Google Organic Ranking & Traffic Dropped
-
Hello,
We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
-
Hi Kristina,
Thank you very much for taking the time to review our site.
These 15000 pages were basically Layered Navigation pages. We did not feel that they were useful for Google, so we used to block them through robots.txt. However the Google bots got them somehow into their index. So then we added canonical tags on our pages and removed the block from robots.txt (In June 2015). Finally Google bots removed those pages from the index due to the canonical tags. We have around 2000 "real" pages on our website. Therefore right now it seems to us that Google's Index is showing the right number of pages. This has been the case since Sept 2015, but our rankings have stayed low.
-
Hi Nancy,
To be clear, when you say that there were 15,000 blocked pages, do you mean that Google reported that its crawler was blocked from crawling them? Or that Google was blocking them itself, thinking that they're malware or something like that?
The term "blocked" makes me think there's a technical issue here, not a penalty. Were these 15,000 pages important pages? I see that Google still has around 1,950 of your sites' pages in its index. How many should it have?
Best,
Kristina
-
Thanks for your insight and help Once Again!
Yes, at that time we dropped because we were attacked with tons of links from foreign countries. At that time we reviewed all of our back-links and used the disavow feature of Google Search Console.
-
Hi NancyH, I wouldn't use ebay.com as a comparison, with a DA of 95 any page speed issues are trumped by the backlink portfolio. I'm not seeing anything else that sticks out as an issue, I did notice that the peak of first page results was in Mar 2014, during the summer of 2014 there seems to have been the biggest dip and you have rebounded a bit since this summer of this year, sitting at about 769 first page results. Do you remember if anything changed with the site in the spring/summer of 2014? Have you dug into your analytics to see if any of those metrics changed around the same time? That is about all of the insight I can give without doing a full audit of the site. Maybe someone else in the forums can offer additional insight.
-
Hi VERBInteractive,
Thanks for looking into our issue and responding. I agree with you but improving our page speed over mobile devices is a challenge. Our development team is working this issue. We noticed many other ecommerce website (for instance ebay.com ) are still doing well over search although their page speed on Mobile devices are poor. Could you look further and see if you notice any other issue on our website?
-
Hi Nancy,
Have you explored your page speed on mobile devices? I noticed a big drop in the spring of this year (although it does look like you are slowly regaining a lot of first page results). This would have been right around google's algorithm change to favor mobile ready sites. If you look at https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.thecpapshop.com%2F&tab=mobile you will see that although the user experience score is pretty good the speed score is pretty low. See what opportunities there you can fix.
Good luck,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fred Update & Ecommerce
Hi I wondered if there had been any other insights since March about the Fred update or any other Google update? I don't think we were hit by Fred but in March we dropped out for a lot of keyword rankings, I just cannot pinpoint why. We are an ecommerce site, so some of our product/category pages don't have a huge amount of written content. We might have a couple of extra backlinks to disavow, but nothing major. Does anyone else have any insights? Thanks!
White Hat / Black Hat SEO | | BeckyKey1 -
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
70% organic traffic drop in October?! Algorithm change?
I oversee content for a client and this past month there was a 70% decrease in traffic. We noticed the hit start on September 29th, and has never rebounded. Any suggestions what this could be (i..e latest Google algorithm update) and or tools I should use to look into it? Nothing is showing up as an alert on Moz analytics and need to address with my client asap.
White Hat / Black Hat SEO | | jfeitlinger0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0