Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Organic search traffic dropped 40% - what am I missing?
-
Have a client (ecommerce site with 1,000+ pages) who recently switched to OpenCart from another cart. Their organic search traffic (from Google, Yahoo, and Bing) dropped roughly 40%. Unfortunately, we weren't involved with the site before, so we can only rely on the wayback machine to compare previous to present.
I've checked all the common causes of traffic drops and so far I mostly know what's probably not causing the issue. Any suggestions?
- Some URLs are the same and the rest 301 redirect (note that many of the pages were 404 until a couple weeks after the switch when the client implemented more 301 redirects)
- They've got an XML sitemap and are well-indexed.
- The traffic drops hit pretty much across the site, they are not specific to a few pages.
- The traffic drops are not specific to any one country or language.
- Traffic drops hit mobile, tablet, and desktop
- I've done a full site crawl, only 1 404 page and no other significant issues.
- Site crawl didn't find any pages blocked by nofollow, no index, robots.txt
- Canonical URLs are good
- Site has about 20K pages indexed
- They have some bad backlinks, but I don't think it's backlink-related because Google, Yahoo, and Bing have all dropped.
- I'm comparing on-page optimization for select pages before and after, and not finding a lot of differences.
- It does appear that they implemented Schema.org when they launched the new site.
- Page load speed is good
I feel there must be a pretty basic issue here for Google, Yahoo, and Bing to all drop off, but so far I haven't found it. What am I missing?
-
Hi Adam,
Not to point out something that is likely well taken-care of, but did the GA / Analytics code populate across the site?
Also, is there any heavy JavaScript on the site, especially above analytics code, that might prevent analytics code from loading properly. We had this happen with a client a few years ago. We built custom analytics for this client (they did not want to run GA). Client placed our code in the footer. Client placed slow-loading CRO code in the header. CRO code took so long to load that people had often clicked away from the page they landed on before our code had had a chance to record their visit, as JavaScript generally loads in the same order as it's placed on the page. We had them move our little piece of code up to the top of the page. Problem was solved (in the mean time, we were recording a 20,000 visit loss each week!).
I'm just wondering if this is a tracking issue since all search traffic, not just Google has been affected. It would be quite rare to find an issue that has the same effect at the same time to both Bing and Google's algos. They're similar, but they're not identical and Bing generally tends to take longer to respond to change than Google as well.
Any chance you have raw server logs to compare analytics stats to?
-
I don't see anything that I would think would trigger that. Let me PM you the URL.
-
Did the layout of the header area change significantly? If, for instance, the header area went from 1/10th of the "above the fold" area to 1/3rd, that might run the entire site afoul of the "topheavy" part of Panda.
-
Thanks for the suggestions!
-
The homepage, category, and product pages have all lost traffic.
-
So far, I haven't found any noteworthy changes in content.
-
I've been wondering if this might be part of the issue.
-
I've reviewed Majestic link data, and only see a few deleted backlinks, so I'm thinking it's not a backlink issue.
-
-
Thanks for the suggestion. So far the only significant difference in optimization I've found has been that they added Schema.org markup.
-
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
Did the new cart generate product pages that were differently optimized than the old cart? (if cart-generated product pages were used)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page suddenly dropped from index!!
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Intermediate & Advanced SEO | | Caro-O
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name. The Robot.txt contains: Default Flywheel robots file User-agent: * Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/ The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem. Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text. Any thoughts would be appreciated.
Caro0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Regular Expressions for Filtering BOT Traffic?
I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them. However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them. How do I determine what the regular expression is for additional bots so I can apply them to the filter? I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance. 🙂
Intermediate & Advanced SEO | | AWCthreads1 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Should I noindex the site search page? It is generating 4% of my organic traffic.
I read about some recommendations to noindex the URL of the site search.
Intermediate & Advanced SEO | | lcourse
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0 -
Sudden rank drop for 1 keyword
A page of mine (http://loginhelper.com/networks/facebook-login/) was ranking in the top 10 for keyword (facebook login) and has been for at least 2 months, moving between 5th and 10th. Suddenly in the last 3 days the rank for the keyword dropped from 7th to 46th, yet none of the other keywords have been affected (they target other pages) and their ranks have continued to improve. I am trying to figure out what caused this sudden drop in the ranking of 1 page (the page has quality mainly text based content and isn't in the least bit shallow or spammy) I have been thinking perhaps a crawl or server error may be to cause leaving the page temporarily unavailable or with a big load time... Otherwise what could cause one page to drop so much so quickly whilst other pages improved their rank?
Intermediate & Advanced SEO | | Netboost0