Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there any reason to get a massive decrease on indexed pages?
-
Hi,
I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way.
The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months.
I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website.
I've checked the following:
- Robots (there is no block, you can see that on the image as well)
- Webmaster Tools Penalties
- Duplicated content
I don't know where else to look for. Can anyone help?
Thanks in advance!
-
It could be, if they made the switch improperly and Google isn't transferring link equity or can't find new pages. I like checking on services like Archive.org to get backing for my ideas, but I think that you should probably reach out directly to your client and ask about their activities in April.
Hope this helps!
Kristina
-
Gazzerman,
Thank you so much for your reply. I see you're a great detective
I'm aware of many of those issues, and I'm fighting with the IT team to get some of those changes done ASAP.
I'll be checking everything you've mentioned to be sure that this things can get fixed.
I'll let you know what happen!
Thank you,
Mat
-
OK I think I found out some of your issues!
There are a few major ones. I hope you don't mind but I did some detective work and found out what your site was by the code screen grabs you posted on another reply below.
OK so the first MAJOR ISSUE is that you have TWO BODY tags in your HTML! You need to check your code on all pages / templates and get it fixed ASAP. Archive.com shows that this has been a problem on your site since you did a redesign in OCT 2013. Google is looking at signals like this a lot more now and is not being favourable to them
My concern when you do a site: query in google is that your homepage does not show, it should be the first result. I see that you have 15,100 pages indexed.
When I search for your website "sitename.com.ar" I see in Google there is no title, description even the quick links are broken, just showing just one and saying untitled.
NEXT, All your pages have the same description and keyword tags! Whats going on there? I you cant make them unique then get rid of them, they are doing more damage and Google hates that stuff, I have had pages not show in SERPS because they have duplicate description tags. This happen to five of my pages recently that I somehow missed. But this is on every product Page of yours if not every page on the site.
Also I would remove the keywords meta tag, its pointless and will do you more harm than good.
The paths to your javascript and css files have two slashes (//) should only be one.
I would start with those changes and do a fetch in Google WMT and give it a few days, the homepage should reindex in a very short time frame maybe minutes.
After that we can take another look at the site. These are very basic issues that a large brand site like this should never be doing wrong, as you can see the damage it has caused. Lucky for you that you have only just started there and can be the knight in shining armour to sort it all out
-
Hi Kristina,
Thank you so much for your complete answer. It's great to have such a comprehensive feedback!
Here I'm copying a screenshot for the Organic traffic since Nov. 2013. You can see that there was a huge decline in visits between march and april 2014.
I'm checking but it seems that all the pages lost traffic and not only some of them. The only page that seems to be getting traffic is the home.
I was checking and it seems that the site has moved from https to http, I'm not sure about it but when I'm checking it on Archive.org I get some differences on the source file I see that on the previous version some links were pointing to https and now they are not. (I'm attaching a screenshot as well).
I'll try the Sitemap thing, but do you think that there might be a penalty or maybe is just that they've moved from https to http?
Thank you so much!
Matías
-
Hi Gazzerman1,
Thank you so much for your feedback. I'll be checking that. Unfortunately I can't publicly disclose the URL.
Best regards,
Matías
-
Hi there,
You've got a lot of things going on here. I'm worried that Google thinks your site is exceptionally spammy or has malware, since it's showing your .pdf or .doc files but not as many of you HTML pages. But I can't make any definitive statements because I'd need a lot more information. If I were you, I would:
-
Dig into Google Analytics to get a better idea of what exactly has lost organic traffic, and when
-
Look at the trend:
-
Is it a slow decline or a sharp drop? Sharp drop typically means a penalty or that the company accidentally deindexed their pages or something equally horrifying. A slow decline could be an algorithm change (at least the ones I've seen take their time to really take effect) or a new competitor.
-
When did it start? That way you can pinpoint what changes happened around then.
-
Is it a steady decline or has it been jumping up and down? If it's a steady decline, it's probably one thing you're looking for; if traffic is erratic, there could be a bunch of problems.
-
Find out which pages are losing traffic.
-
_Are all pages on the site losing traffic? _If they are, it's probably an algorithm change or a change that was made by webmasters sitewide. If it's specific pages, it could be that there's now better competition for those key terms.
-
Which pages are losing the most traffic? When did their traffic start to drop? Is it different from the rest of the site? You may need to investigate at a deeper level here. Individual pages could have new competition, you may have accidentally changed site structure and lessened internal links to it, and/or it may have lost some external links.
-
Get an accurate list of every unique page on your site to clear up duplicate indexation and find out if anything _isn't _in Google's index.
-
Add a canonical to all pages on your site pointing to the version of the URL you want in Google's index. It's a good way to make sure that parameters and other accidental variations don't lead to duplicate content.
-
Divide that list into site sections and create separate XML sitemaps by section. That way you have a better idea of what is indexed and what isn't in Google Webmaster Tools. I got this idea from this guy (who is also my boss) and I swear by it.
-
Based on traffic in Google Analytics, pick out pages that used to get a lot of traffic and now get none. Search for them in Google with site:[url] search to see if Google's got them indexed.
After that, if you're still stumped, definitely come back here with a little more info - we'll all be happy to help!
Kristina
-
-
I'm not sure if this will solve the problem, but thank you so much for your reply.
I've checked with both tools and they're great. But I can't find a solution yet.
Best regards,
Matías
-
Panda has run recently, does the the drop coincide with that date? Normally there is a shake up in the week leading up to it.
I suspect that someone may have made changes before you go there and they have not given you the full picture.
Was there any issue with the server during that time and the pages speeds were slow or not loading. WMT should give you some indication of the decline, even how dramatic the drops were and what kind of pages were dropped.
FYI you are better to remove the meta descriptions from the pages that have duplication than keep them there, but the best course of action is of course to re-write them.
If you care to share the URL we can all take a look and see if we can spot anything.
-
Try checking it out with bing/yahoo and yandex. See what type of pages they were then go to archive.org to check them.
If it's indeed thin pages or pages caused by parameters/searches/sorting then that is usually an automatic algorithmic penalty, usually, panda.
You might also have some good pages with good links, so be sure to check for 404s. 301 good pages as needed. It's a really tricky situation.
You also have to see if there's trouble with internal links and inbound links (just in case there's a mix with penguin. I've seen sites with manual and algorithmic penalties for both wtf)
Try out tools like
http://www.barracuda-digital.co.uk/panguin-tool/
http://fruition.net/google-penalty-checker-tool/
to see when your drops really started. You should be able to identify your algo penalty at least and go from there.
-
Hi Dennis,
Thank you for your help. I'm not sure what those pages were since I've started managing the account only a few days ago. I suspect that most were pages generated by different parameters, but I've adjusted that on Webmaster's tools.
Also the site had a "Expiration" meta tag dated in 1997, which we've removed some days ago, since I thought that might be the problem, but the indexed pages continued to decrease even after removing that meta.
I've checked the http/https and the site (at least the indexable content) is all http, so there shouldn't be an issue there.
One of the issues I've asked to fix on the site is that there are no meta descriptions on most pages and in the ones in which there is, is a generic one which is exactly the same in every single page. I'm aware that this is a problem, but I don't think this can explain the main issue.
Do you know how can I find out if there was any algorithmic penalty? I've checked on GWT and I can only check for manual penalties.
Thank you so much!
Matías
-
Hi
That's a lot of indexed pages. What were the bulk of those pages? Categories? Products? Random tags or thin pages?
Did you switch to https recently? Are all versions included in webmaster tools? With a preferred version? How about their sitemaps?
It's possible that it's panda. It's also possible that it's a technical side like a switch to https, improper redirects, etc.
I just find it weird to see a big site like that lose that much indexed content (assuming they were good pages) without incurring an algorithmic penalty or a technical problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should Hreflang x-default be on every page of every country for an International company?
UPDATED 4/29/2019 4:33 PM I had made to many copy and pastes. Product pages are corrected Upon researching the hreflang x-default tag, I am getting some muddy results for implementation on an international company site older results say just homepage or the country selector but…. My Question/Direction going forward for the International Site I am working on: I believe I can to put x-default all the pages of every country and point it to the default language page for areas that are not covered with our current sites. Is this correct? From my internet reading, the x-default on every page is not truly necessary for Google but it will be valid implemented. My current site setup example:
International SEO | | gravymatt-se
https://www.bluewidgets.com Redirects to https://www.bluewidgets.com/us/en (functions as US/Global) Example Countries w/ code Site:- 4 countries/directories US/Global, France, Spain Would the code sample below be correct? https://www.bluewidgets.com/us/en/ (functions as US/Global) US/Global Country Homepage - https://www.bluewidgets.com/us/en/ US/Global Country Product Page(s) This would be for all products - https://www.bluewidgets.com/us/en/whizzer-5001/ http://www.bluewidgets.com/us/en (functions for France) France Country Homepage - https://www.bluewidgets.com/fr/fr/ France Country Product Page(s) This would be for all products- https://www.bluewidgets.com/es/es/whizzer-5001 http://www.bluewidgets.com/us/en (functions as Spain) Spain Country Homepage - https://www.bluewidgets.com/es/es/ Spain Country Product Page(s) This would be for all products - https://www.bluewidgets.com/es/es/whizzer-5001 Thanks for the spot check Gravy0 -
Hreflang tag on every page?
Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.
International SEO | | DigitalThirdCoast0 -
Country subfolders showing as sitelinks in Google, country targeting for home page no longer working
Hi There, Just wondering if you can help. Our site has 3 region versions (General .com, /ie/ for Ireland and /gb/ for UK), each submitted to Google Webmaster Tools as seperate sites with hreflang tags in the head section of all pages. Google was showing the correct results for a few weeks, but I resubmitted the home pages with slight text changes last week and something strange happened, though it may have been coincidental timing. When we search for the brand name in google.ie or google.co.uk, the .com now shows as the main site, where the sitelinks still show the correct country versions. However, the country subdirectories are now appearing as sitelinks, which is likely causing the problem. I have demoted these on GWT, but unsure as to whether that will work and it seems to take a while for sitelink demotion to work. Has anyone had anything similar happen? I thought perhaps it was a markup issue breaking the head section so that Google can no longer see the hreflangs pointing to each other as alternates. I checked the source code in w3 validator and it doesn't show any errors. Anyway, any help would be much appreciated - and thanks to anyone who gets back, it's a tricky type of issue to troubleshoot. Thanks, Ro
International SEO | | romh0 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
How do I get a UK website to rank in Dubai?
We are trying to get a UK-based children's furniture website to rank in Dubai. We have had a couple of orders from wealthy expats in Dubai and it seems to be the correct target market. Does anyone have any specific knowledge of this area? We are promoting the same website as for the UK market. Also does anyone know any user behaviour stats on expatriates using search engines? Do they carry on using the version of Google they are used to, or do most change to the local version of Google? Thanks in advance
International SEO | | Wagada0 -
Multilingual Ecommerce Product Pages Best Practices
Hi Mozzers, We have a marketplace with 20k+ products, most of which are written in English. At the same time we support several different languages. This changes the chrome of the site (nav, footer, help text, buttons, everything we control) but leaves all the products in their original language. This resulted in all kinds of duplicate content (pages, titles, descriptions) being detected by SEOMoz and GWT. After doing some research we implemented the on page rel="alternate" hreflang="x", seeing as our situation almost perfectly matched the first use case listed by Google on this page http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077. This ended up not helping at all. Google still reports duplicate titles and descriptions for thousands of products, months after setting this up. We are thinking about changing to the sitemap implementation rel="alternate" hreflang="X", but are not sure if this will work either. Other options we have considered include noindex or blocks with robots.txt when the product language is not the same as the site language. That way the feature is still open to users while removing the duplicate pages for Google. So I'm asking for input on best practice for getting Google to correctly recognize one product, with 6 different language views of that same product. Can anyone help? Examples: (Site in English, Product in English) http://website.com/products/product-72 (Site in Spanish, Product in English) http://website.com/es/products/product-72 (Site in German, Product in English) http://website.com/de/products/product-72 etc...
International SEO | | sedwards0 -
Should product-pages with different currencies have different URLs?
Here is a question that should be of interest for small online merchants selling internationally in multiple currencies. When, based on geolocation, a product-page is served with different currencies, should a product-page have a different URL for each currency? Thanks.
International SEO | | AdrienOLeary0 -
Country specific landing pages
I have a client who wants to put a re-direct on his landing pages based on the visitors IP address. The landing page will be a sub domain relevant to the country their IP is located in. I am a little concerned this will effect the SEO. Appreciate any advice. Dylan 🙂
International SEO | | gomyseo0