Need Help On Proper Steps to Take To De-Index Our Search Results Pages
-
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of.
So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed.
By tag pages I mean:
Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies
Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies
So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to?
So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can someone comment on what might be the best, safest, or fastest route?
Thanks so much for any help you might offer me!!
Craig
So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can you tell me which would be the best, fastest and safest routes?
-
Hi Craig,
In general - the structure looks ok - just wondering how you going to manage to keep 1mio products a reasonable number of clicks from the homepage.
rgds
Dirk
-
Sounds good! Thanks again!
C
-
Hi Craig,
Getting quite late here in Belgium (already past midnight) - will get back to you tomorrow (with a fresher mind...)
Dirk
-
This is a big help as I am finalizing the category pages now.
So our site is big, getting close to 1,000,000 products in the store.
Each product can belong to up to 3 sub-cats. Our internal category structure is generally like this:
Widgets->Awesome Widgets->Blue Widgets
or
Widgets->Awesome Widgets->Large Widgets->Large Blue Widgets
So, currently, my structure is like this:
1. Home Page Links To:
Primary Category 1
Primary Category 2
Primary Category 3
Primary Category 42. Each Primary Category Page:
1. Links any sub-categories
2. Has a list of all products in that category with pagination linking to their product pages.3. The Product Page Links back to:
1. Primary Category Page
2. Each of the 3 Sub-Categories' Pages that Product Belongs To.
3. A small number of related products.Generally each sub-cat will have thousands if not tens of thousands of sub-products.
How does this sound and do you have any advice related to this?
Thanks again!! :):):):):):):):) You get extra smilies for awesome help.
Craig
-
Hi Craig,
A. The logic seems ok - but doesn't say much about the depth of the site. Questions for me are:
- can one product belong to more than one category?
- are we talking about 100 products or 10.000?
Suppose worst case
- each product belongs to only one subcategory & each subcategory belongs to one category
- you have 500 products in this subcategory
If there is pagination - with 50 products/page the last 50 products will be >10 clicks from the homepage
If there a 'show as one page - there would be too many links on the page so you cannot be certain that the ones at the bottom of the pages will get followed.
If a product can belong to more subcategories or categories and/or there are fewer products, it's more likely that it will be closer to the homepage.
B. No - the products would not be removed from the index. However, if there are no links to these pages, they will not be shown in the results (google wants that each part of your content should be reachable by at least 1 link). No (internal) links = no value is the way Google thinks. The more links & the fewer clicks from the homepage the more value a page gets. You should put the new navigation in place as soon as possible - ideally it should have been done at the same time.
Hope this clarifies,
Dirk
-
I was talking about my search pages specifically, either adding a meta robots no-index,no-follow OR just a no-index. I just went ahead and added no-follow.
So, good point on the screaming frog.
Currently, the site is organized like this: HomePage -> Several links to many variations of the Search Page -> Product Pages
The new organization will be:
Home Page -> Various Category Pages -> Various Sub-Category Pages (With products on them and pagination to show all products) -> Possibly Other Sub-Category Pages (With products on them and pagination)
Then on the product pages there will be links back to the primary and secondary category pages.
A. How does that sound and
B. So, if I have Product pages that are already indexed could no-indexing the Search pages mean these pages get removed? Or, if they are already in the index, are they safe?
Thanks again for taking the time to help and answer!!
Craig
-
Hi Craig,
Not sure where you would put the nofollow:
-
the links to the search pages on the articles need to be of type "follow" - if Google is never allowed to follow the links to the search pages it will take a lot of time before the bot discovers that all the search pages became "noindex"
-
the links on the search pages themselves- here you can do what you want. As the final goal is to remove the search pages from the index - once they're not longer indexed it becomes irrelevant if the links on these pages are nofollow or not. I would keep these links of type "follow" - allowing the bots to easily access all the pages - find the links on them that go the other search pages and take them out of the index.
One thing that you should also check and that I didn't mention before - it is probably a good idea to crawl your site now with Screaming Frog and check the depth of the site (%of articles at 1/2/3... clicks from the homepage). It could be possible that if you remove the "search" pages a larger part of your content moves deeper in the site - this could have a potential negative impact on the ranking of these articles. If this is the case - you could decide
- to keep some of the search pages (but noindex/follow)
- to increase cross linking between normal articles
- to add some new index pages (again noindex/follow)
(or a mix of these)
rgds,
Dirk
-
-
Hey Dirk,
I have one more follow-up on this if you don't mind. My SEO auditor said I should both no-index AND no-follow the search results pages.
This concerns me a little bit as I am concerned it may have a negative effect on my Product pages as I will have to make sure they will be found in another way, which I will do, but it will take time of course.
Any reason why you just suggested no-index and did not include the no-follow and do you have any other insight on that?
Thanks!
Craig
-
Thank you my brother...
Very much appreciate the time you took for some thorough answers here....
Very good stuff and VERY much appreciated.
I had a chat with my SEO auditor today and he suggested no-indexing, no following the search pages and in about 30 days remove the product page links.
So, I will likely do that.
Much appreciation to you - Craig
-
I don't think there is an easy route here - you will have to get rid of these indexed search pages in any case. Keeping this low quality pages will continue to hurt your site.
If you currently don't have the resources to do the 'ideal' scenario - I would go for the short pain: cut out these pages now, it will probably cost you traffic on the short term, but at least you have a clean base to build upon. Keeping the pages is probably better on the short term, but the longer you keep them, the more your site's reputation is going to be affected and put's you in danger for future algorithm updates.
Just my opinion
Dirk
-
Right, I hear you on that, and honestly, that scenario you have posited, is the reason I haven't done anything yet on this. I agree that is the ideal way to do it, but I am not sure I can. I just don't have the time or resources and I agree that the positive effect could take some time...
So, I am curious, what you think the quickest route to a positive effect would be?
C
-
Hi,
There is an alternative solution but it would require more work on your side.
The problem with your current situation is that you create thousands of low value pages with little added value (which Google doesn't really like: https://www.mattcutts.com/blog/search-results-in-search-results/) and then you heavily promote these low quality pages by point hundreds of links to them. Principal message to Google - these low quality pages are my most important ones.
What you could do is to check the search pages which are generating traffic (ex. take the top 100) and create "real" pages for them. If we take the example you give: http://oursite.com/Search.html?text=Kitten - rather than having a generic search page with little added value you create a real page with some added value content (yoursite.com/topics/kitten) with links to your most important pages on the subject. As an example of how such a page could look like: http://dogtime.com/dog-breeds/german-shepherd-dog - this page is like a kind of "home" - containing a definition + links to the most important related articles on the subject. If these kinds of pages already exist on your site then of course no need to create them.
On the related search pages you then put a canonical url pointing to this page. You also update the links to the search page to the "real" added value page. This way you start promoting new value added content with minimal risk of loosing your current positions & remove the old low value pages from the index. It can take some time however before you see a positive effect.
For the search request where it's not possible to create a version with add value - you point the canonical to the generic search page (or your homepage) and remove all the links to these pages.
Hope this helps,
Dirk
-
Dirk,
THANKS!!! Thanks for the solid response. I guess my only concern is, we are still getting traffic from these indexed Search pages... and I need to minimize the hit from removing them. Any other more advanced methods I could use? Or.... In that case, would you recommend I do a combination of using the URL removal tool PLUS removing the tags?
I just need to do this as right as possible. I can't afford too much of a hit here (if any.) But, at the same time, we are losing traffic so fast, and have lost so much traffic, I don't have any choice at this point. We have doubled our product pages in the past 3 years and yet have lost about half our traffic.
Thanks again!
Craig
-
Hi,
I would first put a noindex on all your search result pages and leave the tags on the pages to allow Google to crawl them & "read" the new instructions.
I would also try to block these result pages in the robots.txt - it accepts pattern-matching ( https://support.google.com/webmasters/answer/6062596?hl=en&ref_topic=6061961) - if you try this make sure that you test it properly to avoid unwanted side effects.
You could also try the url removal tool - it's quite easy to delete an entire directory with the tool (https://support.google.com/webmasters/answer/1663419?hl=en) - you must make sure however that the pages cannot be crawled again (so do it after the modification of the robots.txt). If your search is on the root of your site and not in a separate directory, not sure if it's going to work.
Just removing the links to these pages without other modification is not going to help - they will just remain in the index.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
URL not indexed but shows in results?
We are working on a site that has a whole section that is not indexed (well a few pages are). There is also a problem where there are 2 directories that are the same content and it is the incorrect directory with the indexed URLs. The problem is if I do a search in Google to find a URL - typically location + term then I get the URL (from the wrong directory) up there in the top 5. However, do a site: for that URL and it is not indexed! What could be going on here? There is nothing in robots or the source, and GWT fetch works fine.
Technical SEO | | MickEdwards0 -
I need an SEO Specialist to take a look at a few things for me
I need to hire an SEO specialist technician to take a look at a few things under the hood that I can't seem to figure out... is this the right place to ask for this type of paid help?
Technical SEO | | co.mc0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
Help needed regarding managing client expectations - tricky situation
I will try to explain the scenario as quickly as possible and my hope is that someone can share their opinion on how they would move forward. I was introduced to a local business owner who said he wanted help with SEO. Upon looking at his current online marketing, I saw he had 2 current sites promoting the same local business (martial arts instruction / classes). Why he had two sites? He said it made it easier for him to dominate in Google. Red Flag #1. Upon doing a quick site audit, I found a ton of problems with the existing site. Black text on a black background, keyword stuffing in title tags, non-canonicalization, no xml sitemap, no Google analytics installed...on and on. In addition, the site did not really have a good look to it graphically. I told him that I recommend a fresh new site using Wordpress and that we should build the content with the focus on explaining the benefits of the classes. He agreed and we began development of a new Wordpress site from the ground up. We built a sitemap, wireframe, nice design, etc. The site looks much better and we got rid of a lot of the technical problems with the site. The problem is this: Even though the new site is technically better based on On Page analysis, it is not showing up anywhere in the Top for localized keywords. The site has been live for about 2 1/2 months (March 1). I made the mistake of telling him that in a lot of cases in the past, I was able to build a new site for other clients that would rank well for localized searches based on On Page optimization alone. This is not happening for him with the new site. The new domain is relatively new (less than a year old) and has no links at all at this point. I recommended that we do a 301 redirect from his existing domain to the new one but he is skeptical and I almost can't blame him. The client is not paying me to do any SEO. The contract was to build a new site that would be built with best SEO On Page practices (Title Tags, Header Tags, Meta Desc., XML Sitemap, canonicalization, etc.) I hesitate to post the links to his existing site and the new one we built but I can see where that may shed some more light on the subject. If interested in taking a look, please send me a message. I guess the two questions are this: 1. Is it reasonable for a site to rank well for a localized non-competitive term based on A scores of on page analysis? 2. What harm or foul is there in doing a 301 redirect from the old domain to the new one and then reverting back if he decides that the move hurt his rankings more than helped? Thanks.
Technical SEO | | bluelynxmarketing0 -
Duplicate index.php/webpage pages on website. Help needed!
Hi Guys, Having a really frustrating problem with our website. It is a Joomla 1.7 site and we have some duplicate page issues. What is happening is that we have a webpage, lets say domain.com/webpage1 and then we also have domain.com/index.php/webpage1. Google is seeing these as duplicate pages and is causing me some real SEO problems. I have tried setting up a 301 redirect but it wn't let me redirect /index.php/webpage1 to /webpage1. Anyone have any ideas or plugins that can be used to sort this out? Any help will be really appreciated! Matt.
Technical SEO | | MatthewBarby0 -
Whats the best way to stop search results from being indexed?
I Have a Wordpress Site, and just realized that the search results are being indexed on Google creating duplicate content. Whats the best way for me to stop these search result pages from being indexed without stopping the regulars and important pages and posts from being indexed as well? **The typical search query looks like this: ** http://xxx.com/?s=Milnerton&search=search&srch_type AND this also includes results that are linked to the "view more" such as:
Technical SEO | | stefanok
http://xxx.com/index.php?s=viewmore Your help would be much appreciated. regards Stef0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0