Removing indexed internal search pages from Google when it's driving lots of traffic?
-
Hi
I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed.
Hoping you can help with the below:
To remove these results, is it as simple as adding "noindex/follow" to Search pages?
Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time.
Will there be an initial negative impact on results I should warn others about?
Thanks!
-
Hi effectdigital
Yep I won't be doing that! I don't think I'll be using the robots.txt file at all for this.
Thanks
Frankie
-
Just a note. If you robots.txt block Google from crawling the pages, how do you expect them to find the no-index tag? They can only find the tag if they can crawl the pages! You only apply the robots.txt block (which you should do, at the end) **after **Google have removed the offending URLs from their index
If Google are robots.txt blocked, they can't crawl the pages' coding. If they can't crawl the coding, they can't find the noindex. Simple
-
Hi Dave
I see Amazeinvent has "noindex/follow", on your search. Did you add afterwards???
Thanks
-
Hi Rajesh
**1- Add robots tag on these type pages - **
Any reason why you'd add "nofollow" on Search pages? I'd have though "follow" is a better option?
2- update robots.txt file - Disallow: /search OR /search/*
I done this before a traffic dropped a lot over the next 2-4 weeks. We regained traffic again when we removed
3- add new parameter in GSC as no URLs.
Done already, thanks
-
Are you talking about this type page - https://www.xyz.com/search/?q=jacquemus
If yes, please do like this -
1- Add robots tag on these type pages -
2- update robots.txt file - Disallow: /search OR /search/*
3- add new parameter in GSC as no URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google ignore content styled with 'display:none'?
Do you know if an H1 within a div that has a 'display: none' style applied will still be crawled and evaluated by Google? We have that situation on this page on line 136: view-source:https://www.junk-king.com/services/items-we-take/foreclosure-cleanouts Of course we also have an H1 up at the top of the page and are concerned that the second one will cause interference with our SEO efforts. I've seen conflicting and inconclusive information on line - not sure. Thanks for any help.
Intermediate & Advanced SEO | | rastellop0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Google indexed wrong pages of my website.
When I google site:www.ayurjeewan.com, after 8 pages, google shows Slider and shop pages. Which I don't want to be indexed. How can I get rid of these pages?
Intermediate & Advanced SEO | | bondhoward0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
What to do when you buy a Website without it's content which has a few thousand pages indexed?
I am currently considering buying a Website because I would like to use the domain name to build my project on. Currently that domain is in use and that site has a few thousand pages indexed and around 30 Root domains linking to it (mostly to the home page). The topic of the site is not related to what I am planing to use it for. If there is no other way, I can live with losing the link juice that the site is getting at the moment, however, I want to prevent Google from thinking that I am trying to use the power for another, non related topic and therefore run the risk of getting penalized. Are there any Google guidelines or best practices for such a case?
Intermediate & Advanced SEO | | MikeAir0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
What causes internal pages to have a page rank of 0 if the home page is PR 5?
The home page PageRank is 5 but every single internal page is PR 0. Things I know I need to address each page has 300 links (Menu problem). Each article has 2-3 duplicates caused from the CMS working on this now. Has anyone else had this problem before? What things should I look out for to fix this issue. All internal linking is follow there is no page rank sculpting happening on the pages.
Intermediate & Advanced SEO | | SEOBrent0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610