Best way to remove low quality paginated search pages
-
I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed. I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.
-
According to this article http://www.seroundtable.com/farmer-headers-13111.html
It sounds like I should be 404ing these pages since I never plan to re-writer them and I want them removed from my site and from the index.
According to this article http://www.seroundtable.com/google-robotstxt-advice-12759.html
They believe you shouldn't use robots.txt..
Anyone know the best option in this situation? Should I just 404 a handful of the 40k pagination pages every week/month until they are all 404'd?
-
If this was in wordpress, there are plugins to mass fix this or you can just do it the hard way. I would just opt to remove those from being indexed but follow them. Shouldnt hurt too much and will help in the long run.
-
I agree with Fede, search results pages do not contain any value and search engine might not find it good! The better idea is to block all search result URLs from robots.txt and try to index the real pages that contain value to search engine as well as your end user!
Hope this helps!
-
What I would do, and did for my site, is to noindex (via meta tag) all those search results pages, they actually don't provide any value, you don't want them indexed, those are pages for users searching your site. If a user searches for a term that you have an article/product or whatever, they will find the article/product page more interesting than a search result page, google probably won't even show that in the SERPs...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Rel=Alternate on Paginated Pages
I've a question about setting up the rel=alternate & rel=canonical tags between desktop and a dedicated mobile site in specific regards to paginated pages. On the desktop and mobile site, all paginated pages have the rel=canonical set towards a single URL as per usual. On the desktop site though, should the rel=alternate be to the relevant paginated page on the mobile site (ie a different rel=alternate on every paginated page) or to a single URL just as it is vice versa. Cheers chaps.
Intermediate & Advanced SEO | | eventurerob1 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Best way to move from mixed case url to all lowercase?
We are currently in the process of moving our site from a mixed case structure i.e -> <sitename>/franchise/childrens-child-care/party/Bricks-4-Kidz/company-information.cfm</sitename> to all lowercase i.e -> <sitename>/franchise/childrens-child-care/party/bricks-4-kidz/company-information.cfm.</sitename> In order to maintain as much link juice as possible, should we be using 301 redirects to point from the old to the new? or would it be more advantageous to wait for the next crawl and the link juice would also be somewhat maintained even though the all the upper case letters have been converted to lowercase?
Intermediate & Advanced SEO | | franchisesolutions0 -
Previously ranking #1 in google, web page has 301 / url rewrite, indexed but now showing for keyword search?
Two web pages on my website, previously ranked well in google, consistent top 3 places for 6months+, but when the site was modified, these two pages previously ending .php had the page names changed to the keyword to further improve (or so I thought). Since then the page doesn't rank at all for that search term in google. I used google webmaster tools to remove the previous page from Cache and search results, re submitted a sitemap, and where possible fixed links to the new page from other sites. On previous advice to fix I purchased links, web directories, social and articles etc to the new page but so far nothing... Its been almost 5 months and its very frustrating as these two pages previously ranked well and as a landing page ended in conversions. This problem is only appearing in google. The pages still rank well in Bing and Yahoo. Google has got the page indexed if I do a search by the url, but the page never shows under any search term it should, despite being heavily optimised for certain terms. I've spoke to my developers and they are stumped also, they've now added this text to the effected page(s) to see if this helps. Header("HTTP/1.1 301 Moved Permanently");
Intermediate & Advanced SEO | | seanclc
$newurl=SITE_URL.$seo;
Header("Location:$newurl"); Can Google still index a web page but refuse to show it in search results? All other pages on my site rank well, just these two that were once called something different has caused issues? Any advice? Any ideas, Have I missed something? Im at a loss...0 -
Best Way to Consolidate Domains?
Hello, My company has four websites in the same vertical and we're planning to integrate them all on our main company site. So instead of www.siteone.com, www.sitetwo.com, www.sitethree.com, etc. It would be www.branddomain.com/site-one, www.branddomain.com/site-two, etc. I have a few questions... Should we redirect the old domains to the new directories or leave the old domains and stop updating them with new content... Then have the old content, links, etc. 301 to the same content on the new site? Should we literally move all of the content to the new directories? Any tips are appreciated. It's probably pretty obvious that I don't have a ton of technical skills... my development team will be doing the heavy lifting. I just want to be sure we do this correctly from an SEO perspective! Thanks for the help, please let me know if I can clarify anything. E
Intermediate & Advanced SEO | | essdee0 -
Best free way to make our NAPs consistent - online software maybe?
Hello, What's the best free tool or method to making our local SEO citations consistent? We have more than one name and phone number out there and there are a lot of citations already.
Intermediate & Advanced SEO | | BobGW0