Google indexing my website's Search Results pages. Should I block this?
-
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed.
I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
-
Since you already released these out to the wild, I would analyze which search results pages are bringing in traffic and use that analysis to create new category pages on your site. I would certainly block the search parameter in the Webmaster tools and in robots.txt.. Most internal search results pages have little content value and the engines now look at your site as a whole and if a certain percentage of the site is low quality, the whole site will be penalized.
-
Jenny,
Take a look at this post in the forums on indexing issues with site search - http://www.seomoz.org/q/block-search-engines-from-urls-created-by-internal-search-engine.
Allowing site search to be indexed can result in a ton of duplicate content on your site. I recommend taking the meta noindex approach.
-
well the simple answer for you is Google allocate a crawl budget based on multiple factors.
with your current setting the crawlers and wondering and going after these search page that add no value to the web. and losing alot of your budget on these search pages, where i would definitely direct these crawlers to crawl the content and update it whenever u add or update a page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Images search traffic and image thumbnail issues
Hi MOZ community! Need a little help with a strange issue we are seeing of late on our project CareerAddict.com. We have seen a sudden and significant drop in image visibility in Search Console from the 27th August onwards. I understand that Google has been updating their filters and other bits in image search, so maybe this could have impacted us? I also noticed that the images which are mapped to our articles are not the full featured article 700px wide images which we provide to Google in the Structured Data. They are instead taking the OG share 450px wide images now on many occasions. You can see this by searching for "careeraddict.com" in images. Any insight or suggestions welcome on both of these. Interested to understand if any other webmasters are experiencing other or similar problems with image visibility in Google also. Thanks!
Algorithm Updates | | dqmedia0 -
Canonical Tag on All Pages
This is a new one for me. I have a client that has a canonical tag on almost every page of their site. Even on pages that don't need it. For example on http://www.client.com/examplex they had code: Maybe I have missed something, but is there a reason for this? Does this hurt the ranking of the page?
Algorithm Updates | | smulto0 -
Google.co.uk vs pages from the UK - anyone noticed any changes?
We've started to notice some changes in the rankings of Google UK and Google pages from the UK. Pages from the UK have always typically ranked higher, however it seems like these are slipping, and Google UK pages (pages from the web) are climbing. We've noticed a similar thing happening in the Bing/Yahoo algorithm as well. Just wondered if anyone else has anyone else noticed this? Thanks
Algorithm Updates | | Digirank0 -
Google Page Rank?
We have had a quality website for 12 years now, and it seems no matter how many more links we get and how much new content we add daily, we have stayed at PR3 for the past 10 years or so. Our SEOMoz domain authority is 52. We have over 950,000 pages linking to us from 829 unique root domains. Is this in line with PR3 or should we be approaching PR4 soon? We do daily blog posts with all unique, fresh quality content that has not been published elsewhere. We try to do everything with 'white hat' methods, and we are constantly trying to provide genuine content and high quality products, and customer service. How can we improve our PR and how important is PR today?
Algorithm Updates | | applesofgold0 -
Test contet/pages indexed by search engines
During the web development stages of our Joomla CMS website, we have managed to get our site indexed for totally irrelevant test pages mainly to do with Joomla and some other equally irrelevant test content. How damaging is this to our domain from an SEO prospective and is there something we can do about it? When we do a site:domain.com search we see hundreds of testpages with test/irrelevant meta tags etc.
Algorithm Updates | | Fuad_YK0 -
How often do people use Google Product Search
I was was reading Tom Critchlow's excellent blog on how to rank well for Google Product Search. I'm trying to find out if there are stats on how often people use this feature in Google (since it is not listed on Google's main navigation). I'm working with a customer who has b-2-b products and am trying to determine the value of adjusting his ecommerce pages to appear on Google Product Search.
Algorithm Updates | | EricVallee340 -
Index Page lost rankings? Please Help!
This morning I ranked highly (Page 1 UK Google) for over 50 keyword search terms for my website http://www.careworx.co.uk This afternoon my rankings have bottomed out and dropped pages? I have not been de-indexed it appears and many of my sub-pages are still highly ranked. Would anybody know what has happened? I know of Google Panda but I would've seen results drop before now so I'm very concerned. Don't seem to have lost any links etc and am careful to balance SEO with a mix of techniques to keep Google happy and again, have not been de-indexed. Can anybody offer advice please, or let me know how I can rectify this.
Algorithm Updates | | andystep0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0