Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to safely exclude search result pages from Google's index?
-
Hello everyone,
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blahI wanted to block everything of that sort, but how do I do it without blocking /index.php ?
Thanks in advance and have a great day everyone!
-
Hi Louise,
If you can ID the parameters, you can also look at blocking these in Webmaster Tools. This page explains more. As with any blocking of URLs, of course, proceed with caution.
-
I agree that can be effective. The reason I suggested the robots.txt is because Louise mentioned "blocking and preventing" as an objective. Robots.txt are particularly useful in the example where results from a search bar or something of that nature is involved. A NOINDEX, FOLLOW will not prevent bots from getting tired and dizzy, whereas the robots.txt can "block and prevent" bots from crawling certain parameters.
With all of that said, I think it is important to understand whether you need the bots to crawl and not index (in which case Spencer's answer is correct), or if you need to prevent bots from crawling the parameters altogether.
Hope that is more clear
-
I'm not sure that robots.txt is effective when url parameters are involved.
I would just add a meta robots tag to the head section of the search results template:
-
If you are able to identify a url parameter, you may excluded them using robots.txt. Here is a great resource on Robots.txt - http://moz.com/learn/seo/robotstxt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz reports way fewer backlinks than google search?
My site is only 11 months old but has steadily (if not slowly) been gaining backlinks. My question, is why Moz shows me at 303 backlinks and Google search console is showing at 1,237? I am more than a little suspicious that this could highlight the reason Moz shows such an unfavorable DA ranking for our site at a DA12. Other competitors that rank for similar keywords to mine are DA 42, DA 65, DA 73, etc. If the largest ranking factor is links, and they have mine reported incorrectly - is this the issue with DA as it relates to sites like mine? Any answer from someone who has experienced similar, or has a definitive answer is more than welcome to chime in! Thanks, Kevin
Reporting & Analytics | | kvncrll0 -
How to track google auto search suggestion click?
Hello Guys, In google.co.uk when I search SEL and google gives me option of different different sites and when I click on any one site then that click tracking I need. I have attached the screenshot to understand easily. Is it possible to track such things or possible via server logs etc? TV99h
Reporting & Analytics | | micey1231 -
"index.htm" for all url's in google analytics
I don't have this issue with other wordpress websites, only this one website, and I don't know what's causing the issue: Google Analytics is adding an "index.htm" to every single page on the website. So it is tracking the pages, I see no errors - is it tracking the right page? When I click on the page link in a report, I naturally go to a "404 page not found" since the website address isn't "www.example.com/rewards/index.htm" - but instead the actual address would be:
Reporting & Analytics | | cceebar
"www.example.com/rewards/". I have navigated to View Settings in GA to insure "default page" is empty. Although adding anything else to this field does not effect the page url in analytics reports either. Could it be htaccess file - or a plugin effecting the htaccess file?_Cindy0 -
Why google stubbornly keeps indexing my http urls instead of the https ones?
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why. Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum The third organic result listed is still http. Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index. Anyone knows why? My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
Reporting & Analytics | | max.favilli0 -
Find Pages with 0 traffic
Hi, We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic? The minimum which appears in google analytics is 1 visit.
Reporting & Analytics | | driveawayholidays0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Setting up Google Analytics for Subsites
I currently have one main .com site and am planning on launching geo-location subsites .co.uk, .com.au, .ru, etc... Traffic will flow between both sites and some of the content on the subsites will be duplicate and therefore include a canonical tag to the main site. I want to set up GA to capture who is going to the subsites and vice versa and correctly capture crossover traffic. Any advice on implementing advanced analytics directly (or links to sources that will direct me the right direction for this project)
Reporting & Analytics | | theLotter0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0