Block search engines from URLs created by internal search engine?
-
Hey guys,
I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory.
One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website.
I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt?
What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index?
Thanks
Kim
-
That sounds perfect - if the user-generated URLs are getting enough traffic, make them permanent pages and 301-redirect or canonical. If not, weed them out of the index.
-
Thanks for your reply Dr. Meyers. I think you're probably right.
Yes I'm recommending they define a canonical set of pages that are the most popular searches, categories and locations which can be reached via internal links and we'll get all those duplicates re-directed back to that canonical set.
But for pages that fall outside those categories and locations, I'll recommend a meta-no-index tag.
-
It can be a complicated question on a very large site, but in most cases I'd META NOINDEX those pages. Robots.txt isn't great at removing content that's already been indexed. Admittedly, NOINDEX will take a while to work (virtually any solution will), as Google probably doesn't crawl these pages very often.
Generally, though, the risk of having your index explode with custom search pages is too high for a site like yours (especially post-Panda). I do think blocking those pages somehow is a good bet.
The only exception I would add is if some of the more popular custom searches are getting traffic and/or links. I assume you have a solid internal link structure and other paths to these listings, but if it looks like a few searches (or a few dozen) have attracted traffic and back-links, you'll want to preserve those somehow.
-
Sure, check below and some of the duplication I mean:
Capitalization Duplication
http://yellow.co.nz/yellow+pages/Car+dealer/Auckland+Region
http://yellow.co.nz/yellow+pages/Car+Dealer/Auckland+Region
With a few URL parameters
And with location duplication
http://yellow.co.nz/yellow+pages/Car+Dealer/Auckland
Let me know if you need any more info!
Cheers
Kim
-
Whats the content look like on the new url? Can you give us an example?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Redirect to url with parameter
I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
Intermediate & Advanced SEO | | Debitoor
wiki2.website.com/how_to_build_kitchen
and
wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Does having shorter URLs help with rankings?
Hello here.I own an e-commerce website (virtualsheetmusic.com), and some of our most important category pages have pretty long URLs. Here is an example: http://www.virtualsheetmusic.com/downloads/Indici/Violin.html I am evaluating the possibility to shorten URLs like the above to something like: http://www.virtualsheetmusic.com/violin/ But since it is going to pretty hard and time consuming (considering the custom system we have in place on that site), I am trying to find out if it really matters and worth doing it from a SEO stand point. I am aware that from a user prospective shorter URLs are preferable, and we plan to pursue a better URL architecture on our website in the near future just for that, but this question, at the moment, should be strictly related to SEO. Any thoughts on this topic are very welcome!
Intermediate & Advanced SEO | | fablau0 -
Is this URL Structure SPAMMY
Hey guys/gals I have tried asking this very specific question 3-4 times already and some how my specific question seems to be getting side tracked and my very specif question pertaining to my URL structure keeps getting bypassed and overlooked. I am wondering about if this URL structure would become a possible issue in the somewhat near future with GOOGLE considering what I have seen go down in the SEO world the past 2 years. Does this URL Structure look SPAMMY? http://www.pcmedicsoncall.com/computer-repair/laptop-repair/ www.pcmedicsoncall.com/computer-repair/laptop-repair/laptop-screen-repair/ Below is a Screen shot of the Site which I designed where I have created a SILO Site Architecture. .....PLEASE... Look at the Picture Thank you Marshall SEOMOZ-PC-MEDICS-ON-CALL-1.jpg
Intermediate & Advanced SEO | | MarshallThompson310 -
Should Site Search results be blocked from search engines?
What are the advantages & disadvantages of letting Google crawl site search results? We currently have them blocked via robots.txt, so I'm not sure if we're missing out on potential traffic. Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0