Webmaster tool parameters
-
Hey forum,
About my site, idealchooser.com. Few weeks ago I've defined a parameter "sort" at the Google Webmaster tool that says effect: "Sorts" and Crawl: "No URLs". The logic is simple, I don't want Google to crawl and index the same pages with a different sort parameter, only the default page without this parameter.
The weird thing is that under "HTML Improvement" Google keeps finding "Duplicate Title Tag" for the exact same pages with a different sort parameter. For example:
/shop/Kids-Pants/16//shop/Kids-Pants/16/?sort=Price/shop/Kids-Pants/16/?sort=PriceHi
These aren't old pages and were flagged by Google as duplicates weeks after the sort parameter was defined.
Any idea how to solve it? It seems like Google ignores my parameters handling requests.
Thank you.
-
I just thought of something else, if I do a robots.txt disallow to this parameter, wouldn't it kill all the rankings from external links to pages that contain this parameter?
-
It's been at least 6 weeks since I set it up, probably more like 2 months.
Thanks for the robots.txt idea, I'll give it a shot. However I would still like to figure out what is going on with Google parameter definition.
-
How long you have set this? I think it may take up some time before you can see some changes in Google Webmaster Tools. But again I personally do not like Google Parameter handling tools because it is meant only for Google and therefore, your website will appear duplicate to other search engines. So, I would rather use robots.txt file to fix this. A simple tag like – Disallow: /?sort=*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have I set up my structured data correctly, the testing tool suggests not?
Hi, I've recently marked up some Events for a client in hope that they'll appear as rich snippets in ther SERPS. I have access to their Google Search Console so used the Data Highlighter facility to mark them up, rather than the Raven plugin available for WordPress sites like this. I completed this on 10th July and the snippets are yet to appear - I understand that this can take time and there are no guarantees - but as a novice it would be reassuring if someone can advise that I have done this correctly. We did incidentally resubmit a sitemap after completing this task, but I'm not sure if that makes any difference. I've read that it's the structured data testing tool that I need to use to test my markup, but when I input the urls below, the tool doesn't tell me a lot, which either suggests I've marked it up incorrectly, or don't know how to read it! http://www.ad-esse.com/events/19th-august-2015-reducing-costs-changing-culture-improving-services/
Intermediate & Advanced SEO | | nathangdavidson
http://www.ad-esse.com/events/160915-reducing-costs-changing-culture-improving-services-london/
http://www.ad-esse.com/events/151015-reducing-costs-changing-culture-improving-services-london/ Any guidance welcomed! Many thanks,
Nathan0 -
What exactly is an impression in Google Webmaster Tools search queries with the image filter turned on?
Is it when someone does an image search? Or does it count a regular search that has images in it? On an image search does the picture actually have to be viewed on the screen or can it be below in the infinite scroll?
Intermediate & Advanced SEO | | EcommerceSite0 -
Should I delete 'data hightlighter' mark-up in webmaster tools after added schema.org mark-up?
LEDSupply.com is my site, and before becoming familiar with schema mark-up I used the 'data-highlighter' in webmaster tools to mark-up as much of the site as I could. Now that Schema is set-up I'm wondering if having both active is bad and am thinking I should delete the previous work with the 'data highlighter' tool. To delete or not to delete? Thank you!
Intermediate & Advanced SEO | | saultienut0 -
URL Parameter & crawl stats
Hey Guys,I recently used the URL parameter tool in WBT to mark different urls that offers the same content.I have the parameter "?source=site1" , "?source=site2", etc...It looks like this: www.example.com/article/12?source=site1The "source parameter" are feeds that we provide to partner sites and this way we can track the referral site with our internal analytics platform.Although, pages like:www.example.com/article/12?source=site1 have canonical to the original page www.example.com/article/12, Google indexed both of the URLs
Intermediate & Advanced SEO | | Mr.bfz
www.example.com/article/12?source=site1andwww.example.com/article/12Last week I used the URL parameter tool to mark "source" parameter "No, this parameter doesnt effect page content (track usage)" and today I see a 40% decrease in my crawl stats.In one hand, It makes sense that now google is not crawling the repeated urls with different sources but in the other hand I thought that efficient crawlability would increase my crawl stats.In additional, google is still indexing same pages with different source parameters.I would like to know if someone have experienced something similar and by increasing crawl efficiency I should expect my crawl stats to go up or down?I really appreciate all the help!Thanks!0 -
Webmaster tools 404
Hey, I'm getting a soft 404 error on a webpage that has content and is deferentially not a 404. We've redirect a load of urls to the web page. The url has parameters which was used before the redirect but are no longer used on by the new url, these parameters have been carried over in the redirect. Is this whats causing the soft 404 error or is there another problem that may need addressing? Also a canonical has been set on the webpage. Thanks, Luke.
Intermediate & Advanced SEO | | NoisyLittleMonkey1 -
What is the best tool to crawl a site with millions of pages?
I want to crawl a site that has so many pages that Xenu and Screaming Frog keep crashing at some point after 200,000 pages. What tools will allow me to crawl a site with millions of pages without crashing?
Intermediate & Advanced SEO | | iCrossing_UK0 -
Magic keywords in Google Webmaster Tools
Hi All, Recently moved a friend to a new WP back-end website as they were on Flash which is pretty, but not necessarily the best for SEO. http://francesphotography.com My question is that once Google finally indexed the site, I noticed in Google Webmaster tools that it found the most significant keyword to be: automatically On the following top pages: | tag/snow-boarding-photography/ |
Intermediate & Advanced SEO | | BoulderJoe
|tag/style-photography/ |
|tag/underwater-photography/ |
|tag/vacation-photography/ |
|tag/wedding-photography-beaver-creek/ |
|tag/wedding-photography-copper-mountain/ |
|tag/wedding-photography-denver/ |
|tag/wedding-photography/ |
|underwater-photography-scuba-diving-cozumel-mexico/ |
|wedding-photography/ | The goofy thing is I can find anywhere that "automatically" is used - perhaps it is coming from a plug-in or magically keyword beans that Google found? Any guidance is appreciated.
0