Why is google webmaster tools ignoring my url parameter settings
-
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
-
its been about a month but ill give it a bit longer. ta
-
Allow a couple of month to see the changes. It they were recently made, Google will take a while until removing duplicate content errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with existing URL when replatforming and new URL is the same?
We are changing CMS from WordPress to Uberflip. If there is a URL that remains the same I believe we should not create a redirect. However, what happens to the old page? Should it be deleted?
Technical SEO | | maland0 -
URL structure
Hello Guys, Quick Question regarding URL strucutre One of our client is an hotel chain, thye have a group site www.example.com and each property is located in a subfolder: www.example.com/example-boston.html , www.example.com/example-ny.html etc. My quesion is : where is better to place the language extension at a subfolder level?
Technical SEO | | travelclickseo
Should i go for www.example.com/en/example-ny.html or it is preferable to specify the language after the property name www.example.com/example-ny/en/accommodation.html? Thanks and Regards, Alessio0 -
Webmaster Tools Search Queries Data Drop
Hi I'm seeing a significant drop in search queries being reported for a client in GWT starting on the 7th Feb. I have seen a few articles on SERound Table etc saying that many are reporting probs like delays etc with GWT updating its data, such as these ones: https://www.seroundtable.com/google-webmaster-tools-data-stalled-19854.html https://www.seroundtable.com/google-webmaster-tools-analytics-data-19870.html However these seem to suggest the problem is simply a delay with displayed data being updated, in the case im looking at the data is up to date but showing an increasing decline. When i look at Analytics data though the data is completely different. For exmaple GWT says on the 21st Feb there were 23 impressions with zero clicks but Analytics says there were 6 clicks/sessions from organic search. I take it this means that there is a likely problem with GWT data and I shouldn't worry ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Google Webmaster Site Performance
In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance
Technical SEO | | Bucky0 -
Q Parameters
I'm having several site issues and I want to see if the Q parameter in the URL is the issue. Both of these index. Any capitalization combination brings up another indexed page: http://www.website.com/index.php?q=contact-us. and http://www.website.com/index.php?q=cOntact-us The other issue is Google crawl errors. The website has received increasingly more spam crawl errors. I've read that this is a common issue and most likely is a Google Bot problem. Would removing the q parameter fix this entirely? Here is an example: http://www.website/index.php?q=uk-cheap-chloe-bay-bag-wholesale-shoes
Technical SEO | | DanSpeicher0 -
Parameter handling (where to find all parameters to handle)?
Google recently said they updated their parameter handling, but I was wondering what is the best way to know all of the parameters that need "handling"? Will Google Webmaster find them? Should the company know based on what is on their site? Thanks!
Technical SEO | | nicole.healthline0 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0