Correct URL Parameters for GWT?
-
Hi,
I am just double checking to see if these parameters are ok - I have added an attachment to this post.
We are using an e-commerce store and dealing with faceted navigation so I excluded a lot of parameters from being crawled as I didnt want them indexed. (they got indexed anyway!).
Advice and recommendations on the use of GWT would be very helpful - please check my screenshot.
thanks,
-
Hello BJS,
The parameters look OK. Keep in mind that adjusting parameter handling in GWT is more of a band-aid than a cure. There are lots of other ways to handle faceted navigation, including rel canonical, robots meta tags, robots.txt and others. Read this article if you haven't already:
http://moz.com/blog/building-faceted-navigation-that-doesnt-suck
And here's a Whiteboard Friday on the topic: "http://moz.com/blog/whiteboard-friday-faceted-navigation
Keep a close eye on rankings once you've implemented those parameter handling rules. Be sure to annotate your analytics account with the date on which this was done for possible future diagnostics. See: http://www.google.com/analytics/features/annotations.html
-
Seems that your putting all sorts of work into an issue, and trying to solve it a way that is the most difficult, and still not getting the results you want, which leads you to do even more work.
Try breaking the issue down into several categories: "Do not index", "Please index", and "Group these together"
Now in you Robots.TXT file use the syntax "DISALLOW: \high-level-path" which tells the google bot not to index ALL the pages in you site that have "\high-level-path" an any other additional lower domain structure in their url. So if you were to use DISALLOW: \mydomaine.com, Google will ignore your entire domain.
second issue instead of trying to include all the "Rules" in GWT it would probably be simpler to include all the URL you definitely want indexed in your "Sitemap.xml" file. The free program " Soft Plus Gsite crawler" will automatically build this file for you.
Lastly: The Canonical tag is an easy way to tell Google which pages are similar, like your "Page" parameter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the Good URL structure for Blog posts
Please let me know what is the goood URL structure for blog posts http://www.abc.com/postname/ or http://www.abc.com/�tegory%/%postname% If Category, Can we name it Blog like website/blog/postname or it is good to use actual categories, and How many categories we can use?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Google Keyword Planner tool is not correct
Hi All, I know you are all know about Google keyword planner tool. As i know its shows most keywords searched totally wrong. One of the keyword searches less than (<10) but i got 20 exact keyword hits in only one business day and one of the keyword shows more then 10 K searches give us only 3-4 hits in one day.
Intermediate & Advanced SEO | | dotlineseo0 -
GWT URL Removal Tool Risky to Use for Duplicate Pages?
I was planning to remove lots of URL's via GWT that are highly duplicate alike pages (similar pages exist on other websites across the web). However, this Google article had me a bit concerned: https://support.google.com/webmasters/answer/1269119?hl=en I already have "noindex, follow" on the pages I want to remove from the index, but Google seems to take ages to remove pages from index, which appear to drag down unique content pages from my site.
Intermediate & Advanced SEO | | khi50 -
Hash URLs
Hi Mozzers, Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it? Thanks in advance, Karl
Intermediate & Advanced SEO | | KarlBantleman0 -
Google: How to See URLs Blocked by Robots?
Google Webmaster Tools says we have 17K out of 34K URLs that are blocked by our Robots.txt file. How can I see the URLs that are being blocked? Here's our Robots.txt file. User-agent: * Disallow: /swish.cgi Disallow: /demo Disallow: /reviews/review.php/new/ Disallow: /cgi-audiobooksonline/sb/order.cgi Disallow: /cgi-audiobooksonline/sb/productsearch.cgi Disallow: /cgi-audiobooksonline/sb/billing.cgi Disallow: /cgi-audiobooksonline/sb/inv.cgi Disallow: /cgi-audiobooksonline/sb/new_options.cgi Disallow: /cgi-audiobooksonline/sb/registration.cgi Disallow: /cgi-audiobooksonline/sb/tellfriend.cgi Disallow: /*?gdftrk Sitemap: http://www.audiobooksonline.com/google-sitemap.xml
Intermediate & Advanced SEO | | lbohen0 -
SEO Overly-Dynamic URL Website with thousands of URLs
Hello, I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize. They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all. With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links. The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge. I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum Thought?
Intermediate & Advanced SEO | | MikePatch0 -
Automatic redirect to external urls
Hi, there is a way to create a "bridge page" with automatic url redirect ( 302 ) without google penalization? In this moment, my bridge pages are indexed on google with title and description of the redirected page.. Thanks in advance. Mauro.
Intermediate & Advanced SEO | | raulo790 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0