Should we block URL param in Webmaster tools after URL migration?
-
Hi,
We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog.
We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param.
I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google?
Thanks,
Chris
-
You said as the old URLs can not be redirected.
Then why don't you try for canonical implementation in the important pages to avoid losing link authority that might have built over the years naturally.
The rest of the URLs can be blocked through GWT using parameters. Since you have new version of site, you may want to concentrate only on the new pages. If the old pages are not redirected, anyway you are going to be benefited.
I would suggest to go with canonical wherever possible and URL parameters only as a last option.
Note: To have canonical, pages need to have more similarities
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is robots.txt blocking URL's in sitemap?
Hi Folks, Any ideas why Google Webmaster Tools is indicating that my robots.txt is blocking URL's linked in my sitemap.xml, when in fact it isn't? I have checked the current robots.txt declarations and they are fine and I've also tested it in the 'robots.txt Tester' tool, which indicates for the URL's it's suggesting are blocked in the sitemap, in fact work fine. Is this a temporary issue that will be resolved over a few days or should I be concerned. I have recently removed the declaration from the robots.txt that would have been blocking them and then uploaded a new updated sitemap.xml. I'm assuming this issue is due to some sort of crossover. Thanks Gaz
Technical SEO | | PurpleGriffon0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
What do I need to do for HTTPS switch in Webmaster Tools?
My site is currently verified using a meta tag for both Google and Bing. Will I need to recreate the meta tag or will I be able to use the same one?
Technical SEO | | EcommerceSite1 -
Duplicate blog URLs in Magenton
On one my sites Moz is picking up 4483 duplicate content pages. The majority of these are from our blog and video sections on our site. We're using a URL shortener and it appears that some of the pages are the full version of the URL then the shortened version. However if you go to the full version you get redirected to the shorter one. So I would assume that the Moz crawler should get the same redirect? We're also getting pagination being shown as duplicate pages, which I would half expect, but the URLs Magento is creating are truly bizarre: e.g http://www.xxx.com/uk/blog/cat/view/identifier/news/page/news/index.php/alarms-doorbells/?p=2 Alarms and doorbells is one of our product categories, which is displayed in the LHN on the blog page but has nothing to do with the blog itself. On another site on the same Magento instance, with the same content (they're for two different regions) we're show as having 248 duplicate pages, again in the video and news section, but this is a completely different scale of issue. Has anyone else encountered issues like these? I'm probably going to put a noindex in place on these two sections until we can get a solution in place as we're completely unranked in google on this site. Thanks
Technical SEO | | ahyde0 -
Help with Webmaster Tools "Not Followed" Errors
I have been doing a bunch of 301 redirects on my site to address 404 pages and in each case I check the redirect to make sure it works. I have also been using tools like Xenu to make sure that I'm not linking to 404 or 301 content from my site. However on Friday I started getting "Not Followed" errors in GWT. When I check the URL that they tell me provided the error it seems to redirect correctly. One example is this... http://www.mybinding.com/.sc/ms/dd/ee/48738/Astrobrights-Pulsar-Pink-10-x-13-65lb-Cover-50pk I tried a redirect tracer and it reports the redirect correctly. Fetch as googlebot returns the correct page. Fetch as bing bot in the new bing webmaster tools shows that it redirects to the correct page but there is a small note that says "Status: Redirection limit reached". I see this on all of the redirects that I check in the bing webmaster portal. Do I have something misconfigured. Can anyone give me a hint on how to troubleshoot this type of issue. Thanks, Jeff
Technical SEO | | mybinding10 -
Webmaster Tools 404s
We try to keep our 404s in google webmaster tools to a minimum but in recent months, the volume has simply exploded to over 500k errors. 99.95% of this is complete spam linking to pages that never existed. Have tried marking as resolved but they just end up back in the list eventually and don't like the idea of 301ing so many links when the pages never existed in the first place. We can just ignore them all but this makes it hard to identify legitimate 404s that need redirecting as there is only so much data we can export out of WT. Has anyone had experience with returning 410s? Does google eventually drop these from WT?
Technical SEO | | jandunlop0 -
How do I check if my IP is blocked?
We changed servers and where our sites once ranked very highly (page 1 for all sites), they now are nowhere to be seen. Someone suggested that our IP might be blocked. Someone else suggested SEOMoz was the place to go to get it checked. Any help would be GREATLY appreciated. With thanks. Bryan
Technical SEO | | FortressLearning0 -
Should me URLs be uppercase or lowercase
I'm in the middle of doing a bunch of 301 redirects for me site. Should I make them Lowercase, uppercase, or does it matter? Also, do I want to be using hyphens (-), or underscores (_)? Any other tips? EX: http://www.stupid.com/golf-slippers.html OR http://www.stupid.com/Golf-Slippers.html
Technical SEO | | JustinStupid0