URL Parameters
-
Hi Moz Community,
I'm working on a website that has URL parameters. After crawling the site, I've implemented canonical tags to all these URLs to prevent them from getting indexed by Google. However, today I've found out that Google has indexed plenty of URL parameters..
1-Some of these URLs has canonical tags yet they are still indexed and live.
2- Some can't be discovered through site crawling and they are result in 5xx server error.
Is there anything else that I can do (other than adding canonical tags) + how can I discover URL parameters indexed but not visible through site crawling?
Thanks in advance!
-
I'm also facing the same problem with my website pages. My Blackpods pro website pages don't show the exact permalink urls.
-
Hi there,
Thanks very much for your response. I checked the sitemap and there are no URL parameters listed - only the canonical URL listed on the sitemap.
If you have any other suggestions it'll be much appreciated.
Thank you!
-
Hi Rajesh,
Thank you for your response. I cannot share the website due to client's confidentiality but basically when I search to find a stockist {brand name}, Google lists similar URLs below on the first page. The pages are showing a list of stockists depending on the product availability:
1-website.com/find-stockist?model=10 (5xx status code)
2-website.com/find-stockist?model=11 (200 status code)
3-website.com/find-stockist?model=10 (5xx status code)
4-website.com/find-stockist?model=11 (200 status code)Thank you!
-
Hi Gaston,
Thanks very much for your time. The canonicals have implemented around a month ago and the pages are almost identical. I discovered all URL parameters without performing an advanced search.
Also, I come across the 5xx errors when I clicked indexed URL parameters on Google SERP and I cannot discover them when I crawl the site with Screaming Frog.
I'd appreciate if you have any other suggestions based on your experience!
Many thanks
-
Just so you know, if a URL results in a 5XX server error then it usually won't render your canonical tag to begin with! You might want to check your sitemap XML, to check that it's not 'undoing' your canonical tags by feeding these URLs to Google. Indexation tags must be perfectly aligned with your sitemap XML, or you are sending Google mixed messages (e.g: a URL is in sitemap XML so Google should index it, but when it is crawled it contains a canonical tag citing itself as non-canonical, which is the opposite signal)
Everything which Gaston said is right on the money
-
I think you need to show some examples.
-
Hi there,
Its important to note that canonicals are a signal. Google can obey them if its algorithm considers that those pages are actually canonicals between each other.
In my experience, this does not happen immediately, it usually takes Google some time to figure out if the canonicalization is correct. Keep in mind that pages being canonicalized HAVE TO be nearly identical and refer to the same topic.
And on the indexation part, pages can be indexed and be shown only when you search for that specific URL or using any advanced search parameter (such as site:).
More information about canonicals
- Consolidate duplicate URLs - Google Search supportRegarding the second issue, if you refer to "site crawling" as what you do with an external tool, such as Screaming Frog or Moz, you are getting 5xx errors because that tool is making to many requests, try lowering its crawl frequency. I know for a fact that Screaming Frog allows you to do that.
But, unfortunately, I don't know any other way of discovering URL parameters in bulk but using an external tool.Hope it helps,
Best luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
Changing existing URL to boost SEO
What's best practice regarding changing URLs for SEO? If the page contains great information around a particular term but the URL is not reflective of this and thus the page isn't ranking should the URL be changed? Or is it always a hard and fast no? It would seem to make sense to me if the page didn't have any backlinks already and Organic clicks were minimal. Sam
Intermediate & Advanced SEO | | Samsam00000 -
URL structure with dash or slash
Hi, everyone Basically I am editing my website page's URL for SEO Optimisation and I am not sure which URL structure is best for SEO. The main different is the sign ( dash or slash ) before the product-code. HERE ARE TWO EXAMPLE www.example.com/long-tail-keyword-product-code www.example.com/long-tail-keyword/product-code To get more idea of my page, here is one of the product from my website : http://www.okeus.co.uk/pro_view-3.html My website is selling my own product, as a result the only keyword can be found was the name of the product and I separated different design by different code. Any experts who are willing help would be very much appreciated.
Intermediate & Advanced SEO | | chrisyu781 -
URL Too Long vs. 301 Redirect
We have a small number of content pages where the urls paths were setup before we started looking really hard at SEO. The paths are longer than recommended (but not super crazy IMHO) and some of the pages get a decent amount of traffic. Moz suggests updating the URLs to make them shorter but I wonder if anyone has experience with the tradeoffs here. Is it better to mark those issues to be ignored and just use good URLs going forward or would you suggest updating the URLs to something shorter and implementing a 301 redirect?
Intermediate & Advanced SEO | | russell_ms0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Website Re-Launch - New URLS / Old URL WMT
Hello... We recently re-launched website with a new CMS (Magento). We kept the same domain name, however most of the structure changed. We were diligent about inputting the 301 redirects. The domain is over 15 years old and has tons of link equity and history. Today marks 27 days since launch...And Google Webmaster Tools showed me a recently detected (dated two days ago) URL from the old structure. Our natural search traffic has take a slow dive since launch...Any thoughts? Some background info: The old site did not have a sitemap.xml. The relaunched site does. Thanks!
Intermediate & Advanced SEO | | 19prince0 -
Robots.txt: Syntax URL to disallow
Did someone ever experience some "collateral damages" when it's about "disallowing" some URLs? Some old URLs are still present on our website and while we are "cleaning" them off the site (which takes time), I would like to to avoid their indexation through the robots.txt file. The old URLs syntax is "/brand//13" while the new ones are "/brand/samsung/13." (note that there is 2 slash on the URL after the word "brand") Do I risk to erase from the SERPs the new good URLs if I add to the robots.txt file the line "Disallow: /brand//" ? I don't think so, but thank you to everyone who will be able to help me to clear this out 🙂
Intermediate & Advanced SEO | | Kuantokusta0 -
Is there a tool that lists all external followed URLs?
Is there a tool that lists all external followed URLs? Or maybe separates nofollowed and followed external URLs?
Intermediate & Advanced SEO | | MangoMan160