How do I know if I am correctly solving an uppercase url issue that may be affecting Googlebot?
-
We have a large e-commerce site (10k+ SKUs). https://www.flagandbanner.com.
As I have begun analyzing how to improve it I have discovered that we have thousands of urls that have uppercase characters. For instance: https://www.flagandbanner.com/Products/patriotic-paper-lanterns-string-lights.asp.
This is inconsistently applied throughout the site. I directed our website vendor to fix the issue and they placed 301 redirects via a rule to the web.config file. Any url that contains an uppercase character now displays as a lowercase.
However, as I use screaming frog to monitor our site, I see all these 301 redirects--thousands of them. The XML sitemap still shows the the uppercase versions. We have had indexing issues as well. So I'm wondering what is the most effective way to make sure that I'm not placing an extra burden on Googlebot when they index our site? Should I have just not cared about the uppercase issue and let it alone?
-
Not that I've noticed... I started with the company back in February and noticed it when I crawled the site with Screaming Frog. So they already had uppercase and lowercase permalinks back then. When I brought it to our developers attention they didn't seem to concerned. Then I saw something somewhere that discussed Google seeing them as potential duplicates. Which is when I posted to MOZ and got the response that it was fine since we have canonical URLs in place. So, it has not had any negative effect since I started that I can see. However, I don't know how to correct Screaming Frog from seeing as duplicate pages.
-
Thanks for sharing this, Lindsay! Helpful. Have you seen any negative effects that stem from both uppercase and lowercase urls still being accessible?
-
I had the same issue in Screaming Frog and posted to Moz Q&A a few weeks ago about it that was resolved.
https://moz.com/community/q/uppercase-lowercase-reading-as-duplicate-permalinks
-
This is really helpful. Thank you!
Mike
-
It was still a good idea to create the redirects for the upper-case versions to help cut down duplicate content issues. Rel-canonical "could" have been used, but I find it's much better to actually redirect.
But that means the lower-case URLs are the canonical URLs, so ONLY they should appear in the sitemap. (Sitemaps aren't supposed to contain any URLs that redirect.) Right now, you're giving the search crawlers contradictory directives, and they don't do well with those
For additional cleanup, it would be good to have rules added to the CMS so that upper-case URL slugs cannot be created in the first place. Also run a check (can probably be done in the database) to ensure that any internal links on the site have been re-written NOT to use the uppercase URLs. there's no sense generating unnecessary redirects for URLs you control. (I suspect this is the majority of the cases that Screaming Frog is picking up.) You need to ensure all navigation and internal links are using the canonical lowercase version.
The more directly the crawlers can access the final URL, the better your indexing will be. So don't have the sitemap sending them through redirects, and don't let your site's internal links do so either.
Hope that helps?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with URL Too Long
I manage a number of Shopify stores for ecommerce clients. MOZ keeps kindly telling me the URLs are too long. However, this is largely due to the structure of Shopify, which has to include 'collections' and 'products'. For example: https://domain.com.au/collections/collection-name/products/colour-plus-six-to-seven-word-product-name MOZ recommends no more than 75 characters. This means we have 25-30 characters for both the collection name and product name. VERY challenging! Questions: Anyone know how big an issue URLs are as a ranking factor? I thought pretty low. If it's not an issue, how can we turn off this alert from MOZ? If it is an issue, anyone got any ideas how to fix it on Shopify sites?
Intermediate & Advanced SEO | | muzzmoz0 -
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
Localized Domain Issue - Can I use Search Console to solve this?
Struggling through trying to resolve a complicated search issue - would appreciate any community input or suggestions. The Background Info We have several brand sites and each one has both a .ca and .com domain. For some reason, our website platform was created in a way that hundreds of pages on the .com domain have an equivalent page on the .ca domain, which are all 301'ed to the appropriate .com pages. Example below for clarity: www.domain.ca/gadget/brand - 301 Redirected to: www.domain.com/gadget/brand www.domain.ca/gadget/en/brandcanada = Proper .ca Canadian URL (where en is the language - fr exists as well) The Problem Because these .com pages exist under the .ca domain as well, they have started to outrank the correct .ca pages on Google. This has led to Canadian customers finding incorrect information, pricing, and reviews for these products - causing all sorts of customer service issues and therefore affecting our sales. I am being told that to properly fix the issue, and remove the incorrect URLs under the .ca domain would be prohibitively expensive in terms of resources, so I'm left trying to fix this via means available to me (i.e. anything but a change to how the platform is currently setup). The Attempted Fix I've submitted proper sitemaps for the .ca brand sites, and we have also created a robots.txt file to be accessed only when the site is crawled through the .ca domain. In that robots.txt, we have Disallowed crawling of any /gadget/brand/ URLs for the .ca domain. This was done a week ago and I am still seeing the .com URL show up in search results. The Question Should I be submitting any www.brand.ca/gadget/brand/ URLs to be temporarily removed from Google? Because of the 301 redirect in place from www.brand.ca/gadget/brand to www.brand.com/gadget/brand, I am hesitant to do so, as I do not want the .com URL removed. Will Google simply remove the .ca URL and not follow the 301 redirect to remove that URL as well? Any additional insight or feedback would be awesome as well.
Intermediate & Advanced SEO | | Trevor-O0 -
URL Formatting - Magento
Hi, We are working with a client on Mangento who URLs are formatting Google friendly eg; productname.html - as seen in site search in Google) but when you click the link to the site it is adding on #.VEWKQxbc754 (or similar) The site is also having some page indexing problems as well Thoughts? specific settings/Add on in magento?
Intermediate & Advanced SEO | | Pure-SEO0 -
Block Googlebot from submit button
Hi, I have a website where many searches are made by the googlebot on our internal engine. We can make noindex on result page, but we want to stop the bot to call the ajax search button - GET form (because it pass a request to an external API with associate fees). So, we want to stop crawling the form button, without noindex the search page itself. The "nofollow" tag don't seems to apply on button's submit. Any suggestion?
Intermediate & Advanced SEO | | Olivier_Lambert0 -
Googlebot found an extremely high number of URLs on your site
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage. The error is as below- Googlebot encountered problems while crawling your site. Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site. I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following; No-index a large number of pages using the on page meta tag. Use a canonical tag where it is appropriate But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag. So my question is how do I address this problem? I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag. any suggestions appreciated.
Intermediate & Advanced SEO | | BenFox0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0 -
New AddThis URL Sharing
So, AddThis just added a cool feature that attempts to track when people share URL's via cutting and pasting the address from the browser. It appears to do so by adding a URL fragment on the end of the URL, hoping that the person sharing will cut and paste the entire thing. That seems like a reasonable assumption to me. Unless I misunderstand, it seems like it will add a fragment to every URL (since it's trying to track all of 'em). Probably not a huge issue for the search engines when they crawl, as they'll, hopefully, discard the fragment, or discard the JS that appends the fragment. But what about backlinks? Natural backlinks that someone might post to say, their blog, by doing exactly what AddThis is attempting to track - cutting and pasting the link. What are people's thoughts on what will happen when this occurs, and the search engines crawl that link, fragment included?
Intermediate & Advanced SEO | | BedeFahey0