Unexplained Crawl Diagnostic Errors & Opencart
-
Hi,
I've been looking at the crawl diagnostics for my site and trying to fix the errors that are showing up but Seomoz is producing some strange results.
It's saying pages are duplicated upto 16 times but those pages dont exist. It's adding "page=3", "page=4" to the end of the product URL but I don't see how it's finding those pages, nothing on the site(as far as I can tell) is linking to them. There is no "page=3", just the one product page.
Again on the duplicate content it's saying under the "other URLs" there's URLs like "http:///product-a" but again I don't see where it's finding these URLs and obviously those URL's dont work. Those three slashes aren't a typo either.
So far I've reduced the amount of errors from 2,005 to 543 but the rest of them I can't make sense of.
Also, what does one do when you have two products, eg: "product-a-white" and "product-a-black" to prevent Seomoz from seeing duplicates? Canonical links wont work because there's no parent item, just those two. Google Webmaster tools doesn't seem to have a problem though.
Using Opencart 1.5, if it helps.
Cheers,
-
Ah, so it may well be opencart doing something funky then. It's carrying the page url over into the product listing by the looks of it. I'll have to look into that then, thanks for pointing that out!
Do you have any idea how it could be finding the "http://maggie" style links?
Cheers for the help,
-
Ok, here is a example
http://www.lustrelingerie.com/Gracya-Lingerie/safari-wild-bra-push-up?page=5linked from
http://www.lustrelingerie.com/Gracya-Lingerie?page=5
Seems like if the pages= is on the catalog page, it is on the product links
-
Hi Alan, thanks for the response.
Yea, sure there's additional pages for the categories, I'm talking about the individual products.
Take http://www.lustrelingerie.com/Bassaya-Lingerie/camila-red for example. Seomoz's Diagnostics is saying there's a http://www.lustrelingerie.com/Bassaya-Lingerie/camila-red?page=2. The latter works if you go there, I don't understand that and that's likely down to opencart, but what I don't get is how Seomoz is finding the link to it.
And it's the same with links such as "http://maggie" (real error), I don't see where Seomoz is finding the links to those. I've checked any stray canonical links but they seem fine to me.
Thanks,
-
Yes they do exist
this page http://www.lustrelingerie.com/Everyday-Luxury-Underwear-Lingerie?page=1
is linked from this page
http://www.lustrelingerie.com/Everyday-Luxury-Underwear-Lingerie
There are many examples
-
The URL is http://www.lustrelingerie.com/
-
If you can give us a url i will tell you for sure
-
Hi Ben, thanks for the response.
The thing is I don't think it's a CMS issue, it seems to me that seomoz is getting confused somewhere. my product pages are along the lines of "www.domain.com/range/product-a/". They have a canonical link pointing to "www.domain.com/product-a/" And all only have a single page to them. Which is why I can't figure out where Seomoz is picking up these duplicates.
With regards to your latter paragraph, yea I was thinking that. I thought it might confuse customers though, or I was hoping there would be a more elegant solution. Going back in and editing 500+ products isn't something I was looking forward to hehe.
Cheers,
-
I'll speak to the duplicates issue since the other appears to be a CMS issue and how it is displaying the products. Whenever I see the "page=1" in the URL I can usually fink a pagination script that isn't helping my SEO efforts. But I don't know for sure in your situation, especially since you said you don't see any links on the product page.
As far as the "duplicates" issue. Try to get them as distinct as possible. With our product pages (starting with the most sold items) I have begun changing up the product name. We have the difference of only the height on many of our products so I'm having to get a little creative and add some other aspect to the URL that stays within the products title. I only want one page from my site competing for that exact match product SERP anyway. It's not a good idea to have two pages on your site competing for the same SERP. It seems to always be treated with less authority by Google when that happened in the past.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is real google bot like "fetch"or more like "fetch & render"?
In GWT we have two options to mimic googlebot visits, "fetch" and "fetch and render", but when the real googlebot visit a page, is he behaving like the former or the latter? I can see fetch does fetch only the html, while fetch and render does fetch .js and .css as well. But what does the real googlebot does? I have checked the web server logs, and I can see the real googlebot sometimes request the .js files too, but not every time it visit a page, sometimes it does, sometimes it does not. Has anyone figured out when googlebot actually request javascript files?
Reporting & Analytics | | max.favilli0 -
Universal Analytics & Google Tag Manager - Track URLs that include hashes
Does anyone have any experience tracking URLs that include hashes (#) using Universal Analytics and Google Tag Manager? Can it be done using GTM's container for UA, using the "more settings" options? Or building another tag to work with the GTM UA container? The fallback I'm considering is implementing the UA code in GTM for every page as Custom HTML with the "ga('send', 'pageview', location.pathname + location.search + location.hash);" solution, rather than GTM's specialized UA tag. I'm not yet sure what problems may arise from that, if any. Thanks in advance.
Reporting & Analytics | | 352inc0 -
Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place. Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take. Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route? Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc. Thanks!
Reporting & Analytics | | rhgraves651 -
Sitemap 404 error
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
Reporting & Analytics | | dentaldesign0 -
Webmaster Tools Error: Unreachable page
Hi all, When I try to the "Fetch as Google" feature on Webmaster Tools, I get the error Unreachable page. I checked the Google Analytics code, everything seems to be OK. What should I do?
Reporting & Analytics | | fisniks0 -
Site re-crawled?
I've fixed many of my errors, but they're still showing in my dashboard. When will the site be crawled again?
Reporting & Analytics | | sakeith0 -
Setting up Webmaster Tools correctly - naked domain DNS error and sub-domains question
I'm trying to get our domain (verdantly.com) set up correctly in Google Webmaster Tools. Currently, I have three "sites" setup: blog.verdantly.com (wordpress.com blog redirected to this subdomain) www.verdantly.com verdantly.com The subdomain blog and www show up without errors. However, the naked domain shows a DNS error. I've checked the DNS settings at the registrar and don't see any issues. So here are my questions: 1. Am I correct in setting up the naked domain AND the subdomains separately in Webmaster tools? 2. How do I track down / resolve the source of the DNS errors at the naked domain? Thanks!
Reporting & Analytics | | letsdothis0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0