Page quantity vs no crawl errors.
-
Which one is better:
-
Have a ton of pages but accept crawl errors or;
-
Have a lot less pages with no crawl errors.
Let's say I have a product catalogue with 10 regular pages and 500 product pages (same page with content and title defined by url parameter like 'id').
It seems that even with a different product name, product description, price, color etc, I get dupplicate content crawl errors. I also know I could use a link tag with cannonical rel attribute to fix the crawl errors but I would lose indexing on 499 pages.
In this case is it better SEO wise to have:
-
510 pages with 499 dupplicate content crawl errors or;
-
11 pages with 0 crawl errors?
-
-
Hi Vincent,
Great question! The reason you are getting those duplicate content errors is because of the HTML similarity of your product pages. Even though they have different title tags and content, there's not enough content differences from one page to another to fully distinguish them.
That said, Google is a bit more sophisticated at determining duplicate content, but in the age of Panda, they might view the content on these pages as "thin" In general, it's nice to have at least 250+ words of rich unique content on every page you are trying to rank for.
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
In general, you don't want to canonicalize those product pages, as these would effectively drop them from the index. Instead, the ideal solution would be to add more content to those pages, above the fold, and beef up the descriptions. Even if this is unreasonable, I would prefer the duplicate content errors in this scenario than canonicalizing everything to 10 pages.
Hope this helps! Best of luck with your SEO!
-
I don't see why you would need those as the question was general and the example was not taken from a website. I guess the problem would apply to both:
and
-
Can you list your site domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why have I slipped off the 1st page?
Hi Forum, this has me baffled. I'm an experienced SEOer and have many sites ranking well for their clients but this one has me completely demented. I work for Easigrass.com and we used to rank on page one for the key term "artificial grass". Now we're on page 3 and this has all happened since we launched the new (Better) site in February 2016. We have better content, better seo, better social media and better linking than all of the major competitor sites but we still seem to be falling down the ranks. I don't know if its me just trying too hard and missing simple things but I cant see if we're being penalized or not. For searches performed in the UK we're currently 22nd (we used to be 5th!) Our competitors are; artificial-grass.com
Competitive Research | | Easigrass
expressgrass.com
artificialgrass-direct.com
artificiallawn.co.uk The one area that keeps popping up is potentially keyword stuffing but that can't be the only problem. We must be getting penalised somewhere!0 -
Possible google sandbox issue? Organically ranking page 1 for our number 1 keyword, but page 5 sometimes 6 on google?
What are some things I can look into to figure out why google is ranking us on page 5 sometimes page 6, with some slight rank boosts to rank 36 from 48 but then falls right back. While Yahoo and Bing rank us page 1 consistently, without these big drops back. I use google search ( webmaster tools ) daily, fix 404s and make sure to fetch new content I create. Our site is within the Sandbox issue time frame, google 1st indexed the site about a year and a half ago, the site has been through various SEO service checks, and has had those issues fixed ( some bigcommerce won't allow, such as full sitewide ssl and a few other small factors ) but all the big stuff was handled or will be 100% handled after this redesign is complete. But just still seems we're stuck in the google hole again. We do use adwords, but no clear signs as to why we'd suffer such hardships with google ranking, only thing we don't have optimized in terms of on page optimization from moz is keyword in url, of which will be changing within the next month or two, as we're rolling our a new redesign with SEO 100% at the forefront, nice url paths, with keywords in their url, much more responsive site that uses less resources. But before we release this redesign, I'd like to find out what toe we stubbed of google's to give us such a ranking blackeye... We don't have that many backlinks, and I know these are a huge factor, however, building quality backlinks it's harder than walking on water at time and on the same level as spinning hay into gold. Any ideas community...
Competitive Research | | Deacyde0 -
Relevancy vs Quality of the website in blog commenting
For example, I would like to comment on a do follow blog with my link inserted in the comment. The blog post is relevant with the link i inserted however there are too many spammy links on the comment. Will this affect my website in terms of "link neighborhood" even my site is relevant to the blog? how do you judge whether the blog is worth commenting and putting your link on it? Does link building on blog comments actually hurt the page rank of websites if it go wrong such as spammy sites?
Competitive Research | | andzon0 -
Thoughts on Nofollow My Account / Shopping Cart Pages on an eCommerce Website
I recently noticed most of my competitors (eCommerce sites) have linked to pages within their sites that do not always contain product information (my account, shopping cart, etc) using rel="nofollow". My site does not currently do this. Are there any advantages to using rel="nofollow" on similar pages on our site? Any disadvantages?
Competitive Research | | Gordian0 -
My (properly optimised) webpage outscores page#1 ranked competitors on page/domain authority ... but I'm only on page#2\. Huh?
I'm puzzled. I've optimised a particular page for a particular search term, and the SEOMoz tool gives me an A for on-page optimisation. So no problem there. I can understand why my webpage/site is being outranked by pages from (for example) the Guardian and Oxford University, but there are several sites that Google is ranking on page #1 though their page and domain scores are well below ours. Specifically: my page/domain authority scores are 46/52, compared with 22/46 for the competitor that Google is ranking #5 - yet we only rank a lowly #12. And it's not as though the particular page in question isn't an obvious and appropriate part of our site. We work with new writers and the page in question offers a selection of creative writing courses. It's not like we're a writing-related site that suddenly has a page advertising fake rolexes. It's not a timing issue either, as most of our links have been in place for a couple of years at least. So I'm puzzled. And concerned. This page of ours was a reliable revenue generator for us and it's dying out there on the page#2 wilderness. If anyone can help, I'd be massively grateful. I don't know if this is helpful, but the page in question is http://www.writersworkshop.co.uk/Creative-Writing-Courses.html and the search term is ... well, heck, you take a wild guess. We're a British firm, so the only search engine that really matters to us is google.co.uk
Competitive Research | | harrybingham0 -
How can I project changes to Page Authority?
Lets say I have a PA 50 website, and I just got a PA 60 link from another site. Is there any way to project, mathematically, what that new link will do to my page's authority? Would the formula look something like this? New PA = Current Page PA + ((New Link PA*dampening factor)/Total number of links on new page) In the absence of a mathematical model to predict shifts in PA, how can you be confident that a specific action will help you progress towards an authority objective?
Competitive Research | | phantom0 -
How important are these errors?
SEMOZ have counted a numberous of errors (and some warnings as well) on our site: www.tryg.no But: How does these errors effect our ranking? I know that the errors are coming from the PRISSESSID (insurance
Competitive Research | | Petersen11
calculator). Some keywords are registered with 11 different URLs, wich create errors like Duplicate page content and Duplicate Page title and warnings like Long URL. SEOmoz thinks this can affect my rankings because they assume that pages will end up competing with each other. But will Google "think" the same way? Vennlig hilsen Espen Vigeland
Web Editor0 -
Link to landing page or home page
Ok, so I've been doing a bit of link research and found that most of our competitors use anchor text to link to their homepage on their site and not a specific landing page. Our current strategy is to link to the correct landing page depending on the anchor to get these to rank, as opposed to getting our homepage to rank for different keywords. If that makes sense. The main reason I can think of to send all links to the homepage is that it would increase the domain authority of the site as a whole. Instead of spreading the links to certain pages depending on the anchor text. It would also give your one page a greater chance to rank for all keywords? Quick example I am a site that sells sports clothes. I have 2 main pages from my homepage: Homepage Football Rugby _Should I link to the football page for all links anchored with "football", and all linked anchored with "rugby" to the rugby page OR should I anchor all links including "rugby" and "football" to the homepage? _ What do you think is the best strategy? Thanks in advance.
Competitive Research | | esendex0