Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
-
A company has a TLD (top-level-domain) which every single product:
The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed:
The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same!
My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution?
Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same:
Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD?
Does rel cannonical work across domains?
Would the product pages with a rel cannonical tag in the header still rank?
Let me know if there is a better solution all-round!
-
I only use canolonical links on the same domain as your telling google which is a master page. If you use them accross domains I don't think it would pan out very well for the site giving away it's content google juice.
I'd like to know the colution to this if anyone has got anything to add, as I also have a site in Ireland which sells the same as the site in the UK. Luckily for me the majority of the contnent isn't duplicate.
-
Having unique content is not a visable option in this instance!
You mentioned that using canonical links will work to a certain degree - can you expand on this?
-
Duplicate content is evil, in Google eyes.
Imagine your Google, What Google would like to do is index as fewer pages as possible and end up with the fewest possible number of results. Meaning that the results are specific to the users requirements.
OK, so when you add duplicate content onto your site or sub-domain you are making Google's job harder and therefore they will penalise you for that. Using Canonical links will work to a certain degree. But not as well as unique relevant content.
We have a range of product which is the best in the world made by a company called SKF. It's the humble bearing. However every man and their web developer add content direct from the SKF website (including my company!! DOH) this means that we will never be anywhere for the word bearing as it gets hidden in all of the duplicate content and if they haven't already Google may even drop our page.
It's a constant battle for me, and should also be for you. Unique content is the way to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I even do product page schema?
If I have no reviews/ratings on the page itself and special/limited time offers and just a regular product page with a standard price, is there any ability to do product schema with it getting flagged for errors? Google's Structured Markup Testing Tool threw me an error when I test it without any of those: | One of offers or review or aggregateRating should be provided. | And even if it's possible, is there any point?
Intermediate & Advanced SEO | | SearchStan0 -
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Revamping/Re-optimizing State Pages - What to do with old content?
Hello Moz Fam! I work in the insurance industry and we serve all 50 states. We have a state page for each state where the content is thin(ish). We're slowly revamping each page and hitting every talking point for that specific state. I've used multiple tools to come up with a content template and link building template for each page as well. I spent 5 months last year proof reading all these pages. So the content is good, just not SEO good. I didn't have the team or resources to really optimize them all yet, now I do. My question is... what should I do with the old content? I was thinking of publishing it to other platforms that we have a contributor account on and linking back to each state page with it. Of course, I would wait a few days for the search engines to index the new content so it wouldn't be duplicated on these platforms. Good or bad idea?
Intermediate & Advanced SEO | | LindsayE0 -
Mixing up languages on the same page + possible duplicate content
I have a site in English hosted under .com with English info, and then different versions of the site under subdirectories (/de/, /es/, etc.) Due to budget constraints we have only managed to translate the most important info of our product pages for the local domains. We feel however that displaying (on a clearly identified tab) the detailed product info in English may be of use for many users that can actually understand English, and may help us get more conversions to have that info. The problem is that this detailed product info is already used on the equivalent English page as well. This basically means 2 things: We are mixing languages on pages We have around 50% of duplicate content of these pages What do you think that the SEO implications of this are? By the way, proper Meta Titles and Meta Descriptions as well as implementation of href lang tag are in place.
Intermediate & Advanced SEO | | lauraseo0 -
Will more comprehensive content on product pages help improve ranking?
We're working to improve the ranking of one of our product landing pages. The page that currently ranks #1 has a very simple, short layout with the main keyword many times on the page with otherwise very little text. One thought we had was to make a more comprehensive page including more info on the features and benefits of the product. The thought being that a longer form page would be more valuable and potentially look better to Google if the other SEO pieces are on par. Does that make sense to do? Or would it be better to keep the product page simple and make some more related content on our blog linking back to that landing page? Thanks in advance to any help you can provide!
Intermediate & Advanced SEO | | Bob_Kastner0 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Duplicate content in Webmaster tools, is this bad?
We launched a new site, and we did a 301 redirect to every page. I have over 5k duplicate meta tags and title tags. It shows the old page and the new page as having the same title tag and meta description. This isn't true, we changed the titles and meta description, but it still shows up like that. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0