Long tail keywords and duplicate content (product description)
-
Hi <acronym title="Search Engine Optimization">SEO</acronym> pro's, how are you doing these days? Hope everything is fine...
Let's get down to business: I've got a little question about ecommerce sites with duplicate content (product descriptions). I'm already ranking top #1 for exact keyword matche's (did a lot of backlink work with exact keyword). That's fine.
The question is: long tail keywords still getting lower results than the competitors, because they published the content first. How to beat them? What I need to do/work to outrank competitors on long tail keywords? (I really need this because almost keywords/products from my niche only have 10% of exact search's).
Hope someone can give me a word of light on this!
Thanks!
-
Best long term answer is beat their content. Another solution is to beat their links, but you are one panda-style update away from losing any questionable links.
There you have it in two clear sentences. The methods and the risks.
Thumbs up!
-
EGOL's answer is clearly the way to go.
If you offer identical content to your competitor, then it comes down to the remaining on-page and off-page factors. If both you and your competitor optimize your pages well, then it becomes a battle of links.
Best long term answer is beat their content. Another solution is to beat their links, but you are one panda-style update away from losing any questionable links.
-
If your product is re Huey get-2015-45B then putting some thought I to even some modest optimisation of the product page is worthwhile. SEO is often about experimenting. So for the above widget why not test expanding g the product description, working the detail for 3-4 products and see the before / after impacts.
-
Or you, as a SEO guru, are telling us: just focus on [exact match] and forget the rest...
My advice was either: A) get more links or B) rewrite all of the descriptions.
If I owned a site with this problem I would rewrite the descriptions. It would be expensive but I bet I would make a lot of money.
-
Hmmm... maybe a way to became an authority page for the specific keyword would help...
And yes, I known it's a "magic bullet" question that will crack our minds down.
Or you, as a SEO guru, are telling us: just focus on [exact match] and forget the rest...
Thanks again for replying me.
-
It sounds like you are looking for a "magic bullet" answer?
-
Thanks for the answer but I still need to know how to beat the duplicate content.
Think about the product description as a table of specifications (dimensions, etc). We are working with engineering products...
-
How to beat them?
Lots of people would say... "Beat their links"....
If this was my site I would say... "Beat their content"..that means a rewrite of the standard yada yada yada text that everybody is grabbing from the manufacturer's datafeed. Then you are not competiting with the competitor on a lot of the long tail terms and you might be able to reoptimize for parallel terms that searchers are still using.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate content on user queries
Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it. The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later. All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so www.mysite.com/search/productA will return no results... Product B is also not active today... so www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!
Technical SEO | | SFMoz0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Determining where duplicate content comes from...
I am getting duplicate content warnings on the SEOMOZ crawl. I don't know where the content is duplicated. Is there a site that will find duplicate content?
Technical SEO | | JML11790 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0 -
Duplicate Content and Canonical use
We have a pagination issue, which the developers seem reluctant (or incapable) to fix whereby we have 3 of the same page (slightly differing URLs) coming up in different pages in the archived article index. The indexing convention was very poorly thought up by the developers and has left us with the same article on, for example, page 1, 2 and 3 of the article index, hence the duplications. Is this a clear cut case of using a canonical tag? Quite concerned this is going to have a negative impact on ranking, of course. Cheers Martin
Technical SEO | | Martin_S0