Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How does google recognize original content?
-
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website.
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic.
What do you guys think? You think all our original content effort is going to trash?
-
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software.
This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
-
Thanks.
About ten years ago I sold a lot of stuff on Amazon. Things were going well. I was the only person selling a nice selection of items. Then they started to sell the same items - and sold them at such a low price there was no way for me to make a profit. Impossible. That was just like working really really hard for someone who would become almost an impossible to beat competitor and dominate your SERPs for the next decade.
-
(offers napkin to EGOL to wipe up coffee spittle)
-
Excellent points by EGOL.
Amazon, and Walmart, are two edged swords that cut one way (you). I understand why businesses go that route, but it is very difficult to win. Sometimes someone does though:
A lady who is a friend of mine about 15 years ago took over the US arm of a German toy distributor and they created a very cool doll. Everyone with the German company and all on the US marketing team screamed they had to take it to Walmart. She politely refused to and said, let Walmart come to me. She then went all over hawking the doll and ended up on HSN. (I think that is the original big TV sales channel). About a year in everyone wanted these dolls and Walmart did not have them.
When Walmart called, she named the price - she did not have to kiss someone's... They were pleased to do the kissing.
One of my favorite stories of all time.
-
Well, sounds like i am screwed since we are sending our feeds to amazon last 7 months. I am going to update the feed and remove the descriptions from amazon feed. But i don't know if it will help me at all. By the way, i am talking about amazon ads, Not selling on amazon. However if amazon doesn't have that product in their database, they basically use your descriptions and create a product page but says this product is available on external website.
-
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too.
- spits coffee *
Whoa! I would not do that. I would remove or replace those descriptions on Amazon if at all possible.
When you sell on Amazon, any content, any image, any anything that you put on their site will be used against you. And, if you strike gold there then Amazon will quickly become your competitor.
This is exactly why I don't sell on amazon. They solicit me a couple times a year to sell my stuff on their site. No way. I did that in the past and my work benefited Amazon more than it benefited me and benefited my competitors too.
I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
This is not true. I don't care who says this is true, I am going to argue. No way. I'll argue with anybody about this. Even the big names at Google. They do a horrible job at attributing first publisher. Horrible. Horrible.
I have published a lot of content given to me by others. Other people have stolen my content. I can tell you with assurance that the powerful often wins... and if a LOT of people have grabbed your content you can lose to a ton of weak sites.
Google does not honor first publisher. They honor powerful publishers - like Amazon. Giving content to Amazon that you are going to publish on your website is feeding the snake!
So google thought that we have a shallow or duplicated content and dropped our rankings?
If your content is on Amazon, they are probably taking your traffic. Go out and look at the SERPs.
-
Serkie
Given these are product descriptions, but apply only to you selling them (even if it is through Amazon/G) I think there are a couple of ways you can go. One would be to add author markup if that is possible; I don't know how many products, etc. you are dealing with or what type of eCommerce or other platform you may be using.
Secondarily, within your actual text, you could state authorship and place a link back to you.(likely at very end of description.)
Last would be that if you register a copyright (no not a circle with a c in it as most do - the real thing) it can be fairly inexpensive. Depending how you package it to the copyright office we find it can run about a dollar a page. That would give you ownership should you ever have an issue with someone using your description without authorization (obviously you give it to Amazon and Google.)
A final note is this: when you started rewriting the descriptions my guess is you wrote, changed, rewrote, etc. In the event you ever had to defend yourself or prove you are the actual owner, in a court the documents showing how you arrived at the final are invaluable.
I don't know if this is what you were looking for, but I hope something here will help.
Best
-
For our ecommerce sites we always make sure to have original content in our product feeds as well as our pages. That way the things from our feeds don't poach from our sites and we have a broader range of search terms covered as well as avenues to be reached through.
-
Google typically looks at who published it first, as well as the authority of the sites that house the content. You could be running into problems because Amazon is going to have much more authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Content Below the Fold
Hi I wondered what the view is on content below the fold? We have the H1, product listings & then some written content under the products - will Google just ignore this? I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed. The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!? Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
Intermediate & Advanced SEO | | alphonseha0 -
Medical / Health Content Authority - Content Mix Question
Greetings, I have an interesting challenge for you. Well, I suppose "interesting" is an understatement, but here goes. Our company is a women's health site. However, over the years our content mix has grown to nearly 50/50 between unique health / medical content and general lifestyle/DIY/well being content (non-health). Basically, there is a "great divide" between health and non-health content. As you can imagine, this has put a serious damper on gaining ground with our medical / health organic traffic. It's my understanding that Google does not see us as an authority site with regard to medical / health content since we "have two faces" in the eyes of Google. My recommendation is to create a new domain and separate the content entirely so that one domain is focused exclusively on health / medical while the other focuses on general lifestyle/DIY/well being. Because health / medical pages undergo an additional level of scrutiny per Google - YMYL pages - it seems to me the only way to make serious ground in this hyper-competitive vertical is to be laser targeted with our health/medical content. I see no other way. Am I thinking clearly here, or have I totally gone insane? Thanks in advance for any reply. Kind regards, Eric
Intermediate & Advanced SEO | | Eric_Lifescript0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0