Does google recognize original content when affiliates use xml-feeds of this content
-
Hi,
Concerning the upcoming (We're from the Netherlands) Panda release:
-Could the fact that our affiliates use XML-feeds of our content effect our rankings in some way
-Is it possible to indicate to google that content is yours?
Kind regards,
Dennis Overbeek
dennis@acsi.eu | ACSI publishing | www.suncamp.nl | www.eurocampings.eu
-
Thank you very much!
-
Thank you very much!
-
According to my own experience with affiliates and feeds, Google does not like duplicate content and will identify original content, so if you produced original content make sure Google crawls your site fist before you send it to your affiliates, in terms of automated feeds from your site, Google will identify this and credit you with the hard work.
Not sure what others think but this from my own expeience the last few years....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Cache issue
Hi, We’ve got a really specific issue – we have an SEO team in-house, and have had numerous agencies look at this – but no one can get to the bottom of this. We’re a UK travel company with a number of great positions on the search engines – our brand is www.jet2holidays.com. If you try ‘Majorca holidays’, ‘tenerife holidays’, ‘gran canaria holidays’ etc you’ll see us in the top few positions on Google when searching from the UK. However, none of our destination pages (and it’s only the destination pages), show a ‘cached’ option next to them. Example: https://www.google.com/search?q=majorca+holidays&oq=majorca+holidays&aqs=chrome..69i57j69i60l3.2151j0j9&sourceid=chrome&ie=UTF-8 This isn’t affecting our rankings, but we’re fairly certain it is affecting our ability to be included in the Featured Snippets. Checked and there aren’t any noarchive tags on the pages, example: https://www.jet2holidays.com/destinations/balearics/majorca Anyone have any ideas?
Technical SEO | | fredgray0 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Why HTML entities gets crawled as content keywords in Google search console?
My Google search console shows HTML parameters such as div, class, img, src, gif, align as content keywords, but why google crawls HTML parameters as keywords? because of this, I would be losing traffic for my on-page content keywords. Please let me know how to solve this. Thanks, Jenifer
Technical SEO | | Jenifer300 -
My PR1 website with no manual action is not appearing in the google top 200 result, it used to be top 1
Hi All, My website has 5 years of history, it used to be top 3 for its keyword with an SEO firm I hired back then with PR3. After 2013, its ranking starts dropping to not appearing at all in the google search result. But my site is not banned. There are still some long tail google traffic. There is no manual action in the webmaster tool. Even today, it still has PR1 but it is still not appearing in the top 200 results of google. My onsite optimization for the main keyword is good. It's just not appearing in google results at all. I manually reviewed all the top 200 results, there are so many spammy blogs were listed but not my legit website homepage with unique content. I do regret hiring that SEO firm now. But there is no manual action to my site according to google so I don't know if I should disavow all the old backlinks. I started to do some quality SEO works myself now. My ranking is now at around 11~15 in Yahoo/Bing from 30~40. Do I see some light at the end of the tunnel? Does that mean my site may appear in google again? Thank you all for the reply.
Technical SEO | | ChelseaP0 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0 -
Google +1 not recognizing rel-canonical
So I have a few pages with the same content just with a different URL. http://nadelectronics.com/products/made-for-ipod/VISO-1-iPod-Music-System http://nadelectronics.com/products/speakers/VISO-1-iPod-Music-System http://nadelectronics.com/products/digital-music/VISO-1-iPod-Music-System All pages rel-canonical to:
Technical SEO | | kevin4803
http://nadelectronics.com/products/made-for-ipod/VISO-1-iPod-Music-System My question is... why can't google + (or facebook and twitter for that matter) consolidate all these pages +1. So if the first two had 5 +1 and the rel-canonical page had 5 +1's. It would be nice for all pages to display 15 +1's not 5 on each. It's my understanding that Google +1 will gives the juice to the correct page. So why not display all the +1's at the same time. Hope that makes sense.0