Syndicated Content Appearing Above Original
-
Hi.
I run a travel blog and my content is often re-posted by related sites with a backlink to my content (and full credit etc) but still ranks above my article in Google.
Any ideas what I can do to stop this happening?
Thanks
-
I think that it's partly because my content is not 'blog-like' it's more like a travel guide and an events guide and some products so it might not look right to people but the drop was dramatic.
By the way, I was just looking at your site. Would you be open to a guest article? I run an Israel travel site and could write about the 'real Israel'! Of course, I can link in exchange etc/whatever you like...
-
wow, that is amazing.
I've been encouraging our authors to get onto G+ to add authorship.
(I have 60 so far)
The benefit is the photos with results, but zero traffic improvement.
I wouldn't blame that on authorship either,. There must be something wrong with my site that I haven't fixed yet, but I don't know what it could be. Except I just finished, about 2 weeks ago, finding and removing the last of the duplicate descriptions and titles.
-
With some people maybe I need to look at that. Problem is that many of my articles are valid only in short term because they're about events (these are the ones people love to use) so it would take a while for a DMCA to set in right?
-
I reversed the authorship and instantly went back to the same level as before though.
-
I would not blame that on authorship. I would blame that on an increased level of piracy. Eventually they can strangle your appearance in the SERPs.
-
I understand. Those weasels do it with my content too.
When they copy I often use DMCA complaints.
-
So annoying! I find that sometimes after a week or two Google corrects itself, sometimes not. I was recommended to install authorship which I did and then found that this problem kind of solved itself, but overall search traffic fell 25%+
...!
-
Welcome to my world, Ben.
There are thousands of sites, that post our headline and a snippet of text - some with a link to us, some not.
It is very frustrating. Google buries our page and promotes those guys. Sometimes, there are several of them, all showing in the results pages and our original page is nowhere to be found, buried in the duplicate content, so if you go to the end of the results and redisplay with all the missing pages, there we are on page 1.
I've been trying to overcome this for 18 months now, but I'm not getting anywhere.
-
The problem is if they dont syndicate they'll modify and copy.
-
It might. It might not.
Content syndication has both Panda and Penguin risks.
And, you have the competitor problem.
-
Hmmm. If it isnt fully identical then Google might display both though right?
-
They will still have a relevant title tag, and relevant content.
(add this to my original reply)....
The best way to get filtered from the search results is to have an article on another site linking to identical article on your site.
-
The interesting thing is that sometimes it's tiny sites with much less authority who are ranking better.
Maybe if I got them to syndicate half the article with a "read more, click here" link that would help?
-
This happens because the other sites that post your content have more authority and that gives them a higher ranking. What can you do to prevent this?
-- only syndicate to sites that have less authority than you
-- create different content for syndication that what appears on your website
-- stop syndicating
This is just one reason why I do not syndicate anything. It creates new competitors and feeds existing competitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Affiliate Url & duplicate content
Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
Technical SEO | | Direct_Ram
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Duplicate Content Issue
Very strange issue I noticed today. In my SEOMoz Campaigns I noticed thousands of Warnings and Errors! I noticed that any page on my website ending in .php can be duplicated by adding anything you want to the end of the url, which seems to be causing these issues. Ex: Normal URL - www.example.com/testing.php Duplicate URL - www.example.com/testing.php/helloworld The duplicate URL displays the page without the images, but all the text and information is present, duplicating the Normal page. I Also found that many of my PDFs seemed to be getting duplicated burried in directories after directories, which I never ever put in place. Ex: www.example.com/catalog/pdfs/testing.pdf/pdfs/another.pdf/pdfs/more.pdfs/pdfs/ ... when the pdfs are only located in a pdfs directory! I am very confused on how to fix this problem. Maybe with some sort of redirect?
Technical SEO | | hfranz0 -
Importance of keyword in the content to rank well
Well, I am very beginner seo. our website is www.theprinterdepo.com and our main keyword for the moment is refurbished printers, but there might be hundreds of more keywords. I was analyzing the SERPs and found that for our keyword, the first 2 websites that come up have very different content 1.http://www.geeks.com/products.asp?cat=PRN They rank first, but I dont see a lot of text with the keyword in their text or links 2. http://www.valstarprinters.com/ They mention the keywords lots of times in the text, probably they are doing keyword stuffing? So this makes me thing, how the 1st one ranked there? just by link builiding?
Technical SEO | | levalencia10 -
Duplicate content on my home
Hello, I have duplication with my home page. It comes in two versions of the languages: French and English. http://www.numeridanse.tv/fr/ http://www.numeridanse.tv/en/ You should know that the home page are not directories : http://www.numeridanse.tv/ Google indexes the three versions: http://bit.ly/oqKT0H To avoid duplicating what is the best solution?
Technical SEO | | android_lyon
Have a version of the default language? Thanks a lot for your answers. Take care. A.0 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0