Publishing the same article content on Yahoo? Worth It? Penalties? Urgent
-
Hey All,
I am currently working for a company and they are publishing exactly the same content on their website and yahoo. In addition to this when I put the same article's title it gets outranked by Yahoo. Isn't against Google guidelines? I think Yahoo also gets more than us since they are on the first position. How do you think should the company stop this practice? Please need urgent responses for these questions.
Also look at the attachment and look at the snippets. We have a snippet (description) like the first paragraph but yahoo somehow scans the content and creates meta descriptions based on the search queries. How do they do That?
-
Thank you very much for your advices. Really helped me out here. I will message you sooner or later and tell you how it went, if you are interested. This week I will make a presentation for the team with the reports.
I think this should be addressed ASAP
-
I'd definitely make that point you made in bold.
If you're a paid contributor, it's a matter of does the income outweigh the drawbacks? It's pretty hard to put a tangible figure on that, but there are definite upsides and downsides. Arguably it adds to Moneywise's branding to be seen on Yahoo, but you can't track that. What you can track are clicks through to the site.
And of course it all depends on what the goal of Yahoo inclusion is. If it is just a money-spinner and a worthwhile one at that, don't even put the same content on your site. It's not worth running the risk of duplication penalties and/or link penalties, depending on how Google sees it.
If it is being done to raise brand awareness then (personally) I think it cannibalises your online visibility more than it promotes it - while still presenting SEO problems.
Outside looking in here, but I hope it helps. I'm with you - it's quite a predicament and a delicate situation, so I hope it works out for you. At the very least, my SEO advice can be seen as impartial and without an agenda, which may be useful to bring to a discussion among people with the company's interests, plus their teams'/
-
Thank you for your clear and descriptive response. I really appreciate it. The hardest thing in this case is to persuade the company that the costs outweigh the benefits. It seems that we are getting paid from Yahoo as contributors. I can outline the negative impacts on SEO, definitely will use your points. Need to think something about the returns in terms of potential revenues, also. How do you think?
Or I guess I should just point at that we are losing the overall position as a brand. And content duplication can be one of the main reasons why we are losing many positions.
Right now I will look at the reports. -
Hey there
I can't see any sense in doing this.
At the very least, it detracts clicks to your site, as it promotes Yahoo over your site. It may also look like to a reader that Moneywise is taking content from Yahoo (rather than the other way round), which cheapens the brand.
The worst case scenario would be that your site is seen as duplicating/stealing content - especially given at how poor Google is at identifying the original source for content. It could also think that you're duplicating content for the sole purpose of getting links, which again could lead to penalties.
To me, this doesn't make sense. I'd be much more inclined to keep the content on your own site - get people to come directly to you. You're getting comments on the articles so you already have a solid user base, clearly.
If your colleagues argue that the Yahoo copies of the content bring in new people to the site, pull up a Google Analytics report and look at how many people entered your site via Yahoo over the last 3 months. I can almost guarantee you that hardly anyone will be clicking those links in the article - those links by the way look pretty manipulative/commercial in terms of anchor text, which could prompt another penalty.
And in SEO terms, despite the link coming from Yahoo, if no one is linking or sharing that URL on Yahoo, I can tell you now that the link won't have much value to it.
In terms of your snippet question, it just looks like Yahoo are pulling the title and content from the page and generating a fresh meta description from there. Probably a time saving solution for a website of that size, but certainly not an ideal one. Your meta descriptions look much better.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
How do I make a content calendar to increase my rank for a key word?
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO. Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
Help figuring out if certain paid directories are worth it
The person in my position previously had quite a few paid directories our site was listed on. What is the best resources you guys have used or know of to figure out which ones are good to keep? For instance one that is up for renewal this week is site-sift.com. I know the person previous to me did some not so ethical stuff and I'm trying to clean up messes. Any advice on directories would be much appreciated.
White Hat / Black Hat SEO | | inhouseninja0 -
Identifying why my site has a penalty
Hi, My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this. I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%. Can anyone advise on what it could be? Thanks! Will XBvZq2e
White Hat / Black Hat SEO | | madegood0 -
When to give up on a website with a Google penalty?
I recently had a Google 60 penalty hit my website. The main two issues were that I had a person helping me with SEO and they bought some links. The second issue is that I own about 90 URL's in the my vertical. I created about 60 one page sites for these keyword targeted domains. I then linked these sites to main site. Big mistake! I kept these URL's all on the same server as my main site. In October 2010 I noticed my site hits dropped dramatically. I started looking for the issue. I didn't know which issue caused the penalty. I fixed both issues in November 2010 and asked Google for reconsideration in early December 2010. I kept link building for my site by finding quality links.I was extremely honest with Google. I gave them all of the domains I own and I told them the name of the person that bought links for me and the websites where those links were placed. As of late February 2011 a Google search for my domain still showed up in approximately the 64th position. I recently asked Google again to lift the penalty. I basically told them that I fixed all of my issues that led to the penalty and let them know I have been waiting for almost 3 months. I told them I have put the past 2 years of my life into this website and begged them to forgive me. I also asked them to let me know if my site was never going to be forgiven? I got the typical canned response from the Google team. As of today the penalty is still in effect. I just want to know when you should give up on a site. I have spent about $20,000 on this site and about 2 years of hard work. I don't want to give up, but I don't want to keep putting my hard work and time into the site if it will never escape the dreaded Google penalty. Do you think I should continue to wait and if so how long? Anything else I can do to persuade Google to release me from this penalty hell? If I do abandon the site and start from scratch what steps should I take? Do I need a new server? What if any content can I take from my current site and transfer to the new site? If I can how do I do this without getting another penalty or lose the credit for the original content. I created about 2,000 pages of original content for this site. I'd love to be able to transfer this content if I have to start from scratch. Any ideas or detailed help plans would be greatly appreciated.
White Hat / Black Hat SEO | | tadden0