Duplicate Content Penalty
-
If our pages are to have roughly 30% of non-original textual content, can we be penalized by Google? Or are we OK as long as this non-original content is relevant to the pages?
-
My iPad wrote Andiswa"...lol"..just meant original content is king
-
As far as I understand the recent google posts,original content is king and content farming is more a thing of the past as google is trying to better their search results. Saying this,if your website is a portal for information that is unique but you still make use of variables from others....I.e. Your site is about surfing,but you pull in content on local weather conditions from another sites, I don't think google penalizes you.... Your thoughts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would a lot of images on one post be categorized as thin content?
As an example, if i write an article on 12 best print ads by BMW, it will have 12 images and possible 12 single liners and a paragraph. The images will have the necessary alt tags. But overall, will this post be counted as low content and is there changes of being penalized by google for it?
Content Development | | marketing910 -
Reviving a (very) old blog - is it worth shifting the content onto a new blog?
I look after a few ecommerce sites, one of them doesn't currently have a blog, we are setting up a wordpress blog now for the site. Going way back in time the site did have a blog which was on a separate Typepad domain. What I'm wondering is whether it is worth redirecting this whole blog to the new blog section of the site and copying some of the content over to the new blog as historical posts? I don't think it will be possible to redirect each individual post to a new one so it will just be a straight redirect of the old blog domain to the new one with the same (most of anyway) content. Do you think it is worth doing this for the value of this content which is relevant but dated (many of the links are now expired)? Doing this will take some time to do so it's not 'free' content we'd be getting We have a lot of new content planned out so we won't be short of content, just would be nice to have some historical content on there too Thanks
Content Development | | PeterLeatherland0 -
How do I split page content?
So I offer two services for which each has an FAQ page (let's call them S1 and S2). The problem is that I also have a longer FAQ page that covers both services (S1-2). I would like to eliminate the longer one and attach the relevant content to each of the shorter pages but i'm concerned that deleting pages with a lot of content might be a bad idea. I could redirect I suppose but I wouldn't know which page S1 or S2 to point redirect to. Any advice on this?
Content Development | | NationalPardon0 -
Content Marketing - Car Space
Hey looking for cool content marketing examples in the car industry. Like major car companies leveraging their resources, in developing awesome, and viral content. Anyone aware of any cool campaigns? Cheers, Mark
Content Development | | MBASydney0 -
Correction Duplicate Page Title Problems for a Blog
EDITED: To just focus on the issue at hand. I am trying to figure out the SEO rules instead of just working on the content. Please bear with me. I am adept technically. I just do not know the rules of the SEO process or even some of the termology. So I’m trying to attack problems one at time. Today’s problem – **Duplicate Page Titles ** We evidently have thousands of Duplicate Page Titles. We are using Joomla 2.5 & Easyblog. Our sitemap is automated from XML Sitemap Easyblog takes the title of the sites and uses it for a name of the summary pages. We post 5 blog items per page and all the names are the same. http://www.OursiteName.com/?start=5 Page Title = Site Name http://www.OursiteName.com/?start=10 Page Title = Site Name A similar thing happens on the sorting by Author or Category etc etc. Basically non-duplicate pages are looking like duplicates. What is the best practice / approach? Using the Robot.txt or XML Sitemap to tell Google not to crawl these pages? Writing a script or edit the Easyblog code to edit the 2000 duplicate Page Titles? Other thoughts?
Content Development | | Romana0 -
Will Scrape Content Become Unique Content
I always scrape content from article directories and make it unique through TBS with copyscape passed but i still want to know will google detect the content is not unique.. I unable to write content myself because somewhat english problem.. i know there is lots of cheap article writer available but still is there any way to success ?
Content Development | | mamuti0 -
Duplicate Page Content WordPress blog with categories?
Just got a crawl report back from SEOmoz and it gives me lots of errors for "duplicate page content". Upon investigating, I notice this is because my WP blog is setup into categories so the home page is almost identical to one of the category pages. None of my actually posts are the same but the category pages have some overlap since the same post could show up in two or more categories. Is this a problem or can I just ignore this error? Any thing I should be doing differently? Thanks!
Content Development | | frankthetank20 -
Displaying archive content articles in a writers bio page
My site has writers, and each has their own profile page (accessible when you click their name inside an article). We set up the code in a way that the bios, in addition to the actual writer photo/bio, would dynamically generate links to each article he/she produces. Figured that someone reading something by Bob Smith, might want to read other stuff by him. Which was fine, initially. Fast forward, and some of these writers have 3,4, even 15 pages of archives, as the archive system paginates every 10 articles (so www.example.com/bob-smith/archive-page3, etc) My thinking is that this is a bad thing. The articles are likely already found elsewhere in the site (under the content landing page it was written for, for example) and I visualize spiders getting sucked into these archive black holes, never to return. I also assume that it is just more internal mass linking (yech) and probably doesnt help the overall TOS/bounce/exit, etc. Thoughts?
Content Development | | EricPacifico0