Duplicate Content using templates
-
Hi,
Our web site is designed using a template, which means the header and footer is consistent across all pages. Only the body content is unique on each page.
Is the google bot able to see that the header and footer content is defined by the common template?
Will this have any impact in terms of duplicate content? For example, we have a two line text in the footer that summarize the services we provide. Because the same text is in the footer of all pages, i am concerned about creating duplicate content.
Finally, does it make sense to include keywords in header and footer of the template? Will it have any positive or negative SEO impact?
-
The claim is that they're pretty good at detecting headers, sidebars and footers - and make some accommodation for these. In my experience it's mostly related to links in those areas, though there's some duplicate content consideration.
Having said that, I've also consistently seen where unless you've got a mega site with a lot of other SEO going, that accommodation is not enough to compensate for getting away with almost no actual unique content on important pages. You still need to consider the overall competitive landscape, and you still need to have more than a spit's worth of content. And the less you do in regard to other SEO factors, the more content you absolutely need.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content from page links
So for the last month or so I have been going through fixing SEO content issues on our site. One of the biggest issues has been duplicate content with WHMCS. Some have been easy and other have been a nightmare trying to fix. Some of the duplicate content has been the login page when a page requires a login. For example knowledge base article that are only viewable by clients etc. Easily fixed for me as I dont really need them locked down like that. However pages like affiliate.php and pwreset.php that are only linked off of a page. I am unsure how to take care of these types. Here are some pages that are being listed as duplicate: Should this type of stuff be a 301 redirect to cart.php or would that break something. I am guessing that everything should point back to cart.php.
On-Page Optimization | | blueray
https://www.bluerayconcepts.com/brcl...art.php?a=view
https://www.bluerayconcepts.com/brcl...php?a=checkout These are the ones that are really weird to me. These are showing as duplicate content but pwreset is only a link of the KB category. It shows up as duplicate many times as does affilliate.php: https://www.bluerayconcepts.com/brcl...ebase/16/Email
https://www.bluerayconcepts.com/brcl...16/pwreset.php Any help is overly welcome.0 -
How to explain to a client that duplicate content is bad...
Afternoon! An SEO client of ours has copied a load of landing/category page content from other sites. Lots of emails have been sent back and forth asking them to remove it, but they are adamant to keep it up there until we have time to amend it. We have explained to them: The Google penalty risks The copyright risks The short and long-term implications for their brand new business/website The money they are spending on our SEO package could be completely wasted if they're caught I think the above is pretty black and white, but the director of this company will not budge. Does anyone have any different approaches? The director said he's happy for us to amend the content but, in the meantime, the plagiarised content will not be removed. Cheers, Lewis
On-Page Optimization | | PeaSoupDigital0 -
How to deal with duplicate content when presenting event and sub-events information?
Hi, I'm have a sport event calendar website.
On-Page Optimization | | ahotu
It presents events that may have multiple races.
The event has its own page as well as the races. example :
Event: /event/edinburgh-marathon-festival Races:
/race/emf-half-marathon
/race/emf-10-km
/race/edinburgh-marathon
/race/emf-5-km The pages may have a lot of information in common (location, date, description) and they all link to each other.
What would be the best practices to avoid having the pages considered duplicate content by Google? Thanks0 -
Duplicate content issues?
Our company consists of several smaller companies, some of whom deal with very similar things. For instance, two of our companies resell accounts software, but only one provides after-sales support. Because of the number of different companies and websites we have, sometimes it would be easier to simply copy content from one site to the other, optimised in the same manner as, in some instances, we would want different websites to rank for the same keywords. I have been asked my opinion on the potential impact of this practice and my initial response was that we should avoid this due to potential penalties. However, I thought I'd garner opinion from a wider audience before making any recommendations either way. What do people think? Thanks.
On-Page Optimization | | HBPGroup0 -
Duplicate content issue in SEOmoz campaign.
Hi, We are running a campaign for a website in SEOmoz. We get a dup content issue warning: http://www.oursite.com and http://www.oursite.com/ are being seen as 2 different urls. Only difference among 2 urls is the trailing slash at the end of the second url. Why is this happening? I was aware of www vs non www but never heard of an issue related to the slash. Thanks for your help!
On-Page Optimization | | gerardoH1 -
Using Transcriptions
Hi everyone, I've spent a long time trying to figure this one out, so I'm looking forward to your insights. I've recently started having our videos transcribed and keyworded. The videos are hosted on youtube and already embedded on our website. Each embedded video is accompanied by an existing keyword-rich article that covers pretty much the same content of the video, but in a little more detail. I'm now going back and having these videos transcribed. The reason I started doing this was to essentially lengthen the article and get more keywords on the page. Question A. My concern is that the transcription covers the same content as the article, so doesn't add that much for the reader. That's why when I post the transcription (below the embedded video), I use a little javascript link for people to click if they want to read it. Then it becomes visible. Otherwise it's not visible. Note that I am NOT trying to hide it from google by doing this - and it will still show up for people who don't have javascript on - so I'm not trying to cheat google at all and I think I'm doing it based on how they want it done. You can see an example here: http://www.healthyeatingstartshere.com/nutrition/healthy-diet-plan-mistakes So my first question is: do you think the javascript method is a good way of doing it? Question B. Does anyone have any insight on whether it would be better to put the transcription:
On-Page Optimization | | philraymond
1. On the same page as the embedded video/article (which I am doing now), or
2. On a different page, linked to from the above page, or
3. On various other websites (wordpress, blogspot, web2.0 sites) that link back to the video/article on our site. I know it's usually best practice to put it on the same page as the video, but I'm wondering from an <acronym title="Search Engine Optimization">SEO</acronym> point of view if I'm wasting a 500 word transcription by posting it on the same page as a 500 article that covers the same topic and uses the same keywords, and I wonder if it would be better to use the transcription elsewhere. Do you have any thoughts on which of the above methods would be best? Thanks so much for reading and any advice you may have.0 -
What is the best solution for printable product pages (duplicate content)?
What do you think is the best solution for preventing duplicate content issues on printable versions of product pages? The printable versions are identical in content. Disallow in Robots.txt? Meta Robots No Index, Follow? Meta Robots No Index No Follow? Rel Canonical?
On-Page Optimization | | BlinkWeb1 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0