Please help - Duplicate Content
-
Hi,
I am really struggling to understand why my site has a lot of duplicate content issues.
It's flagging up as ridiculously high and I have no idea how to fix this, can anyone help me, please?
Website is www.firstcapitol.co.uk
-
We have a company that specialises in building luxurious garden buildings.
We found that many of the main pages we wrote, had similar wording and weren't duplicated completely, but they were too similar in terms of the wording, which we think damaged our SEO.
We started to add content marketing, that was very well written.
Then we seen the organic traffic increased massively, this improved our SEO.
-
Hi Alix
It will solve your problems as long as you make sure all Meta is completed and that content is separated and those pages noindexed and removed as I said. This is what is causing your duplication.
Glad to help
Regards Nigel
-
Wow - thank you so much, you have no idea how much this helps me!
I thought it was the code which was flagging up as duplicate content as they all use a similar template which is slightly modified for each page.
If I fix these will it reduce the amount of duplicate content that has flagged up?
It was quite high.
Thanks again!
-
Hi Alix
Just go to Google and type in site:firstcapitol.co.uk/ and you will see all of your pages that are indexed.
It's pretty easy as there are only 46.
1
Go through them and noindex the payment pages, author and category.
Delete sample blog page
Delete /hello-world
Delete /london-officeThen write unique content for what is left. Your London page (/london-collection-agency/) is a duplicate of the home page and should be canonicalised as it offers no value. You also have a contact page and a London contact page so get rid of the latter.
Write unique Titles and Description using 60 character and 140 character limits respectively - make sure they are unique.
This is very easy to fix.
Regards Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I switch my website builder/host? Please help.
My website: www.joeborders.com is hosted with a service called jigsy: www.jigsy.com. I'm losing my mind trying to figure out if I should stay or not. Lol. I am positive I have done waaaayyy more work on my seo than many people ranking above me. I used to be on the first page, but over the last year I've slowly dropped in rankings. I've checked everything! I need to do some work on my blog, but I'm really thinking now that it might have something to do with my host. Some concerns I've identified: 1) I can't give pages individual h1 tags. The same one is blanketed across the site. 2) I'm told there are a lot of .css and JavaScript. 3) i cant redirect blog posts.....so moz is tagging me with 250 critical issues because my posts are on both www and http versions of my site .But that's all I know. I've talked with squarespace and WordPress and they have no way of transferring my site. It would probably take me a good 30 hours to set everything up....should i move? Please help 😞
Intermediate & Advanced SEO | | joebordersmft0 -
HTTP HTTPS Migration Gone Wrong - Please Help!
We have a large (25,000 Products) ecommerce website, and we did an HTTP=>HTTPS migration on 3/14/17, and our rankings went in the tank, but they are slowly coming back. We initially lost 80% of our organic traffic. We are currently down about 50%. Here are some of the issues. In retrospect, we may have been too aggressive in the move. We didn't post our old sitemaps on the new site until about 5 days into the move. We created a new HTTPS property in search console. Our redirects were 302, not 301 We also had some other redirect issues We changed our URL taxonomy from http://www.oursite.com/category-name.html to https://www.oursite.com/category-name (removed the .html) We changed our filters plugin. Proper canonicals were used, but the filters can generate N! canonical pages. I added some parameters (and posted to Search Console) and noindex for pages with multiple filter choices to cut down on our crawl budget yesterday. Here are some observations: Google is crawling like crazy. Since the move, 120,000+ pages per day. These are clearly the filtered pages, but they do have canonicals. Our old sitemaps got error messages "Roboted Out". When we test URLs in Google's robots.txt tester, they test fine. Very Odd. At this point, in search console
Intermediate & Advanced SEO | | GWMSEO
a. HTTPS Property has 23,000 pages indexed
b. HTTP Property has 7800 pages indexed
c. The crawl of our old category sitemap (852 categories) is still pending, and it was posted and submitted on Friday 3/17 Our average daily organic traffic in search console before the move was +/-5,800 clicks. The most recent Search Console had HTTP: 645 Clicks HTTPS: 2000 clicks. Our rank tracker shows a massive drop over 2 days, bottoming out, and then some recovery over the next 3 days. HTTP site is showing 500,000 backlinks. HTTPS is showing 23,000 backilinks. I am planning on resubmitting the old sitemaps today in an attempt to remap our redirects to 301s. Is this typical? Any ideas?0 -
Best Way to Incorporate FAQs into Every Page - Duplicate Content?
Hi Mozzers, We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system. The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total). The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages. I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts? Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Hreflang tag could solve any duplicate content problems on the different versions??
I have run across a couple of articles recently suggesting that using the hreflang tag could solve any SEO problems associated with having duplicate content on the different versions (.co.uk, .com, .ca, etc). here is an example here: http://www.emarketeers.com/e-insight/how-to-use-hreflang-for-international-seo/ Over to you and your technical colleagues, I think ….
Intermediate & Advanced SEO | | JordanBrown0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Wordpress Duplicate Content Due To Allocating Two Post Categories
It looks like google has done a pretty deep crawl of my site and is now showing around 40 duplicate content issues for posts that I have tagged in two seperate categories for example: http://www.musicliveuk.com/latest-news/live-music-boosts-australian-economy http://www.musicliveuk.com/live-music/live-music-boosts-australian-economy I use the all in one SEO pack and have checked the no index for categories, archive, and tag archive boxes so google shouldn't even crawl this content should it? . I guess the obvious answer is to only put each post in one category but I shouldn't have to should I? Some posts are relevant in more than once category.
Intermediate & Advanced SEO | | SamCUK0 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0