Penalized for duplication?
-
Hi there,
In February 2012 one my web pages (.co.uk) dropped from page 1 to page 5 for the keyword 'Menopause' and was replaced with a .PDF
Late January 2012 I launched a duplicate version of this webpage however targeting .ie due to difference currency and legalities, I had made sure in webmaster tools that both websites were both Geographically correct, I am also using hreflang tags on both webpages.
One thing that is strange is if I copy the first few paragraphs of the webpage in question into Google.co.uk, it's the .ie webpage that appears.
Any help would be appreciated in why this has happened.
Kind Regards
-
Hi Stephen, The URL's in question are:
UK version:
http://www.avogel.co.uk/health/menopause/
Irish version:
http://www.avogel.ie/health/menopause/
Any help would be appreciated.
Kind Regards
-
I have had the same happen to two of our sites. Same language, but different country and Tld.
Google ignores the hreflang element. We even made sure to include the country in the page title and in the footer. Since the we use country specific .tld we have no option and shouldn't need to set specific market in webmaster tools.
I guess it is just the way google works when you have two sites with basically the same content.
In our case we werent penalized, it's just that google think our foreign site is more relevant.
-
It could be anything in the entire world. needs more information - post the url
S
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Near Duplicate Title Tag Checker
Hi Everyone, I know there are a lot of tools like Siteliner, which can check the uniqueness of body copy, but are there any that can restrict the check to the title tags alone? Alternatively, is there an Excel or Google Sheets function that would allow me to do the same thing? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Canonical tags for duplicate listings
Hi there, We are restructuring a website. The website originally lists jobs that will have duplicate content. We have tried to ask the client not to use duplicates but apparently their industry is not something they can control. The recommendations I had is to have categories (which will have the idea description for a group of jobs), and the job listing pages. The job listing pages will then have canonical tags pointing to the category page as the primary URL to be indexed. Another opinion came from a third party that this can be seen as if we are tricking Google and would get penalised, **Is that even true? **Why would Google penalise for this if thats their recommendations in the first place? This third party suggested using nofollow on the links to these listings, or even not not index them all together. What are your thoughts? Thanks Issa
Intermediate & Advanced SEO | | iQi0 -
Should I 301 a penalized domain to another domains subfolder?
I have a niche domain seems to have been hit by Penguin. It had very good rankings before the update, and I think at least a good part of the penalty might be due to overoptimized anchor text. So here is the question; If I decide to take this site down, should I 301 the entire domain to a relevant sub-folder of another site? i.e comtemporaryfurniture.com to domain.com/category/modern-furniture.html Will the penalty get passed onto the new domain? If the penalty is partly due to anchor text, then pointing it to another site's subfolder would mean the tartget URL has more varied anchor text and could boost rankings.
Intermediate & Advanced SEO | | inhouseseo0 -
Seo advice / plan ? penalized ?
I built an ecommerce site for a client of mine just over 9 months ago now. To begin with the serps were great, everything was listed in the results but for some reason a few weeks in all the results vanished from google and now we're lucky to find anything. I've been as far as page 200 and havent found any results. Its been like this for a solid 8 months so i can only presume that the site has been penalised in some form. Searching for unique phrases from the site doesnt even return results. The website in question is = http://goo.gl/A6Gz2
Intermediate & Advanced SEO | | gfxpixeldesigns
keywords we're aiming for = coloured contact lenses, fashion contact lenses
Target = Google UK Now im not really an seo guy but regardless of this my client has hired me to see whats going on and correct it. I've been scratching my head thinking all sorts of things but none of which im certain about so i'm looking for someone to point me in the right direction before i do anything drastic. So to begin with here are some of my suspicions which i personally think are affecting ranking and possible penalisation. #1 - Too many links on the page
#2 - Possibly over optimised
#3 - Lack of content on the product and category pages
#4 - Lack of backlinks and links in general coming from other sites My main concern is the lack of links from other sites and the odd link coming from low quality sites. I've also just found out that my client has been using an automatic link submitter which i've always thought of as a big no no. Some of the sites these links have been submitted to have nothing to do with the keyword we are targetting and are sort of spammy sites containing all sorts of links. Im wondering if these poor quality links could have caused the site to be penalised, google may be seeing it as a spammy site due to this. Whats your opinions on the above, are my suspicions correct and can this be recovered ? My planned course of action is to be as follows: #1 - Re write the content currently on the site so that it is better written and include more keywords, especially long tails since i think these will help bring the serps up.
#2 - Write detailed category and product descriptions as well as making sure every page has some well written content with links and keywords.
#3 - Keep the above pages to one main subject / keyword so that google doesnt get confused.
#4 - Get some links on popular and relevant sites, the only problem here is the lack of fashion contact lens sites. Does anyone have any advice on how to find these or where i should be getting links placed. Are directories worth while ?
#5 - Get more involved in the social side ie facebook, twitter I will be building on the above over time, aswell as running google ads moderately for our chosen keywords. Is there anything i have missed, anything i shouldnt be doing. Please advise. Thanks.0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
Intermediate & Advanced SEO | | itrogers0 -
Duplicate block of text on category listings
Fellows, We are deciding whether we should include our category description on all pages of the category listing - for example; page 1, page 2, page 3... The category description is currently a few paragraphs of text that sits on page 1 of the category only at present. It also includes an image (linked to a large version of it) with appropriate ALT text. Would we benefit from including this introductory text on the rest of the pages in the category? Or should we leave it on the first page only? Would it flag up duplicate signals? Ideas please! Thanks.
Intermediate & Advanced SEO | | Peter2640 -
Mobile version creating duplicate content
Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
Intermediate & Advanced SEO | | peterkn
http://www.youtube.com/watch?v=mY9h3G8Lv4k
http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?1