Competitor has same site with multiple languages
-
Hey Moz,
I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc.
My first thought is this is basically a way to game the system but I could be wrong.
They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it.
TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
-
1. Good!
2. You are confused for good reason. There has not been clear direction for here for some time. If you used HREFLANG between the two, it seems for the last number of years the content would not be seen as duplicative. You are telling Google that the content is the same but in different languages inherently.
3. There is so much that goes into this, but I can tell you with years of experience under my belt that the numbers don't ever tell the whole story.
-
Thank you!
-
We have decided that localizing is not worth it, it appears spammy and we cannot offer curated content on that level.
-
I am still a little confused about HrefLang. For example lets say we have a .com and a .de website. Both are nearly identical with the exception of hreflang and a handful or product/pricing descriptions (for example the US website might not list german specific dating sites, where as the german site will most likely include most major dating sites simply because they have the reach). Does google see this as duplicate content? Or does the hreflang indicate to google that this should be treated as two different websites.
-
To sum this up, we are trying to determine just how our competition has such a large keyword footprint when as far as the numbers are concerned (page count, wordcount etc) we are basically on the same level.
-
-
This is a tough one because there are reasons on both sides of the street to give reasons why this duplicate content should and should not be allowed. Think about an ERP SAAS company that needs to change their content just a bit between countries. They might build what looks to be duplicate but doesn't have some pages in one country vs another.
In your example, it seems to be not a great experience, but as a logged in user, they might be getting different content depending on their location.
To my second point: Those outside tools are shit. Total shit. Don't trust those numbers. Do your numbers line up to theirs?
Final point: Do not build and maintain duplicate content in hopes of getting links. It won't work over time. Anything can work for a short period of time, but in the end, they will figure it out. Trust me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Competitor outranking you with link spam. What would be your next steps?
FYI: I've already searched the forums for previous posts on this topic and although some are helpful, they don't tend to have many responses, so I'm posting this again in the hope of more interaction from the community 😉
White Hat / Black Hat SEO | | adamlcasey
So can I please ask the community to tell me what course of action you would take, if this was happening to you? We have been ranking in position 1 for a major keyword in our space for the past 18 months. Today I logged into my Moz account and to keyword rankings to find that we have dropped to 2nd. So I placed the competitors website; who's now in 1st position, into OSE and looked under the "Just Discovered" tab. There are 258 newly discovered links, 95% of which use keywords in the anchor text!
So I reviewed the rankings for all of these other keywords being targeted and sure enough they are now dominating the top 1-3 spots for most of them. (some of which we are also attempting to rank for and have subsequently been pushed down the rankings) Their links are made up of: Forum and blog comments - always using anchor text in the links Article's posted on web 2.0 sites (Squidoo, Pen.io, Tumblr, etc) Profile page links Low quality Press Release sites Classified ad sites Bookmarking sites Article Marketing sites Our competitors sell safety solutions into the B2B market yet the topics of some of the sites where these links appear include: t-shirts sports news online marketing anti aging law christian guitars computers juke boxes Of the articles that I quickly scanned, it was clear they had been spun as they didn't read well/make sense in places. So my conclusion is that they have decided to work with a person (can't bring myself to call them an seo company) who have provided them with a typical automated link building campaign using out dated, poor seo practices that are now classified as link spam. No doubt distributed using an automated link publishing application loaded with the keyword rich anchor text links and published across any site that will take them. As far as I was aware, all of the types of links we're supposed to have be penalised by Google's Penguin & Panda updates and yet it seems they are working for them! So what steps would you take next?0 -
How to stop links from sites that have plagurized my blogs
I have been hit hard by Penguin 2.0. My webmaster explains that I have many links to my articles (a medical website with quality content) from "bad sites." These sites publish my articles with my name and link to my site and it appears I have posted my articles on their site although I have not posted them-theses sites have copied and pasted my articles. Is there a way to prevent sites from posting my content on their site with links to my site?
White Hat / Black Hat SEO | | wianno1681 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
How Does This Site Get Away With It?
The following site is huge in the movie trailer industry: http://bit.ly/18B6tF It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry. Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs. We all know Google hates duplicate content at the moment... so how does this site get a away with it? Does it's root-domain authority keep it up there?
White Hat / Black Hat SEO | | superlordme0 -
Trying to determine if my site was de-indexed...
I ran a search using the allinsite:floridainboundmarketing.com command and found that virtually all of my pages are not being returned in the results. I'm one of those who (foolishly) used ALN blog network for a few months, got the unnatural links notice in WMT and on advice of other SEOs (including some here) I ignored it based on the idea that if my SERPS dropped due to alog update that a request for reconsideration was of no value. As I watched my SERPs dropping I was confident that it was simply because those links were no longer being counted and overall link profile was poor, so the results started dropping. I've not read where G has gone back and started de-indexing pages for such sites but it may be happening as (unless I'm wrong) my site is gone... Anyone got any ideas? Am I searching correctly?
White Hat / Black Hat SEO | | sdennison0 -
How The HELL Is This Site Ranking So Well In Google Places?
When I do a search for this site it ranks number 2 on Google just below the official federation of master builders website for the keyword phase "builders in london" this is the site http://bit.ly/Lypo8E which is a nasty looking blog which has nothing to do with builders and they don't even have an address anywhere on the site. The only thing I can see is that they are sharing there address with a lot of other businesses and all of the citations from those other businesses are causing them to rank higher on Google places, but surely Google can't be that stupid right?
White Hat / Black Hat SEO | | penn730 -
Is my SEO strategy solid moving forward (post panda update) or am I doing risky things that might hurt my sites down the road?
Hey all, WIhen I first started doing SEO, I was encouraged by several supposed experts that it was a good idea to buy links from "respectable" sources and as well make use of SEO experimentation offered on Fiverr. I did that a lot for the clients I represented not knowing if this was going to hurt. But now after the latest Google shift, I am realizing that this was stupid and thus deserving of the ranking drops I have received. In the aftermath, I want to list out here what I am doing now to try to build better and stronger rankings for my sites using white hat techniques only... Below is a list of what I'm doing. Please let me know if any of these are bad choices and I will immediately dump them. Also, If i am not including some good options, please let me know that too. I am really embarrassed and humbled by this and could use whatever help you can offer. Thanks in advance for your help... What am I doing now? *Writing quality articles for external blogs with keyword links back to sites *Taking the above articles and spinning them at SEOLINKVINE to create several articles *Writing quality articles for every site's internal blog and using keywords to link out to other sites that are on different servers - All articles are original, varied and not duplicate content. *Writing quality, relevant articles and submitting them to places like Ezine *Signing clients up for Facebook, Yelp, Twitter, etc so they have a social presence *Working to fix mistakes with onsite issues (mirror sites, duplicate page titles, etc.) *Writing quality keyword-rich unique content on each page of each site *Submitting URL listings and descriptions to directories like JoeAnt, REALS and business.com (Any other good ones that people can recommend that give good link juice?) *Doing competitive research and going after highly authoritative links that our competitors have That is about it... HELP!!! Thanks again
White Hat / Black Hat SEO | | creativeguy0