Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The use of a ghost site for SEO purposes
-
Hi Guys,
Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site.
After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com.
It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain?
Does anyone have any experience of this as a tactic?
Thanks,
Dan
-
The appropriate solution in this place would be to employ the hreflang tag to related the two geographically separate sites together. However, before you take that step, I would look to make sure the previous SEO company which created the .com did not point any harmful links at the .com domain which would make it inadvisable to connect the two sites together. Use OpenSiteExplorer to look at the backlink profile and use Google Webmaster Tools to authorize the .com site and look for any manual penalty notices. If all looks clear, go ahead forward with the implementation of the hreflang tag.
Good luck and feel free to ask more questions here!
-
Thanks Both,
Pretty much confirms our thoughts here and yes Eddie - it appears to be a smash and grab job.
Dan
-
Certainly sounds dodgy, but suddenly removing all of those backlinks might cause you some SEO issues.
Depending on how Google is currently reading your site it may improve as your site would seem less spammy without them or it may really hurt the site (at least to start with) loosing that many back links would maybe make Google think something is up with your site?
I would take the bullet and remove the duplicate content but warn your clients that it may take a while for the natural benefits to come through. Because if your site isn't penalised yet for having that many dodgy backlinks and duplicate content it soon will be!
-
Certainly seems like the wrong thing to do. A good test is that if you think it may be dodgy it probably is. I certainly wouldn't recommend it as a tactic. There are potentially multiple issues with this....duplicate content as you mentioned but also dilution of real links. Good quality legitimate links could link to the Ghost site and therefore not count for the real site.
I have seen issues where it is a legitimate attempt to have a .com and .co.uk on the same shop and ended up with both versions online due to incompetent development but I didn't have to deal with cleaning it up.
Un-picking that could be messy. A good example of quick fix SEO for a fast buck I suspect.
-
5mil+ links?! Wow!
What's their spam score? I'm surprised they are not blocked or something
To answer your question - what does common sense tells you? The job of google and google bots is pretty much based on common sense. So, duplicate content website, ridiculous amount of links, no referral traffic - all these are obvious signals to run, Forrest, run!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
Spam sites with low spam score?
Hello! I have a fair few links on some of the old SEO 'Directory' sites. I've got rid of all the obviously spammy ones - however there are a few that remain which have very low spam scores, and decent page authority, yet they are clearly just SEO directories - I can't believe they service any other purpose. Should we now just be getting rid of all links like this, or is it worth keeping if the domain authority is decent and spam score low? Thanks Sam
White Hat / Black Hat SEO | | wearehappymedia0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
White Papers! Is this still good for SEO
Does publishing a white paper good for SEO? We are trying to decide to publish one or not for the purpose of SEO. If it will not help, we will spend money for other things.
White Hat / Black Hat SEO | | AppleCapitalGroup0