Two sites, heavily cross linking, targeting the same keyword - is this a battle worth fighting?
-
Hi Mozzers,
Would appreciate your input on this, as many people have differing views on this when asked...
We manage 2 websites for the same company (very different domains) - both sites are targeting the same primary keyword phrase, however, the user journey should incorporate both websites, and therefore the sites are very heavily cross linked - so we can easily pass a user from one site to another.
Whilst site 1 is performing well for the target keyword phrase, site 2 isn't. Site 1 is always around 2 to 3 rank, however we've only seen site 2 reach the top of page 2 in SERPs at best, despite a great deal of white hat optimisation, and is now on the decline.
There's also a trend (all be it minimal) of when site 1 improves in rank, site 2 drops.
Because the 2 sites are so heavily inter-linked could Google be treating them as one site, and therefore dropping site 2 in the SERPs, as it is in Google's interests to show different, relevant sites?
-
Alpha,
Is the domain authority the same for both sites? How similar are their back link profiles? Is there other interlinking going on between them and other sites? If all else is on the up and up, my feeling is that Google wouldn't be seeing them as a single site but that as the algorithm evolves, Google's understanding of the relationship between the two sites is being more clearly defined.
But then again, if you think in terms of "entity", (rather than "sites") maybe Google does see them as "conjoined twins" : ), with one being stronger than the other and a searcher finds one, it will certainly find the other.
Maybe it's time to start experimenting with redirecting things to a single domain. You could start a single page or category and see how traffic or rankings change. I wouldn't bet that it would bring you more traffic but maintaining a single domain vs. two interlinked ones could save time and effort.
-
Hi,
Yes, it is in Google interests to show different and relevant sites to users.
You said, "2 websites for the same company (very different domains) - both sites are targeting the same primary keyword phrase, however, the user journey should incorporate both websites, and therefore the sites are very heavily cross linked - so we can easily pass a user from one site to another."
I am assuming the by very different domains, you mean different site address because two sites on two different topics can't be heavily interlinked. In this case, I would recommend to have a single website that is performing well in the search engine and merge the other one in it. As you said that users journey should incorporate both the websites, this means they are closely linked. In this case, it would be better to have a single website that suffice users' needs.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
Links Identified in WMT not on Webpages
Hi, We're currently reviewing one of our clients backlinks in Google Webmaster Tools, Majestic & OSE as we can see many toxic links. However we cannot find the links on the webpages that are listed on Google WMT. We have searched through the website along with checking through the source code. Should we still disavow the domain? Thanks, Edd
White Hat / Black Hat SEO | | tomcraig860 -
Hacked site vs No site
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
White Hat / Black Hat SEO | | Rich_Coffman0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Embedded links/badges
Hi there Just picking up on something Rand said in his blog analysing his predictions for 2014. Rand predicted that Google will publicly acknowledge algorithmic updates targeting...embeddable infographics/badges as manipulative linking practices While this hasn't exactly materialised yet, it has got me thinking. We have a fair few partners linking to us through an embedded badge. This was done to build the brand, but the positives here wouldn't be worth being penalised in search. Does anyone have any further evidence of websites penalised for doing this, or any views on whether removing those badges should be a priority for us? Many thanks
White Hat / Black Hat SEO | | HireSpace0 -
Why should I reach out to webmasters before disavowing links?
Almost all the blogs, and Google themselves, tell us to reach out to webmasters and request the offending links be removed before using Google's Disavow tool. None of the blogs, nor Google, suggest why you "must" do this, it's time consuming and many webmasters don't care and don't act. Why is this a "required" thing to do?
White Hat / Black Hat SEO | | RealSelf0 -
How to stop links from sites that have plagurized my blogs
I have been hit hard by Penguin 2.0. My webmaster explains that I have many links to my articles (a medical website with quality content) from "bad sites." These sites publish my articles with my name and link to my site and it appears I have posted my articles on their site although I have not posted them-theses sites have copied and pasted my articles. Is there a way to prevent sites from posting my content on their site with links to my site?
White Hat / Black Hat SEO | | wianno1681 -
Why are these sites so high with poor relevant links...
Hello, Keyword: TV Stands. I have been researching competitors for a client and we seem to be unable to understand why certains pages are ranking on page 1 of Google UK for keyword TV Stands. eg: http://www.furnitureinfashion.net/plasma-TV-stand.html (Google UK 8 - TV Stands) http://direct.tesco.com/q/N.1999542/Nr.99.aspx (Google UK 9 - TV Stands) The furniture in fashion has links from sites like: http://www.ummah.com/forum/ and http://www.muslimco.com/ which is totaly irrelevant to the site. Any ideas on other things as the tesco.com site does not have direct links to it. Cheers
White Hat / Black Hat SEO | | JohnW-UK0