Multiple sitewide (deep)links devalued by Google?
-
In my experience sitewide links can still be very powerful if used sensibly and in moderation. However, I'm finding that sitewide text blocks with 2 or 3 (deep)links to a single domain appear not to be working that well or not at all in raising the authority of those target pages. Anyone having the same experience?
In your experience, is the link value diminished when there are multiple deeplinks to a single domain in a sitewide text area? Is anything more than 1 link per target domain bad?
Or could it even be that it's not so much the number of deeplinks to a single domain that matter, but purely the fact that they are sitewide "deeplinks"? Are sitewide deeplinks treated differently than sitewide links linking to an external homepage?
Very interested in hearing your personal experience on this matter. Factual experience would be best, but "gut feeling" experience is also appreciated
Best regards, Joost
-
Yep, it's another "it depends" - if Moz links to Search Engine Land in multiple blog posts over many years (as it has done), this is going to count for more than one vote. Those links also undoubtedly go to different pages on SEL's site (new posts, etc.). But if I write one blog post every other day, linking to my affiliate site in every post, this really won't help the affiliate site at all
-
Hi Jane, I now tend to agree in the case of multiple links that are in a sitewide block. Also I agree that receiving multiple links from one IP-adress is worth less than receiving the same amount of links from all different domain (all else being equal with regards to trustworthiness, relevancy, etc).
But I am quite sure that receiving multiple links from one domain (or even one URL on that domain) counts as more than just one 'vote' from that domain. In my experience the raw number of links from a domain definitely helps with strengthening either transfered trust, page-specific authority and page-specific topical relevancy. So, yes to that bbc versus blogspot example.
-
Hey Joost,
That's a tough one because it probably should be subjective and depend on other factors about the linking site, the site it links to and how it links. If the BBC were to link to me twice, once to a new product page on my website and once to my home page, I'm not going to be concerned that the link is only worth the first link in the HTML code, and freak out if that's not to the page I'm interested in. Same thing goes for a lesser site to the BBC, but that would be a highly authoritative example.
That said, if you're counting links from c-classes or IPs, which is a very common way to assess backlinks, that page on the BBC is going to count as one "vote".
If I see a sidebar linking out twice to the same domain, I'm not going to be all that comfortable claiming that both those links are going to be any more useful than one would have been.
I don't believe Google would be simplistic enough to treat two links from one URL on bbc.co.uk to two different pages on one website the same way as it would treat two links on a blogspot blog to two pages on another website, if that makes sense.
-
Hi Jane, thanks. Unfortunately my data so far is only good enough for me to develop a "hunch", I was hoping for empirical data in the Moz community
Bytheway, are you saying that one page linking to URL A and URL B on an external domain would only count as one 'vote' for that entire domain? Not as individual votes for each page with it's own (anchor text / contextual / landingpage) relevancy? I did read a lot about multiple links from one page to the same external URL not adding any value over just one link, but I always thought that links to individual URLs still have their own merit, even if they are from a single source page?
Best regards, Joost
-
Hi Joost,
I don't have hard data on this at all; this is a what-I-know plus gut feeling answer.
Gary is right - multiple links from one page to another target should be treated more like one link to the target domain, but this might not be a uniform rule. In effect, two links from one page whether those links be site-wide or individual shouldn't have much more or less effect on the target website than just one.
That said, if Google felt that site-wide link or text blocks were being used manipulatively, there is no reason why they would not discount the value of those links altogether. It's interesting that you may have seen a correlation between multiple links from site-wide areas and poorer performance. It would also be interesting to see the data - you could put together a good blog post about it with enough data, for sure.
-
Hi Joost
Yes now reads 'unnatural links.' Sorry for my error!
Gary
-
Hi Gary,
Thanks for your reply. I don't really understand this sentence though:
"My question would be, "are your domains carrying natural links?" This would of course have a negative impact, but if not great."
Could you clarify what you mean please? Thanks again!
-
Search engines read this type of link juice as a single vote for a site. My question would be, "are your domains carrying unnatural links?" This would of course have a negative impact, but if not great.
I have heard site wide links becoming 'devalued.' This is not factual but through conversations I've had with large corporations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
Should I worry about rendering problems of my pages in google search console fetch as google?
Some elements are not properly shown when I preview our pages in search console (fetch as google), e.g.
Intermediate & Advanced SEO | | lcourse
google maps, css tables etc. and some parts are not showing up since we load them asynchroneously for best page speed. Is this something should pay attention to and try to fix?0 -
What Links to Disavow?
I am looking through my website's link profile that I pulled directly from Google Webmaster Tools. What is the best way to determine the links to disavow? Maybe the Webmaster Tools list is not the best list for this process but I really need to clean up the links that are hurting the site's SEO. Does anyone have any insight?
Intermediate & Advanced SEO | | PartyStore0 -
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
More bad links
Hi, After a recent disastrous dalliance with a rogue SEO company I disavowed quite a few domains (links he had gained) which I was receiving a penalty of about 23 places. I cleaned up the site and added meta descriptions where missing, and deleted duplicate titles and pages. This gained me another 5 places. In the meantime I have been getting a few links from wedding blogs, adobe forums and other relevant sites so was expecting an upward momentum. Since the high point of bottom of page 1 I have slowly slid back down to near the bottom of page two for my main keywords. Just checked my webmaster tools latest links and another 4 domains have appeared (gained by the dodgy SEO) : domain:erwinskee.blog.co.uk domain:grencholerz.blog.co.uk domain:valeriiees.blog.co.uk domain:gb.bizin.eu They all look bad so I am going to disavow. I expect to find an improvement when I disavow these new domains. As I have said, have started using the open site explorer tool to check my competitors backlinks and getting some low level links(I'm a wedding photographer) like forum comments and blog comments and good directories. I know there is much more than this to SEO and plan on raising my game as time progresses. I have also gained more links from the domains I disavowed on the 8th January mostly from www.friendfeed.com. will webmaster tools ignore any new links from previously disavowed domains? Like I have said I know there are better ways to get links, but are these links (forum comments, blog comments and respectable directories) one way of raising my rankings? To be honest that is all my competitors have got other than some of the top boys might have a photograph or two on another site with a link. No-one has a decent article or review anywhere (which is my next stage of getting links). Thanks! David.
Intermediate & Advanced SEO | | WallerD0 -
OSE link report showing links to 404 pages on my site
I did a link analysis on this site mormonwiki.com. And many of the pages shown to be linked to were pages like these http://www.mormonwiki.com/wiki/index.php?title=Planning_a_trip_to_Rome_By_using_Movie_theatre_-_Your_five_Fun_Shows2052752 There happens to be thousands of them and these pages actually no longer exist but the links to them obviously still do. I am planning to proceed by disavowing these links to the pages that don't exist. Does anyone see any reason to not do this, or that doing this would be unnecessary? Another issue is that Google is not really crawling this site, in WMT they are reporting to have not crawled a single URL on the site. Does anyone think the above issue would have something to do with this? And/or would you have any insight on how to remedy it?
Intermediate & Advanced SEO | | ThridHour0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0