Will adding 1000's of outbound links to just a few website impact rankings?
-
I manage a large website that hosts 1000's of business listings that comprise an area that covers 7 state counties. Currently a category page (such as lodging) hosts a group of listings which then link to it's own page. From these pages links are present directly to the business it represents. The client is proposing that we change all listings to link to the representative county website and remove the individual pages. This essentially would create 1000's of external links to 7 different websites and remove 1000's of pages from our site.
Does anyone have thoughts on how adding 1000's of links (potentially upwards of 3000) to only 7 websites (that I would deem relevant links) would affect SEO? I know if 1000's of links are added pointing to 1000's of websites the site can be considered a link farm, but I can't find any info online that speaks of a case like this. -
do you have any evidence that linking out can improve domain authority, I don't think it can.,
Matt cuts once said that it can be beneficial to link out, well of cause it can, but can it make you rank higher?
The evidence shows it can make you rank lower, not higher
-
Thanks for all the info. We have has a solid SEO strategy to date and currently the site ranks VERY well for all of it's identified keywords. There is a well thought out site architecture and internal linking strategy currently. I know that generally adding external links can improve authority over time if they are relevant, authoritative sites, and done in moderation. To me, the biggest concern is that we are going from linking to the actual businesses from individual pages to having more of an overall listing page that links to 7 other "directory" sites. Also, I don't know how Google will interpret a website that only links to 7 other websites (I should mention that we are already currently linking to those 7 - before this proposed change - in many places across the website). I have already mentioned to the client if we move forward, we will be implementing nofollows on the links.
-
Yes there is hard data, google released and patented there PageRank algorithm,
http://en.wikipedia.org/wiki/PageRank
This page is a simple explanation
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
A no-follow will not save any PageRank,, it will only stop it reaching the linked to page.
-
Adding more internal links so the linkjuice isn't diluted over 1 link would be like playing black hat SEO... I'm sure it will be seen as spam. A nofollow is enough. Still, a directory of only 7 sites without the inner pages is useless.
-
Is there any hard data to back that up? Just curious if there has been a study done over a ton of pages, links, etc.
-
Yes it would. When you link out you lose PageRank.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerankTo minimize the lose of PR, you can add more links to your own site on the same page.
If you have a page and you have 3 internal links, and 1 external link. you are giving away 25% of your PR. but if you have 99 internal links and 1 External links you are only giving away 1%.
You are also losing content, and depending on your internal linking structure, you are more than likely going to lower the PR of your home page by removing sub-pages, again I refer to http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Using no-follow will not help you, all link lose link juice.
There are way of using JavaScript to do this, but one day that may come back to bite you. -
Couple of questions:
The Website is a directory and yet it points to only 7 outbound Websites?
What about using nofollow for all those links?
On the content side, you are about to loose much of the site's content, you should expect a massive traffic drop. What's the point of a Directory if it only links to 7 Websites without offering any extra valuable content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will two navigation components (one removed by Javascript) impact Google rankings?
We are trying to eliminate tedium when developing complexly designed responsive navigations for mobile, desktop and tablet. The changes between breakpoints in our designs are too complex to be handled with css, so we are literally grabbing individual elements with javascript and moving them around. What we'd like to do instead is have two different navigations on the page, and toggle which one is on the DOM based on breakpoint. These navigations will have the same links but different markup. Will having two navigation components on the page at page load negatively impact our Google SEO rankings or potential to rank, even if we are removing one or the other from the DOM with JavaScript?
Intermediate & Advanced SEO | | CaddisInteractive0 -
Migrating From Parameter-Driven URL's to 'SEO Friendly URL's (Slugs)
Hi all, hope you're all good and having a wonderful Friday morning. At the moment we have over 20,000+ live products on our ecomms site, however, all of the products are using non-seo friendly URL's (/product?p=1738 etc) and we're looking at deploying SEO friendly url's such as (/product/this-is-product-one) etc. As you could imagine, making such a change on a big ecomms site will be a difficult task and we will have to take on A LOT of content changes, href-lang changes, affiliate link tests and a big 301 task. I'm trying to get some analysis together to pitch the Tech guys, but it's difficult, I do understand that this change has it's benefits for SEO, usability and CTR - but I need some more info. Keywords in the slugs - what is it's actual SEO weight? Has anyone here recently converted from using parameter based URL's to keyword-based slugs and seen results? Also, what are the best ways of deploying this? Add a canonical and 301? All comments greatly appreciated! Brett
Intermediate & Advanced SEO | | Brett-S0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Can't understand our Ranking; DA, PA and On Page all better than Competiton; Ranking no where
Our rankings are up and down but our domain is clean, DA and PA good
Intermediate & Advanced SEO | | AlexSTUDIO18
and there is really in depth content which is all original. We are at a bit of a loss; The site is
http://www.fightstorepro.com I can use the phrase "Boxing Gloves" as an example
http://www.fightstorepro.com/gear/gloves/boxing-gloves.html PA 26 , DA29
Good original content; Video content, On page grade A
Not ranking in top 50 places?? The competitor in Pos4 is not matching our placing Anyone shed any light on this?0 -
Is ok to add 'no follow' to every outbound link?
How do you handle outbound links from your site?.. do you no follow them all to be on the safe side?
Intermediate & Advanced SEO | | nick-name1230 -
Poor Link Profiles Out Ranking Whitehat
Hi guys / girls, We have a few clients in some very competitive areas that are struggling to gain the top spots. We have been building some good quality links, relevant directories, quality citations, guest posting from good quality websites, broad anchor texts profiles, really strong social signals, competitor back links lookup etc. The issue is that nearly all the websites that are out ranking us have really bad links profiles, lots of spammy links, abusing anchor texts etc So what can you do in this situation. The obvious path is to think, right well lets match them for cr*p links and get some results. However I am more than aware that Googles Algorithm eventually will pick up on this... well at least you hope so. Its just very frustrating when you're getting your ass kicked by poor link building techniques and you're doing good work. I am sure other people have come across this, and I was just wondering if there are any bits of advice of how to move past this? Or is it simply a case of keep doing good work and eventually we will get rewarded? Thanks!
Intermediate & Advanced SEO | | Jon_bangonline1 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0