Internal linking to subdomains
-
Hi *, I have a main site called example.org, and a lot of user generated pages to foo.example.org / bar.example.org and so on.
Most of those pages link back to example.org.
In example.org I have a page that links to all subdomains.
How can I optimize the pagerank of the list page? Should I add nofollow to subdomain sites to avoid passing link juice to those sites and keep normal linking from subdomain sites?
-
It's true that subdomains are not internal links. But if they are your subdomains they are trusted links.
When you place a link, your PR from that page will automatically flow to it. The only question you have to answer is, now that the link juice has been spent, do you want the link's target to benefit from the link and let your PR to flow to the page?
Can you share a context of where or why you think it would be beneficial to use nofollow on a link to one of your subdomains?
-
Yes, but from what I understood subdomains are not always considered internal links, that's why I asked.
I thought that having a lot of subdomains linking TO a page but not receiving any pagerank FROM that page would be beneficial to the listing page.
-
That is correct. PR from a nofollowed link does not flow to the target page. It does not flow anywhere and dies on the existing page.
-
Will you please clarify " Your page rank flows TO the link whether it is followed or not". I thought that if we add rel=nofollow, the pagerank does not flows to the target page.
-
As a rule, never add "nofollow" to internal links. Your page rank flows TO the link whether it is followed or not. When you add the "nofollow" tag, you stop the PR from flowing FROM the link and it simply dies.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Link Rank Flow
I've read in many articles that pages can "pass" rank to other pages internally. Is anyone aware of any well done internal linking case studies which confirm this? If my homepage has the strongest Page Authority, would linking to another page deeper into my website from my homepage boost my rank for the deeper page in Google (more so than linking to the deep page from a page with lower page authority)?
Technical SEO | | poke10 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Are sitewide links bad for SEO?
I have 11 real estate sites and have had links from one to another for about 7 years but someone just suggested me to take them all out because I might get penalized or affected by penguin. My main site was affected on July of 2012 and organic visits have dropped 43%...I've been working on many aspects of my SEO but it's been difficult to come back. Any suggestions are very welcome, thanks 🙂
Technical SEO | | mbulox0 -
Too Many Page Links
I have 8 niche websites for golf clubs. This was done to carve out tight niches for specific types of clubs then only broadens each club by type - i.e. better player, game improvement, max game improvement. So far, for fairly young sites, <1 year, they are doing fairly well as I build content. Running campaigns has alerted me to one problem - too many on-page links. And because I use Wordpress those links are on each page in the right sidebar and lead to the other sites. Even though visitors arrive via organic search in most cases they tend to eventually exit to one of the other sites or they click on a product (Ebay) and venture off to hopefully make a purchase. Ex: Drivers site will have a picture link for each of the other 7 sites. Question: If I have one stie (like a splash page) used as one link to that page listing all the sites with a brief explanation of each site will this cause visitors to bounce off because they will have one click, than the list and other clicks depending on what other club/site they would like to go to. The links all open in new windows. This would cut down on the number of links per page of each site but will it cause too much work for visitors and cause them to leave?
Technical SEO | | NicheGuy0 -
Removing links from another site
Hello, Some site that I have never been able to access as it is always down has over 3,000 links to my website. They disappeared the other week and our search queries dramatically improved but now they are back again in Google Webmaster and we have dropped again.I have contacted the site owner and got no response and I have also put in a removal form (though I am not sure this fits for that) and asked Google to remove as they have been duplicating our content also. It was in my pending section but has now disappeared.This links are really damaging our search and the site isnt even there. Do I have to list all 3,000 links in the link removal to Google or is there another way I can go about telling them the issue.Appreciate any help on this
Technical SEO | | luwhosjack0 -
Asking to remove links from other sites
How hard is it to get people to take off links that point to your site that are on theirs? I have about 4 sites that I would like my link OFF of their blogroll because I think I was hit by the penguin update because of that. Do you know if there is anything you can do if they DON'T take it off?
Technical SEO | | SeaC0 -
Bad back links
Hi Folks I seem to have approx 587 back links to my homepage from this site kitchendetailsanddesign.com since my entire site has under a 1000 back links; I wonder if I should be worried? I've tried contacting the webmaster to remove them but no luck any tips 🙂
Technical SEO | | PHDAustralia680 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0