Backlinks to less important subdomain
-
We have two subdomains on our site: blogs and *www. Our most important and competitive pages are on the www subdomain. I have some pages on the blogs subdomain that have valuable backlinks. Would it be helpful to our SEO efforts for the www subdomain to move and redirect those pages on the blogs subdomain to the www subdomain?
-
Hello!
I don't think it would be beneficial to worry about moving just a few pages. Instead, do what you can to link back over to relevant pages on the main site from those blog pages, and try to drive users to the main site if appropriate. Google will also look at traffic patterns in addition to links and pass more value from the blog pages to your main site if they see user behavior as such.
-
I dont believe that moving just a couple of pages can make a difference.
Remember that the "power" that you want to transfer, is through the root domain. So I´d move the whole blog so as working once and the whole site is benefitedBest luck.
GR -
Ideally, absolutely. But I'm not sure I can get buy-in for moving the whole blog right now, which is why I'm considering moving just a couple of pages, if it would be worthwhile.
-
Hi there,
My opinion? Go for a subdirectory. Move all the content from blog.domain.com to www.domain.com/blog/. Always remember to set everything correct in the migration.
A moz resource:
The Website Migration Guide: SEO Strategy, Process, & Checklist - Moz BlogThere are many believers that state thar having the complete site in the same subdomain improves the overall performance.
There has been many discussions about this topic, ending up with an offical video from Google:
Subdomain or subfolder, which is better for SEO? - Google WebmastersAnd other discussions and related articles:
The Great Subdomain vs. Subfolder Debate, what is the best answer? - Moz Q&A
An SEO Guide On Subdomains vs Subdirectories <- Jump to conclusion at the end.
SEO & Google Fight Over How Google Treats Subdomains vs Subdirectories - SERoundTable
Subdomains vs Subdirectories? - SistrixHope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
DeepCrawl Calls Incomplete Open Graph Tags and Missing Twitter Cards An Issue. How important is this?
Hi, Let me first say that I really like the tool DeepCrawl. So, not busting on them. More like I'm interested in the relative importance of two items they call as "Issues." Those items are "Incomplete Open Graph Tags" and "No Valid Twitter Cards." They call this out on every page. To define it a bit further, I'm interested in the importance as it relates to organic search.I'm also interested in there's some basic functionality we may have missed in our Share42 implementation. To me, it looks like the social sharing buttons work. Also, we use Share42 social sharing buttons, which are quite functional. If it would help, I could private message you an example url. Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Backlinks to internal pages
Hi, Our website of 3K+ pages currently has more links coming to internal pages (2nd & 3rd Level), compared to links to homepage. Just wanted to know if this is bad for rankings ? Please share your thoughts. Thanks.
Intermediate & Advanced SEO | | Umesh-Chandra0 -
Subdomain Place Holder
So long story short - we are rolling out a new website earlier than expected. Unfortunately, we are being rushed and in order to make the deadline, we have decided to create a www2. subdomain and release our HTML only version of the site for the next 2 weeks. During that time, the HTML site will be ported over to a Drupal 8 instance, and resume its www. domain. My question is - will a temporary (302) from www to ww2 and then back to www screw the proverbial pooch? Is there a better way to implement a temporary site? Feel free to probe with some questions - I know I could be clearer here 😉 Thanks community!
Intermediate & Advanced SEO | | BDS20160 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Site-wide Image Backlinks - Are they bad?
Hey guys, A little help required. We are potentially taking on a new client who has over 5,000 image backlinks (4,000 of those from one site) from around 7,000 total backlinks. Would this be a problem? It's been noticeable recently that both footer and blogroll links seem to be getting targeted by Google. Would this be the case for images links too? Especially considering the top-heavy nature of the link profile? Thoughts welcome. Cheers.
Intermediate & Advanced SEO | | Webrevolve0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80