Effects of significant cross linking between subdomains
-
A client has noticed in recent months that their traffic from organic search has been declining, little by little.
They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain!
Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site.
Interested to hear the thoughts of the community on this one!
-
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
-
You mentioned it's a large site Google only goes so deep into a site but as its an irrelevant detail it doesn't matter. Have you tried blocking some of the unused pages by robots and/or implementing tags like canonical &/or pagination tag
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
https://support.google.com/webmasters/answer/139394?hl=en
you could look in Google trends for a rough idea of search volume over the years but that wont help your site as you mentioned. you can try tracking your SERP rank in Moz or other sfotware like serpbook etc.
Sounds like you've dropped down in your SERP to me.
-
Hi Chris,
Thanks for your reply. The issue isn't that Google hasn't indexed those pages, though - it has. I'm not sure what you mean by 'Google won't index huge sites it just doesn't have time', as it clearly does index plenty of huge sites.The site is pretty much fully indexed so it's not that Google can't find the pages.
We have also, of course, tried using the client's Analytics to identify the issue, as you describe, but the client accidentally deleted all the historical data beyond about the six month mark (oops), so I can't do a lot of the analysis I would normally do. I have one or two odd old printouts showing some historical Analytics and ranking data, and their sales data to go on, and this does tend to suggest that organic traffic has indeed dropped off (for reasons other than seasonal ones) and that there has been some decline in their search engine rankings for some key phrases. But I can't tell a lot more than that.
What I'm looking for is to see whether anyone else has had experience of this or a similar issue - whether anyone has seen excessive links between subdomains have a negative impact on rankings & traffic. I've been working in SEO for ten years and never come across anyone who has quite this many links within their own website, so it's not something I've encountered before.
Anyone else out there come across this before?
-
First thing I always do is pretend i don't work for the company etc. go to the site as a user and see how easy it is to navigate, can i find the product i need easily?(try to imagine you want a product prior to going on) Can i get back to the home page easily etc. I try to make sure I can access my home page (or the main products) only 3 pages away (5 max). Google won't index huge sites it just doesn't have time so if your structure is bad it may be Google bot giving up as it can't get all the way down to where you in fact want it to go.
If you find your self lost in "megamenus" imagine the user or Google bot, can you reduce the menus to achieve a good result?
Other factor could be the decline in traffic has there be a Decline in your placement in the SERP or seasonal traffic ? Although not a permanent fix PPC can help top up traffic to your site whilst you jiggle it a bit.
I hope some of the questions above help you look at the site in a different light, there are obviously other things it could be but first off I would look into your SERP placement and seasonal dips. You can use GA to look at users drop off points see where they are getting bored or getting lost too!
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do about this subdomain for SEO?
This is a bit of an unusual structure and I'm having difficulty explaining the question so pardon my being a 'noob', haha. The website I'm working on has some content under Forums that is hosted on another domain. The main website is https://yournorthside.org.au/ and if you select under the main Nav > Forums > Lived Experience it will take you to https://yournorthside.saneforums.org/t5/Lived-Experience-Forum/ct-p/lived-experience-forum. So it's as if it's a subdomain. (notice even the appearance of the main menu changes, weird) Apparently, saneforums.org has a requirement for that content to be on that subdomain. So therefore it's not part of my sitemap and now crawled or indexed. My question is is this structure okay? What are the implications for SEO? Should I be looking to implement some type of no follow link or something? Or is it actually beneficial in terms of all their content gives us 'link juice'? Can you link me to any resources / articles that give further insight?
Technical SEO | | kelseyc0 -
SSL for subdomain is good or bad?
Hello, We have SSL certificate for our domain only for *.website.com, And now, we have few subdomains, as you know, we have two choices: 1. Using HTTPS for subdomain https://me.website.com, while it has problem with https://www.me.website.com (SSL error) 2. Using HTTP for subdomain, which has www and non-www with redirects. Which one is good for us?
Technical SEO | | Anetwork0 -
Spammy nofollow links
Hello, One of our clients - a cleaning business - has a heck of a lot of spammy nofollow links pointing to their site. The majority of the links are from comments or 'pingbacks', most with the anchor text 'cheap nfl jerseys' or 'cyber monday ugg boots'. After researching the subject of spammy nofollow links, it seems there is a lot of uncertainty regarding the negative affect these could have on your SEO efforts. So I guess my question to the community is: if your site was suddenly hit by a plethora of spammy nofollow links, what would you do and why? Cheers, Lewis
Technical SEO | | PeaSoupDigital0 -
"One Page With Two Links To Same Page; We Counted The First Link" Is this true?
I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/ pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
Technical SEO | | PaddyDisplays
Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)" then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant. I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages. I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?0 -
Disavowing links, Is it effective?
Looking for your experiences with disavowing back-links? We've been flooded with new clients who need spammy link removal services and wanted to hear more about your experience with the disavow tool. For sites that have been penalized, how long did it take for them to come back using the disavow tool? Did you see sites come back after the next algo update? Here's the basics of our services for link deletion: 1. Find all the spammy links
Technical SEO | | Keith-Eneix
2. Contact webmasters to delete them
3. Disavow all spammy links that are part of an obvious network
4. Implement a content plan for new quality links to get the site healthy again.
5. Report on all links removed and new links attained Just want to make sure our processes are in line with what everyone else is doing?0 -
What are the factor effect on keyword postion?
The factor of error in Onpage process or Link building process effect on key word position.
Technical SEO | | magician0 -
User Created Subdomain Help
Have I searched FAQ: Yes My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
Technical SEO | | breezi0 -
What should i do with the links for "Login", "Register", "My Trolley" links on every page.
My website ommrudraksha has 3 links on every page. 1. Login 2. Register 3. My trolley My doubt is i do not want to give any weightage to these links. does these links will be calculated when page links are calculated ? Should i remove these as links and place these as buttons ? ( with look a like of link visually ? )
Technical SEO | | Ommrudraksha0