Effects of significant cross linking between subdomains
-
A client has noticed in recent months that their traffic from organic search has been declining, little by little.
They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain!
Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site.
Interested to hear the thoughts of the community on this one!
-
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
-
You mentioned it's a large site Google only goes so deep into a site but as its an irrelevant detail it doesn't matter. Have you tried blocking some of the unused pages by robots and/or implementing tags like canonical &/or pagination tag
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
https://support.google.com/webmasters/answer/139394?hl=en
you could look in Google trends for a rough idea of search volume over the years but that wont help your site as you mentioned. you can try tracking your SERP rank in Moz or other sfotware like serpbook etc.
Sounds like you've dropped down in your SERP to me.
-
Hi Chris,
Thanks for your reply. The issue isn't that Google hasn't indexed those pages, though - it has. I'm not sure what you mean by 'Google won't index huge sites it just doesn't have time', as it clearly does index plenty of huge sites.The site is pretty much fully indexed so it's not that Google can't find the pages.
We have also, of course, tried using the client's Analytics to identify the issue, as you describe, but the client accidentally deleted all the historical data beyond about the six month mark (oops), so I can't do a lot of the analysis I would normally do. I have one or two odd old printouts showing some historical Analytics and ranking data, and their sales data to go on, and this does tend to suggest that organic traffic has indeed dropped off (for reasons other than seasonal ones) and that there has been some decline in their search engine rankings for some key phrases. But I can't tell a lot more than that.
What I'm looking for is to see whether anyone else has had experience of this or a similar issue - whether anyone has seen excessive links between subdomains have a negative impact on rankings & traffic. I've been working in SEO for ten years and never come across anyone who has quite this many links within their own website, so it's not something I've encountered before.
Anyone else out there come across this before?
-
First thing I always do is pretend i don't work for the company etc. go to the site as a user and see how easy it is to navigate, can i find the product i need easily?(try to imagine you want a product prior to going on) Can i get back to the home page easily etc. I try to make sure I can access my home page (or the main products) only 3 pages away (5 max). Google won't index huge sites it just doesn't have time so if your structure is bad it may be Google bot giving up as it can't get all the way down to where you in fact want it to go.
If you find your self lost in "megamenus" imagine the user or Google bot, can you reduce the menus to achieve a good result?
Other factor could be the decline in traffic has there be a Decline in your placement in the SERP or seasonal traffic ? Although not a permanent fix PPC can help top up traffic to your site whilst you jiggle it a bit.
I hope some of the questions above help you look at the site in a different light, there are obviously other things it could be but first off I would look into your SERP placement and seasonal dips. You can use GA to look at users drop off points see where they are getting bored or getting lost too!
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving a subdomain to subfolder with CNAME
Hi all, We have our docs in a subdomain on GitHub and looking to move it into a subfolder. Github offers the ability to redirect via CNAME https://gitbookio.gitbooks.io/documentation/platform/domains.html I am getting conflicting information on if this will work or cause duplicate content and hurt our SEO.
Technical SEO | | kate.hassey0 -
Can you regain any SERPs / link juice of links that have 404'd?
We have a client whose 301 redirects disappeared and have been gone for about 6 months now. We are going to be putting the 301 redirects back in place. Will we be able to regain any of the previous SERPs or link juice from old links or is all lost? Thanks in advance!
Technical SEO | | SavvyPanda0 -
Robots.txt on subdomains
Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?
Technical SEO | | Whittie0 -
How many links can i do in a day
Hi, I built a niche site yesterday, the domain was 1 week old. What is the safest way to do link building? I.e How many links should i do in order to be in safe? So far, i've concentrated on building my social profile. Like getting as many retweets as possible. Slowly getting facebook likes and added few youtube videos with my link in it. Now i am on to link building. I will only do legit link building methods. Currently i am gonna get 2 press releases. But can you tell me no. of links i must get in a day. PS: The keyword i am competing is highly competitive.
Technical SEO | | Vegit0 -
Unwanted spam pharmacy links
Somebody has been building spam pharmacy links to one of our client sites. I presume they hacked the site and were trying to get their injected pages to rank for pharmacy keywords. The hack appears to be gone now, but we will check more code to be sure. However, we're still left with a bunch of really spammy links, with pharmacy related anchor texts. Anyone had any experience dealing with this? Did the links hurt your rankings? How did you get rid of or mitigate them?
Technical SEO | | AdamThompson0 -
Too many on page links
Hello I have about 800 warnings with this. Example of one url with this problem is: http://www.theprinterdepo.com/clearance?dir=asc&order=price I was checking and I think all links are important. But I suppose that if I put a nofollow on the links on the left which are only for navigation purposes I can get rid of these warnings. Any other idea?
Technical SEO | | levalencia10 -
Removing inbound Spam Links
Hello, Last February one of my clients websites was delisted. It turns out that some time ago that had attempted to launch a social network along time lines of ning. The project had fallen apart of the was still up. At some point spammers found it and started using it as part of a link farm. Once it was discovered, the subdomain it was posted on was removed and the website returned to search within 2 weeks. Last week, the website disappeared again OSE shows that in the last 2 months the website has got 2000 (There are about 16,000 total spam links) additional spam links now pointing and the root domain. On top of that, Google Webmaster Tools is reporting about 15,000 404 errors. I have blocked Google from crawling the path where the path were the spam pages used to be. If there a way to block the 1000s of inbound spam links?
Technical SEO | | Simple_Machines0 -
External Sitewide Links and SEO
I have one big question about the potential SEO value -- and possibly also dangers? -- of "followed" external sitewide links. Examples of these would be: a link to your site from another site's footer a blogroll link a link to your site from another site's global navigation Aside from the link's position in the HTML file (the higher the better, presumably), are these links essentially the same from an SEO point of view or different (and how)? There used to be an influential view out there that the link juice value of a sitewide link was the same as that of a single link (presumably from the linking site's home page), even though a sitewide link may in fact result a huge number individual links. Is this true or false? What is the math here? Should one worry about having "too many" sitewide links, in the sense that this may raise red flags by way of the algo? I talked to someone a few months ago (before the recent algo updates) who believed that he had got a minus 10 penalty or whatever it was for getting too many sitewide links We offer website design and development as well as SEO, and we put a keyworded link to ourselves in the footer. I think this is a fairly common practice. Is this a good or bad idea SEO-wise? One opinion is that for external sitewide footer links, you should best have a dofollow link on the home page, but nofollow it on all other pages. What is your opinion about that? Is there anything else that is distinct, interesting or important about sitewide links' SEO value and pitfalls? Thank you!
Technical SEO | | Philip-SEO1