What is the best way to hide duplicate, image embedded links from search engines?
-
**Hello!
Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.]
Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch.
SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now.
Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links:
-
Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page.
-
Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way.
Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page.
My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat.
Thanks in advance for your thoughts!
Eric**
-
-
The drop in your site traffic may be due to a number of things, not just links on the page. If you restructured your URLs, changed title tags, etc etc. There can be lots of reasons besides number of links on the page. You now have a new site that Google is trying to sort out and so you may simply have a little drop in the SERPs.
If you have the tools, look and see if your ranking did drop and monitor that. Also use your analytics to see where the drop is coming from.
There are a couple of good posts on SEOMoz that have info on the 100 page link rule
http://www.seomoz.org/q/can-i-reduce-link-count-by-no-following-links
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Pro > Links > Top Pages: many are images, useful?
My site is 10 years old, and has always ranked well for the variety of garden tools it sells. Looking at our Moz Pro > Links > Top Pages report I see that many of the "pages" are actually image URLs. And many of those are images we do not even use anymore (though they are still hosted). Question: As a way of gaining some link juice to deeper pages, what about 301 redirecting some of those old images over to appropriate pages? (example: redirecting old-weeding-hoe.jpg to the page garden-hoes.html) Would it be worthwhile? Would it be safe? Thanks for any and all input!
Intermediate & Advanced SEO | | GregB1230 -
What exactly is an impression in Google Webmaster Tools search queries with the image filter turned on?
Is it when someone does an image search? Or does it count a regular search that has images in it? On an image search does the picture actually have to be viewed on the screen or can it be below in the infinite scroll?
Intermediate & Advanced SEO | | EcommerceSite0 -
How Google organic search results differ in Local Searches?
We all know Google displays nearby results by locating our ip address. My question is how does these results differ? For eg 1. If someone from Newyork search for "chinese Restaurant in Newyork" 2. Someone from California search for "chinese Restaurant in Newyork" 3. Someone from California changes his location to Newyork and search for "chinese Restaurant in Newyork" What are the factors the Google SERP looks into to display the result in local terms?
Intermediate & Advanced SEO | | rajeevEDU0 -
List of Search Engines subscribing to the ajax crawling scheme?
Hi, Does anyone have a list of (major) Search Engines that subscribe to the Ajax Crawling Scheme? (https://developers.google.com/webmasters/ajax-crawling/) Specifically interested in major international Search Engines such as Bing/Yahoo, Baidu & Yandex - if anyone knows, please let me know! Thanks in advance
Intermediate & Advanced SEO | | FashionLux0 -
Which search engines still use Meta Keywords?
I know Google doesn't use meta keywords in meta tags, but i was wondering if there are other smaller search engines that still do? Id it worth it to add meta keywords for them?
Intermediate & Advanced SEO | | jhinchcliffe0 -
What is value in a back-link from article with multiple links pointing to various other sites?
In a standard article with 400-500 words my site got a back-link. However, within the article there are 4 other links pointing to other external content as well (so total 5 links within articles all pointing to external sites, and 1 of the links is to my site). All links are to relevant external content that is. Question: wouldn't it be much more valuable for my site if only my site got a back-link from the article, as less link juice is now passed to my site, since there are 4 other links pointing to various sites from this same article? Or, is the case that given the other links are pointing to quality material it actually makes the link to my site look more credible and at the end of the day have more value. Conclusion: is it that on one hand less links in same article is better from a link juice perspective, however, from a credibility perspective it looks more convincing there are other links pointing to quality content?
Intermediate & Advanced SEO | | knielsen0 -
How do I get the best from our Blog and build quality links and drive traffic to our site?
We have recently setup a Wordpress focused blog (blog.towelsrus.co.uk) which is very much work in progress. Because of financial constraints we had to host this on a separate sub domain. I need to get to grips with blogs is new to me (only been doing SEO this for 3 months now) and have have read may posts on the forums here that this is one of the best ways to build links and engage the audience. How do I go about getting people to read my blog, and should I use this to pull in traffic on keywords we cannot through the main site or should i use this to re-enforce and build traction on those keywords we are trying to rank for on the main site?
Intermediate & Advanced SEO | | Towelsrus0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0