Still too many internal links reported on page
-
Hi Guys
I am new here, and very much learning a lot, and enjoying the benefits of being an SEOMoz user.
So here goes with my first question (probably of many).
I have known for sometime that our website has a top heavy number of links in the primary navigation. But I wasn't too sure how important this was. Our main objective was to make an east to use nav for customers. All of the feedback we have had says that customers really like our navigation, as it is easy to use etc etc.
However, when running an SEOMoz campaign on our site, again we got back that there are too many links on the pages. Example, home page has 500+ links.
So I decided to do something about this. I have implemented what I think is a good solution where by the drop down navigation isn't loaded on first load. If the user then hovers over one of our "departments" the sub navigation is loaded via Ajax and dropped in. This means if the user wants it, they get it, if not then it's not loaded with the page. My theory being that Google loads the page without all the links, but a user gets the links as and when they need them.
I tested with the SEOMoz toolbar and this tells me that when I load the home page there is 167 links in it vs 500+ previously. However, the my campaign still tells me that my home page has 450+ links (and this is a recent crawl of the page).
Our site is here: www.uniquemagazines.co.uk
Can you tell me is what I have done is a) a good solution and b) does the SEOMoz crawler have the ability to trigger the hover event and cause the AJAX load of the sub navigation content?
-
That's great thank you.
I don't want to get hung up on building a website which is good for Google, I want a website which is good for users, but of course I want to get as much benefit from Google as possible.
One aspect cutting out nearly 300 links has had is the site loads faster, which is good for users, so a great benefit there.
Thanks again.
-
Yes what you have done is good, although I am not sure about the moz bots capabilities for drop down crawling. I would say wait until your next crawl, and even then if it picks up the links don't worry about it too much, just do what you can do and stick with it if it's good for the user.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pushstate and Infinite Scrolling Article Pages: Is it detrimental to not change URLs as the page is being scrolled?
I've noticed a recent trend of news sites using infinite scrolling on article pages to garner more pageviews and I can assume serve up more ads. Here is an overview. Here is an article from NBC news that uses this technique: http://www.nbcnews.com/pop-culture/music/grammys-2016-here-s-why-adele-s-performance-was-out-n519186 Studies have shown that this technique has decreased bounce rates by +15% for some sites. My question is: If a site is using the technique without changing URLs as the user scrolls down what overall negative effects does this have? Obviously you wouldn't be getting credit for the extra pageviews but I was wondering if there were any indexation implications with this. Here is an example of article infinite scrolling without changing the URL: http://www.wftv.com/news/national-content/deputies-wife-attacks-husband-because-he-didnt-get-her-a-valentines-day-gift/87691927
Web Design | | Cox-Media-Group1 -
What is your opinion in the use of jquery for a continuous scroll type of page layout?
So, I'm in 2 minds about this; let me start with a bit of background info. Context
Web Design | | ChrisAshton
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links. Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time. In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site. My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here. Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous". The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order. It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here. Can anyone possibly shed some light on this with either some further reading or first-hand experience?0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Pointless copy on product list pages makes me feel compromised...
When working on ecommerce websites we insist that product list pages need at least 250 words of copy that is optimised for our keyword phrase ... lets say "17 inch bike frames". So we have some crappy copy written that goes something like this.... "We have a great 17 inch bike frame for you whatever your requirement. Take a look at the frames below .... blah blah blah totally pointless text blah blah blah........." This text is of no use to the user as the page is merely a means of them getting to a suitable product page. However, the copy is pretty essential if we want to rank well for "17 inch bike frames" and not having copy on product list pages could land us in hot water with Panda ...especially if we have lots of them on a site using the same page template and with no copy on them. Does anyone else feel uneasy with adding this crappy text to pages? It's only there for search engines and that is something that Google say's we shouldn't do but I know for sure they're not going to rank me as well if I don't have it. I'd be interested to hear other people's opinion on this. It's always annoyed me. Does anyone have any good tips for making this type of copy on product list pages less forced and crappy?
Web Design | | QubaSEO0 -
Linking to an image with the keyword in the title and alt tags.
Hi guys, Just thought I'd ask for opinions about an ecommerce catalog I'm working on. I don't know if it's even worth worrying about, but here's the scenario. Let's say I'm linking to a category called 'Sale' using an image, I have the title tag of the link as 'Sale', the image title is also 'Sale' as well as the alt tag. The HTML looks like this: Sale The page itself is: http://www.fashionbasicsonline.com/catDisplay So my question is, do you think I'm stuffing the keyword in too many times there? It's CMS driven so I could have the alt tag as 'Sale Products' or one of the titles as 'Sale Catalog' perhaps, do you think there would be a benefit in doing that? Maybe it's microoptimisation and I should be looking at other low hanging fruit, but I'm just trying to come up with the best scenario. Would love to hear what you think. Cheers, Bruce p.s. Looking forward to meeting as many people as possible at MozCon next week 🙂
Web Design | | bruce_werdschinski0 -
How to make Address Text Clickable for Google Map Link for Mobile Device
How do I make the address text on the site a clickable link for mobile devices?
Web Design | | bozzie3110 -
Are slimmed down mobile versions of a canonical page considered cloaking?
We are developing our mobile site right now and we are using a user agent sniffer to figure out what kind of device the visitor is using. Once the server knows whether it is a desktop or mobile browser it will deliver the appropriate template. We decided to use the same URL for both versions of the page rather than using m.websiteurl.com or www.websiteurl.mobi so that traffic to either version of these pages would register as a visit to the page. Will search engines consider this cloaking or is mobile "versioning" an acceptable practice? The pages in essence are the same, the mobile version will just leave out extraneous scripts and unnecessary resources to better display on a mobile device.
Web Design | | TahoeMountain400