Is my megamenu negatively impacting my SEO?
-
Hello everyone,
I have a megamenu with 87 links in total. We offer a ton of products, so when we decided to have this developed, it seemed like a no-brainer because a straight drop-down menu was really hard to digest.
But I have been wondering if it is creating too many links on every single page and/or muddling the signals to the search engines.
If anyone could take a look and give me their insight, I would really appreciate it.
Thanks,
-
It is not required considering what I see in your site. I think your site is good enough on that front. If you were way above 100-150 in the number of links, it would be something to worry about. As is, it is fine.
K
-
Thanks for your response.
So in your opinion, am I going over the rule of thumb?
Should I just cut down the number of items?
-
HI -
MOZ advises the total number of links in a given page to be kept to under 100. Here is some information that could help you understand it better -- http://moz.com/blog/how-many-links-is-too-many
Regards,
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is NitroPack plugin Black Hat SEO for speed optimization
We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you!
On-Page Optimization | | opiates0 -
Woocommerce ULRs & SEO
My company is in the process of converting from a .net site to a wordpress site using Woocommerce. We are #1 in Google SEO ranking and have spent a lot of time getting to this point. My currents URL is /smithco.com/rotating-widgetsbecause of the way wordpress works - I'm going to have /product/ in the URL in the new siteThew new URL will be/smithco.com/product/rotating-widgetsShould I be worried about this ?thanks in advance
On-Page Optimization | | ThomasErb0 -
SEO can id and class be used in H1?
Can ID and class be used in my H1 tag. I realize best case would be to change it, but it's going to require a change order from the ecommerce company to fix their sloppy code. Will this hurt seo? Example:
On-Page Optimization | | K-WINTER0 -
Static pages with dynamic content: Good for SEO?
I wanted to know ones thoughts on reducing duplication by creating a static page with a few dynamic fields. Has anyone done this? If 75% of a page is static and 25% is dynamic, is that a good ratio? It's an idea I am thinking about to combat duplication issues affecting ecommerce pages. Some ecommerce sites generate a new page for a small change like size, but the content is the same. What if you could create a single static page and depending on the size chosen, only those fields connected to the size are dynamic? Everything else remains the same. For caching purposes, you always submit a cached page with default values. Wouldn't this work? Isn't this a solution for duplicate ecommerce pages? It would also help in ranking, rather than multiple external links across duplicate ecommerce pages, it would just be to a single page.
On-Page Optimization | | Bio-RadAbs0 -
Is there SEO benefit of automated content through Narrative Science or Automated Insights?
I'm considering working with a group like Narrative Science or Automated Insights to create content for 10k cities around the country. Each article they would create (3-5 per city) would be completely original, based on data we either own or license, written to our editorial tone, voice and direction, and consist of 300-500 words per page. If you are familiar with these groups, you'll know that it is not spun content or spammy crap that we know Google kills off in droves. It will be well written, accurate, articulate original content on topics like health, demographics, population growth, schools and education and weather pertaining to a city or metro area. My question - assuming the answer is actually known - is how well (or if) this content will perform in Google. It is a significant investment for my group (well into size figures) and we don't want to take this decision lightly. We are looking to challenge sites like city-data.org and bestplaces.net, who largely just regurgitate aggregated data.
On-Page Optimization | | barberm0 -
What software do you use/work within for SEO?
Hi, Our site was put together in Dreamweaver and I'm not great at using html so I use Contribute to modify the info on our site. There are some limitations using Contribute so I'm wondering what other people use. Thanks for your input!
On-Page Optimization | | karlseidel0 -
How to SEO a website that is being help back by duplicate content?
We have over 20 websites that sell property. Each website is targeted to a different country. People advertise to sell their property. The websites are not getting to page 1 for the terms we want probably because of duplication issues. If we compare one website with another country website on www.duplicatecontent.net we find it is nearly 70% between one and the other. So we trying to understand why this is. If someone wanted to sell a property in Spain we would create an advert for them but rather than putting this on the back-end of the Spain website it goes on a separate website that does on all countries. We have tried to put nofollow tags so that the country specific website gets acknowledgement of being the original website but the rankings for key-terms will not rise and the duplication % remains nearly 70%. Can anyone suggest the best way forward?
On-Page Optimization | | Feily0 -
SEO Site Planning Tool?
Does anyone know of a good SEO Site planning tool? I see that SEOBOOK has something that looked interesting but they want $300/mo! Thanks in advance! Andy
On-Page Optimization | | MaxOtto0