Tricky: Should I remove this extra navigation?
-
Hello there,
I am working on a website in which the owner has written navigation links manually at the end of all pages/posts.
Example.
Page/Post Entry content.
Go to home click here,
Menu item 2
Menu item 3
Menu item 4 etcWe already have navigation links for that on the main menu and additionally on a sidebar.
But we are two years in, and they have always been there.
Is there a chance removing all these links will better internal pagerank distribution?
Website has millions of views/month so I want to be sure - if I should just leave them as is or remove them all.
Or what extra questions should I be asking myself about this. -
Thanks James, I really appreciate your thoughts. Would love to get confirmation from anyone else willing to chime in.
-
Hi everyone, would appreciate any further comments, thanks in advance.
-
Hi James,
We're trying to increase traffic even more! This in my eyes can have an effect even at a Domain authority level.
Thanks for the words. -
Hi Ricky,
Thanks for your input, we are talking 8 links across 200 pages, that's 1600 internal inks. That's a lot of internal links which are in theory diluting pagerank of every page.
The paradox seems to be that if they are diluting said pagerank to the main pages of the site then I suppose it cancels itself out?
I don't mind doing these changes if we have a solid argument for or against...
-
Hi there,
I'm not totally sure that anyone can definitively answer that question (I could be wrong), but my thought would be that those types of links, if listed on every page would probably be ignored and any real ranking impact would be extremely minimal. I can't imagine this being very important one way or the other.
If you're nervous about removing, you could always experiment a bit at a time and monitor impacts, but I wouldn't be quick to attribute results one way or the other to such a minor issue (or non-issue).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking to remove dates from URL permalink structure. What do you think of this idea?
I know most people who remove dates from their URL structure usually do so and then setup a 301 redirect. I believe that's the right way to go about this typically. My biggest fear with doing a global 301 redirect implementation like that across an entire site is that I've seen cases where this has sort of shocked Google and the site took a hit in organic traffic pretty bad. Heres what I'm thinking a safer approach would be and I'd like to hear others thoughts. What if... Changed permalink structure moving forward to remove the date in future posts. All current URLs stay as is with their dates Moving forward we would go back and optimize past posts in waves (including proper 301 redirects and better URL structure). This way we avoid potentially shocking Google with a global change across all URLs. Do you know of a way this is possible with a large Wordpress website? Do you see any conplications that could come about in this process? I'd like to hear any other thoughts about this please. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
Recommended link removal contractors?
Looking for recommendations for a reliable & experienced contractor to help with a link cleanup project. We've identified the problem links, we just need someone to assist with the actual outreach. Would appreciate any suggestions.
Intermediate & Advanced SEO | | MattBarker0 -
Should I Remove Dates From My Old Posts
I have a web site that has content about home improvement topics but the site has no new content since 2010. All the posts on the wordpress site have the date which are all 2010 and prior. Is there a downside in terms of search engine rankings to remove the dates or changing the dates? What are the risks to removing the dates? Could I lose rankings if I do this? Do you have any personal experience with this situation?
Intermediate & Advanced SEO | | alpha170 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Restructuring/Removing 301 Redirects Due To Newly Optimized Keywords
Just to be clear, this is for one unique page on a website. Also, please see my diagram attached. Let's say that a page's URL was originally /original. So, you optimize the page for a new keyword (keyword 1), and therefore change the URL to /keyword-1. A 301 redirect would then be placed... /original > /keyword-1 However, let's say 6 months down the road you realize that the keyword you optimized the page for (keyword 1) just isn't working. You research for a new keyword, and come up with (keyword 2). So, you'd like to rename the page's URL to /keyword-2. After placing a redirect from the current page (keyword 1) to the 'now' new page (keyword 2), it would look like this... /original > /keyword-1 > /keyword-2 We know that making a server go through more than one redirect slows the server load time, and even more 'link-juice' is lost in translation. Because of this, would it make sense to remove the original redirect and instead place redirects like this? /original > /keyword-2 /keyword-1 > /keyword-2 To me, this would make the most sense for preserving SEO. However, I've read that removing 301 redirects can cause user issues due to browsers caching the now 'removed' redirect. Even if this is ideal for SEO, could it be more work than it's worth? Does anyone have any experience/input on this? If so, I greatly appreciate your time! oDvLl.jpg
Intermediate & Advanced SEO | | LogicalMediaGroup1 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0