Broken links shall i put them in my htaccess file to generate juice
-
Hi, when i had to rebuild my website after my hosting company made an error, i lost over 10,000 pages and lost many thousands of links coming to my site. What i want to know is, instead of trying to recreate those pages again which would take me a long time, should i put them into my htacess file and have them point back into my site.
so for example, if i have a link coming to my site to an article which could be, holidays in benidorm are not selling well, would it be a good idea to have that link pointed at the main benidorm section which is benidorm news.
And if i had an article which was people are finding it hard to lose weight, instead of writing a new article could i have the link pointing to my health section?
If this is the correct way of doing it to grab back some link juice, would it slow my site down and how many links could i put in my htacess file. So what i am trying to say is, if i put in say 1000 redirects into my htaccess file, would it slow my site down and is this a wise thing to do or should i just let the links go.
-
cheers keri, i have read the information and i am now putting it in place and see how things move along. i have gone through all the 404s and i have found that i am losing around 9,000 links. it will take me a long time to get all the links in the htacess file. and i am just about to post a question about my hosting company as they are telling me that i need a new server over this so i am not sure if i should stick with them or move my site, but anyway, thanks for the help on this 404 problem
-
Well you could let the links go to the graveyard but why so - Have you time to manually rewrite specific content around the targeted keyword and associated link that points to that old location ?
Could you also use http://archive.org/web/web.php to get a source of most of your old site and manually rewrite the page content with a view to doing the above method.
Also you state the hosting company lost over 10,000 pages, im assuming you have closed all avenues about backups from them to source the website pages ?
Apart from the above i cant think of anything only to let those links go --
Rob
-
Stephanie Change of Distiled wrote a good post about dealing with expired content and lots of 404 pages, and what you should do with those pages. It's at http://www.seomoz.org/blog/how-should-you-handle-expired-content. She talks about where you might want to redirect these pages to, how it should look to the user, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
How to optimize for new subdomain when root domain has all link juice and built up authority?
We recently took control of a root domain for a business that was not doing e-commerce. They just had a single page business card website at the root domain. However, it had been around long enough to have built up some amount of domain authority and link juice. When we took over to enable the site with e-commerce, we redirected the root domain to point to a www subdomain where the store is now located. Now, in my seomoz campaign, i see that all the link juice and authority stats are in the root domain metrics, and the subdomain we are tracking has nothing. What is the best way for me to take advantage of all the built up authority for the root domain to help with the newly enabled ecommerce site at the subdomain? or am I basically starting from scratch since i have been reading that link juice does not flow as well from root domains to subdomains. thank you and happy new year to all!
Technical SEO | | devinjy0 -
Remove Links or 301
Howdy Guys, Our main site has been hit pretty hard by penguin and we are just wondering what steps we should now take. For the past 2 months we have been working through our back link profile removing spammy / un-natural links, we have documented everything in a spreadsheet... We recently submitted a reconsideration request to Google and they have now responded saying we still have bad links. I'm just wondering would be it easier just to 301 redirect our site to another TLD we have for our main site? Or Do we keep working through our links 1 by 1 and removing them? Has anyone had any success in 301ing? Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Should I wory about spam domains linking to me?
A while ago my site had a pharmacy hack done to it and created a ton of spam links. I've since fixed the issues on my site but I'm still showing links from their sites. See screen shot: http://awesomescreenshot.com/0497cc147 I think they are links from the spam site to me and not my site "yakanger" linking to them correct? Do I need to worry about these? Can I get rid of them?
Technical SEO | | mr_w2 -
WIki Contextual Links
I want to understand what are Wiki Contextual Links and how are they helpful for SEO. I hear google likes them. Is that true?
Technical SEO | | KS__0 -
Code problem and the impact on links
We have a specific URL naming convention for 'city landing pages': .com/Burbank-CA .com/Boston-MA etc. We use this naming convention almost exclisively as the URLs for links. Our website had a code breakdown and all those URLs within that naming convention led to an error message on the website. Will this impact our links?
Technical SEO | | Storitz0 -
301s and Link Juice
So I know that a 301 will pass the majority of link juice to the new site, but if that 301 is taken away what happens?
Technical SEO | | kylesuss0 -
Does the Referral Traffic from a Link Influence the SEO Value of that Link?
If a link exists, and nobody clicks on it, could it still be valuable for SEO? Say I have 1000 links on 500 sites with Domain Authority ranging from 35 to 80. Let's pretend that 900 of those links generate referral traffic. Let's assume that the remaining 100 links are spread between 10 domains of the 500, but nobody ever clicks on them. Are they still valuable? Should an SEO seek to earn more links like those, even though they don't earn referral traffic? Does Google take referral data into account in evaluating links? 5343313-zelda-rogers-albums-zelda-pictures-duh-what-else-would-they-be-picture3672t-link-looks-so-lonely.jpg Sad%20little%20link.jpg
Technical SEO | | glennfriesen1