What is the longest you would go back to ressurrect links that should have been 301's?
-
I have never thought of anything beyond a site that was possibly developed a month or two ago, but an interesting possible client has come along and begs a question.
They had their site "redesigned" in April 2014 and it appears whomever did the work did not realize what a 301 was for. Using ahrefs or MajesticSEO, they have gone from roughly 15,000 referring pages to 500 and the time line perfectly intersects the redesign. Sooooo, just wondering if any of you geniuses has ever gone back that far to try and pull off a 301.... I am actually just thinking of a link building / content marketing plan but thought it was an interesting question.
Thanks for the help,
Robert
-
This link is about using a compass
-
The link in this reply is to a website about "cheap goalkeeper gloves"
-
...a man can dream,....
My favorite is taking the time to explain followed by silence...not awkward at all!
-
Think about it... They did a poor UI/UX site, none or few redirects from a fairly well ranked and high DA/PA site... do you really think they would even consider a custom 404 page? I think we spend more time trying to explain to clients that not everyone gets good design, SEO, etc. No matter what the name of their company says.
Best -
LInda,
I think the point about it being anecdotal really is what I was looking for. There is no clear direction from the search engines on this so that is one of the things that makes Moz so strong. Good SEO's sharing anecdotal and other evidence/ideas.
Thanks so much,Robert
-
Very good point Ash, very good. I have seen it continue to crawl for a year or more as well. Checking for the 404s as a comparison and redirecting to fix the 404's is a good explanation. Well done.
-
Tom,
Great points. I am not as concerned with content relevance as we are fairly careful with that. The issue was it was a new site but used old content and they did not do redirects. I am going to give it a try with what I find to be the most relevant pages from the old, but not with all as I do not want to "overdo" it.
Thanks
-
Great story.
That reminds me of one... I know of a small adsense site that went offline and the owner didn't realize. A few months went by before they realized that the hosting was not responding. The site was brought back online, popped back into the SERPs and resumed making money.
-
When I first started my current job, I found out that in the past there had been a separate, small website for one of the products, which had been abandoned a couple of years before that. (The site, not the product.)
The site still seemed to have some good links pointing to it, so for the heck of it I 301'd it to the main page on the current site for that product. That page quickly grew to be one of the strongest pages on the current site.
This is just one anecdotal data point but based on my experience if it's not a huge amount of work, I'd try redirecting, at least for the pages with the best links.
-
Really hope that they had a custom 404 page at least!
-
What? You callin me a fruit picker???
There are worse things
-
I've been in a similar situation. My recommendation is to look in Google Webmaster Tools in 'Crawl > Crawl Errors' and if it is reporting them as 404 pages, what's the harm in redirecting them?
Google can crawl old URLs which 404 or provide another error for years (as in my case - it was an old website redesign from A LONG time ago).
-
Hi Robert,
I would do an archive.org take a look at the site structure the best you can that way. Then I would figure out if there are any links that are valuable to the site and relevant to the pages that exist.
It is still very risky I have a friend who changed his domain and redirected 35,000 URLs to a site of half 1 million URLs however the links from the old domain that were very high page rank still did not benefit the site very much at all.
I would export from Ahrefs upload to Deep Crawl see the similarities between the old URL's content in the new URLs
considering that they have already been checked and confirmed that the links are not bad.
ttp://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
http://www.seoconsultants.com/tools/check-server-headers-tool/
I hope this is of some help,
Tom
-
I don't know the answer to your original question.... but I would be jumping to redirect anything from April 2014.
Nobody really "knows" the answer to this... but I think there is a good chance that google will continue to crawl these connections. Even if some of them are still good.
-
What? You callin me a fruit picker???
Actually I am pondering it quite a bit. Really a shame these people did this to them. And...bad job on the site as we must rebuild.
Thanks Andy.
-
Hi Robert,
I have never gone back that far myself (30-40 days max), but I can see no reason why this isn't worth a shot. There could still be a lot of potential hanging around out there for the grabbing. Grab any low hanging fruit with both hands
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
Is Prerender.io/React going to negatively impact our SEO efforts?
On any page on the site (https://theadventurepeople.com/), the same short code appears. Having investigated Google index pages, Google's cache and Fetch & Render, it does look like Google can view the content and index it, but we're not 100% convinced. Background technical information from the web developer: The website is a single page application built using React. The site is setup with Prerender: https://prerender.io/ (which renders the javascript in a browser, saves the static HTML, and returns that to crawlers). Is Prerender.io/React going to negatively impact our SEO efforts?
Web Design | | Wagada0 -
We're considering making notable changes to our website's navigation. Other than 301 redirects from old pages to new, what do I need to consider with this type of move or update?
We would like to make some navigation changes to our website: www.NetGainIT.com, specifically to the services section. I know that I will need a list of 301 redirects if I do not plan on keeping certain pages, but what else do I need to consider?
Web Design | | NetGainTech0 -
No cache meta tags - does it help Google get back and reindex faster?
I saw these meta tags on a site and am trying to figure out their benefit. These meta tags are on the home page, product pages, every page of the site. Will it cause search engine bots to come back and index pages faster? Will it cause slower page loading in browsers if nothing is cached? http-equiv="pragma" content="no-cache"/> http-equiv="cache-control" content="no-cache,no-store,must-revalidate"/> http-equiv="expires" content="0"/>
Web Design | | CFSSEO0 -
Domain Consolidation & Proper Linking Strategy?
We have a client that operates 5 gyms in 5 different part of Miami, and each gym has its own website. All sites rank well and have a a good pagerank. For the purpose of their marketing and brand they would like 1 website developed which includes all of their gyms which we are launching later today. Each gym will have it's own landing page within the website Should we redirect the URL's of the different websites to the individual landing pages on the new site or how should that work to minimize any penalties on our SEO. For example (these are fake url's): www.gymA.com, www.gymB.com, www.gymC.com, www.gymD.com The new url: www.gym.com New landing pages:
Web Design | | POPCreative
www.gym.com/gymA, www.gym.com/gymB, www.gym.com/gymC, www.gym.com/gymD Should we do a redirect from: www.gymA.com to www.gym.com/gymA www.gymB.com to www.gym.com/gymB www.gymC.com to www.gym.com/gymC www.gymD.com to www.gym.com/gymD Thank you in advanced. If there is a better way to do this, or anything extra I need to know, that would be great. Thanks!0 -
ECWID How to fix Duplicate page content and external link issue
I am working on a site that has a HUGE number of duplicate pages due to ECWID ecommerce platform. The site is built with Joomla! How can I rectify this situation? The pages also show up as "external " links on crawls... Is it the ECWID platform? I have never worked on a site that uses this. Here is an example of a page with the issue (there are 6280 issues) URL: http://www.metroboltmi.com/shop-spare-parts?Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560081
Web Design | | Atlanta-SMO0 -
Linking root and domain authority
Hi SEOMOZ community, Can you please advise on how to increase Linking root domains and domain authority. much appreciated.
Web Design | | wahin10