Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Any Tips for Reviving Old Websites?
-
Hi,
I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory?
Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for.
The sites have also lost numerous backlinks but still have some really good ones.
-
We would highly recommend writing very high-quality evergreen content marketing.
We would also recommend building very high quality do follow no follow backlinks.
You must also make sure that your web design company designs a website, which offers a good user experience, so it's simple for shoppers to use the website.
-
Content Refresh: Update outdated content, add new information, and improve formatting to make it more engaging and relevant to current trends.
SEO Audit: Conduct a thorough SEO audit to identify and fix issues such as broken links, outdated keywords, and poor site structure.
Mobile Optimization: Ensure your website is mobile-friendly, as more users are accessing the internet through mobile devices.
Speed Optimization: Improve page loading speed by optimizing images, minifying CSS and JavaScript files, and using caching techniques.
Backlink Analysis: Review and disavow low-quality or spammy backlinks while seeking opportunities to acquire high-quality backlinks from reputable sources.
User Experience Enhancement: Enhance user experience by improving navigation, implementing clear calls-to-action, and optimizing for readability.
Social Media Integration: Promote your website through social media channels to increase visibility and attract more traffic.
Update Design: Modernize the website design to reflect current design trends and improve overall aesthetics.
Regular Updates: Commit to regularly updating the website with fresh content, news, or blog posts to keep visitors engaged and encourage return visits.
Analytics Monitoring: Use website analytics tools to monitor traffic, user behavior, and conversion rates, and make data-driven decisions to optimize performance.
By implementing these strategies, you can breathe new life into your old website and improve its visibility, usability, and overall effectiveness.
-
Improving the Organic SEO for on an old company website, is the same SEO, as you would apply to a brand new company website; that is white hat seo.
you do need high-quality content marketing and good-quality backlinks. We own a summerhouse company, and this is how we got the business on the first page of Google.
-
If you are reviving an old website make sure it is mobile friendly. Then you will need to refresh the content and update page titles and meta descriptions. Also make sure you add new content regularly.
-
That's a good question and I'd agree - I imagine that references to your website in published books online could be treated similarly to mentions across the web. Whether Google gives it any extra weight or not is unclear, but I'd agree that the implication is that a mention in a published book could carry some weight.
-
Thank you for the replies. They give me more hope because I was thinking along similar lines.
I certainly plan on reaching out to the authors of old articles that lost link, but I am not so sure sometimes. One of the old websites specifically got its coverage from being controversial so I am not sure if they unlinked due to it being down or due to complaints from people pointing out how they were helping it by linking to it. I have been noticing articles like https://moz.com/learn/seo/backlinks and I would hate to risk losing mentions on high quality sites by drawing attention to new editors that might just delete the articles entirely.
Another question I have related to mentions is mentions in books. I have noticed a site of mine showing up in Google Books from a couple of published books discussing it. Does that help SEO like a brand mention on a high quality site?
I would think that Google would consider sites mentioned in published books to be more authoritative than ones just mentioned in blogs or news stories.
-
Hi there,
I'd suggest a few things:
1. If you have old analytics data or log file data to show you which content performed best when the site was last live, take a look at that and prioritise restoring and updating the content which worked well previously.
2. Go through the content and update with fresh information, data, images, links etc to give everything a freshen up. Don't worry if content is still relevant and evergreen, but just do some checks to make sure.
3. Once you've updated the content and you're happy with it, generate some new XML sitemaps and submit to Google Search Console to prompt Google to crawl the pages again and get them into the index.
4. In addition, perhaps submit the homepage and a few key pages to Google Search Console for crawling and indexing.
5. Once the pages are indexed, keep an eye on Search Console to see how pages are performing and use this data to update the most popular pages.
6. In terms of links, if you can restore any valuable lost ones by reaching back out to the websites, letting them know that the site has relaunched and seeing if they can restore the links, that may give it a nudge too.
I hope that helps!
Paddy
-
Hi,
As previously stated by seotoolshelp5 with addition of
1. Check for any issues with dead links leading to this websites
2. Check crawling errors
3. check website speed and improve it if necessary
4. Prioritize mobile version (if you don't have one, create it)
That's all for now what I can think of.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Amazon crawling my website? Is this hurting us?
Hi mozzers, I discovered that Amazon is crawling our site and exploring thousands of profile pages. In a single day it crawled 75k profile pages. Is this related to AWS? Is this something we should worry about or not? If so what could be a solution to counter this? Could this affect our Google Analytics organic traffic?
Intermediate & Advanced SEO | | Ty19860 -
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
Moving half my website to a new website: 301?
Good Morning! We currently have two websites which are driving all of our traffic. Our end goal is to combine the two and fold them into each other. Can I redirect the duplicate content from one domain to our main domain even though the URL's are different. Ill give an example below. (The domains are not the real domains). The CEO does not want to remove the other website entirely yet, but is willing to begin some sort of consolidation process. ABCaddiction.com is the main domain which covers everything from drug addiction to dual diagnosis treatment. ABCdualdiagnosis.com is our secondary website which covers everything as well. Can I redirect the entire drug addiction half of the website to ABCaddiction.com? With the eventual goal of moving everything together.
Intermediate & Advanced SEO | | HashtagHustler0 -
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
How to check a website's architecture?
Hello everyone, I am an SEO analyst - a good one - but I am weak in technical aspects. I do not know any programming and only a little HTML. I know this is a major weakness for an SEO so my first request to you all is to guide me how to learn HTML and some basic PHP programming. Secondly... about the topic of this particular question - I know that a website should have a flat architecture... but I do not know how to find out if a website's architecture is flat or not, good or bad. Please help me out on this... I would be obliged. Eagerly awaiting your responses, BEst Regards, Talha
Intermediate & Advanced SEO | | MTalhaImtiaz0 -
Ranking A Website For Mulitiple Counties, Cities And Towns
Hello All, I am optimizing three websites for a services based company in the South Jersey Area. Of course within South Jersey there are certain counties, cities and towns I would like to show up for. For example- Pool Cleaning South Jersey Pool Cleaning Cherry Hill NJ Pool Cleaning Burlington County NJ Pool Cleaning Voorhies NJ Pool Cleaning. Do I need to create a page on my websites for every possible county, city and town I want to rank for? This would entail creating thousands of pages targeting these geographic keywords. I have seen other similar sites just list all the counties, cities and towns they service in the footer and it seems to work. Of course this would be beneficial for any business who is looking to not only rank in their home base but a predetermined radius around their home base as well. Thanks so much, Bill
Intermediate & Advanced SEO | | wparlaman0