Pull meta descriptions from a website that isn't live anymore
-
Hi all, we moved a website over to Wordpress 2 months ago. It was using .cfm before, so all of the URLs have changed. We implemented 301 redirects for each page, but we weren't able to copy over any of the meta descriptions.
We have an export file which has all of the old web pages. Is there a tool that would allow us to upload the old pages and extract the meta descriptions so that we can get them onto the new website? We use the Yoast SEO plugin which has a bulk meta descriptions editor, so I'm assuming that the easiest/most effective way would be to find a tool that generates some sort of .csv or excel file that we can just copy and paste? Any feedback/suggestions would be awesome, thanks!
-
You can pull the meta descriptions with Screaming Frog from the Wayback Machine if your site is archived. If you want to do this, let me know and I'll help you with the settings.
-
I would do it one better and crawl from a local web server, just to be sure. But in all reality, a password protected directory is probably more accessible, in this instance.
-
Note Ray-pp suggests you use a private directory... Make sure to keep it out of the serps
-
Thanks Ray, we've used the Screaming From Spider for some time now, I've flirted with the idea of re-uploading the web files. This may be our best option, thanks.
-
Hi George,
If you can upload the old pages to a private directory, you can then use Screaming Frog SEO tool to crawl all of the pages and retrieve the meta descriptions. That would allow you to easily export much of the on-page SEO, include your meta information.
Screaming Frog SEO spider is a mus have tool for SEOs - check it out if you haven't already!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Temporarily redirecting a small website to a specific url of another website
Hi, I would like to redirect a small website that contains info about a specific project temporarily to a specific url about this project on my main website. Reason for this is that the small website doesn't contain accurate info anymore. We will adapt the content in the next few weeks and then remove the redirect again. Should I set up a 301 or a 302? Thanks
Intermediate & Advanced SEO | | Mat_C1 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Server responds with 302 but the pages doesn't appear to redirect?
I'm working on a site and am running some basic audits, including a campaign within Moz. When I put the domain into any of these tools, including response header checkers, the response is a 302 that says there is a redirect to an Error Page. However, the page itself doesn't redirect, and resolves fine in the browser. But all of the audit tools cant seem to get any information from any of the pages. What is the best way to troubleshoot what is going on here? Thanks.
Intermediate & Advanced SEO | | jim_shook0 -
Google is ranking the wrong page and I don't know why?
I have an E-Commerce store and to make things easy, let's say I am selling shoes. There is: Category named 'Shoes' and 3 products 'Sport shoes', 'Hiking shoes' and 'Dancing shoes' My problem: For the keyword 'Shoes' Google is showing the product result 'Sport shoes'. This makes no sense from user perspective. (It's like searching for 'iPhone' and getting a result for 'iPhone 4s' instead of a general overview.) Now what are the specifics of my category page (Which I want Google to rank): It has more external links with higher quality It has more internal links It has much higher page authority It has useful text to guide the user for the keyword It is a category instead of a product All this given, I just don't know how I can signal Google that this page makes sense to show in SERPs? Hope you can help with this!
Intermediate & Advanced SEO | | soralsokal0 -
Can I swap a website yet keep it's high ranking for a competitive keyword?
Couldn't fit the entire question in the main bit so the explanation is here: Working on a client's website which is hosted by volusion and also been doing SEO for them for about a year. Now we've finally got them ranking at the lower end of page 1 (around 10+) for their main keyword. They now want to move from volusion over to Amazon Web Store 😢 which seems to be an SEO nightmare from even my basic understanding of SEO. From looking at the coding and the way Amazon Web store is built on top of how restricted you are from doing anything with it, I am almost certain the shop will be extremely difficult to optimise and we will have to completely change nearly all of the content. Finally! the actual question; I was thinking I could get them to delay their move to Amazon webstore until they are ranking in the top 5 for this top keyword. Once they switch over, i assume they'll keep this ranking for at least a short while? This keyword attracts a high volume of traffic and if this traffic is clicking on the result for their website, and google sees that people are finding this website valuable (not clicking back onto google results). Will they be able hold onto this high ranking? Basically what I'm asking is, this will be a terrible outdated badly SEO'd shop, but if a high volume of people are clicking on it and staying on it from their lingering ranking will Google just let it stay at the top? A massive amount of gratitude in advance for anyone who tries to help with this! 😄
Intermediate & Advanced SEO | | acecream0 -
My website hasn't been cached for over a month. Can anyone tell me why?
I have been working on an eCommerce site www.fuchia.co.uk. I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there. The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl. Can anyone tell me or suggest why my site hasn't been indexed in such a long time? Thanks
Intermediate & Advanced SEO | | SEOAndy0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0