Export Website into XML File
-
Hi,
I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML.
What is best way to export content including meta tags for an entire site along with the steps on how to?
-
I don't know if it does anything more than an offline copy. I haven't encountered your use case before, so haven't looked for that. You might look to see if that program or others has those types of options that could help you.
-
will this software be able to export the site in xml or bascially just a offline copy?
-
I've used http://www.httrack.com/ HTTrack Website Copier before. Website copy software is one keyword search to get you started to find tools like this.
-
That would probably work, keri. What are the tools you speak of?
-
There are tools that will crawl and scrape your entire site and make a local copy of it. Would that work as something you could hand off to the agency?
-
i want a copy of the site content (on-page content and meta data) to give to an agency to optimize. its a regular site hosted on apache server
-
Are you talking about a Wordpress Blog ? What are you trying to do by exporting site content/meta data into an XML File ? Are you trying to use it as a backup or what ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow File and SSL Conversion Question
Moz Community, So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected). Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version? Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well. Thanks QL
Intermediate & Advanced SEO | | QuickLearner0 -
Hacked website - Dealing with 301 redirects and a large .htaccess file
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking. How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
Intermediate & Advanced SEO | | FPK0 -
XML Sitemaps - Multi-lingual website
Hi Mozzers, I am working with a large website that has some of its content translated across multiple languages. I am planning on using The Media Flow to create an HREFLANG Sitemap for content on various languages. Please see the attached image for the questions below. Thanks! Section Highlighted Yellow: When there is a URL that does not have a translated version, should it not be included on the same HREFLANG sitemap? Alternately, could I just remove the languages that are not being targeted, so this would just reflect English language targeting? fqO9Dvk
Intermediate & Advanced SEO | | J-Banz0 -
XML Sitemaps - how to create the perfect XML Sitemap
Hello, We have a site that is not updated very often - currently we have a script running to create/update the XML sitemap every time a page is added/edited or deleted. I have a few questions about best practices for creating XML sitemaps. 1. If the site is not updated for months on end - is it a bad idea to force the script to update i.e. changing the dates once a month? Will google noticed nothing has changed just the date i.e. all the content on the site is exactly the same. Will they start penalising you for updating an XML sitemap when there is nothing new about the website?
Intermediate & Advanced SEO | | JohnW-UK
2. Is it worth automating the XML file to link into Bing/Google to update via webmaster tools - as I say even if the site is never updated?
3. Is the use of "priorities" necessary?
4. The changefreq - does that mean Google/Bing expects to see a new file ever month?
5. The ordering of the pages - the script seems pretty random and put the pages in a random order - should we make it order the pages with the most important ones first? Should the home page always be first?
6. Below is a sample of how our XML sitemap appears - is there anything that we should change? i.e. all marked up properly? This XML file does not appear to have any style information associated with it. The document tree is shown below.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><url><loc>http://www.domain.com</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/contact/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/sitemap/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url></urlset> Hope someone can help enlighten us to best practices0 -
Files blocked in robot.txt and seo
I use joomla and I have blocked the following in my robots.txt is there anything that is bad for seo ? User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Disallow: /xmlrpc/ Disallow: /mailto:myemail@myemail.com/ Disallow: /javascript:void(0) Disallow: /.pdf
Intermediate & Advanced SEO | | seoanalytics0 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0 -
One Website - Local + National Ranking
If a client (e.g. a winery) wants to rank both nationally and locally, what are some best practices for doing this on one Website? So the goal is to: Rank nationally for their wines, wine varietals, etc.so they're found by restaurants, distributors, customers (could include national directories, content creation ,etc.) Rank locally for their tasting room and wines for people looking locally or looking at that specific region (this could also include include Google places, local directories, etc.). I'm wondering if the site would need to be subdivided (or "siloed") where one section is heavily focused on national and another is on regional? Also, for the home page, which focus would be most important (maybe national because it's harder)? Thanks a for any ideas! Tom
Intermediate & Advanced SEO | | DirectionSEO0