Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to download an entire Website (HTML only), ready to rehost
-
Hi all,
I work for a large retail brand and we have lots of counterfeit sites ranking for our products. Our legal team seizes the websites from the owners who then setup more counterfeit sites and so forth.
As soon as we seize control of a website, the site content is deleted and subsequently it falls out of the SERPs to be immediately replaced by the next lot of counterfeit sites.
I need to be able to download a copy of the site before it is seized, so that once I have control of it I can put the content back and hopefully quickly regain the SERPs (with an additional 'counterfeit site' notice superimposed on that page in JS).
Does anyone know or can recommend good software to be able to download an entire website, so that it can be easily rehosted?
Thanks
FashionLux
(Edited title to reflect only wanting to download html, CSS and images of site. I don't want the sites to actually be functional - only appear the same to Google)
-
Thanks for the detailed explanation.
If you know of any software or techniques to crawl and download multiple (html) pages and images of a site then please let me know.
There are many programs designed to crawl websites and grab the html code. Legitimate sites are often duplicated in this manner. You can try searching a couple relevant terms or searching black hat seo sites.
-
"If it is a very basic pure html/css site, you can pretty much achieve your goal." - Yes this is exactly what I need, I don't want the site to be functional and allow users to place orders (which could happen for non-JS users who don't see the notice that fills the entire screen). I don't want to do anything apart from rehost the site and put a big message up that says "THIS SITE WAS A SCAM - BEWARE OF OTHER SCAM SITES" and cannot be closed down.
"Do you obtain control over just the domain?" - Yes only the domain, not the hosting. We go through legal proceeding to prove the site is illegally selling counterfeit goods and obtain the blank domain.
"I understand your intentions are good, but the method is not complaint with Google's Guidelines." Fair point, but Google shouldn't rank these sites in the first place - they have no genuine links and should be banned already. Google aren't spotting this, so I have to fix Google's **** up. If the site gets banned I couldn't care less. Whilst they rank they serve a genuine purpose of (a) showing users that there are counterfeit sites and they need to be wary and (b) new sites have to better the SEO ability of the old ones in order to rank on page 1.
"Your goal is purely to manipulate search engine results which makes these activities black hat and subject to penalty." Yes but it doesn't matter if the domain is banned, it's not my genuine website and has no links back to my genuine site. I'm not going to host the sites on the same server as our genuine site so no risk to the company. Really I couldn't care less if it gets banned - the counterfeit sites are ranking due to black hat techniques - its in my interest for Google to eventually work it out and fix their algo as it will stop the hundreds of other counterfeit sites from ranking too.
"You can use every social media page, etc. If you put in the time and effort, these pages will rank very well in SERPs."
Yes I could, by building links to social media pages for the hundreds of search terms currently dominated by counterfeiters but this is not a good idea for two reasons:
1. Trying to rank social media sites for irrelevant terms isn't a good thing - you wouldn't do it for users if this situation wasn't happening. As you said already, this is a form of trying to manipulate SERPs and I wouldn't want to risk these genuine SocMed pages getting banned because of this.
2. There are hundreds of search terms to optimise for, and 8 remaining slots on Google to fill for many of these. These sites are also powerful in their SEO strength - 17 counterfeit sites made it into Majestic's top 1million sites by links - these sites have literally tons of scummy, comment box spammed links pointing at them and they are ranking (shame on Google). Competing against these isn't possible via white hat methods and I'm not a black hat kind of guy.
My thought process is - Why try and compete against these sites (and waste A LOT of time and effort) trying to bump them down the rankings when they've already done the hard work of optimisation and link building for these terms? I could simply 're-use' them for a genuine purpose (making our customers beware of ordering from unofficial websites).
The previous owner won't sue us for re-using their content - that involves making themselves known to authorities and they'd get arrested in turn for their illegal activities.
I'm happy to debate it more as its an interesting subject and I don't want to waste time going down the wrong route, but I think re-using the sites is the best option - I just need to get copies of them so they LOOK the same to Google and hopefully keep their SERPs.
If you know of any software or techniques to crawl and download multiple (html) pages and images of a site then please let me know.
Thanks for all of the responses
FashionLux
-
Thanks for the response.
"you can download the the html but not the files themselves" - the html is all I need. I don't want the site to actually work so having only the html files is perfect.
I can go to the homepage and manually save it, and go through 100+ pages and manually download them - I just wanted to ask if there was any software that would do this for me and save some leg work.
Thanks again
-
Most sites are database driven. The public does not have direct access to the database. Accordingly you cannot download the full functioning website in the manner you desire.
If it is a very basic pure html/css site, you can pretty much achieve your goal.
Do you obtain control over just the domain? Or do you have access to their hosting account? If you gain access to the hosting account, you can request the host restore the site from a backup.
Even if you gain access to the full site, you really need to be careful. Your goal is purely to manipulate search engine results which makes these activities black hat and subject to penalty. I understand your intentions are good, but the method is not complaint with Google's Guidelines.
If you own the brand, and you have a trademark, you can build quality sites promoting the brand. You can use every social media page, etc. If you put in the time and effort, these pages will rank very well in SERPs.
Some great legal victories are being won in the US to help with these types of issues. Coach recently won a similar case. It's great to hear the good guys are gaining some ground.
-
Dude, you wont be able to do that, the files are stored on the server behind a password locked folder.
Like Ryan said you can download the the html but not the files themselves.
As long as you get the content that should be enough, put it into a word doc and paste it back up once you have the domain, doesn't even need a template.
You need to stop them from re-using the content on another site.
-
Hi Ryan,
Thanks for the reply. To clarify, the site is deleted prior to me gaining control of it - by the time it comes into my hands it's completely blank, so FTP'ing isn't an option.
The site owners are essentially scamming members of the public by charging hundreds of dollars for goods that are never delivered. We've seized hundreds of sites through legal proceedings, but more keep popping up the moment we get hold of them.
These sites rank for hundreds of popular search terms (some have hundreds/thousands of spammy inbound links), so bumping them off page 1 for all SERPs isn't achievable.
By seizing the sites, keeping the content, but making the site non-functioning (imagine a popup image that fills the screen and can't be escaped) it will hopefully mean we own these SERPs and new counterfeit sites have to try and outrank them.
In turn we'll seize those sites, so the next wave of counterfeit sites have to do even more link building - eventually (maybe years) they'll realise its not worth it and give up.
Manually downloading individual webpages isn't an option, so I'm wondering if theres any programmes that can download all html files for a website so I can then just upload them via ftp once the site has been seized and add my javascript image
Thanks for all of the responses
FashionLux
-
Based on your question I am not clear if the site is deleted prior to your gaining control over the site.
If you are trying to copy a site before you have control over it, all you can do is download the HTML of the various web pages. If you spend a bit more time, you may be able to figure out file names on the server and download them, but that is moving down a path of internet security and hacking.
If you are trying to copy a site after you have control over it, the easiest method to capture everything would be a cPanel backup. cPanel is the most popular software used to administrate Apache web servers. That is the most likely hosting environment for counterfeit sites. A single cPanel backup will capture everything.
Otherwise you can go through and copy the public_html folder (or whatever the main folder is called, it will vary based on server setup) along with the database and other settings you wish to retain such as e-mail.
Understand the old site owner will still have all the passwords and an understanding of the code. While it is unlikely, they could leave themselves backdoors into the site as well. This is one reason why maintaining their site is not likely to be a good idea.
Once you began running these sites from your server, what is the plan? You would place a "counterfeit" notice and then ??? that's it? Or would you redirect them to your site? If you redirect them to your site and maintain these sites up on an ongoing basis, it can be seen as a network of doorway sites.
I understand what you are doing and why. The issue is you are taking actions purely based on search engine rankings. To do such for a short period such as 30-60 days is likely fine. To do it on a more permanent basis will likely lead you to a penalty.
-
Hi Dean,
Heather is right! you should access the websites through FTP. Also if there are databases then you should be able to export the data from the software that is managing it.
Istvan
-
Hi Dean
Could you not just use your FTP client (like Filezilla or Dreamweaver) to pull the entire site content down, save it locally, ready to upload later? Or do you not have FTP details of the sites you're taking over?
Sorry if I've miss understood the question
Heather
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved How to Reduce the spam score of the website?
My website is https://usapickleballrules.org it is a informational website and Moz shows a spam score of 8,may i know why my site spam score is high and how can i reduce it,after all it is a dubai store portal.
Moz Pro | | rameezmirza2 -
WEbsite cannot be crawled
I have received the following message from MOZ on a few of our websites now Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. I have spoken with our webmaster and they have advised the below: The Robots.txt file is definitely there on all pages and Google is able to crawl for these files. Moz however is having some difficulty with finding the files when there is a particular redirect in place. For example, the page currently redirects from threecounties.co.uk/ to https://www.threecounties.co.uk/ and when this happens, the Moz crawler cannot find the robots.txt on the first URL and this generates the reports you have been receiving. From what I understand, this is a flaw with the Moz software and not something that we could fix form our end. _Going forward, something we could do is remove these rewrite rules to www., but these are useful redirects and removing them would likely have SEO implications. _ Has anyone else had this issue and is there anything we can do to rectify, or should we leave as is?
Moz Pro | | threecounties0 -
Best tools for an initial website health check?
Hi,
Moz Pro | | CamperConnect14
I'd like to offer free website health checks (basic audits) and am wondering what tools other people use for this? It would be good to use something that presents the data well. Moz is great but it gets expensive if I want to offer these to many businesses in the hope of taking on just a few as clients and doing a full manual audit for them. So far I've tried seositecheckup.com (just checks a single page though), metaforensics.io and mysiteauditor. Thanks!0 -
Best Chrome extension to find contact emails on a website
Hi, I've done some digging around the Q and A and SEOMoz articles. Still not finding exactly what I need. I'm just looking for a tool that will quickly help me find the best contact email on a particular website. Whether it be the one the site is registered to a different one or both. Thanks in advance for the help. Aaron
Moz Pro | | arkana0 -
How to force SeoMoz to re-crawl my website?
Hi, I have done a lot of changes on my website to comply with SeoMoz advices, now I would like to see if I have better feedback from the tool, how can I force it to re-crawl a specific campaign? (waiting another week is too long :-))
Moz Pro | | oumma0 -
A tool to submit websites in directories
Hello I am looking for a tool to help me to submit websites in directories, something like the yooda tool. http://www.yooda.com/outils_referencement/submit_center_yooda/ This tool seems good no? do you offer something similar at seomoz? or where could I find some similar tools and in which languages is it available?
Moz Pro | | bigtimeseo2