You can backup a copy of an entire website and automate this process with a simple batch file (more on that later) and a piece of software named wget.
Wget will allow you to save a backup of every file of a Web site.
Setup wget
- Download Wget here
- Create a folder at C:\WebsiteBackup (or whatever and wherever you want to name it)
- Extract the contents of Wget into the folder you created.
Set up the batch file
- Open Notepad and type the following on the first line:
- cd C:\WebsiteBackup
- On the second line type:
- wget -r -k -p
- Save the file as WebsiteBackup.bat.
The file should look like this.
To run the file click on WebsiteBackup.bat. You should see the following screen – now wait as the website downloads.
You can automate this process if you right click on the file, create a shortcut, and put that shortcut into your “Startup” folder. Whenever you log into Vista/XP, a backup of a Web site will be made.
About Rich
Rich is the owner and creator of Windows Guides; he spends his time breaking things on his PC so he can write how-to guides to fix them.
- Web
- |
- |
- |
- |
- Google+
- |
- More Posts (1018)
I have heard that this website is wonderful, there is some fresh and interesting information.
I have heard that this website is wonderful, there is some fresh and interesting information.
I have heard that this website is wonderful, there is some fresh and interesting information.
Not working
Not working
Not working
Just verified the link — the site was probably down — so it should work for you now
Just verified the link — the site was probably down — so it should work for you now
Just verified the link — the site was probably down — so it should work for you now
Great little tip and software. I will share this with my hosting clients so they can do some personal backups of their websites.
Make it great,
Matt
Great little tip and software. I will share this with my hosting clients so they can do some personal backups of their websites.
Make it great,
Matt
Great little tip and software. I will share this with my hosting clients so they can do some personal backups of their websites.
Make it great,
Matt
Is this legal? I mean someone might not like their porperty being used this way? but then its nice back up huh…!
Is this legal? I mean someone might not like their porperty being used this way? but then its nice back up huh…!
@Aviator – You will need a password to get to any server-side data.
Is this legal? I mean someone might not like their porperty being used this way? but then its nice back up huh…!
@Aviator – You will need a password to get to any server-side data.
@Aviator – You will need a password to get to any server-side data.
Just keep in mind, becareful if you use this command. since it will recursively download ALL the files from that website. Can you imagine what would happen if you use this on a site with gazillion pages, like http://www.microsoft.com or http://www.cnn.com??
Just keep in mind, becareful if you use this command. since it will recursively download ALL the files from that website. Can you imagine what would happen if you use this on a site with gazillion pages, like http://www.microsoft.com or http://www.cnn.com??
Just keep in mind, becareful if you use this command. since it will recursively download ALL the files from that website. Can you imagine what would happen if you use this on a site with gazillion pages, like http://www.microsoft.com or http://www.cnn.com??
I thought wget was just a linux thing.
I thought wget was just a linux thing.
I thought wget was just a linux thing.
Works fine on my site, but some directories are password protected, how do I get those?
Works fine on my site, but some directories are password protected, how do I get those?
Works fine on my site, but some directories are password protected, how do I get those?