How to Download Entire Webpages

A lot of the time you’d like to download an entire webpage to your computer for backup or research purposes. This is difficult to do manually, but luckily Linux provides a terminal command for this. The syntax is

wget -r [stands for recursive] website

For example to download a website called example.com you’d use

wget -r http://www.example.com

Once run the command creates a directory in the directory you are already in to store the files which is titled the same thing as the website’s domain name is and it puts the files in there. Simple as that… well most of the time. Some websites have structures and files that resist downloading so use a more complete command line like this…

wget -mpck --user-agent="" -e robots=off --wait 1 -E [whateverwebsite.com]

and that’ll do it.