How to Download Entire Webpages

A lot of the time you’d like to download an entire webpage to your computer for backup or research purposes. This is difficult to do manually, but luckily Linux provides a terminal command for this. The syntax is

wget –recursive website

For example to download a website called example.com you’d use

wget –recursive http://www.example.com

Once run the command creates a directory in the directory you are already in to store the files which is titled the same thing as the website’s domain name is and it puts the files in there. Simple as that.