How to Download Entire Website using Wget for offline Viewing

Valic —  June 24, 2010 — 3 Comments

Wget is a free utility for non-interactive download of files from the Web.  It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.

Using Wget to download entire website:

Step 1. Create directory where you are planing to store the website content:

mkdir /home/user/offline_site

Step 2. Go to directory that you just created:

cd /home/user/offline_site/

Step 3. Use following command to download the website:

wget -r -Nc -mk http://www.debian-tutorials.com/

NOTE : Command explanation:

-r Turn on recursive retrieving
-N Turn on time-stamping
-m Create a mirror
-k Convert the link

After completion all content will get downloaded into your directory.

Done.

Valic

Posts Twitter Facebook

Editor in Chief at Debian-Tutorials, Linux enthusiast.

3 responses to How to Download Entire Website using Wget for offline Viewing

  1. autism symptoms December 6, 2010 at 04:08

    Valuable info. Lucky me I found your site by accident, I bookmarked it.

  2. What a great resource!

  3. httrack is better

Leave a Reply