How to Disable Website Pages Download Using Wget from .htaccess

Valic —  February 15, 2013 — 2 Comments

If you want to disable wget to download your site pages you need to declare wget as bad_bot. For that you need to add the following code in the .htaccess file located under website’s public_html directory.
after you inserted this code, if anybody tries to download your site pages using wget will receive a 403 error.

#Declare Wget a bad_bot
SetEnvIfNoCase User-Agent "^Wget" bad_bot
Order Allow,Deny
Allow from all
Deny from env=bad_bot

root@debian:~# wget http://www.debian-tutorials.com/
--2013-02-14 xx:xx:xx--  http://www.debian-tutorials.com/
Resolving dev.slayergame.com... xxx.yyy.zzz.aaa
Connecting to www.debian-tutorials.com|xxx.yyy.zzz.aaa|:80... connected.
HTTP request sent, awaiting response... 403 Forbidden
2013-02-14 xx:xx:xx ERROR 403: Forbidden.

Valic

Posts Twitter Facebook

Editor in Chief at Debian-Tutorials, Linux enthusiast.

2 responses to How to Disable Website Pages Download Using Wget from .htaccess

  1. If you’re the bad guy who still wants to download something, just tell wget to alter its user-agent like this:
    wget –user-agent Mozilla/4.0

  2. There will always be someone that will know what to do to succeed but still by using this method You can eliminate a lot of those who will try.

Leave a Reply