How to take offline copy of websites with wget | wget extensions scraping way back machine linux curl


When your going away on leave and need to get copies of you favourite websites use wget! Its available on all linux distros.

You can also build a web archive line way back machine which saves websites at a moment of time.

You need to be a little sneaky with the command you use as most larger websites these days will detect you trying to copy them and try to stop you. You will need to include extensions with your command to fool the site you are a real user.

Here is a command that works for me.


Leave a Reply

Your email address will not be published. Required fields are marked *