Downloading a page for offline analysis with Wget
Wget is a part of the GNU project and is included in most of the major Linux distributions, including Kali Linux. It has the ability to recursively download a web page for offline browsing, including conversion of links and downloading of non-HTML files.
In this recipe, we will use Wget to download pages that are associated with an application in our vulnerable_vm.
Getting ready
All recipes in this chapter will require vulnerable_vm running. In the particular scenario of this book, it will have the IP address 192.168.56.102.
How to do it...
- Let's make the first attempt to download the page by calling Wget with a URL as the only parameter:
wget http://192.168.56.102/bodgeit/
As we can see, it only downloaded the
index.html
file to the current directory, which is the start page of the application. - We will have to use some options to tell Wget to save all the downloaded files to a specific directory and to copy all the files contained in the...