The basics of web requests
The worldwide capacity to generate data is estimated to double in size every 2 years. Even though there is an interdisciplinary field known as data science that is entirely dedicated to studying data, almost every programming task in software development also has something to do with collecting and analyzing data. A significant part of this is, of course, data collection. However, the data that we need for our applications is sometimes not stored nicely and cleanly in a database – sometimes, we need to collect the data we need from web pages.
For example, web scraping is a data extraction method that automatically makes requests to web pages and downloads specific information. Web scraping allows us to comb through numerous websites and collect any data we need systematically and consistently. The collected data can be analyzed later by our applications or simply saved on our computers in various formats. An example of this would be Google, which...