Optimizing an application for search engines
Before we get started optimizing our app for search engines, let’s briefly learn how search engines work. Search engines work by storing information about websites in an index. The index contains the location, content, and meta information of websites. Adding or updating pages in the index is called indexing and done by a crawler. A crawler is an automated software that fetches websites and indexes them. It is called a crawler because it follows further links on the website to find more websites. More advanced crawlers, such as the Googlebot, can also detect whether JavaScript is required to render the contents of a website and even render it.
The following graphic visualizes how a search engine crawler works:
Figure 8.1 – Visualization of how a search engine crawler works
As we can see, a search crawler has a queue containing URLs that it needs to crawl and index. It then visits the URLs one...