The PostgreSQL logger process collects any log event (when enabled) that contains queries or errors/messages to just one log file. It is currently not possible to configure separate log files for separate tasks. For example, you cannot have a separate error log and a slow-query log. The logging we may enable using the previous recipes we discussed in this chapter may produce several thousand or millions of lines. It could get difficult for a DBA/developer to parse the log files and get a better view of what is running slow and how many times a query has run.
For this reason, we use pgBadger, which can parse log files and generates a rich HTML-based report that can be accessed from a browser. In this recipe, we shall discuss how to analyze Postgres logs and generate a report using pgBadger.
A sample pgBadger report is visible at the following URL: http://pgbadger.darold.net/samplev7.html. The following screenshot shows what the pgBadger dashboard looks like...