Reporting on website access and errors
After a website has been active for some time, you will likely want to review the logs to see which content is popular and which is not. Assuming you have kept the default logging method of W3C, and the default sections, the script explained in this recipe should give a high-level view of your website.
How to do it...
Here, we are using a PowerShell script to parse the IIS log files and generate a report:
Function Parse-IISLogs { # Identify the IIS logging directory Import-Module WebAdministration $logFile = Get-ItemProperty 'IIS:\Sites\Default Web Site' ` -Name logFile.directory.value $logFile = [Environment]::ExpandEnvironmentVariables($logFile) # Export log files to a temporary CSV file $logFile += "\*\*.log" (Get-Content $logfile | Where-Object {$_ -notlike "#[S,V,D]*"}) ` -replace "#Fields: ","" | Out-File $env:temp\webLog.csv # Import the CSV file to memory $webLog = Import-Csv $env:temp\webLog.csv...