Replacing the foreach loop with the foreach-object cmdlet
When you write a function to process a file, a typical approach might look like this:
function process-file{ param($filename) $contents=get-content $filename foreach($line in $contents){ # do something interesting } }
This pattern works well for small files, but for really large files this kind of processing will perform very badly and possibly crash with an out of memory exception. For instance, running this function against a 500 MB text file on my laptop took over five seconds despite the fact that the loop doesn't actually do anything. To determine the time it takes to run, we can use the measure-command
cmdlet, as shown in the following screenshot:
Note that the result is a Timespan
object and the TotalSeconds
object has the value we are looking for. You might not have any large files handy, so I wrote the following quick function to create large text files that are approximately the size you ask for:
function...