How to do logging in PowerShell without file is locked exceptions
While I’m working on a simple script which calls a REST API to handle pause and resume of monitoring devices, we run in troubles, because the scripts were executed in startup and shutdown event of the various workstations. So, I came up with the idea of logging the script execution to file. This was fine, but my customer reminds me, that a mass of the workstations will startup in parallel and we might get the excpetion ‘The process cannot access the file ‘xxx’ because it is being used by another process.’.
I had two problems:
1. How to test parallel file access
2. How to solve file is being used by another process exeption, when this would be a problem.
First, I developed my simple Log4PowerShell function which writes log entries in an CSV file:
function Write-Log { [CmdletBinding()] param( [Parameter()] [ValidateNotNullOrEmpty()] [string]$Message, [Parameter()] [ValidateNotNullOrEmpty()] [ValidateSet('DEBUG','INFO','WARN','ERROR')] [string]$Severity = 'INFO' ) [pscustomobject]@{ Date = (Get-Date -Format "dd.MM.yyyy") Time = (Get-Date -Format "HH:mm:ss.fff") Severity = $Severity Message = $Message } | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation }
Next I was thinking about how to test this. The Write-Log
function will write entries to the file C:\Temp\PowerShell-Log.csv
and I want now to force the access to that file in parallel. Therefore the Start-Job
cmdlet is the right joice. The Start-Job
cmdlet starts a PowerShell background job on the local computer. That’s exactly what I want, but not only one job, I want to start 20 jobs in parallel.
One of the simplest calls of Start-Job
is: Start-Job [-ArgumentList
.
So we will try that with:
Start-Job -ArgumentList 10 -ScriptBlock {param ($i) Write-Output $i}
As result we will get a output like this:
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
123 Job123 BackgroundJob Running True localhost param ($i) Write-Outpu...
To get the output result of the job, we have to use the Receive-Job
cmdlet with the ID of the job:
Receive-Job -Id 123
How to run this now 20 times in parallel? We can use a simple loop, but then we will get a list of job Id’s and have to wait until they are finished and then call each seperate. Better is, to create an array of jobs and wait for them all using Wait-Job -Job
and Receive-Job
:
$jobs = @() (1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock {param ($i) Write-Output $i}} Wait-Job -Job $jobs | Out-Null Receive-Job -Job $jobs
The result will be a simple output from 1 to 20.
Now, after we solved that, lets try out the logging function Write-Log
. Therfore we create a $ScriptBlock = {...}
variable, which also contains the Write-Log function:
$ScriptBlock = { param ($init) # --------------------------------------------------------------- # Log4PowerShell function # --------------------------------------------------------------- function Write-Log { [CmdletBinding()] param( [Parameter()] [ValidateNotNullOrEmpty()] [string]$Message, [Parameter()] [ValidateNotNullOrEmpty()] [ValidateSet('DEBUG','INFO','WARN','ERROR')] [string]$Severity = 'INFO' ) [pscustomobject]@{ Date = (Get-Date -Format "dd.MM.yyyy") Time = (Get-Date -Format "HH:mm:ss.fff") Severity = $Severity Message = $Message } | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation } $thread = $init $start = Get-Date (1..30) | % { Start-Sleep -Seconds 1; $init +=1 ; Write-Log -Message "Thread: $($thread) Step: $($_)." -Severity INFO} $stop = Get-Date Write-Output "Counted from $($init - 30) until $init in $($stop - $start)." } $jobs = @() (1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock $ScriptBlock} Wait-Job -Job $jobs | Out-Null Receive-Job -Job $jobs
When we execute this in an PowerShell Console Window, we will get a mass of exceptions like this:
The process cannot access the file 'C:\Temp\PowerShell-Log.csv' because it is being used by another process.
+ CategoryInfo : OpenError: (:) [Export-Csv], IOException
+ FullyQualifiedErrorId : FileOpenFailure,Microsoft.PowerShell.Commands.ExportCsvCommand
+ PSComputerName : localhost
To solve this, I just separate the creation of the row which should be written to the CSV file and the export itself. A try/catch around the export with a loop of maximum one minute if necessary. So I just substitute the Write-Log
function in the above code with:
$ScriptBlock = { param ($init) # --------------------------------------------------------------- # Log4PowerShell function # --------------------------------------------------------------- function Write-Log { [CmdletBinding()] param( [Parameter()] [ValidateNotNullOrEmpty()] [string]$Message, [Parameter()] [ValidateNotNullOrEmpty()] [ValidateSet('DEBUG','INFO','WARN','ERROR')] [string]$Severity = 'INFO' ) $data = [pscustomobject]@{ Date = (Get-Date -Format "dd.MM.yyyy") Time = (Get-Date -Format "HH:mm:ss.fff") Severity = $Severity Message = $Message } $done = $false $loops = 1 While(-Not $done -and $loops -lt 1000) { try { $data | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation $done = $true } catch { Start-Sleep -Milliseconds 10 $loops += 1 } } } $thread = $init $start = Get-Date (1..30) | % { Start-Sleep -Seconds 1; $init +=1 ; Write-Log -Message "Thread: $($thread) Step: $($_)." -Severity INFO} $stop = Get-Date Write-Output "Counted from $($init - 30) until $init in $($stop - $start)." } $jobs = @() (1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock $ScriptBlock} Wait-Job -Job $jobs | Out-Null Receive-Job -Job $jobs
Check now the CSV file. I would recommend, to put an additional column ‘Number’ with 1..600. When you then sort ascending at column ‘Time’, you will see, that then ‘Number’ column is not ongoing anymore and you also will see many equal times.
I hope I could help the one or other with this.