shot of parallel network cables

How to do logging in PowerShell without file is locked exceptions

While I’m working on a simple script which calls a REST API to handle pause and resume of monitoring devices, we run in troubles, because the scripts were executed in startup and shutdown event of the various workstations. So, I came up with the idea of logging the script execution to file. This was fine, but my customer reminds me, that a mass of the workstations will startup in parallel and we might get the excpetion ‘The process cannot access the file ‘xxx’ because it is being used by another process.’.

I had two problems:
1. How to test parallel file access
2. How to solve file is being used by another process exeption, when this would be a problem.

First, I developed my simple Log4PowerShell function which writes log entries in an CSV file:

function Write-Log {
	[CmdletBinding()]
	param(
		[Parameter()]
		[ValidateNotNullOrEmpty()]
		[string]$Message,

		[Parameter()]
		[ValidateNotNullOrEmpty()]
		[ValidateSet('DEBUG','INFO','WARN','ERROR')]
		[string]$Severity = 'INFO'
	)
	[pscustomobject]@{
		Date = (Get-Date -Format "dd.MM.yyyy")
		Time = (Get-Date -Format "HH:mm:ss.fff")
		Severity = $Severity
		Message = $Message
	} | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation
}

Next I was thinking about how to test this. The Write-Log function will write entries to the file C:\Temp\PowerShell-Log.csv and I want now to force the access to that file in parallel. Therefore the Start-Job cmdlet is the right joice. The Start-Job cmdlet starts a PowerShell background job on the local computer. That’s exactly what I want, but not only one job, I want to start 20 jobs in parallel.

One of the simplest calls of Start-Job is: Start-Job [-ArgumentList ] [-ScriptBlock] .
So we will try that with:

Start-Job -ArgumentList 10 -ScriptBlock {param ($i) Write-Output $i}

As result we will get a output like this:

Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
123 Job123 BackgroundJob Running True localhost param ($i) Write-Outpu...

To get the output result of the job, we have to use the Receive-Job cmdlet with the ID of the job:

Receive-Job -Id 123

How to run this now 20 times in parallel? We can use a simple loop, but then we will get a list of job Id’s and have to wait until they are finished and then call each seperate. Better is, to create an array of jobs and wait for them all using Wait-Job -Job and Receive-Job :

$jobs = @()
(1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock {param ($i) Write-Output $i}}
Wait-Job -Job $jobs | Out-Null
Receive-Job -Job $jobs

The result will be a simple output from 1 to 20.

Now, after we solved that, lets try out the logging function Write-Log. Therfore we create a $ScriptBlock = {...} variable, which also contains the Write-Log function:

$ScriptBlock = {
    param ($init)
    # ---------------------------------------------------------------
    # Log4PowerShell function
    # ---------------------------------------------------------------
    function Write-Log {
        [CmdletBinding()]
        param(
            [Parameter()]
            [ValidateNotNullOrEmpty()]
            [string]$Message,

            [Parameter()]
            [ValidateNotNullOrEmpty()]
            [ValidateSet('DEBUG','INFO','WARN','ERROR')]
            [string]$Severity = 'INFO'
        )
        [pscustomobject]@{
                Date = (Get-Date -Format "dd.MM.yyyy")
                Time = (Get-Date -Format "HH:mm:ss.fff")
                Severity = $Severity
                Message = $Message
        } | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation
    }
    $thread = $init
    $start = Get-Date
    (1..30) | % { Start-Sleep -Seconds 1; $init +=1 ; Write-Log -Message "Thread: $($thread) Step: $($_)." -Severity INFO}
    $stop = Get-Date
    Write-Output "Counted from $($init - 30) until $init in $($stop - $start)."
}
$jobs = @()
(1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock $ScriptBlock}
Wait-Job -Job $jobs | Out-Null
Receive-Job -Job $jobs

When we execute this in an PowerShell Console Window, we will get a mass of exceptions like this:

The process cannot access the file 'C:\Temp\PowerShell-Log.csv' because it is being used by another process.
    + CategoryInfo          : OpenError: (:) [Export-Csv], IOException
    + FullyQualifiedErrorId : FileOpenFailure,Microsoft.PowerShell.Commands.ExportCsvCommand
    + PSComputerName        : localhost

To solve this, I just separate the creation of the row which should be written to the CSV file and the export itself. A try/catch around the export with a loop of maximum one minute if necessary. So I just substitute the Write-Log function in the above code with:

$ScriptBlock = {
    param ($init)
    # ---------------------------------------------------------------
    # Log4PowerShell function
    # ---------------------------------------------------------------
    function Write-Log {
        [CmdletBinding()]
        param(
            [Parameter()]
            [ValidateNotNullOrEmpty()]
            [string]$Message,
    
            [Parameter()]
            [ValidateNotNullOrEmpty()]
            [ValidateSet('DEBUG','INFO','WARN','ERROR')]
            [string]$Severity = 'INFO'
        )
        $data = [pscustomobject]@{
                Date = (Get-Date -Format "dd.MM.yyyy")
                Time = (Get-Date -Format "HH:mm:ss.fff")
                Severity = $Severity
                Message = $Message
        }
        $done = $false    
        $loops = 1
        While(-Not $done -and $loops -lt 1000) {
            try {
                $data | Export-Csv -Path "C:\Temp\PowerShell-Log.csv" -useCulture -Append -NoTypeInformation
                $done = $true
            } catch {
                Start-Sleep -Milliseconds 10
                $loops += 1
            }
        }
    }
    $thread = $init
    $start = Get-Date
    (1..30) | % { Start-Sleep -Seconds 1; $init +=1 ; Write-Log -Message "Thread: $($thread) Step: $($_)." -Severity INFO}
    $stop = Get-Date
    Write-Output "Counted from $($init - 30) until $init in $($stop - $start)."
}
$jobs = @()
(1..20) | %{$jobs += Start-Job -ArgumentList $_ -ScriptBlock $ScriptBlock}
Wait-Job -Job $jobs | Out-Null
Receive-Job -Job $jobs

Check now the CSV file. I would recommend, to put an additional column ‘Number’ with 1..600. When you then sort ascending at column ‘Time’, you will see, that then ‘Number’ column is not ongoing anymore and you also will see many equal times.

I hope I could help the one or other with this.

Shows a stack of folders.

List latest files from all directories in a given path using PowerShell

PowerShell is mostly used to execute scripts. But as everyone knows, PowerShell is also great for using interactive. For that, the best choice is the PowerShell Integrated Scripting Environment (PowerShell_ise).

But what brought me to this idea? I just want to know, how are the activities on certain directories by listing all the new files. To handle all with files, two of the PowerShell CmdLets are Get-Item and . With pipeing the result to other CmdLet (or Alias) the job can be done.

Here it is :

# List the newest file from each directory in path
Get-ChildItem -Path <Drive & Path where to start> -Recurse | group directory | foreach {$_.group | sort LastWriteTime -Descending | select -First 1}

When you will view over the result, you will maybe recognize, that there are files already years old. That means already for years, there are no new files or no activity in that folder. For that, it’s possible to filter the returned files with a date. When you dont expect hundreds of new files, you can remove the | select -First 1 at the end of the statement or higher the value to maybe 10. Have a look on that:

# List the newest file from each directory in path
Get-ChildItem -Path <Drive & Path where to start> -Recurse | group directory | foreach {$_.group | sort LastWriteTime -Descending | where LastWriteTime -GE (Date).AddDays(-7) | select -First 5}

When the result is spawned over many folders, then maybe the result is better returned as a complete list of files with the full name. Have a look at this:

# List the newest file from each directory in path
Get-ChildItem -Path <Drive & Path where to start> -Recurse | group directory | foreach {$_.group | where LastWriteTime -GE (Date).AddDays(-7)  | sort LastWriteTime -Descending | select FullName, LastWriteTime -First 5}

The samples above works fine in PowerShell 5.1. Just now I found out that there are problems in earlier Versions of PowerShell. If you use a Version before PowerShell 5.1, you have to change the samples to:

# List the newest file from each directory in path
Get-ChildItem -Path <Drive & Path where to start> -Recurse | group directory | foreach {$_.group | where {$_.LastWriteTime -GE (Date).AddDays(-7)}  | sort LastWriteTime -Descending | select FullName, LastWriteTime -First 5}
oops

How to handle reboot and resume or continue in PowerShell with PowerShell Workflow

While developing a Azure Resource Manager Template with a virtual machine resource and custom script extension, I got the problem, that I have to install a software that needs a reboot and after the reboot, the script should continue to configure these software and install other applications.

The recommended way with custom script extensions is, to have one script, that acts like a start script calling other scripts doing the work. This sounds like a workflow. So I thought I can use PowerShell Workflow to handle this and I gave it a try. With PowerShell Workflow we have all mechanism to handle reboot and resume or continue in a PowerShell script, but it’s anyway a little bit tricky, because we have to use a scheduled task which will be triggered “At startup”.

It was not successful to use it in the Azure Resource Manager virtual machine extension because of other reasons (finally I solved it with Windows PowerShell Desired State Configuration (DSC)), but in general is PowerShell Workflow a fine technology for task which needs a state, because they might be suspended and resumed and could run in parallel. So I will share my experience with you.

First, lets think about the workflow and the single steps called “Activities”. What must be known in advance is, like I wrote already above, is, that PowerShell Workflow is designed to run activities in parallel and that each activity has it’s own workspace. That means, that results i.e. returned to variables cannot be used from the next activity. Each PowerShell command that runs within a workflow is a single, standalone activity. To run activities parallel, the parallel{} keyword must be used and when activites inside the parallel block should run in a defined order the sequence{} keyword must be defined.

For our purpose, following script snipped can be used:

Workflow New-ServerSetup
{
    parallel {
    "1. activity?"
    "2. activity?"
    "3. activity?"
    "4. activity?"
    "..."
    }
    Restart-Computer -Wait 
    "Last activity"
    "or more activities..."
}
# Run the workflow
New-ServerSetup

When this would be executed, the "Last activity" and "or more activities..." would not be processed, because with the Restart-Computer -Wait activity, the New-ServerSetup job will be suspended and will stay in that state after reboot. This can be checked after the server rebooted with

Get-Job
Check status of current jobs

Check status of current jobs

To manually resume the Job, just type

# In our case the job Id is 3. Check if you put the right Id
Resume-Job -Id 3
PowerShell Workflow Suspended, Running, Completed

PowerShell Workflow Suspended, Running, Completed

But, of course, we don’t have the possibility to start the job manually, when it’s executed from the Azure Resource Manager Template virtual machine extension. So, we have to define a scheduled task, resuming the job “at startup”:

Workflow New-ServerSetup
{
    "First activity"
    "Second activity"
    "..."
    Restart-Computer -Wait 
    "Last activity"
    "or more activities..."
    Unregister-ScheduledJob -Name NewServerSetupResume
}
# -------------------------------------------------------------------------
# Use the New-JobTrigger cmdlet to create an "At startup" trigger
# to resume the suspended job.
# Replace <Password> with a password of a administrator
# on the local machine. 
# -------------------------------------------------------------------------
$adm = "Administrator"
$pwd = ConvertTo-SecureString -String "<Password>" -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($adm, $pwd)
$AtStartup = New-JobTrigger -AtStartup
Register-ScheduledJob -Name NewServerSetupResume `
                      -Credential $cred `
                      -Trigger $AtStartup `
                      -ScriptBlock {Import-Module PSWorkflow; `
                          Get-Job -Name NewSrvSetup -State Suspended `
                          | Resume-Job}

# Run the workflow. It is suspended when the computer restarts.
# We give a defined name for the job, to be able to use the name
# in the scheduled task, otherwise the name would be "Job<n>"
New-ServerSetup -JobName NewSrvSetup

To find out more about Windows PowerShell Workflows use:

Get-Help about_workflows
hashtag

How to put single line and multiline or block comments in Windows PowerShell

This seems to be a topic not worth for a post, but there are people looking with google for that.

In PowerShell single line comments start with a hash symbol and everything to the right of the # will be treated as comment and ignored as scripting code.

# This is a comment

Comments in PowerShell spanning multiple lines came with PowerShell 2.0. They start with "<#" and end with "#>". They can be placed anywhere, except inside strings, and anything between them will be treated as a comment.

<# This is a block comment #> Write-Host "Cmdlet after a block comment"
<# Here starts a
   multi line comment
#>
Write-Host "Cmdlet after a multi-line comment"

But there is more about comments. They can be used from the Get-Help cmdlet when a help keyword is given. This is then called comment-based help. This has to appear in one of three locations when they are used for functions:

  • At the beginning of the function body.
  • At the end of the function body.
  • Before the Function keyword. There cannot be more than one blank line between the last line of the function help and the Function keyword.

When used for scripts, the comments can appear in one of the following two locations in the script:

  • At the beginning of the script file. Script help can be preceded in the script only by comments and blank lines.
  • At the end of the script file. However, if the script is signed, place Comment-based help at the beginning of the script file. The end of the script is occupied by the signature block.
  • If the first item in the script body (after the help) is a function declaration, there must be at least two blank lines between the end of the script help and the function declaration. Otherwise, the help is interpreted as being help for the function, not help for the script.

To find out more, use:

Get-Help about_comment

How to find existing Windows features for Desired State Configuration (DSC)?

Using Windows PowerShell Desired State Configuration (DSC) is the preferred way to configure a Windows Server. The only problem when starting with DSC is, to find out the names of the build in configuration entries, in special the Windows Features.

To get help with that you can use:

Get-DscResource

This will show a list with the build in DSC Resources:

Get-DscResource Result

Get-DscResource Result

To find out more about the Windows Feature type:

Get-WindowsFeature

Which information and variables can I reference in Windows PowerShell?

When creating scripts in Windows PowerShell, there is quickly the need for information about the environment, like the folder from which the script was executed or the version of Windows PowerShell that is running in the current session.

The solution for that are so called “automatic variables”, which are created and maintained by Windows PowerShell.

To find out more about that and which variables exists, use:

Get-Help automatic_variable

Windows PowerShell ‘#Requires’ Statement

The #Requires is a statement that prevents the script or module to run if the prerequisites defined with the requirement isn’t met. The statement can appear on any line in a script but must be the first item on a line. A script can include more than one #Requires statement.

To find out which parameters can be used, type

Get-Help about_requires

in the Windows PowerShell Console.