PowerShell ForEach-Object Parallel Feature

Paul Higinbotham

PowerShell ForEach-Object Parallel Feature

PowerShell 7.0 Preview 3 is now available with a new ForEach-Object Parallel Experimental feature. This feature is a great new tool for parallelizing work, but like any tool, it has its uses and drawbacks.

This article describes this new feature, how it works, when to use it and when not to.

What is ForEach-Object -Parallel?

ForEach-Object -Parallel is a new parameter set added to the existing PowerShell ForEach cmdlet.

ForEach-Object -Parallel <scriptblock> [-InputObject <psobject>] [-ThrottleLimit <int>]
[-TimeoutSeconds <int>] [-AsJob] [-WhatIf] [-Confirm] [<CommonParameters>]

Normally, when you use the ForEach-Object cmdlet, each object piped to the cmdlet is processed sequentially.

1..5 | ForEach-Object { "Hello $_"; sleep 1 }
Hello 1
Hello 2
Hello 3
Hello 4
Hello 5

(Measure-Command {
    1..5 | ForEach-Object { "Hello $_"; sleep 1 } 
}).Seconds
5

But with the new ForEach-Object -Parallel parameter set, you can run all script in parallel for each piped input object.

1..5 | ForEach-Object -Parallel { "Hello $_"; sleep 1; } -ThrottleLimit 5 
Hello 1 
Hello 3 
Hello 2 
Hello 4 
Hello 5 

(Measure-Command {
    1..5 | ForEach-Object -Parallel { "Hello $_"; sleep 1; } -ThrottleLimit 5 
}).Seconds
1

Because each script block in the ForEach-Object example above takes 1 second to run, running all five in parallel takes only one second instead of 5 seconds when run sequentially.

Since the script blocks are run in parallel for each of the 1-5 piped input integers, the order of execution is not guaranteed. The -ThrottleLimit parameter limits the number of script blocks running in parallel at a given time, and its default value is 5.

This new feature also supports jobs, where you can choose to have a job object returned instead of having results written to the console.

$Job = 1..5 | ForEach-Object -Parallel { "Hello $_"; sleep 1; } -ThrottleLimit 5 -AsJob 
$job | Wait-Job | Receive-Job 
Hello 1 
Hello 2 
Hello 3 
Hello 5 
Hello 4

ForEach-Object -Parallel is not the same as the foreach language keyword

Don’t confuse ForEach-Object cmdlet with PowerShell’s foreach keyword. The foreach keyword does not handle piped input but instead iterates over an enumerable object. There is currently no parallel support for the foreach keyword.

foreach ($item in (1..5)) { "Hello $item" }
Hello 1
Hello 2
Hello 3
Hello 4
Hello 5

How does it work?

The new ForEach-Object -Parallel parameter set uses existing PowerShell APIs for running script blocks in parallel. These APIs have been around since PowerShell v2, but are cumbersome and difficult to use correctly. This new feature makes it much easier to run script blocks in parallel. But there is a fair amount of overhead involved and many times there is no gain in running scripts in parallel, and in fact it can end up being significantly slower than running ForEach-Object normally.

PowerShell currently supports parallelism in three main categories.

  1. PowerShell remoting. Here PowerShell sends script to external machines to run, using PowerShell’s remoting system.
  2. PowerShell jobs. This is the same as remoting except that script is run in separate processes on the local machine, rather than on external machines.
  3. PowerShell runspaces. Here script is run on the local machine within the same process but on separate threads.

This new feature uses the third method for running scripts in parallel. It has the least overhead of the other two methods and does not use the PowerShell remoting system. So it is generally much faster than the other two methods.

However, there is still quite a bit of overhead to run script blocks in parallel. Script blocks run in a context called a PowerShell runspace. The runspace context contains all of the defined variables, functions and loaded modules. So initializing a runspace for script to run in takes time and resources. When scripts are run in parallel they must be run within their own runspace. And each runspace must load whatever module is needed and have any variable be explicitly passed in from the calling script. The only variable that automatically appears in the parallel script block is the piped in object. Other variables are passed in using the $using: keyword.

$computers = 'computerA','computerB','computerC','computerD' 
$logsToGet = 'LogA','LogB','LogC' 

# Read specified logs on each machine, using custom module
$logs = $computers | ForEach-Object -ThrottleLimit 10 -Parallel {
    Import-Module MyLogsModule 
    Get-Logs -ComputerName $_ -LogName $using:logsToGet 
}

Given the overhead required to run scripts in parallel, the -ThrottleLimit becomes very useful to prevent the system from being overwhelmed. There are some cases where running a lot of script blocks in parallel makes sense, but also many cases where it does not.

When should it be used?

There are two primary reasons to run script blocks in parallel with the ForEach-Object -Parallel feature (keeping in mind that this feature runs the script on separate system threads).

  1. Highly compute intensive script. If your script is crunching a lot of data over a significant period of time and the scripts can be run independently, then it is worthwhile to run them in parallel. But only if the machine you are running on has multiple cores that can host the script block threads. In this case the -ThrottleLimit parameter should be set approximately to the number of available cores. If you are running on a VM with a single core, then it makes little sense to run high compute script blocks in parallel since the system must serialize them anyway to run on the single core.
  2. Script that must wait on something. If you have script that can run independently and performs long running work that requires waiting for somethings to complete, then it makes sense to run these tasks in parallel. If you have 5 scripts that take 5 minutes each to run but spend most of the time waiting, you can have them all run/wait at the same time, and complete all 5 tasks in 5 minutes instead of 25 minutes. Scripts that do a lot of file operations, or perform operations on external machines can benefit by running in parallel. Since the running script cannot use all of the machine cores, it makes sense to set the -ThrottleLimit parameter to something greater than the number of cores. If one script execution waits many minutes to complete, you may want to allow tens or hundreds of scripts to run in parallel.
$logNames.count 
10

Measure-Command { 
    $logs = $logNames | ForEach-Object -Parallel {
        Get-WinEvent -LogName $_ -MaxEvents 5000 2>$null
    } -ThrottleLimit 10
}

TotalMilliseconds : 115994.3 (1 minute 56 seconds)
$logs.Count
50000


Measure-Command {
    $logs = $logNames | ForEach-Object {
        Get-WinEvent -LogName $_ -MaxEvents 5000 2>$null
    } 
}

TotalMilliseconds : 229768.2364 (3 minutes 50 seconds)
$logs.Count
50000

The script above collects 50,000 log entries on the local machine from 10 system log names. Running this in parallel is almost twice as fast as running sequentially, because it involves some relatively slow disk access and can also take advantage of the machine multiple cores as it processes the log entries.

When should it be avoided?

ForEach-Object -Parallel should not be thought as something that will always speed up script execution. And in fact it can significantly slow down script execution if used heedlessly. For example, if your script block is executing trivial script then running in parallel adds a huge amount of overhead and will run much slower.

(Measure-Command {
    1..1000 | ForEach-Object -Parallel { "Hello: $_" } 
}).TotalMilliseconds
10457.962


(Measure-Command {
    1..1000 | ForEach-Object { "Hello: $_" } 
}).TotalMilliseconds
18.4473

The above example, a trivial script block is run 1000 times. The ThrottleLimit is 5 by default so only 5 runspace/threads are created at a time, but still a runspace and thread is created 1000 times to do a simple string evaluation. Consequently, it takes over 10 seconds to complete. But removing the -Parallel parameter and running the ForEach-Object cmdlet normally, results in completion in about 18 milliseconds.

So, it is important to use this feature wisely.

Implementation details

As previously mentioned, the new ForEach-Object -Parallel feature uses existing PowerShell functionality to run script blocks concurrently. The primary addition is the ability to limit the number of concurrent scripts running at a given time with the -ThrottleLimit parameter. Throttling is accomplished by a PSTaskPool class that holds running tasks (running scripts), and has a settable size limit which is set to the throttle limit value. An Add method allows tasks to be added to the pool, but if it is full then the method blocks until a new slot becomes available. Adding tasks to the task pool was initially performed on the ForEach-Object cmdlet piped input processing thread. But that turned out to be a performance bottleneck, and now a dedicated thread is used to add tasks to the pool.

PowerShell itself imposes conditions on how scripts run concurrently, based on its design and history. Scripts have to run in runspace contexts and only one script thread can run at a time within a runspace. So in order to run multiple scripts simultaneously multiple runspaces must be created. The current implementation of ForEach-Object -Parallel creates a new runspace for each script block execution instance. It may be possible to optimize this by re-using runspaces from a pool, but one concern in doing this is leaking state from one script execution to another.

Runspace contexts are an isolation unit for running scripts, and generally do not allow sharing state between themselves. However, variables can be passed at the beginning of script execution through the $using: keyword, from the calling script to the parallel script block. This was borrowed from the remoting layer which uses the keyword for the same purpose but over a remote connection. But there is a big difference when using the $using: keyword in ForEach-Object -Parallel. And that is for remoting, the variable being passed is a copy sent over the remoting connection. But with ForEach-Object -Parallel, the actual object reference is being passed from one script to another, violating normal isolation restrictions. So it is possible to have a non thread-safe variable used in two scripts running on different threads, which can lead to unpredictable behavior.

# This does not throw an error, but is not guaranteed to work since the dictionary object is not thread safe 
$threadUnSafeDictionary = [System.Collections.Generic.Dictionary[string,object]]::new()
Get-Process | ForEach-Object -Parallel {
    $dict = $using:threadUnSafeDictionary
    $dict.TryAdd($_.ProcessName, $_)
}
# This *is* guaranteed to work because the passed in concurrent dictionary object is thread safe
$threadSafeDictionary = [System.Collections.Concurrent.ConcurrentDictionary[string,object]]::new()
Get-Process | ForEach-Object -Parallel {
    $dict = $using:threadSafeDictionary
    $dict.TryAdd($_.ProcessName, $_)
}

$threadSafeDictionary["pwsh"]

NPM(K) PM(M) WS(M) CPU(s) Id SI ProcessName
------ ----- ----- ------ -- -- -----------
112 108.25 124.43 69.75 16272 1 pwsh

Conclusion

This feature can greatly improve your life for many work load scenarios. As long as you understand how it works and what its limitations are, you can experiment with parallelism and make real performance improvements with your scripts.

Paul Higinbotham Senior Software Engineer PowerShell Team

13 comments

Discussion is closed. Login to edit/delete existing comments.

  • ALIEN Quake 0

    That’s absolutely fantastic! One thought:
    Can we use “System.Collections.ArrayList” for the last example? Assuming that we could lock the collection via something like [System.Threading.Monitor]::Enter($AL)/[System.Threading.Monitor]::Exit($AL) or $ALSync = [System.Collections.ArrayList]::Synchronized($al)
    BTW: List of Thread Safe Collections


    EDIT: Years later I discover that [System.Collections.Concurrent.ConcurrentDictionary[string,object]]::new() is case-sensitive. To make it more PS-aligned, use this:

    [System.Collections.Concurrent.ConcurrentDictionary[string, object]]::new([StringComparer]::InvariantCultureIgnoreCase)
    • Paul HiginbothamMicrosoft employee 0

      Thanks.  Yes, you could use managed synchronization objects directly to protect non-threadsafe objects.  But it is much easier to rely on existing thread safe classes (as you showed above), and not have to worry about creating a deadlock situation!

  • David McDonough 0

    Great writeup, and a really cool feature. I think one really common scenario that should be on the ‘when not to use this’ section should be when you’re using cmdlets that are already asynchronous when targeting remote machines, like Invoke-Command and Get-CimInstance.

    • Paul HiginbothamMicrosoft employee 0

      Thanks, that is a good point.  Many existing cmdlets already provide ways to parallelize work.

  • Craig LandisMicrosoft employee 0

    Any plans for Wait-Job -ShowProgress like PoshRSJob supports with Wait-RSJob -ShowProgress?
    With ForEach-Object -Parallel -AsJob making it simple to create jobs, it would be helpful to also have a simple way to show job progress.

  • Morgan, Mark 0

    I’ve been testing out this feature and I like it. Thank you!

    How would I call a function while i’m in a parallel loop. Also, I want my function to be recursive and use parallel processing as well. Would you be able to provide a sample? I haven’t been able to figure this out.

    • Paul HiginbothamMicrosoft employee 0

      You either need to define the function within the -Parallel script block, or import a module in the script block that defines and exports the function (which I recommend as the cleanest way).

      You can run ForEach-Object -Parallel from within a ForEach-Object -Parallel script block or function. However, stopping the cmdlet from running using Ctrl+C, does not always clean up the underlying runspaces immediately, due to how PowerShell stops a running cmdlet.

      But I question how useful this would be. As I mention in this article, running script blocks in parallel involves a fair amount of overhead and it seems like this would not be beneficial except in very special cases.

  • J S 0

    Oops. Get-WinEvent doesn’t like the way I generate the list of 10 logs:

    get-winevent -Listlog * | select -first 10

    LogMode MaximumSizeInBytes RecordCount LogName
    ——- —————— ———– ——-
    Circular 20971520 33435 Application
    Circular 20971520 0 HardwareEvents
    Circular 1052672 0 Internet Explorer
    Circular 20971520 0 Key Management Service
    Circular 1052672 14 myps
    Circular 1052672 132 OAlerts
    Circular 20971520 36023 System
    Circular 15728640 13308 Windows PowerShell
    Circular 1052672 0 AMSI/Operational
    Circular 1052672 0 CCMS
    Get-WinEvent: To access the ‘CCMS’ log start PowerShell with elevated user rights. Error: The pipeline has been stopped.
    Get-WinEvent: To access the ‘ExploitPrevention’ log start PowerShell with elevated user rights. Error: The pipeline has been stopped.
    Get-WinEvent: To access the ‘ForwardedEvents’ log start PowerShell with elevated user rights. Error: The pipeline has been stopped.
    Get-WinEvent: To access the ‘MaliciousActivityProtection’ log start PowerShell with elevated user rights. Error: The pipeline has been stopped.
    Get-WinEvent: To access the ‘Microsoft-AppV-Client/Admin’ log start PowerShell with elevated user rights. Error: The pipeline has been stopped.

  • Michael Minor 0

    Mostly as a programming exercise I wrote a one-liner to scan a subnet for open Remote Desktop ports. It works fine without -parallel but returns $null if made parallel.

    $out = 1..254|% -Parallel {select @{N=’Address’;E={(‘10.22.14.’+(“{0:d3}” -f $_))}}, @{N=’OpenPort’;E={Test-Connection -ComputerName (‘10.22.14.’+$_) -TcpPort 3389}}, @{N=’Port’;E={‘3389’}}}

    It runs pretty slow w/o -Parallel, but it finishes much quicker than I would have expected with -Parallel, which I would have expected to be 4x-5x (-ish) faster. I’m using the release version of PowerShell Core 7. What am I doing wrong?

  • Andrew Stanton 0

    Yet another powershell feature that makes me have to transfer all the preference variables into the script block scope. The script block code is even in the same script file. Can you folks please reduce the amount of boilerplate we have to do for transferring preference (and other originating scope) variables into other script blocks and modules?

  • ping zhong 0

    I installed PowerShell 7.0.1, and tried to run the script:

    1..5 | ForEach-Object -Parallel { "Hello $_"; sleep 1; } -ThrottleLimit 5 

    But I got the following error:

    ForEach-Object : Parameter set cannot be resolved using the specified named parameters.
    + 1..5 | ForEach-Object -Parallel { “Hello $_”; sleep 1; } -ThrottleLim …
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : MetadataError: (:) [ForEach-Object], ParameterBindingException
    + FullyQualifiedErrorId : AmbiguousParameterSet,Microsoft.PowerShell.Commands.ForEachObjectCommand

    Can anyone help?

Feedback usabilla icon