PowerShell Jobs Week: Job Processes

Doctor Scripto

Summary: Richard Siddaway looks at how Windows PowerShell jobs actually run. Honorary Scripting Guy, Richard Siddaway, here today filling in for my good friend, The Scripting Guy. This is the sixth in a series of posts that, hopefully, will shine the spotlight on Windows PowerShell jobs, remind people of their capabilities, and encourage their greater adoption. The full series comprises:

  1. Introduction to PowerShell Jobs
  2. WMI and CIM Jobs
  3. Remote Jobs
  4. Scheduled Jobs
  5. Jobs and Workflows
  6. Job Processes (this post)
  7. Jobs in the Enterprise

The focus on this series has been on Windows PowerShell jobs, including learning about the core job cmdlets, using the
–AsJob parameter, running jobs on remote machines, and figuring out how scheduled jobs work. Today, we are going to go under the hood a bit more and look at how jobs actually run. I’ve stated in passing several times in this series that jobs don’t run in the context of your current Windows PowerShell session—whether that’s the Windows PowerShell console, ISE, or another Windows PowerShell host. Try this:

$proc = “p*”

Get-Process -Name $proc The commands will run and you will see list of processes with names that start with the letter “p.” In my environment, I got a single instance of the Windows PowerShell process. Now try this:

Start-Job -ScriptBlock {Get-Process -Name $proc} Your job will start and apparently complete. However, when you look at the data from the job, you’ll get something like this:

£> Receive-Job -Id 4 Cannot validate argument on parameter ‘Name’. The argument is null or empty. Provide an argument that is not null or empty, and then try the command again.

    + CategoryInfo          : InvalidData: (:) [Get-Process], ParameterBindingValidationException

    + FullyQualifiedErrorId : ParameterArgumentValidationError,Microsoft.PowerShell.Commands.GetProcessCommand

    + PSComputerName        : localhost The error message is telling you in a very roundabout way that the job didn’t know anything about the $proc variable. If you want your job to be able to use the contents of the $proc variable, you need to pass it into the job:

Start-Job -ScriptBlock {param ([string]$proc ) Get-Process -Name $proc} -ArgumentList $proc A param block has been added to the script block defining the proc parameter. The –ArgumentList parameter of Start-Job is used to pass the parameter. If you need to pass multiple parameters into the script block, list them in the order they are defined. You don’t have to use the same name for the variable that is associated with the script block’s parameter inside and outside the script block. I’ve just done it to reduce confusion. The job will run, and when you use Receive-Job, you will receive the expected results. Or are they what you expected? When I ran Get-Process at the beginning of this explanation, I deliberately chose “p*” so that I got the Windows PowerShell process (I’m running all of this in a Windows PowerShell console). I got one Windows PowerShell process. When I ran the second job (the one that worked), I got two Windows PowerShell processes. But if I run Get-Process now, I’ll only see my original Windows PowerShell process. The extra Windows PowerShell process appeared because a standard background job spawns a new, separate Windows PowerShell process in which it runs the script block. That’s why the variable we defined couldn’t be found when we ran the first job. The work is being done in a separate instance of PowerShell that is brand new and clean.

Note  Your profile won’t be run in the Windows PowerShell instance that the job starts, so you won’t have any aliases or functions that you define in your profile. You won’t have any snap-ins or modules that your profile loads available either. You’ll have to load them within your job’s script block. You’ve now discovered why you get the Windows PowerShell prompt back as soon as your job starts—the work is being done in separate process so the Windows PowerShell job engine can give back the prompt. Does that mean that we know how all jobs run? Unfortunately, not! Each job type has its own behavior pattern as far the process, or processes, it requires. The easiest way to dig into this is to open three Windows PowerShell consoles.  In the first one, run Get-Process and keep the results visible. In the second, run:

Get-WmiObject -Class Cim_datafile –AsJob That will create a very long running WMI job. In your third Windows PowerShell console, run Get-Process again. Compare the results, and you will discover a process called unsecapp is now running:

Get-Process unsecapp | fl * It shows the following details. (I’m picking out only the interesting bits):

Path                       : C:windowssystem32wbemunsecapp.exe

Company                    : Microsoft Corporation

Description                : Sink to receive asynchronous callbacks for WMI client application The wbem folder is where Windows keeps all of the WMI related tools. Stop and remove the job with:

Get-Job | Stop-Job

Get-Job | Remove-Job If you wait a little while and then run…

Get-Process u* …You’ll see that unsecapp.exe has closed. If you repeat the exercise, but use Get-CimInstance -ClassName CIM_datafile –AsJob, you can discover where CIM jobs run. Oh. You can’t because the CIM cmdlets don’t have an –AsJob parameter. If you want to run the CIM cmdlets as a job, you have to use Start-Job. If you try that experiment, you’ll see that another instance of Windows PowerShell is started for the background job as you would expect. Commands that are created by using CDXML get an –AsJob parameter added automatically. Create a simple CDXML module by using the CIM_Datafile class:

<?xml version=’1.0′ encoding=’utf-8′?>

<PowerShellMetadata xmlns=’http://schemas.microsoft.com/cmdlets-over-objects/2009/11′>

  <Class ClassName=’ROOTcimv2Cim_DataFile’>




      <GetCmdletParameters DefaultCmdletParameterSet=’DefaultSet’>             




</PowerShellMetadata> Now save it as datafile.cdxml. Import the module and run the command:

Import-Module .datafile.cdxml

Get-CimDataFile –AsJob You will see a job of type CimJob running. Testing the process as you did previously doesn’t highlight any obvious new processes. So we have to assume that CIM jobs run in an existing process. You will have a similar experience if you try to track down where workflow jobs run. There is no obvious new process started. You’ve seen other job types: RemoteJobs and ScheduledJobs. If we try our experiment on these, you can simulate a remote job like this:

Invoke-Command -ComputerName $env:COMPUTERNAME -ScriptBlock {while($true){sleep -Seconds 10}} –AsJob Looking at the running processes, you will discover the wsmprovhost process:

Get-Process wsmprovhost | fl * It shows these details:

Path                       : C:windowssystem32wsmprovhost.exe

Company                    : Microsoft Corporation

Description                : Host process for WinRM plug-ins That seems sensible because we’re dealing with remoting. Scheduled jobs are interesting.  If you look at the definition of the scheduled task associated with the scheduled job, you’ll see:

powershell.exe -NoLogo -NonInteractive -WindowStyle Hidden -Command “Import-Module PSScheduledJob; $jobDef = [Microsoft.PowerShell.ScheduledJob.ScheduledJobDefinition]::LoadFromStore(‘MyLongJob’, ‘C:UsersRichardAppDataLocalMicrosoftWindowsPowerShellScheduledJobs’); $jobDef.Run()” That means you’re starting Windows PowerShell and then starting a job from that Windows PowerShell process, which starts another Windows PowerShell process—for a total of two new Windows PowerShell processes. You now know that a Windows PowerShell job creates at least one new process in which to run. Each job will create a new process. If you start 30 long-running standard background jobs, you’ll get 30 new instances of Windows PowerShell. This could have a serious impact on your admin machine, depending on its specification. You need to think about how many jobs you start at once. Cmdlets such as Invoke-Command and Get-WmiObject, which have an –AsJob parameter, also have a –ThrottleLimit parameter. This accepts an integer value and controls the number of jobs that can be run simultaneously from that command.  Now run:

Invoke-Command –ComputerName (get-content computers.txt) –ScriptBlock {get-process} –AsJob The computers.txt file contains the names of 200 computers, and you don’t want the command starting 200 jobs simultaneously. The –ThrottleLimit parameter defaults to 32, so you’ll only get 32 jobs started, and one has to finish before the next one starts. But that’s not 32 running jobs in total on the machine. Its 32 running jobs from that one command. If you have jobs started in other Windows PowerShell sessions, or you started long-running jobs before you used Invoke-Command, you could have many more jobs running—all taking resources. You can use the –ThrottleLimit parameter to control the number of jobs that start at one time, but don’t make the number too high because your machine will run out of resources and crash. That’s it for today. Tomorrow the series concludes with a look at how and when you can use jobs in your enterprise to the best advantage. Bye for now. ~Richard Thanks, Richard. I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace. Ed Wilson, Microsoft Scripting Guy 


Discussion is closed.

Feedback usabilla icon