Building a PowerShell Troubleshooting Toolkit Revisited

Recently we posted an article about a PowerShell script you could use to build a USB key that contains free troubleshooting and diagnostic utilities. Those scripts relied on a list of links that were processed sequentially by the Invoke-Webrequest cmdlet. The potential downside is that it might take a bit of time to completely download everything. Wouldn’t it be much nicer if we you could download files, let’s say in batches of five?

Unfortunately, there are no cmdlets or parameters that you can use to throttle a set of commands. You could try to create some sort of throttling mechanism with PowerShell’s job infrastructure. If you are proficient with .NET, then you could try your hand at runspaces and runspace pools. Frankly, those make my head hurt, and I don’t expect IT pros to have to be .NET developers to use PowerShell. Fortunately, there is an alternative that I think is a good compromise between usability and systems programming: workflow.

PowerShell 3.0 brought us the ability to create workflows in PowerShell script. The premise of a workflow is that you can orchestrate a series of activities that can run unattended on 10, 100, or 1,000 remote computers. I don’t have space here to fully explain workflows. There is a chapter on workflow in the PowerShell in Depth book from Manning. But one of the great features in my opinion is the ability to execute multiple commands simultaneously.

In a workflow, you can use a ForEach construct with the –Parallel parameter.

The –Parallel parameter only works in a workflow. If there are 10 items, then the Do-Something command will run all 10 at the same time. Or you can throttle the number of simultaneous commands.

Now only five commands will run at a time. As one command finishes, the next one in the queue is processed. You don’t have to write any complicated .NET code to take advantage of this feature. Use a workflow. And there’s no rule that says I have to use a workflow with a remote computer. I can create a workflow and run it locally, which is what I’ve done with my original script.

There are rules for a workflow so I had to make a few adjustments. Workflows aren’t designed to run from pipelined input so the CSV data is embedded in the script. Otherwise, you should recognize most of the code. The key difference is that downloads now happen in parallel and throttled.

I do the same thing for the SysInternals links.

Once the workflow is loaded into my session, which you can do by dot-sourcing the script, I can run it the same way I would any command.

Use the –Verbose cmdlet if you want to see more detail. Using the workflow and parallel processing, I can download everything in about five minutes! Personally, I don’t find much use for workflows, but I love this parallel feature and hope that someday we’ll see it as part of the main PowerShell language.

Related Topics:

  • PowerShell

    Don't have a login but want to join the conversation? Sign up for a Petri Account