Skip to main content
TechNet

Windows PowerShell: Make a Command into a Reusable Tool

You can repackage and reuse your efforts when it comes to Windows PowerShell commands and cmdlets.

Don Jones

No matter how inexperienced you are with Windows PowerShell when you first start working with it, there’s always plenty of room for growth. You can start out running simple commands, work up to more-complicated commands, and eventually repackage those commands into something that looks and feels almost like a native cmdlet. These are called advanced functions, and are informally known as “script cmdlets.”

Consider a situation where you may want to retrieve some critical inventory information from a computer. You need the Windows version, BIOS serial number, service pack version and processor architecture. You can get that information with these three commands:

Get-WmiObject –class Win32_OperatingSystem –computername SERVER-R2 | Select-Object –property __SERVER,BuildNumber,Caption,ServicePackMajorVersion

Get-WmiObject –class Win32_BIOS –computername SERVER-R2 | Select-Object –property SerialNumber

Get-WmiObject –class Win32_Processor –computername SERVER-R2 | Select-Object –property AddressWidth

The problem is that these commands generate three different result sets. You can’t directly pipe all that out to a single CSV file, for example, to store the inventory data, or a single HTML file to display the inventory as a Web page. It’s preferable to bundle the data into a single parameterized command that a less-experienced user could still use. You need the command to:

  • Accept one or more computer names as strings from the pipeline, as in:
Get-Content names.txt | Get-OSInfo | ConvertTo-HTML | Out-File info.html
  • Accept one or more computer names on a –computername parameter, as in:
Get-OSInfo –computername Server-R2,ServerDC4 | Format-Table
  • Accept from the pipeline one or more objects that each has a computername property, as in:
Get-ADComputer –filter * -searchbase "ou=West,dc=company,dc=com" | Select-Object @{label='computername';expression={$_.Name}} | Get-OSInfo | Export-CSV inventory.csv

This way, you won’t need to worry about where computer names are coming from. You also never need to worry about what kind of output you’ll create. The shell will handle that.

In addition, you’ll want to have a –logfile parameter, which will accept a path and filename of a log file. Because the command will still be using Windows Management Instrumentation (WMI) to connect and query information, there’s the possibility that you won’t be able to reach one or more computers. You still need to have their names written to that log file, which you can then use to troubleshoot the problem or even to retry those computers.

Dealing with Errors

You can accomplish that last bit using the Try…Catch construct within PowerShell. Simply wrap the first WMI query in a Try…Catch block. Then specify the –ErrorAction Stop parameter so it will be able to catch any errors the command generates. Set a variable that keeps track of whether an error occurred, so your script will know whether to try the next two WMI queries:

$continue = $True

Try {

  Get-WmiObject –class Win32_BIOS –computername $computername –EA Stop

} Catch {

  $continue = $False

  $computername | Out-File –path $logfile –append

}

Dealing with Input

The tricky part is dealing with the input. You’ll receive input in one of two ways—via the pipeline or via a parameter. You can actually direct input coming in via the pipeline to the parameter, using cmdlet-style parameter binding. Here’s the basic structure for that:

Function Get-OSInfo {

  [CmdletBinding()]

  param(

    [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]

  [string[]]$computername,

  [string]$logfile 

 )

  BEGIN {}

  PROCESS {}

  END {}

}

The PROCESS block within the cmdlet is guaranteed to execute at least once, which it will do if there’s no pipeline input provided. If there is pipeline input provided, then the PROCESS block will execute once for each piped-in item. It will place that item into the $computername variable.

Therein lies the problem: If input is only specified via a parameter, as in the second bulleted example, then PROCESS will only execute once. The $computername will contain every computer name that was given to the parameter. You’d have to enumerate or “unwind” those yourself. If those items are piped in, then you’ll only have to work with one item at a time. You’ll still get them in the $computername variable.

The trick is to create a second function that does the actual work. Use the advanced function to accept either kind of input, and break things down into one computer name at a time. Then call the second “worker” function with a single computer name at a time. Here’s the basic structure, which would go inside the PROCESS block of the main function:

PROCESS {

  if ($PSBoundParameters.ContainsKey('computername')) {

    foreach($computer in $computername) {

      OSInfoWorker –computername $computer –logfile $logfile

    }

  } else {

    OSInfoWorker –computername $computername –logfile $logfile

  }

}

Finishing Up

With the major technical hurdles solved, bring it all together and add a few other details, such as cleaning up old log files before running the function each time (which is a perfect candidate for the BEGIN block). The resulting script, consisting of two functions, is a bit lengthy. However, very little of it is actual “programming.” Most of it is just Windows PowerShell commands, like you’d run on the command line. They’re just surrounded by a lot of declarative structure to make it all behave like a cmdlet.

The entire thing is posted on my Web site at ow.ly/39YcX, along with a video walkthrough of the final code. This is a template that you can absolutely reuse. The primary function is really just dealing with the input, so you can use it almost as is. Change the parameters to match your specific needs, and you’re good to go. The actual work is being done in a separate function.

Next month, I’ll show you how to package all of this as an easy-to-distribute script module. I’ll also show you how to set up a shared location for script modules that all of your administrator colleagues can utilize.

Don Jones
Don Jones is a founder of Concentrated Technology, and answers questions about Windows PowerShell and other technologies at ConcentratedTech.com. He’s also an author for Nexus.Realtimepublishers.com, which makes many of his books available as free electronic editions through his web site.

 

Get More

This month’s article has a companion video walkthrough, as well as downloadable sample code. Get these extras here.

Related Content