Hey, Scripting Guy! What Should I Include in Windows PowerShell Script Help?


 Bookmark and Share
Hey, Scripting Guy! Question

Hey, Scripting Guy! I heard your interview the other day on the PowerScripting podcast, and I was really impressed. It is not every day I get to hear one of my heroes be interviewed on a podcast. I wanted to submit a question, but I got too carried away with the flow of things, and before I knew it, Hal was wrapping everything up. Oh, well. What I wanted to ask is in reference to something you said about adding Help to a script: What should I add to Help in a script? For example, you say I should add Help to a script, but I do not know what type of Help I should add. Does this make sense?

— OC

Hey, Scripting Guy! AnswerHello OC,

Microsoft Scripting Guy Ed Wilson here. There are good things about living in Charlotte, North Carolina. One benefit is we almost never receive any snow. When the recent snowstorm hit the Eastern seaboard of the United States, all airports closed and cities declared states of emergency. We simply woke up with a little frost on the grass. It was pretty, and Knut would have been happy, even though there was hardly enough snow in the entire subdivision to make even a miniature snow person. When a snowstorm hits Charlotte, the entire city closes down because people are not familiar with driving on snow. A Windows PowerShell script can be like driving on a snow-laden road—it is deceptively familiar. Windows PowerShell code is fairly readable, and a well-written Windows PowerShell script should be easy to read.

Image of Windows PowerShell 2.0 Best Practices book

Note: Portions of today’s Hey, Scripting Guy! post are excerpted from the Microsoft Press book, Windows PowerShell 2.0 Best Practices by Ed Wilson, which is now available.

Although well-written code should be easy to understand, easy to maintain, and easy to troubleshoot, it still benefits from well-written documentation. Good documentation not only tells people who will be using the script how to get the most out of your script, but it also explains to people how they may modify your script or even use your functions in other scripts. Good documentation in a script is a sign of a professional at work. In the CreateFileNameFromDate.ps1 script, the header section of the script uses the comments section to explain how the script works, what it does, and the limitations of the approach. The CreateFileNameFromDate.ps1 script is seen here.


# ————————————————————————
# NAME: CreateFileNameFromDate.ps1
# AUTHOR: ed wilson, Microsoft
# DATE:12/15/2008
# KEYWORDS: .NET framework, io.path, get-date
# file, new-item, Standard Date and Time Format Strings
# regular expression, fef, pass by reference
# COMMENTS: This script creates an empty text file
# based upon the date-time stamp. uses format string
# to specify a sortable date. uses getInvalidFileNameChars
# method to get all the invalid characters that are not allowed
# in a file name. It assumes there is a folder named fso off the
# c: drive. If the folder does not exist, the script will fail.
# ————————————————————————
Function GetFileName([ref]$fileName)
 $invalidChars = [io.path]::GetInvalidFileNamechars()
 $date = Get-Date -format s
 $fileName.value = ($date.ToString() -replace “[$invalidChars]”,”-“) + “.txt”

$fileName = $null
new-item -path c:fso -name $filename -itemtype file

As seen in the following image, adding comments does not change the way the script runs.

Image showing that adding comments doesn't change how scipt runs

In general you should always provide information about how to utilize your functions. Each parameter and underlying dependencies must be explained. In addition to documenting the operation and the dependencies of the functions, you should also include information that would be beneficial to those who must maintain the code. You should always assume that the person who will maintain your code does not understand what the code actually does; therefore, ensure that the documentation explains everything. In the BackUpFiles.ps1 script, comments are added to the header and each function to explain the logic and limitations of the functions. The BackUpFiles.ps1 script is seen here.


# ————————————————————————
# NAME: BackUpFiles.ps1
# AUTHOR: ed wilson, Microsoft
# DATE: 12/12/2008
# KEYWORDS: Filesystem, get-childitem, where-object
# date manipulation, regular expressions
# COMMENTS: This script backs up a folder. It will
# backup files that have been modified within the past
# 24 hours. You can change the interval, the destination,
# and the source. It creates a backup fol der that is named based upon
# the time the script runs. If the destination foder does not exist, it
# will be created. The destination folder is based upon the time the
# script is run and will looklike this: C:bu12.12.2008.1.22.51.PM.
# The interval is the age in days of the files to be copied.
# ———————————————————————
Function New-BackUpFolder($destinationFolder)
 #Receives the path to the destination folder and creates the path to
 #a child folder based upon the date / time. It then calls the New-Backup
 #function while passing the source path, destination path, and interval
 #in days.
 $dte = get-date
 #The following regular expression pattern removes white space, colon, and
 #and forward slash from the date and replaces with a period to create the
 #backup folder name.
 $dte = $dte.tostring() -replace “[:s/]”, “.”
 $backUpPath = “$destinationFolder” + $dte
 $null = New-Item -path $backUpPath -itemType directory
 New-Backup $dataFolder $backUpPath $backUpInterval
} #end New-BackUpFolder

Function New-Backup($dataFolder,$backUpPath,$backUpInterval)
 #Does a recursive copy of all files in the data folder, and filters out
 #all files that have been written to within the number of days specified
 #by the interval. Writes copied files to the destination and will create
 #if the destination (including parent path) does not exist. WIll overwrite
 #if destination already exists. This is unlikely, however, unless the
 #script is run twice during the same minute.
 “backing up $dataFolder… check $backUppath for your files”
 Get-Childitem -path $dataFolder -recurse |
 Where-Object { $_.LastWriteTime -ge (get-date).addDays(-$backUpInterval) } |
 Foreach-Object { copy-item -path $_.FullName -destination $backUpPath -force }
} #end New-BackUp

# *** entry point to script ***

$backUpInterval = 1
$dataFolder = “C:fso”
$destinationFolder = “C:BU”
New-BackupFolder $destinationFolder

Make sure that, when you modify the script, you also modify your comments. As a result, both the comments and the script will refer to the same information. It is easy to forget to update comments that refer to the parameters of a function when you add additional parameters to that function. In a similar fashion it is easy to ignore the information contained inside the header of the script that refers to dependencies or assumptions within the script. Make sure that you treat both the script and the comments with the same level of attention and importance.


OC that is all there is to using Windows PowerShell help.  Windows PowerShell Help Week will continue tomorrow.

If you want to know exactly what we will be looking at tomorrow, follow us on Twitter or Facebook. If you have any questions, send e-mail to us at scripter@microsoft.com or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.


Ed Wilson and Craig Liebendorfer, Scripting Guys



Discussion is closed.

Feedback usabilla icon