An Advanced PowerShell HTML HotFix Report

powershell hero
Last time, I showed you a PowerShell script that leveraged my hotfix reporting function to create a basic HTML report complete with highlights and active links. The challenge I left for myself, or perhaps you worked on it yourself, was to display the page title for the hotfix online link. That way I can tell at a glance what problem the hotfix is solving. The difficulty, at least in my development efforts, is that target HTML document does not lend itself to the typical way I would approach this task with Invoke-WebRequest or Invoke-RestMethod. My solution was to resurrect a technique I used back in my VBScript days, and that is to use Internet Explorer.

This version of my reporting script includes the same features as last time so I won’t repeat that discussion. If you recall, the function output includes a URL. All I need to do is navigate Internet Explorer to that page and get the document title. This will require a COM object for Internet Explorer.

$ie = New-Object -ComObject internetexplorer.application

As each hot fix is enumerated I can search for the URL.

$frag.SelectNodes("//*[contains(text(),'http')]") | foreach {
      #get the current value
      $url = $_.'#text'

Now the fun part. I can tell the IE object to go to that page.

$ie.Navigate($url)
#need to give IE a chance to open the web page
do {
  Start-Sleep -Milliseconds 10
} while ($ie.busy)

In my experience I have found it helpful to loop while Internet Explorer is busy, otherwise the script will continue with other steps I may not be ready for. After the page has been loaded, I can get the document title.

$ietitle = $ie.locationname
#sometimes it takes a bit longer to get to the final document
if ($ietitle -eq 'Microsoft Support') {
    Start-Sleep -Seconds 5
    #get the title from the document
    $ietitle = $ie.document.title
}

In my testing I discovered that sometimes the location took a little time to resolve so I added an additional loop. But once I have the title, I can insert it into the HTML table.

$_.'#text' = "<a href=$url target=_blank>$ietitle</a>"

When I’m finished getting the titles, I can get rid of the IE object.

#clean up Internet Explorer
$ie.quit()

Now I have a very meaningful report.

an advanced HTML hotfix report
an advanced HTML hotfix report (Image Credit: Jeff Hicks)

As you can imagine, making all of these connections is a time consuming task. If you think about it for a moment, the same hotfix is very likely installed on multiple servers so it doesn’t make sense to keep checking. My solution to this is to create a hashtable. As each URL is resolved, the title is added to the hashtable.

$online.Add($url,$ietitle)

With the hashtable I can check it first and if there is an entry use it instead of trying to resolve it online.

#check hashtable to see if url has already been resolved
      if ($online.ContainsKey($url)) {
        $ietitle = $online.item($url)
      }

I also realized that the next time I run the script, assuming it’s a monthly report, there is bound to be some overlap. In other words, the report will show hotfixes that I’m showing in this report. So why not save the resolved hot fix links and re-use them? In my script I export the online links to an XML file in the same directory as the script.

$online | Export-Clixml -Path $psscriptRoot\hfonline.xml

If this file exists, I can import it to initialize the hashtable.

#initialize a hashtable for hotfix links or import
#a saved file
if (Test-Path "$PSScriptRoot\hfonline.xml") {
    Write-Host "Importing online data from $psscriptRoot\hfonline.xml" -ForegroundColor magenta
    $online = import-clixml $psscriptRoot\hfonline.xml
}
else {
    $online = @{}
}

The end result is that the script runs much faster, at least after the very first time. The only online links that need to be resolved are the new ones.
And since I was revising the script to make it more advanced, I decided to add some parameter attributes and validation. I also realized that since my hotfix function supports alternate credentials, I should offer that option in this script and pass the credential on. I used splatting.

#define a hashtable of parameters to splat to Get-MyHotfix
$hfParams = @{
    After = (Get-Date).AddDays(-$days)
}
if ($Credential.username) {
    $hfParams.Add("Credential",$Credential)
}
#group data by computername
#get all hotfixes installed since $Days days ago
Write-Host "Getting hot fix data...please wait" -foregroundcolor magenta
$data = $computers | Get-MyHotFix @hfParams | Group-Object -Property Computername

The end result is a script that anyone can run to generate a meaningful and useful HTML-based report.  Here is the complete script for your reference.

#requires -version 4.0
#create an HTML hotfix report
Param(
[Parameter(Position=0)]
[ValidateNotNullorEmpty()]
[string]$Path = "C:\work\HotFixReport.htm",
[ValidateNotNullorEmpty()]
[int]$Days = 45,
[System.Management.Automation.Credential()]$Credential = [System.Management.Automation.PSCredential]::Empty
)
#import computer information
$computers = import-csv C:\scripts\computers.csv
#dot source the hot fix function if it isn't part of a module
. C:\scripts\AdvancedFunction-HotfixReport.ps1
#define a hashtable of parameters to splat to Get-MyHotfix
$hfParams = @{
    After = (Get-Date).AddDays(-$days)
}
if ($Credential.username) {
    $hfParams.Add("Credential",$Credential)
}
#group data by computername
#get all hotfixes installed since $Days days ago
Write-Host "Getting hot fix data...please wait" -foregroundcolor magenta
$data = $computers | Get-MyHotFix @hfParams | Group-Object -Property Computername
Write-Host "Preparing report..." -foregroundcolor magenta
#initialize a hashtable for hotfix links or import
#a saved file
if (Test-Path "$PSScriptRoot\hfonline.xml") {
    Write-Host "Importing online data from $psscriptRoot\hfonline.xml" -ForegroundColor magenta
    $online = import-clixml $psscriptRoot\hfonline.xml
}
else {
    $online = @{}
}
#initialize an empty array for HTML fragments
$fragments=@()
#create an Internet Explorer object for getting the document title
$ie = New-Object -ComObject internetexplorer.application
#create a fragments for each computername
foreach ($item in $data) {
    #define a heading with the computer name and total number of hotfixes
    $fragments+="<H2>$($item.name) [$($item.count)]</H2>"
    #convert data to an XML fragment
    [xml]$frag = $item.Group | Select-Object -Property * -ExcludeProperty Computername |
    ConvertTo-HTML -Fragment -as Table
    #insert security class for Security Updates
    $frag.SelectNodes("//td[text()='Security Update']") | foreach {
      $class = $frag.CreateAttribute("class")
      $class.value = 'security'
      $_.Attributes.append($class) | Out-Null
    }
    #turn urls into links. This assumes the entire text value is a url
    $frag.SelectNodes("//*[contains(text(),'http')]") | foreach {
      #get the current value
      $url = $_.'#text'
      #check hashtable to see if url has already been resolved
      if ($online.ContainsKey($url)) {
        $ietitle = $online.item($url)
      }
      else {
        #get the online title and add to the hashtable
        #make sure we aren't still waiting for IT to do something
        do {
            Start-Sleep -Milliseconds 10
        } while ($ie.Busy)
        Write-Host "Resolving title for $url"
        $ie.Navigate($url)
        #need to give IE a chance to open the web page
        do {
            Start-Sleep -Milliseconds 10
        } while ($ie.busy)
      $ietitle = $ie.locationname
      #sometimes it takes a bit longer to get to the final document
      if ($ietitle -eq 'Microsoft Support') {
        Start-Sleep -Seconds 5
        #get the title from the document
        $ietitle = $ie.document.title
        }
        $online.Add($url,$ietitle)
      }
      #replace the value with html link
      $_.'#text' = "<a href=$url target=_blank>$ietitle</a>"
    }
    #replace XML characters for <> in the body
    $fragments+= $frag.InnerXml.replace("<","<").Replace(">",">" )
}
#clean up Internet Explorer
$ie.quit()
#html report title
$ReportTitle = "Company Hotfix Report - $Days Days"
#define a header with an embedded style sheet
$head = @"
<Title>$ReportTitle</Title>
<style>
body { background-color:#D5DBDB;
       font-family:Tahoma;
       font-size:10pt; }
td, th { border:1px solid black;
         border-collapse:collapse; }
th { color:white;
     background-color:black; }
table, tr, td, th { padding: 2px; margin: 0px }
table { width:95%;margin-left:5px; margin-bottom:20px;}
.security { background-color:red;}
</style>
<br>
<H1>$ReportTitle</H1>
"@
$footer = "<H5><i>Report run $(Get-Date)</i></H5>"
#create the HTML report and save to a file
ConvertTo-HTML -Head $head -body $fragments -PostContent $footer | Out-File -FilePath $Path -Encoding ascii
#export online data for reuse
Write-Host "Exporting online data to $psscriptRoot\hfonline.xml" -ForegroundColor magenta
$online | Export-Clixml -Path $psscriptRoot\hfonline.xml
#write the file object to the pipeline
Get-item -Path $path

The script is written to work in my environment because it dot sources where I’m keeping my Get-MyHotfix function. You would need to change that. It is also importing a CSV file on my computer. You could create your own CSV file. Or perhaps you want to modify the function to take the computer names as parameter values. The last change you might want to make is to not use Write-Host as I am to display status messages and implement Write-Progress.

If you tried to write the last script first, you would most likely be frustrated and cursing PowerShell. Instead, I encourage you to follow the development process I’ve outlined over the course of several articles. I think you will find the process much more enjoyable and probably more educational.
I hope you’ll leave a comment and let me know what you think about all of this.