I code a fair amount with PowerShell at work and home and find myself reusing different snippets of code from script-to-script.  Here are a few handy ones I like to keep around.

Get the directory of the executing script

Having the directory of the executing script can be handy to load adjacent resources or as a location to which to write logs or other data:


1
$ExecutionDir = Split-Path $MyInvocation.MyCommand.Path

Dynamically add a new column to a CSV imported into a collection

Many times you need to add one or more columns to a data file you’re working on. Here’s a way to load your data file and add those other columns in one line:


1
Import-Csv "C:\somepath\some.csv" | select *, @{Name='my_new_column'; Expression={'some value'}}

Test whether an object is an object or an array

One thing I find frustrating with PowerShell is that when you retrieve an object, say through a web request or simply filtering on a collection, you don’t necessarily know the datatype of the result set. You could either have an array of objects or a single object. The problem is, the available properties change between arrays and single object. If you try to print “count” on a single object, for example, PowerShell will throw an exception. In order not to crash my scripts, then, I’ll use code like what I have below to test the datatype of my object before continuing on:


1
2
3
4
5
6
7
if($null -ne $myObj){
    if($myObj.GetType().IsArray){
        # $myObj is a collection, so deal with it as such`
    }else{
        # $myObj is a single object
    }
}

Add attributes to a XML document

Manipulating XML documents can be a real pain. Here’s an easy way to add an attribute to a XML element:


1
2
3
$x = [xml]"<top_level_element/>"
$x.DocumentElement.SetAttribute("my_attribute", "some value")
$x.OuterXml

Upload a document to a Sharepoint document library

I suspect there are probably easier ways to do this with the Sharepoint web APIs, but here’s a technique I’ve used in the past to upload a document to a document library in Sharepoint:


1
2
3
4
$web_client = New-Object System.Net.WebClient
$web_client.Credentials = [System.Net.CredentialCache]::DefaultCredentials
$my_file = gci "C:\somepath\somefile.txt"
$web_client.UploadFile( ("http://some_sharepoint_domain/sites/some_site/Shared Documents/{0}" -f $my_file.Name), "PUT", $my_file )

Use the HTML Agility Pack to parse an HTML document

Parsing HTML is the worst! In the Microsoft world, some genius came up with the HTML Agility Pack allowing you to effectively convert your HTML page into XML and then use XPath query techniques to easily find the data you’re interested in:


1
2
3
4
5
6
7
Add-Type -Path "C:\nuget_packages\HtmlAgilityPack.1.8.4\lib\Net40\HtmlAgilityPack.dll"

$hap_web = New-Object HtmlAgilityPack.HtmlWeb
$html_doc = $hap_web.Load("https://finance.yahoo.com/")
$xpath_qry = "//a[contains(@href, 'DJI')]"
$dow_data = $html_doc.DocumentNode.SelectNodes($xpath_qry)
$dow_stmt = ($dow_data.Attributes | ? {$_.Name -eq "aria-label"}).Value

Convert one collection to another (and guarantee column order)

First, imagine you have a collection of complex objects, say, a JSON document with lots of nesting. You want to try to pull out just the relevant data elements and flatten the collection to a simple CSV. This snippet will allow you to iterate through that collection of complex objects and append simplified records into a new collection. Another problem I’ve found is that techniques like Export-Csv don’t always guarantee that the columns in the resulting CSV will be in the same order you added them in your PowerShell script. If order is important, the pscustomobject is the way to go:


1
2
$col2 = @()
$col1 | %{ $col2 += [pscustomobject]@{"column1"=$_.val1; "column2"=$_.val2} }

Load multiple CSV files into one collection

It’s not uncommon to have multiple data files that you need to load into one collection to work on. Here’s a technique I use for that situation:


1
2
3
4
5
$col = @()
dir "C:\somepath" -Filter "somefile*.csv" | % { $col += Import-Csv $_.FullName }

# If you need to filter out certain files, try this:
dir "C:\somepath" -Filter "somefile*.csv" | ? { $_.Name -notmatch "excludeme" } | % { $col += Import-Csv $_.FullName }

Parse a weird date/time format

It’s inevitable that you’ll run into a non-standard date/time format that you’ll have to parse. Here’s a way to handle that:


1
$date = [datetime]::ParseExact( "10/Jun/18", "dd/MMM/yy",$null )