Archive for the 'PowerShell v2' Category

Remote File Explorer in PowerGUI

Ravi published a PowerPack which lets you browse file system of a remote computer and download files from it:

PowerGUI Remote File Explorer

This is based on PowerShell 2.0 remoting and Oisin’s pModem module. So really a great mashup story of multiple technologies from different people coming together in a cool admin UI.

Download the Remote File Explorer PowerPack or read more about it in Ravi’s blog.

PowerGUI Survey

When do we stop supporting PowerShell 1.0 in newer releases of PowerGUI and switch to PowerShell 2.0 only? You can affect this by responding to the poll Darin put on the PowerGUI User Discussions forum:

The poll is in the right-hand column of the PowerGUI Discussion Forum

Please take a few seconds by answering one single question of the poll: when do you expect to have your administrative workstation upgraded to PowerShell 2.0. You need to log into PowerGUI.org to vote.

The poll is available at the right-hand column of the PowerGUI User Discussions forum.

You can also post your comments here or in this discussion thread.

Thank you in advance!

Better Together: PowerShell 2.0 and PowerGUI

Now that PowerShell v2 is officially released for all platforms: XP, 2003, Vista, 2008 (and obviously Windows 7 and Server 2008 R2) – HURRAY!!! –  let me quickly summarize how you can start immediately getting benefits from the new functionality once you upgrade PowerShell to 2.0 on the machine on which you use PowerGUI:

So what are you waiting for? Download PowerShell 2.0 now! 🙂

The KB article also has a list of what’s new functionality for PowerShell 2.0. And this video from Kirk highlights most of it in action.

Add filepath to ConvertTo-HTML

Converting PowerShell data into an HTML report and save it to disk with no need for extra pipeline has long been my dream. Unfortunately, there’s no native Export-HTML cmdlet (unlike, say, Export-CSV), and ConvertTo-HTML does not have -Path parameter and only displays the html code on the screen (very useful ;)) unless you pipe it to Out-File.

So being inspired by Kirk adding parameters to Import-CSV and using PowerShell 2.0 code-snippets, I created my Export-HTML function, which behaves exactly like ConvertTo-HTML but adds optional -Path parameter to specify the output file.

Download it, copy/paste the function into PowerShell (or dot-source it, or include it in your PowerShell profile) and you will be able to do something like:

Get-Process | Export-Html -Path C:\pr.htm

or

Get-Process |
    Export-Html C:\pr.htm -Title My Processes

or

Get-Process |
    Export-Html C:\pr.htm -Property Name, Handles -Title My Processes

You can download the code here, or copy/paste it from the text below.

You may also consider renaming the function name from Export-HTML to ConvertTo-HTML (or use set-alias to make them the same thing), because the -Path parameter is optional, and old behavior of outputting HTML code to the console/pipeline is supported as well as all native parameters.

Here’s how I created this proxy function:

  1. Downloaded and installed PowerShell 2.0 code snippets.
  2. Used the function (proxy) snippet to generate the proxy for ConvertTo-
    HTML
    .
  3. Added Path to the parameters section:
  4.     [Parameter(Position=0)]
        [Alias('PSPath', 'FilePath')]
        [ValidateNotNullOrEmpty()]
        [System.String]
        ${Path},
    
  5. Added a variable to store modified PowerShell code to be executed:
  6. $scriptCmdPipeline = ''
  7. Added parameter handling in which I (if Path is present) append the Out-File code to the pipeline:
  8.         if ($Path) {
                $PSBoundParameters.Remove('Path') | Out-Null
                $scriptCmdPipeline += " | Out-File -FilePath $Path"
            }
    
  9. Got the original command-line for ConvertTo-HTML
  10.         $scriptCmd = {& $wrappedCmd @PSBoundParameters}
    
  11. And added this new pipeline to it:
  12.         $scriptCmd = $ExecutionContext.InvokeCommand.NewScriptBlock(
                    [string]$scriptCmd + $scriptCmdPipeline
                )
    

The rest was handled by the code snippet.

Here’s the resultant code:

#Requires -Version 2.0

<#
    Export-Html behaves exactly like native ConvertTo-HTML
    However it has one optional parameter -Path
    Which lets you specify the output file: e.g.
    Get-Process | Export-Html C:\temp\processes.html
#>

function Export-Html {
[CmdletBinding(DefaultParameterSetName='Page')]
param(
    [Parameter(ValueFromPipeline=$true)]
    [System.Management.Automation.PSObject]
    ${InputObject},

# Adding Path parameter 
# (made it Position 0, and incremented Position for others)
    [Parameter(Position=0)]
    [Alias('PSPath', 'FilePath')]
    [ValidateNotNullOrEmpty()]
    [System.String]
    ${Path},

    [Parameter(Position=1)]
    [ValidateNotNullOrEmpty()]
    [System.Object[]]
    ${Property},

    [Parameter(ParameterSetName='Page', Position=4)]
    [ValidateNotNullOrEmpty()]
    [System.String[]]
    ${Body},

    [Parameter(ParameterSetName='Page', Position=2)]
    [ValidateNotNullOrEmpty()]
    [System.String[]]
    ${Head},

    [Parameter(ParameterSetName='Page', Position=3)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    ${Title},

    [ValidateSet('Table','List')]
    [ValidateNotNullOrEmpty()]
    [System.String]
    ${As},

    [Parameter(ParameterSetName='Page')]
    [Alias('cu','uri')]
    [ValidateNotNullOrEmpty()]
    [System.Uri]
    ${CssUri},

    [Parameter(ParameterSetName='Fragment')]
    [ValidateNotNullOrEmpty()]
    [Switch]
    ${Fragment},

    [ValidateNotNullOrEmpty()]
    [System.String[]]
    ${PostContent},

    [ValidateNotNullOrEmpty()]
    [System.String[]]
    ${PreContent})

begin
{
    try {
        $outBuffer = $null
        if ($PSBoundParameters.TryGetValue('OutBuffer', [ref]$outBuffer))
        {
            $PSBoundParameters['OutBuffer'] = 1
        }
        $wrappedCmd = $ExecutionContext.InvokeCommand.GetCommand('ConvertTo-Html', 
            [System.Management.Automation.CommandTypes]::Cmdlet)
        
        # define string variable to become the target command line
        #region Initialize helper variable to create command
        $scriptCmdPipeline = ''
        #endregion

        # add new parameter handling
        #region Process and remove the Path parameter if it is present
        if ($Path) {
            $PSBoundParameters.Remove('Path') | Out-Null
            $scriptCmdPipeline += " | Out-File -FilePath $Path"
        }
        #endregion
        
        $scriptCmd = {& $wrappedCmd @PSBoundParameters}
        
        # redefine command invocation
        #region Append our pipeline command to the wrapped command script block
        $scriptCmd = $ExecutionContext.InvokeCommand.NewScriptBlock(
                [string]$scriptCmd + $scriptCmdPipeline
            )
        #endregion
        
        
        $steppablePipeline = 
          $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
        $steppablePipeline.Begin($PSCmdlet)
    } catch {
        throw
    }
}

process
{
    try {
        $steppablePipeline.Process($_)
    } catch {
        throw
    }
}

end
{
    try {
        $steppablePipeline.End()
    } catch {
        throw
    }
}
<#

.ForwardHelpTargetName ConvertTo-Html
.ForwardHelpCategory Cmdlet

#>}

Hope you find this useful and it gets you the feature you wanted without waiting for PowerShell v3. 😉

Tags: , , , , , , ,

Background Jobs in PowerGUI

Here’s a quick tutorial on how you can do asynchronous script execution in PowerGUI admin console.

Suppose you want to have a node in PowerGUI which would show computers in your office which are currently online. The script (using Quest AD cmdlets and Test-Connection cmdlet from PowerShell v2) could look like:

Get-QADComputer -Location 'EMEA/RU/St.Petersburg' | 
    ForEach-Object {
        if (Test-Connection -ComputerName $_.Name -Quiet) 
            { $_  } 
    }

You will probably have a different Location parameter in your environment. Please change it here and in all samples below.

And it seems to work but is taking awfully long time to execute (sure, we could have optimized it a bit – but that’s not the point ;)). And you kind of have to sit there and watch it executing because if you click another node in PowerGUI admin console it will abort the operation…

The easy fix is to turn it into background job by taking the script, adding Add-PSSnapin (because background jobs start in their own runspace) and keeping the job handle in a global variable (so we don’t lose the job when we leave the node):

$global:OnlineComputersJob = Start-Job {
Add-PSSnapin 'Quest.ActiveRoles.ADManagement'
Get-QADComputer -Location 'EMEA/RU/St.Petersburg' | 
    ForEach-Object {
        if (Test-Connection -ComputerName $_.Name -Quiet) 
            { $_  } 
    }
}

This will start the job and it will keep running in the background while you can use PowerGUI for other tasks.

Now I will just change it a little bit to add actual output both to the grid and Output window. I will add yet another global variable $global:OnlineComputersResults and use it to keep all the results.

# if no job started yet, start a new one, else output old results
if ( $global:OnlineComputersJob -eq $null ) {

    # this is the collection in which we will keep job results
    $global:OnlineComputersResults = @()

    $global:OnlineComputersJob = Start-Job {
    Add-PSSnapin 'Quest.ActiveRoles.ADManagement'
    Get-QADComputer -Location 'EMEA/RU/St.Petersburg' | 
        ForEach-Object {
            if (Test-Connection -ComputerName $_.Name -Quiet) 
                { $_  } 
        }
    }
} else {
    $global:OnlineComputersResults
}

# Now let's keep looping and adding new results to the grid
while ( $global:OnlineComputersJob.JobStateInfo.State -ne 'Completed' ) {
    # get new results, add them to old ones, and output them to the grid
    # @() required to make sure PowerShell knows it is a collection
    $results = @(Receive-Job $global:OnlineComputersJob)
    $global:OnlineComputersResults += $results
    $results
    Write-Host "Added $($results.count) records. More on the way."
    Start-Sleep -Seconds 5
}

# When the job is completed, output whatever remains
$results = Receive-Job $global:OnlineComputersJob
$global:OnlineComputersResults += $results
$results
Write-Host "Added $results records."
Write-Host "Job completed."

That is it. This node will start running and adding new results to the grid as they appear.

Now let’s make it really shine by adding an action to reset the computer list, and make other computer-related actions show-up in the right-hand pane.

To add the reset action, we first add a category: right-click the right-hand pane and select New / Category – then name it something – e.g. “Job Control”.

After that, right click the new category and select New / Script Action – then create a new action called Reset computer list with the following code:

$global:OnlineComputersResults = $null
Remove-Job $global:OnlineComputersJob -Force
$global:OnlineComputersJob = $null

When you click this action the computer list gets recompiled from scratch.

Finally, let’s modify the original node list to tell PowerGUI that the objects in the grid can be used as computer objects. We need to do this because background jobs return not real objects but their deserialized versions (only data fields) – hence the type is also changed to “Deserialized.Quest.ActiveRoles.ArsPowerShellSnapIn.Data.ArsComputerObject”.

If we want to tell PowerGUI that these objects can be treated as regular computer objects we have two options to do that:
1. (Safer) Go through the regular actions displayed for computers and, for the ones which are not using any methods, add ‘Deserialized.Quest.ActiveRoles.ArsPowerShellSnapIn.Data.ArsComputerObject’ in Properties / Display Configuration / Associate the action with these types.
2. (Easier) Modify the original node code so that the first element we output contains the regular computer type name: ‘Quest.ActiveRoles.ArsPowerShellSnapIn.Data.ArsComputerObject’.

So with that easier option, the node code will look like:

# if no job started yet, start a new one, else output old results
if ( $global:OnlineComputersJob -eq $null ) {

    # this is the collection in which we will keep job results
    $global:OnlineComputersResults = @()

    $global:OnlineComputersJob = Start-Job {
    Add-PSSnapin 'Quest.ActiveRoles.ADManagement'
    Get-QADComputer -Location 'EMEA/RU/St.Petersburg' | 
        ForEach-Object {
            if (Test-Connection -ComputerName $_.Name -Quiet) 
                { $_  } 
        }
    }
} else {
    $global:OnlineComputersResults
}

# Now let's keep looping and adding new results to the grid
while ( $global:OnlineComputersJob.JobStateInfo.State -ne 'Completed' ) {
    # get new results, add them to old ones, and output them to the grid
    # @() required to make sure PowerShell knows it is a collection
    $results = @(Receive-Job $global:OnlineComputersJob)
    
    #for the first object in the collection, set object type
    if (( $global:OnlineComputersResults.Count -eq 0 ) -and 
        ( $results.count -gt 0 )) {
        $results[0].PSObject.TypeNames.Insert(0,`
        'Quest.ActiveRoles.ArsPowerShellSnapIn.Data.ArsComputerObject')
    }
    
    $global:OnlineComputersResults += $results
    $results
    Write-Host "Added $($results.count) records. More on the way."
    Start-Sleep -Seconds 5
}

# When the job is completed, output whatever remains
$results = Receive-Job $global:OnlineComputersJob
$global:OnlineComputersResults += $results
$results
Write-Host "Added $results records."
Write-Host "Job completed."

That is it. Now you have your Online Computers node which:
A. Execute in the background asynchronously,
B. Outputs results as they get in,
C. Has all the actions for its results plus a new one to restart it.

This is how your console would probably look like:

background

And, yes, PowerGUI at some point will probably have a lot of this just built into the framework. Meanwhile, I hope that you find this workaround useful. 🙂

Big thanks to Karl for the idea of this blog post.

Introduction to PowerShell v2

Check out this video to learn how to start using the most important new features of PowerShell 2.0 including background jobs, modules, advanced functions, and function help (HQ and full screen recommended):

Great job by Kirk Munro! I know I will be now using these features a lot more.

Tags: , , , , , ,

PowerGUI and Modules

With full PowerShell v2, Windows 7, and Windows Server 2008 R2 support in PowerGUI we obviously absolutely had to fully support Modules – and we do. 🙂 Modules are first class citizens in PowerGUI 1.9: you can add them, get intellisense and F1 help in script editor, start searching for them when adding admin console elements, set them as PowerPack dependencies and so on.

As a quick 101: modules are a new format of PowerShell libraries introduced in PowerShell version 2. Unlike snapins in v1, modules can be scripted – so you do not have to use Visual Studio and compile them.

To create a simple module, just take a PowerShell script file with a function, e.g.:

function Add-Number {
param ($A, $B)
    $A+$B
}

Then create a folder with a name you want for a module (e.g. MATH) and save the file into that folder as MATH.psm1.

To make this module listed as available, put the module folder (in our example MATH) into one of the $env:psmodulepath folders. Which basically means C:\Windows\system32\WindowsPowerShell\v1.0\Modules\ if you want it to be available to anyone, or your Documents\WindowsPowerShell\Modules if it’s for you only.

Now, if in PowerGUI (admin console or script editor) you click File / PowerShell Libraries, you’ll see it listed as one of the options:

PowerShell-Libraries

As you can see, other modules (BitsTransfer and PSDiagnostics) are also available, as well as v1-style snapins.

You can also use Add Module button, if you want to add a module which you have outside of the default module folders.

Anyways, once you select MATH and click OK, the module becomes available for you. E.g. you get intellisense and F1 help for the module members:

Module-Intellisense

For more information of creating modules of your own, see this MSDN page.

Happy PowerShell v2 scripting! 😉

Tags: , ,


My Recent Tweets

Legal

The posts on this blog are provided “as is” with no warranties and confer no rights. The opinions expressed on this site are mine and mine alone, and do not necessarily represent those of my employer - WSO2 or anyone else for that matter. All trademarks acknowledged.

© 2007-2014 Dmitry Sotnikov

December 2019
M T W T F S S
« Oct    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

%d bloggers like this: