Optimize PowerShell Performance and Memory Consumption

PowerShell can be pretty resource intensive especially if you use it to retrieve data from your directory. I thought it will be useful to share this issue and workaround which were recently discussed in the PowerShell newsgroup.

SYMPTOMS

When trying to retrieve all user objects for subsequent processing with the following code:

$users = Get-QADUser -IncludeAllProperties -SizeLimit 0

the computer throws the following exception:

Get-QADUser : Exception retrieving members: "Exception of type
'System.OutOfMemoryException' was thrown."

(Looks like a correct line, right? SizeLimit set to zero makes PwerShell retrieve all user objects, IncludeAllProperties makes it retrieve whole user objects (not just the default set of attributes), then the collection is assigned to a variable for subsequent use. So why the exception? Read on!)

CAUSE

In reality on a fairly big domain this single line quoted above consumed all the RAM the machine had by basically retrieving the whole AD database wrapped into PowerShell objects.

You should avoid retrieving and keeping in memory the data you don’t need in your scripts. See information on how this can be done below.

RESOLUTION

Here are a few important tips to keep in mind to optimize your PowerShell code when working with Active Directory:

Use Pipeline

Don’t save the whole collection of objects to a variable. This way you make PowerShell retrieve all the objects and keep them the whole session. Use pipeline instead – which makes PowerShell pass the retrieved objects one by one to the next cmdlet.

So instead of:

$users = Get-QADUser
ForEach ($user in $users) { here goes the code }

Use:

Get-QADUser | ForEach { here goes the code }

Retrieve Only What You Need

-IncludeAllAttributes is a dangerous parameter because it, well, includes all attributes. Even the binary blobs you won’t have any idea on how to use. If you are ok with the attributes the cmdlets retrieves by default (to get the list just run Get-QADUser | Get-Member) simply use:

Get-QADUser -SizeLimit 0

If you need a couple additional parameter, use -IncludedProperties switch to add them and just them (not all the attributes!)

Get-QADUser -SizeLimit 0 -IncludedProperties proxyAddresses

If you need to optimize even further, limit the retieval only to the attributes you need by using the DontUseDefaultIncludedProperties switch:

Get-QADUser -DontUseDefaultIncludedProperties -IncludedProperties
SamAccountName,proxyAddresses

Filter Using cmdlet Parameters

Finally, if you need only a subset of objects use cmdlet parameters to do the filtering – and not the where clause.

So instead of:

Get-QADUser | where { $_.City -eq Amsterdam} | { here goes the code }

Use:

Get-QADUser -City Amsterdam | { here goes the code }

The difference is huge. In the former case, PowerShell retrieves all user objects and then does the filtering on the client. In the latter, the filtering becomes a part of the LDAP query and thus is made automatically on the domain controller during data retrieval.

SUMMARY

The ideal PowerShell script:

  1. Does not keep the whole set of objects but processes them one by one upon retrieval.
  2. Retrieves only the attributes it needs.
  3. Filters objects during retrieval.

Get-QADUser -City Amsterdam -DontUseDefaultIncludedProperties -IncludedProperties
SamAccountName,proxyAddresses | ForEach { here goes the code }

Tags: , , , , , , ,

About these ads

17 Responses to “Optimize PowerShell Performance and Memory Consumption”


  1. 1 JP April 21, 2009 at 11:05 pm

    Thank you for posting this. I have been looking for ways to filter objects during the retrieval. I was hoping Import-Csv had something like this, but I have not found anything so far. I only need 4 columns out of 70 or so from my csv file (which has about 10,000 records). It looks like I may be stuck with grabbing all the data for a csv file. =(

  2. 2 Dmitry Sotnikov April 22, 2009 at 9:45 am

    And you don’t own the csv, do you? You could probably use regular expressions to parse the file and extract just the columns you need.

  3. 3 Anonymous July 15, 2011 at 8:46 pm

    This was a tremendously helpful article and helped me fix a memory leak I had been contending with.

  4. 5 Lars Gottlieb February 24, 2012 at 6:30 pm

    Using Quest AD tools I often run in to memory consumption problems. I thought it was a question of memoryleaks, but its not, its the Garbage collection that doesn’t get collection until its to late.
    So i’m using this when I use Quests AD Management Cmdlets in PowerShell, where $i is a simple counter

    if (($i % 200) -eq 0)
    {
    [System.GC]::Collect()
    }

  5. 9 Lars Gottlieb February 24, 2012 at 8:08 pm

    Aha – I was running i Debug Mode – I didn-t know – tried running it by pressing CTRL+F5 but it stil uses huge amount of data compared to running it in an External Powershell Window where it uses 65 -75 MB, and in PowerGUI it ends up using 1.4 GB and then chrashes (debug or not).

    • 10 Lars Gottlieb February 24, 2012 at 8:52 pm

      Ok, when I start an External Powershell through the menu of PowerGUI the memoryconsumption is even worse compared to running it in the PowerGUI (3.1.0.2058)!
      But everything is fine when I start a Powershell command and Execute the script from in there.


  1. 1 Essential PowerShell: Understanding foreach « Poshoholic Trackback on August 22, 2007 at 3:51 am
  2. 2 Measure vs Count vs $i++ « Dmitry’s PowerBlog: PowerShell and beyond Trackback on January 18, 2008 at 1:16 pm
  3. 3 Find large objects in AD « Dmitry’s PowerBlog: PowerShell and beyond Trackback on June 8, 2009 at 10:02 am
  4. 4 Freeing up memory in PowerShell using garbage collector « Dmitry’s PowerBlog: PowerShell and beyond Trackback on February 24, 2012 at 7:29 pm
  5. 5 Quest cmdlets performance and optimization « Jacques DALBERA's IT world Trackback on December 22, 2012 at 10:00 pm
  6. 6 PowerShellのForeach-Objectは一体何をしているのか | guitarrapc.wordpress.com Trackback on March 9, 2013 at 9:46 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




My Recent Tweets

  • Checklist for Cloud Service Operations: I knew one software company that failed their SaaS transition… goo.gl/fb/eIhUs #powershell 37 minutes ago
  • @philcockfield Congrats on the launch and by the buzz you are getting - including the TechCrunch story! 19 hours ago
  • Traditional hosters (Rackspace, Softlayer) falling behind cloud (Amazon, Google) in both pricing & performance lnkd.in/btX7xC7 & … 20 hours ago
  • @4sysops Basically, open source is secure only for projects with good governance: large active communities or large companies in charge. 22 hours ago
  • @kinlane Isn't the current best practice: "forgot password-> get a one-time password/link over email", "still remember-> provide old + new"? 22 hours ago

Legal

The posts on this blog are provided “as is” with no warranties and confer no rights. The opinions expressed on this site are mine and mine alone, and do not necessarily represent those of my employer - WSO2 or anyone else for that matter. All trademarks acknowledged.

© 2007-2014 Dmitry Sotnikov

July 2007
M T W T F S S
« Jun   Aug »
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Follow

Get every new post delivered to your Inbox.

Join 93 other followers

%d bloggers like this: