Optimize PowerShell Performance and Memory Consumption

PowerShell can be pretty resource intensive especially if you use it to retrieve data from your directory. I thought it will be useful to share this issue and workaround which were recently discussed in the PowerShell newsgroup.

SYMPTOMS

When trying to retrieve all user objects for subsequent processing with the following code:

$users = Get-QADUser -IncludeAllProperties -SizeLimit 0

the computer throws the following exception:

Get-QADUser : Exception retrieving members: "Exception of type
'System.OutOfMemoryException' was thrown."

(Looks like a correct line, right? SizeLimit set to zero makes PwerShell retrieve all user objects, IncludeAllProperties makes it retrieve whole user objects (not just the default set of attributes), then the collection is assigned to a variable for subsequent use. So why the exception? Read on!)

CAUSE

In reality on a fairly big domain this single line quoted above consumed all the RAM the machine had by basically retrieving the whole AD database wrapped into PowerShell objects.

You should avoid retrieving and keeping in memory the data you don’t need in your scripts. See information on how this can be done below.

RESOLUTION

Here are a few important tips to keep in mind to optimize your PowerShell code when working with Active Directory:

Use Pipeline

Don’t save the whole collection of objects to a variable. This way you make PowerShell retrieve all the objects and keep them the whole session. Use pipeline instead – which makes PowerShell pass the retrieved objects one by one to the next cmdlet.

So instead of:

$users = Get-QADUser
ForEach ($user in $users) { here goes the code }

Use:

Get-QADUser | ForEach { here goes the code }

Retrieve Only What You Need

-IncludeAllAttributes is a dangerous parameter because it, well, includes all attributes. Even the binary blobs you won’t have any idea on how to use. If you are ok with the attributes the cmdlets retrieves by default (to get the list just run Get-QADUser | Get-Member) simply use:

Get-QADUser -SizeLimit 0

If you need a couple additional parameter, use -IncludedProperties switch to add them and just them (not all the attributes!)

Get-QADUser -SizeLimit 0 -IncludedProperties proxyAddresses

If you need to optimize even further, limit the retieval only to the attributes you need by using the DontUseDefaultIncludedProperties switch:

Get-QADUser -DontUseDefaultIncludedProperties -IncludedProperties
SamAccountName,proxyAddresses

Filter Using cmdlet Parameters

Finally, if you need only a subset of objects use cmdlet parameters to do the filtering – and not the where clause.

So instead of:

Get-QADUser | where { $_.City -eq Amsterdam} | { here goes the code }

Use:

Get-QADUser -City Amsterdam | { here goes the code }

The difference is huge. In the former case, PowerShell retrieves all user objects and then does the filtering on the client. In the latter, the filtering becomes a part of the LDAP query and thus is made automatically on the domain controller during data retrieval.

SUMMARY

The ideal PowerShell script:

  1. Does not keep the whole set of objects but processes them one by one upon retrieval.
  2. Retrieves only the attributes it needs.
  3. Filters objects during retrieval.

Get-QADUser -City Amsterdam -DontUseDefaultIncludedProperties -IncludedProperties
SamAccountName,proxyAddresses | ForEach { here goes the code }

Tags: , , , , , , ,

20 Responses to “Optimize PowerShell Performance and Memory Consumption”


  1. 1 JP April 21, 2009 at 11:05 pm

    Thank you for posting this. I have been looking for ways to filter objects during the retrieval. I was hoping Import-Csv had something like this, but I have not found anything so far. I only need 4 columns out of 70 or so from my csv file (which has about 10,000 records). It looks like I may be stuck with grabbing all the data for a csv file. =(

  2. 2 Dmitry Sotnikov April 22, 2009 at 9:45 am

    And you don’t own the csv, do you? You could probably use regular expressions to parse the file and extract just the columns you need.

  3. 3 Anonymous July 15, 2011 at 8:46 pm

    This was a tremendously helpful article and helped me fix a memory leak I had been contending with.

  4. 5 Lars Gottlieb February 24, 2012 at 6:30 pm

    Using Quest AD tools I often run in to memory consumption problems. I thought it was a question of memoryleaks, but its not, its the Garbage collection that doesn’t get collection until its to late.
    So i’m using this when I use Quests AD Management Cmdlets in PowerShell, where $i is a simple counter

    if (($i % 200) -eq 0)
    {
    [System.GC]::Collect()
    }

  5. 9 Lars Gottlieb February 24, 2012 at 8:08 pm

    Aha – I was running i Debug Mode – I didn-t know – tried running it by pressing CTRL+F5 but it stil uses huge amount of data compared to running it in an External Powershell Window where it uses 65 -75 MB, and in PowerGUI it ends up using 1.4 GB and then chrashes (debug or not).

    • 10 Lars Gottlieb February 24, 2012 at 8:52 pm

      Ok, when I start an External Powershell through the menu of PowerGUI the memoryconsumption is even worse compared to running it in the PowerGUI (3.1.0.2058)!
      But everything is fine when I start a Powershell command and Execute the script from in there.

  6. 12 Travis Loyd December 3, 2014 at 9:21 am

    This is so incredibly helpful. I wonder if you could also write something around how to avoid memory leaks. I have a powershell script which runs forever, processing information … sleeping.. repeat. The memory usage of the ISE just grows and grows. I’ve discovered and added “[gc]::collect()” into the loop, which seems to help. Also some articles speak about .Dispose when using a custom object. Should I be calling a function? An article to clarify this would also be extremely helpful. Articles out there suggesting to run from the task scheduler rather than a forever loop are giving a valid work around, but avoiding the issue.

    • 13 Ken B March 28, 2016 at 7:44 am

      I ran into similar issue using the Quest cmdlets…what I’ve started doing when processing items in a loop is using the RV (remove variable) to free up the space used by that variable (setting to null is not the same thing!). This helped quite a bit with the Quest cmdlets (including invoking the garbage collection).

      I even try to do this within functions before the function ends on the variables used within that function (local variables) – I assume the values would be removed and cleaned up…I am just forcing it to happen

      really basic example:

      $list=import-csv somelist.csv

      foreach ($i in $list) {
      $var1=$i.value1
      $var2= $i.value2

      rv var1
      rv var2
      } # end foreach
      rv i
      rv list

  7. 14 naveenbasatikittu421 September 29, 2015 at 8:38 pm

    Very nice article and very helpful indeed. It reduced my code length and memory usage. Thanks


  1. 1 Essential PowerShell: Understanding foreach « Poshoholic Trackback on August 22, 2007 at 3:51 am
  2. 2 Measure vs Count vs $i++ « Dmitry’s PowerBlog: PowerShell and beyond Trackback on January 18, 2008 at 1:16 pm
  3. 3 Find large objects in AD « Dmitry’s PowerBlog: PowerShell and beyond Trackback on June 8, 2009 at 10:02 am
  4. 4 Freeing up memory in PowerShell using garbage collector « Dmitry’s PowerBlog: PowerShell and beyond Trackback on February 24, 2012 at 7:29 pm
  5. 5 Quest cmdlets performance and optimization « Jacques DALBERA's IT world Trackback on December 22, 2012 at 10:00 pm
  6. 6 PowerShellのForeach-Objectは一体何をしているのか | guitarrapc.wordpress.com Trackback on March 9, 2013 at 9:46 pm

Leave a comment




Legal

The posts on this blog are provided “as is” with no warranties and confer no rights. The opinions expressed on this site are mine and mine alone, and do not necessarily represent those of my employer - WSO2 or anyone else for that matter. All trademarks acknowledged.

© 2007-2014 Dmitry Sotnikov

July 2007
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031