Monitor web-site availability

Did you know that you can use PowerShell to monitor your website and send you alarms when something goes wrong? We had availability issues with our community site and I was quite surprised that the 20-line (!) PowerShell script did the job!

Basically, all I had to do was use the Net.WebClient object and its DownloadString method to query the page (with some proxy handling code I got from Alexey Chuikov), and trap any exception which it generates when something goes wrong. The trap is using our internal relay server to send me and everyone who is involved in the site administration the email.

Here’s the code:

##########################################################
# Test-Site - script to test web site availability
# and notify in case of any issues
# (c) Dmitry Sotnikov
# https://dmitrysotnikov.wordpress.com
##########################################################

function Test-Site {
    param($URL)
    trap{
        "Failed. Details: $($_.Exception)"
        $emailFrom = "my.email@address.com"
        # Use commas for multiple addresses
        $emailTo = "my.email@address.com,another.admin@address.com"
        $subject = "PowerGUI.org down"
        $body = "PowerGUI web site is down. Details: $($_.Exception)"
        $smtpServer = "smtp.server.to.use.for.relay"
        $smtp = new-object Net.Mail.SmtpClient($smtpServer)
        $smtp.Send($emailFrom, $emailTo, $subject, $body)    
        exit 1
    }
    $webclient = New-Object Net.WebClient
    # The next 5 lines are required if your network has a proxy server
    $webclient.Credentials = [System.Net.CredentialCache]::DefaultCredentials
    if($webclient.Proxy -ne $null)     {
        $webclient.Proxy.Credentials = `
                [System.Net.CredentialCache]::DefaultNetworkCredentials
    }
    # This is the main call
    $webclient.DownloadString($URL) | Out-Null
} 

Test-Site "http://powergui.org"

To test it you can obviously just put an invalid URL into the call.

Once I had the script running, I just set up a scheduled task in Windows Task Scheduler to run the script every 15 minutes:
Windows Task Scheduler with a PowerShell task

One trick I learned from MoW and used in the task, was using the -command parameter (rather than just supplying the script) and including the exit $LASTEXITCODE into the command, so the exit code from the PowerShell script gets registered as the scheduled task result.

So here’s the command-line I have scheduled:

c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -Noninteractive -command ". c:\scripts\test-site.ps1; exit $LASTEXITCODE"

Works flawlessly! And can save you tons of money on a monitoring solution. Talk about ROI from learning PowerShell! 😉

Tags: , , , , , , , ,

32 Responses to “Monitor web-site availability”


  1. 1 Server Monitoring August 15, 2008 at 2:15 pm

    That script looks pretty interesting. The problem I have with local solutions is that I would have to run my Windows box 24/7 and configure it to send me a notification whenever my websites become unavailable.

  2. 2 dmitrysotnikov August 15, 2008 at 3:33 pm

    Yes, this is an obvious limitation. At the same time specifically with the task of monitoring web site availability I actually like that this script is running remotely (actually from another network) so network-related problems such as DNS or routing get caught as well.

    • 3 eric October 4, 2010 at 10:27 am

      Hello,

      Thanks for these informations,

      Where do you fix the URL to test (At which line)
      And in the case of multipules URL ?

      Im a begginer in PS.

      Regards,

      Eric

      • 4 Dmitry Sotnikov October 4, 2010 at 4:38 pm

        The URL is the function parameter and is being supplied in the call at the end of the script:

        Test-Site “http://powergui.org”

        Just change this to yours.

  3. 5 joe January 20, 2009 at 11:14 pm

    Have a look at http://www.mywebkpi.com which does web kpi monitoring around the clock for free. Try the live demo here http://www.mywebkpi.com/cgi-bin/demo-cgi to see how websites can be monitored without scripts and without software or config changes. Talk about ROI here for investing nothing and getting your site monitored.

    works great for me

  4. 6 JKav February 6, 2009 at 12:12 am

    Well I don’t see any limitations :=) Actually, I will be using this for a daily checkout script to report that critical intranet sites are up. The .Proxy and .CredentialCache was driving me crazy and this was a huge help. This ole SysAdmin is gonna have to learn .Net. Thank you very much Dmitry!

  5. 7 David Johnson March 4, 2009 at 5:25 pm

    Whilst looking for a script to monitor website availability I found another option using kixtart here:

    http://www.jjclements.co.uk/index.php/2009/02/19/kixtart-script-to-check-for-website-availability

    It seems to also report the actual error codes that may be invoked when there is a webserver issue

    • 8 MadLogik November 21, 2009 at 2:47 pm

      Thanks for this code!
      I modified it to simply return true or false,
      and to add the http:// if it’s not there.

      The only thing I’m missing is a timeout for the webclient.
      I found a few examples with a dowhile loop in powershell and timespans… but I need something better

      Here’s your modified code Mr.
      I don’t understand the parm so I skipped it…

      [CODE]
      function mad-TestSite
      {
      if ($args.count -eq 1)
      {
      $erroractionpreference = “SilentlyContinue”
      $URL = $Args[0]
      if ($URL -like “http://*”)
      {
      #write-host “URL has the http://
      }
      else
      {
      #write-host “URL does not have the http://
      $URL = “http://” + $URL
      }
      $webclient = New-Object Net.WebClient
      # The next 5 lines are required if your network has a proxy server
      $webclient.Credentials = [System.Net.CredentialCache]::DefaultCredentials
      if($webclient.Proxy -ne $null)
      {
      $webclient.Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
      }
      # This is the main call
      $webdata = $webclient.DownloadString($URL)
      if ($webdata -ne $null)
      {return $true}
      else
      {return $false}
      }
      else
      {
      write-host “Usage: mad-TestSite http://thewebsite.com or thewebsite.com”
      }
      }
      [/CODE]

  6. 9 Dmitry Sotnikov November 23, 2009 at 8:03 am

    Thanks MadLogik!

  7. 10 Igor March 11, 2010 at 8:52 pm

    this is exactly what I was looking for.

    How ever iam getting this error while sending the email:

    Exception calling “Send” with “4” argument(s): “The SMTP server requires a secure connection or the client was not authenticated. The server response was: 5.7.1 Client was not authenticated”
    At O:\test\rendertest.ps1:12 char:19
    + $smtp.Send <<<< ($emailFrom, $emailTo, $subject, $body)
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : DotNetMethodException

    Also I have multiple websites to test, is there a way to test 10 sites or more. maybe get-content from sites.txt?

  8. 11 Igor March 11, 2010 at 11:57 pm

    nvm i figured it out.

    $smtp.Credentials = New-Object System.Net.NetworkCredential(“username”,”password”)

    • 12 Dmitry Sotnikov March 12, 2010 at 10:28 am

      Igor, good catch! In my environment I was already at my office desktop authenticated in AD so did not have to explicitely provide username and password.

      Thanks for finding the issue and providing the fix!

      Dmitry

  9. 13 Igor March 12, 2010 at 5:15 pm

    Im having problems with this code. Please help!
    here is what i want to do. get a list of sites. if any give errors, send an email with the error. All other that work, put into a table and send a seperate email.

    • 14 Igor March 12, 2010 at 5:16 pm
      function Test-Site {
      	BEGIN{}
      	PROCESS{
      	$URL = $_	
      	$webclient = New-Object Net.WebClient
          $webdata = $webclient.DownloadString($URL)
      	if ($webdata -ne $nulll) 
      		{#Write-Host `t "$URL is good" -ForegroundColor Cyan
      		GoodRender}
      	else 
      		{#return $false
      		  SendBadRenderMail}
      	}
      	END {}
       }
      
      Set-ExecutionPolicy RemoteSigned
      Get-Content o:\Test\sites.txt | test-site
      
      • 15 Igor March 12, 2010 at 5:18 pm

        OUTPUT: (suppose to give me false for the bad sites)

        PS O:\test> .\rendertest.ps1
                 http://microsoft.com is good
        Exception calling "DownloadString" with "1" argument(s): "The remote name could not be resolved: 'badsite'" At O:\test\rendertest.ps1:9 char:41
        +     $webdata = $webclient.DownloadString <<<< ($URL)
            + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
            + FullyQualifiedErrorId : DotNetMethodException
        
                 http://badsite:8080 is good
        Exception calling "DownloadString" with "1" argument(s): "The remote server returned an error: (502) Bad Gateway." At O:\test\rendertest.ps1:9 char:41
        +     $webdata = $webclient.DownloadString <<<< ($URL)
            + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
            + FullyQualifiedErrorId : DotNetMethodException
        
                 http://badsite.com:123 is good
        PS O:\test> 
      • 16 Dmitry Sotnikov March 17, 2010 at 10:32 am

        Igor,

        Your code is failing where it should indeed fail. 🙂 For invalid/offline site $webclient.DownloadString($URL) does NOT return $null – instead it throws an exception.

        In my original code (see the blog post above) I used trap to catch it and report unavailable site. (In PowerShell 2.0 you can also use try/catch statement).

        Change your script to catch the exception (instead of testing for $null) and you should be fine.

        Sorry for delayed response. Forums at http://powergui.org are really a much better way to get answers fast. I am a bit slow on responding to comments and email.

        Dmitry

  10. 17 eric October 4, 2010 at 11:07 am

    Hello,

    It’seems to be not working for HTTPS URL §

    Eric

    • 18 Dmitry Sotnikov October 4, 2010 at 4:42 pm

      Eric,

      Seems to work for https for me. For example this worked just fine (Google supports both HTTP and HTTPS connections):

      Test-Site “https://www.google.com”

      Dmitry

  11. 19 Kanna November 11, 2010 at 6:50 am

    This worked for me, nice one ……..

    Thanks a lot for Sharing this information.

  12. 20 Gowrish February 15, 2012 at 6:33 am

    Hi All,

    I am unable to get the infromation for a link. I am getting some syntax error.

    $webclient = New-Object Net.WebClient;$webdata = ${webclient.DownloadString(https://stbap701.gsp.accenture.com/portal/login/login.do)} if ($webdata -ne $nulll) {#Write-Host `t “$URL is good”} else {#return $false}

    Please correct me where I am wrong

  13. 21 Anonymous August 30, 2012 at 4:39 pm

    Thanks you. It’s simple and work best. I had been trying to create something like this in SCOM with so much time and still can make it work.

    awsome

  14. 24 Sean November 30, 2013 at 7:00 pm

    Thank you for posting this very helpful script, Dmitry! The only issue I have been having is the receipt of timeout errors when the website being monitored is not down and responds fine. The error detail is “website is down. Details: System.Net.WebException: The operation has timed out”. Have you seen this error and is there a way around it?

    Thanks again,
    Sean

    • 25 Dmitry Sotnikov December 12, 2013 at 1:55 am

      Sean, no I have not had such issues myself – the script was quite reliable for me when I used it. Is this sporadic or happening to you all the time? I have not looked into the APIs lately – maybe there is a way to set a different timeout.

      • 26 Sean December 12, 2013 at 9:58 pm

        Thanks for the reply, Dmitry. I originally had the script set to run in Task Manager every 5 minutes. I have moved that to every 30 minutes and now only receive the error a couple times a day. Is it possible to trap this error and have the script not send an email when the error is “operation has timed out”? I do receive a different error when the site is actually down, so I know I won’t miss an actual issue by ignoring the “operation has timed out”.

  15. 27 Dmitry Sotnikov December 17, 2013 at 8:25 pm

    If you are on PowerShell v3, you can use Invoke-WebRequest instead of the .net object – it has a parameter to set a longer timeout: http://technet.microsoft.com/en-us/library/hh849901.aspx

  16. 28 Preethi August 5, 2017 at 8:01 pm

    Hi Dmitry, Thanks for sharing the script. In few cases , the URL is redirected to error webpages which is preset in the application. For ex: when launch URL https://mysite.abc.com , it should land http://mysite.abc.com/home but when something goes wrong , the URL lands in https://mysite.abc.com/home/error.aspx?error=Licenseerror. How do we capture the destiation URL and set an alert for this?


  1. 1 Brian Nettles » Blog Archive » Automate Website Monitoring Trackback on August 27, 2008 at 2:57 pm
  2. 2 run as command | keyongtech Trackback on January 18, 2009 at 5:25 pm
  3. 3 Automate Website Monitoring : Trisummit Technologies Trackback on March 29, 2010 at 2:23 am
  4. 4 PowerShell scripts to monitor web server crash eventhang and send notification emails | MSDN Blogs Trackback on June 21, 2012 at 11:59 am

Leave a comment




Legal

The posts on this blog are provided “as is” with no warranties and confer no rights. The opinions expressed on this site are mine and mine alone, and do not necessarily represent those of my employer - WSO2 or anyone else for that matter. All trademarks acknowledged.

© 2007-2014 Dmitry Sotnikov

May 2008
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031