Did you know that you can use PowerShell to monitor your website and send you alarms when something goes wrong? We had availability issues with our community site and I was quite surprised that the 20-line (!) PowerShell script did the job!
Basically, all I had to do was use the Net.WebClient
object and its DownloadString
method to query the page (with some proxy handling code I got from Alexey Chuikov), and trap any exception which it generates when something goes wrong. The trap is using our internal relay server to send me and everyone who is involved in the site administration the email.
Here’s the code:
########################################################## # Test-Site - script to test web site availability # and notify in case of any issues # (c) Dmitry Sotnikov # https://dmitrysotnikov.wordpress.com ########################################################## function Test-Site { param($URL) trap{ "Failed. Details: $($_.Exception)" $emailFrom = "my.email@address.com" # Use commas for multiple addresses $emailTo = "my.email@address.com,another.admin@address.com" $subject = "PowerGUI.org down" $body = "PowerGUI web site is down. Details: $($_.Exception)" $smtpServer = "smtp.server.to.use.for.relay" $smtp = new-object Net.Mail.SmtpClient($smtpServer) $smtp.Send($emailFrom, $emailTo, $subject, $body) exit 1 } $webclient = New-Object Net.WebClient # The next 5 lines are required if your network has a proxy server $webclient.Credentials = [System.Net.CredentialCache]::DefaultCredentials if($webclient.Proxy -ne $null) { $webclient.Proxy.Credentials = ` [System.Net.CredentialCache]::DefaultNetworkCredentials } # This is the main call $webclient.DownloadString($URL) | Out-Null } Test-Site "http://powergui.org"
To test it you can obviously just put an invalid URL into the call.
Once I had the script running, I just set up a scheduled task in Windows Task Scheduler to run the script every 15 minutes:
One trick I learned from MoW and used in the task, was using the -command parameter (rather than just supplying the script) and including the exit $LASTEXITCODE
into the command, so the exit code from the PowerShell script gets registered as the scheduled task result.
So here’s the command-line I have scheduled:
c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -Noninteractive -command ". c:\scripts\test-site.ps1; exit $LASTEXITCODE"
Works flawlessly! And can save you tons of money on a monitoring solution. Talk about ROI from learning PowerShell! 😉
Tags: .NET, Examples, Exchange, Internet, OpsMgr, PowerShell, Reporting, email, server management
That script looks pretty interesting. The problem I have with local solutions is that I would have to run my Windows box 24/7 and configure it to send me a notification whenever my websites become unavailable.
Yes, this is an obvious limitation. At the same time specifically with the task of monitoring web site availability I actually like that this script is running remotely (actually from another network) so network-related problems such as DNS or routing get caught as well.
Hello,
Thanks for these informations,
Where do you fix the URL to test (At which line)
And in the case of multipules URL ?
Im a begginer in PS.
Regards,
Eric
The URL is the function parameter and is being supplied in the call at the end of the script:
Test-Site “http://powergui.org”
Just change this to yours.
Have a look at http://www.mywebkpi.com which does web kpi monitoring around the clock for free. Try the live demo here http://www.mywebkpi.com/cgi-bin/demo-cgi to see how websites can be monitored without scripts and without software or config changes. Talk about ROI here for investing nothing and getting your site monitored.
works great for me
Well I don’t see any limitations :=) Actually, I will be using this for a daily checkout script to report that critical intranet sites are up. The .Proxy and .CredentialCache was driving me crazy and this was a huge help. This ole SysAdmin is gonna have to learn .Net. Thank you very much Dmitry!
Whilst looking for a script to monitor website availability I found another option using kixtart here:
http://www.jjclements.co.uk/index.php/2009/02/19/kixtart-script-to-check-for-website-availability
It seems to also report the actual error codes that may be invoked when there is a webserver issue
Thanks for this code!
I modified it to simply return true or false,
and to add the http:// if it’s not there.
The only thing I’m missing is a timeout for the webclient.
I found a few examples with a dowhile loop in powershell and timespans… but I need something better
Here’s your modified code Mr.
I don’t understand the parm so I skipped it…
[CODE]
function mad-TestSite
{
if ($args.count -eq 1)
{
$erroractionpreference = “SilentlyContinue”
$URL = $Args[0]
if ($URL -like “http://*”)
{
#write-host “URL has the http://”
}
else
{
#write-host “URL does not have the http://”
$URL = “http://” + $URL
}
$webclient = New-Object Net.WebClient
# The next 5 lines are required if your network has a proxy server
$webclient.Credentials = [System.Net.CredentialCache]::DefaultCredentials
if($webclient.Proxy -ne $null)
{
$webclient.Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
}
# This is the main call
$webdata = $webclient.DownloadString($URL)
if ($webdata -ne $null)
{return $true}
else
{return $false}
}
else
{
write-host “Usage: mad-TestSite http://thewebsite.com or thewebsite.com”
}
}
[/CODE]
Thanks MadLogik!
this is exactly what I was looking for.
How ever iam getting this error while sending the email:
Exception calling “Send” with “4” argument(s): “The SMTP server requires a secure connection or the client was not authenticated. The server response was: 5.7.1 Client was not authenticated”
At O:\test\rendertest.ps1:12 char:19
+ $smtp.Send <<<< ($emailFrom, $emailTo, $subject, $body)
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
Also I have multiple websites to test, is there a way to test 10 sites or more. maybe get-content from sites.txt?
nvm i figured it out.
$smtp.Credentials = New-Object System.Net.NetworkCredential(“username”,”password”)
Igor, good catch! In my environment I was already at my office desktop authenticated in AD so did not have to explicitely provide username and password.
Thanks for finding the issue and providing the fix!
Dmitry
Im having problems with this code. Please help!
here is what i want to do. get a list of sites. if any give errors, send an email with the error. All other that work, put into a table and send a seperate email.
OUTPUT: (suppose to give me false for the bad sites)
Igor,
Your code is failing where it should indeed fail. 🙂 For invalid/offline site $webclient.DownloadString($URL) does NOT return $null – instead it throws an exception.
In my original code (see the blog post above) I used trap to catch it and report unavailable site. (In PowerShell 2.0 you can also use try/catch statement).
Change your script to catch the exception (instead of testing for $null) and you should be fine.
Sorry for delayed response. Forums at http://powergui.org are really a much better way to get answers fast. I am a bit slow on responding to comments and email.
Dmitry
Hello,
It’seems to be not working for HTTPS URL §
Eric
Eric,
Seems to work for https for me. For example this worked just fine (Google supports both HTTP and HTTPS connections):
Test-Site “https://www.google.com”
Dmitry
This worked for me, nice one ……..
Thanks a lot for Sharing this information.
Hi All,
I am unable to get the infromation for a link. I am getting some syntax error.
$webclient = New-Object Net.WebClient;$webdata = ${webclient.DownloadString(https://stbap701.gsp.accenture.com/portal/login/login.do)} if ($webdata -ne $nulll) {#Write-Host `t “$URL is good”} else {#return $false}
Please correct me where I am wrong
Thanks you. It’s simple and work best. I had been trying to create something like this in SCOM with so much time and still can make it work.
awsome
This script is exactly what I was looking for. However, I have a site that prompts for a username and password. Any idea on how to get around this?
Jose,
In that case instead of the webclient COM object you can use Invoke-WebRequest cmdlet (which was not yet in PowerShell when I was writing the original script). You can find examples here: http://technet.microsoft.com/en-us/library/hh849901.aspx
Dmitry
Thank you for posting this very helpful script, Dmitry! The only issue I have been having is the receipt of timeout errors when the website being monitored is not down and responds fine. The error detail is “website is down. Details: System.Net.WebException: The operation has timed out”. Have you seen this error and is there a way around it?
Thanks again,
Sean
Sean, no I have not had such issues myself – the script was quite reliable for me when I used it. Is this sporadic or happening to you all the time? I have not looked into the APIs lately – maybe there is a way to set a different timeout.
Thanks for the reply, Dmitry. I originally had the script set to run in Task Manager every 5 minutes. I have moved that to every 30 minutes and now only receive the error a couple times a day. Is it possible to trap this error and have the script not send an email when the error is “operation has timed out”? I do receive a different error when the site is actually down, so I know I won’t miss an actual issue by ignoring the “operation has timed out”.
If you are on PowerShell v3, you can use Invoke-WebRequest instead of the .net object – it has a parameter to set a longer timeout: http://technet.microsoft.com/en-us/library/hh849901.aspx
Hi Dmitry, Thanks for sharing the script. In few cases , the URL is redirected to error webpages which is preset in the application. For ex: when launch URL https://mysite.abc.com , it should land http://mysite.abc.com/home but when something goes wrong , the URL lands in https://mysite.abc.com/home/error.aspx?error=Licenseerror. How do we capture the destiation URL and set an alert for this?