Search this site


Metadata

Articles

Projects

Presentations

Bringing test tools to Nagios monitoring

With all the TDD (test-driven design) and BDD (behavior-driven design) going around these days, it'd be a shame not to use these tools on monitoring applications.

You might have a boatload of tests that test your application before you roll a new version, but do you use those tests while the application is in production? Can you? Yes!

Let's take an important example of monitoring some complex interaction, like searching google and checking the results. Simple with a mouse, but perhaps complex in code. Even if you wrote a script to do it, using an existing testing framework gets you pass/fail testing automatically.

For this example, I'll use the following ruby tools: rspec and webrat. This fairly easy, though it took me a bit to find all the right documentation bits to clue me in to the right way.

require 'rubygems'
require 'webrat'

Spec::Runner.configure do |config|
  include Webrat::Methods
end

describe "google search for my name" do
  it "should include semicomplete.com in results" do
    visit "http://www.google.com/"
    webrat.response.title.should =~ /Google/
    query = "jordan sissel"
    fill_in "q", :with => query
    field_named("btnG").click
    webrat.response.title.should == "#{query} - Google Search"
    click_link "semicomplete.com"
  end
end
Now, we run this with the 'spec' tool:
% spec rspec-webrat.rb 
.

Finished in 0.578546 seconds

1 example, 0 failures
Seems ok. Let's break the test and see what happens. Change the 'visit' line to something else:
    visit "http://www.yahoo.com/"
Now rerun the test, which was checking specifically for google things in the page and will now fail on yahoo's page:
 % spec rspec-webrat.rb
F

1)
'google search for my name should include semicomplete.com in results' FAILED
expected: /Google/,
     got: "Yahoo!" (using =~)
./rspec-webrat.rb:29:

Finished in 0.186847 seconds

1 example, 1 failure
This output kind of sucks. Additionally, rspec failures seem to have exit code 1, not 2 as wanted by a nagios check reporting critical. Let's fix those. First, fixing the exit code can be hacked around directly in ruby if you want:
# Nagios checks expect exit code '2' to mean CRITICAL.
# Let's make any nonzero exit attempt always exit 2 (EXIT_CRITICAL).
EXIT_CRITICAL = 2
module Kernel
  alias :original_exit :exit
  def exit(value)
    value = EXIT_CRITICAL if value != 0
    original_exit(value)
  end
end
Fixing the output just means telling spec to use a different output format. I like the 'nested' output. Rerun that test now:
% spec -f nested rspec-webrat.rb
google search for my name
  should include semicomplete.com in results (FAILED - 1)

1)
'google search for my name should include semicomplete.com in results' FAILED
expected: /Google/,
     got: "Yahoo!" (using =~)
./rspec-webrat.rb:30:

Finished in 0.017534 seconds

1 example, 1 failure

% echo $?
2
All set.

Even better is that you can include multiple checks in the same script, if you wanted to. RSpec lets you select any test to run alone, so your nagios checks for a given web application could be a very simple:

define command {
  command_name check_google_for_semicomplete
  command_line /usr/bin/spec -f nested -e "google search for my name" mytests.rb
}

Squid + Selenium == Win

Selenium cannot be used to test remote sites becuase browsers have cross-site scripting protection which prevents you from modifying the content from other domains. What happens when we fool the browser into believing content comes from a given domain? An extremely simple squid configuration can do just this.

I wrote one test. This test visits technorati's homepage, searches for 'barcamp' and verifies that a text string exists on the search results.

Steps:

  1. Tell Firefox to use my squid proxy as an HTTP proxy
  2. Visit http://www.technorati.com/_selenium/
  3. Run the tests
Simple, right?

The squid proxy intercepts any requests for '/_selenium' and redirects them internally to my selenium web server. This has been tested in IE, Firefox, and Safari with 100% success over vanilla http. HTTPS probably doesn't work for obvious "Duh, it's encrypted" reasons. Squid can fix this aswell with ssl reverse proxying.

If I run my single test, the result is something that looks like the following (Firefox and IE, respectively):