Monday, December 19, 2011

31 Days of Testing-Day 16: Testing Web Services

Updated: Index to all posts in this series is here!

Today’s post is from long-time pal and one-time colleague Dan Hounshell. Dan’s been around the Heartland developer community for quite some time, although it’s been awhile since he’s gotten up at a conference or user group to share his great knowledge. (Yes, Dan, that’s a public Dope Slap for you to get presenting again!)

Dan is the Development lead for the Core at Telligent, makers of Telligent Enterprise and Community platforms for social networking. He’s spent several years building out the extensibility services layer for those platforms and helped build up several thousand integration tests hitting all aspects of testing the services. Dan’s post really speaks to the lessons he’s learned through this journey, which is why I was so happy when he agreed to write this post. He’s been there and done that.

In other words, he knows of what he speaks. (Well, or writes in this case.)

Testing Web Services

Let’s kick this off with a big dose of honesty. Creating tests for your web service layer is a lot of hard work. It involves working with a lot of moving parts that can be finicky and stubborn: HTTP, data, XML, maybe JSON, etc. At some point you may get very frustrated. At some point you may consider “skipping it just this once” or maybe even quitting entirely. Do not give up – the payoff is worth it. Once you have a good test suite in place new tests will be easy to add, you will get on a roll and eventually realize you have thousands of tests covering your application. You will be amazed and proud as you watch your test count grow from 100 to 500 and ultimately into the thousands.

The following tips and advice are based on knowledge that my team and I have gained over the last few years building out a hefty integration test suite for the web service layer of our rather large product.

Types of tests

I have seen a several different methods used for testing web services including unit tests, “light” integration tests, and “full” integration tests. The approach you choose will depend on your experience, how testable your application is, and your goals for your tests.

Unit testing is valuable if your web service layer has been built in a way that allows it to be tested in isolation. If you already have a thorough test suite of integration tests and functionality tests then unit tests may be all that you need. In this scenario you could unit test common methods, serializers, any mappers or converters, etc. Adding unit tests should prove that your web services work properly when given the right data but they will not guarantee that they are getting it.

By “light” integration tests I am referring to only testing a layer or so beneath the web services along with them rather than the full stack. Perhaps you have a model layer directly beneath your web services that you can disconnect from the data/persistence layer. Mocking or faking the data will allow you to fully test your web services along with the logic beneath them. This approach is a good option if you already have a full set of integration tests and do not want to spend unnecessary time duplicating testing of the full stack.

The approach that we adopted is to do “full” integration testing with our web services. Think of where web services live… on top of everything else. Testing the full stack in this manner is akin to doing a functional test but without having to worry about the UI. We continue unit testing pieces of concern but spend the bulk of our budget testing the entire stack. Most of the rest of this article relates to integration testing web services.

Every journey begins with one small step

Before you can have a great test suite you have to write the first test. Do it. Make it simple. Do it wrong, it doesn’t matter. Don’t set out to create a grand testing framework, write just one test in one test fixture on one page. New up a web client, make a request to an endpoint on your local development site that will return some data, and write a couple of asserts to make sure the response contains something expected. Now run your test(s). Rinse and repeat. Congratulations, you are on your way!

Refactor, refactor, refactor

Now that you have some tests in place start looking for duplicate/similar code. Make note of every time you cut and paste the same setup code from one test or test fixture to another. Start refactoring those sorts of things out into helpers or into base classes. There is no reason to have the same web client setup code in every test, refactor it so you’ll have something more like this:

var response = MakeGetRequest(baseUrl, “/api/users.xml”);
I am compelled to issue a warning at this point because this is a hole I’ve fallen into myself. Be careful about refactoring too far. Avoid so many layers of abstraction that you hinder test readability. Jim also mentioned this in a previous post, “Day 8: Pay Attention to Your Tests’ Setup!”.

In addition to refactoring the tests themselves, look at the code you are using to setup data for your tests and refactor that as well.

Standing up data

Build a library for standing up data that is easy to use with as little input as possible.

var user = test_data.CreateUser(“bob”);
In the above example passwords, email addresses and other things may be required to create a user in your application but you ought to be able to supply default values for those.

Add fluid interface support to your test data creation methods to make it easy and fast to create complex sets of related data.

var blog = test_data.CreateBlog().WithPosts(3).EachWithComments(3);

Compare that to the tediousness of the example below:

var blog = test_data.CreateBlog(); 
var post1 = test_data.CreateBlogPost(blog); 
var post2 = test_data.CreateBlogPost(blog); 
var post3 = test_data.CreateBlogPost(blog); 
var comment1_1 = test_data.CreateBlogPostComment(post1); 
var comment1_2 = test_data.CreateBlogPostComment(post1); 
… 5 more lines of the same go here … 
var comment3_3 = test_data.CreateBlogPostComment(post3);

Consider adding a session monitor to your test data creation library that will keep track of the objects that you create in order to delete them after your tests finish. This allows your developers to concentrate on testing the functionality and not concerning themselves with cleaning up test data when they are finished with it.

Providing data

In order to perform a full integration test of your web services you will more than likely need some data store to provide known test data or allow for creation of data on the fly. If your web services only serve up mostly static or known data, you are not allowing creation or updating of data, then you may be able to use your existing local development database for your test suite. Another option is to build a separate database with known data for use by tests, which will work fine for mostly static data or a small number of tests. However, the best option for dynamic data is to use a separate data store from your development/staging database(s). Creating a new database on the fly, filling it with data as-needed for each test, and destroying it when testing is completed is the cleanest method for testing but it is expensive and will increase the time it takes for your test suite to run.

Specifications

Do yourself and your team a favor by writing your tests using specification syntax. I’m not saying that you need to adopt TDD, BDD or the Talking Mister Ed but just to make your tests readable. Use an existing specification tool or develop your own wrapper around your chosen test framework’s Assert method. You will appreciate the naming standards when you have hundreds of files and thousands of tests in your suite. The below code is a simplified example of something you might see in our web services test suite:

Folder: Users
Filename: when_making_a_get_user_request_with_invalid_username
 
 
[TestFixture] 
public class when_making_a_get_user_request_with_invalid_username () 
{ 
    // setup stuff for http request goes here 
    [Test] 
    public void response_code_should_be_404()
    { 
        response.Code.ShouldBe(404); 
    }
 
    [Test] 
    public void response_should_have_errors()
    { 
        response.Errors.ShouldNotBeNull(); 
    }
 
    [Test] 
    public void response_should_have_user_not_found_error()
    { 
        response.Errors
                .Where(x => x.Message.Contains(“User not found”)) 
                .FirstOrDefault()
                .ShouldNotBeNull(); 
    }
}

The above is very readable and is a benefit to your developers, testers and product/project managers. Adopting a specification syntax allows your test suite to describe exactly what your web services deliver.

Foster the process

Building a good/thorough/dependable test suite is as much about making testing a first class citizen of the development process as it is writing and running tests. You must evangelize writing tests. To make sure that your test suite gets built and maintained you have to require tests with each code change and require that checks for those tests are part of the dev/code review process.

Be strict about feature development – updated code should include updated tests and updated documentation. For web services work I recommend adopting a workflow approach of

1. Development

2. Create tests (can be swapped with #1 if you TDD)

3. Create/Update documentation

4. Dev review / Code review (make sure tests and documentation are in place)

5. QA review

6. Build proof of concept, possibly throwaway app, that makes use of your new web service functionality

If you have good tests in place your QA review may be as simple as explaining the tests that were created and showing that they work.

Get everyone involved. Let everyone know “this is how we do things when we work on web services”. Empower team members to make wholesale changes, refactor, and become part of the solution.

Other tips

Test the edges. Make sure your web services are returning exceptions/errors when they should be.

Always ask “how can/should/do I test this” when creating/updating code. As Jim has mentioned in many of his previous posts in this series, testing should be part of the development cycle not something optional or tacked onto the end. How can you know that something works without testing?

Make writing tests fun. Try stuff that you would not/cannot use in production code.

Always look for new things to test and new ways to test.

“You can’t test everything. Get over it.”

Run the section of tests in the area you are working regularly, as in several times a coding session. Run the complete set (if possible) before committing your changes.

Enjoy!

Remember that building and maintaining a good test suite is a journey and not a destination. At some point you will realize that you have succeeded in building something great and that all the hard work is worth it. It is a pleasure watching your test suite grow over time and notify you of bugs introduced to your code base. Your customers will appreciate it too!

4 comments:

Lakshminarayanan Sampath said...
This comment has been removed by the author.
Lakshminarayanan Sampath said...

Hi,
i have been reading your series for the past 3 days and its been quite a nice learning experience.
Is there a chance that you would post a consolidated post of all the articles? possible in PDF for offline reads as well?

Cheers

Andrew Frank said...

Thanks for compiling this series Jim! One technique that we have used to test our web services is to use SpecFlow. (Although, one could also use Cucumber.)

This helps to make the tests more readable and drives consensus on what the service is actually supposed to do.

Thanks again for the series.

Happy holdiays!

Unknown said...

good post.....I appreciate yor way of writing that make the blog attractive and make reader to hold longer to your blog.
web testing services

Subscribe (RSS)

The Leadership Journey