- •contents
- •preface
- •acknowledgments
- •about this book
- •Special features
- •Best practices
- •Design patterns in action
- •Software directory
- •Roadmap
- •Part 1: JUnit distilled
- •Part 2: Testing strategies
- •Part 3: Testing components
- •Code
- •References
- •Author online
- •about the authors
- •about the title
- •about the cover illustration
- •JUnit jumpstart
- •1.1 Proving it works
- •1.2 Starting from scratch
- •1.3 Understanding unit testing frameworks
- •1.4 Setting up JUnit
- •1.5 Testing with JUnit
- •1.6 Summary
- •2.1 Exploring core JUnit
- •2.2 Launching tests with test runners
- •2.2.1 Selecting a test runner
- •2.2.2 Defining your own test runner
- •2.3 Composing tests with TestSuite
- •2.3.1 Running the automatic suite
- •2.3.2 Rolling your own test suite
- •2.4 Collecting parameters with TestResult
- •2.5 Observing results with TestListener
- •2.6 Working with TestCase
- •2.6.1 Managing resources with a fixture
- •2.6.2 Creating unit test methods
- •2.7 Stepping through TestCalculator
- •2.7.1 Creating a TestSuite
- •2.7.2 Creating a TestResult
- •2.7.3 Executing the test methods
- •2.7.4 Reviewing the full JUnit life cycle
- •2.8 Summary
- •3.1 Introducing the controller component
- •3.1.1 Designing the interfaces
- •3.1.2 Implementing the base classes
- •3.2 Let’s test it!
- •3.2.1 Testing the DefaultController
- •3.2.2 Adding a handler
- •3.2.3 Processing a request
- •3.2.4 Improving testProcessRequest
- •3.3 Testing exception-handling
- •3.3.1 Simulating exceptional conditions
- •3.3.2 Testing for exceptions
- •3.4 Setting up a project for testing
- •3.5 Summary
- •4.1 The need for unit tests
- •4.1.1 Allowing greater test coverage
- •4.1.2 Enabling teamwork
- •4.1.3 Preventing regression and limiting debugging
- •4.1.4 Enabling refactoring
- •4.1.5 Improving implementation design
- •4.1.6 Serving as developer documentation
- •4.1.7 Having fun
- •4.2 Different kinds of tests
- •4.2.1 The four flavors of software tests
- •4.2.2 The three flavors of unit tests
- •4.3 Determining how good tests are
- •4.3.1 Measuring test coverage
- •4.3.2 Generating test coverage reports
- •4.3.3 Testing interactions
- •4.4 Test-Driven Development
- •4.4.1 Tweaking the cycle
- •4.5 Testing in the development cycle
- •4.6 Summary
- •5.1 A day in the life
- •5.2 Running tests from Ant
- •5.2.1 Ant, indispensable Ant
- •5.2.2 Ant targets, projects, properties, and tasks
- •5.2.3 The javac task
- •5.2.4 The JUnit task
- •5.2.5 Putting Ant to the task
- •5.2.6 Pretty printing with JUnitReport
- •5.2.7 Automatically finding the tests to run
- •5.3 Running tests from Maven
- •5.3.2 Configuring Maven for a project
- •5.3.3 Executing JUnit tests with Maven
- •5.3.4 Handling dependent jars with Maven
- •5.4 Running tests from Eclipse
- •5.4.1 Creating an Eclipse project
- •5.4.2 Running JUnit tests in Eclipse
- •5.5 Summary
- •6.1 Introducing stubs
- •6.2 Practicing on an HTTP connection sample
- •6.2.1 Choosing a stubbing solution
- •6.2.2 Using Jetty as an embedded server
- •6.3 Stubbing the web server’s resources
- •6.3.1 Setting up the first stub test
- •6.3.2 Testing for failure conditions
- •6.3.3 Reviewing the first stub test
- •6.4 Stubbing the connection
- •6.4.1 Producing a custom URL protocol handler
- •6.4.2 Creating a JDK HttpURLConnection stub
- •6.4.3 Running the test
- •6.5 Summary
- •7.1 Introducing mock objects
- •7.2 Mock tasting: a simple example
- •7.3 Using mock objects as a refactoring technique
- •7.3.1 Easy refactoring
- •7.3.2 Allowing more flexible code
- •7.4 Practicing on an HTTP connection sample
- •7.4.1 Defining the mock object
- •7.4.2 Testing a sample method
- •7.4.3 Try #1: easy method refactoring technique
- •7.4.4 Try #2: refactoring by using a class factory
- •7.5 Using mocks as Trojan horses
- •7.6 Deciding when to use mock objects
- •7.7 Summary
- •8.1 The problem with unit-testing components
- •8.2 Testing components using mock objects
- •8.2.1 Testing the servlet sample using EasyMock
- •8.2.2 Pros and cons of using mock objects to test components
- •8.3 What are integration unit tests?
- •8.4 Introducing Cactus
- •8.5 Testing components using Cactus
- •8.5.1 Running Cactus tests
- •8.5.2 Executing the tests using Cactus/Jetty integration
- •8.6 How Cactus works
- •8.6.2 Stepping through a test
- •8.7 Summary
- •9.1 Presenting the Administration application
- •9.2 Writing servlet tests with Cactus
- •9.2.1 Designing the first test
- •9.2.2 Using Maven to run Cactus tests
- •9.2.3 Finishing the Cactus servlet tests
- •9.3 Testing servlets with mock objects
- •9.3.1 Writing a test using DynaMocks and DynaBeans
- •9.3.2 Finishing the DynaMock tests
- •9.4 Writing filter tests with Cactus
- •9.4.1 Testing the filter with a SELECT query
- •9.4.2 Testing the filter for other query types
- •9.4.3 Running the Cactus filter tests with Maven
- •9.5 When to use Cactus, and when to use mock objects
- •9.6 Summary
- •10.1 Revisiting the Administration application
- •10.2 What is JSP unit testing?
- •10.3 Unit-testing a JSP in isolation with Cactus
- •10.3.1 Executing a JSP with SQL results data
- •10.3.2 Writing the Cactus test
- •10.3.3 Executing Cactus JSP tests with Maven
- •10.4 Unit-testing taglibs with Cactus
- •10.4.1 Defining a custom tag
- •10.4.2 Testing the custom tag
- •10.5 Unit-testing taglibs with mock objects
- •10.5.1 Introducing MockMaker and installing its Eclipse plugin
- •10.5.2 Using MockMaker to generate mocks from classes
- •10.6 When to use mock objects and when to use Cactus
- •10.7 Summary
- •Unit-testing database applications
- •11.1 Introduction to unit-testing databases
- •11.2 Testing business logic in isolation from the database
- •11.2.1 Implementing a database access layer interface
- •11.2.2 Setting up a mock database interface layer
- •11.2.3 Mocking the database interface layer
- •11.3 Testing persistence code in isolation from the database
- •11.3.1 Testing the execute method
- •11.3.2 Using expectations to verify state
- •11.4 Writing database integration unit tests
- •11.4.1 Filling the requirements for database integration tests
- •11.4.2 Presetting database data
- •11.5 Running the Cactus test using Ant
- •11.5.1 Reviewing the project structure
- •11.5.2 Introducing the Cactus/Ant integration module
- •11.5.3 Creating the Ant build file step by step
- •11.5.4 Executing the Cactus tests
- •11.6 Tuning for build performance
- •11.6.2 Grouping tests in functional test suites
- •11.7.1 Choosing an approach
- •11.7.2 Applying continuous integration
- •11.8 Summary
- •Unit-testing EJBs
- •12.1 Defining a sample EJB application
- •12.2 Using a façade strategy
- •12.3 Unit-testing JNDI code using mock objects
- •12.4 Unit-testing session beans
- •12.4.1 Using the factory method strategy
- •12.4.2 Using the factory class strategy
- •12.4.3 Using the mock JNDI implementation strategy
- •12.5 Using mock objects to test message-driven beans
- •12.6 Using mock objects to test entity beans
- •12.7 Choosing the right mock-objects strategy
- •12.8 Using integration unit tests
- •12.9 Using JUnit and remote calls
- •12.9.1 Requirements for using JUnit directly
- •12.9.2 Packaging the Petstore application in an ear file
- •12.9.3 Performing automatic deployment and execution of tests
- •12.9.4 Writing a remote JUnit test for PetstoreEJB
- •12.9.5 Fixing JNDI names
- •12.9.6 Running the tests
- •12.10 Using Cactus
- •12.10.1 Writing an EJB unit test with Cactus
- •12.10.2 Project directory structure
- •12.10.3 Packaging the Cactus tests
- •12.10.4 Executing the Cactus tests
- •12.11 Summary
- •A.1 Getting the source code
- •A.2 Source code overview
- •A.3 External libraries
- •A.4 Jar versions
- •A.5 Directory structure conventions
- •B.1 Installing Eclipse
- •B.2 Setting up Eclipse projects from the sources
- •B.3 Running JUnit tests from Eclipse
- •B.4 Running Ant scripts from Eclipse
- •B.5 Running Cactus tests from Eclipse
- •references
- •index
278CHAPTER 11
Unit-testing database applications
[...]
suite.addTestSuite(TestCustomerN.class);
return suite;
}
}
11.6.3Using an in-memory database
Most applications spend most of their time waiting for the database. The same is true for the database integration unit tests. If the test can access the database more quickly, the test will also run more quickly.
For running tests, an in-memory database can be accessed much more quickly (by several orders of magnitude) than a conventional database. As a result, your tests also run faster, by several orders of magnitude. Hypersonic SQL, among others, has an in-memory mode that is ideal for running tests.
Unfortunately, using a different database isn’t a viable option for many applications. Often, the SQL is optimized for the target database. Several common needs are not standardized by SQL. For example, to return pages of data, Oracle has a rowid feature; but other databases implement this feature differently. Different vendors implement other commonly used database features, like triggers and referential integrity, differently. However, if your application uses SQL that can be used with an in-memory database as well as your target database, this can be an excellent optimization.
11.7Overall database unit-testing strategy
You have seen two main approaches to testing database applications. One uses mock objects to perform unit tests in isolation. Another uses in-container testing (with a tool like Cactus) and preset database data (created with a tool like DbUnit) to perform integration tests.
At this point, you’re probably wondering whether you need to write both types of tests. Is it enough to write only mock-object tests? Can Cactus tests be replaced by functional tests? Let’s address these crucial questions.
11.7.1Choosing an approach
There are two valid approaches:
■Mock objects / functional testing—Write mock-objects-style unit tests for the business logic (by separating the business logic from the database access
Overall database unit-testing strategy |
279 |
|
|
layer). Also write mock-objects-style unit tests for the database access layer code. Finally, write functional tests to ensure the whole thing works when it’s running in the deployed environment.
■Mock objects / database integration unit tests / functional tests—Write mock-objects- style unit tests for the business logic, as shown in section 11.6, by separating the business logic from the database access layer. Don’t write mock-object unit tests for the database access layer code. Instead, write Cactus integration unit tests (possibly using some mocks to test failure conditions). Also write functional tests (as in the previous approach) for end-to-end testing.
The difference between the two approaches is that in the second, you use Cactus to perform integration unit tests. This approach provides better test coverage but requires more setup, because you need a database (or private dataspace) installed for each developer. You also need to think about builds and deployment from day one (but that can be a good thing).
11.7.2Applying continuous integration
The Cactus approach pushes the integration issues into the development realm. This is helpful, because too often integration comes as a second phase in the project life cycle. In some projects, integration is treated almost as an afterthought!
Integration is when the real problems start to happen: Architectural decisions are changed, leading to rewriting of a portion of the application; deployment teams have not been trained beforehand and begin discovering how the application works—and of course the project becomes late. The cost of dealing with integration issues in a J2EE project is extremely high.
Our recommendation is to begin practicing packaging and deployment from day one (maybe after a first exploratory iteration) and improve the process continuously as the project progresses, by automating more and more. As you do this, you’ll find that you gain increasing control over the configuration of your container, and that it takes less time to put a release into production. On a recent project, a complex system took two hours flat to deploy from development to preproduction, and we could put our system into production at any point in time.
The second approach using Cactus, discussed in the previous section, takes longer in execution time, but that shouldn’t be an issue. We usually execute all the pure JUnit and mock-objects tests continuously from our development IDE and only execute the integration unit tests from time to time. For large projects, we always have a continuous build system located on a dedicated machine that
280CHAPTER 11
Unit-testing database applications
runs continuously and executes all the tests all the time; it sends us an email if there is a build failure. This way, we benefit from the best of both worlds.
11.8 Summary
Most developers feel that it’s difficult to unit-test applications that access a database. In this chapter, we’ve shown three types of simple unit tests that can test the data-access code calling the real database: business logic unit tests that test the business code in isolation from the database, data access unit tests that test the dataaccess code in isolation from the database, and integration unit tests.
Business logic tests and data access unit tests are easy to run as ordinary JUnit tests, with some help from mock objects. The MockObjects.com framework is especially useful, because it provides prewritten mocks for the SQL packages.
For integration unit tests, we have demonstrated the powerful Cactus/DbUnit combo. Not only have we succeeded in running database tests from inside the container, but we have also automated the execution of the tests using Ant. This combo paves the way for practicing continuous integration.
In chapter 9, we started down a path to completely unit-test a web application, including its servlets, filters, taglibs, JSPs, and database access. In this chapter, we completed that path by showing how easy it can be to unit-test database access. In the next chapter, we will introduce an EJB application that demonstrates unittesting of EJBs.