Welcome!

Java IoT Authors: Zakia Bouachraoui, Liz McMillan, Kevin Benedict, Elizabeth White, Yeshim Deniz

Related Topics: Java IoT

Java IoT: Article

Designing JUnit Test Cases

Effective functional testing

Functional testing, or integration testing, is concerned with the entire system, not just small pieces (or units) of code. It involves taking features that have been tested independently, combining them into components, and verifying if they work together as expected. For Java, this testing is typically performed using the JUnit framework.

Most Java developers are well-versed in logistical test construction matters, such as how to develop a test fixture, add test methods with assertions, use the setup method for initialization, and so forth. However, many Java developers could benefit from a deeper understanding of how to develop a functional test suite that effectively verifies whether code works as designed.

This article introduces and demonstrates the following strategy for building an effective JUnit functional test suite:

  • Identify use cases that cover all actions that your program should be able to perform.
  • Identify the code's entry points - central pieces of code that exercise the functionality that the code as a whole is designed to undertake.
  • Pair entry points with the use cases that they implement.
  • Create test cases by applying the initialize/work/check procedure.
  • Develop runtime event diagrams and use them to facilitate testing.
I demonstrate these strategies by applying them to source code from the Saxon project (http://saxon.sourceforge.net/), an XML utility kit that can process XPath, XQuery, and XSLT. This library is built from approximately 50,000 lines of Java code; it is open source, well written, and well documented.

Identifying Use Cases
There are two balancing goals of functional testing: coverage and granularity. In order to be complete, functional testing must cover each function that the program provides, and it must do so at a level that separates the tests into their component parts. Tests can rely on each other, but no single test should verify two things.

The first step to creating a comprehensive functionality test suite is assembling a list of all the actions that your program should be able to perform. This can be further codified by specifying use cases that model a supported action that can be taken by an outside actor (a human user or another software component) that performs work inside the system.

A typical enterprise Java application already has several documents detailing the requirements of the various users. These may include use case specifications, nonfunctional requirements specifications, test case specifications, user interface design documents, mockups, user profiles, and various additional artifacts. Simple applications typically have one simple text document that details all relevant requirements.

Using these documents, you can quickly identify use cases that should be tested. Each test case describes a scenario that can be exercised through the application. A good practice is to aim for similar-sized scenarios that verify one and only one functionality - larger scenarios can be broken into smaller ones along the lines of the functionalities that they verify.

There are many ways to model use cases, but the simplest is in terms of input/output pairs. In Saxon's query class, the simplest use case is passing a query file, a query, and a path to an output file. The output file is created as needed and filled with the result of running the query in the query file.

More complex use cases may take more input or produce more output. The defining point, however, is that use cases do not specify or care how the work is performed internally. They treat the software as a "black box" inside of which all work could be performed by gnomes, as long as it's performed. This is an important point because the use cases as input/output pairs translate very easily and very directly into test cases, which allows complex specifications to map into simple tests that can verify that the required operations work, and that operations which should fail actually fail.

Defining the use cases for the designated entry points is simple if the class is relatively straightforward, or if there is already a specification document that enumerates all of the possible class uses. If not, it might be necessary to learn about the various ways the class is expected to behave (and possibly highlight confusion as to the class's purpose and use). Use cases can also be extracted from the code itself if you are willing to look for all of the places where the code is called.

Most likely, the class has some rudimentary documentation, and by supplementing this documentation with the developers' domain knowledge, it should be possible to fully determine what the class should and shouldn't be able to do. With this knowledge, an appropriate set of use cases can be developed.

Translating Test Cases
Each test case is divided into two parts: input and expected output. The input part lists all the test case statements that create variables or assign values to variables. The expected output part indicates the expected results; it shows either assertions or the message 'no exception' (when no assertions exist).

The basic input/output format is the simplest, easiest to understand model to follow for test cases. It follows the pattern of normal functions (pass arguments, get return value), and most user actions (press this button to perform this action). The pattern, then, is to:

  • Initialize: Create the environment that the test expects to run in. The initialization code can either be in the beginning of the test or in the setUp() method.
  • Work: Call the code that is being tested, capturing any interesting output and recording any interesting statistics.
  • Check: Use assert statements to ensure that the code worked as expected.
For instance, assume that you want to test the Saxon library's transform class entry point. One of its use cases is to transform an XML file into an HTML file, given an XSL file that describes the transformation. The inputs are the paths to the three files, and the output is the contents of the HTML file. This translates very directly into the following test:

    public void testXSLTransformation() {
      /* initialize the variables
        (or do this in setUp if used in many tests) */
      String processMePath = "/path/to/file.xml";
      String stylesheetPath = "/path/to/stylesheet.xsl";
      String outputFilePath = "/path/to/output.xml";
      //do the work
      Transform.main(new String[] {
        processMePath,
        stylesheetPath,
        "-o", outputFilePath } );
      //check the work
      assertTrue(checkOutputFile(outputFilePath));
    }

Each step can be as simple or complex as necessary. The variables declared here could just as easily call methods to obtain their values. The work could consist of several steps that achieve the desired outcome. Moreover, the check can sometimes be omitted when the process succeeds silently.

The pattern is very simple and very flexible, but step two is decidedly generic. This template gives us no method for finding the code to be tested, or any assurances that the code is set up in a way that facilitates testing. This is a serious concern.

Focusing Functional Tests
The search can be narrowed to a more useful context by identifying central pieces of code that exercise the functionality that the code as a whole is designed to undertake. These classes are considered the code's entry points because they provide a way to jump into the system from the outside.

The overall goal of this process is to identify a group of classes that provide a high-level interface to the system functionality. The easier it is to use each class independently, the better. After all, the more the class can be decoupled from its surroundings, the easier it is to test.

Determining what code to identify as entry points is a fairly straightforward process. In a library of code, there are usually a choice few entry points that control all of the library's functionality. These facade classes act as a mediator between client code and the library, separating the developer on the outside from the complexity of the code within. This is exactly the type of class whose methods should be tested first.

For instance, Saxon provides a small collection of classes that act as a portal into the rest of the library, and thus serve as a logical entry point. By coding to the facade classes such as transform, configuration, and query, library client code can use a vast number of worker classes without having to worry about their interfaces… or even their existence. These facade classes therefore provide a simple way to test the system functionality using the high-level and easy-to-use interfaces that are a sign of a good library.

In application code, there is usually an obvious separation between modules of functionality. In some code, these modules are segregated to the extent that they can largely be treated as if they were each separate libraries whose functionality can be accessed through a handful of facade classes. These classes are the logical places to look for high-level interfaces. A plug-in architecture will usually follow this design, in that each individual plug-in has a simple interface that can effectively exercise the entirety of the contained code.

In less rigidly delineated systems, there is usually a central point through which all activity passes. This mediator class is often a 'controller' in an MVC paradigm, and it routes requests to and from parts of the system. The vast majority of the overall system functionality is implemented by classes to which this controller connects; consequently, these classes are prime candidates for testing. This can be seen in Applet design, where the class deriving from java.applet.Applet will be the central processor of the entire code base. Depending on whether the code is thoroughly decomposed, testing can focus on either the Applet subclass itself, or on those classes with which it works.

Code between modules is also prime code to test. The class that converts application requests into database queries is a good candidate, as are similar adapter classes.

Various MVC (Model-View Controller) framework-based components may be easier to test with other testing frameworks, some of which extend JUnit. For example, Struts actions are best tested using the StrutsTestCase extension of JUnit, server-side components like Servlets, JSPs, and EJBs are best tested using Cactus, and HttpUnit is the best framework for conducting blackbox Web application testing. All techniques discussed in this article are applicable when writing tests in these frameworks.

Moving from Use Case to Test Case
Once the entry points have been discovered, they must be paired with the use cases that they implement. In some cases, this is a trivial step because the classes' names self-document to the point that matching is automatic: for instance, Saxon's transform class performs the XSL transformations; the query class performs the XQuery resolutions, etc.

In other cases, the search is more difficult. Often, a use case describes functionality that exists only as a cross-cutting concern that is not exemplified in any single class; the behavior in question is visible only when a group of classes interacts, or when certain conditions apply. In these cases, the test has a longer than average initialization phase, or the setUp() method can be used to provide the requisite environment.

The work phase, where the code is actually being called, should be only a single line if possible. Minimizing the contact with the tested code helps you avoid depending on side effects and unstable implementation details. The test's check phase is commonly the most complex because it must often compensate for code that was not written to be tested. The test may be forced to pull apart the results to ensure that they satisfy the requirements. Occasionally, the results are so difficult to obtain that multiple steps are required to get them into a form that the test can recognize. Both of these cases are true in the above test for XSL transformations; the results are in a file, which must be read into memory, and are in a complex XML format, which must be scrutinized to ensure accuracy.

A simpler example can be taken from Saxon. Given an XML file and an XPath expression, Saxon can evaluate the expression and return a list of all matches. Saxon ships with a sample class - the XPathExample class - that does precisely this. Paring down the interactivity, the class resolves to this test:

    public void testXPathEvaluation() {
      //initialize
      XPathEvaluator xpe = new XPathEvaluator(
        new SAXSource(new InputSource("/path/to/file.xml")));
      XPathExpression findLine =
        xpe.createExpression("/some/xpath[expression]");
      //work
      List matches = findLine.evaluate();
      //check
      assertTrue(matches.count() > 0);
    }

The two inputs are the two constant strings, and the output is the list of matches, which is tested to ensure that matches were indeed found. All the work is performed in one line, which simply calls the method that is being tested.


More Stories By Nada daVeiga

Nada daVeiga is the Product Manager of Java Solutions at Parasoft, where she has been a senior member of Professional Services team for two years. Nada's background includes development of service-oriented architecture for integration of rich media applications such as Artesia Teams, IBM Content Manager, Stellent Content Server and Virage Video Logger. Nada developed J2EE enterprise applications and specialized in content transport frameworks using XML, JMS, SOAP, and JWSDP technologies. As a presales engineer, Nada worked with clients such as Cisco, Fidelity, HBO and Time Warner. Nada holds a bachelors degree in computer science from the University of California, Los Angeles (UCLA).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, will provide an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life ...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform elimi...