Unit Test Results, Code Coverage and Quality Metrics for Android Apps

For a recent demo, I wanted to use Jenkins to give a “QA dashboard” view of a native Android application build.  In particular, I wanted to display metrics for:

  1. Unit Test Results (in JUnit format)
  2. Code Coverage (using Emma)
  3. Code Quality (using Android Lint)

It’s not hard to do and there are excellent OSS Jenkins plugins to do the heavy lifting - you just need to configure your Android build to be able to use them.  To do this, you just need to add a customized test target to your Android test project’s build.xml file, plus a one-line change to the test project’s Android manifest file.  The easiest way to do this (see the Jenkins CI wiki for details) is just to copy the entire test target from the main Android SDK build file (${sdk.dir}/tools/ant/build.xml) and paste that into the test project’s build.xml to override the target.  Make sure that you insert the new target before the line that imports the Android SDK’s build.xml (<import file=”${sdk.dir}/tools/ant/build.xml” />); you also need to change version-tag to “custom” to keep your changes from being overwritten by the android update project  command (i.e. <!— version-tag: custom —>).

I’ve given an example below of the Android manifest and build.xml that I use with the DroidFishTest project: you can see the full source code on GitHub and the Jenkins configuration for this example is available online.  Let’s take a look at how this works.

First, you need to use a custom TestRunner class that is able to output the results of the Android unit tests in JUnit format.  There are several open source projects that you can use for this: two that I looked at are android-junit-report and the-missing-android-xml-junit-test-runner.  Just download the jar file for whichever one you want to use and copy it to the libs folder of the test Project (in my example, DroidFishTest/libs); remember to add it to your source or binary artifact repository, as the library must be available when the test project is built.  Then you add the full class name in the <instrumentation> section of the test project’s Android manifest file and set a corresponding test.runner property in the test project’s build.xml file.  That causes the Android build job to output the unit test results in the JUnit XML format that Jenkins understands.

By default, the TestRunner class will store its output on the device: check the documentation for the implementation you are using to find the exact location.  In this example, I’m using android-junit-report and this sends output to /data/data/${tested.project.app.package}/files/junit-reports. The <exec> task at the end of the test target uses adb pull to copy the report files from the device to the project workspace.

The other, minor change require is to configure the Emma code coverage tool to produce its output in XML format as well as the default HTML, as required by the Jenkins Emma Plugin.  That’s just a one-line change: <xml outfile=”coverage/coverage.xml”/> does the trick.

Here’s the AndroidManifest.xml:

<?xml version="1.0" encoding="utf-8"?>

<manifest xmlns:android="http://schemas.android.com/apk/res/android"

    package="org.petero.droidfish.test"

    android:versionCode="1"

    android:versionName="1.0" >

    <uses-sdk android:minSdkVersion="3" />

    <instrumentation

        android:name="com.zutubi.android.junitreport.JUnitReportTestRunner"

        android:targetPackage="org.petero.droidfish" />

    <application

        android:icon="@drawable/ic_launcher"

        android:label="@string/app_name" >

        <uses-library android:name="android.test.runner" />

    </application>

</manifest>

And here’s the relevant section of the Android test project’s build.xml:    

<!-- Overriding test target to configure Emma XML output and android-junit-report test runner -->

<target name="test" depends="-test-project-check"

        description="Runs tests from the package defined in test.package property">

        <property name="test.runner" value="com.zutubi.android.junitreport.JUnitReportTestRunner" />

[...]

                    <!-- TODO: reports in other, indicated by user formats -->

                    <html outfile="coverage/coverage.html" />

                    <xml outfile="coverage/coverage.xml" />

               </report>

            </emma>

            <echo level="info">Cleaning up temporary files...</echo>

            <delete file="${out.absolute.dir}/coverage.ec" />

            <delete file="${out.absolute.dir}/coverage.em" />

            <echo level="info">Saving HTML report file in ${basedir}/coverage/coverage.html</echo>

            <echo level="info">Saving XML report file in ${basedir}/coverage/coverage.xml</echo>

[...]

    <mkdir dir="${basedir}/junit-results"/>

    <exec executable="${adb}" failonerror="true" dir="junit-results">

        <arg line="${adb.device.arg}" />

        <arg value="pull" />

        <arg value="/data/data/${tested.project.app.package}/files/" />

    </exec>

</target>

 

Now the rest of the configuration is easy.  We just add the emma target to the main ant build, which now becomes ant clean emma debug install test and we can add an Execute Shell task after the ant build to run Android Lint with ( cd DroidFish; os_opts=”-Djava.awt.headless=true” lint —xml lint-results.xml . ).  Then we add three post-build actions to publish the JUnit, Emma and Lint results, giving the workspace location for the output XML files we configured above:

Publish android lint results

Finally, for completeness, I added Cobertura support to my mongochess service, which is a Java Maven project.  I added cobertura-maven-plugin to the <build> and <reporting> sections: the latter isn’t necessary for configuring reporting in Jenkins as the Cobertura Plugin does the job, but I wanted to run the standard Maven reporting goals outside of Jenkins, so I added that for good measure.  Here are the relevant sections from my project pom.xml file:

 <build>

    <finalName>mongochess</finalName>

    <plugins>

      <plugin>

        <groupId>com.cloudbees</groupId>

        <artifactId>bees-maven-plugin</artifactId>

        <version>1.0-SNAPSHOT</version>     

      </plugin>

            <plugin>

        <groupId>org.codehaus.mojo</groupId>

        <artifactId>cobertura-maven-plugin</artifactId>

        <version>2.5.2</version>

        <configuration>

          <format>xml</format> 

        </configuration>

      </plugin>

    </plugins>     

  </build>

    

  <reporting>

    <plugins>

      <plugin>

        <groupId>org.codehaus.mojo</groupId>

        <artifactId>cobertura-maven-plugin</artifactId>

        <version>2.5.2</version>

        <configuration>

          <format>xml</format> 

        </configuration>

      </plugin>

    </plugins>

  </reporting>

And finally, here’s what the Cobertura Code Coverage Report looks like in the Jenkins project dashboard:

Cobertura Code Coverage Report
 

Mark Prichard, Senior Director of Product Management

CloudBees

 

Mark Prichard is Java PaaS Evangelist for CloudBees. He came to CloudBees after 13 years at BEA Systems and Oracle, where he was Product Manager for the WebLogic Platform. A graduate of St John’s College, Cambridge and the Cambridge University Computer Laboratory, Mark works for CloudBees in Los Altos, CA.  Follow Mark on Twitter and via his blog Clouds, Bees and Blogs.