How-to: Test HBase Applications Using Popular Tools

How-to: Test HBase Applications Using Popular Tools

While Apache HBase adoption for building end-user applications has skyrocketed, many of those applications (and many apps generally) have not been well-tested. In this post, you’ll learn some of the ways this testing can easily be done.

We will start with unit testing via JUnit, then move on to using Mockito and Apache MRUnit, and then to using an HBase mini-cluster for integration testing. (The HBase codebase itself is tested via a mini-cluster, so why not tap into that for upstream applications, as well?)

As a basis for discussion, let’s assume you have an HBase data access object (DAO) that does the following insert into HBase. The logic could be more complicated of course but for the sake of example, this does the job.

public class MyHBaseDAO {

        	public static void insertRecord(HTableInterface table, HBaseTestObj obj)
       	throws Exception {
  	              	Put put = createPut(obj);
  	              	table.put(put);
        	}

        	private static Put createPut(HBaseTestObj obj) {
  	              	Put put = new Put(Bytes.toBytes(obj.getRowKey()));
  	              	put.add(Bytes.toBytes("CF"), Bytes.toBytes("CQ-1"),
             	               	Bytes.toBytes(obj.getData1()));
  	              	put.add(Bytes.toBytes("CF"), Bytes.toBytes("CQ-2"),
             	               	Bytes.toBytes(obj.getData2()));
  	              	return put;
        	}
}

 

HBaseTestObj is a basic data object with getters and setters for rowkey, data1, and data2.

The insertRecord does an insert into the HBase table against the column family of CF, with CQ-1 and CQ-2 as qualifiers. The createPut method simply populates a Put and returns it to the calling method.

Using JUnit

JUnit, which is well known to most Java developers at this point, is easily applied to many HBase applications. First, add the dependency to your pom:

<dependency>
    <groupId>junit</groupId>
    <artifactId>junit</artifactId>
    <version>4.11</version>
    <scope>test</scope>
</dependency>

 

Now, within the test class:

public class TestMyHbaseDAOData {
          	@Test
          	public void testCreatePut() throws Exception {
    	  	HBaseTestObj obj = new HBaseTestObj();
    	  	obj.setRowKey("ROWKEY-1");
    	  	obj.setData1("DATA-1");
    	  	obj.setData2("DATA-2");
    	  	Put put = MyHBaseDAO.createPut(obj);
    	  	assertEquals(obj.getRowKey(), Bytes.toString(put.getRow()));
    	  	assertEquals(obj.getData1(), Bytes.toString(put.get(Bytes.toBytes("CF"),
  Bytes.toBytes("CQ-1")).get(0).getValue()));
    	  	assertEquals(obj.getData2(), Bytes.toString(put.get(Bytes.toBytes("CF"),
  Bytes.toBytes("CQ-2")).get(0).getValue()));
          	}
  }

 

What you did here was to ensure that your createPut method creates, populates, and returns a Put object with expected values.

Using Mockito

So how do you go about unit testing the above insertRecord method? One very effective approach is to do so with Mockito.

First, add Mockito as a dependency to your pom:

<dependency>
    <groupId>org.mockito</groupId>
    <artifactId>mockito-all</artifactId>
    <version>1.9.5</version>
    <scope>test</scope>
</dependency>

 

Then, in test class:

@RunWith(MockitoJUnitRunner.class)
public class TestMyHBaseDAO{
  @Mock 
  private HTableInterface table;
  @Mock
  private HTablePool hTablePool;
  @Captor
  private ArgumentCaptor putCaptor;

  @Test
  public void testInsertRecord() throws Exception {
    //return mock table when getTable is called
    when(hTablePool.getTable("tablename")).thenReturn(table);
    //create test object and make a call to the DAO that needs testing
    HBaseTestObj obj = new HBaseTestObj();
    obj.setRowKey("ROWKEY-1");
    obj.setData1("DATA-1");
    obj.setData2("DATA-2");
    MyHBaseDAO.insertRecord(table, obj);
    verify(table).put(putCaptor.capture());
    Put put = putCaptor.getValue();
  
    assertEquals(Bytes.toString(put.getRow()), obj.getRowKey());
    assert(put.has(Bytes.toBytes("CF"), Bytes.toBytes("CQ-1")));
    assert(put.has(Bytes.toBytes("CF"), Bytes.toBytes("CQ-2")));
    assertEquals(Bytes.toString(put.get(Bytes.toBytes("CF"),
Bytes.toBytes("CQ-1")).get(0).getValue()), "DATA-1");
    assertEquals(Bytes.toString(put.get(Bytes.toBytes("CF"),
Bytes.toBytes("CQ-2")).get(0).getValue()), "DATA-2");
  }
}

 

Here you have populated HBaseTestObj with “ROWKEY-1”, “DATA-1”, “DATA-2” as values. You then used the mocked table and the DAO to insert the record. You captured the Put that the DAO would have inserted and verified that the rowkey, data1, and data2 are what you expect them to be.

The key here is to manage htable pool and htable instance creation outside the DAO. This allows you to mock them cleanly and test Puts as shown above. Similarly, you can now expand into all the other operations such as Get, Scan, Delete, and so on.

Using MRUnit

With regular data access unit testing covered, let’s turn toward MapReduce jobs that go against HBase tables.

Testing MR jobs that go against HBase is as straightforward as testing regular MapReduce jobs. MRUnit makes it really easy to test MapReduce jobs including the HBase ones.

Imagine you have an MR job that writes to an HBase table, “MyTest”, which has one column family, “CF”. The reducer of such a job could look like:

public class MyReducer extends TableReducer<Text, Text, ImmutableBytesWritable> {
   public static final byte[] CF = "CF".getBytes();
   public static final byte[] QUALIFIER = "CQ-1".getBytes();
  public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
     //bunch of processing to extract data to be inserted, in our case, lets say we are simply
     //appending all the records we receive from the mapper for this particular
     //key and insert one record into HBase
     StringBuffer data = new StringBuffer();
     Put put = new Put(Bytes.toBytes(key.toString()));
     for (Text val : values) {
         data = data.append(val);
     }
     put.add(CF, QUALIFIER, Bytes.toBytes(data.toString()));
     //write to HBase
     context.write(new ImmutableBytesWritable(Bytes.toBytes(key.toString())), put);
   }
 }

 

Now how do you go about unit-testing the above reducer in MRUnit? First, add MRUnit as a dependency to your pom.

<dependency>
   <groupId>org.apache.mrunit</groupId>
   <artifactId>mrunit</artifactId>
   <version>1.0.0 </version>
   <scope>test</scope>
</dependency>

 

 

Then, within the test class, use the ReduceDriver that MRUnit provides as below:

public class MyReducerTest {
    ReduceDriver<Text, Text, ImmutableBytesWritable, Writable> reduceDriver;
    byte[] CF = "CF".getBytes();
    byte[] QUALIFIER = "CQ-1".getBytes();

    @Before
    public void setUp() {
      MyReducer reducer = new MyReducer();
      reduceDriver = ReduceDriver.newReduceDriver(reducer);
    }
  
   @Test
   public void testHBaseInsert() throws IOException {
      String strKey = "RowKey-1", strValue = "DATA", strValue1 = "DATA1", 
strValue2 = "DATA2";
      List<Text> list = new ArrayList<Text>();
      list.add(new Text(strValue));
      list.add(new Text(strValue1));
      list.add(new Text(strValue2));
      //since in our case all that the reducer is doing is appending the records that the mapper   
      //sends it, we should get the following back
      String expectedOutput = strValue + strValue1 + strValue2;
     //Setup Input, mimic what mapper would have passed
      //to the reducer and run test
      reduceDriver.withInput(new Text(strKey), list);
      //run the reducer and get its output
      List<Pair<ImmutableBytesWritable, Writable>> result = reduceDriver.run();
    
      //extract key from result and verify
      assertEquals(Bytes.toString(result.get(0).getFirst().get()), strKey);
    
      //extract value for CF/QUALIFIER and verify
      Put a = (Put)result.get(0).getSecond();
      String c = Bytes.toString(a.get(CF, QUALIFIER).get(0).getValue());
      assertEquals(expectedOutput,c );
   }

}

 

Basically, after a bunch of processing in MyReducer, you verified that:

 

  • The output is what you expect.
  • The Put that is inserted in HBase has “RowKey-1” as the rowkey.
  • “DATADATA1DATA2” is the value for the CF column family and CQ column qualifier.

 

You can also test Mappers that get data from HBase in a similar manner using MapperDriver, or test MR jobs that read from HBase, process data, and write to HDFS.

Using an HBase Mini-cluster

Now we’ll look at how to go about integration testing. HBase ships with HBaseTestingUtility, which makes writing integration testing with an HBase mini-cluster straightforward. In order to pull in the correct libraries, the following dependencies are required in your pom:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.0.0-cdh4.2.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase</artifactId>
    <version>0.94.2-cdh4.2.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>
        
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.0.0-cdh4.2.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.0.0-cdh4.2.0</version>
    <scope>test</scope>
</dependency>

 

Now, let’s look at how to run through an integration test for the MyDAO insert described in the introduction:

public class MyHBaseIntegrationTest {
private static HBaseTestingUtility utility;
byte[] CF = "CF".getBytes();
byte[] QUALIFIER = "CQ-1".getBytes();

@Before
public void setup() throws Exception {
	utility = new HBaseTestingUtility();
	utility.startMiniCluster();
}

@Test
    public void testInsert() throws Exception {
   	 HTableInterface table = utility.createTable(Bytes.toBytes("MyTest"),
   			 Bytes.toBytes("CF"));
   	 HBaseTestObj obj = new HBaseTestObj();
   	 obj.setRowKey("ROWKEY-1");
   	 obj.setData1("DATA-1");
   	 obj.setData2("DATA-2");
   	 MyHBaseDAO.insertRecord(table, obj);
   	 Get get1 = new Get(Bytes.toBytes(obj.getRowKey()));
   	 get1.addColumn(CF, CQ1);
   	 Result result1 = table.get(get1);
   	 assertEquals(Bytes.toString(result1.getRow()), obj.getRowKey());
   	 assertEquals(Bytes.toString(result1.value()), obj.getData1());
   	 Get get2 = new Get(Bytes.toBytes(obj.getRowKey()));
   	 get2.addColumn(CF, CQ2);
   	 Result result2 = table.get(get2);
   	 assertEquals(Bytes.toString(result2.getRow()), obj.getRowKey());
   	 assertEquals(Bytes.toString(result2.value()), obj.getData2());
    }}

 

Here you created an HBase mini-cluster and started it. You then created a table called “MyTest” with one column family, “CF”. You inserted a record using the DAO you needed to test, did a Get from the same table, and verified that the DAO inserted records correctly.

The same could be done for much more complicated use cases along with the MR jobs like the ones shown above. You can also access the HDFS and ZooKeeper mini-clusters created while creating the HBase one, run an MR job, output that to HBase, and verify the inserted records.

Just a quick note of caution: starting up a mini-cluster takes 20 to 30 seconds and cannot be done on Windows without Cygwin. However, because they should only be run periodically, the longer run time should be acceptable.

You can find sample code for the above examples at https://github.com/sitaula/HBaseTest. Happy testing!

Sunil Sitaula
More by this author

Leave a comment

Your email address will not be published. Links are not permitted in comments.