Cloudera Engineering Blog · Security Posts

How-to: Set Up a Hadoop Cluster with Network Encryption

Hadoop network encryption is a feature introduced in Apache Hadoop 2.0.2-alpha and in CDH4.1.

In this blog post, we’ll first cover Hadoop’s pre-existing security capabilities. Then, we’ll explain why network encryption may be required. We’ll also provide some details on how it has been implemented. At the end of this blog post, you’ll get step-by-step instructions to help you set up a Hadoop cluster with network encryption.

A Bit of History on Hadoop Security

How-to: Enable User Authentication and Authorization in Apache HBase

With the default Apache HBase configuration, everyone is allowed to read from and write to all tables available in the system. For many enterprise setups, this kind of policy is unacceptable. 

Administrators can set up firewalls that decide which machines are allowed to communicate with HBase. However, machines that can pass the firewall are still allowed to read from and write to all tables.  This kind of mechanism is effective but insufficient because HBase still cannot differentiate between multiple users that use the same client machines, and there is still no granularity with regard to HBase table, column family, or column qualifier access.

Authorization and Authentication In Hadoop

One of the more confusing topics in Hadoop is how authorization and authentication work in the system. The first and most important thing to recognize is the subtle, yet extremely important, differentiation between authorization and authentication, so let’s define these terms first:

Authentication is the process of determining whether someone is who they claim to be.

Introducing Alfredo, Kerberos HTTP SPNEGO for Java

What is Kerberos & SPNEGO?

Kerberos is an authentication protocol that provides mutual authentication and single sign-on capabilities.

Configuring Security Features in CDH3

Post written by Cloudera Software Engineer Aaron T. Myers.

Apache Hadoop has had methods of doing user authorization for some time. The Hadoop Distributed File System (HDFS) has a permissions model similar to Unix to control file and directory access, and MapReduce has access control lists (ACLs) per job queue to control which users may submit jobs. These authorization schemes allow Hadoop users and administrators to specify exactly who may access Hadoop’s resources. However, until recently, these mechanisms relied on a fundamentally insecure method of identifying the user who is interacting with Hadoop. That is, Hadoop had no way of performing reliable authentication. This limitation meant that any authorization system built on top of Hadoop, while helpful to prevent accidental unwanted access, could do nothing to prevent malicious users from accessing other users’ data.

Hadoop World: Security and API Compatibility

Today’s Hadoop World talk comes from Owen O’Malley and talks about some of the biggest challenges facing Hadoop: Security and API Compatibility.

Over the past several months, Yahoo! has been leading the charge in both areas. This work will enable wider use of Hadoop within Yahoo! as well as lower the barrier for new users – particularly those working with sensitive data. A big thanks to Yahoo! and everyone else in the community helping out.

Newer Posts