Meet the Instructor: Nathan Neff
In this installment of “Meet the Instructor,” we speak to St. Louis-based Nathan Neff, the Training Lead for Cloudera’s new Data Analyst course.
What is your role at Cloudera?
I’m an instructor teaching almost all of Cloudera’s curricula: Developer, Administrator, Data Analyst, HBase, and Hadoop Essentials. I’m currently gearing up to start delivering Cloudera’s Introduction to Data Science training, which, from an instructor’s perspective, is a pretty exciting challenge. Most of the classes I teach are live and in-person, but I’ve also recorded screencasts and helped design multimedia courseware for Cloudera’s customers, which was a lot of fun.
Right now, my primary role is Training Lead for Cloudera’s new Data Analyst course. I worked closely with our curriculum team during design and development to ensure the best possible experience for every student. I taught the train-the-trainer and first public classes, and provided participant feedback and guidance from my own experience. My goal is to help create the best course available to analysts, developers, and administrators who want to work with Big Data at scale and in real time using Hadoop tools.
Now that the course has been released, my responsibility is twofold: ensure that Cloudera’s Data Analyst Training stays up-to-date with the rapidly changing ecosystem, and help our instructors with the tools to deliver a fun, challenging, and consistent class.
What do you enjoy most about training and/or curriculum development?
Like most technical instructors, I enjoy meeting new people who see opportunity in skills advancement. It’s a treat to teach students who are as excited about the technology as I am. My biggest reward is hearing about all the cool new ways people and organizations are using Hadoop and Big Data in practice.
In addition to teaching, curriculum development helps me remain an active and contributing member of the Hadoop community. One of the qualities that typifies Cloudera is that we not only manage the largest Hadoop knowledge base, but also work to broaden access in any way we can. A relevant and state-of-the-art training curriculum is a fantastic way to spread the word on the latest and greatest in the platform. As often as I can, I provide additional questions, exercises, and challenges to students, and I’ve now started screencasting tips and tricks for our Data Analyst instructors when I find a clever way of teaching a subject or point from our classes.
Describe an interesting application you’ve seen or heard about for Apache Hadoop.
The applications of Hadoop that I find most interesting are those that hold the potential to positively impact huge numbers of people, particularly projects related to genome sequencing. A student in a recent class discussed the benefits and risks of knowing our genetic makeup: prevention and early treatment of diseases, increased life expectancy, issues surrounding privacy and ownership of all this information. Hadoop has essentially made possible the storage, management, and analysis of vast quantities of data that help us understand human biology. Over time, Hadoop will enable the collection of all the information stored in the cells and behaviors of every living thing on Earth, which could have amazing implications for humanitarianism, science, and health.
What advice would you give to an engineer, system administrator, or analyst who wants to learn more about Big Data?
The best advice is to engage with people who are similarly interested Big Data. Cloudera University classes are a great place to meet other people delving into Hadoop-based scenarios and hear first-hand accounts and use cases from other students and our widely experienced instructors.
I also recommend finding user groups in your city that specialize in Hadoop, Big Data, machine learning, data science, or ecosystem projects like Cloudera Impala, Apache HBase, and Apache Pig. Start by exploring meetup.com for a Hadoop User Group near you.
Finally, I suggest diving in head first! Cloudera has a fantastic QuickStart VM available for download that has Hadoop and the ecosystem components already set up. Think of some kind of Big Data project you’d like to work on and get to it! Just remember that you’re never done learning.
How did you become involved in technical training and Hadoop?
My previous experience as an instructor came at Bworks in St. Louis, where, for five years, I taught computer classes to kids in fifth through eighth grades. I wrote the curriculum for the classes, including a class where students wrote a video game using M.I.T.’s excellent Scratch programming tool.
I learned about Hadoop while working for a startup that specialized in comparing the features of and making recommendations about enterprise software. We used Hadoop, HBase, and Apache Solr to collect and serve this huge and diverse set of data to customers who wanted objective comparisons and evaluations of the relevant software products available to them.
What’s one interesting fact or story about you that a training participant would be surprised to learn?
I play guitar in an Irish band called Rusty Nail. We released an album and gig around the St. Louis area. I even recently found out that an Oakland A’s pitcher liked our album and uses one of our songs as his warm-up and entrance soundtrack during home games. That’s really cool!