Lessons learned putting Hadoop into production
Many Apache Hadoop deployments begin as small test clusters as either an electronic sandbox for analyzing data in new ways or solving a small specific business problem. Typically, as more use cases are discovered more data is loaded into the cluster. Consequently, the clusters grow to provide expanded capacity to the organization. Typically one or more of the use cases provides insight that is critical to the efficient operation of business and eventually creates a need for a full scale production Hadoop system. As the clusters grow and business becomes more dependent on the results, challenges begin to arise in many aspects of deployment from configuring and installing to monitoring and managing the daily operations of the cluster.
Cloudera Solution Architect Eric Sammer has helped move dozens of Hadoop clusters into production, including some of the largest seen to date. In his upcoming webinar Eric will reveal some key insights and considerations developed from not only his experiences, but the entire Cloudera Solution Architect team.
When bringing your Hadoop cluster to a production state some of the key items to keep in mind include:
- Proper planning
- Data ingestion
- ETL and data processing infrastructure
- Authentication, authorization, and sharing
- As well as monitoring
During this one hour webinar titled “Production-izing Hadoop: Lessons Learned” Eric will cover each of the above bullet points in detail and will also leave ample time for questions at the end of the hour.
Eric presented a condensed version of this webinar at Hadoop World in October where it was rated the top breakout session of the 42 sessions. If you weren’t able to attend here’s another opportunity to better understand the obstacles and considerations for bringing a Hadoop cluster to production.