This is a guest post written by Ronda Swaney, freelance author and journalist
Gartner has predicted that there will be 20.4 billion connected things in existence by 2020. Those things range from Internet of Things (IoT) sensors on manufacturing equipment, to smart electric meters attached to homes, to continuous glucose monitors worn by people with diabetes. As the sources and speed of data capture grow, data management must evolve to keep up. But as data management evolves, what role will streaming data and IoT data architecture play?
The Need for IoT Data Architecture That Delivers Real-Time Results
With billions of IoT and streaming devices on the horizon, the growth of IoT appears to be unstoppable. Enterprises are building initiatives around these devices because they realize there’s value contained in the data they produce. But it’s not the data itself that is valuable. The only way to realize the true value of IoT and streaming data is to process and understand that data in real time.
As data is collected from IoT and streaming devices, it’s vital to have the appropriate data architecture in place to organize it, scale as the data stream fluctuates, and ensure that real-time analysis is possible. In today’s business world, knowing what happened two or three days ago is not helpful. You should be able to harness predictive capabilities based on your real-time data. For that, you need infrastructure that can scale as needed, manage whatever data volume comes in, and enable the analysis needed to react in real time.
For most organizations, analytics have been looking backward. What this new world offers is the ability to greatly reduce the time between when an event happens and when you know about it. More importantly, this data allows organizations to predict what will happen in the future. For that, IoT is the connecting fabric. IoT provides data from the edge, allows you to analyze it, and then take action.
The Evolution of IoT Data Architecture
The data flow from IoT devices enables a continuous cycle that captures sensor data, uses it to refine algorithms, and feeds it back to the point of origination. This demands an infrastructure that is both flexible and scalable.
To create the best architecture setup for IoT initiatives, you must answer a few questions:
- IoT enables extreme data ingestion. For example, a single autonomous vehicle can create four terabytes of data per day. How will your architecture ingest this data? Can you use batch mode, or will stream processing be required?
- The data coming in from IoT sensors will also contain metadata. How will your architecture manage metadata? How crucial is that metadata for analysis?
- Can your architecture handle scaling? How unpredictable are your data streams? The answers to these questions determine how elastic you must make your IoT data architecture. This elasticity will affect performance in how fast you can analyze queries.
- Lastly, you need to determine where the data must be analyzed. Do you need to filter and process the data at the edge? Should it be processed in multiple points along the stream? Must all the data be stored for analysis, or can some be filtered out?
The State of Your IoT Data Architecture
Just as there is no one-size-fits-all approach to IoT initiatives, there’s no one-size-fits-all answer for building the best IoT data architecture. The correct answer depends on the parameters of your initiative, as well as your business and data strategy. Research from TDWI found an even split in the preparedness of organizations looking to deploy IoT. Roughly one-third said they had their architecture already in place. Another third were uncertain if their current architecture could support IoT. The final third did not believe their architecture could support such projects. Where does your organization fall on that scale?
If you’re questioning the strength and flexibility of your architecture to manage IoT projects, then download the TWDI Pulse Report: Developing a Data Strategy for IoT.