K

Kathleen Martin

Guest
Just about every surface in the material world, or environment in the digital world, is being fitted with a form of data emission to describe itself and its behavior. This includes but is not limited to cars, software applications, factories, 400-ton mining trucks, financial markets, power grids, ice caps, satellites, apparel, appliances, phones, bodies, brains, jet engines—the list goes on. If the ripening of your apples isn’t already being tracked by line graphs in a cloud-based app, it may be soon. 
It’s no surprise, then, that data is growing exponentially. By 2025, the world will be fitted with 41.7 billion IoT devices that transmit 73.1 zettabytes of data. The importance of data rises as businesses increasingly use it to make business-altering decisions, and more data means instrumenting things that previously went uninstrumented.
This expected ballooning of data is a good thing for us data nerds, but it comes with an infamous set of complications. As it stands, hundreds of vendors and thousands of individually contributing community members are responsible for the instrumentation of the world. But this network of contributors is far from a well-oiled machine; disparate technologies deal with disparate data communication all over the world. What’s needed is a set of IoT interoperability standards across data collection, stream processing, visualization, alerting, and machine learning and AI frameworks.
Dissecting one of IoT’s biggest challenges
A quick look at the IoT landscape shows how quickly the complexity can grow. There are multiple variables in play, and there is a potential for issues at every level of IoT architecture.
Take instrumentation as one example. If a device vendor writes metric emission into the firmware, the instrumentation is typically unchangeable to users; if you’re lucky, the target systems where these devices send data may be configurable. Meanwhile, purpose-built sensors are designed to fit equipment to gather their respective signals, but efficiently collecting the data can again be hampered by vendor firmware. You could deploy third-party services to pull or “scrape” data from data sources, but this requires that the data sources actually allow it.
After instrumentation, the next thing to consider is the architecture of the pipeline from the data source to where that data is ultimately analyzed—a convoluted space to say the least. The moment you begin to think about agents, gateways, message queues, and streaming engines, the questions pour in. Which do you use? One? Some? All? Which of each do you use? How many? Where do you put them? What order do they go in?
To complicate matters further, the answers to these questions depend on your answers to all the other questions—a Cartesian product of possible solutions. They’re all interdependent decisions, so the technologies you use need to be both evaluated and decided on essentially simultaneously. Is your head spinning yet? It’s no wonder digital transformation has felt more like a lifestyle than a step toward progress.
Continue reading: https://www.infoworld.com/article/3649795/overcoming-the-iot-interoperability-hurdle.html
 

Attachments

  • p0006917.m06572.iot_internet_of_things_web_of_connected_devices_by_natalya_burova_gettyimages_...jpg
    p0006917.m06572.iot_internet_of_things_web_of_connected_devices_by_natalya_burova_gettyimages_...jpg
    124.9 KB · Views: 11
  • Like
Reactions: Kathleen Martin