Industry 4.0 Is Running on Impulse Power
At the Imagine conference, Edge Impulse announced a suite of new tools geared toward moving Industry 4.0 from R&D mode to production-ready.
A lot of ink is spilled these days about the coming of the Fourth Industrial Revolution, or Industry 4.0. It is often said that the advances of the past that have led us up to the present moment where digital and communications technologies are tightly integrated into industrial processes will soon give way to a new era driven by artificial intelligence (AI). But in practice, progress has been quite slow. There have been some wins β especially in areas like predictive maintenance β but by and large, Industry 4.0 is still in R&D mode.
Many of us know Edge Impulse from their platform that simplifies the development and deployment of edge machine learning models. At their Imagine conference that just wrapped up, they put a strong emphasis on bringing production-grade AI to industry. Toward that goal, they announced that a new suite of tools will soon be released on their platform. These tools include a production-ready object detection algorithm, model tuning and deployment applications, and even some GenAI tooling to enhance the entire pipeline.
YOLO + Industry 4.0 = YOLO-Pro
One of the more interesting announcements was the development of a new object detection algorithm called YOLO-Pro. It is based on the well-known and effective YOLO algorithm, but with a first-of-its-kind twist β it has been extensively trained on large industrial datasets to optimize its performance for real-world Industry 4.0 applications. Various sizes of YOLO-Pro are available so that it can be deployed anywhere from highly-constrained edge devices all the way up to powerful systems with cutting-edge GPUs.
Once a pipeline has been developed, it will need to be fine-tuned for optimal performance before it is ready for a production deployment. To simplify this process, the team at Edge Impulse also announced a tool called Application Behavior. It was designed to optimize models in the full context of a deployment against desired business outcomes. Whether you are building a sound or speech detection, object tracking and counting, or fall detection model, Application Behavior ensures peak performance on edge AI devices.
Deployment is only the beginning
Deployment is not the end of the life cycle for a machine learning pipeline. These algorithms will encounter new situations from time to time, and they must be updated to maintain high levels of performance. For this reason, Model Monitoring was introduced. This tool continuously monitors edge AI models after they have been deployed to track how they are performing. Issues can be addressed by updating the model with new data collected in the field, and other over-the-air updates can be applied to optimize performance.
Last but certainly not least, the team unveiled a tool called AI Actions. This suite of GenAI tools serves to not only enhance model performance, but also accelerate the pace of pipeline development. These goals were achieved by using technologies like large language models for data labeling and quality control, and integration with NVIDIA Omniverse, OpenAI, HuggingFace, and ElevenLabs to produce synthetic audio, video, and sensor data for more accurate and robust models.
Whew! That was a lot to pack into a one-day conference. As industrial partners start to leverage these new tools, we should expect to see their AI transformations move from hype to reality.