Embedding AI at the Factory Edge
Visual Inspection
Production area reduction
Acoustic Monitoring
Explore the history of large-format 3D printing and how to design your parts for best results on these machines. Guest: Marques Franklin, GE Additive Customer Success Technical Account Manager.
Assembly Line
How Edge Analytics Can Help Manufacturers Overcome Obstacles Associated with More Equipment Data
Big data is transforming a variety of sectors, ushering them into the era of Industry 4.0. However, having access to raw data and knowing what to do with it are at completely different ends of the digitalization spectrum. To help manufacturers understand, and overcome, some of the challenges associated with smart manufacturing, Martin Thunman, CEO and co-founder of leading low-code platform for streaming analytics, automation and integration for industrial IoT, Crosser shares his insight.
SLAM for the real world
To take the next leap forward, the robotics industry needs software that is reliable and effective in the real-world, yet flexible and cost effective to integrate into a wider range of robot platforms and optimized to make efficient use of limited compute, power and memory resources. Creating βcommercial-gradeβ software that is robust enough to be deployed in thousands of robots in the real world, at prices that make that scale achievable, is the next challenge for the industry.
Tree Model Quantization for Embedded Machine Learning Applications
Compressed tree-based models are useful models to consider for embedded machine learning applications, in particular with the compression technique: quantization. Quantization can compress models by significant amounts with a trade-off of slight loss in model fidelity, allowing more room on the device for other programs.
The realities of developing embedded neural networks
With any embedded software destined for deployment in volume production, an enormous amount of effort goes into the code once the implementation of its core functionality has been completed and verified. This optimization phase is all about minimizing memory, CPU and other resources needed so that as much as possible of the software functionality is preserved, while the resources needed to execute it are reduced to the absolute minimum possible.
This process of creating embedded software from lab-based algorithms enables production engineers to cost-engineer software functionality into a mass-production ready form, requiring far cheaper, less capable chips and hardware than the massive compute datacenter used to develop it. However, it usually requires the functionality to be frozen from the beginning, with code modifications only done to improve the way the algorithms themselves are executed. For most software, that is fine: indeed, it enables a rigorous verification methodology to be used to ensure the embedding process retains all the functionality needed.
However, when embedding NN-based AI algorithms, that can be a major problem. Why? Because by freezing the functionality from the beginning, you are removing one of the main ways in which the execution can be optimized.
Surge Demand
Warehouse automation is being deployed at a rapid pace due to labor shortages. AI can now beat you at crosswords.