Imagimob Edge is an easy-to-use SaaS solution that makes it possible to transform AI models developed in h5 file format in TensorFlow into highly efficient C-code, ready for use in Edge devices. "Doing the job in seconds instead of weeks, Imagimob Edge is a huge game-changer for companies looking to innovate with Edge AI. Offering a huge step forward for both productivity and accessibility, Imagimob Edge stands to impact the growth of the Edge AI device market at large," the company states.
Imagimob Edge is designed for software developers who want to run TensorFlow deep learning AI models on embedded devices with constrained resources, such as the popular ARM Cortex-M-series MCU's. TensorFlow is the leading open-source machine learning framework on the market, and thousands of companies use it to develop deep learning applications. The standard file format extension for trained AI models in TensorFlow or Keras is h5.
Traditionally, converting an h5 file to C-code optimized for an edge device takes days or even weeks for a skilled programmer. But, with Imagimob Edge, anyone can do it in a matter of seconds.
Extremely easy to operate, Imagimob Edge users simply open an h5 file, let the service start up, then click a button to generate the C-code automatically. The C-code can be compiled and run on any edge device without external dependencies. The solution is also highly efficient when it comes to processing power and RAM memory usage—meaning, no runtime needed.
"Imagimob Edge significantly reduces time to market and increases productivity," says Anders Hardebring, CEO at Imagimob. "This will help companies launch new and exciting products to the market much faster."
Imagimob Edge, scheduled to be commercially available on November 30, 2020, at the latest, is a subset of Imagimob AI, the company's end-to-end SaaS solution for the development of Edge AI Applications. Users of Imagimob Edge can easily upgrade to Imagimob AI to benefit from a full suite of development support.
In June 2020, a consortium of Swedish companies, consisting of Imagimob, Acconeer, and Flexworks received a grant worth $450,000 to build gesture-controlled in-ear headphones. Acconeer is working on the sensing part, Flexworks is responsible for hardware and mechanics, and Imagimob is developing the gesture detection application and is perfecting a hardware accelerated system for the machine learning code running on a Arm Cortex M series MCU.