CIOTechie_ArtificialIntelligence_TensorFlow_TensorFlow-Lite_TensorFlow-Lite_1.0_Mobile_Model_Google_Search-Engine_Technology-Company

Google Introduces Its Artificial Intelligence Model TensorFlow Lite 1.0 For Mobile And Embedded Devices

Artificial Intelligence News

Google-Introduces_Its-Artificial-Intelligence-Model-TensorFlow-Lite-1.0-For-Mobile-And-Embedded-Devices Google Introduces Its Artificial Intelligence Model TensorFlow Lite 1.0 For Mobile And Embedded DevicesSearch engine giant Google has divulged TensorFlow Lite 1.0, its developer framework for deploying Artificial Intelligence models on mobile and IoT (Internet of Things) devices. In TensorFlow Lite 1.0, improvements including selective registration and quantization during and after training for faster, smaller models, where quantization has led to 4 times compression of some models.

According to the company, Lite has started with training AI models on TensorFlow and then is converted to build Lite models for running on mobile devices. Back in May 2017, Lite was first introduced at the I/O developer conference and undertaken for developer preview later that year. At Google, the TensorFlow Lite team also shared its Lite 1.0 roadmap for the future and is designed to condense and accelerate AI models for edge deployment of things like model acceleration, particularly for Android developers utilizing neural nets, in addition to a Keras-based connecting pruning kit and further quantization advancements. The company has also revealed other changes for its TensorFlow Lite that is on the way, including support for control flow that is necessary to the operation of models like Recurrent Neural Networks (RNNs); CPU performance optimization with Lite models, potentially comprising partnerships with other companies; expand coverage of GPU delegate operations and finalize the API to make it generally reachable.

A TensorFlow 2.0 model converter for making Lite models will be accessible for developers to better comprehend about wrong things during the conversion process and how to fix it. As said in a statement by TensoFlow Lite engineer Raziel Alvarez, TensorFlow Lite is deployed by over two billion devices to date. He further stated that TensorFlow Lite more and more makes TensorFlow Mobile archaic, except for users who want to use it for training, but a solution is in the works. Google, in the future, plans to build GPU delegates available at large, expand coverage, and finalize APIs.