Posted by Bernhard Bauer and Terry Heo, Software Engineers, Google
Today we’re excited to announce that the Google Play services API for TensorFlow Lite is generally available on Android devices. We recommend this distribution as the path to adding custom machine learning to your apps. Last year, we launched a public beta of TensorFlow Lite in Google Play services at Google I/O. Since then, we’ve received lots of feedback and made improvements to the API. Most recently, we added the GPU delegate and Task Library support. Today we’re moving from beta to general availability on billions of Android devices globally.
TensorFlow Lite in Google Play services is already used by Google teams, including ML Kit, serving over a billion monthly active users and running more than 100 billion daily inferences.
TensorFlow Lite is an inference runtime optimized for mobile devices, and now that it’s part of Google Play services, it helps you deliver better ML experiences because it:
- Reduces your app size by up to 5 MB compared to statically bundling TensorFlow Lite with your app
- Uses the same API as available when bundling TF Lite into your app
- Receives regular performance updates in the background so it’s always getting better automatically
Get started by learning how to add TensorFlow Lite in Google Play Services to your Android app.Read More