Get models for TensorFlow Lite
You don't have to build a TensorFlow Lite model to start using machine learning on mobile or edge devices. Many already-built and optimized models are available for you to use right away in your application. You can start with using pre-trained models in TensorFlow Lite and move up to building custom models over time, as follows:
- Start developing machine learning features with already
trained models.
- Modify existing TensorFlow Lite models using tools such as
Model Maker
.
- Build a
custom model
with TensorFlow tools and then
convert
it to TensorFlow Lite.
Using models for quick tasks: ML Kit
If you are trying to quickly implement features or utility tasks with machine learning, you should review the use cases supported by
ML Kit
before starting development with TensorFlow Lite. This development tool provides APIs you can call directly from mobile apps to complete common ML tasks such as barcode scanning and on-device translation. Using this method can help you get results fast. However, ML Kit has limited options for extending its capabilities. For more information, see the
ML Kit
developer documentation.
Building models for your app: Constraints
If building a custom model for your specific use case is your ultimate goal, you should start with developing and training a TensorFlow model or extending an existing one. Before you start your model development process, you should be aware of the constraints for TensorFlow Lite models and build your model with these constraints in mind:
- Limited compute capabilities
- Size of models
- Size of data
- Supported TensorFlow operations
For more detail about each of these constraints, see
model design contraints
in the Model build overview. For more information about building effective, compatible, high performance models for TensorFlow Lite, see
Performance best practices
.