跳至内容

A Guide on YOLO11 Model Export to TFLite for Deployment

TFLite 徽标

Deploying computer vision models on edge devices or embedded devices requires a format that can ensure seamless performance.

The TensorFlow Lite or TFLite export format allows you to optimize your Ultralytics YOLO11 models for tasks like object detection and image classification in edge device-based applications. In this guide, we'll walk through the steps for converting your models to the TFLite format, making it easier for your models to perform well on various edge devices.

为什么要导出到 TFLite?

TensorFlow Lite(简称 TFLite)由Google 于 2017 年 5 月推出,是其TensorFlow 框架的一部分,是一个开源深度学习框架,专为设备推理(也称为边缘计算)而设计。它为开发人员提供了在移动、嵌入式和物联网设备以及传统计算机上执行训练好的模型所需的工具。

TensorFlow TFLite 兼容多种平台,包括嵌入式 Linux、Android 、iOS 和 MCU。将模型导出到 TFLite 可使您的应用程序更快、更可靠,并能离线运行。

TFLite 型号的主要特点

TFLite 模型具有多种关键功能,可帮助开发人员在移动、嵌入式和边缘设备上运行模型,从而实现设备上的机器学习:

  • 设备上优化:TFLite 对设备上的 ML 进行了优化,通过本地处理数据来减少延迟,通过不传输个人数据来提高私密性,并尽量缩小模型尺寸以节省空间。

  • 多平台支持:TFLite 提供广泛的平台兼容性,支持Android 、iOS 、嵌入式 Linux 和微控制器。

  • 支持多种语言:TFLite 兼容多种编程语言,包括 Java、Swift、Objective-C、C++ 和Python 。

  • 高性能:通过硬件加速和模型优化实现卓越性能。

TFLite 中的部署选项

Before we look at the code for exporting YOLO11 models to the TFLite format, let's understand how TFLite models are normally used.

TFLite 为机器学习模型提供各种设备上部署选项,包括

  • 使用Android 和iOS 进行部署:使用 TFLite 的Android 和iOS 应用程序都可以分析基于边缘的摄像头馈送和传感器,以检测和识别物体。TFLite 还提供用SwiftObjective-C 编写的本地iOS 库。下面的架构图显示了使用TensorFlow Lite 将训练好的模型部署到Android 和iOS 平台的过程。

建筑学

  • 使用嵌入式 Linux 实现:如果使用Ultralytics 指南树莓派上运行推理不能满足您的使用要求,您可以使用导出的 TFLite 模型来加快推理时间。此外,还可以利用Coral EdgeTPU 设备进一步提高性能。

  • 使用微控制器部署:TFLite 模型也可以部署在内存只有几千字节的微控制器和其他设备上。在 Arm Cortex M3 上,核心运行时只有 16 KB,可以运行许多基本模型。它不需要操作系统支持、任何标准 C 或 C++ 库或动态内存分配。

Export to TFLite: Converting Your YOLO11 Model

通过将它们转换为 TFLite 格式,可以提高设备上模型的执行效率并优化性能。

安装

要安装所需的软件包,请运行

安装

# Install the required package for YOLO11
pip install ultralytics

For detailed instructions and best practices related to the installation process, check our Ultralytics Installation guide. While installing the required packages for YOLO11, if you encounter any difficulties, consult our Common Issues guide for solutions and tips.

使用方法

Before diving into the usage instructions, it's important to note that while all Ultralytics YOLO11 models are available for exporting, you can ensure that the model you select supports export functionality here.

使用方法

from ultralytics import YOLO

# Load the YOLO11 model
model = YOLO("yolo11n.pt")

# Export the model to TFLite format
model.export(format="tflite")  # creates 'yolo11n_float32.tflite'

# Load the exported TFLite model
tflite_model = YOLO("yolo11n_float32.tflite")

# Run inference
results = tflite_model("https://ultralytics.com/images/bus.jpg")
# Export a YOLO11n PyTorch model to TFLite format
yolo export model=yolo11n.pt format=tflite  # creates 'yolo11n_float32.tflite'

# Run inference with the exported model
yolo predict model='yolo11n_float32.tflite' source='https://ultralytics.com/images/bus.jpg'

有关导出过程的更多详情,请访问Ultralytics 有关导出的文档页面

Deploying Exported YOLO11 TFLite Models

After successfully exporting your Ultralytics YOLO11 models to TFLite format, you can now deploy them. The primary and recommended first step for running a TFLite model is to utilize the YOLO("model.tflite") method, as outlined in the previous usage code snippet. However, for in-depth instructions on deploying your TFLite models in various other settings, take a look at the following resources:

  • Android: A quick start guide for integrating TensorFlow Lite into Android applications, providing easy-to-follow steps for setting up and running machine learning models.

  • iOS:查看本详细指南,了解开发人员如何在iOS 应用程序中集成和部署TensorFlow Lite 模型,其中提供了分步说明和资源。

  • 端到端示例:本页面概述了各种TensorFlow Lite 示例,展示了旨在帮助开发人员在移动和边缘设备上的机器学习项目中实施TensorFlow Lite 的实际应用和教程。

摘要

In this guide, we focused on how to export to TFLite format. By converting your Ultralytics YOLO11 models to TFLite model format, you can improve the efficiency and speed of YOLO11 models, making them more effective and suitable for edge computing environments.

有关用法的更多详细信息,请访问 TFLite 官方文档

Also, if you're curious about other Ultralytics YOLO11 integrations, make sure to check out our integration guide page. You'll find tons of helpful info and insights waiting for you there.

常见问题

How do I export a YOLO11 model to TFLite format?

To export a YOLO11 model to TFLite format, you can use the Ultralytics library. First, install the required package using:

pip install ultralytics

然后,使用以下代码片段导出模型:

from ultralytics import YOLO

# Load the YOLO11 model
model = YOLO("yolo11n.pt")

# Export the model to TFLite format
model.export(format="tflite")  # creates 'yolo11n_float32.tflite'

对于CLI 用户,您可以通过以下方式实现这一目标:

yolo export model=yolo11n.pt format=tflite  # creates 'yolo11n_float32.tflite'

更多详情,请访问Ultralytics 出口指南

What are the benefits of using TensorFlow Lite for YOLO11 model deployment?

TensorFlow Lite (TFLite) is an open-source deep learning framework designed for on-device inference, making it ideal for deploying YOLO11 models on mobile, embedded, and IoT devices. Key benefits include:

  • 设备上优化:通过本地处理数据,最大限度地减少延迟并提高私密性。
  • 平台兼容性:支持Android 、iOS 、嵌入式 Linux 和 MCU。
  • 性能利用硬件加速优化模型速度和效率。

要了解更多信息,请查看TFLite 指南

Is it possible to run YOLO11 TFLite models on Raspberry Pi?

Yes, you can run YOLO11 TFLite models on Raspberry Pi to improve inference speeds. First, export your model to TFLite format as explained here. Then, use a tool like TensorFlow Lite Interpreter to execute the model on your Raspberry Pi.

要进一步优化,可以考虑使用Coral EdgeTPU 。有关详细步骤,请参阅我们的Raspberry Pi 部署指南

Can I use TFLite models on microcontrollers for YOLO11 predictions?

Yes, TFLite supports deployment on microcontrollers with limited resources. TFLite's core runtime requires only 16 KB of memory on an Arm Cortex M3 and can run basic YOLO11 models. This makes it suitable for deployment on devices with minimal computational power and memory.

要开始使用,请访问TFLite Micro for Microcontrollers 指南

What platforms are compatible with TFLite exported YOLO11 models?

TensorFlow Lite provides extensive platform compatibility, allowing you to deploy YOLO11 models on a wide range of devices, including:

  • Android 和iOS :通过 TFLiteAndroid 和iOS 库提供本地支持。
  • 嵌入式 Linux:是 Raspberry Pi 等单板计算机的理想选择。
  • 微控制器:适用于资源有限的 MCU。

有关部署选项的更多信息,请参阅我们的详细部署指南

How do I troubleshoot common issues during YOLO11 model export to TFLite?

If you encounter errors while exporting YOLO11 models to TFLite, common solutions include:

  • 检查软件包兼容性:确保您使用的是Ultralytics 和TensorFlow 的兼容版本。请参阅我们的安装指南
  • Model support: Verify that the specific YOLO11 model supports TFLite export by checking here.

有关其他故障排除技巧,请访问我们的常见问题指南

📅 Created 7 months ago ✏️ Updated 20 days ago

评论