跳至内容

How to Export to NCNN from YOLO11 for Smooth Deployment

Deploying computer vision models on devices with limited computational power, such as mobile or embedded systems, can be tricky. You need to make sure you use a format optimized for optimal performance. This makes sure that even devices with limited processing power can handle advanced computer vision tasks well.

The export to NCNN format feature allows you to optimize your Ultralytics YOLO11 models for lightweight device-based applications. In this guide, we'll walk you through how to convert your models to the NCNN format, making it easier for your models to perform well on various mobile and embedded devices.

为什么要导出到NCNN ?

NCNN 概况

The NCNN framework, developed by Tencent, is a high-performance neural network inference computing framework optimized specifically for mobile platforms, including mobile phones, embedded devices, and IoT devices. NCNN is compatible with a wide range of platforms, including Linux, Android, iOS, and macOS.

NCNN is known for its fast processing speed on mobile CPUs and enables rapid deployment of deep learning models to mobile platforms. This makes it easier to build smart apps, putting the power of AI right at your fingertips.

NCNN 机型的主要特点

NCNN models offer a wide range of key features that enable on-device machine learning by helping developers run their models on mobile, embedded, and edge devices:

  • Efficient and High-Performance: NCNN models are made to be efficient and lightweight, optimized for running on mobile and embedded devices like Raspberry Pi with limited resources. They can also achieve high performance with high accuracy on various computer vision-based tasks.

  • Quantization: NCNN models often support quantization which is a technique that reduces the precision of the model's weights and activations. This leads to further improvements in performance and reduces memory footprint.

  • 兼容性:NCNN 模型与流行的深度学习框架兼容,如 TensorFlowCaffeONNX.这种兼容性使开发人员可以轻松使用现有模型和工作流程。

  • 易于使用:NCNN 模型与流行的深度学习框架兼容,可轻松集成到各种应用程序中。此外,NCNN 还提供用户友好型工具,用于在不同格式之间转换模型,确保在整个开发环境中实现流畅的互操作性。

部署选项NCNN

Before we look at the code for exporting YOLO11 models to the NCNN format, let's understand how NCNN models are normally used.

NCNN 这些型号专为提高效率和性能而设计,与各种部署平台兼容:

  • 移动部署:专门针对Android 和iOS 进行了优化,可无缝集成到移动应用程序中,实现高效的设备推理。

  • 嵌入式系统和物联网设备Ultralytics NCNN NCNN 非常适合 Raspberry Pi 和NVIDIA Jetson等设备,尤其是在需要在设备上快速处理的情况下。

  • 桌面和服务器部署:可在 Linux、Windows 和 macOS 的桌面和服务器环境中部署,支持开发、培训和评估,具有更高的计算能力。

Export to NCNN: Converting Your YOLO11 Model

You can expand model compatibility and deployment flexibility by converting YOLO11 models to NCNN format.

安装

要安装所需的软件包,请运行

安装

# Install the required package for YOLO11
pip install ultralytics

For detailed instructions and best practices related to the installation process, check our Ultralytics Installation guide. While installing the required packages for YOLO11, if you encounter any difficulties, consult our Common Issues guide for solutions and tips.

使用方法

Before diving into the usage instructions, it's important to note that while all Ultralytics YOLO11 models are available for exporting, you can ensure that the model you select supports export functionality here.

使用方法

from ultralytics import YOLO

# Load the YOLO11 model
model = YOLO("yolo11n.pt")

# Export the model to NCNN format
model.export(format="ncnn")  # creates '/yolo11n_ncnn_model'

# Load the exported NCNN model
ncnn_model = YOLO("./yolo11n_ncnn_model")

# Run inference
results = ncnn_model("https://ultralytics.com/images/bus.jpg")
# Export a YOLO11n PyTorch model to NCNN format
yolo export model=yolo11n.pt format=ncnn  # creates '/yolo11n_ncnn_model'

# Run inference with the exported model
yolo predict model='./yolo11n_ncnn_model' source='https://ultralytics.com/images/bus.jpg'

有关支持的导出选项的详细信息,请访问Ultralytics 部署选项文档页面

Deploying Exported YOLO11 NCNN Models

After successfully exporting your Ultralytics YOLO11 models to NCNN format, you can now deploy them. The primary and recommended first step for running a NCNN model is to utilize the YOLO("./model_ncnn_model") method, as outlined in the previous usage code snippet. However, for in-depth instructions on deploying your NCNN models in various other settings, take a look at the following resources:

  • Android: This blog explains how to use NCNN models for performing tasks like object detection through Android applications.

  • MacOS:了解如何使用NCNN 模型通过 macOS 执行任务。

  • Linux:浏览本页,了解如何在资源有限的设备(如 Raspberry Pi 和其他类似设备)上部署NCNN 模型。

  • 使用 VS2017 的 Windows x64:浏览本博客,了解如何使用 Visual Studio Community 2017 在 Windows x64 上部署NCNN 模型。

摘要

In this guide, we've gone over exporting Ultralytics YOLO11 models to the NCNN format. This conversion step is crucial for improving the efficiency and speed of YOLO11 models, making them more effective and suitable for limited-resource computing environments.

有关使用的详细说明,请参阅 NCNN 官方文档

Also, if you're interested in exploring other integration options for Ultralytics YOLO11, be sure to visit our integration guide page for further insights and information.

常见问题

How do I export Ultralytics YOLO11 models to NCNN format?

To export your Ultralytics YOLO11 model to NCNN format, follow these steps:

  • Python:使用 export YOLO 函数。

    from ultralytics import YOLO
    
    # Load the YOLO11 model
    model = YOLO("yolo11n.pt")
    
    # Export to NCNN format
    model.export(format="ncnn")  # creates '/yolo11n_ncnn_model'
    
  • CLI:使用 yolo 命令与 export 争论。

    yolo export model=yolo11n.pt format=ncnn  # creates '/yolo11n_ncnn_model'
    

有关详细的导出选项,请查看文档中的导出页面。

What are the advantages of exporting YOLO11 models to NCNN?

Exporting your Ultralytics YOLO11 models to NCNN offers several benefits:

  • 效率:NCNN 模型针对移动和嵌入式设备进行了优化,即使计算资源有限也能确保高性能。
  • 量化:NCNN 支持量化等技术,可提高模型速度并减少内存使用。
  • 广泛的兼容性:您可以在多个平台上部署NCNN 模型,包括Android 、iOS 、Linux 和 macOS。

更多详情,请参阅文档中的导出到NCNN部分。

为什么要在移动人工智能应用中使用NCNN ?

NCNN由腾讯公司开发,专门针对移动平台进行了优化。使用NCNN 的主要原因包括

  • 高性能:专为移动 CPU 的高效快速处理而设计。
  • Cross-Platform: Compatible with popular frameworks such as TensorFlow and ONNX, making it easier to convert and deploy models across different platforms.
  • 社区支持:积极的社区支持可确保持续改进和更新。

要了解更多信息,请访问文档中的NCNN 概述

What platforms are supported for NCNN model deployment?

NCNN 功能多样,支持各种平台:

  • 手机:Android,iOS.
  • 嵌入式系统和物联网设备:Raspberry Pi 和NVIDIA Jetson 等设备。
  • 台式机和服务器:Linux、Windows 和 macOS。

如果在 Raspberry Pi 上运行模型的速度不够快,转换为NCNN 格式可以加快速度,详见我们的《Raspberry Pi 指南》

How can I deploy Ultralytics YOLO11 NCNN models on Android?

To deploy your YOLO11 models on Android:

  1. 构建Android :请遵循NCNN Build forAndroid指南。
  2. 与您的应用程序集成:使用NCNN Android SDK 将导出的模型集成到您的应用程序中,以实现高效的设备推理。

For step-by-step instructions, refer to our guide on Deploying YOLO11 NCNN Models.

有关更多高级指南和使用案例,请访问Ultralytics 文档页面

📅 Created 7 months ago ✏️ Updated 20 days ago

评论