콘텐츠로 건너뛰기

Ultralytics YOLO11 Modes

Ultralytics YOLO 에코시스템 및 통합

소개

Ultralytics YOLO11 is not just another object detection model; it's a versatile framework designed to cover the entire lifecycle of machine learning models—from data ingestion and model training to validation, deployment, and real-world tracking. Each mode serves a specific purpose and is engineered to offer you the flexibility and efficiency required for different tasks and use-cases.



Watch: Ultralytics 모드 튜토리얼: 훈련, 검증, 예측, 내보내기 및 벤치마크.

모드 살펴보기

Understanding the different modes that Ultralytics YOLO11 supports is critical to getting the most out of your models:

  • 훈련 모드: 사용자 지정 또는 미리 로드된 데이터 세트에서 모델을 미세 조정합니다.
  • Val 모드: 모델 성능을 검증하기 위한 학습 후 체크포인트입니다.
  • 예측 모드: 실제 데이터에서 모델의 예측력을 발휘하세요.
  • Export mode: Make your model deployment-ready in various formats.
  • 추적 모드: 객체 감지 모델을 실시간 추적 애플리케이션으로 확장할 수 있습니다.
  • 벤치마크 모드: 다양한 배포 환경에서 모델의 속도와 정확성을 분석합니다.

This comprehensive guide aims to give you an overview and practical insights into each mode, helping you harness the full potential of YOLO11.

기차

Train mode is used for training a YOLO11 model on a custom dataset. In this mode, the model is trained using the specified dataset and hyperparameters. The training process involves optimizing the model's parameters so that it can accurately predict the classes and locations of objects in an image.

열차 예시

Val

Val mode is used for validating a YOLO11 model after it has been trained. In this mode, the model is evaluated on a validation set to measure its accuracy and generalization performance. This mode can be used to tune the hyperparameters of the model to improve its performance.

Val 예제

예측

Predict mode is used for making predictions using a trained YOLO11 model on new images or videos. In this mode, the model is loaded from a checkpoint file, and the user can provide images or videos to perform inference. The model predicts the classes and locations of objects in the input images or videos.

예측 예제

내보내기

Export mode is used for exporting a YOLO11 model to a format that can be used for deployment. In this mode, the model is converted to a format that can be used by other software applications or hardware devices. This mode is useful when deploying the model to production environments.

내보내기 예제

트랙

Track mode is used for tracking objects in real-time using a YOLO11 model. In this mode, the model is loaded from a checkpoint file, and the user can provide a live video stream to perform real-time object tracking. This mode is useful for applications such as surveillance systems or self-driving cars.

트랙 예시

벤치마크

Benchmark mode is used to profile the speed and accuracy of various export formats for YOLO11. The benchmarks provide information on the size of the exported format, its mAP50-95 metrics (for object detection, segmentation, and pose) or accuracy_top5 metrics (for classification), and the inference time in milliseconds per image across various formats like ONNX, OpenVINO, TensorRT, and others. This information can help users choose the optimal export format for their specific use case based on their requirements for speed and accuracy.

벤치마크 예시

자주 묻는 질문

How do I train a custom object detection model with Ultralytics YOLO11?

Training a custom object detection model with Ultralytics YOLO11 involves using the train mode. You need a dataset formatted in YOLO format, containing images and corresponding annotation files. Use the following command to start the training process:

from ultralytics import YOLO

# Load a pre-trained YOLO model (you can choose n, s, m, l, or x versions)
model = YOLO("yolo11n.pt")

# Start training on your custom dataset
model.train(data="path/to/dataset.yaml", epochs=100, imgsz=640)
# Train a YOLO model from the command line
yolo train data=path/to/dataset.yaml epochs=100 imgsz=640

자세한 안내는 Ultralytics 열차 가이드를 참조하세요.

What metrics does Ultralytics YOLO11 use to validate the model's performance?

Ultralytics YOLO11 uses various metrics during the validation process to assess model performance. These include:

  • mAP(평균 평균 정밀도): 물체 감지의 정확도를 평가합니다.
  • IOU(교차점 오버 유니온): 예측된 바운딩 박스와 기준 실측 바운딩 박스 간의 중첩을 측정합니다.
  • Precision and Recall: Precision measures the ratio of true positive detections to the total detected positives, while recall measures the ratio of true positive detections to the total actual positives.

다음 명령을 실행하여 유효성 검사를 시작할 수 있습니다:

from ultralytics import YOLO

# Load a pre-trained or custom YOLO model
model = YOLO("yolo11n.pt")

# Run validation on your dataset
model.val(data="path/to/validation.yaml")
# Validate a YOLO model from the command line
yolo val data=path/to/validation.yaml

자세한 내용은 유효성 검사 가이드를 참조하세요.

How can I export my YOLO11 model for deployment?

Ultralytics YOLO11 offers export functionality to convert your trained model into various deployment formats such as ONNX, TensorRT, CoreML, and more. Use the following example to export your model:

from ultralytics import YOLO

# Load your trained YOLO model
model = YOLO("yolo11n.pt")

# Export the model to ONNX format (you can specify other formats as needed)
model.export(format="onnx")
# Export a YOLO model to ONNX format from the command line
yolo export model=yolo11n.pt format=onnx

각 내보내기 형식에 대한 자세한 단계는 내보내기 가이드에서 확인할 수 있습니다.

What is the purpose of the benchmark mode in Ultralytics YOLO11?

Benchmark mode in Ultralytics YOLO11 is used to analyze the speed and accuracy of various export formats such as ONNX, TensorRT, and OpenVINO. It provides metrics like model size, mAP50-95 를 통해 다양한 하드웨어 설정에서 객체 감지 및 추론 시간을 비교하여 배포 요구 사항에 가장 적합한 형식을 선택할 수 있습니다.

from ultralytics.utils.benchmarks import benchmark

# Run benchmark on GPU (device 0)
# You can adjust parameters like model, dataset, image size, and precision as needed
benchmark(model="yolo11n.pt", data="coco8.yaml", imgsz=640, half=False, device=0)
# Benchmark a YOLO model from the command line
# Adjust parameters as needed for your specific use case
yolo benchmark model=yolo11n.pt data='coco8.yaml' imgsz=640 half=False device=0

자세한 내용은 벤치마크 가이드를 참조하세요.

How can I perform real-time object tracking using Ultralytics YOLO11?

Real-time object tracking can be achieved using the track mode in Ultralytics YOLO11. This mode extends object detection capabilities to track objects across video frames or live feeds. Use the following example to enable tracking:

from ultralytics import YOLO

# Load a pre-trained YOLO model
model = YOLO("yolo11n.pt")

# Start tracking objects in a video
# You can also use live video streams or webcam input
model.track(source="path/to/video.mp4")
# Perform object tracking on a video from the command line
# You can specify different sources like webcam (0) or RTSP streams
yolo track source=path/to/video.mp4

자세한 지침은 트랙 가이드를 참조하세요.

📅 Created 11 months ago ✏️ Updated 20 days ago

댓글