Reference for ultralytics/engine/predictor.py
Note
This file is available at https://github.com/ultralytics/ultralytics/blob/main/ultralytics/engine/predictor.py. If you spot a problem please help fix it by contributing a Pull Request 🛠️. Thank you 🙏!
ultralytics.engine.predictor.BasePredictor
A base class for creating predictors.
This class provides the foundation for prediction functionality, handling model setup, inference, and result processing across various input sources.
Attributes:
Name | Type | Description |
---|---|---|
args |
SimpleNamespace
|
Configuration for the predictor. |
save_dir |
Path
|
Directory to save results. |
done_warmup |
bool
|
Whether the predictor has finished setup. |
model |
Module
|
Model used for prediction. |
data |
dict
|
Data configuration. |
device |
device
|
Device used for prediction. |
dataset |
Dataset
|
Dataset used for prediction. |
vid_writer |
dict
|
Dictionary of {save_path: video_writer} for saving video output. |
plotted_img |
ndarray
|
Last plotted image. |
source_type |
SimpleNamespace
|
Type of input source. |
seen |
int
|
Number of images processed. |
windows |
list
|
List of window names for visualization. |
batch |
tuple
|
Current batch data. |
results |
list
|
Current batch results. |
transforms |
callable
|
Image transforms for classification. |
callbacks |
dict
|
Callback functions for different events. |
txt_path |
Path
|
Path to save text results. |
_lock |
Lock
|
Lock for thread-safe inference. |
Methods:
Name | Description |
---|---|
preprocess |
Prepare input image before inference. |
inference |
Run inference on a given image. |
postprocess |
Process raw predictions into structured results. |
predict_cli |
Run prediction for command line interface. |
setup_source |
Set up input source and inference mode. |
stream_inference |
Stream inference on input source. |
setup_model |
Initialize and configure the model. |
write_results |
Write inference results to files. |
save_predicted_images |
Save prediction visualizations. |
show |
Display results in a window. |
run_callbacks |
Execute registered callbacks for an event. |
add_callback |
Register a new callback function. |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
cfg
|
str | dict
|
Path to a configuration file or a configuration dictionary. |
DEFAULT_CFG
|
overrides
|
dict | None
|
Configuration overrides. |
None
|
_callbacks
|
dict | None
|
Dictionary of callback functions. |
None
|
Source code in ultralytics/engine/predictor.py
__call__
Perform inference on an image or stream.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
source
|
str | Path | List[str] | List[Path] | List[ndarray] | ndarray | Tensor | None
|
Source for inference. |
None
|
model
|
str | Path | Module | None
|
Model for inference. |
None
|
stream
|
bool
|
Whether to stream the inference results. If True, returns a generator. |
False
|
*args
|
Any
|
Additional arguments for the inference method. |
()
|
**kwargs
|
Any
|
Additional keyword arguments for the inference method. |
{}
|
Returns:
Type | Description |
---|---|
List[Results] | generator
|
Results objects or generator of Results objects. |
Source code in ultralytics/engine/predictor.py
add_callback
inference
Run inference on a given image using the specified model and arguments.
Source code in ultralytics/engine/predictor.py
postprocess
pre_transform
Pre-transform input image before inference.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
im
|
List[ndarray]
|
Images of shape (N, 3, h, w) for tensor, [(h, w, 3) x N] for list. |
required |
Returns:
Type | Description |
---|---|
List[ndarray]
|
A list of transformed images. |
Source code in ultralytics/engine/predictor.py
predict_cli
Method used for Command Line Interface (CLI) prediction.
This function is designed to run predictions using the CLI. It sets up the source and model, then processes the inputs in a streaming manner. This method ensures that no outputs accumulate in memory by consuming the generator without storing results.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
source
|
str | Path | List[str] | List[Path] | List[ndarray] | ndarray | Tensor | None
|
Source for inference. |
None
|
model
|
str | Path | Module | None
|
Model for inference. |
None
|
Note
Do not modify this function or remove the generator. The generator ensures that no outputs are accumulated in memory, which is critical for preventing memory issues during long-running predictions.
Source code in ultralytics/engine/predictor.py
preprocess
Prepares input image before inference.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
im
|
torch.Tensor | List(np.ndarray
|
Images of shape (N, 3, h, w) for tensor, [(h, w, 3) x N] for list. |
required |
Source code in ultralytics/engine/predictor.py
run_callbacks
save_predicted_images
Save video predictions as mp4 or images as jpg at specified path.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
save_path
|
str
|
Path to save the results. |
''
|
frame
|
int
|
Frame number for video mode. |
0
|
Source code in ultralytics/engine/predictor.py
setup_model
Initialize YOLO model with given parameters and set it to evaluation mode.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model
|
str | Path | Module | None
|
Model to load or use. |
required |
verbose
|
bool
|
Whether to print verbose output. |
True
|
Source code in ultralytics/engine/predictor.py
setup_source
Set up source and inference mode.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
source
|
str | Path | List[str] | List[Path] | List[ndarray] | ndarray | Tensor
|
Source for inference. |
required |
Source code in ultralytics/engine/predictor.py
show
Display an image in a window.
Source code in ultralytics/engine/predictor.py
stream_inference
Stream real-time inference on camera feed and save results to file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
source
|
str | Path | List[str] | List[Path] | List[ndarray] | ndarray | Tensor | None
|
Source for inference. |
None
|
model
|
str | Path | Module | None
|
Model for inference. |
None
|
*args
|
Any
|
Additional arguments for the inference method. |
()
|
**kwargs
|
Any
|
Additional keyword arguments for the inference method. |
{}
|
Yields:
Type | Description |
---|---|
Results
|
Results objects. |
Source code in ultralytics/engine/predictor.py
271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 |
|
write_results
Write inference results to a file or directory.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
i
|
int
|
Index of the current image in the batch. |
required |
p
|
Path
|
Path to the current image. |
required |
im
|
Tensor
|
Preprocessed image tensor. |
required |
s
|
List[str]
|
List of result strings. |
required |
Returns:
Type | Description |
---|---|
str
|
String with result information. |