Workouts Monitoring using Ultralytics YOLO11
Monitoring workouts through pose estimation with Ultralytics YOLO11 enhances exercise assessment by accurately tracking key body landmarks and joints in real-time. This technology provides instant feedback on exercise form, tracks workout routines, and measures performance metrics, optimizing training sessions for users and trainers alike.
Π‘ΠΌΠΎΡΡΠΈ: Workouts Monitoring using Ultralytics YOLO11 | Pushups, Pullups, Ab Workouts
ΠΡΠ΅ΠΈΠΌΡΡΠ΅ΡΡΠ²Π° ΠΌΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³Π° ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ?
- ΠΠΏΡΠΈΠΌΠΈΠ·ΠΈΡΠΎΠ²Π°Π½Π½Π°Ρ ΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΡ: ΠΠΎΠ΄Π±ΠΈΡΠ°ΠΉ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠΈ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Π΄Π°Π½Π½ΡΡ ΠΌΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³Π° Π΄Π»Ρ Π΄ΠΎΡΡΠΈΠΆΠ΅Π½ΠΈΡ Π»ΡΡΡΠΈΡ ΡΠ΅Π·ΡΠ»ΡΡΠ°ΡΠΎΠ².
- ΠΠΎΡΡΠΈΠΆΠ΅Π½ΠΈΠ΅ ΡΠ΅Π»Π΅ΠΉ: ΠΡΡΠ»Π΅ΠΆΠΈΠ²Π°ΠΉ ΠΈ ΠΊΠΎΡΡΠ΅ΠΊΡΠΈΡΡΠΉ ΡΠΈΡΠ½Π΅Ρ-ΡΠ΅Π»ΠΈ, ΡΡΠΎΠ±Ρ Π΄ΠΎΠ±ΠΈΡΡΡΡ ΠΈΠ·ΠΌΠ΅ΡΠΈΠΌΠΎΠ³ΠΎ ΠΏΡΠΎΠ³ΡΠ΅ΡΡΠ°.
- ΠΠ΅ΡΡΠΎΠ½Π°Π»ΠΈΠ·Π°ΡΠΈΡ: ΠΠ½Π΄ΠΈΠ²ΠΈΠ΄ΡΠ°Π»ΡΠ½ΡΠ΅ ΠΏΠ»Π°Π½Ρ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ, ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΠ΅ Π½Π° ΠΈΠ½Π΄ΠΈΠ²ΠΈΠ΄ΡΠ°Π»ΡΠ½ΡΡ Π΄Π°Π½Π½ΡΡ Π΄Π»Ρ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΡΡΠΈ.
- ΠΠ½ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½Π½ΠΎΡΡΡ ΠΎ Π·Π΄ΠΎΡΠΎΠ²ΡΠ΅: Π Π°Π½Π½Π΅Π΅ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΠ΅ ΠΏΠ°ΡΡΠ΅ΡΠ½ΠΎΠ², ΡΠΊΠ°Π·ΡΠ²Π°ΡΡΠΈΡ Π½Π° ΠΏΡΠΎΠ±Π»Π΅ΠΌΡ ΡΠΎ Π·Π΄ΠΎΡΠΎΠ²ΡΠ΅ΠΌ ΠΈΠ»ΠΈ ΠΏΠ΅ΡΠ΅ΡΡΠ΅Π½ΠΈΡΠΎΠ²Π°Π½Π½ΠΎΡΡΡ.
- ΠΠ±ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΠ΅ ΡΠ΅ΡΠ΅Π½ΠΈΡ: ΠΡΠΈΠ½ΡΡΠΈΠ΅ ΡΠ΅ΡΠ΅Π½ΠΈΠΉ, ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΡ Π½Π° Π΄Π°Π½Π½ΡΡ , Π΄Π»Ρ ΠΊΠΎΡΡΠ΅ΠΊΡΠΈΡΠΎΠ²ΠΊΠΈ ΡΠ°ΡΠΏΠΎΡΡΠ΄ΠΊΠ° ΠΈ ΠΏΠΎΡΡΠ°Π½ΠΎΠ²ΠΊΠΈ ΡΠ΅Π°Π»ΠΈΡΡΠΈΡΠ½ΡΡ ΡΠ΅Π»Π΅ΠΉ.
ΠΡΠΈΠΌΠ΅Π½Π΅Π½ΠΈΠ΅ Π² ΡΠ΅Π°Π»ΡΠ½ΠΎΠΌ ΠΌΠΈΡΠ΅
ΠΠΎΠ½ΡΡΠΎΠ»Ρ Π·Π° ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠ°ΠΌΠΈ | ΠΠΎΠ½ΡΡΠΎΠ»Ρ Π·Π° ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠ°ΠΌΠΈ |
---|---|
ΠΠΎΠ΄ΡΡΠ΅Ρ ΠΎΡΠΆΠΈΠΌΠ°Π½ΠΈΠΉ | ΠΠΎΠ΄ΡΡΠ΅Ρ ΠΏΠΎΠ΄ΡΡΠ³ΠΈΠ²Π°Π½ΠΈΠΉ |
ΠΡΠΈΠΌΠ΅Ρ ΠΌΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³Π° ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ
import cv2
from ultralytics import solutions
cap = cv2.VideoCapture("path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))
gym = solutions.AIGym(
model="yolo11n-pose.pt",
show=True,
kpts=[6, 8, 10],
)
while cap.isOpened():
success, im0 = cap.read()
if not success:
print("Video frame is empty or video processing has been successfully completed.")
break
im0 = gym.monitor(im0)
cv2.destroyAllWindows()
import cv2
from ultralytics import solutions
cap = cv2.VideoCapture("path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))
video_writer = cv2.VideoWriter("workouts.avi", cv2.VideoWriter_fourcc(*"mp4v"), fps, (w, h))
gym = solutions.AIGym(
show=True,
kpts=[6, 8, 10],
)
while cap.isOpened():
success, im0 = cap.read()
if not success:
print("Video frame is empty or video processing has been successfully completed.")
break
im0 = gym.monitor(im0)
video_writer.write(im0)
cv2.destroyAllWindows()
video_writer.release()
ΠΠ°ΡΡΠ° KeyPoints
ΠΡΠ³ΡΠΌΠ΅Π½ΡΡ AIGym
ΠΠΌΡ | Π’ΠΈΠΏ | ΠΠΎ ΡΠΌΠΎΠ»ΡΠ°Π½ΠΈΡ | ΠΠΏΠΈΡΠ°Π½ΠΈΠ΅ |
---|---|---|---|
kpts |
list |
None |
Π‘ΠΏΠΈΡΠΎΠΊ ΠΈΠ· ΡΡΠ΅Ρ ΠΊΠ»ΡΡΠ΅Π²ΡΡ ΡΠΎΡΠ΅ΠΊ ΠΈΠ½Π΄Π΅ΠΊΡΠ°, Π΄Π»Ρ ΠΏΠΎΠ΄ΡΡΠ΅ΡΠ° ΠΊΠΎΠ½ΠΊΡΠ΅ΡΠ½ΠΎΠΉ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠΈ, Π·Π° ΠΊΠΎΡΠΎΡΡΠΌ ΡΠ»Π΅Π΄ΡΠ΅Ρ ΠΊΠ°ΡΡΠ° ΠΊΠ»ΡΡΠ΅Π²ΡΡ ΡΠΎΡΠ΅ΠΊ |
line_width |
int |
2 |
Π’ΠΎΠ»ΡΠΈΠ½Π° Π½Π°ΡΠΈΡΠΎΠ²Π°Π½Π½ΡΡ Π»ΠΈΠ½ΠΈΠΉ. |
show |
bool |
False |
Π€Π»Π°Π³ Π΄Π»Ρ ΠΎΡΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΡ ΠΊΠ°ΡΡΠΈΠ½ΠΊΠΈ. |
up_angle |
float |
145.0 |
ΠΠΎΡΠΎΠ³ ΡΠ³Π»Π° Π΄Π»Ρ ΠΏΠΎΠ·Ρ "Π²Π²Π΅ΡΡ ". |
down_angle |
float |
90.0 |
ΠΠΎΡΠΎΠ³ ΡΠ³Π»Π° Π΄Π»Ρ ΠΏΠΎΠ·Ρ "Π²Π½ΠΈΠ·". |
ΠΡΠ³ΡΠΌΠ΅Π½ΡΡ model.predict
ΠΡΠ³ΡΠΌΠ΅Π½Ρ | Π’ΠΈΠΏ | ΠΠΎ ΡΠΌΠΎΠ»ΡΠ°Π½ΠΈΡ | ΠΠΏΠΈΡΠ°Π½ΠΈΠ΅ |
---|---|---|---|
source |
str |
'ultralytics/assets' |
Specifies the data source for inference. Can be an image path, video file, directory, URL, or device ID for live feeds. Supports a wide range of formats and sources, enabling flexible application across different types of input. |
conf |
float |
0.25 |
Π£ΡΡΠ°Π½Π°Π²Π»ΠΈΠ²Π°Π΅Ρ ΠΌΠΈΠ½ΠΈΠΌΠ°Π»ΡΠ½ΡΠΉ ΠΏΠΎΡΠΎΠ³ ΡΠ²Π΅ΡΠ΅Π½Π½ΠΎΡΡΠΈ Π΄Π»Ρ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ. ΠΠ±ΡΠ΅ΠΊΡΡ, ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½Π½ΡΠ΅ Ρ ΡΠ²Π΅ΡΠ΅Π½Π½ΠΎΡΡΡΡ Π½ΠΈΠΆΠ΅ ΡΡΠΎΠ³ΠΎ ΠΏΠΎΡΠΎΠ³Π°, Π±ΡΠ΄ΡΡ ΠΏΡΠΎΠΈΠ³Π½ΠΎΡΠΈΡΠΎΠ²Π°Π½Ρ. ΠΠ°ΡΡΡΠΎΠΉΠΊΠ° ΡΡΠΎΠ³ΠΎ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΌΠΎΠΆΠ΅Ρ ΠΏΠΎΠΌΠΎΡΡ ΡΠΌΠ΅Π½ΡΡΠΈΡΡ ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ Π»ΠΎΠΆΠ½ΡΡ ΡΡΠ°Π±Π°ΡΡΠ²Π°Π½ΠΈΠΉ. |
iou |
float |
0.7 |
Intersection Over Union (IoU) threshold for Non-Maximum Suppression (NMS). Lower values result in fewer detections by eliminating overlapping boxes, useful for reducing duplicates. |
imgsz |
int or tuple |
640 |
ΠΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅Ρ ΡΠ°Π·ΠΌΠ΅Ρ ΠΈΠ·ΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΡ Π΄Π»Ρ Π²ΡΠ²ΠΎΠ΄Π°. ΠΠΎΠΆΠ΅Ρ Π±ΡΡΡ ΠΎΠ΄Π½ΠΈΠΌ ΡΠ΅Π»ΡΠΌ ΡΠΈΡΠ»ΠΎΠΌ 640 for square resizing or a (height, width) tuple. Proper sizing can improve detection accuracy and processing speed. |
half |
bool |
False |
Enables half-precision (FP16) inference, which can speed up model inference on supported GPUs with minimal impact on accuracy. |
device |
str |
None |
Π£ΠΊΠ°Π·ΡΠ²Π°Π΅Ρ ΡΡΡΡΠΎΠΉΡΡΠ²ΠΎ Π΄Π»Ρ ΡΠΌΠΎΠ·Π°ΠΊΠ»ΡΡΠ΅Π½ΠΈΠΉ (Π½Π°ΠΏΡΠΈΠΌΠ΅Ρ, cpu , cuda:0 ΠΈΠ»ΠΈ 0 ). ΠΠΎΠ·Π²ΠΎΠ»ΡΠ΅Ρ ΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°ΡΠ΅Π»ΡΠΌ Π²ΡΠ±ΠΈΡΠ°ΡΡ ΠΌΠ΅ΠΆΠ΄Ρ CPU, ΠΊΠΎΠ½ΠΊΡΠ΅ΡΠ½ΡΠΌ GPU ΠΈΠ»ΠΈ Π΄ΡΡΠ³ΠΈΠΌΠΈ Π²ΡΡΠΈΡΠ»ΠΈΡΠ΅Π»ΡΠ½ΡΠΌΠΈ ΡΡΡΡΠΎΠΉΡΡΠ²Π°ΠΌΠΈ Π΄Π»Ρ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΠΌΠΎΠ΄Π΅Π»ΠΈ. |
max_det |
int |
300 |
ΠΠ°ΠΊΡΠΈΠΌΠ°Π»ΡΠ½ΠΎΠ΅ ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΠΉ, Π΄ΠΎΠΏΡΡΡΠΈΠΌΠΎΠ΅ Π΄Π»Ρ ΠΎΠ΄Π½ΠΎΠ³ΠΎ ΠΈΠ·ΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΡ. ΠΠ³ΡΠ°Π½ΠΈΡΠΈΠ²Π°Π΅Ρ ΠΎΠ±ΡΠ΅Π΅ ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΎΠ±ΡΠ΅ΠΊΡΠΎΠ², ΠΊΠΎΡΠΎΡΡΠ΅ ΠΌΠΎΠ΄Π΅Π»Ρ ΠΌΠΎΠΆΠ΅Ρ ΠΎΠ±Π½Π°ΡΡΠΆΠΈΡΡ Π·Π° ΠΎΠ΄ΠΈΠ½ Π²ΡΠ²ΠΎΠ΄, ΠΏΡΠ΅Π΄ΠΎΡΠ²ΡΠ°ΡΠ°Ρ ΡΡΠ΅Π·ΠΌΠ΅ΡΠ½ΡΠΉ Π²ΡΠ²ΠΎΠ΄ Π² ΠΏΠ»ΠΎΡΠ½ΡΡ ΡΡΠ΅Π½Π°Ρ . |
vid_stride |
int |
1 |
ΠΡΠΎΠΏΡΡΠΊ ΠΊΠ°Π΄ΡΠΎΠ² Π΄Π»Ρ Π²ΠΈΠ΄Π΅ΠΎΠ²Ρ ΠΎΠ΄ΠΎΠ². ΠΠΎΠ·Π²ΠΎΠ»ΡΠ΅Ρ ΠΏΡΠΎΠΏΡΡΠΊΠ°ΡΡ ΠΊΠ°Π΄ΡΡ Π² Π²ΠΈΠ΄Π΅ΠΎ, ΡΡΠΎΠ±Ρ ΡΡΠΊΠΎΡΠΈΡΡ ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΡ Π·Π° ΡΡΠ΅Ρ ΡΠ½ΠΈΠΆΠ΅Π½ΠΈΡ Π²ΡΠ΅ΠΌΠ΅Π½Π½ΠΎΠ³ΠΎ ΡΠ°Π·ΡΠ΅ΡΠ΅Π½ΠΈΡ. ΠΡΠΈ Π·Π½Π°ΡΠ΅Π½ΠΈΠΈ 1 ΠΎΠ±ΡΠ°Π±Π°ΡΡΠ²Π°Π΅ΡΡΡ ΠΊΠ°ΠΆΠ΄ΡΠΉ ΠΊΠ°Π΄Ρ, ΠΏΡΠΈ Π±ΠΎΠ»ΡΡΠΈΡ Π·Π½Π°ΡΠ΅Π½ΠΈΡΡ ΠΊΠ°Π΄ΡΡ ΠΏΡΠΎΠΏΡΡΠΊΠ°ΡΡΡΡ. |
stream_buffer |
bool |
False |
Determines whether to queue incoming frames for video streams. If False , old frames get dropped to accomodate new frames (optimized for real-time applications). If `True', queues new frames in a buffer, ensuring no frames get skipped, but will cause latency if inference FPS is lower than stream FPS. |
visualize |
bool |
False |
ΠΠΊΡΠΈΠ²ΠΈΡΡΠ΅Ρ Π²ΠΈΠ·ΡΠ°Π»ΠΈΠ·Π°ΡΠΈΡ ΠΎΡΠΎΠ±Π΅Π½Π½ΠΎΡΡΠ΅ΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π²ΠΎ Π²ΡΠ΅ΠΌΡ Π²ΡΠ²ΠΎΠ΄Π°, Π΄Π°Π²Π°Ρ ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½ΠΈΠ΅ ΠΎ ΡΠΎΠΌ, ΡΡΠΎ "Π²ΠΈΠ΄ΠΈΡ" ΠΌΠΎΠ΄Π΅Π»Ρ. ΠΠΎΠ»Π΅Π·Π½ΠΎ Π΄Π»Ρ ΠΎΡΠ»Π°Π΄ΠΊΠΈ ΠΈ ΠΈΠ½ΡΠ΅ΡΠΏΡΠ΅ΡΠ°ΡΠΈΠΈ ΠΌΠΎΠ΄Π΅Π»ΠΈ. |
augment |
bool |
False |
ΠΠΊΠ»ΡΡΠ°Π΅Ρ ΡΠ²Π΅Π»ΠΈΡΠ΅Π½ΠΈΠ΅ Π²ΡΠ΅ΠΌΠ΅Π½ΠΈ ΡΠ΅ΡΡΠΈΡΠΎΠ²Π°Π½ΠΈΡ (TTA) Π΄Π»Ρ ΠΏΡΠ΅Π΄ΡΠΊΠ°Π·Π°Π½ΠΈΠΉ, ΠΏΠΎΡΠ΅Π½ΡΠΈΠ°Π»ΡΠ½ΠΎ ΡΠ»ΡΡΡΠ°Ρ ΡΡΡΠΎΠΉΡΠΈΠ²ΠΎΡΡΡ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ ΡΠ΅Π½ΠΎΠΉ ΡΠ½ΠΈΠΆΠ΅Π½ΠΈΡ ΡΠΊΠΎΡΠΎΡΡΠΈ Π²ΡΠ²ΠΎΠ΄Π°. |
agnostic_nms |
bool |
False |
ΠΠΊΠ»ΡΡΠ°Π΅Ρ Π΄ΠΈΠ°Π³Π½ΠΎΡΡΠΈΡΡΠ΅ΠΌΠΎΠ΅ ΠΊΠ»Π°ΡΡΠΎΠΌ Π½Π΅ΠΌΠ°ΠΊΡΠΈΠΌΠ°Π»ΡΠ½ΠΎΠ΅ ΠΏΠΎΠ΄Π°Π²Π»Π΅Π½ΠΈΠ΅ (NMS), ΠΊΠΎΡΠΎΡΠΎΠ΅ ΠΎΠ±ΡΠ΅Π΄ΠΈΠ½ΡΠ΅Ρ ΠΏΠ΅ΡΠ΅ΠΊΡΡΠ²Π°ΡΡΠΈΠ΅ΡΡ Π±ΠΎΠΊΡΡ ΡΠ°Π·Π½ΡΡ ΠΊΠ»Π°ΡΡΠΎΠ². ΠΠΎΠ»Π΅Π·Π½ΠΎ Π² ΡΡΠ΅Π½Π°ΡΠΈΡΡ ΠΌΠ½ΠΎΠ³ΠΎΠΊΠ»Π°ΡΡΠΎΠ²ΠΎΠ³ΠΎ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ, Π³Π΄Π΅ ΡΠ°ΡΡΠΎ ΠΏΡΠΎΠΈΡΡ ΠΎΠ΄ΠΈΡ ΠΏΠ΅ΡΠ΅ΠΊΡΡΡΠΈΠ΅ ΠΊΠ»Π°ΡΡΠΎΠ². |
classes |
list[int] |
None |
Π€ΠΈΠ»ΡΡΡΡΠ΅Ρ ΠΏΡΠ΅Π΄ΡΠΊΠ°Π·Π°Π½ΠΈΡ ΠΏΠΎ Π½Π°Π±ΠΎΡΡ ΠΈΠ΄Π΅Π½ΡΠΈΡΠΈΠΊΠ°ΡΠΎΡΠΎΠ² ΠΊΠ»Π°ΡΡΠΎΠ². ΠΡΠ΄ΡΡ Π²ΠΎΠ·Π²ΡΠ°ΡΠ΅Π½Ρ ΡΠΎΠ»ΡΠΊΠΎ ΡΠ΅ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ, ΠΊΠΎΡΠΎΡΡΠ΅ ΠΎΡΠ½ΠΎΡΡΡΡΡ ΠΊ ΡΠΊΠ°Π·Π°Π½Π½ΡΠΌ ΠΊΠ»Π°ΡΡΠ°ΠΌ. ΠΠΎΠ»Π΅Π·Π½ΠΎ Π΄Π»Ρ ΡΠΎΠ³ΠΎ, ΡΡΠΎΠ±Ρ ΡΡΠΎΠΊΡΡΠΈΡΠΎΠ²Π°ΡΡΡΡ Π½Π° ΡΠΎΠΎΡΠ²Π΅ΡΡΡΠ²ΡΡΡΠΈΡ ΠΎΠ±ΡΠ΅ΠΊΡΠ°Ρ Π² ΠΌΠ½ΠΎΠ³ΠΎΠΊΠ»Π°ΡΡΠΎΠ²ΡΡ Π·Π°Π΄Π°ΡΠ°Ρ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ. |
retina_masks |
bool |
False |
ΠΡΠΏΠΎΠ»ΡΠ·ΡΠΉ ΠΌΠ°ΡΠΊΠΈ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠ°ΡΠΈΠΈ Π²ΡΡΠΎΠΊΠΎΠ³ΠΎ ΡΠ°Π·ΡΠ΅ΡΠ΅Π½ΠΈΡ, Π΅ΡΠ»ΠΈ ΠΎΠ½ΠΈ Π΅ΡΡΡ Π² ΠΌΠΎΠ΄Π΅Π»ΠΈ. ΠΡΠΎ ΠΌΠΎΠΆΠ΅Ρ ΠΏΠΎΠ²ΡΡΠΈΡΡ ΠΊΠ°ΡΠ΅ΡΡΠ²ΠΎ ΠΌΠ°ΡΠΎΠΊ Π΄Π»Ρ Π·Π°Π΄Π°Ρ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠ°ΡΠΈΠΈ, ΠΎΠ±Π΅ΡΠΏΠ΅ΡΠΈΠ²Π°Ρ Π±ΠΎΠ»Π΅Π΅ ΡΠΎΠ½ΠΊΡΡ Π΄Π΅ΡΠ°Π»ΠΈΠ·Π°ΡΠΈΡ. |
embed |
list[int] |
None |
Specifies the layers from which to extract feature vectors or embeddings. Useful for downstream tasks like clustering or similarity search. |
ΠΡΠ³ΡΠΌΠ΅Π½ΡΡ model.track
ΠΡΠ³ΡΠΌΠ΅Π½Ρ | Π’ΠΈΠΏ | ΠΠΎ ΡΠΌΠΎΠ»ΡΠ°Π½ΠΈΡ | ΠΠΏΠΈΡΠ°Π½ΠΈΠ΅ |
---|---|---|---|
source |
str |
None |
Specifies the source directory for images or videos. Supports file paths and URLs. |
persist |
bool |
False |
Enables persistent tracking of objects between frames, maintaining IDs across video sequences. |
tracker |
str |
botsort.yaml |
Specifies the tracking algorithm to use, e.g., bytetrack.yaml ΠΈΠ»ΠΈ botsort.yaml . |
conf |
float |
0.3 |
Sets the confidence threshold for detections; lower values allow more objects to be tracked but may include false positives. |
iou |
float |
0.5 |
Sets the Intersection over Union (IoU) threshold for filtering overlapping detections. |
classes |
list |
None |
Filters results by class index. For example, classes=[0, 2, 3] only tracks the specified classes. |
verbose |
bool |
True |
Controls the display of tracking results, providing a visual output of tracked objects. |
ΠΠΠΠ ΠΠ‘Π« Π ΠΠ’ΠΠΠ’Π«
How do I monitor my workouts using Ultralytics YOLO11?
To monitor your workouts using Ultralytics YOLO11, you can utilize the pose estimation capabilities to track and analyze key body landmarks and joints in real-time. This allows you to receive instant feedback on your exercise form, count repetitions, and measure performance metrics. You can start by using the provided example code for pushups, pullups, or ab workouts as shown:
import cv2
from ultralytics import solutions
cap = cv2.VideoCapture("path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))
gym = solutions.AIGym(
line_width=2,
show=True,
kpts=[6, 8, 10],
)
while cap.isOpened():
success, im0 = cap.read()
if not success:
print("Video frame is empty or video processing has been successfully completed.")
break
im0 = gym.monitor(im0)
cv2.destroyAllWindows()
ΠΠ° Π΄Π°Π»ΡΠ½Π΅ΠΉΡΠ΅ΠΉ Π½Π°ΡΡΡΠΎΠΉΠΊΠΎΠΉ ΠΈ ΠΏΠ°ΡΠ°ΠΌΠ΅ΡΡΠ°ΠΌΠΈ ΡΡ ΠΌΠΎΠΆΠ΅ΡΡ ΠΎΠ±ΡΠ°ΡΠΈΡΡΡΡ ΠΊ ΡΠ°Π·Π΄Π΅Π»Ρ AIGym Π² Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΠ°ΡΠΈΠΈ.
What are the benefits of using Ultralytics YOLO11 for workout monitoring?
Using Ultralytics YOLO11 for workout monitoring provides several key benefits:
- ΠΠΏΡΠΈΠΌΠΈΠ·ΠΈΡΠΎΠ²Π°Π½Π½Π°Ρ ΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΡ: ΠΠΎΠ΄Π±ΠΈΡΠ°Ρ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠΈ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Π΄Π°Π½Π½ΡΡ ΠΌΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³Π°, ΡΡ ΡΠΌΠΎΠΆΠ΅ΡΡ Π΄ΠΎΠ±ΠΈΡΡΡΡ Π»ΡΡΡΠΈΡ ΡΠ΅Π·ΡΠ»ΡΡΠ°ΡΠΎΠ².
- ΠΠΎΡΡΠΈΠΆΠ΅Π½ΠΈΠ΅ ΡΠ΅Π»Π΅ΠΉ: ΠΠ΅Π³ΠΊΠΎ ΠΎΡΡΠ»Π΅ΠΆΠΈΠ²Π°ΠΉ ΠΈ ΠΊΠΎΡΡΠ΅ΠΊΡΠΈΡΡΠΉ ΡΠΈΡΠ½Π΅Ρ-ΡΠ΅Π»ΠΈ, ΡΡΠΎΠ±Ρ Π΄ΠΎΠ±ΠΈΡΡΡΡ ΠΈΠ·ΠΌΠ΅ΡΠΈΠΌΠΎΠ³ΠΎ ΠΏΡΠΎΠ³ΡΠ΅ΡΡΠ°.
- ΠΠ΅ΡΡΠΎΠ½Π°Π»ΠΈΠ·Π°ΡΠΈΡ: ΠΠΎΠ»ΡΡΠ°ΠΉ ΠΈΠ½Π΄ΠΈΠ²ΠΈΠ΄ΡΠ°Π»ΡΠ½ΡΠ΅ ΠΏΠ»Π°Π½Ρ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ, ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΠ΅ Π½Π° ΡΠ²ΠΎΠΈΡ ΠΈΠ½Π΄ΠΈΠ²ΠΈΠ΄ΡΠ°Π»ΡΠ½ΡΡ Π΄Π°Π½Π½ΡΡ , Π΄Π»Ρ ΠΎΠΏΡΠΈΠΌΠ°Π»ΡΠ½ΠΎΠΉ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΡΡΠΈ.
- ΠΠ½ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½Π½ΠΎΡΡΡ ΠΎ Π·Π΄ΠΎΡΠΎΠ²ΡΠ΅: Π Π°Π½Π½Π΅Π΅ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΠ΅ ΠΏΠ°ΡΡΠ΅ΡΠ½ΠΎΠ², ΡΠΊΠ°Π·ΡΠ²Π°ΡΡΠΈΡ Π½Π° ΠΏΠΎΡΠ΅Π½ΡΠΈΠ°Π»ΡΠ½ΡΠ΅ ΠΏΡΠΎΠ±Π»Π΅ΠΌΡ ΡΠΎ Π·Π΄ΠΎΡΠΎΠ²ΡΠ΅ΠΌ ΠΈΠ»ΠΈ ΠΏΠ΅ΡΠ΅ΡΡΠ΅Π½ΠΈΡΠΎΠ²Π°Π½Π½ΠΎΡΡΡ.
- ΠΠ±ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΠ΅ ΡΠ΅ΡΠ΅Π½ΠΈΡ: ΠΡΠΈΠ½ΠΈΠΌΠ°ΠΉ ΡΠ΅ΡΠ΅Π½ΠΈΡ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Π΄Π°Π½Π½ΡΡ , ΡΡΠΎΠ±Ρ ΡΠΊΠΎΡΡΠ΅ΠΊΡΠΈΡΠΎΠ²Π°ΡΡ ΡΠ°ΡΠΏΠΎΡΡΠ΄ΠΎΠΊ Π΄Π½Ρ ΠΈ ΠΏΠΎΡΡΠ°Π²ΠΈΡΡ ΡΠ΅Π°Π»ΠΈΡΡΠΈΡΠ½ΡΠ΅ ΡΠ΅Π»ΠΈ.
Π’Ρ ΠΌΠΎΠΆΠ΅ΡΡ ΠΏΠΎΡΠΌΠΎΡΡΠ΅ΡΡ Π²ΠΈΠ΄Π΅ΠΎΠ΄Π΅ΠΌΠΎΠ½ΡΡΡΠ°ΡΠΈΡ Π½Π° YouTube, ΡΡΠΎΠ±Ρ ΡΠ²ΠΈΠ΄Π΅ΡΡ ΡΡΠΈ ΠΏΡΠ΅ΠΈΠΌΡΡΠ΅ΡΡΠ²Π° Π² Π΄Π΅ΠΉΡΡΠ²ΠΈΠΈ.
How accurate is Ultralytics YOLO11 in detecting and tracking exercises?
Ultralytics YOLO11 is highly accurate in detecting and tracking exercises due to its state-of-the-art pose estimation capabilities. It can accurately track key body landmarks and joints, providing real-time feedback on exercise form and performance metrics. The model's pretrained weights and robust architecture ensure high precision and reliability. For real-world examples, check out the real-world applications section in the documentation, which showcases pushups and pullups counting.
Can I use Ultralytics YOLO11 for custom workout routines?
Yes, Ultralytics YOLO11 can be adapted for custom workout routines. The AIGym
ΠΠ»Π°ΡΡ ΠΏΠΎΠ΄Π΄Π΅ΡΠΆΠΈΠ²Π°Π΅Ρ ΡΠ°Π·Π»ΠΈΡΠ½ΡΠ΅ ΡΠΈΠΏΡ ΠΏΠΎΠ·, ΡΠ°ΠΊΠΈΠ΅ ΠΊΠ°ΠΊ "ΠΎΡΠΆΠΈΠΌΠ°Π½ΠΈΠ΅", "ΠΏΠΎΠ΄ΡΡΠ³ΠΈΠ²Π°Π½ΠΈΠ΅" ΠΈ "ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΊΠ° ΠΆΠΈΠ²ΠΎΡΠ°". Π’Ρ ΠΌΠΎΠΆΠ΅ΡΡ ΡΠΊΠ°Π·Π°ΡΡ ΠΊΠ»ΡΡΠ΅Π²ΡΠ΅ ΡΠΎΡΠΊΠΈ ΠΈ ΡΠ³Π»Ρ, ΡΡΠΎΠ±Ρ ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΡΡ ΠΊΠΎΠ½ΠΊΡΠ΅ΡΠ½ΡΠ΅ ΡΠΏΡΠ°ΠΆΠ½Π΅Π½ΠΈΡ. ΠΠΎΡ ΠΏΡΠΈΠΌΠ΅Ρ Π½Π°ΡΡΡΠΎΠΉΠΊΠΈ:
from ultralytics import solutions
gym = solutions.AIGym(
line_width=2,
show=True,
kpts=[6, 8, 10],
)
ΠΠΎΠ΄ΡΠΎΠ±Π½Π΅Π΅ ΠΎ Π½Π°ΡΡΡΠΎΠΉΠΊΠ΅ Π°ΡΠ³ΡΠΌΠ΅Π½ΡΠΎΠ² ΡΠΈΡΠ°ΠΉΡΠ΅ Π² ΡΠ°Π·Π΄Π΅Π»Π΅ ΠΡΠ³ΡΠΌΠ΅Π½ΡΡ AIGym
Π Π°Π·Π΄Π΅Π». Π’Π°ΠΊΠ°Ρ Π³ΠΈΠ±ΠΊΠΎΡΡΡ ΠΏΠΎΠ·Π²ΠΎΠ»ΡΠ΅Ρ ΡΠ΅Π±Π΅ ΡΠ»Π΅Π΄ΠΈΡΡ Π·Π° Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΠ΅ΠΌ ΡΠ°Π·Π»ΠΈΡΠ½ΡΡ
ΡΠΏΡΠ°ΠΆΠ½Π΅Π½ΠΈΠΉ ΠΈ Π½Π°ΡΡΡΠ°ΠΈΠ²Π°ΡΡ ΡΡΡΠΈΠ½Ρ Π² Π·Π°Π²ΠΈΡΠΈΠΌΠΎΡΡΠΈ ΠΎΡ ΡΠ²ΠΎΠΈΡ
ΠΏΠΎΡΡΠ΅Π±Π½ΠΎΡΡΠ΅ΠΉ.
How can I save the workout monitoring output using Ultralytics YOLO11?
Π§ΡΠΎΠ±Ρ ΡΠΎΡ ΡΠ°Π½ΠΈΡΡ ΡΠ΅Π·ΡΠ»ΡΡΠ°ΡΡ ΠΌΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³Π° ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ, ΡΡ ΠΌΠΎΠΆΠ΅ΡΡ ΠΌΠΎΠ΄ΠΈΡΠΈΡΠΈΡΠΎΠ²Π°ΡΡ ΠΊΠΎΠ΄, Π²ΠΊΠ»ΡΡΠΈΠ² Π² Π½Π΅Π³ΠΎ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΈΡΠ°ΡΠ΅Π»Ρ, ΠΊΠΎΡΠΎΡΡΠΉ ΡΠΎΡ ΡΠ°Π½ΡΠ΅Ρ ΠΎΠ±ΡΠ°Π±ΠΎΡΠ°Π½Π½ΡΠ΅ ΠΊΠ°Π΄ΡΡ. ΠΠΎΡ ΠΏΡΠΈΠΌΠ΅Ρ:
import cv2
from ultralytics import solutions
cap = cv2.VideoCapture("path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))
video_writer = cv2.VideoWriter("workouts.avi", cv2.VideoWriter_fourcc(*"mp4v"), fps, (w, h))
gym = solutions.AIGym(
line_width=2,
show=True,
kpts=[6, 8, 10],
)
while cap.isOpened():
success, im0 = cap.read()
if not success:
print("Video frame is empty or video processing has been successfully completed.")
break
im0 = gym.monitor(im0)
video_writer.write(im0)
cv2.destroyAllWindows()
video_writer.release()
ΠΡΠ° Π½Π°ΡΡΡΠΎΠΉΠΊΠ° Π·Π°ΠΏΠΈΡΡΠ²Π°Π΅Ρ Π½Π°Π±Π»ΡΠ΄Π°Π΅ΠΌΠΎΠ΅ Π²ΠΈΠ΄Π΅ΠΎ Π² Π²ΡΡ ΠΎΠ΄Π½ΠΎΠΉ ΡΠ°ΠΉΠ». ΠΠΎΠ΄ΡΠΎΠ±Π½Π΅Π΅ ΠΎΠ± ΡΡΠΎΠΌ ΡΠΈΡΠ°ΠΉ Π² ΡΠ°Π·Π΄Π΅Π»Π΅ " ΠΠΎΠ½ΠΈΡΠΎΡΠΈΠ½Π³ ΡΡΠ΅Π½ΠΈΡΠΎΠ²ΠΎΠΊ Ρ ΡΠΎΡ ΡΠ°Π½Π΅Π½ΠΈΠ΅ΠΌ Π²ΡΠ²ΠΎΠ΄Π° ".