์ฝ˜ํ…์ธ ๋กœ ๊ฑด๋„ˆ๋›ฐ๊ธฐ

์ฝ”์ฝ” ํฌ์ฆˆ ๋ฐ์ดํ„ฐ ์„ธํŠธ

COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ํฌ์ฆˆ ์ถ”์ • ์ž‘์—…์„ ์œ„ํ•ด ์„ค๊ณ„๋œ COCO(Common Objects in Context) ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ํŠน์ˆ˜ ๋ฒ„์ „์ž…๋‹ˆ๋‹ค. COCO ํ‚คํฌ์ธํŠธ 2017 ์ด๋ฏธ์ง€์™€ ๋ ˆ์ด๋ธ”์„ ํ™œ์šฉํ•˜์—ฌ ํฌ์ฆˆ ์ถ”์ • ์ž‘์—…์„ ์œ„ํ•œ YOLO ๊ฐ™์€ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

ํฌ์ฆˆ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

์ฝ”์ฝ” ํฌ์ฆˆ ์‚ฌ์ „ ํ›ˆ๋ จ๋œ ๋ชจ๋ธ

๋ชจ๋ธ ํฌ๊ธฐ
(ํ”ฝ์…€)
mAPpose
50-95
mAPpose
50
์†๋„
CPU ONNX
(ms)
์†๋„
A100 TensorRT
(ms)
๋งค๊ฐœ๋ณ€์ˆ˜
(M)
FLOPs
(B)
YOLOv8n-pose 640 50.4 80.1 131.8 1.18 3.3 9.2
YOLOv8s-pose 640 60.0 86.2 233.2 1.42 11.6 30.2
YOLOv8m-pose 640 65.0 88.8 456.3 2.00 26.4 81.0
YOLOv8l-pose 640 67.6 90.0 784.5 2.59 44.4 168.6
YOLOv8x-pose 640 69.2 90.2 1607.1 3.73 69.4 263.2
YOLOv8x-pose-p6 1280 71.6 91.2 4088.7 10.04 99.1 1066.4

์ฃผ์š” ๊ธฐ๋Šฅ

  • COCO-Pose๋Š” ํฌ์ฆˆ ์ถ”์ • ์ž‘์—…์„ ์œ„ํ•ด ํ‚คํฌ์ธํŠธ๋กœ ๋ ˆ์ด๋ธ”์ด ์ง€์ •๋œ 200๋งŒ ๊ฐœ์˜ ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋œ COCO ํ‚คํฌ์ธํŠธ 2017 ๋ฐ์ดํ„ฐ ์„ธํŠธ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•ฉ๋‹ˆ๋‹ค.
  • ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์ธ๋ฌผ์— ๋Œ€ํ•œ 17๊ฐœ์˜ ํ‚คํฌ์ธํŠธ๋ฅผ ์ง€์›ํ•˜์—ฌ ์ƒ์„ธํ•œ ํฌ์ฆˆ ์ถ”์ •์ด ์šฉ์ดํ•ฉ๋‹ˆ๋‹ค.
  • COCO์™€ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ํฌ์ฆˆ ์ถ”์ • ์ž‘์—…์„ ์œ„ํ•œ ๊ฐ์ฒด ํ‚คํฌ์ธํŠธ ์œ ์‚ฌ์„ฑ(OKS)์„ ๋น„๋กฏํ•œ ํ‘œ์ค€ํ™”๋œ ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ์ œ๊ณตํ•˜๋ฏ€๋กœ ๋ชจ๋ธ ์„ฑ๋Šฅ์„ ๋น„๊ตํ•˜๋Š” ๋ฐ ์ ํ•ฉํ•ฉ๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์กฐ

COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์„ธ ๊ฐœ์˜ ํ•˜์œ„ ์ง‘ํ•ฉ์œผ๋กœ ๋‚˜๋‰ฉ๋‹ˆ๋‹ค:

  1. Train2017: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์—๋Š” COCO ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ 118K ์ด๋ฏธ์ง€ ์ค‘ ์ผ๋ถ€๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์œผ๋ฉฐ, ํฌ์ฆˆ ์ถ”์ • ๋ชจ๋ธ ํ•™์Šต์„ ์œ„ํ•ด ์ฃผ์„์ด ์ถ”๊ฐ€๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
  2. Val2017: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์—๋Š” ๋ชจ๋ธ ํ•™์Šต ์ค‘ ์œ ํšจ์„ฑ ๊ฒ€์‚ฌ ๋ชฉ์ ์œผ๋กœ ์‚ฌ์šฉ๋˜๋Š” ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
  3. Test2017: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์€ ํ•™์Šต๋œ ๋ชจ๋ธ์„ ํ…Œ์ŠคํŠธํ•˜๊ณ  ๋ฒค์น˜๋งˆํ‚นํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ๋˜๋Š” ์ด๋ฏธ์ง€๋กœ ๊ตฌ์„ฑ๋ฉ๋‹ˆ๋‹ค. ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์— ๋Œ€ํ•œ ์‹ค์ธก ์ž๋ฃŒ ์ฃผ์„์€ ๊ณต๊ฐœ๋˜์ง€ ์•Š์œผ๋ฉฐ, ๊ฒฐ๊ณผ๋Š” ์„ฑ๋Šฅ ํ‰๊ฐ€๋ฅผ ์œ„ํ•ด COCO ํ‰๊ฐ€ ์„œ๋ฒ„์— ์ œ์ถœ๋ฉ๋‹ˆ๋‹ค.

์• ํ”Œ๋ฆฌ์ผ€์ด์…˜

COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ํŠนํžˆ OpenPose์™€ ๊ฐ™์€ ํ‚คํฌ์ธํŠธ ๊ฐ์ง€ ๋ฐ ํฌ์ฆˆ ์ถ”์ • ์ž‘์—…์—์„œ ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๊ณ  ํ‰๊ฐ€ํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ์ˆ˜๋งŽ์€ ์ฃผ์„์ด ๋‹ฌ๋ฆฐ ์ด๋ฏธ์ง€์™€ ํ‘œ์ค€ํ™”๋œ ํ‰๊ฐ€ ๋ฉ”ํŠธ๋ฆญ์ด ํฌํ•จ๋˜์–ด ์žˆ์–ด ํฌ์ฆˆ ์ถ”์ •์— ์ค‘์ ์„ ๋‘” ์ปดํ“จํ„ฐ ๋น„์ „ ์—ฐ๊ตฌ์ž ๋ฐ ์‹ค๋ฌด์ž์—๊ฒŒ ํ•„์ˆ˜์ ์ธ ๋ฆฌ์†Œ์Šค์ž…๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ YAML

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์„ฑ์„ ์ •์˜ํ•˜๋Š” ๋ฐ๋Š” YAML(๋˜ ๋‹ค๋ฅธ ๋งˆํฌ์—… ์–ธ์–ด) ํŒŒ์ผ์ด ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ๋กœ, ํด๋ž˜์Šค ๋ฐ ๊ธฐํƒ€ ๊ด€๋ จ ์ •๋ณด์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ์šฐ, ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ coco-pose.yaml ํŒŒ์ผ์€ ๋‹ค์Œ ์œ„์น˜์—์„œ ์œ ์ง€๋ฉ๋‹ˆ๋‹ค. https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco-pose.yaml.

ultralytics/cfg/datasets/coco-pose.yaml

# Ultralytics YOLO ๐Ÿš€, AGPL-3.0 license
# COCO 2017 dataset https://cocodataset.org by Microsoft
# Documentation: https://docs.ultralytics.com/datasets/pose/coco/
# Example usage: yolo train data=coco-pose.yaml
# parent
# โ”œโ”€โ”€ ultralytics
# โ””โ”€โ”€ datasets
#     โ””โ”€โ”€ coco-pose  โ† downloads here (20.1 GB)

# Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..]
path: ../datasets/coco-pose # dataset root dir
train: train2017.txt # train images (relative to 'path') 118287 images
val: val2017.txt # val images (relative to 'path') 5000 images
test: test-dev2017.txt # 20288 of 40670 images, submit to https://competitions.codalab.org/competitions/20794

# Keypoints
kpt_shape: [17, 3] # number of keypoints, number of dims (2 for x,y or 3 for x,y,visible)
flip_idx: [0, 2, 1, 4, 3, 6, 5, 8, 7, 10, 9, 12, 11, 14, 13, 16, 15]

# Classes
names:
  0: person

# Download script/URL (optional)
download: |
  from ultralytics.utils.downloads import download
  from pathlib import Path

  # Download labels
  dir = Path(yaml['path'])  # dataset root dir
  url = 'https://github.com/ultralytics/yolov5/releases/download/v1.0/'
  urls = [url + 'coco2017labels-pose.zip']  # labels
  download(urls, dir=dir.parent)
  # Download data
  urls = ['http://images.cocodataset.org/zips/train2017.zip',  # 19G, 118k images
          'http://images.cocodataset.org/zips/val2017.zip',  # 1G, 5k images
          'http://images.cocodataset.org/zips/test2017.zip']  # 7G, 41k images (optional)
  download(urls, dir=dir / 'images', threads=3)

์‚ฌ์šฉ๋ฒ•

์ด๋ฏธ์ง€ ํฌ๊ธฐ๊ฐ€ 640์ธ 100๊ฐœ์˜ ์—ํฌํฌ์— ๋Œ€ํ•ด COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ์—์„œ YOLOv8n-pose ๋ชจ๋ธ์„ ํ•™์Šตํ•˜๋ ค๋ฉด ๋‹ค์Œ ์ฝ”๋“œ ์Šค๋‹ˆํŽซ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ธ์ˆ˜์˜ ์ „์ฒด ๋ชฉ๋ก์€ ๋ชจ๋ธ ํ•™์Šต ํŽ˜์ด์ง€๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.

์—ด์ฐจ ์˜ˆ์‹œ

from ultralytics import YOLO

# Load a model
model = YOLO('yolov8n-pose.pt')  # load a pretrained model (recommended for training)

# Train the model
results = model.train(data='coco-pose.yaml', epochs=100, imgsz=640)
# Start training from a pretrained *.pt model
yolo detect train data=coco-pose.yaml model=yolov8n.pt epochs=100 imgsz=640

์ƒ˜ํ”Œ ์ด๋ฏธ์ง€ ๋ฐ ์ฃผ์„

COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ํ‚คํฌ์ธํŠธ๋กœ ์ฃผ์„์„ ๋‹จ ๋‹ค์–‘ํ•œ ์ด๋ฏธ์ง€ ์„ธํŠธ๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ๋‹ค์Œ์€ ํ•ด๋‹น ์ฃผ์„๊ณผ ํ•จ๊ป˜ ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ์ด๋ฏธ์ง€ ๋ช‡ ๊ฐ€์ง€ ์˜ˆ์‹œ์ž…๋‹ˆ๋‹ค:

๋ฐ์ดํ„ฐ ์„ธํŠธ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

  • ๋ชจ์ž์ดํฌ ์ด๋ฏธ์ง€: ์ด ์ด๋ฏธ์ง€๋Š” ๋ชจ์ž์ดํฌ๋œ ๋ฐ์ดํ„ฐ ์„ธํŠธ ์ด๋ฏธ์ง€๋กœ ๊ตฌ์„ฑ๋œ ํ›ˆ๋ จ ๋ฐฐ์น˜์˜ ์˜ˆ์‹œ์ž…๋‹ˆ๋‹ค. ๋ชจ์ž์ดํฌ๋Š” ์—ฌ๋Ÿฌ ์ด๋ฏธ์ง€๋ฅผ ํ•˜๋‚˜์˜ ์ด๋ฏธ์ง€๋กœ ๊ฒฐํ•ฉํ•˜์—ฌ ๊ฐ ํ›ˆ๋ จ ๋ฐฐ์น˜ ๋‚ด์—์„œ ๋‹ค์–‘ํ•œ ๊ฐœ์ฒด์™€ ์žฅ๋ฉด์„ ๋Š˜๋ฆฌ๊ธฐ ์œ„ํ•ด ํ›ˆ๋ จ ์ค‘์— ์‚ฌ์šฉ๋˜๋Š” ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค. ์ด๋ฅผ ํ†ตํ•ด ๋‹ค์–‘ํ•œ ๊ฐ์ฒด ํฌ๊ธฐ, ์ข…ํšก๋น„ ๋ฐ ์ปจํ…์ŠคํŠธ์— ์ผ๋ฐ˜ํ™”ํ•˜๋Š” ๋ชจ๋ธ์˜ ๋Šฅ๋ ฅ์„ ํ–ฅ์ƒ์‹œํ‚ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

์ด ์˜ˆ๋Š” COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ํฌํ•จ๋œ ์ด๋ฏธ์ง€์˜ ๋‹ค์–‘์„ฑ๊ณผ ๋ณต์žก์„ฑ, ๊ทธ๋ฆฌ๊ณ  ํ›ˆ๋ จ ๊ณผ์ •์—์„œ ๋ชจ์ž์ดํฌ ์‚ฌ์šฉ์˜ ์ด์ ์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.

์ธ์šฉ ๋ฐ ๊ฐ์‚ฌ

์—ฐ๊ตฌ ๋˜๋Š” ๊ฐœ๋ฐœ ์ž‘์—…์— COCO-Pose ๋ฐ์ดํ„ฐ์…‹์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ ๋‹ค์Œ ๋…ผ๋ฌธ์„ ์ธ์šฉํ•ด ์ฃผ์„ธ์š”:

@misc{lin2015microsoft,
      title={Microsoft COCO: Common Objects in Context},
      author={Tsung-Yi Lin and Michael Maire and Serge Belongie and Lubomir Bourdev and Ross Girshick and James Hays and Pietro Perona and Deva Ramanan and C. Lawrence Zitnick and Piotr Dollรกr},
      year={2015},
      eprint={1405.0312},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

์ปดํ“จํ„ฐ ๋น„์ „ ์ปค๋ฎค๋‹ˆํ‹ฐ๋ฅผ ์œ„ํ•ด ์ด ๊ท€์ค‘ํ•œ ๋ฆฌ์†Œ์Šค๋ฅผ ๋งŒ๋“ค๊ณ  ์œ ์ง€ ๊ด€๋ฆฌํ•ด ์ฃผ์‹  COCO ์ปจ์†Œ์‹œ์—„์— ๊ฐ์‚ฌ์˜ ๋ง์”€์„ ์ „ํ•ฉ๋‹ˆ๋‹ค. COCO-Pose ๋ฐ์ดํ„ฐ ์„ธํŠธ์™€ ์ œ์ž‘์ž์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ COCO ๋ฐ์ดํ„ฐ ์„ธํŠธ ์›น์‚ฌ์ดํŠธ๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.



์ƒ์„ฑ 2023-11-12, ์—…๋ฐ์ดํŠธ 2024-04-17
์ž‘์„ฑ์ž: glenn-jocher (4), RizwanMunawar (1), Laughing-q (1)

๋Œ“๊ธ€