์ฝ˜ํ…์ธ ๋กœ ๊ฑด๋„ˆ๋›ฐ๊ธฐ

SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ

SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์กฐ๋ฐ€ํ•˜๊ฒŒ ํฌ์žฅ๋œ ์†Œ๋งค์  ์ง„์—ด๋Œ€ ์ด๋ฏธ์ง€ ๋ชจ์Œ์œผ๋กœ, ๋ฌผ์ฒด ๊ฐ์ง€ ์ž‘์—…์˜ ์—ฐ๊ตฌ๋ฅผ ์ง€์›ํ•˜๊ธฐ ์œ„ํ•ด ์„ค๊ณ„๋˜์—ˆ์Šต๋‹ˆ๋‹ค. Eran Goldman ๋“ฑ์ด ๊ฐœ๋ฐœํ•œ ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ์œ ์‚ฌํ•˜๊ฑฐ๋‚˜ ์‹ฌ์ง€์–ด ๋™์ผํ•ด ๋ณด์ด๋Š” ๋ฌผ์ฒด๊ฐ€ ๋ฐ€์ง‘๋˜์–ด ์žˆ๊ณ  ๊ฐ€๊นŒ์šด ๊ณณ์— ๋ฐฐ์น˜๋œ 110,000๊ฐœ ์ด์ƒ์˜ ๊ณ ์œ ํ•œ SKU(์ƒํ’ˆ ๋ถ„๋ฅ˜ ๋‹จ์œ„) ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

์ฃผ์š” ๊ธฐ๋Šฅ

  • SKU-110k์—๋Š” ์ „ ์„ธ๊ณ„์˜ ๋งค์žฅ ์ง„์—ด๋Œ€ ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์œผ๋ฉฐ, ์ตœ์ฒจ๋‹จ ๋ฌผ์ฒด ๊ฐ์ง€๊ธฐ์— ๋„์ „ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฐ€์ง‘๋œ ๋ฌผ์ฒด๊ฐ€ ํŠน์ง•์ž…๋‹ˆ๋‹ค.
  • ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” 110,000๊ฐœ ์ด์ƒ์˜ ๊ณ ์œ ํ•œ SKU ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์–ด ๋‹ค์–‘ํ•œ ๊ฐ์ฒด ๋ชจ์–‘์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
  • ์ฃผ์„์—๋Š” ๊ฐ์ฒด์˜ ๊ฒฝ๊ณ„ ์ƒ์ž ๋ฐ SKU ์นดํ…Œ๊ณ ๋ฆฌ ๋ ˆ์ด๋ธ”์ด ํฌํ•จ๋ฉ๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์กฐ

SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์„ธ ๊ฐ€์ง€ ์ฃผ์š” ํ•˜์œ„ ์ง‘ํ•ฉ์œผ๋กœ ๊ตฌ์„ฑ๋ฉ๋‹ˆ๋‹ค:

  1. ํ›ˆ๋ จ ์ง‘ํ•ฉ: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์—๋Š” ๊ฐ์ฒด ๊ฐ์ง€ ๋ชจ๋ธ ํ•™์Šต์— ์‚ฌ์šฉ๋˜๋Š” ์ด๋ฏธ์ง€์™€ ์ฃผ์„์ด ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
  2. ์œ ํšจ์„ฑ ๊ฒ€์‚ฌ ์ง‘ํ•ฉ: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์€ ํ•™์Šต ์ค‘ ๋ชจ๋ธ ์œ ํšจ์„ฑ ๊ฒ€์‚ฌ์— ์‚ฌ์šฉ๋˜๋Š” ์ด๋ฏธ์ง€์™€ ์ฃผ์„์œผ๋กœ ๊ตฌ์„ฑ๋ฉ๋‹ˆ๋‹ค.
  3. ํ…Œ์ŠคํŠธ ์„ธํŠธ: ์ด ํ•˜์œ„ ์ง‘ํ•ฉ์€ ํ•™์Šต๋œ ๊ฐ์ฒด ๊ฐ์ง€ ๋ชจ๋ธ์˜ ์ตœ์ข… ํ‰๊ฐ€๋ฅผ ์œ„ํ•ด ์„ค๊ณ„๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

์• ํ”Œ๋ฆฌ์ผ€์ด์…˜

SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ๋ฌผ์ฒด ๊ฐ์ง€ ์ž‘์—…, ํŠนํžˆ ์†Œ๋งค์  ์ง„์—ด๋Œ€์™€ ๊ฐ™์ด ๋ฐ€์ง‘๋œ ์žฅ๋ฉด์—์„œ ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๊ณ  ํ‰๊ฐ€ํ•˜๋Š” ๋ฐ ๋„๋ฆฌ ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋‹ค์–‘ํ•œ SKU ์นดํ…Œ๊ณ ๋ฆฌ์™€ ๋ฐ€์ง‘๋œ ๊ฐ์ฒด ๋ฐฐ์—ด์€ ์ปดํ“จํ„ฐ ๋น„์ „ ๋ถ„์•ผ์˜ ์—ฐ๊ตฌ์ž์™€ ์‹ค๋ฌด์ž์—๊ฒŒ ์œ ์šฉํ•œ ๋ฆฌ์†Œ์Šค์ž…๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ YAML

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์„ฑ์„ ์ •์˜ํ•˜๋Š” ๋ฐ๋Š” YAML(๋˜ ๋‹ค๋ฅธ ๋งˆํฌ์—… ์–ธ์–ด) ํŒŒ์ผ์ด ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ๋กœ, ํด๋ž˜์Šค ๋ฐ ๊ธฐํƒ€ ๊ด€๋ จ ์ •๋ณด์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. SKU-110K ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ์šฐ, ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ SKU-110K.yaml ํŒŒ์ผ์€ ๋‹ค์Œ ์œ„์น˜์—์„œ ์œ ์ง€๋ฉ๋‹ˆ๋‹ค. https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/SKU-110K.yaml.

ultralytics/cfg/๋ฐ์ดํ„ฐ์„ธํŠธ/SKU-110K.yaml

# Ultralytics YOLO ๐Ÿš€, AGPL-3.0 license
# SKU-110K retail items dataset https://github.com/eg4000/SKU110K_CVPR19 by Trax Retail
# Documentation: https://docs.ultralytics.com/datasets/detect/sku-110k/
# Example usage: yolo train data=SKU-110K.yaml
# parent
# โ”œโ”€โ”€ ultralytics
# โ””โ”€โ”€ datasets
#     โ””โ”€โ”€ SKU-110K  โ† downloads here (13.6 GB)

# Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..]
path: ../datasets/SKU-110K # dataset root dir
train: train.txt # train images (relative to 'path')  8219 images
val: val.txt # val images (relative to 'path')  588 images
test: test.txt # test images (optional)  2936 images

# Classes
names:
  0: object

# Download script/URL (optional) ---------------------------------------------------------------------------------------
download: |
  import shutil
  from pathlib import Path

  import numpy as np
  import pandas as pd
  from tqdm import tqdm

  from ultralytics.utils.downloads import download
  from ultralytics.utils.ops import xyxy2xywh

  # Download
  dir = Path(yaml['path'])  # dataset root dir
  parent = Path(dir.parent)  # download dir
  urls = ['http://trax-geometry.s3.amazonaws.com/cvpr_challenge/SKU110K_fixed.tar.gz']
  download(urls, dir=parent)

  # Rename directories
  if dir.exists():
      shutil.rmtree(dir)
  (parent / 'SKU110K_fixed').rename(dir)  # rename dir
  (dir / 'labels').mkdir(parents=True, exist_ok=True)  # create labels dir

  # Convert labels
  names = 'image', 'x1', 'y1', 'x2', 'y2', 'class', 'image_width', 'image_height'  # column names
  for d in 'annotations_train.csv', 'annotations_val.csv', 'annotations_test.csv':
      x = pd.read_csv(dir / 'annotations' / d, names=names).values  # annotations
      images, unique_images = x[:, 0], np.unique(x[:, 0])
      with open((dir / d).with_suffix('.txt').__str__().replace('annotations_', ''), 'w') as f:
          f.writelines(f'./images/{s}\n' for s in unique_images)
      for im in tqdm(unique_images, desc=f'Converting {dir / d}'):
          cls = 0  # single-class dataset
          with open((dir / 'labels' / im).with_suffix('.txt'), 'a') as f:
              for r in x[images == im]:
                  w, h = r[6], r[7]  # image width, height
                  xywh = xyxy2xywh(np.array([[r[1] / w, r[2] / h, r[3] / w, r[4] / h]]))[0]  # instance
                  f.write(f"{cls} {xywh[0]:.5f} {xywh[1]:.5f} {xywh[2]:.5f} {xywh[3]:.5f}\n")  # write label

์‚ฌ์šฉ๋ฒ•

์ด๋ฏธ์ง€ ํฌ๊ธฐ๊ฐ€ 640์ธ 100๊ฐœ์˜ ์—ํฌํฌ์— ๋Œ€ํ•ด SKU-110K ๋ฐ์ดํ„ฐ ์„ธํŠธ์—์„œ YOLOv8n ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๋ ค๋ฉด ๋‹ค์Œ ์ฝ”๋“œ ์กฐ๊ฐ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ธ์ˆ˜์˜ ์ „์ฒด ๋ชฉ๋ก์€ ๋ชจ๋ธ ํ•™์Šต ํŽ˜์ด์ง€๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.

์—ด์ฐจ ์˜ˆ์‹œ

from ultralytics import YOLO

# Load a model
model = YOLO('yolov8n.pt')  # load a pretrained model (recommended for training)

# Train the model
results = model.train(data='SKU-110K.yaml', epochs=100, imgsz=640)
# Start training from a pretrained *.pt model
yolo detect train data=SKU-110K.yaml model=yolov8n.pt epochs=100 imgsz=640

์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ ๋ฐ ์ฃผ์„

SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ๊ฐ์ฒด๊ฐ€ ๋ฐ€์ง‘๋œ ๋‹ค์–‘ํ•œ ์†Œ๋งค์  ์ง„์—ด๋Œ€ ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์–ด ๊ฐ์ฒด ๊ฐ์ง€ ์ž‘์—…์— ๋Œ€ํ•œ ํ’๋ถ€ํ•œ ์ปจํ…์ŠคํŠธ๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ๋‹ค์Œ์€ ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋ฐ์ดํ„ฐ ์˜ˆ์‹œ ๋ช‡ ๊ฐ€์ง€์™€ ํ•ด๋‹น ์ฃผ์„์ž…๋‹ˆ๋‹ค:

๋ฐ์ดํ„ฐ ์„ธํŠธ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

  • ์กฐ๋ฐ€ํ•˜๊ฒŒ ํฌ์žฅ๋œ ์†Œ๋งค์  ์ง„์—ด๋Œ€ ์ด๋ฏธ์ง€: ์ด ์ด๋ฏธ์ง€๋Š” ์†Œ๋งค์  ์ง„์—ด๋Œ€ ํ™˜๊ฒฝ์—์„œ ์กฐ๋ฐ€ํ•˜๊ฒŒ ํฌ์žฅ๋œ ๊ฐ์ฒด์˜ ์˜ˆ๋ฅผ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค. ๊ฐ์ฒด์—๋Š” ๊ฒฝ๊ณ„ ์ƒ์ž ๋ฐ SKU ์นดํ…Œ๊ณ ๋ฆฌ ๋ ˆ์ด๋ธ”๋กœ ์ฃผ์„์„ ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.

์ด ์˜ˆ๋Š” SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ํฌํ•จ๋œ ๋ฐ์ดํ„ฐ์˜ ๋‹ค์–‘์„ฑ๊ณผ ๋ณต์žก์„ฑ์„ ๋ณด์—ฌ์ฃผ๋ฉฐ ๋ฌผ์ฒด ๊ฐ์ง€ ์ž‘์—…์—์„œ ๊ณ ํ’ˆ์งˆ ๋ฐ์ดํ„ฐ์˜ ์ค‘์š”์„ฑ์„ ๊ฐ•์กฐํ•ฉ๋‹ˆ๋‹ค.

์ธ์šฉ ๋ฐ ๊ฐ์‚ฌ

์—ฐ๊ตฌ ๋˜๋Š” ๊ฐœ๋ฐœ ์ž‘์—…์— SKU-110k ๋ฐ์ดํ„ฐ์…‹์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ ๋‹ค์Œ ๋…ผ๋ฌธ์„ ์ธ์šฉํ•ด ์ฃผ์„ธ์š”:

@inproceedings{goldman2019dense,
 author    = {Eran Goldman and Roei Herzig and Aviv Eisenschtat and Jacob Goldberger and Tal Hassner},
 title     = {Precise Detection in Densely Packed Scenes},
 booktitle = {Proc. Conf. Comput. Vision Pattern Recognition (CVPR)},
 year      = {2019}
}

์ปดํ“จํ„ฐ ๋น„์ „ ์—ฐ๊ตฌ ์ปค๋ฎค๋‹ˆํ‹ฐ๋ฅผ ์œ„ํ•œ ๊ท€์ค‘ํ•œ ๋ฆฌ์†Œ์Šค์ธ SKU-110k ๋ฐ์ดํ„ฐ์…‹์„ ๋งŒ๋“ค๊ณ  ์œ ์ง€ ๊ด€๋ฆฌํ•ด ์ฃผ์‹  Eran Goldman ๋“ฑ์—๊ฒŒ ๊ฐ์‚ฌ์˜ ๋ง์”€์„ ์ „ํ•ฉ๋‹ˆ๋‹ค. SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ์™€ ์ œ์ž‘์ž์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ SKU-110k ๋ฐ์ดํ„ฐ ์„ธํŠธ GitHub ๋ฆฌํฌ์ง€ํ† ๋ฆฌ์—์„œ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.



2023-11-12 ์ƒ์„ฑ, 2023-11-22 ์—…๋ฐ์ดํŠธ๋จ
์ž‘์„ฑ์ž: glenn-jocher (3), Laughing-q (1)

๋Œ“๊ธ€