์ฝ˜ํ…์ธ ๋กœ ๊ฑด๋„ˆ๋›ฐ๊ธฐ

VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ

VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์ค‘๊ตญ ํ†ˆ์ง„๋Œ€ํ•™๊ต ๋จธ์‹ ๋Ÿฌ๋‹ ๋ฐ ๋ฐ์ดํ„ฐ ๋งˆ์ด๋‹ ์—ฐ๊ตฌ์†Œ์˜ AISKYEYE ํŒ€์ด ๋งŒ๋“  ๋Œ€๊ทœ๋ชจ ๋ฒค์น˜๋งˆํฌ์ž…๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋“œ๋ก  ๊ธฐ๋ฐ˜ ์ด๋ฏธ์ง€ ๋ฐ ๋น„๋””์˜ค ๋ถ„์„๊ณผ ๊ด€๋ จ๋œ ๋‹ค์–‘ํ•œ ์ปดํ“จํ„ฐ ๋น„์ „ ์ž‘์—…์— ๋Œ€ํ•œ ์„ธ์‹ฌํ•œ ์ฃผ์„์ด ๋‹ฌ๋ฆฐ ์‹ค์ธก ๋ฐ์ดํ„ฐ๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.

VisDrone์€ ๋“œ๋ก ์— ์žฅ์ฐฉ๋œ ๋‹ค์–‘ํ•œ ์นด๋ฉ”๋ผ๋กœ ์ดฌ์˜ํ•œ 261,908 ํ”„๋ ˆ์ž„์˜ 288๊ฐœ ๋น„๋””์˜ค ํด๋ฆฝ๊ณผ 10,209๊ฐœ์˜ ์ •์  ์ด๋ฏธ์ง€๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์œ„์น˜(์ค‘๊ตญ ์ „์—ญ์˜ 14๊ฐœ ๋„์‹œ), ํ™˜๊ฒฝ(๋„์‹œ ๋ฐ ๋†์ดŒ), ๋ฌผ์ฒด(๋ณดํ–‰์ž, ์ฐจ๋Ÿ‰, ์ž์ „๊ฑฐ ๋“ฑ), ๋ฐ€๋„(๋“œ๋ฌธ๋“œ๋ฌธ ๋ถ๋น„๋Š” ์žฅ๋ฉด) ๋“ฑ ๋‹ค์–‘ํ•œ ์ธก๋ฉด์„ ํฌ๊ด„ํ•ฉ๋‹ˆ๋‹ค. ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ๋‹ค์–‘ํ•œ ์‹œ๋‚˜๋ฆฌ์˜ค์™€ ๋‚ ์”จ ๋ฐ ์กฐ๋ช… ์กฐ๊ฑด์—์„œ ๋‹ค์–‘ํ•œ ๋“œ๋ก  ํ”Œ๋žซํผ์„ ์‚ฌ์šฉํ•˜์—ฌ ์ˆ˜์ง‘๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ์ด๋Ÿฌํ•œ ํ”„๋ ˆ์ž„์—๋Š” ๋ณดํ–‰์ž, ์ž๋™์ฐจ, ์ž์ „๊ฑฐ, ์„ธ๋ฐœ์ž์ „๊ฑฐ ๋“ฑ 260๋งŒ ๊ฐœ๊ฐ€ ๋„˜๋Š” ๋Œ€์ƒ์˜ ๊ฒฝ๊ณ„ ์ƒ์ž์— ์ˆ˜๋™์œผ๋กœ ์ฃผ์„์„ ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค. ์”ฌ ๊ฐ€์‹œ์„ฑ, ์˜ค๋ธŒ์ ํŠธ ํด๋ž˜์Šค, ์˜คํด๋ฃจ์ „๊ณผ ๊ฐ™์€ ์†์„ฑ๋„ ์ œ๊ณต๋˜์–ด ๋ฐ์ดํ„ฐ ํ™œ์šฉ๋„๋ฅผ ๋†’์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์กฐ

VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” 5๊ฐœ์˜ ์ฃผ์š” ํ•˜์œ„ ์ง‘ํ•ฉ์œผ๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์œผ๋ฉฐ, ๊ฐ ์ง‘ํ•ฉ์€ ํŠน์ • ์ž‘์—…์— ์ค‘์ ์„ ๋‘๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค:

  1. ์ž‘์—… 1: ์ด๋ฏธ์ง€์—์„œ ๋ฌผ์ฒด ๊ฐ์ง€
  2. ์ž‘์—… 2: ๋™์˜์ƒ์—์„œ ๊ฐ์ฒด ๊ฐ์ง€
  3. ์ž‘์—… 3: ๋‹จ์ผ ๊ฐœ์ฒด ์ถ”์ 
  4. ์ž‘์—… 4: ๋‹ค์ค‘ ๊ฐœ์ฒด ์ถ”์ 
  5. ์ž‘์—… 5: ๊ตฐ์ค‘ ๊ณ„์‚ฐ

์• ํ”Œ๋ฆฌ์ผ€์ด์…˜

VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ๋ฌผ์ฒด ๊ฐ์ง€, ๋ฌผ์ฒด ์ถ”์ , ๊ตฐ์ค‘ ๊ณ„์‚ฐ๊ณผ ๊ฐ™์€ ๋“œ๋ก  ๊ธฐ๋ฐ˜ ์ปดํ“จํ„ฐ ๋น„์ „ ์ž‘์—…์—์„œ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๊ณ  ํ‰๊ฐ€ํ•˜๋Š” ๋ฐ ๋„๋ฆฌ ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋‹ค์–‘ํ•œ ์„ผ์„œ ๋ฐ์ดํ„ฐ, ๊ฐ์ฒด ์ฃผ์„ ๋ฐ ์†์„ฑ์€ ๋“œ๋ก  ๊ธฐ๋ฐ˜ ์ปดํ“จํ„ฐ ๋น„์ „ ๋ถ„์•ผ์˜ ์—ฐ๊ตฌ์ž์™€ ์‹ค๋ฌด์ž์—๊ฒŒ ์œ ์šฉํ•œ ๋ฆฌ์†Œ์Šค์ž…๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ YAML

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์„ฑ์„ ์ •์˜ํ•˜๋Š” ๋ฐ๋Š” YAML(๋˜ ๋‹ค๋ฅธ ๋งˆํฌ์—… ์–ธ์–ด) ํŒŒ์ผ์ด ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ๋กœ, ํด๋ž˜์Šค ๋ฐ ๊ธฐํƒ€ ๊ด€๋ จ ์ •๋ณด์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. Visdrone ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ์šฐ, ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ VisDrone.yaml ํŒŒ์ผ์€ ๋‹ค์Œ ์œ„์น˜์—์„œ ์œ ์ง€๋ฉ๋‹ˆ๋‹ค. https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/VisDrone.yaml.

ultralytics/cfg/datasets/VisDrone.yaml

# Ultralytics YOLO ๐Ÿš€, AGPL-3.0 license
# VisDrone2019-DET dataset https://github.com/VisDrone/VisDrone-Dataset by Tianjin University
# Documentation: https://docs.ultralytics.com/datasets/detect/visdrone/
# Example usage: yolo train data=VisDrone.yaml
# parent
# โ”œโ”€โ”€ ultralytics
# โ””โ”€โ”€ datasets
#     โ””โ”€โ”€ VisDrone  โ† downloads here (2.3 GB)

# Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..]
path: ../datasets/VisDrone # dataset root dir
train: VisDrone2019-DET-train/images # train images (relative to 'path')  6471 images
val: VisDrone2019-DET-val/images # val images (relative to 'path')  548 images
test: VisDrone2019-DET-test-dev/images # test images (optional)  1610 images

# Classes
names:
  0: pedestrian
  1: people
  2: bicycle
  3: car
  4: van
  5: truck
  6: tricycle
  7: awning-tricycle
  8: bus
  9: motor

# Download script/URL (optional) ---------------------------------------------------------------------------------------
download: |
  import os
  from pathlib import Path

  from ultralytics.utils.downloads import download

  def visdrone2yolo(dir):
      from PIL import Image
      from tqdm import tqdm

      def convert_box(size, box):
          # Convert VisDrone box to YOLO xywh box
          dw = 1. / size[0]
          dh = 1. / size[1]
          return (box[0] + box[2] / 2) * dw, (box[1] + box[3] / 2) * dh, box[2] * dw, box[3] * dh

      (dir / 'labels').mkdir(parents=True, exist_ok=True)  # make labels directory
      pbar = tqdm((dir / 'annotations').glob('*.txt'), desc=f'Converting {dir}')
      for f in pbar:
          img_size = Image.open((dir / 'images' / f.name).with_suffix('.jpg')).size
          lines = []
          with open(f, 'r') as file:  # read annotation.txt
              for row in [x.split(',') for x in file.read().strip().splitlines()]:
                  if row[4] == '0':  # VisDrone 'ignored regions' class 0
                      continue
                  cls = int(row[5]) - 1
                  box = convert_box(img_size, tuple(map(int, row[:4])))
                  lines.append(f"{cls} {' '.join(f'{x:.6f}' for x in box)}\n")
                  with open(str(f).replace(f'{os.sep}annotations{os.sep}', f'{os.sep}labels{os.sep}'), 'w') as fl:
                      fl.writelines(lines)  # write label.txt


  # Download
  dir = Path(yaml['path'])  # dataset root dir
  urls = ['https://github.com/ultralytics/yolov5/releases/download/v1.0/VisDrone2019-DET-train.zip',
          'https://github.com/ultralytics/yolov5/releases/download/v1.0/VisDrone2019-DET-val.zip',
          'https://github.com/ultralytics/yolov5/releases/download/v1.0/VisDrone2019-DET-test-dev.zip',
          'https://github.com/ultralytics/yolov5/releases/download/v1.0/VisDrone2019-DET-test-challenge.zip']
  download(urls, dir=dir, curl=True, threads=4)

  # Convert
  for d in 'VisDrone2019-DET-train', 'VisDrone2019-DET-val', 'VisDrone2019-DET-test-dev':
      visdrone2yolo(dir / d)  # convert VisDrone annotations to YOLO labels

์‚ฌ์šฉ๋ฒ•

์ด๋ฏธ์ง€ ํฌ๊ธฐ๊ฐ€ 640์ธ 100๊ฐœ์˜ ์—ํฌํฌ์— ๋Œ€ํ•ด VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ์—์„œ YOLOv8n ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๋ ค๋ฉด ๋‹ค์Œ ์ฝ”๋“œ ์Šค๋‹ˆํŽซ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ธ์ˆ˜์˜ ์ „์ฒด ๋ชฉ๋ก์€ ๋ชจ๋ธ ํ•™์Šต ํŽ˜์ด์ง€๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.

์—ด์ฐจ ์˜ˆ์‹œ

from ultralytics import YOLO

# Load a model
model = YOLO('yolov8n.pt')  # load a pretrained model (recommended for training)

# Train the model
results = model.train(data='VisDrone.yaml', epochs=100, imgsz=640)
# Start training from a pretrained *.pt model
yolo detect train data=VisDrone.yaml model=yolov8n.pt epochs=100 imgsz=640

์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ ๋ฐ ์ฃผ์„

VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ๋“œ๋ก ์— ์žฅ์ฐฉ๋œ ์นด๋ฉ”๋ผ๋กœ ์บก์ฒ˜ํ•œ ๋‹ค์–‘ํ•œ ์ด๋ฏธ์ง€์™€ ๋™์˜์ƒ์ด ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ๋‹ค์Œ์€ ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋ช‡ ๊ฐ€์ง€ ๋ฐ์ดํ„ฐ ์˜ˆ์‹œ์™€ ํ•ด๋‹น ์ฃผ์„์ž…๋‹ˆ๋‹ค:

๋ฐ์ดํ„ฐ ์„ธํŠธ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

  • ์ž‘์—… 1: ์ด๋ฏธ์ง€์—์„œ ๊ฐ์ฒด ๊ฐ์ง€ - ์ด ์ด๋ฏธ์ง€๋Š” ๊ฐ์ฒด์— ๊ฒฝ๊ณ„ ์ƒ์ž๊ฐ€ ์ฃผ์„์œผ๋กœ ํ‘œ์‹œ๋œ ์ด๋ฏธ์ง€์—์„œ ๊ฐ์ฒด๋ฅผ ๊ฐ์ง€ํ•˜๋Š” ์˜ˆ์‹œ๋ฅผ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ๋‹ค์–‘ํ•œ ์œ„์น˜, ํ™˜๊ฒฝ ๋ฐ ๋ฐ€๋„์—์„œ ์ดฌ์˜๋œ ๋‹ค์–‘ํ•œ ์ด๋ฏธ์ง€๋ฅผ ์ œ๊ณตํ•˜์—ฌ ์ด ์ž‘์—…์„ ์œ„ํ•œ ๋ชจ๋ธ ๊ฐœ๋ฐœ์„ ์šฉ์ดํ•˜๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.

์ด ์˜ˆ๋Š” VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ํฌํ•จ๋œ ๋ฐ์ดํ„ฐ์˜ ๋‹ค์–‘์„ฑ๊ณผ ๋ณต์žก์„ฑ์„ ๋ณด์—ฌ์ฃผ๋ฉฐ ๋“œ๋ก  ๊ธฐ๋ฐ˜ ์ปดํ“จํ„ฐ ๋น„์ „ ์ž‘์—…์—์„œ ๊ณ ํ’ˆ์งˆ ์„ผ์„œ ๋ฐ์ดํ„ฐ์˜ ์ค‘์š”์„ฑ์„ ๊ฐ•์กฐํ•ฉ๋‹ˆ๋‹ค.

์ธ์šฉ ๋ฐ ๊ฐ์‚ฌ

์—ฐ๊ตฌ ๋˜๋Š” ๊ฐœ๋ฐœ ์ž‘์—…์— VisDrone ๋ฐ์ดํ„ฐ์„ธํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ ๋‹ค์Œ ๋…ผ๋ฌธ์„ ์ธ์šฉํ•ด ์ฃผ์„ธ์š”:

@ARTICLE{9573394,
  author={Zhu, Pengfei and Wen, Longyin and Du, Dawei and Bian, Xiao and Fan, Heng and Hu, Qinghua and Ling, Haibin},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  title={Detection and Tracking Meet Drones Challenge},
  year={2021},
  volume={},
  number={},
  pages={1-1},
  doi={10.1109/TPAMI.2021.3119563}}

๋“œ๋ก  ๊ธฐ๋ฐ˜ ์ปดํ“จํ„ฐ ๋น„์ „ ์—ฐ๊ตฌ ์ปค๋ฎค๋‹ˆํ‹ฐ๋ฅผ ์œ„ํ•œ ๊ท€์ค‘ํ•œ ๋ฆฌ์†Œ์Šค์ธ VisDrone ๋ฐ์ดํ„ฐ์„ธํŠธ๋ฅผ ๋งŒ๋“ค๊ณ  ์œ ์ง€ ๊ด€๋ฆฌํ•˜๋Š” ์ค‘๊ตญ ์ฒœ์ง„๋Œ€ํ•™๊ต ๋จธ์‹ ๋Ÿฌ๋‹ ๋ฐ ๋ฐ์ดํ„ฐ ๋งˆ์ด๋‹ ์—ฐ๊ตฌ์†Œ์˜ AISKYEYE ํŒ€์— ๊ฐ์‚ฌ์˜ ๋ง์”€์„ ์ „ํ•ฉ๋‹ˆ๋‹ค. VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ์™€ ์ œ์ž‘์ž์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ VisDrone ๋ฐ์ดํ„ฐ ์„ธํŠธ GitHub ๋ฆฌํฌ์ง€ํ† ๋ฆฌ์—์„œ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.



2023-11-12 ์ƒ์„ฑ, 2023-11-22 ์—…๋ฐ์ดํŠธ๋จ
์ž‘์„ฑ์ž: glenn-jocher (3), Laughing-q (1)

๋Œ“๊ธ€