์ฝ˜ํ…์ธ ๋กœ ๊ฑด๋„ˆ๋›ฐ๊ธฐ

xView ๋ฐ์ดํ„ฐ ์„ธํŠธ

xView ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์ „ ์„ธ๊ณ„์˜ ๋ณต์žกํ•œ ์žฅ๋ฉด์— ๋ฐ”์šด๋”ฉ ๋ฐ•์Šค๋ฅผ ์‚ฌ์šฉํ•ด ์ฃผ์„์ด ๋‹ฌ๋ฆฐ ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋œ ๊ณต๊ฐœ์ ์œผ๋กœ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ๊ฐ€์žฅ ํฐ ์˜ค๋ฒ„ํ—ค๋“œ ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ ์„ธํŠธ ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค. xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋ชฉํ‘œ๋Š” ๋„ค ๊ฐ€์ง€ ์ปดํ“จํ„ฐ ๋น„์ „ ๋ถ„์•ผ์˜ ๋ฐœ์ „์„ ๊ฐ€์†ํ™”ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค:

  1. ํƒ์ง€๋ฅผ ์œ„ํ•œ ์ตœ์†Œ ํ•ด์ƒ๋„๋ฅผ ๋‚ฎ์ถฅ๋‹ˆ๋‹ค.
  2. ํ•™์Šต ํšจ์œจ์„ฑ์„ ๊ฐœ์„ ํ•˜์„ธ์š”.
  3. ๋” ๋งŽ์€ ๊ฐœ์ฒด ํด๋ž˜์Šค๋ฅผ ๊ฒ€์ƒ‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
  4. ์„ธ๋ถ„ํ™”๋œ ํด๋ž˜์Šค์— ๋Œ€ํ•œ ํƒ์ง€ ๊ธฐ๋Šฅ์„ ๊ฐœ์„ ํ•ฉ๋‹ˆ๋‹ค.

xView๋Š” COCO(Common Objects in Context)์™€ ๊ฐ™์€ ๊ณผ์ œ์˜ ์„ฑ๊ณต์„ ๋ฐ”ํƒ•์œผ๋กœ ์ปดํ“จํ„ฐ ๋น„์ „์„ ํ™œ์šฉํ•˜์—ฌ ์šฐ์ฃผ์—์„œ ์ ์  ๋” ๋งŽ์€ ์–‘์˜ ์ด๋ฏธ์ง€๋ฅผ ๋ถ„์„ํ•˜์—ฌ ์‹œ๊ฐ ์„ธ๊ณ„๋ฅผ ์ƒˆ๋กœ์šด ๋ฐฉ์‹์œผ๋กœ ์ดํ•ดํ•˜๊ณ  ๋‹ค์–‘ํ•œ ์ค‘์š”ํ•œ ์• ํ”Œ๋ฆฌ์ผ€์ด์…˜์„ ๋‹ค๋ฃจ๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•ฉ๋‹ˆ๋‹ค.

์ฃผ์š” ๊ธฐ๋Šฅ

  • xView์—๋Š” 60๊ฐœ ํด๋ž˜์Šค์— ๊ฑธ์ณ 100๋งŒ ๊ฐœ ์ด์ƒ์˜ ์˜ค๋ธŒ์ ํŠธ ์ธ์Šคํ„ด์Šค๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
  • ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ํ•ด์ƒ๋„๋Š” 0.3๋ฏธํ„ฐ๋กœ, ๋Œ€๋ถ€๋ถ„์˜ ๊ณต๊ฐœ ์œ„์„ฑ ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ ์„ธํŠธ๋ณด๋‹ค ๋” ๋†’์€ ํ•ด์ƒ๋„์˜ ์ด๋ฏธ์ง€๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
  • xView๋Š” ๋ฐ”์šด๋”ฉ ๋ฐ•์Šค ์ฃผ์„์ด ์žˆ๋Š” ์ž‘๊ณ  ํฌ๊ท€ํ•˜๋ฉฐ ์„ธ๋ถ„ํ™”๋œ ๋‹ค์–‘ํ•œ ์œ ํ˜•์˜ ์˜ค๋ธŒ์ ํŠธ ์ปฌ๋ ‰์…˜์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
  • TensorFlow ๊ฐ์ฒด ๊ฐ์ง€ API๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์‚ฌ์ „ ํ•™์Šต๋œ ๊ธฐ์ค€ ๋ชจ๋ธ๊ณผ PyTorch ์— ๋Œ€ํ•œ ์˜ˆ์ œ๊ฐ€ ํ•จ๊ป˜ ์ œ๊ณต๋ฉ๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์กฐ

xView ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์›”๋“œ๋ทฐ-3 ์œ„์„ฑ์—์„œ 0.3m์˜ ์ง€์ƒ ์ƒ˜ํ”Œ ๊ฑฐ๋ฆฌ์—์„œ ์ˆ˜์ง‘ํ•œ ์œ„์„ฑ ์ด๋ฏธ์ง€๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” 1,400kmยฒ ์ด์ƒ์˜ ์ด๋ฏธ์ง€์— 60๊ฐœ ํด๋ž˜์Šค์— ๊ฑธ์ณ 100๋งŒ ๊ฐœ ์ด์ƒ์˜ ๋ฌผ์ฒด๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.

์• ํ”Œ๋ฆฌ์ผ€์ด์…˜

xView ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์˜ค๋ฒ„ํ—ค๋“œ ์ด๋ฏธ์ง€์—์„œ ๋ฌผ์ฒด ๊ฐ์ง€๋ฅผ ์œ„ํ•œ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๊ณ  ํ‰๊ฐ€ํ•˜๋Š” ๋ฐ ๋„๋ฆฌ ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋‹ค์–‘ํ•œ ๊ฐ์ฒด ํด๋ž˜์Šค์™€ ๊ณ ํ•ด์ƒ๋„ ์ด๋ฏธ์ง€๋Š” ์ปดํ“จํ„ฐ ๋น„์ „ ๋ถ„์•ผ, ํŠนํžˆ ์œ„์„ฑ ์ด๋ฏธ์ง€ ๋ถ„์„์„ ์œ„ํ•œ ์—ฐ๊ตฌ์ž์™€ ์‹ค๋ฌด์ž์—๊ฒŒ ์œ ์šฉํ•œ ๋ฆฌ์†Œ์Šค์ž…๋‹ˆ๋‹ค.

๋ฐ์ดํ„ฐ ์„ธํŠธ YAML

๋ฐ์ดํ„ฐ ์„ธํŠธ ๊ตฌ์„ฑ์„ ์ •์˜ํ•˜๋Š” ๋ฐ๋Š” YAML(๋˜ ๋‹ค๋ฅธ ๋งˆํฌ์—… ์–ธ์–ด) ํŒŒ์ผ์ด ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ๋กœ, ํด๋ž˜์Šค ๋ฐ ๊ธฐํƒ€ ๊ด€๋ จ ์ •๋ณด์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๊ฒฝ์šฐ, ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ xView.yaml ํŒŒ์ผ์€ ๋‹ค์Œ ์œ„์น˜์—์„œ ์œ ์ง€๋ฉ๋‹ˆ๋‹ค. https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/xView.yaml.

ultralytics/cfg/datasets/xView.yaml

# Ultralytics YOLO ๐Ÿš€, AGPL-3.0 license
# DIUx xView 2018 Challenge https://challenge.xviewdataset.org by U.S. National Geospatial-Intelligence Agency (NGA)
# --------  DOWNLOAD DATA MANUALLY and jar xf val_images.zip to 'datasets/xView' before running train command!  --------
# Documentation: https://docs.ultralytics.com/datasets/detect/xview/
# Example usage: yolo train data=xView.yaml
# parent
# โ”œโ”€โ”€ ultralytics
# โ””โ”€โ”€ datasets
#     โ””โ”€โ”€ xView  โ† downloads here (20.7 GB)

# Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..]
path: ../datasets/xView # dataset root dir
train: images/autosplit_train.txt # train images (relative to 'path') 90% of 847 train images
val: images/autosplit_val.txt # train images (relative to 'path') 10% of 847 train images

# Classes
names:
  0: Fixed-wing Aircraft
  1: Small Aircraft
  2: Cargo Plane
  3: Helicopter
  4: Passenger Vehicle
  5: Small Car
  6: Bus
  7: Pickup Truck
  8: Utility Truck
  9: Truck
  10: Cargo Truck
  11: Truck w/Box
  12: Truck Tractor
  13: Trailer
  14: Truck w/Flatbed
  15: Truck w/Liquid
  16: Crane Truck
  17: Railway Vehicle
  18: Passenger Car
  19: Cargo Car
  20: Flat Car
  21: Tank car
  22: Locomotive
  23: Maritime Vessel
  24: Motorboat
  25: Sailboat
  26: Tugboat
  27: Barge
  28: Fishing Vessel
  29: Ferry
  30: Yacht
  31: Container Ship
  32: Oil Tanker
  33: Engineering Vehicle
  34: Tower crane
  35: Container Crane
  36: Reach Stacker
  37: Straddle Carrier
  38: Mobile Crane
  39: Dump Truck
  40: Haul Truck
  41: Scraper/Tractor
  42: Front loader/Bulldozer
  43: Excavator
  44: Cement Mixer
  45: Ground Grader
  46: Hut/Tent
  47: Shed
  48: Building
  49: Aircraft Hangar
  50: Damaged Building
  51: Facility
  52: Construction Site
  53: Vehicle Lot
  54: Helipad
  55: Storage Tank
  56: Shipping container lot
  57: Shipping Container
  58: Pylon
  59: Tower

# Download script/URL (optional) ---------------------------------------------------------------------------------------
download: |
  import json
  import os
  from pathlib import Path

  import numpy as np
  from PIL import Image
  from tqdm import tqdm

  from ultralytics.data.utils import autosplit
  from ultralytics.utils.ops import xyxy2xywhn


  def convert_labels(fname=Path('xView/xView_train.geojson')):
      # Convert xView geoJSON labels to YOLO format
      path = fname.parent
      with open(fname) as f:
          print(f'Loading {fname}...')
          data = json.load(f)

      # Make dirs
      labels = Path(path / 'labels' / 'train')
      os.system(f'rm -rf {labels}')
      labels.mkdir(parents=True, exist_ok=True)

      # xView classes 11-94 to 0-59
      xview_class2index = [-1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 0, 1, 2, -1, 3, -1, 4, 5, 6, 7, 8, -1, 9, 10, 11,
                           12, 13, 14, 15, -1, -1, 16, 17, 18, 19, 20, 21, 22, -1, 23, 24, 25, -1, 26, 27, -1, 28, -1,
                           29, 30, 31, 32, 33, 34, 35, 36, 37, -1, 38, 39, 40, 41, 42, 43, 44, 45, -1, -1, -1, -1, 46,
                           47, 48, 49, -1, 50, 51, -1, 52, -1, -1, -1, 53, 54, -1, 55, -1, -1, 56, -1, 57, -1, 58, 59]

      shapes = {}
      for feature in tqdm(data['features'], desc=f'Converting {fname}'):
          p = feature['properties']
          if p['bounds_imcoords']:
              id = p['image_id']
              file = path / 'train_images' / id
              if file.exists():  # 1395.tif missing
                  try:
                      box = np.array([int(num) for num in p['bounds_imcoords'].split(",")])
                      assert box.shape[0] == 4, f'incorrect box shape {box.shape[0]}'
                      cls = p['type_id']
                      cls = xview_class2index[int(cls)]  # xView class to 0-60
                      assert 59 >= cls >= 0, f'incorrect class index {cls}'

                      # Write YOLO label
                      if id not in shapes:
                          shapes[id] = Image.open(file).size
                      box = xyxy2xywhn(box[None].astype(np.float), w=shapes[id][0], h=shapes[id][1], clip=True)
                      with open((labels / id).with_suffix('.txt'), 'a') as f:
                          f.write(f"{cls} {' '.join(f'{x:.6f}' for x in box[0])}\n")  # write label.txt
                  except Exception as e:
                      print(f'WARNING: skipping one label for {file}: {e}')


  # Download manually from https://challenge.xviewdataset.org
  dir = Path(yaml['path'])  # dataset root dir
  # urls = ['https://d307kc0mrhucc3.cloudfront.net/train_labels.zip',  # train labels
  #         'https://d307kc0mrhucc3.cloudfront.net/train_images.zip',  # 15G, 847 train images
  #         'https://d307kc0mrhucc3.cloudfront.net/val_images.zip']  # 5G, 282 val images (no labels)
  # download(urls, dir=dir)

  # Convert labels
  convert_labels(dir / 'xView_train.geojson')

  # Move images
  images = Path(dir / 'images')
  images.mkdir(parents=True, exist_ok=True)
  Path(dir / 'train_images').rename(dir / 'images' / 'train')
  Path(dir / 'val_images').rename(dir / 'images' / 'val')

  # Split
  autosplit(dir / 'images' / 'train')

์‚ฌ์šฉ๋ฒ•

์ด๋ฏธ์ง€ ํฌ๊ธฐ๊ฐ€ 640์ธ xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์—์„œ 100๊ฐœ์˜ ์—ํฌํฌ์— ๋Œ€ํ•œ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๋ ค๋ฉด ๋‹ค์Œ ์ฝ”๋“œ ์กฐ๊ฐ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ธ์ˆ˜์˜ ์ „์ฒด ๋ชฉ๋ก์€ ๋ชจ๋ธ ํ•™์Šต ํŽ˜์ด์ง€๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.

์—ด์ฐจ ์˜ˆ์‹œ

from ultralytics import YOLO

# Load a model
model = YOLO('yolov8n.pt')  # load a pretrained model (recommended for training)

# Train the model
results = model.train(data='xView.yaml', epochs=100, imgsz=640)
# Start training from a pretrained *.pt model
yolo detect train data=xView.yaml model=yolov8n.pt epochs=100 imgsz=640

์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ ๋ฐ ์ฃผ์„

xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์—๋Š” ๋ฐ”์šด๋”ฉ ๋ฐ•์Šค๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ฃผ์„์ด ๋‹ฌ๋ฆฐ ๋‹ค์–‘ํ•œ ๊ฐœ์ฒด ์ง‘ํ•ฉ์ด ํฌํ•จ๋œ ๊ณ ํ•ด์ƒ๋„ ์œ„์„ฑ ์ด๋ฏธ์ง€๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ๋‹ค์Œ์€ ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ๋ฐ์ดํ„ฐ์™€ ํ•ด๋‹น ์ฃผ์„์˜ ๋ช‡ ๊ฐ€์ง€ ์˜ˆ์ž…๋‹ˆ๋‹ค:

๋ฐ์ดํ„ฐ ์„ธํŠธ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€

  • ์˜ค๋ฒ„ํ—ค๋“œ ์ด๋ฏธ์ง€: ์ด ์ด๋ฏธ์ง€๋Š” ์˜ค๋ฒ„ํ—ค๋“œ ์ด๋ฏธ์ง€์—์„œ ๊ฐ์ฒด์— ๊ฒฝ๊ณ„ ์ƒ์ž๊ฐ€ ์ฃผ์„์œผ๋กœ ํ‘œ์‹œ๋œ ๊ฐ์ฒด ๊ฐ์ง€์˜ ์˜ˆ๋ฅผ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” ์ด ์ž‘์—…์„ ์œ„ํ•œ ๋ชจ๋ธ ๊ฐœ๋ฐœ์„ ์šฉ์ดํ•˜๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•ด ๊ณ ํ•ด์ƒ๋„ ์œ„์„ฑ ์ด๋ฏธ์ง€๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

์ด ์˜ˆ๋Š” xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ํฌํ•จ๋œ ๋ฐ์ดํ„ฐ์˜ ๋‹ค์–‘์„ฑ๊ณผ ๋ณต์žก์„ฑ์„ ๋ณด์—ฌ์ฃผ๋ฉฐ ๋ฌผ์ฒด ๊ฐ์ง€ ์ž‘์—…์—์„œ ๊ณ ํ’ˆ์งˆ ์œ„์„ฑ ์ด๋ฏธ์ง€์˜ ์ค‘์š”์„ฑ์„ ๊ฐ•์กฐํ•ฉ๋‹ˆ๋‹ค.

์ธ์šฉ ๋ฐ ๊ฐ์‚ฌ

์—ฐ๊ตฌ ๋˜๋Š” ๊ฐœ๋ฐœ ์ž‘์—…์— xView ๋ฐ์ดํ„ฐ์„ธํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ ๋‹ค์Œ ๋…ผ๋ฌธ์„ ์ธ์šฉํ•ด ์ฃผ์„ธ์š”:

@misc{lam2018xview,
      title={xView: Objects in Context in Overhead Imagery},
      author={Darius Lam and Richard Kuzma and Kevin McGee and Samuel Dooley and Michael Laielli and Matthew Klaric and Yaroslav Bulatov and Brendan McCord},
      year={2018},
      eprint={1802.07856},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

์ปดํ“จํ„ฐ ๋น„์ „ ์—ฐ๊ตฌ ์ปค๋ฎค๋‹ˆํ‹ฐ์— ๊ท€์ค‘ํ•œ ๊ธฐ์—ฌ๋ฅผ ํ•ด์ฃผ์‹  ๊ตญ๋ฐฉ ํ˜์‹  ์œ ๋‹› (DIU)๊ณผ xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์˜ ์ œ์ž‘์ž์—๊ฒŒ ๊ฐ์‚ฌ์˜ ๋ง์”€์„ ์ „ํ•ฉ๋‹ˆ๋‹ค. xView ๋ฐ์ดํ„ฐ ์„ธํŠธ์™€ ์ œ์ž‘์ž์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ xView ๋ฐ์ดํ„ฐ ์„ธํŠธ ์›น์‚ฌ์ดํŠธ๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.



2023-11-12 ์ƒ์„ฑ, 2023-11-22 ์—…๋ฐ์ดํŠธ๋จ
์ž‘์„ฑ์ž: glenn-jocher (3), Laughing-q (1)

๋Œ“๊ธ€