μ½˜ν…μΈ λ‘œ κ±΄λ„ˆλ›°κΈ°

λ‹€μŒμ„ μ‚¬μš©ν•œ 뢄석 Ultralytics YOLOv8

μ†Œκ°œ

이 κ°€μ΄λ“œμ—μ„œλŠ” μ„  κ·Έλž˜ν”„, λ§‰λŒ€ν˜• 차트, μ›ν˜• 차트의 μ„Έ 가지 κΈ°λ³Έ 데이터 μ‹œκ°ν™” μœ ν˜•μ— λŒ€ν•œ 포괄적인 κ°œμš”λ₯Ό μ œκ³΅ν•©λ‹ˆλ‹€. 각 μ„Ήμ…˜μ—λŠ” Python 을 μ‚¬μš©ν•˜μ—¬ μ΄λŸ¬ν•œ μ‹œκ°ν™”λ₯Ό λ§Œλ“œλŠ” 방법에 λŒ€ν•œ 단계별 지침과 μ½”λ“œ μŠ€λ‹ˆνŽ«μ΄ ν¬ν•¨λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€.

μ‹œκ°μ  μƒ˜ν”Œ

μ„  κ·Έλž˜ν”„ λ§‰λŒ€ ν”Œλ‘― 파이 차트
μ„  κ·Έλž˜ν”„ λ§‰λŒ€ ν”Œλ‘― 파이 차트

κ·Έλž˜ν”„κ°€ μ€‘μš”ν•œ 이유

  • μ„  κ·Έλž˜ν”„λŠ” μž₯단기 λ³€ν™”λ₯Ό μΆ”μ ν•˜κ³  같은 κΈ°κ°„ λ™μ•ˆ μ—¬λŸ¬ 그룹의 λ³€ν™”λ₯Ό λΉ„κ΅ν•˜λŠ” 데 μ΄μƒμ μž…λ‹ˆλ‹€.
  • 반면 λ§‰λŒ€ν˜• λ§‰λŒ€κ·Έλž˜ν”„λŠ” μ—¬λŸ¬ μΉ΄ν…Œκ³ λ¦¬μ˜ μˆ˜λŸ‰μ„ λΉ„κ΅ν•˜κ³  μΉ΄ν…Œκ³ λ¦¬μ™€ ν•΄λ‹Ή 수치 μ‚¬μ΄μ˜ 관계λ₯Ό ν‘œμ‹œν•˜λŠ” 데 μ ν•©ν•©λ‹ˆλ‹€.
  • λ§ˆμ§€λ§‰μœΌλ‘œ 파이 μ°¨νŠΈλŠ” μΉ΄ν…Œκ³ λ¦¬ κ°„μ˜ λΉ„μœ¨μ„ μ„€λͺ…ν•˜κ³  μ „μ²΄μ˜ 일뢀λ₯Ό ν‘œμ‹œν•˜λŠ” 데 νš¨κ³Όμ μž…λ‹ˆλ‹€.

뢄석 예제

import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")

cap = cv2.VideoCapture("Path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))

out = cv2.VideoWriter("line_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(
    type="line",
    writer=out,
    im0_shape=(w, h),
    view_img=True,
)
total_counts = 0
frame_count = 0

while cap.isOpened():
    success, frame = cap.read()

    if success:
        frame_count += 1
        results = model.track(frame, persist=True, verbose=True)

        if results[0].boxes.id is not None:
            boxes = results[0].boxes.xyxy.cpu()
            for box in boxes:
                total_counts += 1

        analytics.update_line(frame_count, total_counts)

        total_counts = 0
        if cv2.waitKey(1) & 0xFF == ord("q"):
            break
    else:
        break

cap.release()
out.release()
cv2.destroyAllWindows()
import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")

cap = cv2.VideoCapture("Path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))
out = cv2.VideoWriter("multiple_line_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(
    type="line",
    writer=out,
    im0_shape=(w, h),
    view_img=True,
    max_points=200,
)

frame_count = 0
data = {}
labels = []

while cap.isOpened():
    success, frame = cap.read()

    if success:
        frame_count += 1

        results = model.track(frame, persist=True)

        if results[0].boxes.id is not None:
            boxes = results[0].boxes.xyxy.cpu()
            track_ids = results[0].boxes.id.int().cpu().tolist()
            clss = results[0].boxes.cls.cpu().tolist()

            for box, track_id, cls in zip(boxes, track_ids, clss):
                # Store each class label
                if model.names[int(cls)] not in labels:
                    labels.append(model.names[int(cls)])

                # Store each class count
                if model.names[int(cls)] in data:
                    data[model.names[int(cls)]] += 1
                else:
                    data[model.names[int(cls)]] = 0

        # update lines every frame
        analytics.update_multiple_lines(data, labels, frame_count)
        data = {}  # clear the data list for next frame
    else:
        break

cap.release()
out.release()
cv2.destroyAllWindows()
import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")

cap = cv2.VideoCapture("Path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))

out = cv2.VideoWriter("pie_chart.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(
    type="pie",
    writer=out,
    im0_shape=(w, h),
    view_img=True,
)

clswise_count = {}

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True, verbose=True)
        if results[0].boxes.id is not None:
            boxes = results[0].boxes.xyxy.cpu()
            clss = results[0].boxes.cls.cpu().tolist()
            for box, cls in zip(boxes, clss):
                if model.names[int(cls)] in clswise_count:
                    clswise_count[model.names[int(cls)]] += 1
                else:
                    clswise_count[model.names[int(cls)]] = 1

            analytics.update_pie(clswise_count)
            clswise_count = {}

        if cv2.waitKey(1) & 0xFF == ord("q"):
            break
    else:
        break

cap.release()
out.release()
cv2.destroyAllWindows()
import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")

cap = cv2.VideoCapture("Path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))

out = cv2.VideoWriter("bar_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(
    type="bar",
    writer=out,
    im0_shape=(w, h),
    view_img=True,
)

clswise_count = {}

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True, verbose=True)
        if results[0].boxes.id is not None:
            boxes = results[0].boxes.xyxy.cpu()
            clss = results[0].boxes.cls.cpu().tolist()
            for box, cls in zip(boxes, clss):
                if model.names[int(cls)] in clswise_count:
                    clswise_count[model.names[int(cls)]] += 1
                else:
                    clswise_count[model.names[int(cls)]] = 1

            analytics.update_bar(clswise_count)
            clswise_count = {}

        if cv2.waitKey(1) & 0xFF == ord("q"):
            break
    else:
        break

cap.release()
out.release()
cv2.destroyAllWindows()
import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")

cap = cv2.VideoCapture("path/to/video/file.mp4")
assert cap.isOpened(), "Error reading video file"
w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))

out = cv2.VideoWriter("area_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(
    type="area",
    writer=out,
    im0_shape=(w, h),
    view_img=True,
)

clswise_count = {}
frame_count = 0

while cap.isOpened():
    success, frame = cap.read()
    if success:
        frame_count += 1
        results = model.track(frame, persist=True, verbose=True)

        if results[0].boxes.id is not None:
            boxes = results[0].boxes.xyxy.cpu()
            clss = results[0].boxes.cls.cpu().tolist()

            for box, cls in zip(boxes, clss):
                if model.names[int(cls)] in clswise_count:
                    clswise_count[model.names[int(cls)]] += 1
                else:
                    clswise_count[model.names[int(cls)]] = 1

        analytics.update_area(frame_count, clswise_count)
        clswise_count = {}
        if cv2.waitKey(1) & 0xFF == ord("q"):
            break
    else:
        break

cap.release()
out.release()
cv2.destroyAllWindows()

인수 Analytics

λ‹€μŒμ€ ν‘œμž…λ‹ˆλ‹€. Analytics 인수λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€:

이름 μœ ν˜• κΈ°λ³Έκ°’ μ„€λͺ…
type str None 데이터 λ˜λŠ” 객체의 μœ ν˜•μž…λ‹ˆλ‹€.
im0_shape tuple None 초기 μ΄λ―Έμ§€μ˜ λͺ¨μ–‘μž…λ‹ˆλ‹€.
writer cv2.VideoWriter None λΉ„λ””μ˜€ 파일 μ“°κΈ°μš© κ°œμ²΄μž…λ‹ˆλ‹€.
title str ultralytics λΉ„μ£Όμ–Όλ¦¬μ œμ΄μ…˜μ˜ 제λͺ©μž…λ‹ˆλ‹€.
x_label str x XμΆ•μ˜ λ ˆμ΄λΈ”μž…λ‹ˆλ‹€.
y_label str y YμΆ•μ˜ λ ˆμ΄λΈ”μž…λ‹ˆλ‹€.
bg_color str white 배경색.
fg_color str black 전경색.
line_color str yellow μ„ μ˜ 색상.
line_width int 2 μ„ μ˜ λ„ˆλΉ„μž…λ‹ˆλ‹€.
fontsize int 13 ν…μŠ€νŠΈμ˜ κΈ€κΌ΄ ν¬κΈ°μž…λ‹ˆλ‹€.
view_img bool False 이미지 λ˜λŠ” λ™μ˜μƒμ„ ν‘œμ‹œν•˜λ €λ©΄ ν”Œλž˜κ·Έλ₯Ό ν΄λ¦­ν•©λ‹ˆλ‹€.
save_img bool True ν”Œλž˜κ·Έλ₯Ό μ§€μ •ν•˜μ—¬ 이미지 λ˜λŠ” λ™μ˜μƒμ„ μ €μž₯ν•©λ‹ˆλ‹€.
max_points int 50 μ—¬λŸ¬ μ„ μ˜ 경우, 초기 점을 μ‚­μ œν•˜κΈ° 전에 ν”„λ ˆμž„μ— 그렀진 총 μ μž…λ‹ˆλ‹€.
points_width int 15 μ„  포인트 ν˜•κ΄‘νŽœμ˜ λ„ˆλΉ„.

인수 model.track

이름 μœ ν˜• κΈ°λ³Έκ°’ μ„€λͺ…
source im0 None 이미지 λ˜λŠ” λΉ„λ””μ˜€μ˜ μ†ŒμŠ€ 디렉토리
persist bool False ν”„λ ˆμž„ κ°„ νŠΈλž™ 지속
tracker str botsort.yaml 좔적 방법 'λ°”μ΄νŠΈνŠΈλž™' λ˜λŠ” 'λ΄‡μ†ŒνŠΈ'
conf float 0.3 μ‹ λ’° μž„κ³„κ°’
iou float 0.5 IOU μž„κ³„κ°’
classes list None ν΄λž˜μŠ€λ³„λ‘œ κ²°κ³Όλ₯Ό ν•„ν„°λ§ν•©λ‹ˆλ‹€(예: classes=0 λ˜λŠ” classes=[0,2,3]).
verbose bool True 개체 좔적 κ²°κ³Ό ν‘œμ‹œ

κ²°λ‘ 

효과적인 데이터 뢄석을 μœ„ν•΄μ„œλŠ” λ‹€μ–‘ν•œ μœ ν˜•μ˜ μ‹œκ°ν™”λ₯Ό μ–Έμ œ, μ–΄λ–»κ²Œ μ‚¬μš©ν•΄μ•Ό ν•˜λŠ”μ§€ μ΄ν•΄ν•˜λŠ” 것이 μ€‘μš”ν•©λ‹ˆλ‹€. μ„  κ·Έλž˜ν”„, λ§‰λŒ€ν˜• 차트, 파이 μ°¨νŠΈλŠ” λ°μ΄ν„°μ˜ μŠ€ν† λ¦¬λ₯Ό 보닀 λͺ…ν™•ν•˜κ³  효과적으둜 μ „λ‹¬ν•˜λŠ” 데 도움이 λ˜λŠ” 기본적인 λ„κ΅¬μž…λ‹ˆλ‹€.

자주 λ¬»λŠ” 질문

Ultralytics YOLOv8 μ• λ„λ¦¬ν‹±μŠ€λ₯Ό μ‚¬μš©ν•˜μ—¬ 라인 κ·Έλž˜ν”„λ₯Ό λ§Œλ“€λ €λ©΄ μ–΄λ–»κ²Œ ν•΄μ•Ό ν•˜λ‚˜μš”?

Ultralytics YOLOv8 μ• λ„λ¦¬ν‹±μŠ€λ₯Ό μ‚¬μš©ν•˜μ—¬ 라인 κ·Έλž˜ν”„λ₯Ό λ§Œλ“€λ €λ©΄ λ‹€μŒ 단계λ₯Ό λ”°λ₯΄μ„Έμš”:

  1. YOLOv8 λͺ¨λΈμ„ λ‘œλ“œν•˜κ³  λ™μ˜μƒ νŒŒμΌμ„ μ—½λ‹ˆλ‹€.
  2. μ΄ˆκΈ°ν™” Analytics 클래슀의 μœ ν˜•μ„ "line"으둜 μ„€μ •ν•©λ‹ˆλ‹€.
  3. λΉ„λ””μ˜€ ν”„λ ˆμž„μ„ λ°˜λ³΅ν•˜μ—¬ ν”„λ ˆμž„λ‹Ή 개체 μˆ˜μ™€ 같은 κ΄€λ ¨ λ°μ΄ν„°λ‘œ μ„  κ·Έλž˜ν”„λ₯Ό μ—…λ°μ΄νŠΈν•©λ‹ˆλ‹€.
  4. μ„  κ·Έλž˜ν”„κ°€ ν‘œμ‹œλœ 좜λ ₯ λ™μ˜μƒμ„ μ €μž₯ν•©λ‹ˆλ‹€.

μ˜ˆμ‹œ:

import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")
cap = cv2.VideoCapture("Path/to/video/file.mp4")
out = cv2.VideoWriter("line_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(type="line", writer=out, im0_shape=(w, h), view_img=True)

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True)
        total_counts = sum([1 for box in results[0].boxes.xyxy])
        analytics.update_line(frame_count, total_counts)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

cap.release()
out.release()
cv2.destroyAllWindows()

ꡬ성에 λŒ€ν•œ μžμ„Έν•œ λ‚΄μš©μ€ Analytics 클래슀λ₯Ό λ°©λ¬Έν•˜μ‹­μ‹œμ˜€. Ultralytics YOLOv8 πŸ“Šμ„ μ‚¬μš©ν•œ 뢄석 μ„Ήμ…˜μœΌλ‘œ μ΄λ™ν•©λ‹ˆλ‹€.

λ§‰λŒ€ν˜• 차트 생성에 Ultralytics YOLOv8 을 μ‚¬μš©ν•˜λ©΄ μ–΄λ–€ 이점이 μžˆλ‚˜μš”?

λ§‰λŒ€ν˜• 차트 생성에 Ultralytics YOLOv8 을 μ‚¬μš©ν•˜λ©΄ λͺ‡ 가지 이점이 μžˆμŠ΅λ‹ˆλ‹€:

  1. μ‹€μ‹œκ°„ 데이터 μ‹œκ°ν™”: 개체 감지 κ²°κ³Όλ₯Ό λ§‰λŒ€ν˜• μ°¨νŠΈμ— μ›ν™œν•˜κ²Œ ν†΅ν•©ν•˜μ—¬ λ™μ μœΌλ‘œ μ—…λ°μ΄νŠΈν•  수 μžˆμŠ΅λ‹ˆλ‹€.
  2. μ‚¬μš© νŽΈμ˜μ„±: κ°„λ‹¨ν•œ API와 ν•¨μˆ˜λ‘œ 데이터λ₯Ό μ‰½κ²Œ κ΅¬ν˜„ν•˜κ³  μ‹œκ°ν™”ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
  3. μ‚¬μš©μž 지정: 제λͺ©, 라벨, 색상 등을 νŠΉμ • μš”κ΅¬μ‚¬ν•­μ— 맞게 μ‚¬μš©μž μ§€μ •ν•˜μ„Έμš”.
  4. νš¨μœ¨μ„±: λŒ€μš©λŸ‰ 데이터λ₯Ό 효율적으둜 μ²˜λ¦¬ν•˜κ³  λ™μ˜μƒ 처리 쀑에 μ‹€μ‹œκ°„μœΌλ‘œ ν”Œλ‘―μ„ μ—…λ°μ΄νŠΈν•  수 μžˆμŠ΅λ‹ˆλ‹€.

λ‹€μŒ 예제λ₯Ό μ‚¬μš©ν•˜μ—¬ λ§‰λŒ€ν˜• 차트λ₯Ό μƒμ„±ν•©λ‹ˆλ‹€:

import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")
cap = cv2.VideoCapture("Path/to/video/file.mp4")
out = cv2.VideoWriter("bar_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(type="bar", writer=out, im0_shape=(w, h), view_img=True)

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True)
        clswise_count = {
            model.names[int(cls)]: boxes.size(0)
            for cls, boxes in zip(results[0].boxes.cls.tolist(), results[0].boxes.xyxy)
        }
        analytics.update_bar(clswise_count)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

cap.release()
out.release()
cv2.destroyAllWindows()

μžμ„Ένžˆ μ•Œμ•„λ³΄λ €λ©΄ κ°€μ΄λ“œμ˜ λ§‰λŒ€ν˜• 차트 μ„Ήμ…˜μ„ μ°Έμ‘°ν•˜μ„Έμš”.

데이터 μ‹œκ°ν™” ν”„λ‘œμ νŠΈμ—μ„œ 파이 차트λ₯Ό λ§Œλ“€ λ•Œ Ultralytics YOLOv8 을 μ‚¬μš©ν•΄μ•Ό ν•˜λŠ” μ΄μœ λŠ” λ¬΄μ—‡μΈκ°€μš”?

Ultralytics YOLOv8 λŠ” 파이 차트λ₯Ό λ§Œλ“œλŠ” 데 νƒμ›”ν•œ μ„ νƒμž…λ‹ˆλ‹€:

  1. 객체 감지와 톡합: 개체 감지 κ²°κ³Όλ₯Ό 파이 μ°¨νŠΈμ— 직접 ν†΅ν•©ν•˜μ—¬ 즉각적인 μΈμ‚¬μ΄νŠΈλ₯Ό 얻을 수 μžˆμŠ΅λ‹ˆλ‹€.
  2. μ‚¬μš©μž μΉœν™”μ μΈ API: μ΅œμ†Œν•œμ˜ μ½”λ“œλ‘œ κ°„νŽΈν•˜κ²Œ μ„€μ •ν•˜κ³  μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
  3. μ‚¬μš©μž 지정 κ°€λŠ₯: 색상, 라벨 등에 λŒ€ν•œ λ‹€μ–‘ν•œ μ‚¬μš©μž 지정 μ˜΅μ…˜μ΄ μžˆμŠ΅λ‹ˆλ‹€.
  4. μ‹€μ‹œκ°„ μ—…λ°μ΄νŠΈ: μ‹€μ‹œκ°„μœΌλ‘œ 데이터λ₯Ό μ²˜λ¦¬ν•˜κ³  μ‹œκ°ν™”ν•  수 μžˆμ–΄ λ™μ˜μƒ 뢄석 ν”„λ‘œμ νŠΈμ— μ΄μƒμ μž…λ‹ˆλ‹€.

λ‹€μŒμ€ κ°„λ‹¨ν•œ μ˜ˆμž…λ‹ˆλ‹€:

import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")
cap = cv2.VideoCapture("Path/to/video/file.mp4")
out = cv2.VideoWriter("pie_chart.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(type="pie", writer=out, im0_shape=(w, h), view_img=True)

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True)
        clswise_count = {
            model.names[int(cls)]: boxes.size(0)
            for cls, boxes in zip(results[0].boxes.cls.tolist(), results[0].boxes.xyxy)
        }
        analytics.update_pie(clswise_count)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

cap.release()
out.release()
cv2.destroyAllWindows()

μžμ„Έν•œ λ‚΄μš©μ€ κ°€μ΄λ“œμ˜ 파이 차트 μ„Ήμ…˜μ„ μ°Έμ‘°ν•˜μ„Έμš”.

Ultralytics YOLOv8 을 μ‚¬μš©ν•˜μ—¬ 개체λ₯Ό μΆ”μ ν•˜κ³  λΉ„μ£Όμ–Όλ¦¬μ œμ΄μ…˜μ„ λ™μ μœΌλ‘œ μ—…λ°μ΄νŠΈν•  수 μžˆλ‚˜μš”?

예, Ultralytics YOLOv8 을 μ‚¬μš©ν•˜μ—¬ 개체λ₯Ό μΆ”μ ν•˜κ³  μ‹œκ°ν™”λ₯Ό λ™μ μœΌλ‘œ μ—…λ°μ΄νŠΈν•  수 μžˆμŠ΅λ‹ˆλ‹€. μ—¬λŸ¬ 개체λ₯Ό μ‹€μ‹œκ°„μœΌλ‘œ 좔적할 수 있으며 μΆ”μ λœ 개체의 데이터λ₯Ό 기반으둜 μ„  κ·Έλž˜ν”„, λ§‰λŒ€ν˜• 차트, μ›ν˜• 차트 λ“± λ‹€μ–‘ν•œ μ‹œκ°ν™”λ₯Ό μ—…λ°μ΄νŠΈν•  수 μžˆμŠ΅λ‹ˆλ‹€.

라인 κ·Έλž˜ν”„ 좔적 및 μ—…λ°μ΄νŠΈ μ˜ˆμ‹œ:

import cv2

from ultralytics import YOLO, solutions

model = YOLO("yolov8s.pt")
cap = cv2.VideoCapture("Path/to/video/file.mp4")
out = cv2.VideoWriter("line_plot.avi", cv2.VideoWriter_fourcc(*"MJPG"), fps, (w, h))

analytics = solutions.Analytics(type="line", writer=out, im0_shape=(w, h), view_img=True)

while cap.isOpened():
    success, frame = cap.read()
    if success:
        results = model.track(frame, persist=True)
        total_counts = sum([1 for box in results[0].boxes.xyxy])
        analytics.update_line(frame_count, total_counts)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

cap.release()
out.release()
cv2.destroyAllWindows()

전체 κΈ°λŠ₯에 λŒ€ν•΄ μžμ„Ένžˆ μ•Œμ•„λ³΄λ €λ©΄ 좔적 μ„Ήμ…˜μ„ μ°Έμ‘°ν•˜μ„Έμš”.

Ultralytics YOLOv8 λ‹€λ₯Έ 객체 감지 μ†”λ£¨μ…˜(예: OpenCV 및 TensorFlow)κ³Ό λ‹€λ₯Έ 점은 λ¬΄μ—‡μΈκ°€μš”?

Ultralytics YOLOv8 λŠ” μ—¬λŸ¬ 가지 이유둜 OpenCVλ‚˜ TensorFlow 같은 λ‹€λ₯Έ 객체 감지 μ†”λ£¨μ…˜κ³Ό μ°¨λ³„ν™”λ©λ‹ˆλ‹€:

  1. μ΅œμ²¨λ‹¨ 정확도: YOLOv8 λŠ” 물체 감지, μ„ΈλΆ„ν™” 및 λΆ„λ₯˜ μž‘μ—…μ—μ„œ λ›°μ–΄λ‚œ 정확도λ₯Ό μ œκ³΅ν•©λ‹ˆλ‹€.
  2. μ‚¬μš© νŽΈμ˜μ„±: μ‚¬μš©μž μΉœν™”μ μΈ APIλ₯Ό 톡해 κ΄‘λ²”μœ„ν•œ μ½”λ”© 없이도 λΉ λ₯΄κ²Œ κ΅¬ν˜„ν•˜κ³  톡합할 수 μžˆμŠ΅λ‹ˆλ‹€.
  3. μ‹€μ‹œκ°„ μ„±λŠ₯: μ‹€μ‹œκ°„ μ• ν”Œλ¦¬μΌ€μ΄μ…˜μ— μ ν•©ν•œ 고속 좔둠에 μ΅œμ ν™”λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€.
  4. λ‹€μ–‘ν•œ μ• ν”Œλ¦¬μΌ€μ΄μ…˜: 닀쀑 객체 좔적, μ‚¬μš©μž 지정 λͺ¨λΈ ν›ˆλ ¨, ONNX, TensorRT, CoreML κ³Ό 같은 λ‹€μ–‘ν•œ ν˜•μ‹μœΌλ‘œ 내보내기 λ“± λ‹€μ–‘ν•œ μž‘μ—…μ„ μ§€μ›ν•©λ‹ˆλ‹€.
  5. 쒅합적인 λ¬Έμ„œ: λͺ¨λ“  단계λ₯Ό μ•ˆλ‚΄ν•˜λŠ” λ°©λŒ€ν•œ λ¬Έμ„œμ™€ λΈ”λ‘œκ·Έ λ¦¬μ†ŒμŠ€.

μžμ„Έν•œ 비ꡐ 및 μ‚¬μš© μ‚¬λ‘€λŠ” Ultralytics λΈ”λ‘œκ·Έμ—μ„œ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.



생성 2024-05-23, μ—…λ°μ΄νŠΈ 2024-07-05
μž‘μ„±μž: glenn-jocher (4), IvorZhu331 (1), RizwanMunawar (3)

λŒ“κΈ€