๐Ÿ˜Ž ๊ณต๋ถ€ํ•˜๋Š” ์ง•์ง•์•ŒํŒŒ์นด๋Š” ์ฒ˜์Œ์ด์ง€?

[v0.31]์˜์ƒ์ฒ˜๋ฆฌ_์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ ๋ณธ๋ฌธ

๐Ÿ‘ฉ‍๐Ÿ’ป IoT (Embedded)/Image Processing

[v0.31]์˜์ƒ์ฒ˜๋ฆฌ_์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ

์ง•์ง•์•ŒํŒŒ์นด 2022. 1. 17. 23:56
728x90
๋ฐ˜์‘ํ˜•

220117 ์ž‘์„ฑ

<๋ณธ ๋ธ”๋กœ๊ทธ๋Š” ๊ท€ํ‰์ด ์„œ์žฌ๋‹˜์˜ ๋ธ”๋กœ๊ทธ๋ฅผ ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋ฉฐ ์ž‘์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค>

https://bkshin.tistory.com/entry/OpenCV-29-%EC%98%AC%EB%B0%94%EB%A5%B8-%EB%A7%A4%EC%B9%AD%EC%A0%90-%EC%B0%BE%EA%B8%B0?category=1148027 

 

OpenCV - 29. ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ

์ด๋ฒˆ ํฌ์ŠคํŒ…์€ ์ด์ „ ํฌ์ŠคํŒ…์˜ ํ›„์† ํŽธ์ž…๋‹ˆ๋‹ค. ์ด์ „ ํฌ์ŠคํŒ…์—์„œ๋Š” ํŠน์ง• ๋งค์นญ์— ๋Œ€ํ•ด ์•Œ์•„๋ดค์Šต๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ž˜๋ชป๋œ ํŠน์ง• ๋งค์นญ์ด ๋„ˆ๋ฌด ๋งŽ์•˜์Šต๋‹ˆ๋‹ค. ์ž˜๋ชป๋œ ํŠน์ง• ๋งค์นญ์€ ์ œ์™ธํ•˜๊ณ  ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์ ์„

bkshin.tistory.com

 

 

 

 

 

1. ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ

match() ํ•จ์ˆ˜

: ๋ชจ๋“  ๋””์Šคํฌ๋ฆฝํ„ฐ๋ฅผ ํ•˜๋‚˜ํ•˜๋‚˜ ๋น„๊ตํ•˜์—ฌ ๋งค์นญ์ ์„ ์ฐพ๋Š”๋‹ค

: ๊ฐ€์žฅ ์ž‘์€ ๊ฑฐ๋ฆฌ ๊ฐ’๊ณผ ํฐ ๊ฑฐ๋ฆฌ ๊ฐ’์˜ ์ƒ์œ„ ๋ช‡ ํผ์„ผํŠธ๋งŒ ๊ณจ๋ผ์„œ ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์ ์„ ์ฐพ์„ ์ˆ˜ ์žˆ์Œ

# match ํ•จ์ˆ˜๋กœ๋ถ€ํ„ฐ ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ
import cv2, numpy as np

img1 = cv2.imread('img/taekwonv1.jpg')
img2 = cv2.imread('img/figures.jpg')
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)

# ORB๋กœ ์„œ์ˆ ์ž ์ถ”์ถœ ---โ‘ 
detector = cv2.ORB_create()
kp1, desc1 = detector.detectAndCompute(gray1, None)
kp2, desc2 = detector.detectAndCompute(gray2, None)
# BF-Hamming์œผ๋กœ ๋งค์นญ ---โ‘ก
matcher = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = matcher.match(desc1, desc2)

# ๋งค์นญ ๊ฒฐ๊ณผ๋ฅผ ๊ฑฐ๋ฆฌ๊ธฐ์ค€ ์˜ค๋ฆ„์ฐจ์ˆœ์œผ๋กœ ์ •๋ ฌ ---โ‘ข
matches = sorted(matches, key=lambda x:x.distance)
# ์ตœ์†Œ ๊ฑฐ๋ฆฌ ๊ฐ’๊ณผ ์ตœ๋Œ€ ๊ฑฐ๋ฆฌ ๊ฐ’ ํ™•๋ณด ---โ‘ฃ
min_dist, max_dist = matches[0].distance, matches[-1].distance
# ์ตœ์†Œ ๊ฑฐ๋ฆฌ์˜ 15% ์ง€์ ์„ ์ž„๊ณ„์ ์œผ๋กœ ์„ค์ • ---โ‘ค
ratio = 0.2
good_thresh = (max_dist - min_dist) * ratio + min_dist
# ์ž„๊ณ„์  ๋ณด๋‹ค ์ž‘์€ ๋งค์นญ์ ๋งŒ ์ข‹์€ ๋งค์นญ์ ์œผ๋กœ ๋ถ„๋ฅ˜ ---โ‘ฅ
good_matches = [m for m in matches if m.distance < good_thresh]
print('matches:%d/%d, min:%.2f, max:%.2f, thresh:%.2f' \
        %(len(good_matches),len(matches), min_dist, max_dist, good_thresh))
# ์ข‹์€ ๋งค์นญ์ ๋งŒ ๊ทธ๋ฆฌ๊ธฐ ---โ‘ฆ
res = cv2.drawMatches(img1, kp1, img2, kp2, good_matches, None, \
                flags=cv2.DRAW_MATCHES_FLAGS_NOT_DRAW_SINGLE_POINTS)
# ๊ฒฐ๊ณผ ์ถœ๋ ฅ
cv2.imshow('Good Match', res)
cv2.waitKey()
cv2.destroyAllWindows()

match ํ•จ์ˆ˜๋กœ๋ถ€ํ„ฐ ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์ 

 

 

 

knnMatch() ํ•จ์ˆ˜

: ๋””์Šคํฌ๋ฆฝํ„ฐ๋‹น k๊ฐœ์˜ ์ตœ๊ทผ์ ‘ ์ด์›ƒ ๋งค์นญ์ ์„ ๊ฐ€๊นŒ์šด ์ˆœ์„œ๋Œ€๋กœ ๋ฐ˜ํ™˜

: k๊ฐœ์˜ ์ตœ๊ทผ์ ‘ ์ด์›ƒ ์ค‘ ๊ฑฐ๋ฆฌ๊ฐ€ ๊ฐ€๊นŒ์šด ๊ฒƒ์€ ์ข‹์€ ๋งค์นญ์ ์ด๊ณ ,

๊ฑฐ๋ฆฌ๊ฐ€ ๋จผ ๊ฒƒ์€ ์ข‹์ง€ ์•Š์€ ๋งค์นญ์ ์ผ ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์Œ

: ์ตœ๊ทผ์ ‘ ์ด์›ƒ ์ค‘ ๊ฑฐ๋ฆฌ๊ฐ€ ๊ฐ€๊นŒ์šด ๊ฒƒ ์œ„์ฃผ๋กœ ๊ณจ๋ผ๋‚ด๋ฉด ์ข‹์€ ๋งค์นญ์ ์„ ์ฐพ์„ ์ˆ˜ ์žˆ์Œ

# knnMatch ํ•จ์ˆ˜๋กœ๋ถ€ํ„ฐ ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์  ์ฐพ๊ธฐ
import cv2, numpy as np

img1 = cv2.imread('img/taekwonv1.jpg')
img2 = cv2.imread('img/figures.jpg')
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)

# ORB๋กœ ์„œ์ˆ ์ž ์ถ”์ถœ ---โ‘ 
detector = cv2.ORB_create()
kp1, desc1 = detector.detectAndCompute(gray1, None)
kp2, desc2 = detector.detectAndCompute(gray2, None)
# BF-Hamming ์ƒ์„ฑ ---โ‘ก
matcher = cv2.BFMatcher(cv2.NORM_HAMMING2)
# knnMatch, k=2 ---โ‘ข
matches = matcher.knnMatch(desc1, desc2, 2)

# ์ฒซ๋ฒˆ์žฌ ์ด์›ƒ์˜ ๊ฑฐ๋ฆฌ๊ฐ€ ๋‘ ๋ฒˆ์งธ ์ด์›ƒ ๊ฑฐ๋ฆฌ์˜ 75% ์ด๋‚ด์ธ ๊ฒƒ๋งŒ ์ถ”์ถœ---โ‘ค
ratio = 0.75
good_matches = [first for first,second in matches \
                    if first.distance < second.distance * ratio]
print('matches:%d/%d' %(len(good_matches),len(matches)))

# ์ข‹์€ ๋งค์นญ๋งŒ ๊ทธ๋ฆฌ๊ธฐ
res = cv2.drawMatches(img1, kp1, img2, kp2, good_matches, None, \
                    flags=cv2.DRAW_MATCHES_FLAGS_NOT_DRAW_SINGLE_POINTS)
# ๊ฒฐ๊ณผ ์ถœ๋ ฅ                    
cv2.imshow('Matching', res)
cv2.waitKey()
cv2.destroyAllWindows()

knnMatch ํ•จ์ˆ˜๋กœ๋ถ€ํ„ฐ ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์ 

- ์ตœ๊ทผ์ ‘ ์ด์›ƒ ์ค‘ ๊ฑฐ๋ฆฌ๊ฐ€ ๊ฐ€๊นŒ์šด ๊ฒƒ ์ค‘ 75%๋งŒ ๊ณ ๋ฆ„

 

 

 

 

2. ๋งค์นญ ์˜์—ญ ์›๊ทผ ๋ณ€ํ™˜

: ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ๋งค์นญ๋œ ์ขŒํ‘œ๋“ค์— ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๊ตฌํ•˜๋ฉด ๋งค์นญ ๋˜๋Š” ๋ฌผ์ฒด๊ฐ€ ์–ด๋”” ์žˆ๋Š”์ง€ ํ‘œ์‹œ

: ๋น„๊ตํ•˜๋ ค๋Š” ๋ฌผ์ฒด๊ฐ€ ๋‘ ์‚ฌ์ง„ ์ƒ์—์„œ ์•ฝ๊ฐ„ ํšŒ์ „ํ–ˆ์„ ์ˆ˜๋„ ์žˆ๊ณ  ํฌ๊ธฐ๊ฐ€ ์กฐ๊ธˆ ๋‹ค๋ฅผ ์ˆ˜๋„ ์žˆ์Œ

: ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๊ตฌํ•˜๋ฉด ์ฐพ๊ณ ์ž ํ•˜๋Š” ๋ฌผ์ฒด์˜ ์œ„์น˜๋ฅผ ์ž˜ ์ฐพ์„ ์ˆ˜ ์žˆ์Œ

: ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์— ๋“ค์–ด๋งž์ง€ ์•Š๋Š” ๋งค์นญ์ ์„ ๊ตฌ๋ถ„ํ•  ์ˆ˜ ์žˆ์–ด์„œ ๋‚˜์œ ๋งค์นญ์ ์„ ํ•œ๋ฒˆ ๋” ์ œ๊ฑฐ ๊ฐ€๋Šฅ

 

mtrx, mask = cv2.findHomography(srcPoints, dstPoints, method, ransacReprojThreshold, mask, maxIters, confidence)

: ์—ฌ๋Ÿฌ ๋งค์นญ์ ์œผ๋กœ ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๊ตฌํ•˜๋Š” ํ•จ์ˆ˜

 

- srcPoints : ์›๋ณธ ์ขŒํ‘œ ๋ฐฐ์—ด
- dstPoints : ๊ฒฐ๊ณผ ์ขŒํ‘œ ๋ฐฐ์—ด
- method=0(optional) : ๊ทผ์‚ฌ ๊ณ„์‚ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์„ ํƒ

  • 0 : ๋ชจ๋“  ์ ์œผ๋กœ ์ตœ์†Œ ์ œ๊ณฑ ์˜ค์ฐจ ๊ณ„์‚ฐ
  • cv2.RANSAC : ๋ชจ๋“  ์ขŒํ‘œ๋ฅผ ์‚ฌ์šฉX, ์ž„์˜์˜ ์ขŒํ‘œ๋งŒ ์„ ์ •ํ•ด์„œ ๋งŒ์กฑ๋„ ๊ตฌํ•จ, ์ •์ƒ์น˜, ์ด์ƒ์น˜๋ฅผ ๊ตฌ๋ถ„ํ•˜๋Š” mask ๋ฐ˜ํ™˜
  • cv2.LMEDS : ์ œ๊ณฑ์˜ ์ตœ์†Œ ์ค‘๊ฐ„๊ฐ’์„ ์‚ฌ์šฉ, ์ •์ƒ์น˜๊ฐ€ 50% ์ด์ƒ์ธ ๊ฒฝ์šฐ์—๋งŒ ์ •์ƒ์ ์œผ๋กœ ์ž‘๋™
  • cv2.RHO : ์ด์ƒ์น˜๊ฐ€ ๋งŽ์€ ๊ฒฝ์šฐ์— ๋” ๋น ๋ฆ„

- ransacReprojThreshold=3(optional) : ์ •์ƒ์น˜ ๊ฑฐ๋ฆฌ ์ž„๊ณ„ ๊ฐ’(RANSAC, RHO์ธ ๊ฒฝ์šฐ)
- maxIters=2000(optional) : ๊ทผ์‚ฌ ๊ณ„์‚ฐ ๋ฐ˜๋ณต ํšŸ์ˆ˜ 
- confidence=0.995(optional) : ์‹ ๋ขฐ๋„(0~1์˜ ๊ฐ’)
- mtrx : ๊ฒฐ๊ณผ ๋ณ€ํ™˜ ํ–‰๋ ฌ
- mask : ์ •์ƒ์น˜ ํŒ๋ณ„ ๊ฒฐ๊ณผ, N x 1 ๋ฐฐ์—ด (0: ๋น„์ •์ƒ์น˜, 1: ์ •์ƒ์น˜)

 

dst = cv2.perspectiveTransform(src, m, dst)

: ์›๋ž˜ ์ขŒํ‘œ๋“ค์„ ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ๋กœ ๋ณ€ํ™˜ํ•˜๋Š” ํ•จ์ˆ˜

 

- src : ์ž…๋ ฅ ์ขŒํ‘œ ๋ฐฐ์—ด
- m : ๋ณ€ํ™˜ ๋ฐฐ์—ด
- dst(optional) : ์ถœ๋ ฅ ์ขŒํ‘œ ๋ฐฐ์—ด

 

: cv2.getPerspectiveTransform()์€ 4๊ฐœ์˜ ๊ผญ์ง“์ ์œผ๋กœ ์ •ํ™•ํ•œ ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๋ฐ˜ํ™˜ํ•˜์ง€๋งŒ,

: cv2.findHomography()๋Š” ์—ฌ๋Ÿฌ ๊ฐœ์˜ ์ ์œผ๋กœ ๊ทผ์‚ฌ ๊ณ„์‚ฐํ•œ ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๋ฐ˜ํ™˜

: cv2.perspectiveTransform() ํ•จ์ˆ˜๋Š” ์›๊ทผ ๋ณ€ํ™˜ํ•  ์ƒˆ๋กœ์šด ์ขŒํ‘œ ๋ฐฐ์—ด์„ ๋ฐ˜ํ™˜

# ๋งค์นญ์  ์›๊ทผ ๋ณ€ํ™˜์œผ๋กœ ์˜์—ญ ์ฐพ๊ธฐ
import cv2, numpy as np

img1 = cv2.imread('img/taekwonv1.jpg')
img2 = cv2.imread('img/figures.jpg')
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)

# ORB, BF-Hamming ๋กœ knnMatch  ---โ‘ 
detector = cv2.ORB_create()
kp1, desc1 = detector.detectAndCompute(gray1, None)
kp2, desc2 = detector.detectAndCompute(gray2, None)
matcher = cv2.BFMatcher(cv2.NORM_HAMMING2)
matches = matcher.knnMatch(desc1, desc2, 2)

# ์ด์›ƒ ๊ฑฐ๋ฆฌ์˜ 75%๋กœ ์ข‹์€ ๋งค์นญ์  ์ถ”์ถœ---โ‘ก
ratio = 0.75
good_matches = [first for first,second in matches \
                    if first.distance < second.distance * ratio]
print('good matches:%d/%d' %(len(good_matches),len(matches)))

# ์ข‹์€ ๋งค์นญ์ ์˜ queryIdx๋กœ ์›๋ณธ ์˜์ƒ์˜ ์ขŒํ‘œ ๊ตฌํ•˜๊ธฐ ---โ‘ข
src_pts = np.float32([ kp1[m.queryIdx].pt for m in good_matches ])
# ์ข‹์€ ๋งค์นญ์ ์˜ trainIdx๋กœ ๋Œ€์ƒ ์˜์ƒ์˜ ์ขŒํ‘œ ๊ตฌํ•˜๊ธฐ ---โ‘ฃ
dst_pts = np.float32([ kp2[m.trainIdx].pt for m in good_matches ])
# ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ ๊ตฌํ•˜๊ธฐ ---โ‘ค
mtrx, mask = cv2.findHomography(src_pts, dst_pts)
# ์›๋ณธ ์˜์ƒ ํฌ๊ธฐ๋กœ ๋ณ€ํ™˜ ์˜์—ญ ์ขŒํ‘œ ์ƒ์„ฑ ---โ‘ฅ
h,w, = img1.shape[:2]
pts = np.float32([ [[0,0]],[[0,h-1]],[[w-1,h-1]],[[w-1,0]] ])
# ์›๋ณธ ์˜์ƒ ์ขŒํ‘œ๋ฅผ ์›๊ทผ ๋ณ€ํ™˜  ---โ‘ฆ
dst = cv2.perspectiveTransform(pts,mtrx)
# ๋ณ€ํ™˜ ์ขŒํ‘œ ์˜์—ญ์„ ๋Œ€์ƒ ์˜์ƒ์— ๊ทธ๋ฆฌ๊ธฐ ---โ‘ง
img2 = cv2.polylines(img2,[np.int32(dst)],True,255,3, cv2.LINE_AA)

# ์ข‹์€ ๋งค์นญ ๊ทธ๋ ค์„œ ์ถœ๋ ฅ ---โ‘จ
res = cv2.drawMatches(img1, kp1, img2, kp2, good_matches, None, \
                    flags=cv2.DRAW_MATCHES_FLAGS_NOT_DRAW_SINGLE_POINTS)
cv2.imshow('Matching Homography', res)
cv2.waitKey()
cv2.destroyAllWindows()

๋งค์นญ์  ์›๊ทผ ๋ณ€ํ™˜์œผ๋กœ ์˜์—ญ ์ฐพ๊ธฐ

: ์˜ฌ๋ฐ”๋ฅธ ๋งค์นญ์ ์„ ํ™œ์šฉํ•ด ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๊ตฌํ•˜๊ณ , ์›๋ณธ ์ด๋ฏธ์ง€ ํฌ๊ธฐ๋งŒํผ์˜ ์‚ฌ๊ฐํ˜• ๋„ํ˜•์„ ์›๊ทผ ๋ณ€ํ™˜ํ•˜์—ฌ ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€์— ํ‘œ์‹œ

 

 

 

 

 

+) good_matches๋Š” knnMatch() ํ•จ์ˆ˜์˜ ๋ฐ˜ํ™˜ ๊ฒฐ๊ณผ

: match(), knnMatch(), radiusMatch() ํ•จ์ˆ˜์˜ ๋ฐ˜ํ™˜ ๊ฒฐ๊ณผ๋Š” DMatch ๊ฐ์ฒด ๋ฆฌ์ŠคํŠธ

 

DMatch

: ๋งค์นญ ๊ฒฐ๊ณผ๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๊ฐ์ฒด 

 

- queryIdx : queryDescriptors์˜ ์ธ๋ฑ์Šค
- trainIdx : trainDescriptors์˜ ์ธ๋ฑ์Šค
- imgIdx : trainDescriptor์˜ ์ด๋ฏธ์ง€ ์ธ๋ฑ์Šค
- distance : ์œ ์‚ฌ๋„ ๊ฑฐ๋ฆฌ

 

 

 

 

 

+) RANSAC ์›๊ทผ ๋ณ€ํ™˜ ๊ทผ์‚ฌ ๊ณ„์‚ฐ์œผ๋กœ ์ž˜๋ชป๋œ ๋งค์นญ์„ ์ถ”๊ฐ€๋กœ ์ œ๊ฑฐ

# RANSAC ์›๊ทผ ๋ณ€ํ™˜ ๊ทผ์‚ฌ ๊ณ„์‚ฐ์œผ๋กœ ๋‚˜์œ ๋งค์นญ ์ œ๊ฑฐ
import cv2, numpy as np

img1 = cv2.imread('img/taekwonv1.jpg')
img2 = cv2.imread('img/figures2.jpg')
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)

# ORB, BF-Hamming ๋กœ knnMatch  ---โ‘ 
detector = cv2.ORB_create()
kp1, desc1 = detector.detectAndCompute(gray1, None)
kp2, desc2 = detector.detectAndCompute(gray2, None)
matcher = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = matcher.match(desc1, desc2)

# ๋งค์นญ ๊ฒฐ๊ณผ๋ฅผ ๊ฑฐ๋ฆฌ๊ธฐ์ค€ ์˜ค๋ฆ„์ฐจ์ˆœ์œผ๋กœ ์ •๋ ฌ ---โ‘ข
matches = sorted(matches, key=lambda x:x.distance)
# ๋ชจ๋“  ๋งค์นญ์  ๊ทธ๋ฆฌ๊ธฐ ---โ‘ฃ
res1 = cv2.drawMatches(img1, kp1, img2, kp2, matches, None, \
                    flags=cv2.DRAW_MATCHES_FLAGS_NOT_DRAW_SINGLE_POINTS)

# ๋งค์นญ์ ์œผ๋กœ ์›๊ทผ ๋ณ€ํ™˜ ๋ฐ ์˜์—ญ ํ‘œ์‹œ ---โ‘ค
src_pts = np.float32([ kp1[m.queryIdx].pt for m in matches ])
dst_pts = np.float32([ kp2[m.trainIdx].pt for m in matches ])
# RANSAC์œผ๋กœ ๋ณ€ํ™˜ ํ–‰๋ ฌ ๊ทผ์‚ฌ ๊ณ„์‚ฐ ---โ‘ฅ
mtrx, mask = cv2.findHomography(src_pts, dst_pts, cv2.RANSAC, 5.0)
h,w = img1.shape[:2]
pts = np.float32([ [[0,0]],[[0,h-1]],[[w-1,h-1]],[[w-1,0]] ])
dst = cv2.perspectiveTransform(pts,mtrx)
img2 = cv2.polylines(img2,[np.int32(dst)],True,255,3, cv2.LINE_AA)

# ์ •์ƒ์น˜ ๋งค์นญ๋งŒ ๊ทธ๋ฆฌ๊ธฐ ---โ‘ฆ
matchesMask = mask.ravel().tolist()
res2 = cv2.drawMatches(img1, kp1, img2, kp2, matches, None, \
                    matchesMask = matchesMask,
                    flags=cv2.DRAW_MATCHES_FLAGS_NOT_DRAW_SINGLE_POINTS)
# ๋ชจ๋“  ๋งค์นญ์ ๊ณผ ์ •์ƒ์น˜ ๋น„์œจ ---โ‘ง
accuracy=float(mask.sum()) / mask.size
print("accuracy: %d/%d(%.2f%%)"% (mask.sum(), mask.size, accuracy))

# ๊ฒฐ๊ณผ ์ถœ๋ ฅ                    
cv2.imshow('Matching-All', res1)
cv2.imshow('Matching-Inlier ', res2)
cv2.waitKey()
cv2.destroyAllWindows()

RANSAC ์›๊ทผ ๋ณ€ํ™˜ ๊ทผ์‚ฌ ๊ณ„์‚ฐ์œผ๋กœ ๋‚˜์œ ๋งค์นญ ์ œ๊ฑฐ

- ์›๊ทผ ๋ณ€ํ™˜ ํ–‰๋ ฌ์„ ๊ตฌํ•  ๋•Œ RANSAC์„ ์‚ฌ์šฉํ–ˆ๊ณ , ๊ทธ ๊ฒฐ๊ณผ์ธ mask๋ฅผ ํ™œ์šฉํ•˜์—ฌ ์ž˜๋ชป๋œ ๋งค์นญ์ ์„ ์ œ๊ฑฐ

- mask์—๋Š” ์ž…๋ ฅ ์ขŒํ‘œ์™€ ๋™์ผํ•œ ์ธ๋ฑ์Šค์— ์ •์ƒ์น˜์—๋Š” 1, ์ด์ƒ์น˜์—๋Š” 0์ด ํ‘œ์‹œ

 

 

 

 

 

 

์–ด๋ ต๋‹น,,,

728x90
๋ฐ˜์‘ํ˜•
Comments