๐Ÿ˜Ž ๊ณต๋ถ€ํ•˜๋Š” ์ง•์ง•์•ŒํŒŒ์นด๋Š” ์ฒ˜์Œ์ด์ง€?

[DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ]_4_Classifcation 2. MobileNet & EfficientNet ๋ณธ๋ฌธ

๐Ÿ‘ฉ‍๐Ÿ’ป ์ธ๊ณต์ง€๋Šฅ (ML & DL)/ML & DL

[DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ]_4_Classifcation 2. MobileNet & EfficientNet

์ง•์ง•์•ŒํŒŒ์นด 2022. 1. 26. 10:41
728x90
๋ฐ˜์‘ํ˜•

220125 ์ž‘์„ฑ

<๋ณธ ๋ธ”๋กœ๊ทธ๋Š” DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ์„ ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋ฉฐ ์ž‘์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค>

https://www.deepnoid.com/

 

์ธ๊ณต์ง€๋Šฅ | Deepnoid

DEEPNOID๋Š” ์ธ๊ณต์ง€๋Šฅ์„ ํ†ตํ•œ ์ธ๋ฅ˜์˜ ๊ฑด๊ฐ•๊ณผ ์‚ถ์˜ ์งˆ ํ–ฅ์ƒ์„ ๊ธฐ์—…์ด๋…์œผ๋กœ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๋”ฅ๋…ธ์ด๋“œ๊ฐ€ ๊ฟˆ๊พธ๋Š” ์„ธ์ƒ์€, ์˜๋ฃŒ ์ธ๊ณต์ง€๋Šฅ์ด ์ง€๊ธˆ๋ณด๋‹ค ํ›จ์”ฌ ๋„“์€ ๋ฒ”์œ„์˜ ์งˆํ™˜์˜ ์—ฐ๊ตฌ, ์ง„๋‹จ, ์น˜๋ฃŒ์— ๋„์›€

www.deepnoid.com

 

 

 

 

1. MobileNet

: Mobile ๊ธฐ๊ธฐ์—์„œ๋„ ๋Œ์•„๊ฐ€๊ธฐ ์œ„ํ•œ ๊ฒฝ๋Ÿ‰ํ™”๊ฐ€ ํ•ต์‹ฌ!

: ์—ฐ์‚ฐ๋Ÿ‰ ๊ฐ์†Œ๋ฅผ ์ตœ์šฐ์„ ์œผ๋กœ ๋ชฉํ‘œ

 

- Depthwise Separable Convolution

1) Depthwise Convolution

: ๊ฐ๊ฐ์˜ feature map ์— ๋Œ€ํ•ด 1-channel Conv ์—ฐ์‚ฐ ์ˆ˜ํ–‰ ํ›„ Concat ํ˜•ํƒœ๋กœ ๊ฐ๊ฐ ๊ฒฐ๊ณผ feature map ์ฑ„๋„๋ณ„๋กœ ์Œ“์Œ

 

2) Separable Convolution

: ์ฑ„๋„๋ณ„๋กœ ๊ตฌํ•ด์ง„ feature map ์„ 1x1 kernel Conv ์—ฐ์‚ฐ์œผ๋กœ ํ•˜๋‚˜์˜ channel ๋กœ ํ•ฉ์„ฑ

=> ๊ฐ๊ฐ ์—ฐ์‚ฐ๋œ feature map ๋“ค์ด ํ•˜๋‚˜์˜ ์˜์ƒ์œผ๋กœ stack ๋œ ํ›„ 1x1 kernel Conv์œผ๋กœ ์ฑ„๋„์ˆ˜๋งŒ ๋ฐ”๊ฟ”์ฃผ๋Š” ๊ฒƒ

 

 

2. ๊ธฐ์กด Conv ์—ฐ์‚ฐ๋Ÿ‰ ๊ณ„์‚ฐ

: (Dk^2 * M) * Df^2 * (N * Dg^2)

- Dk : kernel size

Df : input size

Dg : output size

M : input channel

N : output channel

 

 

3. Depthwise Conv ์—ฐ์‚ฐ๋Ÿ‰ ๊ณ„์‚ฐ

: (Dk^2 * 1) * Df^2 * (M * Dg^2)

- M = 1 : 1 channel ์— ๋Œ€ํ•ด์„œ๋งŒ ๋…๋ฆฝ์ ์œผ๋กœ Conv ์—ฐ์‚ฐ ์ˆ˜ํ–‰

- N = M : Depth Conv ๊ฒฐ๊ณผ ์˜์ƒ์€ ์ธํ’‹ ์˜์ƒ์˜ ์ฑ„๋„ ์ˆ˜๋งŒํผ ๊ฐ๊ฐ ์—ฐ์‚ฐ ์ˆ˜ํ–‰ ๊ฒฐ๊ณผ๋ฅผ concat์œผ๋กœ ํ•ฉ์นจ

 

 

4. Separable Conv ์—ฐ์‚ฐ๋Ÿ‰ ๊ณ„์‚ฐ

: (1* M) * Df^2 * (N * Dg^2)

- Dk  = 1 : 1x1 kernel convolution ์ˆ˜ํ–‰ํ•˜๊ธฐ ๋•Œ๋ฌธ์— 1 ์†Œ๊ฑฐ

- Df = Dg : 1x1 kernel convoluton ์ˆ˜ํ–‰ํ•˜๊ธฐ ๋•Œ๋ฌธ์— Df, Dg ํฌ๊ธฐ ๋ณ€ํ™˜ X

 

 

5. ๊ธฐ์กด VS Depthwise Separable Conv

- ๊ธฐ์กด

= (Dk^2 * M) * Df^2 * (N * Dg^2)

 

- Depthwise Separable

=(M * Dk^2 * Df^2 * Dg^2) + (M*N*Df^2 * Dg^2)

= M * Df^2 * Dg^2 * (Dk^2 + N)

๊ธฐ์กด๊ณผ Depthwise&amp;nbsp; Separable Conv

 

 

 

 

 

6. EfficientNet

: EfficientNet Baseline(MBconV) Block + Compound Scaling

์„ฑ๋Šฅ + ๊ณ„์‚ฐ๋Ÿ‰ GOOD

 

1) Inverted residual block (Linear Bottleneck)

- residual blokc

: ์ฑ„์ปฌ ์ˆ˜๊ฐ€ ํด ๊ฒฝ์šฐ 1x1 Conv ์ฑ„๋„ ์ˆ˜๋ฅผ ์ค„์ธ ํ›„ 3x3 conv ์ˆ˜ํ–‰ ํ›„ ๋‹ค์‹œ ์ฑ„๋„ ๋Š˜๋ฆฐ ๋’ค์— skip connection

 

- inverted residual block

: expansion layer๋กœ ์ฑ„๋„์„ ๋ฐ˜๋Œ€๋กœ ๋Š˜๋ฆฐ ํ›„ conv, ๋‹ค์‹œ ์›๋ž˜ ์ฑ„๋„ ํฌ๊ธฐ๋กœ ์ค„์ธ ๊ฒฐ๊ณผ๋ฅผ skip connection

 

- manifod

: ๊ณ ์ฐจ์› ์ฑ„๋„์€ ์ €์ฐจ์›์—์„œ ํ‘œํ˜„ ๊ฐ€๋Šฅ

 

 

7. MBconV Block - SE Block

- squeeze

: fearure map ์„ GAP(gloabl average pooling) ์œผ๋กœ ์••์ถ•

 

- excitation

: ๊ฐ ์ฑ„๋„๋ณ„ ์ค‘์š”๋„ ์—ฐ์‚ฐ (relu + sigmoid)

=> SE Block : GAP + FC + ReLU + FC + Sigmoid

=> feature map ๋ณ„๋กœ ์ค‘์š”๋„ ๊ณ„์‚ฐ, ํด๋ž˜์Šค ๋ถ„๋ฅ˜์— ๋” ์ •ํ™•ํ•œ ์ •๋ณด ์ „๋‹ฌ

 

 

 

 

8. EfficientNet Baseline Block : MBconV Block

- mobile net

: depthwise separable

: inverted residual

 

- SENet

: squeeze

: excitation

 

 

 

 

9. Compound Scaling

: ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋Š” ์ตœ์ ์˜ Width, Depth, Resolution scaling

 

 

 

 

728x90
๋ฐ˜์‘ํ˜•
Comments