๐Ÿ˜Ž ๊ณต๋ถ€ํ•˜๋Š” ์ง•์ง•์•ŒํŒŒ์นด๋Š” ์ฒ˜์Œ์ด์ง€?

[DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ]_3_Classifcation 1. ResNet/DenseNet ๋ณธ๋ฌธ

๐Ÿ‘ฉ‍๐Ÿ’ป ์ธ๊ณต์ง€๋Šฅ (ML & DL)/ML & DL

[DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ]_3_Classifcation 1. ResNet/DenseNet

์ง•์ง•์•ŒํŒŒ์นด 2022. 1. 25. 11:36
728x90
๋ฐ˜์‘ํ˜•

220125 ์ž‘์„ฑ

<๋ณธ ๋ธ”๋กœ๊ทธ๋Š” DEEPNOID ์›ํฌ์ธํŠธ๋ ˆ์Šจ์„ ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋ฉฐ ์ž‘์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค>

https://www.deepnoid.com/

 

์ธ๊ณต์ง€๋Šฅ | Deepnoid

DEEPNOID๋Š” ์ธ๊ณต์ง€๋Šฅ์„ ํ†ตํ•œ ์ธ๋ฅ˜์˜ ๊ฑด๊ฐ•๊ณผ ์‚ถ์˜ ์งˆ ํ–ฅ์ƒ์„ ๊ธฐ์—…์ด๋…์œผ๋กœ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๋”ฅ๋…ธ์ด๋“œ๊ฐ€ ๊ฟˆ๊พธ๋Š” ์„ธ์ƒ์€, ์˜๋ฃŒ ์ธ๊ณต์ง€๋Šฅ์ด ์ง€๊ธˆ๋ณด๋‹ค ํ›จ์”ฌ ๋„“์€ ๋ฒ”์œ„์˜ ์งˆํ™˜์˜ ์—ฐ๊ตฌ, ์ง„๋‹จ, ์น˜๋ฃŒ์— ๋„์›€

www.deepnoid.com

 

 

 

 

 

1. ResNet

: Deep residual learning for image recognition

: Very Deep networks utilsing residual connection (Up to 152 layers)

: Shortcut Connectionnnn

: Residual Learning

: ์ž…๋ ฅ๊ฐ’์„ ์ถœ๋ ฅ๊ฐ’์— ๋”ํ•ด์ค„ ์ˆ˜ ์žˆ๋„๋ก ์ง€๋ฆ„๊ธธ(shortcut)

ResNet

: F(x) + x๋ฅผ ์ตœ์†Œํ™”ํ•˜๋Š” ๊ฒƒ์„ ๋ชฉ์ 

: x๋Š” ํ˜„์‹œ์ ์—์„œ ๋ณ€ํ•  ์ˆ˜ ์—†๋Š” ๊ฐ’

: F(x) = H(x) - x์ด๋ฏ€๋กœ F(x)๋ฅผ ์ตœ์†Œ๋กœ ํ•ด์ค€๋‹ค๋Š” ๊ฒƒ์€ H(x) - x๋ฅผ ์ตœ์†Œ๋กœ ํ•ด์ฃผ๋Š” ๊ฒƒ๊ณผ ๋™์ผํ•œ ์˜๋ฏธ

=> H(x) - x๋ฅผ ์ž”์ฐจ(residual)

=> ์ž”์ฐจ๋ฅผ ์ตœ์†Œ๋กœ ํ•ด์ฃผ๋Š” ๊ฒƒ์ด๋ฏ€๋กœ ResNet

=> Skip connection์ ์šฉํ•˜์—ฌ ๊ธฐ์šธ๊ธฐ์†Œ์‹ค๋ฌธ์ œ ํ•ด๊ฒฐํ•˜๋ฉด์„œ, ๋งค์šฐ ๊นŠ์€ ๋„คํŠธ์›Œํฌ(152Layer) ํ•™์Šตํ•˜์—ฌ ์„ฑ๋Šฅ ์ƒ์Šน

 

 

 

 

 

 

 

2. DenseNet

: To further improve model compactness

: Dense Connectivity

: Composite Function (BN-ReLU-Conv)

: Polling Layers

: Growth Rate

: Bottleneck Layers

: Compression

- ์žฅ์ 

: gradeint ๋ฌธ์ œ ๊ฐ์†Œ

: eas of feature propagation, feature reuse

: Regluarization effect

: high parameter efficiency

=> ์ง„ํ™”๋œ Skip connection๊ณผ bottleneck layers๋ฅผ ์ ์šฉํ•˜๋ฉด์„œ, Feature๋งŒ ๊ฐ€์ง„ ๋งค์šฐ ๊นŠ์€ ๋„คํŠธ์›Œํฌ๋ฅผ ํ•™์Šตํ•˜์—ฌ ์„ฑ๋Šฅ ์ƒ์Šน

 

1) Resnet์˜ Skip-connection์—์„œ ๋ฐœ์ „๋œ Dense-connection

2) Densenet์— Bottleneck layers๋ฅผ ์ ์šฉ

: layer๋งˆ๋‹ค ๋ชจ๋“  ์ด์ „ feature-map์— ์ƒˆ๋กœ์šด feature-map์„ ๊ณ„์† concatenation

 

3) Transition Layer

๋‹ค์šด์ƒ˜ํ”Œ๋ง

: feature map์˜ ๊ฐ€๋กœ, ์„ธ๋กœ ์‚ฌ์ด์ฆˆ๋ฅผ ์ค„์—ฌ์ฃผ๊ณ  feature map์˜ ๊ฐœ์ˆ˜๋ฅผ ์ค„์—ฌ์คŒ

 

โ€‹4) Classification Layer

fully connected layer๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š์œผ๋ฉด์„œ ํŒŒ๋ผ๋ฏธํ„ฐ ์ˆ˜ ๊ฐ์†Œ

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

์ข€ ๋” ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•ด๋ณด์•„์”๋‹ˆ๋‹น

https://warm-uk.tistory.com/46

 

[CNN ๊ฐœ๋…์ •๋ฆฌ] CNN์˜ ๋ฐœ์ „, ๋ชจ๋ธ ์š”์•ฝ์ •๋ฆฌ 2 (ResNet, DenseNet)

* ์ฐธ๊ณ ์ž๋ฃŒ ๋ฐ ๊ฐ•์˜ - cs231n ์šฐ๋ฆฌ๋ง ํ•ด์„ ๊ฐ•์˜ https://www.youtube.com/watch?v=y1dBz6QPxBc&list=PL1Kb3QTCLIVtyOuMgyVgT-OeW0PYXl3j5&index=6 - Coursera, Andrew Ng๊ต์ˆ˜๋‹˜ ์ธํ„ฐ๋„ท ๊ฐ•์˜ * ๋ชฉ์ฐจ 1. ๋ชจ๋ธ ๋ฐœ์ „..

warm-uk.tistory.com

 

728x90
๋ฐ˜์‘ํ˜•
Comments