๐Ÿ˜Ž ๊ณต๋ถ€ํ•˜๋Š” ์ง•์ง•์•ŒํŒŒ์นด๋Š” ์ฒ˜์Œ์ด์ง€?

[๋…ผ๋ฌธ๋ฆฌ๋ทฐ] Time Series Forecasting (TSF) Using Various Deep Learning Models ๋ณธ๋ฌธ

๐Ÿ‘ฉ‍๐Ÿ’ป ์ธ๊ณต์ง€๋Šฅ (ML & DL)/Serial Data

[๋…ผ๋ฌธ๋ฆฌ๋ทฐ] Time Series Forecasting (TSF) Using Various Deep Learning Models

์ง•์ง•์•ŒํŒŒ์นด 2022. 9. 19. 15:18
728x90
๋ฐ˜์‘ํ˜•

220919 ์ž‘์„ฑ

<๋ณธ ๋ธ”๋กœ๊ทธ๋Š” Jimeng Shi, Mahek Jain, Giri Narasimhan ๋‹˜์˜ ๋…ผ๋ฌธ์„ ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋ฉฐ ์ž‘์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค :-) >

https://arxiv.org/abs/2204.11115

 

Time Series Forecasting (TSF) Using Various Deep Learning Models

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit inp

arxiv.org

 

 

๐ŸŸฃ Abstract

  • ์‹œ๊ณ„์—ด ์˜ˆ์ธก(TSF)์€ ์ด์ „ ์‹œ์ ์œผ๋กœ๋ถ€ํ„ฐ์˜ ํ•™์Šต์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๋ฏธ๋ž˜์˜ ์‹œ์ ์—์„œ ๋ชฉํ‘œ ๋ณ€์ˆ˜๋ฅผ ์˜ˆ์ธกํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ
  • ๋ฌธ์ œ๋ฅผ ๋‹ค๋ฃจ๊ธฐ ์‰ฝ๊ฒŒ ์œ ์ง€ํ•˜๊ธฐ ์œ„ํ•ด ํ•™์Šต ๋ฐฉ๋ฒ•์€ ๊ณผ๊ฑฐ์— ๊ณ ์ •๋œ ๊ธธ์ด ์ฐฝ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ๋ช…์‹œ์  ์ž…๋ ฅ์œผ๋กœ ์‚ฌ์šฉ
  • ๋”ฅ ๋Ÿฌ๋‹ ๋ฐฉ๋ฒ•(RNN, LSTM, GRU ๋ฐ Transformer)์„ ๊ธฐ์ค€ ๋ฐฉ๋ฒ•๊ณผ ํ•จ๊ป˜ ๋น„๊ต
  • Transformer ๋ชจ๋ธ์ด ์ตœ๊ณ ์˜ ์„ฑ๋Šฅ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค
    • ๋Œ€๋ถ€๋ถ„์˜ ๋‹จ์ผ ๋‹จ๊ณ„ ๋ฐ ๋‹ค์ค‘ ๋‹จ๊ณ„ ์˜ˆ์ธก์—์„œ ๊ฐ€์žฅ ๋‚ฎ์€ ํ‰๊ท  ์˜ค์ฐจ(MAE = 14.599, 23.273)
    • ๋ฃจํŠธ ํ‰๊ท  ์ œ๊ณฑ ์˜ค์ฐจ(RSME = 23.573, 38.165)
    • 1์‹œ๊ฐ„ ํ›„๋ฅผ ์˜ˆ์ธกํ•˜๊ธฐ ์œ„ํ•œ ๋ฃฉ๋ฐฑ ์œˆ๋„์šฐ์˜ ๊ฐ€์žฅ ์ข‹์€ ํฌ๊ธฐ๋Š” ํ•˜๋ฃจ์ธ ๋ฐ˜๋ฉด 2์ผ ๋˜๋Š” 4์ผ
    • ๋ฏธ๋ž˜๋ฅผ 3์‹œ๊ฐ„ ์˜ˆ์ธกํ•˜๊ธฐ ์œ„ํ•ด ์ตœ์„ ์„ ๋‹คํ•จ

 

 

 

1๏ธโƒฃ INTRODUCTION

  • TIME ์˜์ƒ ์‹œ๋ฆฌ์ฆˆ๋Š” ํŠน์ • ๊ธฐ๊ฐ„ ๋™์•ˆ ์ฃผ์–ด์ง„ γ ๋ณ€์ˆ˜ ์ง‘ํ•ฉ์˜ ๋ฐ˜๋ณต ๊ด€์ธก์น˜ ์‹œํ€€์Šค
    • EX) ์ฃผ๊ฐ€, ๊ฐ•์ˆ˜๋Ÿ‰, ๊ตํ†ต๋Ÿ‰, ํ†ต์‹ , ์šด์†ก ๋„คํŠธ์›Œํฌ ๋“ฑ
  • ๋ชจ๋ธ๋“ค์€ ์‹œ๊ฐ„ ์‹œ๋ฆฌ์ฆˆ ํฌ์ฐฉํ•˜๋Š” ๋ฐ 3๊ฐ€์ง€๋กœ ๋‚˜๋ˆŒ ์ˆ˜ ์žˆ์Œ
    • ์ „ํ†ต์ ์ธ ๋ชจ๋ธ
      • ์„ ํ˜•
        • Autoregressive Moving Average (ARMA)
        • Autoregressive Integrated Moving Average (ARIMA)
      • ๋น„์„ ํ˜•
        • Autoregressive Fractionally Integrated Moving Average (ARFIMA)
        • Seasonal Autoregressive Integrated Moving Average (SARIMA)
      • ํ•œ๊ณ„
        • ์˜ˆ์ธก์„ ์ƒ์„ฑํ•˜๊ธฐ ์œ„ํ•ด ๊ฐ€์žฅ ์ตœ๊ทผ์˜ ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ์—์„œ ๊ณ ์ •๋œ ์š”์ธ ์ง‘ํ•ฉ์— ํšŒ๊ท€๋ฅผ ์ ์šฉ
        • ์ „ํ†ต์ ์ธ ๋ฐฉ๋ฒ•์€ ๋ฐ˜๋ณต์ ์ด๋ฉฐ ์ข…์ข… ํ”„๋กœ์„ธ์Šค๊ฐ€ ์‹œ๋“œ๋˜๋Š” ๋ฐฉ์‹์— ๋ฏผ๊ฐ
        • ์ •์ƒ์„ฑ์€ ์—„๊ฒฉํ•œ ์กฐ๊ฑด์ด๋ฉฐ, ๋“œ๋ฆฌํ”„ํŠธ, ๊ณ„์ ˆ์„ฑ, ์ž๊ธฐ์ƒ๊ด€์„ฑ, ์ด์งˆ์„ฑ๋งŒ์„ ๋‹ค๋ฃจ๋Š” ๊ฒƒ๋งŒ์œผ๋กœ๋Š” ํœ˜๋ฐœ์„ฑ ์‹œ๊ณ„์—ด์˜ ์ •์ƒ์„ฑ์„ ๋‹ฌ์„ฑํ•˜๊ธฐ ์–ด๋ ค์›€
    • ๊ธฐ๊ณ„ ํ•™์Šต ๋ชจ๋ธ
      • ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ (SVM) 
      • ์ˆœํ™˜ ์‹ ๊ฒฝ๋ง(RNN)
      • ์žฅ๋‹จ๊ธฐ ๋ฉ”๋ชจ๋ฆฌ(LSTM)
      • Transformers๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š” ์ฃผ์˜ ๊ธฐ๋ฐ˜ ๋ฐฉ๋ฒ•
    • ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ
      • ์ธ๊ณต ์‹ ๊ฒฝ๋ง(ANN)
      • ๋”ฅ ๋Ÿฌ๋‹ NN
    • ๋ชฉํ‘œ
      • (1) ์‹œ๊ณ„์—ด ์˜ˆ์ธก์„ ์œ„ํ•œ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ(RNN, LSTM, GRU, Transformer)์„ ์ ์šฉ, ๊ฒ€์ฆํ•˜๊ณ  ํ•ด๋‹น ์„ฑ๋Šฅ์„ ๋น„๊ต
      • (2) ์ด๋Ÿฌํ•œ ๋ชจ๋ธ์˜ ๊ฐ•์ ๊ณผ ์•ฝ์ ์„ ํ‰๊ฐ€
      • (3) ๋ฃฉ๋ฐฑ ์ฐฝ์˜ ํฌ๊ธฐ์™€ ๋ฏธ๋ž˜ ์˜ˆ์ธก ์‹œ๊ฐ„์˜ ๊ธธ์ด๊ฐ€ ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์„ ์ดํ•ด
      • (4) ์ง€์ •๋œ ๋ฏธ๋ž˜ ์‹œ๊ฐ„์— ์ตœ์ƒ์˜ ์˜ˆ์ธก์„ ์œ„ํ•ด ์‚ฌ์šฉํ•  ์ตœ์ ์˜ ๋ฃฉ๋ฐฑ ์ฐฝ ํฌ๊ธฐ๋ฅผ ์ •ํ™•ํžˆ ํŒŒ์•…

 

2๏ธโƒฃ METHODOLOGY

  • ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ์€ ๋Œ€์ƒ ๋ณ€์ˆ˜์˜ ์ž…๋ ฅ ๊ธฐ๋Šฅ๊ณผ ๋ฏธ๋ž˜ ๊ฐ’ ์‚ฌ์ด์˜ ๊ธฐ๋Šฅ์  ๊ด€๊ณ„๋ฅผ ํ•™์Šต
    • ๊ฒฐ๊ณผ ๋ชจ๋ธ์€ ๋ฏธ๋ž˜ ์‹œ์ ์— ๋ชฉํ‘œ ๋ณ€์ˆ˜์— ๋Œ€ํ•œ ์˜ˆ์ธก์„ ์ œ๊ณต
    • ๊ท ์ผํ•œ ๊ธธ์ด์˜ ๋ชจ๋ธ์— ์ž…๋ ฅํ•˜๊ธฐ ์œ„ํ•ด ๊ทธ๋ฆผ 1๊ณผ ๊ฐ™์ด θ ํฌ๊ธฐ์˜ ๊ณ ์ • ๊ธธ์ด ์Šฌ๋ผ์ด๋”ฉ ์‹œ๊ฐ„ ์ฐฝ์„ ์‚ฌ์šฉ
      • ์‹ (5)์„ ์ด์šฉํ•œ ์ตœ์†Œ-์ตœ๋Œ€ ์Šค์ผ€์ผ๋ง์œผ๋กœ ๋ฐ์ดํ„ฐ๋ฅผ ๋ณ€ํ™˜
      • ์ˆ˜ํ•™์ ์œผ๋กœ ๊ธฐ๊ณ„ ํ•™์Šต ๋ชจ๋ธ์— ์˜ํ•ด ํ•™์Šต๋œ ํ•จ์ˆ˜ ๊ด€๊ณ„๋Š” Eq์™€ ๊ฐ™์ด ์ ์„ ์ˆ˜ ์žˆ์Œ

๊ทธ๋ฆผ 1
์‹ 5

  • ์—ฌ๊ธฐ์„œ y^(t+k)์€ ์‹œ๊ฐ„ t+k์— ๋Œ€ํ•œ ๋ชฉํ‘œ ๋ณ€์ˆ˜ ์˜ˆ์ธก๊ฐ’
  • k ๋Š” ๋ชฉํ‘œ ๋ณ€์ˆ˜๊ฐ€ ์˜ˆ์ธก๋  ๋ฏธ๋ž˜๊นŒ์ง€์˜ ์‹œ๊ฐ„ ๊ธธ์ด
  • t-w to t-1 ์€ ๊ด€์ธก๋œ ๋ชฉํ‘œ๊ฐ’
  • x(t-w) to x(t-1)์€ (t-w) to (t-1) ๊นŒ์ง€์˜ observed ๊ด€์ธก๋œ ์ž…๋ ฅ ํŠน์ง•์˜ ๋ฒกํ„ฐ
  • f(k) ๋Š” ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ์— ์˜ํ•ด ํ•™์Šต๋œ ํ•จ์ˆ˜
  • m ์€ ์ž…๋ ฅ ํ”ผ์ณ์˜ ์ˆ˜
  • w ๋Š” ์ž…๋ ฅ์œผ๋กœ ์‚ฌ์šฉ๋˜๋Š” ์ฐฝ์˜ ํฌ๊ธฐ

 

 

3๏ธโƒฃ DEEP LEARNING FRAMEWORKS

์ด ์ž‘์—…์— ์‚ฌ์šฉ๋˜๋Š” ์‹ฌ์ธต ํ•™์Šต ๋ชจ๋ธ, ์ฆ‰ ๋ฐ˜๋ณต ์‹ ๊ฒฝ๋ง(RNN), ์žฅ๋‹จ๊ธฐ ๋ฉ”๋ชจ๋ฆฌ(LSTM), ๊ฒŒ์ดํŠธ ์ˆœํ™˜ ์žฅ์น˜(GRU) ๋ฐ ํŠธ๋žœ์Šคํฌ๋จธ์— ๋Œ€ํ•ด ๊ฐ„๋žตํ•˜๊ฒŒ ์„ค๋ช…ํ•œ๋‹ค

๐Ÿ’— A. Recurrent Neural Networks(RNN)

 

 

  • RNN์€ ์‹œ๊ณ„์—ด ๋ฐ์ดํ„ฐ ๋ชจ๋ธ๋ง์— ๊ฐ€์žฅ ์ ํ•ฉ
    • ์‹ ๊ฒฝ๋ง์„ ์‚ฌ์šฉํ•˜์—ฌ ์ตœ๊ทผ ์ž…๋ ฅ ๊ธฐ๋Šฅ๊ณผ ๋ฏธ๋ž˜์˜ ๋ชฉํ‘œ ๋ณ€์ˆ˜ ์‚ฌ์ด์˜ ๊ธฐ๋Šฅ ๊ด€๊ณ„๋ฅผ ๋ชจ๋ธ๋ง
    • ๊ทธ๋ฆผ 2์— ๋‚˜ํƒ€๋‚œ ๋ฐ”์™€ ๊ฐ™์ด, RNN์€ ํ˜„์žฌ t - 1์—์„œ t๋กœ ๋‚ด๋ถ€(์ˆจ๊ฒจ์ง„) ์ƒํƒœ์˜ ์ „ํ™˜์— ์ดˆ์ ์„ ๋งž์ถ”์–ด ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ์˜ ํ›ˆ๋ จ ์„ธํŠธ์—์„œ ํ•™์Šต 
      • ๊ฒฐ๊ณผ ๋ชจ๋ธ์€ ๋ชจ๋ธ์„ ์ •์˜ํ•˜๋Š” ๋ฐ ๋„์›€์ด ๋˜๋Š” ์„ธ ๊ฐœ์˜ ๋งค๊ฐœ ๋ณ€์ˆ˜ ํ–‰๋ ฌ w(x), w(y), w(s)
      • ๋‘ ๊ฐœ์˜ ๋ฐ”์ด์–ด์Šค ๋ฒกํ„ฐ b(s) ๋ฐ b(y) ์— ์˜ํ•ด ๊ฒฐ์ •
    • ์ถœ๋ ฅ y(t)๋Š” ๋‚ด๋ถ€ ์ƒํƒœ s(t)์— ๋”ฐ๋ผ ๋‹ฌ๋ผ์ง€๋ฉฐ, ์ด๋Š” ํ˜„์žฌ ์ž…๋ ฅ x(t)์™€ ์ด์ „ ์ƒํƒœ(t-1) ๋ชจ๋‘์— ๋”ฐ๋ผ ๋‹ฌ๋ผ์ง
    • ๊ฐ๊ฐ์˜ ์€๋‹‰ ์ƒํƒœ(์€๋‹‰ ๋‹จ์œ„ ๋˜๋Š” ์€๋‹‰ ์…€)์˜ ์—ฐ์‚ฐ ๊ณผ์ •์„ ๊ทธ๋ฆผ 3์— ๋‚˜ํƒ€๋ƒ„

  • R^N : ๋‚ด๋ถ€ ์ƒํƒœ์™€ ์ถœ๋ ฅ์— ๋Œ€ํ•œ bias ๋ฒกํ„ฐ
  • σ : sigmoid activation func
  • S(t) : internal (hidden) state

  • RNN์˜ ๊ฐ€์žฅ ํฐ ๋‹จ์ ์€ ๋ฐ˜๋ณต ๊ฐ€์ค‘์น˜ ํ–‰๋ ฌ์˜ ๋ฐ˜๋ณต ๊ณฑ์…ˆ์œผ๋กœ ์ธํ•ด ๊ธฐ์šธ๊ธฐ ์†Œ์‹ค ๋ฌธ์ œ๋กœ ์–ด๋ ค์›€์„ ๊ฒช์Œ
    • ์‹œ๊ฐ„์ด ์ง€๋‚จ์— ๋”ฐ๋ผ ๊ธฐ์šธ๊ธฐ๊ฐ€ ๋„ˆ๋ฌด ์ž‘์•„์ง€๊ณ  RNN์ด ์งง์€ ์‹œ๊ฐ„ ๋™์•ˆ๋งŒ ์ •๋ณด๋ฅผ ๊ธฐ์–ตํ•˜๊ฒŒ ๋˜๊ธฐ ๋•Œ๋ฌธ

 

๐Ÿ’— B.Long Short-term Model (LSTM)

  • LSTM(Long Short-Term Memory) ๋„คํŠธ์›Œํฌ๋Š” ์‚ฌ๋ผ์ง€๋Š” ๊ทธ๋ ˆ์ด๋””์–ธํŠธ ๋ฌธ์ œ๋ฅผ ๋ถ€๋ถ„์ ์œผ๋กœ ํ•ด๊ฒฐํ•˜๊ณ  ์‹œ๊ณ„์—ด ๋ฐ์ดํ„ฐ์—์„œ ์žฅ๊ธฐ ์˜์กด์„ฑ์„ ํ•™์Šตํ•˜๋Š” RNN์˜ ๋ณ€ํ˜•
  • ์‹œ๊ฐ„ t์—์„œ ๋‚ด๋ถ€(์ˆจ๊ฒจ์ง„) ์ƒํƒœ s(t), cell ์ƒํƒœ, c(t) ์œผ๋กœ ๋ฌ˜์‚ฌ๋จ
  • ๊ทธ๋ฆผ 4 ์ฒ˜๋Ÿผ C(t)์—๋Š” ์„ธ ๊ฐ€์ง€ ๋‹ค๋ฅธ ์ข…์†์„ฑ์ด ์žˆ์Œ
    • (1) ์ด์ „ ์…€ ์ƒํƒœ, C(t-1)
    • (2) ์ด์ „ ๋‚ด๋ถ€ ์ƒํƒœ, S(t-1)
    • (3) ํ˜„์žฌ ์‹œ์ ์—์„œ ์ž…๋ ฅ, x(t)
  • ๊ทธ๋ฆผ 4์— ํ‘œ์‹œ๋œ ๊ณผ์ •์€ forget gate, input gate, addition gate, output gate๋ฅผ ์ด์šฉํ•œ ์ •๋ณด์˜ removal/filtering, multiplication/combining ๋ฐ addition ๊ฐ€ ๊ฐ€๋Šฅํ•˜์—ฌ ๊ฐ๊ฐ f(t), i(t) C~(t), O(t) ๊ธฐ๋Šฅ์„ ๊ตฌํ˜„ํ•˜์—ฌ ์žฅ๊ธฐ ์˜์กด์„ฑ ํ•™์Šต์„ ๋ณด๋‹ค ์„ธ๋ฐ€ํ•˜๊ฒŒ ์ œ์–ด

 

 

๐Ÿ’— C.Gated Recurrent Unit (GRU)

  • ๊ฒŒ์ดํŠธ ์ˆœํ™˜ ์žฅ์น˜(GRU)๋Š” ์‚ฌ๋ผ์ง€๋Š” ๊ฒฝ์‚ฌ ๋ฌธ์ œ๋ฅผ ์ถ”๊ฐ€๋กœ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•œ LSTM์˜ ๋ณ€ํ˜•
  • ๊ทธ๋ฆผ 5 ๊ฐ™์ด, ์ด ๋ฐฉ๋ฒ•์˜ ์‹ ๊ทœ์„ฑ์€ ๊ฐ๊ฐ z(t), r(t) ๋ฐ s~(t) ๊ธฐ๋Šฅ์„ ๊ตฌํ˜„ํ•œ ์—…๋ฐ์ดํŠธ ๊ฒŒ์ดํŠธ, ๋ฆฌ์…‹ ๊ฒŒ์ดํŠธ ๋ฐ ์ œ3 ๊ฒŒ์ดํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ
  • ๊ฐ ๊ฒŒ์ดํŠธ๋Š” ์‚ฌ์ „ ์ •๋ณด๋ฅผ ํ•„ํ„ฐ๋ง, ์‚ฌ์šฉ ๋ฐ ๊ฒฐํ•ฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ œ์–ดํ•˜๋Š” ๋ฐ ์„œ๋กœ ๋‹ค๋ฅธ ์—ญํ• 
    • (1 - z(t)) • S(t-1)์— ์˜ํ•ด ์ฃผ์–ด์ง„ ๋‹ค์Œ ์ƒํƒœ์— ๋Œ€ํ•œ ์‹์˜ ์ฒซ ๋ฒˆ์งธ ์šฉ์–ด๋Š” ๊ณผ๊ฑฐ๋กœ๋ถ€ํ„ฐ ๋ฌด์—‡์„ ์œ ์ง€ํ• ์ง€๋ฅผ ๊ฒฐ์ •
    • z(t) • S~(t)๋Š” ํ˜„์žฌ ๋ฉ”๋ชจ๋ฆฌ ๋‚ด์šฉ์—์„œ ๋ฌด์—‡์„ ์ˆ˜์ง‘ํ• ์ง€๋ฅผ ๊ฒฐ์ •

 

๐Ÿ’— D.Transformer Model

  • LSTMs๊ณผ GRUs ๋ถ€๋ถ„์ ์œผ๋กœ RNNs์˜ ์‚ฌ๋ผ์ง€๊ณ  ์žˆ๋Š” ๊ฒฝ๋„ ๋ฌธ์ œ๋ฅผ ๋‹ค๋ฃจ๊ณ  ์žˆ์Œ
  • But, ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋กœ ์Œ๊ณก์„  ์ ‘์„ ๊ณผ ์‹œ๊ทธ๋ชจ์ด๋“œ ํ•จ์ˆ˜์˜ ์‚ฌ์šฉ์€ ๋” ๊นŠ์€ ์ธต์—์„œ ๊ธฐ์šธ๊ธฐ ๋ถ•๊ดด๋ฅผ ๊ณ„์† ์ผ์œผํ‚ด
    • transformer networks ๋Š” ์„ ํƒ์ ์œผ๋กœ ๊ณผ๊ฑฐ๋กœ๋ถ€ํ„ฐ ์ค‘์š”ํ•œ ์ •๋ณด๋ฅผ ๋Œ€ ๋ฌด๊ฒŒ ํ—ˆ์šฉํ•˜๋Š” ๊ด€์‹ฌ ๊ธฐ๋Šฅ์˜ ์‚ฌ์šฉ ๋•Œ๋ฌธ์— ์‹œ๊ฐ„ ์‹œ๋ฆฌ์ฆˆ์— ๋Œ€ํ•œ ์ตœ์ƒ์˜ ์„ฑ๋Šฅ์„ ๊ฒƒ์œผ๋กœ ์•Œ๋ ค์ ธ ์žˆ์›€
    • ๊ทธ๋ฆผ 6์€ transformer networks ์˜ schematic ๋ฅผ ๋ณด์—ฌ์คŒ
      • ์ธ์ฝ”๋”์™€ ๋””์ฝ”๋”๋ผ๋Š” ๋ถ€๋ถ„์œผ๋กœ ๊ตฌ์„ฑ
      • w ์€ look-back window ์ฐฝ ํฌ๊ธฐ
      • k ๋Š” ํ–ฅํ›„ ์˜ˆ์ธกํ•ด์•ผ ํ•  ๋‹จ๊ณ„ ์ˆ˜.
    • ๋””์ฝ”๋” ๋ถ€๋ถ„์€ ๋””์ฝ”๋”์— ๋งˆ์Šคํฌ๋“œ ์–ดํ…์…˜(Masked Attention) ๋ฉ”์ปค๋‹ˆ
      • ๋””์ฝ”๋”์˜ ํŠน์ง• ๋ฒกํ„ฐ๊ฐ€ ๋  ์ธ์ฝ”๋” ์ถœ๋ ฅ ์ค‘์—์„œ ์„ ํƒํ•˜๋Š” ๋ฉ€ํ‹ฐ ๋ฆฌ๋“œ ์–ดํ…์…˜(Multi-ead Attention) ๋ฉ”์ปค๋‹ˆ์ฆ˜ ๊ฐ€์ง
      • Transformer ๋Š” recurrent ๋„คํŠธ์›Œํฌ๊ฐ€ ์•„๋‹ˆ์ง€๋งŒ positional encoding์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฐ์ดํ„ฐ์˜ ์‹œ๊ฐ„์  ์ˆœ์„œ๋ฅผ ํ‘œ์‹œ
    • ์ธ์ฝ”๋”๋Š” w ํฌ๊ธฐ์˜ look-back window ๋กœ๋ถ€ํ„ฐ ๋ฐ์ดํ„ฐ๋ฅผ ์ œ๊ณตํ•˜๊ณ  ๋””์ฝ”๋”๊ฐ€ ์‚ฌ์šฉํ•  ํ”ผ์ฒ˜ ๋ฒกํ„ฐ๋ฅผ ์ถœ๋ ฅ
      • ํ›ˆ๋ จ ์ค‘์— ๋””์ฝ”๋”๋Š” ์ธ์ฝ”๋”์˜ ์ถœ๋ ฅ๊ณผ ํ•จ๊ป˜ ๋ชจ๋ธ๋ง๋  ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒ๋˜๋Š” ๋ฏธ๋ž˜ ๋ฐ์ดํ„ฐ๋„ ์ œ๊ณต
      • transformer networks ์˜ ์ฃผ์˜ ๊ธฐ๋Šฅ์€ ์ค‘์š”ํ•œ ํŠน์ง•๊ณผ ๊ณผ๊ฑฐ์˜ ๋™ํ–ฅ์— ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด๋Š” ๋ฒ•์„ ๋ฐฐ์šฐ๋Š” ๋ฐ ๋„์›€

 

4๏ธโƒฃ DATA AND EXPERIMENTS

  • ์šฐ๋ฆฌ๋Š” UCI ์›น์‚ฌ์ดํŠธ์˜ ๋ฒ ์ด์ง• ๋Œ€๊ธฐ์งˆ ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ๋„ค ๊ฐ€์ง€ ๊ธฐ๊ณ„ ํ•™์Šต ๊ธฐ๋ฒ•์„ ์ ์šฉํ•˜์—ฌ ๋Œ€๊ธฐ์งˆ ์˜ˆ์ธก์„ ์œ„ํ•œ ์‹œ๊ณ„์—ด ์˜ˆ์ธก(TSF)์„ ์ˆ˜ํ–‰
  • ๋‘ ๊ฐ€์ง€ ์œ ํ˜•์˜ ์‹คํ—˜์„ ์ˆ˜ํ–‰ํ–ˆ๋Š”๋ฐ
    • ํ•˜๋‚˜๋Š” ์ด์ „ ์‹œ์ ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋‹ค์Œ ์‹œ์ ์„ ์˜ˆ์ธกํ•˜๋Š” "์‹ฑ๊ธ€ ์Šคํ…"
    • ๋‹ค๋ฅธ ํ•˜๋‚˜๋Š” ์ด์ „ ์‹œ์ ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋‹ค์Œ ์—ฌ๋Ÿฌ ์‹œ์ ์„ ์˜ˆ์ธกํ•˜๋Š” "๋ฉ€ํ‹ฐ ์Šคํ…"

๐Ÿ’• A.Dataset

  • ์‚ฌ์šฉํ•œ ๋ฐ์ดํ„ฐ ์„ธํŠธ๋Š” UCI ์›น์‚ฌ์ดํŠธ์˜ ์‹œ๊ฐ„๋‹น ๋ฒ ์ด์ง• ๋Œ€๊ธฐ ํ’ˆ์งˆ ๋ฐ์ดํ„ฐ ์„ธ
    • 2010๋…„ 1์›” 1์ผ๋ถ€ํ„ฐ 2014๋…„ 12์›” 31์ผ๊นŒ์ง€ 5๋…„ ๋™์•ˆ์˜ ๋ฐ์ดํ„ฐ๊ฐ€ ํฌํ•จ
    • ๋งค์‹œ๊ฐ„ ์ˆ˜์ง‘๋˜์—ˆ์œผ๋ฉฐ ๋ฐ์ดํ„ฐ ์ง‘ํ•ฉ์—๋Š” 43,824๊ฐœ์˜ ํ–‰๊ณผ 13๊ฐœ์˜ ์—ด
      • ์ฒซ ๋ฒˆ์งธ ์—ด์€ ๋‹จ์ˆœํ•œ ์ธ๋ฑ์Šค์ด๋ฉฐ ๋ถ„์„์—์„œ ๋ฌด์‹œ
      • ๋…„, ์›”, ์ผ ๋ฐ ์‹œ๊ฐ„์œผ๋กœ ํ‘œ์‹œ๋œ ๋„ค ๊ฐœ์˜ ์—ด์€ "๋…„-์›”-์ผ-์‹œ๊ฐ„"์ด๋ผ๋Š” ๋‹จ์ผ ๊ธฐ๋Šฅ์œผ๋กœ ๊ฒฐํ•ฉ
    • 'PM2.5' ์—ด์ด ๋Œ€์ƒ ๋ณ€์ˆ˜
    • ๋‹ค๋ฅธ ๋ชจ๋“  ๋ณ€์ˆ˜(์‹œ๊ฐ„๊ณผ ํ•จ๊ป˜)๊ฐ€ ์ž…๋ ฅ ๊ธฐ๋Šฅ์œผ๋กœ ์‚ฌ์šฉ
    • ์—ด ์ด๋ฆ„๊ณผ ์„ค๋ช…์€ ํ‘œ I์— ๋ช…์‹œ
    • ์‹œ๊ฐ„๊ณผ 'cbwd'๋ฅผ ์ œ์™ธํ•œ ๋ชจ๋“  ์ž…๋ ฅ ๋ฐ ๋Œ€์ƒ ํ”ผ์ณ์— ๋Œ€ํ•œ ์‹œ๊ณ„์—ด์ด ๊ทธ๋ฆผ 7์— ํ‘œ์‹œ
  • ๋ฐ์ดํ„ฐ ๋ˆ„๋ฝ์œผ๋กœ ์ธํ•ด ์ผ๋ถ€ ํ–‰(43,824๊ฐœ ์ค‘ 24๊ฐœ)์ด ํ๊ธฐ
    • ์›ํ•ซ ์ž„๋ฒ ๋”ฉ์€ ํ’ํ–ฅ์˜ ๋ฒ”์ฃผ์  ํŠน์ง•์— ์ ์šฉ
    • ๋ฐ์ดํ„ฐ๋Š” Min-Max ์ •๊ทœํ™” ๊ธฐ๋ฒ•์„ ์‚ฌ์šฉํ•˜์—ฌ [0, 1] ๋ฒ”์œ„๋กœ ์ •๊ทœํ™”
  • ๋ฐ์ดํ„ฐ๋Š” ๊ต์œก ์„ธํŠธ(์ฒซ ๋ฒˆ์งธ 70% ํ–‰)์™€ ํ…Œ์ŠคํŠธ ์„ธํŠธ(๋งˆ์ง€๋ง‰ 30% ํ–‰)๋กœ ๊ตฌ๋ถ„

 

 

๐Ÿ’• B.Experiments

  • k = 1์„ ์‚ฌ์šฉํ•œ ์‹คํ—˜์€ ๋ฏธ๋ž˜์— ๋Œ€ํ•œ ํ•œ ๋ฒˆ์˜ ๋‹จ๊ณ„๋ฅผ ์˜ˆ์ธกํ•˜๊ณ  ๋‹จ์ผ ๋‹จ๊ณ„ ์˜ˆ์ธก
  • k > 1์„ ์‚ฌ์šฉํ•œ ์‹คํ—˜์€ ๋ฏธ๋ž˜์— ๋Œ€ํ•œ ํ•˜๋‚˜ ์ด์ƒ์˜ ์‹œ์ ์„ ์˜ˆ์ธกํ•˜๊ณ  ๋‹ค๋‹จ๊ณ„ ์˜ˆ์ธก
  • ๋‘ ์‹คํ—˜ ๋ชจ๋‘ ์ž…๋ ฅ์œผ๋กœ ์‚ฌ์šฉ๋œ ์ตœ๊ทผ ๊ณผ๊ฑฐ์˜ ๋ถ€๋ถ„์„ ๋‚˜ํƒ€๋‚ด๋Š” look-back window ์˜ ๋‹ค๋ฅธ ๊ฐ’์œผ๋กœ ์ˆ˜ํ–‰๋˜์—ˆ๋‹ค.
    • 1, 2, 4, 8, 16์ผ์˜ ์ฐฝ ํฌ๊ธฐ๊ฐ€ ์‹คํ—˜์— ๋ชจ๋‘ ์‚ฌ์šฉ
    • ์ฐฝ ํฌ๊ธฐ๊ฐ€ ์˜ˆ์ธก ์ •ํ™•๋„์—S ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์„ ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด ์ฐฝ ํฌ๊ธฐ์˜ ์ง€์ˆ˜ ์„ ํƒ์ด ์„ ํƒ
  • ๋‹ค๋‹จ๊ณ„ ์˜ˆ์ธก์€ ํ–ฅํ›„ 1, 2, 4, 8, 16์‹œ๊ฐ„ ์‹œ์ ์˜ ๋Œ€๊ธฐ์งˆ ๊ฐ’์„ ์˜ˆ์ธกํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ
  • 4๊ฐœ์˜ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ ๊ฐ๊ฐ์— ๋Œ€ํ•ด ํ‘œ II์— ํ‘œ์‹œ๋œ ๊ฒƒ์ฒ˜๋Ÿผ ์„œ๋กœ ๋‹ค๋ฅธ ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ ์„ค์ •์„ ์‹œ๋„
    • ํ•™์Šต ์†๋„(0.00001, 0.00005, 0.0001, 0.0005, 0.001)
    • ๋ฐฐ์น˜ ํฌ๊ธฐ(128, 256, 512)
    • ์˜ตํ‹ฐ๋งˆ์ด์ €(Adam, SGD)

 

๐Ÿ’• C.Measures of Evaluation

  • ์†์‹ค ํ•จ์ˆ˜๋กœ ํ‰๊ท  ์ œ๊ณฑ ์˜ค์ฐจ(MSE)๋ฅผ ์‚ฌ์šฉ
    • ํ›ˆ๋ จ ๋ฐ ํ…Œ์ŠคํŠธ ์†์‹ค์€ ๋‹ค์Œ๊ณผ ๊ฐ™์€ ํ•จ์ˆ˜๋กœ ๊ณ„์‚ฐ
    • ๊ณผ์ ํ•ฉ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฐ์ง€ํ•˜๋Š” ์—ํญ์Šค
    • ๊ทธ๋ฆผ 8, 9๋Š” 2013-07-04-09:00๋ถ€ํ„ฐ 2013-07-19-08:00๊นŒ์ง€ ๋‹จ๊ธฐ๊ฐ„ ๋Œ€๊ธฐ์งˆ ์˜ˆ์ธก ๋ฐ ๊ด€์ธก์น˜๋ฅผ ๋‚˜ํƒ€๋‚ธ ๊ฒƒ
    • ํ‰๊ท  ์ ˆ๋Œ€ ์˜ค์ฐจ(MAE)์™€ ๋ฃจํŠธ ํ‰๊ท  ์ œ๊ณฑ ์˜ค์ฐจ(RMSE)๋Š” ๋ฐฐ๊ฐ€ ๋ฐ”๋‹ค ๋ฐ‘์œผ๋กœ ๊ฐ€๋ผ์•‰์€ ๊ฒƒ์„ ๋ณด์—ฌ์ฃผ๋Š” ํ‘œ์ค€ ๊ณต์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๊ณ„์‚ฐ

โœ” A.Predict Multiple Timesteps Ahead

  • ๊ณ ์ •๋œ look-back window ํฌ๊ธฐ์— ๋Œ€ํ•ด ์‹œ๊ณ„์—ด ๊ฐ’์„ ์˜ˆ์ธกํ•˜๋Š” ๋ฏธ๋ž˜์˜ ์‹œ๊ฐ„์ธ k ๊ฐ’์„ ๋Š˜๋ฆฌ๋ฉด ๋ชจ๋ธ ์„ฑ๋Šฅ์ด ์–ด๋–ป๊ฒŒ ์ €ํ•˜๋˜๋Š”์ง€ ์กฐ์‚ฌ
    • ์š”๊ตฌ ์‚ฌํ•ญ์ด ์ฆ๊ฐ€ํ•จ์— ๋”ฐ๋ผ ์„ฑ๋Šฅ์ด ์ €ํ•˜๋  ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒํ•ด๋„ ๋ฌด๋ฐฉ
    • TABLE III์˜ ๊ฐ ์—ด์— MAE ๋ฐ RMSE ๊ฐ’์ด k์— ๋”ฐ๋ผ ์ฆ๊ฐ€ํ•œ๋‹ค๋Š” ์‚ฌ์‹ค์— ์˜ํ•ด ํ™•์ธ
    • transformer models ์€ ์‹คํ—˜์˜ 80%์—์„œ RNN, LSTM ๋ฐ GRU๋ณด๋‹ค ์„ฑ๋Šฅ์ด ์šฐ์ˆ˜
    • ๋ฏธ๋ž˜๋ฅผ 4์‹œ๊ฐ„ ์ด์ƒ ์˜ˆ์ธกํ•ด์•ผ ํ•  ๊ฒฝ์šฐ ์˜ˆ์ธก ์„ฑ๋Šฅ์ด ๊ธ‰๊ฒฉํžˆ ๋–จ์–ด์ง

 

โœ” B.Different Look-back Window Sizes

  • ๋‹จ์ผ ๋‹จ๊ณ„ ๋ฐ ๋‹ค์ค‘ ๋‹จ๊ณ„ ์˜ˆ์ธก์˜ ์„ฑ๋Šฅ์ด Look-back Window ํฌ๊ธฐ์— ์˜ํ•ด ์–ด๋–ป๊ฒŒ ์˜ํ–ฅ์„ ๋ฐ›๋Š”์ง€ ์กฐ์‚ฌ
    • ์‹คํ—˜์€ w = 24์‹œ๊ฐ„, 48์‹œ๊ฐ„, 96์‹œ๊ฐ„, 192์‹œ๊ฐ„, 384์‹œ๊ฐ„์œผ๋กœ ์ˆ˜ํ–‰
    • Single-step predictions : ํ‘œ IV๋Š” ์šฐ๋ฆฌ์˜ ์‹คํ—˜ ๊ฒฐ๊ณผ๋ฅผ ์š”์•ฝ
      • transformer network model ์€ w( 96 96์‹œ๊ฐ„)์˜ ๋” ํฐ ๊ฐ’์— ๋Œ€ํ•ด ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•๋ณด๋‹ค ์„ฑ๋Šฅ์ด ์šฐ์ˆ˜
      • attention ๊ธฐ๋ฐ˜ ์ ‘๊ทผ ๋ฐฉ์‹์˜ ์•Œ๋ ค์ง„ ๊ฐ•์ ๊ณผ ์ผ์น˜
      • ๋” ์ž‘์€ ์ฐฝ ํฌ๊ธฐ(24์‹œ๊ฐ„ ๋˜๋Š” 48์‹œ๊ฐ„)์˜ ๊ฒฝ์šฐ, GRU์™€ LSTM์ด RNN๋ณด๋‹ค ๋” ๋‚˜์€ ์„ฑ๋Šฅ
        • ์ด๋Š” GRU์™€ LSTM์ด RNN๋ณด๋‹ค ๋” ๊ธด ๋ฉ”๋ชจ๋ฆฌ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์œผ๋ฉฐ ๋ถ€๋ถ„์ ์œผ๋กœ ์‚ฌ๋ผ์ง€๋Š” ๊ธฐ์šธ๊ธฐ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ–ˆ๋‹ค๋Š” ์ฃผ์žฅ๊ณผ ์ผ์น˜
      • Single-step predictions์˜ ๊ฒฝ์šฐ ์ž‘์€ ์ฐฝ ํฌ๊ธฐ๋งŒ ์„ ํƒํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒฝ์šฐ LSTM๊ณผ GRU๊ฐ€ ๋” ๋‚˜์€ ์„ ํƒ
      • ๋ถˆํ–‰ํžˆ๋„, Transformer ๋„คํŠธ์›Œํฌ๋Š” ์ฐฝ์ด ๋” ํด์ˆ˜๋ก ์†Œ์Œ ์ˆ˜์ค€์ด ์ฆ๊ฐ€ํ•˜๊ธฐ ๋•Œ๋ฌธ์— ํ›จ์”ฌ ๋” ํฐ ์ฐฝ ํฌ๊ธฐ๋ฅผ ์‚ฌ์šฉํ•˜๋”๋ผ๋„ ๋” ๋‚˜์€ ์„ฑ๋Šฅ์„ ์ œ๊ณตํ•˜์ง€ ๋ชปํ•จ
      • ๊ฒŒ๋‹ค๊ฐ€, ๋” ์ž‘์€ ์ฐฝ๋“ค์€ ๋” ํšจ์œจ์ ์ธ ๋ฐฉ๋ฒ•๋“ค๋กœ ์ด์–ด์งˆ ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์Œ
        • ์ด์ „ ์‹œ์ ์˜ ์‹œ๊ณ„์—ด ๊ฐ’๋งŒ ๋ณด๊ณ ํ•˜๋Š” ์˜ˆ์ธก์— ๋Œ€ํ•œ ๋‹จ์ˆœํ•œ ๊ธฐ์ค€์„  ์ ‘๊ทผ ๋ฐฉ์‹์€ ๊ฐ๊ฐ 16.624์™€ 26.828์˜ MAE ๊ฐ’์„ ๊ฐ€์ง
    • Multi-step predictions : ํ‘œ IV๋Š” w์˜ ๋‹ค์–‘ํ•œ ๊ฐ’์— ๋Œ€ํ•ด k = 3์‹œ๊ฐ„ ํ›„๋ฅผ ์˜ˆ์ธกํ•˜๋Š” ์‹คํ—˜ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ์คŒ
      • transformer network model ๋Š” ๋‹ค๋ฅธ ๋ชจ๋“  ๋„๊ตฌ๋ณด๋‹ค ์„ฑ๋Šฅ์ด ์šฐ์ˆ˜
      • ์ตœ์†Œ๊ฐ’์ด w = 48 ๋˜๋Š” 96์‹œ๊ฐ„ ๋™์•ˆ ๋„๋‹ฌํ•˜๊ธฐ ๋•Œ๋ฌธ์— ์„ฑ๋Šฅ ๋ณ€ํ™”๋Š” ๋‹จ์กฐ๋กญ์ง€ ์•Š์œผ๋ฉฐ, ์ด๋Š” ํ•™์Šต ๋ฐฉ๋ฒ•์— ๋Œ€ํ•œ ์ตœ์ ์˜ ๊ฐ’์ผ ์ˆ˜ ์žˆ์Œ์„ ์‹œ์‚ฌ
      • ๊ทธ๋ฆผ 9๋Š” ์‹คํ—˜์— ๋Œ€ํ•œ ์˜ˆ์ธก์„ ์‹œ๊ฐํ™”
        • T1, T2, T3 ๊ณก์„ ์€ k = 1, 2, 3์‹œ๊ฐ„์— ๋Œ€ํ•œ ์‹คํ—˜ ๊ฒฐ๊ณผ๋ฅผ ๋‚˜ํƒ€๋ƒ„

 

 

5๏ธโƒฃ CONCLUSIONS

  • ๋„ค ๊ฐ€์ง€ ๋‹ค๋ฅธ ๋”ฅ ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•œ ์‹คํ—˜์—์„œ ์–ป์€ ๊ฒฐ๋ก ์€ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์š”์•ฝ
    • Transformer network models ์€ ๋” ๋จผ ๋ฏธ๋ž˜๋ฅผ ์˜ˆ์ธกํ•  ๋•Œ ๊ฐ€์žฅ ์ž˜ ์ˆ˜ํ–‰
      • LSTM๊ณผ GRU๋Š” ๋‹จ๊ธฐ ์˜ˆ์ธก์—์„œ RNN์„ ๋Šฅ๊ฐ€
    • lookback window ํฌ๊ธฐ์— ๋Œ€ํ•œ ์„ฑ๋Šฅ ์˜์กด์„ฑ์˜ ๊ฒฝ์šฐ ๋กœ์ปฌ ์ตœ์†Œ๊ฐ’์ด ํ‘œ์‹œ
    • single-step predictions ์˜ ๊ฒฝ์šฐ ์ฐฝ ํฌ๊ธฐ์˜ ์ตœ์  ๊ฐ’์€ w = 24์‹œ๊ฐ„
      • ๋‹ค๋‹จ๊ณ„ ์˜ˆ์ธก์˜ ๊ฒฝ์šฐ ์ตœ์  ๊ฐ’์€ w = 48 ๋˜๋Š” 96์‹œ๊ฐ„ (k = 3์‹œ๊ฐ„ ์ „ ์˜ˆ์ธก ์‹œ)
    • multi-step predictions ์˜ ๊ฒฝ์šฐ ๋ณ€์••๊ธฐ๊ฐ€ ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•๋ณด๋‹ค ์„ฑ๋Šฅ์ด ์šฐ์ˆ˜
      • ๋‹จ์ผ ๋‹จ๊ณ„ ์˜ˆ์ธก์˜ ๊ฒฝ์šฐ, ๋ณ€ํ™˜๊ธฐ๋Š” ์กฐํšŒ ์ฐฝ์ด ๋” ๊ธด ๊ฒฝ์šฐ์—๋งŒ ์„ฑ๋Šฅ์ด ์šฐ์ˆ˜

 

 

 

728x90
๋ฐ˜์‘ํ˜•
Comments