๐Ÿ˜Ž ๊ณต๋ถ€ํ•˜๋Š” ์ง•์ง•์•ŒํŒŒ์นด๋Š” ์ฒ˜์Œ์ด์ง€?

[FuncAnimation] 2. ๋‹จ์ผ๋ณ€๋Ÿ‰ ๊ทธ๋ž˜ํ”„๋ฅผ ๋งŒ๋“ค์–ด์„œ GUI๋กœ ์‹œ๊ฐํ™”ํ•˜๊ธฐ ๋ณธ๋ฌธ

๐Ÿ‘ฉ‍๐Ÿ’ป ์ธ๊ณต์ง€๋Šฅ (ML & DL)/Serial Data

[FuncAnimation] 2. ๋‹จ์ผ๋ณ€๋Ÿ‰ ๊ทธ๋ž˜ํ”„๋ฅผ ๋งŒ๋“ค์–ด์„œ GUI๋กœ ์‹œ๊ฐํ™”ํ•˜๊ธฐ

์ง•์ง•์•ŒํŒŒ์นด 2022. 10. 24. 15:06
728x90
๋ฐ˜์‘ํ˜•

221024 ์ž‘์„ฑ

<๋ณธ ๋ธ”๋กœ๊ทธ๋Š” ahnig ๋‹˜์˜ ๋ธ”๋กœ๊ทธ๋ฅผ ์ฐธ๊ณ ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋ฉฐ ์ž‘์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค>

https://ahnjg.tistory.com/33

 

[keras, TF2.0] ์˜จ๋„ ๋ฐ์ดํ„ฐ, ์‹œ๊ณ„์—ด ์˜ˆ์ธกํ•˜๊ธฐ (Time Series Forecasting)

์‹œ๊ณ„์—ด ์˜ˆ์ธก(Time Series Forecasting) Licensed under the Apache License, Version 2.0 (the "License") MIT License https://www.tensorflow.org/tutorials/structured_data/time_series RNN(Recurrent Neural..

ahnjg.tistory.com

 

 

๐Ÿ˜Š csv ํŒŒ์ผ์„ ๋ฐ›์•„์„œ ์‹ค์‹œ๊ฐ„ ๊ทธ๋ž˜ํ”„๋ฅผ ๋งŒ๋“ค์–ด๋ณด์ž

  • csv ๋ฐ์ดํ„ฐ ์Šค์ผ€์ผ ์กฐ์ •
  • ์ตœ๊ทผ ๋ฐ์ดํ„ฐ๋กœ ๋ฏธ๋ž˜์˜จ๋„ ์˜ˆ์ธก
  • LSTM์œผ๋กœ ๋ฏธ๋ž˜์˜จ๋„ ์˜ˆ์ธก
# 1. ๋‹จ์ผ๋ณ€๋Ÿ‰ ๊ทธ๋ž˜ํ”„
## https://operstu1.tistory.com/97

import random
from itertools import count
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation
from pandas.core.indexes import interval
import tensorflow as tf
import matplotlib as mpl
import os

TRAIN_SPLIT = 4000          # 4558 rows
tf.random.set_seed(13)

def load_file()  :
    # ๋ฐ์ดํ„ฐ ๋กœ๋“œ
    data = pd.read_csv("/home/ubuntu/FPDS/20220929/TESTTEST.csv")
    # insert_date_time ๋ฅผ ๊ธฐ์ค€์œผ๋กœ ์ง‘๊ณ„ ํ•ฉ
    temp = data.groupby(["insert_date_time"], as_index=False).sum()
    # ๋‹จ์ผ ๋ณ€์ˆ˜ ์„ ํƒ
    uni_temp = temp['cnt_item']
    uni_temp.index = temp['insert_date_time']    
    return uni_temp


def scaling(uni_temp) :
    # ๋ฐ์ดํ„ฐ ๊ฐ’์— ํ‰๊ท ๊ฐ’์„ ๋นผ๊ณ  ํ‘œ์ค€ํŽธ์ฐจ๋ฅผ ๋‚˜๋ˆ„๋Š” ๋ฐ์ดํ„ฐ ํ‘œ์ค€ํ™” (Standardization)
    uni_temp = uni_temp.values
    uni_train_mean = uni_temp[:TRAIN_SPLIT].mean()
    uni_train_std = uni_temp[:TRAIN_SPLIT].std()
    uni_data = (uni_temp - uni_train_mean)/(uni_train_std)
    return uni_data

def univariate_data(dataset, start_index, end_index, history_size, target_size):
    data=[]
    labels=[]
    
    start_index = start_index + history_size
    if end_index is None:
        end_index = len(dataset) - target_size
    
    for i in range(start_index, end_index):
        indices = range(i-history_size, i)
        # (history_size,) ์—์„œ (history_size,1) ๋กœ reshape
        data.append(np.reshape(dataset[indices], (history_size,1)))
        labels.append(dataset[i+target_size])
    return np.array(data), np.array(labels)

def prediction(uni_data) :
    # ๊ฐ€์žฅ ์ตœ๊ทผ์— ์ˆ˜์ง‘๋œ, ๋งˆ์ง€๋ง‰ 20๊ฐœ์˜ ๋ฐ์ดํ„ฐํฌ์ธํŠธ๋ฅผ ์‚ฌ์šฉํ•ด์„œ ๋ฏธ๋ž˜ ์˜จ๋„๋ฅผ ์˜ˆ์ธก
    univariate_past_history = 20
    univariate_future_target = 0

    x_train_uni, y_train_uni = univariate_data(uni_data, 0, TRAIN_SPLIT, univariate_past_history, univariate_future_target)
    x_val_uni, y_val_uni = univariate_data(uni_data, TRAIN_SPLIT, None, univariate_past_history, univariate_future_target)

    print('Single window of past history')
    print(x_train_uni[0])
    print('\n Target temperature to predict')
    print(y_train_uni[0])
    return x_train_uni, y_train_uni

def create_time_steps(length):
    return list(range(-length, 0))

def show_plot(plot_data, delta, title):
    # ๋ฐ์ดํ„ฐ, ์‹ค์ œ ๋ฐ์ดํ„ฐ, ๋ชจ๋ธ ์˜ˆ์ธก ๋ฐ์ดํ„ฐ ๋น„๊ต
    labels = ['History', 'True Future', 'Model Prediction']
    marker = ['.-', 'rx', 'go']
    time_steps = create_time_steps(plot_data[0].shape[0])
    if delta:
        future = delta
    else:
        future = 0
        
    plt.title(title)
    for i, x in enumerate(plot_data):
        if i:
            plt.plot(future, plot_data[i], marker[i], markersize=10, label=labels[i])
        else:
            plt.plot(time_steps, plot_data[i].flatten(), marker[i], label=labels[i])
    
    plt.legend()
    plt.xlim([time_steps[0], (future+5)*2])
    plt.xlabel('Time-Step')
    return plt
 
# ์‹ค์‹œ๊ฐ„ ๊ทธ๋ž˜ํ”„
def animation(i):
    data = load_file()

    x = []
    y1 = []

    x = data[0:i].index
    y1 = data[0:i]

    ax.cla()
    ax.plot(x,y1, label='cnt_item')

    plt.legend(loc = 'upper left')
    plt.tight_layout()


# 1. ๋ฐ์ดํ„ฐ ๋กœ๋“œํ•˜๊ณ , ๋‹จ์ผ๋ณ€์ˆ˜๋งŒ ๊ฐ–๊ณ ์˜จ๋‹ค
# 2. ๋ฐ์ดํ„ฐ ๊ฐ’์— ํ‰๊ท ๊ฐ’์„ ๋นผ๊ณ  ํ‘œ์ค€ํŽธ์ฐจ๋ฅผ ๋‚˜๋ˆ„๋Š” ๋ฐ์ดํ„ฐ ํ‘œ์ค€ํ™” (Standardization)
uni_data = scaling(load_file())
# 3. ๋ชจ๋ธ ์ƒ์„ฑ์— ์žˆ์–ด์„œ ๊ฐ€์žฅ ์ตœ๊ทผ์— ์ˆ˜์ง‘๋œ, ๋งˆ์ง€๋ง‰ 20๊ฐœ์˜ ๋ฐ์ดํ„ฐํฌ์ธํŠธ๋ฅผ ์‚ฌ์šฉํ•ด์„œ ๋ฏธ๋ž˜ ์˜จ๋„๋ฅผ ์˜ˆ์ธก
x_train_uni, y_train_uni = prediction(uni_data)
# 4. ์˜ˆ์ธก๊ฐ’ ์‹œ๊ฐํ™”
show_plot([x_train_uni[0], y_train_uni[0]], 0, 'Sample Example')

# 5. ๋ฐ์ดํ„ฐ ์‹ค์‹œ๊ฐ„ ์‹œ๊ฐํ™”
plt.style.use('seaborn')
fig = plt.figure()
ax = fig.add_subplot(1,1,1)

animation = FuncAnimation(plt.gcf(), func=animation, interval=100)
plt.show()

# 6. gif ์ €์žฅํ•˜๊ธฐ
#graph_ani.save('graph_ani.gif', writer='imagemagick', fps=3, dpi=100)

 

โœ” ์ตœ๊ทผ ๋ฐ์ดํ„ฐ๋กœ ๋ฏธ๋ž˜์˜จ๋„ ์˜ˆ์ธก

                                        ๊ธฐ์กด ๋ฐ์ดํ„ฐ                                                                                                   ์˜ˆ์ธก๊ฐ’
์ตœ๊ทผ ๋ฐ์ดํ„ฐ๋กœ ์˜ˆ์ธกํ•œ ๊ฐ’

 

 

โœ” LSTM์œผ๋กœ ๋ฏธ๋ž˜์˜จ๋„ ์˜ˆ์ธก

์ˆœํ™˜์‹ ๊ฒฝ๋ง(Recurrent Neural Network)

: ์ˆœํ™˜์‹ ๊ฒฝ๋ง์„ ํ†ตํ•ด ์‹œ๊ณ„์—ด ๋ฐ์ดํ„ฐ๋ฅผ ํ•œ ์‹œ์  ํ•œ์‹œ์ ์”ฉ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ, ์ด๋Š” ํ•ด๋‹น ์‹œ์ ๊นŒ์ง€ ์ธํ’‹์œผ๋กœ ๋“ค์–ด๊ฐ”๋˜ ๋ฐ์ดํ„ฐ๋ฅผ ์š”์•ฝ

LSTM ๋ชจ๋ธ ํ›ˆ๋ จ์ค‘
๋งˆ์ง€๋ง‰ ์ž˜ ์˜ˆ์ธก
์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ทธ๋ž˜ํ”„ ๊ทธ๋ฆฌ๋Š” ์ค‘

728x90
๋ฐ˜์‘ํ˜•
Comments