🔥알림🔥
① 테디노트 유튜브 - 구경하러 가기!
② LangChain 한국어 튜토리얼 바로가기 👀
③ 랭체인 노트 무료 전자책(wikidocs) 바로가기 🙌

2 분 소요

TensorBoard 사용을 위한 callback을 만드는 방법과 colab에서 바로 로드하여 확인할 수 있는 magic command에 대한 내용입니다.

밑에서 추가로 언급하지만, Jupyter Notebook 서버나 로컬에서 돌리시는 분들은 별도의 extension 설치를 해주셔야 하며, TensorBoard extension을 통해 텐서보드를 확인하실 수 있습니다.





사전 설치사항

  • 구글 코랩 (Google Colab) 에서는 별도의 설치 없이 바로 동작 가능합니다.
  • local이나 Jupyter Notebook 서버에서 실행하실 분들은 별도의 설치 과정이 필요합니다.
  • 아래 안내해 드리는 방법을 통해 jupyter-tensorboard 설치후, Jupyter Notebook에서 Tensorboard를 클릭하시면, 바로 텐서보드를 보실 수 있습니다.

Jupyter Notebook 서버에서 Tensorboard 로드를 위한 설치

  • 설치

pip install jupyter-tensorboard

도커를 통하여 실행

docker pull lspvic/tensorboard-notebook

docker run -it --rm -p 8888:8888 lspvic/tensorboard-notebook

실습 START

필요한 라이브러리 import

import tensorflow as tf
import numpy as np
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.models import Sequential
from tensorflow.keras.callbacks import ModelCheckpoint

데이터 로드 & 전처리

fashion_mnist = tf.keras.datasets.fashion_mnist

(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()

x_train = x_train / 255.0
x_test = x_test / 255.0
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
32768/29515 [=================================] - 0s 2us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
26427392/26421880 [==============================] - 1s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
8192/5148 [===============================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
4423680/4422102 [==============================] - 0s 0us/step

간단한 모델링

model = Sequential([
    Flatten(input_shape=(28, 28)),
    Dense(512, activation='relu'),
    Dense(256, activation='relu'),
    Dense(128, activation='relu'),
    Dense(64, activation='relu'),
    Dense(10, activation='softmax')
])

model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['acc'])

[핵심코드] 텐서보드 콜백 만들기

# 시간정보를 활용하여 폴더 생성
import datetime
# 학습데이터의 log를 저장할 폴더 생성 (지정)
log_dir = "logs/my_board/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")

# 텐서보드 콜백 정의 하기
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
model.fit(x_train, y_train,
            validation_data=(x_test, y_test), 
            epochs=10, 
            callbacks=[tensorboard_callback],
            )
Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 6s 99us/sample - loss: 0.0853 - acc: 0.9695 - val_loss: 0.6725 - val_acc: 0.8950
Epoch 2/10
60000/60000 [==============================] - 6s 97us/sample - loss: 0.0796 - acc: 0.9712 - val_loss: 0.7866 - val_acc: 0.8898
Epoch 3/10
60000/60000 [==============================] - 6s 95us/sample - loss: 0.0755 - acc: 0.9720 - val_loss: 0.7241 - val_acc: 0.8978
Epoch 4/10
60000/60000 [==============================] - 6s 96us/sample - loss: 0.0822 - acc: 0.9707 - val_loss: 0.8183 - val_acc: 0.8965
Epoch 5/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.0770 - acc: 0.9723 - val_loss: 0.7377 - val_acc: 0.8930
Epoch 6/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.0786 - acc: 0.9721 - val_loss: 0.8396 - val_acc: 0.8920
Epoch 7/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.0788 - acc: 0.9714 - val_loss: 0.8566 - val_acc: 0.8976
Epoch 8/10
60000/60000 [==============================] - 6s 96us/sample - loss: 0.0730 - acc: 0.9742 - val_loss: 0.8196 - val_acc: 0.8976
Epoch 9/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.0704 - acc: 0.9746 - val_loss: 0.8780 - val_acc: 0.8948
Epoch 10/10
60000/60000 [==============================] - 6s 99us/sample - loss: 0.0715 - acc: 0.9746 - val_loss: 0.8415 - val_acc: 0.8940
<tensorflow.python.keras.callbacks.History at 0x7fe8d7cfa780>
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten (Flatten)            (None, 784)               0         
_________________________________________________________________
dense (Dense)                (None, 512)               401920    
_________________________________________________________________
dense_1 (Dense)              (None, 256)               131328    
_________________________________________________________________
dense_2 (Dense)              (None, 128)               32896     
_________________________________________________________________
dense_3 (Dense)              (None, 64)                8256      
_________________________________________________________________
dense_4 (Dense)              (None, 10)                650       
=================================================================
Total params: 575,050
Trainable params: 575,050
Non-trainable params: 0
_________________________________________________________________

텐서보드를 colab에서 바로 로드 하기

텐서보드 extension 로드를 위한 magic command

%load_ext tensorboard

텐서보드를 로드 합니다.

%tensorboard --logdir {log_dir}

댓글남기기