반응형
In [1]:
from IPython.core.display import display, HTML
display(HTML("<style>.container {width:90% !important;}</style>"))
필요한 라이브러리 임포트¶
In [2]:
import tensorflow as tf
from tensorflow.keras import preprocessing, layers
입력 및 라벨 데이터 생성¶
In [3]:
samples = ["너 오늘 이뻐 보인다",
"나는 오늘 기분이 더러워",
"끝내주는데, 좋은 일이 있나봐",
"나 좋은 일이 생겼어",
"아 오늘 진짜 짜증나",
"환상적인데, 정말 좋은거 같아"]
labels = [[1], [0], [1], [1], [0], [1]]
전처리¶
In [4]:
tokenizer = preprocessing.text.Tokenizer()
tokenizer.fit_on_texts(samples)
word_index = tokenizer.word_index
word_index
Out[4]:
{'오늘': 1, '좋은': 2, '일이': 3, '너': 4, '이뻐': 5, '보인다': 6, '나는': 7, '기분이': 8, '더러워': 9, '끝내주는데': 10, '있나봐': 11, '나': 12, '생겼어': 13, '아': 14, '진짜': 15, '짜증나': 16, '환상적인데': 17, '정말': 18, '좋은거': 19, '같아': 20}
In [5]:
sequences = tokenizer.texts_to_sequences(samples)
sequences
Out[5]:
[[4, 1, 5, 6], [7, 1, 8, 9], [10, 2, 3, 11], [12, 2, 3, 13], [14, 1, 15, 16], [17, 18, 19, 20]]
하이퍼파라미터 설정¶
In [6]:
batch_size = 2
num_epochs = 10
vocab_size = len(word_index) + 1
emb_size = 128
hidden_dimension = 256
output_dimension = 1
심층 신경망 모델 생성¶
In [7]:
class CustomModel(tf.keras.Model):
def __init__(self, vocab_size, embed_dimension, hidden_dimension, output_dimension):
super(CustomModel, self).__init__(name = "my_model")
self.embedding = layers.Embedding(vocab_size, embed_dimension)
self.dense_layer = layers.Dense(hidden_dimension, activation = "relu")
self.output_layer = layers.Dense(output_dimension, activation = "sigmoid")
def call(self, inputs):
x = self.embedding(inputs)
x = tf.reduce_mean(x, axis = 1)
x = self.dense_layer(x)
x = self.output_layer(x)
return x
모델 훈련¶
In [8]:
model = CustomModel(vocab_size = vocab_size, embed_dimension = emb_size, hidden_dimension = hidden_dimension,\
output_dimension = output_dimension)
model.compile(optimizer = tf.keras.optimizers.Adam(0.001), loss = "binary_crossentropy", metrics = ["accuracy"])
model.fit(sequences, labels, epochs = num_epochs, batch_size = batch_size)
Epoch 1/10 3/3 [==============================] - 0s 1ms/step - loss: 0.6914 - accuracy: 0.6667 Epoch 2/10 3/3 [==============================] - 0s 1ms/step - loss: 0.6727 - accuracy: 1.0000 Epoch 3/10 3/3 [==============================] - 0s 1ms/step - loss: 0.6571 - accuracy: 1.0000 Epoch 4/10 3/3 [==============================] - 0s 2ms/step - loss: 0.6425 - accuracy: 1.0000 Epoch 5/10 3/3 [==============================] - 0s 1ms/step - loss: 0.6213 - accuracy: 1.0000 Epoch 6/10 3/3 [==============================] - 0s 1ms/step - loss: 0.6000 - accuracy: 1.0000 Epoch 7/10 3/3 [==============================] - 0s 1ms/step - loss: 0.5724 - accuracy: 1.0000 Epoch 8/10 3/3 [==============================] - 0s 997us/step - loss: 0.5429 - accuracy: 1.0000 Epoch 9/10 3/3 [==============================] - 0s 1ms/step - loss: 0.5068 - accuracy: 1.0000 Epoch 10/10 3/3 [==============================] - 0s 1ms/step - loss: 0.4676 - accuracy: 1.0000
Out[8]:
<tensorflow.python.keras.callbacks.History at 0x200869bf388>
반응형
'인공지능 > 머신러닝' 카테고리의 다른 글
텐서플로우(Tensorflow) 자동 미분과 사용자정의 훈련 (0) | 2021.03.07 |
---|---|
Kaggle TMDB (0) | 2020.08.04 |
Kaggle 타이타닉(문제 정의~데이터 전처리) (0) | 2020.07.22 |