목록대학교 4학년 1학기 (16)
printf("ho_tari\n");
목적 / 개요 “Real Video(원본 영상)과 Fake Video(deep fake가 적용된 영상)들을 명확하게 구분할 수 있는가” 최근 Deep Fake 기술을 악용하여 다양한 범죄에 사용하는 이슈들이 늘어나는 추세 ex) ‘Deep Fake로 신분증 도용’, ‘Deep Fake로 음란물 합성 범죄’, ‘Deep Fake 기술을 이용하여 딥보이스 범죄’, etc 범죄에 이용되는 deep fake영상들이 real인지 fake인지 구분해주어 실시간 화상통화나 실시간 방송 송출 등에서 fake 영상 사용을 예방할 수 있고 동영상 업로드 사이트에서 업로드 전에 fake 영상을 구분하여 deep fake 영상 악용 범죄를 예방할 수 있다. 데이터 Kaggle Competition에 존재하는 deep fake..
Daily COVID-19 confirmed cases prediction Let’s predict the daily number of confirmed cases of COVID-19 in Spain through recurrent neural network (RNN). Predicting the daily number of confirmed cases of COVID-19 is a regression problem. Dataset • Daily incidence of the 52 Spanish provinces for 450 days from January 1st , 2020 to March 27th, 2021 Q1 Let’s load the dataset using “pickle” python mo..
Chest X-ray (Pneumonia 폐렴) Classification problem ◦ input variable: images ◦ 1 binary output variable (pneumonia or normal) 5863 x-ray images ◦ Already split into train, validation and test. Q1 Data preprocessing: The image sizes all vary. Thus, resizing is essential. ◦ When loading images, resize the image into [128, 128] ◦ flow_from_directory(train_dir, target_size=(128,128), batch_size=20,c..
Breast Cancer Wisconsin Dataset Classification problem ◦ 10 input variables ◦ 1 binary output variable (benign or malignant) 569 data samples ◦ Use the first 100 samples as test set ◦ Use the next 100 samples as validation set ◦ Use the others as training set ◦ Use Numpy slicing. Do not use the “train_test_split” function. Data Preparation Download breast-cancer-wisconsin.data Remove the row..
from tensorflow.keras import layers, models from tensorflow.keras.datasets import mnist from tensorflow.keras import backend as K import numpy as np import matplotlib.pyplot as plt def Conv2D(filters, kernel_size, padding="same", activation="relu"): return layers.Conv2D(filters, kernel_size, padding=padding, activation=activation) class SCAE(models.Model): def __init__(self, org_shape=(1,28,28))..
from tensorflow.keras import layers, models class AE(models.Model): def __init__(self, x_nodes=784, z_dim=36): x_shape = (x_nodes,) x = layers.Input(shape=x_shape) z = layers.Dense(z_dim, activation='relu')(x) y = layers.Dense(x_nodes, activation='sigmoid')(z) # Essential parts: super().__init__(x, y) self.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) # Optional Par..
import os import numpy as np class Data(): def __init__(self, fname, ratio): f = open(fname) data = f.read() f.close() lines = data.split("\n") header = lines[0].split('.') lines = lines[1:] values = [line.split(",")[1:] for line in lines] self.float_data = np.array(values).astype('float32') self.data_length = self.float_data.shape[-1] self.ratio = ratio self.train_set_length = int(self.float_da..