ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • [CNN] ResNet
    머신러닝 & 딥러닝 2022. 1. 23. 21:34

    ResNet

    • ResNet 등장한 이유 : Network 깊이가 깊어질수록 오히려 성능이 저하되었다. (Vanishing Gradient 문제)
    • ResNet의 주요 특징 : Short cut, Identity Block 구성
    • Short cut : 이전 layer의 출력이 conv layer로 거치지 않고 그대로 전달
    • ResNet이란? H(x) = F(x) + x 라고 정의할 때, F(x)를 학습하여 최소화 하도록 residual learning을 구성한 네트워크

     

    ResNet 아키텍처

    • 여러 개의 Identity Block이 연결되어 구성.
    • 이전 입력 값이 Skip Connection을 통해서 Conv Residual block을 skip 해서 output으로 Direct 연결됨.
    • Residual Block 과 이전 입력 값을 더한(Add) 뒤에 다시 Relu 적용
    • Residual block 내에서 기본적으로 두개의 3X3 Conv Layer를 가짐.
    • Residual Block내에서 CNN 커널은 동일한 크기와 Depth를

    유지(1x1을 적용하지 않을 경우). 연속된 Residual block(ResNet 50 기준 3, 4 또는 6회) 이후에는 커널 크기는 절반, Depth는 2배로 변화하면서 Residual block들을 순차적으로 구성

     

    다음 구조를 코드로 구현하기

    import numpy as np
    import pandas as pd
    import os
    from tensorflow.keras.layers import Conv2D, Dense, BatchNormalization, Activation
    from tensorflow.keras.layers import add, Add
    
    # identity block은 shortcut 단에 conv layer가 없는 block 영역
    def identity_block(input_tensor, middle_kernel_size, filters, stage, block):
        '''
        함수 입력 인자 설명
        input_tensor는 입력 tensor
        middle_kernel_size 중간에 위치하는 kernel 크기. identity block내에 있는 두개의 conv layer중 1x1 kernel이 아니고, 3x3 kernel임. 
        3x3 커널이 이외에도 5x5 kernel도 지정할 수 있게 구성. 
        filters: 3개 conv layer들의 filter개수를 list 형태로 입력 받음. 첫번째 원소는 첫번째 1x1 filter 개수, 두번째는 3x3 filter 개수, 세번째는 마지막 1x1 filter 개수
        stage: identity block들이 여러개가 결합되므로 이를 구분하기 위해서 설정. 동일한 filter수를 가지는 identity block들을  동일한 stage로 설정.  
        block: 동일 stage내에서 identity block을 구별하기 위한 구분자
        ''' 
    
        # filters로 list 형태로 입력된 filter 개수를 각각 filter1, filter2, filter3로 할당. 
        # filter은 첫번째 1x1 filter 개수, filter2는 3x3 filter개수, filter3는 마지막 1x1 filter개수
        filter1, filter2, filter3 = filters
        # conv layer와 Batch normalization layer각각에 고유한 이름을 부여하기 위해 설정. 입력받은 stage와 block에 기반하여 이름 부여
        conv_name_base = 'res' + str(stage) + block + '_branch'
        bn_name_base = 'bn' + str(stage) + block + '_branch'
    
        # 이전 layer에 입력 받은 input_tensor(함수인자로 입력받음)를 기반으로 첫번째 1x1 Conv->Batch Norm->Relu 수행. 
        # 첫번째 1x1 Conv에서 Channel Dimension Reduction 수행. filter1은 입력 input_tensor(입력 Feature Map) Channel 차원 개수의 1/4임. 
        x = Conv2D(filters=filter1, kernel_size=(1, 1), kernel_initializer='he_normal', name=conv_name_base+'2a')(input_tensor)
        # Batch Norm적용. 입력 데이터는 batch 사이즈까지 포함하여 4차원임(batch_size, height, width, channel depth)임
        # Batch Norm의 axis는 channel depth에 해당하는 axis index인 3을 입력.(무조건 channel이 마지막 차원의 값으로 입력된다고 가정. )
        x = BatchNormalization(axis=3, name=bn_name_base+'2a')(x)
        # ReLU Activation 적용. 
        x = Activation('relu')(x)
    
        # 두번째 3x3 Conv->Batch Norm->ReLU 수행
        # 3x3이 아닌 다른 kernel size도 구성 가능할 수 있도록 identity_block() 인자로 입력받은 middle_kernel_size를 이용. 
        # Conv 수행 출력 사이즈가 변하지 않도록 padding='same'으로 설정. filter 개수는 이전의 1x1 filter개수와 동일.  
        x = Conv2D(filters=filter2, kernel_size=middle_kernel_size, padding='same', kernel_initializer='he_normal', name=conv_name_base+'2b')(x)
        x = BatchNormalization(axis=3, name=bn_name_base+'2b')(x)
        x = Activation('relu')(x)
    
        # 마지막 1x1 Conv->Batch Norm 수행. ReLU를 수행하지 않음에 유의.
        # filter 크기는 input_tensor channel 차원 개수로 원복
        x = Conv2D(filters=filter3, kernel_size=(1, 1), kernel_initializer='he_normal', name=conv_name_base+'2c')(x)
        x = BatchNormalization(axis=3, name=bn_name_base+'2c')(x)
        # Residual Block 수행 결과와 input_tensor를 합한다. 
        x = Add()([input_tensor, x])
        # 또는 x = add([x, input_tensor]) 와 같이 구현할 수도 있음. 
    
        # 마지막으로 identity block 내에서 최종 ReLU를 적용
        x = Activation('relu')(x)
    
        return x

     

    위에서 생성한 identity_block()을 호출하여 어떻게 identity block이 구성되어 있는지 확인

    from tensorflow.keras.layers import Input
    from tensorflow.keras.models import Model
    
    # input_tensor로 임의의 Feature Map size를 생성. 
    input_tensor = Input(shape=(56, 56, 256), name='test_input')
    # input_tensor의 channel수는 256개임. filters는 256의 1/4 filter수로 차원 축소후 다시 마지막 1x1 Conv에서 256으로 복원
    filters = [64, 64, 256]
    # 중간 Conv 커널 크기는 3x3
    kernel_size = (3, 3)
    stage = 2
    block = 'a'
    
    # identity_block을 호출하고 layer들이 어떻게 구성되어 있는지 확인하기 위해서 model로 구성하고 summary()호출 
    output = identity_block(input_tensor, kernel_size, filters, stage, block)
    identity_layers = Model(inputs=input_tensor, outputs=output)
    identity_layers.summary()
    Model: "model"
    __________________________________________________________________________________________________
     Layer (type)                   Output Shape         Param #     Connected to                     
    ==================================================================================================
     test_input (InputLayer)        [(None, 56, 56, 256  0           []                               
                                    )]                                                                
    
     res2a_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['test_input[0][0]']             
    
     bn2a_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation (Activation)        (None, 56, 56, 64)   0           ['bn2a_branch2a[0][0]']          
    
     res2a_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation[0][0]']             
    
     bn2a_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_1 (Activation)      (None, 56, 56, 64)   0           ['bn2a_branch2b[0][0]']          
    
     res2a_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_1[0][0]']           
    
     bn2a_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2a_branch2c[0][0]']         
     ion)                                                                                             
    
     add (Add)                      (None, 56, 56, 256)  0           ['test_input[0][0]',             
                                                                      'bn2a_branch2c[0][0]']          
    
     activation_2 (Activation)      (None, 56, 56, 256)  0           ['add[0][0]']                    
    
    ==================================================================================================
    Total params: 71,552
    Trainable params: 70,784
    Non-trainable params: 768
    __________________________________________________________________________________________________
    

     

    identity block을 연속으로 이어서 하나의 Stage 구성.

    • 아래는 input tensor의 크기가 feature map 생성시 절반으로 줄지 않음. input tensor의 크기가 절반으로 줄수 있도록 구성 필요.
    • 동일한 Stage 내에서 feature map의 크기는 그대로 대신, block내에서 filter 개수는 변화
    input_tensor = Input(shape=(56, 56, 256), name='test_input')
    x = identity_block(input_tensor, middle_kernel_size=3, filters=[64, 64, 256], stage=2, block='a')
    x = identity_block(x, middle_kernel_size=3, filters=[64, 64, 256], stage=2, block='b')
    output = identity_block(x, middle_kernel_size=3, filters=[64, 64, 256], stage=2, block='c')
    identity_layers = Model(inputs=input_tensor, outputs=output)
    identity_layers.summary()
    Model: "model_1"
    __________________________________________________________________________________________________
     Layer (type)                   Output Shape         Param #     Connected to                     
    ==================================================================================================
     test_input (InputLayer)        [(None, 56, 56, 256  0           []                               
                                    )]                                                                
    
     res2a_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['test_input[0][0]']             
    
     bn2a_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_3 (Activation)      (None, 56, 56, 64)   0           ['bn2a_branch2a[0][0]']          
    
     res2a_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_3[0][0]']           
    
     bn2a_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_4 (Activation)      (None, 56, 56, 64)   0           ['bn2a_branch2b[0][0]']          
    
     res2a_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_4[0][0]']           
    
     bn2a_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2a_branch2c[0][0]']         
     ion)                                                                                             
    
     add_1 (Add)                    (None, 56, 56, 256)  0           ['test_input[0][0]',             
                                                                      'bn2a_branch2c[0][0]']          
    
     activation_5 (Activation)      (None, 56, 56, 256)  0           ['add_1[0][0]']                  
    
     res2b_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['activation_5[0][0]']           
    
     bn2b_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_6 (Activation)      (None, 56, 56, 64)   0           ['bn2b_branch2a[0][0]']          
    
     res2b_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_6[0][0]']           
    
     bn2b_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_7 (Activation)      (None, 56, 56, 64)   0           ['bn2b_branch2b[0][0]']          
    
     res2b_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_7[0][0]']           
    
     bn2b_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_2 (Add)                    (None, 56, 56, 256)  0           ['activation_5[0][0]',           
                                                                      'bn2b_branch2c[0][0]']          
    
     activation_8 (Activation)      (None, 56, 56, 256)  0           ['add_2[0][0]']                  
    
     res2c_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['activation_8[0][0]']           
    
     bn2c_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_9 (Activation)      (None, 56, 56, 64)   0           ['bn2c_branch2a[0][0]']          
    
     res2c_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_9[0][0]']           
    
     bn2c_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_10 (Activation)     (None, 56, 56, 64)   0           ['bn2c_branch2b[0][0]']          
    
     res2c_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_10[0][0]']          
    
     bn2c_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_3 (Add)                    (None, 56, 56, 256)  0           ['activation_8[0][0]',           
                                                                      'bn2c_branch2c[0][0]']          
    
     activation_11 (Activation)     (None, 56, 56, 256)  0           ['add_3[0][0]']                  
    
    ==================================================================================================
    Total params: 214,656
    Trainable params: 212,352
    Non-trainable params: 2,304
    __________________________________________________________________________________________________
    

     

    각 stage내의 첫번째 identity block에서 입력 feature map의 크기를 절반으로 줄이는 block을 생성하는 함수 conv_block() 만들기

    • conv_block() 함수는 앞에서 구현한 identity_block()함수과 거의 유사하나 입력 feature map의 크기를 절반으로 줄이고 shortcut 전달시 1x1 conv stride 2 적용
    • 단 첫번째 Stage의 첫번째 block에서는 이미 입력 feature map이 max pool로 절반이 줄어있는 상태이므로 다시 줄이지 않음.
    def conv_block(input_tensor, middle_kernel_size, filters, stage, block, strides=(2, 2)):
        '''
        함수 입력 인자 설명
        input_tensor: 입력 tensor
        middle_kernel_size: 중간에 위치하는 kernel 크기. identity block내에 있는 두개의 conv layer중 1x1 kernel이 아니고, 3x3 kernel임. 
                            3x3 커널 이외에도 5x5 kernel도 지정할 수 있게 구성. 
        filters: 3개 conv layer들의 filter개수를 list 형태로 입력 받음. 첫번째 원소는 첫번째 1x1 filter 개수, 두번째는 3x3 filter 개수, 
                 세번째는 마지막 1x1 filter 개수
        stage: identity block들이 여러개가 결합되므로 이를 구분하기 위해서 설정. 동일한 filter수를 가지는 identity block들을  동일한 stage로 설정.  
        block: 동일 stage내에서 identity block을 구별하기 위한 구분자
        strides: 입력 feature map의 크기를 절반으로 줄이기 위해서 사용. Default는 2이지만, 
                 첫번째 Stage의 첫번째 block에서는 이미 입력 feature map이 max pool로 절반이 줄어있는 상태이므로 다시 줄이지 않기 위해 1을 호출해야함 
        ''' 
    
        # filters로 list 형태로 입력된 filter 개수를 각각 filter1, filter2, filter3로 할당. 
        # filter은 첫번째 1x1 filter 개수, filter2는 3x3 filter개수, filter3는 마지막 1x1 filter개수
        filter1, filter2, filter3 = filters
        # conv layer와 Batch normalization layer각각에 고유한 이름을 부여하기 위해 설정. 입력받은 stage와 block에 기반하여 이름 부여
        conv_name_base = 'res' + str(stage) + block + '_branch'
        bn_name_base = 'bn' + str(stage) + block + '_branch'
    
        # 이전 layer에 입력 받은 input_tensor(함수인자로 입력받음)를 기반으로 첫번째 1x1 Conv->Batch Norm->Relu 수행. 
        # 입력 feature map 사이즈를 1/2로 줄이기 위해 strides인자를 입력  
        x = Conv2D(filters=filter1, kernel_size=(1, 1), strides=strides, kernel_initializer='he_normal', name=conv_name_base+'2a')(input_tensor)
        # Batch Norm적용. 입력 데이터는 batch 사이즈까지 포함하여 4차원임(batch_size, height, width, channel depth)임
        # Batch Norm의 axis는 channel depth에 해당하는 axis index인 3을 입력.(무조건 channel이 마지막 차원의 값으로 입력된다고 가정. )
        x = BatchNormalization(axis=3, name=bn_name_base+'2a')(x)
        # ReLU Activation 적용. 
        x = Activation('relu')(x)
    
        # 두번째 3x3 Conv->Batch Norm->ReLU 수행
        # 3x3이 아닌 다른 kernel size도 구성 가능할 수 있도록 identity_block() 인자로 입력받은 middle_kernel_size를 이용. 
        # Conv 수행 출력 사이즈가 변하지 않도록 padding='same'으로 설정. filter 개수는 이전의 1x1 filter개수와 동일.  
        x = Conv2D(filters=filter2, kernel_size=middle_kernel_size, padding='same', kernel_initializer='he_normal', name=conv_name_base+'2b')(x)
        x = BatchNormalization(axis=3, name=bn_name_base+'2b')(x)
        x = Activation('relu')(x)
    
        # 마지막 1x1 Conv->Batch Norm 수행. ReLU를 수행하지 않음에 유의.
        # filter 크기는 input_tensor channel 차원 개수로 원복
        x = Conv2D(filters=filter3, kernel_size=(1, 1), kernel_initializer='he_normal', name=conv_name_base+'2c')(x)
        x = BatchNormalization(axis=3, name=bn_name_base+'2c')(x)
    
        # shortcut을 1x1 conv 수행, filter3가 입력 feature map의 filter 개수
        shortcut = Conv2D(filter3, (1, 1), strides=strides, kernel_initializer='he_normal', name=conv_name_base+'1')(input_tensor)
        shortcut = BatchNormalization(axis=3, name=bn_name_base+'1')(shortcut)
    
        # Residual Block 수행 결과와 1x1 conv가 적용된 shortcut을 합한다. 
        x = add([x, shortcut])
    
        # 마지막으로 identity block 내에서 최종 ReLU를 적용
        x = Activation('relu')(x)
    
        return x
    
    input_tensor = Input(shape=(56, 56, 256), name='test_input')
    # conv_block() 호출 시 strides를 2로 설정하여 입력 feature map의 크기를 절반으로 줄임. strides=1이면 크기를 그대로 유지
    x = conv_block(input_tensor, middle_kernel_size=3, filters=[64, 64, 256], strides=2, stage=2, block='a')
    x = identity_block(x, middle_kernel_size=3, filters=[64, 64, 256], stage=2, block='b')
    output = identity_block(x, middle_kernel_size=3, filters=[64, 64, 256], stage=2, block='c')
    identity_layers = Model(inputs=input_tensor, outputs=output)
    identity_layers.summary()
    Model: "model_3"
    __________________________________________________________________________________________________
     Layer (type)                   Output Shape         Param #     Connected to                     
    ==================================================================================================
     test_input (InputLayer)        [(None, 56, 56, 256  0           []                               
                                    )]                                                                
    
     res2a_branch2a (Conv2D)        (None, 28, 28, 64)   16448       ['test_input[0][0]']             
    
     bn2a_branch2a (BatchNormalizat  (None, 28, 28, 64)  256         ['res2a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_15 (Activation)     (None, 28, 28, 64)   0           ['bn2a_branch2a[0][0]']          
    
     res2a_branch2b (Conv2D)        (None, 28, 28, 64)   36928       ['activation_15[0][0]']          
    
     bn2a_branch2b (BatchNormalizat  (None, 28, 28, 64)  256         ['res2a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_16 (Activation)     (None, 28, 28, 64)   0           ['bn2a_branch2b[0][0]']          
    
     res2a_branch2c (Conv2D)        (None, 28, 28, 256)  16640       ['activation_16[0][0]']          
    
     res2a_branch1 (Conv2D)         (None, 28, 28, 256)  65792       ['test_input[0][0]']             
    
     bn2a_branch2c (BatchNormalizat  (None, 28, 28, 256)  1024       ['res2a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn2a_branch1 (BatchNormalizati  (None, 28, 28, 256)  1024       ['res2a_branch1[0][0]']          
     on)                                                                                              
    
     add_4 (Add)                    (None, 28, 28, 256)  0           ['bn2a_branch2c[0][0]',          
                                                                      'bn2a_branch1[0][0]']           
    
     activation_17 (Activation)     (None, 28, 28, 256)  0           ['add_4[0][0]']                  
    
     res2b_branch2a (Conv2D)        (None, 28, 28, 64)   16448       ['activation_17[0][0]']          
    
     bn2b_branch2a (BatchNormalizat  (None, 28, 28, 64)  256         ['res2b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_18 (Activation)     (None, 28, 28, 64)   0           ['bn2b_branch2a[0][0]']          
    
     res2b_branch2b (Conv2D)        (None, 28, 28, 64)   36928       ['activation_18[0][0]']          
    
     bn2b_branch2b (BatchNormalizat  (None, 28, 28, 64)  256         ['res2b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_19 (Activation)     (None, 28, 28, 64)   0           ['bn2b_branch2b[0][0]']          
    
     res2b_branch2c (Conv2D)        (None, 28, 28, 256)  16640       ['activation_19[0][0]']          
    
     bn2b_branch2c (BatchNormalizat  (None, 28, 28, 256)  1024       ['res2b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_5 (Add)                    (None, 28, 28, 256)  0           ['activation_17[0][0]',          
                                                                      'bn2b_branch2c[0][0]']          
    
     activation_20 (Activation)     (None, 28, 28, 256)  0           ['add_5[0][0]']                  
    
     res2c_branch2a (Conv2D)        (None, 28, 28, 64)   16448       ['activation_20[0][0]']          
    
     bn2c_branch2a (BatchNormalizat  (None, 28, 28, 64)  256         ['res2c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_21 (Activation)     (None, 28, 28, 64)   0           ['bn2c_branch2a[0][0]']          
    
     res2c_branch2b (Conv2D)        (None, 28, 28, 64)   36928       ['activation_21[0][0]']          
    
     bn2c_branch2b (BatchNormalizat  (None, 28, 28, 64)  256         ['res2c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_22 (Activation)     (None, 28, 28, 64)   0           ['bn2c_branch2b[0][0]']          
    
     res2c_branch2c (Conv2D)        (None, 28, 28, 256)  16640       ['activation_22[0][0]']          
    
     bn2c_branch2c (BatchNormalizat  (None, 28, 28, 256)  1024       ['res2c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_6 (Add)                    (None, 28, 28, 256)  0           ['activation_20[0][0]',          
                                                                      'bn2c_branch2c[0][0]']          
    
     activation_23 (Activation)     (None, 28, 28, 256)  0           ['add_6[0][0]']                  
    
    ==================================================================================================
    Total params: 281,472
    Trainable params: 278,656
    Non-trainable params: 2,816
    __________________________________________________________________________________________________
    

     

    input image를 7x7 Conv 변환하고 Max Pooling 적용 로직을 별도 함수로 구현.

    • O = (I - F + 2P)/S + 1, I는 Input size, F는 filter의 kernel 크기, P는 padding, S는 Stride
    • (224 - 7)/2 + 1 = 109.5 = 109가 됨. 따라서 112x112 로 출력하기 위해 ZeroPadding2D(3, 3)수행
    • 112x112로 MaxPooling 을 (3, 3) pool size로 stride 2로 수행하므로 56x56으로 출력하기 위해 ZeroPadding2D(1,1) 수행
    from tensorflow.keras.layers import ZeroPadding2D, MaxPooling2D
    
    def do_first_conv(input_tensor):
        # 7x7 Conv 연산 수행하여 feature map 생성하되 input_tensor 크기(image 크기)의 절반으로 생성.  filter 개수는 64개 
        # 224x224 를 input을 7x7 conv, strides=2로 112x112 출력하기 위해 Zero padding 적용. 
        x = ZeroPadding2D(padding=(3, 3), name='conv1_pad')(input_tensor)
        x = Conv2D(64, (7, 7), strides=(2, 2), padding='valid', kernel_initializer='he_normal', name='conv')(x)
        x = BatchNormalization(axis=3, name='bn_conv1')(x)
        x = Activation('relu')(x)
        # 다시 feature map 크기를 MaxPooling으로 절반으로 만듬. 56x56으로 출력하기 위해 zero padding 적용. 
        x = ZeroPadding2D(padding=(1, 1), name='pool1_pad')(x)
        x = MaxPooling2D((3, 3), strides=(2, 2))(x)
    
        return x
    
    input_tensor = Input(shape=(224, 224, 3))
    output = do_first_conv(input_tensor)
    model = Model(inputs=input_tensor, outputs=output)
    model.summary()
    Model: "model_2"
    _________________________________________________________________
     Layer (type)                Output Shape              Param #   
    =================================================================
     input_1 (InputLayer)        [(None, 224, 224, 3)]     0         
    
     conv1_pad (ZeroPadding2D)   (None, 230, 230, 3)       0         
    
     conv (Conv2D)               (None, 112, 112, 64)      9472      
    
     bn_conv1 (BatchNormalizatio  (None, 112, 112, 64)     256       
     n)                                                              
    
     activation_12 (Activation)  (None, 112, 112, 64)      0         
    
     pool1_pad (ZeroPadding2D)   (None, 114, 114, 64)      0         
    
     max_pooling2d (MaxPooling2D  (None, 56, 56, 64)       0         
     )                                                               
    
    =================================================================
    Total params: 9,728
    Trainable params: 9,600
    Non-trainable params: 128
    _________________________________________________________________
    

     

    ResNet 50 모델 생성.

    • 앞에서 생성한 conv_block()과 identity_block()을 호출하여 ResNet 50 모델 생성.
    from tensorflow.keras.models import Model
    from tensorflow.keras.layers import Input, Dense , Conv2D , Dropout , Flatten , Activation, MaxPooling2D , GlobalAveragePooling2D
    from tensorflow.keras.optimizers import Adam , RMSprop 
    from tensorflow.keras.layers import BatchNormalization
    from tensorflow.keras.callbacks import ReduceLROnPlateau , EarlyStopping , ModelCheckpoint , LearningRateScheduler
    
    def create_resnet(in_shape=(224, 224, 3), n_classes=10):
        input_tensor = Input(shape=in_shape)
    
        #첫번째 7x7 Conv와 Max Polling 적용.  
        x = do_first_conv(input_tensor)
    
        # stage 2의 conv_block과 identity block 생성. stage2의 첫번째 conv_block은 strides를 1로 하여 크기를 줄이지 않음. 
        x = conv_block(x, 3, [64, 64, 256], stage=2, block='a', strides=(1, 1))
        x = identity_block(x, 3, [64, 64, 256], stage=2, block='b')
        x = identity_block(x, 3, [64, 64, 256], stage=2, block='c')
    
        # stage 3의 conv_block과 identity block 생성. stage3의 첫번째 conv_block은 strides를 2(default)로 하여 크기를 줄임 
        x = conv_block(x, 3, [128, 128, 512], stage=3, block='a')
        x = identity_block(x, 3, [128, 128, 512], stage=3, block='b')
        x = identity_block(x, 3, [128, 128, 512], stage=3, block='c')
        x = identity_block(x, 3, [128, 128, 512], stage=3, block='d')
    
        # stage 4의 conv_block과 identity block 생성. stage4의 첫번째 conv_block은 strides를 2(default)로 하여 크기를 줄임
        x = conv_block(x, 3, [256, 256, 1024], stage=4, block='a')
        x = identity_block(x, 3, [256, 256, 1024], stage=4, block='b')
        x = identity_block(x, 3, [256, 256, 1024], stage=4, block='c')
        x = identity_block(x, 3, [256, 256, 1024], stage=4, block='d')
        x = identity_block(x, 3, [256, 256, 1024], stage=4, block='e')
        x = identity_block(x, 3, [256, 256, 1024], stage=4, block='f')
    
        # stage 5의 conv_block과 identity block 생성. stage5의 첫번째 conv_block은 strides를 2(default)로 하여 크기를 줄임
        x = conv_block(x, 3, [512, 512, 2048], stage=5, block='a')
        x = identity_block(x, 3, [512, 512, 2048], stage=5, block='b')
        x = identity_block(x, 3, [512, 512, 2048], stage=5, block='c')
    
        # classification dense layer와 연결 전 GlobalAveragePooling 수행 
        x = GlobalAveragePooling2D(name='avg_pool')(x)
        x = Dropout(rate=0.5)(x)
        x = Dense(200, activation='relu', name='fc_01')(x)
        x = Dropout(rate=0.5)(x)
        output = Dense(n_classes, activation='softmax', name='fc_final')(x)
    
        model = Model(inputs=input_tensor, outputs=output, name='resnet50')
        model.summary()
    
        return model
    model =  create_resnet(in_shape=(224,224,3), n_classes=10)
    Model: "resnet50"
    __________________________________________________________________________________________________
     Layer (type)                   Output Shape         Param #     Connected to                     
    ==================================================================================================
     input_2 (InputLayer)           [(None, 224, 224, 3  0           []                               
                                    )]                                                                
    
     conv1_pad (ZeroPadding2D)      (None, 230, 230, 3)  0           ['input_2[0][0]']                
    
     conv (Conv2D)                  (None, 112, 112, 64  9472        ['conv1_pad[0][0]']              
                                    )                                                                 
    
     bn_conv1 (BatchNormalization)  (None, 112, 112, 64  256         ['conv[0][0]']                   
                                    )                                                                 
    
     activation_22 (Activation)     (None, 112, 112, 64  0           ['bn_conv1[0][0]']               
                                    )                                                                 
    
     pool1_pad (ZeroPadding2D)      (None, 114, 114, 64  0           ['activation_22[0][0]']          
                                    )                                                                 
    
     max_pooling2d_1 (MaxPooling2D)  (None, 56, 56, 64)  0           ['pool1_pad[0][0]']              
    
     res2a_branch2a (Conv2D)        (None, 56, 56, 64)   4160        ['max_pooling2d_1[0][0]']        
    
     bn2a_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_23 (Activation)     (None, 56, 56, 64)   0           ['bn2a_branch2a[0][0]']          
    
     res2a_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_23[0][0]']          
    
     bn2a_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_24 (Activation)     (None, 56, 56, 64)   0           ['bn2a_branch2b[0][0]']          
    
     res2a_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_24[0][0]']          
    
     res2a_branch1 (Conv2D)         (None, 56, 56, 256)  16640       ['max_pooling2d_1[0][0]']        
    
     bn2a_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn2a_branch1 (BatchNormalizati  (None, 56, 56, 256)  1024       ['res2a_branch1[0][0]']          
     on)                                                                                              
    
     add_7 (Add)                    (None, 56, 56, 256)  0           ['bn2a_branch2c[0][0]',          
                                                                      'bn2a_branch1[0][0]']           
    
     activation_25 (Activation)     (None, 56, 56, 256)  0           ['add_7[0][0]']                  
    
     res2b_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['activation_25[0][0]']          
    
     bn2b_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_26 (Activation)     (None, 56, 56, 64)   0           ['bn2b_branch2a[0][0]']          
    
     res2b_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_26[0][0]']          
    
     bn2b_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_27 (Activation)     (None, 56, 56, 64)   0           ['bn2b_branch2b[0][0]']          
    
     res2b_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_27[0][0]']          
    
     bn2b_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_8 (Add)                    (None, 56, 56, 256)  0           ['activation_25[0][0]',          
                                                                      'bn2b_branch2c[0][0]']          
    
     activation_28 (Activation)     (None, 56, 56, 256)  0           ['add_8[0][0]']                  
    
     res2c_branch2a (Conv2D)        (None, 56, 56, 64)   16448       ['activation_28[0][0]']          
    
     bn2c_branch2a (BatchNormalizat  (None, 56, 56, 64)  256         ['res2c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_29 (Activation)     (None, 56, 56, 64)   0           ['bn2c_branch2a[0][0]']          
    
     res2c_branch2b (Conv2D)        (None, 56, 56, 64)   36928       ['activation_29[0][0]']          
    
     bn2c_branch2b (BatchNormalizat  (None, 56, 56, 64)  256         ['res2c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_30 (Activation)     (None, 56, 56, 64)   0           ['bn2c_branch2b[0][0]']          
    
     res2c_branch2c (Conv2D)        (None, 56, 56, 256)  16640       ['activation_30[0][0]']          
    
     bn2c_branch2c (BatchNormalizat  (None, 56, 56, 256)  1024       ['res2c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_9 (Add)                    (None, 56, 56, 256)  0           ['activation_28[0][0]',          
                                                                      'bn2c_branch2c[0][0]']          
    
     activation_31 (Activation)     (None, 56, 56, 256)  0           ['add_9[0][0]']                  
    
     res3a_branch2a (Conv2D)        (None, 28, 28, 128)  32896       ['activation_31[0][0]']          
    
     bn3a_branch2a (BatchNormalizat  (None, 28, 28, 128)  512        ['res3a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_32 (Activation)     (None, 28, 28, 128)  0           ['bn3a_branch2a[0][0]']          
    
     res3a_branch2b (Conv2D)        (None, 28, 28, 128)  147584      ['activation_32[0][0]']          
    
     bn3a_branch2b (BatchNormalizat  (None, 28, 28, 128)  512        ['res3a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_33 (Activation)     (None, 28, 28, 128)  0           ['bn3a_branch2b[0][0]']          
    
     res3a_branch2c (Conv2D)        (None, 28, 28, 512)  66048       ['activation_33[0][0]']          
    
     res3a_branch1 (Conv2D)         (None, 28, 28, 512)  131584      ['activation_31[0][0]']          
    
     bn3a_branch2c (BatchNormalizat  (None, 28, 28, 512)  2048       ['res3a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn3a_branch1 (BatchNormalizati  (None, 28, 28, 512)  2048       ['res3a_branch1[0][0]']          
     on)                                                                                              
    
     add_10 (Add)                   (None, 28, 28, 512)  0           ['bn3a_branch2c[0][0]',          
                                                                      'bn3a_branch1[0][0]']           
    
     activation_34 (Activation)     (None, 28, 28, 512)  0           ['add_10[0][0]']                 
    
     res3b_branch2a (Conv2D)        (None, 28, 28, 128)  65664       ['activation_34[0][0]']          
    
     bn3b_branch2a (BatchNormalizat  (None, 28, 28, 128)  512        ['res3b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_35 (Activation)     (None, 28, 28, 128)  0           ['bn3b_branch2a[0][0]']          
    
     res3b_branch2b (Conv2D)        (None, 28, 28, 128)  147584      ['activation_35[0][0]']          
    
     bn3b_branch2b (BatchNormalizat  (None, 28, 28, 128)  512        ['res3b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_36 (Activation)     (None, 28, 28, 128)  0           ['bn3b_branch2b[0][0]']          
    
     res3b_branch2c (Conv2D)        (None, 28, 28, 512)  66048       ['activation_36[0][0]']          
    
     bn3b_branch2c (BatchNormalizat  (None, 28, 28, 512)  2048       ['res3b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_11 (Add)                   (None, 28, 28, 512)  0           ['activation_34[0][0]',          
                                                                      'bn3b_branch2c[0][0]']          
    
     activation_37 (Activation)     (None, 28, 28, 512)  0           ['add_11[0][0]']                 
    
     res3c_branch2a (Conv2D)        (None, 28, 28, 128)  65664       ['activation_37[0][0]']          
    
     bn3c_branch2a (BatchNormalizat  (None, 28, 28, 128)  512        ['res3c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_38 (Activation)     (None, 28, 28, 128)  0           ['bn3c_branch2a[0][0]']          
    
     res3c_branch2b (Conv2D)        (None, 28, 28, 128)  147584      ['activation_38[0][0]']          
    
     bn3c_branch2b (BatchNormalizat  (None, 28, 28, 128)  512        ['res3c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_39 (Activation)     (None, 28, 28, 128)  0           ['bn3c_branch2b[0][0]']          
    
     res3c_branch2c (Conv2D)        (None, 28, 28, 512)  66048       ['activation_39[0][0]']          
    
     bn3c_branch2c (BatchNormalizat  (None, 28, 28, 512)  2048       ['res3c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_12 (Add)                   (None, 28, 28, 512)  0           ['activation_37[0][0]',          
                                                                      'bn3c_branch2c[0][0]']          
    
     activation_40 (Activation)     (None, 28, 28, 512)  0           ['add_12[0][0]']                 
    
     res3d_branch2a (Conv2D)        (None, 28, 28, 128)  65664       ['activation_40[0][0]']          
    
     bn3d_branch2a (BatchNormalizat  (None, 28, 28, 128)  512        ['res3d_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_41 (Activation)     (None, 28, 28, 128)  0           ['bn3d_branch2a[0][0]']          
    
     res3d_branch2b (Conv2D)        (None, 28, 28, 128)  147584      ['activation_41[0][0]']          
    
     bn3d_branch2b (BatchNormalizat  (None, 28, 28, 128)  512        ['res3d_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_42 (Activation)     (None, 28, 28, 128)  0           ['bn3d_branch2b[0][0]']          
    
     res3d_branch2c (Conv2D)        (None, 28, 28, 512)  66048       ['activation_42[0][0]']          
    
     bn3d_branch2c (BatchNormalizat  (None, 28, 28, 512)  2048       ['res3d_branch2c[0][0]']         
     ion)                                                                                             
    
     add_13 (Add)                   (None, 28, 28, 512)  0           ['activation_40[0][0]',          
                                                                      'bn3d_branch2c[0][0]']          
    
     activation_43 (Activation)     (None, 28, 28, 512)  0           ['add_13[0][0]']                 
    
     res4a_branch2a (Conv2D)        (None, 14, 14, 256)  131328      ['activation_43[0][0]']          
    
     bn4a_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_44 (Activation)     (None, 14, 14, 256)  0           ['bn4a_branch2a[0][0]']          
    
     res4a_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_44[0][0]']          
    
     bn4a_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_45 (Activation)     (None, 14, 14, 256)  0           ['bn4a_branch2b[0][0]']          
    
     res4a_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_45[0][0]']          
                                    )                                                                 
    
     res4a_branch1 (Conv2D)         (None, 14, 14, 1024  525312      ['activation_43[0][0]']          
                                    )                                                                 
    
     bn4a_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4a_branch2c[0][0]']         
     ion)                           )                                                                 
    
     bn4a_branch1 (BatchNormalizati  (None, 14, 14, 1024  4096       ['res4a_branch1[0][0]']          
     on)                            )                                                                 
    
     add_14 (Add)                   (None, 14, 14, 1024  0           ['bn4a_branch2c[0][0]',          
                                    )                                 'bn4a_branch1[0][0]']           
    
     activation_46 (Activation)     (None, 14, 14, 1024  0           ['add_14[0][0]']                 
                                    )                                                                 
    
     res4b_branch2a (Conv2D)        (None, 14, 14, 256)  262400      ['activation_46[0][0]']          
    
     bn4b_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_47 (Activation)     (None, 14, 14, 256)  0           ['bn4b_branch2a[0][0]']          
    
     res4b_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_47[0][0]']          
    
     bn4b_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_48 (Activation)     (None, 14, 14, 256)  0           ['bn4b_branch2b[0][0]']          
    
     res4b_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_48[0][0]']          
                                    )                                                                 
    
     bn4b_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4b_branch2c[0][0]']         
     ion)                           )                                                                 
    
     add_15 (Add)                   (None, 14, 14, 1024  0           ['activation_46[0][0]',          
                                    )                                 'bn4b_branch2c[0][0]']          
    
     activation_49 (Activation)     (None, 14, 14, 1024  0           ['add_15[0][0]']                 
                                    )                                                                 
    
     res4c_branch2a (Conv2D)        (None, 14, 14, 256)  262400      ['activation_49[0][0]']          
    
     bn4c_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_50 (Activation)     (None, 14, 14, 256)  0           ['bn4c_branch2a[0][0]']          
    
     res4c_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_50[0][0]']          
    
     bn4c_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_51 (Activation)     (None, 14, 14, 256)  0           ['bn4c_branch2b[0][0]']          
    
     res4c_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_51[0][0]']          
                                    )                                                                 
    
     bn4c_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4c_branch2c[0][0]']         
     ion)                           )                                                                 
    
     add_16 (Add)                   (None, 14, 14, 1024  0           ['activation_49[0][0]',          
                                    )                                 'bn4c_branch2c[0][0]']          
    
     activation_52 (Activation)     (None, 14, 14, 1024  0           ['add_16[0][0]']                 
                                    )                                                                 
    
     res4d_branch2a (Conv2D)        (None, 14, 14, 256)  262400      ['activation_52[0][0]']          
    
     bn4d_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4d_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_53 (Activation)     (None, 14, 14, 256)  0           ['bn4d_branch2a[0][0]']          
    
     res4d_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_53[0][0]']          
    
     bn4d_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4d_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_54 (Activation)     (None, 14, 14, 256)  0           ['bn4d_branch2b[0][0]']          
    
     res4d_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_54[0][0]']          
                                    )                                                                 
    
     bn4d_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4d_branch2c[0][0]']         
     ion)                           )                                                                 
    
     add_17 (Add)                   (None, 14, 14, 1024  0           ['activation_52[0][0]',          
                                    )                                 'bn4d_branch2c[0][0]']          
    
     activation_55 (Activation)     (None, 14, 14, 1024  0           ['add_17[0][0]']                 
                                    )                                                                 
    
     res4e_branch2a (Conv2D)        (None, 14, 14, 256)  262400      ['activation_55[0][0]']          
    
     bn4e_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4e_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_56 (Activation)     (None, 14, 14, 256)  0           ['bn4e_branch2a[0][0]']          
    
     res4e_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_56[0][0]']          
    
     bn4e_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4e_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_57 (Activation)     (None, 14, 14, 256)  0           ['bn4e_branch2b[0][0]']          
    
     res4e_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_57[0][0]']          
                                    )                                                                 
    
     bn4e_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4e_branch2c[0][0]']         
     ion)                           )                                                                 
    
     add_18 (Add)                   (None, 14, 14, 1024  0           ['activation_55[0][0]',          
                                    )                                 'bn4e_branch2c[0][0]']          
    
     activation_58 (Activation)     (None, 14, 14, 1024  0           ['add_18[0][0]']                 
                                    )                                                                 
    
     res4f_branch2a (Conv2D)        (None, 14, 14, 256)  262400      ['activation_58[0][0]']          
    
     bn4f_branch2a (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4f_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_59 (Activation)     (None, 14, 14, 256)  0           ['bn4f_branch2a[0][0]']          
    
     res4f_branch2b (Conv2D)        (None, 14, 14, 256)  590080      ['activation_59[0][0]']          
    
     bn4f_branch2b (BatchNormalizat  (None, 14, 14, 256)  1024       ['res4f_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_60 (Activation)     (None, 14, 14, 256)  0           ['bn4f_branch2b[0][0]']          
    
     res4f_branch2c (Conv2D)        (None, 14, 14, 1024  263168      ['activation_60[0][0]']          
                                    )                                                                 
    
     bn4f_branch2c (BatchNormalizat  (None, 14, 14, 1024  4096       ['res4f_branch2c[0][0]']         
     ion)                           )                                                                 
    
     add_19 (Add)                   (None, 14, 14, 1024  0           ['activation_58[0][0]',          
                                    )                                 'bn4f_branch2c[0][0]']          
    
     activation_61 (Activation)     (None, 14, 14, 1024  0           ['add_19[0][0]']                 
                                    )                                                                 
    
     res5a_branch2a (Conv2D)        (None, 7, 7, 512)    524800      ['activation_61[0][0]']          
    
     bn5a_branch2a (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_62 (Activation)     (None, 7, 7, 512)    0           ['bn5a_branch2a[0][0]']          
    
     res5a_branch2b (Conv2D)        (None, 7, 7, 512)    2359808     ['activation_62[0][0]']          
    
     bn5a_branch2b (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_63 (Activation)     (None, 7, 7, 512)    0           ['bn5a_branch2b[0][0]']          
    
     res5a_branch2c (Conv2D)        (None, 7, 7, 2048)   1050624     ['activation_63[0][0]']          
    
     res5a_branch1 (Conv2D)         (None, 7, 7, 2048)   2099200     ['activation_61[0][0]']          
    
     bn5a_branch2c (BatchNormalizat  (None, 7, 7, 2048)  8192        ['res5a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn5a_branch1 (BatchNormalizati  (None, 7, 7, 2048)  8192        ['res5a_branch1[0][0]']          
     on)                                                                                              
    
     add_20 (Add)                   (None, 7, 7, 2048)   0           ['bn5a_branch2c[0][0]',          
                                                                      'bn5a_branch1[0][0]']           
    
     activation_64 (Activation)     (None, 7, 7, 2048)   0           ['add_20[0][0]']                 
    
     res5b_branch2a (Conv2D)        (None, 7, 7, 512)    1049088     ['activation_64[0][0]']          
    
     bn5b_branch2a (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_65 (Activation)     (None, 7, 7, 512)    0           ['bn5b_branch2a[0][0]']          
    
     res5b_branch2b (Conv2D)        (None, 7, 7, 512)    2359808     ['activation_65[0][0]']          
    
     bn5b_branch2b (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_66 (Activation)     (None, 7, 7, 512)    0           ['bn5b_branch2b[0][0]']          
    
     res5b_branch2c (Conv2D)        (None, 7, 7, 2048)   1050624     ['activation_66[0][0]']          
    
     bn5b_branch2c (BatchNormalizat  (None, 7, 7, 2048)  8192        ['res5b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_21 (Add)                   (None, 7, 7, 2048)   0           ['activation_64[0][0]',          
                                                                      'bn5b_branch2c[0][0]']          
    
     activation_67 (Activation)     (None, 7, 7, 2048)   0           ['add_21[0][0]']                 
    
     res5c_branch2a (Conv2D)        (None, 7, 7, 512)    1049088     ['activation_67[0][0]']          
    
     bn5c_branch2a (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_68 (Activation)     (None, 7, 7, 512)    0           ['bn5c_branch2a[0][0]']          
    
     res5c_branch2b (Conv2D)        (None, 7, 7, 512)    2359808     ['activation_68[0][0]']          
    
     bn5c_branch2b (BatchNormalizat  (None, 7, 7, 512)   2048        ['res5c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_69 (Activation)     (None, 7, 7, 512)    0           ['bn5c_branch2b[0][0]']          
    
     res5c_branch2c (Conv2D)        (None, 7, 7, 2048)   1050624     ['activation_69[0][0]']          
    
     bn5c_branch2c (BatchNormalizat  (None, 7, 7, 2048)  8192        ['res5c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_22 (Add)                   (None, 7, 7, 2048)   0           ['activation_67[0][0]',          
                                                                      'bn5c_branch2c[0][0]']          
    
     activation_70 (Activation)     (None, 7, 7, 2048)   0           ['add_22[0][0]']                 
    
     avg_pool (GlobalAveragePooling  (None, 2048)        0           ['activation_70[0][0]']          
     2D)                                                                                              
    
     dropout (Dropout)              (None, 2048)         0           ['avg_pool[0][0]']               
    
     fc_01 (Dense)                  (None, 200)          409800      ['dropout[0][0]']                
    
     dropout_1 (Dropout)            (None, 200)          0           ['fc_01[0][0]']                  
    
     fc_final (Dense)               (None, 10)           2010        ['dropout_1[0][0]']              
    
    ==================================================================================================
    Total params: 23,999,522
    Trainable params: 23,946,402
    Non-trainable params: 53,120
    __________________________________________________________________________________________________
    

     

    CIFAR10 데이터 세트로 ResNet 모델 학습 및 성능 테스트

    IMAGE_SIZE = 128
    BATCH_SIZE = 64

     

    (기존 사용한) 데이터 전처리/인코딩/스케일링 함수 및 CIFAR_Dataset 선언

    import random as python_random
    from tensorflow.keras.utils import to_categorical
    from sklearn.model_selection import train_test_split
    from tensorflow.keras.datasets import cifar10
    from tensorflow.keras.utils import Sequence
    import cv2
    import sklearn
    
    def zero_one_scaler(image):
        return image/255.0
    
    def get_preprocessed_ohe(images, labels, pre_func=None):
        # preprocessing 함수가 입력되면 이를 이용하여 image array를 scaling 적용.
        if pre_func is not None:
            images = pre_func(images)
        # OHE 적용    
        oh_labels = to_categorical(labels)
        return images, oh_labels
    
    # 학습/검증/테스트 데이터 세트에 전처리 및 OHE 적용한 뒤 반환 
    def get_train_valid_test_set(train_images, train_labels, test_images, test_labels, valid_size=0.15, random_state=2021):
        # 학습 및 테스트 데이터 세트를  0 ~ 1사이값 float32로 변경 및 OHE 적용. 
        train_images, train_oh_labels = get_preprocessed_ohe(train_images, train_labels)
        test_images, test_oh_labels = get_preprocessed_ohe(test_images, test_labels)
    
        # 학습 데이터를 검증 데이터 세트로 다시 분리
        tr_images, val_images, tr_oh_labels, val_oh_labels = train_test_split(train_images, train_oh_labels, test_size=valid_size, random_state=random_state)
    
        return (tr_images, tr_oh_labels), (val_images, val_oh_labels), (test_images, test_oh_labels )
    
    from tensorflow.keras.utils import Sequence
    import cv2
    import sklearn
    
    # 입력 인자 images_array labels는 모두 numpy array로 들어옴. 
    # 인자로 입력되는 images_array는 전체 32x32 image array임. 
    class CIFAR_Dataset(Sequence):
        def __init__(self, images_array, labels, batch_size=BATCH_SIZE, augmentor=None, shuffle=False, pre_func=None):
            '''
            파라미터 설명
            images_array: 원본 32x32 만큼의 image 배열값. 
            labels: 해당 image의 label들
            batch_size: __getitem__(self, index) 호출 시 마다 가져올 데이터 batch 건수
            augmentor: albumentations 객체
            shuffle: 학습 데이터의 경우 epoch 종료시마다 데이터를 섞을지 여부
            '''
            # 객체 생성 인자로 들어온 값을 객체 내부 변수로 할당. 
            # 인자로 입력되는 images_array는 전체 32x32 image array임.
            self.images_array = images_array
            self.labels = labels
            self.batch_size = batch_size
            self.augmentor = augmentor
            self.pre_func = pre_func
            # train data의 경우 
            self.shuffle = shuffle
            if self.shuffle:
                # 객체 생성시에 한번 데이터를 섞음. 
                #self.on_epoch_end()
                pass
    
        # Sequence를 상속받은 Dataset은 batch_size 단위로 입력된 데이터를 처리함. 
        # __len__()은 전체 데이터 건수가 주어졌을 때 batch_size단위로 몇번 데이터를 반환하는지 나타남
        def __len__(self):
            # batch_size단위로 데이터를 몇번 가져와야하는지 계산하기 위해 전체 데이터 건수를 batch_size로 나누되, 정수로 정확히 나눠지지 않을 경우 1회를 더한다. 
            return int(np.ceil(len(self.labels) / self.batch_size))
    
        # batch_size 단위로 image_array, label_array 데이터를 가져와서 변환한 뒤 다시 반환함
        # 인자로 몇번째 batch 인지를 나타내는 index를 입력하면 해당 순서에 해당하는 batch_size 만큼의 데이타를 가공하여 반환
        # batch_size 갯수만큼 변환된 image_array와 label_array 반환. 
        def __getitem__(self, index):
            # index는 몇번째 batch인지를 나타냄. 
            # batch_size만큼 순차적으로 데이터를 가져오려면 array에서 index*self.batch_size:(index+1)*self.batch_size 만큼의 연속 데이터를 가져오면 됨
            # 32x32 image array를 self.batch_size만큼 가져옴. 
            images_fetch = self.images_array[index*self.batch_size:(index+1)*self.batch_size]
            if self.labels is not None:
                label_batch = self.labels[index*self.batch_size:(index+1)*self.batch_size]
    
            # 만일 객체 생성 인자로 albumentation으로 만든 augmentor가 주어진다면 아래와 같이 augmentor를 이용하여 image 변환
            # albumentations은 개별 image만 변환할 수 있으므로 batch_size만큼 할당된 image_name_batch를 한 건씩 iteration하면서 변환 수행. 
            # 변환된 image 배열값을 담을 image_batch 선언. image_batch 배열은 float32 로 설정. 
            image_batch = np.zeros((images_fetch.shape[0], IMAGE_SIZE, IMAGE_SIZE, 3), dtype='float32')
    
            # batch_size에 담긴 건수만큼 iteration 하면서 opencv image load -> image augmentation 변환(augmentor가 not None일 경우)-> image_batch에 담음. 
            for image_index in range(images_fetch.shape[0]):
                #image = cv2.cvtColor(cv2.imread(image_name_batch[image_index]), cv2.COLOR_BGR2RGB)
                # 원본 image를 IMAGE_SIZE x IMAGE_SIZE 크기로 변환
                image = cv2.resize(images_fetch[image_index], (IMAGE_SIZE, IMAGE_SIZE))
                # 만약 augmentor가 주어졌다면 이를 적용. 
                if self.augmentor is not None:
                    image = self.augmentor(image=image)['image']
    
                # 만약 scaling 함수가 입력되었다면 이를 적용하여 scaling 수행. 
                if self.pre_func is not None:
                    image = self.pre_func(image)
    
                # image_batch에 순차적으로 변환된 image를 담음.               
                image_batch[image_index] = image
    
            return image_batch, label_batch
    
        # epoch가 한번 수행이 완료 될 때마다 모델의 fit()에서 호출됨. 
        def on_epoch_end(self):
            if(self.shuffle):
                #print('epoch end')
                # 원본 image배열과 label를 쌍을 맞춰서 섞어준다. scikt learn의 utils.shuffle에서 해당 기능 제공
                self.images_array, self.labels = sklearn.utils.shuffle(self.images_array, self.labels)
            else:
                pass

     

    원-핫 인코딩, 학습/검증/테스트 데이터 세트 분할

    • scaling은 원본 채널별 pixel값 - [103.939, 116.779, 123.68] 적용.
    # CIFAR10 데이터 재 로딩 및 OHE 전처리 적용하여 학습/검증/데이터 세트 생성. 
    (train_images, train_labels), (test_images, test_labels) = cifar10.load_data()
    print(train_images.shape, train_labels.shape, test_images.shape, test_labels.shape)
    
    (tr_images, tr_oh_labels), (val_images, val_oh_labels), (test_images, test_oh_labels) = \
        get_train_valid_test_set(train_images, train_labels, test_images, test_labels, valid_size=0.2, random_state=2021)
    print(tr_images.shape, tr_oh_labels.shape, val_images.shape, val_oh_labels.shape, test_images.shape, test_oh_labels.shape)
    Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz
    170500096/170498071 [==============================] - 6s 0us/step
    170508288/170498071 [==============================] - 6s 0us/step
    (50000, 32, 32, 3) (50000, 1) (10000, 32, 32, 3) (10000, 1)
    (40000, 32, 32, 3) (40000, 10) (10000, 32, 32, 3) (10000, 10) (10000, 32, 32, 3) (10000, 10)
    

     

    학습, 검증용 CIFAR_Dataset 생성

    from tensorflow.keras.applications.resnet50 import preprocess_input as resnet_preprocess
    
    tr_ds = CIFAR_Dataset(tr_images, tr_oh_labels, batch_size=BATCH_SIZE, augmentor=None, shuffle=True, pre_func=resnet_preprocess)
    val_ds = CIFAR_Dataset(val_images, val_oh_labels, batch_size=BATCH_SIZE, augmentor=None, shuffle=False, pre_func=resnet_preprocess)
    
    print(next(iter(tr_ds))[0].shape, next(iter(val_ds))[0].shape)
    print(next(iter(tr_ds))[1].shape, next(iter(val_ds))[1].shape)
    # 채널별 값 - [103.939, 116.779, 123.68]
    print(next(iter(tr_ds))[0][0])
    (64, 128, 128, 3) (64, 128, 128, 3)
    (64, 10) (64, 10)
    [[[ 73.061     57.221     40.32    ]
      [ 73.061     57.221     40.32    ]
      [ 70.061     54.221     38.32    ]
      ...
      [-34.939003 -42.779    -50.68    ]
      [-35.939003 -44.779    -53.68    ]
      [-35.939003 -44.779    -53.68    ]]
    
     [[ 73.061     57.221     40.32    ]
      [ 73.061     57.221     40.32    ]
      [ 70.061     54.221     38.32    ]
      ...
      [-34.939003 -42.779    -50.68    ]
      [-35.939003 -44.779    -53.68    ]
      [-35.939003 -44.779    -53.68    ]]
    
     [[ 75.061     59.221     42.32    ]
      [ 75.061     59.221     42.32    ]
      [ 72.061     56.221     40.32    ]
      ...
      [-34.939003 -42.779    -50.68    ]
      [-35.939003 -44.779    -52.68    ]
      [-35.939003 -44.779    -52.68    ]]
    
     ...
    
     [[120.061    102.221    109.32    ]
      [120.061    102.221    109.32    ]
      [116.061     99.221    107.32    ]
      ...
      [-35.939003 -44.779    -55.68    ]
      [-34.939003 -43.779    -53.68    ]
      [-34.939003 -43.779    -53.68    ]]
    
     [[121.061    103.221    110.32    ]
      [121.061    103.221    110.32    ]
      [117.061    100.221    107.32    ]
      ...
      [-36.939003 -45.779    -56.68    ]
      [-35.939003 -44.779    -54.68    ]
      [-35.939003 -44.779    -54.68    ]]
    
     [[121.061    103.221    110.32    ]
      [121.061    103.221    110.32    ]
      [117.061    100.221    107.32    ]
      ...
      [-36.939003 -45.779    -55.68    ]
      [-35.939003 -44.779    -54.68    ]
      [-35.939003 -44.779    -54.68    ]]]
    

     

    ResNet50 모델 생성 후 학습 및 성능 검증

    • 초기 learning_rate 0.001
    resnet_model = create_resnet(in_shape=(128, 128, 3), n_classes=10)
    
    resnet_model.compile(optimizer=Adam(lr=0.001), loss='categorical_crossentropy', metrics=['accuracy'])
    
    # 5번 iteration내에 validation loss가 향상되지 않으면 learning rate을 기존 learning rate * 0.2로 줄임.  
    rlr_cb = ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=3, mode='min', verbose=1)
    ely_cb = EarlyStopping(monitor='val_loss', patience=10, mode='min', verbose=1)
    
    history = resnet_model.fit(tr_ds, epochs=5, 
                        #steps_per_epoch=int(np.ceil(tr_images.shape[0]/BATCH_SIZE)),
                        validation_data=val_ds, 
                        #validation_steps=int(np.ceil(val_images.shape[0]/BATCH_SIZE)), 
                        callbacks=[rlr_cb, ely_cb]
                       )
    Model: "resnet50"
    __________________________________________________________________________________________________
     Layer (type)                   Output Shape         Param #     Connected to                     
    ==================================================================================================
     input_4 (InputLayer)           [(None, 128, 128, 3  0           []                               
                                    )]                                                                
    
     conv1_pad (ZeroPadding2D)      (None, 134, 134, 3)  0           ['input_4[0][0]']                
    
     conv (Conv2D)                  (None, 64, 64, 64)   9472        ['conv1_pad[0][0]']              
    
     bn_conv1 (BatchNormalization)  (None, 64, 64, 64)   256         ['conv[0][0]']                   
    
     activation_24 (Activation)     (None, 64, 64, 64)   0           ['bn_conv1[0][0]']               
    
     pool1_pad (ZeroPadding2D)      (None, 66, 66, 64)   0           ['activation_24[0][0]']          
    
     max_pooling2d_3 (MaxPooling2D)  (None, 32, 32, 64)  0           ['pool1_pad[0][0]']              
    
     res2a_branch2a (Conv2D)        (None, 32, 32, 64)   4160        ['max_pooling2d_3[0][0]']        
    
     bn2a_branch2a (BatchNormalizat  (None, 32, 32, 64)  256         ['res2a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_25 (Activation)     (None, 32, 32, 64)   0           ['bn2a_branch2a[0][0]']          
    
     res2a_branch2b (Conv2D)        (None, 32, 32, 64)   36928       ['activation_25[0][0]']          
    
     bn2a_branch2b (BatchNormalizat  (None, 32, 32, 64)  256         ['res2a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_26 (Activation)     (None, 32, 32, 64)   0           ['bn2a_branch2b[0][0]']          
    
     res2a_branch2c (Conv2D)        (None, 32, 32, 256)  16640       ['activation_26[0][0]']          
    
     res2a_branch1 (Conv2D)         (None, 32, 32, 256)  16640       ['max_pooling2d_3[0][0]']        
    
     bn2a_branch2c (BatchNormalizat  (None, 32, 32, 256)  1024       ['res2a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn2a_branch1 (BatchNormalizati  (None, 32, 32, 256)  1024       ['res2a_branch1[0][0]']          
     on)                                                                                              
    
     add_7 (Add)                    (None, 32, 32, 256)  0           ['bn2a_branch2c[0][0]',          
                                                                      'bn2a_branch1[0][0]']           
    
     activation_27 (Activation)     (None, 32, 32, 256)  0           ['add_7[0][0]']                  
    
     res2b_branch2a (Conv2D)        (None, 32, 32, 64)   16448       ['activation_27[0][0]']          
    
     bn2b_branch2a (BatchNormalizat  (None, 32, 32, 64)  256         ['res2b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_28 (Activation)     (None, 32, 32, 64)   0           ['bn2b_branch2a[0][0]']          
    
     res2b_branch2b (Conv2D)        (None, 32, 32, 64)   36928       ['activation_28[0][0]']          
    
     bn2b_branch2b (BatchNormalizat  (None, 32, 32, 64)  256         ['res2b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_29 (Activation)     (None, 32, 32, 64)   0           ['bn2b_branch2b[0][0]']          
    
     res2b_branch2c (Conv2D)        (None, 32, 32, 256)  16640       ['activation_29[0][0]']          
    
     bn2b_branch2c (BatchNormalizat  (None, 32, 32, 256)  1024       ['res2b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_8 (Add)                    (None, 32, 32, 256)  0           ['activation_27[0][0]',          
                                                                      'bn2b_branch2c[0][0]']          
    
     activation_30 (Activation)     (None, 32, 32, 256)  0           ['add_8[0][0]']                  
    
     res2c_branch2a (Conv2D)        (None, 32, 32, 64)   16448       ['activation_30[0][0]']          
    
     bn2c_branch2a (BatchNormalizat  (None, 32, 32, 64)  256         ['res2c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_31 (Activation)     (None, 32, 32, 64)   0           ['bn2c_branch2a[0][0]']          
    
     res2c_branch2b (Conv2D)        (None, 32, 32, 64)   36928       ['activation_31[0][0]']          
    
     bn2c_branch2b (BatchNormalizat  (None, 32, 32, 64)  256         ['res2c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_32 (Activation)     (None, 32, 32, 64)   0           ['bn2c_branch2b[0][0]']          
    
     res2c_branch2c (Conv2D)        (None, 32, 32, 256)  16640       ['activation_32[0][0]']          
    
     bn2c_branch2c (BatchNormalizat  (None, 32, 32, 256)  1024       ['res2c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_9 (Add)                    (None, 32, 32, 256)  0           ['activation_30[0][0]',          
                                                                      'bn2c_branch2c[0][0]']          
    
     activation_33 (Activation)     (None, 32, 32, 256)  0           ['add_9[0][0]']                  
    
     res3a_branch2a (Conv2D)        (None, 16, 16, 128)  32896       ['activation_33[0][0]']          
    
     bn3a_branch2a (BatchNormalizat  (None, 16, 16, 128)  512        ['res3a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_34 (Activation)     (None, 16, 16, 128)  0           ['bn3a_branch2a[0][0]']          
    
     res3a_branch2b (Conv2D)        (None, 16, 16, 128)  147584      ['activation_34[0][0]']          
    
     bn3a_branch2b (BatchNormalizat  (None, 16, 16, 128)  512        ['res3a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_35 (Activation)     (None, 16, 16, 128)  0           ['bn3a_branch2b[0][0]']          
    
     res3a_branch2c (Conv2D)        (None, 16, 16, 512)  66048       ['activation_35[0][0]']          
    
     res3a_branch1 (Conv2D)         (None, 16, 16, 512)  131584      ['activation_33[0][0]']          
    
     bn3a_branch2c (BatchNormalizat  (None, 16, 16, 512)  2048       ['res3a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn3a_branch1 (BatchNormalizati  (None, 16, 16, 512)  2048       ['res3a_branch1[0][0]']          
     on)                                                                                              
    
     add_10 (Add)                   (None, 16, 16, 512)  0           ['bn3a_branch2c[0][0]',          
                                                                      'bn3a_branch1[0][0]']           
    
     activation_36 (Activation)     (None, 16, 16, 512)  0           ['add_10[0][0]']                 
    
     res3b_branch2a (Conv2D)        (None, 16, 16, 128)  65664       ['activation_36[0][0]']          
    
     bn3b_branch2a (BatchNormalizat  (None, 16, 16, 128)  512        ['res3b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_37 (Activation)     (None, 16, 16, 128)  0           ['bn3b_branch2a[0][0]']          
    
     res3b_branch2b (Conv2D)        (None, 16, 16, 128)  147584      ['activation_37[0][0]']          
    
     bn3b_branch2b (BatchNormalizat  (None, 16, 16, 128)  512        ['res3b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_38 (Activation)     (None, 16, 16, 128)  0           ['bn3b_branch2b[0][0]']          
    
     res3b_branch2c (Conv2D)        (None, 16, 16, 512)  66048       ['activation_38[0][0]']          
    
     bn3b_branch2c (BatchNormalizat  (None, 16, 16, 512)  2048       ['res3b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_11 (Add)                   (None, 16, 16, 512)  0           ['activation_36[0][0]',          
                                                                      'bn3b_branch2c[0][0]']          
    
     activation_39 (Activation)     (None, 16, 16, 512)  0           ['add_11[0][0]']                 
    
     res3c_branch2a (Conv2D)        (None, 16, 16, 128)  65664       ['activation_39[0][0]']          
    
     bn3c_branch2a (BatchNormalizat  (None, 16, 16, 128)  512        ['res3c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_40 (Activation)     (None, 16, 16, 128)  0           ['bn3c_branch2a[0][0]']          
    
     res3c_branch2b (Conv2D)        (None, 16, 16, 128)  147584      ['activation_40[0][0]']          
    
     bn3c_branch2b (BatchNormalizat  (None, 16, 16, 128)  512        ['res3c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_41 (Activation)     (None, 16, 16, 128)  0           ['bn3c_branch2b[0][0]']          
    
     res3c_branch2c (Conv2D)        (None, 16, 16, 512)  66048       ['activation_41[0][0]']          
    
     bn3c_branch2c (BatchNormalizat  (None, 16, 16, 512)  2048       ['res3c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_12 (Add)                   (None, 16, 16, 512)  0           ['activation_39[0][0]',          
                                                                      'bn3c_branch2c[0][0]']          
    
     activation_42 (Activation)     (None, 16, 16, 512)  0           ['add_12[0][0]']                 
    
     res3d_branch2a (Conv2D)        (None, 16, 16, 128)  65664       ['activation_42[0][0]']          
    
     bn3d_branch2a (BatchNormalizat  (None, 16, 16, 128)  512        ['res3d_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_43 (Activation)     (None, 16, 16, 128)  0           ['bn3d_branch2a[0][0]']          
    
     res3d_branch2b (Conv2D)        (None, 16, 16, 128)  147584      ['activation_43[0][0]']          
    
     bn3d_branch2b (BatchNormalizat  (None, 16, 16, 128)  512        ['res3d_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_44 (Activation)     (None, 16, 16, 128)  0           ['bn3d_branch2b[0][0]']          
    
     res3d_branch2c (Conv2D)        (None, 16, 16, 512)  66048       ['activation_44[0][0]']          
    
     bn3d_branch2c (BatchNormalizat  (None, 16, 16, 512)  2048       ['res3d_branch2c[0][0]']         
     ion)                                                                                             
    
     add_13 (Add)                   (None, 16, 16, 512)  0           ['activation_42[0][0]',          
                                                                      'bn3d_branch2c[0][0]']          
    
     activation_45 (Activation)     (None, 16, 16, 512)  0           ['add_13[0][0]']                 
    
     res4a_branch2a (Conv2D)        (None, 8, 8, 256)    131328      ['activation_45[0][0]']          
    
     bn4a_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_46 (Activation)     (None, 8, 8, 256)    0           ['bn4a_branch2a[0][0]']          
    
     res4a_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_46[0][0]']          
    
     bn4a_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_47 (Activation)     (None, 8, 8, 256)    0           ['bn4a_branch2b[0][0]']          
    
     res4a_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_47[0][0]']          
    
     res4a_branch1 (Conv2D)         (None, 8, 8, 1024)   525312      ['activation_45[0][0]']          
    
     bn4a_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn4a_branch1 (BatchNormalizati  (None, 8, 8, 1024)  4096        ['res4a_branch1[0][0]']          
     on)                                                                                              
    
     add_14 (Add)                   (None, 8, 8, 1024)   0           ['bn4a_branch2c[0][0]',          
                                                                      'bn4a_branch1[0][0]']           
    
     activation_48 (Activation)     (None, 8, 8, 1024)   0           ['add_14[0][0]']                 
    
     res4b_branch2a (Conv2D)        (None, 8, 8, 256)    262400      ['activation_48[0][0]']          
    
     bn4b_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_49 (Activation)     (None, 8, 8, 256)    0           ['bn4b_branch2a[0][0]']          
    
     res4b_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_49[0][0]']          
    
     bn4b_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_50 (Activation)     (None, 8, 8, 256)    0           ['bn4b_branch2b[0][0]']          
    
     res4b_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_50[0][0]']          
    
     bn4b_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_15 (Add)                   (None, 8, 8, 1024)   0           ['activation_48[0][0]',          
                                                                      'bn4b_branch2c[0][0]']          
    
     activation_51 (Activation)     (None, 8, 8, 1024)   0           ['add_15[0][0]']                 
    
     res4c_branch2a (Conv2D)        (None, 8, 8, 256)    262400      ['activation_51[0][0]']          
    
     bn4c_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_52 (Activation)     (None, 8, 8, 256)    0           ['bn4c_branch2a[0][0]']          
    
     res4c_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_52[0][0]']          
    
     bn4c_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_53 (Activation)     (None, 8, 8, 256)    0           ['bn4c_branch2b[0][0]']          
    
     res4c_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_53[0][0]']          
    
     bn4c_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_16 (Add)                   (None, 8, 8, 1024)   0           ['activation_51[0][0]',          
                                                                      'bn4c_branch2c[0][0]']          
    
     activation_54 (Activation)     (None, 8, 8, 1024)   0           ['add_16[0][0]']                 
    
     res4d_branch2a (Conv2D)        (None, 8, 8, 256)    262400      ['activation_54[0][0]']          
    
     bn4d_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4d_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_55 (Activation)     (None, 8, 8, 256)    0           ['bn4d_branch2a[0][0]']          
    
     res4d_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_55[0][0]']          
    
     bn4d_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4d_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_56 (Activation)     (None, 8, 8, 256)    0           ['bn4d_branch2b[0][0]']          
    
     res4d_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_56[0][0]']          
    
     bn4d_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4d_branch2c[0][0]']         
     ion)                                                                                             
    
     add_17 (Add)                   (None, 8, 8, 1024)   0           ['activation_54[0][0]',          
                                                                      'bn4d_branch2c[0][0]']          
    
     activation_57 (Activation)     (None, 8, 8, 1024)   0           ['add_17[0][0]']                 
    
     res4e_branch2a (Conv2D)        (None, 8, 8, 256)    262400      ['activation_57[0][0]']          
    
     bn4e_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4e_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_58 (Activation)     (None, 8, 8, 256)    0           ['bn4e_branch2a[0][0]']          
    
     res4e_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_58[0][0]']          
    
     bn4e_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4e_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_59 (Activation)     (None, 8, 8, 256)    0           ['bn4e_branch2b[0][0]']          
    
     res4e_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_59[0][0]']          
    
     bn4e_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4e_branch2c[0][0]']         
     ion)                                                                                             
    
     add_18 (Add)                   (None, 8, 8, 1024)   0           ['activation_57[0][0]',          
                                                                      'bn4e_branch2c[0][0]']          
    
     activation_60 (Activation)     (None, 8, 8, 1024)   0           ['add_18[0][0]']                 
    
     res4f_branch2a (Conv2D)        (None, 8, 8, 256)    262400      ['activation_60[0][0]']          
    
     bn4f_branch2a (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4f_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_61 (Activation)     (None, 8, 8, 256)    0           ['bn4f_branch2a[0][0]']          
    
     res4f_branch2b (Conv2D)        (None, 8, 8, 256)    590080      ['activation_61[0][0]']          
    
     bn4f_branch2b (BatchNormalizat  (None, 8, 8, 256)   1024        ['res4f_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_62 (Activation)     (None, 8, 8, 256)    0           ['bn4f_branch2b[0][0]']          
    
     res4f_branch2c (Conv2D)        (None, 8, 8, 1024)   263168      ['activation_62[0][0]']          
    
     bn4f_branch2c (BatchNormalizat  (None, 8, 8, 1024)  4096        ['res4f_branch2c[0][0]']         
     ion)                                                                                             
    
     add_19 (Add)                   (None, 8, 8, 1024)   0           ['activation_60[0][0]',          
                                                                      'bn4f_branch2c[0][0]']          
    
     activation_63 (Activation)     (None, 8, 8, 1024)   0           ['add_19[0][0]']                 
    
     res5a_branch2a (Conv2D)        (None, 4, 4, 512)    524800      ['activation_63[0][0]']          
    
     bn5a_branch2a (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5a_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_64 (Activation)     (None, 4, 4, 512)    0           ['bn5a_branch2a[0][0]']          
    
     res5a_branch2b (Conv2D)        (None, 4, 4, 512)    2359808     ['activation_64[0][0]']          
    
     bn5a_branch2b (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5a_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_65 (Activation)     (None, 4, 4, 512)    0           ['bn5a_branch2b[0][0]']          
    
     res5a_branch2c (Conv2D)        (None, 4, 4, 2048)   1050624     ['activation_65[0][0]']          
    
     res5a_branch1 (Conv2D)         (None, 4, 4, 2048)   2099200     ['activation_63[0][0]']          
    
     bn5a_branch2c (BatchNormalizat  (None, 4, 4, 2048)  8192        ['res5a_branch2c[0][0]']         
     ion)                                                                                             
    
     bn5a_branch1 (BatchNormalizati  (None, 4, 4, 2048)  8192        ['res5a_branch1[0][0]']          
     on)                                                                                              
    
     add_20 (Add)                   (None, 4, 4, 2048)   0           ['bn5a_branch2c[0][0]',          
                                                                      'bn5a_branch1[0][0]']           
    
     activation_66 (Activation)     (None, 4, 4, 2048)   0           ['add_20[0][0]']                 
    
     res5b_branch2a (Conv2D)        (None, 4, 4, 512)    1049088     ['activation_66[0][0]']          
    
     bn5b_branch2a (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5b_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_67 (Activation)     (None, 4, 4, 512)    0           ['bn5b_branch2a[0][0]']          
    
     res5b_branch2b (Conv2D)        (None, 4, 4, 512)    2359808     ['activation_67[0][0]']          
    
     bn5b_branch2b (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5b_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_68 (Activation)     (None, 4, 4, 512)    0           ['bn5b_branch2b[0][0]']          
    
     res5b_branch2c (Conv2D)        (None, 4, 4, 2048)   1050624     ['activation_68[0][0]']          
    
     bn5b_branch2c (BatchNormalizat  (None, 4, 4, 2048)  8192        ['res5b_branch2c[0][0]']         
     ion)                                                                                             
    
     add_21 (Add)                   (None, 4, 4, 2048)   0           ['activation_66[0][0]',          
                                                                      'bn5b_branch2c[0][0]']          
    
     activation_69 (Activation)     (None, 4, 4, 2048)   0           ['add_21[0][0]']                 
    
     res5c_branch2a (Conv2D)        (None, 4, 4, 512)    1049088     ['activation_69[0][0]']          
    
     bn5c_branch2a (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5c_branch2a[0][0]']         
     ion)                                                                                             
    
     activation_70 (Activation)     (None, 4, 4, 512)    0           ['bn5c_branch2a[0][0]']          
    
     res5c_branch2b (Conv2D)        (None, 4, 4, 512)    2359808     ['activation_70[0][0]']          
    
     bn5c_branch2b (BatchNormalizat  (None, 4, 4, 512)   2048        ['res5c_branch2b[0][0]']         
     ion)                                                                                             
    
     activation_71 (Activation)     (None, 4, 4, 512)    0           ['bn5c_branch2b[0][0]']          
    
     res5c_branch2c (Conv2D)        (None, 4, 4, 2048)   1050624     ['activation_71[0][0]']          
    
     bn5c_branch2c (BatchNormalizat  (None, 4, 4, 2048)  8192        ['res5c_branch2c[0][0]']         
     ion)                                                                                             
    
     add_22 (Add)                   (None, 4, 4, 2048)   0           ['activation_69[0][0]',          
                                                                      'bn5c_branch2c[0][0]']          
    
     activation_72 (Activation)     (None, 4, 4, 2048)   0           ['add_22[0][0]']                 
    
     avg_pool (GlobalAveragePooling  (None, 2048)        0           ['activation_72[0][0]']          
     2D)                                                                                              
    
     dropout (Dropout)              (None, 2048)         0           ['avg_pool[0][0]']               
    
     fc_01 (Dense)                  (None, 200)          409800      ['dropout[0][0]']                
    
     dropout_1 (Dropout)            (None, 200)          0           ['fc_01[0][0]']                  
    
     fc_final (Dense)               (None, 10)           2010        ['dropout_1[0][0]']              
    
    ==================================================================================================
    Total params: 23,999,522
    Trainable params: 23,946,402
    Non-trainable params: 53,120
    __________________________________________________________________________________________________
    
    /usr/local/lib/python3.7/dist-packages/keras/optimizer_v2/adam.py:105: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
      super(Adam, self).__init__(name, **kwargs)
    
    Epoch 1/5
    625/625 [==============================] - 105s 146ms/step - loss: 1.9412 - accuracy: 0.2651 - val_loss: 1.8480 - val_accuracy: 0.3566 - lr: 0.0010
    Epoch 2/5
    625/625 [==============================] - 90s 144ms/step - loss: 1.6040 - accuracy: 0.3982 - val_loss: 1.4495 - val_accuracy: 0.4608 - lr: 0.0010
    Epoch 3/5
    625/625 [==============================] - 90s 144ms/step - loss: 1.4899 - accuracy: 0.4543 - val_loss: 1.3171 - val_accuracy: 0.5014 - lr: 0.0010
    Epoch 4/5
    625/625 [==============================] - 90s 144ms/step - loss: 1.3179 - accuracy: 0.5252 - val_loss: 1.2866 - val_accuracy: 0.5273 - lr: 0.0010
    Epoch 5/5
    625/625 [==============================] - 90s 144ms/step - loss: 1.1866 - accuracy: 0.5840 - val_loss: 1.5122 - val_accuracy: 0.5247 - lr: 0.0010
    
    test_ds = CIFAR_Dataset(test_images, test_oh_labels, batch_size=BATCH_SIZE, augmentor=None, shuffle=False, pre_func=resnet_preprocess)
    resnet_model.evaluate(test_ds)
    157/157 [==============================] - 8s 51ms/step - loss: 1.5256 - accuracy: 0.5163
    
    [1.5255552530288696, 0.5163000226020813]
    

    '머신러닝 & 딥러닝' 카테고리의 다른 글

    PyTorch의 DP와 DDP  (0) 2023.12.25
    [이미지] 의류 직물 불량 검출하기  (0) 2022.02.02
    [CNN] GoogLeNet  (0) 2022.01.23
    [CNN] VGG  (0) 2022.01.23
    [CNN] AlexNet  (0) 2022.01.22

    댓글