资源简介

keras 各种fine tune,深度学习专用,机器学习专用,代码简单,vgg resnet inception..

资源截图

代码片段和文件信息

# -*- coding: utf-8 -*-

from keras.optimizers import SGD
from keras.layers import Input merge ZeroPadding2D
from keras.layers.core import Dense Dropout Activation
from keras.layers.convolutional import Convolution2D
from keras.layers.pooling import AveragePooling2D GlobalAveragePooling2D MaxPooling2D
from keras.layers.normalization import BatchNormalization
from keras.models import Model
import keras.backend as K

from sklearn.metrics import log_loss

from custom_layers.scale_layer import Scale

from load_cifar10 import load_cifar10_data

def densenet121_model(img_rows img_cols color_type=1 nb_dense_block=4 growth_rate=32 nb_filter=64 reduction=0.5 dropout_rate=0.0 weight_decay=1e-4 num_classes=None):
    ‘‘‘
    DenseNet 121 Model for Keras

    Model Schema is based on 
    https://github.com/flyyufelix/DenseNet-Keras

    ImageNet Pretrained Weights 
    Theano: https://drive.google.com/open?id=0Byy2AcGyEVxfMlRYb3YzV210VzQ
    TensorFlow: https://drive.google.com/open?id=0Byy2AcGyEVxfSTA4SHJVOHNuTXc

    # Arguments
        nb_dense_block: number of dense blocks to add to end
        growth_rate: number of filters to add per dense block
        nb_filter: initial number of filters
        reduction: reduction factor of transition blocks.
        dropout_rate: dropout rate
        weight_decay: weight decay factor
        classes: optional number of classes to classify images
        weights_path: path to pre-trained weights
    # Returns
        A Keras model instance.
    ‘‘‘
    eps = 1.1e-5

    # compute compression factor
    compression = 1.0 - reduction

    # Handle Dimension Ordering for different backends
    global concat_axis
    if K.image_dim_ordering() == ‘tf‘:
      concat_axis = 3
      img_input = Input(shape=(img_rows img_cols color_type) name=‘data‘)
    else:
      concat_axis = 1
      img_input = Input(shape=(color_type img_rows img_cols) name=‘data‘)

    # From architecture for ImageNet (Table 1 in the paper)
    nb_filter = 64
    nb_layers = [6122416] # For DenseNet-121

    # Initial convolution
    x = ZeroPadding2D((3 3) name=‘conv1_zeropadding‘)(img_input)
    x = Convolution2D(nb_filter 7 7 subsample=(2 2) name=‘conv1‘ bias=False)(x)
    x = BatchNormalization(epsilon=eps axis=concat_axis name=‘conv1_bn‘)(x)
    x = Scale(axis=concat_axis name=‘conv1_scale‘)(x)
    x = Activation(‘relu‘ name=‘relu1‘)(x)
    x = ZeroPadding2D((1 1) name=‘pool1_zeropadding‘)(x)
    x = MaxPooling2D((3 3) strides=(2 2) name=‘pool1‘)(x)

    # Add dense blocks
    for block_idx in range(nb_dense_block - 1):
        stage = block_idx+2
        x nb_filter = dense_block(x stage nb_layers[block_idx] nb_filter growth_rate dropout_rate=dropout_rate weight_decay=weight_decay)

        # Add transition_block
        x = transition_block(x stage nb_filter compression=compression dropout_rate=dropout_rate weight_decay=weight_decay)
        nb_filter = int(nb_filter * compression)

    fi

评论

共有 条评论