-
大小:文件类型: .zip金币: 2下载: 0 次发布日期: 2023-11-13
- 语言: 其他
- 标签: Machine Learning Scikit-Learn TensorFlow
资源简介

代码片段和文件信息
# Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License Version 2.0 (the “License“);
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing software
# distributed under the License is distributed on an “AS IS“ BASIS
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
“““Contains a model definition for AlexNet.
This work was first described in:
ImageNet Classification with Deep Convolutional Neural Networks
Alex Krizhevsky Ilya Sutskever and Geoffrey E. Hinton
and later refined in:
One weird trick for parallelizing convolutional neural networks
Alex Krizhevsky 2014
Here we provide the implementation proposed in “One weird trick“ and not
“ImageNet Classification“ as per the paper the LRN layers have been removed.
Usage:
with slim.arg_scope(alexnet.alexnet_v2_arg_scope()):
outputs end_points = alexnet.alexnet_v2(inputs)
@@alexnet_v2
“““
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
slim = tf.contrib.slim
trunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0 stddev)
def alexnet_v2_arg_scope(weight_decay=0.0005):
with slim.arg_scope([slim.conv2d slim.fully_connected]
activation_fn=tf.nn.relu
biases_initializer=tf.constant_initializer(0.1)
weights_regularizer=slim.l2_regularizer(weight_decay)):
with slim.arg_scope([slim.conv2d] padding=‘SAME‘):
with slim.arg_scope([slim.max_pool2d] padding=‘VALID‘) as arg_sc:
return arg_sc
def alexnet_v2(inputs
num_classes=1000
is_training=True
dropout_keep_prob=0.5
spatial_squeeze=True
scope=‘alexnet_v2‘):
“““AlexNet version 2.
Described in: http://arxiv.org/pdf/1404.5997v2.pdf
Parameters from:
github.com/akrizhevsky/cuda-convnet2/blob/master/layers/
layers-imagenet-1gpu.cfg
Note: All the fully_connected layers have been transformed to conv2d layers.
To use in classification mode resize input to 224x224. To use in fully
convolutional mode set spatial_squeeze to false.
The LRN layers have been removed and change the initializers from
random_normal_initializer to xavier_initializer.
Args:
inputs: a tensor of size [batch_size height width channels].
num_classes: number of predicted classes.
is_training: whether or not the model is being trained.
dropout_keep_prob: the probability that activations are kept in the dropout
layers during traini
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
目录 0 2016-11-25 08:34 handson-ml-master\
文件 244 2016-11-25 08:34 handson-ml-master\.binder_start
文件 112 2016-11-25 08:34 handson-ml-master\.gitignore
文件 274510 2016-11-25 08:34 handson-ml-master\01_the_machine_learning_landscape.ipynb
文件 2118822 2016-11-25 08:34 handson-ml-master\02_end_to_end_machine_learning_project.ipynb
文件 960196 2016-11-25 08:34 handson-ml-master\03_classification.ipynb
文件 743852 2016-11-25 08:34 handson-ml-master\04_training_linear_models.ipynb
文件 867639 2016-11-25 08:34 handson-ml-master\05_support_vector_machines.ipynb
文件 201263 2016-11-25 08:34 handson-ml-master\06_decision_trees.ipynb
文件 567343 2016-11-25 08:34 handson-ml-master\07_ensemble_learning_and_random_forests.ipynb
文件 1560349 2016-11-25 08:34 handson-ml-master\08_dimensionality_reduction.ipynb
文件 83498 2016-11-25 08:34 handson-ml-master\09_up_and_running_with_tensorflow.ipynb
文件 326697 2016-11-25 08:34 handson-ml-master\10_introduction_to_artificial_neural_networks.ipynb
文件 275788 2016-11-25 08:34 handson-ml-master\11_deep_learning.ipynb
文件 15732 2016-11-25 08:34 handson-ml-master\12_distributed_tensorflow.ipynb
文件 1507818 2016-11-25 08:34 handson-ml-master\13_convolutional_neural_networks.ipynb
文件 897957 2016-11-25 08:34 handson-ml-master\14_recurrent_neural_networks.ipynb
文件 328928 2016-11-25 08:34 handson-ml-master\15_autoencoders.ipynb
文件 757620 2016-11-25 08:34 handson-ml-master\16_reinforcement_learning.ipynb
文件 1927 2016-11-25 08:34 handson-ml-master\Dockerfile
文件 10175 2016-11-25 08:34 handson-ml-master\LICENSE
文件 3639 2016-11-25 08:34 handson-ml-master\README.md
目录 0 2016-11-25 08:34 handson-ml-master\datasets\
目录 0 2016-11-25 08:34 handson-ml-master\datasets\housing\
文件 3679 2016-11-25 08:34 handson-ml-master\datasets\housing\README.md
文件 1423529 2016-11-25 08:34 handson-ml-master\datasets\housing\housing.csv
文件 409488 2016-11-25 08:34 handson-ml-master\datasets\housing\housing.tgz
目录 0 2016-11-25 08:34 handson-ml-master\datasets\inception\
文件 31674 2016-11-25 08:34 handson-ml-master\datasets\inception\imagenet_class_names.txt
目录 0 2016-11-25 08:34 handson-ml-master\datasets\lifesat\
文件 4311 2016-11-25 08:34 handson-ml-master\datasets\lifesat\README.md
............此处省略70个文件信息
相关资源
- Learning Linux Binary Analysis
- ReportMachine 交叉报表 学生成绩表
- reportmachine帮助电子书
- 机器学习个人笔记完整版v5.2-A4打印版
- TH upstream-inhibited ARHGAP12 subnetwork for
- Bishop - Pattern Recognition And Machine Learn
- [en]深度学习[Deep Learning: Adaptive Compu
- 吴恩达机器学习编程题
- Wikipedia机器学习迷你电子书之四《D
- AV Foundation 开发秘籍 英文版 Learning
- Google论文\“Wide & Deep Learning for Recom
- Learning From Data Yaser S. Abu-Mostafa
- 《增强学习导论》Reinforcement Learning
- titanic_dataset.csv泰坦尼克数据集
- TensorFlow Machine Learning Cookbook+无码高清
- Hands-On Machine Learning with Scikit-Learn an
- Vapnik经典之作The Nature Of Statistical Le
- Learning Generative Adversarial Networks 无水印
- Algorithms for reinforcement learning
- Bioinformatics Algorithms: an Active Learning
- Big Data and Machine Learning in Quantitative
- Learning with Kernels
- master_machine_learning_algorithms285570
- Grokking Deep Learning
- machine-learning-ex4
- Learning Generative Adversarial Networks
- 斯坦福大学 2014 机器学习教程中文笔
- Deep Learning with R.pdf
- Reinforcement Learning: An Introduction,Rich
- 数据不均衡问题经典文献《Learning f
评论
共有 条评论