资源简介
基于深度学习网络的手写体识别matlab代码-Hinton,用来入门基于深度学习的手写体识别是很好的,代码直接可用,比较经典看不懂的网上有很多对其的分析注释
代码片段和文件信息
% Version 1.000
%
% Code provided by Ruslan Salakhutdinov and Geoff Hinton
%
% Permission is granted for anyone to copy use modify or distribute this
% program and accompanying programs and documents for any purpose provided
% this copyright notice is retained and prominently displayed along with
% a note saying that the original programs are available from our
% web page.
% The programs and documents are distributed without any warranty express or
% implied. As the programs were written for research purposes only they have
% not been tested to the degree that would be advisable in any important
% application. All use of these programs is entirely at the user‘s own risk.
% This program fine-tunes an autoencoder with backpropagation.
% Weights of the autoencoder are going to be saved in mnist_weights.mat
% and trainig and test reconstruction errors in mnist_error.mat
% You can also set maxepoch default value is 200 as in our paper.
maxepoch=200;
fprintf(1‘\nFine-tuning deep autoencoder by minimizing cross entropy error. \n‘);
fprintf(1‘60 batches of 1000 cases each. \n‘);
load mnistvh
load mnisthp
load mnisthp2
load mnistpo
makebatches;
[numcases numdims numbatches]=size(batchdata);
N=numcases;
%%%% PREINITIALIZE WEIGHTS OF THE AUTOENCODER %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
w1=[vishid; hidrecbiases];
w2=[hidpen; penrecbiases];
w3=[hidpen2; penrecbiases2];
w4=[hidtop; toprecbiases];
w5=[hidtop‘; topgenbiases];
w6=[hidpen2‘; hidgenbiases2];
w7=[hidpen‘; hidgenbiases];
w8=[vishid‘; visbiases];
%%%%%%%%%% END OF PREINITIALIZATIO OF WEIGHTS %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
l1=size(w11)-1;
l2=size(w21)-1;
l3=size(w31)-1;
l4=size(w41)-1;
l5=size(w51)-1;
l6=size(w61)-1;
l7=size(w71)-1;
l8=size(w81)-1;
l9=l1;
test_err=[];
train_err=[];
for epoch = 1:maxepoch
%%%%%%%%%%%%%%%%%%%% COMPUTE TRAINING RECONSTRUCTION ERROR %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
err=0;
[numcases numdims numbatches]=size(batchdata);
N=numcases;
for batch = 1:numbatches
data = [batchdata(::batch)];
data = [data ones(N1)];
w1probs = 1./(1 + exp(-data*w1)); w1probs = [w1probs ones(N1)];
w2probs = 1./(1 + exp(-w1probs*w2)); w2probs = [w2probs ones(N1)];
w3probs = 1./(1 + exp(-w2probs*w3)); w3probs = [w3probs ones(N1)];
w4probs = w3probs*w4; w4probs = [w4probs ones(N1)];
w5probs = 1./(1 + exp(-w4probs*w5)); w5probs = [w5probs ones(N1)];
w6probs = 1./(1 + exp(-w5probs*w6)); w6probs = [w6probs ones(N1)];
w7probs = 1./(1 + exp(-w6probs*w7)); w7probs = [w7probs ones(N1)];
dataout = 1./(1 + exp(-w7probs*w8));
err= err + 1/N*sum(sum( (data(:1:end-1)-dataout).^2 ));
end
train_err(epoch)=err/numbatches;
%%%%%%%%%%%%%% END OF COMPUTING TRAINING RECONSTRUCTION ERROR %%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%% DISPLAY FIGURE TOP ROW REAL DATA BOTTOM ROW RECONSTRUCTIONS %%%%%%%%%%%%%%%%%%%%%%%%%
fprintf(1‘Displaying in figure 1: Top row - real data Bottom row -- reconstructions \n‘);
output=[];
for i
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
文件 5594 2006-05-21 11:34 Autoencoder_Code\backprop.m
文件 5474 2006-06-20 09:49 Autoencoder_Code\backpropclassify.m
文件 1853 2006-06-20 09:49 Autoencoder_Code\CG_CLASSIFY.m
文件 1136 2006-06-20 09:49 Autoencoder_Code\CG_CLASSIFY_INIT.m
文件 2727 2006-06-20 09:49 Autoencoder_Code\CG_MNIST.m
文件 3011 2006-06-20 09:49 Autoencoder_Code\converter.m
文件 4169 2006-06-20 09:49 Autoencoder_Code\makebatches.m
文件 8995 2013-01-24 10:57 Autoencoder_Code\minimize.m.txt
文件 1902 2006-06-20 09:49 Autoencoder_Code\mnistclassify.m
文件 2199 2006-06-20 09:49 Autoencoder_Code\mnistdeepauto.m
文件 1084 2006-06-20 09:49 Autoencoder_Code\mnistdisp.m
文件 3914 2006-06-20 09:49 Autoencoder_Code\rbm.m
文件 3964 2006-06-20 09:49 Autoencoder_Code\rbmhidlinear.m
文件 2934 2006-07-13 23:40 Autoencoder_Code\README.txt
目录 0 2013-01-24 11:05 Autoencoder_Code
----------- --------- ---------- ----- ----
48956 15
评论
共有 条评论