资源简介
将节点相似度矩阵,作为深度稀疏自动编码器的输入,并通过不断迭代,作为输出低维特征矩阵。(matlab编写)
代码片段和文件信息
function [] = checkNumericalGradient()
% This code can be used to check your numerical gradient implementation
% in computeNumericalGradient.m
% It analytically evaluates the gradient of a very simple function called
% simpleQuadraticFunction (see below) and compares the result with your numerical
% solution. Your numerical gradient implementation is incorrect if
% your numerical solution deviates too much from the analytical solution.
% Evaluate the function and gradient at x = [4; 10]; (Here x is a 2d vector.)
x = [4; 10];
[value grad] = simpleQuadraticFunction(x);
% Use your code to numerically compute the gradient of simpleQuadraticFunction at x.
% (The notation “@simpleQuadraticFunction“ denotes a pointer to a function.)
numgrad = computeNumericalGradient(@simpleQuadraticFunction x);
% Visually examine the two gradient computations. The two columns
% you get should be very similar.
disp([numgrad grad]);
fprintf(‘The above two columns you get should be very similar.\n(Left-Your Numerical Gradient Right-Analytical Gradient)\n\n‘);
% Evaluate the norm of the difference between two solutions.
% If you have a correct implementation and assuming you used EPSILON = 0.0001
% in computeNumericalGradient.m then diff below should be 2.1452e-12
diff = norm(numgrad-grad)/norm(numgrad+grad);
disp(diff);
fprintf(‘Norm of the difference between numerical and analytical gradient (should be < 1e-9)\n\n‘);
end
function [valuegrad] = simpleQuadraticFunction(x)
% this function accepts a 2D vector as input.
% Its outputs are:
% value: h(x1 x2) = x1^2 + 3*x1*x2
% grad: A 2x1 vector that gives the partial derivatives of h with respect to x1 and x2
% Note that when we pass @simpleQuadraticFunction(x) to computeNumericalGradients we‘re assuming
% that computeNumericalGradients will use only the first returned value of this function.
value = x(1)^2 + 3*x(1)*x(2);
grad = zeros(2 1);
grad(1) = 2*x(1) + 3*x(2);
grad(2) = 3*x(1);
end
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
文件 1982 2014-10-30 12:38 SparseAutoencoder\checkNumericalGradient.m
文件 1531 2014-10-30 12:38 SparseAutoencoder\computeNumericalGradient.m
文件 2647 2014-10-30 12:38 SparseAutoencoder\display_network.m
文件 622 2014-10-30 12:38 SparseAutoencoder\initializeParameters.m
文件 884 2019-02-25 17:56 SparseAutoencoder\main.m
文件 3143 2014-10-30 12:38 SparseAutoencoder\minFunc\ArmijoBacktrack.m
文件 807 2014-10-30 12:38 SparseAutoencoder\minFunc\autoGrad.m
文件 901 2014-10-30 12:38 SparseAutoencoder\minFunc\autoHess.m
文件 307 2014-10-30 12:38 SparseAutoencoder\minFunc\autoHv.m
文件 870 2014-10-30 12:38 SparseAutoencoder\minFunc\autoTensor.m
文件 374 2014-10-30 12:38 SparseAutoencoder\minFunc\callOutput.m
文件 1763 2014-10-30 12:38 SparseAutoencoder\minFunc\conjGrad.m
文件 953 2014-10-30 12:38 SparseAutoencoder\minFunc\dampedUpdate.m
文件 2421 2014-10-30 12:38 SparseAutoencoder\minFunc\example_minFunc.m
文件 1556 2014-10-30 12:38 SparseAutoencoder\minFunc\example_minFunc_LR.m
文件 106 2014-10-30 12:38 SparseAutoencoder\minFunc\isLegal.m
文件 885 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgs.m
文件 2293 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.c
文件 7707 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexa64
文件 7733 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexglx
文件 9500 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexmac
文件 12660 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexmaci
文件 8800 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexmaci64
文件 7168 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexw32
文件 9728 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsC.mexw64
文件 594 2014-10-30 12:38 SparseAutoencoder\minFunc\lbfgsUpdate.m
文件 397 2014-10-30 12:38 SparseAutoencoder\minFunc\logistic\LogisticDiagPrecond.m
文件 208 2014-10-30 12:38 SparseAutoencoder\minFunc\logistic\LogisticHv.m
文件 625 2014-10-30 12:38 SparseAutoencoder\minFunc\logistic\LogisticLoss.m
文件 1111 2014-10-30 12:38 SparseAutoencoder\minFunc\logistic\mexutil.c
............此处省略34个文件信息
- 上一篇:精益创业实战第二版.mobi
- 下一篇:matlab实现的sift的图像拼接
评论
共有 条评论