如何手动计算神经网络输出?

106 views (last 30 days)
杰夫·张
杰夫·张 2018年5月2日
Edited: Soumitra Sitole2022年4月20日
亲爱的大家,
我正在探索神经网络工具箱,并想手动手动计算输出。我使用了MATLAB提供的一个示例和以下代码。不幸的是,我的输出不正确。有人知道为什么吗?谢谢
下面是MATLAB的示例代码
[x, y] = crab_dataset;
尺寸(x)%6 x 200
尺寸(y)%2x 200
setDemorandStream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
testx = x(:,tr_info.testind);
testt = y(:,tr_info.testind);
testY = net(testX);
testIndices = vec2ind(testY);
[error_rate, conf_mat] = confusion(testT, testY);
fprintf('Percentage Correct Classification : %f%%\n', 100*(1 - error_rate));
fprintf('Percentage Incorrect Classification : %f%%\n',100*error_rate);
%%手动计算输出
%nfeatures = 6
%nSamples = 200
%nhidendnode = 10
%nClass = 2
%input layer => x (6x200)
%hidden layer => h = sigmoid(w1.x + b1)
%=(10x6)(6x200) +(10x1)
%= (10x200)
%output layer => yhat = w2.h + b2
%= (2x200)
w1 = net.iw {1};%(10x6)
w2 = net.lw {2};%(2x10)
b1 = net.b{1};%(10x1)
b2 = net.b {2};%(2x1)
H = Sigmoid(W1*X + B1);
yhat = w2*h + b2;
[testy'yhat']
[vec2ind(testy)'vec2ind(yhat)']

Accepted Answer

JESUS DAVID ARIZA ROYETH
JESUS DAVID ARIZA ROYETH 2018年5月2日
you missed several normalization parameters, here I leave the solution :
[x, y] = crab_dataset;
尺寸(x)%6 x 200
尺寸(y)%2x 200
setDemorandStream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
xoffset = net.inputs {1} .processSettings {1} .xoffset;
gain = net.inputs {1} .processSettings {1} .gain;
ymin = net.inputs {1} .processSettings {1} .ymin;
w1 = net.iw {1};%(10x6)
w2 = net.lw {2};%(2x10)
b1 = net.b{1};%(10x1)
b2 = net.b {2};
%Input 1
y1 = bsxfun(@times,bsxfun(@minus,x,xoffset),gain);
y1 = bsxfun(@plus,y1,ymin);
%1层
a1 = 2 ./(1 + exp(-2*(repmat(b1,1,size(x,2)) + w1*y1))) - 1;
%output
n = repmat(b2,1,size(x,2)) + w2*a1;
nmax = max(n,[],1);
n = bsxfun(@minus,n,nmax);
num = exp(n);
den = sum(num,1);
den(den == 0) = 1;
y2 = bsxfun(@rdivide,num,den);%y2 == outputnet == net(x)
2条评论
Sadaf Jabeen
Sadaf Jabeen 2022年3月14日
How can I compute the same with linear activation function and no bias values?

登录发表评论。

More Answers (3)

Amir Qolami
Amir Qolami 2020年4月12日
Edited:Amir Qolami 2020年4月12日

{i}和out {i}是i(th)隐藏(以及输出)层的输入和输出。输入之前和输出层之后有两个重新生物。

function output = NET(net,inputs)

w = cellfun(@transpose,[net.iw {1},net.lw(2:size(net.lw,1)+1:end)+1:end)],'simurancoutOutput',false);b = cellfun(@transpose,net.b','suffortOutput',false);tf = cellfun(@(x)x.transferfcn,net.layers','suffortOutput',false);
%%mapminmax on inputs if strcmp(net.Inputs{1}.processFcns{:},'mapminmax') xoffset = net.Inputs{1}.processSettings{1}.xoffset; gain = net.Inputs{1}.processSettings{1}.gain; ymin = net.Inputs{1}.processSettings{1}.ymin; In0 = bsxfun(@plus,bsxfun(@times,bsxfun(@minus,inputs,xoffset),gain),ymin); else In0 = inputs; end
%%In = cell(1,length(w)); Out = In; In{1} = In0'*w{1}+b{1}; Out{1} = eval([tf{1},'(In{1})']); for i=2:length(w) In{i} = Out{i-1}*w{i}+b{i}; Out{i} = eval([tf{i},'(In{',num2str(i),'})']); end
如果strcmp(net.outputs {end} .processfcns {:},'mapminmax')gain = net.outputs {end end} .processSettings {:}。ymin = net.outputs {end} .processSettings {:}。ymin;xoffset = net.outputs {end} .processSettings {:}。xoffset;output = bsxfun(@plus,bsxfun(@rdivide,bsxfun(@minus,out of {end},ymin),ymin),gain),xoffset);else output = out {end};结尾

结尾

1条评论
Soumitra Sitole
Soumitra Sitole 2022年4月20日
谢谢,这也适用于相对深度的回归网络

登录发表评论。


杰夫·张
杰夫·张 2018年5月2日
Edited:杰夫·张 2018年5月2日
除了,是否可以使用deepdreamimage()在此示例中可视化隐藏层?

Shounak Mitra
Shounak Mitra 2018年10月8日
Unfortunately, using deepDreamImage() is not possible in this case.