Matlab Neural Net: tansig always returns positive value -


I am taking a classification task, but facing the problem that when I run my patterns on a trained net Only one till I apply the original pattern (the last line of code given below) to the trained net, even then I expect classification (equivalent).

All normalization is being done by auto-created functions, and the results are shown below.

Summary of My Results

Advice?

  [pattern, target] = getData (); Pattern = pattern '; Target = target = 11x3078; % 1x3078 learners = 'trainlm'; % 'Trainlm', 'trainbr', 'trainscg' hiddenSizes = 5; % Default10 net = feedfarnet (hidden size, learner); % Inps = net.inputs {1} .processFcns; % Default for hidden layers is 'tansig' net.layers {1} .transferFcn = 'tansig'; % Default for hidden layers is preferred in a linear function for purein '% tansig classification net.layers {2}. Transfer FCN = 'tansig'; Net.divideParam.trainRatio = 0.7; Net.divideParam.valRatio = 1 - net.divideParam.trainRatio; Net.divideParam.testRatio = 0.0; [Net, tr] = train (net, pattern, target); % Network network %% test network output = net (pattern);  


Comments

Popular posts from this blog

import - Python ImportError: No module named wmi -

Editing Python Class in Shell and SQLAlchemy -

c# - MySQL Parameterized Select Query joining tables issue -