`
deepfuture
  • 浏览: 4331706 次
  • 性别: Icon_minigender_1
  • 来自: 湛江
博客专栏
073ec2a9-85b7-3ebf-a3bb-c6361e6c6f64
SQLite源码剖析
浏览量:79401
1591c4b8-62f1-3d3e-9551-25c77465da96
WIN32汇编语言学习应用...
浏览量:68340
F5390db6-59dd-338f-ba18-4e93943ff06a
神奇的perl
浏览量:101463
Dac44363-8a80-3836-99aa-f7b7780fa6e2
lucene等搜索引擎解析...
浏览量:281100
Ec49a563-4109-3c69-9c83-8f6d068ba113
深入lucene3.5源码...
浏览量:14594
9b99bfc2-19c2-3346-9100-7f8879c731ce
VB.NET并行与分布式编...
浏览量:65528
B1db2af3-06b3-35bb-ac08-59ff2d1324b4
silverlight 5...
浏览量:31304
4a56b548-ab3d-35af-a984-e0781d142c23
算法下午茶系列
浏览量:45181
社区版块
存档分类
最新评论

神经网络-感知机(8)[matlab]

 
阅读更多

可以使用硬限幅

>> test

e =

       10000


e =

    -1     0    -1    -1    -1


Wij =

   -1.0428   -1.5146


b =

    -1     0    -1    -1    -1


e =

     0     1     0     0     0


Wij =

   -0.0428   -0.5146


b =

    -1     1    -1    -1    -1


e =

     0     0     0     0     0


Wij =

   -0.0428   -0.5146


b =

    -1     1    -1    -1    -1


net =

   -1.5146    0.4425   -1.5146   -1.0428   -1.0428


y =

     0     1     0     0     0

%感知机的学习过程,and运算 
P=[0 1 0 1 1;1 1 1 0 0];
T=[0 1 0 0 0];
[M,N]=size(P);
[L,N]=size(T);
%权值矩阵
Wij=rand(L,M);
%阈值矩阵
b=zeros(L,1);
e=10000
while (mae(e)>0.0015)
   net=netsum(Wij*P,b);
   y=hardlim(net);
   e=T-y
   Wij=Wij+e*P'
   b=b+e 
end
net=netsum(Wij*P,b)
y=hardlim(net)

 

 

 

 

 

 help netsum
 NETSUM Sum net input function.
 
  Syntax
 
    N = netsum({Z1,Z2,...,Zn},FP)
    dN_dZj = netsum('dz',j,Z,N,FP)
    INFO = netsum(CODE)
 
  Description
 
    NETSUM is a net input function.  Net input functions calculate
    a layer's net input by combining its weighted inputs and bias.
 
    NETSUM({Z1,Z2,...,Zn},FP) takes Z1-Zn and optional function parameters,
      Zi - SxQ matrices in a row cell array.
      FP - Row cell array of function parameters (ignored).
    Returns element-wise sum of Z1 to Zn.
 
    NETSUM('dz',j,{Z1,...,Zn},N,FP) returns the derivative of N with
    respect to Zj.  If FP is not supplied the default values are used.
    if N is not supplied, or is [], it is calculated for you.
 
    NETSUM('name') returns the name of this function.
    NETSUM('type') returns the type of this function.
    NETSUM('fpnames') returns the names of the function parameters.
    NETSUM('fpdefaults') returns default function parameter values.
    NETSUM('fpcheck',FP) throws an error for illegal function parameters.
    NETSUM('fullderiv') returns 0 or 1, if the derivate is SxQ or NxSxQ.
 
  Examples
 
    Here NETSUM combines two sets of weighted input vectors and a bias.
    We must use CONCUR to make B the same dimensions as Z1 and Z2.
 
      z1 = [1 2 4; 3 4 1]
      z2 = [-1 2 2; -5 -6 1]
      b = [0; -1]
      n = netsum({z1,z2,concur(b,3)})
 
    Here we assign this net input function to layer i of a network.
 
      net.layers{i}.netFcn = 'compet';
 
    Use NEWP or NEWLIN to create a standard network that uses NETSUM.
 
 hardlim通过计算网络的输入得到该层的输出,如果网络的输入达到门限,则输出1,否则输出0,

配合netsum函数,可以构造感知机的学习过程

 help hardlim
 HARDLIM Hard limit transfer function.
  
  Syntax
 
    A = hardlim(N,FP)
    dA_dN = hardlim('dn',N,A,FP)
    INFO = hardlim(CODE)
 
  Description
 
    HARDLIM is a neural transfer function.  Transfer functions
    calculate a layer's output from its net input.
 
    HARDLIM(N,FP) takes N and optional function parameters,
      N - SxQ matrix of net input (column) vectors.
      FP - Struct of function parameters (ignored).
    and returns A, the SxQ boolean matrix with 1's where N >= 0.
  
    HARDLIM('dn',N,A,FP) returns SxQ derivative of A w-respect to N.
    If A or FP are not supplied or are set to [], FP reverts to
    the default parameters, and A is calculated from N.
 
    HARDLIM('name') returns the name of this function.
    HARDLIM('output',FP) returns the [min max] output range.
    HARDLIM('active',FP) returns the [min max] active input range.
    HARDLIM('fullderiv') returns 1 or 0, whether DA_DN is SxSxQ or SxQ.
    HARDLIM('fpnames') returns the names of the function parameters.
    HARDLIM('fpdefaults') returns the default function parameters.
  
  Examples
 
    Here is how to create a plot of the HARDLIM transfer function.
  
      n = -5:0.1:5;
      a = hardlim(n);
      plot(n,a)
 
    Here we assign this transfer function to layer i of a network.
 
      net.layers{i}.transferFcn = 'hardlim';
 
  Algorithm
 
      hardlim(n) = 1, if n >= 0
                   0, otherwise

 

 

>> help hardlims
 HARDLIMS Symmetric hard limit transfer function.
  
  Syntax
 
    A = hardlims(N,FP)
    dA_dN = hardlims('dn',N,A,FP)
    INFO = hardlims(CODE)
 
  Description
  
    HARDLIMS is a neural transfer function.  Transfer functions
    calculate a layer's output from its net input.
 
    HARDLIMS(N,FP) takes N and optional function parameters,
      N - SxQ matrix of net input (column) vectors.
      FP - Struct of function parameters (ignored).
    and returns A, the SxQ +1/-1 matrix with +1's where N >= 0.
  
    HARDLIMS('dn',N,A,FP) returns SxQ derivative of A w-respect to N.
    If A or FP are not supplied or are set to [], FP reverts to
    the default parameters, and A is calculated from N.
 
    HARDLIMS('name') returns the name of this function.
    HARDLIMS('output',FP) returns the [min max] output range.
    HARDLIMS('active',FP) returns the [min max] active input range.
    HARDLIMS('fullderiv') returns 1 or 0, whether DA_DN is SxSxQ or SxQ.
    HARDLIMS('fpnames') returns the names of the function parameters.
    HARDLIMS('fpdefaults') returns the default function parameters.
  
  Examples
 
    Here is how to create a plot of the HARDLIMS transfer function.
  
      n = -5:0.1:5;
      a = hardlims(n);
      plot(n,a)
 
    Here we assign this transfer function to layer i of a network.
 
      net.layers{i}.transferFcn = 'hardlims';
 
  Algorithm
 
      hardlims(n) = 1, if n >= 0
                   -1, otherwise

 

hardlims达到门限输出为1,否则输出-1

>> a=[-5:0.5:5]

a =

  Columns 1 through 6

   -5.0000   -4.5000   -4.0000   -3.5000   -3.0000   -2.5000

  Columns 7 through 12

   -2.0000   -1.5000   -1.0000   -0.5000         0    0.5000

  Columns 13 through 18

    1.0000    1.5000    2.0000    2.5000    3.0000    3.5000

  Columns 19 through 21

    4.0000    4.5000    5.0000

>> c=hardlim(a)

c =

  Columns 1 through 11

     0     0     0     0     0     0     0     0     0     0     1

  Columns 12 through 21

     1     1     1     1     1     1     1     1     1     1

>> d=hardlims(a)

d =

  Columns 1 through 11

    -1    -1    -1    -1    -1    -1    -1    -1    -1    -1     1

  Columns 12 through 21

     1     1     1     1     1     1     1     1     1     1

>>

分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics