Perceptrons

Created 17 years ago by Matt,
Last updated August 5th 2024, 3:50:07 pm

I spent ages trying to work out the fancy mathmatics that make up the formula for a single layer perceptron for simple character recognition, as part of one of my modules, what would have really helped would have been a nice bit of Pseudocode somewhere, which I didn't find... so here is my Pseudocode for a problem with 7 letter classes:

a b c d e j k

with 63 input values of 0 or 1 (imagine a grid of 7 by 9 which represesnts a letter)

There for example 21 input patterns(s), three of each represent a letter (3 A's 3 B's etc etc.)

0 0 0 1 0 0 0
0 0 0 1 0 0 0
0 0 1 0 1 0 0 
0 0 1 0 1 0 0
0 1 0 0 0 1 0
0 1 1 1 1 1 0
1 0 0 0 0 0 1
1 0 0 0 0 0 1
1 1 0 0 0 1 1

Which is a pattern of A class (out of 7) whose output pattern (t)be is:

1, -1, -1, -1 -1, -1, -1

The logic is as follows:

for each letter to train for (e.g. a b c d e j k)
    create a new neuron and set a bias, threshold, max_iterations and initialise a set of weights, one for each input unit(63)

foreach neuron created (to train)
    load training data (s:t)(21 different patterns(s), and what the result should be for each one(t) - i.e. neuron A should get 1 for an A and -1 for anything elss)
    
    do(while erros are found)
       foreach pattern entered (21) 
           calculate the input value(1 to 63) times the assiciated weight(1 to 63) then add the bias
           if this value is greater than 0 then make y equal to 1 else make it equal to -1

           if the value of y is not equal to the expected value for this pattern (i.e 1 or -1)

               this means an error occurred
               then update each weight in turn by adding  to the weight:
                      the sum of: the learning rate times the expected_output(-1 or 1) times the input value (one of 63)
               update the bias by adding the value of bias:
                      the sum of: the learning rate times the expected_output
    end do
end foreach

That will have trained each neuron, i.e. you a neuron, and your b neuron etc) to have a certain set of weights that relate to the pattern of letters entered for that neuron. Now, you must test each one in turn with your test pattern by:

 foreach trained neuron
     pass pattern to test function which:
           Calculates the sum of the weights times the inputs, adds the bias
          if this value us greater than 0 it return 1
          else it returns -1

The pattern of the returned results, is the result of your perceptron, e.g. A would be equated to

1, -1, -1, -1 -1, -1, -1

Comments

  1. A Perceptron being a type of a transformer? - James on Fri Nov 30 2007 15:01:29 GMT+0000 (Coordinated Universal Time)
  2. I think you'll find thats a Decepticon :) - matt on Sat Dec 01 2007 13:46:40 GMT+0000 (Coordinated Universal Time)