CSCI 335 - Artificial Intelligence
Fall 2017
Programming Project #5: Handwriting Recognition with Multilayer Perceptrons
Overview
You will implement the backpropagation algorithm for training a multi-layer perceptron. You will experiment with different configurations of the
neural network to find the best results for a handwriting recognition problem.
Setup
- Download perceptron.zip.
- Unzip the files into the Eclipse Java project you created for this course.
- Click on your project in the package explorer, and press F5 to refresh.
- Be sure to add JUnit4 to the build path if necessary.
Programming Assignment
The new files are all placed in the handwriting.learners
package.
You will need to modify
Perceptron.java and
MultiLayer.java. You will also need to create a class that extends
RecognizerAI.java, and place it in the same package.
-
handwriting.learners
- PerceptronNet.java: Abstract base class for both the single-layer and multi-layer perceptron classes. Study the
trainN()
method carefully to understand the roles of the train()
and updateWeights()
methods in the training process.
- Perceptron.java: Implement the
compute()
and addToWeightDeltas()
methods. Study the train()
method to understand the role of these two methods.
- MultiLayer.java: Implement the
backpropagate()
method. Study the train()
method to understand how it is used.
- PerceptronTest.java: Unit tests for perceptrons. If all tests pass, you should have
confidence that your implementations of the
Perceptron
and
MultiLayer
classes are correct.
The class you will create that extends RecognizerAI
will need
to employ at least one MultiLayer
object for handwriting
recognition. The precise way in which it is used is entirely up to you.
You are encouraged to experiment with different variations of the learning
algorithm, each of which would be implemented as a separate class. Here
are some examples of aspects of the algorithm that you might vary:
- Process for decoding perceptron outputs
- Learning rate
- Number of hidden nodes
- Number of training iterations
Generating Data
- Using the drawing editor, draw 10 samples each of two letters. For each drawing, click the "Record" button when it is complete. For the label, use the letter that you drew. Once this is complete, save the file (using the Save command on the File menu).
- Create a second set of samples of your two letters, and save them under a different filename.
- Create a neural network using the first file as the training set. Test its performance using the second file. Then build a new network using the second file for training and the first file for testing. How well does each network perform on its testing set?
- Once you can build a network that distinguishes two letters, expand your training and test sets to train it to distinguish three letters. Continue iterating this process until you can build a network that can distinguish at least eight different letters.
Presentations
- Thursday, October 5:
- Describe your strategy for encoding/decoding the inputs/outputs of the network.
- Give testing-set results for a two-letter experiment. Include the following details:
- Number of training iterations
- Number of hidden nodes employed
- Percentage of testing-set examples correctly classified
- Tuesday, October 10:
- Describe your overall esults for all experiments, up to eight letters
Paper
When you are finished with your experiments, write a paper summarizing your findings. Include the following:
- An analysis and discussion of your data. (Be sure to include the data as well.)
- How well does the multi-layer perceptron perform for this task?
- What effect did varaitions in the number of training iterations, the learning rate and numbers of hidden nodes have on the results?
- Beyond the actual results, what other issues are noteworthy?