Thinking as a Hobby


Home
Get Email Updates
LINKS
JournalScan
Email Me

Admin Password

Remember Me

3476848 Curiosities served
Share on Facebook

Evolving Neural Nets
Previous Entry :: Next Entry

Read/Post Comments (0)

Yesterday, Philip and I worked on his final project for the neural net class (I'm auditing, so I'm not taking the final or doing the project). Mostly, this consisted of providing nominal input as Philip dashed out the code in Matlab, the software we used for the class.

Basically, after learning a whole semester of different approaches to neural nets, we're doing something completely different. Rather than creating a net and training up the weights of the connections between units, we're using genetic algorithms. That is, we're creating lots of nets with randomized weights, seeing how well they perform, keeping the ones that perform well, killing the ones that don't, cloning the survivors, slightly mutating their weights, and doing the whole thing over and over again.

We're trying to evolve a neural net that can recognize five different letters, or rather, ASCII representations of those letters. We chose a 4x7 grid to represent each letter.

For example:

X000
X000
X000
XXXX
X00X
X00X
XXXX

is a lower-case "b".

We're trying to evolve a single net which recognizes five letters (a, b, c, d, and e) with a reasonable degree of reliability. Hopefully, when presented with an input vector, even with a small amount of noise, the system would still respond by outputting the proper letter.

For example:

X000
X000
X000
XX0X
X00X
X00X
XXX0

should still be recognizable as the letter "b", even though it is not exactly the same as the ideal input. This is the strenth of neural nets, the ability to generalize given slightly varied inputs.

Anyway, we spent about four hours on Saturday working on the program (or again, Philip wrote the program and I sort of watched and provided input). We got it to run without errors, but the preliminary results looked pretty flaky. So we'll keep working on it and see how it goes.


Read/Post Comments (0)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 JournalScape.com. All rights reserved.
All content rights reserved by the author.
custsupport@journalscape.com