Since a while ago i gained interest in A.I., I'm interested in it since a long time, but never had the time to really get started with it.
But hippieyee, when it was my birthday, a bought a couple of books and actually read them also. Then the A.I vibes came really to life :)
After reading the book(s) i started to implement a framework for neural networks,
because the N.N. part of A.I interests me alot.
It's pretty cool how many simularities you can see in it, in relation with a human brain.
Now, i learned that there are many different kinds of neural nets, with the framework, based on explenation from a book, i think i managed to get the biggest kinds implemented, that's ADELINE, BackPropagation, BAM and SOM (or SON).
If you look at the working of those networks, you'll see that there are certain simularities in it. I'll go deeper on that later.
My Object Oriented Neural Network Framework looks like this:
Like you see i only show the classnames, because the complete content doesn't fit on the screen.
The thing that all N.N. have in common is a basenode, from there we make a feedforward-node, SON-Node (for self organizing maps), InputNode for giving input values and a baseNetwork, that actually consists of BaseNodes.
Then we have the BaseLink, it's a link between the Neuron-nodes (a neuron is an element of a Neural Network). We have the adeline links, Backprop links with the epochlink (for learning) and the BAM and SON links.
With this construction we can make all combinations of neural networks. Maybe you will have to inherit something to extend functionality.
The usual practices to learn something about neural networks (which i encountered the most) are:
- Self Origanizing Nets
- Color SOM
- XOR problem
- ... there are others also offcourse.
I also made a deviation, that i picked op on www.generation5.org, and that's detecting picture simularity with neural networks.
I'll post more on those subjects later.
Regards,
F.
No comments:
Post a Comment