Natural Machine Intelligence and Machine Consciousness - fantasy or near-future fact? How can we get there, and do we want to undertake the journey?
Site feed http://machine-intelligence.blogspot.com/atom.xml

Wednesday, May 31, 2006

Peltarion Synapse

This is not an incitement to click on the ads, but I happened to have a look at what was being advertised on my page (seems only sensible, really, and I was careful to do it in such a way as to not break my agreement not to click on them with Google) and one of them was for a product called Synapse from Peltarion.

 

I had never heard of them before, so I took a quick look.  It is a nice piece of software, although annoyingly similar to something I have been meaning to make for some time.  Essentially it provides a toolkit of neural networks and analysis tools, coupled with some wizards to guide you through importing your data and designing the classifier/time series predictor/function modeller.  It can take a little while to load up on my machine – 30 seconds or so just now, with precious little else using disk or processor (but with 760Mb of page file in use… which slows things down even with 1GB around… but Synapse only uses another 20MB by the time it has the flash screen up).   This is probably due to its highly modular design, with almost everything being implemented in separate dll files.  In fact, most of the filters/neural network components etc. have 2 dll files – one for the gubbins (I assume) and one for the gui (judging by the names).

 

The splash screen has an annoying ‘always on top’ness about it, but it is bright and friendly and gives you the option to switch it off.  The main software is probably much easier to use with practice, but I must admit given the 14 day trial period, I would imagine I would stick to using the wizards in order to get anything done.

The only real problem I ran into with it while messing about (that should read ‘testing’) was that if I set the option to use a genetic algorithm to optimise the work flow parameters, it would sometimes de-optimise them, leaving the analysis in a sad and sorry state.

 

Trying to set some of the components up manually (they have a nice drag ‘n’ drop functionality) left me rather fatigued at one point, as every time I tried to change the number of outputs of one layer of an MLP in it, something automatically adjusted it back to fit the next component it was connected to.  I think (it was late at night!) that I discovered that if you work backwards through the work flow, it was more likely to listen to your desires and not fix it for you, than if you work forwards.

 

Lots of good stuff implemented in there though – Kalman filters, Fuzzy logic, Naïve Bayes, SVMs, wavelets, Hebbians, RBFs, Self Organising Maps … and the design of the system means that is will be simple for them to update it.  In fact, the second time I ran it it automatically updated the Hebbian component and a couple of other things.  That worries me slightly, because it might mean you have non-reproducible results, but on the other hand it should mean you have the latest and greatest facilities at any time, as long as you are connected to the interweb thingy.

 

Downside? The asking price.  Ouch.  Don’t think I will be buying that any time soon, although, maybe, just maybe, I could twist my supervisor’s arm and get it for the project, as it essentially implements a large chunk of the stuff I am half way through implementing.  Not quite as satisfying as a roll your own though, is it?

0 Comments:

Post a Comment

<< Home