neuralpp/ChangeLog

96 lines
3.6 KiB
Text
Raw Normal View History

2009-09-02 21:16:47 +02:00
--- Release 1.0 ---
2009-09-02 BlackLight <blacklight@autistici.org>
* all: Most of the code has been rewritten. Now a trained network should
be saved, via save() method, to an XML file, and loaded from that using
the constructor. This way is much cleaner, and easily fixable and
customizable too. The old approach through binary files has been kept yet
through saveToBinary() and loadFromBinary() methods, for
back-compatibility purposes, though it's strongly deprecated. Moreover,
the XML formats (for training the network and for loading a trained
network's information) are fully compatible with the ones used by
NeuralPerl module, mostly a Perl version of this library, so you can use
the same XML files to train and load a neural network using C++ and Perl.
Moreover, a great step forward in the fixing of the dirty old
vulnerability (synaptical random weights overflow) has been made, keeping
the values of the weights very low and ~ 0. I've made tons of tests and
the network never overflowed. Anyway, if you should experience an
overflow (synaptical weights go > 1), just write me. I've also done a
deep re-design of library's classes, using as public methods only the
stuff you *REALLY* need to be public.
2009-08-15 02:59:09 +02:00
--- Release 0.4 ---
2009-08-16 11:09:42 +02:00
2009-08-16 BlackLight <blacklight@autistici.org>
2009-08-16 18:02:24 +02:00
* all: Now it *REALLY* supports multiple output values, i.e. multiple
neurons, in the output layer.
Oh, and I've fixed too that nasty bug that made training from sets
containing more than a single training set.
Added examples in the Doxygen documentation.
2009-08-16 11:09:42 +02:00
* neuron.cpp: Fixing propagate() function
2009-08-15 02:59:09 +02:00
2009-08-15 BlackLight <blacklight@autistici.org>
* Makefile: Now you compile Neural++ with -Wall -pedantic
-pedantic-errors -ansi. And you won't get a single error. I love writing
pedantic, pure ISO C++ code...
* all: Hey I can't believe it guys, I've fixed that horrible bug an
neural output won't diverge anymore ^^
Also removed the 'deriv' parameter, as function pointer. It was just
useless, as once you have an activation function, you can compute its
derivate in any points. But, of course, this makes old codes no more
compatible with this new version.
2009-08-09 11:17:39 +02:00
--- Release 0.3 ---
2009-08-09 BlackLight <blacklight@autistici.org>
* Makefile: Totally changed
* neural++.hpp: Changed header name, added BETA0 macro
* synapsis.cpp: Added momentum() method to compute the inertial momentum
of a synapsis
2009-08-15 02:59:09 +02:00
* all: Data type changed from float to double for everything,
2009-08-09 11:17:39 +02:00
fixing neuralpp namespace, fixed indentation, fixed exception throwing,
fixed documentation
--- Release 0.2.2 ---
2008-11-04 BlackLight <blacklight@autistici.org>
* all: Added initXML(), XMLFromSet() and closeXML() methods to auto-
generate training XML.
2009-02-18 00:19:29 +01:00
train() method can now get XML both from a file and from a simple string.
--- Release 0.2.1 ---
2008-10-22 BlackLight <blacklight@autistici.org>
* all: Added `train()` method to NeuralNet class, that allows you to
train a neural network using an XML containing pre-saved training
sets (input values, expected output). See examples/adder.xml for an
example of such XML.
--- Release 0.2 ---
2008-10-12 BlackLight <blacklight@autistici.org>
* all: Added `save()` method to NeuralNet class, that allows you to
save a trained neural network to a binary file, to load anytime you
need using NeuralNet(const char *fname) constructor.
--- Release 0.01b ---
2008-04-03 BlackLight <blacklight@autistici.org>
* all: First beta release ^^
Copyright 2008, 2009, BlackLight
Copying and distribution of this file, with or without modification,
are permitted provided the copyright notice and this notice are
preserved.
2009-02-18 00:19:29 +01:00