mirror of
https://github.com/BlackLight/neuralpp.git
synced 2024-12-28 20:25:12 +01:00
19 lines
1.2 KiB
Text
19 lines
1.2 KiB
Text
* @example examples/learnAdd.cpp Show how to train a network that performs sums between
|
|
* two real numbers. The training XML is built from scratch, then saved to a file, then
|
|
* the network is initialized using that XML file, trained, and the resulting trained
|
|
* network is saved to adder.net. Then, you should take a look at doAdd.cpp to see how
|
|
* to use that file to use the network.
|
|
|
|
* @example examples/doAdd.cpp Show how to use a network already trained and saved to a
|
|
* binary file. In this case, a network trained to simply perform sums between two real
|
|
* numbers, that should have already been created using learnAdd.
|
|
|
|
* @example examples/adderFromScratch.cpp Similar to learnAdd.cpp, but this time the
|
|
* training XML is generated as a string and not saved to a file, and parsed by the
|
|
* program itself to build the network. Then, the program asks two real numbers, and
|
|
* performs both the sum and the difference between them, putting the sum's output on
|
|
* the first output neuron and the difference's on the second output neuron. Anyway,
|
|
* using more than one neuron in the output layer is strongly discouraged, as the network
|
|
* usually won't set correctly the synaptical weights to give satisfying and accurate
|
|
* answers for all of the operations.
|
|
|