mirror of
https://github.com/BlackLight/neuralpp.git
synced 2025-07-15 02:48:07 +02:00
Making everything cooler
This commit is contained in:
parent
458eab5e99
commit
d52976e74e
7 changed files with 41 additions and 21 deletions
12
BUGS
Normal file
12
BUGS
Normal file
|
@ -0,0 +1,12 @@
|
|||
Sometimes the training phase of the network breaks in the middle. It happens
|
||||
because the synaptical weights are initialized with random values, and
|
||||
sometimes updating them causes those values to become >= 1. This makes the
|
||||
output values of the network diverge instead of converging to the desired,
|
||||
expected values. The library recognizes this behaviour, and when a weight
|
||||
become >= 1 throws an InvalidSynapticalWeightException. So far there's no
|
||||
way to prevent this odd, random behaviour. The network implements the usage
|
||||
of an inertial momentum coefficient to avoid strong oscillations in the
|
||||
training phase, in order to make this phenomenon rarer, but also using this
|
||||
mechanism there's a possibility ~ 10% of getting a diverging network, and
|
||||
so a training phase broken by an InvalidSynapticalWeightException.
|
||||
|
Loading…
Add table
Add a link
Reference in a new issue