start version 0.0.6

This commit is contained in:
2023-06-20 10:15:47 +07:00
parent 57efefc8e3
commit 90446210ac
63 changed files with 4404 additions and 26 deletions

View File

@@ -32,16 +32,38 @@ using .learn
# using .interface
#------------------------------------------------------------------------------------------------100
""" version 0.0.5
""" version 0.0.6
Todo:
[4] implement dormant connection
[] implement dormant connection
[] using RL to control learning signal
[] consider using Dates.now() instead of timestamp because time_stamp may overflow
[5] training should include adjusting α, neuron membrane potential decay factor
[] training should include adjusting α, neuron membrane potential decay factor
which defined by neuron.tau_m formula in type.jl
Change from version: 0.0.4
Change from version: 0.0.5
-
All features
- synapticStrength apply at the end of learning
- collect ΔwRecChange during online learning (0-784th) and merge with wRec at
the end learning (800th).
- multidispatch + for loop as main compute method
- allow -w_rec yes
- voltage drop when neuron fires voltage drop equals to vRest
- v_t decay during refractory
- input data population encoding, each pixel data =>
population encoding, ralative between pixel data
- compute neuron weight init rand()
- output neuron weight init randn()
- compute pseudo derivative (n.phi) every time step
- add excitatory, inhabitory to neuron
- implement "start learning", reset learning and "learning", "end_learning and
"inference"
- synaptic connection strength concept. use sigmoid, turn connection offline
- neuroplasticity() i.e. change connection
- add multi threads
- compute model error in main loop so one could decide when to calculate error in
training sequence and how to calculate
- fix ALIF adaptation formula, now n.a compute avery time step
@@ -65,28 +87,6 @@ using .learn
on the correct answer -> strengthen the right neural pathway (connections) ->
this correct neural pathway resist to change.
Not used connection should dissapear (forgetting).
All features
- synapticStrength apply at the end of learning
- collect ΔwRecChange during online learning (0-784th) and merge with wRec at
the end learning (800th).
- multidispatch + for loop as main compute method
- allow -w_rec yes
- voltage drop when neuron fires voltage drop equals to vRest
- v_t decay during refractory
- input data population encoding, each pixel data =>
population encoding, ralative between pixel data
- compute neuron weight init rand()
- output neuron weight init randn()
- compute pseudo derivative (n.phi) every time step
- add excitatory, inhabitory to neuron
- implement "start learning", reset learning and "learning", "end_learning and
"inference"
- synaptic connection strength concept. use sigmoid, turn connection offline
- neuroplasticity() i.e. change connection
- add multi threads
Removed features