up version

This commit is contained in:
2023-05-28 10:59:29 +07:00
parent 526ffa94be
commit fe35066a94
2 changed files with 46 additions and 49 deletions

View File

@@ -43,28 +43,29 @@ using .learn
[5] training should include adjusting α, neuron membrane potential decay factor
which defined by neuron.tau_m formula in type.jl
[DONE] each knowledgeFn should have its own noise generater
[DONE] where to put pseudo derivative (n.phi)
[DONE] add excitatory, inhabitory to neuron
[DONE] implement "start learning", reset learning and "learning", "end_learning and
Change from version: 0.0.1
- each knowledgeFn should have its own noise generater
- where to put pseudo derivative (n.phi)
- add excitatory, inhabitory to neuron
- implement "start learning", reset learning and "learning", "end_learning and
"inference"
[DONE] output neuron connect to random multiple compute neurons and overall have
- output neuron connect to random multiple compute neurons and overall have
the same structure as lif
[DONE] time-based learning method based on new error formula
- time-based learning method based on new error formula
(use output vt compared to vth instead of late time)
if output neuron not activate when it should, use output neuron's
(vth - vt)*100/vth as error
if output neuron activates when it should NOT, use output neuron's
(vt*100)/vth as error
[DONE] use LinearAlgebra.normalize!(vector, 1) to adjust weight after weight merge
[DONE] reset_epsilonRec after ΔwRecChange is calculated
[DONE] synaptic connection strength concept. use sigmoid, turn connection offline
[DONE] wRec should not normalized whole. it should be local 5 conn normalized.
[DONE] neuroplasticity() i.e. change connection
[DONE] add multi threads
[DONE] during 0 training if 1-9 output neuron fires, adjust weight only those neurons
[DONE] add maximum weight cap of each connection
[DONE] weaker connection should be harder to increase strength. It requires a lot of
- use LinearAlgebra.normalize!(vector, 1) to adjust weight after weight merge
- reset_epsilonRec after ΔwRecChange is calculated
- synaptic connection strength concept. use sigmoid, turn connection offline
- wRec should not normalized whole. it should be local 5 conn normalized.
- neuroplasticity() i.e. change connection
- add multi threads
- during 0 training if 1-9 output neuron fires, adjust weight only those neurons
- add maximum weight cap of each connection
- weaker connection should be harder to increase strength. It requires a lot of
repeat activation to get it stronger. While strong connction requires a lot of
inactivation to get it weaker. The concept is strong connection will lock
correct neural pathway through repeated use of the right connection i.e. keep training
@@ -72,9 +73,6 @@ using .learn
this correct neural pathway resist to change.
Not used connection should dissapear (forgetting).
Change from version: v06_36a
-
All features
- multidispatch + for loop as main compute method
- hard connection constrain yes

View File

@@ -32,7 +32,7 @@ using .learn
# using .interface
#------------------------------------------------------------------------------------------------100
""" version 0.0.2
""" version 0.0.3
Todo:
[*2] implement connection strength based on right or wrong answer
[*1] how to manage how much constrength increase and decrease
@@ -43,36 +43,7 @@ using .learn
[5] training should include adjusting α, neuron membrane potential decay factor
which defined by neuron.tau_m formula in type.jl
[DONE] each knowledgeFn should have its own noise generater
[DONE] where to put pseudo derivative (n.phi)
[DONE] add excitatory, inhabitory to neuron
[DONE] implement "start learning", reset learning and "learning", "end_learning and
"inference"
[DONE] output neuron connect to random multiple compute neurons and overall have
the same structure as lif
[DONE] time-based learning method based on new error formula
(use output vt compared to vth instead of late time)
if output neuron not activate when it should, use output neuron's
(vth - vt)*100/vth as error
if output neuron activates when it should NOT, use output neuron's
(vt*100)/vth as error
[DONE] use LinearAlgebra.normalize!(vector, 1) to adjust weight after weight merge
[DONE] reset_epsilonRec after ΔwRecChange is calculated
[DONE] synaptic connection strength concept. use sigmoid, turn connection offline
[DONE] wRec should not normalized whole. it should be local 5 conn normalized.
[DONE] neuroplasticity() i.e. change connection
[DONE] add multi threads
[DONE] during 0 training if 1-9 output neuron fires, adjust weight only those neurons
[DONE] add maximum weight cap of each connection
[DONE] weaker connection should be harder to increase strength. It requires a lot of
repeat activation to get it stronger. While strong connction requires a lot of
inactivation to get it weaker. The concept is strong connection will lock
correct neural pathway through repeated use of the right connection i.e. keep training
on the correct answer -> strengthen the right neural pathway (connections) ->
this correct neural pathway resist to change.
Not used connection should dissapear (forgetting).
Change from version: v06_36a
Change from version: 0.0.2
-
All features
@@ -87,6 +58,34 @@ using .learn
population encoding, ralative between pixel data
- compute neuron weight init rand()
- output neuron weight init randn()
- each knowledgeFn should have its own noise generater
- where to put pseudo derivative (n.phi)
- add excitatory, inhabitory to neuron
- implement "start learning", reset learning and "learning", "end_learning and
"inference"
- output neuron connect to random multiple compute neurons and overall have
the same structure as lif
- time-based learning method based on new error formula
(use output vt compared to vth instead of late time)
if output neuron not activate when it should, use output neuron's
(vth - vt)*100/vth as error
if output neuron activates when it should NOT, use output neuron's
(vt*100)/vth as error
- use LinearAlgebra.normalize!(vector, 1) to adjust weight after weight merge
- reset_epsilonRec after ΔwRecChange is calculated
- synaptic connection strength concept. use sigmoid, turn connection offline
- wRec should not normalized whole. it should be local 5 conn normalized.
- neuroplasticity() i.e. change connection
- add multi threads
- during 0 training if 1-9 output neuron fires, adjust weight only those neurons
- add maximum weight cap of each connection
- weaker connection should be harder to increase strength. It requires a lot of
repeat activation to get it stronger. While strong connction requires a lot of
inactivation to get it weaker. The concept is strong connection will lock
correct neural pathway through repeated use of the right connection i.e. keep training
on the correct answer -> strengthen the right neural pathway (connections) ->
this correct neural pathway resist to change.
Not used connection should dissapear (forgetting).
"""