101 lines
2.9 KiB
Julia
101 lines
2.9 KiB
Julia
module Ironpen
|
||
|
||
export kfn_1, synapticConnStrength!
|
||
|
||
|
||
""" Order by dependencies of each file. The 1st included file must not depend on any other
|
||
files and each file can only depend on the file included before it.
|
||
"""
|
||
|
||
include("types.jl")
|
||
using .types # bring model into this module namespace (this module is a parent module)
|
||
|
||
include("snn_utils.jl")
|
||
using .snn_utils
|
||
|
||
# include("Save_and_load.jl")
|
||
# using .Save_and_load
|
||
|
||
# include("DB_services.jl")
|
||
# using .DB_services
|
||
|
||
include("forward.jl")
|
||
using .forward
|
||
|
||
include("learn.jl")
|
||
using .learn
|
||
|
||
include("readout.jl")
|
||
using .readout
|
||
|
||
include("interface.jl")
|
||
using .interface
|
||
#------------------------------------------------------------------------------------------------100
|
||
|
||
"""
|
||
Todo:
|
||
[] using RL to control learning signal
|
||
[] consider using Dates.now() instead of timestamp because time_stamp may overflow
|
||
[] training should include adjusting α, neuron membrane potential decay factor
|
||
which defined by neuron.tau_m formular in type.jl
|
||
|
||
[DONE] each knowledgeFn should have its own noise generater
|
||
[DONE] where to put pseudo derivative (n.phi)
|
||
[DONE] add excitatory, inhabitory to neuron
|
||
[DONE] implement "start learning", reset learning and "learning", "end_learning and
|
||
"inference"
|
||
[DONE] output neuron connect to random multiple compute neurons and overall have
|
||
the same structure as lif
|
||
[DONE] time-based learning method based on new error formula
|
||
(use output vt compared to vth instead of late time)
|
||
if output neuron not activate when it should, use output neuron's
|
||
(vth - vt)*100/vth as error
|
||
if output neuron activates when it should NOT, use output neuron's
|
||
(vt*100)/vth as error
|
||
[DONE] use LinearAlgebra.normalize!(vector, 1) to adjust weight after weight merge
|
||
[DONE] reset_epsilonRec after ΔwRecChange is calculated
|
||
[DONE] synaptic connection strength concept. use sigmoid, turn connection offline
|
||
[DONE] wRec should not normalized whole. it should be local 5 conn normalized.
|
||
[DONE] neuroplasticity() i.e. change connection
|
||
|
||
Change from version: v06_36a
|
||
-
|
||
|
||
All features
|
||
- multidispatch + for loop as main compute method
|
||
- hard connection constrain yes
|
||
- normalize output yes
|
||
- allow -w_rec yes
|
||
- voltage drop when neuron fires voltage drop equals to vth
|
||
- v_t decay during refractory
|
||
duration exponantial decay
|
||
- input data population encoding, each pixel data =>
|
||
population encoding, ralative between pixel data
|
||
- compute neuron weight init rand()
|
||
- output neuron weight init randn()
|
||
"""
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
end # module end
|