version 0.0.5-alpha
This commit is contained in:
@@ -25,19 +25,22 @@ using .interface
|
||||
|
||||
#------------------------------------------------------------------------------------------------100
|
||||
|
||||
""" version 0.0.3
|
||||
""" version 0.0.5
|
||||
Todo:
|
||||
[2] implement dormant connection and pruning machanism. the longer the training the longer
|
||||
[DONE] add excitatory/inhabitory matrix
|
||||
[-] add temporal summation in addition to already used spatial summation.
|
||||
CANCELLED, spatial summation every second until membrane potential reach a threshold
|
||||
is in itself a temporal summation.
|
||||
[x] add neuroplasticity
|
||||
[4] implement dormant connection and pruning machanism. the longer the training the longer
|
||||
0 weight stay 0.
|
||||
[] using RL to control learning signal
|
||||
[] consider using Dates.now() instead of timestamp because time_stamp may overflow
|
||||
[] Liquid time constant. training should include adjusting α, neuron membrane potential decay factor
|
||||
which defined by neuron.tau_m formula in type.jl
|
||||
|
||||
Change from version: 0.0.2
|
||||
- knowledgeFn in GPU format
|
||||
- use partial error update for computeNeuron
|
||||
- frequency regulator
|
||||
Change from version: 0.0.4
|
||||
-
|
||||
|
||||
All features
|
||||
|
||||
|
||||
Reference in New Issue
Block a user