SbS Extension for PyTorch
Find a file
2023-01-16 08:39:55 +01:00
bash_tools Add files via upload 2023-01-15 14:57:43 +01:00
dataset_collection Add files via upload 2023-01-15 00:53:58 +01:00
network Add files via upload 2023-01-15 14:59:56 +01:00
settings Add files via upload 2023-01-16 08:39:55 +01:00
get_perf.py Update get_perf.py 2023-01-15 03:02:07 +01:00
LICENSE Update LICENSE 2022-05-02 01:29:32 +02:00
README.md Update README.md 2023-01-15 16:14:26 +01:00
test_it.py Add files via upload 2023-01-15 14:56:50 +01:00
train_it.py Add files via upload 2023-01-15 14:56:50 +01:00

pytorch-sbs

SbS Extension for PyTorch

Based on these scientific papers

Back-Propagation Learning in Deep Spike-By-Spike Networks
David Rotermund and Klaus R. Pawelzik
Front. Comput. Neurosci., https://doi.org/10.3389/fncom.2019.00055
https://www.frontiersin.org/articles/10.3389/fncom.2019.00055/full

Efficient Computation Based on Stochastic Spikes
Udo Ernst, David Rotermund, and Klaus Pawelzik
Neural Computation (2007) 19 (5): 13131343. https://doi.org/10.1162/neco.2007.19.5.1313
https://direct.mit.edu/neco/article-abstract/19/5/1313/7183/Efficient-Computation-Based-on-Stochastic-Spikes

Python

It was programmed with 3.10.4. And I used some 3.10 Python expression. Thus you might get problems with older Python versions.

C++

You need to modify the Makefile in the C++ directory to your Python installation.

In addition your Python installation needs the PyBind11 package installed. You might want to perform a
pip install pybind11
The Makefile uses clang as a compiler. If you want something else then you need to change the Makefile. For CUDA I used version 12.0.

Config files and pre-existing weights

Three .json config files are required:

dataset.json : Information about the dataset

network.json : Describes the network architecture

def.json : Controlls the other parameters

If you want to load existing weights, just put them in a sub-folder called Previous