tether package¶
Subpackages¶
- tether.data package
- tether.functional package
- tether.kernels package
- tether.nn package
Module contents¶
- class tether.ALIF(n_neurons, decay_v=0.9, decay_a=0.9, threshold=1.0, beta=0.5, alpha=2.0, store_traces=False)[source]¶
Bases:
Module
- class tether.Arctan(alpha=2.0, trainable=False)[source]¶
Bases:
SurrogateArctan surrogate gradient.
The surrogate derivative is given by:
\[\begin{split}f'(x) = \\frac{1}{1 + (\\alpha \\pi x)^2}\end{split}\]where x is the normalized membrane potential (v - threshold).
- class tether.FastSigmoid(alpha=2.0, trainable=False)[source]¶
Bases:
SurrogateFast Sigmoid (approximated) surrogate gradient.
Uses a computationally cheaper approximation of the sigmoid derivative:
\[\begin{split}f'(x) = \\frac{1}{(1 + |\\alpha x|)^2}\end{split}\]This avoids expensive exponential operations.
- class tether.LIF(n_neurons, decay=0.9, threshold=1.0, alpha=2.0, surrogate=None, store_traces=False)[source]¶
Bases:
Module- property alpha¶
- class tether.PLIF(n_neurons, init_decay=0.9, init_threshold=1.0, alpha=2.0, surrogate=None, store_traces=False)[source]¶
Bases:
Module
- class tether.Sigmoid(alpha=2.0, trainable=False)[source]¶
Bases:
SurrogateSigmoid surrogate gradient.
The surrogate function is a sigmoid, and its derivative is:
\[\begin{split}f'(x) = \\alpha \\cdot \\sigma(\\alpha x) \\cdot (1 - \\sigma(\\alpha x))\end{split}\]where \(\\sigma\) is the logistic sigmoid function and x is the membrane potential gap.
- class tether.SpikingSelfAttention(dim, num_heads=8, decay=0.9, threshold=1.0)[source]¶
Bases:
Module
- class tether.Surrogate(alpha=2.0, trainable=False)[source]¶
Bases:
ModuleBase class for surrogate gradient functions used in Spiking Neural Networks.
Surrogate gradients allow for backpropagation through the non-differentiable Heaviside step function used for spike generation.
- Parameters:
alpha (float, optional) – Scaling parameter that controls the steepness/width of the surrogate derivative. Default is 2.0.
trainable (bool, optional) – If True, alpha becomes a learnable parameter. Default is False.