Skip to article frontmatterSkip to article content

Organising the paper

We are currently in the process of writing up the paper. If you are one of the contributors and you would like to be recognised as an author on this paper, please make yourself known to us. Either email Dan Goodman or join the SNUFA discord channel #sound-localisation-paper (or both).

The current plan for writing up the paper is as follows:

  1. Gather information on all contributors (names, email addresses, institutions, etc.). Current plan is that Marcus will be first author, I will be last author, and no decision about author ordering other than that. Please give your thoughts, including whether or not you disagree with the first/last author places (this is fine if you disagree, it’s just a temporary decision). If nobody has strong feelings we may just randomise the rest of the order.
  2. Clean up and merge all notebooks into main. Please note that I have renamed some of the notebooks, moved some into the research folder, and changes the headings on some to give them better descriptions. You may have conflicts that need resolving in your pull request.
  3. Marcus and I will write a first draft of the main body of the paper that will attempt to summarise everything as well as talk about the process. You can see the results so far online as we write.
  4. If you want to do a write up of your section in more detail including methods, results, etc. these will be included in the Appendices part of the paper.
  5. Every notebook will be included in the Supplementary Materials section of the paper.
  6. I am happy to discuss any of the decisions above. I’ve only made them in order to get things going quickly!
  7. We’ll do several rounds of iteration and comments. We will also continue our monthly meetings and discuss at those.
  8. We’ll submit a preprint of the paper to arXiv or bioRxiv.
  9. We’ll try to get this submitted to a journal.

Actions

Referencing

If contributing to the paper, it helps if we all stick to a standard way of doing referencing (citations and internal references).

Code for figures should look something like this:

```{figure} ../research/diagrams/arch-stimuli.png
:label: basic-arch
:width: 100%

Overall model architecture.
```

The path is relative to the file paper/paper.md. You can reference this figure in the text by writing something like:

See {ref}`basic-arch`

You can even do subpanels of figures and reference those, e.g.:

```{figure}
:label: basic-results
:width: 100%

(confusion-matrix)=
![Confusion matrix.](sections/basicmodel/confusion.png)

(hidden-firing-rates)=
![Hidden neuron firing rates.](sections/basicmodel/hidden-firing-rates.png)

Results of training the network with $f=50$ Hz, $\tau=2$ ms, $N_\psi=100$, $N_h=8$, $N_c=12$. Mean absolute IPD errors are $\sim 2.6$ deg.
```

You can do label and reference section headings, which might look like this:

(basic-model)=
## A minimal trainable model of IPD processing

and referenced as

See {ref}`basic-model`

For papers, include the paper in paper.bib using standard BiBTeX notation, and then reference in one of the following ways:

[@Zenke2018] - single reference will look like (Zenke et al. 2018).
[@Zenke2018;Yin2019] - multiple references (don't do [@Zenke2018][@Yin2019] which will look ugly).
{cite:t}`@Zenke2018` will look like Zenke et al. (2018) for referencing as part of a sentence.

Current known contributors

If you add a contribution, please use one of the following templates (see examples below):

  • Wrote the paper (plus which section if you would like to specify)
  • Conducted research (please give a link to your notebook formatted like this [](../research/3-Starting-Notebook.ipynb), or specify another sort of contribution)
  • Supervised research (please give the name of your supervisee)

Table 1:Contributors, ordered by GitHub commits as of 2024-07-16.

NameGitHubContribution
Tomas Fiers@tfiersBuilt the website infrastructure, created Figure 2 based on Dan’s sketch of the model architecture.
Dan Goodman@thesamovarConceived the project, wrote the paper, wrote and recorded the Cosyne tutorial. Conducted research (Starting Notebook, Analysing performance and solutions as time constants change).
Marcus Ghosh@ghoshmManaged the project, wrote the paper, conducted research (Quick Start Notebook, Sound localisation following Dale’ law), gave the Cosyne tutorial.
Francesco De Santis@francescodesantis(Conducted research (Inhibition Model Notebook) and wrote the paper (Contralateral glycinergic inhibition as a key factor in creating ITD sensitivity))
Dilay Fidan Erçelik@dilayercelikConducted research (Quick Start Notebook, Quick Start Notebook with 250 Hz input).
Pietro Monticone@pitmonticoneCleaned paper and notebooks
Karim Habashy@KarimHabashyConducted research (Learning delays, Learning delays (v2), Vanilla sound localization problem with a single delay layer (non-spiking)), wrote the paper (Learning delays), project management (Quick Start Notebook)
Balázs Mészáros@mbalazs98Wrote the paper (DCLS based delay learning in the appendix). Conducted research (Noise offsets in every iteration, Dilated Convolution with Learnable Spacings).
Mingxuan Hong@mxhongConducted research (Altering Output Neurons, Dynamic threshold).
Rory Byrne@rorybyrneOrganised the source code structure, conducted research (Improving Performance: Optimizing the membrane time constant).
Sara Evers@saraeversConducted research (Analysing Dale’s law and distribution of excitatory and inhibitory neurons).
Zach Friedenberger@ZachFriedenbergerConducted research (Improving Performance: Optimizing the membrane time constant).
Helena Yuhan Liu@Helena-Yuhan-LiuConducted research (Analysis: thresholding W1W2 plot).
Jose Gomes (Portugal, PhD)@JoseGomesJPGConducted research (Sound localisation following Dale’ law).
(Unknown)@a-dtkConducted research (Robustness to Noise and Dropout).
Ido Aizenbud@ido4848Conducted research (Filter-and-Fire Neuron Model).
Sebastian Schmitt@schmittsConducted research (background on neuromorphic hardware in Background).
Rowan Cockett@rowanc1MyST technical support
Alberto Antonietti@alberto-antoniettiSupervised Francesco De Santis, wrote the paper (Contralateral glycinergic inhibition as a key factor in creating ITD sensitivity).
Juan Luis Riquelme@luis-rrConducted research (Sound localisation with excitatory-only inputs surrogate gradient descent)
Adam Haber@adamhaberConducted research (Compute hessians (jax version))
Gabriel Béna@GabrielBenaConducted research (Analysing trained networks - workshop edition, Sound localisation following Dale’ law).
Peter Crowe@pfcroweConducted research (Improving Performance: Optimizing the membrane time constant).
Umar Abubacar@UmarAbubacarConducted research (TCA Analysis) and wrote the paper (Tensor component analysis).
Gabryel Mason-WilliamsNone/unknownConducted research (Analysing trained networks - workshop edition).
Josh BourneNone/unknownConducted research (Analysing trained networks - workshop edition).
Zekai XuNone/unknownConducted research (Analysing trained networks - workshop edition).
Leonidas RichterNone/unknownConducted research (Learning delays).
Chen LiNone/unknownConducted research (Improving Performance: Optimizing the membrane time constant).
Brendan BicknellNone/unknownSupervised Dilay Fidan Erçelik.
Volker BormuthNone/unknownDeveloped teaching materials and used the project to teach two university courses. Supervised Marcus Ghosh & students at Sorbonne University.

Notebook map

The following lists the notebooks, authors, summary and related notebooks in this project.

Introductory notebooks

Background
Explanation of the background. (Author: Dan Goodman.)
Questions & challenges
List of research questions and challenges. (Author: everyone.)

Templates / starting points

Starting Notebook
The template notebook suggested as a starting point, based on the Cosyne tutorial that kicked off this project. (Author: Dan Goodman.)
Quick Start Notebook
Condensed version of Starting Notebook using the shorter membrane time constants from Improving Performance: Optimizing the membrane time constant and Dale’s law from Sound localisation following Dale’ law. (Author: Dilay Fidan Erçelik, Karim Habashy, Marcus Ghosh.)

Individual notebooks

Filter-and-Fire Neuron Model
Using an alternative neuron model. (Author: Ido Aizenbud based on work from Dilay Fidan Erçelik.)
Altering Output Neurons
Comparison of three different ways of reading out the network’s decision (average membrane potential, maximum membrane potential, spiking outputs) with short and long time constants. (Author: Mingxuan Hong.)
Analysing trained networks - workshop edition
Group project from an early workshop looking at hidden unit spiking activity and single unit ablations. Found that some hidden neurons don’t spike, and ablating those does not harm performance. Builds on (WIP) Analysing trained networks. (Author: Gabriel Béna, Josh Bourne, Tomas Fiers, Tanushri Kabra, Zekai Xu.)
Sound localisation following Dale’ law
Investigation into the results of imposing Dale’s law. Incorporated into Quick Start Notebook. Uses a fix from Analysing Dale’s law and distribution of excitatory and inhibitory neurons. (Author: Marcus Ghosh, Gabriel Béna, Jose Gomes.)
Dynamic threshold
Adds an adaptive threshold to the neuron model and compares results. Conclusion is that the dynamic threshold does not help in this case. (Author: Mingxuan Hong.)
Sound localisation with excitatory-only inputs surrogate gradient descent
Results of imposing an excitatory only constraint on the neurons. Appears to find solutions that are more like what would be expected from the Jeffress model. (Author: Juan Luis Riquelme.)
Learning delays, Learning delays (v2) and Vanilla sound localization problem with a single delay layer (non-spiking)
Delay learning using differentiable delay layer, written up in Learning delays (Author: Karim Habashy.)
Dilated Convolution with Learnable Spacings
Delay learning using Dilated Convolution with Learnable Spacings, written up in Learning delays. (Author: Balázs Mészáros.)
Robustness to Noise and Dropout
Test effects of adding Gaussian noise and/or dropout during training phase. Conclusion is that dropout does not help and adding noise decreases performance. (Author: Unknown (@a-dtk).)
Version with 250 Hz input, Quick Start Notebook with 250 Hz input
Analysis of results with a higher frequency input stimulus and different membrane time constants for hidden and output layers. Conclusion is that smaller time constant matters for hidden layer but not for output layer. (Author: Dilay Fidan Erçelik.)
Analysing performance and solutions as time constants change
Deeper analysis of strategies found by trained networks as time constants vary. Added firing rate regularisation. Extends Improving Performance: Optimizing the membrane time constant. Written up in more detail in A minimal trainable model of IPD processing. (Author: Dan Goodman.)
Workshop 1 Write-up
Write-up of what happened at the first workshop. (Author: Marcus Ghosh.)

Inconclusive

The following notebooks did not reach a solid conclusion.

Compute hessians (jax version)
An unfinished attempt to perform sensitivity analysis using Hessian matrices computed via autodifferentiation with the Jax library. (Author: Adam Haber.)
Noise offsets in every iteration
Analysis of an alternative way of handling noise. (Author: Balázs Mészáros.)
Analysis: thresholding W1W2 plot
Unfinished attempt to improve analysis code. (Author: Helena Yuhan Liu.)

Historical

This subsection includes notebooks whose content got merged into an updated notebook later.

(WIP) Analysing trained networks
Early work on analysing the strategies learned by trained networks. Folded into Analysing trained networks - workshop edition. (Author: Dan Goodman.)
Improving Performance: Optimizing the membrane time constant
Analyses how performance depends on membrane time constant. Folded into Analysing performance and solutions as time constants change. (Author: Zach Friedenberger, Chen Li, Peter Crowe.)
Analysing Dale’s law and distribution of excitatory and inhibitory neurons
Fixed a mistake in an earlier version of Sound localisation following Dale’ law. (Author: Sara Evers.)