03. Building and simulating a simple model

Version 11.5 by adavison on 2021/09/30 14:20

tutorial under development

Learning objectives

In this tutorial, you will learn how to build a simple network of integrate-and-fire neurons using PyNN, how to run simulation experiments with this network using different simulators, and how to visualize the data generated by these experiments.

Audience

This tutorial is intended for people with at least a basic knowledge of neuroscience (high school level or above) and basic familiarity with the Python programming language. It should also be helpful for people who already have advanced knowledge of neuroscience and neural simulation, who simply wish to learn how to use PyNN, and how it differs from other simulation tools they know.

Prerequisites

To follow this tutorial, you need a basic knowledge of neuroscience (high-school level or greater), basic familiarity with the Python programming language, and either a computer with PyNN, NEST, NEURON and Brian 2 installed, or an EBRAINS account and basic familiarity with Jupyter notebooks. If you don't have these tools installed, see one of our previous tutorials which guide you through the installation.

Format

This tutorial will be a video combining slides, animations, and screencast elements. The intended duration is 10 minutes.

Script

Slide showing tutorial title, PyNN logo, link to PyNN service page.

Hello, my name is X.

This video is one of a series of tutorials for PyNN, which is Python software for modelling and simulating spiking neural networks.

For a list of the other tutorials in this series, you can visit ebrains.eu/service/pynn, that's p-y-n-n.

Slide listing learning objectives

In this tutorial, you will learn the basics of PyNN: how to build a simple network of integrate-and-fire neurons using PyNN, how to run simulation experiments with this network using different simulators, and how to visualize the data generated by these experiments.

Slide listing prerequisites

To follow this tutorial, you need a basic knowledge of neuroscience (high-school level or greater), basic familiarity with the Python programming language, and you should have already followed our earlier tutorial video which guides you through the installation process.

This video covers PyNN 0.10. If you've installed a more recent version of PyNN, you might want to look for an updated version of this video.

Slide showing animation of leaky integrate-and-fire model

PyNN is a tool for building models of nervous systems, and parts of nervous systems, at the level of individual neurons and synapses.

We'll start off creating a group of 100 neurons, using a really simple model of a neuron, the leaky integrate-and-fire model.

When we inject positive current into this model, either from an electrode or from an excitatory synapse, it increases the voltage across the cell membrane, until the voltage reaches a certain threshold.

At that point, the neuron produces an action potential, also called a spike, and the membrane voltage is reset.

Screencast - blank document in editor

In this video, you'll see my editor on the left, and on the right my terminal and my file browser. I'll be writing code in the editor, and then running my scripts in the terminal. You're welcome to follow along---you can pause the video at any time if I'm going too fast---or you can just watch.

Let's start by writing a docstring, "Simple network model using PyNN".

For now, we're going to use the NEST simulator to simulate this model, so we import the PyNN-for-NEST module.

Like with any numerical model, we need to break time down into small steps, so let's set that up with steps of 0.1 milliseconds.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim
sim.setup(timestep=0.1)

PyNN comes with a selection of integrate-and-fire models. We're going to use the IF_curr_exp model, where "IF" is for integrate-and-fire, "curr" means that synaptic responses are changes in current, and "exp" means that the shape of the current is a decaying exponential function.

This is where we set the parameters of the model: the resting membrane potential is -65 millivolts, the spike threshold is -55 millivolts, the reset voltage after a spike is again -65 millivolts, the refractory period after a spike is one millisecond, the membrane time constant is 10 milliseconds, and the membrane capacitance is 1 nanofarad. We're also going to inject a constant bias current of 0.1 nanoamps into these neurons, so that we get some action potentials.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim
sim.setup(timestep=0.1)

cell_type  = sim.IF_curr_exp(v_rest=-65, v_thresh=-55, v_reset=-65, t_refrac=1, tau_m=10, cm=1, i_offset=0.1)

Let's create 100 of these neurons, then we're going to record the membrane voltage, and run a simulation for 100 milliseconds.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim
sim.setup(timestep=0.1)

cell_type  = sim.IF_curr_exp(v_rest=-65, v_thresh=-55, v_reset=-65, t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population1.record("v")
sim.run(100.0)


Run script in terminal

PyNN has some built-in tools for making simple plots, so let's import those, and plot the membrane voltage of the zeroth neuron in our population (remember Python starts counting at zero).

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(v_rest=-65, v_thresh=-55, v_reset=-65, t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population1.record("v")
sim.run(100.0)

data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 0],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of neuron #0",
    annotations="Simulated with NEST"
).show()


Run script in terminal, show figure

As you'd expect, the bias current causes the membrane voltage to increase until it reaches threshold---it doesn't increase in a straight line because it's a leaky integrate-and-fire neuron---then once it hits the threshold the voltage is reset, and then stays at the same level for a short time---this is the refractory period---before it starts to increase again.

Now, all 100 neurons in our population are identical, so if we plotted the first neuron, the second neuron, ..., we'd get the same trace.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(v_rest=-65, v_thresh=-55, v_reset=-65, t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population1.record("v")
sim.run(100.0)

data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 
0:5],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of 
first five neurons",
    annotations="Simulated with NEST"
).show()


Run script in terminal, show figure

Let's change that. In nature every neuron is a little bit different, so let's set the resting membrane potential and the spike threshold randomly from a Gaussian distribution.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
from pyNN.random import RandomDistribution
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(
   
 v_rest=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}),
    v_thresh=RandomDistribution('normal', {'mu': -55.0, 'sigma': 1.0}),
    v_reset=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}), 

    t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population1.record("v")
sim.run(100.0)

data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 0:5],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of first five neurons 
with heterogeneous parameters",
    annotations="Simulated with NEST"
).show()


Run script in terminal, show figure

Now if we run our simulation again, we can see the effect of this heterogeneity in the neuron population.

Slide showing addition of second population, and of connections between them

So far we have a population of neurons, but there are no connections between them, we don't have a network. Let's add a second population of the same size as the first, but we'll set the offset current to zero, so they don't fire action potentials spontaneously.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
from pyNN.random import RandomDistribution
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(
   
 v_rest=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}),
    v_thresh=RandomDistribution('normal', {'mu': -55.0, 'sigma': 1.0}),
    v_reset=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}), 

    t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population2 = sim.Population(100, cell_type, label="Population 2")
population2.set(i_offset=0)

population1.record("v")
population2.record("v")
sim.run(100.0)
data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 0:5],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of first five neurons with heterogeneous parameters",
    annotations="Simulated with NEST"
).show()

Now we want to create synaptic connections between the neurons in Population 1 and those in Population 2. There are lots of different ways these could be connected.

Slide showing all-to-all connections

We could connect all neurons in Population 1 to all those in Population 2.

Slide showing random connections

We could connect the populations randomly, in several different ways.

Slide showing distance-dependent connections

We could connect the populations randomly, but with a probability of connection that depends on the distance between the neurons.

Slide showing explicit lists of connections

Or we could connect the neurons in a very specific manner, based on an explicit list of connections.

Just as PyNN provides a variety of neuron models, so it comes with a range of connection algorithms built in. You can also add your own connection methods.

Slide showing addition of second population, and of connections between them, labelled as a Projection.

In PyNN, we call a group of connections between two populations a _Projection_. To create a Projection, we need to specify the presynaptic population, the postsynaptic population, the connection algorithm, and the synapse model. Here we're using the simplest synapse model available in PyNN, for which the synaptic weight is constant over time, there is no plasticity.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
from pyNN.random import RandomDistribution
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(
   
 v_rest=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}),
    v_thresh=RandomDistribution('normal', {'mu': -55.0, 'sigma': 1.0}),
    v_reset=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}), 

    t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population2 = sim.Population(100, cell_type, label="Population 2")
population2.set(i_offset=0)
population1.record("v")
population2.record("v")

connection_algorithm = sim.FixedProbabilityConnector(p=0.5)
synapse_type = sim.StaticSynapse(weight=0.5, delay=0.5)
connections = sim.Projection(population1, population2, connection_algorithm, synapse_type)

sim.run(100.0)
data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 0:5],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of first five neurons with heterogeneous parameters",
    annotations="Simulated with NEST"
).show()

Finally, let's update our figure, by adding a second panel to show the responses of Population 2.

Screencast - current state of editor

"""Simple network model using PyNN"""

import pyNN.nest as sim

from pyNN.utility.plotting import Figure, Panel
from pyNN.random import RandomDistribution
sim.setup(timestep=0.1)
cell_type  = sim.IF_curr_exp(
   
 v_rest=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}),
    v_thresh=RandomDistribution('normal', {'mu': -55.0, 'sigma': 1.0}),
    v_reset=RandomDistribution('normal', {'mu': -65.0, 'sigma': 1.0}), 

    t_refrac=1, tau_m=10, cm=1, i_offset=0.1)
population1 = sim.Population(100, cell_type, label="Population 1")
population2 = sim.Population(100, cell_type, label="Population 2")
population2.set(i_offset=0)
population1.record("v")
population2.record("v")

connection_algorithm = sim.FixedProbabilityConnector(p=0.5)
synapse_type = sim.StaticSynapse(weight=0.5, delay=0.5)
connections = sim.Projection(population1, population2, connection_algorithm, synapse_type)

sim.run(100.0)
data_v = population1.get_data().segments[0].filter(name='v')[0]
Figure(
    Panel(
        data_v[:, 0:5],
        xticks=True, xlabel="Time (ms)",
        yticks=True, ylabel="Membrane potential (mV)"
    ),
    title="Response of first five neurons with heterogeneous parameters",
    annotations="Simulated with NEST"
).show()

Run script in terminal, show figure

Summary (In this tutorial, you have learned to do X…)

.

Acknowledgements if appropriate

.

References to websites (For more information, visit us at…)

.

Contact information (For questions, contact us at…)

.