Import libraries

This notebook shows how to load ophys data from Visual Behavior Project using AllenSDK tools. It briefly describes the type of data available and shows a few simple ways of plotting ophys traces along with animal's behavior.

We will first install allensdk into your environment by running the appropriate commands below.

Install AllenSDK into your local environment

You can install AllenSDK locally with:

Install AllenSDK into your notebook environment (good for Google Colab)

You can install AllenSDK into your notebook environment by executing the cell below.

If using Google Colab, click on the RESTART RUNTIME button that appears at the end of the output when this cell is complete,. Note that running this cell will produce a long list of outputs and some error messages. Clicking RESTART RUNTIME at the end will resolve these issues. You can minimize the cell after you are done to hide the output.

Import required libraries

We need to import libraries for plotting and manipulating data

Load data tables

This code block allows you to use behavior_project_cache (bpc) class to get behavior and ophys tables.

In this notebook, we will use experiment_table to select experiments of interest and look at them in a greater detail.

You can get any experiment ids from the experiment table by subsetting the table using various conditions (aka columns) in the table. Here, we can select experiments from Sst mice only, novel Ophys session 4, with 0 prior exposures to the stimulus (meaning the session was not a relake).

Remember that any given experiment contains data from only one imaging plane. Some of these experiments come from the same imaging session. Here, we can check how many unique imaging sessions are associated with experiments selected above.

Load an experiment

Let's pick a random experiment from the table and plot example ophys and behavioral data.

show metadata for this experiment

You can get additional information about this experiment from the metadata field of the dataset class. Here, you can see that this experiment was in Sst Cre line, in a female mouse at 233 days old, recorded using mesoscope (this is one of four imaging planes), at imaging depth of 150 microns, in primary visual cortex (VISp). This experiment is also from OPHYS 1 session using image set A.

plot max projection from this experiment

Max projection plots an average image from the movie recorded during an imaging session. Plotting max projection can give you a sense of how many neurons were visible during imaging and how clear and stable the imaging session was.

load cell specimen table with cells' imaging metrics

cell_specimen_table includes information about x and y coordinates of the cell in the imaging plane as well as how much correction was applied during motion correction process.

cell_roi_id is a unique id assigned to each ROI during segmentation.

cell_specimen_id is a unique id assigned to each cell after cell matching, which means that if we were able to identify and match the same cell across multiple sessions, it can be identified by its unique cell specimen id.

roi_mask is a boolean array that can be used to visualize where any given cell is in the imaging field.

show dff traces for the first 10 cells this experiment

dff_traces dataframe contains traces for all neurons in this experiment, unaligned to any events in the task.

You can select rows by their enumerated number using .iloc[] method:

Alternatively, you can use cell_specimen_id as index to select cells with .loc[] method:

If you don't want dff in a pandas dataframe format, you can load dff traces as an array, using np.vstack function to format the data into cell by time array and .values to only grab values in dff column:

show events traces for the first 10 cells in this experiment

events table is similar to dff_traces but the output provides traces of extrapolated events. Events are computed on unmixed dff traces for each cell as described in Giovannucci et al. 2019. The magnitude of events approximates the firing rate of neurons with the resolusion of about 200 ms. The biggest advantage of using events over dff traces is they exclude prolonged Ca transients that may conteminate neural responses to subsequent stimuli. You can also use filtered_events which are events convolved with a filter created using stats.halfnorm method.

lambda is computed from Poisson distribution of events in the trace (think of it as a center of mass of the distribution, larger lambda == higher "firing rate").

noise_std is a measure of variability in the events trace.

load ophys timestamps

The timestamps are the same for dff_traces and events, in seconds

Pick a cell and plot the traces

We can select a random cell from the experiment and plot its dff and events traces along with other behavioral and stimulus data.

We can see that as expected, events trace is much cleaner than dff and it generally follows big Ca transients really well. We can also see that this cell was not very active during our experiment. Each experiment has a 5 minute movie at the end, which often drives neural activity really well. We can see a notable increase in cell's activity at the end of this experiment as well.

plot mouse running speed from this experiment

plot pupil area for the same experiment

You can find all attributes and methods that belong to dataset class using this helpful method:

You can learn more about them by calling help on them:

Get information about visual stimuli presented on each trial

get stimulus information for this experiment and assign it to a table called stimulus_table

This table provides helpful information like image name, start, duration and stop of image presentation, and whether the image was omitted.

You can also use keys() method to see the names of the columns in any pandas dataframe table:

Get task and behavioral data for each trial

get behavioral trial information and assign it to trials_table

This table has information about experiment trials. go trials are change trials when the animal was supposed to lick. If the animal licked, hit is set to True for that trial. If the animal was rewarded, reward_time will have time in seconds. If this was an auto rewarded trial (regardless of whether the animal got it right), auto_rewarded is set to True. The trials table also includes response_latency which can be used as reaction time of the animal during the experiment.

Plot an example of one selected cell

Now, we will put together a plotting functions that utilizes data in the dataset class to plot ophys traces and behavioral data from an experiment.

From looking at the activity of this neuron, we can see that it was very active during our experiment but its activity does not appear to be reliably locked to image presentations. It does seem to vaguely follow animal's running speed, thus it might be modulated by running.

Vip cell example

We can get a different, Vip experiment from Ophys session 1 and plot it to compare response traces. This gives us a similar plot from a different inhibitory neuron to compare their neural dynamics.

We can see that the dynamics of a Vip neuron are also not driven by the visual stimuli. Aligning neural activity to different behavioral or experimental events might reveal what this neuron is driven by.