Example behavior and ophys data

The following example shows how to access behavioral data for a given recording session and how to align with corresponding neural data

We will first install allensdk into your environment by running the appropriate commands below.

Install AllenSDK into your local environment

You can install AllenSDK locally with:

Install AllenSDK into your notebook environment (good for Google Colab)

You can install AllenSDK into your notebook environment by executing the cell below.

If using Google Colab, click on the RESTART RUNTIME button that appears at the end of the output when this cell is complete,. Note that running this cell will produce a long list of outputs and some error messages. Clicking RESTART RUNTIME at the end will resolve these issues. You can minimize the cell after you are done to hide the output.

Imports

Make notebook use full screen width

Look at a sample of the experiment table

here are all of the unique session types

Select an OPHYS_1_images_A experiment at random, load the experiment data

Look at the performance data

We can see that the d-prime metric, a measure of discrimination performance, peaked at 2.14 during this session, indicating mid-range performance.
(d' = 0 means no discrimination performance, d' is infinite for perfect performance, but is limited to about 4.5 this dataset due to trial count limitations).

We can build a trial dataframe that tells us about behavior events on every trial. This can be merged with a rolling performance dataframe, which calculates behavioral performance metrics over a rolling window of 100 trials (excluding aborted trials, or trials where the animal licks prematurely).

Now we can plot performance over the full experiment duration

Some key observations:

We can also look at a dataframe of stimulus presentations. This tells us the attributes of every stimulus that was shown in the session

Also note that there is an image name called 'omitted'. This represents the time that a stimulus would have been shown, had it not been omitted from the regular stimulus cadence. They are included here for ease of analysis, but it's important to note that they are not actually stimuli. They are the lack of expected stimuli.

For plotting purposes below, let's add a column that specifies a unique color for every unique image

There are also dataframes containing running speed, licks, eye tracking, and neural data:

running speed

One entry for each read of the analog input line monitoring the encoder voltage, polled at ~60 Hz.

licks

One entry for every detected lick onset time, assigned the time of the corresponding visual stimulus frame.

eye tracking data

One entry containing ellipse fit parameters for the eye, pupil and corneal reflection for every frame of the eye tracking video stream.

and deltaF/F values

One row per cell, with each containing an array of deltaF/F values.

we can convert the dff_traces to long-form (aka "tidy") as follows:

We can look at a few trials in some detail

First define a function to plot a number of data streams

here is a hit trial

Notes:

here is a miss trial

Notes:

here is a false alarm trial

Notes:

And finally, a correct rejection

Notes: