Example behavior session

The following example shows how to access behavioral data for a given mouse across sessions

We will first install allensdk into your environment by running the appropriate commands below.

Install AllenSDK into your local environment

You can install AllenSDK with:

Install AllenSDK into your notebook environment (good for Google Colab)

You can install AllenSDK into your notebook environment by executing the cell below.

If using Google Colab, click on the RESTART RUNTIME button that appears at the end of the output when this cell is complete,. Note that running this cell will produce a long list of outputs and some error messages. Clicking RESTART RUNTIME at the end will resolve these issues. You can minimize the cell after you are done to hide the output.

Imports

load cache, get behavior session table

This will set a location on your local drive to cache NWB files.
Then a table of all behavior sessions will be loaded.

view a sample of the behavior session table

The behavior_session_table is a Pandas DataFrame with one row for every session and a collection of informative columns. We can view 10 randomly selected rows of the table using the Pandas sample command.
It's important to note that this table contains every session, including sessions performed on a two-photon imaging rig (session_type starts with OPHYS_) and pre-imaging (aka 'training') sessions, (session_type starts with TRAINING_).

Select one mouse

We'll choose one mouse id from the full list of unique mouse IDs in the dataset

query the full behavior sessions table for all sessions that this mouse performed

This will return a subset of the full behavior_session_table in which the mouse_id matches our mouse_id variable (mouse 445002). The table should be returned in order of date of acquisition, but we'll use the Pandas command sort_values(by = 'date_of_acquisition') just to be sure.

What we then see is a table that has metadata for every session performed by this mouse, in sequential order. The equipment_name column tells us where the session was run on that day and the session_type column tells us the name of the session type. See the technical white paper for a description of the progression of stages.

For this mouse, we can see that it progressed through a series of training stages starting on 3/15/2019 in behavior training boxes BEH.B-Box3 and BEH.B-Box1.

On 4/1/2019, it reached the TRAINING_5_images_A_handoff_ready, which meant that it was ready for transition to an imaging rig as soon as space became available.

On 4/4/2019, it was transitioned to ophys rig CAM2P.3, where it then underwent three days of habituation without imaging. This is evidenced by the fact that the session type for 4/4/2019, 4/5/2019, and 4/8/2019 was OPHYS_0_images_A_habituation and there was no associated ophys_session_id.

The first day of imaging for this mouse was on 4/9/2019, with session_type = OPHYS_1_images_A.

Note that this mouse has two OPHYS_5_images_B_passive sessions, the first taken in order (immediately after OPHYS_4_images_B), and second taken at the end of the sequence. The first OPHYS_5_images_B_passive does not have an ophys_session_id associated with it. This is likely due to that first session failing to meet quality control standards and being excluded from the dataset. The second OPHYS_5_images_B_passive was likely a retake, taken after the first was identified as having been failed.

In general, ophys behavior sessions that do not have associated ophys_session_ids are sessions for which the ophys data has been removed do to failure to meet quality control standards.

iterate over all sessions for this mouse, build a behavior_session_dict which will have one behavior session object for every session that this mouse performed, with the key being the behavior_session_id

Note that this could take many minutes to complete. For each session in our new table, this_mouse_table, we are pulling the behavior session NWB file from AWS, opening it as a BehaviorSession object using the AllenSDK, and also caching a copy of the NWB file in the directory specified above as my_cache_dir. When the below cell completes, all behavior sessions for this mouse will be held in memory in the behavior_session_dict dictionary.

If you were to re-run this cell a second time, it would access your cached NWB files instead of downloading them from AWS, allowing it to run substantially faster.

It is important to note that we will only be loading the behavior data here, even for sessions that had corresponding imaging data. The get_behavior_ophys_experiment method would be used to get behavior and ophys data for ophys sessions. See additional sample notebooks for details.

We can view all attributes of the behavior session object

These are all of the methods and attributes available on the BehaviorSession object. Not all are explored in this notebook.

Note that any attribute can be followed by a ? in a Jupyter Notebook to see the docstring. For example, running the cell below will make a frame appear at the bottom of your browser with the docstring for the running_speed attribute.

here are some basic task parameters

We can see the session_type, which is OPHYS_5_images_B_passive and a number of other task parameters.

Look at some of the attributes of the last 'handoff ready session'

We can filter the full table to get the last TRAINING_5_images_A_handoff_ready session. This would have been the last training session before the animal was subsequently handed off to the imaging team, after which all sessions were performed on a two-photon imaging rig.

stimuli

One entry for every distinct stimulus. Includes onset and offset time/frame.

licks

One entry for every detected lick onset time, assigned the time of the corresponding visual stimulus frame.

rewards

One entry for every reward that was delivered, assigned the time of the corresponding visual stimulus frame. Autorewarded is True if the reward was delivered without requiring a preceding lick.

running data

One entry for each read of the analog input line monitoring the encoder voltage, polled at ~60 Hz

we can make a simple plot where we combine together running, licking and stimuli

First, add a column to the stimulus_presentations table that assigns a unique color to every stimulus

now make some simple plotting functions to plot these datastreams

now make the plot

Above, we can see that stimuli were being delivered at a regular cadence (250 ms on, 500 ms off). There were changes to new stimuli at t = 778.6 and t = 793.7, as indicated by the change in the color of the bars. The mouse licked inside of the required response window following both stimulus changes and received a reward coincident with the first lick following the change. The subsequent licks are likely a result of the mouse consuming the water reward. There was also a brief bout of two licks, likely representing impulsivity, at t = 786.9.

trials

We can view attributes of every trial here. Below is a random sample of 5 trials

we can examine one trial in some detail. Let's randomly select a hit trial.

Some things to note:

One useful method is the get_performance_metrics method, which returns some summary metrics on the session, derived from the 'rolling_performance_df'

we can build out a new table that has all performance data for every session as follows:

This might take a minute or so. The AllenSDK will be extracting the performance data from the NWB file for every session individually.

for convenience, we should merge this with the existing table we built for this mouse

Now we can plot the max_dprime value for every session

We can see that this particular mouse performed relatively consistently for every session as it progressed through training.