Time and lfp sampling rate in Visual Behavior Dataset

Good afternoon all,

just wanted to double check some things regarding fs and time in the LFPs provided in the Visual Behavior dataset and analyzed by AllenSDK.

First of all, the lfp sampling rate in the a sessions data indicated that is 2500 Hz

However when I extract the time points of the lfp I want to analyze I noticed that the fs= 1250 Hz
(1/(t[1]-t[0])= 1250.007433406601 )

Does anyone know if that is an error in the metadata?

Also, the time vector has t[0]= 4.023837588781174 seconds. so this is time zero of the LFP.
however for the same session the running time zero is 25.21406 seconds and for the running speed the sampling rate is ~60 Hz.
How do you align the running speed with the actual lfp time when you want to focus on events of milliseconds accuracy?

Thank all you in advance!

Best,
Sofia

Hi Sofia,

The value in the table indicates the sampling rate at which the raw data was collected (2500 Hz). However, when we package the LFP into NWB files, we downsample the signal both spatially (4x) and temporally (2x), giving the 1250 Hz you are seeing. So this isn’t an error, but I agree it’s a bit confusing. If you’d like, feel free to submit an issue on the AllenSDK github (GitHub - AllenInstitute/AllenSDK: code for reading and processing Allen Institute for Brain Science data) requesting clarification here.

The discrepancy in t[0] for the different signals reflects when these data streams were started relative to our master clock (LFP started ~4 seconds after the master clock and then the running signal some 20 seconds later). You are right that the running speed was collected at 60 Hz, which will limit the resolution with which you can align this data stream to other events.

You might find this tutorial helpful for aligning the LFP to other task events:
visual_behavior_neuropixels_LFP_analysis.

And here’s another tutorial demonstrating how to align behavioral data:
aligning_behavioral_data_to_task_events_with_the_stimulus_and_trials_tables.

1 Like

Thank you, this is really helpful!