Can't open session .nwb files on Matlab or Python

I’m trying to open visual-coding neuropixel .nwb files using both Python and Matlab using pynwb and matnwb. When I try and open the session .nwb files I get the following errors:

Python
I’m new to using .nwb files but have managed to write and read an example .nwb file in python.
However, when I try and read in an example neuropixles session using the following code

import numpy as np
from pynwb import NWBHDF5IO

io = NWBHDF5IO('session_767871931.nwb', 'r')
nwbfile_in = io.read()

I get the following error:
ValueError: Column name ‘name’ is not allowed because it is already an attribute

Matlab
After running the generateCore() function I run:

nwb = nwbRead('D:\AllenSDK\session_766640955\session_766640955.nwb')

And get the following error:

Error using types.util.checkUnset (line 13)
Properties {help} are not valid property names.

Error in types.core.TimeSeries (line 82)
types.util.checkUnset(obj, unique(varargin(1:2:end)));

Error in io.parseGroup (line 78)
parsed = eval([typename ‘(kwargs{:})’]);

Error in io.parseGroup (line 38)
subg = io.parseGroup(filename, group, Blacklist);

Error in io.parseGroup (line 38)
subg = io.parseGroup(filename, group, Blacklist);

Error in nwbRead (line 33)
nwb = io.parseGroup(filename, h5info(filename), Blacklist);

From this it looks like there is an issue with the formatting of the session .nwb files that prevents them from being opened to me. Is there anything obvious I’m missing?

Thanks

To give an update, in python I have also tried:

from allensdk.core.nwb_data_set import NwbDataSet

file_name = 'D:\AllenSDK\session_767871931\session_767871931.nwb'
data_set = NwbDataSet(file_name)

sweep_numbers = data_set.get_sweep_numbers()
sweep_number = sweep_numbers[0]

But I get the following error

Traceback (most recent call last):
  File "C:/Users/edward.horrocks/PycharmProjects/AllenSDK/venv/loadsessiontest.py", line 7, in <module>
sweep_numbers = data_set.get_sweep_numbers()
  File "C:\Users\edward.horrocks\PycharmProjects\AllenSDK\venv\lib\site-packages\allensdk\core\nwb_data_set.py", line 306, in get_sweep_numbers
for e in f['epochs'].keys() if e.startswith('Sweep_')]
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "C:\Users\edward.horrocks\PycharmProjects\AllenSDK\venv\lib\site-packages\h5py\_hl\group.py", line 264, in __getitem__
oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py\h5o.pyx", line 190, in h5py.h5o.open
KeyError: "Unable to open object (object 'epochs' doesn't exist)"

Hi eabhorrocks,

Welcome to the forums!

I think you are running afoul of a different issue in each of your posts. These are:

  1. These NWB files were written using pynwb 1.0.2. Some newer versions of pynwb (issue here) cannot open these files. As a workaround, please try installing 1.0.2 using the allensdk requirements file.
  2. The NwbDataSet in allensdk/core is for intracellular ephys (and NWB1) - you want to instead use EcephysSession from allensdk.brain_observatory.ecephys.ecephys_session. For more information and code examples, see the neuropixels landing page.

The problem you are running into when loading these files via MATLAB is unfamiliar to me. We don’t support MATLAB, but I wonder if you just need to load the extension files from here.

Thank you,
Nile

Thanks for the information.

  1. I did not check to see that the ncessary pynwb was already installed with pip install allensdk and had installed the latest version. Thanks very much for pointing out this was where this issue was.

In terms of Matlab, I tried loading those extension files but to no avail. It seems like a similar issue to python whereby file conventions have changed since the ecephys nwb files were generated. Perhaps an older version matnwb would also solve this issue, similarly to python.

Thanks for such a quick response, looking forward to using these datasets.

Edd