Skip to content

Commit

Permalink
Merge branch 'main' into add_electrode_group_and_mismatch_test_on_spi…
Browse files Browse the repository at this point in the history
…keinterface
  • Loading branch information
h-mayorquin authored Jan 14, 2025
2 parents 366bc8f + fe5bfb4 commit 7382a90
Show file tree
Hide file tree
Showing 39 changed files with 49 additions and 72 deletions.
7 changes: 7 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,13 @@ __pycache__/
# C extensions
*.so


# UV
# Similar to Pipfile.lock, it is generally recommended to include uv.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
uv.lock

# Distribution / packaging
.Python
build/
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ repos:
exclude: ^docs/

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.7.1
rev: v0.8.3
hooks:
- id: ruff
args: [ --fix ]
Expand Down
36 changes: 27 additions & 9 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,36 @@
# v0.6.6 (Upcoming)
# v0.7.0 (Upcoming)

## Deprecations

## Bug Fixes

## Features
* Source validation is no longer performed when initializing interfaces or converters [PR #1168](https://github.com/catalystneuro/neuroconv/pull/1168)

## Improvements

# v0.6.9 (Upcoming)
Small fixes should be here.

## Deprecations

## Bug Fixes
* Temporary set a ceiling for hdmf to avoid a chunking bug [PR #1175](https://github.com/catalystneuro/neuroconv/pull/1175)

## Features

## Improvements
* Detect mismatch errors between group and group names when writing ElectrodeGroups [PR #1165](https://github.com/catalystneuro/neuroconv/pull/1165)


# v0.6.6 (December 20, 2024)

## Deprecations
* Removed use of `jsonschema.RefResolver` as it will be deprecated from the jsonschema library [PR #1133](https://github.com/catalystneuro/neuroconv/pull/1133)
* Completely removed compression settings from most places [PR #1126](https://github.com/catalystneuro/neuroconv/pull/1126)
* Soft deprecation for `file_path` as an argument of `SpikeGLXNIDQInterface` and `SpikeGLXRecordingInterface` [PR #1155](https://github.com/catalystneuro/neuroconv/pull/1155)
* `starting_time` in RecordingInterfaces has given a soft deprecation in favor of time alignment methods [PR #1158](https://github.com/catalystneuro/neuroconv/pull/1158)


## Bug Fixes
* datetime objects now can be validated as conversion options [#1139](https://github.com/catalystneuro/neuroconv/pull/1126)
* Make `NWBMetaDataEncoder` public again [PR #1142](https://github.com/catalystneuro/neuroconv/pull/1142)
Expand All @@ -15,7 +39,6 @@
* `SpikeGLXNIDQInterface` is no longer written as an ElectricalSeries [#1152](https://github.com/catalystneuro/neuroconv/pull/1152)
* Fix a bug on ecephys interfaces where extra electrode group and devices were written if the property of the "group_name" was set in the recording extractor [#1164](https://github.com/catalystneuro/neuroconv/pull/1164)


## Features
* Propagate the `unit_electrode_indices` argument from the spikeinterface tools to `BaseSortingExtractorInterface`. This allows users to map units to the electrode table when adding sorting data [PR #1124](https://github.com/catalystneuro/neuroconv/pull/1124)
* Imaging interfaces have a new conversion option `always_write_timestamps` that can be used to force writing timestamps even if neuroconv's heuristics indicates regular sampling rate [PR #1125](https://github.com/catalystneuro/neuroconv/pull/1125)
Expand All @@ -31,7 +54,7 @@
* Use pytest format for dandi tests to avoid window permission error on teardown [PR #1151](https://github.com/catalystneuro/neuroconv/pull/1151)
* Added many docstrings for public functions [PR #1063](https://github.com/catalystneuro/neuroconv/pull/1063)
* Clean up with warnings and deprecations in the testing framework [PR #1158](https://github.com/catalystneuro/neuroconv/pull/1158)
* Detect mismatch errors between group and group names when writing ElectrodeGroups [PR #1165](https://github.com/catalystneuro/neuroconv/pull/1165)
* Enhance the typing of the signature on the `NWBConverter` by adding zarr as a literal option on the backend and backend configuration [PR #1160](https://github.com/catalystneuro/neuroconv/pull/1160)


# v0.6.5 (November 1, 2024)
Expand All @@ -56,8 +79,6 @@
* Avoid running link test when the PR is on draft [PR #1093](https://github.com/catalystneuro/neuroconv/pull/1093)
* Centralize gin data preparation in a github action [PR #1095](https://github.com/catalystneuro/neuroconv/pull/1095)



# v0.6.4 (September 17, 2024)

## Bug Fixes
Expand All @@ -83,11 +104,8 @@
* Consolidated daily workflows into one workflow and added email notifications [PR #1081](https://github.com/catalystneuro/neuroconv/pull/1081)
* Added zarr tests for the test on data with checking equivalent backends [PR #1083](https://github.com/catalystneuro/neuroconv/pull/1083)



# v0.6.3


# v0.6.2 (September 10, 2024)

## Bug Fixes
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@ Convert LightningPose pose estimation data to NWB using :py:class:`~neuroconv.da
>>> labeled_video_file_path = str(folder_path / "labeled_videos/test_vid_labeled.mp4")
>>> converter = LightningPoseConverter(file_path=file_path, original_video_file_path=original_video_file_path, labeled_video_file_path=labeled_video_file_path, verbose=False)
Source data is valid!
>>> metadata = converter.get_metadata()
>>> # For data provenance we add the time zone information to the conversion
>>> session_start_time = metadata["NWBFile"]["session_start_time"]
Expand Down
6 changes: 2 additions & 4 deletions docs/conversion_examples_gallery/fiberphotometry/tdt_fp.rst
Original file line number Diff line number Diff line change
Expand Up @@ -207,15 +207,13 @@ Convert TDT Fiber Photometry data to NWB using
>>> LOCAL_PATH = Path(".") # Path to neuroconv
>>> editable_metadata_path = LOCAL_PATH / "tests" / "test_on_data" / "ophys" / "fiber_photometry_metadata.yaml"
>>> interface = TDTFiberPhotometryInterface(folder_path=folder_path, verbose=True)
Source data is valid!
>>> interface = TDTFiberPhotometryInterface(folder_path=folder_path, verbose=False)
>>> metadata = interface.get_metadata()
>>> metadata["NWBFile"]["session_start_time"] = datetime.now(tz=ZoneInfo("US/Pacific"))
>>> editable_metadata = load_dict_from_file(editable_metadata_path)
>>> metadata = dict_deep_update(metadata, editable_metadata)
>>> # Choose a path for saving the nwb file and run the conversion
>>> nwbfile_path = LOCAL_PATH / "example_tdtfp.nwb"
>>> nwbfile_path = f"{path_to_save_nwbfile}"
>>> # t1 and t2 are optional arguments to specify the start and end times for the conversion
>>> interface.run_conversion(nwbfile_path=nwbfile_path, metadata=metadata, t1=0.0, t2=1.0)
NWB file saved at example_tdtfp.nwb!
1 change: 0 additions & 1 deletion docs/conversion_examples_gallery/recording/spikeglx.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@ We can easily convert all data stored in the native SpikeGLX folder structure to
>>>
>>> folder_path = f"{ECEPHY_DATA_PATH}/spikeglx/Noise4Sam_g0"
>>> converter = SpikeGLXConverterPipe(folder_path=folder_path)
Source data is valid!
>>> # Extract what metadata we can from the source files
>>> metadata = converter.get_metadata()
>>> # For data provenance we add the time zone information to the conversion
Expand Down
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "neuroconv"
version = "0.6.6"
version = "0.6.7"
description = "Convert data from proprietary formats to NWB format."
readme = "README.md"
authors = [
Expand Down Expand Up @@ -39,7 +39,7 @@ dependencies = [
"PyYAML>=5.4",
"scipy>=1.4.1",
"h5py>=3.9.0",
"hdmf>=3.13.0",
"hdmf>=3.13.0,<=3.14.5", # Chunking bug
"hdmf_zarr>=0.7.0",
"pynwb>=2.7.0",
"pydantic>=2.0.0",
Expand Down
2 changes: 0 additions & 2 deletions src/neuroconv/basedatainterface.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,6 @@ def __init__(self, verbose: bool = False, **source_data):
self.verbose = verbose
self.source_data = source_data

self._validate_source_data(source_data=source_data, verbose=verbose)

def get_metadata_schema(self) -> dict:
"""Retrieve JSON schema for metadata."""
metadata_schema = load_dict_from_file(Path(__file__).parent / "schemas" / "base_metadata_schema.json")
Expand Down

This file was deleted.

This file was deleted.

Empty file.

This file was deleted.

This file was deleted.

This file was deleted.

3 changes: 0 additions & 3 deletions src/neuroconv/datainterfaces/behavior/sleap/requirements.txt

This file was deleted.

This file was deleted.

This file was deleted.

1 change: 0 additions & 1 deletion src/neuroconv/datainterfaces/ecephys/edf/requirements.txt

This file was deleted.

Empty file.

This file was deleted.

This file was deleted.

This file was deleted.

This file was deleted.

This file was deleted.

2 changes: 0 additions & 2 deletions src/neuroconv/datainterfaces/ecephys/requirements.txt

This file was deleted.

This file was deleted.

1 change: 0 additions & 1 deletion src/neuroconv/datainterfaces/icephys/abf/requirements.txt

This file was deleted.

1 change: 0 additions & 1 deletion src/neuroconv/datainterfaces/icephys/requirements.txt

This file was deleted.

This file was deleted.

This file was deleted.

2 changes: 0 additions & 2 deletions src/neuroconv/datainterfaces/ophys/miniscope/requirements.txt

This file was deleted.

1 change: 0 additions & 1 deletion src/neuroconv/datainterfaces/ophys/requirements.txt

This file was deleted.

This file was deleted.

2 changes: 0 additions & 2 deletions src/neuroconv/datainterfaces/ophys/tdt_fp/requirements.txt

This file was deleted.

1 change: 0 additions & 1 deletion src/neuroconv/datainterfaces/ophys/tiff/requirements.txt

This file was deleted.

2 changes: 0 additions & 2 deletions src/neuroconv/datainterfaces/text/excel/requirements.txt

This file was deleted.

12 changes: 4 additions & 8 deletions src/neuroconv/nwbconverter.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,6 @@ def _validate_source_data(self, source_data: dict[str, dict], verbose: bool = Tr
def __init__(self, source_data: dict[str, dict], verbose: bool = True):
"""Validate source_data against source_schema and initialize all data interfaces."""
self.verbose = verbose
self._validate_source_data(source_data=source_data, verbose=self.verbose)
self.data_interface_objects = {
name: data_interface(**source_data[name])
for name, data_interface in self.data_interface_classes.items()
Expand Down Expand Up @@ -204,11 +203,8 @@ def run_conversion(
nwbfile: Optional[NWBFile] = None,
metadata: Optional[dict] = None,
overwrite: bool = False,
# TODO: when all H5DataIO prewraps are gone, introduce Zarr safely
# backend: Union[Literal["hdf5", "zarr"]],
# backend_configuration: Optional[Union[HDF5BackendConfiguration, ZarrBackendConfiguration]] = None,
backend: Optional[Literal["hdf5"]] = None,
backend_configuration: Optional[HDF5BackendConfiguration] = None,
backend: Optional[Literal["hdf5", "zarr"]] = None,
backend_configuration: Optional[Union[HDF5BackendConfiguration, ZarrBackendConfiguration]] = None,
conversion_options: Optional[dict] = None,
) -> None:
"""
Expand All @@ -226,11 +222,11 @@ def run_conversion(
overwrite : bool, default: False
Whether to overwrite the NWBFile if one exists at the nwbfile_path.
The default is False (append mode).
backend : "hdf5", optional
backend : {"hdf5", "zarr"}, optional
The type of backend to use when writing the file.
If a `backend_configuration` is not specified, the default type will be "hdf5".
If a `backend_configuration` is specified, then the type will be auto-detected.
backend_configuration : HDF5BackendConfiguration, optional
backend_configuration : HDF5BackendConfiguration or ZarrBackendConfiguration, optional
The configuration model to use when configuring the datasets for this backend.
To customize, call the `.get_default_backend_configuration(...)` method, modify the returned
BackendConfiguration object, and pass that instead.
Expand Down
4 changes: 2 additions & 2 deletions tests/test_on_data/ophys/test_miniscope_converter.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,8 +159,8 @@ def assertNWBFileStructure(self, nwbfile_path: str):
nwbfile = io.read()

self.assertEqual(
nwbfile.session_start_time,
datetime(2021, 10, 7, 15, 3, 28, 635).astimezone(),
nwbfile.session_start_time.replace(tzinfo=None),
datetime(2021, 10, 7, 15, 3, 28, 635),
)

self.assertIn(self.device_name, nwbfile.devices)
Expand Down
9 changes: 4 additions & 5 deletions tests/test_on_data/setup_paths.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,19 @@


# Load the configuration for the data tests

project_root_path = Path(__file__).parent.parent.parent

if os.getenv("CI"):
LOCAL_PATH = Path(".") # Must be set to "." for CI
print("Running GIN tests on Github CI!")
else:
# Override LOCAL_PATH in the `gin_test_config.json` file to a point on your system that contains the dataset folder
# Use DANDIHub at hub.dandiarchive.org for open, free use of data found in the /shared/catalystneuro/ directory
test_config_path = Path(__file__).parent / "gin_test_config.json"
test_config_path = project_root_path / "tests" / "test_on_data" / "gin_test_config.json"
config_file_exists = test_config_path.exists()
if not config_file_exists:

root = test_config_path.parent.parent
base_test_config_path = root / "base_gin_test_config.json"
base_test_config_path = project_root_path / "base_gin_test_config.json"

test_config_path.parent.mkdir(parents=True, exist_ok=True)
copy(src=base_test_config_path, dst=test_config_path)
Expand All @@ -40,4 +39,4 @@
ECEPHY_DATA_PATH = LOCAL_PATH / "ephy_testing_data"
OPHYS_DATA_PATH = LOCAL_PATH / "ophys_testing_data"

TEXT_DATA_PATH = Path(__file__).parent.parent.parent / "tests" / "test_text"
TEXT_DATA_PATH = project_root_path / "tests" / "test_text"

0 comments on commit 7382a90

Please sign in to comment.