Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add some missing parameter docstrings #245

Merged
merged 2 commits into from
Nov 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions python-spec/src/somacore/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,11 @@ def create(
the minimum and maximum possible values for the column's
datatype. This makes a dataframe growable.

platform_config: platform-specific configuration; keys are SOMA
implementation names.

context: Other implementation-specific configuration.

Returns:
The newly created dataframe, opened for writing.

Expand Down Expand Up @@ -326,6 +331,8 @@ def read(
a partitioned read, and which part of the data to include.
result_order: the order to return results, specified as a
:class:`~options.ResultOrder` or its string value.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns: The data over the requested range as a tensor.

Expand Down Expand Up @@ -426,6 +433,8 @@ def read(
and which partition to include, if present.
result_order: the order to return results, specified as a
:class:`~options.ResultOrder` or its string value.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns: The data that was requested in a :class:`SparseRead`,
allowing access in any supported format.
Expand Down Expand Up @@ -479,6 +488,8 @@ def write(

Arrow table: a COO table, with columns named ``soma_dim_0``,
..., ``soma_dim_N`` and ``soma_data``.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns: ``self``, to enable method chaining.

Expand Down
6 changes: 3 additions & 3 deletions python-spec/src/somacore/query/axis.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,9 +72,9 @@ class AxisQuery:

AxisQuery() # all data
AxisQuery(coords=()) # also all data
AxisQuery(coords=(slice(1,10),)) # 1D, slice
AxisQuery(coords=([0,1,2])) # 1D, point indexing using array-like
AxisQuery(coords=(slice(None), numpy.array([0,88,1001]))) # 2D
AxisQuery(coords=(slice(1, 10),)) # 1D, slice
AxisQuery(coords=([0, 1, 2])) # 1D, point indexing using array-like
AxisQuery(coords=(slice(None), numpy.array([0, 88, 1001]))) # 2D
AxisQuery(value_filter="tissue == 'lung'")
AxisQuery(coords=(slice(1,None),), value_filter="tissue == 'lung'")

Expand Down
6 changes: 4 additions & 2 deletions python-spec/src/somacore/query/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,8 @@ def X(
and which partition to include, if present.
result_order: the order to return results, specified as a
:class:`~options.ResultOrder` or its string value.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Lifecycle: maturing
"""
Expand Down Expand Up @@ -539,7 +541,7 @@ def _read_axis_dataframe(

# Do the actual query.
arrow_table = axis_df.read(
axis_query.coords,
coords=axis_query.coords,
value_filter=axis_query.value_filter,
column_names=query_columns,
).concat()
Expand Down Expand Up @@ -679,7 +681,7 @@ class _AxisQueryResult:
var: pd.DataFrame
"""Experiment.ms[...].var query slice, as a pandas DataFrame"""
X: sparse.csr_matrix
"""Experiment.ms[...].X[...] query slice, as an SciPy sparse.csr_matrix """
"""Experiment.ms[...].X[...] query slice, as a SciPy sparse.csr_matrix """
X_layers: Dict[str, sparse.csr_matrix] = attrs.field(factory=dict)
"""Any additional X layers requested, as SciPy sparse.csr_matrix(s)"""
obsm: Dict[str, np.ndarray] = attrs.field(factory=dict)
Expand Down
29 changes: 28 additions & 1 deletion python-spec/src/somacore/spatial.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,9 @@ def create(
or if ``None`` in a given dimension, the corresponding index-column
domain will use the minimum and maximum possible values for the
column's datatype. This makes a point cloud dataframe growable.
platform_config: platform-specific configuration; keys are SOMA
implementation names.
context: Other implementation-specific configuration.

Returns:
The newly created geometry dataframe, opened for writing.
Expand Down Expand Up @@ -122,6 +125,8 @@ def read(
Defaults to ``()``, meaning no constraint -- all IDs.
column_names: the named columns to read and return.
Defaults to ``None``, meaning no constraint -- all column names.
batch_size: The size of batches that should be returned from a read.
See :class:`options.BatchSize` for details.
partitions: If present, specifies that this is part of
a partitioned read, and which part of the data to include.
result_order: the order to return results, specified as a
Expand All @@ -130,6 +135,8 @@ def read(
The default of ``None`` represents no filter. Value filter
syntax is implementation-defined; see the documentation
for the particular SOMA implementation for details.
platform_config: platform-specific configuration; keys are SOMA
implementation names.
Returns:
A :class:`ReadIter` of :class:`pa.Table`s.

Expand Down Expand Up @@ -177,6 +184,8 @@ def read_spatial_region(
The default of ``None`` represents no filter. Value filter
syntax is implementation-defined; see the documentation
for the particular SOMA implementation for details.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns:
A :class:`SpatialRead` with :class:`ReadIter` of :class:`pa.Table`s data.
Expand All @@ -201,6 +210,8 @@ def write(
values: An Arrow table containing all columns, including
the index columns. The schema for the values must match
the schema for the ``DataFrame``.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns: ``self``, to enable method chaining.

Expand Down Expand Up @@ -323,6 +334,9 @@ def create(
the corresponding index-column domain will use the minimum and maximum
possible values for the column's datatype. This makes a dataframe
growable.
platform_config: platform-specific configuration; keys are SOMA
implementation names.
context: Other implementation-specific configuration.

Returns:
The newly created geometry dataframe, opened for writing.
Expand Down Expand Up @@ -352,6 +366,8 @@ def read(
Defaults to ``()``, meaning no constraint -- all IDs.
column_names: the named columns to read and return.
Defaults to ``None``, meaning no constraint -- all column names.
batch_size: The size of batches that should be returned from a read.
See :class:`options.BatchSize` for details.
partitions: If present, specifies that this is part of
a partitioned read, and which part of the data to include.
result_order: the order to return results, specified as a
Expand All @@ -360,6 +376,8 @@ def read(
The default of ``None`` represents no filter. Value filter
syntax is implementation-defined; see the documentation
for the particular SOMA implementation for details.
platform_config: platform-specific configuration; keys are SOMA
implementation names.
Returns:
A :class:`ReadIter` of :class:`pa.Table`s.

Expand Down Expand Up @@ -407,6 +425,8 @@ def read_spatial_region(
The default of ``None`` represents no filter. Value filter
syntax is implementation-defined; see the documentation
for the particular SOMA implementation for details.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns:
A :class:`SpatialRead` with :class:`ReadIter` of :class:`pa.Table`s data.
Expand All @@ -431,6 +451,8 @@ def write(
values: An Arrow table containing all columns, including
the index columns. The schema for the values must match
the schema for the ``DataFrame``.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns: ``self``, to enable method chaining.

Expand Down Expand Up @@ -553,7 +575,7 @@ def create(
SOMADenseNDArray it must match have the shape provided by
``level_shape`` and type specified in ``type. If set to ``None``, the
``level_key`` will be used to construct a default child URI. For more
on URIs see :meth:`collection.Collection.add_new_collction`.
on URIs see :meth:`collection.Collection.add_new_collection`.
coordinate_space: Either the coordinate space or the axis names for the
coordinate space the ``level=0`` image is defined on. This does not
include the channel dimension, only spatial dimensions.
Expand All @@ -562,6 +584,9 @@ def create(
axis is provided, this defaults to the channel axis followed by the
coordinate space axes in reverse order (e.g.
``("soma_channel", "y", "x")`` if ``coordinate_space=("x", "y")``).
platform_config: platform-specific configuration; keys are SOMA
implementation names.
context: Other implementation-specific configuration.

Returns:
The newly created collection, opened for writing.
Expand Down Expand Up @@ -662,6 +687,8 @@ def read_spatial_region(
:class:`~options.ResultOrder` or its string value. This is the result
order the data is read from disk. It may be permuted if
``data_axis_order`` is not the default order.
platform_config: platform-specific configuration; keys are SOMA
implementation names.

Returns:
The data bounding the requested region as a :class:`SpatialRead` with
Expand Down
Loading