diff --git a/docs/src/userguide/cube_statistics.rst b/docs/src/userguide/cube_statistics.rst index 4eb016078e..d62a056f33 100644 --- a/docs/src/userguide/cube_statistics.rst +++ b/docs/src/userguide/cube_statistics.rst @@ -23,9 +23,9 @@ Collapsing Entire Data Dimensions In the :doc:`subsetting_a_cube` section we saw how to extract a subset of a cube in order to reduce either its dimensionality or its resolution. -Instead of simply extracting a sub-region of the data, -we can produce statistical functions of the data values -across a particular dimension, +Instead of simply extracting a sub-region of the data, +we can produce statistical functions of the data values +across a particular dimension, such as a 'mean over time' or 'minimum over latitude'. .. _cube-statistics_forecast_printout: @@ -57,9 +57,9 @@ For instance, suppose we have a cube: um_version: 7.3 -In this case we have a 4 dimensional cube; -to mean the vertical (z) dimension down to a single valued extent -we can pass the coordinate name and the aggregation definition to the +In this case we have a 4 dimensional cube; +to mean the vertical (z) dimension down to a single valued extent +we can pass the coordinate name and the aggregation definition to the :meth:`Cube.collapsed() ` method: >>> import iris.analysis @@ -88,8 +88,8 @@ we can pass the coordinate name and the aggregation definition to the mean: model_level_number -Similarly other analysis operators such as ``MAX``, ``MIN`` and ``STD_DEV`` -can be used instead of ``MEAN``, see :mod:`iris.analysis` for a full list +Similarly other analysis operators such as ``MAX``, ``MIN`` and ``STD_DEV`` +can be used instead of ``MEAN``, see :mod:`iris.analysis` for a full list of currently supported operators. For an example of using this functionality, the @@ -103,14 +103,14 @@ in the gallery takes a zonal mean of an ``XYT`` cube by using the Area Averaging ^^^^^^^^^^^^^^ -Some operators support additional keywords to the ``cube.collapsed`` method. -For example, :func:`iris.analysis.MEAN ` supports -a weights keyword which can be combined with +Some operators support additional keywords to the ``cube.collapsed`` method. +For example, :func:`iris.analysis.MEAN ` supports +a weights keyword which can be combined with :func:`iris.analysis.cartography.area_weights` to calculate an area average. -Let's use the same data as was loaded in the previous example. -Since ``grid_latitude`` and ``grid_longitude`` were both point coordinates -we must guess bound positions for them +Let's use the same data as was loaded in the previous example. +Since ``grid_latitude`` and ``grid_longitude`` were both point coordinates +we must guess bound positions for them in order to calculate the area of the grid boxes:: import iris.analysis.cartography @@ -155,24 +155,24 @@ including an example on taking a :ref:`global area-weighted mean Partially Reducing Data Dimensions ---------------------------------- -Instead of completely collapsing a dimension, other methods can be applied -to reduce or filter the number of data points of a particular dimension. +Instead of completely collapsing a dimension, other methods can be applied +to reduce or filter the number of data points of a particular dimension. Aggregation of Grouped Data ^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The :meth:`Cube.aggregated_by ` operation -combines data for all points with the same value of a given coordinate. -To do this, you need a coordinate whose points take on only a limited set -of different values -- the *number* of these then determines the size of the +The :meth:`Cube.aggregated_by ` operation +combines data for all points with the same value of a given coordinate. +To do this, you need a coordinate whose points take on only a limited set +of different values -- the *number* of these then determines the size of the reduced dimension. -The :mod:`iris.coord_categorisation` module can be used to make such -'categorical' coordinates out of ordinary ones: The most common use is -to aggregate data over regular *time intervals*, +The :mod:`iris.coord_categorisation` module can be used to make such +'categorical' coordinates out of ordinary ones: The most common use is +to aggregate data over regular *time intervals*, such as by calendar month or day of the week. -For example, let's create two new coordinates on the cube +For example, let's create two new coordinates on the cube to represent the climatological seasons and the season year respectively:: import iris @@ -188,8 +188,8 @@ to represent the climatological seasons and the season year respectively:: .. note:: - The 'season year' is not the same as year number, because (e.g.) the months - Dec11, Jan12 + Feb12 all belong to 'DJF-12'. + The 'season year' is not the same as year number, because (e.g.) the months + Dec11, Jan12 + Feb12 all belong to 'DJF-12'. See :meth:`iris.coord_categorisation.add_season_year`. @@ -206,10 +206,10 @@ to represent the climatological seasons and the season year respectively:: iris.coord_categorisation.add_season_year(cube, 'time', name='season_year') annual_seasonal_mean = cube.aggregated_by( - ['clim_season', 'season_year'], + ['clim_season', 'season_year'], iris.analysis.MEAN) - + Printing this cube now shows that two extra coordinates exist on the cube: .. doctest:: aggregation @@ -238,20 +238,20 @@ These two coordinates can now be used to aggregate by season and climate-year: .. doctest:: aggregation >>> annual_seasonal_mean = cube.aggregated_by( - ... ['clim_season', 'season_year'], + ... ['clim_season', 'season_year'], ... iris.analysis.MEAN) >>> print(repr(annual_seasonal_mean)) - -The primary change in the cube is that the cube's data has been -reduced in the 'time' dimension by aggregation (taking means, in this case). -This has collected together all data points with the same values of season and + +The primary change in the cube is that the cube's data has been +reduced in the 'time' dimension by aggregation (taking means, in this case). +This has collected together all data points with the same values of season and season-year. The results are now indexed by the 19 different possible values of season and season-year in a new, reduced 'time' dimension. -We can see this by printing the first 10 values of season+year -from the original cube: These points are individual months, +We can see this by printing the first 10 values of season+year +from the original cube: These points are individual months, so adjacent ones are often in the same season: .. doctest:: aggregation @@ -271,7 +271,7 @@ so adjacent ones are often in the same season: djf 2007 djf 2007 -Compare this with the first 10 values of the new cube's coordinates: +Compare this with the first 10 values of the new cube's coordinates: All the points now have distinct season+year values: .. doctest:: aggregation @@ -294,7 +294,7 @@ All the points now have distinct season+year values: Because the original data started in April 2006 we have some incomplete seasons (e.g. there were only two months worth of data for 'mam-2006'). -In this case we can fix this by removing all of the resultant 'times' which +In this case we can fix this by removing all of the resultant 'times' which do not cover a three month period (note: judged here as > 3*28 days): .. doctest:: aggregation @@ -306,7 +306,7 @@ do not cover a three month period (note: judged here as > 3*28 days): >>> full_season_means -The final result now represents the seasonal mean temperature for 17 seasons +The final result now represents the seasonal mean temperature for 17 seasons from jja-2006 to jja-2010: .. doctest:: aggregation diff --git a/lib/iris/common/metadata.py b/lib/iris/common/metadata.py index 174f115187..801ba57c44 100644 --- a/lib/iris/common/metadata.py +++ b/lib/iris/common/metadata.py @@ -27,10 +27,6 @@ __all__ = [ - "SERVICES_COMBINE", - "SERVICES_DIFFERENCE", - "SERVICES_EQUAL", - "SERVICES", "AncillaryVariableMetadata", "BaseMetadata", "CellMeasureMetadata", @@ -38,11 +34,19 @@ "CubeMetadata", "DimCoordMetadata", "hexdigest", + "metadata_filter", "metadata_manager_factory", + "SERVICES", + "SERVICES_COMBINE", + "SERVICES_DIFFERENCE", + "SERVICES_EQUAL", ] # https://www.unidata.ucar.edu/software/netcdf/docs/netcdf_data_set_components.html#object_name + +from ..util import guess_coord_axis + _TOKEN_PARSE = re.compile(r"""^[a-zA-Z0-9][\w\.\+\-@]*$""") # Configure the logger. @@ -194,9 +198,18 @@ def func(field): return result # Note that, for strict we use "_fields" not "_members". - # The "circular" and "src_dim" members do not participate in strict equivalence. + # TODO: refactor so that 'non-participants' can be held in their specific subclasses. + # Certain members never participate in strict equivalence, so + # are filtered out. fields = filter( - lambda field: field not in ("circular", "src_dim"), + lambda field: field + not in ( + "circular", + "src_dim", + "node_dimension", + "edge_dimension", + "face_dimension", + ), self._fields, ) result = all([func(field) for field in fields]) @@ -1330,6 +1343,146 @@ def equal(self, other, lenient=None): return super().equal(other, lenient=lenient) +def metadata_filter( + instances, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, +): + """ + Filter a collection of objects by their metadata to fit the given metadata + criteria. Criteria can be one or both of: specific properties / other objects + carrying metadata to be matched. + + Args: + + * instances + One or more objects to be filtered. + + Kwargs: + + * item + Either + + (a) a :attr:`standard_name`, :attr:`long_name`, or + :attr:`var_name`. Defaults to value of `default` + (which itself defaults to `unknown`) as defined in + :class:`~iris.common.CFVariableMixin`. + + (b) a 'coordinate' instance with metadata equal to that of + the desired coordinates. Accepts either a + :class:`~iris.coords.DimCoord`, :class:`~iris.coords.AuxCoord`, + :class:`~iris.aux_factory.AuxCoordFactory`, + :class:`~iris.common.CoordMetadata` or + :class:`~iris.common.DimCoordMetadata` or + :class:`~iris.experimental.ugrid.ConnectivityMetadata`. + * standard_name + The CF standard name of the desired coordinate. If None, does not + check for standard name. + * long_name + An unconstrained description of the coordinate. If None, does not + check for long_name. + * var_name + The netCDF variable name of the desired coordinate. If None, does + not check for var_name. + * attributes + A dictionary of attributes desired on the coordinates. If None, + does not check for attributes. + * axis + The desired coordinate axis, see + :func:`~iris.util.guess_coord_axis`. If None, does not check for + axis. Accepts the values 'X', 'Y', 'Z' and 'T' (case-insensitive). + + Returns: + A list of the objects supplied in the ``instances`` argument, limited + to only those that matched the given criteria. + + """ + name = None + obj = None + + if isinstance(item, str): + name = item + else: + obj = item + + # apply de morgan's law for one less logical operation + if not (isinstance(instances, str) or isinstance(instances, Iterable)): + instances = [instances] + + result = instances + + if name is not None: + result = [instance for instance in result if instance.name() == name] + + if standard_name is not None: + result = [ + instance + for instance in result + if instance.standard_name == standard_name + ] + + if long_name is not None: + result = [ + instance for instance in result if instance.long_name == long_name + ] + + if var_name is not None: + result = [ + instance for instance in result if instance.var_name == var_name + ] + + if attributes is not None: + if not isinstance(attributes, Mapping): + msg = ( + "The attributes keyword was expecting a dictionary " + "type, but got a %s instead." % type(attributes) + ) + raise ValueError(msg) + + def attr_filter(instance): + return all( + k in instance.attributes + and hexdigest(instance.attributes[k]) == hexdigest(v) + for k, v in attributes.items() + ) + + result = [instance for instance in result if attr_filter(instance)] + + if axis is not None: + axis = axis.upper() + + def get_axis(instance): + if hasattr(instance, "axis"): + axis = instance.axis.upper() + else: + axis = guess_coord_axis(instance) + return axis + + result = [ + instance for instance in result if get_axis(instance) == axis + ] + + if obj is not None: + if hasattr(obj, "__class__") and issubclass( + obj.__class__, BaseMetadata + ): + target_metadata = obj + else: + target_metadata = obj.metadata + + result = [ + instance + for instance in result + if instance.metadata == target_metadata + ] + + return result + + def metadata_manager_factory(cls, **kwargs): """ A class instance factory function responsible for manufacturing diff --git a/lib/iris/cube.py b/lib/iris/cube.py index a15951900b..e8b6d4a692 100644 --- a/lib/iris/cube.py +++ b/lib/iris/cube.py @@ -13,7 +13,6 @@ from collections.abc import ( Iterable, Container, - Mapping, MutableMapping, Iterator, ) @@ -40,11 +39,10 @@ import iris.aux_factory from iris.common import ( CFVariableMixin, - CoordMetadata, CubeMetadata, - DimCoordMetadata, metadata_manager_factory, ) +from iris.common.metadata import metadata_filter import iris.coord_systems import iris.coords import iris.exceptions @@ -1639,14 +1637,6 @@ def coords( See also :meth:`Cube.coord()`. """ - name = None - coord = None - - if isinstance(name_or_coord, str): - name = name_or_coord - else: - coord = name_or_coord - coords_and_factories = [] if dim_coords in [True, None]: @@ -1656,62 +1646,15 @@ def coords( coords_and_factories += list(self.aux_coords) coords_and_factories += list(self.aux_factories) - if name is not None: - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if coord_.name() == name - ] - - if standard_name is not None: - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if coord_.standard_name == standard_name - ] - - if long_name is not None: - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if coord_.long_name == long_name - ] - - if var_name is not None: - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if coord_.var_name == var_name - ] - - if axis is not None: - axis = axis.upper() - guess_axis = iris.util.guess_coord_axis - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if guess_axis(coord_) == axis - ] - - if attributes is not None: - if not isinstance(attributes, Mapping): - msg = ( - "The attributes keyword was expecting a dictionary " - "type, but got a %s instead." % type(attributes) - ) - raise ValueError(msg) - - def attr_filter(coord_): - return all( - k in coord_.attributes and coord_.attributes[k] == v - for k, v in attributes.items() - ) - - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if attr_filter(coord_) - ] + coords_and_factories = metadata_filter( + coords_and_factories, + item=name_or_coord, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + ) if coord_system is not None: coords_and_factories = [ @@ -1720,20 +1663,6 @@ def attr_filter(coord_): if coord_.coord_system == coord_system ] - if coord is not None: - if hasattr(coord, "__class__") and coord.__class__ in ( - CoordMetadata, - DimCoordMetadata, - ): - target_metadata = coord - else: - target_metadata = coord.metadata - coords_and_factories = [ - coord_ - for coord_ in coords_and_factories - if coord_.metadata == target_metadata - ] - if contains_dimension is not None: coords_and_factories = [ coord_ diff --git a/lib/iris/exceptions.py b/lib/iris/exceptions.py index 1c05d13163..12d24ef70f 100644 --- a/lib/iris/exceptions.py +++ b/lib/iris/exceptions.py @@ -39,6 +39,12 @@ class AncillaryVariableNotFoundError(KeyError): pass +class ConnectivityNotFoundError(KeyError): + """Raised when a search yields no connectivities.""" + + pass + + class CoordinateMultiDimError(ValueError): """Raised when a routine doesn't support multi-dimensional coordinates.""" diff --git a/lib/iris/experimental/ugrid.py b/lib/iris/experimental/ugrid.py index e5420b6041..f14d2f0a35 100644 --- a/lib/iris/experimental/ugrid.py +++ b/lib/iris/experimental/ugrid.py @@ -10,6 +10,8 @@ """ +from abc import ABC, abstractmethod +from collections import Iterable, namedtuple from functools import wraps import dask.array as da @@ -18,6 +20,7 @@ from .. import _lazy_data as _lazy from ..common.metadata import ( BaseMetadata, + metadata_filter, metadata_manager_factory, SERVICES, SERVICES_COMBINE, @@ -25,15 +28,64 @@ SERVICES_DIFFERENCE, ) from ..common.lenient import _lenient_service as lenient_service -from ..coords import _DimensionalMetadata +from ..common.mixin import CFVariableMixin +from ..config import get_logger +from ..coords import _DimensionalMetadata, AuxCoord +from ..exceptions import ConnectivityNotFoundError, CoordinateNotFoundError +from ..util import guess_coord_axis __all__ = [ "Connectivity", "ConnectivityMetadata", + "Mesh1DConnectivities", + "Mesh1DCoords", + "Mesh2DConnectivities", + "Mesh2DCoords", + "MeshEdgeCoords", + "MeshFaceCoords", + "MeshNodeCoords", + "MeshMetadata", ] +# Configure the logger. +logger = get_logger(__name__, fmt="[%(cls)s.%(funcName)s]") + + +# Mesh dimension names namedtuples. +Mesh1DNames = namedtuple("Mesh1DNames", ["node_dimension", "edge_dimension"]) +Mesh2DNames = namedtuple( + "Mesh2DNames", ["node_dimension", "edge_dimension", "face_dimension"] +) + +# Mesh coordinate manager namedtuples. +Mesh1DCoords = namedtuple( + "Mesh1DCoords", ["node_x", "node_y", "edge_x", "edge_y"] +) +Mesh2DCoords = namedtuple( + "Mesh2DCoords", + ["node_x", "node_y", "edge_x", "edge_y", "face_x", "face_y"], +) +MeshNodeCoords = namedtuple("MeshNodeCoords", ["node_x", "node_y"]) +MeshEdgeCoords = namedtuple("MeshEdgeCoords", ["edge_x", "edge_y"]) +MeshFaceCoords = namedtuple("MeshFaceCoords", ["face_x", "face_y"]) + +# Mesh connectivity manager namedtuples. +Mesh1DConnectivities = namedtuple("Mesh1DConnectivities", ["edge_node"]) +Mesh2DConnectivities = namedtuple( + "Mesh2DConnectivities", + [ + "face_node", + "edge_node", + "face_edge", + "face_face", + "edge_face", + "boundary_node", + ], +) + + class Connectivity(_DimensionalMetadata): """ A CF-UGRID topology connectivity, describing the topological relationship @@ -497,7 +549,7 @@ def xml_element(self, doc): class ConnectivityMetadata(BaseMetadata): """ - Metadata container for a :class:`~iris.coords.Connectivity`. + Metadata container for a :class:`~iris.experimental.ugrid.Connectivity`. """ @@ -615,17 +667,1347 @@ def equal(self, other, lenient=None): return super().equal(other, lenient=lenient) +class MeshMetadata(BaseMetadata): + """ + Metadata container for a :class:`~iris.experimental.ugrid.Mesh`. + + """ + + # The node_dimension", "edge_dimension" and "face_dimension" members are + # stateful only; they not participate in lenient/strict equivalence. + _members = ( + "topology_dimension", + "node_dimension", + "edge_dimension", + "face_dimension", + ) + + __slots__ = () + + @wraps(BaseMetadata.__eq__, assigned=("__doc__",), updated=()) + @lenient_service + def __eq__(self, other): + return super().__eq__(other) + + def _combine_lenient(self, other): + """ + Perform lenient combination of metadata members for meshes. + + Args: + + * other (MeshMetadata): + The other mesh metadata participating in the lenient + combination. + + Returns: + A list of combined metadata member values. + + """ + + # Perform "strict" combination for "topology_dimension", + # "node_dimension", "edge_dimension" and "face_dimension". + def func(field): + left = getattr(self, field) + right = getattr(other, field) + return left if left == right else None + + # Note that, we use "_members" not "_fields". + values = [func(field) for field in MeshMetadata._members] + # Perform lenient combination of the other parent members. + result = super()._combine_lenient(other) + result.extend(values) + + return result + + def _compare_lenient(self, other): + """ + Perform lenient equality of metadata members for meshes. + + Args: + + * other (MeshMetadata): + The other mesh metadata participating in the lenient + comparison. + + Returns: + Boolean. + + """ + # Perform "strict" comparison for "topology_dimension". + # "node_dimension", "edge_dimension" and "face_dimension" are not part + # of lenient equivalence at all. + result = self.topology_dimension == other.topology_dimension + if result: + # Perform lenient comparison of the other parent members. + result = super()._compare_lenient(other) + + return result + + def _difference_lenient(self, other): + """ + Perform lenient difference of metadata members for meshes. + + Args: + + * other (MeshMetadata): + The other mesh metadata participating in the lenient + difference. + + Returns: + A list of difference metadata member values. + + """ + # Perform "strict" difference for "topology_dimension", + # "node_dimension", "edge_dimension" and "face_dimension". + def func(field): + left = getattr(self, field) + right = getattr(other, field) + return None if left == right else (left, right) + + # Note that, we use "_members" not "_fields". + values = [func(field) for field in MeshMetadata._members] + # Perform lenient difference of the other parent members. + result = super()._difference_lenient(other) + result.extend(values) + + return result + + @wraps(BaseMetadata.combine, assigned=("__doc__",), updated=()) + @lenient_service + def combine(self, other, lenient=None): + return super().combine(other, lenient=lenient) + + @wraps(BaseMetadata.difference, assigned=("__doc__",), updated=()) + @lenient_service + def difference(self, other, lenient=None): + return super().difference(other, lenient=lenient) + + @wraps(BaseMetadata.equal, assigned=("__doc__",), updated=()) + @lenient_service + def equal(self, other, lenient=None): + return super().equal(other, lenient=lenient) + + +class Mesh(CFVariableMixin): + """ + + .. todo:: + + .. questions:: + + - decide on the verbose/succinct version of __str__ vs __repr__ + + .. notes:: + + - the mesh is location agnostic + + - no need to support volume at mesh level, yet + + - topology_dimension + - use for fast equality between Mesh instances + - checking connectivity dimensionality, specifically the highest dimensonality of the + "geometric element" being added i.e., reference the src_location/tgt_location + - used to honour and enforce the minimum UGRID connectivity contract + + - support pickling + + - copy is off the table!! + + - MeshCoord.guess_points() + - MeshCoord.to_AuxCoord() + + - don't provide public methods to return the coordinate and connectivity + managers + + - validate both managers contents e.g., shape? more...? + + """ + + # TBD: for volume and/or z-axis support include axis "z" and/or dimension "3" + AXES = ("x", "y") + TOPOLOGY_DIMENSIONS = (1, 2) + + def __init__( + self, + topology_dimension, + node_coords_and_axes, + connectivities, + standard_name=None, + long_name=None, + var_name=None, + units=None, + attributes=None, + edge_coords_and_axes=None, + face_coords_and_axes=None, + node_dimension=None, + edge_dimension=None, + face_dimension=None, + ): + # TODO: support volumes. + # TODO: support (coord, "z") + + self._metadata_manager = metadata_manager_factory(MeshMetadata) + + # topology_dimension is read-only, so assign directly to the metadata manager + if topology_dimension not in self.TOPOLOGY_DIMENSIONS: + emsg = f"Expected 'topology_dimension' in range {self.TOPOLOGY_DIMENSIONS!r}, got {topology_dimension!r}." + raise ValueError(emsg) + self._metadata_manager.topology_dimension = topology_dimension + + # TBD: these are strings, if None is provided then assign the default string. + self.node_dimension = node_dimension + self.edge_dimension = edge_dimension + self.face_dimension = face_dimension + + # assign the metadata to the metadata manager + self.standard_name = standard_name + self.long_name = long_name + self.var_name = var_name + self.units = units + self.attributes = attributes + + # based on the topology_dimension, create the appropriate coordinate manager + def normalise(location, axis): + result = str(axis).lower() + if result not in self.AXES: + emsg = f"Invalid axis specified for {location} coordinate {coord.name()!r}, got {axis!r}." + raise ValueError(emsg) + return f"{location}_{axis}" + + if not isinstance(node_coords_and_axes, Iterable): + node_coords_and_axes = [node_coords_and_axes] + + if not isinstance(connectivities, Iterable): + connectivities = [connectivities] + + kwargs = {} + for coord, axis in node_coords_and_axes: + kwargs[normalise("node", axis)] = coord + if edge_coords_and_axes is not None: + for coord, axis in edge_coords_and_axes: + kwargs[normalise("edge", axis)] = coord + if face_coords_and_axes is not None: + for coord, axis in face_coords_and_axes: + kwargs[normalise("face", axis)] = coord + + # check the UGRID minimum requirement for coordinates + if "node_x" not in kwargs: + emsg = ( + "Require a node coordinate that is x-axis like to be provided." + ) + raise ValueError(emsg) + if "node_y" not in kwargs: + emsg = ( + "Require a node coordinate that is y-axis like to be provided." + ) + raise ValueError(emsg) + + if self.topology_dimension == 1: + self._coord_manager = _Mesh1DCoordinateManager(**kwargs) + self._connectivity_manager = _Mesh1DConnectivityManager( + *connectivities + ) + elif self.topology_dimension == 2: + self._coord_manager = _Mesh2DCoordinateManager(**kwargs) + self._connectivity_manager = _Mesh2DConnectivityManager( + *connectivities + ) + else: + emsg = f"Unsupported 'topology_dimension', got {topology_dimension!r}." + raise NotImplementedError(emsg) + + def __eq__(self, other): + # TBD + return NotImplemented + + def __getstate__(self): + return ( + self._metadata_manager, + self._coord_manager, + self._connectivity_manager, + ) + + def __ne__(self, other): + # TBD + return NotImplemented + + def __repr__(self): + # TBD + args = [] + return f"{self.__class__.__name__}({', '.join(args)})" + + def __setstate__(self, state): + metadata_manager, coord_manager, connectivity_manager = state + self._metadata_manager = metadata_manager + self._coord_manager = coord_manager + self._connectivity_manager = connectivity_manager + + def __str__(self): + # TBD + args = [] + return f"{self.__class__.__name__}({', '.join(args)})" + + def _set_dimension_names(self, node, edge, face, reset=False): + args = (node, edge, face) + currents = ( + self.node_dimension, + self.edge_dimension, + self.face_dimension, + ) + zipped = zip(args, currents) + if reset: + node, edge, face = [ + None if arg else current for arg, current in zipped + ] + else: + node, edge, face = [arg or current for arg, current in zipped] + + self.node_dimension = node + self.edge_dimension = edge + self.face_dimension = face + + if self.topology_dimension == 1: + result = Mesh1DNames(self.node_dimension, self.edge_dimension) + elif self.topology_dimension == 2: + result = Mesh2DNames( + self.node_dimension, self.edge_dimension, self.face_dimension + ) + else: + message = ( + f"Unsupported topology_dimension: {self.topology_dimension} ." + ) + raise NotImplementedError(message) + + return result + + @property + def all_coords(self): + return self._coord_manager.all_members + + @property + def edge_dimension(self): + return self._metadata_manager.edge_dimension + + @edge_dimension.setter + def edge_dimension(self, name): + if not name or not isinstance(name, str): + edge_dimension = f"Mesh{self.topology_dimension}d_edge" + else: + edge_dimension = name + self._metadata_manager.edge_dimension = edge_dimension + + @property + def edge_coords(self): + return self._coord_manager.edge_coords + + @property + def face_dimension(self): + return self._metadata_manager.face_dimension + + @face_dimension.setter + def face_dimension(self, name): + if not name or not isinstance(name, str): + face_dimension = f"Mesh{self.topology_dimension}d_face" + else: + face_dimension = name + self._metadata_manager.face_dimension = face_dimension + + @property + def face_coords(self): + return self._coord_manager.face_coords + + @property + def node_dimension(self): + return self._metadata_manager.node_dimension + + @node_dimension.setter + def node_dimension(self, name): + if not name or not isinstance(name, str): + node_dimension = f"Mesh{self.topology_dimension}d_node" + else: + node_dimension = name + self._metadata_manager.node_dimension = node_dimension + + @property + def node_coords(self): + return self._coord_manager.node_coords + + @property + def all_connectivities(self): + return self._connectivity_manager.all_members + + @property + def face_node_connectivity(self): + # required + return self._connectivity_manager.face_node + + @property + def edge_node_connectivity(self): + # optionally required + return self._connectivity_manager.edge_node + + @property + def face_edge_connectivity(self): + # optional + return self._connectivity_manager.face_edge + + @property + def face_face_connectivity(self): + # optional + return self._connectivity_manager.face_face + + @property + def edge_face_connectivity(self): + # optional + return self._connectivity_manager.edge_face + + @property + def boundary_node_connectivity(self): + # optional + return self._connectivity_manager.boundary_node + + def add_coords( + self, + node_x=None, + node_y=None, + edge_x=None, + edge_y=None, + face_x=None, + face_y=None, + ): + self._coord_manager.add( + node_x=node_x, + node_y=node_y, + edge_x=edge_x, + edge_y=edge_y, + face_x=face_x, + face_y=face_y, + ) + + def add_connectivities(self, *connectivities): + self._connectivity_manager.add(*connectivities) + + def connectivities( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + cf_role=None, + node=None, + edge=None, + face=None, + ): + return self._connectivity_manager.filters( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + cf_role=cf_role, + node=node, + edge=edge, + face=face, + ) + + def connectivity( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + cf_role=None, + node=None, + edge=None, + face=None, + ): + return self._connectivity_manager.filter( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + cf_role=cf_role, + node=node, + edge=edge, + face=face, + ) + + def coord( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=None, + edge=None, + face=None, + ): + return self._coord_manager.filter( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + node=node, + edge=edge, + face=face, + ) + + def coords( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=False, + edge=False, + face=False, + ): + return self._coord_manager.filters( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + node=node, + edge=edge, + face=face, + ) + + def remove_connectivities( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + cf_role=None, + node=None, + edge=None, + face=None, + ): + return self._connectivity_manager.remove( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + cf_role=cf_role, + node=node, + edge=edge, + face=face, + ) + + def remove_coords( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=None, + edge=None, + face=None, + ): + return self._coord_manager.remove( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + node=node, + edge=edge, + face=face, + ) + + def xml_element(self): + # TBD + pass + + # the MeshCoord will always have bounds, perhaps points. However the MeshCoord.guess_points() may + # be a very useful part of its behaviour. + # after using MeshCoord.guess_points(), the user may wish to add the associated MeshCoord.points into + # the Mesh as face_coordinates. + + # def to_AuxCoord(self, location, axis): + # # factory method + # # return the lazy AuxCoord(...) for the given location and axis + # + # def to_AuxCoords(self, location): + # # factory method + # # return the lazy AuxCoord(...), AuxCoord(...) + # + # def to_MeshCoord(self, location, axis): + # # factory method + # # return MeshCoord(..., location=location, axis=axis) + # # use Connectivity.indices_by_src() for fetching indices, passing in the lazy_indices() result as an argument. + # + # def to_MeshCoords(self, location): + # # factory method + # # return MeshCoord(..., location=location, axis="x"), MeshCoord(..., location=location, axis="y") + # # use Connectivity.indices_by_src for fetching indices, passing in the lazy_indices() result as an argument. + + def dimension_names_reset(self, node=False, edge=False, face=False): + return self._set_dimension_names(node, edge, face, reset=True) + + def dimension_names(self, node=None, edge=None, face=None): + return self._set_dimension_names(node, edge, face, reset=False) + + @property + def cf_role(self): + return "mesh_topology" + + @property + def topology_dimension(self): + return self._metadata_manager.topology_dimension + + +class _Mesh1DCoordinateManager: + """ + + TBD: require clarity on coord_systems validation + TBD: require clarity on __eq__ support + TBD: rationalise self.coords() logic with other manager and Cube + + """ + + REQUIRED = ( + "node_x", + "node_y", + ) + OPTIONAL = ( + "edge_x", + "edge_y", + ) + + def __init__(self, node_x, node_y, edge_x=None, edge_y=None): + # initialise all the coordinates + self.ALL = self.REQUIRED + self.OPTIONAL + self._members = {member: None for member in self.ALL} + + # required coordinates + self.node_x = node_x + self.node_y = node_y + # optional coordinates + self.edge_x = edge_x + self.edge_y = edge_y + + def __eq__(self, other): + # TBD + return NotImplemented + + def __getstate__(self): + return self._members + + def __iter__(self): + for item in self._members.items(): + yield item + + def __ne__(self, other): + # TBD + return NotImplemented + + def __repr__(self): + args = [ + f"{member}={coord!r}" + for member, coord in self + if coord is not None + ] + return f"{self.__class__.__name__}({', '.join(args)})" + + def __setstate__(self, state): + self._members = state + + def __str__(self): + args = [ + f"{member}=True" for member, coord in self if coord is not None + ] + return f"{self.__class__.__name__}({', '.join(args)})" + + def _remove(self, **kwargs): + result = {} + members = self.filters(**kwargs) + + for member in members.keys(): + if member in self.REQUIRED: + dmsg = f"Ignoring request to remove required coordinate {member!r}" + logger.debug(dmsg, extra=dict(cls=self.__class__.__name__)) + else: + result[member] = members[member] + setattr(self, member, None) + + return result + + def _setter(self, location, axis, coord, shape): + axis = axis.lower() + member = f"{location}_{axis}" + + # enforce the UGRID minimum coordinate requirement + if location == "node" and coord is None: + emsg = ( + f"{member!r} is a required coordinate, cannot set to 'None'." + ) + raise ValueError(emsg) + + if coord is not None: + if not isinstance(coord, AuxCoord): + emsg = f"{member!r} requires to be an 'AuxCoord', got {type(coord)}." + raise TypeError(emsg) + + guess_axis = guess_coord_axis(coord) + + if guess_axis and guess_axis.lower() != axis: + emsg = f"{member!r} requires a {axis}-axis like 'AuxCoord', got a {guess_axis.lower()}-axis like." + raise TypeError(emsg) + + if coord.climatological: + emsg = f"{member!r} cannot be a climatological 'AuxCoord'." + raise TypeError(emsg) + + if shape is not None and coord.shape != shape: + emsg = f"{member!r} requires to have shape {shape!r}, got {coord.shape!r}." + raise ValueError(emsg) + + self._members[member] = coord + + def _shape(self, location): + coord = getattr(self, f"{location}_x") + shape = coord.shape if coord is not None else None + if shape is None: + coord = getattr(self, f"{location}_y") + if coord is not None: + shape = coord.shape + return shape + + @property + def _edge_shape(self): + return self._shape(location="edge") + + @property + def _node_shape(self): + return self._shape(location="node") + + @property + def all_members(self): + return Mesh1DCoords(**self._members) + + @property + def edge_coords(self): + return MeshEdgeCoords(edge_x=self.edge_x, edge_y=self.edge_y) + + @property + def edge_x(self): + return self._members["edge_x"] + + @edge_x.setter + def edge_x(self, coord): + self._setter( + location="edge", axis="x", coord=coord, shape=self._edge_shape + ) + + @property + def edge_y(self): + return self._members["edge_y"] + + @edge_y.setter + def edge_y(self, coord): + self._setter( + location="edge", axis="y", coord=coord, shape=self._edge_shape + ) + + @property + def node_coords(self): + return MeshNodeCoords(node_x=self.node_x, node_y=self.node_y) + + @property + def node_x(self): + return self._members["node_x"] + + @node_x.setter + def node_x(self, coord): + self._setter( + location="node", axis="x", coord=coord, shape=self._node_shape + ) + + @property + def node_y(self): + return self._members["node_y"] + + @node_y.setter + def node_y(self, coord): + self._setter( + location="node", axis="y", coord=coord, shape=self._node_shape + ) + + def _add(self, coords): + member_x, member_y = coords._fields + + # deal with the special case where both members are changing + if coords[0] is not None and coords[1] is not None: + cache_x = self._members[member_x] + cache_y = self._members[member_y] + self._members[member_x] = None + self._members[member_y] = None + + try: + setattr(self, member_x, coords[0]) + setattr(self, member_y, coords[1]) + except (TypeError, ValueError): + # restore previous valid state + self._members[member_x] = cache_x + self._members[member_y] = cache_y + # now, re-raise the exception + raise + else: + # deal with the case where one or no member is changing + if coords[0] is not None: + setattr(self, member_x, coords[0]) + if coords[1] is not None: + setattr(self, member_y, coords[1]) + + def add(self, node_x=None, node_y=None, edge_x=None, edge_y=None): + """ + use self.remove(edge_x=True) to remove a coordinate e.g., using the + pattern self.add(edge_x=None) will not remove the edge_x coordinate + + """ + self._add(MeshNodeCoords(node_x, node_y)) + self._add(MeshEdgeCoords(edge_x, edge_y)) + + def filter(self, **kwargs): + # TODO: rationalise commonality with MeshConnectivityManager.filter and Cube.coord. + result = self.filters(**kwargs) + + if len(result) > 1: + names = ", ".join( + f"{member}={coord!r}" for member, coord in result.items() + ) + emsg = ( + f"Expected to find exactly 1 coordinate, but found {len(result)}. " + f"They were: {names}." + ) + raise CoordinateNotFoundError(emsg) + + if len(result) == 0: + item = kwargs["item"] + if item is not None: + if not isinstance(item, str): + item = item.name() + name = ( + item + or kwargs["standard_name"] + or kwargs["long_name"] + or kwargs["var_name"] + or None + ) + name = "" if name is None else f"{name!r} " + emsg = ( + f"Expected to find exactly 1 {name}coordinate, but found none." + ) + raise CoordinateNotFoundError(emsg) + + return result + + def filters( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=None, + edge=None, + face=None, + ): + # TBD: support coord_systems? + + # rationalise the tri-state behaviour + args = [node, edge, face] + state = not any(set(filter(lambda arg: arg is not None, args))) + node, edge, face = map( + lambda arg: arg if arg is not None else state, args + ) + + def populated_coords(coords_tuple): + return list(filter(None, list(coords_tuple))) + + members = [] + if node: + members += populated_coords(self.node_coords) + if edge: + members += populated_coords(self.edge_coords) + if hasattr(self, "face_coords"): + if face: + members += populated_coords(self.face_coords) + else: + dmsg = "Ignoring request to filter non-existent 'face_coords'" + logger.debug(dmsg, extra=dict(cls=self.__class__.__name__)) + + result = metadata_filter( + members, + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + ) + + # Use the results to filter the _members dict for returning. + result_ids = [id(r) for r in result] + result_dict = { + k: v for k, v in self._members.items() if id(v) in result_ids + } + return result_dict + + def remove( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=None, + edge=None, + ): + return self._remove( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + node=node, + edge=edge, + ) + + +class _Mesh2DCoordinateManager(_Mesh1DCoordinateManager): + OPTIONAL = ( + "edge_x", + "edge_y", + "face_x", + "face_y", + ) + + def __init__( + self, + node_x, + node_y, + edge_x=None, + edge_y=None, + face_x=None, + face_y=None, + ): + super().__init__(node_x, node_y, edge_x=edge_x, edge_y=edge_y) + + # optional coordinates + self.face_x = face_x + self.face_y = face_y + + @property + def _face_shape(self): + return self._shape(location="face") + + @property + def all_members(self): + return Mesh2DCoords(**self._members) + + @property + def face_coords(self): + return MeshFaceCoords(face_x=self.face_x, face_y=self.face_y) + + @property + def face_x(self): + return self._members["face_x"] + + @face_x.setter + def face_x(self, coord): + self._setter( + location="face", axis="x", coord=coord, shape=self._face_shape + ) + + @property + def face_y(self): + return self._members["face_y"] + + @face_y.setter + def face_y(self, coord): + self._setter( + location="face", axis="y", coord=coord, shape=self._face_shape + ) + + def add( + self, + node_x=None, + node_y=None, + edge_x=None, + edge_y=None, + face_x=None, + face_y=None, + ): + super().add(node_x=node_x, node_y=node_y, edge_x=edge_x, edge_y=edge_y) + self._add(MeshFaceCoords(face_x, face_y)) + + def remove( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + axis=None, + node=None, + edge=None, + face=None, + ): + return self._remove( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + axis=axis, + node=node, + edge=edge, + face=face, + ) + + +class _MeshConnectivityManagerBase(ABC): + # Override these in subclasses. + REQUIRED: tuple = NotImplemented + OPTIONAL: tuple = NotImplemented + + def __init__(self, *connectivities): + cf_roles = [c.cf_role for c in connectivities] + for requisite in self.REQUIRED: + if requisite not in cf_roles: + message = ( + f"{self.__name__} requires a {requisite} Connectivity." + ) + raise ValueError(message) + + self.ALL = self.REQUIRED + self.OPTIONAL + self._members = {member: None for member in self.ALL} + self.add(*connectivities) + + def __eq__(self, other): + # TBD + return NotImplemented + + def __getstate__(self): + return self._members + + def __iter__(self): + for item in self._members.items(): + yield item + + def __ne__(self, other): + # TBD + return NotImplemented + + def __repr__(self): + args = [ + f"{member}={connectivity!r}" + for member, connectivity in self + if connectivity is not None + ] + return f"{self.__class__.__name__}({', '.join(args)})" + + def __setstate__(self, state): + self._members = state + + def __str__(self): + args = [ + f"{member}=True" + for member, connectivity in self + if connectivity is not None + ] + return f"{self.__class__.__name__}({', '.join(args)})" + + @property + @abstractmethod + def all_members(self): + return NotImplemented + + def add(self, *connectivities): + # Since Connectivity classes include their cf_role, no setters will be + # provided, just a means to add one or more connectivities to the + # manager. + # No warning is raised for duplicate cf_roles - user is trusted to + # validate their outputs. + add_dict = {} + for connectivity in connectivities: + if not isinstance(connectivity, Connectivity): + message = f"Expected Connectivity, got: {type(connectivity)} ." + raise ValueError(message) + cf_role = connectivity.cf_role + if cf_role not in self.ALL: + message = ( + f"Not adding connectivity ({cf_role}: " + f"{connectivity!r}) - cf_role must be one of: {self.ALL} ." + ) + logger.debug(message, extra=dict(cls=self.__class__.__name__)) + else: + add_dict[cf_role] = connectivity + + # Validate shapes. + proposed_members = {**self._members, **add_dict} + locations = set( + [ + c.src_location + for c in proposed_members.values() + if c is not None + ] + ) + for location in locations: + counts = [ + len(c.indices_by_src(c.lazy_indices())) + for c in proposed_members.values() + if c is not None and c.src_location == location + ] + # Check is list values are identical. + if not counts.count(counts[0]) == len(counts): + message = ( + f"Invalid Connectivities provided - inconsistent " + f"{location} counts." + ) + raise ValueError(message) + + self._members = proposed_members + + def filter(self, **kwargs): + # TODO: rationalise commonality with MeshCoordManager.filter and Cube.coord. + result = self.filters(**kwargs) + if len(result) > 1: + names = ", ".join( + f"{member}={connectivity!r}" + for member, connectivity in result.items() + ) + message = ( + f"Expected to find exactly 1 connectivity, but found " + f"{len(result)}. They were: {names}." + ) + raise ConnectivityNotFoundError(message) + elif len(result) == 0: + item = kwargs["item"] + _name = item + if item is not None: + if not isinstance(item, str): + _name = item.name() + bad_name = ( + _name or kwargs["standard_name"] or kwargs["long_name"] or "" + ) + message = ( + f"Expected to find exactly 1 {bad_name} connectivity, " + f"but found none." + ) + raise ConnectivityNotFoundError(message) + + return result + + def filters( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + cf_role=None, + node=None, + edge=None, + face=None, + ): + members = [c for c in self._members.values() if c is not None] + + if cf_role is not None: + members = [ + instance for instance in members if instance.cf_role == cf_role + ] + + def location_filter(instances, loc_arg, loc_name): + if loc_arg is False: + filtered = [ + instance + for instance in instances + if loc_name + not in (instance.src_location, instance.tgt_location) + ] + elif loc_arg is None: + filtered = instances + else: + # Interpret any other value as =True. + filtered = [ + instance + for instance in instances + if loc_name + in (instance.src_location, instance.tgt_location) + ] + + return filtered + + for arg, loc in ( + (node, "node"), + (edge, "edge"), + (face, "face"), + ): + members = location_filter(members, arg, loc) + + # No need to actually modify filtering behaviour - already won't return + # any face cf-roles if none are present. + supports_faces = any(["face" in role for role in self.ALL]) + if face and not supports_faces: + message = ( + "Ignoring request to filter for non-existent 'face' cf-roles." + ) + logger.debug(message, extra=dict(cls=self.__class__.__name__)) + + result = metadata_filter( + members, + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + ) + + # Use the results to filter the _members dict for returning. + result_ids = [id(r) for r in result] + result_dict = { + k: v for k, v in self._members.items() if id(v) in result_ids + } + return result_dict + + def remove( + self, + item=None, + standard_name=None, + long_name=None, + var_name=None, + attributes=None, + cf_role=None, + node=None, + edge=None, + face=None, + ): + removal_dict = self.filters( + item=item, + standard_name=standard_name, + long_name=long_name, + var_name=var_name, + attributes=attributes, + cf_role=cf_role, + node=node, + edge=edge, + face=face, + ) + for cf_role in self.REQUIRED: + excluded = removal_dict.pop(cf_role, None) + if excluded: + message = ( + f"Ignoring request to remove required connectivity " + f"({cf_role}: {excluded!r})" + ) + logger.debug(message, extra=dict(cls=self.__class__.__name__)) + + for cf_role in removal_dict.keys(): + self._members[cf_role] = None + + return removal_dict + + +class _Mesh1DConnectivityManager(_MeshConnectivityManagerBase): + REQUIRED = ("edge_node_connectivity",) + OPTIONAL = () + + @property + def all_members(self): + return Mesh1DConnectivities(edge_node=self.edge_node) + + @property + def edge_node(self): + return self._members["edge_node_connectivity"] + + +class _Mesh2DConnectivityManager(_MeshConnectivityManagerBase): + REQUIRED = ("face_node_connectivity",) + OPTIONAL = ( + "edge_node_connectivity", + "face_edge_connectivity", + "face_face_connectivity", + "edge_face_connectivity", + "boundary_node_connectivity", + ) + + @property + def all_members(self): + return Mesh2DConnectivities( + face_node=self.face_node, + edge_node=self.edge_node, + face_edge=self.face_edge, + face_face=self.face_face, + edge_face=self.edge_face, + boundary_node=self.boundary_node, + ) + + @property + def boundary_node(self): + return self._members["boundary_node_connectivity"] + + @property + def edge_face(self): + return self._members["edge_face_connectivity"] + + @property + def edge_node(self): + return self._members["edge_node_connectivity"] + + @property + def face_edge(self): + return self._members["face_edge_connectivity"] + + @property + def face_face(self): + return self._members["face_face_connectivity"] + + @property + def face_node(self): + return self._members["face_node_connectivity"] + + #: Convenience collection of lenient metadata combine services. -SERVICES_COMBINE.append(ConnectivityMetadata.combine) -SERVICES.append(ConnectivityMetadata.combine) +_services = [ConnectivityMetadata.combine, MeshMetadata.combine] +SERVICES_COMBINE.extend(_services) +SERVICES.extend(_services) #: Convenience collection of lenient metadata difference services. -SERVICES_DIFFERENCE.append(ConnectivityMetadata.difference) -SERVICES.append(ConnectivityMetadata.difference) +_services = [ConnectivityMetadata.difference, MeshMetadata.difference] +SERVICES_DIFFERENCE.extend(_services) +SERVICES.extend(_services) #: Convenience collection of lenient metadata equality services. -SERVICES_EQUAL.extend( - [ConnectivityMetadata.__eq__, ConnectivityMetadata.equal] -) -SERVICES.extend([ConnectivityMetadata.__eq__, ConnectivityMetadata.equal]) +_services = [ + ConnectivityMetadata.__eq__, + ConnectivityMetadata.equal, + MeshMetadata.__eq__, + MeshMetadata.equal, +] +SERVICES_EQUAL.extend(_services) +SERVICES.extend(_services) + +del _services diff --git a/lib/iris/tests/unit/common/metadata/test_metadata_filter.py b/lib/iris/tests/unit/common/metadata/test_metadata_filter.py new file mode 100644 index 0000000000..b5dad2864c --- /dev/null +++ b/lib/iris/tests/unit/common/metadata/test_metadata_filter.py @@ -0,0 +1,136 @@ +# Copyright Iris contributors +# +# This file is part of Iris and is released under the LGPL license. +# See COPYING and COPYING.LESSER in the root of the repository for full +# licensing details. +""" +Unit tests for the :func:`iris.common.metadata_filter`. + +""" + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np + +from iris.common.metadata import ( + CoordMetadata, + DimCoordMetadata, + metadata_filter, +) +from iris.coords import AuxCoord + +Mock = tests.mock.Mock + + +class Test_standard(tests.IrisTest): + def test_instances_non_iterable(self): + item = Mock() + item.name.return_value = "one" + result = metadata_filter(item, item="one") + self.assertEqual(1, len(result)) + self.assertIn(item, result) + + def test_name(self): + name_one = Mock() + name_one.name.return_value = "one" + name_two = Mock() + name_two.name.return_value = "two" + input_list = [name_one, name_two] + result = metadata_filter(input_list, item="one") + self.assertIn(name_one, result) + self.assertNotIn(name_two, result) + + def test_item(self): + coord = Mock(__class__=AuxCoord) + mock = Mock() + input_list = [coord, mock] + result = metadata_filter(input_list, item=coord) + self.assertIn(coord, result) + self.assertNotIn(mock, result) + + def test_item_metadata(self): + coord = Mock(metadata=CoordMetadata) + dim_coord = Mock(metadata=DimCoordMetadata) + input_list = [coord, dim_coord] + result = metadata_filter(input_list, item=coord) + self.assertIn(coord, result) + self.assertNotIn(dim_coord, result) + + def test_standard_name(self): + name_one = Mock(standard_name="one") + name_two = Mock(standard_name="two") + input_list = [name_one, name_two] + result = metadata_filter(input_list, standard_name="one") + self.assertIn(name_one, result) + self.assertNotIn(name_two, result) + + def test_long_name(self): + name_one = Mock(long_name="one") + name_two = Mock(long_name="two") + input_list = [name_one, name_two] + result = metadata_filter(input_list, long_name="one") + self.assertIn(name_one, result) + self.assertNotIn(name_two, result) + + def test_var_name(self): + name_one = Mock(var_name="one") + name_two = Mock(var_name="two") + input_list = [name_one, name_two] + result = metadata_filter(input_list, var_name="one") + self.assertIn(name_one, result) + self.assertNotIn(name_two, result) + + def test_attributes(self): + # Confirm that this can handle attrib dicts including np arrays. + attrib_one_two = Mock( + attributes={"one": np.arange(1), "two": np.arange(2)} + ) + attrib_three_four = Mock( + attributes={"three": np.arange(3), "four": np.arange(4)} + ) + input_list = [attrib_one_two, attrib_three_four] + result = metadata_filter( + input_list, attributes=attrib_one_two.attributes + ) + self.assertIn(attrib_one_two, result) + self.assertNotIn(attrib_three_four, result) + + def test_invalid_attributes(self): + attrib_one = Mock(attributes={"one": 1}) + input_list = [attrib_one] + self.assertRaisesRegex( + ValueError, + ".*expecting a dictionary.*", + metadata_filter, + input_list, + attributes="one", + ) + + def test_axis__by_guess(self): + # see https://docs.python.org/3/library/unittest.mock.html#deleting-attributes + axis_lon = Mock(standard_name="longitude") + del axis_lon.axis + axis_lat = Mock(standard_name="latitude") + del axis_lat.axis + input_list = [axis_lon, axis_lat] + result = metadata_filter(input_list, axis="x") + self.assertIn(axis_lon, result) + self.assertNotIn(axis_lat, result) + + def test_axis__by_member(self): + axis_x = Mock(axis="x") + axis_y = Mock(axis="y") + input_list = [axis_x, axis_y] + result = metadata_filter(input_list, axis="x") + self.assertEqual(1, len(result)) + self.assertIn(axis_x, result) + + def test_multiple_args(self): + coord_one = Mock(__class__=AuxCoord, long_name="one") + coord_two = Mock(__class__=AuxCoord, long_name="two") + input_list = [coord_one, coord_two] + result = metadata_filter(input_list, item=coord_one, long_name="one") + self.assertIn(coord_one, result) + self.assertNotIn(coord_two, result) diff --git a/lib/iris/tests/unit/experimental/ugrid/test_MeshMetadata.py b/lib/iris/tests/unit/experimental/ugrid/test_MeshMetadata.py new file mode 100644 index 0000000000..105365c908 --- /dev/null +++ b/lib/iris/tests/unit/experimental/ugrid/test_MeshMetadata.py @@ -0,0 +1,784 @@ +# Copyright Iris contributors +# +# This file is part of Iris and is released under the LGPL license. +# See COPYING and COPYING.LESSER in the root of the repository for full +# licensing details. +""" +Unit tests for the :class:`iris.experimental.ugrid.MeshMetadata`. + +""" + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +from copy import deepcopy +import unittest.mock as mock +from unittest.mock import sentinel + +from iris.common.lenient import _LENIENT, _qualname +from iris.common.metadata import BaseMetadata +from iris.experimental.ugrid import MeshMetadata + + +class Test(tests.IrisTest): + def setUp(self): + self.standard_name = mock.sentinel.standard_name + self.long_name = mock.sentinel.long_name + self.var_name = mock.sentinel.var_name + self.units = mock.sentinel.units + self.attributes = mock.sentinel.attributes + self.topology_dimension = mock.sentinel.topology_dimension + self.node_dimension = mock.sentinel.node_dimension + self.edge_dimension = mock.sentinel.edge_dimension + self.face_dimension = mock.sentinel.face_dimension + self.cls = MeshMetadata + + def test_repr(self): + metadata = self.cls( + standard_name=self.standard_name, + long_name=self.long_name, + var_name=self.var_name, + units=self.units, + attributes=self.attributes, + topology_dimension=self.topology_dimension, + node_dimension=self.node_dimension, + edge_dimension=self.edge_dimension, + face_dimension=self.face_dimension, + ) + fmt = ( + "MeshMetadata(standard_name={!r}, long_name={!r}, " + "var_name={!r}, units={!r}, attributes={!r}, " + "topology_dimension={!r}, node_dimension={!r}, " + "edge_dimension={!r}, face_dimension={!r})" + ) + expected = fmt.format( + self.standard_name, + self.long_name, + self.var_name, + self.units, + self.attributes, + self.topology_dimension, + self.node_dimension, + self.edge_dimension, + self.face_dimension, + ) + self.assertEqual(expected, repr(metadata)) + + def test__fields(self): + expected = ( + "standard_name", + "long_name", + "var_name", + "units", + "attributes", + "topology_dimension", + "node_dimension", + "edge_dimension", + "face_dimension", + ) + self.assertEqual(self.cls._fields, expected) + + def test_bases(self): + self.assertTrue(issubclass(self.cls, BaseMetadata)) + + +class Test__eq__(tests.IrisTest): + def setUp(self): + self.values = dict( + standard_name=sentinel.standard_name, + long_name=sentinel.long_name, + var_name=sentinel.var_name, + units=sentinel.units, + attributes=sentinel.attributes, + topology_dimension=sentinel.topology_dimension, + node_dimension=sentinel.node_dimension, + edge_dimension=sentinel.edge_dimension, + face_dimension=sentinel.face_dimension, + ) + self.dummy = sentinel.dummy + self.cls = MeshMetadata + # The "node_dimension", "edge_dimension" and "face_dimension" members + # are stateful only; they do not participate in lenient/strict equivalence. + self.members_dim_names = filter( + lambda member: member + in ("node_dimension", "edge_dimension", "face_dimension"), + self.cls._members, + ) + + def test_wraps_docstring(self): + self.assertEqual(BaseMetadata.__eq__.__doc__, self.cls.__eq__.__doc__) + + def test_lenient_service(self): + qualname___eq__ = _qualname(self.cls.__eq__) + self.assertIn(qualname___eq__, _LENIENT) + self.assertTrue(_LENIENT[qualname___eq__]) + self.assertTrue(_LENIENT[self.cls.__eq__]) + + def test_call(self): + other = sentinel.other + return_value = sentinel.return_value + metadata = self.cls(*(None,) * len(self.cls._fields)) + with mock.patch.object( + BaseMetadata, "__eq__", return_value=return_value + ) as mocker: + result = metadata.__eq__(other) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(), kwargs) + + def test_op_lenient_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_same_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["var_name"] = None + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_same_topology_dim_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["topology_dimension"] = None + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_same_dim_names_none(self): + for member in self.members_dim_names: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_different(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["units"] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_different_topology_dim(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["topology_dimension"] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_lenient_different_dim_names(self): + for member in self.members_dim_names: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_strict_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["long_name"] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different_topology_dim(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["topology_dimension"] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different_dim_names(self): + for member in self.members_dim_names: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["long_name"] = None + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different_topology_dim_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["topology_dimension"] = None + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertFalse(lmetadata.__eq__(rmetadata)) + self.assertFalse(rmetadata.__eq__(lmetadata)) + + def test_op_strict_different_dim_names_none(self): + for member in self.members_dim_names: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertTrue(lmetadata.__eq__(rmetadata)) + self.assertTrue(rmetadata.__eq__(lmetadata)) + + +class Test___lt__(tests.IrisTest): + def setUp(self): + self.cls = MeshMetadata + self.one = self.cls(1, 1, 1, 1, 1, 1, 1, 1, 1) + self.two = self.cls(1, 1, 1, 2, 1, 1, 1, 1, 1) + self.none = self.cls(1, 1, 1, None, 1, 1, 1, 1, 1) + self.attributes = self.cls(1, 1, 1, 1, 10, 1, 1, 1, 1) + + def test__ascending_lt(self): + result = self.one < self.two + self.assertTrue(result) + + def test__descending_lt(self): + result = self.two < self.one + self.assertFalse(result) + + def test__none_rhs_operand(self): + result = self.one < self.none + self.assertFalse(result) + + def test__none_lhs_operand(self): + result = self.none < self.one + self.assertTrue(result) + + def test__ignore_attributes(self): + result = self.one < self.attributes + self.assertFalse(result) + result = self.attributes < self.one + self.assertFalse(result) + + +class Test_combine(tests.IrisTest): + def setUp(self): + self.values = dict( + standard_name=sentinel.standard_name, + long_name=sentinel.long_name, + var_name=sentinel.var_name, + units=sentinel.units, + attributes=sentinel.attributes, + topology_dimension=sentinel.topology_dimension, + node_dimension=sentinel.node_dimension, + edge_dimension=sentinel.edge_dimension, + face_dimension=sentinel.face_dimension, + ) + self.dummy = sentinel.dummy + self.cls = MeshMetadata + self.none = self.cls(*(None,) * len(self.cls._fields)) + + def test_wraps_docstring(self): + self.assertEqual( + BaseMetadata.combine.__doc__, self.cls.combine.__doc__ + ) + + def test_lenient_service(self): + qualname_combine = _qualname(self.cls.combine) + self.assertIn(qualname_combine, _LENIENT) + self.assertTrue(_LENIENT[qualname_combine]) + self.assertTrue(_LENIENT[self.cls.combine]) + + def test_lenient_default(self): + other = sentinel.other + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "combine", return_value=return_value + ) as mocker: + result = self.none.combine(other) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=None), kwargs) + + def test_lenient(self): + other = sentinel.other + lenient = sentinel.lenient + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "combine", return_value=return_value + ) as mocker: + result = self.none.combine(other, lenient=lenient) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=lenient), kwargs) + + def test_op_lenient_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + expected = self.values + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_lenient_same_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["var_name"] = None + rmetadata = self.cls(**right) + expected = self.values + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_lenient_same_members_none(self): + for member in self.cls._members: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + expected = right.copy() + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertEqual( + expected, lmetadata.combine(rmetadata)._asdict() + ) + self.assertEqual( + expected, rmetadata.combine(lmetadata)._asdict() + ) + + def test_op_lenient_different(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["units"] = self.dummy + rmetadata = self.cls(**right) + expected = self.values.copy() + expected["units"] = None + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_lenient_different_members(self): + for member in self.cls._members: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + expected = self.values.copy() + expected[member] = None + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertEqual( + expected, lmetadata.combine(rmetadata)._asdict() + ) + self.assertEqual( + expected, rmetadata.combine(lmetadata)._asdict() + ) + + def test_op_strict_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + expected = self.values.copy() + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_strict_different(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["long_name"] = self.dummy + rmetadata = self.cls(**right) + expected = self.values.copy() + expected["long_name"] = None + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_strict_different_members(self): + for member in self.cls._members: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + expected = self.values.copy() + expected[member] = None + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertEqual( + expected, lmetadata.combine(rmetadata)._asdict() + ) + self.assertEqual( + expected, rmetadata.combine(lmetadata)._asdict() + ) + + def test_op_strict_different_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["long_name"] = None + rmetadata = self.cls(**right) + expected = self.values.copy() + expected["long_name"] = None + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertEqual(expected, lmetadata.combine(rmetadata)._asdict()) + self.assertEqual(expected, rmetadata.combine(lmetadata)._asdict()) + + def test_op_strict_different_members_none(self): + for member in self.cls._members: + lmetadata = self.cls(**self.values) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + expected = self.values.copy() + expected[member] = None + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertEqual( + expected, lmetadata.combine(rmetadata)._asdict() + ) + self.assertEqual( + expected, rmetadata.combine(lmetadata)._asdict() + ) + + +class Test_difference(tests.IrisTest): + def setUp(self): + self.values = dict( + standard_name=sentinel.standard_name, + long_name=sentinel.long_name, + var_name=sentinel.var_name, + units=sentinel.units, + attributes=sentinel.attributes, + topology_dimension=sentinel.topology_dimension, + node_dimension=sentinel.node_dimension, + edge_dimension=sentinel.edge_dimension, + face_dimension=sentinel.face_dimension, + ) + self.dummy = sentinel.dummy + self.cls = MeshMetadata + self.none = self.cls(*(None,) * len(self.cls._fields)) + + def test_wraps_docstring(self): + self.assertEqual( + BaseMetadata.difference.__doc__, self.cls.difference.__doc__ + ) + + def test_lenient_service(self): + qualname_difference = _qualname(self.cls.difference) + self.assertIn(qualname_difference, _LENIENT) + self.assertTrue(_LENIENT[qualname_difference]) + self.assertTrue(_LENIENT[self.cls.difference]) + + def test_lenient_default(self): + other = sentinel.other + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "difference", return_value=return_value + ) as mocker: + result = self.none.difference(other) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=None), kwargs) + + def test_lenient(self): + other = sentinel.other + lenient = sentinel.lenient + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "difference", return_value=return_value + ) as mocker: + result = self.none.difference(other, lenient=lenient) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=lenient), kwargs) + + def test_op_lenient_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertIsNone(lmetadata.difference(rmetadata)) + self.assertIsNone(rmetadata.difference(lmetadata)) + + def test_op_lenient_same_none(self): + lmetadata = self.cls(**self.values) + right = self.values.copy() + right["var_name"] = None + rmetadata = self.cls(**right) + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertIsNone(lmetadata.difference(rmetadata)) + self.assertIsNone(rmetadata.difference(lmetadata)) + + def test_op_lenient_same_members_none(self): + for member in self.cls._members: + lmetadata = self.cls(**self.values) + member_value = getattr(lmetadata, member) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected[member] = (member_value, None) + rexpected = deepcopy(self.none)._asdict() + rexpected[member] = (None, member_value) + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_lenient_different(self): + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right["units"] = self.dummy + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected["units"] = (left["units"], right["units"]) + rexpected = deepcopy(self.none)._asdict() + rexpected["units"] = lexpected["units"][::-1] + + with mock.patch("iris.common.metadata._LENIENT", return_value=True): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_lenient_different_members(self): + for member in self.cls._members: + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected[member] = (left[member], right[member]) + rexpected = deepcopy(self.none)._asdict() + rexpected[member] = lexpected[member][::-1] + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=True + ): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_strict_same(self): + lmetadata = self.cls(**self.values) + rmetadata = self.cls(**self.values) + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertIsNone(lmetadata.difference(rmetadata)) + self.assertIsNone(rmetadata.difference(lmetadata)) + + def test_op_strict_different(self): + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right["long_name"] = self.dummy + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected["long_name"] = (left["long_name"], right["long_name"]) + rexpected = deepcopy(self.none)._asdict() + rexpected["long_name"] = lexpected["long_name"][::-1] + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_strict_different_members(self): + for member in self.cls._members: + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right[member] = self.dummy + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected[member] = (left[member], right[member]) + rexpected = deepcopy(self.none)._asdict() + rexpected[member] = lexpected[member][::-1] + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_strict_different_none(self): + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right["long_name"] = None + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected["long_name"] = (left["long_name"], right["long_name"]) + rexpected = deepcopy(self.none)._asdict() + rexpected["long_name"] = lexpected["long_name"][::-1] + + with mock.patch("iris.common.metadata._LENIENT", return_value=False): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + def test_op_strict_different_members_none(self): + for member in self.cls._members: + left = self.values.copy() + lmetadata = self.cls(**left) + right = self.values.copy() + right[member] = None + rmetadata = self.cls(**right) + lexpected = deepcopy(self.none)._asdict() + lexpected[member] = (left[member], right[member]) + rexpected = deepcopy(self.none)._asdict() + rexpected[member] = lexpected[member][::-1] + + with mock.patch( + "iris.common.metadata._LENIENT", return_value=False + ): + self.assertEqual( + lexpected, lmetadata.difference(rmetadata)._asdict() + ) + self.assertEqual( + rexpected, rmetadata.difference(lmetadata)._asdict() + ) + + +class Test_equal(tests.IrisTest): + def setUp(self): + self.cls = MeshMetadata + self.none = self.cls(*(None,) * len(self.cls._fields)) + + def test_wraps_docstring(self): + self.assertEqual(BaseMetadata.equal.__doc__, self.cls.equal.__doc__) + + def test_lenient_service(self): + qualname_equal = _qualname(self.cls.equal) + self.assertIn(qualname_equal, _LENIENT) + self.assertTrue(_LENIENT[qualname_equal]) + self.assertTrue(_LENIENT[self.cls.equal]) + + def test_lenient_default(self): + other = sentinel.other + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "equal", return_value=return_value + ) as mocker: + result = self.none.equal(other) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=None), kwargs) + + def test_lenient(self): + other = sentinel.other + lenient = sentinel.lenient + return_value = sentinel.return_value + with mock.patch.object( + BaseMetadata, "equal", return_value=return_value + ) as mocker: + result = self.none.equal(other, lenient=lenient) + + self.assertEqual(return_value, result) + self.assertEqual(1, mocker.call_count) + (arg,), kwargs = mocker.call_args + self.assertEqual(other, arg) + self.assertEqual(dict(lenient=lenient), kwargs) + + if __name__ == "__main__": + tests.main()