Skip to content

Commit

Permalink
Merge pull request #43 from pastas/dev
Browse files Browse the repository at this point in the history
Dev
  • Loading branch information
dbrakenhoff authored Sep 3, 2021
2 parents a1fba2b + 6163d1c commit 13bd833
Show file tree
Hide file tree
Showing 20 changed files with 1,832 additions and 217 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.7]
python-version: [3.7, 3.8]
services:
mongodb:
image: mongo:latest
Expand Down
21 changes: 9 additions & 12 deletions docs/connectors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Arctic
The :ref:`ArcticConnector` is an object that creates a
connection with a MongoDB database. This can be an existing or a new database.
For each of the datasets a collection or library is created. These are named
using the following convention: `<database name>.<collection name>`.
using the following convention: `<database name>.<library name>`.

The Arctic implementation uses the following structure:

Expand All @@ -61,14 +61,12 @@ are stored as pandas.DataFrames. Models are stored in JSON (actually binary
JSON) and *do not* contain the timeseries themselves. These are picked up from
the other libraries when the model is loaded from the database.

The `ArcticConnector` object allows the user to add different versions for
datasets, which can be used to keep a history of older models for example.

Pystore
-------
The :ref:`PystoreConnector` is an object that links
to a location on disk. This can either be an existing or a new Pystore. A new
store is created with collections that hold the different datasets:
store is created with collections (or libraries) that hold the different
datasets:

* observation timeseries
* stresses timeseries
Expand All @@ -79,7 +77,7 @@ The Pystores have the following structure:
.. code-block:: bash
+-- store
| +-- collections... (i.e. oseries, stresses, models)
| +-- collections or libraries... (i.e. oseries, stresses, models)
| | +-- items... (i.e. individual timeseries or models)
Expand All @@ -91,28 +89,27 @@ design allows the models to be saved in a PyStore. The timeseries are picked
up from their respective stores when the model is loaded from disk.

PyStore supports so-called snapshots (which store the current state of the
store) but this has not been actively implemented in this module. Pystore does
not have the same versioning capabilities as Arctic.
store) but this has not been actively implemented in this module.

Custom Connectors
-----------------
It should be relatively straightforward to write your own custom connector
object. The :ref:`Base` submodule contains the
`BaseConnector` class that defines which methods and properties *must*
be defined. The `ConnectorUtil` mix-in class contains some general methods that
are used by each connector. Each Connector object should inherit from these
are used by each connector. Each Connector object should inherit from these two
classes.

The `BaseConnector` class also shows the expected call signature for each
method. Following the same call signature should ensure that your new connector
works directly with `PastaStore`. Though extra keyword arguments can be
works directly with `PastaStore`. Extra keyword arguments can be
added in the custom class.

Below is a small snippet showing a custom Connector class::

class MyCustomConnector(BaseConnector, ConnectorUtil):
"""Must override each method and property in BaseConnector, e.g."""

def get_oseries(self, name, progressbar=False):
# your code to get oseries from database here
def _get_item(self, name, progressbar=False):
# your code here for getting an item from your database
pass
3 changes: 3 additions & 0 deletions docs/examples/003_pastastore_plots_and_maps.nblink
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"path": "../../examples/notebooks/ex03_pastastore_plots_and_maps.ipynb"
}
22 changes: 22 additions & 0 deletions docs/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,15 @@ DictConnector

.. autoclass:: pastastore.DictConnector
:members:
:private-members:
:show-inheritance:

PasConnector
^^^^^^^^^^^^

.. autoclass:: pastastore.PasConnector
:members:
:private-members:
:show-inheritance:

ArcticConnector
Expand All @@ -36,6 +38,7 @@ ArcticConnector
.. autoclass:: pastastore.ArcticConnector
:members:
:undoc-members:
:private-members:
:show-inheritance:

PystoreConnector
Expand All @@ -44,6 +47,7 @@ PystoreConnector
.. autoclass:: pastastore.PystoreConnector
:members:
:undoc-members:
:private-members:
:show-inheritance:

.. _Pastastore_API:
Expand All @@ -54,6 +58,24 @@ PastaStore
.. automodule:: pastastore.store
:members:


Plots
-----

.. autoclass:: pastastore.plotting.Plots
:members:
:undoc-members:
:private-members:

Maps
----

.. autoclass:: pastastore.plotting.Maps
:members:
:undoc-members:
:private-members:


Util
----

Expand Down
2 changes: 1 addition & 1 deletion docs/pstore.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ PastaStore object
=================

The `PastaStore` object is essentially a class for working with timeseries and
pastas Models. A connector has to be passed to the object which manages the
pastas Models. A Connector has to be passed to the object which manages the
retrieval and storage of data.

Methods are available for the following tasks:
Expand Down
6 changes: 3 additions & 3 deletions docs/utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,16 @@ Utilities
=========

The `pastastore.util` submodule contains useful functions, i.e. for deleting
databases, connector objects, and PastaStore objects or emptying a library of
all its contents:
databases, connector objects, and PastaStore objects, emptying a library of
all its contents or copying all data to a new database:


* :meth:`pastastore.util.delete_pastastore`
* :meth:`pastastore.util.delete_dict_connector`
* :meth:`pastastore.util.delete_pas_connector`
* :meth:`pastastore.util.delete_pystore_connector`
* :meth:`pastastore.util.delete_arctic_connector`
* :meth:`pastastore.util.empty_library`
* :meth:`pastastore.util.copy_database`


It also contains a method for making a detailed comparison between two
Expand Down
30 changes: 18 additions & 12 deletions examples/notebooks/ex01_intro+guide_for_pastas-Project_users.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,14 +49,13 @@
"metadata": {},
"outputs": [],
"source": [
"import pastastore as pst\n",
"import os\n",
"import pandas as pd\n",
"import pastas as ps\n",
"\n",
"import sys\n",
"sys.path.insert(1, \"../..\")\n",
"\n",
"import pastastore as pst"
"sys.path.insert(1, \"../..\")"
]
},
{
Expand Down Expand Up @@ -221,7 +220,8 @@
],
"source": [
"datadir = \"../../tests/data/\" # relative path to data directory\n",
"oseries1 = pd.read_csv(os.path.join(datadir, \"head_nb1.csv\"), index_col=0, parse_dates=True)\n",
"oseries1 = pd.read_csv(os.path.join(\n",
" datadir, \"head_nb1.csv\"), index_col=0, parse_dates=True)\n",
"ometa = {\"x\": 100300, \"y\": 400400}\n",
"oseries1.head()"
]
Expand Down Expand Up @@ -275,13 +275,15 @@
"metadata": {},
"outputs": [],
"source": [
"# prec \n",
"p = pd.read_csv(os.path.join(datadir, \"rain_nb1.csv\"), index_col=0, parse_dates=True)\n",
"# prec\n",
"p = pd.read_csv(os.path.join(datadir, \"rain_nb1.csv\"),\n",
" index_col=0, parse_dates=True)\n",
"p.columns = ['value']\n",
"pmeta = {\"x\": 100300, \"y\": 400400}\n",
"\n",
"# evap \n",
"e = pd.read_csv(os.path.join(datadir, \"evap_nb1.csv\"), index_col=0, parse_dates=True)\n",
"# evap\n",
"e = pd.read_csv(os.path.join(datadir, \"evap_nb1.csv\"),\n",
" index_col=0, parse_dates=True)\n",
"e.columns = [\"value\"]\n",
"emeta = {\"x\": 100300, \"y\": 400400}"
]
Expand Down Expand Up @@ -1006,16 +1008,19 @@
"outputs": [],
"source": [
"# oseries 2\n",
"o2 = pd.read_csv(os.path.join(datadir, \"obs.csv\"), index_col=0, parse_dates=True)\n",
"o2 = pd.read_csv(os.path.join(datadir, \"obs.csv\"),\n",
" index_col=0, parse_dates=True)\n",
"o2.index.name = \"oseries2\"\n",
"ometa2 = {\"x\": 100000, \"y\": 400000}\n",
"\n",
"# prec 2\n",
"p2 = pd.read_csv(os.path.join(datadir, \"rain.csv\"), index_col=0, parse_dates=True)\n",
"p2 = pd.read_csv(os.path.join(datadir, \"rain.csv\"),\n",
" index_col=0, parse_dates=True)\n",
"pmeta2 = {\"x\": 100000, \"y\": 400000}\n",
"\n",
"# evap 2\n",
"e2 = pd.read_csv(os.path.join(datadir, \"evap.csv\"), index_col=0, parse_dates=True)\n",
"e2 = pd.read_csv(os.path.join(datadir, \"evap.csv\"),\n",
" index_col=0, parse_dates=True)\n",
"emeta2 = {\"x\": 100000, \"y\": 400000}"
]
},
Expand Down Expand Up @@ -1353,7 +1358,8 @@
],
"source": [
"# pastas.Project\n",
"prj.get_parameters([\"recharge_A\", \"recharge_a\", \"recharge_n\", \"recharge_f\", \"constant_d\", \"noise_alpha\"])"
"prj.get_parameters([\"recharge_A\", \"recharge_a\", \"recharge_n\",\n",
" \"recharge_f\", \"constant_d\", \"noise_alpha\"])"
]
},
{
Expand Down
Loading

0 comments on commit 13bd833

Please sign in to comment.