Skip to content

Commit

Permalink
deploy: 3b7f4ce
Browse files Browse the repository at this point in the history
  • Loading branch information
austinwarner-8451 committed May 24, 2024
0 parents commit e418669
Show file tree
Hide file tree
Showing 60 changed files with 7,068 additions and 0 deletions.
4 changes: 4 additions & 0 deletions .buildinfo
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 7acb99269bfb29a6f862db5dc2907fcf
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file added .doctrees/api.doctree
Binary file not shown.
Binary file added .doctrees/changelog.doctree
Binary file not shown.
Binary file added .doctrees/contributing.doctree
Binary file not shown.
Binary file added .doctrees/environment.pickle
Binary file not shown.
Binary file added .doctrees/index.doctree
Binary file not shown.
Binary file added .doctrees/installation.doctree
Binary file not shown.
Binary file added .doctrees/usage.doctree
Binary file not shown.
Empty file added .nojekyll
Empty file.
29 changes: 29 additions & 0 deletions _sources/api.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
Functional Pypelines API
========================

Pipeline Class
--------------
.. autoclass:: functional_pypelines.Pipeline
:members:

.. autoclass:: functional_pypelines.core.PipelineDebugger
:members:

Validators
----------
.. autoclass:: functional_pypelines.validator.ValidatorPipeline
:members:

.. autoclass:: functional_pypelines.validator.SUCCESS
.. autofunction:: functional_pypelines.validator.FAILURE

JSON API
--------
.. autofunction:: functional_pypelines.run
.. autofunction:: functional_pypelines.api.core.dry_run

CLI
---
.. click:: functional_pypelines.api.cli:cli_run
:prog: functional_pypelines
:nested: full
4 changes: 4 additions & 0 deletions _sources/changelog.md.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Changelog

## Version 3.2.0
- Release as open source
68 changes: 68 additions & 0 deletions _sources/contributing.md.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Contributing

If you would like to add new functionality or fix a bug, we welcome contributions. All change requests should start with
an [issue on the repo](https://github.com/8451/functional-pypelines/issues/new/choose). If you would like to develop the
solution yourself, us the following flow:

1. Read the [style guide](#style-guide) below
2. Tag @8451/cobra-owners in the issue and request to be assigned the issue
- If this is your first time contributing, ask to be granted write access to the repo
3. Create a new branch based off of [develop](https://github.com/8451/functional-pypelines/tree/develop)
- Give your branch a name starting with `feature/`, `bug/` or `misc/`
4. Clone the repo in your favorite IDE, check out your new branch, and add your changes
5. Run the tests to ensure nothing breaks
- `pip install -e .[test]`
- `pytest`
6. Push the changes on your branch to the repo, and open a Pull Request where the base branch is `develop`
- Request a review from @8451/cobra-owners

## Style Guide

### Pre-Commit
The repo has [pre-commit](https://pre-commit.com/) configured to enforce much (but not all) of the style guide
automatically. When developing locally, please perform the one-time setup using `pip install pre-commit` followed by
`pre-commit install` before making any commits.

### PEP 8
We try to follow [PEP 8](https://peps.python.org/pep-0008/) wherever possible. Feel free to familiarize yourself
with it, but most IDEs will automatically help you. Alternatively, you can use a code formatter like
[black](https://pypi.org/project/black/) to format your code for you.

### Imports
Imports should take place at the top of the file and be broken into 4 sections, each section with a single empty line between
them. The sections are:
1. Standard library imports (i.e. typing, regex, math)
2. 3rd-party library imports (i.e. numpy, pandas)
3. Relative imports (importing other modules from the `functional_pypelines` package)

For sections 1-3, prefer using the fully-qualified namespace over an unqualified import (`import json` over
`from json import load`). An exception to this rule is imports from the typing library (`from typing import List`).

When importing another module from `functional_pypelines`, write it like `from . import core`.

Under no circumstances should `from module import *` be used.

### Typing
All function definitions should be fully type-hinted (arguments and return value). Where applicable, use
generic types from the `typing` library like `List[str]` vs `list`. If a function does not return anything, mark the
return type as `None`.

### Naming
Prefer long, descriptive function names over short abbreviated names. For example, `load_config` is preferred over
`ld_cfg`.

Function and variable names should be lowercase with underscores separating words (`my_function`). Class names should be
camel case (`MyClass`). Constants should be all uppercase with underscores separating words (`MY_CONSTANT`).

Any function, variable, or constant that is not intended to be used outside of the module it is defined in should be
prefixed with an underscore (`_my_private_function`).

### Documentation
All public-facing functions should be documented using docstrings in the
[Numpy Style](https://numpydoc.readthedocs.io/en/latest/format.html#docstring-standard).

### Functional Programming
We strongly prefer Functional Programming over Object-Oriented programming. OOP is used specifically to store
configuration data in an object for later use. Regardless of paradigm, there is a preference for stateless programming
and avoiding in-place data mutation where possible. For example, when combining two lists, prefer `list_a + list_b`
over `list_a.extend(list_b)`.
30 changes: 30 additions & 0 deletions _sources/index.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
.. confect documentation master file, created by
sphinx-quickstart on Wed Mar 2 11:50:23 2022.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Functional Pypelines
--------------------

Framework for creating composable functions, including an API for running them via a JSON config. Get started with the
:doc:`Installation Instructions <installation>`.


.. toctree::
:hidden:
:maxdepth: 1
:caption: References

Installation <../../installation>
Usage <usage>
API <api>
Changelog <changelog>
Contributing <contributing>


Indices and tables
==================

* :ref:`genindex`
* :ref:`search`
7 changes: 7 additions & 0 deletions _sources/installation.md.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Installation

Functional Pypelines is for install via pip.

```bash
pip install functional-pypelines
````
150 changes: 150 additions & 0 deletions _sources/usage.md.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
# Usage

## Introduction

There are many tasks in programming, especially Data Science, that can be best modeled as a sequence of data
transformations. In such situations, it is ideal to be able to chain a series of functions together, passing the output
of one as the input to the next.

In Python, there is no such way to chain functions together. For example, this is how we might compose 3 functions using
Python out-of-the-box.

```python
def double(x):
return 2 * x

def negate(x):
return -x

def to_string(x):
return str(x)


# Inline composition
to_string(negate(double(2))) == -4


# Define new function
def str_of_neg_dbl(x):
return to_string(negate(double(x)))

str_of_neg_dbl(2) == '-4'


# Assign output on each call
x = 2
x = double(x)
x = negate(x)
x = to_string(x)

x == '-4'
```

But with Pipelines, we can compose the functions using ``>>`` into a new function that chains the steps together. All
we need to do is decorate our functions with the `Pipeline.step` decorator.

```python
from functional_pypelines import Pipeline


@Pipeline.step
def double(x):
return 2 * x


@Pipeline.step
def negate(x):
return -x


@Pipeline.step
def to_string(x):
return str(x)


# Inline Composition
(double >> negate >> to_string)(2) == '-4'

# Define new function
str_of_neg_dbl = double >> negate >> to_string
str_of_neg_dbl(2) == -4

# Can still use the functions like normal
double(2) == 4
to_string(True) == 'True'
```

Using the decorator gives you a lot of flexibility, but you can also use pypelines with undecorated functions, you just
need to start the pipeline with a call to `Pipeline()` to kick it off, and wrap the whole thing in parentheses when
passing the input data inline.

```python
from functional_pypelines import Pipeline


def double(x):
return 2 * x


def negate(x):
return -x


def to_string(x):
return str(x)


# Inline Composition
(Pipeline() >> double >> negate >> to_string)(2) == '-4'

# Define new function
str_of_neg_dbl = Pipeline() >> double >> negate >> to_string
str_of_neg_dbl(2) == -4
```

## JSON Config API

In addition to letting you write more expressive code, Functional Pypelines also allows you to run a sequence of functions via a
JSON config. For example, if our three functions `double`, `negate`, and `to_string` lived in a `functions.py` file,
we could accomplish the same task using the following config.

```json
{
"PIPELINE": [
"functions.double",
"functions.negate",
"functions.to_string"
],
"DATA": 2
}
```

With this config we can either run this from the command line like so:

```bash
functional_pypelines -c conf.json
```

Or if we had the same config as a Python dictionary we can run from Python like so:

```python
import functional_pypelines

config = {
"PIPEILNE": [
"functions.double",
"functions.negate",
"functions.to_string"
],
"DATA": 2
}

functional_pypelines.run(config) == '-4'
```


## Extending Pipeline

While Functional Pypelines is powerful out of the box, it may be a bit limiting to only write functions that pass one value around.
For more complex tasks, the `Pipeline` class can be subclassed to customize the behavior. The `Pipeline.step` decorator
can be overridden to allow for more complex functionality, such as passing multiple arguments to a function.
Loading

0 comments on commit e418669

Please sign in to comment.