Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge master #238

Merged
merged 15 commits into from
Mar 26, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ NNI_INSTALL_PATH ?= $(INSTALL_PREFIX)/nni

BIN_FOLDER ?= $(ROOT_FOLDER)/bin
NNI_PKG_FOLDER ?= $(ROOT_FOLDER)/nni
NASUI_PKG_FOLDER ?= $(ROOT_FOLDER)/nni/nasui

## Dependency information
NNI_DEPENDENCY_FOLDER = /tmp/$(USER)
Expand Down Expand Up @@ -132,6 +133,8 @@ clean:
-rm -rf src/sdk/pynni/nni_sdk.egg-info
-rm -rf src/webui/build
-rm -rf src/webui/node_modules
-rm -rf src/nasui/build
-rm -rf src/nasui/node_modules

# Main targets end

Expand Down Expand Up @@ -193,6 +196,11 @@ install-node-modules:
sed -ie 's/$(NNI_VERSION_TEMPLATE)/$(NNI_VERSION_VALUE)/' $(NNI_PKG_FOLDER)/package.json
$(NNI_YARN) --prod --cwd $(NNI_PKG_FOLDER)
cp -r src/webui/build $(NNI_PKG_FOLDER)/static
# Install nasui
mkdir -p $(NASUI_PKG_FOLDER)
cp -rf src/nasui/build $(NASUI_PKG_FOLDER)
cp src/nasui/server.js $(NASUI_PKG_FOLDER)


.PHONY: dev-install-node-modules
dev-install-node-modules:
Expand All @@ -203,6 +211,8 @@ dev-install-node-modules:
sed -ie 's/$(NNI_VERSION_TEMPLATE)/$(NNI_VERSION_VALUE)/' $(NNI_PKG_FOLDER)/package.json
ln -sf ${PWD}/src/nni_manager/node_modules $(NNI_PKG_FOLDER)/node_modules
ln -sf ${PWD}/src/webui/build $(NNI_PKG_FOLDER)/static
ln -sf ${PWD}/src/nasui/build $(NASUI_PKG_FOLDER)/build
ln -sf ${PWD}/src/nasui/server.js $(NASUI_PKG_FOLDER)/server.js

.PHONY: install-scripts
install-scripts:
Expand Down
34 changes: 19 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,8 @@ The tool manages automated machine learning (AutoML) experiments, **dispatches a
### **NNI v1.4 has been released! &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**

## **NNI capabilities in a glance**
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in stat-of-the-art AutoML algorithms and out of box support for popular training platforms.

NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in stat-of-the-art AutoML algorithms and out of box support for popular training platforms.

Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.

Expand Down Expand Up @@ -105,24 +106,24 @@ Within the following table, we summarized the current NNI capabilities, we are g
<b>Heuristic search</b>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Evolution">Naïve Evolution</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Anneal">Anneal</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Anneal">Anneal</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Hyperband">Hyperband</a></li>
</ul>
<b>Bayesian optimization</b>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#TPE">TPE</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#MetisTuner">Metis Tuner</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a> </li>
</ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a></li>
</ul>
<b>RL Based</b>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#PPOTuner">PPO Tuner</a> </li>
</ul>
</ul>
<a href="docs/en_US/NAS/Overview.md">Neural Architecture Search</a>
<ul>
<ul>
<ul>
<li><a href="docs/en_US/NAS/ENAS.md">ENAS</a></li>
<li><a href="docs/en_US/NAS/DARTS.md">DARTS</a></li>
Expand All @@ -131,7 +132,7 @@ Within the following table, we summarized the current NNI capabilities, we are g
<li><a href="docs/en_US/NAS/SPOS.md">SPOS</a></li>
<li><a href="docs/en_US/NAS/Proxylessnas.md">ProxylessNAS</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a> </li>
</ul>
</ul>
</ul>
<a href="docs/en_US/Compressor/Overview.md">Model Compression</a>
<ul>
Expand All @@ -155,7 +156,7 @@ Within the following table, we summarized the current NNI capabilities, we are g
<a href="docs/en_US/Assessor/BuiltinAssessor.md">Early Stop Algorithms</a>
<ul>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Medianstop">Median Stop</a></li>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Curvefitting">Curve Fitting</a></li>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Curvefitting">Curve Fitting</a></li>
</ul>
</td>
<td>
Expand Down Expand Up @@ -288,8 +289,9 @@ You can use these commands to get more information about the experiment
</table>

## **Documentation**
* To learn about what's NNI, read the [NNI Overview](https://nni.readthedocs.io/en/latest/Overview.html).
* To get yourself familiar with how to use NNI, read the [documentation](https://nni.readthedocs.io/en/latest/index.html).

* To learn about what's NNI, read the [NNI Overview](https://nni.readthedocs.io/en/latest/Overview.html).
* To get yourself familiar with how to use NNI, read the [documentation](https://nni.readthedocs.io/en/latest/index.html).
* To get started and install NNI on your system, please refer to [Install NNI](https://nni.readthedocs.io/en/latest/installation.html).

## **Contributing**
Expand All @@ -300,6 +302,7 @@ When you submit a pull request, a CLA-bot will automatically determine whether y
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the Code of [Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact opencode@microsoft.com with any additional questions or comments.

After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start:

* We recommend new contributors to start with simple issues: ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) or ['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22).
* [NNI developer environment installation tutorial](docs/en_US/Tutorial/SetupNniDeveloperEnvironment.md)
* [How to debug](docs/en_US/Tutorial/HowToDebug.md)
Expand All @@ -311,15 +314,14 @@ After getting familiar with contribution agreements, you are ready to create you

## **External Repositories and References**
With authors' permission, we listed a set of NNI usage examples and relevant articles.

* ### **External Repositories** ###
* Run [ENAS](examples/tuners/enas_nni/README.md) with NNI
* Run [Neural Network Architecture Search](examples/trials/nas_cifar10/README.md) with NNI
* [Automatic Feature Engineering](examples/feature_engineering/auto-feature-engineering/README.md) with NNI
* Run [Neural Network Architecture Search](examples/trials/nas_cifar10/README.md) with NNI
* [Automatic Feature Engineering](examples/feature_engineering/auto-feature-engineering/README.md) with NNI
* [Hyperparameter Tuning for Matrix Factorization](https://github.com/microsoft/recommenders/blob/master/notebooks/04_model_select_and_optimize/nni_surprise_svd.ipynb) with NNI
* [scikit-nni](https://github.com/ksachdeva/scikit-nni) Hyper-parameter search for scikit-learn pipelines using NNI

* ### **Relevant Articles** ###

* [Hyper Parameter Optimization Comparison](docs/en_US/CommunitySharings/HpoComparision.md)
* [Neural Architecture Search Comparison](docs/en_US/CommunitySharings/NasComparision.md)
* [Parallelizing a Sequential Algorithm TPE](docs/en_US/CommunitySharings/ParallelizingTpeSearch.md)
Expand All @@ -330,11 +332,13 @@ With authors' permission, we listed a set of NNI usage examples and relevant art
* **Blog (in Chinese)** - [A summary of NNI new capabilities in 2019](https://mp.weixin.qq.com/s/7_KRT-rRojQbNuJzkjFMuA) by @squirrelsc

## **Feedback**

* Discuss on the NNI [Gitter](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) in NNI.
* [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub.
* Ask a question with NNI tags on [Stack Overflow](https://stackoverflow.com/questions/tagged/nni?sort=Newest&edited=true).

## Related Projects

Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) had also released few other open source projects.

* [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
Expand Down
42 changes: 9 additions & 33 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,24 +52,12 @@ jobs:
displayName: 'Run flake8 tests to find Python syntax errors and undefined names'
- script: |
cd test
source unittest.sh
source scripts/unittest.sh
displayName: 'Unit test'
- script: |
cd test
python3 naive_test.py
displayName: 'Naive test'
- script: |
cd test
python3 tuner_test.py
displayName: 'Built-in tuners / assessors tests'
- script: |
cd test
python3 metrics_test.py
displayName: 'Trial job metrics test'
- script: |
cd test
python3 cli_test.py
displayName: 'nnicli test'
python3 nni_test/nnitest/run_tests.py --config config/pr_tests.yml
displayName: 'Simple test'
- script: |
cd docs/en_US/
sphinx-build -M html . _build -W
Expand Down Expand Up @@ -101,20 +89,12 @@ jobs:
displayName: 'Install dependencies'
- script: |
cd test
source unittest.sh
source scripts/unittest.sh
displayName: 'Unit test'
- script: |
cd test
python3 naive_test.py
displayName: 'Naive test'
- script: |
cd test
python3 tuner_test.py
displayName: 'Built-in tuners / assessors tests'
- script: |
cd test
python3 cli_test.py
displayName: 'nnicli test'
python3 nni_test/nnitest/run_tests.py --config config/pr_tests.yml
displayName: 'Simple test'

- job: 'basic_test_pr_Windows'
pool:
Expand All @@ -137,13 +117,9 @@ jobs:
displayName: 'Install dependencies'
- script: |
cd test
powershell.exe -file unittest.ps1
powershell.exe -file scripts/unittest.ps1
displayName: 'unit test'
- script: |
cd test
python tuner_test.py
displayName: 'Built-in tuners / assessors tests'
- script: |
cd test
PATH=$HOME/.local/bin:$PATH python3 cli_test.py
displayName: 'nnicli test'
python nni_test/nnitest/run_tests.py --config config/pr_tests.yml
displayName: 'Simple test'
3 changes: 3 additions & 0 deletions deployment/pypi/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -36,10 +36,13 @@ build:
tar -xf $(NNI_YARN_TARBALL) -C $(NNI_YARN_FOLDER) --strip-components 1
cd $(CWD)../../src/nni_manager && $(NNI_YARN) && $(NNI_YARN) build
cd $(CWD)../../src/webui && $(NNI_YARN) && $(NNI_YARN) build
cd $(CWD)../../src/nasui && $(NNI_YARN) && $(NNI_YARN) build
rm -rf $(CWD)nni
cp -r $(CWD)../../src/nni_manager/dist $(CWD)nni
cp -r $(CWD)../../src/nni_manager/config $(CWD)nni
cp -r $(CWD)../../src/webui/build $(CWD)nni/static
cp -r $(CWD)../../src/nasui/build $(CWD)nni/nasui
cp $(CWD)../../src/nasui/server.js $(CWD)nni/nasui
cp $(CWD)../../src/nni_manager/package.json $(CWD)nni
sed -ie 's/$(NNI_VERSION_TEMPLATE)/$(NNI_VERSION_VALUE)/' $(CWD)nni/package.json
cd $(CWD)nni && $(NNI_YARN) --prod
Expand Down
5 changes: 5 additions & 0 deletions deployment/pypi/install.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -46,11 +46,16 @@ yarn build
cd $CWD\..\..\src\webui
yarn
yarn build
cd $CWD\..\..\src\nasui
yarn
yarn build
if(Test-Path $CWD\nni){
Remove-Item $CWD\nni -r -fo
}
Copy-Item $CWD\..\..\src\nni_manager\dist $CWD\nni -Recurse
Copy-Item $CWD\..\..\src\webui\build $CWD\nni\static -Recurse
Copy-Item $CWD\..\..\src\nasui\build $CWD\nni\nasui -Recurse
Copy-Item $CWD\..\..\src\nasui\server.js $CWD\nni\nasui -Recurse
Copy-Item $CWD\..\..\src\nni_manager\package.json $CWD\nni
(Get-Content $CWD\nni\package.json).replace($NNI_VERSION_TEMPLATE, $NNI_VERSION_VALUE) | Set-Content $CWD\nni\package.json
cd $CWD\nni
Expand Down
30 changes: 15 additions & 15 deletions docs/en_US/Assessor/BuiltinAssessor.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
# Built-in Assessors

NNI provides state-of-the-art tuning algorithm in our builtin-assessors and makes them easy to use. Below is the brief overview of NNI current builtin Assessors:
NNI provides state-of-the-art tuning algorithms within our builtin-assessors and makes them easy to use. Below is a brief overview of NNI's current builtin Assessors.

Note: Click the **Assessor's name** to get the Assessor's installation requirements, suggested scenario and using example. The link for a detailed description of the algorithm is at the end of the suggested scenario of each Assessor.
Note: Click the **Assessor's name** to get each Assessor's installation requirements, suggested usage scenario, and a config example. A link to a detailed description of each algorithm is provided at the end of the suggested scenario for each Assessor.

Currently we support the following Assessors:
Currently, we support the following Assessors:

|Assessor|Brief Introduction of Algorithm|
|---|---|
|[__Medianstop__](#MedianStop)|Medianstop is a simple early stopping rule. It stops a pending trial X at step S if the trial’s best objective value by step S is strictly worse than the median value of the running averages of all completed trials’ objectives reported up to step S. [Reference Paper](https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/46180.pdf)|
|[__Curvefitting__](#Curvefitting)|Curve Fitting Assessor is a LPA(learning, predicting, assessing) algorithm. It stops a pending trial X at step S if the prediction of final epoch's performance worse than the best final performance in the trial history. In this algorithm, we use 12 curves to fit the accuracy curve. [Reference Paper](http://aad.informatik.uni-freiburg.de/papers/15-IJCAI-Extrapolation_of_Learning_Curves.pdf)|
|[__Curvefitting__](#Curvefitting)|Curve Fitting Assessor is an LPA (learning, predicting, assessing) algorithm. It stops a pending trial X at step S if the prediction of the final epoch's performance worse than the best final performance in the trial history. In this algorithm, we use 12 curves to fit the accuracy curve. [Reference Paper](http://aad.informatik.uni-freiburg.de/papers/15-IJCAI-Extrapolation_of_Learning_Curves.pdf)|

## Usage of Builtin Assessors

Use builtin assessors provided by NNI SDK requires to declare the **builtinAssessorName** and **classArgs** in `config.yml` file. In this part, we will introduce the detailed usage about the suggested scenarios, classArg requirements, and example for each assessor.
Usage of builtin assessors provided by the NNI SDK requires one to declare the **builtinAssessorName** and **classArgs** in the `config.yml` file. In this part, we will introduce the details of usage and the suggested scenarios, classArg requirements, and an example for each assessor.

Note: Please follow the format when you write your `config.yml` file.
Note: Please follow the provided format when writing your `config.yml` file.

<a name="MedianStop"></a>

Expand All @@ -25,12 +25,12 @@ Note: Please follow the format when you write your `config.yml` file.

**Suggested scenario**

It is applicable in a wide range of performance curves, thus, can be used in various scenarios to speed up the tuning progress. [Detailed Description](./MedianstopAssessor.md)
It's applicable in a wide range of performance curves, thus, it can be used in various scenarios to speed up the tuning progress. [Detailed Description](./MedianstopAssessor.md)

**Requirement of classArg**
**classArgs requirements:**

* **optimize_mode** (*maximize or minimize, optional, default = maximize*) - If 'maximize', assessor will **stop** the trial with smaller expectation. If 'minimize', assessor will **stop** the trial with larger expectation.
* **start_step** (*int, optional, default = 0*) - A trial is determined to be stopped or not, only after receiving start_step number of reported intermediate results.
* **start_step** (*int, optional, default = 0*) - A trial is determined to be stopped or not only after receiving start_step number of reported intermediate results.

**Usage example:**

Expand All @@ -53,15 +53,15 @@ assessor:

**Suggested scenario**

It is applicable in a wide range of performance curves, thus, can be used in various scenarios to speed up the tuning progress. Even better, it's able to handle and assess curves with similar performance. [Detailed Description](./CurvefittingAssessor.md)
It's applicable in a wide range of performance curves, thus, it can be used in various scenarios to speed up the tuning progress. Even better, it's able to handle and assess curves with similar performance. [Detailed Description](./CurvefittingAssessor.md)

**Requirement of classArg**
**classArgs requirements:**

* **epoch_num** (*int, **required***) - The total number of epoch. We need to know the number of epoch to determine which point we need to predict.
* **epoch_num** (*int, **required***) - The total number of epochs. We need to know the number of epochs to determine which points we need to predict.
* **optimize_mode** (*maximize or minimize, optional, default = maximize*) - If 'maximize', assessor will **stop** the trial with smaller expectation. If 'minimize', assessor will **stop** the trial with larger expectation.
* **start_step** (*int, optional, default = 6*) - A trial is determined to be stopped or not, we start to predict only after receiving start_step number of reported intermediate results.
* **threshold** (*float, optional, default = 0.95*) - The threshold that we decide to early stop the worse performance curve. For example: if threshold = 0.95, optimize_mode = maximize, best performance in the history is 0.9, then we will stop the trial which predict value is lower than 0.95 * 0.9 = 0.855.
* **gap** (*int, optional, default = 1*) - The gap interval between Assesor judgements. For example: if gap = 2, start_step = 6, then we will assess the result when we get 6, 8, 10, 12...intermedian result.
* **start_step** (*int, optional, default = 6*) - A trial is determined to be stopped or not only after receiving start_step number of reported intermediate results.
* **threshold** (*float, optional, default = 0.95*) - The threshold that we use to decide to early stop the worst performance curve. For example: if threshold = 0.95, optimize_mode = maximize, and the best performance in the history is 0.9, then we will stop the trial who's predicted value is lower than 0.95 * 0.9 = 0.855.
* **gap** (*int, optional, default = 1*) - The gap interval between Assesor judgements. For example: if gap = 2, start_step = 6, then we will assess the result when we get 6, 8, 10, 12...intermediate results.

**Usage example:**

Expand Down
Loading