Skip to content

Commit

Permalink
Update UI nav flow and views.
Browse files Browse the repository at this point in the history
  • Loading branch information
jeff-phillips-18 committed Jan 28, 2021
1 parent 7bc1738 commit f1425a3
Show file tree
Hide file tree
Showing 38 changed files with 888 additions and 18,876 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ node_modules
npm-debug.log*
yarn-debug.log*
yarn-error.log*
yarn.lock

install/odh/overlays/dev/kustomization.yaml
install/odh/overlays/dev/deployment_patch.yaml
84 changes: 31 additions & 53 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ $ oc login https://api.my-openshift-cluster.com:6443 -u kubeadmin -p my-password
or log in using the makefile and `.env.local` settings
```.env.local
OC_URL=https://specify.in.env:6443
OC_PROJECT=opendatahub
OC_PROJECT=my-project
OC_USER=kubeadmin
OC_PASSWORD=my-password
```
Expand Down Expand Up @@ -72,40 +72,49 @@ The dotenv files have access to default settings grouped by facet; frontend, bac

...

## Deploy
Add the dashboard component and the repo to the ODH instance kfdef yaml.
```yaml
apiVersion: kfdef.apps.kubeflow.org/v1
kind: KfDef
spec:
applications:
# ... other components ...

# Add Dashboard Component
## Deploy your version
Edit the opendatahub KfDef in your project, remove the section:
```
- kustomizeConfig:
repoRef:
name: odh-dashboard
path: install/odh/base
name: manifests
path: odh-dashboard
name: odh-dashboard
repos:
# ... other repos ...

# Add Dashboard Dev Repo
- name: odh-dashboard
uri: 'https://github.com/opendatahub-io/odh-dashboard/tarball/master'
```

version: vX.Y.Z
Remove the current deployment of the ODH Dashboard
```shell
$ make undeploy
```
or
```
$ npm run make:undeploy
```

### Building
### Customize your env
Customize `.env.local` file to image and source information as desired. `npm` and the `s2i` command line tool is required.

```.env.local
CONTAINER_BUILDER=docker
IMAGE_REPOSITORY=quay.io/my-org/odh-dashboard:latest
SOURCE_REPOSITORY_URL=git@github.com:my-org/odh-dashboard.git
SOURCE_REPOSITORY_REF=my-branch
OC_URL=https://specify.in.env:6443
OC_PROJECT=specify_in_.env
# user and password login
OC_USER=specify_in_.env
OC_PASSWORD=specify_in_.env
# or token login
#OC_TOKEN=specify_in_.env
```

### Build
Push your branch to your repo for it to be visible to the s2i build.

Then build:
```shell
$ make build
```
Expand All @@ -115,13 +124,6 @@ $ npm run make:build
```

### Pushing the image
Customize `.env.local` file to image information and container builder.
```.env.local
CONTAINER_BUILDER=docker
IMAGE_REPOSITORY=quay.io/my-org/odh-dashboard:latest
```

```shell
$ make push
```
Expand All @@ -131,32 +133,8 @@ $ npm run make:push
```

### Deploying your image
Customize `.env.local` file for deployment information. Required. The OpenShift, `oc`, command line tool is required.

First set the image to deploy to your custom image you've built in previous steps.
```.env.local
IMAGE_REPOSITORY=quay.io/my-org/odh-dashboard:latest
```

Then set your login information to deploy to your cluster.
```.env.local
OC_URL=https://specify.in.env:6443
OC_PROJECT=specify_in_.env
# user and password login
#OC_USER=specify_in_.env
#OC_PASSWORD=specify_in_.env
```
or
```.env.local
OC_URL=https://specify.in.env:6443
OC_PROJECT=specify_in_.env
# token login
OC_TOKEN=specify_in_.env
```
Required: The OpenShift, `oc`, command line tool is required.

Now execute the deployment scripts.
```shell
$ make deploy
```
Expand Down
5 changes: 5 additions & 0 deletions backend/plugins/kube.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
'use strict';
const { DEV_MODE } = require('../utils/constants');

const fs = require('fs');
const fp = require('fastify-plugin');
Expand All @@ -9,6 +10,7 @@ kc.loadFromDefault();

const currentContext = kc.getCurrentContext();
const customObjectsApi = kc.makeApiClient(k8s.CustomObjectsApi);
const currentUser = kc.getCurrentUser();

module.exports = fp(async function (fastify) {
let namespace;
Expand All @@ -23,6 +25,7 @@ module.exports = fp(async function (fastify) {
currentContext,
namespace,
customObjectsApi,
currentUser,
});
});

Expand All @@ -39,6 +42,8 @@ async function getCurrentNamespace() {
resolve(data);
},
);
} else if (DEV_MODE) {
resolve(process.env.OC_PROJECT);
} else {
resolve(currentContext.split('/')[0]);
}
Expand Down
162 changes: 85 additions & 77 deletions backend/routes/api/components/available-components.js
Original file line number Diff line number Diff line change
@@ -1,105 +1,113 @@
module.exports = [
{
key: "jupyterhub",
label: "JupyterHub",
key: 'jupyterhub',
label: 'JupyterHub',
description:
"A multi-user version of the notebook designed for companies, classrooms and research labs",
kfdefApplications: ["jupyterhub", "notebook-images"],
route: "jupyterhub",
img: "images/jupyterhub.svg",
docsLink: "https://jupyter.org/hub",
'A multi-user version of the notebook designed for companies, classrooms and research labs',
kfdefApplications: ['jupyterhub', 'notebook-images'],
route: 'jupyterhub',
img: 'images/jupyterhub.svg',
docsLink: 'https://jupyter.org/hub',
support: 'redhat',
},
{
key: "argo",
label: "Argo",
description: "Kubernetes native workflows, events, CI and CD",
kfdefApplications: ["odhargo-cluster", "odhargo"],
route: "argo-server",
img: "images/argo.svg",
docsLink: "https://argoproj.github.io/",
key: 'argo',
label: 'Argo',
description: 'Kubernetes native workflows, events, CI and CD',
kfdefApplications: ['odhargo-cluster', 'odhargo'],
route: 'argo-server',
img: 'images/argo.svg',
docsLink: 'https://argoproj.github.io/',
support: 'other',
},
{
key: "superset",
label: "Superset",
description:
"A modern, enterprise-ready business intelligence web application",
kfdefApplications: ["superset"],
route: "superset",
img: "images/superset.svg",
docsLink: "https://superset.incubator.apache.org/",
key: 'superset',
label: 'Superset',
description: 'A modern, enterprise-ready business intelligence web application',
kfdefApplications: ['superset'],
route: 'superset',
img: 'images/superset.svg',
docsLink: 'https://superset.incubator.apache.org/',
support: 'other',
},
{
key: "prometheus",
label: "Prometheus",
description: "Systems monitoring and alerting toolkit",
kfdefApplications: ["prometheus-cluster", "prometheus-operator"],
route: "prometheus-portal",
img: "images/prometheus.svg",
docsLink: "https://prometheus.io/docs/",
key: 'prometheus',
label: 'Prometheus',
description: 'Systems monitoring and alerting toolkit',
kfdefApplications: ['prometheus-cluster', 'prometheus-operator'],
route: 'prometheus-portal',
img: 'images/prometheus.svg',
docsLink: 'https://prometheus.io/docs/',
support: 'redhat',
},
{
key: "grafana",
label: "Grafana",
description: "Visualization and analytics software",
kfdefApplications: ["grafana-cluster", "grafana-instance"],
route: "grafana-route",
img: "images/grafana.svg",
docsLink: "https://grafana.com/docs/grafana/latest/",
key: 'grafana',
label: 'Grafana',
description: 'Visualization and analytics software',
kfdefApplications: ['grafana-cluster', 'grafana-instance'],
route: 'grafana-route',
img: 'images/grafana.svg',
docsLink: 'https://grafana.com/docs/grafana/latest/',
support: 'redhat',
},
{
key: "spark",
label: "Spark",
description: "Unified analytics engine for large-scale data processing",
kfdefApplications: ["radanalyticsio-spark-cluster"],
key: 'spark',
label: 'Spark',
description: 'Unified analytics engine for large-scale data processing',
kfdefApplications: ['radanalyticsio-spark-cluster'],
route: null,
img: "images/spark.svg",
docsLink: "https://spark.apache.org/docs/latest/",
img: 'images/spark.svg',
docsLink: 'https://spark.apache.org/docs/latest/',
support: 'other',
},
{
key: "seldon",
label: "Seldon",
description:
"Platform for rapidly deploying machine learning models on Kubernetes.",
kfdefApplications: ["odhseldon"],
key: 'seldon',
label: 'Seldon',
description: 'Platform for rapidly deploying machine learning models on Kubernetes.',
kfdefApplications: ['odhseldon'],
route: null,
img: "images/seldon.svg",
docsLink: "https://docs.seldon.io/",
img: 'images/seldon.svg',
docsLink: 'https://docs.seldon.io/',
support: 'other',
},
{
key: "kafka",
label: "Kafka",
description: "Distributed event streaming platform",
kfdefApplications: ["strimzi-operator", "kafka-cluster"],
key: 'kafka',
label: 'Kafka',
description: 'Distributed event streaming platform',
kfdefApplications: ['strimzi-operator', 'kafka-cluster'],
route: null,
img: "images/kafka.svg",
docsLink: "https://kafka.apache.org/documentation/",
img: 'images/kafka.svg',
docsLink: 'https://kafka.apache.org/documentation/',
support: 'other',
},
{
key: "airflow",
label: "Airflow",
description:
"Platform to programmatically author, schedule, and monitor workflows",
kfdefApplications: ["airflow-cluster", "airflow-operator"],
key: 'airflow',
label: 'Airflow',
description: 'Platform to programmatically author, schedule, and monitor workflows',
kfdefApplications: ['airflow-cluster', 'airflow-operator'],
route: null,
img: "images/airflow.svg",
docsLink: "https://airflow.apache.org/",
img: 'images/airflow.svg',
docsLink: 'https://airflow.apache.org/',
support: 'other',
},
{
key: "hue",
label: "Hue",
description: "Data exploration platform for Hive and S3 storage",
kfdefApplications: ["hue"],
route: "hue",
img: "images/hue.svg",
docsLink: "https://docs.gethue.com/",
key: 'hue',
label: 'Hue',
description: 'Data exploration platform for Hive and S3 storage',
kfdefApplications: ['hue'],
route: 'hue',
img: 'images/hue.svg',
docsLink: 'https://docs.gethue.com/',
support: 'other',
},
{
key: "thriftserver",
label: "Spark SQL Thrift Server",
description: "Expose Spark data frames modeled as Hive tables through a JDBC connection",
kfdefApplications: ["thriftserver"],
route: "thriftserver",
img: "images/spark.svg",
docsLink: "https://spark.apache.org/docs/latest/sql-distributed-sql-engine.html",
}
key: 'thriftserver',
label: 'Spark SQL Thrift Server',
description: 'Expose Spark data frames modeled as Hive tables through a JDBC connection',
kfdefApplications: ['thriftserver'],
route: 'thriftserver',
img: 'images/spark.svg',
docsLink: 'https://spark.apache.org/docs/latest/sql-distributed-sql-engine.html',
support: 'other',
},
];
Loading

0 comments on commit f1425a3

Please sign in to comment.