- Data ingestion is hard, Airbyte makes it more easy and scalable
- Take advantage of many data sources from files, api, databases,...
- Let users have a choice other than traditional, centralize solutions (Kafka, BigQuery, Snowflake,...)
docker pull mihthanh27/airbyte-destination-streamr
Go to Airbyte > Settings > Destination
- Install
nvm
- Install Node.js
nvm install 14 && nvm use 14
- Update
npm
to version 7.x by runningnpm install -g npm@7
- Install
lerna
by runningnpm install -g lerna
- Run
npm run prepare
to install dependencies for all projects (npm run clean
to clean all) - Run
npm run build
to build all projects (for a single project add scope, e.gnpm run build -- --scope faros-destination
) - Run
npm run test
to test all projects (for a single project add scope, e.gnpm run test -- --scope faros-destination
) - Run
npm run lint
to apply linter on all projects (for a single project add scope, e.gnpm run lint -- --scope faros-destination
)
- Audit fix
npm audit fix
- Clean your project
lerna run clean
(sometimes you also wannarm -rf ./node_modules
)
Read more about lerna
here - https://github.com/lerna/lerna
In order to build a Docker image for a connector run the docker build
command and set path
argument.
For example for Faros Destination connector run:
docker build . --build-arg path=destinations/streamr-destination -t airbyte-destination-streamr
And then run it:
docker run airbyte-destination-streamr
Connector Docker images are automatically published to Docker Hub after updates
to the main branch. They are tagged by the version listed in the connector's
package.json
. If the connector is updated without incrementing the version,
GitHub will NOT overwrite the existing image in Docker Hub.