This page is for users who have decided that the Airbyte server is the right deployment option for their production instance.
With a full Airbyte server you will get an infrastructure that manages the scheduled runs of these connections as well as an Airbyte UI that helps to manage your Airbyte connections as well as.
- [recommended] We recommend you spin up an EC2 instance of the Airbyte app and a corresponding RDS. Follow along with the linked Airbyte docs to set up your server.
- If you anticipate scaling issues hosting your Airbyte server on EC2 and expertise in managing Kubernetes, you can instead deploy Airbyte on Kubernetes.
By default, Airbyte Server stores secrets, such as API keys and other types of credentials, as unencrypted strings in the Airbyte database. Airbyte offers alternate options for storing secrets in a more secure manner. We recommend using Hashicorp Vault to securely store secrets. To configure your Airbyte Server to use Vault, perform the following steps:
Create a new Key/Value Secrets Engine in your Vault instance. Both Engine versions 1 and 2 should work.
If you're running Airbyte on EC2, add the following environment variables to the bootloader, server, and worker Docker services:
SECRET_PERSISTENCE=VAULT VAULT_ADDRESS=<VAULT URL> VAULT_PREFIX=<Secrets Engine Name>/ VAULT_AUTH_TOKEN="<VAULT TOKEN>"
If you're running Airbyte on Kubernetes, add the following entries to your Helm values.yaml file or a custom Helm values file specified with
-f, --valuesoptions during deployment.
airbyte-bootloader: extraEnv: - name: SECRET_PERSISTENCE value: VAULT - name: VAULT_ADDRESS value: <VAULT URL> - name: VAULT_PREFIX value: <Secrets Engine Name>/ - name: VAULT_AUTH_TOKEN value: "<VAULT TOKEN>" server: extraEnv: - name: SECRET_PERSISTENCE value: VAULT - name: VAULT_ADDRESS value: <VAULT URL> - name: VAULT_PREFIX value: <Secrets Engine Name>/ - name: VAULT_AUTH_TOKEN value: "<VAULT TOKEN>" worker: extraEnv: - name: SECRET_PERSISTENCE value: VAULT - name: VAULT_ADDRESS value: <VAULT URL> - name: VAULT_PREFIX value: <Secrets Engine Name>/ - name: VAULT_AUTH_TOKEN value: "<VAULT TOKEN>"
You can also store secrets using GCP Secret Manager or AWS Secret Manager if you're already integrated into one of those cloud platforms. See the Airbyte documentation for environment variables required for each option.
Instead of running your Airbyte connections through the command line you’ll need to recreate them in your new Airbyte server. Fortunately, all the parameters will stay the same.
This only needs to be done once and will be used by all sources.
- Add a new destination definition
- Add a Faros destination definition
- Choose Faros Destination
- Configure the Faros destination (Note: select "v2" for GraphQL API version)
The example below reproduces the GitHub CLI command in the Airbyte server UI.
Add the connector to your sources
Connector display name: Faros Feeds Docker repository name: farosai/airbyte-faros-feeds-source Docker image tag: latest Note: this pulls the latest version at the time you create the connector. It does not update to the latest version each time the source runs. Connector Documentation URL: <https://docs.faros.ai>
Create a new source to pull from GitHub Enterprise using the connector you created above
Source type: Faros Feeds (or whatever you called it in step 1a) Choose 'github' from the Feed type dropdown. This will show the GitHub specific configuration fields. Authentication: Choose your authentication method and complete accordingly. Repos Query Mode: Select GitHub Org and enter a list of repositories to pull. GitHub API URL: Enter the GitHub Enterprise API URL (defaults to GitHub API URL) Cutoff days: fetch entities updated in the last number of days (defaults to 90 days) Feed command line arguments: Leave blank. This is how we can pass extra feed arguments which aren’t present in the UI. Enable debug logs if desired.
Create a connection between the source and the Faros destination
- Destination Stream Prefix: Enter a prefix matching faros_feeds (e.g., ghefaros_feeds). The first part will be used as the origin for your records. The second part (faros_feeds) is used by the destination connector to convert records emitted by the source into the Faros models.
- There's a single stream: faros_feed. Activate it and select the sync mode.
Updated about 2 months ago