Google BigQuery Integration
Connect Google BigQuery to Edilitics as a source or destination. Step-by-step setup for Service Account and User Account authentication.
Google BigQuery can be used as both a source and destination in Edilitics. It supports full load, incremental, and CDC (destination only) in Replicate pipelines, and source or destination in Transform flows.
Before You Begin
- A Google Cloud project with BigQuery enabled.
- Either a Service Account key (JSON) with BigQuery Data Editor and BigQuery Job User roles, or a Google account with the same permissions for User Account authentication.
- The Project ID and Dataset ID you want to connect.
Authentication Methods
BigQuery supports two authentication methods:
Service Account (recommended for production): Upload a JSON key file generated from a Google Cloud Service Account. The service account must have the BigQuery Data Editor and BigQuery Job User IAM roles on the target project.
User Account: Authenticate using a Google account via OAuth. Use the same Google account you use to log in to Edilitics. This method is suitable for personal or trial usage.
Steps to Connect
Step 1: Add the BigQuery Integration
- In Edilitics, go to the Integrate module.
- Click New Integration.
- Select Google BigQuery from the integration catalog.
Step 2: Choose Authentication Type
Select Service Account or User Account.
For Service Account:
- Enter your Project ID.
- Upload your Service Account Key (JSON file).
- Click Fetch Datasets to load the available datasets in your project.
- Select your Dataset ID from the list.
For User Account:
- Enter your Project ID.
- Click Authenticate with Google and complete the OAuth flow using the same account you use to log in to Edilitics.
- Click Fetch Datasets to load datasets.
- Select your Dataset ID.
Step 3: Configure the Integration
- Enter an Integration Name to identify this connection in your workspace.
- Add an optional description.
- Configure AI Column Insights if required. See AI Column Insights for privacy mode options.
Step 4: Test and Save
- Click Test Connection to validate credentials and access.
- On success, click Preview to confirm the data is accessible.
- Click Save to create the integration.
Connection Details
| Field | Value |
|---|---|
| Protocol | HTTPS |
| Port | 443 |
| IP whitelisting | Not required |
Supported Operations
| Operation | Source | Destination |
|---|---|---|
| Replicate: Full Load | Yes | Yes |
| Replicate: Incremental | Yes | Yes |
| Replicate: CDC | No | Yes |
| Transform | Yes | Yes |
| Visualize | Yes | n/a |
| AskEdi | Yes | n/a |
Need help? Email support@edilitics.com with your workspace, job ID, and context. We reply within one business day.
Last updated on
Data Warehouse Integrations
Connect Google BigQuery, Amazon Redshift, and Snowflake to Edilitics. All three support source and destination, DQ profiling, AI column insights, and AskEdi sessions.
Redshift
Connect Amazon Redshift to Edilitics as a source or destination. Step-by-step setup including host, port, credentials, and schema configuration.