Warning
Work in Progress: This page is currently under construction. Content may be incomplete or subject to change. To contribute, see the contribution guide.
Google Cloud Platform (GCP)
Reference for GCP projects, IAM, and operational procedures at Patria Investments.
Access
| Method | Details |
|---|---|
| Console | console.cloud.google.com — authenticate with corporate Google Workspace account |
| CLI | gcloud auth login — opens browser SSO flow |
| Service accounts | Managed via IaC; keys are stored in Key Vault (never committed to code) |
Projects
| Project | Purpose | Environment |
|---|---|---|
| patria-data-prod | BigQuery data lake, Cloud Storage, Composer / Airflow | Production |
| patria-data-staging | Staging data pipelines | Staging |
| patria-data-dev | Developer sandboxes, exploratory analysis | Development |
Key resources
| Resource | Type | Project |
|---|---|---|
| patria-datalake | BigQuery dataset | patria-data-prod |
| patria-raw | Cloud Storage bucket (raw ingestion) | patria-data-prod |
| patria-airflow | Cloud Composer environment | patria-data-prod |
IAM roles in use
| Role | Description |
|---|---|
roles/bigquery.dataEditor | Read and write to BigQuery datasets |
roles/bigquery.jobUser | Execute BigQuery jobs |
roles/storage.objectViewer | Read-only access to Cloud Storage buckets |
roles/composer.worker | Runtime access for Airflow workers |
Common procedures
Set active project
gcloud config set project patria-data-prodList BigQuery datasets
bq ls --project_id patria-data-prodCheck Composer environment status
gcloud composer environments describe patria-airflow \
--location <region> \
--format="value(state)"Runbooks
For step-by-step operational procedures see Runbook — GCP.
Escalation
Owner: Infra & Cloud Squad — see Contacts