Google Cloud Storage Configuration
Complete guide for configuring Google Cloud Storage connectivity with FastTransfer.
Authentication Methods
FastTransfer supports multiple Google Cloud authentication methods:
- gcloud CLI (recommended for local development)
- Service Account Key File (recommended for production)
- Default Application Credentials (for GCP infrastructure)
1. gcloud CLI (Recommended)
The simplest method for local development and testing.
Setup:
# Install gcloud CLI
# Windows: Download from https://cloud.google.com/sdk/docs/install
# Linux: curl https://sdk.cloud.google.com | bash
# Initialize gcloud
gcloud init
# Login to Google Cloud
gcloud auth login
# List available projects
gcloud projects list
# Set active project
gcloud config set project my-project-id
# Create application default credentials
gcloud auth application-default login
FastTransfer will automatically use the gcloud CLI's authentication credentials.
Using with FastTransfer:
# No additional configuration needed - just run FastTransfer
./FastTransfer \
...
--directory "gs://my-bucket/exports" \
--fileoutput "data.parquet" \
...
2. Service Account Key File
Recommended for production environments, CI/CD pipelines, and automated workflows.
Create Service Account:
- Google Cloud Console
- gcloud CLI
- Go to IAM & Admin → Service Accounts
- Click Create Service Account
- Enter name (e.g., "FastTransfer-export")
- Click Create and Continue
- Grant Storage Object Admin role
- Click Continue → Done
- Click on the service account
- Go to Keys tab
- Click Add Key → Create new key
- Select JSON format
- Click Create (JSON file will download)
# Create service account
gcloud iam service-accounts create FastTransfer-export \
--display-name="FastTransfer Export Service Account"
# Grant Storage Object Admin role
gcloud projects add-iam-policy-binding my-project-id \
--member="serviceAccount:FastTransfer-export@my-project-id.iam.gserviceaccount.com" \
--role="roles/storage.objectAdmin"
# Create and download key file
gcloud iam service-accounts keys create ~/FastTransfer-key.json \
--iam-account=FastTransfer-export@my-project-id.iam.gserviceaccount.com
Use Service Account Key:
- Windows (PowerShell)
- Linux
# Set path to service account key file
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\keys\FastTransfer-key.json"
# Run FastTransfer
.\\FastTransfer.exe `
...
--directory "gs://my-bucket/exports" `
--fileoutput "data.parquet" `
...
# Set path to service account key file
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/FastTransfer-key.json"
# Run FastTransfer
./FastTransfer \
...
--directory "gs://my-bucket/exports" \
--fileoutput "data.parquet" \
...
3. Default Application Credentials
When FastTransfer runs on Google Cloud infrastructure, it automatically uses the default service account. No credentials configuration needed!
Supported GCP Services:
- Compute Engine instances
- Google Kubernetes Engine (GKE) pods
- Cloud Run services
- Cloud Functions
- App Engine
Verify Default Credentials:
gcloud auth application-default print-access-token
GCS URI Format
gs://bucket-name/path/to/directory/
URI Examples
# Root of bucket
--directory "gs://my-bucket"
# Single folder level
--directory "gs://my-bucket/exports"
# Multiple folder levels
--directory "gs://my-bucket/exports/sales/2024"
# Hive-style partitioning
--directory "gs://my-bucket/data/year=2024/month=01/day=15"
Complete Examples
Basic Export to GCS
- Windows
- Linux
.\\FastTransfer.exe `
--connectiontype "mssql" `
--server "localhost" `
--database "SalesDB" `
--trusted `
--sourceschema "dbo" `
--sourcetable "Orders" `
--directory "gs://my-bucket/exports/orders" `
--fileoutput "orders.parquet"
./FastTransfer \
--connectiontype "mssql" \
--server "localhost" \
--database "SalesDB" \
--trusted \
--sourceschema "dbo" \
--sourcetable "Orders" \
--directory "gs://my-bucket/exports/orders" \
--fileoutput "orders.parquet"
Parallel Export with Query
- Windows
- Linux
.\\FastTransfer.exe `
--connectiontype "pgsql" `
--server "localhost" `
--port "5432" `
--database "analytics" `
--user "postgres" `
--password "postgres" `
--query "SELECT * FROM events WHERE event_date >= '2024-01-01'" `
--directory "gs://analytics-bucket/events/2024" `
--fileoutput "events.parquet" `
--parallelmethod "Ctid" `
--paralleldegree 10
./FastTransfer \
--connectiontype "pgsql" \
--server "localhost" \
--port "5432" \
--database "analytics" \
--user "postgres" \
--password "postgres" \
--query "SELECT * FROM events WHERE event_date >= '2024-01-01'" \
--directory "gs://analytics-bucket/events/2024" \
--fileoutput "events.parquet" \
--parallelmethod "Ctid" \
--paralleldegree 10
Export with Date Partitioning
- Windows
- Linux
.\\FastTransfer.exe `
--connectiontype "mysql" `
--server "localhost" `
--port "3306" `
--database "ecommerce" `
--user "root" `
--password "password" `
--sourceschema "ecommerce" `
--sourcetable "orders" `
--directory "gs://datalake-bucket/orders/" `
--fileoutput "orders.parquet" `
--parallelmethod "DataDriven" `
--distributekeycolumn "o_orderdate" `
--paralleldegree 8
./FastTransfer \
--connectiontype "mysql" \
--server "localhost" \
--port "3306" \
--database "ecommerce" \
--user "root" \
--password "password" \
--sourceschema "ecommerce" \
--sourcetable "orders" \
--directory "gs://datalake-bucket/orders/" \
--fileoutput "orders.parquet" \
--parallelmethod "DataDriven" \
--distributekeycolumn "o_orderdate" \
--paralleldegree 8
Export from Oracle to GCS using Rowid parallelmethod
- Windows
- Linux
.\\FastTransfer.exe `
--connectiontype "oraodp" `
--server "localhost:1521/ORCLPDB1" `
--user "FastUser" `
--password "FastPassword" `
--sourceschema "SALES" `
--sourcetable "ORDERS" `
--directory "gs://warehouse-bucket/oracle-export/orders" `
--fileoutput "orders.parquet" `
--parallelmethod "Rowid" `
--paralleldegree 8
./FastTransfer \
--connectiontype "oraodp" \
--server "localhost:1521/ORCLPDB1" \
--user "FastUser" \
--password "FastPassword" \
--sourceschema "SALES" \
--sourcetable "ORDERS" \
--directory "gs://warehouse-bucket/oracle-export/orders" \
--fileoutput "orders.parquet" \
--parallelmethod "Rowid" \
--paralleldegree 8 \
--merge "True"
Bucket Configuration
Create Bucket
- Google Cloud Console
- gcloud CLI
- Go to Cloud Storage → Buckets
- Click Create Bucket
- Enter bucket name (globally unique)
- Choose location type and region
- Choose storage class (Standard, Nearline, Coldline, Archive)
- Set access control (Uniform or Fine-grained)
- Click Create
# Create bucket in us-central1
gsutil mb -l us-central1 gs://my-bucket
# Create bucket with specific storage class
gsutil mb -c STANDARD -l us-central1 gs://my-bucket
# Set uniform bucket-level access
gsutil uniformbucketlevelaccess set on gs://my-bucket
Required Permissions
IAM Roles
Recommended Roles:
- Storage Object Creator (
roles/storage.objectCreator) - Create objects only - Storage Object Viewer (
roles/storage.objectViewer) - Read objects only
Grant Access via Console
- Go to Cloud Storage → Buckets
- Click bucket name
- Go to Permissions tab
- Click Grant Access
- Enter principal (user, service account, or group)
- Select role (e.g., Storage Object Admin)
- Click Save
Grant Access via gcloud CLI
# Grant to service account
gsutil iam ch serviceAccount:FastTransfer-export@my-project-id.iam.gserviceaccount.com:roles/storage.objectAdmin gs://my-bucket
Minimal Permissions
Minimum permissions for FastTransfer instead of bucket storage admin:
storage.objects.create- Upload objectsstorage.objects.delete- Delete objects (for overwrite)storage.objects.get- Read objects (for verification)storage.objects.list- List objects in bucket