Frequently Asked Questions (FAQ)

Answers to common questions regarding the architecture, security, and implementation of the OneLens Google Cloud Platform (GCP) integration.

What is the high-level architecture of this integration?

The integration utilizes GCP Billing Exports to BigQuery.

  • Data Flow: GCP exports detailed billing and pricing data into a BigQuery dataset (billing_export) within a dedicated project in your organization. OneLens queries this BigQuery dataset securely to generate insights.

  • Resource Impact: Minimal. The integration runs read-only queries against your billing data and reads metadata from your projects. It does not deploy agents to your VMs or clusters.

What GCP scopes do you support for onboarding?

We support onboarding at the Organization, Folder, and Project levels.

  • Automation Support: Our Terraform script (deploy.sh) accepts a list of specific Target Folder IDs or Target Project IDs.

  • Default Behavior: If no specific targets are provided, the script defaults to onboarding the entire Organization (all active projects).

How is authentication handled?

We do not require (and do not recommend) sharing long-lived Service Account JSON keys.

  • Method: We utilize Service Account Impersonation.

  • Mechanism: You grant the role Service Account Token Creator to our external identity ([email protected]). This allows our platform to generate short-lived credentials to access only the specific OneLens Reader SA service account you create in your environment.

  • Benefit: This is a zero-trust aligned pattern that eliminates the risk of key leakage and allows you to revoke access instantly by removing the IAM binding.

What specific permissions does OneLens require?

We adhere to a read-only posture using a dedicated Service Account (onelens-reader-sa).

  • Billing Access: BigQuery Data Viewer and BigQuery Job User on the billing project to read cost data.

  • Resource Access: Detailed "Viewer" roles on the target projects (e.g., Compute Viewer, Kubernetes Engine Viewer, Vertex AI Viewer) to map costs to resources and generate rightsizing recommendations.

  • Organization Access: Organization Viewer and Billing Account Viewer to visualize hierarchy and subscription mapping.

IAM Role
Scope
Assignee
Purpose

Organization Viewer

Organization

Service Account, External user

Read organization hierarchy.

Billing Viewer

Billing account

Service Account, External user

Read billing account metadata.

BigQuery Data Viewer

Billing project

Service Account, External user

Read data from BigQuery export dataset.

BigQuery Job User

Billing project

Service Account, External user

Run queries on the billing data.

*Service Viewer roles

**Target scope

Service Account

Read metadata for services like Compute, GKE, etc.

Viewer

**Target scope

External user

Read-only access to console.

Which APIs need to be enabled?

To provide accurate recommendations and map costs effectively, the following 15 APIs must be enabled on all target projects:

  1. aiplatform.googleapis.com (Vertex AI API)

  2. cloudfunctions.googleapis.com (Cloud Functions API)

  3. sqladmin.googleapis.com (Cloud SQL Admin API)

  4. compute.googleapis.com (Compute Engine API)

  5. container.googleapis.com (Kubernetes Engine API)

  6. dataflow.googleapis.com (Dataflow API)

  7. dataproc.googleapis.com (Cloud Dataproc API)

  8. file.googleapis.com (Cloud Filestore API)

  9. monitoring.googleapis.com (Cloud Monitoring API)

  10. networkmanagement.googleapis.com (Network Management API)

  11. recommender.googleapis.com (Recommender API)

  12. redis.googleapis.com (Google Cloud Memorystore for Redis API)

  13. serviceusage.googleapis.com (Service Usage API)

  14. cloudasset.googleapis.com (Cloud Asset API)

  15. bigquery.googleapis.com (BigQuery API)

Is there any cost for enabling all these APIs?

No.

  • Explanation: Enabling an API (like compute.googleapis.com) acts as a "gateway" allowing interaction with the service. It does not incur a fee by itself. Costs are incurred when you provision resources (like running a VM) or make high-volume API calls (Data Plane operations).

  • OneLens Usage: Our platform performs low-volume "metadata read" operations (e.g., "List Instances", "Get Disk Type") which typically fall well within the Google Cloud Free Tier limits for API requests. You are not charged simply for having the API enabled in your projects.

Can we automate this setup?

Yes. We provide a pre-packaged Terraform module wrapper (deploy.sh).

  • Workflow: You upload the provided onelens-gcp-onboarding folder to your Google Cloud Shell.

  • Script Actions: The script automatically enables the required 15+ APIs, creates the dedicated billing project (astuto-<company>-billing), configures the BigQuery dataset, and applies the necessary IAM bindings.

  • Prerequisites: The user running the script needs Organisation Administrator, Billing Account Administrator, and Service Usage Administrator.

Why does OneLens need "Viewer" role for the External User?

The "Viewer" role (roles/viewer) is assigned to the External User ([email protected]), which represents our support/engineering team, not the automated platform.

  • Reason: This facilitates rapid troubleshooting of permission issues or data discrepancies during the onboarding phase without requiring back-and-forth granular permission grants.

  • Least Privilege: The automated platform (the Service Account) uses strictly granular roles (e.g., roles/compute.viewer, roles/cloudsql.viewer) defined in the Terraform/Manual guide, ensuring your daily data ingestion is scoped tightly.

What if we have a third-party billing partner (CSP) and do not have access to enable the BigQuery cost export?

This is a common scenario with billing CSP partners.

  • Action: You cannot create the export yourself if you do not own the Billing Account. You must request the vendor to share the existing BigQuery dataset containing your billing data.

  • Procedure: Ask your partner to grant BigQuery Data Viewer and BigQuery Job User roles on their export dataset to the OneLens Reader SA service account email we provide during onboarding. You will then point OneLens to that Partner Project/Dataset ID instead of creating a new one.

Why must the BigQuery Dataset be in the "US (multiple regions)" location?

This is a critical GCP constraint for historical data.

  • Reason: When you enable GCP Billing Exports, setting the dataset location to Multi-region US allows GCP to potentially back-fill billing data from the start of the previous month.

  • Impact: If you select a different region, the export will likely only contain data starting from the moment you enable it, resulting in a gap in your initial reporting.

Why is "Table Expiry" disabled on the dataset?

By default, BigQuery may set tables to expire (auto-delete) after 60 days.

  • Requirement: We explicitly require this to be unchecked (disabled).

  • Reason: FinOps requires long-term trend analysis (year-over-year, month-over-month). If the data partitions are auto-deleted, we lose the historical audit trail required for forecasting and anomaly detection.

I already have a billing export. Do I need to create a new one?

You can reuse an existing billing export, but we strongly recommend a dedicated setup.

  • Existing Project: Our script supports an input Existing Billing Export Project ID.

  • Recommendation: Creating a dedicated project OneLens Billing Project isolates our access. It ensures we don't accidentally query unrelated tables and prevents our BigQuery Job User queries from consuming quotas meant for or creating noise in your production analytics.

Why do you require BigQuery roles (Job User, Metadata Viewer) on the target projects?

This is separate from the billing export access.

  • Reason: Many organizations run significant BigQuery workloads (queries) within their application projects. These roles allow OneLens to analyze the query history, slot usage, and job performance in those specific projects.

  • Benefit: This enables us to perform BigQuery optimization, which often uncovers substantial savings by identifying inefficient queries and underutilized slot commitments. This is purely for workload optimization, not for reading the data inside your application tables.

What is the cost incurred for this setup?

The below analysis provides approximate costs for two scenarios:

Scenario 1: Same Region (our standard setup)

(Example: Customer's BigQuery in US Multi-region, OneLens processing in US Multi-region)

Component
$5K/month spend
$50K/month spend
$500K/month spend
$5M/month spend

Storage size

1.25 GB

12.5 GB

125 GB

1.25 TB

Storage cost

$0.00

~$0.05

~$2.30

~$24.80

Query cost

$0.00

$0.00

$0.00

~$1.56

Egress cost

$0.00

$0.00

$0.00

$0.00

Total cost per month

$0.00

~$0.05

~$2.30

~$26.36

  • For storage, up-to 10 GB is free.

  • For queries, up-to 1 TB is free.

  • For egress, same-region transfer is free.

We default to Scenario 1 during the export setup to maximize the value of Google’s free tier and eliminate network egress fees.

Scenario 2: Different Region (common with existing billing project)

(Example: Customer's BigQuery in asia-south1 (Mumbai), OneLens processing in US Multi-region)

Component
$5K/month spend
$50K/month spend
$500K/month spend
$5M/month spend

Storage size

1.25 GB

12.5 GB

125 GB

1.25 TB

Storage cost

$0.00

~$0.06

~$2.65

~$28.52

Query cost

$0.00

$0.00

$0.00

~$1.56

Egress cost

~$0.10

~$1.00

~$10.00

~$100.00

Total cost per month

~$0.10

~$1.06

~$12.65

~$130.08

  • For storage, up-to 10 GB is free.

  • For queries, up-to 1 TB is free.

  • For egress, same-region transfer is free.

Last updated