# Frequently Asked Questions (FAQ)

## What is the high-level architecture of this integration?

The integration utilizes **GCP Billing Exports to BigQuery**.

* **Data Flow:** GCP exports detailed billing and pricing data into a BigQuery dataset (`billing_export`) within a dedicated project in your organization. OneLens queries this BigQuery dataset securely to generate insights.
* **Resource Impact:** Minimal. The integration runs read-only queries against your billing data and reads metadata from your projects. It does not deploy agents to your VMs or clusters.

## What GCP scopes do you support for onboarding?

We support onboarding at the **Organization**, **Folder**, and **Project** levels.

* **Automation Support:** Our Terraform script (`deploy.sh`) accepts a list of specific `Target Folder IDs` or `Target Project IDs`.
* **Default Behavior:** If no specific targets are provided, the script defaults to onboarding the entire Organization (all active projects).

## How is authentication handled?

We do **not** require (and do not recommend) sharing long-lived Service Account JSON keys.

* **Method:** We utilize **Service Account Impersonation**.
* **Mechanism:** You grant the role `Service Account Token Creator` to our external identity (`onelens-customer-sa@astuto-prod-mum.iam.gserviceaccount.com`). This allows our platform to generate short-lived credentials to access *only* the specific `OneLens Reader SA` service account you create in your environment.
* **Benefit:** This is a zero-trust aligned pattern that eliminates the risk of key leakage and allows you to revoke access instantly by removing the IAM binding.

## What specific permissions does OneLens require?

We adhere to a read-only posture using a dedicated Service Account (`onelens-reader-sa`).

* **Billing Access:** `BigQuery Data Viewer` and `BigQuery Job User` on the billing project to read cost data.
* **Resource Access:** Detailed "Viewer" roles on the target projects (e.g., `Compute Viewer`, `Kubernetes Engine Viewer`, `Vertex AI Viewer`) to map costs to resources and generate rightsizing recommendations.
* **Organization Access:** `Organization Viewer` and `Billing Account Viewer` to visualize hierarchy and subscription mapping.

| IAM Role                        | Scope            | Assignee                       | Purpose                                            |
| ------------------------------- | ---------------- | ------------------------------ | -------------------------------------------------- |
| Organization Viewer             | Organization     | Service Account, External user | Read organization hierarchy.                       |
| Billing Viewer                  | Billing account  | Service Account, External user | Read billing account metadata.                     |
| BigQuery Data Viewer            | Billing project  | Service Account, External user | Read data from BigQuery export dataset.            |
| BigQuery Job User               | Billing project  | Service Account, External user | Run queries on the billing data.                   |
| **\***&#x53;ervice Viewer roles | \*\*Target scope | Service Account                | Read metadata for services like Compute, GKE, etc. |
| Viewer                          | \*\*Target scope | External user                  | Read-only access to console.                       |

## Which APIs need to be enabled?

To provide accurate recommendations and map costs effectively, the following 15 APIs must be enabled on all target projects:

1. `aiplatform.googleapis.com` (Vertex AI API)
2. `cloudfunctions.googleapis.com` (Cloud Functions API)
3. `sqladmin.googleapis.com` (Cloud SQL Admin API)
4. `compute.googleapis.com` (Compute Engine API)
5. `container.googleapis.com` (Kubernetes Engine API)
6. `dataflow.googleapis.com` (Dataflow API)
7. `dataproc.googleapis.com` (Cloud Dataproc API)
8. `file.googleapis.com` (Cloud Filestore API)
9. `monitoring.googleapis.com` (Cloud Monitoring API)
10. `networkmanagement.googleapis.com` (Network Management API)
11. `recommender.googleapis.com` (Recommender API)
12. `redis.googleapis.com` (Google Cloud Memorystore for Redis API)
13. `serviceusage.googleapis.com` (Service Usage API)
14. `cloudasset.googleapis.com` (Cloud Asset API)
15. `bigquery.googleapis.com` (BigQuery API)

## Is there any cost for enabling all these APIs?

No.

* **Explanation:** Enabling an API (like `compute.googleapis.com`) acts as a "gateway" allowing interaction with the service. It does not incur a fee by itself. Costs are incurred when you *provision resources* (like running a VM) or make high-volume API calls (Data Plane operations).
* **OneLens Usage:** Our platform performs low-volume "metadata read" operations (e.g., "List Instances", "Get Disk Type") which typically fall well within the Google Cloud Free Tier limits for API requests. You are not charged simply for having the API enabled in your projects.

## Can we automate this setup?

Yes. We provide a pre-packaged Terraform module wrapper (`deploy.sh`).

* **Workflow:** You upload the provided `onelens-gcp-onboarding` folder to your Google Cloud Shell.
* **Script Actions:** The script automatically enables the required 15+ APIs, creates the dedicated billing project (`astuto-<company>-billing`), configures the BigQuery dataset, and applies the necessary IAM bindings.
* **Prerequisites:** The user running the script needs `Organisation Administrator`, `Billing Account Administrator`, and `Service Usage Administrator`.

## Why does OneLens need "Viewer" role for the External User?

The "Viewer" role (`roles/viewer`) is assigned to the **External User** (`onelens.finops@astuto.ai`), which represents our support/engineering team, *not* the automated platform.

* **Reason:** This facilitates rapid troubleshooting of permission issues or data discrepancies during the onboarding phase without requiring back-and-forth granular permission grants.
* **Least Privilege:** The *automated platform* (the Service Account) uses strictly granular roles (e.g., `roles/compute.viewer`, `roles/cloudsql.viewer`) defined in the Terraform/Manual guide, ensuring your daily data ingestion is scoped tightly.

## What if we have a third-party billing partner (CSP) and do not have access to enable the BigQuery cost export?

This is a common scenario with billing CSP partners.

* **Action:** You cannot create the export yourself if you do not own the Billing Account. You must request the vendor to **share** the existing BigQuery dataset containing your billing data.
* **Procedure:** Ask your partner to grant `BigQuery Data Viewer` and `BigQuery Job User` roles on *their* export dataset to the `OneLens Reader SA` service account email we provide during onboarding. You will then point OneLens to that Partner Project/Dataset ID instead of creating a new one.

## Why must the BigQuery Dataset be in the "US (multiple regions)" location?

This is a critical GCP constraint for **historical data**.

* **Reason:** When you enable GCP Billing Exports, setting the dataset location to Multi-region US allows GCP to potentially back-fill billing data from the **start of the previous month**.
* **Impact:** If you select a different region, the export will likely only contain data starting from the *moment* you enable it, resulting in a gap in your initial reporting.

## Why is "Table Expiry" disabled on the dataset?

By default, BigQuery may set tables to expire (auto-delete) after 60 days.

* **Requirement:** We explicitly require this to be **unchecked (disabled)**.
* **Reason:** FinOps requires long-term trend analysis (year-over-year, month-over-month). If the data partitions are auto-deleted, we lose the historical audit trail required for forecasting and anomaly detection.

## I already have a billing export. Do I need to create a new one?

You can reuse an existing billing export, but we strongly recommend a dedicated setup.

* **Existing Project:** Our script supports an input `Existing Billing Export Project ID`.
* **Recommendation:** Creating a dedicated project `OneLens Billing Project` isolates our access. It ensures we don't accidentally query unrelated tables and prevents our `BigQuery Job User` queries from consuming quotas meant for or creating noise in your production analytics.

## Why do you require BigQuery roles (`Job User`, `Metadata Viewer`) on the *target* projects?

This is separate from the *billing export* access.

* **Reason:** Many organizations run significant BigQuery workloads (queries) within their application projects. These roles allow OneLens to analyze the query history, slot usage, and job performance in those specific projects.
* **Benefit:** This enables us to perform BigQuery optimization, which often uncovers substantial savings by identifying inefficient queries and underutilized slot commitments. This is purely for workload optimization, not for reading the data *inside* your application tables.

## What is the cost incurred for this setup?

The below analysis provides approximate costs for two scenarios:

#### Scenario 1: Same Region (our standard setup)

*(Example: Customer's BigQuery in US Multi-region, OneLens processing in US Multi-region)*

<table><thead><tr><th>Component</th><th width="149.5">$5K/month spend</th><th width="164">$50K/month spend</th><th width="167.5">$500K/month spend</th><th width="159.5">$5M/month spend</th></tr></thead><tbody><tr><td>Storage size</td><td>1.25 GB</td><td>12.5 GB</td><td>125 GB</td><td>1.25 TB</td></tr><tr><td>Storage cost</td><td>$0.00</td><td>~$0.05</td><td>~$2.30</td><td>~$24.80</td></tr><tr><td>Query cost</td><td>$0.00</td><td>$0.00</td><td>$0.00</td><td>~$1.56</td></tr><tr><td>Egress cost</td><td>$0.00</td><td>$0.00</td><td>$0.00</td><td>$0.00</td></tr><tr><td><strong>Total cost per month</strong></td><td><strong>$0.00</strong></td><td><strong>~$0.05</strong></td><td><strong>~$2.30</strong></td><td><strong>~$26.36</strong></td></tr></tbody></table>

* For storage, up-to 10 GB is free.
* For queries, up-to 1 TB is free.
* For egress, same-region transfer is free.

We default to **Scenario 1** during the export setup to maximize the value of Google’s free tier and eliminate network egress fees.

#### Scenario 2: Different Region (common with existing billing project)

*(Example: Customer's BigQuery in asia-south1 (Mumbai), OneLens processing in US Multi-region)*

<table><thead><tr><th>Component</th><th width="149.5">$5K/month spend</th><th width="164">$50K/month spend</th><th width="167.5">$500K/month spend</th><th width="159.5">$5M/month spend</th></tr></thead><tbody><tr><td>Storage size</td><td>1.25 GB</td><td>12.5 GB</td><td>125 GB</td><td>1.25 TB</td></tr><tr><td>Storage cost</td><td>$0.00</td><td>~$0.06</td><td>~$2.65</td><td>~$28.52</td></tr><tr><td>Query cost</td><td>$0.00</td><td>$0.00</td><td>$0.00</td><td>~$1.56</td></tr><tr><td>Egress cost</td><td>~$0.10</td><td>~$1.00</td><td>~$10.00</td><td>~$100.00</td></tr><tr><td><strong>Total cost per month</strong></td><td><strong>~$0.10</strong></td><td><strong>~$1.06</strong></td><td><strong>~$12.65</strong></td><td><strong>~$130.08</strong></td></tr></tbody></table>

* For storage, up-to 10 GB is free.
* For queries, up-to 1 TB is free.
* For egress, same-region transfer is free.
