Exporting Logs to Google BigQuery
Connecting Patronum Logs to Google BigQuery
This guide explains step by step how to connect Patronum logs to Google BigQuery using a Google Cloud Service Account and BigQuery integration settings inside Patronum.
1. Create a Google Cloud Project
- Go to the Google Cloud Console.
- Click the Project Dropdown (top navigation bar).
- Select an existing project or click New Project.
- Enter a project name (e.g., "Patronum Logs Project") and click Create.
2. Enable the BigQuery API
Before using BigQuery, you need to enable the API.
- In the Google Cloud Console, go to Search bar at the top.
- Type BigQuery API.
- Click on it and then click Enable.

3. Create a Service Account
- Go to IAM & Admin → Service Accounts.
- Click + Create Service Account.
- Provide a name (e.g.,
patronum-logs-sa) and description. - Click Create and Continue.
4. Assign Roles to the Service Account
Assign project-level roles so Patronum can create datasets and insert logs into BigQuery.
- BigQuery Data Editor — Required for creating tables and inserting data.
- BigQuery Job User — Required for running streaming insert jobs.
Select both roles and click Continue → then click Done.
5. Generate and Download the Service Account Key (JSON File)
- Open the Service Account you just created.
- Go to the Keys tab.
- Click Add Key → Create New Key.
- Select JSON as the key type.
- Click Create → a JSON file will download automatically.
* Important:
- This is the only time you can download the key.
- If lost, you must create a new one.
6. Secure the JSON Key File
- Store it securely (environment variables, encrypted storage, secret manager).
- Never upload it to GitHub, email, or public locations.
- Delete unused keys and rotate keys regularly.
7. Configure Patronum with BigQuery
Now that you have the JSON key, connect Patronum to BigQuery:
- Login to Patronum App.
- Click Settings (left menu).
- Go to Integrations.
- Expand the Exporting Logs dropdown.

- In the Action column, enable BigQuery by clicking the radio button.
- In the Configure column, click the Settings (⚙️) icon.
- Upload the JSON file (Service Account Key).
- Click Next.
- Enter Dataset ID:
- If this is your first time, create one by typing a new Dataset ID.

- Click Next.
- Enter a Table ID (the table where Patronum logs will be stored).

- Click Save.
Patronum is now connected to Google BigQuery.
8. Troubleshooting Dataset Creation Errors
If you encounter errors when Patronum attempts to create the BigQuery dataset, check the following:
Error: Dataset creation fails silently
Cause: Your Google Cloud Project is in Sandbox mode (no billing enabled).
Solution: Enable billing on your Google Cloud Project. Go to Billing in the Google Cloud Console and link a valid billing account. BigQuery Sandbox does not support streaming inserts.
Error: Permission denied or "Access Denied"
Cause: The service account lacks the required project-level permissions.
Solution: Ensure the service account has both BigQuery Data Editor and BigQuery Job User roles assigned at the project level (not just dataset level). Go to IAM & Admin → IAM, find the service account, and verify the roles.
Table remains empty after 24 hours
Cause: No Patronum activity has occurred to generate logs, or the table was manually created with an incompatible schema.
Solution:
- Delete any manually created datasets/tables and let Patronum auto-create them.
- Trigger Patronum activity (e.g., user onboarding, policy updates) to generate log entries.
- Verify that the integration is enabled in Patronum Settings → Integrations → Exporting Logs.
9. (Optional) Rotate Keys and Manage Security
- Remove old keys when not in use.
- Assign only the roles needed (principle of least privilege).
- Monitor BigQuery logs for security and usage.
10. Where to See Your Logs in Google Cloud
- Go to Google Cloud Console.
- Open the Navigation Menu → BigQuery.
- Expand your Project.
- Select your Dataset ID.
- Click on your Table ID.
- Open the Table Explorer View.
- Here, you can view your Patronum logs.(On Preview Tab)

11. Querying Your Logs
You can use SQL queries in BigQuery to analyze your Patronum logs.
For example:
SELECT *
FROM project_name.datasetID.tableId
LIMIT 50;
This query fetches the last 50 events from your Patronum logs.
Or you can see the referrence below:

Updated on: 23/02/2026
Thank you!
