Articles on: Miscellaneous

Exporting Logs to Google BigQuery

Connecting Patronum Logs to Google BigQuery

This guide explains step by step how to connect Patronum logs to Google BigQuery using a Google Cloud Service Account and BigQuery integration settings inside Patronum.


You must be using the paid version of Google BigQuery, the Sandbox and Trial versions do not support Data Streaming.


Before you begin: Your Google Cloud Project must have billing enabled and Sandbox mode disabled. BigQuery Sandbox does not support streaming inserts, which will cause Patronum log exports to fail silently. Additionally, the Patronum service account requires project-level permissions to create datasets and tables automatically.


1. Create a Google Cloud Project

  1. Go to the Google Cloud Console.
  2. Click the Project Dropdown (top navigation bar).
  3. Select an existing project or click New Project.
  4. Enter a project name (e.g., "Patronum Logs Project") and click Create.


2. Enable the BigQuery API

Before using BigQuery, you need to enable the API.

  1. In the Google Cloud Console, go to Search bar at the top.
  2. Type BigQuery API.
  3. Click on it and then click Enable.






3. Create a Service Account

  1. Go to IAM & Admin → Service Accounts.
  2. Click + Create Service Account.
  3. Provide a name (e.g., patronum-logs-sa) and description.
  4. Click Create and Continue.


4. Assign Roles to the Service Account

Assign project-level roles so Patronum can create datasets and insert logs into BigQuery.


These roles must be assigned at the project level, not just to a specific dataset. Patronum needs permissions to create datasets and tables automatically.


  • BigQuery Data Editor — Required for creating tables and inserting data.
  • BigQuery Job User — Required for running streaming insert jobs.

Select both roles and click Continue → then click Done.


5. Generate and Download the Service Account Key (JSON File)

  1. Open the Service Account you just created.
  2. Go to the Keys tab.
  3. Click Add Key → Create New Key.
  4. Select JSON as the key type.
  5. Click Create → a JSON file will download automatically.

* Important:

  • This is the only time you can download the key.
  • If lost, you must create a new one.


6. Secure the JSON Key File

  • Store it securely (environment variables, encrypted storage, secret manager).
  • Never upload it to GitHub, email, or public locations.
  • Delete unused keys and rotate keys regularly.


7. Configure Patronum with BigQuery

Now that you have the JSON key, connect Patronum to BigQuery:

  1. Login to Patronum App.
  2. Click Settings (left menu).
  3. Go to Integrations.
  4. Expand the Exporting Logs dropdown.







  1. In the Action column, enable BigQuery by clicking the radio button.
  2. In the Configure column, click the Settings (⚙️) icon.
  3. Upload the JSON file (Service Account Key).
  4. Click Next.
  5. Enter Dataset ID:
    • If this is your first time, create one by typing a new Dataset ID.





  1. Click Next.
  2. Enter a Table ID (the table where Patronum logs will be stored).





  1. Click Save.

Patronum is now connected to Google BigQuery.


8. Troubleshooting Dataset Creation Errors

If you encounter errors when Patronum attempts to create the BigQuery dataset, check the following:

Error: Dataset creation fails silently

Cause: Your Google Cloud Project is in Sandbox mode (no billing enabled).

Solution: Enable billing on your Google Cloud Project. Go to Billing in the Google Cloud Console and link a valid billing account. BigQuery Sandbox does not support streaming inserts.

Error: Permission denied or "Access Denied"

Cause: The service account lacks the required project-level permissions.

Solution: Ensure the service account has both BigQuery Data Editor and BigQuery Job User roles assigned at the project level (not just dataset level). Go to IAM & Admin → IAM, find the service account, and verify the roles.

Table remains empty after 24 hours

Cause: No Patronum activity has occurred to generate logs, or the table was manually created with an incompatible schema.

Solution:

  • Delete any manually created datasets/tables and let Patronum auto-create them.
  • Trigger Patronum activity (e.g., user onboarding, policy updates) to generate log entries.
  • Verify that the integration is enabled in Patronum Settings → Integrations → Exporting Logs.


9. (Optional) Rotate Keys and Manage Security

  • Remove old keys when not in use.
  • Assign only the roles needed (principle of least privilege).
  • Monitor BigQuery logs for security and usage.


10. Where to See Your Logs in Google Cloud

  1. Go to Google Cloud Console.
  2. Open the Navigation Menu → BigQuery.
  3. Expand your Project.
  4. Select your Dataset ID.
  5. Click on your Table ID.
  6. Open the Table Explorer View.
  7. Here, you can view your Patronum logs.(On Preview Tab)






11. Querying Your Logs

You can use SQL queries in BigQuery to analyze your Patronum logs.

For example:



SELECT * 

FROM project_name.datasetID.tableId

LIMIT 50;

This query fetches the last 50 events from your Patronum logs.



Or you can see the referrence below:









Updated on: 23/02/2026

Was this article helpful?

Share your feedback

Cancel

Thank you!