Automate Pinterest Ads to BigQuery or Snowflake

Centralize Pinterest Ads Data in Your Warehouse

Build a robust, automated pipeline for Pinterest Ads data in BigQuery or Snowflake. See how Keboola delivers unified analytics, automation, and fast setup—no code required.
Try Keboola Now
Arrow right
Laptop displaying Keboola data platform dashboard showing usage metrics and welcome screen

Why and How to Move Pinterest Ads Data Into BigQuery or Snowflake With Keboola

Why Centralize Pinterest Ads Data?

  • [analytics]Unified Marketing Analytics: Combining Pinterest Ads data with spend, CRM, commerce, and web analytics enables accurate ROAS, CAC, LTV, attribution, and channel pacing. Centralizing in a data warehouse lets you analyze cross-channel performance and make data-driven decisions.
  • [data-source]Full Fidelity & History: Retain raw, daily-grain Pinterest facts. This allows you to re-run attributions, audit changes, and correct late conversions without data loss.
  • [automation]Governed Automation & Observability: Keboola's Flows orchestrate data extraction, transformation, and loading with scheduling, notifications, run history, and versioning. No fragile custom scripts required.
  • [fast]Fast Build with AI Assistance: Keboola MCP scaffolds your entire pipeline from natural language prompts, so you can deploy in minutes and tailor as needed.

Architecture Overview

  1. Extract: Use Keboola’s Pinterest Ads extractor to pull data from one or multiple accounts. Customize reports by time range, granularity, conversion windows, and report basis.
  2. Transform (Optional): Apply SQL (or dbt) transformations to validate types, deduplicate records, and model warehouse-friendly facts and dimensions.
  3. Load: Write data to Google BigQuery or Snowflake with support for full or incremental loads. Snowflake offers native UPSERTs; BigQuery supports append, with MERGE transformations for deduplication.
  4. Orchestrate: Automate and monitor all steps with Keboola Flows, including scheduling and notifications for success or failure.

[2col]Prerequisites

  • [company]Pinterest: Business account and access to desired ad accounts. Understand your attribution windows and reporting basis.
  • [tool]Keboola: Project access with permissions to create components and Flows. For fast setup, connect to Keboola MCP server.
  • [database]BigQuery: Service account with Data Editor and Job User roles, and a JSON key for writer configuration.
  • [database]Snowflake: Access to your own Snowflake or a Keboola-provisioned instance. Set up a dedicated writer user/role with required privileges.

Step 1: Extract Pinterest Ads Data

  1. Configure the Pinterest Ads Extractor: In Keboola, add and authorize the extractor with your Pinterest Business credentials. Create extractor rows for each dataset (each row becomes a table in Storage).
  2. Define Your Report:
  • Option A - Build in Keboola: Select accounts, time range (absolute or relative), granularity (Advertiser, Campaign, Ad Group, Ad), conversion window, time basis, destination table, and load type (Full or Incremental).
  • Option B - Use Pinterest Report Template: Create a report template in Pinterest Ads Manager, retrieve the template ID, and reference it in Keboola. Set overrides as needed.
  1. Tip: Use a rolling 30–35 day window to capture late conversions and avoid data loss. Stagger extraction for large backfills to respect Pinterest’s rate limits.

Step 2: Transform Your Data (Recommended)

  • [sql]Use SQL transformations in Keboola to normalize types, deduplicate, and model star schema (facts and dimensions).
  • [data-source]Suggested Models:
  • dimension tables: dim_ad_account, dim_campaign, dim_ad_group, dim_ad
  • fact tables: fct_ad_daily, fct_campaign_daily
  • grain: ad_id × date or campaign_id × date)
  • [analytics]Example Transformation:
CREATE OR REPLACE TABLE WORK.fct_campaign_daily AS SELECT CAST(DATE(report_date) AS DATE) AS dt, CAST(advertiser_id AS STRING) AS advertiser_id, CAST(campaign_id AS STRING) AS campaign_id, SUM(impressions) AS impressions, SUM(clicks) AS clicks, SUM(spend) AS spend, SUM(conversions) AS conversions, SAFE_DIVIDE(SUM(conversions), NULLIF(SUM(clicks),0)) AS conv_rate FROM STAGE.pinterest_campaign_report GROUP BY 1,2,3; 
  • [clean]Normalize numeric fields, cast dates/times, and handle schema drift proactively.

Step 3: Load Data into BigQuery or Snowflake

Google BigQuery Writer

  1. Paste your service account JSON in Keboola Writer. Select the dataset.
  2. Map Storage tables to destination tables.
  3. Select load type: Full (overwrite) or Incremental (append).
  4. For deduplication, use a MERGE transformation from staging into analytics tables.
MERGE `analytics.pinterest_campaign_daily` T USING `stage.pinterest_campaign_daily` S ON T.dt = S.dt AND T.campaign_id = S.campaign_id WHEN MATCHED THEN UPDATE SET impressions = S.impressions, clicks = S.clicks, spend = S.spend, conversions = S.conversions WHEN NOT MATCHED THEN INSERT (dt, campaign_id, impressions, clicks, spend, conversions) VALUES (S.dt, S.campaign_id, S.impressions, S.clicks, S.spend, S.conversions); 

Snowflake Writer

  1. Set up a dedicated writer user/role with necessary privileges (see example below).
  2. Configure tables in Keboola Writer, choosing Full or Incremental mode. Set a Primary Key for UPSERTs in Incremental mode.
  3. Leverage Snowflake transformations as needed, and quote identifiers to preserve case.
CREATE ROLE WRITER_KBC; CREATE DATABASE MARKETING; CREATE SCHEMA MARKETING.PINTEREST; GRANT USAGE ON DATABASE MARKETING TO ROLE WRITER_KBC; GRANT USAGE ON SCHEMA MARKETING.PINTEREST TO ROLE WRITER_KBC; GRANT ALL PRIVILEGES ON SCHEMA MARKETING.PINTEREST TO ROLE WRITER_KBC; GRANT USAGE ON WAREHOUSE ANALYTICS_WH TO ROLE WRITER_KBC; CREATE USER KBC_WRITER PASSWORD='' DEFAULT_ROLE=WRITER_KBC DEFAULT_WAREHOUSE=ANALYTICS_WH DEFAULT_NAMESPACE=MARKETING.PINTEREST; GRANT ROLE WRITER_KBC TO USER KBC_WRITER; 

Step 4: Orchestrate and Monitor with Keboola Flows

  • [flow]Create a Flow to run extractors, transformations, and writers in sequence.
  • [clock]Schedule daily refreshes (e.g., 07:00) and set up notifications for job success/failure via email or webhook.
  • [information]Track Flow versions for reproducibility and rollback.

Accelerate with Keboola MCP (Model Context Protocol)

Keboola MCP lets you scaffold pipelines from a natural language prompt. For example, prompt:

Create a Flow named Pinterest → BigQuery that extracts Pinterest Ads for accounts 123, 456 with granularity CAMPAIGN, time basis TIME_OF_CONVERSION, conversion window click 30d / view 1d, rolling window last 35 days; write Storage tables stg_pinterest_campaign_daily and stg_pinterest_ad_daily. Then add a BigQuery transformation to MERGE into analytics.pinterest_* tables, and add a BigQuery writer to dataset marketing. Schedule daily at 07:00 and notify me on failure. 

Swap to Snowflake by changing the writer and transformation steps.

Data Modeling Guidance

  • [database]Use composite keys (advertiser_id, campaign_id, ad_group_id, ad_id, date) for fact tables. Keep dimension tables for reference data.
  • [decision]Be consistent and explicit with attribution windows and time basis (ad action vs. conversion).
  • [clean]Cast all metrics and timestamps explicitly to avoid type drift.
  • [sql]For BigQuery, use MERGE for deduplication. For Snowflake, use UPSERTs with Primary Key in Incremental mode.

Common Use Cases

  • [analytics]Cross-channel performance and budget allocation
  • [star]Creative analytics (variant performance, theme rollups)
  • [question]Attribution QA (align conversions with commerce/CRM)
  • [collaboration]Executive dashboards and alerts (daily refresh via Flows)
  • [tool]Activation (reverse ETL to marketing tools after modeling)

End-to-End Build Recipes

Recipe A: Pinterest → BigQuery

  1. Extractor rows for campaign_daily and ad_daily (rolling 35 days, Conversion basis).
  2. Transformations to build fact tables and merge staging to analytics.
  3. Configure BigQuery writer with service account.
  4. Orchestrate with a Flow; schedule daily, add notifications.

Recipe B: Pinterest → Snowflake

  1. Extractor rows as above.
  2. (Optional) Snowflake transformations.
  3. Snowflake writer with Primary Key for Incremental UPSERTs, or Full loads.
  4. Orchestrate with a Flow; schedule and notify.

Troubleshooting & Best Practices

  • [slow]Rate Limits: For large extractions, stagger windows or accounts. Limit concurrency to avoid hitting Pinterest API rate limits.
  • [decision]Attribution Windows: Standardize and document your windows and time basis. If Pinterest defaults change, update extractor and models.
  • [incomplete]Schema Drift: Update transformations and writer mappings if Pinterest adds new fields.
  • [information]Observability: Use Flow run history and job logs to investigate failures. Set up email/webhook alerts for failures.

One-Page Checklists

BigQuery Path

  • [checkbox]Pinterest extractor rows (campaign_daily, ad_daily) with rolling window, Conversion basis
  • [checkbox]BigQuery transformations to MERGE staging into facts
  • [checkbox]BigQuery writer with service account
  • [checkbox]Flow scheduled + notifications enabled

Snowflake Path

  • [checkbox]Pinterest extractor rows
  • [checkbox]Snowflake writer with Primary Key for Incremental UPSERTs (or Full overwrite)
  • [checkbox](Optional) Snowflake transformations
  • [checkbox]Flow scheduled + notifications enabled

Optional: dbt Integration

If you use dbt in Keboola, convert SQL transformations into dbt models and orchestrate them in your Flow for modular, versioned analytics.

Watch Related Video

Testimonials

No items found.
Unlock the value of your data
Try Keboola
Book a Meeting