Use cases
How to upload raw data from Google Ads to Google BigQuery
By analyzing the effectiveness of Google Ads advertising campaigns in Google Analytics, you may experience sampling, data aggregation, or other system interface restrictions. Fortunately, this problem is easily solved by uploading raw data from your advertising service to Google BigQuery.
In this article, you will learn how to upload raw data from your Google Ads account to BigQuery and identify all the UTM tags for auto-labeled campaigns.
You need OWOX BI to link campaign information to user activity on the site. Sign up for a demo and we'll detail all the challenges you can solve with OWOX BI.
Table of contents
- Why you need raw data from Google Ads
- Two ways to upload raw data from Google Ads into BigQuery
- How to configure upload using Data Transfer
- How to set up upload using Ads Script
- How to connect data download from Google Ads to OWOX BI
- Useful tips

Our clients
grow
22%
faster
Grow faster by measuring what works best in your marketing
Analyze your marketing efficiency, find the growth areas, increase ROI
Get demoWhy you need raw data from Google Ads
Raw data from Google Ads will allow you to analyze advertising campaigns with accuracy down to each keyword. By uploading data to BigQuery, you can:
- Build reports as detailed as you want without being limited by GA restrictions.
- Determine the effectiveness of ad campaigns at the session and user levels.
- Calculate ROI, ROAS, CRR by region, user type (new or returned), device and any other parameter.
- Manage your rates effectively and create remarketing lists.
- Combine data from Google Ads, Google Analytics and CRM to assess the effectiveness of campaigns based on the margin and redeemability of your items.
- Train your ML-model for more accurate planning.
To understand which campaigns, ads, and keywords bring users to your site, you need to combine data from Google Ads and Analytics into BigQuery. You can do this using the OWOX BI streaming.
Streaming this information sends unsampled user behavior data on your site to GBQ. Hits are transmitted in real-time, then sessions are formed based on these hits.
OWOX BI traffic source information is taken from UTM tags advertising markup. Tags are manual and automatic.
Let's say you marked the ad manually and you got this URL:
https://example.com/?utm_source=facebook&utm_medium=cpc&utm_campaign=utm_tags
In this case, after you connect OWOX BI, you will have source, channel, and campaign data available in the GBQ table:
- trafficSource.source — google
- trafficSource.medium — cpc
- trafficSource.campaign — utm_tags
Read more about how to create UTM tags correctly.
If you have enabled automatic markup in the advertising service, a special gclid parameter is assigned to each of your ads. It's added to the landing page URL when the user clicks on the announcement.
Example of such a link:
http://www.example.com/?gclid=TeSter-123
If you use an automatic markup, you cannot get source, medium, or campaign from gclid without raw data - these fields will be empty in the BigQuery tables that OWOX BI collects.
What can you do in such a case and how can you get the name of campaigns and other parameters, having only the gclid? Configure automatic upload from Google Ads to GBQ.
Note: if the announcement is not marked at all, OWOX BI will assign the link as follows:
- for non-Google sources as referral traffic (e.g. facebook/referral)
- for Google source as direct traffic (direct/none)
If there is a lot of direct/none traffic in your reports, you may not have bot filtering enabled or you may have a large number of untagged ads.
Two ways to upload raw data from Google Ads into BigQuery
We use and recommend two methods to upload raw data from Google Ads: Data Transfer Connector and Ads Script.
Which way you choose depends on your goals and budget.
Data Transfer features
- Native integration with GBQ.
- Download historical data for any period without restrictions.
- Free of charge.
Google Ads Script features
- Free.
- You cannot upload historical data. Only the previous day's information is downloaded.
- Requires more if you want to set up upload from a large number of accounts. You will need to manually make script changes for each account. At the same time, there is a high risk of making a mistake.
What you need for settings
Active projects and accounts in:
- Google Cloud Platform (GCP)
- Google BigQuery (GBQ)
- OWOX BI
- Google Ads
Access:
- Owner in GCP
- Administrator in GBQ
- Editing in OWOX BI. Important: only the user who created Google Analytics → Google BigQuery streaming pipeline can turn on downloading from Google Ads.
- Reading in Google Ads
How to grant access rights in GBQ
Open the GCP console and select IAM and admin — Manage Resource from the side menu. Then select the project and click Add Member. Enter the user's email, select the BigQuery Administrator role, and save your changes.

How to configure upload using Data Transfer
Step 1. Create a project in the Google Cloud Platform
If you already have a project in GCP, skip this step. If not, open the GCP console and select IAM and admin — Manage Resource from the side menu. Click the Create Project button. Then enter the project name, specify the organization, and click Create:

Be sure to enable billing. To do this, open the Billing — Account Management tab in the side menu, select the project, and link Billing Account:

Next, complete all the fields by entering your contacts and payment card details. If this is your first project in GCP, you will receive $300 that can be used for 12 months. Projects that have 1-2 Google Ads accounts and up to 100,000 unique users per month will have enough for a year. When you exhaust this limit, you don't need to return the money. For further use, you just refill the balance on the card that you linked to the project.
Step 2. Turn on API BigQuery
After you create a project, you must activate the BigQuery API. To do this, go to APIs & Services — Dashboard from the GCP side menu, select the project, and click Enable APIs and Services:

In the API library, search for "BigQuery API" and click Enable:

To use the API, click Create Credentials:

From the drop-down list, choose the BigQuery API and click What credentials do I need?

Create the name of the service account and specify the BigQuery Role access level. Select the type of JSON key and click Continue:

Step 3. Activate Data Transfer API
Next, you need to activate the data service in BigQuery. To do this, open GBQ and select Transfers from the side menu on the left. Then enable the BigQuery Data Transfer API:

Step 4. Prepare the data set in GBQ
In the BigQuery, select the project and click the Create Dataset button on the right. Complete all required fields for the new dataset (name, location, retention):

Step 5. Set up data transfer from Google Ads
Click the Transfers tab on the side menu, and then click Create Transfer. Then select Google Ads (formerly AdWords) for the source and enter the name of the upload, for example, Data Transfer.
Under Schedule options, you can leave the default set to Start now or set the date and time you want to start downloading. In the Repeats field, select how often to upload: daily, weekly, monthly on-demand, etc.

Then you have to specify the GBQ dataset to load reports from Google Ads into. Enter Customer ID (this is the ID of your Google Ads account or MCC ID) and click Add. You can view the Customer ID in your Google Ads account in the upper right corner, next to your email.

Then you need to authorize the Gmail account you are using. The following day, the information will appear in the dataset you specified when you set up the transfer.
As a result, you will receive a large amount of raw data in GBQ that you can work with: tables by campaigns, audiences, common (custom) tables, keywords, and conversions. For example, if you want to build a custom dashboard, you can pull non-aggregated data from these tables.
How to set up upload using Ads Script
Open your Google Ads account, click Tools and Settings in the upper right corner, select Bulk Actions — Scripts, and click the Plus symbol:

Then, in the upper-right corner, click the Advanced APIs button, select BigQuery, and save your changes:

Be sure to sign up with the account you signed in to Google Ads with:

Copy the script below. In the BIGQUERY_PROJECT_ID, BIGQUERY_DATASET_ID, and Your email lines, replace the values with your own information: project name, GBQ dataset, and email. Paste the script text into the text editor.
/**
* @name Export Data to BigQuery
*
* @overview The Export Data to BigQuery script sets up a BigQuery
* dataset and tables, downloads a report from AdWords and then
* loads the report to BigQuery.
*
* @author AdWords Scripts Team [adwords-scripts@googlegroups.com]
*
* @version 1.3
*/
var CONFIG = {
BIGQUERY_PROJECT_ID: 'BQ project name',
BIGQUERY_DATASET_ID: AdWordsApp.currentAccount().getCustomerId().replace(/-/g, '_'),
// Truncate existing data, otherwise will append.
TRUNCATE_EXISTING_DATASET: false,
TRUNCATE_EXISTING_TABLES: true,
// Lists of reports and fields to retrieve from AdWords.
REPORTS: [],
RECIPIENT_EMAILS: [
'Your email'
]
};
var report = {
NAME: 'CLICK_PERFORMANCE_REPORT', //https://developers.google.com/adwords/api/docs/appendix/reports/click-performance-report
CONDITIONS: '',
FIELDS: {'AccountDescriptiveName': 'STRING',
'AdFormat': 'STRING',
'AdGroupId': 'STRING',
'AdGroupName': 'STRING',
'AoiCountryCriteriaId': 'STRING',
'CampaignId': 'STRING',
'CampaignLocationTargetId': 'STRING',
'CampaignName': 'STRING',
'CampaignStatus': 'STRING',
'Clicks': 'INTEGER',
'ClickType': 'STRING',
'CreativeId': 'STRING',
'CriteriaId': 'STRING',
'CriteriaParameters': 'STRING',
'Date': 'DATE',
'Device': 'STRING',
'ExternalCustomerId': 'STRING',
'GclId': 'STRING',
'KeywordMatchType': 'STRING',
'LopCountryCriteriaId': 'STRING',
'Page': 'INTEGER'
},
DATE_RANGE: new Date(new Date().setDate(new Date().getDate()-1)).toISOString().slice(0, 10).replace(/-/g, "")+','+new Date(new Date().setDate(new Date().getDate()-1)).toISOString().slice(0, 10).replace(/-/g, ""),
DATE: new Date(new Date().setDate(new Date().getDate()-1)).toISOString().slice(0, 10).replace(/-/g, "")
};
//Regular export
CONFIG.REPORTS.push(JSON.parse(JSON.stringify(report)));
//One-time historical export
//for(var i=2;i<91;i++){
// report.DATE_RANGE = new Date(new Date().setDate(new Date().getDate()-i)).toISOString().slice(0, 10).replace(/-/g, "")+','+new Date(new Date().setDate(new Date().getDate()-i)).toISOString().slice(0, 10).replace(/-/g, "");
// report.DATE = new Date(new Date().setDate(new Date().getDate()-i)).toISOString().slice(0, 10).replace(/-/g, "");
// CONFIG.REPORTS.push(JSON.parse(JSON.stringify(report)));
//}
/**
* Main method
*/
function main() {
createDataset();
for (var i = 0; i < CONFIG.REPORTS.length; i++) {
var reportConfig = CONFIG.REPORTS[i];
createTable(reportConfig);
}
var jobIds = processReports();
waitTillJobsComplete(jobIds);
sendEmail(jobIds);
}
/**
* Creates a new dataset.
*
* If a dataset with the same id already exists and the truncate flag
* is set, will truncate the old dataset. If the truncate flag is not
* set, then will not create a new dataset.
*/
function createDataset() {
if (datasetExists()) {
if (CONFIG.TRUNCATE_EXISTING_DATASET) {
BigQuery.Datasets.remove(CONFIG.BIGQUERY_PROJECT_ID,
CONFIG.BIGQUERY_DATASET_ID, {'deleteContents' : true});
Logger.log('Truncated dataset.');
} else {
Logger.log('Dataset %s already exists. Will not recreate.',
CONFIG.BIGQUERY_DATASET_ID);
return;
}
}
// Create new dataset.
var dataSet = BigQuery.newDataset();
dataSet.friendlyName = CONFIG.BIGQUERY_DATASET_ID;
dataSet.datasetReference = BigQuery.newDatasetReference();
dataSet.datasetReference.projectId = CONFIG.BIGQUERY_PROJECT_ID;
dataSet.datasetReference.datasetId = CONFIG.BIGQUERY_DATASET_ID;
dataSet = BigQuery.Datasets.insert(dataSet, CONFIG.BIGQUERY_PROJECT_ID);
Logger.log('Created dataset with id %s.', dataSet.id);
}
/**
* Checks if dataset already exists in project.
*
* @return {boolean} Returns true if dataset already exists.
*/
function datasetExists() {
// Get a list of all datasets in project.
var datasets = BigQuery.Datasets.list(CONFIG.BIGQUERY_PROJECT_ID);
var datasetExists = false;
// Iterate through each dataset and check for an id match.
if (datasets.datasets != null) {
for (var i = 0; i < datasets.datasets.length; i++) {
var dataset = datasets.datasets[i];
if (dataset.datasetReference.datasetId == CONFIG.BIGQUERY_DATASET_ID) {
datasetExists = true;
break;
}
}
}
return datasetExists;
}
/**
* Creates a new table.
*
* If a table with the same id already exists and the truncate flag
* is set, will truncate the old table. If the truncate flag is not
* set, then will not create a new table.
*
* @param {Object} reportConfig Report configuration including report name,
* conditions, and fields.
*/
function createTable(reportConfig) {
var tableName = reportConfig.NAME+reportConfig.DATE;
if (tableExists(tableName)) {
if (CONFIG.TRUNCATE_EXISTING_TABLES) {
BigQuery.Tables.remove(CONFIG.BIGQUERY_PROJECT_ID,
CONFIG.BIGQUERY_DATASET_ID, tableName);
Logger.log('Truncated table %s.', tableName);
} else {
Logger.log('Table %s already exists. Will not recreate.',
tableName);
return;
}
}
// Create new table.
var table = BigQuery.newTable();
var schema = BigQuery.newTableSchema();
var bigQueryFields = [];
// Add each field to table schema.
var fieldNames = Object.keys(reportConfig.FIELDS);
for (var i = 0; i < fieldNames.length; i++) {
var fieldName = fieldNames[i];
var bigQueryFieldSchema = BigQuery.newTableFieldSchema();
bigQueryFieldSchema.description = fieldName;
bigQueryFieldSchema.name = fieldName;
bigQueryFieldSchema.type = reportConfig.FIELDS[fieldName];
bigQueryFields.push(bigQueryFieldSchema);
}
schema.fields = bigQueryFields;
table.schema = schema;
table.friendlyName = tableName;
table.tableReference = BigQuery.newTableReference();
table.tableReference.datasetId = CONFIG.BIGQUERY_DATASET_ID;
table.tableReference.projectId = CONFIG.BIGQUERY_PROJECT_ID;
table.tableReference.tableId = tableName;
table = BigQuery.Tables.insert(table, CONFIG.BIGQUERY_PROJECT_ID,
CONFIG.BIGQUERY_DATASET_ID);
Logger.log('Created table with id %s.', table.id);
}
/**
* Checks if table already exists in dataset.
*
* @param {string} tableId The table id to check existence.
*
* @return {boolean} Returns true if table already exists.
*/
function tableExists(tableId) {
// Get a list of all tables in the dataset.
var tables = BigQuery.Tables.list(CONFIG.BIGQUERY_PROJECT_ID,
CONFIG.BIGQUERY_DATASET_ID);
var tableExists = false;
// Iterate through each table and check for an id match.
if (tables.tables != null) {
for (var i = 0; i < tables.tables.length; i++) {
var table = tables.tables[i];
if (table.tableReference.tableId == tableId) {
tableExists = true;
break;
}
}
}
return tableExists;
}
/**
* Process all configured reports
*
* Iterates through each report to: retrieve AdWords data,
* backup data to Drive (if configured), load data to BigQuery.
*
* @return {Array.<string>} jobIds The list of all job ids.
*/
function processReports() {
var jobIds = [];
// Iterate over each report type.
for (var i = 0; i < CONFIG.REPORTS.length; i++) {
var reportConfig = CONFIG.REPORTS[i];
Logger.log('Running report %s', reportConfig.NAME);
// Get data as csv
var csvData = retrieveAdwordsReport(reportConfig);
//Logger.log(csvData);
// Convert to Blob format.
var blobData = Utilities.newBlob(csvData, 'application/octet-stream');
// Load data
var jobId = loadDataToBigquery(reportConfig, blobData);
jobIds.push(jobId);
}
return jobIds;
}
/**
* Retrieves AdWords data as csv and formats any fields
* to BigQuery expected format.
*
* @param {Object} reportConfig Report configuration including report name,
* conditions, and fields.
*
* @return {string} csvData Report in csv format.
*/
function retrieveAdwordsReport(reportConfig) {
var fieldNames = Object.keys(reportConfig.FIELDS);
var query = 'SELECT ' + fieldNames.join(', ') +
' FROM ' + reportConfig.NAME + '' + reportConfig.CONDITIONS +
' DURING ' + reportConfig.DATE_RANGE;
Logger.log(query);
var report = AdWordsApp.report(query);
var rows = report.rows();
var csvRows = [];
// Header row
csvRows.push(fieldNames.join(','));
// Iterate over each row.
while (rows.hasNext()) {
var row = rows.next();
var csvRow = [];
for (var i = 0; i < fieldNames.length; i++) {
var fieldName = fieldNames[i];
var fieldValue = row[fieldName].toString();
var fieldType = reportConfig.FIELDS[fieldName];
// Strip off % and perform any other formatting here.
if (fieldType == 'FLOAT' || fieldType == 'INTEGER') {
if (fieldValue.charAt(fieldValue.length - 1) == '%') {
fieldValue = fieldValue.substring(0, fieldValue.length - 1);
}
fieldValue = fieldValue.replace(/,/g,'');
if (fieldValue == '--' || fieldValue == 'Unspecified') {
fieldValue = ''
}
}
// Add double quotes to any string values.
if (fieldType == 'STRING') {
if (fieldValue == '--') {
fieldValue = ''
}
fieldValue = fieldValue.replace(/"/g, '""');
fieldValue = '"' + fieldValue + '"'
}
csvRow.push(fieldValue);
}
csvRows.push(csvRow.join(','));
}
Logger.log('Downloaded ' + reportConfig.NAME + ' with ' + csvRows.length +
' rows.');
return csvRows.join('\n');
}
/**
* Creates a BigQuery insertJob to load csv data.
*
* @param {Object} reportConfig Report configuration including report name,
* conditions, and fields.
* @param {Blob} data Csv report data as an 'application/octet-stream' blob.
*
* @return {string} jobId The job id for upload.
*/
function loadDataToBigquery(reportConfig, data) {
// Create the data upload job.
var job = {
configuration: {
load: {
destinationTable: {
projectId: CONFIG.BIGQUERY_PROJECT_ID,
datasetId: CONFIG.BIGQUERY_DATASET_ID,
tableId: reportConfig.NAME + reportConfig.DATE
},
skipLeadingRows: 1
}
}
};
var insertJob = BigQuery.Jobs.insert(job, CONFIG.BIGQUERY_PROJECT_ID, data);
Logger.log('Load job started for %s. Check on the status of it here: ' +
'https://bigquery.cloud.google.com/jobs/%s', reportConfig.NAME,
CONFIG.BIGQUERY_PROJECT_ID);
return insertJob.jobReference.jobId;
}
/**
* Polls until all jobs are 'DONE'.
*
* @param {Array.<string>} jobIds The list of all job ids.
*/
function waitTillJobsComplete(jobIds) {
var complete = false;
var remainingJobs = jobIds;
while (!complete) {
if (AdWordsApp.getExecutionInfo().getRemainingTime() < 5){
Logger.log('Script is about to timeout, jobs ' + remainingJobs.join(',') +
' are still incomplete.');
}
remainingJobs = getIncompleteJobs(remainingJobs);
if (remainingJobs.length == 0) {
complete = true;
}
if (!complete) {
Logger.log(remainingJobs.length + ' jobs still being processed.');
// Wait 5 seconds before checking status again.
Utilities.sleep(5000);
}
}
Logger.log('All jobs processed.');
}
/**
* Iterates through jobs and returns the ids for those jobs
* that are not 'DONE'.
*
* @param {Array.<string>} jobIds The list of job ids.
*
* @return {Array.<string>} remainingJobIds The list of remaining job ids.
*/
function getIncompleteJobs(jobIds) {
var remainingJobIds = [];
for (var i = 0; i < jobIds.length; i++) {
var jobId = jobIds[i];
var getJob = BigQuery.Jobs.get(CONFIG.BIGQUERY_PROJECT_ID, jobId);
if (getJob.status.state != 'DONE') {
remainingJobIds.push(jobId);
}
}
return remainingJobIds;
}
/**
* Sends a notification email that jobs have completed loading.
*
* @param {Array.<string>} jobIds The list of all job ids.
*/
function sendEmail(jobIds) {
var html = [];
html.push(
'<html>',
'<body>',
'<table width=800 cellpadding=0 border=0 cellspacing=0>',
'<tr>',
'<td colspan=2 align=right>',
"<div style='font: italic normal 10pt Times New Roman, serif; " +
"margin: 0; color: #666; padding-right: 5px;'>" +
'Powered by AdWords Scripts</div>',
'</td>',
'</tr>',
"<tr bgcolor='#3c78d8'>",
'<td width=500>',
"<div style='font: normal 18pt verdana, sans-serif; " +
"padding: 3px 10px; color: white'>Adwords data load to " +
"Bigquery report</div>",
'</td>',
'<td align=right>',
"<div style='font: normal 18pt verdana, sans-serif; " +
"padding: 3px 10px; color: white'>",
AdWordsApp.currentAccount().getCustomerId(),
'</tr>',
'</table>',
'<table width=800 cellpadding=0 border=1 cellspacing=0>',
"<tr bgcolor='#ddd'>",
"<td style='font: 12pt verdana, sans-serif; " +
'padding: 5px 0px 5px 5px; background-color: #ddd; ' +
"text-align: left'>Report</td>",
"<td style='font: 12pt verdana, sans-serif; " +
'padding: 5px 0px 5px 5px; background-color: #ddd; ' +
"text-align: left'>JobId</td>",
"<td style='font: 12pt verdana, sans-serif; " +
'padding: 5px 0px 5x 5px; background-color: #ddd; ' +
"text-align: left'>Rows</td>",
"<td style='font: 12pt verdana, sans-serif; " +
'padding: 5px 0px 5x 5px; background-color: #ddd; ' +
"text-align: left'>State</td>",
"<td style='font: 12pt verdana, sans-serif; " +
'padding: 5px 0px 5x 5px; background-color: #ddd; ' +
"text-align: left'>ErrorResult</td>",
'</tr>',
createTableRows(jobIds),
'</table>',
'</body>',
'</html>');
MailApp.sendEmail(CONFIG.RECIPIENT_EMAILS.join(','),
'Adwords data load to Bigquery Complete', '',
{htmlBody: html.join('\n')});
}
/**
* Creates table rows for email report.
*
* @param {Array.<string>} jobIds The list of all job ids.
*/
function createTableRows(jobIds) {
var html = [];
for (var i = 0; i < jobIds.length; i++) {
var jobId = jobIds[i];
var job = BigQuery.Jobs.get(CONFIG.BIGQUERY_PROJECT_ID, jobId);
var errorResult = ''
if (job.status.errorResult) {
errorResult = job.status.errorResult;
}
html.push('<tr>',
"<td style='padding: 0px 10px'>" +
job.configuration.load.destinationTable.tableId + '</td>',
"<td style='padding: 0px 10px'>" + jobId + '</td>',
"<td style='padding: 0px 10px'>" + job.statistics.load?job.statistics.load.outputRows:0 + '</td>',
"<td style='padding: 0px 10px'>" + job.status.state + '</td>',
"<td style='padding: 0px 10px'>" + errorResult + '</td>',
'</tr>');
}
return html.join('\n');
}
Before running the script, be sure to click the Preview button in the lower right corner to check the result. If there are errors in it, the system will caution you and indicate in which line it occurred, as in this screenshot:

If there are no errors, click the Run button:

As a result, you will receive a new CLICK_PERFORMANCE_REPORT report in your GBQ that will be available the next day:

Recall that when you use Data Transfer, you get a large amount of raw non-aggregated data. With Ads Script, you will only have information about certain fields.
The following fields from this upload are included in the session-related OWOX BI tables:
- GclId
- CampaignId
- CampaignName
- AdGroupId
- AdGroupName
- CriteriaId
- CriteriaParameters
- KeywordMatchType
How to connect data download from Google Ads to OWOX BI
Now you need to combine information from Google Ads with site data to understand which campaigns users have reached your site through. The tables you get in the BigQuery, such as Data Transfer, don't have a Client ID parameter. You can only determine which customer clicked on ads by linking the gclid data to the OWOX BI flow data.
If you have no Google Analytics → Google BigQuery streaming pipeline in OWOX BI yet, read the instructions on how to create it.
Then go to your OWOX BI project and open this pipeline. Click the Settings tab, and under Session Data Collection, click Edit Settings:

Use the slider to enable data collection for Google Ads auto-labeled campaigns and click Change Settings:

Select the AutoLabel markup type, specify how to load the Data Transfer or Ads scripts to BigQuery. Specify the project and dataset from which Google Ads data will be downloaded and save your settings:

Useful tips
Tip 1. With Data Transfer, you can upload historical data from Google Ads to GBQ. At the same time, there are no restrictions on the total period of loading (either for a year, or for three), but with data for only 180 days at a time.
You can activate the upload and specify the period using the Schedule Backfill button on the Transfers tab by selecting the transfer you want:

Tip 2. If you want to check the number of Google Ads accounts for which GCP will charge, you need to determine the number of ExternalCustomerID in the Customer table using the query:
SELECT
ExternalCustomerId
FROM `project_name.dataset_name.Customer_*`
WHERE _PARTITIONTIME >= "2020-01-01 00:00:00" AND _PARTITIONTIME < "2020-07-10 00:00:00"
group by 1
You can edit the dates in the query.
Tip 3. You can access the uploaded data yourself using SQL queries. Here, for example, is a query to determine the effectiveness of campaigns from the Data Transfer-derived "Campaign" and "CampaignBasicStats" tables:
SELECT
{source language="sql"}
c.ExternalCustomerId,
c.CampaignName,
c.CampaignStatus,
SUM(cs.Impressions) AS Impressions,
SUM(cs.Interactions) AS Interactions,
{/source}
(SUM(cs.Cost) / 1000000) AS Cost
FROM
`[DATASET].Campaign_[CUSTOMER_ID]` c
LEFT JOIN
{source language="sql"}
{source language="sql"}
`[DATASET].CampaignBasicStats_[CUSTOMER_ID]` cs
ON
(c.CampaignId = cs.CampaignId
AND cs._DATA_DATE BETWEEN
DATE_ADD(CURRENT_DATE(), INTERVAL -31 DAY) AND DATE_ADD(CURRENT_DATE(), INTERVAL -1 DAY))
WHERE
c._DATA_DATE = c._LATEST_DATE
GROUP BY
1, 2, 3
ORDER BY
Impressions DESC
P. S. If you need help with uploading and merging data into Google BigQuery, we are ready to assist. Sign up for a demo — and we'll discuss the details.