Transferring Data into BigQuery: Methods and Tools

Data Integration Google BigQuery
Big Query Extension

Effectively managing and analyzing data is key for businesses today, and BigQuery offers a solid solution for large-scale analytics.

Whether you're a Data Engineer or in Business Intelligence, understanding how to transfer data into BigQuery can improve your analytics workflow, utilizing data more effectively using SQL and therefore - the corporate culture of decision-making.

In this guide, we'll explore different ways to load data into BigQuery, covering both its built-in features and third-party tools.

By the end of this article, you'll have a clearer idea of how to make your data collection process more efficient and fully leverage BigQuery.

Benefits of Consolidating Data in BigQuery

Whether you're working at a startup or a large enterprise, BigQuery's features ensure smooth, efficient data processing.

Scalability

BigQuery handles massive datasets, making it suitable for organizations of any size. Whether you're a startup or a global company, it easily scales to meet growing data needs.

High-Performance Storage

Optimized for analytics, BigQuery provides fast, reliable storage for structured, semi-structured, and unstructured data, supporting formats like JSON, Avro, and Parquet.

Seamless Integration with BI Tools

BigQuery integrates smoothly with visualization services like Looker Studio, Power BI or Google Sheets creating a connected data ecosystem and mankind data accessible by non-technical stakeholders.

Reliable Security and Compliance

BigQuery offers encryption and identity management, ensuring security and meeting regulatory standards like GDPR, PPDA, and HIPAA.

4 Ways of Collecting Data into BigQuery

Google BigQuery is popular among businesses for its seamless integration with advertising services, analytics systems, CRM tools, and more.

It helps businesses automate tasks and respond quickly to market changes.

Now, let’s explore the different ways to collect data in BigQuery:

#1: BigQuery Data Transfer

The BigQuery Data Transfer Service automates data transfers from popular sources like Google Ads, YouTube, and external platforms like Facebook Ads directly into BigQuery. This service allows users to create, configure, and monitor data transfers without manual intervention, ensuring timely and consistent updates.

#2: bq Command-Line

The bq command-line tool manages data transfers into BigQuery, providing a more hands-on, flexible option than the web interface. It allows users to create, configure, and monitor data transfers through terminal commands, offering greater control over the data transfer process and automation options for advanced users.

#3: Third-Party Tools

Third-party paid tools offer advanced data transfer solutions for BigQuery, allowing users to connect with a wide range of platforms beyond Google’s ecosystem. These tools provide no-code options, pre-configured connectors, and support for custom schemas, making it easier to integrate and manage data from sources like Facebook Ads or CRM systems.

#4: Custom Connectors

Custom BigQuery connectors enable smooth integration of external data sources into BigQuery, automating data ingestion without manual ETL processes.

These connectors act as direct bridges for data transfer, making analytics more efficient and reducing errors.

Businesses can tailor data workflows by building custom connectors, ensuring scalability, consistency, and accurate data flow for advanced analysis.

In this article, we’ll show you the ways of collecting data from popular sources by type of data: web analytics, ads platforms, spreadsheets, etc.

Collecting Web Analytics Data to BigQuery

Collecting web analytics data in BigQuery is essential for in-depth insights and reporting.

For example, Google Analytics 4 (GA4) allows you to transfer event data into BigQuery automatically. Let’s take a closer look at both methods for collecting web analytics data into BigQuery.

Google Analytics 4 to BigQuery Export

Exporting data from Google Analytics 4 (GA4) to BigQuery allows you to analyze user behavior and site metrics more flexibly and deeply. Connecting GA4 to BigQuery will enable you to access raw event-level data for more advanced analysis.

Here’s a step-by-step guide on how to export GA4 data to BigQuery:

Step 1: Access Google Analytics Admin

Open your GA4 account and navigate to the Admin section at the page's bottom left. The Admin section manages all the property settings for your Google Analytics account.

Step 2: Select "BigQuery Linking"

Under the Property column, click on BigQuery Linking to begin setting up the connection between GA4 and BigQuery. This option lets you send raw event data directly from your GA4 property to BigQuery.

Step 3: Create a New Link

Click the Link button to create a new connection between your GA4 property and a BigQuery project. This step initiates the process of exporting data.

Step 4: Choose a BigQuery Project

Select the BigQuery project where you want to send your GA4 data. If you don't have a project, you must create one in Google Cloud Console.

Don’t forget to mark the checkbox next to the BigQuery project you want to link to your GA4 property to confirm your selection. Then click on “Confirm”.

Step 5: Select Preferred Data Location

After choosing your BigQuery project, select the preferred data location where your analytics data will be stored. It's recommended to choose a location that matches your business region to ensure compliance.

Step 6: Choose Data Streams

Next, select the data streams you want to export from GA4 to BigQuery. Depending on your needs, you can choose specific streams, such as website or app data.

Step 7: Set Up Frequency

In this step, choose the export frequency. Depending on your reporting needs, you can select between daily exports for regular data updates streaming exports for near real-time data analysis.

Step 8: Submit

After configuring all settings, click Submit to finalize the setup. This will initiate the linking process between GA4 and BigQuery.

Step 9: Linking Complete

Once submitted, the connection between your GA4 property and BigQuery is established, and data will start transferring based on the schedule you’ve set.

Uncover in-depth insights

How to Set up [GA4] to BigQuery Export

Download now

Bonus for readers

How to Set up [GA4] to BigQuery Export

Server-Side Cookieless Tracking

OWOX Website Tracker offers an effective solution for accurate customer acquisition reporting using server-side first-party cookies collected into BigQuery in real-time.

This method extends cookie lifespan to 365 days even for the browsers like Safari, ensuring more accurate returning visitor tracking.

Unlike traditional client-side cookies, OWOX BI’s solution requires no engineering resources and offers an out-of-the-box setup, enhancing campaign accuracy by over 30%.

For a better understanding, let's compare GA4 vs. OWOX BI Sreaming.


Feature

Google Analytics (Client-Side Cookies)

OWOX BI Cookieless Tracking (Server-Side Cookies)

Cookie Type

3rd party, client-side (set by JS)

1st party, server-side (set with ss-GTM)

1st party, server-side, same origin

Cookie Lifespan

7 days or 24 hours (due to ITP)

365 days

Adblockers

Highly sensitive

Less sensitive

Setup & Maintenance

Setup of ssGTM, plus ongoing maintenance

No engineering required, minutes to setup, 

Accuracy in Acquisition Tracking

Prone to inaccuracies due to cookie expiration, leading to inflated direct traffic

20%+ higher accuracy in acquisition tracking

Cost

Requires Google Cloud/App Engine expenses and ongoing fees

Easy to estimate. No hidden total cost of ownership

Impact on Direct Traffic

Returning visitors are often misattributed as new ones

30% more accurate tracking of returning visitors


How to Collect Advertising Data to BigQuery

To gather advertising data from platforms like Google Ads, Facebook Ads, or LinkedIn into BigQuery, you can use the BigQuery Data Transfer Service or third-party tools like OWOX BI Pipelines.

BigQuery Data Transfer Service

The BigQuery Data Transfer Service makes it easy to import data from various sources directly into your BigQuery tables. It works with Google services like Google Ads, YouTube, and Google Play, as well as external platforms such as Amazon S3 and Teradata. Transfers are automatically scheduled, ensuring your data is consistently updated without any manual effort.

A standout feature is the ability to backfill data so that you can recover any lost information during outages. You can manage the service through the:

  • Google Cloud Console,
  • the bq command-line tool, or the
  • Data Transfer Service API

This gives you flexibility in handling data imports. The service lets you customize transfer schedules or start manual transfers when needed. With its integration and automation, the BigQuery Data Transfer Service helps businesses keep their datasets accurate and up-to-date for real-time analysis and reporting.

Google Ads Data Transfer

Transferring Google Ads data to BigQuery allows merging campaign performance data with GA4 event data, enabling an overview of customer behavior and ROI.

Steps to Set Up Google Ads Data Transfer to BigQuery:

Step 1: Create a Dataset in BigQuery

  • Open the BigQuery Console and click on the three dots next to your project name.
  • Select Create dataset, name it, and configure the settings. Ensure the dataset is in the same region as your GA4 event data.

    Step 2: Configure Data Transfer

    • Go to Google Cloud Console and navigate to BigQuery > Data Transfers.
    • Click Create Transfer and enable the Data Transfer API if necessary.
    • Select Google Ads as the source, configure the display name, schedule frequency, dataset, and Google Ads customer ID. Optionally, include Performance Max campaign data. Click Save.
    • Now, configure the transfer settings- choose any display name, set the schedule (hours, days, weeks, or months), select your new dataset, and enter your Google Ads customer or MMC ID. If running Performance Max campaigns, select "Include PMax Campaign Tables”. The refresh window is 7 days by default, extendable to 30. Enable email notifications if needed, then click Save.

    Step 3: Backfill Data Transfer

    • The transfer will be scheduled for the next day. To run the transfer manually, click Schedule backfill.
    • Choose a date range or run a one-time transfer. Backfilling large datasets may take time.

    Step 4: First Query

    Run your first query by replacing in the sample query provided to analyze metrics like impressions, interactions, and cost for your Google Ads campaigns.

    select
      c.customer_id,
      c.campaign_name,
      c.campaign_status,
      sum(cs.metrics_impressions) as impressions,
      sum(cs.metrics_interactions) as interactions,
      (sum(cs.metrics_cost_micros) / 1000000) as cost
    from
      `google_ads.ads_Campaign_` c
    left join
      `google_ads.ads_CampaignBasicStats_` cs
    on
      (c.campaign_id = cs.campaign_id
      and cs._data_date between
      date_add(current_date(), interval -31 day) and date_add(current_date(), interval -1 day))
    where
      c._data_date = c._latest_date
    group by
      1, 2, 3
    order by
      impressions desc

    This query retrieves Google Ads campaign performance data from the last 31 days. It calculates total impressions, interactions, and cost for each campaign, joining campaign details with basic campaign statistics

    Facebook Ads Data Transfer

    Transferring Facebook Ads data to BigQuery allows businesses to merge ad performance metrics with other datasets.

    There are three key methods for this transfer: manual data uploads, using API-based connections, or opting for no-code automated tools for seamless, ongoing transfers.

    1. Manual Transfer Steps:

    • Export CSV from Facebook Ads Manager.
    • Clean and preprocess data if needed.
    • Create a dataset in BigQuery.
    • Create a table and upload the CSV file.
    • Specify or auto-detect schema.
    • Verify data import with a query.

    2. Facebook Ads API Transfer Steps:

    • Obtain an access token from Facebook’s Developer portal.
    • Create a service account in Google Cloud, download the key file, and enable BigQuery API for the project.
    • Set up your server or local environment with necessary libraries like requests for API calls and google-cloud-bigquery for data uploads.
    • Use the API to extract relevant ad performance metrics.
    • Clean and structure the data, ensuring it aligns with appropriate data types and schema.
    • Use the Google Cloud BigQuery client library in your script.
    • Automate the process using tools like Google Cloud Scheduler.

    Here's a basic example to demonstrate part of the workflow. This assumes that you have the google-cloud-bigquery and requests libraries installed in your environment.

    from google.cloud import bigquery
    import requests
    
    # Initialize BigQuery client
    client = bigquery.Client()
    
    # Facebook Ads API credentials and endpoint setup
    fb_access_token = 'YOUR_FACEBOOK_ACCESS_TOKEN'
    ad_account_id = 'YOUR_AD_ACCOUNT_ID'
    fb_ads_endpoint = f'https://graph.facebook.com/v12.0/{ad_account_id}/insights'
    
    # Parameters for the API call
    params = {
        'fields': 'impressions,clicks,spend',
        'access_token': fb_access_token
    }
    
    # Fetch data from Facebook Ads API
    response = requests.get(fb_ads_endpoint, params=params)
    ads_data = response.json()['data']  # Ensure this matches the structure of your expected API response
    
    # Define the BigQuery dataset and table IDs
    dataset_id = 'YOUR_DATASET_ID'
    table_id = 'YOUR_TABLE_ID'
    
    # Transform ads_data as needed to match your BigQuery table schema
    # Assuming ads_data is a list of dictionaries that match your BigQuery table schema
    
    # Insert data into BigQuery
    errors = client.insert_rows_json(f'{dataset_id}.{table_id}', ads_data)
    if errors == []:
        print("New rows have been added.")
    else:
        print(f"Errors occurred: {errors}")

    3. Automated Transfer (No-code integration):

    An automated or no-code integration allows businesses to transfer Facebook Ads data to BigQuery using third-party tools, eliminating the need for coding skills.

    OWOX BI Pipelines offer a seamless, no-code solution for transferring Facebook Ads data to BigQuery. With easy setup and automated data updates, OWOX BI ensures real-time insights without technical expertise.

    Data Transfer for Other Google Products

    The BigQuery Data Transfer Service also automates data transfers from other Google products like Display & Video 360 (DV360), Search Ads 360, and YouTube.

    Steps for Setting Up Data Transfer for DV360:

    1. Open the BigQuery Console from the Google Cloud Console.
    2. Ensure that the BigQuery Data Transfer API is enabled for your project.
    3. Navigate to BigQuery > Data Transfers and click Create Transfer.
    4. Choose Display & Video 360 from the list of available data sources.
    5. Set the transfer name and schedule, and select the dataset in BigQuery to store the data.
    6. Authorize the connection to DV360 and finalize the setup by saving the transfer configuration.

    Your DV360 data will now be automatically transferred to BigQuery on a scheduled basis.

    OWOX BI Pipelines

    OWOX BI Pipelines simplifies collecting and transferring data from multiple sources, including advertising platforms, CRMs, and web analytics, into BigQuery.

    Benefits of OWOX BI Pipelines:

    • Seamless delivery of marketing data to Google Analytics and BigQuery for easy access to insights.
    • Collects raw, unsampled data in an analyst-friendly structure for high-quality analysis.
    • Provides highly granular data for detailed reporting and deeper marketing insights.
    • Ensures reliable, clean, and consistent data through continuous quality control.

    Steps for Setting Up OWOX BI Pipelines:

    1. Create an account on the OWOX BI platform.
    2. Navigate to the OWOX BI Workspace.
    3. Click on the “+ New” button, then select “Pipeline” from the options.
    4. Choose the marketing platform or data source you want to connect to, such as Google Ads or Facebook Ads.
    5. Specify the BigQuery project and dataset where you want to send your data.
    6. Once the setup is complete, click ‘Create pipeline’ to finalize the configuration.
    7. Set up an automated schedule for data imports to ensure your data stays up-to-date.

      Google Ads to BigQuery

      Connecting Google Ads to BigQuery using the OWOX BI Pipeline allows the integration of advertising data from multiple sources for comprehensive analysis. This enables businesses to store, query, and analyze large volumes of ad data in real-time, facilitating data-driven decisions.

      💡If you want to dive deeper about connecting Google Ads Data to BigQuery, read this article on Google Ads to BigQuery.

      Facebook Ads to BigQuery

      With OWOX BI Pipelines, you can easily upload Facebook Ads data to BigQuery, linking advertising costs with user activities, website data, and CRM purchase orders. This integration gives you a complete view of ad performance.

      💡 You can learn more about connecting Facebook Ads data to BigQuery in this article.

      LinkedIn Ads to BigQuery

      OWOX BI allows you to automatically import LinkedIn Ads data into Google BigQuery, eliminating the need for manual uploads. It collects daily campaign data, including costs, impressions, and clicks, while converting it into your default currency. This integration helps businesses streamline reporting, optimize ad performance, and ensure data accuracy.

      💡 You can know more about connecting Facebook Ads data to BigQuery in this article.

      Microsoft Ads (Bing Ads) to BigQuery

      OWOX BI streamlines transferring Microsoft Ads(Bing ads) data to BigQuery through an automated pipeline, removing the need for technical expertise. Users receive clean, ready-to-use datasets that update automatically, enabling them to focus on analysis and insights without manual data processing, ultimately improving decision-making efficiency.

      💡 You can learn more about connecting Facebook Ads data to BigQuery in this article.

      How to Move Spreadsheet Data to BigQuery

      Moving spreadsheet data to BigQuery enables real-time insights and more efficient, scalable data analysis. Here are a few ways to achieve this.

      Upload Google Sheets Data to BigQuery

      Google Sheets is a simple, convenient tool for small teams and enterprises to perform analysis, build reports, and share insights. However, as data grows and comes from multiple sources, spreadsheets become less effective for storage. At that stage, sending data from Google Sheets to BigQuery can help streamline data processes and handle larger volumes efficiently.

      Below are three effective methods that allow for data integration.

      Option #1: Upload Sheets Data to BigQuery with OWOX BI Reports Extension

      Using the OWOX BI Reports Extension, you can easily upload Google Sheets data to BigQuery. This no-code solution simplifies the process, allowing you to map columns, define data types, and automate data transfers for real-time analysis and reporting.

      Option #2: Transfer Data from Google Sheets using the BigQuery Interface

      Using the BigQuery web interface allows you to upload Google Sheets data directly into a new or existing table.

      Step 1: Open the BigQuery Console and create a new dataset.

      Step 2: In the dataset, select "Create Table" and choose "Drive" as the data source.

      Step 3: Enter the Google Sheets URL and configure the schema before uploading.

      Step 4: Configure the schema before uploading

      Option #3: Connect to BigQuery from Google Sheets With Connected Sheets

      Connected Sheets is Google’s native tool for quickly uploading and working with BigQuery data directly from Sheets, with no code needed.

      Step 1: Open the Spreadsheet

      Step 2: Access the Data Menu

        Step 3: Get Connected

        Step 4: Select Google Cloud Project

        Step 5: Choose a Dataset

        Step 6: Select a Table or View

        Step 7: Create Connected Charts and Pivot Tables

          Once connected, you can build charts, pivot tables, and functions that either work with the entire dataset or extract subsets of data directly into Google Sheets.

          CSV Files Upload to BigQuery

          Uploading CSV files to BigQuery allows businesses to efficiently manage and analyze large datasets that would otherwise be limited by traditional spreadsheet tools. Using BigQuery, you can quickly upload CSV files and leverage powerful features like fast query execution, machine learning, and geographic data analysis.

          You can load CSV data into BigQuery using three main methods:

          1. Command Line: Use the bq load command to upload data from a local file or cloud storage. This method is efficient for automating recurring tasks with scripts.
          2. BigQuery Web UI: Utilize the web interface for an easy drag-and-drop method to upload CSV files. It's user-friendly and ideal for one-time or manual uploads.
          3. BigQuery API: Programmatically load CSV files using languages like Python, offering greater flexibility and control, particularly for large-scale or custom data processing needs. It’s perfect for developers integrating data pipelines.

          Each method ensures efficient data integration, helping you maximize BigQuery’s advanced analytics capabilities.

          Getting Other Business & Finance Data to BigQuery

          Integrating business and finance data into BigQuery streamlines analysis and reporting. By utilizing custom connectors or third-party tools, you can automate data imports from various platforms, ensuring comprehensive insights.

          Building Custom Connectors

          Custom connectors are essential for automating and streamlining data transfers from unique sources, ensuring real-time access to critical information.

          By building custom connectors, organizations can extend BigQuery's capabilities to suit their specific data needs.

          Process of Setting Up a Custom Connector:

          1. Identify the Data Source: Start by determining the external source you need to connect to and understand its API or access mechanism.
          2. Choose Tools: Select a programming language like Python and appropriate libraries to manage data transfer and authentication.
          3. Authentication: Securely authenticate using OAuth or similar protocols for safe data access between your source and BigQuery.
          4. Data Transfer Logic: Write code that extracts, optionally transforms, and loads the data into BigQuery.
          5. Automate: Schedule regular data transfers using tools like Google Cloud Scheduler or cron jobs to automate the process.
          6. Monitor: Continuously monitor the performance of the connector and adjust for any changes in the data source or API.

          Example: Custom Connector in Python

          This example illustrates how to create a Python-based custom connector to fetch data from an API, transform it, and load it into BigQuery.

          # Import necessary libraries
          import requests
          from google.cloud import bigquery
          from google.oauth2 import service_account
          
          # Define your credentials and project details
          credentials = service_account.Credentials.from_service_account_file('path/to/key.json')
          bq_client = bigquery.Client(credentials=credentials, project='your-project-id')
          
          # Fetch data from the external API
          api_url = 'https://your.api/source'
          data = requests.get(api_url).json()
          
          # Optional: Transform your data as needed
          transformed_data = transform_function(data)
          
          # Load data into BigQuery
          dataset_id = 'your_dataset'
          table_id = 'your_table'
          table_ref = bq_client.dataset(dataset_id).table(table_id)
          job = bq_client.load_table_from_json(transformed_data, table_ref)
          job.result()  # Wait for the job to complete

          It demonstrates the basic steps of authenticating with BigQuery, fetching data from an external source, and optionally transforming the data before uploading it.

          Automated Data Transfers with Third-Party Tools

          Third-party data transfers in BigQuery Data Transfer Service allow businesses to automate the recurring loading of data from external sources such as Salesforce CRM, Adobe Analytics, and Facebook Ads.

          Using third-party tools simplifies the extraction, transformation, and loading (ETL) process, ensuring timely and accurate data ingestion across multiple platforms.

          Example: Automated Data Transfer from Facebook Ads to BigQuery with OWOX BI

          Using OWOX BI Pipelines, businesses can automate the transfer of Facebook Ads data into BigQuery. OWOX BI simplifies the ETL process, automatically extracting ad performance data and loading it directly into BigQuery for real-time analysis.

          1. Data Integration: OWOX BI Pipelines fetch data from Facebook Ads, such as impressions, clicks, and costs, and automate the transfer to BigQuery.
          2. ETL Simplified: OWOX BI handles data transformation, ensuring accurate and structured data loading.
          3. Automation: Once set up, OWOX BI regularly syncs data, ensuring your BigQuery dataset is always up-to-date without any manual effort.

          This automation allows businesses to focus on deriving insights and optimizing campaigns rather than managing data transfers.

          How to Use Business Data in BigQuery

          Getting the data in BigQuery is just the beginning. Now, you need to take it further by turning it into clear, actionable reports. With simple, intuitive tools, turning your data into clear, actionable decisions has never been smoother.

          Generate SQL Queries 50X Faster with AI

          With OWOX SQL Copilot, you can write up to 50 times faster SQL queries. It integrates directly into BigQuery, allowing users to streamline complex queries without needing advanced SQL skills.

          Best of all, it's completely free, boosting productivity and accelerating data-driven insights.

          SQL Copilot

          Generate Сomplex SQL Based on Your Context

          Transform how you interact with your database — faster, smarter, and error-free

          Get started now

          Use Google Sheets as a Window to Corporate Data in BigQuery

          Google Sheets can act as a front-end interface for BigQuery, allowing non-technical users to interact with data. With the OWOX BI BigQuery Reports Extension, you can effortlessly visualize BigQuery data directly in Google Sheets. This integration simplifies data access, enabling teams to analyze and report on corporate data, making data-driven decisions more efficient.

          pipeline

          Access BigQuery Data at Your Fingertips

          Make BigQuery corporate data accessible for business users. Easily query data, run reports, create pivots & charts, and enjoy automatic updates

          Elevate Your Analytics

          FAQ

          Expand all Close all
          • What is the easiest way to transfer data from Google Sheets to BigQuery?

            The simplest way is to use the BigQuery Data Transfer Service, which provides a built-in connector for Google Sheets. You can schedule regular data imports without manual intervention, allowing your Google Sheets data to automatically sync with your BigQuery tables for analysis.

          • How can I automate data transfers to BigQuery from multiple sources?

            BigQuery Data Transfer Service supports automated transfers from various Google products. For non-Google sources, third-party tools like OWOX BI or APIs can be used to set up automatic data pipelines, ensuring timely updates from multiple platforms directly into BigQuery without manual uploads.

          • What are the differences between first-party and third-party BigQuery data transfers?

            First-party transfers refer to data sourced from Google's products, like Google Ads or YouTube, and are natively supported by the BigQuery Data Transfer Service. Third-party transfers involve external platforms, often requiring custom APIs or third-party tools like OWOX BI for seamless integration.

          • Can I transfer marketing analytics data from platforms like Google Ads or Facebook Ads to BigQuery?

            Yes, you can transfer marketing data from platforms like Google Ads using BigQuery Data Transfer Service. For non-Google platforms like Facebook Ads, third-party tools such as OWOX BI Pipelines offer automated data transfers, helping you consolidate your marketing data in BigQuery for analysis.

          • How do I handle large CSV files when uploading them to BigQuery?

            For large CSV files, you can use the BigQuery web interface, command-line tool, or API to upload data directly. Google Cloud Storage is recommended for especially large files, allowing you to first upload the CSV to storage and then import it into BigQuery efficiently.

          • Is it possible to create custom connectors for unique data sources in BigQuery?

            Yes, custom connectors can be developed using the BigQuery API or Google Cloud Functions. This allows you to extract data from unique or unsupported sources and transfer it to BigQuery. Third-party integration platforms like OWOX BI also help streamline custom data connections.