Skip to main content
All CollectionsIntegrationsGraphQL
Best Practices for Integrating GraphQL Data with Your BI Tools
Best Practices for Integrating GraphQL Data with Your BI Tools

Learn how to integrate GraphQL data into your BI platform for powerful visualizations and data analysis.

Michael Wallace avatar
Written by Michael Wallace
Updated over a week ago

Organizations often need to extract and visualize data from various sources to gain actionable insights in today's data-driven world. With PlusPlus, you can access your learning data using our GraphQL API, but integrating this data into powerful business intelligence (BI) platforms like Tableau, Power BI, Looker, and others can be challenging. This guide will walk you through the steps to fetch GraphQL data and seamlessly import it into these BI tools, enabling you to create compelling visualizations and make informed decisions based on your data.

Integrating GraphQL Data into BI Tools

A common question is, "How can I import my learning data into BI tools to visualize it according to my needs?" Here is an example to help get you started:

1. Fetch Data from GraphQL API

To integrate GraphQL data into BI tools, you must fetch data from the GraphQL API and then transform that data into a format the BI tool can read, such as JSON or CSV. Since each BI tool handles data differently, the following steps will help you standardize the process.

Python Script Example

First, install the required libraries:

```sh
pip install requests pandas
```

Then you can use the following script to fetch data and save it as a CSV.

Note that the query you write and how you parse data out of the results are tightly coupled. Querying different objects means parsing different fields out of the results. Here’s an example to illustrate this:

import requests
import pandas as pd

# Define your GraphQL query for fetching event data
query = """
query {
events(first: 150, starts_at_after: "20240101T000000-0400", tag: ["Workplace"]) {
edges {
node {
pk
name
tags
url
cover
}
}
}
}
"""

# Define the URL of your GraphQL endpoint
url = "https://your-graphql-endpoint.com/graphql"

# Define the headers, including authentication if needed
headers = {
"Content-Type": "application/json",
"Authorization": "Bearer YOUR_ACCESS_TOKEN"
}

# Send the request
response = requests.post(url, json={'query': query}, headers=headers)
response_data = response.json()

# Parse the data into a pandas DataFrame
events = []
for edge in response_data['data']['events']['edges']:
node = edge['node']
events.append({
'pk': node['pk'],
'name': node['name'],
'tags': ", ".join(node['tags']),
'url': node['url'],
'cover': node['cover']
})

df = pd.DataFrame(events)

# Save the DataFrame to a CSV file
df.to_csv('events_data.csv', index=False)

Understanding the Query and Parsing Coupling

The critical point here is that the structure of your GraphQL query determines the structure of the data you receive. This means that if you change the query, you must also change the way you parse the results.

For example, if you decide to query user data instead of event data, the fields you need to extract will be different:

Example for Fetching User Data

Let's say you want to fetch user data instead:

# Define your GraphQL query for fetching user data
query = """
query {
users(first: 100) {
edges {
node {
id
username
email
dateJoined
}
}
}
}
"""

# Send the request
response = requests.post(url, json={'query': query}, headers=headers)
response_data = response.json()

# Parse the user data into a pandas DataFrame
users = []
for edge in response_data['data']['users']['edges']:
node = edge['node']
users.append({
'id': node['id'],
'username': node['username'],
'email': node['email'],
'date_joined': node['dateJoined']
})

df = pd.DataFrame(users)

# Save the DataFrame to a CSV file
df.to_csv('users_data.csv', index=False)

In this example, the query structure is different, and therefore, the parsing logic also changes to accommodate the new fields (id, username, email, date_joined). This tight coupling between the query and the parsing logic is a critical point to understand for successful data integration.

2. Load the CSV File into BI Tools

Once you have the CSV file, you can load it into your BI Tool:

Tableau

  1. Open Tableau**.

  2. Connect to Data**:

    1. Go to the "Connect" pane on the left side.

    2. Select "Text File" under "To a File".

    3. Choose the `events_data.csv` file you created.

  3. Preview and Load Data**:

    1. Tableau will display a preview of your data.

    2. Click "Sheet 1" to start creating visualizations with your data.

Power BI

  1. Open Power BI.

  2. Get Data:

    1. Go to the Home tab and select "Get Data".

    2. Choose "Text/CSV" and select your CSV file.

  3. Load Data:

    1. Power BI will load your data into a table.

    2. You can now create reports and dashboards based on this data.

Looker

  1. Upload Data:

    1. Use Looker’s APIs or the data upload interface to upload your CSV file.

  2. Create LookML Models:

    1. Define LookML models to specify how Looker should interpret your data.

  3. Explore Data:

    1. Use Looker’s explore interface to create visualizations and dashboards.

Automating the Process

For automated data updates, you’ll need to be familiar with your internal tooling for scheduling scripts, such as cron (Linux/Mac) or Task Scheduler (Windows). You can also use internal data movement tools to write data to your data warehouse, join it with other analytical datasets, and move it to platforms like Sheets, Airtable, or Tableau.

Using Internal Data Warehouses and Movement Tools

Some customers prefer to write data to their internal data warehouses. This approach allows them to join GraphQL data with other analytical datasets and use internal data movement tools to send data to various platforms, including Sheets, Airtable, or Tableau. Here’s how you can adapt the process:

  1. Fetch and Save Data: Use the provided Python script to fetch data and save it as a CSV.

  2. Write to Data Warehouse: Load the CSV data into your data warehouse using your preferred method (e.g., ETL tools, custom scripts).

  3. Use Internal Tools: Rely on your internal data movement tools to transfer data from the warehouse to your BI platforms.

Summary

Integrating GraphQL data into BI tools like Tableau, Power BI, and Looker allows you to create insightful visualizations and drive data-informed decisions. By understanding the coupling between your GraphQL queries and parsing logic, and leveraging internal data movement tools, you can streamline this process and make the most of your data.

Did this answer your question?