How to connect Google Data Studio to a CSV file using BigQuery and Cloud Storage
So let’s say you have data sitting in a very big CSV file. What’s the best way to connect it to Google Data Studio? In this post I’ll show you a method of connecting a big CSV file using Google BigQuery and Cloud Storage.
So let’s say we have a CSV file that is very big. It’s over 30 mb and has over 35,000 rows. That’s too large for Google Sheets. The best option is to upload it to Google Cloud Storage and then access it via Google Big Query.

In this blog post I’ll show you a method to do that.
Step 1: Create a Storage Bucket within Google Cloud Platform
If you don’t already have a Google Cloud Platform account you will need to set one up.
Here is the link to Google Cloud Platform.
The next step is to create a Storage Bucket. Click on Storage.

Now click on ‘Create Bucket’.

You will then need to give your bucket a unique name. For this example I’ve used ‘example_bucket_abcdefg’.

Next there are various options for setting up your bucket. You can choose whichever options best suit your needs, or simply go for the default options.
For example, one of the options presented is to choose Mult-region as the Location type.

Once you have finished selecting the options for your bucket, click Create.

Step 2: Uploading your CSV file to the Storage Bucket
The next step is to upload the CSV file to your newly created storage bucket.
Click ‘Upload files’ and select the file you want.

Your file should then start uploading, as shown below.

Once it is upload you will see it in your storage bucket.

Step 3: Create a Dataset based on the file in BigQuery
Open up Google BigQuery and click ‘Create Dataset’.

Give it a unique name. In this case I’ve named it ‘example_data_01’.

You will see a notification that the data source has been created and will be able to view it under your project’s datasets.

The next step is to create a table within this dataset
Step 4: Create a new Table within this Dataset
Next we need to click ‘Create Table’ within our newly created dataset.

When asked where to create the table from we need to select ‘Google Cloud Storage’.

We then need to choose the storage bucket we created a few moments ago. Then after clicking into the storage bucket we then choose our csv file.

Once we’ve selected the csv file there are a few more things we need to do.
We need to give the table a name. In this example I’ve named it ‘example_data_table’.
I’ve also checked the box to auto-detect schema and input parameters. We can then click ‘Create table’.

Once we’ve create the table we will see a notification in the bottom left corner of the screen. Click ‘Go to table’.

We now have our table with all the data accessible using BigQuery.

Step 5: Connect to the data using Google Data Studio
The final step is to connect to this data using Google Data Studio.
Open up Google Data Studio and choose the BigQuery connector.

Open up the Project, Data set and table that we’ve created. Click the correct table and add the data.

Lo and behold we have the data accessible in Google BigQuery! Just ready for some very fast visualizations.

I hope you found this blog post about Google BigQuery and Cloud Storage useful.
My Other Blog Posts
You might enjoy my other blog posts.
It is also possible to connect directly, without Big Query in the middle. However, then you don’t have the way to filter the data in Big Query before Data Studio. It would be to good to highlight this fact in the beginning of the article.
Not really, more precisely, without BigQuery (using “CSV upload” feature of Data Studio) you would be limited to 100Mb per “dataset”. Which might work for some cases, but if your data is more than 100 Mb, you have no really other choice except of going with BigQuery.