Working with large datasets[ Edited ]
April 2017 - last edited April 2017
I have the data studio connection to google postgresql in the cloud. One of my tables has about 2.5 million rows.
Everything was working fine while there was only several thousand records, but since we have reached this amount, we started to have issues - many of our charts show "configuration error".
Limiting the date range or filtering in any other way does not seem to help as it looks like the data studio always brings back all the data and does the filtering afterwards.
Working with large datasets
There isn't currently a way to filter the dataset before it's pulled into DataStudio, so unless you can do this at source and strip out some of the data (appreciate this may not be something you want to do) the only other option I can suggest is to try using Google BigQuery.