Ошибка конфигурации набора данных студия данных не может подключиться к вашему набору данных

A Google Sheets data set is connected to Google Data Studio then explored in chart using data range. For 1 or 2 days, data is visible in chart and we can also set range that period of time but afte...

A Google Sheets data set is connected to Google Data Studio then explored in chart using data range.
For 1 or 2 days, data is visible in chart and we can also set range that period of time but after 2 days it shows the screen:

error_screenshot

The data set is connected for some time or some days after this error occurs:

Data Set Configuration Error
Data Studio cannot connect to your data set. Failed to fetch data from the underlying data set.

Nimantha's user avatar

Nimantha

6,6776 gold badges27 silver badges66 bronze badges

asked Jan 21, 2020 at 12:44

Chanchal S's user avatar

1

I got the same issue, because I changed the owner of data source. I fixed it by changing the «data credentials».

enter image description here

Nimantha's user avatar

Nimantha

6,6776 gold badges27 silver badges66 bronze badges

answered May 17, 2021 at 11:19

Katie's user avatar

KatieKatie

511 silver badge1 bronze badge

1

This issue seems to be normally related to updates in underlying Spreadsheets data, for example, adding new columns. If that’s the case, you can refresh the data source fields to get access to the data again. Also, check for changes in the file that might break the connection, like change of ownership.

In case this does not apply to your case, my suggestion is to go to the Data Studio Community site where you can check similar questions and even get responses by Data Studio team.

answered Feb 5, 2020 at 16:45

Tlaquetzal's user avatar

TlaquetzalTlaquetzal

2,6651 gold badge12 silver badges18 bronze badges

Looks like this issue may stem from the original source being a sheets file. I recently came across an issue where pointing to a view that is made off of a sheet tab was having this same exact problem.

We tried changing credentials, refreshing, etc and nothing worked. Turned out, you need to navigate to «Manage Data Sources» —> «Add Dataset» —> and select Google Sheets. You should be prompted to approve connection to Google Drive. Do this. Even if you are not literally pulling a sheet, do this. To be clear our workflow is as follows: team inputs information in sheets —> sheet is loaded in BQ —> create a view off of the table in BQ that is the sheet so we can manipulate as such —> point the dashboard to the view in BQ.

Upon doing so, refresh the connection and you should see data populate the table/chart.

answered Apr 28, 2021 at 16:53

seanwsullivan1's user avatar

In my case, it was the data source owner.
by changing the owner it works for me , because had a permission issue with the owner

  1. Click on Data Credentials on the top menu.
  2. Update Credentials Owner

enter image description here

answered Jun 17, 2022 at 4:56

Akitha_MJ's user avatar

Akitha_MJAkitha_MJ

3,63022 silver badges18 bronze badges

I encountered this issue today(https://twitter.com/wey_gu/status/1385101021576372228), for the BigQuery data source being imported in data studio, for the reason the same data source can be imported properly by my project owner role account, it’s suspected as permission caused issue.

By checking all predefined IAM roles related to BigQuery, it’s found that the BigQuery Job User role is needed as the import action is underlying yet another BigQuery Job being triggerred.

answered Apr 22, 2021 at 5:37

Wey Gu's user avatar

Wey GuWey Gu

5171 gold badge6 silver badges11 bronze badges

Go to Edit Connection option in your Data Source. From here select the appropriate data sheet and click on RECONNECT OPTION. This is prompt if any issues identified in your excel and it will automatically fix the issue. Now you data set configuration error will be removed.

answered Jan 28, 2022 at 11:37

vignesh Subash's user avatar

vignesh Subashvignesh Subash

2,4971 gold badge10 silver badges15 bronze badges

A Google Sheets data set is connected to Google Data Studio then explored in chart using data range.
For 1 or 2 days, data is visible in chart and we can also set range that period of time but after 2 days it shows the screen:

error_screenshot

The data set is connected for some time or some days after this error occurs:

Data Set Configuration Error
Data Studio cannot connect to your data set. Failed to fetch data from the underlying data set.

Nimantha's user avatar

Nimantha

6,6776 gold badges27 silver badges66 bronze badges

asked Jan 21, 2020 at 12:44

Chanchal S's user avatar

1

I got the same issue, because I changed the owner of data source. I fixed it by changing the «data credentials».

enter image description here

Nimantha's user avatar

Nimantha

6,6776 gold badges27 silver badges66 bronze badges

answered May 17, 2021 at 11:19

Katie's user avatar

KatieKatie

511 silver badge1 bronze badge

1

This issue seems to be normally related to updates in underlying Spreadsheets data, for example, adding new columns. If that’s the case, you can refresh the data source fields to get access to the data again. Also, check for changes in the file that might break the connection, like change of ownership.

In case this does not apply to your case, my suggestion is to go to the Data Studio Community site where you can check similar questions and even get responses by Data Studio team.

answered Feb 5, 2020 at 16:45

Tlaquetzal's user avatar

TlaquetzalTlaquetzal

2,6651 gold badge12 silver badges18 bronze badges

Looks like this issue may stem from the original source being a sheets file. I recently came across an issue where pointing to a view that is made off of a sheet tab was having this same exact problem.

We tried changing credentials, refreshing, etc and nothing worked. Turned out, you need to navigate to «Manage Data Sources» —> «Add Dataset» —> and select Google Sheets. You should be prompted to approve connection to Google Drive. Do this. Even if you are not literally pulling a sheet, do this. To be clear our workflow is as follows: team inputs information in sheets —> sheet is loaded in BQ —> create a view off of the table in BQ that is the sheet so we can manipulate as such —> point the dashboard to the view in BQ.

Upon doing so, refresh the connection and you should see data populate the table/chart.

answered Apr 28, 2021 at 16:53

seanwsullivan1's user avatar

In my case, it was the data source owner.
by changing the owner it works for me , because had a permission issue with the owner

  1. Click on Data Credentials on the top menu.
  2. Update Credentials Owner

enter image description here

answered Jun 17, 2022 at 4:56

Akitha_MJ's user avatar

Akitha_MJAkitha_MJ

3,63022 silver badges18 bronze badges

I encountered this issue today(https://twitter.com/wey_gu/status/1385101021576372228), for the BigQuery data source being imported in data studio, for the reason the same data source can be imported properly by my project owner role account, it’s suspected as permission caused issue.

By checking all predefined IAM roles related to BigQuery, it’s found that the BigQuery Job User role is needed as the import action is underlying yet another BigQuery Job being triggerred.

answered Apr 22, 2021 at 5:37

Wey Gu's user avatar

Wey GuWey Gu

5171 gold badge6 silver badges11 bronze badges

Go to Edit Connection option in your Data Source. From here select the appropriate data sheet and click on RECONNECT OPTION. This is prompt if any issues identified in your excel and it will automatically fix the issue. Now you data set configuration error will be removed.

answered Jan 28, 2022 at 11:37

vignesh Subash's user avatar

vignesh Subashvignesh Subash

2,4971 gold badge10 silver badges15 bronze badges

Data Studio не может подключиться к вашему набору данных. Не удалось получить данные из базового набора данных. Набор данных подключен в течение некоторого времени или нескольких дней после возникновения этой ошибки данные не подключены для диапазона данных

enter image description here

4 ответа

Эта проблема обычно связана с обновлениями базовых данных электронных таблиц, например, с добавлением новых столбцов. В этом случае вы можете обновить поля источника данных, чтобы получить доступ к данным. снова. Кроме того, проверьте наличие изменений в файле, которые могут нарушить соединение, например изменение владельца.

Если это не относится к вашему случаю, я предлагаю перейти в Сообщество Студии данных. сайт, где вы можете задать похожие вопросы и даже получить ответы от команды Data Studio.


4

Tlaquetzal
5 Фев 2020 в 19:45

Похоже, эта проблема может быть связана с тем, что исходным источником является файл листов. Недавно я столкнулся с проблемой, когда указание на представление, созданное на вкладке листа, имело ту же самую проблему.

Мы пытались изменить учетные данные, обновить и т. д., и ничего не получилось. Оказалось, вам нужно перейти к «Управление источниками данных» -> «Добавить набор данных» -> и выбрать Google Таблицы. Вам будет предложено подтвердить подключение к Google Диску. Сделай это. Даже если вы буквально не тянете простыню, сделайте это. Чтобы было ясно, наш рабочий процесс выглядит следующим образом: команда вводит информацию в листы —> лист загружается в BQ —> создает представление вне таблицы в BQ, которое является листом, чтобы мы могли манипулировать им как таковым —> указать панель инструментов к просмотру в БК.

После этого обновите соединение, и вы должны увидеть данные, заполняющие таблицу/диаграмму.


1

seanwsullivan1
28 Апр 2021 в 19:53

Я столкнулся с этой проблемой сегодня (https://twitter.com/wey_gu/status/1385101021576372228) , для источника данных BigQuery, импортируемого в студию данных, по той причине, что тот же источник данных может быть правильно импортирован моей ролевой учетной записью владельца проекта, это подозревается как проблема с разрешением.

При проверке всех предопределенных ролей IAM, связанных с BigQuery, было обнаружено, что роль BigQuery Job User необходима, поскольку действие импорта лежит в основе еще одного инициируемого задания BigQuery.


0

Wey Gu
22 Апр 2021 в 08:37

У меня такая же проблема, потому что я сменил владельца источника данных. Я исправил это, изменив «учетные данные данных». Надеюсь, это решение поможет вам ввести сюда описание изображения


0

Katie
17 Май 2021 в 14:19

A Google Sheets data set is connected to Google Data Studio then explored in chart using data range.
For 1 or 2 days, data is visible in chart and we can also set range that period of time but after 2 days it shows the screen:

error_screenshot

The data set is connected for some time or some days after this error occurs:

Data Set Configuration Error
Data Studio cannot connect to your data set. Failed to fetch data from the underlying data set.

Nimantha's user avatar

Nimantha

6,6626 gold badges27 silver badges66 bronze badges

asked Jan 21, 2020 at 12:44

Chanchal S's user avatar

1

I got the same issue, because I changed the owner of data source. I fixed it by changing the «data credentials».

enter image description here

Nimantha's user avatar

Nimantha

6,6626 gold badges27 silver badges66 bronze badges

answered May 17, 2021 at 11:19

Katie's user avatar

KatieKatie

511 silver badge1 bronze badge

1

This issue seems to be normally related to updates in underlying Spreadsheets data, for example, adding new columns. If that’s the case, you can refresh the data source fields to get access to the data again. Also, check for changes in the file that might break the connection, like change of ownership.

In case this does not apply to your case, my suggestion is to go to the Data Studio Community site where you can check similar questions and even get responses by Data Studio team.

answered Feb 5, 2020 at 16:45

Tlaquetzal's user avatar

TlaquetzalTlaquetzal

2,6551 gold badge11 silver badges18 bronze badges

Looks like this issue may stem from the original source being a sheets file. I recently came across an issue where pointing to a view that is made off of a sheet tab was having this same exact problem.

We tried changing credentials, refreshing, etc and nothing worked. Turned out, you need to navigate to «Manage Data Sources» —> «Add Dataset» —> and select Google Sheets. You should be prompted to approve connection to Google Drive. Do this. Even if you are not literally pulling a sheet, do this. To be clear our workflow is as follows: team inputs information in sheets —> sheet is loaded in BQ —> create a view off of the table in BQ that is the sheet so we can manipulate as such —> point the dashboard to the view in BQ.

Upon doing so, refresh the connection and you should see data populate the table/chart.

answered Apr 28, 2021 at 16:53

seanwsullivan1's user avatar

In my case, it was the data source owner.
by changing the owner it works for me , because had a permission issue with the owner

  1. Click on Data Credentials on the top menu.
  2. Update Credentials Owner

enter image description here

answered Jun 17, 2022 at 4:56

Akitha_MJ's user avatar

Akitha_MJAkitha_MJ

3,66622 silver badges18 bronze badges

I encountered this issue today(https://twitter.com/wey_gu/status/1385101021576372228), for the BigQuery data source being imported in data studio, for the reason the same data source can be imported properly by my project owner role account, it’s suspected as permission caused issue.

By checking all predefined IAM roles related to BigQuery, it’s found that the BigQuery Job User role is needed as the import action is underlying yet another BigQuery Job being triggerred.

answered Apr 22, 2021 at 5:37

Wey Gu's user avatar

Wey GuWey Gu

5171 gold badge6 silver badges11 bronze badges

Go to Edit Connection option in your Data Source. From here select the appropriate data sheet and click on RECONNECT OPTION. This is prompt if any issues identified in your excel and it will automatically fix the issue. Now you data set configuration error will be removed.

answered Jan 28, 2022 at 11:37

vignesh Subash's user avatar

vignesh Subashvignesh Subash

2,4971 gold badge10 silver badges15 bronze badges

A Google Sheets data set is connected to Google Data Studio then explored in chart using data range.
For 1 or 2 days, data is visible in chart and we can also set range that period of time but after 2 days it shows the screen:

error_screenshot

The data set is connected for some time or some days after this error occurs:

Data Set Configuration Error
Data Studio cannot connect to your data set. Failed to fetch data from the underlying data set.

Nimantha's user avatar

Nimantha

6,6626 gold badges27 silver badges66 bronze badges

asked Jan 21, 2020 at 12:44

Chanchal S's user avatar

1

I got the same issue, because I changed the owner of data source. I fixed it by changing the «data credentials».

enter image description here

Nimantha's user avatar

Nimantha

6,6626 gold badges27 silver badges66 bronze badges

answered May 17, 2021 at 11:19

Katie's user avatar

KatieKatie

511 silver badge1 bronze badge

1

This issue seems to be normally related to updates in underlying Spreadsheets data, for example, adding new columns. If that’s the case, you can refresh the data source fields to get access to the data again. Also, check for changes in the file that might break the connection, like change of ownership.

In case this does not apply to your case, my suggestion is to go to the Data Studio Community site where you can check similar questions and even get responses by Data Studio team.

answered Feb 5, 2020 at 16:45

Tlaquetzal's user avatar

TlaquetzalTlaquetzal

2,6551 gold badge11 silver badges18 bronze badges

Looks like this issue may stem from the original source being a sheets file. I recently came across an issue where pointing to a view that is made off of a sheet tab was having this same exact problem.

We tried changing credentials, refreshing, etc and nothing worked. Turned out, you need to navigate to «Manage Data Sources» —> «Add Dataset» —> and select Google Sheets. You should be prompted to approve connection to Google Drive. Do this. Even if you are not literally pulling a sheet, do this. To be clear our workflow is as follows: team inputs information in sheets —> sheet is loaded in BQ —> create a view off of the table in BQ that is the sheet so we can manipulate as such —> point the dashboard to the view in BQ.

Upon doing so, refresh the connection and you should see data populate the table/chart.

answered Apr 28, 2021 at 16:53

seanwsullivan1's user avatar

In my case, it was the data source owner.
by changing the owner it works for me , because had a permission issue with the owner

  1. Click on Data Credentials on the top menu.
  2. Update Credentials Owner

enter image description here

answered Jun 17, 2022 at 4:56

Akitha_MJ's user avatar

Akitha_MJAkitha_MJ

3,66622 silver badges18 bronze badges

I encountered this issue today(https://twitter.com/wey_gu/status/1385101021576372228), for the BigQuery data source being imported in data studio, for the reason the same data source can be imported properly by my project owner role account, it’s suspected as permission caused issue.

By checking all predefined IAM roles related to BigQuery, it’s found that the BigQuery Job User role is needed as the import action is underlying yet another BigQuery Job being triggerred.

answered Apr 22, 2021 at 5:37

Wey Gu's user avatar

Wey GuWey Gu

5171 gold badge6 silver badges11 bronze badges

Go to Edit Connection option in your Data Source. From here select the appropriate data sheet and click on RECONNECT OPTION. This is prompt if any issues identified in your excel and it will automatically fix the issue. Now you data set configuration error will be removed.

answered Jan 28, 2022 at 11:37

vignesh Subash's user avatar

vignesh Subashvignesh Subash

2,4971 gold badge10 silver badges15 bronze badges

Data Studio не может подключиться к вашему набору данных. Не удалось получить данные из базового набора данных. Набор данных подключен в течение некоторого времени или нескольких дней после возникновения этой ошибки данные не подключены для диапазона данных

enter image description here

4 ответа

Эта проблема обычно связана с обновлениями базовых данных электронных таблиц, например, с добавлением новых столбцов. В этом случае вы можете обновить поля источника данных, чтобы получить доступ к данным. снова. Кроме того, проверьте наличие изменений в файле, которые могут нарушить соединение, например изменение владельца.

Если это не относится к вашему случаю, я предлагаю перейти в Сообщество Студии данных. сайт, где вы можете задать похожие вопросы и даже получить ответы от команды Data Studio.


4

Tlaquetzal
5 Фев 2020 в 19:45

Похоже, эта проблема может быть связана с тем, что исходным источником является файл листов. Недавно я столкнулся с проблемой, когда указание на представление, созданное на вкладке листа, имело ту же самую проблему.

Мы пытались изменить учетные данные, обновить и т. д., и ничего не получилось. Оказалось, вам нужно перейти к «Управление источниками данных» -> «Добавить набор данных» -> и выбрать Google Таблицы. Вам будет предложено подтвердить подключение к Google Диску. Сделай это. Даже если вы буквально не тянете простыню, сделайте это. Чтобы было ясно, наш рабочий процесс выглядит следующим образом: команда вводит информацию в листы —> лист загружается в BQ —> создает представление вне таблицы в BQ, которое является листом, чтобы мы могли манипулировать им как таковым —> указать панель инструментов к просмотру в БК.

После этого обновите соединение, и вы должны увидеть данные, заполняющие таблицу/диаграмму.


1

seanwsullivan1
28 Апр 2021 в 19:53

Я столкнулся с этой проблемой сегодня (https://twitter.com/wey_gu/status/1385101021576372228) , для источника данных BigQuery, импортируемого в студию данных, по той причине, что тот же источник данных может быть правильно импортирован моей ролевой учетной записью владельца проекта, это подозревается как проблема с разрешением.

При проверке всех предопределенных ролей IAM, связанных с BigQuery, было обнаружено, что роль BigQuery Job User необходима, поскольку действие импорта лежит в основе еще одного инициируемого задания BigQuery.


0

Wey Gu
22 Апр 2021 в 08:37

У меня такая же проблема, потому что я сменил владельца источника данных. Я исправил это, изменив «учетные данные данных». Надеюсь, это решение поможет вам ввести сюда описание изображения


0

Katie
17 Май 2021 в 14:19

First: You can always check out the Open-Source implementations that others did for custom Google Data Studio connectors. They are a great source if information. Fore more information checkout the documentation on Open Source Community Connectors.

Second: My implementation is for a time tracking system thus having confidential GDPR relevant data. That’s why I can not just give you response messages. But I assembled this code. It contains authentifiction, HTTP GET data fetch and data conversions. Explanation is below the code. Again, checkout the open-source connectors if you need further assistance.

var cc = DataStudioApp.createCommunityConnector();

const URL_DATA = 'https://www.myverysecretdomain.com/api';
const URL_PING = 'https://www.myverysecretdomain.com/ping';
const AUTH_USER = 'auth.user'
const AUTH_KEY = 'auth.key';
const JSON_TAG = 'user';

String.prototype.format = function() {
  // https://coderwall.com/p/flonoa/simple-string-format-in-javascript
  a = this;
  for (k in arguments) {
    a = a.replace("{" + k + "}", arguments[k])
  }
  return a
}

function httpGet(user, token, url, params) {
  try {
    // this depends on the URL you are connecting to
    var headers = {
      'ApiUser': user,
      'ApiToken': token,
      'User-Agent': 'my super freaky Google Data Studio connector'
    };

    var options = {
      headers: headers
    };

    if (params && Object.keys(params).length > 0) {
      var params_ = [];
      for (const [key, value] of Object.entries(params)) {
        var value_ = value;
        if (Array.isArray(value))
          value_ = value.join(',');

        params_.push('{0}={1}'.format(key, encodeURIComponent(value_)))
      }

      var query = params_.join('&');
      url = '{0}?{1}'.format(url, query);
    }

    var response = UrlFetchApp.fetch(url, options);

    return {
      code: response.getResponseCode(),
      json: JSON.parse(response.getContentText())
    }  
  } catch (e) {
    throwConnectorError(e);
  }
}

function getCredentials() {
  var userProperties = PropertiesService.getUserProperties();
  return {
    username: userProperties.getProperty(AUTH_USER),
    token: userProperties.getProperty(AUTH_KEY)
  }
}

function validateCredentials(user, token) {
  if (!user || !token) 
    return false;

  var response = httpGet(user, token, URL_PING);

  if (response.code == 200)
    console.log('API key for the user %s successfully validated', user);
  else
    console.error('API key for the user %s is invalid. Code: %s', user, response.code);

  return response;
}  

function getAuthType() {
  var cc = DataStudioApp.createCommunityConnector();
  return cc.newAuthTypeResponse()
    .setAuthType(cc.AuthType.USER_TOKEN)
    .setHelpUrl('https://www.myverysecretdomain.com/index.html#authentication')
    .build();
}

function resetAuth() {
  var userProperties = PropertiesService.getUserProperties();
  userProperties.deleteProperty(AUTH_USER);
  userProperties.deleteProperty(AUTH_KEY);

  console.info('Credentials have been reset.');
}

function isAuthValid() {
  var credentials = getCredentials()
  if (credentials == null) {
    console.info('No credentials found.');
    return false;
  }

  var response = validateCredentials(credentials.username, credentials.token);
  return (response != null && response.code == 200);
}

function setCredentials(request) {
  var credentials = request.userToken;
  var response = validateCredentials(credentials.username, credentials.token);

  if (response == null || response.code != 200) return { errorCode: 'INVALID_CREDENTIALS' };

  var userProperties = PropertiesService.getUserProperties();
  userProperties.setProperty(AUTH_USER, credentials.username);
  userProperties.setProperty(AUTH_KEY, credentials.token);

  console.info('Credentials have been stored');

  return {
    errorCode: 'NONE'
  };
}

function throwConnectorError(text) {
  DataStudioApp.createCommunityConnector()
    .newUserError()
    .setDebugText(text)
    .setText(text)
    .throwException();
}

function getConfig(request) {
  // ToDo: handle request.languageCode for different languages being displayed
  console.log(request)

  var params = request.configParams;
  var config = cc.getConfig();

  // ToDo: add your config if necessary

  config.setDateRangeRequired(true);
  return config.build();
}

function getDimensions() {
  var types = cc.FieldType;

  return [
    {
      id:'id',
      name:'ID',
      type:types.NUMBER
    },
    {
      id:'name',
      name:'Name',
      isDefault:true,
      type:types.TEXT
    },
    {
      id:'email',
      name:'Email',
      type:types.TEXT
    }
  ];
}

function getMetrics() {
  return [];
}

function getFields(request) {
  Logger.log(request)

  var fields = cc.getFields();

  var dimensions = this.getDimensions();
  var metrics = this.getMetrics();
  dimensions.forEach(dimension => fields.newDimension().setId(dimension.id).setName(dimension.name).setType(dimension.type));  
  metrics.forEach(metric => fields.newMetric().setId(metric.id).setName(metric.name).setType(metric.type).setAggregation(metric.aggregations));

  var defaultDimension = dimensions.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
  var defaultMetric = metrics.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);

  if (defaultDimension)
    fields.setDefaultDimension(defaultDimension.id);
  if (defaultMetric)
    fields.setDefaultMetric(defaultMetric.id);

  return fields;
}

function getSchema(request) {
  var fields = getFields(request).build();
  return { schema: fields };
}

function convertValue(value, id) {  
  // ToDo: add special conversion if necessary
  switch(id) {      
    default:
      // value will be converted automatically
      return value[id];
  }
}

function entriesToDicts(schema, data, converter, tag) {

  return data.map(function(element) {

    var entry = element[tag];
    var row = {};    
    schema.forEach(function(field) {

      // field has same name in connector and original data source
      var id = field.id;
      var value = converter(entry, id);

      // use UI field ID
      row[field.id] = value;
    });

    return row;
  });
}

function dictsToRows(requestedFields, rows) {
  return rows.reduce((result, row) => ([...result, {'values': requestedFields.reduce((values, field) => ([...values, row[field]]), [])}]), []);
}

function getParams (request) { 
  var schema = this.getSchema();
  var params;

  if (request) {
    params = {};

    // ToDo: handle pagination={startRow=1.0, rowCount=100.0}
  } else {
    // preview only
    params = {
      limit: 20
    }
  }

  return params;
}

function getData(request) {
  Logger.log(request)

  var credentials = getCredentials()
  var schema = getSchema();
  var params = getParams(request);

  var requestedFields;  // fields structured as I want them (see above)
  var requestedSchema;  // fields structured as Google expects them
  if (request) {
    // make sure the ordering of the requested fields is kept correct in the resulting data
    requestedFields = request.fields.filter(field => !field.forFilterOnly).map(field => field.name);
    requestedSchema = getFields(request).forIds(requestedFields);
  } else {
    // use all fields from schema
    requestedFields = schema.map(field => field.id);
    requestedSchema = api.getFields(request);
  }

  var filterPresent = request && request.dimensionsFilters;
  //var filter = ...
  if (filterPresent) {
    // ToDo: apply request filters on API level (before the API call) to minimize data retrieval from API (number of rows) and increase speed
    // see https://developers.google.com/datastudio/connector/filters

    // filter = ...   // initialize filter
    // filter.preFilter(params);  // low-level API filtering if possible
  }

  // get HTTP response; e.g. check for HTTT RETURN CODE on response.code if necessary
  var response = httpGet(credentials.username, credentials.token, URL_DATA, params);  

  // get JSON data from HTTP response
  var data = response.json;

  // convert the full dataset including all fields (the full schema). non-requested fields will be filtered later on  
  var rows = entriesToDicts(schema, data, convertValue, JSON_TAG);

  // match rows against filter (high-level filtering)
  //if (filter)
  //  rows = rows.filter(row => filter.match(row) == true);

  // remove non-requested fields
  var result = dictsToRows(requestedFields, rows);

  console.log('{0} rows received'.format(result.length));
  //console.log(result);

  return {
    schema: requestedSchema.build(),
    rows: result,
    filtersApplied: filter ? true : false
  };
}

A sample request that filters for all users with names starting with J.

{
    configParams={}, 
    dateRange={
        endDate=2020-05-14, 
        startDate=2020-04-17
    }, 
    fields=[
        {name=name}
    ], 
    scriptParams={
        lastRefresh=1589543208040
    }, 
    dimensionsFilters=[
        [
            {
                values=[^J.*], 
                operator=REGEXP_EXACT_MATCH, 
                type=INCLUDE, 
                fieldName=name
            }
        ]
    ]
}

The JSON data returned by the HTTP GET contains all fields (full schema).

[ { user: 
     { id: 1,
       name: 'Jane Doe',
       email: 'jane@doe.com' } },
  { user: 
     { id: 2,
       name: 'John Doe', 
       email: 'john@doe.com' } }
]

Once the data is filtered and converted/transformed, you’ll get this result, which is perfectly displayed by Google Data Studio:

{
    filtersApplied=true, 
    schema=[
        {
            isDefault=true, 
            semantics={
                semanticType=TEXT, 
                conceptType=DIMENSION
            }, 
            label=Name, 
            name=name, 
            dataType=STRING
        }
    ], 
    rows=[
        {values=[Jane Doe]}, 
        {values=[John Doe]}
    ]
}

Screenshot of Google Data Studio report

First: You can always check out the Open-Source implementations that others did for custom Google Data Studio connectors. They are a great source if information. Fore more information checkout the documentation on Open Source Community Connectors.

Second: My implementation is for a time tracking system thus having confidential GDPR relevant data. That’s why I can not just give you response messages. But I assembled this code. It contains authentifiction, HTTP GET data fetch and data conversions. Explanation is below the code. Again, checkout the open-source connectors if you need further assistance.

var cc = DataStudioApp.createCommunityConnector();

const URL_DATA = 'https://www.myverysecretdomain.com/api';
const URL_PING = 'https://www.myverysecretdomain.com/ping';
const AUTH_USER = 'auth.user'
const AUTH_KEY = 'auth.key';
const JSON_TAG = 'user';

String.prototype.format = function() {
  // https://coderwall.com/p/flonoa/simple-string-format-in-javascript
  a = this;
  for (k in arguments) {
    a = a.replace("{" + k + "}", arguments[k])
  }
  return a
}

function httpGet(user, token, url, params) {
  try {
    // this depends on the URL you are connecting to
    var headers = {
      'ApiUser': user,
      'ApiToken': token,
      'User-Agent': 'my super freaky Google Data Studio connector'
    };

    var options = {
      headers: headers
    };

    if (params && Object.keys(params).length > 0) {
      var params_ = [];
      for (const [key, value] of Object.entries(params)) {
        var value_ = value;
        if (Array.isArray(value))
          value_ = value.join(',');

        params_.push('{0}={1}'.format(key, encodeURIComponent(value_)))
      }

      var query = params_.join('&');
      url = '{0}?{1}'.format(url, query);
    }

    var response = UrlFetchApp.fetch(url, options);

    return {
      code: response.getResponseCode(),
      json: JSON.parse(response.getContentText())
    }  
  } catch (e) {
    throwConnectorError(e);
  }
}

function getCredentials() {
  var userProperties = PropertiesService.getUserProperties();
  return {
    username: userProperties.getProperty(AUTH_USER),
    token: userProperties.getProperty(AUTH_KEY)
  }
}

function validateCredentials(user, token) {
  if (!user || !token) 
    return false;

  var response = httpGet(user, token, URL_PING);

  if (response.code == 200)
    console.log('API key for the user %s successfully validated', user);
  else
    console.error('API key for the user %s is invalid. Code: %s', user, response.code);

  return response;
}  

function getAuthType() {
  var cc = DataStudioApp.createCommunityConnector();
  return cc.newAuthTypeResponse()
    .setAuthType(cc.AuthType.USER_TOKEN)
    .setHelpUrl('https://www.myverysecretdomain.com/index.html#authentication')
    .build();
}

function resetAuth() {
  var userProperties = PropertiesService.getUserProperties();
  userProperties.deleteProperty(AUTH_USER);
  userProperties.deleteProperty(AUTH_KEY);

  console.info('Credentials have been reset.');
}

function isAuthValid() {
  var credentials = getCredentials()
  if (credentials == null) {
    console.info('No credentials found.');
    return false;
  }

  var response = validateCredentials(credentials.username, credentials.token);
  return (response != null && response.code == 200);
}

function setCredentials(request) {
  var credentials = request.userToken;
  var response = validateCredentials(credentials.username, credentials.token);

  if (response == null || response.code != 200) return { errorCode: 'INVALID_CREDENTIALS' };

  var userProperties = PropertiesService.getUserProperties();
  userProperties.setProperty(AUTH_USER, credentials.username);
  userProperties.setProperty(AUTH_KEY, credentials.token);

  console.info('Credentials have been stored');

  return {
    errorCode: 'NONE'
  };
}

function throwConnectorError(text) {
  DataStudioApp.createCommunityConnector()
    .newUserError()
    .setDebugText(text)
    .setText(text)
    .throwException();
}

function getConfig(request) {
  // ToDo: handle request.languageCode for different languages being displayed
  console.log(request)

  var params = request.configParams;
  var config = cc.getConfig();

  // ToDo: add your config if necessary

  config.setDateRangeRequired(true);
  return config.build();
}

function getDimensions() {
  var types = cc.FieldType;

  return [
    {
      id:'id',
      name:'ID',
      type:types.NUMBER
    },
    {
      id:'name',
      name:'Name',
      isDefault:true,
      type:types.TEXT
    },
    {
      id:'email',
      name:'Email',
      type:types.TEXT
    }
  ];
}

function getMetrics() {
  return [];
}

function getFields(request) {
  Logger.log(request)

  var fields = cc.getFields();

  var dimensions = this.getDimensions();
  var metrics = this.getMetrics();
  dimensions.forEach(dimension => fields.newDimension().setId(dimension.id).setName(dimension.name).setType(dimension.type));  
  metrics.forEach(metric => fields.newMetric().setId(metric.id).setName(metric.name).setType(metric.type).setAggregation(metric.aggregations));

  var defaultDimension = dimensions.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
  var defaultMetric = metrics.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);

  if (defaultDimension)
    fields.setDefaultDimension(defaultDimension.id);
  if (defaultMetric)
    fields.setDefaultMetric(defaultMetric.id);

  return fields;
}

function getSchema(request) {
  var fields = getFields(request).build();
  return { schema: fields };
}

function convertValue(value, id) {  
  // ToDo: add special conversion if necessary
  switch(id) {      
    default:
      // value will be converted automatically
      return value[id];
  }
}

function entriesToDicts(schema, data, converter, tag) {

  return data.map(function(element) {

    var entry = element[tag];
    var row = {};    
    schema.forEach(function(field) {

      // field has same name in connector and original data source
      var id = field.id;
      var value = converter(entry, id);

      // use UI field ID
      row[field.id] = value;
    });

    return row;
  });
}

function dictsToRows(requestedFields, rows) {
  return rows.reduce((result, row) => ([...result, {'values': requestedFields.reduce((values, field) => ([...values, row[field]]), [])}]), []);
}

function getParams (request) { 
  var schema = this.getSchema();
  var params;

  if (request) {
    params = {};

    // ToDo: handle pagination={startRow=1.0, rowCount=100.0}
  } else {
    // preview only
    params = {
      limit: 20
    }
  }

  return params;
}

function getData(request) {
  Logger.log(request)

  var credentials = getCredentials()
  var schema = getSchema();
  var params = getParams(request);

  var requestedFields;  // fields structured as I want them (see above)
  var requestedSchema;  // fields structured as Google expects them
  if (request) {
    // make sure the ordering of the requested fields is kept correct in the resulting data
    requestedFields = request.fields.filter(field => !field.forFilterOnly).map(field => field.name);
    requestedSchema = getFields(request).forIds(requestedFields);
  } else {
    // use all fields from schema
    requestedFields = schema.map(field => field.id);
    requestedSchema = api.getFields(request);
  }

  var filterPresent = request && request.dimensionsFilters;
  //var filter = ...
  if (filterPresent) {
    // ToDo: apply request filters on API level (before the API call) to minimize data retrieval from API (number of rows) and increase speed
    // see https://developers.google.com/datastudio/connector/filters

    // filter = ...   // initialize filter
    // filter.preFilter(params);  // low-level API filtering if possible
  }

  // get HTTP response; e.g. check for HTTT RETURN CODE on response.code if necessary
  var response = httpGet(credentials.username, credentials.token, URL_DATA, params);  

  // get JSON data from HTTP response
  var data = response.json;

  // convert the full dataset including all fields (the full schema). non-requested fields will be filtered later on  
  var rows = entriesToDicts(schema, data, convertValue, JSON_TAG);

  // match rows against filter (high-level filtering)
  //if (filter)
  //  rows = rows.filter(row => filter.match(row) == true);

  // remove non-requested fields
  var result = dictsToRows(requestedFields, rows);

  console.log('{0} rows received'.format(result.length));
  //console.log(result);

  return {
    schema: requestedSchema.build(),
    rows: result,
    filtersApplied: filter ? true : false
  };
}

A sample request that filters for all users with names starting with J.

{
    configParams={}, 
    dateRange={
        endDate=2020-05-14, 
        startDate=2020-04-17
    }, 
    fields=[
        {name=name}
    ], 
    scriptParams={
        lastRefresh=1589543208040
    }, 
    dimensionsFilters=[
        [
            {
                values=[^J.*], 
                operator=REGEXP_EXACT_MATCH, 
                type=INCLUDE, 
                fieldName=name
            }
        ]
    ]
}

The JSON data returned by the HTTP GET contains all fields (full schema).

[ { user: 
     { id: 1,
       name: 'Jane Doe',
       email: 'jane@doe.com' } },
  { user: 
     { id: 2,
       name: 'John Doe', 
       email: 'john@doe.com' } }
]

Once the data is filtered and converted/transformed, you’ll get this result, which is perfectly displayed by Google Data Studio:

{
    filtersApplied=true, 
    schema=[
        {
            isDefault=true, 
            semantics={
                semanticType=TEXT, 
                conceptType=DIMENSION
            }, 
            label=Name, 
            name=name, 
            dataType=STRING
        }
    ], 
    rows=[
        {values=[Jane Doe]}, 
        {values=[John Doe]}
    ]
}

Screenshot of Google Data Studio report

В настоящее время я пытаюсь подключиться к источнику данных, содержащему данные формы. Некоторые поля являются необязательными, например, _vfb_field-1048 — это страна. Когда я добавляю все, кроме необязательных полей, таких как _vfb_field-1048, все работает отлично. Когда я добавляю эти измерения, он говорит мне:

Data Studio не может подключиться к вашему набору данных.
Не удалось получить данные из базового набора данных.
Идентификатор ошибки: 929212f4

Любая идея, что мне нужно сделать, чтобы данные отображались?

Примеры данных (не полные данные, поэтому не обращайте внимания на форматирование):

Мета: {
_vfb_field-1045: «Джон»,
_vfb_field-1069: «15551234567»,
_vfb_field-1046: «Лань»,
_vfb_field-1047: «email@example.com»,
_vfb_field-1048: «США»,
_vfb_field-1049: «Онлайн»,
_vfb_form_id: «72»,
_vfb_entry_id: «13560»
}

Мета: {
_vfb_field-1045: «Джон»,
_vfb_field-1069: «15551234567»,
_vfb_field-1046: «Лань»,
_vfb_field-1047: «email@example.com»,
_vfb_field-1048: «»,
_vfb_field-1049: «Онлайн»,
_vfb_form_id: «72»,
_vfb_entry_id: «13561»
}

function getFields() {
  var cc = DataStudioApp.createCommunityConnector();
  var fields = cc.getFields();
  var types = cc.FieldType;
  var aggregations = cc.AggregationType;

  fields.newDimension()
      .setId('slug')
      .setName('Slug')
      .setType(types.TEXT);

  fields.newDimension()
      .setId('country')
      .setName('Country')
      .setType(types.TEXT);

  fields.newDimension()
      .setId('form')
      .setName('Form')
      .setType(types.TEXT);


  fields.newMetric()
      .setId('id')
      .setName('ID')
      .setType(types.NUMBER)
      .setAggregation(aggregations.SUM);

  return fields;
}
function getSchema(request) {
  var fields = getFields().build();
  return {'schema': fields};
}

function responseToRows(requestedFields, response) {
  // Transform parsed data and filter for requested fields
  return response.map(function(dailyDownload) {
    var row = [];
    requestedFields.asArray().forEach(function(field) {
      switch (field.getId()) {
        case 'slug':
          return row.push(dailyDownload.slug);
        case 'country':
          return row.push(dailyDownload.meta._vfb_field-1048);
        case 'id':
          return row.push(dailyDownload.id);
        case 'form':
          return row.push(dailyDownload.meta._vfb_form_id);  
        default:
          return row.push('');
      }
    });
    return {values: row};
  });
} 

В настоящее время я пытаюсь подключиться к источнику данных, который содержит данные формы.Некоторые из полей являются необязательными, например _vfb_field-1048, который является страной.Когда я помещаю все, кроме необязательных полей, таких как _vfb_field-1048, это прекрасно работает.Когда я добавляю эти измерения, он говорит мне:

Data Studio не может подключиться к вашему набору данных.Не удалось получить данные из базового набора данных. Идентификатор ошибки: 929212f4

Есть идеи, что мне нужно сделать, чтобы отобразить данные?

Примеры данных (не полных данных, поэтомуне возражайте против форматирования):

meta: {_vfb_field-1045: «Джон», _vfb_field-1069: «15551234567», _vfb_field-1046: «Доу», _vfb_field-1047: «email@example.com»», _vfb_field-1048:» Соединенные Штаты «, _vfb_field-1049:» Онлайн «, _vfb_form_id:» 72 «, _vfb_entry_id:» 13560 «}

meta: {_vfb_field-1045:» Джон «, _vfb_field-1069: «15551234567», _vfb_field-1046: «Доу», _vfb_field-1047: «email@example.com», _vfb_field-1048: «», _vfb_field-1049: «Онлайн», _vfb_form_id: «72», _vfb_entry_id: «13561 «} Empty Data With Data in it

function getFields() {
  var cc = DataStudioApp.createCommunityConnector();
  var fields = cc.getFields();
  var types = cc.FieldType;
  var aggregations = cc.AggregationType;

  fields.newDimension()
      .setId('slug')
      .setName('Slug')
      .setType(types.TEXT);

  fields.newDimension()
      .setId('country')
      .setName('Country')
      .setType(types.TEXT);

  fields.newDimension()
      .setId('form')
      .setName('Form')
      .setType(types.TEXT);


  fields.newMetric()
      .setId('id')
      .setName('ID')
      .setType(types.NUMBER)
      .setAggregation(aggregations.SUM);

  return fields;
}
function getSchema(request) {
  var fields = getFields().build();
  return {'schema': fields};
}

function responseToRows(requestedFields, response) {
  // Transform parsed data and filter for requested fields
  return response.map(function(dailyDownload) {
    var row = [];
    requestedFields.asArray().forEach(function(field) {
      switch (field.getId()) {
        case 'slug':
          return row.push(dailyDownload.slug);
        case 'country':
          return row.push(dailyDownload.meta._vfb_field-1048);
        case 'id':
          return row.push(dailyDownload.id);
        case 'form':
          return row.push(dailyDownload.meta._vfb_form_id);  
        default:
          return row.push('');
      }
    });
    return {values: row};
  });
} 

First: You can always check out the Open-Source implementations that others did for custom Google Data Studio connectors. They are a great source if information. Fore more information checkout the documentation on Open Source Community Connectors.

Second: My implementation is for a time tracking system thus having confidential GDPR relevant data. That’s why I can not just give you response messages. But I assembled this code. It contains authentifiction, HTTP GET data fetch and data conversions. Explanation is below the code. Again, checkout the open-source connectors if you need further assistance.

var cc = DataStudioApp.createCommunityConnector();

const URL_DATA = 'https://www.myverysecretdomain.com/api';
const URL_PING = 'https://www.myverysecretdomain.com/ping';
const AUTH_USER = 'auth.user'
const AUTH_KEY = 'auth.key';
const JSON_TAG = 'user';

String.prototype.format = function() {
  // https://coderwall.com/p/flonoa/simple-string-format-in-javascript
  a = this;
  for (k in arguments) {
    a = a.replace("{" + k + "}", arguments[k])
  }
  return a
}

function httpGet(user, token, url, params) {
  try {
    // this depends on the URL you are connecting to
    var headers = {
      'ApiUser': user,
      'ApiToken': token,
      'User-Agent': 'my super freaky Google Data Studio connector'
    };

    var options = {
      headers: headers
    };

    if (params && Object.keys(params).length > 0) {
      var params_ = [];
      for (const [key, value] of Object.entries(params)) {
        var value_ = value;
        if (Array.isArray(value))
          value_ = value.join(',');

        params_.push('{0}={1}'.format(key, encodeURIComponent(value_)))
      }

      var query = params_.join('&');
      url = '{0}?{1}'.format(url, query);
    }

    var response = UrlFetchApp.fetch(url, options);

    return {
      code: response.getResponseCode(),
      json: JSON.parse(response.getContentText())
    }  
  } catch (e) {
    throwConnectorError(e);
  }
}

function getCredentials() {
  var userProperties = PropertiesService.getUserProperties();
  return {
    username: userProperties.getProperty(AUTH_USER),
    token: userProperties.getProperty(AUTH_KEY)
  }
}

function validateCredentials(user, token) {
  if (!user || !token) 
    return false;

  var response = httpGet(user, token, URL_PING);

  if (response.code == 200)
    console.log('API key for the user %s successfully validated', user);
  else
    console.error('API key for the user %s is invalid. Code: %s', user, response.code);

  return response;
}  

function getAuthType() {
  var cc = DataStudioApp.createCommunityConnector();
  return cc.newAuthTypeResponse()
    .setAuthType(cc.AuthType.USER_TOKEN)
    .setHelpUrl('https://www.myverysecretdomain.com/index.html#authentication')
    .build();
}

function resetAuth() {
  var userProperties = PropertiesService.getUserProperties();
  userProperties.deleteProperty(AUTH_USER);
  userProperties.deleteProperty(AUTH_KEY);

  console.info('Credentials have been reset.');
}

function isAuthValid() {
  var credentials = getCredentials()
  if (credentials == null) {
    console.info('No credentials found.');
    return false;
  }

  var response = validateCredentials(credentials.username, credentials.token);
  return (response != null && response.code == 200);
}

function setCredentials(request) {
  var credentials = request.userToken;
  var response = validateCredentials(credentials.username, credentials.token);

  if (response == null || response.code != 200) return { errorCode: 'INVALID_CREDENTIALS' };

  var userProperties = PropertiesService.getUserProperties();
  userProperties.setProperty(AUTH_USER, credentials.username);
  userProperties.setProperty(AUTH_KEY, credentials.token);

  console.info('Credentials have been stored');

  return {
    errorCode: 'NONE'
  };
}

function throwConnectorError(text) {
  DataStudioApp.createCommunityConnector()
    .newUserError()
    .setDebugText(text)
    .setText(text)
    .throwException();
}

function getConfig(request) {
  // ToDo: handle request.languageCode for different languages being displayed
  console.log(request)

  var params = request.configParams;
  var config = cc.getConfig();

  // ToDo: add your config if necessary

  config.setDateRangeRequired(true);
  return config.build();
}

function getDimensions() {
  var types = cc.FieldType;

  return [
    {
      id:'id',
      name:'ID',
      type:types.NUMBER
    },
    {
      id:'name',
      name:'Name',
      isDefault:true,
      type:types.TEXT
    },
    {
      id:'email',
      name:'Email',
      type:types.TEXT
    }
  ];
}

function getMetrics() {
  return [];
}

function getFields(request) {
  Logger.log(request)

  var fields = cc.getFields();

  var dimensions = this.getDimensions();
  var metrics = this.getMetrics();
  dimensions.forEach(dimension => fields.newDimension().setId(dimension.id).setName(dimension.name).setType(dimension.type));  
  metrics.forEach(metric => fields.newMetric().setId(metric.id).setName(metric.name).setType(metric.type).setAggregation(metric.aggregations));

  var defaultDimension = dimensions.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
  var defaultMetric = metrics.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);

  if (defaultDimension)
    fields.setDefaultDimension(defaultDimension.id);
  if (defaultMetric)
    fields.setDefaultMetric(defaultMetric.id);

  return fields;
}

function getSchema(request) {
  var fields = getFields(request).build();
  return { schema: fields };
}

function convertValue(value, id) {  
  // ToDo: add special conversion if necessary
  switch(id) {      
    default:
      // value will be converted automatically
      return value[id];
  }
}

function entriesToDicts(schema, data, converter, tag) {

  return data.map(function(element) {

    var entry = element[tag];
    var row = {};    
    schema.forEach(function(field) {

      // field has same name in connector and original data source
      var id = field.id;
      var value = converter(entry, id);

      // use UI field ID
      row[field.id] = value;
    });

    return row;
  });
}

function dictsToRows(requestedFields, rows) {
  return rows.reduce((result, row) => ([...result, {'values': requestedFields.reduce((values, field) => ([...values, row[field]]), [])}]), []);
}

function getParams (request) { 
  var schema = this.getSchema();
  var params;

  if (request) {
    params = {};

    // ToDo: handle pagination={startRow=1.0, rowCount=100.0}
  } else {
    // preview only
    params = {
      limit: 20
    }
  }

  return params;
}

function getData(request) {
  Logger.log(request)

  var credentials = getCredentials()
  var schema = getSchema();
  var params = getParams(request);

  var requestedFields;  // fields structured as I want them (see above)
  var requestedSchema;  // fields structured as Google expects them
  if (request) {
    // make sure the ordering of the requested fields is kept correct in the resulting data
    requestedFields = request.fields.filter(field => !field.forFilterOnly).map(field => field.name);
    requestedSchema = getFields(request).forIds(requestedFields);
  } else {
    // use all fields from schema
    requestedFields = schema.map(field => field.id);
    requestedSchema = api.getFields(request);
  }

  var filterPresent = request && request.dimensionsFilters;
  //var filter = ...
  if (filterPresent) {
    // ToDo: apply request filters on API level (before the API call) to minimize data retrieval from API (number of rows) and increase speed
    // see https://developers.google.com/datastudio/connector/filters

    // filter = ...   // initialize filter
    // filter.preFilter(params);  // low-level API filtering if possible
  }

  // get HTTP response; e.g. check for HTTT RETURN CODE on response.code if necessary
  var response = httpGet(credentials.username, credentials.token, URL_DATA, params);  

  // get JSON data from HTTP response
  var data = response.json;

  // convert the full dataset including all fields (the full schema). non-requested fields will be filtered later on  
  var rows = entriesToDicts(schema, data, convertValue, JSON_TAG);

  // match rows against filter (high-level filtering)
  //if (filter)
  //  rows = rows.filter(row => filter.match(row) == true);

  // remove non-requested fields
  var result = dictsToRows(requestedFields, rows);

  console.log('{0} rows received'.format(result.length));
  //console.log(result);

  return {
    schema: requestedSchema.build(),
    rows: result,
    filtersApplied: filter ? true : false
  };
}

A sample request that filters for all users with names starting with J.

{
    configParams={}, 
    dateRange={
        endDate=2020-05-14, 
        startDate=2020-04-17
    }, 
    fields=[
        {name=name}
    ], 
    scriptParams={
        lastRefresh=1589543208040
    }, 
    dimensionsFilters=[
        [
            {
                values=[^J.*], 
                operator=REGEXP_EXACT_MATCH, 
                type=INCLUDE, 
                fieldName=name
            }
        ]
    ]
}

The JSON data returned by the HTTP GET contains all fields (full schema).

[ { user: 
     { id: 1,
       name: 'Jane Doe',
       email: 'jane@doe.com' } },
  { user: 
     { id: 2,
       name: 'John Doe', 
       email: 'john@doe.com' } }
]

Once the data is filtered and converted/transformed, you’ll get this result, which is perfectly displayed by Google Data Studio:

{
    filtersApplied=true, 
    schema=[
        {
            isDefault=true, 
            semantics={
                semanticType=TEXT, 
                conceptType=DIMENSION
            }, 
            label=Name, 
            name=name, 
            dataType=STRING
        }
    ], 
    rows=[
        {values=[Jane Doe]}, 
        {values=[John Doe]}
    ]
}

Screenshot of Google Data Studio report

First: You can always check out the Open-Source implementations that others did for custom Google Data Studio connectors. They are a great source if information. Fore more information checkout the documentation on Open Source Community Connectors.

Second: My implementation is for a time tracking system thus having confidential GDPR relevant data. That’s why I can not just give you response messages. But I assembled this code. It contains authentifiction, HTTP GET data fetch and data conversions. Explanation is below the code. Again, checkout the open-source connectors if you need further assistance.

var cc = DataStudioApp.createCommunityConnector();

const URL_DATA = 'https://www.myverysecretdomain.com/api';
const URL_PING = 'https://www.myverysecretdomain.com/ping';
const AUTH_USER = 'auth.user'
const AUTH_KEY = 'auth.key';
const JSON_TAG = 'user';

String.prototype.format = function() {
  // https://coderwall.com/p/flonoa/simple-string-format-in-javascript
  a = this;
  for (k in arguments) {
    a = a.replace("{" + k + "}", arguments[k])
  }
  return a
}

function httpGet(user, token, url, params) {
  try {
    // this depends on the URL you are connecting to
    var headers = {
      'ApiUser': user,
      'ApiToken': token,
      'User-Agent': 'my super freaky Google Data Studio connector'
    };

    var options = {
      headers: headers
    };

    if (params && Object.keys(params).length > 0) {
      var params_ = [];
      for (const [key, value] of Object.entries(params)) {
        var value_ = value;
        if (Array.isArray(value))
          value_ = value.join(',');

        params_.push('{0}={1}'.format(key, encodeURIComponent(value_)))
      }

      var query = params_.join('&');
      url = '{0}?{1}'.format(url, query);
    }

    var response = UrlFetchApp.fetch(url, options);

    return {
      code: response.getResponseCode(),
      json: JSON.parse(response.getContentText())
    }  
  } catch (e) {
    throwConnectorError(e);
  }
}

function getCredentials() {
  var userProperties = PropertiesService.getUserProperties();
  return {
    username: userProperties.getProperty(AUTH_USER),
    token: userProperties.getProperty(AUTH_KEY)
  }
}

function validateCredentials(user, token) {
  if (!user || !token) 
    return false;

  var response = httpGet(user, token, URL_PING);

  if (response.code == 200)
    console.log('API key for the user %s successfully validated', user);
  else
    console.error('API key for the user %s is invalid. Code: %s', user, response.code);

  return response;
}  

function getAuthType() {
  var cc = DataStudioApp.createCommunityConnector();
  return cc.newAuthTypeResponse()
    .setAuthType(cc.AuthType.USER_TOKEN)
    .setHelpUrl('https://www.myverysecretdomain.com/index.html#authentication')
    .build();
}

function resetAuth() {
  var userProperties = PropertiesService.getUserProperties();
  userProperties.deleteProperty(AUTH_USER);
  userProperties.deleteProperty(AUTH_KEY);

  console.info('Credentials have been reset.');
}

function isAuthValid() {
  var credentials = getCredentials()
  if (credentials == null) {
    console.info('No credentials found.');
    return false;
  }

  var response = validateCredentials(credentials.username, credentials.token);
  return (response != null && response.code == 200);
}

function setCredentials(request) {
  var credentials = request.userToken;
  var response = validateCredentials(credentials.username, credentials.token);

  if (response == null || response.code != 200) return { errorCode: 'INVALID_CREDENTIALS' };

  var userProperties = PropertiesService.getUserProperties();
  userProperties.setProperty(AUTH_USER, credentials.username);
  userProperties.setProperty(AUTH_KEY, credentials.token);

  console.info('Credentials have been stored');

  return {
    errorCode: 'NONE'
  };
}

function throwConnectorError(text) {
  DataStudioApp.createCommunityConnector()
    .newUserError()
    .setDebugText(text)
    .setText(text)
    .throwException();
}

function getConfig(request) {
  // ToDo: handle request.languageCode for different languages being displayed
  console.log(request)

  var params = request.configParams;
  var config = cc.getConfig();

  // ToDo: add your config if necessary

  config.setDateRangeRequired(true);
  return config.build();
}

function getDimensions() {
  var types = cc.FieldType;

  return [
    {
      id:'id',
      name:'ID',
      type:types.NUMBER
    },
    {
      id:'name',
      name:'Name',
      isDefault:true,
      type:types.TEXT
    },
    {
      id:'email',
      name:'Email',
      type:types.TEXT
    }
  ];
}

function getMetrics() {
  return [];
}

function getFields(request) {
  Logger.log(request)

  var fields = cc.getFields();

  var dimensions = this.getDimensions();
  var metrics = this.getMetrics();
  dimensions.forEach(dimension => fields.newDimension().setId(dimension.id).setName(dimension.name).setType(dimension.type));  
  metrics.forEach(metric => fields.newMetric().setId(metric.id).setName(metric.name).setType(metric.type).setAggregation(metric.aggregations));

  var defaultDimension = dimensions.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
  var defaultMetric = metrics.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);

  if (defaultDimension)
    fields.setDefaultDimension(defaultDimension.id);
  if (defaultMetric)
    fields.setDefaultMetric(defaultMetric.id);

  return fields;
}

function getSchema(request) {
  var fields = getFields(request).build();
  return { schema: fields };
}

function convertValue(value, id) {  
  // ToDo: add special conversion if necessary
  switch(id) {      
    default:
      // value will be converted automatically
      return value[id];
  }
}

function entriesToDicts(schema, data, converter, tag) {

  return data.map(function(element) {

    var entry = element[tag];
    var row = {};    
    schema.forEach(function(field) {

      // field has same name in connector and original data source
      var id = field.id;
      var value = converter(entry, id);

      // use UI field ID
      row[field.id] = value;
    });

    return row;
  });
}

function dictsToRows(requestedFields, rows) {
  return rows.reduce((result, row) => ([...result, {'values': requestedFields.reduce((values, field) => ([...values, row[field]]), [])}]), []);
}

function getParams (request) { 
  var schema = this.getSchema();
  var params;

  if (request) {
    params = {};

    // ToDo: handle pagination={startRow=1.0, rowCount=100.0}
  } else {
    // preview only
    params = {
      limit: 20
    }
  }

  return params;
}

function getData(request) {
  Logger.log(request)

  var credentials = getCredentials()
  var schema = getSchema();
  var params = getParams(request);

  var requestedFields;  // fields structured as I want them (see above)
  var requestedSchema;  // fields structured as Google expects them
  if (request) {
    // make sure the ordering of the requested fields is kept correct in the resulting data
    requestedFields = request.fields.filter(field => !field.forFilterOnly).map(field => field.name);
    requestedSchema = getFields(request).forIds(requestedFields);
  } else {
    // use all fields from schema
    requestedFields = schema.map(field => field.id);
    requestedSchema = api.getFields(request);
  }

  var filterPresent = request && request.dimensionsFilters;
  //var filter = ...
  if (filterPresent) {
    // ToDo: apply request filters on API level (before the API call) to minimize data retrieval from API (number of rows) and increase speed
    // see https://developers.google.com/datastudio/connector/filters

    // filter = ...   // initialize filter
    // filter.preFilter(params);  // low-level API filtering if possible
  }

  // get HTTP response; e.g. check for HTTT RETURN CODE on response.code if necessary
  var response = httpGet(credentials.username, credentials.token, URL_DATA, params);  

  // get JSON data from HTTP response
  var data = response.json;

  // convert the full dataset including all fields (the full schema). non-requested fields will be filtered later on  
  var rows = entriesToDicts(schema, data, convertValue, JSON_TAG);

  // match rows against filter (high-level filtering)
  //if (filter)
  //  rows = rows.filter(row => filter.match(row) == true);

  // remove non-requested fields
  var result = dictsToRows(requestedFields, rows);

  console.log('{0} rows received'.format(result.length));
  //console.log(result);

  return {
    schema: requestedSchema.build(),
    rows: result,
    filtersApplied: filter ? true : false
  };
}

A sample request that filters for all users with names starting with J.

{
    configParams={}, 
    dateRange={
        endDate=2020-05-14, 
        startDate=2020-04-17
    }, 
    fields=[
        {name=name}
    ], 
    scriptParams={
        lastRefresh=1589543208040
    }, 
    dimensionsFilters=[
        [
            {
                values=[^J.*], 
                operator=REGEXP_EXACT_MATCH, 
                type=INCLUDE, 
                fieldName=name
            }
        ]
    ]
}

The JSON data returned by the HTTP GET contains all fields (full schema).

[ { user: 
     { id: 1,
       name: 'Jane Doe',
       email: 'jane@doe.com' } },
  { user: 
     { id: 2,
       name: 'John Doe', 
       email: 'john@doe.com' } }
]

Once the data is filtered and converted/transformed, you’ll get this result, which is perfectly displayed by Google Data Studio:

{
    filtersApplied=true, 
    schema=[
        {
            isDefault=true, 
            semantics={
                semanticType=TEXT, 
                conceptType=DIMENSION
            }, 
            label=Name, 
            name=name, 
            dataType=STRING
        }
    ], 
    rows=[
        {values=[Jane Doe]}, 
        {values=[John Doe]}
    ]
}

Screenshot of Google Data Studio report

Понравилась статья? Поделить с друзьями:
  • Ошибка конфигурации даф 95 причина
  • Ошибка конфигурации даф 106
  • Ошибка конфигурации даф 105 не заводится что делать
  • Ошибка конфигурации tsa viessmann функции
  • Ошибка конфигурации ip что это значит