2 BIGQUERY

============================================================================== Magnitude Simba Google BigQuery ODBC Data Connector Release Notes ==============================================================================

The release notes provide details of enhancements, features, known issues, and workflow changes in Simba Google BigQuery ODBC Connector 2.3.5, as well as the version history.

2.3.5 ========================================================================

Released 2021-05-07

Enhancements & New Features

  • Updated documentation

Instructions on using the ClientId and ClientSecret properties for authentication are included in the Installation and Configuration Guide.

Resolved Issues The following issues have been resolved in Simba Google BigQuery ODBC Connector 2.3.5.

  • [GAUSS-1264] If the Dataset drop-down list is opened immediately after the DSN dialog box is opened, the connector terminates unexpectedly.

  • [GAUSS-1281] When using Service Authentication, an unparseable KeyFile value causes the connector to terminate unexpectedly.

  • [GAUSS-1282] When the KeyFile property is set to a key file path, the connector incorrectly returns an error.

Known Issues The following are known issues that you may encounter due to limitations in the data source, the connector, or an application.

  • The connector no longer supports parameters in the exception block.

This is a limitation of the Google BigQuery server discovered on Mar 2021.

  • On macOS or Linux platforms, when the connector converts SQL_DOUBLE data to SQL_C_CHAR or SQL_C_WCHAR, data which is small or large enough to require representation in scientific notation may prepend a 0 to the exponent.

This is a limitation of Google BigQuery. For a list of BigQuery data types that the connector maps to the SQL_DOUBLE ODBC type, see the Installation and Configuration Guide.

  • When casting data, you must specify the data type according to Google BigQuery standards.

When casting data to a specific data type, you must use the corresponding data type name shown in the "Casting" section of the Query Reference: https://cloud.google.com/bigquery/sql-reference/functions-and-operators#cas ting

For example, to cast the "salary" column to the INTEGER type, you must specify INT64 instead of INTEGER:

  SELECT position, CAST(salary AS INT64) from Employee
  • When using the Standard SQL dialect, the connector's ODBC escape functionality is subject to the following limitations:

  • Standard SQL does not support the seed in the RAND([seed]) scalar function. As a result, the connector maps RAND() and RAND(6) to RAND().

  • For the following scalar functions, BigQuery only returns values in UTC, but ODBC expects the values in local time:
    • CURDATE()
    • CURRENT_DATE()
    • CURRENT_TIME[(TIME_PRECISION)]
    • CURRENT_TIMESTAMP[(TIME_PRECISION)]
    • CURTIME()
    • NOW()
  • Time precision values are not supported for the CURRENT_TIME[(TIME_PRECISION)] and CURRENT_TIMESTAMP[(TIME_PRECISION)] scalar functions.

  • TIME data types are not supported for the following scalar functions:
    • EXTRACT(interval FROM datetime)
    • TIMESTAMPADD(interval,integer_exp,timestamp_exp
    • TIMESTAMPDIFF(interval,timestamp_exp1,timestamp_exp2) For TIMESTAMPADD and TIMESTAMPDIFF, only the TIMESTAMP and DATE data types are supported.
  • When calling the TIMESTAMPADD() scalar function to work with DAY, WEEK, MONTH, QUARTER, or YEAR intervals, the connector escapes the function and calls DATE_ADD() instead. DATE_ADD() only supports DATE types, so time information is lost if the function is called on TIMESTAMP data.

  • When calling the TIMESTAMPDIFF() scalar function to work with DAY, MONTH, QUARTER, or YEAR intervals, the connector escapes the function and calls DATE_DIFF() instead. DATE_DIFF() only supports DATE types, so time information is lost if the function is called on TIMESTAMP data.

  • For the BIT_LENGTH scalar function, only the STRING and BYTES data types are supported. This behavior aligns with the SQL-92 specification, but not the ODBC specification.

  • When using the Legacy SQL dialect, the connector's ODBC escape functionality is subject to the following limitations:

  • For the following scalar functions, BigQuery only returns values in UTC, but ODBC expects the values in local time:
    • CURDATE()
    • CURRENT_DATE()
    • CURRENT_TIME[(TIME_PRECISION)]
    • CURRENT_TIMESTAMP[(TIME_PRECISION)]
    • CURTIME()
  • Time precision values are not supported for the CURRENT_TIME[(TIME_PRECISION)] and CURRENT_TIMESTAMP[(TIME_PRECISION)] scalar functions.

  • For the following scalar functions, TIME data types are not supported. Only the TIMESTAMP and DATE data types are supported.
    • TIMESTAMPADD(interval,integer_exp,timestamp_exp
    • TIMESTAMPDIFF(interval,timestamp_exp1,timestamp_exp2)

Workflow Changes =============================================================

The following changes may disrupt established workflows for the connector.

2.3.5 -----------------------------------------------------------------------

  • [GAUSS-1246] Removed support for macOS earlier than 10.14

Beginning with this release, the connector no longer supports macOS versions earlier than 10.14. For a list of supported macOS versions, see the Installation and Configuration Guide.

2.2.4 ------------------------------------------------------------------------

  • [GAUSS-980] Removed support for the Visual C++ Redistributable for Visual Studio 2013

Beginning with this release, the driver no longer supports this version of the dependency, and requires Visual C++ Redistributable for Visual Studio 2015 instead.

2.2.2 ------------------------------------------------------------------------

  • [GAUSS-875] New service endpoints

The driver now uses a new set of service endpoints to connect to the Google BigQuery API. The previous service endpoints have been deprecated. For a list of the new endpoints, see the "Service Endpoints" section of the Installation and Configuration Guide.

  • [GAUSS-897] Precedence for default large result dataset

If the Use Default _bqodbc_temp_tables Large Results Dataset check box is selected (the UseDefaultLargeResultsDataset property is set to 1) and a dataset is specified in the Dataset Name For Large Result Sets field (the LargeResultsDataSetID property), the driver now uses the default _bqodbc_temp_tables dataset. For more information, see the Installation and Configuration Guide.

2.2.0 ------------------------------------------------------------------------

  • Linux support changes

Beginning with this release, the Linux version of the driver now requires glibc 2.17 or later to be installed on the target machine.

As a result, the driver no longer supports CentOS 6 or RedHat Enterprise Linux (RHEL) 6. Only CentOS 7, RHEL 7, and SUSE Linux Enterprise Server (SLES) 11 and 12 are supported.

2.1.22 -----------------------------------------------------------------------

  • [GAUSS-653] Updated large result set behavior

The driver's behavior for handling large result sets with legacy SQL has been changed. When the driver sends a query, it checks whether the "Allow Large Results" option is enabled and if there is a dataset name specified. If the option is enabled, it requests a temporary large result set for your data. This data storage has cost implications for your Big Query account, consult the Big Query service documentation for details.

2.1.14 -----------------------------------------------------------------------

  • Minimum TLS Version

Beginning with this release, the driver requires a minimum version of TLS for encrypting the data store connection. By default, the driver requires TLS version 1.2. This requirement may cause existing DSNs and connection strings to stop working, if they are used to connect to data stores that use a TLS version earlier than 1.2.

To resolve this, in your DSN or connection string, set the Minimum TLS option (the Min_TLS property) to the appropriate version of TLS for your server. For more information, see the Installation and Configuration Guide.

  • Large result set handling

If you have a default destination set for large datasets but have not enabled the Allow Large Result Sets option (the AllowLargeResults property) the driver reports an error.

To resolve this, enable the Allow Large Result Sets option (the AllowLargeResults property).

Version History ==============================================================

2.3.4 ------------------------------------------------------------------------

Released 2021-04-29

Enhancements & New Features

  • [GAUSS-1176] Support for plain text JSON objects as key files

You can now substitute a plain text JSON object in place of a JSON or P12 key file path using the KeyFile property. To do this, set the KeyFile property to the either a plain text JSON object or the key file path. The existing KeyFilePath property can still be used for key file paths only. For more information, see the Installation and Configuration Guide.

  • [GAUSS-1193] Improvement to Email option

The Email option is no longer required to use JSON key files. If no Email value is provided when using Service Authentication, the connector attempts to read it from the provided JSON key file. The Email option is still required for P12 key files. For more information, see the Installation and Configuration Guide.

  • [GAUSS-1248][GAUSS-1258] Upgraded third-party libraries

The connector has been updated to use the following libraries: - libcurl version 7.74.0 with control flow guard enabled (previously 7.68.0) - OpenSSL version 1.1.1k (previously 1.1.1i)

  • [GAUSS-5298] Support for SLES 15

The connector now supports SUSE Linux Enterprise Server (SLES) 15.

2.3.3 ------------------------------------------------------------------------

Released 2021-02-26

Enhancements & New Features

  • [GAUSS-781] Improved fetch handling

The mechanism with which the connector fetches query results over the REST API has been refactored and the performance improved.

  • [GAUSS-1117] Improved error handling

When using SELECT, DML, and DDL statements, the connector now returns a more specific error message for "table not found" scenarios.

  • [GAUSS-1158] Support for additional DDL and DML keywords

The connector now supports additional DDL and DML keywords. For more information, see: - https://cloud.google.com/blog/topics/developers-practitioners/smile-new-user-friendly-sql-capabilities-bigquery - https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language

  • [GAUSS-1170] EXECUTE IMMEDIATE support

The connector now supports the EXECUTE IMMEDIATE script keyword.

  • Updated OpenSSL support

The connector now uses OpenSSL 1.1.1i for encryption of data. Previously, the connector used OpenSSL 1.1.1g.

Resolved Issues The following issues have been resolved in Simba Google BigQuery ODBC Connector 2.3.3.

  • [GAUSS-1104] The connector does not return an error message of missing required settings and escapes iodbctest when the key-value pair of a required setting is missing in odbc.ini.

  • [GAUSS-1159] When the value of StrLen_or_IndPtr in SQLBindParameter is very large, the connector becomes unresponsive.

This issue has been resolved. The connector now returns the error message "Request entity too large."

  • [GAUSS-1166] When parameterizing column names for use in queries, the connector returns an error message.

  • [GAUSS-1175] In some cases, when querying, large result datasets are not found.

  • [GAUSS-1177] In some cases, when using the User or Server Authentication OAuth mechanisms to request refresh tokens with RequestGoogleDriveScope enabled, the connector does not request access to Google Drive.

2.3.2 ------------------------------------------------------------------------

Released 2020-11-27

Enhancements & New Features

  • [GAUSS-1101] Support for BIGNUMERIC data

The driver now supports data of type BIGNUMERIC. For more information, see the Installation and Configuration Guide.

  • [GAUSS-1108] Improved driver logic

The driver now uses the response from the BigQuery API call made in the SQLExecute/SQLExecDirect ODBC call to preload the table with data.

Resolved Issues The following issue has been resolved in Simba ODBC Driver for Google BigQuery 2.3.2.

  • [GAUSS-1135] In some cases, when DefaultStringColumnLength is set to a large value, SQLGetTypeInfo() terminates unexpectedly.

2.3.1 ------------------------------------------------------------------------

Released 2020-10-16

Enhancements & New Features

  • [GAUSS-1105][GAUSS-1103] Updated logging configurations

You can now configure logging for the current connection by setting the logging configuration properties in the DSN or in a connection string. For more information, see the Installation and Configuration Guide.

  • [GAUSS-1108] Updated OpenSSL support

The driver now uses OpenSSL 1.1.1g for encryption of data. Previously, the driver used OpenSSL 1.1.1d.

Resolved Issues The following issues have been resolved in Simba ODBC Driver for Google BigQuery 2.3.1.

  • [GAUSS-916] The REST API incorrectly retains trailing zeroes in nested TIMESTAMP data.

  • [GAUSS-1087] In some cases, when querying long strings dataset, the driver returns the following error: "Error interacting with REST API: Response too large to return".

  • [GAUSS-1107] In some cases, when making calls to SQLTables and projects are specified in the AdditionalProjects property, the driver takes longer to return data than expected.

  • [GAUSS-1111] The driver populates some data in nested ARRAY and STRUCT formats differently from flat data, as follows:
  • BOOL: A flat BOOL is converted to SQL_BIT, and populated as 1 or 0. A nested BOOL is incorrectly populated as "true" or "false".
  • BYTES: A flat BYTES is decoded to SQL_VARBINARY. A nested BYTES is not decoded.
  • DATETIME: A flat DATETIME trims trailing zeroes. A nested DATETIME incorrectly retains trailing zeroes.
  • TIME: A flat TIME is returned as SQL_TIME with microsecond precision. A nested TIME is returned as a formatted string, such as "00:00:00".

This issue has been resolved. The driver now correctly populates the data.

  • [GAUSS-1118] When querying datasets in a non-US location, the driver returns the following error: "Error interacting with REST API: Not found: Job #".

  • [GAUSS-1129] When calling the SQLDescribeCol function, the driver returns incorrect values.

  • [GAUSS-1131] In some cases, due to changes in the Google BigQuery API, DDL statements in scripts causes the driver to terminate unexpectedly.

2.3.0 ------------------------------------------------------------------------

Released 2020-07-29

Enhancements & New Features

  • [GAUSS-1070] High-throughput API for anonymous tables

The driver now uses the high-throughput API for all queries whose results meet the criteria defined by the HTAPI_MinResultsSize and HTAPI_MinActivationRatio connection properties. To enable the high- throughput API for large result sets, select the Enable High-Throughput API check box (set the EnableHTAPI property to 1). For more information, see the Installation and Configuration Guide.

  • [GAUSS-1048] Google BigQuery jobs.query API

The driver now preferentially uses the jobs.query API for queries executed by SQLExecDirect.

Note that the jobs.query API is not used for the following query types: - Parameterized queries - SCRIPT statements - Queries which require the use of a destination table for large results

Resolved Issues The following issue has been resolved in Simba ODBC Driver for Google BigQuery 2.3.0.

  • [GAUSS-1067] On Windows, the registry file for the 64-bit driver uses improper encoding.

==============================================================================

Installation and Configuration Guide