For documentation regarding Power BI connector, please refer to build Power BI connector.
- This project uses Google's C++ Style Guide as a basis for C/C++ usage and formatting.
- Some formatting is set using the .clang-format file at the base of repository. Other options for Visual Studio can be imported from the
VS-C++-Settings-Export.vssettings
file also found at root of repository.
-
To use IAM accessKeyId and secretKey to access AWS Timestream, you need to specify the following environment variables.
Enviornment Variable Description AWS_ACCESS_KEY_ID <AWS Access Key ID>
AWS_SECRET_ACCESS_KEY <AWS Secret Access Key>
AWS_SESSION_TOKEN <Session Token>
, if your account is enabled with MFA -
[ Optional ] Log configurations
Set these 2 variables only if you would like to set a custom log path or log level for connection tests; it is completely optional.
TIMESTREAM_LOG_PATH
=<path_to_log_file>
(e.g.:"C:\\Users\\BitQuillUser\\Desktop\\Timestream ODBC Driver"
)
The user needs to ensure that the directory mentioned in the log file path exists or the driver will ignore the user's passed-in value and create the log file in the default log path. Do not include a slash at the end of the log path.
The log path indicates the path to store the log file. The log file name is formatted as
timestream_odbc_YYYYMMDD.log
, whereYYYYMMDD
(e.g., 20220225 <= Feb 25th, 2022) is the date at the first log message.TIMESTREAM_LOG_LEVEL
=<log_level>
. The default is2
(means WARNING Level). Possible values:- 0: OFF
- 1: ERROR
- 2: WARNING
- 3: INFO
- 4: DEBUG
More details about logging in
support\troubleshooting-guide.md
. -
AWS Logs
This ODBC driver uses AWS logs beside its own logging. Please see how AWS Logs work in their official document. The logs will be stored in the executable directory following the default naming pattern of
aws_sdk_<date>.log
.
Test data are needed for tests to run successfully and they only need to be loaded once for each AWS account. Check here for instructions on loading the test data.
- Microsoft Visual Studio (Community 2022 Verified)
- Desktop Development for C++
- Visual Studio core editor
- C++ ATL for latest v142 build tools (x86 & x64)
- C++ MFC for latest v142 build tools (x86 & x64)
- WiX Toolset v3 Schemas for Visual Studio If encountering "wix toolset requires .net 3.5.1 to be enabled", follow the steps below.
- WiX Toolset Visual Studio 2019 Extension
- WiX Installer (3.11)
- Ensure to add path to WiX executables (e.g.
C:\Program Files (x86)\WiX Toolset v3.11\bin
)
- Ensure to add path to WiX executables (e.g.
- Boost Test Framework
-
Install VCPKG
-
cd vcpkg
-
Install Boost and AWS SDK
.\vcpkg install boost-test:x64-windows boost-asio:x64-windows boost-chrono:x64-windows boost-interprocess:x64-windows boost-regex:x64-windows boost-system:x64-windows boost-thread:x64-windows --recurse
-
- Run
.\vcpkg integrate install
to implicitly add Include Directories, Link Directories, and Link Libraries for all packages installed with Vcpkg to all VS2015, VS2017 and VS2019 MSBuild projects - On the Developer PowerShell, run one of the build scripts to create an initial compilation.
- E.g.:
.\build_win_debug64.ps1
- Navigate to the
build\odbc\cmake
folder to use the generated solution file,Timestream-ODBC.sln
to work on source code development and testing.
- E.g.:
- Open a 64-bit command shell or 64-bit PowerShell window, as Administrator, run the command below
Ensure that backslashes are used in your command.
.\<repo-folder>\src\odbc\install\install_amd64.cmd <repo-folder>\build\odbc\cmake\Debug\timestream.odbc.dll
- Now you're ready to begin configuration for integration and unit testing.
- Once configured, run the tests:
- Run integration tests:
.\build\odbc\bin\<Release or Debug>\timestream-odbc-integration-tests.exe
. - Run unit tests:
.\build\odbc\bin\<Release or Debug>\timestream-odbc-unit-tests.exe
.
- Run integration tests:
Driver installer detects existing driver and isn't allowing installation
- Go to
Apps & Features
settings page and search forTimestream ODBC Driver
. If there is already a Timestream ODBC driver installed, manually uninstall the driver (optional to uninstall C++ Redistributable, for details see Windows installation guide), and then attempt installing again. - If the driver is not showing under
Apps & Features
, remove driver registryComputer\HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBCINST.INI\Amazon Timestream ODBC Driver
using Registry Editor, and then attempt installing again.
- If you wish to re-register the driver, follow step 6 under Windows developer setup guide
- Install dependencies
brew install cmake
brew install libiodbc
- You may need to unlink
unixodbc
if you already have this installed. Usebrew unlink unixodbc
. - You may need to run
brew link --overwrite --force libiodbc
.
- You may need to unlink
brew install boost
- If creating a debug build (
./build_mac_debug64.sh
), LLVM is required.- If you only have XCode Command Line Tools, use the LLVM included with XCode by modifying the PATH with
export PATH=/Library/Developer/CommandLineTools/usr/bin/:$PATH
. Ensure this XCode path comes first in $PATH. If error occurs, check that clang and llvm are under folder Library/Developer/CommandLineTools/usr/bin. - If you have XCode application, to ensure LLVM and CMake are compatible, use the LLVM included with XCode by modifying the PATH with
export PATH=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/:$PATH
.
- If you only have XCode Command Line Tools, use the LLVM included with XCode by modifying the PATH with
- Run one of the build scripts to create an initial compilation.
- E.g.: from the root of the Timestream ODBC repository, run
./build_mac_release64.sh
- The generated driver files will be placed in the
build/odbc/lib
folder.
- E.g.: from the root of the Timestream ODBC repository, run
- Set the environment variable
ODBCINSTINI
. On a developer's machine, set it to<repo-folder>/build/odbc/lib/timestream-odbc-install.ini
. - Set the environment variable
DYLD_LIBRARY_PATH
. On a developer's machine, set it to<repo-folder>/build/odbc/lib:$DYLD_LIBRARY_PATH
. - Run the following command to register the ODBC driver.
./scripts/register_driver_unix.sh
. - Now you're ready to begin configuration for integration and unit testing.
- Once configured, run the tests:
- Run integration tests:
./build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false
. - Run unit tests:
./build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false
.
- Run integration tests:
You should have all the following environment variables set. If you encounter any issues, check that these variables have all been set correctly:
AWS_ACCESS_KEY_ID
(from prerequisites)AWS_SECRET_ACCESS_KEY
(from prerequisites)ODBCINSTINI
DYLD_LIBRARY
- Install all dependencies
- Ubuntu dev dependencies E.g.
apt-get -y update \
&& apt-get -y install wget \
curl \
libcurl4-openssl-dev \
libssl-dev \
uuid-dev \
zlib1g-dev \
libpulse-dev \
gcc \
gcc-multilib \
g++ \
g++-multilib \
build-essential \
valgrind \
libboost-all-dev \
libsasl2-dev \
lcov \
git \
unixodbc-dev \
valgrind \
zip \
unzip \
tar \
rpm
-
Run one of the build scripts to create an initial compilation. E.g.
./build_linux_release64_deb.sh
-
Set all necessary environment variables and run the following command to register the ODBC driver.
./scripts/register_driver_unix.sh
-
Set environment variables for testing and double-check if all dev environmnet variables are set running
scripts/env_variables_check.sh
. -
Now you're ready to begin configuration for integration and unit testing.
-
Once configured, run the tests under the repository root folder:
- Run integration tests:
./build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false
. - Run unit tests:
./build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false
.
- Run integration tests:
- Install all dependencies
- openSUSE dev dependencies E.g.
zypper refresh \
&& zypper install wget \
curl \
gcc \
gcc-c++ \
valgrind \
lcov \
git \
valgrind \
zip \
unzip \
tar \
rpm \
libopenssl-3-devel \
openssl \
cmake \
libcurl-devel \
unixODBC \
unixODBC-devel \
rpmbuild \
libboost_regex-devel \
libboost_system-devel \
libboost_thread-devel \
libboost_chrono-devel \
libboost_test-devel \
boost-devel
-
Run one of the build scripts to create an initial compilation. E.g.
./build_linux_release64_deb.sh
-
Set all necessary environment variables and run the following command to register the ODBC driver.
sudo ./scripts/register_driver_unix.sh
-
Set environment variables for testing and double-check if all dev environmnet variables are set running
scripts/env_variables_check.sh
. -
Now you're ready to begin configuration for integration and unit testing.
-
Once configured, run the tests under repository root folder:
- Run integration tests:
./build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false
. - Run unit tests:
./build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false
.
- Run integration tests:
- Install all dependencies
- Ubuntu dev dependencies E.g.
apt-get -y update \
&& apt-get -y install wget \
curl \
libcurl4-openssl-dev \
libssl-dev \
uuid-dev \
zlib1g-dev \
libpulse-dev \
gcc \
gcc-multilib \
g++ \
g++-multilib \
build-essential \
valgrind \
libboost-all-dev \
libsasl2-dev \
lcov \
git \
unixodbc-dev \
valgrind \
zip \
unzip \
tar \
rpm
-
Install cmake
apt-get install cmake
-
The version of cmake installed is lower than 3.20 which is the minimal required version. Follow below steps to build cmake 3.20 (or above) from source.
-
Download cmake 3.20 or above from https://github.com/Kitware/CMake/releases/
-
Under cmake source directory create a build directory
mkdir build
-
Run
cmake
under source directory -
cd build
and runmake
-
Install the new cmake
sudo make install
-
Add
/usr/local/bin
to PATH and make sure it is ahead of lower version cmake pathexport PATH=/usr/local/bin:$PATH
-
-
Run one of the build scripts to create an initial compilation. E.g.
./build_linux_release32_deb.sh
-
Set all necessary environment variables and run the following command to register the ODBC driver.
./scripts/register_driver_unix.sh
-
Set environment variables for testing and double-check if all dev environmnet variables are set running
scripts/env_variables_check.sh
. -
Now you're ready to begin configuration for integration and unit testing.
-
Once configured, run the tests under repository root folder:
- Run integration tests:
./build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false
. - Run unit tests:
./build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false
.
- Run integration tests:
When running integration tests you may encounter the error
message: 01000: [unixODBC][Driver Manager]Can't open lib 'Amazon Timestream ODBC Driver' : file not found
Running ./scripts/register_driver_unix.sh
and copying /etc/odbcinst.ini
to $HOME/.odbcinist.ini
and /etc/odbc.ini
to $HOME/.odbc.ini
may help the Driver and DNS be discovered.
To generate code coverage reports you to need to use the debug builds for macOS/Linux, run the tests and use gcovr
to compile the report. For an installation guide and how to use it look at the official documentation.
If you want to check a detailed report generate the with --html-details
option.
OpenCppCoverage is used to generate code coverage for windows, for more information check it in the official documentation.
- To set the version of the ODBC driver, update the
src/ODBC_DRIVER_VERSION.txt
file with the appropriate version.
By default, the Timestream ODBC driver reports databases as catalogs. This behavior is matching with Timestream ODBC Driver version 1.0.0, and it is due to Excel on macOS not showing tables with duplicate names when databases are reported as schemas.
Driver will report databases as schemas when the user exports environment variable DATABASE_AS_SCHEMA
to TRUE
.
Value of DATABASE_AS_SCHEMA |
Behavior of driver |
---|---|
TRUE |
Databases are reported as schemas. Catalogs will not be supported by driver. |
any other value/unset | Databases are reported as catalogs. Schemas will not be supported by driver. |
For the tests to run successfully, test data needs to be loaded. Note that all test data must be loaded in region us-west-2
.
-
Follow below instructions to load sample data from Timestream.
i. To load
SampleDB.IoTMulti
: Go to Timestream → Databases. ClickCreate database
and you’ll see the below page. SelectSample database
and enterSampleDB
as database name. CheckIoT
and selectMulti-measure records
. Finished page should look like below.Then clicking on "Create Database" will create the desired database.
ii. To load
meta_queries_test_db.IoTMulti
andmeta_queries_test_db.DevOpsMulti
: Go to Timestream → Databases. ClickCreate database
and you’ll see the below page. SelectSample database
and entermeta_queries_test_db
as database name. CheckIoT
andDevOps
and selectMulti-measure records
.Then clicking on "Create Database" will create the desired database.
-
Run below commands to load the test data. The commands provided work on Linux and Windows Command Prompt, please modify the command to work with your desired platform. For help around using quotation marks in different platforms, please visit AWS official guide for using quotation marks. For general help with AWS CLI commands, please visit Amazon Timestream official CLI guide and AWS CLI general guide.
# Run following AWS CLI command to create table meta_queries_test_db.TestColumnsMetadata1 and enable magnetic storage writes
aws timestream-write create-table --database-name meta_queries_test_db --table-name TestColumnsMetadata1 --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
# Run the following AWS CLI command to fill data for meta_queries_test_db.TestColumnsMetadata1
aws timestream-write write-records --database-name meta_queries_test_db --table-name TestColumnsMetadata1 --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000001\"}, {\"Name\":\"地区\", \"Value\":\"us-west-1\"} ], \"Time\":\"1666292462000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"1\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.1\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"true\",\"Type\":\"BOOLEAN\"}]}]"
# Run following AWS CLI command to disable magnetic storage writes for table meta_queries_test_db.TestColumnsMetadata1
aws timestream-write update-table --database-name meta_queries_test_db --table-name TestColumnsMetadata1 --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": false}"
# Run following AWS CLI command to create table meta_queries_test_db.TestColumnsMetadata2 and enable magnetic storage writes
aws timestream-write create-table --database-name meta_queries_test_db --table-name TestColumnsMetadata2 --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
# Run the following AWS CLI command to fill data for meta_queries_test_db.TestColumnsMetadata2
aws timestream-write write-records --database-name meta_queries_test_db --table-name TestColumnsMetadata2 --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000001\"}, {\"Name\":\"地区\", \"Value\":\"us-west-1\"} ], \"Time\":\"1666292462000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"1\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.1\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"true\",\"Type\":\"BOOLEAN\"}]}]"
# Run following AWS CLI command to disable magnetic storage writes for table meta_queries_test_db.TestColumnsMetadata2
aws timestream-write update-table --database-name meta_queries_test_db --table-name TestColumnsMetadata2 --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": false}"
# Run below command to create database data_queries_test_db
aws timestream-write create-database --database-name data_queries_test_db
# Run below command to create table meta_queries_test_db.testTableMeta. It is kept as empty table
aws timestream-write create-table --database-name meta_queries_test_db --table-name testTableMeta
# Run below command to create table data_queries_test_db.TestScalarTypes and enable magnetic storage writes
aws timestream-write create-table --database-name data_queries_test_db --table-name TestScalarTypes --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
# Run the following AWS CLI commands to fill data for meta_queries_test_db.TestScalarTypes
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000001\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone1\"}, {\"Name\":\"os_version\", \"Value\":\"1.0\"}, {\"Name\":\"region\", \"Value\":\"us-west-1\"} ], \"Time\":\"1666292462000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"1\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.1\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"TRUE\",\"Type\":\"BOOLEAN\"}]}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000002\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone2\"}, {\"Name\":\"os_version\", \"Value\":\"2.0\"}, {\"Name\":\"region\", \"Value\":\"us-west-2\"} ], \"Time\":\"1666378862000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"2\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.2\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"false\",\"Type\":\"BOOLEAN\"}]}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000003\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone3\"}, {\"Name\":\"os_version\", \"Value\":\"3.0\"}, {\"Name\":\"region\", \"Value\":\"us-west-3\"} ], \"Time\":\"1666465262000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"3\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.3\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"TRUE\",\"Type\":\"BOOLEAN\"}]}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000004\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone4\"}, {\"Name\":\"os_version\", \"Value\":\"4.0\"}, {\"Name\":\"region\", \"Value\":\"us-west-4\"} ], \"Time\":\"1666551662000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"video_startup_time\",\"Value\":\"4\",\"Type\":\"BIGINT\"}, {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.4\",\"Type\":\"DOUBLE\"}, {\"Name\":\"flag\",\"Value\":\"False\",\"Type\":\"BOOLEAN\"}]}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000005\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone13\"}, {\"Name\":\"os_version\", \"Value\":\"13.0\"}, {\"Name\":\"region\", \"Value\":\"us-west-5\"} ], \"Time\":\"1666551663000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"performance\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"cpu_usage\",\"Value\":\"63.7\",\"Type\":\"DOUBLE\"}]}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestScalarTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \"Value\":\"00000006\"}, {\"Name\":\"device_type\", \"Value\":\"iPhone13\"}, {\"Name\":\"os_version\", \"Value\":\"13.0\"}, {\"Name\":\"region\", \"Value\":\"美西-5\"} ], \"Time\":\"1666551664000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"performance\", \"MeasureValueType\":\"MULTI\", \"MeasureValues\": [{\"Name\":\"cpu_usage\",\"Value\":\"89.6\",\"Type\":\"DOUBLE\"}]}]"
# Run following AWS CLI command to disable magnetic storage writes for table data_queries_test_db.TestScalarTypes
aws timestream-write update-table --database-name data_queries_test_db --table-name TestScalarTypes --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": false}"
# Run below command to create table data_queries_test_db.TestComplexTypes and enable magnetic storage writes
aws timestream-write create-table --database-name data_queries_test_db --table-name TestComplexTypes --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
# Run the following AWS CLI commands to fill data for data_queries_test_db.TestComplexTypes
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef0\"}], \"Time\":\"1575486000000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"35.2\"}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef0\"}], \"Time\":\"1575486060000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"38.2\"}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef0\"}], \"Time\":\"1575486120000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"45.3\"}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef1\"}], \"Time\":\"1575486000000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"54.1\"}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef1\"}], \"Time\":\"1575486060000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"42.5\"}]"
aws timestream-write write-records --database-name data_queries_test_db --table-name TestComplexTypes --common-attributes "{\"Dimensions\":[{\"Name\":\"region\", \"Value\":\"us-east-1\"}, {\"Name\":\"az\", \"Value\":\"us-east-1d\"}, {\"Name\":\"vpc\", \"Value\":\"vpc-1a2b3c4d\"}, {\"Name\":\"instance_id\", \"Value\":\"i-1234567890abcdef1\"}], \"Time\":\"1575486120000\",\"TimeUnit\":\"MILLISECONDS\"}" --records "[{\"MeasureName\":\"cpu_utilization\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"33.7\"}]"
# Run following AWS CLI command to disable magnetic storage writes for table data_queries_test_db.TestComplexTypes
aws timestream-write update-table --database-name data_queries_test_db --table-name TestComplexTypes --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": false}"
# Run below command to create table data_queries_test_db.EmptyTable. It is kept as empty table
aws timestream-write create-table --database-name data_queries_test_db --table-name EmptyTable
-
Load large scale data with data writer.
i. Run the driver build script, it will build the data writer executable.
ii. Run below AWS CLI command to create table data_queries_test_db.TestMultiMeasureBigTable and enable magnetic storage writes.
aws timestream-write create-table --database-name data_queries_test_db --table-name TestMultiMeasureBigTable --magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
iii. Run below
timestream-populate-data
command to insert test data for big table tests. For help with this command, see data-population guide. Note that this command will write data in regionus-west-2
, which is expected by our tests.timestream-populate-data -u <access_key_id> -p <secret_access_key> -d data_queries_test_db -t TestMultiMeasureBigTable -ty computer -l 20000
Since this command inserts randomly generated 20000 rows of data, it is estimated to take 15 - 20 minutes for the command to run.
-
(Optional) Now all test data has been loaded. Double check that all table dependencies have been created:
sampleDB └-- IoTMulti meta_queries_test_db └-- IoTMulti └-- DevOpsMulti └-- TestColumnsMetadata1 └-- TestColumnsMetadata2 └-- testTableMeta (empty table) data_queries_test_db └-- TestComplexTypes └-- TestScalarTypes └-- EmptyTable (empty table) └-- TestMultiMeasureBigTable
AWS CLI commands may have parsing errors with PowerShell. The workaround is to wrap the json inputs with single quotes.
For example, before wrapping json data: aws timestream-write write-records --database-name <database name> --table-name <table name> --common-attributes "{<json data>}" --records "{<json data>}"
After wrapping json data: aws timestream-write write-records --database-name <database name> --table-name <table name> --common-attributes '"{<json data>}"' --records '"{<json data>}"'
Reading and writing data on Timestream requires corresponding permissions. For read permissions, it is suggested to add Amazon-managed policy AmazonTimestreamReadOnlyAccess
. For write permissions, see below for example policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"timestream:WriteRecords",
"timestream:CreateDatabase",
"timestream:UpdateTable",
"timestream:CreateTable"
],
"Resource": "*"
}
]
}
timestream-populate-data
is a data population tool provided together with Timestream ODBC driver. It is under <your_timestream_repository>\build\odbc\bin\<Debug or Release>
, add that path to your environment variable PATH to use it without path.
- The IAM profile tests are disabled by default because they require valid IAM profiles. They can be enabled by exporting environment variable
ENABLE_PROFILE_TEST
totrue
. Follow below instructions for setting up valid IAM profiles and required environment variables.
cd
to repository root. Set the environment variableREPOSITORY_ROOT
to your repository root.- Run
.\src\tests\input\create_credentials_file.ps1
to create credential files for testing. Note that this script will write AWS IAM credentials filesrc\tests\input\credentials
. - Set environment variable AWS_SHARED_CREDENTIALS_FILE to the newly created credentials file.
-
cd
to repository root. Set environment variableREPOSITORY_ROOT
to your repository rootexport REPOSITORY_ROOT=<your repository root>
-
Run
./src/tests/input/create_credentials_file.sh
from the respository root to create credential files for testing. Note that this script will write AWS IAM credentials filesrc/tests/input/credentials
. -
Set environment variable
AWS_SHARED_CREDENTIALS_FILE
to the newly created credentials file.export AWS_SHARED_CREDENTIALS_FILE=$REPOSITORY_ROOT/src/tests/input/credentials
-
The Azure AD authentication tests are disabled by default because they require valid Azure AD test account. They can be enabled by exporting environment variable
ENABLE_AAD_TEST
totrue
. -
For AAD integration tests to be run successfully, you need to specify the following environment variables. For instructions on how to set-up and find the AAD connection property values, go to the SAML 2.0 Azure AD set up guide.
Variable Name Corresponding Connection String Option AAD_APP_ID
AADApplicationID AAD_ROLE_ARN
RoleARN AAD_IDP_ARN
IdPARN AAD_TENANT
AADTenant AAD_USER
UID or IdPUserName AAD_USER_PWD
PWD or IdPPassword AAD_CLIENT_SECRET
AADClientSecret
-
The Okta authentication tests are disabled by default because they require a valid Okta test account. They can be enabled by exporting the environment variable
ENABLE_OKTA_TEST
with the valuetrue
. -
To run the Okta authentication tests, the environment variables in the following table need to be configured with correct values. Refer to Okta Authentication Setup Guide for instructions on setting up an Okta authentication.
Variable Name Corresponding Connection String Option OKTA_HOST
IdPHost OKTA_USER
UID or IdPUserName OKTA_USER_PWD
PWD or IdPPassword OKTA_APP_ID
OktaApplicationID OKTA_ROLE_ARN
RoleARN OKTA_IDP_ARN
IdPARN
Ensure OKTA_HOST
does not include https://
or http://
.
Big table pagination tests are time-consuming. To save time for integration test, they are disabled by default. They can be enabled by exporting the environment variable BIG_TABLE_PAGINATION_TEST_ENABLE
with the value true
.
For setting up connection proxy properties, see connection proxy guide..
- Download proxy-test-server from https://github.com/andris9/proxy-test-server
git clone git@github.com:andris9/proxy-test-server.git
- Install proxy-test-server
npm install proxy-test-server
- Run proxy server at port 9999
cd proxy-test-server/examples
node proxy.js
- Set environment variables
TS_PROXY_HOST
,TS_PROXY_PORT
, andTS_PROXY_SCHEME
. - Start DSN window and create a connection to Timestream. Click the 'Test' button to verify.
Unit test results can be viewed in odbc_unit_test_result.xml
and integration test results can be viewed in odbc_test_result.xml
.