IBM Watson Data Platform Data Flows Service - Data Asset and Connection Properties

The following information describes the possible values of the 'properties' section of a connection within a data flow. The properties that follow can also be used in the 'properties' section of a data asset definition, although care has to be taken to confirm the type of connection referenced by the data asset. Different types of connections will support different properties (termed 'interaction properties' below). Some of these properties are only applicable when the connection is used as a source, some when the connection is used as a target, and some are applicable in either case.

For example, if an 'IBM Db2 Warehouse on Cloud' connection is used as a source both the table_name and schema_name properties can be set as follows:
	{  
	   "id":"source1",
	   "type":"binding",
	   "output":{  
	      "id":"source1Output"
	   },
	   "connection":{  
	      "properties":{  
	         "schema_name":"GOSALESHR",
	         "table_name":"EMPLOYEE"
	      },
	      "ref":"{connection_id}"
	   }
	}
alternatively the 'IBM Db2 Warehouse on Cloud' connection used as a source also allows just a SQL select statement to be provided:
	{  
	   "id":"source1",
	   "type":"binding",
	   "output":{  
	      "id":"source1Output"
	   },
	   "connection":{  
	      "properties":{  
	         "select_statement":"select * from GOSALES.PRODUCT_NAME_LOOKUP"
	      },
	      "ref":"{connection_id}"
	   }
	}

Note that in the tables below a * character next to a property name denotes the property is required to be assigned a value.


Table of contents

Amazon Redshift
Amazon S3
Apache Hive
BigInsights HDFS
Cloud Object Storage
Cloud Object Storage (infrastructure)
Cloudant
Cloudera Impala
Cognos Analytics
Compose for MySQL
Compose for PostgreSQL
Db2
Db2 Big SQL
Db2 Hosted
Db2 Warehouse
Db2 for i
Db2 for z/OS
Db2 on Cloud
Dropbox
FTP
Google BigQuery
Google Cloud Storage
Hortonworks HDFS
Informix
Looker
Microsoft Azure Data Lake Store
Microsoft Azure SQL Database
Microsoft SQL Server
MySQL
Object Storage OpenStack Swift (infrastructure)
Oracle
Pivotal Greenplum
PostgreSQL
PureData for Analytics
Salesforce.com
Sybase
Sybase IQ
Tableau
Teradata




Amazon Redshift


Description: Amazon Redshift database
Data source type ID: 31170994-f54c-4148-9c5a-807832fa1d07
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Amazon S3


Description: Amazon Simple Storage Service (S3)
Data source type ID: a0b1d14a-4767-404c-aac1-4ce0e62818c3
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
access_key *The access key ID (user name) for authorizing access to AWS
bucketThe name of the bucket that contains the files to access
secret_key *The password associated with the access key ID for authorizing access to AWS

Interaction properties (when used as a source)

Name Type Description
bucketThe name of the bucket that contains the files to read
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
bucketThe name of the bucket that contains the files to write to
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_bucketCreate the bucket that contains the files to write to. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Apache Hive


Description: Apache Hive database
Data source type ID: 0fd83fe5-8995-4e2e-a1be-679bb8813a6d
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
http_pathThe path of the endpoint such as gateway/default/hive when the server is configured for HTTP transport mode
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
null_valueThe value that represents null (a missing value) in the file, for example, NULL
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [replace]. Default: replace
table_name *The name of the table to write to




BigInsights HDFS


Description: IBM BigInsights HDFS via the WebHDFS API
Data source type ID: 9344f081-797c-4e75-a3f3-4e734b61275e
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
hive_dbThe database in Apache Hive. Default: default
hive_hostThe host name or IP address of the Apache Hive server
hive_passwordThe password associated with the user name for connecting to Apache Hive
hive_portThe port of the Apache Hive server. Default: 10000
hive_userThe user name for connecting to Apache Hive
passwordThe password associated with the user name for accessing the system
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system
url *The WebHDFS URL for accessing HDFS

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_hive_tableCreate a table in the database. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
hive_tableThe name of the table to create
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Cloud Object Storage


Description: Cloud Object Storage service on IBM Cloud. Offers S3 API and application binding with regional and cross regional resiliency
Data source type ID: 193a97c1-4475-4a19-b90c-295c4fdc6517
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
api_key *A token that is used to call the Watson IoT Platform HTTP APIs. API keys are assigned roles that grant them authorization to call certain sets of HTTP APIs. Find the API key by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of api_key, not including quotation marks.
access_keyConnecting to the IBM COS service with the S3 API requires credentials and an endpoint. Credentials consist of an Access Key and a Secret Key. Find the Access Key by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of access_key, not including quotation marks.
bucketThe name of the bucket that contains the files to access
url *The URL for logging in to IBM Cloud Object Storage. Find this URL by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, and then clicking Endpoint in the left pane. Copy the value of the public endpoint that you want to use.
resource_instance_id *The identifier of the resource instance that you created when you ordered IBM Cloud Object Storage. Find the resource instance ID by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of resource_instance_id, not including quotation marks.
secret_keyConnecting to the IBM COS service with the S3 API requires credentials and an endpoint. Credentials consist of an Access Key and a Secret Key. Find the Secret Key by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of secret_key, not including quotation marks.

Interaction properties (when used as a source)

Name Type Description
bucketThe name of the bucket that contains the files to read
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_nameThe name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
bucketThe name of the bucket that contains the files to write to
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_bucketCreate the bucket that contains the files to write to. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_nameThe name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
storage_classThe storage class for the created bucket. Values: [cold_vault, flex, standard, vault]. Default: standard
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Cloud Object Storage (infrastructure)


Description: Object storage for workloads requiring AWS SDKs and AWS Signature style authentication
Data source type ID: 4bf2dedd-3809-4443-96ec-b7bc5726c07b
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
access_keyConnecting to the IBM COS service with the S3 API requires credentials and an endpoint. Credentials consist of an Access Key and a Secret Key. Find the Access Key by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of access_key, not including quotation marks.
url *The URL for logging in to IBM Cloud Object Storage. Find this URL by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, and then clicking Endpoint in the left pane. Copy the value of the public endpoint that you want to use.
secret_key *Connecting to the IBM COS service with the S3 API requires credentials and an endpoint. Credentials consist of an Access Key and a Secret Key. Find the Secret Key by going to https://console.ng.bluemix.net/dashboard/services, clicking the Cloud Object Storage service, clicking Service credentials in the left pane, and then clicking View credentials in the Actions column of the Service Credentials table. Copy the value of secret_key, not including quotation marks.

Interaction properties (when used as a source)

Name Type Description
bucketThe name of the bucket that contains the files to read
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_nameThe name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
bucketThe name of the bucket that contains the files to write to
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_bucketCreate the bucket that contains the files to write to. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_nameThe name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
storage_classThe storage class for the created bucket. Values: [cold_vault, flex, standard, vault]. Default: standard
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Cloudant


Description: IBM Cloudant
Data source type ID: 44e904b5-0cb2-4d8e-a5c0-c48bc3e24fdd
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
custom_urlThe URL to the Cloudant database
password *The password associated with the user name for accessing the system
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
database *The database to connect to
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_limitThe maximum number of rows to return

Interaction properties (when used as a target)

Name Type Description
blob_truncation_sizeThe maximum size for BLOB values. Values larger than this will be truncated.. Default: 8000
batch_sizeThe number of documents to send per request. Default: 1000
clob_truncation_sizeThe maximum size for CLOB values. Values larger than this will be truncated.. Default: 8000
create_databaseCreate the database to connect to. Default: false
database *The database to connect to
document_typeThe type of the document
input_formatThe format of the source data. Values: [json, relational]. Default: relational
write_modeWhether to write to, or delete, the target. Values: [delete, write]. Default: write




Cloudera Impala


Description: Cloudera Impala database
Data source type ID: 05c58384-862e-4597-b19a-c71ea7e760bc
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description




Cognos Analytics


Description: Cognos Analytics
Data source type ID: 11f3029d-a1cf-4c4d-b8e7-64422fa54a94
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
url *The gateway URL to access Cognos.
namespace_idThe identifier of the authentication namespace
passwordThe password associated with the user name for accessing the system
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
use_anonymous_accessConnect without providing logon credentials. Default: false
usernameThe user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
file_name *The name of the file to read
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_limitThe maximum number of rows to return

Interaction properties (when used as a target)

Name Type Description




Compose for MySQL


Description: Compose for MySQL database
Data source type ID: 0cd4b64c-b485-47ed-a8c4-329c25412de3
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Compose for PostgreSQL


Description: Compose for PostgreSQL database
Data source type ID: 048ed1bf-516c-46f0-ae90-fa3349d8bc1c
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2


Description: IBM Db2 database
Data source type ID: 8c1a4480-1c29-4b33-9086-9cb799d7b157
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 Big SQL


Description: IBM Db2 Big SQL
Data source type ID: 2bdd9544-f13a-47b6-b6c3-f5964a08066a
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 Hosted


Description: IBM Db2 hosted database
Data source type ID: 9525f6a6-1c44-4925-b1a0-9a2b731518cb
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 Warehouse


Description: IBM Db2 warehouse database on Cloud
Data source type ID: cfdcb449-1204-44ba-baa6-9a8a878e6aa7
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 for i


Description: IBM Db2 database for i
Data source type ID: 335cbfe7-e495-474e-8ad7-78ad63c05091
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
host *The host name or IP address of the database
database *The unique name of the Db2 location you want to access
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 for z/OS


Description: IBM Db2 database for z/OS
Data source type ID: c8d3eab2-25f6-4a90-8e10-0b4226693c45
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
collectionThe ID of the collection of packages to use
host *The host name or IP address of the database
database *The unique name of the Db2 location you want to access
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Db2 on Cloud


Description: IBM Db2 fully-managed cloud SQL database
Data source type ID: 506039fb-802f-4ef2-a2bf-c1682e9c8aa2
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Dropbox


Description: Dropbox secure file sharing and storage service
Data source type ID: 507b850c-f4a1-41d7-ad64-4182a1264014
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
access_token *The OAuth2 access token that you obtained by following the instructions at https://www.dropbox.com/developers/reference/oauth-guide

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




FTP


Description: Remote file system (FTP)
Data source type ID: d5dbc62f-7c4c-4d49-8eb2-dab6cef2969c
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
connection_mode *The method for authenticating and managing the FTP connection, which depends on the FTP server configuration.. Values: [anonymous, basic, ftps, sftp]
host *The host name or IP address of the remote FTP server
key_passphraseIf the private key is encrypted, this passphrase is needed to decrypt/encrypt it
passwordThe password associated with the user name for connecting to the FTP server
portThe port of the FTP server
private_keyThe private key that's uniquely associated with your account and recognized by the SFTP server
username *The user name for connecting to the FTP server. Default: anonymous

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Google BigQuery


Description: Google BigQuery
Data source type ID: 933152db-99e1-453a-8ce5-ae0e6714d1a9
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
credentials *The contents of the Google service account key (JSON) file.

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_name *The name of the schema that contains the table to read from
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
schema_name *The name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert]. Default: insert




Google Cloud Storage


Description: Google Cloud Storage
Data source type ID: 05b7f0ea-6ae4-45e2-a455-cc280f110825
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
credentials *The contents of the Google service account key (JSON) file.

Interaction properties (when used as a source)

Name Type Description
bucketThe name of the bucket that contains the files to read
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
bucketThe name of the bucket that contains the files to write to
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_bucketCreate the bucket that contains the files to write to. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
storage_classThe storage class for the created bucket. Values: [coldline, multi_regional, nearline, regional, standard]. Default: standard
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Hortonworks HDFS


Description: Hortonworks HDFS via the WebHDFS API
Data source type ID: c10e5224-f17d-4524-844f-e97b1305e489
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
hive_dbThe database in Apache Hive. Default: default
hive_hostThe host name or IP address of the Apache Hive server
hive_passwordThe password associated with the user name for connecting to Apache Hive
hive_portThe port of the Apache Hive server. Default: 10000
hive_userThe user name for connecting to Apache Hive
passwordThe password associated with the user name for accessing the system
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system
url *The WebHDFS URL for accessing HDFS

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
create_hive_tableCreate a table in the database. Default: false
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
hive_tableThe name of the table to create
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Informix


Description: IBM Informix database
Data source type ID: 029e5d1c-ba73-4b09-b742-14c3a39b6cf9
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
server *The name of the database server to connect to
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Looker


Description: Looker
Data source type ID: 69857d6b-2be8-4a59-8a70-723405f09708
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
client_id *The client ID for authorizing access to Looker
client_secret *The password associated with the client ID for authorizing access to Looker
host *The host name or IP address of the Looker server
portThe port of the Looker server. Default: 19999

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
sheet_nameThe name of the Excel worksheet to read from
file_formatThe format of the file. Values: [csv, delimited, excel, json]. Default: csv
file_name *The name of the file to read
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_limitThe maximum number of rows to return
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description




Microsoft Azure Data Lake Store


Description: Microsoft Azure Data Lake Store via the WebHDFS API
Data source type ID: 6863060d-97c4-4653-abbe-958bde533f8c
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
client_id *The client ID for authorizing access to Microsoft Azure Data Lake Store
client_secret *The authentication key associated with the client ID for authorizing access to Microsoft Azure Data Lake Store
tenant_id *The Azure Active Directory tenant ID
url *The WebHDFS URL for accessing Microsoft Azure Data Lake Store

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Microsoft Azure SQL Database


Description: Microsoft Azure SQL Database
Data source type ID: e375c0ae-cba9-47fc-baf7-523bef88c09e
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Microsoft SQL Server


Description: Microsoft SQL Server database
Data source type ID: 48695e79-6279-474a-b539-342625d3dfc2
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




MySQL


Description: MySQL database
Data source type ID: b2cc3dc2-aff7-4a80-8f80-5e8c5703e9d2
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Object Storage OpenStack Swift (infrastructure)


Description: Swift based object storage accessed through IBM Cloud infrastructure customer portal
Data source type ID: 9e9d55fe-f268-4ac2-ab95-0754a7fcb3c4
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
secret_key *The password associated with the user name for connecting to the IBM Cloud infrastructure Object Storage service
access_key *The user name for connecting to the IBM Cloud infrastructure Object Storage service
url *The authentication endpoint for logging in to IBM Cloud infrastructure Object Storage

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
rangeThe range of cells to retrieve from the Excel worksheet, for example, C1:F10
container *The name of the container that contains the files to read
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
display_value_labelsDisplay the value labels
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
sheet_nameThe name of the Excel worksheet to read from
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file. Values: [avro, csv, delimited, excel, json, orc, parquet, sav]. Default: csv
file_name *The name of the file to read
first_lineThe first line of data or header. Default: 0
first_line_headerThe first line of the file contains column headers. Default: false
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
json_pathThe path that identifies the elements to retrieve from a JSON document, for example, /book/publisher
null_valueThe value that represents null (a missing value) in the file, for example, NULL
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
row_limitThe maximum number of rows to return
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description
codec_avroThe compression codec to use when writing. Values: [bzip2, deflate, null, snappy]
codec_orcThe compression codec to use when writing. Values: [lz4, lzo, none, snappy, zlib]
codec_parquetThe compression codec to use when writing. Values: [gzip, snappy, uncompressed]
container *The name of the container that contains the files to write to
date_formatThe format of date values, for example, yyyy-[m]m-[d]d
decimal_formatThe format of decimal values, for example, #,###.##
encodingThe appropriate character encoding for your data, for example, UTF-8. Default: utf-8
escape_characterThe character that's used to escape other characters, for example, a backslash. Escaping is a string technique that identifies characters as being part of a string value.. Values: [, backslash, double_quote, none, single_quote]. Default: none
field_delimiterThe character that separates each value from the next value, for example, a comma. Values: [, colon, comma, tab]. Default: comma
file_formatThe format of the file to write to. Values: [avro, csv, delimited, excel, json, orc, parquet]. Default: csv
file_name *The name of the file to write to or delete
first_line_headerThe first line of the file contains column headers. Default: false
include_typesInclude data types in the first line of the file. Default: false
null_valueThe value that represents null (a missing value) in the file, for example, NULL
partitionedWrite the file as multiple partitions. Default: false
quote_characterThe character that's used to enclose string values, for example, a double quotation mark. Values: [", ', double_quote, none, single_quote]. Default: none
row_delimiterThe character or characters that separate one line from another, for example, CR/LF (Carriage Return/Line Feed). Values: [carriage_return, carriage_return_line_feed, line_feed, new_line]. Default: new_line
time_formatThe format of time values, for example, HH:mm:ss[.f]
timestamp_formatThe format of timestamp values, for example, yyyy-MM-dd H:m:s
write_modeWhether to write to, or delete, the target. Values: [delete, write, write_raw]. Default: write




Oracle


Description: Oracle database
Data source type ID: 971223d3-093e-4957-8af9-a83181ee9dd9
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
sid *The unique name of the database instance. If you provide a SID, do not provide a service name.
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
sslThe port is configured to accept SSL connections. Default: false
ssl_certificateA self-signed certificate that was created by a tool such as OpenSSL
service_name *The name of the service. If you provide a service name, do not provide a SID.
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Pivotal Greenplum


Description: Pivotal Greenplum database
Data source type ID: e278eff1-a7c4-4d60-9a02-bde1bb1d26ef
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




PostgreSQL


Description: PostgreSQL database
Data source type ID: e1c23729-99d8-4407-b3df-336e33ffdc82
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




PureData for Analytics


Description: IBM PureData for Analytics database
Data source type ID: c2a82a72-0711-4376-a468-4e9951cabf22
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert




Salesforce.com


Description: Salesforce.com
Data source type ID: 06847b16-07b4-4415-a924-c63d11a17aa1
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
password *The password associated with the user name for accessing the system
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, update]. Default: insert




Sybase


Description: Sybase database
Data source type ID: 6976a3fc-b2ad-4db6-818c-ea049cac309d
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, update]. Default: insert




Sybase IQ


Description: Sybase IQ database
Data source type ID: 49079262-fac2-4762-99d1-452c1caf6b49
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, update]. Default: insert




Tableau


Description: Tableau
Data source type ID: 9ebc33eb-8c01-43fd-be1e-7202cf5c2c82
Can be used as a source: Yes
Can be used as a target: No
Secure gateway: Not applicable / not supported

Connection properties (connection asset)

Name Type Description
host *The host name or IP address of the Tableau server
password *The password associated with the user name for accessing the system
site *The name of the Tableau site to use
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
file_name *The name of the file to read
infer_record_countThe number of records to process to obtain the structure of the data. Default: 1000
infer_schemaObtain the schema from the file. Default: false
infer_as_varcharTreat the data in all columns as VARCHARs. Default: false
invalid_data_handlingHow to handle values that are not valid: fail the job, null the column, or drop the row. Values: [column, fail, row]. Default: fail
read_modeThe method for reading files. Values: [read_multiple_regex, read_multiple_wildcard, read_raw, read_single]. Default: read_single
row_limitThe maximum number of rows to return
type_mappingOverrides the data types of specified columns in the file's inferred schema, for example, inferredType1:newType1;inferredType2:newType2

Interaction properties (when used as a target)

Name Type Description




Teradata


Description: Teradata database
Data source type ID: 96ec8f53-a818-4ba1-bd8d-c86cc33a0b45
Can be used as a source: Yes
Can be used as a target: Yes
Secure gateway: Supported but optional

Connection properties (connection asset)

Name Type Description
database *The name of the database
host *The host name or IP address of the database
password *The password associated with the user name for accessing the system
port *The port of the database
username *The user name for accessing the system

Interaction properties (when used as a source)

Name Type Description
byte_limitThe maximum number of bytes to return. Use any of these suffixes: KB, MB, GB, or TB
row_limitThe maximum number of rows to return
schema_nameThe name of the schema that contains the table to read from
select_statement *The SQL SELECT statement for retrieving data from the table
table_name *The name of the table to read from

Interaction properties (when used as a target)

Name Type Description
create_statementThe Create DDL statement for recreating the target table
key_column_namesA comma seperated list of column names to override the primary key used during an update or merge.
schema_nameThe name of the schema that contains the table to write to
table_actionThe action to take on the target table to handle the new data set. Values: [append, replace, truncate]. Default: append
table_name *The name of the table to write to
write_modeThe mode for writing records to the target table. Values: [insert, merge, update]. Default: insert



Generated on: 2019-07-22T18:08:23.796Z
Generated from: GET /v2/datasource_types?interactive_properties=true&connection_properties=true