Macro Name | Description |
---|---|
$(AVALANCHE_CONNECT_STRING) | ODBC Connection string for connecting to Avalanche database. You can obtain this information from Avalanche portal. Note: This is required only if you are running Integration Manager on DataCloud. |
$(AVALANCHE_USERNAME) $(AVALANCHE_PASSWORD) | Credentials for connecting to the database. |
$(AVALANCHE_TABLE) | Name of the table in Avalanche where the data will be written to. Note: Do not encrypt this macro value. |
$(SALESFORCE_USERNAME) $(SALESFORCE_PASSWORD) | Credentials for connecting to Salesforce. |
$(SALESFORCE_TABLE) | Name of the table or entity in Salesforce. |
$(SALESFORCE_SANDBOX_MODE) | If set to True, connects to the Salesforce application in the sandbox environment. |
$(SALESFORCE_QUERY) | Salesforce SOQL query for fetching data. Value can be query statement itself or a path to the query file. In case of path, prefix the value with "file:///". Note: Either SALESFORCE_TABLE/SALESFORCE_QUERY macro must be specified. |
Macro Name | Description |
---|---|
$(AWS_ACCESS_KEY) $(AWS_SECRET_KEY) | Credentials for accessing AWS services. |
$(AWS_BUCKET_NAME) | Name of the AWS storage bucket to pull the data from. |
$(AWS_REGION) | Region ID of the location that hosts the specific S3 bucket. The supported regions are: • US East (N. Virginia) • US East (Ohio) • US West (Oregon) • Europe (Ireland) • Europe (London) • Europe (Frankfurt) You can specify the region in the above format or in the following format: • us-east-1 • us-east-2 • us-west-2 • eu-west-1 • eu-west-2 • eu-central-1 Note: Region information is case and format sensitive. If you do not enter the region value correctly, then the template will fail during execution. Avalanche cluster and the bucket must be in the same region. |
Macro Name | Description |
---|---|
$(SALESFORCE_API_VERSION) | Salesforce API version. The default version is 42.0. |
$(SALESFORCE_QUERY_ALL) | If set to True, fetched deleted or archived records. |
$(AVALANCHE_DSN) | Name of the ODBC data source for connecting to Avalanche database. Specify this macro if you want to use a pre-configured DSN on your system instead of the connect string. |
$(AVALANCHE_CREATE_TABLE_QUERY) | Create table statement to use for creating the table. Make sure partitioning is specified. Table name in the query must match the AVALANCHE_TABLE macro value. |
$(AVALANCHE_CREATE_TABLE_OPTIONS) | Use this option when you do not want to build the complete query but only want to specify options to pass to "with" clause of create table query. Note: Make sure partitioning is specified. Also, this macro is ignored if AVALANCHE_CREATE_TABLE_QUERY macro is defined. |
$(OUTPUT_MODE) | Table operations that must be performed before inserting data. The available operations are: • replace: Drops existing table and creates new table • delete_append: Truncates table before inserting. • append: Creates table only if it does not exist and inserts records. The default value is append. |
$(DEFAULT_TEXT_COL_SIZE) | Set the default size of the text columns in the table. Set it to a reasonable value based on your data to avoid truncations. This property is also useful for inserting double-byte characters like Japanese or Chinese. Varchar is used for text data types that supports single byte characters. To support double-byte characters in varchar data type, the size of the column must be doubled using this macro. |
$(UNICODE_CHARS) | Indicates whether the data contains Unicode characters. If set to True, nvarchar data type is used for text columns. |
$(VW_XXX) | Specify VWLOAD options as a macro in the VW_XXX format. XXX can be any of the properties listed in the COPY VWLOAD section in the Avalanche documentation. One macro can be added for each property. For properties such as STRICTNULLS that does not accept a value, the macro value must be the name of the property itself. Note: FDELIM, RDELIM, QUOTE, AWS credentials must be specified using specific macros for these properties. For example: VW_NULLS = NULL VW_STRICTNULLS = STRICTNULLS |
$(BATCH_SIZE) | Size of the chunk (in terms of number of records), the source is split into. Default is 50000. |
$(COPYVW_PARALLEL_LOAD_SIZE) | Number of chunks to load in parallel using COPY VWLOAD. The default value is 20. |
$(AVALANCHE_DBADMIN_GROUP_ACCESS) | Grant table access for "dbadmingrp" group. Only applicable when new table is created. Default is True. |