Category | Description |
---|---|
Actian Warehouse | Enables you to quickly load data from the following sources into Actian Warehouse on AWS, Azure, or Google Cloud Platform storage: • Salesforce • NetSuite • ServiceNow • Actian Zen • Delimited files You need to edit only a few macro values (bucket/Actian Warehouse credentials, object name, table name, and so on) before running the template and loading their data. These templates were designed to dynamically discover the schema (fields, field size, data types, and so on), perform field level mapping, and determine the target output mode (append, insert) without input or any manual intervention. The templates leverage VWLoad for high-performance. For information to run this template, see the Using Actian Library Templates for Loading Data to Actian Warehouse section in the Actian Data Platform help. |
Compression | Compresses or decompresses text or binary files. |
DataProfiler | The Data Profiler templates consist of two different Profiles and one Process • DataProfile.dp illustrates how to use data quality rules within Data Profiler to identify inconsistencies and data quality issues within a dataset. Records that meet the criteria specified within the data quality rules are written to the PASS_TARGET output file and records that do not meet that criteria are written to the FAIL_TARGET output file. • DuplicateClusters.dp shows how the data quality rule, Duplicate Clusters, can be used to identify duplicate records in a dataset • DataProfileWorkflow.process illustrates how Data Profiles can be used within a Process as part of a data pipeline. The Profile, DataProfile.dp, is referenced within the Data Profiler Invoker step. The data from the PASS_TARGET within the Profile is sent to the transformation step (PASS_TARGET) for processing. The data from the FAIL_TARGET is sent to the transformation step (FAIL_TARGET) where it can be enriched and remediated. |
Email | Receives emails from IMAP or POP3 servers and send emails using SMTP. |
EZscript Libraries | Reusable user-defined function that extracts a text string from any two text strings. |
File Folder Queues | Gets all files with a matching file name pattern from a file server and moves them to a destination folder for processing. Note: The destination folder can be monitored as a File Listener for event driven integration. |
File Listener | Gets data from the file that triggers a real-time integration and uses that data as an input to the integration. |
File Splitting | Splits a large file into multiple smaller files that can be used as input to a parallel or distributed processing workflow. |
FTP | Gets all files with a matching file name pattern from a secure FTP server and moves them to a destination folder for processing. This is similar to File Folder Queues. |
GUID | Example of extensibility in calling a Java object to generate a unique ID. |
Orchestration | Uses the Orchestration Invoker to dynamically call any integration already deployed to an Integration Manager instance. |
SFDC Login Invoker | Uses the Salesforce Login Invoker to get a session token than can be used throughout the workflow and reduce API calls. |
SHA-1 Encryption | Example of reusable user-defined functions from an EZscript Library module that hashes and encrypts string data. |
Web Service API | Example of using the API Invoker to make a RESTful web service API call and pass the response data into a Transformation Map. |