4. Using Integration Manager : Configuring and Running DataFlow Jobs within Integration Manager : Methods to Run DataFlow Processes
 
Share this page                  
Methods to Run DataFlow Processes
You can run DataFlow processes through Integration Manager by one of the following methods:
Java Execution
JSON Execution
RushScript Execution
Java Execution
When executing a DataFlow Java class through Integration Manager, you must upload the jar file that contains your class and input the class to execute as the Entry Point. You also may upload supporting jars into the Files section and any file with *.jar as an extension will be added to the classpath. Other files in the File section will be placed in an “include” folder.
JSON Executions
To execute a JSON execution, you must upload your JSON script as the package. The file must end in a .dr extension. As with Java execution, any needed supporting files may be uploaded to the Files section of the Job Config. These files will be passed to the execution in the “include” folder.
JSON execution also enables you to set overrides for execution in two ways. First, you may set the Macros on the JobConfig. They will be passed into the execution as individual overrides. The key will be of the format operator.property and the value set to any value. The back end will handle formatting.
You may also pass an override file that can be set on the Job Config.
RushScript Execution
Similar to JSON execution, you must upload your RushScript as the package (which must have .js extension). No Entry Point should be set. You may upload additional files to the Files section of the JobConfig to be included in the “include” directory for execution.
RushScript offers the concept of variables. You may include these in the Macros section, and they will be added to the execution as variables.