Creating a Step

Download this manual as a PDF file

This section explains how to create your own custom step that you can then upload and run on an SL1 PowerFlow system.

All Python step code should be Python 3.7 or later.

What is a Step?

In PowerFlow, a step is a generic Python class that performs a single action, such as gathering data about an organization:

You can use existing steps to create your own workflows, and you can re-use steps in more than one workflow. When these steps are combined as part of a PowerFlow application, they provide a workflow that satisfies a business requirement:

In the PowerFlow builder user interface, if you click the ellipsis icon () on a step in the PowerFlow builder, you can select View step code to view the code for that step:

You can configure how a step works by adjusting a set of arguments called input parameters. The parameters specify the values, variables, and configurations to use when executing the step. Parameters allow steps to accept arguments and allow steps to be re-used in multiple integrations. For example, you can use the same step to query both the local system and another remote system; only the arguments, such as hostname, username, and password change.

You can view and edit the input parameters for a step:

  1. Go to the Applications page of the PowerFlow user interface and click the name of a PowerFlow application.

  2. Click the Open Editor button.

  3. Click the ellipsis icon () on the step and select Configure. The Configuration pane for that step appears:

    The PowerFlow builder is available only in SL1 Premium solutions. To upgrade, contact ScienceLogic Customer Support. For more information, see https://sciencelogic.com/pricing.

A step can pass the data it generates during execution to a subsequent step. A step can use the data generated by another step. Also, you can run test data for that step by hovering over the Run button and selecting Custom Run.

PowerFlow analyzes the required parameters for each step and alerts you if any required parameters are missing before running the step.

Steps are grouped into the following types: 

  • Standard. Standard steps do not require any previously collected data to perform. Standard steps are generally used to generate data to perform a transformation or a database insert. These steps can be run independently and concurrently.
  • Aggregated. Aggregated steps require data that was generated by a previously run step. Aggregated steps are not executed by PowerFlow until all data required for the aggregation is available. These steps can be run independently and concurrently.
  • Trigger. Trigger steps are used to trigger other PowerFlow applications. These steps can be configured to be blocking or not.

Default Steps

The Base Steps Synchronization PowerPack contains a set of steps that are used in a variety of different Synchronization PowerPacks, including "Query REST", "Query GQL", and "MySQL Select" steps, which are used throughout PowerFlow for a variety of use cases.

The Base Steps Synchronization PowerPack is included with the most recent release of the PowerFlow Platform. The Base Steps Synchronization PowerPack version 1.5.0 was shipped with PowerFlow Platform version 2.4.0. You can also download newer versions of this Synchronization PowerPack, if available, from the PowerPacks page of the ScienceLogic Support Site at https://support.sciencelogic.com/s/.

Version 1.5.0

Version 1.5.0 of the Base Steps Synchronization PowerPack includes support for sending multiple payloads using the new "PostREST" and "GetREST" steps. This release also includes a new "DeleteREST" step and a new "PutREST" step.

For more information, see the Base Steps Synchronization PowerPack Release Notes, version 1.5.0.

Version 1.4.2

Version 1.4.2 of the Base Steps Synchronization PowerPack includes updates to the third-party packages used by PowerFlow.

For more information, see the Base Steps Synchronization PowerPack Release Notes, version 1.4.2.

Version 1.4.1

Version 1.4.1 of the Base Steps Synchronization PowerPack includes the following features:

  • Updated the "QueryREST" to return a response body when errors occur.
  • The "SSHCommand" step can now use SSH keys.

For more information, see the Base Steps Synchronization PowerPack Release Notes, version 1.4.1.

Contents of This Synchronization PowerPack

This Synchronization PowerPack contains the "Template App" PowerFlow application. You can use the "Template App" application as a template for building new PowerFlow applications.

This Synchronization PowerPack includes the following steps, which you can use in new and existing applications:

  • Cache Delete
  • Cache Read
  • Cache Save
  • DeleteREST
  • Direct Cache Read (deprecated)
  • GetREST
  • Jinja Template Data Render
  • MS-SQL Describe
  • MS-SQL Insert
  • MS-SQL Select
  • MySQL Delete
  • MySQL Describe
  • MySQL Insert
  • MySQL Select
  • PostREST
  • PutREST
  • Query GraphQL
  • Query REST (deprecated)
  • QueryRest: BearerAuth
  • QueryRest: OAuth
  • Run a command through an SSH tunnel
  • Trigger Application

To view the code for a step, select a Synchronization PowerPack from the SyncPacks page, click the Steps tab, and select the step you want to view.

You can configure the existing "Query REST" step to use bearer authentication by adding the bearer token to the request headers in the Configuration pane for that step. The headers field should look like the following:

{

"Authorization": "Bearer <token_id>",

"accept": "application/json",

"content-type": "application/json"

}

Requirements

To create a custom step, you must perform the following tasks:

  1. Download or copy the step template, called stepTemplate.
  2. Set up the required classes and methods in the step.
  3. Define logic for the step, including transferring data between steps.
  4. Define parameters for the step.
  5. Define logging for the step.
  6. Define exceptions for the step.
  7. Upload the step to PowerFlow.
  8. Validate and test the step.

Creating a Step from the Step Template

The easiest way to create a new step is to use the step template that is included with PowerFlow. To copy this template to your desktop:

  1. Using an API tool like Postman or cURL, use the API GET /steps/{step_name}:

    GET URL_for_your_PowerFlow_system/api/v1/steps/stepTemplate

    where URL_for_your_PowerFlow _system is the IP address or URL for PowerFlow.

  2. Select and copy all the text in the stepTemplate.

  3. Open a source-code editor and paste the content of the stepTemplate in the source-code editor.

  4. Save the new file as newfilename.py

    where newfilename.py is the new name of the step and includes the .py suffix. The file name must be unique within your PowerFlow system and cannot contain spaces. Note that the step name will also be the name of the Python class for the step.

Including the SubClass and Required Methods

To execute successfully on PowerFlow, your step must be a subclass of the ipaascore.BaseStep class and include the init method and the execute method.

Subclass

The stepTemplate.py file is already configured to include the new step as a subclass of the ipaascore.BaseSetp class. To update your step:

  • Use a source-code editor to open the re-named file for editing.

  • Notice that the file includes these lines of text:

    from ipaascore.BaseStep import BaseStep

    from ipaascommon import ipaas_exceptions

  • Do not remove or alter these lines of text.

  • Search for the following:

    class stepTemplate(BaseStep):

  • Replace stepTemplate with the new name of the file (without the .py suffix).

  • Save and close the file.

Required Methods

To execute successfully on PowerFlow, your step must contain at least two methods:

  • init method. Allows you to define initialization options and parameters for the step.
  • execute method. The execute method includes the logic for the step and performs the action. After PowerFlow evaluates all parameters and initialization settings and aligns the step with a worker process, PowerFlow examines the execute method.

Without these methods, PowerFlow will consider your step to be "incomplete" and will not execute the step.

The stepTemplate.py file includes these two methods and the syntax of some of the sub-methods you can use within the main methods.

Defining the Logic for the Step

Each step requires the init method and the execute method. Within those methods, you can specify parameters and logic for the step. The following sections describe how to do this.

The init Method

From the init method, you can define the friendly name, the step description, and the step version:

self.friendly_name = "friendly name of the step. This name appears in the user interface"

self.description = "Description of the step"

self.version = "version number"

In the QueryREST step, the friendly name, description, and version number are defined like this:

def __init__(self):

self.friendly_name = "Query REST"

self.description = "Step facilitates REST interactions and will return the returned data dictionary and specified headers as data to the next step"

self.version = "1.0.0"

From the init method, you also define the parameters for the step. PowerFlow will examine the parameters and enforce the parameters when the step is run.

For example, if you specify a parameter as required, and the user does not specify the required parameter when calling the step, PowerFlow will display an error message and will not execute the step.

To define a parameter, use the self.new_step_parameter function.

self.new_step_parameter(name=parameter_name, description="description", sample_value="sample value", default_value=None, required=False)

where:

  • name. The name of the parameter. This value will be used to create a name:value tuple in the PowerFlow application file (in JSON).
  • description. A description of the step parameter.
  • sample_value. A sample value of the required data type or schema.
  • default_value. If no value is specified for this parameter, use the default value. Can be any Python data structure. To prevent a default value, specify "None".
  • required. Specifies whether this parameter is required by the step. The possible values are "True" or "False".

Here is an example from the QueryREST step (available on each PowerFlow system):

self.new_step_parameter(name=PREFIX_URL, description="used with relative_url to create the full URL.", sample_value="http://10.2.11.253", default_value=None, required=True)

The execute Method

From the execute method you can:

For details on all the functions you can use in the execute method, see the chapter on the ipaascore.BaseStep class.

You can also define additional methods in the step. For examples of this, see the QueryREST step provided with PowerFlow.

For examples of the logic in a step, view one or more of the steps listed in Default Steps.

Transferring Data Between Steps

An essential part of integrations is passing data between tasks. PowerFlow includes native support for saving and transferring Python objects between steps. The ipaascore.BaseStep class includes multiple functions for transferring data between steps.

Saving Data for the Next Step

The save_data_for_next_step function saves an object or other type of data and make the data available to another step. The object to be saved and made available must be able to be serialized with pickle:

save_data_for_next_step(data_to_save)

where: data_to_save is a variable that contains the data.

NOTE: The data_to_save object must be of a data type that can be pickled by Python: None, True and False, integers, long integers, floating point numbers, complex numbers, normal strings, unicode strings, tuples, lists, set, and dictionaries.

The following is an example of the save_data_for_next_step function:

save_data = {'key': 'value'}

self.save_data_for_next_step(save_data)

The PowerFlow application must then specify that the data from the current step should be passed to one or more subsequent step, using the output_to parameter. For details, see the section about using applications to transfer data between steps.

Retrieving Data from a Previous Step

The ipaascore.BaseStep class includes multiple functions that retrieve data from a previous step:

NOTE: To retrieve data from a previous step, that previous step must save the data with the save_data_for_next_step function, and the application must specify that the data from the previous step should be passed to the current step using the output_to parameter.

get_data_from_step_by_name

The get_data_from_step_by_name function retrieves data saved by a previous step.

NOTE: Although the get_data_from_step_by_name function is simple to use, it does not allow you to write a generic, reusable step, because the step name will be hard-coded in the function. The functions join_previous_step_data or get_data_from_step_by_order allow you to create a more generic, reusable step.

get_data_from_step_by_name(step_name)

where: step_name is the name of a previous step in the PowerFlow application.

The following is an example of the get_data_from_step_by_name function:

em7_data = self.get_data_from_step_by_name('FetchDevicesFromEM7')

snow_data = self.get_data_from_step_by_name('FetchDevicesFromSnow')

get_data_from_step_by_order

The get_data_from_step_by_order function retrieves data from a step based on the position of the step in the application.

get_data_from_step_by_order(position)

where: position is the position of the step (the order that the step was run) in the application. Position starts at "0" (zero).

For example:

  • Suppose your application has four steps: stepA, stepB, stepC, and stepD

  • Suppose stepA was run first (position "0") and includes the parameter "output_to":["stepD"]

  • Suppose stepB was run second (position "1") and includes the parameter "output_to":["stepD"]

  • Suppose stepC was run third (position "2") and includes the parameter "output_to":["stepD"]

  • Suppose stepD was run fourth

  • If the current step is stepD, and stepD needs the data from stepC, you could use the following:

    data_from_stepC = self.get_data_from_step_by_order(2)

join_previous_step_data

The join_previous_step_data function is the easiest and most generic way of retrieving data from one or more previous steps in the application.

If you are expecting similar data from multiple steps, or expecting data from only a single step, the join_previous_step_data function is the best choice.

The join_previous_step_data function gathers all data from all steps that included the save_data_for_next_step function and also include the output_to parameter in the application. By default, this function returns the joined set of all data that is passed to the current step. You can also specify a list of previous steps from which to join data.

The retrieved data must be of the same type. The data is then combined into a list in a dictionary.

If the data types are not the same, then the function will raise an exception:

join_previous_step_data([step_name])

where step_name is an optional argument that specifies the steps. For example, if you wanted to join only the data from stepA and stepD, you could specify the following:

self.join_previous_step_data([“stepA”, “stepD”]),

The following is an example of the join_previous_step_data function in the QueryREST step (included in each PowerFlow system):

def query_with_url_generated_from_input(self):

“““

Iterates over data from previous steps and generates a relative url for each. Then executes that command

:return:

“““

count = 0

input_data = self.join_previous_step_data()

payload = self.get_parameter(PAYLOAD)

if type(input_data) is list:

for input_d in input_data:

relative_url = self.get_parameter(RELATIVE_URL, input_d)

self.process_REST_command(payload, relative_url)

count += 1

elif type(input_data) is dict:

relative_url = self.get_parameter(RELATIVE_URL, input_data)

self.process_REST_command(payload, relative_url)

count += 1

else:

raise NotImplementedError(“Data type: {} is not currently supported for generating relative urls form data”.format(type(input_data)))

Step Parameters

Steps accept arguments, called parameters. These arguments specify the values, variables, and configurations to use when executing.

Base Parameters Available in All Steps

The PowerFlow BaseClass has a few base parameters that are automatically inherited by all steps and cannot be overwritten. You do not need to define these parameters before using them in steps:

  • name. The Application-unique name for this step. That parameter can be used by other steps to refer to a step.
  • file. The name of the file that will be executed by the step. For example, you could write step logic in a single file, but use that step logic with different applications and use different names for the step in each application.
  • output_to. A list indicating that the data retrieved from this step should be output to another step. Setting this parameter links the steps, and the subsequent step will be able to retrieve data from the current step. The format is:

"output_to":["stepA", "stepB"]

Defining a Parameter

From the init method, you define the parameters for the step. The PowerFlow system will examine the parameters and enforce the parameters when the step is run.

For example, if you specify a parameter as required, and the user does not specify the required parameter when calling the step, the PowerFlow system will display an error message and will not execute the step.

To define a parameter, use the new_step_parameter function.

new_step_parameter(name='', description='', sample_value='', default_value=None, required=False)

where:

  • name. The name of the parameter. This value will be used to create a name:value tuple in the application file (in JSON).
  • description. A description of the step parameter.
  • sample_value. A sample value of the required data type or schema.
  • default_value. If no value is specified for this parameter, use the default value. Can be any combination of alphanumeric characters. To prevent a default value, specify "None".
  • required. Specifies whether this parameter is required by the step. The possible values are "True" or "False".

Here is an example from the QueryREST step:

self.new_step_parameter(name=PREFIX_URL, description=“used with relative_url to create the full URL.”, sample_value=“http://10.2.11.253”, default_value=None, required=True)

Retrieving Parameter Values

To retrieve the latest value of a parameter, use the get_parameter function.

get_parameter(param_name, lookup_data=None)

where:

  • param_name is the name of the parameter that you want to retrieve the value for.
  • lookup_data is an optional dictionary that can provide a reference for additional variable substitutions.

For example, suppose we defined this parameter in the step named "GETgoogle.com":

self.new_step_parameter(name=prefix_url, description="used with relative_url to create the full URL.", sample_value="http://10.2.11.253", default_value=None, required=True)

Suppose in the PowerFlow application that calls "GETgoogle.com", we specified:

"steps": [

{

"file": "QueryREST",

"method": "GET",

"name": "GETgoogle.com",

"output_to": ["next_step"],

"prefix_url": "http://google.com"

}

],

Suppose we use the get_parameter function in the step "GETgoogle.com" to retrieve the value of the "prefix_url" parameter:

build_url_1 = self.get_parameter("prefix_url")

The value of build_url_1 would be "http://google.com".

Variable Substitution in Parameters

The PowerFlow system allows users to define variables, so that parameters can be populated dynamically.

To include a variable in a parameter, use the following syntax:

${exampleVariable}

The PowerFlow system includes the following types of variables for use in parameters:

  • ${object_from_previous_step}. The PowerFlow system will search the data from the previous steps for object_from_previous_step. If found, the PowerFlow system will substitute the value of the object for the variable
  • ${config.exampleVariable}. Configuration variables are defined in a stand-alone file that lives on the PowerFlow system and can be accessed by all applications and their steps. Including the config. prefix with a variable tells the PowerFlow system to look in a configuration file to resolve the variable. If you want to re-use the same settings (like hostname and credentials) between applications, define configuration variables.
  • ${appvar.exampleVariable}. Application variables are defined in the PowerFlow application. These variables can be accessed only by steps in the application. Including the appvar. prefix with a variable tells the PowerFlow system to look in the application to resolve the variable.
  • ${stepfunc.exampleFunctionargs}. The variable value will be the output from the user-defined function, specified in exampleFunction, with the arguments specified in args. The exampleFunction must exist in the current step. Additional parameters can be specified as args with a space delimiter. You can also specify additional variable substitution values as the arguments. This allows you to dynamically set the value of a variable using a proprietary function, with dynamically generated arguments. For example:

"param": "${step_func.add_numbers 1 2}"

will call a function (defined in the current step) called “add_numbers” and pass it the arguments "1" and "2". The value of "param" will be "3".

For details on defining configuration variables and application variables, see the chapter on defining an application.

Retrieving Variable Values

You can use the get_app_variable function to retrieve the latest value of a variable.

get_app_variable (variable_name)

where:

  • variable_name is the name of the application variable that you want to retrieve the value for.

For example, suppose we defined this application variable in the application:

"app_variables": [

{

"name": "sl1_hostname",

"description": "The SL1 hostname to participate in the sync",

"sample_value": "10.2.253.115",

"default_value": null,

"required": true,

"value": 10.64.68.25

},

]

Suppose this application calls the step "sync_SL1_data".

In the step "sync_SL1_data" , we could use the following function to resolve the value of "sl1hostname":

hostname = self.get_app_variable("sl1_hostname")

The value of hostname would be "10.64.68.25".

Defining Logging for the Step

The PowerFlow system includes a logger for steps. The BaseStep class initializes the logger, so it is ready for use by each step.

To define logging in a step, use the following syntax:

self.logger.logging_level ("log_message")

where:

  • logging_level is one of the following Python logging levels:
  • critical
  • error
  • info
  • warning
  • log_message is the message that will appear in the step log.

For example :

self.logger.info("informational message")

Raising Exceptions

The PowerFlow platform natively handles exceptions raised from custom steps. You can include a user-defined exception or any standard Python exception.

If an exception is raised at runtime, the step will immediately be marked as a failure and be discarded.

To view the exception and the complete stack trace, use the steps from the section on Viewing Logs.

Uploading Your Step

When you create a new step or edit an existing step, you must upload the step to the PowerFlow system.

There are two ways to upload a step to the PowerFlow system:

  • At the command line with the iscli utility
  • With the API

Uploading a Step with iscli

The PowerFlow system includes a command line tool called iscli. When you install PowerFlow, iscli is automatically installed.

To upload a step to the PowerFlow using iscli:

  • Either go to the console of the PowerFlow system or use SSH to access the server.

  • Log in as isadmin with the appropriate password.

  • Enter the following at the command line:

    iscli -u -s -f path_and_name_of_step_file.py -H hostname_or_IP_address_of_powerflow_system -P port_number_of_http_on_powerflow_system -u user_name -p password

    where:

  • name_of_step_file is the full pathname for the step.
  • hostname_or_IP_address_of_powerflow_system is the hostname or IP address of the PowerFlow system.
  • port_number_of_http_on_powerflow_system is the port number to access the PowerFlow system. The default value is 443.
  • user_name is the user name you use to log in to the PowerFlow system.
  • password is password you use to log in to the PowerFlow system.

Uploading a Step with the API

The PowerFlow system includes an API. When you install the PowerFlow system, the API is available.

To upload a step with the API POST /steps:

POST /steps

{

"name": "name_of_step",

"data": "string"

}

where:

  • data is all the information included in the step.

Validating Your Step

After uploading a step, you can use the API POST /steps run to run the step individually without running an application. This allows you to validate that the step works as designed.

To run a step from the IS API:

POST /steps/run

{

"name": "name_of_step",

all other data from the .json file for the step

}

After the POST request is made, the PowerFlow system will dispatch the step to a remote worker process for execution. By default, the POST request will wait five seconds for the step to complete. To override the default wait period, you can specify wait time as a parameter in the POST request. For example, to specify that the wait time should be 10 seconds :

POST /steps/run?wait=10

{

"name": "example_step",

}

If the step completes within the wait time, the PowerFlow system returns a 200 return code, logs, output, and the result of the step.

If the step does not complete within the wait time, the PowerFlow system returns a task ID. You can use this task ID to view the logs for the step.

The API returns one of the following codes:

  • 200. Step executed and completed within the timeout period.
  • 202. Step executed but did not complete with timeout period or user did not specify wait. Returned data includes task to query for the status of the step
  • 400. Required parameter for the step is missing.
  • 404. Step not found.
  • 500. Internal error. Database connection might be lost.

Viewing Logs

After running a step, you can view the log information for a step. Log information for a step is saved for the duration of the result_expires setting in the PowerFlow system. The result_expires setting is defined in the file opt/iservices/scripts/docker-compose.yml. The default value for log expiration is 24 hours.

To view the log information for a step before running an integration, you can use the API POST /steps run to run the step individually without running an application. You can then use the information in step 3-6 below to view step logs.

To view the log information for a step:

  1. Run a PowerFlow application.

  2. Using an API tool like Postman or cURL, use the API GET /applications/{appName}/logs:

    GET URL_for_your_PowerFlow_system/api/v1/applications/integration_application_name/logs

    where URL_for_your_PowerFlow_system is the IP address or URL for the PowerFlow system.

  1. You should see something like this:

{

"app_name": "example_integration",

"app_vars": {},

"href": "/api/v1/tasks/isapp-af7d3824-c147-4d44-b72a-72d9eae2ce9f",

"id": "isapp-af7d3824-c147-4d44-b72a-72d9eae2ce9f",

"start_time": 1527429570,

"state": "SUCCESS",

"steps": [

{

"href": "/api/v1/tasks/2df5e7d5-c680-4d9d-860c-e1ceccd1b189",

"id": "2df5e7d5-c680-4d9d-860c-e1ceccd1b189",

"name": "First EM7 Query",

"state": "SUCCESS",

"traceback": null

},

{

"href": "/api/v1/tasks/49e1212b-b512-4fa7-b099-ea6b27acf128",

"id": "49e1212b-b512-4fa7-b099-ea6b27acf128",

"name": "second EM7 Query",

"state": "SUCCESS",

"traceback": null

}

],

"triggered_by": [

{

"application_id": "isapp-af7d3824-c147-4d44-b72a-72d9eae2ce9f",

"triggered_by": "USER"

}

]

}

  1. In the "steps" section, notice the lines that start with href and id. You can use these lines to view the logs for the application and the steps.

  2. To use the href value to get details about a step, use an API tool like Postman or cURL and then use the API GET /steps{step_name}:

    GET URL_for_your_PowerFlow_system/href_value

    where:

  • URL_for_your_PowerFlow_system is the IP address or URL for the PowerFlow system.
  • href_value is the href value you can copy from the log file for the application. The href value is another version of the step name.

    To view logs for subsequent runs of the application, you can include the href specified in the last_run field.

  1. To use the task id value to views details about a step, use an API tool like Postman or cURL and then use the API GET /tasks/{task_ID}:

    GET URL_for_your_PowerFlow _system/task_id

    where:

    • URL_for_your_PowerFlow_system is the IP address or URL for the PowerFlow system.
    • task_id is the ID value you can copy from the log file for the application. The task ID specifies the latest execution of the step.

After you find the href and task ID for a step, you can use those values to retrieve the most recent logs and status of the step.