Using the PowerFlow Synchronization PowerPack SDK

Download this manual as a PDF file

The PowerFlow Synchronization PowerPack Software Development Kit (SDK) contains a functional environment for developing and testing Synchronization PowerPacks, without the need for a full PowerFlow System. The SDK includes auto-completion and other features to help you quickly create a Synchronization PowerPack for PowerFlow.

This section describes how to set up and use the PowerFlow Synchronization PowerPack SDK to create custom Synchronization PowerPacks. This chapter also covers how you can use the PowerFlow Synchronization PowerPack Pytest Fixtures to set up unit test coverage of individual steps.

The Synchronization PowerPack SDK is meant to be used by developers with expertise in Python development.

Contact your ScienceLogic Customer Success Manager (CSM) to get access to the SDK, which is not included in a PowerFlow system by default.

Setting up the PowerFlow Synchronization PowerPack SDK

To set up the PowerFlow Synchronization PowerPack SDK:

  1. Install the Cookiecutter repository and set up the Synchronization PowerPack directory structure.
  2. Open the newly created directory for Synchronization PowerPacks using VS Code or PyCharm, following the steps below.

VS Code

The generated package contains a VS Code Dev Container. For more information about Dev Containers, see https://code.visualstudio.com/docs/devcontainers/containers.

When you open the newly created directory for Synchronization PowerPacks directory with VS Code, you will get a prompt to open the workspace within the container.

To open the new directory with VS Code:

  1. Install the dev containers add-on.

  2. Install the Python tools and add-on recommended by VS Code.

  3. After installing the Cookiecutter repository, open the newly created directory for Synchronization PowerPacks in VS Code. The .devcontainer folder should be detected, and you should be prompted to re-open the workspace in the container.

  4. Click Yes.

  5. If you are not prompted, or if you miss the prompt, open the command pallet (Ctrl+Shift+P) and run the following command:

    Remote-Containers: Reopen in Container

    If you need any additional Synchronization PowerPacks or packages to be available, run a pip install them into your container. Be sure to add those Synchronization PowerPacks or packages to your dependencies.

  6. Verify that the container is running by running docker ps.

  7. To make sure that the Synchronization PowerPack workspace was correctly set up, run the "DummyStep" test script located in tests/steps/test_DummyStep.py by running pytest https://code.visualstudio.com/docs/python/testing.

  8. You can now start using the SDK to create steps and applications for the new Synchronization PowerPack. Auto-completion and documentation is enabled for all of the PowerFlow classes and methods.

For a list of additional VS Code tasks, see the Tasks section in the README.md file at the SyncPacks Cookiecutter GitHub page. For a list of available snippets, see the Snippets section in the README.md file at the SyncPacks Cookiecutter GitHub page.

PyCharm Professional Edition

The Cookiecutter generated package includes PyCharm configuration settings to improve the process for developing Synchronization PowerPacks.

To use PyCharm:

  1. After installing the Cookiecutter repository, open the newly created directory for Synchronization PowerPacks in PyCharm.

    If you are working with more than one Synchronization PowerPack repository, ScienceLogic recommends that you open the root directory where the needed repositories are located.

  2. Right-click the Synchronization PowerPack directory, select Mark Directory, and select Sources Root.

  3. Configure the docker-compose.yml file located in .pycharm_devcontainer as a Docker Compose Remote Python interpreter. Follow the instructions in the PyCharm official documentation: https://www.jetbrains.com/help/pycharm/using-docker-compose-as-a-remote-interpreter.html#docker-compose-remote.
  4. Select the interpreter that you created in step 3 as the default for your workspace.
  5. Check the test profile for the tests/steps/test_DummyStep.py to make sure that the Python Interpreter is set to the Docker Compose Interpreter configured above. Follow the instructions in PyCharm Run Test Documentation to set the corresponding interpreter: https://www.jetbrains.com/help/pycharm/creating-run-debug-configuration-for-tests.html.
  6. To make sure that the Synchronization PowerPack workspace was correctly set up, run the "DummyStep" test script located in tests/steps/test_DummyStep.py, using the steps in the PyCharm Run Tests documentation: https://www.jetbrains.com/help/pycharm/performing-tests.html#test-mutliprocessing.
  7. You can now start using the SDK to create steps and applications for the new Synchronization PowerPack. Auto-completion and documentation is enabled for all of the PowerFlow classes and methods.

For advanced settings, see Advanced Setup.

PowerFlow Synchronization PowerPack Pytest Fixtures

The PowerFlow pytest fixtures allows for unit test coverage of individual steps. The pytest fixtures are included in the PowerFlow Synchronization PowerPack SDK.

You can access the fixtures by running pytest --fixtures in the devcontainer that contains the following fixtures:

  • test_is_conf. The consistent configuration object for Step Run tests.
  • patched_logger. The patched logging instance that allows for assert tests and output to the console.
  • content_manager. The generic content manager fixture, which can be patched for further testing needs.
  • syncpack_step_runner. Runs a PowerFlow step.
  • powerflow_content_manager. Provides a Content Manager instance that can be asserted against.

These fixtures allow for unit test coverage of a wide variety of steps, including steps that use data from previous steps, read or write from the cache, make API calls, and require patching. The fixtures also let you assert against log messages.

Enabling Unit Tests for PowerFlow Steps

To enable unit tests using the pytest fixtures:

  1. Choose the step to test. Be sure to understand the goal of the step, which step parameters are needed, and what the execute method is doing. For more information, see Creating a Step.

  2. In the tests directory, add a file called test_{step_name}.

  3. Copy the content of tests/steps/test_DummyStep.py into the new test step file.

  4. Rename the test function to test_{step_name}.

  5. Edit the following test function arguments:

    • stepdict. The step definition as it would be defined in an application.

      • name and file. Use the name of your step.
      • syncpack. Uses the current Synchronization PowerPack by default. No changes are needed.
      • custom step parameters. Parameters in your step.
    • in_data. Input data that helps to simulate the data received from the previous step.

    • out_data. Output data that represents the data saved for the next step using the save_data_for_next_step method. This is useful to compare the result generated by the step with the expected(out_data) data.

  6. Parametrize any other argument the test step may need. The following is a simple example of how to mock API responses using the library requests_mock and a mock_data argument.

    ... ....
    from requests_mock import Mocker
    
    
    def test_GetREST(
        step_dict, in_data, mock_data, out_data, syncpack_step_runner):
        req_mock = kwargs.get("req_mock")
        req_mock.get(
            f"https://snow.test/api/x_sclo_scilogic/v1/sciencelogic/file_systems?
    region=pytest&sysparm_limit={step_dict['chunk_size']}",
            mock_data,
        )
        syncpack_step_runner.data_in = in_data
        data = syncpack_step_runner.run(step_dict)
        assert data == out_data
  7. Use any of the pytest fixtures as needed inside your step test code. The following fixture is included by default as a test argument, and it should be always part of the step’s test arguments:

    • syncpack_step_runner. Helps execute the step defined in the stepdict argument.
  8. Run the unit tests using pytest, based on your development environment:

Using the iscli Tool

You can use the iscli tool with VS Code and PyCharm Professional Edition

Using iscli with VS Code

After the SDK is up and running, you can open a bash terminal and execute iscli commands to build and upload Synchronization PowerPacks as mentioned in the iscli section.

VS Code tasks created by the Cookiecutter tool can also be used to execute some iscli commands. For more information, see the Tasks section in the README.md file at the SyncPacks Cookiecutter GitHub page.

For Linux environments, you will need to uncomment the line "runArgs": ["--network", "host"] in the .devcontainer/devcontainer.json file, so that Synchronization PowerPacks can be published to remote systems using iscli, and dependencies can be installed from Pypi.

Using iscli with PyCharm

PyCharm currently does not support keeping a container alive as VS Code does.

To be able to open a terminal and execute commands inside the SDK container:

  1. Using the Docker tool window that PyCharm offers, start a container by selecting Services > Docker > right-click on the running container > Create terminal.
  2. When the terminal is up and running, you can execute bash commands in it, including iscli commands. For more information, see https://www.jetbrains.com/help/pycharm/docker.html#interacting-with-containers.

The following image is an example of a Docker profile in Linux:

You can also perform this action using the following Docker command:

docker run -it --volume /home/syncpack_test/:/workspace/syncpack_test --name devcontainer_iscli --user=1000 --rm registry.scilo.tools/sciencelogic/pf-syncpack-devcontainer:2.4.1 /bin/bash

Use --network host for Linux environments, if needed. Be sure the user UUID matches your local user.

When the container is running, you can execute iscli commands inside the container.

Advanced Setup

This section explains how a developer can use third-party online or offline dependencies, and how to load more than one Synchronization PowerPack directory into only one workspace.

Load Dependencies into a Synchronization PowerPack Workspace

When developing Synchronization PowerPacks, most of the time external dependencies are needed, such as ScienceLogic Synchronization PowerPacks or third-person Python libraries.

Be sure to update the "requires_dist" array in the meta.json file with the dependencies needed. For more information about how to edit this file, see Synchronization PowerPack Properties.

Advanced Setup: VS Code

To address offline dependencies (.whl files):

  1. Copy all of the necessary .whl files, including ScienceLogic Synchronization PowerPacks, into the .offline_dependencies directory.

  2. Run the following task to install the Synchronization PowerPack using those offline dependencies:

    PF: Install SP - Offline Dependencies

To address dependencies from a PowerFlow System

  1. Choose a PowerFlow system that has the ScienceLogic Synchronization PowerPacks that are needed as dependencies for this Synchronization PowerPack. You will need the corresponding credentials for that PowerFlow system.

  2. Run the following task to install the Synchronization PowerPack using packages from the PowerFlow system:

    PF: Install SP - Dependencies from PF(devpi)

If dependencies from a PowerFlow system need to be used along with offline dependencies, run the following task PF: Install SP - Dependencies from PF(devpi) + Offline dependencies.

Advanced Setup: PyCharm

To address offline dependencies (.whl files), copy all of the needed .whl files (including ScienceLogic Synchronization PowerPacks) into the .offline_dependencies directory.

To address dependencies from a PowerFlow System:

  1. Add the PF System to the /syncpack_name/.pycharm_devcontainer/pip.conf file:

    [global]

    timeout = 0

    retries = 0

    extra-index-url = https://isadmin:password@PF_HOST:3141/isadmin/syncpacks

    trusted-host = PF_HOST

  2. When the file is configured, configure the Docker Compose file as a Remote Python Interpreter if it was not done already: https://www.jetbrains.com/help/pycharm/using-docker-compose-as-a-remote-interpreter.html#docker-compose-remote.

  3. Rebuild the image if needed, follow PyCharm documentation if needed: https://www.jetbrains.com/help/pycharm/using-docker-compose-as-a-remote-interpreter.html#tw.

  4. Run the tests normally. The environment should have the necessary dependencies installed.

Opening Multiple Synchronization PowerPack Directories on One Workspace

This section describes how to set multiple Synchronization PowerPack directories in only one workspace to make it easier to update Synchronization PowerPacks that share dependencies.

Multiple Directories: VS Code

  1. Select a Synchronization PowerPack directory as primary.

  2. Edit the .devcontainer/devcontainer.json file by adding the root directory /home/username/pf_syncpacks_workspace as a bind volume:

    "mounts": [

    "source=/home/username/pf_syncpacks_workspace,target=/workspaces,type=bind"

    ],

  3. Edit the /.vscode/pf-syncpack.code-workspace file and add the secondary Synchronization PowerPack paths.

    {
    	"folders": [
    		{
    			"path": ".."
    		},
    		{
    			"path": "/workspaces/pf_syncpack_test"
    		},
    		{
    			"path": "/workspaces/pf_syncpack_other"
    		}
    	],
    }
  4. Open the Synchronization PowerPack directory in VS Code selecting the option Reopen in the container as explained in the Cookiecutter README.md file: https://github.com/ScienceLogic/is_syncpack_cookiecutter#devcontainer. Rebuild the container, if necessary.

  5. Go to the primary Synchronization PowerPack file .vscode/pf_syncpacks_workspace.code-workspace and select Open Workspace. This should be done after the DevContainer was started successfully in step 4.

  6. Install all the secondary Synchronization PowerPacks using any of the tasks to install Syncpacks PF: Install SP ... . This enables the tests to run.

For more information, see the VS Code multi-root workspaces documentation: https://code.visualstudio.com/docs/editor/workspaces#_multiroot-workspaces.

Multiple Directories: PyCharm

  1. Open the root Synchronization PowerPack directory mentioned above at /home/username/pf_syncpacks_workspace.
  2. Make sure that all of the Synchronization PowerPack directories are located in that directory. PyCharm will automatically recognize any new Synchronization PowerPack located in the root directory
  3. Right-click each Synchronization PowerPack directory, select Mark Directory, and select Sources Root.
  4. When all of the Synchronization PowerPacks are marked as sources, the tests for all of them can run normally.

Using Templates to Create Steps and Application JSON Files

This section describes how to use templates to create new applications, steps, and step test files. When creating many new files, the use of templates can help speed up the process.

Using Templates: VS Code

Because VS Code does not have a way of configuring templates, you can configure something similar using snippets. The Cookiecutter includes three snippets that can help you quickly add an application, step, and step test code into a file.

To use the Cookiecutter snippets:

  1. Create the corresponding empty file: app.json or StepName.py.
  2. Start writing the prefix of any of the snippets: pfapp or pfstep or pfsteptest, and when VS Code suggests the expected snippet to use, press Enter.
  3. The cursor will be positioned in the property that needs to be edited, press tab to go to the next property to edit.

For more information, see the Cookiecutter README.md file: https://github.com/ScienceLogic/is_syncpack_cookiecutter#devcontainer.

To create your custom snippet:

  1. Create a new file in the .vscode directory called {SnippetName}.code-snippets.
  2. Copy the needed code into the VS Code Snippet generator: https://snippet-generator.app/. Use variables if needed; check the current snippets the Synchronization PowerPack has for reference.
  3. Copy the resulting .JSON code into the {SnippetName}.code-snippets file.

Using Templates: PyCharm

PyCharm lets you create file templates. When creating many new files, the use of templates can help speed up the process.

The Cookiecutter includes a zip file with three templates in /.pycharm_devcontainer/settings.zip with basic information to create a new step, application, and step test.

  1. To import the zip file, see https://www.jetbrains.com/help/resharper/Templates__Managing_Templates__Importing_and_Exporting_Templates.html#cb67c1bc.
  2. To use the templates, select New and select any of the three templates.

To create your custom template with PyCharm:

  1. Create a File Template: https://www.jetbrains.com/help/pycharm/using-file-and-code-templates.html#create-new-template.

  2. Copy the corresponding code inside the template as follows, define custom templates variables if needed: https://www.jetbrains.com/help/pycharm/file-template-variables.html#custom_template_variables.

  3. Save the template: https://www.jetbrains.com/help/pycharm/using-file-and-code-templates.html#save-file-as-template.

Frequently Asked Questions

Can I remove the dummy steps and applications created by the Cookiecutter tool?

Yes. The only purpose of those is to be examples.

Should git be initialized in the newly created Synchronization PowerPack directory?

It is not needed, but it is recommended to push and save the changes to the corresponding code repository.

You can also do this with VS Code tasks. For a list of VS Code tasks, see the Tasks section in the README.md file at the SyncPacks Cookiecutter GitHub page.

Should I use the SDK within a PF system ?

No. You should only use the SDK in a local development environment, and not in a PowerFlow system.