Feature Request: Test Framework

One major omission we’ve found in Snaplogic is there is no standard framework for testing of Pipelines. One approach has been to put in a JSON Generator Snap with some test data, however this results in changing the Pipeline definition in order to perform the test. Ideally we should be able to test the Pipeline without changing it, so that if it passes, we can promote that same version to Production without changing it back. Otherwise there is a risk that something is broken when removing the test data.

An alternative is to create a triggered task for each Pipeline and use an external test tool (e.g. JUnit) to submit test data to the Pipeline. This works well, and is particularly useful for automated test execution and performance testing, however requires users to have Java development skills to develop test.

We’ve requested (Service Request #18186) a feature set including:

  • Ability to validate and execute Pipelines using test data without changing the Pipeline definition (e.g. ‘run in test mode’)
  • Ability to set specific Snaps to ‘No Execute in Production’ or ‘No Execute in Test’ so that test logic can be included in Pipelines if necessary or vice versa
  • Ability to mock Snap outputs and ‘flick a switch’ between mock results and real results without changing the Pipeline definition
  • Ability to create suites of automated tests which randomly submit from a pool of test data and can be scheduled
  • Ability to perform performance, soak and smoke testing using pre-defined automated tests
  • Ability to compare test results (Pipeline outputs) against expected results and notify only where there are discrepancies
  • Ability to capture, browse and compare test results over an extended period

I’d be keen to hear any thoughts or feedback from the community - is this a feature you would like to see too?


@cj.ruggles Thank you for raising this enhancement request. The information you provided is useful, and we will ensure this is taken into consideration in future product planning.

1 Like

I would like to see some incremental progress on this front, but it is difficult for me to say exactly what I would like to see first. The list looks fairly complete though.

In general, we have been logically dividing our non-trivial pipelines into three functional areas: ingress, transformation, egress. Typically, the ingress and egress portions are abstracted to sub-pipelines so that the transformation area can be tested with Fake (or Mock if you prefer) sub-pipelines.

So some tooling to support that pattern would be most helpful.

Our typical pain point for us is in acquiring and managing test data, and way to manage that would be helpful as well.


Excellent request. Highly desired. We are becoming overwhelmed by the manual testing as we increase our new pipeline development.

1 Like

This goes a long way to what I have been looking for as well. A full Test Driven Development SnapLogic environment is the dream I was hoping for.

We are looking for the ability to perform performance tests using pre-defined automation tests as well. Do you plan to develop something in the near future? Maybe you can share the Snaplogic team approach to the performance tests of pipelines? Thank you.