Expert opinion

A Framework for deploying them all!

Or how to deploy IRIS productions more quickly and with greater peace of mind?

The aim of interoperability productions is to enable you to connect systems in order to transform and route messages between them. To connect systems, you develop, configure, deploy and manage productions that integrate several solutions.

This is what the InterSystems documentation tells us on its reference site, but what do you actually have to do to deploy a production?

Productions can be composed, depending on usage, to connect external systems to the IRIS Data Platform. To do this, it is necessary to create an environment specific to each production, i.e. the following components:

  • Business service
  • Business process (optional)
  • Business operation
  • table definition schemas (.cls; classes)
  • namespace initialisation file (.cpf)

Of course, the importance of using productions to process messages lies in the fact that each message can be traced, so that any undesirable events can be traced back.

What if I told you that you could deploy your productions using our IRIS interoperability framework with a wave of a magic wand?

The mainframe approach on which our Framework is based means that IRIS InterSystems® productions can be deployed at high “V” speed without having to recreate all the components by hand.

The use of the Framework allows us to add an interesting feature that allows the data in the tables to be deployed with the production to be read: the addition of an outgoing API (RestForms2).

>>> Data can then be queried and returned in JSON format.

The Framework will generate all the components based on a functional specification file filled in with the business and our project manager (whose role is to ensure that all the necessary information finds its place).

The script works in two stages: building the ETL flow and the data drop point.

Once it has been filled in as required, the functional specifications file is used firstly to generate: the message serialisation file (data classes; obj.py), the data structure file for each message (msg.py), the message generation file (bs.py) and the file for ingesting messages into the corresponding database (bo.py); secondly, it is used to create/delete tables in the database in the form of an SQL script containing DDL (Data Definition Language) instructions.

All of which saves you a lot of time!

 

 

The best part is that the Framework can be easily deployed from a Docker container!

If you’re still not convinced, how can using this Framework save you 80% of the time?

What if I told you that the code deployed by the Framework is validated by the publisher InterSystems®, that it enables your team to work on standardised code, that during maintenance campaigns this possibility encourages you to be more efficient when updating code or looking for bugs, that it enables you to interact with your data using an API mechanism (from the repository of InterSystems IRIS compatible packages, all versions included).

What do we mean by “the code is validated by the publisher”?

Simply that it respects the Python standards and those of the editor in terms of architecture, calls to the internal mechanisms of IRIS InterSystems® and that it can interface with the ObjectScript language and vice versa.

In the near future, we will show you a case study of the Framework in an operational environment.