How to create Guided Decision Table in Drools

1. Create new project UPLOAD <project-name>.

2. Add data object asset (e.g., upload).

Picture1

3. Add asset Guided Decision Table using “Add asset” option from home page.

4. Enter GDT name, choose package and select an option from Hit Policy drop down. You can select from any of the listed policies based on the requirement. In this example, we are using First Hit.

Picture2

5. Click OK button, then you will see the GDT as below.

Picture3

6. Insert columns by clicking on Columns tab, there will be a popup window as below.

Picture4

7. We should configure the mentioned options as in below screenshot.

Picture5

7.1 New Column:

Add a condition -> next ->

7.2 Pattern:

Picture6

Then click on create a new fact pattern button, it will show a new popup.

Picture7

We must provide the Binding value and click Next.

Picture8

7.3 Calculation type:

Select literal value radio button and click next.

You can select other options based on your requirement.

Picture10

7.4 Field:

Choose field value from dropdown and Binding is optional.

Picture11

7.5 Operator:

Choose equal to option from drop down and then next

Picture12

7.6 Value options:

Here we can provide multiple options with comma separate in value list. Then it will show in Default tab in drop value like below.

Picture13

I have given gold and diamond; it is showing same in drop down. Then click next.

7.7 Additional Info:

Header filed is mandatory filed and click finish button.

Picture148.Below is the table structure.

Picture15

9. Here we can add rows using Append row under insert button drop down.

Picture16

10. We can add multiple rows using same option. And if we double click on gold tab it shows list of options which are, we added in previous steps.

Picture17

11. Do not forget save.

Picture18

12. Using above steps, we can add multiple columns as per our requirement.

13. Here adding one more column action.

14. Click on insert tab and choose insert column.

Picture19

15. Select set the value of a field and click on next.

Picture20

16. Choose upload[auto] pattern by clicking dropdown of pattern filed and then click next.

Picture21

17. Select discount filed from field dropdown.

Picture22

18. Provide optional values then it will show in drop down like below.

Picture23

19. Given Header description as Action and click Finish.

Picture24

20. Below is the GDT final structure.

Picture25

21. Click on save button and then click on validate button and it should be successfully validated.

 

Use Test Scenario Asset for testing instead of POSTMAN:

1. Once deployment got completed, add Test Scenario Asset using add asset option.

2. Below is the normal Test Scenario Asset.

Picture26

3. We provided condition and action details below.

Picture27

4. Then start testing using play option and the response is like below.

Picture28

5. If something is wrong, it will show like below.

Picture29

CI-CD EMAIL ALERT CONFIGURATION ON SPECIFIC FILE CHANGE

CI/CD Email Alert Configuration on Specific File Change

Background:

A notification has to be sent to the developer who made a commit to a configuration file, such that the retrofit of the file to the other repos would not be missed and there by avoid issues upfront. At present in git there is a feature for notification by using the “GIT Integrator service call email on push”. However, this service is not able to restrict the mail notification to a particular file change which is creating the issue as the inboxes are flooded with emails for every check-in.

Steps to achieve

  • To configure an email notification for any specific file changes during git commit, we can use git tag “only”. Following is an example for the git script which is used for the same.

send_email: —-Git Stage

  stage: notify

  only: —-Git Command Only

    changes:

      – sb/temp/azu/*.prop

  script:

    – echo `git log -p -2 *.prop | head -50 > commit_history.txt`

    – |

      if [[ -z “$REPO”]]; then

         curl -s –user “api:$API_KEY” “https://api.mailgun.net/v3/$DOMAIN/messages” -F from=’Gitlab <gitlabsuer@domain.com>’ -F to=$GITLAB_USER_EMAIL -F to=$GITLAB_USER_TLD -F to=$GITLAB_USER_NLD -F cc=$GITLAB_USER_TWO -F cc=$GITLAB_USER_PM -F subject=’Gate Property file Modified’ -F text=’Retrofit property files to other Repos’ -F attachment=’@commit_history.txt’

      else

         echo “It’s not allowed to trigger”

      fi

  • In the above yml code snippet, it’s the “only” tag where we have condition for the execution of the script. Whenever there is a change in the property files then immediately it recognizes the change and triggers this job.
  • An Email is sent using the mail gun service provider for the particular property file change. Without having our own SMTP Service provider, we can create an account in the Mail gun, it provides an API which can be hit with the required parameter such that it does the job of sending notification for us. It’s a pretty cool feature and easy to use. The service is ‘Free’ for a couple of months that is accompained with 5000 emails per month.
  • Sample curl command to hit the email API:

curl -s –user “api:$API_KEY” “https://api.mailgun.net/v3/$DOMAIN/messages” -F from=’Gitlab < gitlabsuer@domain.com >’ -F to=$GITLAB_USER_EMAIL -F to=$GITLAB_USER_TLD -F to=$GITLAB_USER_NLD -F cc=$GITLAB_USER_TWO -F cc=$GITLAB_USER_PM -F subject=’Property file Modified’ -F text=’Retrofit property files to other Repos’ -F attachment=’@commit_history.txt’

  • Private Key and service domain from Mailgun are required to put them in yml. Below steps help us to find the following things from mail gun.
  • Create an account with mail gun using the below link:

1. https://signup.mailgun.com/new/signup  (Don’t click on the Add payment info now, we can sign up to the site without credit/debit card details.)

2. Provide all the sign-up details like email and name

3. Once login you will be able to see the below screen:

Picture1

4. Once login you will be able to see the below screen:

Picture2

5. Click on the domain then you will be able to see the domain name, also, in the right side we will have the recipients where we have to add the mail id of the person to whom the notification should be sent, currently, using free account we can only send it to 5 people.

Picture3

6. Click on the setting and then API keys to get the private key:

Picture4

7. All the variables are given in the GITLAB-UI under settings/ci-cd/variables as a global part.

Picture5

  • We can also use the smtp parameters instead of the API: sample for the same is below:

./swaks –auth \

        –server smtp.mailgun.org \

        –au postmaster@sandbox43a6751f9b1c43faaf8fa187eadc3a0f.mailgun.org \

        –ap 583627bfa8d4a292c4aa779f9b61aafb-ba042922-a9cb1219 \

        –to recepient@domain.com \

        –h-Subject: “Alert!!! Property file changed” \

        –body ‘check the commits’

Reference: https://documentation.mailgun.com/en/latest/

PROCESSING X12 EDI DATA IN MULE 4

Processing X12 EDI data in Mule 4

Background

X12 EDI is a data format based on ASC X12 standards developed by the American National Standards Institute (ANSI) Accredited Standards Committee (ASC). It is used to electronically exchange business documents in specified formats between two or more trading partners. EDI Data is widely used in logistics and Healthcare industries.

In this article, we shall see how an X12 EDI document is parsed and converted into an XML document. The following Connector is used from Anypoint Exchange for working with X12 data – “X12 Connector – Mule 4”.

Steps

The following steps are performed for working with EDI documents

  1. Create a project and import X12 Connector
  2. Create a Mule flow that reads the X12 data, parses, and transforms it into XML format

1.  Create a project and import X12 Connector

  • Open Anypoint studio and create a project.
  • In the new project, click on Search in Exchange (highlighted in yellow). This is to import X12 Connector – Mule 4 from exchange to studio

Picture1

  • In the dialog box that is opened, click on Add Account, to enter the credentials for Anypoint platform if not already saved.

Picture2

  • After Anypoint platform credentails are entered, you are now connected to Anypoint exchange from your Anypoint studio.
  • Type x12 as shown in below picture. And select X12 Connector – Mule 4 and click on add and then finish

Picture4

  • X12 Connector – Mule 4 is now successfully added to your studio from exchange.

 

2. Create a Mule flow that reads the X12 data, parses, and transforms it into XML format

  • Create a mule flow like below with the below processors
  • X12 Read reads EDI payload that is received from Listener

When an X12 Read processor is executed, it generates an object with the schema as shown in the screenshot below

Picture7

  • Transform message reads the output of X12 Read and converts it to XML payload.

Picture8

  • Run the mule flow in studio either in run or debug mode.
  • Use Postman (or any tool of your choice) to test the above REST API flow. Screenshot below.
  • In Body section, enter the EDI payload of any type and invoke the mule flow running in the Anypoint studio.
  • The X12 Processor parses the data and returns the data as a Java Object.
  • Using a Transform Message processor, as seen above, the object is converted into XML format. In general, the output of “X12-Read” may be transformed into any other data format (e.g., JSON) that the application must work with.
  • Screenshot below shows the input EDI and output XML after it is transformed.

Picture9

Conclusion

As seen in the flow, the parsing and processing of EDI data to other formats using the X12 Connector is straight-forward. In practice, for file-processing with large number of EDI transaction sets, a file-based batch processing is used, and each transaction is parsed separately using EDI X12 Read Processor before further processing is done.

The screenshot below shows the expected output for X12-Write processor.

Picture10

After the EDI data is read using the X12-Read processor, and the transaction processing is complete, the response data may have to be formatted back to EDI. To generate an EDI response, a Java Object with the above schema is built (screenshot above), This “Expected” Object is provided as an input to “X12 – Write” processor that transforms data into EDI format, which can then be used for response.

In the next part of the article, we shall delve into the details of trading partner setup for exchange of EDI documents and customizing EDI validation rules. So, stay tuned!

CUSTOM LIBRARY

Custom Library

To perform custom manipulations and logic in a process and to accomplish unique and advanced requirements that fail outside of the native functionality of the Boomi platform, Custom scripting is been written. Such integrations need custom files, a thirdparty custom scripting libraries or specific libraries for connectors. 

Custom Library components are collections of Java Archive (JAR) files that you can use to support those in Boomi integration processes. 

Creating and deploying Custom Library components enables you to manage JAR files through the Boomi Integration UI. You can deploy Custom Library components to any Atom, Molecule, Atom Cloud, or environment. 

To configure the custom library in atom management below are the steps: 

Upload external libraries to the Boomi account:

Setting>Account details>manage Account libraries>upload a JAR file. 

Create a custom library component in Boomi: 

In custom librarytype in Boomi, we have three types of components: General, Scripting and Connector. 

Click on create new Custom Library Component from the process and in the configuration tab, define Custom library, Component Name, and Folder and click on create. Once you create a component in dropdown list select the library we want to create, if we select customer library type as a connector, we have to provide connector type and we can add JAR files from the previously uploaded custom JAR files and click on save. 

Picture2

Deployment of the custom library component: 

To Deploy the Custom library: Create Package component with all details and create it. 

Once all package components successfully created, click on the Attachments tab to attach environment and version and click on Deployment tab to deploy the component. 

Restart Atom: 

Before you can use the library in your integration process, we will need to restart the atom. Manage > Atom Management > 

Select an atom > click on Atom information tab > click on “Restart Atom”. 

 Removing files from a custom library: 

You can remove custom JAR files from a Custom Library component, but you should not do so if the component is currently deployed. 

Custom library component > Select Jar File > Delete > Save. 

 Migrating existing JAR files to use custom libraries: 

Existing JAR files that were placed in user library folders manually continue to work as they did before the introduction of Custom Library components. However, as a best practice, Boomi recommends that you migrate any manually deployed JAR files to custom libraries that can be managed through the Boomi Integration UI. 

To make existing JAR files known to Boomi Integration, follow the normal custom library deployment flow: 

 Upload JAR file > Create Custom library component > Deploy. 

 When the JAR files are deployed, Boomi Integration checks for existing files of the same name: 

  • If the file name and the contents of the file are the same, the file is not replaced. 
  • If the file name is the same but the contents are different, the new JAR file is deployed with a unique suffix to avoid a naming conflict. The existing file is marked for deletion. 

In either case, the JAR file is now recognized by the Boomi Integration and can be managed through the UI. 

USE OF MESSAGE SHAPE DURING BOOMI PROCESS TESTING

Use of Message Shape During Boomi Process Testing 

After the Boomi integration process is developed, while testing multiple sub-process calls if any changes are required or if testing is required only for a particular sub-component, Message Shape can meet the requirements. There is no need for testing end-to-end flow if changes need to be tested and verified for a particular process or sub-flows to validate the changes. 

With the use of Message shape, the following can be achieved 

  1. Required data can be passed to the next shape 
  2. Modified data can be supplied on the fly 
  3. Testing from Source connector data might take more time than can be saved 
  4. Ability to test multiple flows using different test data supplied to each message shape in the flow 

To test the process flow or changes in the Boomi platform, the content will be added in the message field of the Message Shape and it will be connected to the next shape. 

Below are different scenarios with different message types to test with Message Shape 

1. JSON Message: 

For testing with JSON messages, use single quotes. 

Ex:   

‘{ 

“FirstName” : “Russell”, 

“LastName” : “Crowe” 

}’ 

2. XML Message: 

Use original XML message that contains &, < and > with double quotes (“) in the top line of XML directly in the Message shape. The Message shape will not take single quotes in XML. 

Ex: 

<?xml version=”1.0″ encoding=”UTF-8″?> 

<root> 

 <row> 

   <userId>1</userId> 

   <parentid>1</parentid> 

   <title>delectus aut autem</title> 

   <status>false</status> 

 </row> 

 </root> 

3. EDI or Flat file Message: 

Copy actual EDI or Flat file message without any modifications into Message shape. If EDI data contains single quotes, use an extra single quote to escape it. 

Ex: 

(a) X12 message (truncated) 

ISA*00*0000000000*00*0000000000*08*4532345856 *12*8038025150 *020624*1158*U*00401*000000009*0*P*>~ 

GS*PO*4532345856*8038025150*20020624*1158*2*X*004010~ 

(b)EDIFACT message (truncated) 

UNB+UNOA:1+US::US+50138::THEM+140531:0305+001934++ORDERS 

UNH+1+ORDERS:91:2:UN 

BGM+220+A761902+4:20140530:102+9′ ’ 

4. Database Message: 

When testing with DB message, component ID (after DBSTART and DBEND) needs to be replaced with new DB profile component ID need to be replaced if the profiles are copied during testing. 

Eg: DBSTART|828ac9d7-8334-44e0-8739-4139cf86f2e5|2|@|BEGIN|2|@|OUT_START|3|@|1259|^|20201209 182925.764|^|Alex Jr|^|1111111|#||#|OUT_END|3|@|END|2|@|DBEND|828ac9d7-8334-44e0-8739-4139cf86f2e5|2|@| 

In the above example, the bold text needs to be replaced with the component ID of the new DB profile