ESB Integration Design Patterns

ESB Integration Design Patterns

Use Case 1: How to Create an ESB Service Consumed by Client Applications?

In this example, we will go through a scenario where an external system such as an insurance underwriting system calls an ESB integration service that takes a request, processes the request in terms of fetching the requested data by calling an API and then passes the results back to the underwriting system. To further illustrate this application integration example, the underwriter may request for specific coverage information for an insurance policy or the insured’s previous policy history or recent claims filed by the insured. By clicking on the “request info” action in the underwriting system, the application would call the ESB service running in Adeptia that would go and get the data from the underlying system and present that information to the underwriter.

The purpose of creating this type of web service is to prevent applications and clients from directly accessing the backend systems and have all the requests pass through a reusable ESB service that would authenticate, validate the request, calls and gets the requested data from the underlying systems.

In the ESB orchestration diagram, we have the main steps defined in the flow. The first step is to get and validate the request from the client system. The request is then passed to the next step that converts the request into the format that the backend system is able to parse and responds with the data. The call to the backend system can be through an API as depicted in the flow or it can be through a database connection or a particular application connector such as SAP IDOC connector. Data sent back from the backend system is passed to the client as a web service response. Data may also need to be formatted as per the response structure of the ESB web service.

To implement this example in Adeptia Connect, you would have to set up the following micro-services:

  • Publish the orchestration as a web service provider
  • Create the data mapping service that would convert the incoming client request into the format that the backend system would accept, such as an API request format.
  • Call the API and connect with the backend system and get the data
  • Pass the data as a web service response to the client

Use Case 2: How to Integrate Synchronous ESB Service with API, Database and MQ?

Suppose there is a scenario where a supplier wants to pull all the purchase order data from its various backend systems and want to consolidate this information in a report. The purchase order data may reside in multiple data repositories such as order management systems, databases and message queues.

In Adeptia, you can build a single service that connects to the order management system through an API, extracts records from a database and pulls data from a message queue. Once the data is pulled from all these systems, Adeptia merges the data into a single data stream. The output is converted into a custom report that is sent out to the requestor.

The advantage of using this ESB integration design pattern is to support “many to one” application integration capability in a single orchestration. In other words, taking data from many systems and combining the multiple data streams into a single output that is then consumed by the client application. The design can be further extended by adding a gateway (decision node) into the flow, where based on the type of incoming request, the process would route the request to the appropriate data source.

The individual micro-services configured to extract the purchase order data can also be a “call” to a sub-process that is executed at runtime and its response is sent back to the parent flow.

Adeptia has embedded various error-handling support within the ESB integration design framework and some of the important ones are listed here:

  • Retries in case of failure in connecting to an application (configurable)
  • Validation of requests and routing of errors to email notification, as error data stream to another activity in the flow, and/or human workflow tasks
  • Passing the erroneous records into process flow logs
  • Alerting the errors in the process flow dashboard with error description
  • Validation of data against the schema definitions and routing those parsing errors into the process flow logs
  • Handling of API error responses and logging those errors in the process flow logs

Use Case 3: How to Trigger Message Broker Service by a Message Queue?

In this example, we will discuss a multi-step message brokering service that gets invoked by a MQ and the resulting service orchestration connects to multiple systems, combines the data into a single data stream as an output and posts the result back into the message queue. The process also has a human workflow event to handle errors and it generates a report that is sent out to the process owners.

In this example we are using a real-time message queue event that triggers a service in Adeptia. In the Adeptia Connect, users can configure a JMS trigger that listens to a new message posted to a particular queue. Once the message is posted, the JMS trigger would automatically trigger the associated service published in Adeptia. The service orchestration would take the request, parses it and routes the request to multiple applications. In this case, we are routing the request to claims and policy management applications.

As part of this orchestration design, we have also shown that you can include a decision node that can check the validity of the application response and if the resulting output is correct then the data is posted back to the Message Queue. If there are validation errors, such as invalid policy expiration date, or a policy coverage that does not cover the reported claim, then the output is routed to a workflow task for review and approval. The final step in the error-handling sub-process is that a report is generated and is sent to the business team for review.

The advantage of this type of design integration pattern is that you can persist a message in the Adeptia process flow and the flow routes the message to multiple applications. It also includes human workflow tasks needed for review and approval of the data.

Types of micro-services used in application integration are:

  • JMS Trigger that kicks-off the service
  • Connections to multiple source applications such as claims and policy management systems
  • Decision node that checks the validity of the data extracted from the source applications
  • Routing the result back to the message queue
  • Human workflow task for data review and approval

Use Case 4: How to Ensure Message Persistence in an Asynchronous ESB Service?

In cases where the ESB service is asynchronous and needs to persist the message at runtime, Adeptia provides different functions for message persistence. As part of its long-running process support, users can embed events that wait for a certain action to occur before proceeding to the next step in the process flow. Adeptia’s ESB integration services with embedded event activities support long-running transactions and can wait for events to occur before completing the execution of that service.

There can be several examples related to this scenario such as in case management where a new support ticket is created in ServiceNow that triggers a service in Adeptia. This service takes the new support ticket from ServiceNow and persists it in the process flow at runtime untill the ticket is resolved and closed. The process waits for the ticket to be assigned, worked on and completed by the support rep and when all these events are completed then it proceeds to update ServiceNow. Another example can be in the order fulfillment process where the status of the order is not changed and an invoice is not sent until an advanced shipment notice (ASN) is sent out to the customer.

To implement this process flow we can use the following micro-services:

  • Publish the service as a web service provider and configure it as a webhook in the source application.
  • Event activity that polls for a certain action to finish such as sending of an ASN to customer. This activity can be a JMS, File or Database event.
  • Third activity is the generation of an invoice that gets sent out to the customer. This activity can be pulling an Invoice IDOC from SAP and converting it to an EDI 810.
  • Last step is changing the status of the purchase order in a database. This can be database update activity that changes the status of a specific PO.
  • In order to further extend this process, user can think about adding an exit condition on the event in case the time taken to complete the activity takes more than the threshold time acceptable to the business. In this case, another activity can be called that sends out a notification with a token associated with the transaction and when the activity is eventually completed the token can be used by the process flow to complete the rest of the transaction.

Use Case 5: How to Design a Reusable, Dynamic ESB Service to Convert SAP IDOC into EDI?

In Adeptia, an ESB service behavior can be governed dynamically based on the type of the data received from the source application. Even if the data is in XML, EDI, CSV, Excel, Semi-structured, Fixed Width, Rosettanet, JSON or any other format, the service can look into its mapping repository at runtime to find the matching data conversion mapping needed to process the data.

In this ESB design integration example, suppose the Adeptia’s ESB service listens to an incoming SAP IDOC and needs to convert the data into an outbound EDI. The incoming data can be related to purchase orders, invoices, ASNs, payments, and financials that need to be converted into an EDI document for customers. In this example, user can design a single “template” process that would automatically look for the associated EDI data mapping needed for IDOC conversion.

The advantage of this design is that the user doesn’t have to create individual flows to handle different IDOCS. A single flow can be implemented to support all IDOC conversions. Taking this model and extending it further, we can think of using a more generic source activity that does not have to be specifically tied to a source application. Thus it can be SAP in one case, or Oracle EBS or JD Edwards sending out the source data and based on the data type, Adeptia can figure out how to process the message. Dynamic binding is one of the key methods used in Adeptia’s ESB development and is generally recommended as best practice in solution design.

Some of the key elements in designing a dynamic ESB service are:

  • Introspection of the source data to determine the type of request or message from the source application.
  • Lookup into the Adeptia’s service library to find the associated data conversion service for the incoming data.
  • Gateway or decision node to handle situations when no matching data conversion service is found in the Service Library/repository.
  • Send notifications in case when no matching service is found.
  • Bind the matching service into the placeholder mapping activity in the flow.
  • Send out the converted EDI document to the customer.

Use Case 6: How to Carry Out Error Handling in a Mediation Flow?

In this example, a verification request for a new enrollee’s profile is initiated by the customer onboarding system that sends an enquiry into an ESB service to verify the enrollment data and the service sends its verification results back to the client application.

For error handling, we have added an error event on the first step of the mediation flow that checks the validity of the request. If the enquiry contains data that is incomplete or has discrepancies then the mediation flow is routed to an error handling sub-process. The normal flow from the request module to the parsing, lookup and conversion steps are not executed at runtime in case of errors in the incoming request.

Error handling events can be configured in any part of the mediation flow. When connecting to API results in connectivity errors then those errors can be routed to another step in the flow that can notify users about API connection failure after several retries.

Having a common exception handling sub-process is typically recommended in ESB design pattern when there is a commonality in handling request errors across multiple services. Sub-process can have contextual error mediation rules that can be applied based on the type of errors. Runtime errors can be both at data and process level. Mediation rules on handling data errors can be different than the process execution errors around application connectivity and integration.

Mediation modules consist of:

  • Route the request in the process flow
  • Lookup enrollee’s profile in the master database
  • Convert the lookup results into an output format
  • Log all the messages in the monitoring dashboard
  • Send the results back to the client application
  • Error handling rules that route the exceptions to a sub-process
  • Notification of errors to the process owners

Use Case 7: Use Case 7: How to Integrate Data with Heterogeneous Systems via ESB Mediation?

In a heterogeneous environment where multiple teams are responsible for maintaining different systems and data repositories, it becomes important to separate the technical details of these systems and create an abstraction layer (ESB) that allows these systems to interact with each other through an ESB service rather than through convoluted point-to-point integrations.

In this example, we will go through the steps of taking a purchase order request from a client application and routing it to multiple systems in a single data mediation flow. The example also illustrates protocol transformation, as these applications require use of different protocols such as HTTPS, AMQP, JDBC, SMTP and others.

The first step in the data mediation flow is the arrival of a purchase order from a procurement system. The request is passed via HTTPS as the ESB service is published as asynchronous REST service. As the data arrives, it is parsed and validated and sent to a repeater service.

The repeater service clones the request into two data streams. One of the data streams is routed to a data transformation service that converts the data and loads it into a product order database using JDBC protocol. The second data stream is persisted into a message queue using AMQP in order to invoke other services that require this data such as for creating shipment notice and customer invoice. Finally, email notifications using SMTP protocol are sent out by the end event to all the process owners.

Key modules of a data mediation flow that integrates with multiple systems are:

  • Protocol Transformation: connecting to multiple systems using different protocols in a single flow. Protocols include, AMQP, HTTPS, JDBC, SMTP, POP/IMAP, UDP, TCP, MLLP, SFTP, AS2, Rosettanet etc.
  • Service Mapping: converting data into application-specific format in order to achieve data integration. Since no common data model exists among these systems, data conversion is part of the mediation flow.
  • Application adapters: connectivity to ERP, CRM, legacy and cloud applications

Use Case 8: How to Design an ESB service That Connects to Multiple Systems Based On Request type?

In this example, the request type determines the specific application that needs to be called and the lookup result is routed back to the requestor as a response. The source data for the request can be a file, database, message queue, or an ERP system. The gateway contains the routing rules that would direct the flow in regards to which source application to call and perform the lookup.

Calls to the various applications can also be sub-processes that contain protocol transformations related to connecting with the appropriate application, data mediation rules in terms of converting lookup results into canonical format and sending the converted output to the global flow.

In this process flow, we have added a source selector function after the application call and this function selects the particular output stream sent by the application and then passes the stream as a response to the client. The source selector consumes only one stream. This process can also be changed or further extended where multiple source applications are called and multiple streams are merged into a single normalized output that is sent out as a response. This type of multi-system application integration ESB design depends on the type of process or data mediation needed for the request.

You can easily plugin additional application connections as sub-processes that would also help modularize the ESB flow into individual flows that are easy to manage. Using the sub-process approach to encapsulate application connections, one doesn’t have to modify the parent flow if a data mediation rule needs to be changed for a particular application. Another advantage of using sub-processes is that these flows can be consumed by other global ESB services or orchestrations that require similar data lookups based on the type of request received from the client applications.

Key components for designing a multi-app ESB mediation are:

  • Calling sub-processes that call individual applications based on request type
  • Data translation that converts the lookup results
  • Merging multiple application results into a normalized output

Use Case 9: How to Incorporate Data Mediation in ESB That Converts Data into Multiple Formats?

Adeptia’s ESB service can be used to convert data flowing into or off the bus, depending on the types of message formats supported by applications that are integrated into the data flow.

In this example, we are using a data conversion ESB service that takes a payload from the incoming request and converts the data into two outputs. One of the outputs is the database format that is used to load records into tables and the other format is an XML that is used to load data into a cloud ERP system.

In the Adeptia’s ESB integration solution, the types of data conversions can be many ranging from simple conversion from a CSV format to an XML or it can be more specific to the type of message or application or API that needs the data in a specific format. Adeptia supports all structured and semi-structured data. Data standards such as EDI, ACORD, BAI, ACH, MISMO, HIPAA, HL7, NCPDP, ODETTE, OAGIS, Rosettanet, FiX are supported out-of-the-box with data dictionaries for the major standards pre-bundled in the product.

The complexity of data map varies from the type of hierarchical data structure that source or target schema represents the complexity of conversion rules that the target application requires for successful data integration. Also, the mapping can be between multiple sources and targets where the data from two or more sources need to be merged or joined prior to mapping the result to the target.

The components used in building a data conversion ESB service are:

  • Data Mapper: loads the schemas of multiple sources or target applications and converts the data into multiple output formats
  • Data Parsing: reads the incoming data and verifies the validity of the data structure and routes any errors to the logs or error handling process
  • Use of Data Mapping functions: conversion requires functions such as data quality rules, filter and sorting conditions that need to be applied on the incoming data. Conditional functions to manipulate data such a string, if/else, switch-case, key-value pairs, constants, date conversions are some of the important functions used in the data conversion.

Use Case 10: How to Convert an ESB Service Into a REST API?

There are many ways an ESB service can be triggered such as via events like SFTP or message queues but service can also be triggered in real-time through an API call by a client application. In order for the client application to invoke an ESB service one has to publish the ESB service as a REST or SOAP service and define the request/response structures (i.e., JSON or WSDL) needed by the source application to call this service.

In this example, we have published a card payment service for an e-commerce website that calls this service and executes a card payment transaction. Adeptia Connect allows you to publish an ESB service by creating a web service provider activity. You can select the type of API in terms of REST or SOAP and provide the related parameters and methods that would be used for service invocation by source applications.

Advantages of publishing an ESB service as a consumable API by clients is that it can be used in multiple use cases. It provides the flexibility of reusing the service in multiple scenarios where for example an order fulfillment and shipment service systems can call the same service and get the information needed by the customer rep or by an ERP system.

An important part of application integration is support for Web Services. In Adeptia, process orchestrations can be published as both REST or as SOAP web services. ESB architecture supports the use of different versions for the same API and the creation of different resource end-paths.

Developers can create multiple operations for the API such as “requestCardPayment”, “getPaymentResult”, “updatePayment”, “createSalesOrder” and each of these operations can have different parameters that can be passed in the body or in the header of the request. Adeptia also supports the use of API security keys along with SSL encryption for secure connectivity for integrating with backend or cloud applications.

To learn more about Adeptia’s application integration capabilities, check out the following resources: