The other day, a Chief Executive Officer of a manufacturing company asked “ I have over one thousand stove-piped onpremise applications, how can I migrate them to cloud with minimum friction?” A common question looming in the minds of business leaders envisioning a cloud first enterprise. I replied “The answer depends upon how you will deal with fair share of challenges coming from rehosting/ rearchitechting, data migration, line of business (LoB) needs, security, partner data exchange needs, etc.” I added, “transitioning systems to the cloud can be simplified and speeded if you cross manage sumtotal of these challenges.”
Climbing to the top and setting up a future-ready IT ultimately depends upon the ability to comprehend legacy system integration challenges which I am going to cover in this blog:
Stove-piped Applications: In the good old days, stove-piped applications were developed that offered very limited modes of integration. Adding trading partners or onboarding customers would require complex logic dedicated to specific applications involved. There are blowbacks attached to this approach:
Lengthy Data Loads and Latency Issues: There are performance-based differences between cloud and on-premise applications. For example, ERP based systems can be piped with direct data loads. On the other hand, on-premise systems don’t provision direct data load. Bringing data into a cloud platform Salesforce requires data to be passed through a dense network of API layers. Heavy JAVA coding is required for custom validations, triggers, workflows, duplication detection rules, etc.
Apart from this, fundamentals and technical differences of Salesforce.com and SAP, or Amazon EC2 and Oracle will require totally different approach. And using different solutions for integrating these endpoints can do more harm than good. All this can cause inefficiencies, performance degradation and malfunctioning issues.
Rattling Downtime Issues: Downtime is hard to beat and it is the major barrier for systems and data consolidation. Heterogeneous systems cause disruptions when they are integrated together. Legacy systems support only unidirectional capabilities which don’t scale up to support hosting or rehosting of services or data based schemas. Getting data feeds from systems becomes an uphill task for developers.
Redundant Batch Data Integration: Batch based Extract, Transform, and Load (ETL) approach has become obsolete as the archaic rows-and-columns approach doesn’t support cloud based initiatives. Moreover, it doesn’t enable storage for structured, unstructured data, and semi-structured data.
Cloud-based applications use Extended Markup Language (XML) which is complex. Also, it is notoriously known for being incompatible with poly-structured data. XML needs to be converted into a comprehendable format like Comma Separated Values (CSV) for ease of data sharing. With traditional ETL approach it is pretty hard to convert XML into CSV or visa versa. Apart from this, modern approaches require Physical to Virtual (P2V) conversion, data replication, etc.
Data Security: Data protection is now the agenda of organizations and world governments. More stringent data protection laws like General Data Protection Regulation (GDPR) are being anticipated in the near future, which will impose stringent liabilities on firms for data breaches. However, data is exposed to several threat actors when legacy systems are integrated with cloud based systems or deployed in the cloud. This problem expands when multiple cloud providers like Microsoft and Amazon are consulted. During high volume data externalization, it becomes difficult to safeguard enterprise data without unified security platform and identity management triggers.
All these problems are not insurmountable and there are ways to avoid tradeoffs. A robust integration model can help organizations in getting more mileage out of their cloud migration. In part II, best practices to combat these challenges will be covered.