Enterprise applications belong to a vibrant ecosystem and consequently the data they generate is large and varied. Enterprises both benefit and suffer from this nature of application and data.Whenever a new application is to be deployed in an enterprise that integrates with the applications in the ecosystem, the precondition is an 'expansive data definition with referential value' on day 1 to start integration. Traditionally, this approach to data integration involves identifying a target data structure, and force fitting data from all sources into it. This is done to ensure a 'seamless' integration - never mind the loss of data considered irrelevant.
Data Warehouse Approach
Traditional data warehouses use the following approach for data ingestion:
- Fixed target structure in which to ingest data
- Source data is transformed to fit into the target structure
- Any 'alien' data is just ignored and dropped
- Unstructured data is sparingly allowed
- Reporting and analytics is run on top of this target structure
- Structure is reviewed periodically for change in definitions.
So how do we absorb data without defining so that we can discover the structure and value of data on an ongoing basis?
Benefits of Stream-based Data Integration
In the real world, data gets collected or elicited through events - enterprise-initiated or customer-initiated. The data collection approach from transactional systems to analytical systems can follow a similar approach if event data can have a variable structure. This works best if data flows as streams from multiple sources. This approach is called event-based data modelling using data streams.
- Unit of integration is a data packet which contains a series of connected name-value pairs.
- Data packets flow into the data ingestion environment into one or more target streams. The packet is considered for processing on a stream if the minimum variables required for a stream are present in the packet.
- Each data packet can provide foundational information for multiple different events.
- Data is stored as events; data for the same event can be provided incrementally.
- The variables in a packet that are unknown to the data ingestion environment are not ignored or dropped Instead, they are retained at all events that are identified from the data packet.
- Since events are real world concepts, they form an excellent foundation for analytical models targeted towards behavioral outcomes.
- Data ingestion can start very quickly with conformance to a minimum set of variables required per stream.
- Adjunct variables can be discovered after ingestion and used for analytical value
- Works very well with the real-time paradigm in which the businesses of today compete.
In consumer-based enterprise businesses where relationships are long term and are influenced by experience, the following streams are essential:
- Customer
- Relationship
- Transactions
- Interactions
These streams abstract the natural structures of the processes that govern the operations of the business. And to top it all, it can be kick-started quickly and improved upon in an on-going basis.
Advantages of Stream-based Integration
The following are some of the technical advantages of stream-based integration:
- Cloud Technologies: applications like Salesforce hosted on the cloud can be connected to via Native API or integration APIs like MuleSoft by using adapters.
- Legacy application integration: legacy applications that allow connectivity via Message Queues or flat files can also integrate in a stream-based environment
- Batch-base upload: data which is available after EOD processing or is available from external systems as flat files can also integrate on streams
- Real-time integration: enterprises environments with ESD can easily connect in real-time to the web service end-point of the related stream. Also, applications that can call web services can also connect at real-time.
Conclusion
If you want to see analytics results in real-time, then stream-based processing is the way to go. A stream-based approach to data integration preserves the sanctity of data through its life cycle. Also, it fits in easily with many different environments or sources such as the cloud, legacy applications, batch uploads and real-time integration.
Download the Aureus Analytics whitepaper "Data Integration with CRUX" to learn how data stream-based integration is used to bring together multiple datasets in real-time.