aureus-insights_logo

All Posts

Cheers to Stream-based Data Integration...and to Never Losing a Byte!!

Enterprise applications belong to a vibrant ecosystem and consequently the data they generate is large and varied. Enterprises both benefit and suffer from this nature of application and data.Whenever a new application is to be deployed in an enterprise that integrates with the applications in the ecosystem, the precondition is an 'expansive data definition with referential value' on day 1 to start integration. Traditionally, this approach to data integration involves identifying a target data structure, and force fitting data from all sources into it. This is done to ensure a 'seamless' integration - never mind the loss of data considered irrelevant.

Data Warehouse Approach

Traditional data warehouses use the following approach for data ingestion:

  • Fixed target structure in which to ingest data
  • Source data is transformed to fit into the target structure
  • Any 'alien' data is just ignored and dropped
  • Unstructured data is sparingly allowed
  • Reporting and analytics is run on top of this target structure
  • Structure is reviewed periodically for change in definitions.

So how do we absorb data without defining so that we can discover the structure and value of data on an ongoing basis?

Benefits of Stream-based Data Integration

In the real world, data gets collected or elicited through events - enterprise-initiated or customer-initiated. The data collection approach from transactional systems to analytical systems can follow a similar approach if event data can have a variable structure. This works best if data flows as streams from multiple sources. This approach is called event-based data modelling using data streams.

  1. Unit of integration is a data packet which contains a series of connected name-value pairs.
  2. Data packets flow into the data ingestion environment into one or more target streams. The packet is considered for processing on a stream if the minimum variables required for a stream are present in the packet.
  3. Each data packet can provide foundational information for multiple different events.
  4. Data is stored as events; data for the same event can be provided incrementally.
  5. The variables in a packet that are unknown to the data ingestion environment are not ignored or dropped Instead, they are retained at all events that are identified from the data packet.
  6. Since events are real world concepts, they form an excellent foundation for analytical models targeted towards behavioral outcomes.
  7. Data ingestion can start very quickly with conformance to a minimum set of variables required per stream.
  8. Adjunct variables can be discovered after ingestion and used for analytical value
  9. Works very well with the real-time paradigm in which the businesses of today compete.

In consumer-based enterprise businesses where relationships are long term and are influenced by experience, the following streams are essential:

  • Customer
  • Relationship
  • Transactions
  • Interactions

These streams abstract the natural structures of the processes that govern the operations of the business. And to top it all, it can be kick-started quickly and improved upon in an on-going basis.

Advantages of Stream-based Integration

The following are some of the technical advantages of stream-based integration:

  1. Cloud Technologies: applications like Salesforce hosted on the cloud can be connected to via Native API or integration APIs like MuleSoft by using adapters.
  2. Legacy application integration: legacy applications that allow connectivity via Message Queues or flat files can also integrate in a stream-based environment
  3. Batch-base upload: data which is available after EOD processing or is available from external systems as flat files can also integrate on streams
  4. Real-time integration: enterprises environments with ESD can easily connect in real-time to the web service end-point of the related stream. Also, applications that can call web services can also connect at real-time.

Conclusion

If you want to see analytics results in real-time, then stream-based processing is the way to go. A stream-based approach to data integration preserves the sanctity of data through its life cycle. Also, it fits in easily with many different environments or sources such as the cloud, legacy applications, batch uploads and real-time integration.


Download the Aureus Analytics whitepaper "Data Integration with CRUX" to learn how data stream-based integration is used to bring together multiple datasets in real-time.

Download

 

Nitin Purohit
Nitin Purohit
Nitin is CTO and co-founder at Aureus. With over 15 years of experience in leveraging technology to drive and achieve top-line and bottom-line numbers, Nitin has helped global organizations optimize value from their significant IT investments. Over the years, Nitin has been responsible for the creation of many product IPs. Prior to this role at Aureus, Nitin was the Global Practice Head for Application Services at Omnitech Infosolutions Ltd and was responsible for sales and profitability of offerings from application services across geographies.

Related Posts

Transfer Learning: A New Age of Machine Learning

In recent years, Machine Learning (ML) algorithms have advanced and are now capable of learning accurate and complex patterns provided large and labeled data samples are available. However, many ML implementations fail to generalize when new data points are encountered, especially data points with different and unseen patterns or conditions from training samples.

Trust in the Evolution of the Customer's Journey

This is part 1 of a 2-part series "Trust: The Key Ingredient for a Successful Insurance Customer Journey." Today, everyone in the business world is talking about the customer journey and experiences starting from E-commerce, banking, and many other industries. So, what is customer experience? What is new about it?

3 Ways to Target the Right Customers in the Insurance Industry

This is Part 3 of our blog series, "Data Science Use Cases in Insurance." The insurance industry isn’t the same as it was 20 years ago. It has become much more competitive as tech companies come into the picture with new and innovative ways to compete in order to gain a foothold in the insurance industry. Consumers want to save money and will make their decisions based on the lowest price available. Some websites will help the consumer compare carriers’ prices and offerings to choose the best deal. Unfortunately, this is causing insurance companies to make price their priority over quality and customer satisfaction.