Event Streaming platforms provide a mechanism to store aggregated data in-memory, which enables analytics and the follow-up actionable insights derived on the fly. This detailed WhitePaper explains the emergence of Event Streaming, the architectural components that make up the Streaming Platform and then goes on to describe a use-case implemented through TIBCO Streaming Platform.
Since the dismantling of the early monolithic application, computer systems have been evolving non-stop. Earlier systems relied on accumulating data on the disk, and on a configured schedule, the data would be pumped into the next system. As the next system processed this data, it stored the needed information in an operational data store (ODS) and created another set of data for the next system to consume.
The lifecycle of the active data to-be-processed ended when it reached the last hop and the last system completed processing it. In a typical flow, comprising a decent number of hops, one could easily visualize that this was a staggered approach. The processing of data was interspersed with waiting periods until the data is sent to the next hop, and then another.
When messaging systems were introduced into the mix, the systems started to work closer in real-time. The staggered approach based on a scheduler was replaced with real-time systems that leveraged messaging platforms. As messages were received, in a typical process model, the data was parsed, enriched, transformed, and/or aggregated as needed. The result was stored in its ODS, and then messages were published out to the next system. The change in approach was huge considering the messages were independently processed, regardless of any schedules.