Pivotal + VMware: Transforming how more of the world builds software
Tech Insights
Five Minute Read

Event streaming:
Build applications that keep up with business

Event streaming reflects how businesses actually work. Thousands of small changes happening, all day, every day. Shouldn’t your applications work the same way?


Event streaming and data

Streaming data means a constant flow of events, with each containing information about an event (or “change of state”). A common example is a stock market ticker. In the stock market, every time a stock price changes, a new message is created containing the time of day, the stock’s code, and its new trade price. It’s called “streaming data” because there are thousands of stocks, and thousands of trades happening every second, resulting in a constant stream of data.

But streams of data don’t have to come in a torrent to be useful. Streams can have lower cadence and still be valuable. The price change of a single stock from one second to the next is probably not very interesting, but the change in price between one day and the next could be life-changing.

Streaming data is unbounded, meaning it has no real beginning and no real end—each event is processed as it occurs and managed accordingly. Events can trigger specific application behaviors and feed real-time business dashboards, as well as make their way into other systems for later batch processing. This is in contrast to legacy approaches, where, by default, data is sent to a backend data warehouse or Hadoop cluster to be analyzed every day, week, month, or quarter.





Event streaming and microservice architectures

When combined with microservices, event streaming can open up a whole new set of exciting opportunities. However, in order to appreciate the value of event streaming, it helps to understand some of the known issues with microservices architectures.

One of the inherent challenges with microservices is the coupling that can occur between the services. In a conventional “ask, don’t tell” architecture, data is gathered on demand. Service A “asks” Services B, C, and D, “What’s your current state?” This assumes B, C, and D are always available to respond. However, if they moved or they’re offline, they can’t respond to Service A’s query.

To compensate, microservice architectures tend to include workarounds (such as retries) to deal with any ill effects caused by changes in deployment topology or network outages. This adds an extra layer of complexity and cost.

In an event-driven architecture (i.e., “tell, don’t ask”), the approach is reversed. In an event-driven architecture, Services B, C, and D would “publish” continuous streams of data as events. Service A would “subscribe” to these event streams—processing the facts, collating the results, and caching them locally, ready for the next time they’re needed.

Using event streams in this way opens up the possibility of fully event-driven systems, mimicking more closely how businesses actually work, perhaps even using scale-to-zero functions (or “serverless computing”) more widely.



Prepare for an event-driven future

If your organization wants to adapt to future business demands more quickly, then stream processing of events really matters. Streaming architectures can power everything from simple event notifications—for example, sending an alert when a stock price slumps—to real-time machine learning models to detect suspicious trade activity. Even in batch operations, streaming data can improve analytics and business intelligence by inherently connecting specific events with the times they occurred.

Modern stream processing systems such as Apache Kafka can also act as a stateful source of truth about the business. Because they can store and process events—while also holding onto historical data—systems can analyze and aggregate data in real time without reaching out to external data sources.

However simple or advanced the implementation, event streaming can help organizations future-proof their applications. A common benefit realized by Pivotal customers at all levels is their ability to add new features or eliminate problematic ones simply by adding or removing subscribers to a data stream. There’s no monolithic business logic to reconfigure; just microservices to plug or unplug.

In the years to come, the ability to think in events and build applications around them will become ever more important. Apart from the business improvements that go along with being faster, making better use of data, and designing more intuitive user experiences, there are also technologies such as serverless and functions that have the notion of events baked into their cores. The best enterprises will take full advantage of these innovations to build better, more successful applications.



Streaming data at Pivotal

As a contributor to the Spring Framework, RabbitMQ, and many other open-source projects related to modern application development, Pivotal supports developers building streaming data applications in many ways. Here are just a few highlights:


Pivotal is also working closely in partnership with Confluent, provider of the leading commercial event-streaming platform based on Apache Kafka, to bring enterprise-grade Kafka to our ecosystem. The Confluent Operator automates and simplifies the provisioning and management of the Confluent Platform on Pivotal Container Service (PKS). Enterprises can:

  • Deploy Kafka as a cloud-native application on Kubernetes to simplify and automate provisioning, and minimize the operating burden of managing Kafka clusters at scale.
  • Run applications and their Kafka service on any (public or private) cloud without modification or the dreaded “vendor lock-in.”

Solace PubSub+ for Pivotal Platform, another partner integration, registers a service broker on Pivotal Platform and exposes the preconfigured service plans in the Pivotal Marketplace for developers to choose. Solace, together with Pivotal, also developed the first third-party, commercially developed binder to ensure optimal integration with Spring Cloud Stream. It’s been released as an open-source project and is available on GitHub and Maven.

The tile, along with Spring Cloud Stream Binder for PubSub+, gives developers the tools they need to deploy event-driven microservices and steaming applications to the Pivotal Platform.

Contact us