10 Questions For @PrimeDimensions: Real-Time, Event-Driven, Complex Event Processing In Government and Healthcare

A few years ago I flew all the way to Cape Town, South Africa to present a paper at the triennial World Congress on Medical and Health Informatics, commonly know as MedInfo. My topic was a “big picture” framework of how electronic health records can be supplemented with process-aware Business Process Management ideas and tech. A key link in that framework was Complex Event Processing. I am not an expert on Event-Driven Architectures. So I am so delighted to convince Michael Joseph of Prime Dimensions, who is expert, to answer my questions! He’s experienced in both government and healthcare real-time information processing systems. By the way, I’m tweeting out chunks of this interview during the NextGov Prime2014 conference. Both Michael (@PrimeDimensions) and I (@wareFLO)will be responding to replies and retweets — in real-time!

eda

Every hour or two I’ll update this blog post and tweet links to each added question for which there is a new answer. I’ll tweet on the #Prime2014 hashtag, so be sure to follow it during the Prime2014 conference, Monday, September 2014. Say hi to Michael for me if you bump into him at #Prime2014.

Thanks Michael! First of all. Enjoy the conference! Now, let’s dive into the real-time, event-driven weeds! Could you provide us a brief glossary of terms, so we won’t feel lost?

Sure, Chuck. The following is from one of my slide decks.

  • Real-time Data: Data streams processed and analyzed within seconds of data collection.
  • Business Process Management (BPM): Includes user-friendly business modeling and optimization tools, system integration tools, business activity monitoring tools, and rich workflow capabilities for end users.
  • Complex Event Processing (CEP): Advanced analytics on one or more streams of real-time data to quickly identify and respond to meaningful events.
  • Data in motion: Steaming data flowing within a system or from one system to another (ie: HL7, medical devices, real-time location services).
  • Data at rest: Data stored in a database or file.
  • Business Activity Monitoring: Real-time activity data converted to information and pushed to end-users through a visualization tool or dashboard.
  • Operational Business Intelligence (BI): Reporting of real-time data triggered by an end-user request.
  • Event-driven Architecture (EDA): A framework that orchestrates behavior around the production, detection, consumption of events as well as the responses they evoke.

Thanks, Michael, that was very helpful! If we count that as the first question in this interview, it get’s us to a nice round ten (plus one!). Here are the rest of the questions I have, base on past conversations with you. I know you’ve got a foot in both the government and healthcare spheres, so I’d appreciate it if you could address both areas, when practical.

  1. Could you provide us a brief glossary of terms before we dive into the Real-Time weeds?
  2. What are the advantages in adopting a real-time data capability?
  3. What are the advantages in adopting a real-time data capability in healthcare?
  4. Is there an industry-wide, accepted definition of “real-time?”
  5. What are typical sources of real-time data?
  6. What type of analytics can be generated with real-time data?
  7. How does this real-time capability influence or drive the future-state solution architecture?
  8. What are advantages of an Event-Driven Architecture?
  9. What technologies are required to deploy a real-time capability?
  10. What is the relationship between business process management (BPM) and Complex-Event Processing (CEP)?
  11. You’ve got a great marketecture of how an Event-Driven Architecture could fit into a hospital IT architecture. May I share it?

2. What are the advantages in adopting a real-time data capability?

Government agencies continuously generate and collect valuable real-time data associated with specific transactions, but they generally have limited ability to effectively analyze, process and present this data for actionable intelligence and real-time decision support.

Real-time data allows information to be disseminated in a timely manner, when and where it is needed. Most organizations have been focusing efforts on leveraging technology to become a data-driven enterprise; the next evolution is to also consider how to become an event-enabled enterprise. Real-time capability and related process automation assist in accessing data to build event-driven applications enriched with other relevant information and bringing them together in a scalable, vendor agnostic platform.

The expectation and pressure to deliver measurable results and meaningful change has never been more pronounced, as government executives face enormous challenges and risks as they navigate the complexity of our digital, data-driven world. These circumstances necessitate next-generation technologies designed to extract value from very large volumes of disparate, multi-structured data by enabling high-velocity capture, discovery, and analysis.

3. What are the advantages in adopting a real-time data capability in healthcare?

For the healthcare industry, a real-time capability is required to exceed future standards of care, provider experience and patient engagement expectations and to accommodate the massive transformation currently occurring in the healthcare industry — a transformation focused on opening up health data to facilitate exchange with providers, payers, and patients. For this reason, healthcare providers should be seeking to deploy an Enterprise-wide real-time processing capability that provides improved clinical insights, operational effectiveness and situational awareness associated with key indicators and events.

4. Is there an industry-wide, accepted definition of “real-time?”

Not exactly. The real-time platform should be designed based on requirements for providing minimally acceptable timeliness of information based on feasibility and clinical necessity. In collecting, processing and analyzing real-time data, there is inherent latency depending on data rates, volume, aggregation method, processing power, embedded analytics and throughput. In general, real-time data is defined as data streams that are processed and analyzed from milliseconds to approximately 30 seconds of collection. This is done either through a machine-to-machine or machine-to-human interface. Sensors and medical devices generate real-time data that are captured by other systems for continuous monitoring. Depending on the scenario, anomalies may be resolved by automated responses or alerts for human intervention.

5. What are typical sources of real-time data?

Sources of real-time data include data-in-motion, such as instant messages, flow sheets, device and sensor data, business process and workflow data, and real-time location services (RTLS). The goal of the real-time capability is not only to capture and integrate these sources, but also to transform and collect latent, transactional data as it becomes available in various source systems, such as financial, human resources, operations, and supply chain management. The challenge is efficiently integrating these disparate data sources across a fragmented information infrastructure, with multiple data silos, data marts and operational data stores. The real-time platform will provide specialized data services to extract data-at-rest that represents the most current records and maintains “a single version of the truth.”

6. What type of analytics can be generated with real-time data?

The real-time processing provides better visibility into all dimensions of healthcare delivery with real-time information. Disparate clinical and operational applications create the need to aggregate patient data in a new IT environment for real-time collection, processing and analysis. Analytic engines process vast amounts of data to identify and correlate the most important factors influencing patient care to develop an optimal treatment plan. In addition to real-time response and alerting, the platform enables the emergence of a new class of analytic applications, dashboards and visualizations for predictive modeling, clustering analysis, decision trees, root-cause analysis and optimization.

By supporting both on-demand and continuous analytics, the real-time platform extends and improves operational business intelligence and business activity monitoring through integration with Enterprise reporting and dashboard tools. On-demand real-time analytics are generated and delivered based on a user query; the data are pulled in real-time. Continuous real-time analytics notifies users with continuous updates in real-time; the data are pushed through on a regular basis. Algorithms include statistical analysis, predictive modeling, root-cause analysis, optimization, data mining, decision trees, clustering and natural language processing.

7. How does this real-time capability influence or drive the future-state solution architecture?

Event-driven architecture (EDA) integrates relational, non-relational and stream data structures to create a unified analytics environment. EDA describes an architecture where layers are partitioned and decoupled from each other to allow better or easier composition, asynchronous performance characteristics, loose coupling of APIs and dependencies and easier profiling and monitoring. Its goal is a highly reliable design that allows different parts of the system to fail independently and still allow eventual consistency.

EDA enables discovery, or exploratory, analytics, which rely on low-latency continuous batch processing techniques and high frequency querying on large, dynamic datasets. This type of architecture requires a different class of tools and system interfaces to promote a looser coupling of applications to streamline data access, integration, exploration, and analysis. It is also designed to deploy real-time Web applications using NoSQL databases, RESTful interfaces and advanced platforms that maximize throughput and efficiency by providing evented, asynchronous I/O and guaranteed, non-blocking libraries, thereby sharing code between the browser and server, effectively eliminating the Web server layer.

8. What are advantages of an Event-Driven Architecture?

  • Promotes operational effectiveness, process automation and analytic excellence
  • Enables advanced analytics and clinical informatics through an interoperable and scalable infrastructure
  • Streamlines technology insertion based on agile development methodology for rapid deployments
  • Controls IT operational costs by eliminating redundancies and aligning capabilities
  • Supports strategic planning, organizational alignment, and continuous process improvement
  • Provides a practical framework for defining and realizing the evolving future state
  • Integrates multi-structured and stream data using advanced technologies that provide high velocity data capture, discovery and analysis
  • Establishes a virtualized data environment and extensible service-oriented architecture that supports both Restful and SOAP APIs, allowing multiple data structures and formats (JSON, XML, etc.)
  • Provides an application development platform with domain-specific enclaves for evolving from “systems of record” to “systems of engagement”

9. What technologies are required to deploy a real-time capability?

A real-time platform ingests and processes data streams from clinical and operational systems, performs complex event processing (CEP), pattern matching and anomaly detection, applies on-demand and continuous analytics and triggers notifications and alerts based on an embedded rules engine. The platform also aggregates at-rest retrospective data from other source systems with real-time data streams for enhancing the context of information presented through operational business intelligence. With an in-memory cache, the platform has the ability to retain and persist data as long as it remains relevant to the real-time event. Detecting and reacting to events in real-time allows wide variety of business processes to be automated and optimized, improving a patient’s entire care team to improve communication and collaboration.

CEP provides an organization with the ability to detect, manage and predict events, situations, conditions, opportunities and threats in complex, heterogeneous networks. The in-memory cache provides the capability to run multiple, complex filters that compare events to other events in the same stream, or in another stream, or compare events to a computed metric. Moreover, in conjunction with the CEP rules engines, multiple algorithms can be deployed simultaneously and include the following rule types: (1) message syntax and semantic rules, (2) routing and decision rules, and (3) aggregation and transformation rules. Sophisticated rules engines can invoke in-memory analytics. To optimize performance, the platform can apply data compression and message parsing directly on the incoming streams, depending on the data rate, content and structure. Detection-oriented Complex Event Processing focuses on detecting combinations or patterns of events. Aggregation-oriented Complex Event Processing focuses on executing embedded algorithms as a response to data streams entering the system.

10. What is the relationship between Business Process Management (BPM) and Complex Event Processing (CEP)?

An ideal real-time platform integrates Business Process Management and Complex Event Processing. Together, these components create an agile, high performance, scalable platform that can deliver fast insights through real-time queries, notifications, alerts and advanced analytics. With BPM, the platform is able to detect discrete events and trigger a workflow that completes a specific process through a series of transactions. CEP extends this capability by correlating multiple events through a common interface that invokes an embedded rules engine. Event filtering evaluates a specified logical condition based on event attributes, and, if the condition is true, publishes the event to the destination stream as a notification or alert. Moreover, integrating BPM tools, and other line-of-business (LOB) applications, improves operational business intelligence and business activity monitoring (BAM), the use of technology to proactively define and analyze the most critical opportunities and risks in an enterprise. By deploying a BPM capability based on CEP technology, providers will be able to process high volumes of underlying technical events to derive higher level business and clinical decision support, extending the functionality of BPM tools and providing insights to a wider audience of users.

11. You’ve got a great marketecture of how an Event-Driven Architecture could fit into a hospital IT architecture. May I share it?

[Michael, you’ve got an incredibly detailed “marketecture” of how an Event-Driven Architecture integrates into a hospital venue. May I share it? Also, your detailed answers to my in-the-weeds question really started a bunch of wheels spinning in my head. Do you mind if I followup at a future date with ten more questions, even more healthcare focused?]

Please do! I look forward to it!

Cheers

Michael



P.S. [Chuck writing] You can see that Michael is fount of detailed knowledge about real-time analytics. Michael comes from a business intelligence and analytics background. I, of course, come from a healthcare workflow tech background. It’s fascinating to see how event-driven architectures are so at the intersection between clinical business intelligence and clinical business process management. And I’d like to thank Michael for having a conversation about that intersection. By the way, as I mentioned at the outset, I’m tweeting out chunks of this interview during the NextGov #Prime2014 conference. And both Michael and I will be responding to replies and retweets — in real-time!

Leave a Reply

Your email address will not be published. Required fields are marked *