Out Of The Health IT Tar Pit: My Comments on A Robust Health Data Infrastructure

A Robust Health Data Infrastructure is an HHS report addressing why Meaningful Use has, so far, failed to deliver on early promises to dramatically improve healthcare while reducing cost. At its core is a proposed architecture to facilitate more interoperable EHRs and health IT systems. Since I also recently weighed in on why interoperability has eluded us (Pragmatic Interoperability), I thought I’d compare A Robust Health Data Infrastructure to my own views on healthcare workflow and health IT workflow technology.

Here is the proposed architecture for exchange of information.

jason-arch

Figure 1: JASON’s proposed software architecture
for exchange of health information

The report’s authors are collectively referred to as JASON. I’ve not been about to find out what this acronym stands for, but it sounds like JSON (JavaScript Object Notation) a popular lightweight data-interchange Web format.

This architecture is, essentially, a marketecture, a “one page, typically informal depiction of the system’s structure and interactions. It shows the major components, their relationships and has a few well-chosen labels and text boxes that portray the design philosophies embodied in the architecture. A marketecture is an excellent vehicle for facilitating discussion by stakeholders during design, build, review, and of course the sales process. It’s easy to understand and explain, and serves as a starting point for deeper analysis.”

Marketecture is a portmanteau of “marketing” and “architecture”. Besides the above more technical meaning, marketecture is “used by a vendor to place itself in such a way as to promote all their strongest abilities whilst simultaneously masking their weaknesses.” I think it’s fair to say the proposed JASON diagram serves both the purposes of education and marketing, education about what will be necessary to create truly interoperable health IT software systems and marketing its adoption by a variety of healthcare and health IT stakeholders.

The JASON report prompted me to reread: Out Of The Tar Pit. You’ll see the connection in a moment.

Out Of The Tar Pit is about why software is so difficult. Software is difficult because it is complex, by which the authors mean it is difficult to understand (not the more formal concept of computational complexity). They list reasons — state, control, and code volume — to which I will return.

Why did I think of Out Of The Tar Pit when I read the JASON report? Because software architectures are about managing, attempting to reduce by design, complexity, to get to more understandable and reliable software systems. Divide and conquer: if we can reduce a software system into understandable components, and understand their interactions, we can hope to understand the whole, and make it do what we wish more reliably.

So far, so good. The JASON-suggested software architecture for healthcare information exchange is a valiant effort. It’s only missing one key ingredient: process-aware information systems. Now, the architecture is, indeed, intended to be agnostic about what specific software platforms should be used to implement it. I, however, am not agnostic. I believe, without a doubt, that many of health IT’s problems regarding usability, interoperability, and cost, are due to not using technologies that have been prevalent in other industries for years, in some cases, even decades. These are the workflow technologies, including workflow management systems, business process management, and dynamic and case management systems.

Workflow tech is used by, embedded in, a wide variety of social, mobile, analytics and cloud platforms. From speech recognition and natural language processing systems to “big data” and machine learning workflows, executable process models are helping to manage software complexity, increase understandability and reliability. I might visit some of the boxes of this architecture in a later post: stovepipe legacy systems, UI apps, middleware apps, semantics and language translation, as well as privacy services. However, before I do so we need to review the major sources of software complexity the JASON architecture seeks to tame.

Out Of The Tar Pit blames software complexity on

  • state (data values)
  • control (order of execution) and
  • code volume (lines of text).

Out Of The Tar Pit suggests programmers think more declaratively about what the software does, not how it is accomplished. This would go a long way to reduce software complexity. In other words, if health IT could better manage software state, control, and code volume, this would go a long way toward accomplishing the same higher-level goal that motivates the JASON architecture.

The better health IT manages state, control, and code volume, the more it will reduce complexity and achieve goals motivating the JASON architecture. On all three points, workflow technology contributes. Let’s take an extremely simplified patient encounter workflow.

A more realistic representation of healthcare workflow would be a magnitude more complex than this example. However, it would not be more complex than the software code necessary to support users in during interacting with patients, users, and data partners. This simplified workflow is sufficient, though, to illustrate how workflow technology manages state, control, and code.

First of all, the above network is what is called a state transition network, or STN. Each of the numbered nodes is a patient encounter state. “S” is the start state. Circled nodes are end states. Arrows between states represent state transitions. For example, to get from “S” to “1” (vitals) the task of taking vitals must be recorded as having been accomplished. And so on. “S” and “1” and so on represent state just as much as variables in a computer program represent state. However, these states are at a level high enough a human familiar with patient encounter workflow can understand the “program” (the STN) but low enough that a workflow engine can execute the program, perhaps by putting a task in a user’s worklist and checking it off when completed, or recognizing from other data when the vitals task has been accomplished.

Workflow systems still have a lot of variables, most pertaining to patient state stored in databases, but also entered or calculated during a patient encounter. However, by explicitly representing workflow states, and connections among them, in a declarative manner (think of declaring a variable, assigning a type, and a value) at a level that makes sense to both humans and machines, a significant portion of EHR software complexity is more understandably managed.

What about control, the order of task execution? Arrows among states represent this. Instead of confusing if-then and case statements where commas matter, not to mention executing chunks of software in parallel and not knowing in what order they will complete, sequence order is represented a level of abstraction understandable by human but executable by engines. Workflow technology has a natural means to represent and automatically execute representations of state and order missing in traditional health IT systems not using workflow engines and executable process definitions.

Finally, there is code volume. Many workflow technology vendors bend the truth a bit and boast about “no code” development. However, dragging, dropping, and double-clicking on Visio-style icons can indeed be “substantially less code” development. Properly trained business and clinical analysts, who don’t know how to program but do understand healthcare workflow, can create applications that would otherwise require teams of programs a magnitude or more longer to create.

To close the loop between my workflow discussion and my JASON architecture discussion, both seek to manage software complexity in different, but complementary ways. So, why not use both? Use workflow tech to implement the JASON architecture.


Leave a Reply