From Data and Information Governance to Workflow and Process Governance

Have you heard of the phrase “photobomb”? This is a photo including the unexpected appearance of an unintended subject in a camera’s field of vision as the picture is taken. Well, just call me the “workflow bomber” because I keep showing up at data-oriented events shouting about workflow.

That’s me, in the lower right. (And on the right? That’s one of my more data-oriented #HITsm colleagues looking at me, well, owlishly…)

Today’s #HITsm tweet chat topic, hosted by @ErinHead_HIM, is Data and Information Governance. In the questions below, I replaced “data” and “information” with “workflow” and answered THOSE questions.

Topic 1: Do you think there is a difference between #InformationGovernance and #DataGovernance in healthcare versus #WorkflowGovernance? Why or why not?

There’s a big, big difference between data and information governance versus workflow governance (though less so between information and workflow governance than between data and workflow governance).

But workflow governance is extremely important to data and information governance, as we shall see.

Data Governance

“Data governance (DG) refers to the overall management of the availability, usability, integrity, and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures, and a plan to execute those procedures.”

Information Governance

“Information governance is a holistic approach to managing corporate information by implementing processes, roles, controls and metrics that treat information as a valuable business asset.”

One could argue that IG overlaps with WG, and it does, because it specifically mentions processes, which is almost synonymous with workflows, depending on context. However, I think it’s worth taking a look at BPM Governance (Business Process Management, umbrella term for modern workflow and related technologies).

There are five key elements to BPM governance:

  • Measurement of workflow performance (cycle time, throughput, bottlenecks, rework, errors, etc.)
  • Ownership (who has the right to create and execute workflows)
  • Accountability (how are process owners held accountable for workflow performance)
  • Control (how much control do owners have over workflows?)
  • Support (how are workflow creators, managers, and executors supported to make sure workflows Perform best?)

Topic 2: Who do you see as the proper owner(s) of #WorkflowGovernance in an organization and why?

As is the case of data and information, ownership is a thorny issue. Workflows often span departmental fiefdoms; so breaking down workflow silos is essential to beginning-to-end workflow ownership and accountability. Sometimes it is who ever is responsible for the most important or most numerous steps in a workflow. Sometimes it ownership becomes the responsibility of a “Workflow Council” (which sort of reminds me of hospital form design committees).

Topic 3: Of the #WorkflowGovernance drivers listed, which do you think are the most important and why? Which drivers are missing?

Here’s the thing. If different organizations, at different times and in different contexts, any of the following drivers may become the most important. One of the primary values of workflow technology is the ability to quickly change workflows in response of evolving conditions, such as, in this case, changing governance drivers. In this you can see workflow tech’s unique contribution. Data is static and tactical. Workflow is dynamic and strategic. Increasingly, data and information governance will be made possible because it is workflow that changes data. In other words, future systems data and information governance will increasingly rely on systems of workflow and process governance.

Drivers of Workflow Governance:

  • Regulatory Compliance
  • Improve patient safety/patient care
  • Need to manage and contain costs
  • Need for clinical, quality, and/or business analytics
  • Changing payment environment
  • Need for increased standardization
  • Need to integrate and/or improve systems and technologies
  • New care delivery models (population health management)
  • Lack of trust or confidence in workflow and processes

Topic 4: How do you see #WorkflowGovernance affecting Population Health? What technologies are involved?

Great question! As I’ve written about win the past, in several blog posts, the most mature population health solutions today invariably have some sort of workflow engine powering a care coordination platform.

Topic 5: What do you think the workflow risks are for an organization without a formal #WorkflowGovernance framework?

Just as there are many risks to an organization without a system of data and information governance, there are many risks without a system of workflow governance. However, since most healthcare organizations have immature workflow technology, these risks are a combination of immature and under appreciated as well. Workflow governance will increasingly attract attention as healthcare organizations adopt more sophisticated interior-facing and exterior-facing workflow platforms.

I’ll highlight just one, unique to workflow technology, risk, that of “workflows gone wild.” As it becomes easier and easier to create new workflow application, without having to be a Java, C-sharp, or MUMPs programmer, there tends to be an explosion of workflow creation. The combination of pent-up need, creativity, and easy-to-create workflows results a dramatic increase in workflows, which may not work together well. So some sort of central authority, sometimes called a CoE, or Center of Excellence in the BPM world, is needed.


Twitter, Periscope, Health IT, and Patient Experience

This post is prompted by today’s #HITsm tweetchat Crowdsourcing Patient Experience on Social Media.

I’ve fallen hard for Periscope, Twitter’s new live video streaming app. Despite a long list of “But…”s (privacy, flakey clients, low rez video, difficulty finding the best videos in real-time, trolls…), the idea itself — “Explore the world in real time through someone else’s eyes” — is great, perhaps even, dare I say, revolutionary. For example, yesterday I explored the world of EHR and health IT medical office workflow through the eyes of a patient and her physician. (By the way, the Periscope link is only good for 24 hours, so will cease to work today around 2PM EST. See further below for YouTube archive. The Periscope is to be preferred, because it includes comments and hearts.) I’ve surfed off the coast of Australia. I’ve admired kittens online (now, that IS revolutionary!).

For five years I’ve been messing around with almost real-time wearable video streaming. I started with Looxcie (now defunct), a small camera I clipped to my baseball cap. What I really wanted was almost real-time wearable video streaming, but could never quite pull it off. Under the right conditions, I could sometimes stream short amounts of very low rez mobile videos to a couple of people with the right clients installed. I could get somewhat higher rez videos uploaded and tweeted within reasonably short turn around times, say within a 30 minutes.

I know, this wasn’t anywhere near real time, but sending a series of these out during a 3-4 day conferences was felt more real time than the alternatives. With Google Glass I finally got my One-Minute Interviews from health IT conferences down to a couple minutes after shooting the video. I’d shoot the video. Begin the video upload on the spot. Get a notification a minute later. And tweet.

Ironically, many didn’t like Glass because they thought you were live streaming everything, which it wasn’t very capable of doing for technical reasons. Then Periscope came along. Now folks are watching 40 years of live-streamed Periscope mobile video every day. A Glass/Periscope app would have been a killer app. I’ve actually tried to figure how to use Periscope without having to hold the smartphone. I used a 3D printer to design an attachment for a photographer’s tripod. Done so for a bike handle attachment. Walked around with my smartphone peeking out of my pocket. Bought a fisherman’s chest pack (like a small backpack) and adapted it. Investigated upper arm jogging straps smart phones. Even considered a sort of hat thing that would suspect my smart phone over my right eye, a la Google Glass.

At the same time, I’ve been watching how other people are using Periscope. Just like on Twitter, there is a lot of uninteresting crap to wade through. But just like Twitter, when you find the right accounts, its sense of transport and exploring the world in real time through someone else’s eyes is amazing. I’ve surfed in Australia, multiple times, that’s amazing. I can take or leave puppy and cat videos on YouTube. But so much more enjoy them on Periscope. Because it is in real time and unscripted, I feel like, for the moment, that is MY puppy or kitty. Because I, and other viewers, are contributing our thoughts and feelings and the ‘scoper is reacting, in kind, in the moment, from, on the average, thousands of miles away.

Just as with Twitter I’ve used Periscope for both professional and personal purposes. I streamed entire 50 minute presentations, to all of two people. Afterwards I uploaded to YouTube and was glad I did. (Note, the YouTubed version of Periscope videos lack comments and hearts, which are an important part of the Periscope experience.)

I’ve interviewed health IT colleagues at health IT conferences.

But I’ve mostly used Periscope for personal purposes. I live in Washington DC, a fun and interesting place. I ‘scope walks among its monuments, vacation adventures, and walked through my neighborhood (“Feral cat!”).

Sometimes I know the viewers (in the sense I know people on Twitter I’ve never met but still consider friends) and sometimes I don’t (Wow, someone from New Zealand is watching planes land at Reagan National Airport with me!). I think you get the idea. I really like Periscope, and the general idea of live streaming. But how might Periscope be relevant to patient experience? I first started think about this when I wrote my first blog post about Periscope: Periscope Helped Me Change My Hubcaps! Real-time Crowdsourced Healthcare Social Media Problem-Solving?

That post described how surprised I was when I frivolously Periscoped installing replacement hubcaps. I had no expectation of anything more then a bit of fun and a couple viewers. Forty-four showed up. I ran into problems installing the hubcaps. One of the views practically came through my smartphone when I decided to give up. He was in the “motor trade” from Birmingham, UK. He told us, step by step, how to successfully install our hubcaps. And we did.



At the very end of that post I wrote:

“Could this work, or be adapted, in healthcare? Obviously, there are all kinds of privacy, security and real-time search issues. But just imagine, if those obstacles could be overcome…. A doctor could Periscope, “I’m looking at a skin lesion I’ve never seen before #dermatology” and a thousand physicians tune in. A patient could Periscope, “I’m about to get a diagnosis I’m afraid of…. #cancer” and a thousand patients tune in, take notes, and offer support. Of course, I know, privacy, security and real-time search… but… What if?!”

While I’ve been exploring Periscope, @Jimmie_Vanagon, an internist in Montana, has too. Similarly, his scope span personal and professional topics. I’ve watched hawks in his backyard, his wife kiln (verb?) and sell pottery, and his father-in-law harvest wheat from within giant John Deere combines.

Dr. Vanagon (he likes euro vans) has also used Periscope to explain how he has set up his EHR and medical office workflows. (My interest in this subject is legendary among occasionally somewhat horrified but thankfully apparently bemusedly tolerant corner of the health IT Twitterverse.)

In the following video, uploaded from Periscope to YouTube after the fact, Dr. V interviews one of his patients, Martha, about her impressions of how he uses health IT in his practice. By the way, while the YouTube is good, it’s not nearly as good as the original Periscope, which included lots of comments (to which J responded) and hearts (which he appreciated). I think it is a good example of what is currently possible regarding the use of Periscope to explore patient experience, HIT, and workflow.



I think Periscope will be a lot like Twitter. It will be used in a great variety of ways that cannot be foreseen.

Folks have asked me how Periscope is better than other video streaming options.

Folks have asked me how Periscope could possibly be the best thing since sliced bread as a leadership tool.

I have asked myself, how can Periscope be used in health IT social media?

Great question!

P.S. A great way to watch some examples of Periscope videos is just click on this link. You won’t be able comment. That requires an Android or iOS client. Skip around until you find something compelling. It’s early days, wild and wooly! Then join the party!

Trade-offs, Preferences, Utilities, Part Two: A Conceptual Framework on Value of Cancer Care

Tonight the Healthcare Leadership (#HCLDR) Blog community will discuss the American Society of Clinical Oncology’s proposed Conceptual Framework on Value of Cancer Care. Preparing to participate in HCLDR tweet chats is great way to review areas of interested I’ve not thought about in while. I mostly think about healthcare IT workflow. So HCLDR usefully takes me outside my mental box, so to speak, though I did find some interesting healthcare and IT workflow wrinkles!

Value to patients and society of cancer treatments depend on preferences of various stakeholders. And closely tied to value and preference is the economic concept of utility. So I also recommend this white paper, What Are Health Utilities?

utilities

But first I read A Conceptual Framework to Assess the Value of Cancer Treatment Options.

While the title includes “Value”, the paper is motivated by the high cost of cancer treatment. Since I have an Accountancy undergraduate degree, I’m interested in the variety of ways healthcare measures (or fails to measure cost) and implications clinical and health policy decision-making. If cancer treatments cost a lot, then we need to know which treatments are most valuable to patients and society.

Coincidently, several weeks ago the HCLDR topic was trade-offs in healthcare. I did my research and wrote Healthcare Trade-offs, Shared Decision Making, Vulcan Mind-melds, and a Marriage Metaphor.

“A Conceptual Framework to Assess the Value of Cancer Treatment Options” is also about trade-offs. That’s why my title for this blog post starts of “Trade-offs, Preferences, Utilities, Part Two: ….” The very interesting Conceptual Framework paper is about value, and value cannot be understood without understanding patient preferences. Different cancer patients have different preferences about such things as quality of life versus length of life, and perhaps even financial consequences of treatment, “financial toxicity.”

In my previous post I explained how expected utility decision theory is the gold standard for combining probabilities (of whether a treatment will work or not) and patient utilities (preferences for possible consequences of treatment, pain, longevity, etc.). I also noted problems with formal methods such as expected utility approaches to shared medical decision making.

“Conceptual Framework…” is a high-level document. However, it mentions a variety of formal cost-benefit models in Table 1 summarizing how other countries measure costs and benefits of treatments. So I decided to drill down into the cancer patient preference topic from the point of view of seeing what has become the state of the art since I studied this subject in graduate and medical school. The best three papers (that I could find and read and for which full PDF text is available for free on the Web) are the following:

In addition to What Are Health Utilities, I also highly recommend Preferences For Cancer Treatments: An Overview Of Methods and Applications in Oncology. Published in reasonably recent 2012, Blinman et. al. include discussion of expected utility theory, similar to what I covered in my previous posts on trade-offs. A particular application of expected utility theory, the Standard Gamble, is “often considered to be the gold standard of [patient] preference assessment. They also cover Time Trade-Off, Discrete Choice Experiments, and Multi-Attribute Utility Instrument. As I noted in my previous blog post, I was sure there were many interesting developments in estimating patient preferences and utilities since when I studied the subject decades ago. And this paper is a great way to catch up on this topic.

The second paper, Patient Preferences in Choosing Chemotherapy Regimens for Advanced Non-Small Cell Lung Cancer, is a patient research survey study. The authors conclude “our study findings indicated that lung cancer patients have significant and varying concerns about the side-effects of chemotherapy and that these concerns are not being uniformly identified by physicians or integrated into decisions about treatment plans…. Further prospective studies of this issue should include exploration of both patient concerns and preferences and physician perceptions about tailoring chemotherapy regimens according to the adverse reactions they may cause.”

Deriving A Preference-Based Utility Measure For Cancer Patients… is a technical paper. It describes a method to derive a Health State Classification System, a collection of core domains of health-related quality of life (eg. fatigue, pain, nausea) and associated levels (eg. poor, moderate, good) and then a scoring algorithm of assigning a utility value to each possible health state. If you recall, from my previous post of healthcare trade-offs, I remarked how difficult it is to estimate the utilities to be plugged into expected utility models of shared decision making. This research addresses that problem.

Finally, since I’m a health IT guy, and always, always, looking for the workflow and usability angle, I’ll quote the following from the ASCO Conceptual Framework paper:

“The complexity of the value framework makes it clear that for it to eventually be used effectively in a practice setting, the information must be presented in a visually appealing, user-friendly way and acquired almost immediately. Thus, our vision entails preloading data for all regimens to be evaluated, and that of their comparators, into user-friendly software that can be used on a smart phone, tablet, or computer and integrated into the electronic medical record. The tool that is envisioned will include the key elements discussed here for clinical benefit and toxicity for the majority of commonly used cancer regimens in a variety of clinical scenarios and will permit incorporation of patient weighting preferences. For example, if, in the advanced disease setting, longevity is less important to a patient than freedom from toxicity, the tool should be able to adjust the clinical benefit and toxicity parameters to reduce the impact of clinical benefit and enhance the impact of toxicity, thereby producing a personalized NHB. The ability to modify the framework at the point of care would facilitate decision making by enabling patients to create a personalized NHB score that takes into account not only the specific clinical problem but also existing comorbidities, personal preferences, and values. In addition, access to the cost of the regimen in question and the patient’s out-of-pocket costs will provide additional context to the physician and patient in determining the relative value of treatment options.”

This topic fascinates me. It touches on some specific interests of mine. There’s an interesting connection between representing clinical protocols and clinical workflows. And clinical protocols, especially those with decision branch points, resemble decision trees such as I discussed in Part One of this series on trade-offs. So, naturally, I wonder if there might be a role for workflow technology at the point-of-care, to effect the kind of shared clinical decision-making decision support envisioned here…

That’s it! As usual, I learned lots preparing for a #HCLDR tweetchat. I look forward to it tonight, and every Tuesday night at 8:30EST. I hope I’ll see you there!

I’m Going on Vacation, To The New York Finger Lakes: Past Photos!

I usually only blog or tweet about workflow (80%?) or non-personal personal stuff (weird news, archeology, random science). The exception is when I go on vacation. I’m a bit of a photography nut… not very good… just a nut. Since I’m heading up to the Finger Lakes next week, and taking and tweeting lots of photos (plus some ‘scopes, via Periscope), I thought I’d prime the pump with this collection of past tweeted Finger Lakes photos.

Thank you for your patience, until the 16th, when I’ll be back to healthcare workflow and workflow tech, 24/7!

My Five-Part Series on Interoperability Is In Healthcare IT News This Week

7,000 words on the missing layer of interoperability that no one talks about: workflow interoperability! You may have seen my post on pragmatic interoperability. It’s workflow tech that will make pragmatic interoperability possible. That’s why I call it workflow interoperability around health IT folks; it’s less scary. 🙂 Pragmatics is a term from linguistics, as syntax and semantics are, but is lesser known in the health IT realm (I’m ABD — all but dissertation — in computational linguistics). I start by defining task interoperability, and then use that definition to build a definition of workflow interoperability.

The next five years will see enormous investments in time, energy, and money, climbing an important learning curve, from the data-centric notions of interoperability toward workflow-centric notions. From essentially fancy database management systems toward true workflow management systems, the modern manifestation being Business Process Management (BPM) application platforms.

By the way, I am extremely gratified by how widely my series was distributed via social media (almost 150 shares on LinkedIn, over 400 on Twitter), as well as the many public and private comments of praise and encouragement. Viva la workflow!

  1. Achieving task and workflow interoperability in healthcare (Monday)

  2. A look at what healthcare task interoperability means (Tuesday)

  3. Laying down a definition of workflow interoperability
    (Wednesday)

  4. Bridging the gap between healthcare data and healthcare workflow
    (Thursday)

  5. Achieving workflow interoperability among healthcare organizations
    (Friday)

Coincidently, tonight’s Healthcare Leadership (#HCLDR) tweetchat is about interoperability! My answers to its four questions is sort of a highly condensed executive summary of my entire five-part series: Healthcare Data Interoperability and Workflow Interoperability: Four Questions (and Answers).

Healthcare Data Interoperability and Workflow Interoperability: Four Questions (and Answers)

This post is prompted by this week’s Healthcare Leadership (#HCLDR) tweetchat. It’s at 8:30 EST on Tuesday. The following questions come from @JoeBabiain‘s tee-off post the Challenge of True Interoperability and Why It Matters. (By the way, please also check out my five-part series on Healthcare IT News this week: Achieving Task and Workflow Interoperability in Healthcare.)


T1: How urgent is the need for true healthcare data interoperability and why?

Even more urgent than data interoperability is workflow interoperability. The latter is a layer of interoperability above the former. Data interop is about getting messages from one system to another and having them mean the same thing in both systems. Workflow interop is about messages having the intended effect of the sender of the message. Much health IT investment and software development activity today is putting down a layer on top of legacy EHR and health IT systems. Much of this activity is about getting to workflow interoperability.

To some extent, data interoperability is a prerequisite for workflow interoperability, but not completely. Workflow interop can strategically compensate for problems at the data interoperability level. For example, workflow technology can escalate data interop problems for more intelligent automated handling or even human intervention. Consider this extreme example. Before data interoperability existed between EHR and health IT systems, some degree of workflow interoperability already existed. How is this possible? When a physician clicked a button to send a document to another organization, she or he did not care how this was accomplished, merely that was accomplished.

Before any data interop even existed, humans did the necessary work to achieve workflow interoperability. It was inefficient — involving copy machines, faxes, phone calls and sneaker net — but it was intelligent. Workflows were intelligent because the humans carrying them out were intelligent. They understood the purpose of the communications, so they tried to do what was necessary to achieve the intent of the communications. Today we have increasing data interoperability, but we’ve lost some of that intelligent workflow processing along the way. We need to marry together both data and workflow interoperability to get where we need to go.

Workflow interoperability essentially requires models of workflow and work, and their automated interpretation by workflow engines, AKA orchestration or process engines. Therefore workflow interoperability requires workflow technology.

T2: What experiences have you had with lack of interoperability?

My experience with lack of interoperability is that of a programmer working on EHR interoperability for a vendor. Typically, our EHR customer would approach us about interfacing with some source or recipient of patient data, such as clinical labs, e-prescribing, vaccine registries and such. So my experience was that of moving from a state of data and workflow non-interoperability to a state of data and workflow interoperability via use of interface engines, message parsers, and configuration of incoming and outgoing patient data workflows.

T3: Do you see incumbent providers willingly getting “on board” or will further market forces/regulation come into play?

First of all I’ll assume by “getting ‘on board'” you mean “Ready to participate or be included; amenable.”

I’m not sure if you mean for “providers” to mean clinicians, or providers of EHRs and interoperability solutions. However, it is an interesting question in all three respects. In all three cases, providers are essentially already “on board.” Healthcare interoperability is like motherhood, apple pie, and the American flag. Everyone is for healthcare interoperability. The main exception is that EHR vendors get a lot of flack because they are perceived to be “information blocking.”

While I am sure that there is some of this going on, on the whole, I don’t believe this is the main, root cause of healthcare’s “interoperability problem.” (I also know I am probably in the minority in this view.)

The contentious nature of healthcare interoperability, especially regarding blaming corporations and/or the government, reminds me of some discussions of why HIEs (Health Information Exchanges) aren’t as successful as everyone hoped. The reasonable, and accurate in my opinion, view has been that there’s been a lack of sustaining business models. Exactly the same point can be made about into why we’ve not better achieved healthcare data (and workflow) interoperability. If we can figure out how to better incent healthcare interoperability, then we’ll make better progress, goes an increasingly popular view.

Part of the reason we lack sustaining healthcare interoperability business models is that obsolete workflow-oblivious technology makes even the possibility of interoperability too expensive. This is the link between business models and technology models I’ve written about before.

The point I’d like to make here is that business models do not exist in a vacuum. They rely on technology models. Our current health IT infrastructure is notorious (in my mind, and in more-and-more other minds as well) “workflow-oblivious.” Without what academics called “process-aware” information systems, data and workflow interoperability are simply too difficult a problem to solve effectively and efficiently. In other words, a major reason healthcare interoperability has not been forthcoming, is that our fundamental health IT infrastructure lacks necessary architecture characteristics for transparent and flexible workflows within and among healthcare organizations. So, the healthcare interoperability problem still boils down to lack of sufficiently sophisticated workflow technology within and among healthcare data-exchanging partners.

T4: What can we as healthcare leaders do today to change the current state of interoperability? Can it be done?

Yes, workflow (and data) interoperability in healthcare can indeed be accomplished. Open discussion forums such as the weekly #HCLDR (Healthcare Leadership) tweetchat are extraordinarily important to getting to true workflow interoperability in healthcare.

I am sure most readers of this post are familiar with the healthcare Triple Aim (and related Quadruple Aim). I have a Healthcare Workflow Triple Aim, which I wrote about in my Health IT Workflow Silo post in HL7Standards.

  1. Educate healthcare leaders, clinicians, health IT, and healthcare social media influencers about healthcare workflow and workflow technology.
  2. Highlight healthcare workflow and workflow interoperability successes, in terms of healthcare organizations, IT vendors, and stakeholders.
  3. Recruit the best minds in workflow technology, from both inside and outside of healthcare, into accelerating use of process-aware technologies to facilitate true workflow interoperability.

Are we making progress in regards to the Healthcare Workflow Triple Aim?

Yes!

I see plenty of evidence that all three legs of the Workflow Triple Aim are materializing into existence and lots of wonderful synergies are occurring between them! (Boy is that a mixed and mangled metaphor!)

My evidence? Too much to go into here, but I’ll leave you with this mornings’ Today’s Thought… 🙂

… which I wrote before I knew I would be writing this post.

Anyway, I look forward to tomorrow evening’s #HCLDR discussion of interoperability!