NIST EMR / EHR Usability Workshop: A Highly Annotated Tweetstream

Short link: http://ehr.bz/73

A.S. Check out the EHR.BZ REPORT on Workflow, Usability and Productivity.

June 7th I attended the well-run and thought-provoking NIST EHR Usability workshop in Gaithersburg, Maryland (http://ehr.bz/nistux). I tweeted my notes. I’ve tried this before with mixed results (Dr. G’s Workflow Management EMR presentation at HIMSS, Tweeting Live from HIMSS, Tweeting Live from Process.gov).

I decided to try again!

So I…

  • Tweeted my “notes.”
  • Copied the tweets from Twitter.
  • Reversed their order (“tail -r tweets.txt” for you UNIX folks).
  • Pasted them into WordPress.
  • Edited for readiblity.
  • Added more thoughts and material such as links and specific slides.

This is a long (and sometimes meandering and ruminative) document, so here is a table of contents if you’d rather proceed directly to one or other presentation topic.

I should make a disclaimer. I am biased. Lack of EMR usability has more do with document-based versus process-based approaches to building EMRs than it has with government versus industry approaches to driving EMR innovation. Most current EMRs rely on structured documents and unstructured processes. Until EMR users are given tools to model, execute, monitor and systematically improve the standardizable processes generating the structured documents (and potentially redirect those processes on the fly), lack of usability will continue to slow EMR adoption. (See EHR/EMR Usability: Natural, Consistent, Relevant, Supportive, Flexible Workflow) Whether government or industry or both accomplish this, and how, is an important debate, but more about ideology than what is technically retarding EMR usability.

Let the tweets begin!

Looking forward to the NIST EHR / EMR Usability workshop tomorrow

nist-ehr-emr-usuability-screen

I’ve arrived at the NIST EMR / EHR usability workshop, my gadgets are charged, I have my coffee, presentations are about to begin …

By the way, I read NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records to prepare

Today I’m attending Measuring, Evaluating & Improving Usability of EHRs EMRs Workshop NIST Gaithersburg, MD

A Community-Building Workshop: Measuring, Evaluating and Improving the Usability of Electronic Health Records, June 7th, 2011, Gaithersburg, MD

Original announcement and agenda:

http://ehr.bz/nistux (cached announcement and agenda)

Opening remarks A (cached)

Welcome to: A Community-Building Workshop: Measuring, Evaluating and Improving the Usability of Electronic Health Records

Opening remarks B (cached)

A Community-Building Workshop: Measuring, Evaluating and Improving the Usability of Electronic Health Records

NIST: innovation/competitiveness by advancing measurement/standards/tech 2 enhance econ security/quality of life

Cramming NIST’s mission statement into 140 tweetable characters resulting in something a bit terse, so…

NIST’s mission:

“To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.”

Why is Improved Usability of EHRs Important?

Starting: Intro 2 workshop

Usability: strong/direct relationshp w/ clinical productivity, error rate, user fatigue & satisfaction

Also a bit terse, but the cool thing is that these were great search terms to find the original quote

“usability is one of the major factors—possibly the most important factor—hindering widespread adoption of EMRs. Usability has a strong, often direct relationship with clinical productivity, error rate, user fatigue and user satisfaction–critical factors for EMR adoption. Clinicians lose productivity during the training days and for months afterward as they adapt to the new tools and workflow. Some productivity losses are sustained, mostly due to longer time needed for encounter documentation in complex patients” (Defining and Testing EHR Usability)

ONC goals include EHR usability transparency

There were several other goals regarding usability, but I wasn’t a quick enough so, from presentation slides distributed later…

ONC’s Goals

  • Improve transparency on usability
  • Promote technology that fully supports care
  • Identify and address potential safety issues, but also factors that affect efficiency and effectiveness, user satisfaction, etc.
  • Enable constructive innovation

#ehrusability is the hashtag for NIST EHR usability workshop

Introduction: The Promise of EMRs

Presentation PDF / Cached

@c_wb Starting: Introduction: Why is Improved Usability of EHRs Important?

one pediatric growth chart takes 8 clicks to get to open…

Wish I had a nickel for every time “click” was mentioned at this workshop. It also on the minds of folks not attending the workshop (pro and con: here, here, here, here, here and here are representative).

The number of clicks is a fairly superficial measure of (lack of) usability; three easy, fast, automatic clicks is can be more usable than one long tortuous click. However, cognitive effort is harder to measure then physical user events, so it’s a convenient surrogate. This point was made several times by members of the audience. I suspect that most people attending the workshop agree, but that “click” is shorthand for a wide variety of data and order entry costs: time (to target), effort (to find target), and error (if missed).

Perhaps entering data into an EHR/EMR should resemble playing a musical instrument. A pianist can effortlessly click on a lot of keys in a very short period of time.

poor usability can cause medical errors. We need to figure out how to test, identify and prevent them

The workshop focussed more on error minimization and patient safety than other possible usability goals such as speed, productivity and user satisfaction, though each of these topics were indeed represented in presentations and discussion. The following slide is from a later presentation. Notice that “critical errors that impact patient safety” and “errors and failures” are highlighted in red.

summative-ehr-usability-test-plan

Overview of Current Programs for Improving EHR Usability (at NIST)

Starting: Overview of Current Programs for Improving EHR Usability

NIST EHR Usability Program

Presentation PDF / Cached

Starting: NIST EHR Usability Program

NIST funds EMR usability research, such as Human Factors Guidance to Prevent Healthcare Disparities with EHRs http://1.usa.gov/lXcPYT

Resources for the Regional Extension Centers

Presentation PDF / Cached

Starting: Resources for the Regional Extension Centers Usability of EMRs

accessibility and usability of data was a theme of 2010 HITPC Meaningful Use hearings

Themes that Emerged from 2010 HITPC/ Meaningful Use Workgroup Hearings

  • Achieve 4 Es: engage, educate, empower, and enable
  • Meet needs of diverse population
  • Accessibility/usability of data – Need for mobile apps(esp.for vulnerable populations) – Contextualizing information – Multiple languages – Compatible with assistivetechnologies
  • Patient-provider secure messaging
  • Incorporate patient-generated data into EHR
  • Provide ample training on all functionalities

regional extension center: need training to get to 4 clicks instead of 15

No kidding. I know of two EHRs that both allow a physician users to approve a refill request; one takes four clicks 🙂 and the other takes 25 🙁 . Yeah, I know, now I seem to be contradicting myself. But in this case it’s four fast automatic effortless clicks versus 25 slow laborious clicks.

RECs to help train to “vendor specific workflow”

Interesting phrase: “vendor specific workflow”

The slide says “Work directly with providers on a regular basis. Use the HITRC for usability tools and resources. More participation with RECs; more participation with RECs; Webinars, Training, Response; Vendor specific workflow.(my emphasis)

Given that a majority of the total cost of owning an EHR / EMR can associated with workflow customization issues (I do have reference on this, but not immediately at hand), it’ll be interesting to observe how RECs cope with this particularly important and problematic aspect of implementing traditional EHRs.

See my Mirror, Mirror, On the Wall, Which EMR is Least Traditional Of All? for some tongue-in-cheek comments on how darned difficult it is to cope with the relatively uncustomizable workflow of traditional EMRs / EHRs.

Evidence-based Usability Guidelines for Promoting Safety and Efficacy

Presentation PDF / Cached

Starting: Evidence-based Usability Guidelines 4 Promoting Safety & Efficacy

took sixty yrs to get to standardized time zones, can we get 2 usability faster?

lessons other industries: apple microsoft android “usability guidelines are powerful”

Usability Guidelines are Powerful

  • Apple, Microsoft, Android, …
  • NASA, FAA, DoD, …
  • HFES, WWW-Consortium, …
  • RAISE QUALITY
    • Promote Consistency
    • User Performance Speed Development
    • Programmer Productivity Reduce Errors
    • Organization Reputation

Interesting though to contrast the slide above and the slide below (from a later presentation)!

if-the-gov2

This second slide shouldn’t be taken out of context though. It was presented as a typical criticism of government-led EMR usability test that must be acknowledge and addressed.

for a good model of research-based web design and usability guidelines see http://usability.gov

HIT usability researchers find it difficult to obtain user docs, screenshots, demos from EMR EHR vendors for research purposes

Yeah, I know what he is talking about. When I was in academia it was very difficult to get cooperation from EHR / EMR vendors to help train students let alone conduct research. When I worked for an EHR / EMR vendor I suddenly had access to a plethora of such materials, butdid not have time (or charge) to train students (as opposed to users) or conduct and present research. I started this blog in an attempt to bridge this gap and resolve (to my satisfaction) some of these inherent contradictions. I wrote about this in Walking the Fine Line between Marketing and Education.

TURF – A Unified Framework for Defining, Evaluating, Measuring, and Designing EHR Usability

Presentation PDF / Cached

Starting: TURF: Unified Framework 4 Defining, Evaluating, Measuring & Designing EHR usability

TURF = UFuRT? Below is a diagram from 2007 paper…

ufurt-diagram1

(cached)

TURF = task + user + function + representation

Note the common components between the previous and subsequent slides: task, users, function, representation, intrinsic difficulty/complexity, extrinsic usability/difficulty

turf-framework1

“function saturation”

function-saturation1

“overhead in designer model” less is better

overhead-designer-model

measuring usability: learnability, efficiency, error prevention and recovery

How to Measure Usableness?

  • Learnability
    • trials to reach a certain performance level
    • items that need to be memorized
    • sequences of steps that need to be memorized
    • Etc.
  • Efficiency
    • Time on task
    • Task steps
    • Task Success
    • Mental effort
  • Error Prevention and Recovery
    • Error occurrence rate
    • Error recovery rate

efficiency measure: time on task in seconds (5 minutes on CPOE )

time-on-task2

number of steps, mental effort (thru cognitive modeling)

turf-action2

For an overview of cognitive modeling see Toward Cognitive Modeling for Predicting Usability

TURF in action: reduced steps from 91 to 14

Note that if an EHR does not have some means for its users to modify its workflow then this means going back to the programmers to “unfreeze”, change, and then “refreeze” its workflow. I addressed this in my post Litmus Test for Detecting Frozen EHR Workflow.

in summary: usability is definable, measurable, doable

By the way, usability approaches that emphasize an entire team as the user instead of a single user before a single display (groupware instead of “singleware”) are complementary with workflow-oriented approaches such as TURF/UFuRT. This is why, in 2004 I wrote the following about the importance of workflow management systems (today business process management systems and suites) to EHR / EMR usability:

“Workflow Management and EHR Usability

EHR workflow management concepts mesh with research initiatives to improve EHR usability. For example, Human-Centered Distributed Information Design [6] (there applied to EHR usability issues) distinguishes four levels of distributed analysis: user, function, task, and representation, which correspond well to workflow management architectural distinctions.

[1] Distributed user analysis can be interpreted to include allocation of tasks, relationship between roles, and task-related messaging, all of which are important workflow management concepts.

[2] Distributed function analysis involves high-level relationships among users and system resources. From a workflow management perspective, this includes who reports to whom and who is allowed to accomplish what.

[3] Distributed task analysis roughly corresponds to the creation of process definitions that in turn drive EHR behavior: What is to be accomplished by whom, in what order, and what needs to happen automatically.

[4] Distributed representational analysis corresponds to something that workflow management systems intentionally do not address. Workflow management system design tends to be agnostic about how information is displayed to, transformed, or collected from the user. The underlying workflow engine is intended to be a general purpose tool that can be used to sequentially launch whatever screen or initiate whatever behind the scenes action that the implementer of the workflow system deems most apt as part of workflow analysis and design. However, by remaining orthogonal to the choice of screen, by not mandating or hard coding, the designer/implementer is free to bring to bear the powers of representational analysis to use whatever screen and attendant representation is most appropriate.

Thus, workflow management concepts are consistent with human-centered distributed information design, an important emerging area of medical informatics research. “Task-specific, context-sensitive, and event-related displays are basic elements for implementing HCC [human-centered computing] systems,” (p. 46 [6]) and they are the basic elements provided by EHR workflow management systems, too.” (my emphases)

Overview of HIMSS Usability Taskforce Initiatives

Presentation PDF / Cached

Starting: Overview of HIMSS Usability Taskforce Initiatives

promotes industry education about usability principles & measurement

usability maturity model white paper

Defining and Testing EMR. Usability: Principles and Proposed Methods of EMR Usability Evaluation

subgroups include HIMSS Celltop Design Workgroup: smartphones, handheld design tenets, partnership NIH & HIMSS

next steps: maturity model checklists, attention education, more white papers

Safety, Usability and User Interface Standards in the NHS (Virtual Presentation)

Presentation PDF / Cached

starting virtual presentation from britain on NHS usability program

CUI = Common User Interface, speaker has background in aviation human factors (as have I, sort of) which is advanced over healthIT with respect to usability

common user interface example: patient banner eg “display comma after family name”

NHS lessons learned: usability + safety strongest message, considerable common user interface docs online at portal located at http://www.cui.nhs.uk

“there’s some very poor product out there” “woefully clunky”

Amen! Question: what is the best way to make EHRs generally less clunky?

  • Government?
  • Industry?
  • Both?
  • How?

Even bigger question: Is above the right question?

Human Factors Approaches to Improving EHR Usability

Presentation PDF / Cached

Starting: Human Factors Approaches to Improving EHR Usability

amusing user videos “we only have 3 screens, we don’t need 500 buttons”

The actual quote was “We only have three screens–preop, intraop, postop–we don’t need five hundred buttons” How do I know? I used the Looxcie wearable camcoder to provide a resource to consult when I want the exact wording of something.

[flv:http://www.chuckwebster.com/video/NIST-EHR-usability/500-buttons/we-dont-need-500-buttons.flv 250 139]

RT NHS reps bemoan lack of visibility into ehrusability; no recourse after you sign the ehr contract

+1 RT : Super cool ehrusability lab in Canada with live video recorded simulation

challenges: generalizability, resource requirements, risk vs usability, comprehensiveness…

question from audience: from an CMIO, how to tap into your (the folks presenting at NIST EHR usability conference) usability expertise?

Answer: Join this EHR usability community that NIST is in the process of creating.

question: how to make sure requirements are done right?

answer to previous question from virtual NHS folks: usability needs to be brought up in contract phase, otherwise it is two late for it to influence requirements

member of panel: patient is also a user, an example is a patient identifying an error in chart when looking at the EMR screen

As an aside, I once suggested that pediatric EMR EHR user interfaces ought to, and eventually will, evolve to look like illustrated children’s books (as part of a large exercise to make pediatric offices more-and-more child friendly). Why? Because the pediatric patient is an EMR user too. EHR buttons ought to be big enough for not only the pediatrician to hit on the fly (respecting Fitts and Hicks laws of target acquisition) but for the sharp-eyed child to see. An EMR is not just [insert standard EHR definition here], it is a form of persuasive technology. Big buttons displaying soccer balls and report cards carrying the letter “A” ought to be part and parcel of pediatric EMR user interfaces.

soccer-ball-report-card

Cute representations of real and imaginary animals *ought* to scamper about an EMR UI, drawing in the child EHR user, not just making them less fearful, but entrancing and motivating them. Imagine a pediatrician clicking on a big animated button while saying “Come sit here and let’s see what the big blue bear thinks you should do about your cough!” EHRs, really usable EMRs, by Disney or Nickelodeon are in our future. Just not sure when–how soon or how long.

question: can NLP natural language processing help? docs wont tolerate lots of clicking

I took all the courses necessary for a degree in Computational Linguistics before switching into Intelligent Systems, including phonetics, phonology, morphology, syntax, semantics, pragmatics, and NLP I, II, and III, plus knowledge representation and NLG (natural language generation). The problem I have with replacing properly-managed structured data entry with speech recognition and natural language processing is this: a human, either the original speaker or someone else (perhaps some sort of post-editor) still needs to proof the string of linguistic tokens emitted by a speech recognition-based EMR user interface, or canonical representations of what those tokens mean.

Even if speech recognition is 99.5% correct, as long as non-automated proofing and post-editing is required, the fire-and-forget nature of clicking (or, preferably, touching) a picklist item, which does not require proofing or post-editing, is superior. And as for slowing down the physician user (the so-called , again, if Fitts and Hicks laws are respected, structured data entry can actually be faster and more accurate than other data input modalities. Where speech recognition makes more sense, at this point in the evolution of the degree of intelligence possible by the technologies we can bring to bear on this problem, is in the mobile smartphone interface. However, here the context is severely constrained and the amount of required proofing and post editing is minimal (though still required).

By the way, computational linguistics is relevant to not just processing medical language generated by humans, but communication between EMR systems as well (especially at the higher, currently less considered, levels: pragmatics and discourse).

Answer: we’ll compare input modalities, including NLP Question: can NLP help?

RT Following tweets from ehrusability workshop http://bit.ly/kexzxu

Hi Brian!

Collaboration and Consensus through Standards – The National Technology Transfer and Advancement Act

No Presentation PDF

Starting: Collaboration & Consensus thru Standards – National Technology Transfer & Advancement Act

by its very nature standards are a “collaboration” vehicles and “consensus” processes

NIST has a standard for creating standards (hmm, is there a standard for creating standards for creating standards?)

A Community Approach to EHR User Experience Measurement

Presentation PDF / Cached

Starting: Community Approach to EHR User Experience Measurement

classic tradeoff between focus vs participation: so “focussed collaboration” approach

From slide presentation speaker notes:

“Focused collaboration means that we’re engaging a broad array of stakeholders in the development process, but managing their work property to ensure the most efficient and effective process. What we don’t want to see is a high degree of focus (top down, heavy-handed government-driven process) with little participation from outside stakeholders; nor do we want a highly participatory process with no strong focus that results in a lot of great ideas, but no results. We’re seeking the best of both worlds.”

some say: if the gov created usability tests we wouldn’t have iphone or android

We saw this slide earlier

others say usability is indeed a science but that clinical workflow are “nuanced”

I presume (perhaps too much, though) that “nuanced” means not susceptible to formalization, formal analysis, or automatic execution in the service of greater data and order entry usability. I absolutely disagree.

From EMRs, EHRs, and Clinical Groupware Need to Solve “The BPM Problem”: Why Not Use BPM to Help Do So?

While it is true that most current traditional EMRs lack facility to model and execute workflow, future EMRs based on workflow management systems (WfMSs: workflow engines plus process definitions) and business process management technology (WfMS plus business intelligence, business activity monitoring, process mining, complex event processing, process simulation optimization, adaptive case management, etc.) inevitably will.

vendors don’t want government say what products are good or bad, we [vendors] don’t want the government to incent us into creating technology that stinks

BTW funny typo on slide, text was “…we want…” instead of “we [don’t] want the government to incent us into creating technology that stinks”

Audience reaction to the type was good humored laughter. I certainly appreciated the lighter moment after hours of seriousness.

After the lighter moment, there is this: “So we have this classic tradeoff of the vendor community not wanting, for absolute valid reasons, the government to tell the world what products are good or bad, and the provider community saying, look we want to use electronic health records but we need to improve practice workflow, we want to make sure we have the right signals about the products we purchase”

Again, a central question appears to be: How to make EMRs, on the whole, “less clunky?

EHR EMR Usability professional: “sometimes i feel like an island in my own organization: the product is done, marketing brings me in, ‘the usability stinks'”

From my Intuitive vs. Intuitable EMRs, EHRs, and Clinical Groupware: Do We Need Smarter Users or Smarter User Interfaces?

“Usability can’t be “added” to EMRs, EHRs, or clinical groupware. It has to inform and influence the very first design decisions. And there are no more fundamental early design decisions than what paradigm to adopt and platform to use.

No matter how “intuitable,” EMRs without executable process models (necessary to perceive, reason, and act, and later systematically improve), cannot become fully active and helpful members of the patient care team. Wrong paradigm. Wrong platform.

Truly “intuitive” process-aware clinical groupware, on the other hand, has a brain, variously called a BPM, workflow, or process engine. This is the necessary platform for delivering context-aware intelligent user interfaces and user experience to the point of care. Right paradigm. Right platform.”

Community of Profession Model

Presentation PDF / Cached

Human Factors / Usability for Medical Devices at FDA: An Historical Perspective

Presentation PTT / Cached

Starting: Human Factors / Usability for medical devices at FDA: Historical Perspective

medical device milestones, 1976 bureau of MD, 1984 congress hearings on deaths

1999 “To error is human” 98,000 deaths, 5th cause death, cost $29B

key: FDA review of pre-market submissions, outreach 2 industry

feedback: sales increase w/ satisfaction of customers for devices w attention to human factors driven by FDA

FDA concerns about device usability include relying on checklists and rating scales instead of systematic usability reviews

See my own comments about the use of checklists to evaluate EMR EHR usability

Building More Usable EHRs – Supporting the Needs of Developers “Focus on Faster & Usable Clinical Documentation”

Presentation PDF / Cached

Starting: Building More Usable EHRs: Supporting Developers “Focus on Clinical Documention” usability EMRs NIST

one study’s conclusion: “current EMRs frustrate physician collection of data”

current-ehrs-frustrate

Another study:

summary-conclusions

“the ‘we computerized the paper, so we can go paperless” fallacy” displays not as portable, flexible or well designed as paper

remove tension btwn free text vs structured documentation

This from this presentation’s key slide:

Recommendations From Literature

Remove tension between free text versus structured documentation

Clinical documentation needs to support both seamlessly

  • Usability and semantic interoperability go hand in hand
  • Refuse systems that do not deliver both
  • Remove tension between clinician/physician documentation as a billing vehicle and as a clinical documentation tool
  • Improved data input and richness of documentation can coexist if you design the system properly
  • Usability is perhaps more crucial than interoperability
  • The question of interoperability will be unresolved if clinicians fail toaccurately record the data

improved data input & richness of clinical documentation can coexist

Also the point of my rejoinder to the many criticisms of structured EMR EHR data entry

Developers: Supporting the Needs of Patients

No Presentation PDF (no slides were used)

Starting: Developers: Supporting Needs of Patients Usability of EHRs EMR

among users: physicians have the highest standards but also lowest tolerance (hmm, not bad, just a fact)

In fact my experience has been that part of the difficulty in developing usable EMRs has been creating an EHR that is useful and usable enough to physicians, in spite of its flaws and their high standards, to sustain physician engagement in the necessary process of improving EMR usefulness and usability.

As soon as you talk about certification your are talking about an idealized model EMR vendor customers are skeptical

“i don’t believe the price of adding government [to this mix] will be rewarded w/ better outcomes”

Again consulting my Looxcie:

“What you are saying is that your design is meaningful for every purchaser out there and your standards, your metrics, your way of saying yes we’ve have these numbers and tests are going to translate into provider happiness. As I said, I challenge the assertion that the industry has failed but I find it even more surprising the assertion we can’t do better. We think there is a role for usability in healthcare because true usability has to start out with the tasks the users have to provide, or have to perform everyday, it’s their requirements. So let’s start to look at some of those requirements that they have for meaningful use right now and figure out the way we can make those processes more efficient and more usable while still maintaining their usefulness. I think ultimately then what we need to do is to first of all provide support to those organizations like ISMT [???] and give the purchasers the tools that they need. Because we have clinicians who demand quality based on markets that have to bow to those demands in order to survive. I don’t believe the price of adding government into this dynamic will be rewarded with better outcomes. Thank you…”

Educate, Motivate, and Improve: In Favor of Inspecting and Rating UCD/Usability Processes

Presentation PDF / Cached

Starting: Educate, Motivate & Improve: In Favor of Inspecting/Rating UCD/Usability Processes

+1 “who is the user? it’s a cooperative *group* of users” [that is the user, not an individual user] #ehrusability #EMR clinical #groupware

I absolutely agree with this point. Focussing on the individual user in front of a single screen will ultimately be counterproductive. I tagged this tweet with the hashtag #groupware because I think that the phrase (and movement) associated with “clinical groupware” has this particularly right. Clinical Groupware, Care Coordination, and EMR Workflow Systems: Key Ideas provides an overview of this history of groupware as it relates to workflow (and getting workflow right is so important to usability, and, in my view, considerably misunderstood)

vendors are worried they will spend more time getting certified instead of improving usability

EMR certification does have the danger of running afoul of the famous software development “Iron Triangle”. Given the same level of resources a software product can increase one or two of 1) features, 2) quality, 3) time to market only at the expense of one of two of the remaining goals. By diverting resources to meet certification not only can one set of features (those required for certification) crowd out other features, but also constitute an obstacle to quickly getting a stable and usable EHR release to market. Usability certification may, ironically, potentially forestall the very usability innovation it seeks to advance.

summary: need incremental improvement & competitive dynamic to invent new solutions

Love “Focus on the ‘Five Big Tasks’” RT “War on EMR usability” Should gov get involved? post.ly/2Aagh

Agree RT we should continue to automate where it makes sense and let providers focus on practicing medicine

One of the fun bonuses of tweeting notes live from a presentation is that folks (in the audience or not present) will occasionally publicly (or privately, by direct message) chime in. I’ve previously written about this Twitter-mediated conference back channel.

Usability is the Key to Stimulating EHR Innovation and Adoption

Presentation PDF / Cached

Starting: Usability is the Key 2 Stimulating EHR Innovation & Adoption

Usability: extent EMR EHR can B used by specified/S users 2 achieve S goals w/effectiveness, efficiency & satisfaction in S context NIST

usability-definition

It’s an interesting exercise to extend this usability definition to a clinical groupware approach to EMRs:

“Great definition…but it just seems so, well, “singleware-ish.” Clinical groupware needs a less abstract definition of usability that is more direct about groupware’s unique usability issues (see the sixth, seventh, eighth, ninth, and tenth quotes from my Clinical Groupware…Key Ideas post). How about:

Clinical groupware usability is…

“The extent to which clinical groupware can be used by specified teams of users to coordinate activity and achieve specified collections of goals with overall effectiveness, efficiency and satisfaction in specified contexts of use.”

[Those links to the sixth through tenth quotes about groupware usability? I pull that material into this post after its conclusion below.]

meaningful use has emphasized functionality over usability, 100 clicks 2 doc’mt smoking status!

Surely an exaggeration, but point well taken.

2 types of usability: individual usability vs workflow usability, how can we reduce physician work/steps/task?

two-types-ehr-usability

Absolutely agree. Also see material on clinical groupware usability in postscript.

Government should do 3 things: define usability measures, promote open platforms and APIs, national EHR usability database

compare users to find “positive deviance” find people doing well and find out what they are doing to learn from it ehrusability

Wrote about the potential to do something similar relative to comparing medical practice productivity measures and explaining them in terms of differences in workflow.

individual vs workflow usability: think more about how usability affects team-based care ehrusability NIST emr clinical groupware

Again, incredibly important insight, this! In my post The Cognitive Science Behind EMR Usability Checklists I wrote:

“[T]here is no guarantee that optimizing single user usability won’t in suboptimize higher level global system goals. So I prefer a definition of usability that emphasizes team, rather than individual, performance.”

Again, see postscript.

Promoting Usability in Healthcare Organizations with a New Usability Maturity Model

Presentation PDF / Cached

Starting: Promoting Usability in Healthcare Organizations with a New Usability Maturity Model

Speaker defined usability as “sexier products using a process”

Now that is a sexy definition of usability!

Usability Maturity Model: unregulated, preliminary, implemented, integrated, strategic

usability-maturity

While we are on the topic of maturity models I’d like to mention two others 1) the Software Capability Maturity Model developed at CMU’s Software Engineering Institute (the original MM that has inspired other MMs, and 2) the Business Process Management BPM Maturity Model.

Relative to the Capability Maturity Model, Wikipedia says more generally:

“A maturity model can be viewed as a set of structured levels that describe how well the behaviors, practices and processes of an organization can reliably and sustainably produce required outcomes. A maturity model may provide, for example :

  • a place to start
  • the benefit of a community’s prior experiences
  • a common language and a shared vision
  • a framework for prioritizing actions.
  • a way to define what improvement means for your organization.

A maturity model can be used as a benchmark for comparison and as an aid to understanding – for example, for comparative assessment of different organizations where there is something in common that can be used as a basis for comparison. In the case of the CMM, for example, the basis for comparison would be the organizations’ software development processes.”

Relative to a BPM Maturity Model here is a similar five level chart from “Towards a Business Process Management Maturity Model”

bpm-mm

Why do I include these maturity models? The Software CMM is the grandaddy of maturity models and therefore important context for considering any EHR usability maturity model. And the healthcare industry has the lowest BPM maturity of any major industry segment. At least some of the (lack of) usability issues afflicting EHRs is that they are not process-aware in the sense as workflow management systems and business process management suites.

Launching usability: wake-up calls, individual infiltration, internal champion, external experts

Guidelines for Improving Usability: Proposed EHR Usability Evaluation Protocol

Presentation PDF / Cached

Presentation PDF / Cached

Starting: Guidelines for Improving Usability: Proposed EHR Usability Evaluation Protocol

EHR Usability Protocol focuses on *most critical* issues first, others later

Presumably preventing errors threatening patient safety

From the slide:

EHR Usability Protocol (EUP)

  • The EUP provides a methodology for identifying and eliminating risks to patients due to poor user interface design.
  • This focus is the foundation of many existing, validated protocols for evaluating the usability of systems where safety is a critical component of user operation.
  • EUP focuses on the most critical issues first.
  • Other dimensions of usability are important.(my emphasis)

My interpretation is that while speed, productivity, profitability, satisfaction, engagement, etc. are important and may eventually be tackled, that patient safety is paramount and therefore will drive creation of the EHR Usability Protocol.

objectives: eliminate “never events”, ID/prevent critical use errors…

objectives:… ID areas for improvement and report in Common Industry Format CIF

EUP does not describe “look & feel” & therefore will not discourage innovation

eup-is-not

Is it possible to improve usability without changing the look and feel of a user interface? From the Wikipedia entry on “look and feel”:

“Look and feel in operating system user interfaces serves two general purposes.

First, it provides branding, helping to identify a set of products from one company.

Second, it increases ease of use, since users will become familiar with how one product functions (looks, reads, etc.) and can translate their experience to other products with the same look and feel.” (my emphasis)

Errors of commission vs omission (harder to detect)

Never events: commission, ommission wrong med, [2 more didn’t get!]

Here’s the complete list (from the slide):

“Never Events

The proposed categories of never events are:

  • Wrong patient action of commission event: Actions with potentially fatal consequences are performed for one patient that were intended for another patient because two patient identifiers were not displayed in an area of the screen that is visible without scrolling
  • Wrong patient action of omission event: A patient is not informed of the need for treatment because the wrong patient’s name was displayed on clinical data for another patient
  • Wrong medication event: A patient receives the wrong medication, dose, or route because the displayed information was not accurate or required viewing information on hidden screens to be accurate
  • Delay in care event: A patient should not receive a life-threatening delay in the provision of critical care activities due to design decisions made for administrative, billing, or security objectives
  • Unintended care event: A patient should not receive unintended care actions due to actions taken to test software, train users, or demonstrate software to potential customers.”

More errors: sequence & timing errors (both subclasses of errors of commission)

Quant vs qual/attitude vs behavior: summative usability testing is quantitative results from behaviors

usability-eval-context1(red circle added)

“discount usability testing” not repeatable across multiple designs

summative-ehr-usability-test-plan

(red highlights in original)

Had to look that phrase up–Jakob Nielson popularized the phrase and idea of discount usability engineering, about which he wrote:

The Discount Usability Engineering Approach

“Usability specialists will often propose using the best possible methodology. Indeed, this is what they have been trained to do in most universities. Unfortunately, it seems that “le mieux est l’ennemi du bien” (the best is the enemy of the good) [Voltaire 1764] to the extent that insisting on using only the best methods may result in having no methods used at all. Therefore, I will focus on achieving “the good” with respect to having some usability engineering work performed, even though the methods needed to achieve this result are definitely not “the best” method and will not give perfect results.

It will be easy for the knowledgable reader to put down the methods proposed here with various well-known counter-examples showing important usability aspects that will be missed under certain circumstances. Some of these counter-examples are no doubt true and I do agree that better results can be achieved by applying more careful methodologies. But remember that such more careful methods are also more expensive — often in terms of money, and always in terms of required expertise (leading to the intimidation factor discussed above). Therefore, the simpler methods stand a much better chance of actually being used in practical design situations and they should therefore be viewed as a way of serving the user community.”

usability testing process: kickoff/discovery, preparation, data collection, analysis/reporting

key differences summative testing for EHRs: requires more moderators with greater expertise & more tasks mandated by meaningful use than non EMR testing

difference-ehr-usability-test-plan

(red highlights in original)

many are tasks tied to MU meaningful use criteria

test administrators will need advanced degrees in human factors and minimum 3 years experience

evaluators-ehr-usability

Next steps: protocol development, test protocol examples, data sheets, develop more specific tasks

Government Best Practices in System Usability: Brief History & Status

Presentation PDF / Cached

Starting: Government best practices in system usability: Brief history & status

Human factors in design of safety critical systems: Book: The Chapanis Chronicles: 50 years of HF Research, education & design

chapanis

Great to see the well-reviewed biography of aviation human factors/ergonomics researcher Alphonse Chapanis. I became peripherally aware of Chapanis when I took a course at the University of Illinois Institute of Aviation (home of U of I’s Human Factors and Ergonomics program). I’ve frequently cited his contemporary, Paul Fitts, relative to Fitts Law and the need for big buttons for EHRs. I even wrote a poem about him. 🙂

“The early educators in the field-Alex Williams, Al Chapanis, Paul Fitts, Ross McFarland, Len Mead, Lick Licklider, Neil Warren, John Lyman, Jack Adams, George Briggs, and Ernest McCormick-had in common a recognition of the importance of a multidisciplinary approach to aviation problems, and their students were so trained.” (The Adolescence of Aviation Psychology)

As we are currently in the adolescence of “EMR Psychology”, I think there is indeed a historical model in aviation psychology to inspire and guide us today.

Human factors was born 70 years ago in the aviation industry when planes were falling out of the skies

I enjoyed seeing connections drawn between the history of aviation human factors and EMR / EHR usability. For more on the subject see EHR/EMR Workflow System Usability–Roots in Aviation Human Factors.

helmet

Notable UI Incidences: NORAD false alarms, 3 Mile Island, Flight 965 Cali CA

NIST involved in refinery user interface: reorg control room operator info, reduced plant incidents to one third of previous

FAA Order 9550.08*: “human factors shall be systematically integrated…all FAA elements & activities

FAA order 9550.08*

“Human factors shall be systematically integrated into the planning and execution of the functions of all FAA elements and activities associated with system acquisitions and system operations. FAA endeavors shall emphasize human factors considerations to enhance system performance and capitalize upon the relative strengths of people and machines.”

Human factors @ Dept of Defense: Human System Integration/Manpower Personnel Integration (MANPRINT, I kid you not)

The Relationship between Health IT Usability and Patient Safety: Towards an EHR Usability Safety Framework

Presentation PDF / Cached

Presentation PDF / Cached

Starting: Relationshp btn HealthIT Usability/Patient Safety: 2wards EHR Usability Safety Framework

NIST working on Usability-Safety Framework max benefits 2 users & patients & min harm

safety-framework

I like this diagram and look forward to where it leads. What I like most is that it is “closed loop” and appears amenable to a “process-aware” approach (process design, implementation, enactment and diagnosis) to systematic optimization (minimization of error, maximization of safety).

Use Errors: Patient ID, mode, data accuracy, visibility, consistency, recall, feedback, data integrity

Evaluation indicators: workarounds, redundancies, burnout

Patient harm: never events, substandard care, morbidity, mortality

Risk Factors: severity, frequency, detectability, complexity (1-4 scale)

handoff & interruptions increase complexity

Better supporting handoffs and managing interruptions is key. See Interruptions, Usability, and Pediatric and Primary Care EMR Workflow on the subject.

Audience Questions/Comments During Technical Feedback on EUP (EHR Usability Protocol)

Now moving to Breakout sessions: (Red) Tech Feedback on EUP (Green) Building Collaborative Community 4 Improving Usability

I stayed for the Red technical feedback session

question: will testing occur in typically noisy & distracting environment of real EMR user?

Great question!

“Similar to an under-attack fighter, a busy airport control tower, or a hectic lunchtime restaurant, medical practices can be high cognitive load environments (especially during the flu season in primary care). All four require multitasking and prioritization in the face of interruption and distraction.” (EHR/EMR Workflow System Usability–Roots in Aviation Human Factors)

Going to be very interesting to see how EHR usability, relevant to real-life use of EHRs, will be measured.

comment: warning that dealing with test system versions, configurations, and seeding realistic patient data will be very difficult

Traditional EMRs are highly customizable (often based on table-driven development). However, they are still not customizable enough, especially when it comes to workflow (which will be necessary to improve EMR workflow from 91 to 14 steps, as mentioned in a presentation). As workflow engines executing user customizable process definitions become more prevalent, EMRs will become even more customizable. So, which set of EMR process definitions will be tested? Since users will likely change these definitions, how do EMR workflows “in vivo” get tested for usability? I suspect we are ultimately looking as more use of participant observation in the wild than summative testing in simulated environments. But then that is more qualitative than quantitative, which makes it difficult to apply to programs for certification of EMR / EHR usability.

question: are looking where in workflow error occurs? team setting, where in setting?

Answer: interface is built based on workflow, it is the interface that is tested

What does it mean to say an “interface is built based on workflow”? I think it means that someone analyzes the workflow and then writes software that fits the workflow. The problem is we (developers) aren’t very good at doing this. Is there an alternative? I think there is. Build EMRs with workflow engines executing process definitions so users don’t have to be programmers to change workflow. There are even so-called Design by Doing approaches that allow users to create their own workflows without having to deal with workflow editors. Instead of trying to make sure an EMR fits medical practice workflow before it is installed, give its users the tools to more easily change its workflow. But, then, how do you measure usability? Isn’t usability then really about the tools used to change EMR screens and workflows, not the resulting screens and workflows?

question: what about variability? comment: reducing usability to number of clicks problematic

Agree: See my comments above…

audience: need more detailed cognitive model of error besides omission/commission?

Might want to check out the Rouse Human Error Scheme (disclaimer: he was my Industrial Engineering advisor). I’ve not been able to find a relevant paper of his that is not behind a paywall, but a table appears at the end of Improving Human Factors in Marine Maintenance by Clive K. Bright BA, PhD and Simon P. Bell BSc, CEng. (cached)

error-table

audience: would need 2 test 100 users at least for each release, which are frequent

Summative testing does require a test design that averages over a number of similar users performing similar tasks in similar environments (in contrast with current certification scripts that rely on one user (perhaps pretend) performing each task once over the Internet. So presumably more work than current certifications.

summative-formative-table1

Formative and summative usability testing are compared here.

response: if user interface doesn’t change, don’t need to test, only need 15 test users per category

response: can still identify critical use errors with smaller groups even if not statistical significant

audience: different sites can run very different version of same EMR version, test each?

audience: most important causes of critical error may B in the variability between sites such as how they handle interruptions ehrusability

audience: 8 week release cycles, could be a lot of testing…

BTW: all of these questioners were thanked and their comments and questions appreciated, that was the purpose of this portion of the workshop

audience: can eye-specialty vendor adapt usability to their subspecialty workflow?

I recall this audience member commented that in order to obtain EMR certification that they had to add pediatric growth chart to their EHR even though it was of no possible use to any of their customers.

Response: likely should just concentrate those aspects of workflow that are relevant to them [tempted to go back to my Looxcie video to find the exact wording]

audience: where can we get credible test data? response: building community to provide this

audience: radiology oncology errors must be reported to state, 1/10000, pool size statistical too small

audience: why so difficult for gov & vendors to collaborate? response: Question not for this forum (ask ONC)

Believe this question came from a European representative in the audience…

@ re medicine where aviation was in 1940 < coincidence! > conf @ NIST this topic

That it! There are tradeoffs when you tweet instead of write your notes. I’m not a fast typist so I miss stuff. On the other hand there is less transcription from difficult to decipher handwriting. Plus the tweets provide an electronic outline that begs for further electronic annotation. Another thing I like about tweeting notes is that I can retweet other folks in the audience (or sometimes not even in the audience but nonetheless following along) and interact with them.

Cheers!

P.S. Here is that material regarding clinical groupware usability that I promised earlier:

“Great definition…but it just seems so, well, “singleware-ish.” Clinical groupware needs a less abstract definition of usability that is more direct about groupware’s unique usability issues (see the sixth, seventh, eighth, ninth, and tenth quotes from my Clinical Groupware…Key Ideas post). How about:

Clinical groupware usability is…

“The extent to which clinical groupware can be used by specified teams of users to coordinate activity and achieve specified collections of goals with overall effectiveness, efficiency and satisfaction in specified contexts of use.”

“Distributed Cognition takes as its unit of analysis a complex cognitive system: collections of individuals and artifacts that participate in the performance of a task. The external structures exchanged by agents of complex cognitive systems comprise its “mental” state and unlike individual cognition, where mental states are inaccessible, these states are observable and available for direct analysis.”

“The Human Factors in Computing community has a…challenge [to] find ways to test and evaluate technological impacts on groups. It’s difficult enough to get meaningful results that take into account differences in experience and individual differences of users to their reactions to user interfaces. But at least it’s possible to get volunteers to sit down with word processing systems and spreadsheet programs for relatively self-contained tasks. It is more difficult to “stage” a realistic group-work setting in a lab and have volunteers use the system in a way that provides meaningful data. Methodologies for testing individual user interfaces don’t apply as well to group support systems. As a result, CSCW [Computer-Supported Cooperative Work] is looking more to anthropology to find methodologies for studying groups at work in their natural settings.”

“Until recently, most user interface research has focused on single-user systems. Groupware challenges researchers to broaden this perspective, to address the issues of human-user interaction with the context of multiuser or *group* interfaces. Since these interfaces are sensitive to such factors as group dynamics and organizational structure—factors not normally considered relevant to user interface design—it is vital that social scientists and end users play a role in the development of group interfaces.”

“Evaluating groupware ‘in the field’ is remarkably complex because of the number of people to observe at each site, the wide variability of group composition, and the range of environmental factors that play roles in determining acceptance”

“Five factors contributing to groupware failure…:

  1. Groupware applications often fail because they require that some people do additional work, and those people are not the ones who perceive a direct benefit from the use of the applications.
  2. Groupware may lead to activity that violates social taboos, threatens existing political structures, or otherwise demotivates users who are crucial to its success.
  3. Groupware may fail if it does not allow for a wide range of exception handling and improvisation that characterizes much group activity.
  4. We fail to learn from experience because these complex applications introduce insurmountable obstacles to meaningful, generalizable analysis and evaluation.
  5. The groupware development process fails because our intuitions are especially poor for multiuser applications.”

Follow me on Twitter at .

Leave a Reply