Short Link: http://j.mp/cbQP2c
I can’t leave the iPad alone, literally or figuratively (how many EMR users can say *that* about their EMR?). Last week I explored the relationship between EMR/EHR/clinical groupware contextual usability and process awareness. This week I consider the following apparent contradiction:
Most reviewers agree that the iPad is optimized for content consumption, not content creation.
“The iPad is not a laptop. It’s not nearly as good for creating stuff. On the other hand, it’s infinitely more convenient for consuming it — books, music, video, photos, Web, e-mail and so on. For most people, manipulating these digital materials directly by touching them is a completely new experience — and a deeply satisfying one.” (Looking at the iPad From Two Angles, David Pogue)
However another article reports that of all industries healthcare is most agog at the iPad’s form factor and usability.
“So while the rest of the world texts, tweets, and generally fawns over the thing, that’s muted compared with the reception the iPad is getting in the health care universe…This isn’t just hot-new-toy fever sweeping the mediverse, though: If the iPad becomes as ubiquitous in medical facilities as the iPod is everywhere else, it could usher in literally billions in savings.” (An Apple a Day: Will the iPad revolutionize health care?, Martha White)
The apparent contradiction? Physicians need to create content at the point of care, not just consume it. They will resist hauling around multiple devices. While the iPad has a virtual keyboard and an optional keyboard accessory, and there’s Dragon Dictation, clicking (or in this case, tapping) to perform routine data entry is not likely to go away.
But consider the following blogosphere comment:
“I have read reports from ‘excellent’ EHR systems…which contain disastrous errors created by a 0.5 mm slip of the mouse pointer and a click. This is what happens when two opposite diagnoses differ by one consonant and are adjacent in the pull-down list. We are trying to treat the patient but we are really doctoring the EHR.”
And (in reaction to the iPad):
“Even most template driven EMR software would not be fun on an iPad. Checking a check box with touch can be painful if the check box is too small, no?”
Absolutely right and exactly my point. The Cognitive Psychology of Pediatric EMR Usability and Workflow starts with a question and graphic example of the issue.
“Which targets are easier to hit quickly, accurately, and repeatedly? Small checkboxes or large buttons?”
Figure 1: See Post Script.
The answer is obvious. There’s even a psychological law.
Fitts’s Law: “The time required to rapidly move to a target area is a function of the distance to and the size of the target.” (Wiki article on Fitts’s Law)
For user interface design, Hick’s Law complements Fitts’s Law:
Hick’s Law: “The more choices you have to choose from, the longer it takes for you to make a decision.” (Wiki article on Hick’s Law–you’ll have to copy and paste “http://en.wikipedia.org/wiki/Hick’s_law” into your browser, WordPress doesn’t handle apostrophe’s in links well)
“Acquiring” (human factors speak for click or tap) one small “target” amidst many “competing” targets is slower, more effortful, and error prone than a large target among just a few alternatives.
How, might you ask, can EMRs, EHRs, or clinical groupware present *enough* buttons to their physician users so they can enter all the data and orders that they need? Instead of just a few big screens containing many small buttons and checkboxes and so on, spread larger buttons (and no checkboxes, not a one) across many screens.
How, might you ask, are you expected to navigate to the right screen at the right time to click on the right button? For each specific context (well child visit, sick child visit, vaccination, etc.) present the right screens in the right sequence to the user in a way that mirrors the natural order of the tasks the user needs to accomplish. That was my major point in last week’s post Contextual Usability, My Apple iPad, and Process-Aware Clinical Groupware for Pediatric Practice.
The iPad and similar devices may indeed transform digital medicine. If they do, one important reason will likely be that it forces EMR, EHR, and clinical groupware developers to get rid of those cramped rows of itty-bitty little checkboxes and endlessly scrolling lists of skinny pick list items. To do so requires clinical groupware to ask and answer the right question at the right time and act appropriately, to hand to the user the right data or order entry screen with all, but only, the right data or options. I don’t see any other way for clinical groupware to do this than to rely on some form of user-programmable executable process model.
P.S. Read about post-WIMP (Windows, Icons, Menus, Pointer) user interfaces.
P.S.S. Follow me on Twitter at .
Really great roundup of posts about the iPad. No doubt there are some SERIOUS issues of EMR usability in general that are just blown up when you try to use the iPad’s touch features to use an EMR.
“blown up” That’s almost a pun! (as in larger buttons, etc.)
Thanks John,
You have a great roundup of posts about *everything*.
Your combo of EMR and HIPAA/EMR and EHR are two of my three favorite blogs at the moment (Vince Kuraitis’ e-caremanagement.com rounding out my top three).
Keep up the good fight!
–Chuck
P.S. EMR developers aren’t the only folks who may have difficulty porting their apps to the iPad.
“The interesting thing, on a much bigger screen size the game becomes a lot easier,” Groves said. “If you have larger targets with larger screen, you’ll not have as much of an appeal as far as maintaining a (certain) challenge level.”
Ironic! Bigger “targets” make iPad games *too* easy. Wonder if *that* will ever be a problem with EMRs, as in “I’m so bored! My EMR is too easy!”
I actually did a mock up of what I thought an EMR should look like on the iPad a few months ago, just after the announcement. It followed similar principles with a page containing no more than 15 buttons. There was space for context, all numbers popped up a calculator key-pad and each part of the classic SOAP note had not just a page or section but a set of pages. I’ve been contemplating this since that time and it’s nice to see that there’s research and forward thinking going on…
Love to see the mock up, perhaps post it to your blog at Paging Dr. Geek?
You mention a set of pages. How would you intend to the user to get to them? Individually tabbed? Of would they appear sequentially and be dismissed?
In your post Pleasantly Surprised you mention “The device disappears….”
That’s exactly what happens to the heads up displays that are projected onto the inside of an airplane windshield (showing the locations of threats and targets). After a while the pilot is no longer consciously aware of the interposed diagram, and yet he or she still has access to the conveyed information. Eventually I suppose that EMR data will be projected on the inside of an eyeglass lens. In an augmented reality scenario the patient, then, becomes the interface. Look at their hand, then object recognition plus context triggers workflow that includes a radiographic image (appearing as an overlay on the hand to make it seem that you have x-ray vision and are literally looking into the hand).
The phrase that I’ve encountered over the years that refers to this sort of thing is the Phenomenology of Tool Use.
A good user interface *should* disappear. The day that the EMR effectively disappears from view will be the day we finally get it right.
You should check out the app “Epic Haiku” for iPhone. It was the first EMR made for the platform.
For my own views on the matter: http://www.macadamian.com/insight/healthcare_detail/ipad_for_healthcare/
Isn’t it more of a presentation layer for interacting with an EMR, than an EMR itself?
It’s a good idea, though apparently retooling iPhone apps for the iPad is nontrivial.
http://informationarchitects.jp/designing-for-ipad-reality-check
I think we’ll see a lot of iPad wrapper presentation layers created for access of existing data in EMRs, plus a bit of data and order entry.
However, it is going to be challenging for traditional EMRs (large screens with many checkboxes) to port their data and order entry functionality to the iPad, let alone the iPhone.
My larger point is not that iPad is an ideal platform or even presentation layer. It is that this entire class of mobile touchscreen UI platforms will force designers of traditional WIMP-based (windows, icons, menus, pointing devices) to move to alternative post-WIMP user interfaces.
Doing so will be nontrivial because traditional EMRs lack process models to interpret context (the who, what, why, when, where, and how of that step in that encounter for that patient for that user in that location with that goal or constraints).
Process-aware clinical groupware (that is, groupware that uses rules to interpret context and present the right screens in the right order with the right data and order-entry options) will not have the same difficulties with respect to use of the iPad as a user interface.
The iPad can hold only a few large buttons. The process model can, at run time, chose the all and only the right buttons. And then present a succession of screens that matches the natural workflow of the encounter.
I agree with you: we need “clinical groupware to ask and answer the right question at the right time and act appropriately, to hand to the user the right data or order entry screen with all, but only, the right data or options.”
I worked w/ Michael Roizen, MD at the Univ of Chicago to develop patient self-administered health assessments (HealthQuiz). In the late 80s we used a 3 button (with only YES, NO, NOT SURE, NEXT QUESTION choices) laptop device, which utilized a simple decision tree algorithm to present appropriate questions one at a time to the patient. Patients would misunderstand the questions being asked, so we used cognitive interview testing to produce more accurate and reliable questions and algorithms. We later moved the Healthquiz to a desktop PC, but the average patient could not use a mouse/keyboard. We could have used the iPad at that time.
What excites me about the iPad right now is that it has the potential to provide an intelligent, flexible and customizable visual interface for data entry.
On a personal note, I am thinking of transitioning from psychiatry to medical informatics – how could I do this? Thanks in advance for your help!
Richard Kim, MD
Thanks for the comment and question Richard!
I’m looking forward to a flood of inexpensive slate computers–such as Android, webOS, and Windows Mobile 7–with a similar form factor to the iPad. Using an EMR should be almost as simple as using an ATM cash machine or airline checking kiosk. However, this “simplified” interface requires a smart engine to dynamically decide which screens to present and what data and order entry options to display on each screen. The current generation of EMRs mimic Microsoft office which is a mistake. Too much thinking and too much clicking. There some EMRs that are starting to take this superior alternative road. They sometimes describe themselves as having workflow-driven or context-aware interfaces (if you search in Google you’ll find them, I’m thinking about doing a post to highlight them).
On transitioning from psychiatry to medical informatics, hard to advise you without knowing more about how much technical background you have or self-learned. Plus finding a position in medical informatics is still perhaps more based on serendipity than systematic search. This may be changing as there are many educational degree and certificate programs springing up.
However, psychiatry does have some unique contributions to health information management and technology.
Your domain knowledge of psychiatry will be invaluable for designing psychiatry patient care information systems, where I suspect there has yet been less automation. I hesitate to call them psychiatric EMRs, because I’m not sure that a traditional EMR model is appropriate for psychiatry (though some subfunctions such as eprescribing will likely transfer relatively directly). I’d start with what you currently use on paper and currently do for the patient and figure out how to make it better with IT. You know your workflows best. While many EMRs currently hardcode their workflow and require physician users to adapt to the software, we’re going to see more malleable (by users) “process-aware” clinical workflow systems that can be programmed by non-programmers. Just like what happened years ago when HyperCard enabled non-programmers to create useful applications, process-aware clinical groupware (see below) will allow non-programmers to design their own automated environments and invite others (co-workers and patients) into those environments.
Psychiatrists know a lot about psychology–on both the normal and abnormal side–that will be invaluable for designing more usable EMR user interfaces. And lack of usability is becoming a big issue with obstacles to EMR adoption. You might look into usability courses or certification programs.
Implementing new technology in the workplace triggers all kinds of psychological defense mechanisms. I’ve often thought that a good EMR implementer and trainer also needs to be a good individual and group therapist.
Clinical groupware has an important social group lifecycle and participatory patient element. Again, psychiatrists potentially have special and valuable knowledge about the “group” in “groupware.
Current problems with adopting health information technology, new technology such as process-aware clinical systems, and the rise of social media and clinical groupware speak directly to the special insights and strengths of psychiatry. You could look for ways to pivot from, or extend, your mental health brand to address these problems and opportunities.
Again, my advice is based on little more than the knowledge you are a psychiatrist interested in user interfaces, so you may already be way ahead of me in your investigation of a medical informatics career.
Please come back to tell us where you go in medical informatics!
Cheers
Chuck
It seems to me what physicians need is a customizable EMR interface built with multiple specialties in mind. In addition, while iPad is an ideal tool for mobile healthcare, a traditional desktop or laptop is often necessary, or at least more convenient for creation. MediTouch EHR from HealthFusion is designed and built for the iPad, but also offers cross-compatibility with standard computers, like Windows PCs. With built-in encounter notes on large, clickable buttons, small check boxes aren’t anything to worry about. And with the new MediDraw feature, physicians can draw directly on patient pictures or anatomical outlines.
Thanks for your comment.
You’re right, I do love big buttons. I write about big buttons a lot. You could say I have a thing for big buttons. I’m not alone among usability professionals. Fitts Law is well known in aviation human factors. Larger “targets” are more easily “acquired” quickly and accurately. This reduces cognitive effort and error.
However, simply having big buttons is insufficient. Needed is ability for users to program sequences of screens, and the data and order entry options on those screens, to customize EMR / EHR behavior to their local idiosyncratic workflow needs and preferences. And for the EMR / EHR to automatically chose and then execute these sequences intelligently given the evolving clinical context within which this execution occurs.
The only practical means to do both (big buttons *and* automated screens) is to use workflow engines executing process definitions. These are widely prevalent in other industries and just beginning to appear in healthcare. Obviously, these engines must give way to users, letting them jump out of workflows, jump back into workflows, reassign workflows, change workflows on the run, and so forth. These capabilities (under the guise of adaptive case management) are rapidly being added to workflow management systems in the business process management industry.
Given recent spikes in interest in User-Centered Design of EMRs / EHRs I am rereading Donald Norman and Stephen Draper’s 1986 edited collection User-Centered System Design: New Perspectives on Human-Computer Interaction (526 pages!).
On the very first page they write:
This passage, written more than a quarter of a century ago, applies in spades to most of today’s EMRs / EHRs. So much so that just a few word and phrase substitutions results in the following passage:
Starting with a product design course when I was an undergraduate, human factors and workflow courses as a graduate student, and finally coursework in all the cognitive sciences (except anthro, sorry to say, though I read it extensively), I continue to be fascinated by computational models of mind and community, and their relevance to building usable EMRs / EHRs. So I am delighted that you chose to post here and would love to hear more about how the HealthFusion EMR / EHR Meditouch addresses these impediments to widespread EMR / EHR adoption.
I’m thinking about writing a blog post with a title something like “User-Centered EMR / EHR Design is Good Product Design, Plus Lots of Cognitive Science”. How have you addressed EMR / EHR usability? How malleable (by users, not programmers) is your workflow? Can users edit workflow. Care to share a screenshot or two? Or, better yet, a video explaining how you avoid EMR / EHR “frozen workflow”? I’d feature any relevant screen shots or videos.
You certainly appear to be on the right track!
Thank you and
Cheers
–Chuck