TIGER Usability & Clinical Application Design
This wiki is a shared online whiteboard that allows the entire group to easily share information to each other and other TIGER collaborative team members. The material on the wiki should be considered to be in draft format and will be published on the TIGER website at www.tigersummit.com upon completion.
This work group is collecting case studies and examples from the field that illustrate usability and clinical application design from various environments. These resources are intended to demonstrate examplars as well as lessons to be learned that can help others to avoid common pitfalls and mistakes.
Denise Tyler - Co-Facilitator
Angela Lewis
This team meets every other Monday at 1 p.m. EST/10 a.m. PST
March 17, 2008 at 1:00 p.m. EST - please register at https://www1.gotomeeting.com/register/363976443
April 7, 2008 at 1:00 p.m. EST - please register at https://www1.gotomeeting.com/register/734361695
April 21, 2008 at 1:00 p.m. EST - please register at https://www1.gotomeeting.com/register/186612902
May 5, 2008 at 1:00 p.m. EST - please register at https://www1.gotomeeting.com/register/533307278
May 19, 2008 at 1:00 p.m. EST - please register at https://www1.gotomeeting.com/register/584246816
TIGER_Request for Case Study FINAL.doc Ready to Use Case Study Template
Supplied by Patti Rogers on November 28, 2007
HFEinIH proceedings content.pdf
Case Studies Collected
Case Study Analysis
Usability_CaseStudies_compiled(5).xls Case Study summarizations for Final
11/15/07 - provided by Nancy Staggers (slide used with permission)
Axioms of usability include: a) involving the end-user early in the design process, b) iterative prototyping and c) empirical testing. The attached slide was developed by a Suzanne Miller, a nursing informaticist employed by Intermountain Health in the early 2000s. She worked with interdisciplinary care providers to create an interdisciplinary care summary page. She used MicroSoft Front Page to mock-up this application to present before clinicians and elicit comments. She could easily modify the presentation to produce iterative designs based upon comments. Over months, she presented to many groups of clinicians, so this design represents the culmination of extensive work with clinicians. Rather than empirical testing, Intermountain Health used a systematic but less formal method to solidify its design.
Example of a good case study.
Strawman for case study outlines:
Author
Institution
Application or Function
Target users
Overview of the Process
Positive attributes of this case
Challenges and lessons learned
11.15.07 - submitted by Lee Stabler
Case Study
Health First, Inc.
Selection of Electronic Documentation System
This is the process used by Health First (HF) hospitals to select an electronic documentation system. Health First, Inc. is comprised of three separate acute care hospitals, free standing clinics, Hospice, Home Care, and a health plans. Current state at HF was electronic documentation was being done by all clinicians in all departments, except physicians. We had several best of breed niche systems that were implemented during the late 1990’s and early 2000’s. At the time niche system were purchased and implemented technology did not provide one electronic system that fit the needs of all specialty areas such as L&D, ED and OR.
The first step was to determine the mission and vision HF wanted for their move forward in utilization of an electronic health record. It was determined that we needed to find a solution that would allow all departments to utilize the same system, accommodate CPOE, provide documentation support for evidenced based practice and decision support at the point of care. Additionally, the system needed to have a balance of standardization as well as the ability to support individualization. A key component during this step is to establish executive buy-in and sponsorship.
The second step was the evaluation process for several systems/vendors. Before the actual evaluations could be started there was a determination of who should be included on the team doing the evaluations and criteria for the evaluations. The team needed to include all stakeholders that the new system would have an impact to. We chose bedside clinicians from each of the three hospitals, all disciplines, IT and management from across the organization. Once the team and tool for evaluations where established, then vendors were scheduled to come in and demonstrate their products. At the end of the demonstrations utilizing an objective based evaluation tool the selection was made.
Now it was time to do a crosswalk between the current system(s) and the new system. We had a dedicated team of clinicians and support persons who looked at each piece we documented and compared it to the new system to answer several questions.
1) Why are we charting this?
2) Is it required by regulatory agencies?
3) Required by HF policy?
4) Just because we always have?
5) Standard of care?
6) Is the current documentation included in the new system? If not, why not?
7) Is this evidenced based practice?
8) Will policies and procedures be impacted and need revisions?
A master spreadsheet was kept of the crosswalk findings; follow up needed, time line for follow up and person responsible. Once this was completed then the custom configuration needed was completed by IT.
The next step was extensive testing of the content and functionality. Included in this team was Clinical Informatics, our clinical system educators and selected clinicians. After testing was completed and all issues resolved, then the training plan was put together.
The training plan included content, length of classes, documented competencies based on roles and schedules to accommodate the go-live schedules. At HF we have dedicated clinical system trainers, who train all associates on any electronic systems. This provides a consistent method and content for the classes. During the training period we made sure that the training environment was available on each unit for four to six weeks prior to go-live of the unit, allowing staff to have extra time to practice.
Go-live was the last step in the implementation phases. Each unit/department was scheduled for their go-live to start on a Tuesday morning at 0700. This corresponded with shift change and allowed Monday for prepping. The prep involved Clinical Informatics being on the unit and working with staff to pre-build the patient’s chart in the new system with the basics. The individual clinicians would customize for details such as drains. During go-live week there was scheduled support from Clinical Informatics, Clinical Practice Coordinators, Superusers and management. This was tapered the second week based on needs.
HF has several systems set up for on going support. There is Clinical Informatics associates assigned to each facility and they make daily rounds to answer questions about system functionality and computer issues. The Clinical Practice Coordinators are reasonable for process issues. The Help Desk is available 24/7. We have an established change management process that addresses any change requests.
Today we have all nursing and ancillary departments within the hospitals using the system successfully. May 2008 our ED’s which currently use a niche system will move to new system. Our OR’s niche system is in partnership with the main system and is progressing with integration of the data. We have seen significant improvements in our quality outcome measures since adopting this system.
----------------------------------------------------------------------------------------------------------------------
11/15/07 Submitted by Angela Lewis