HMIs: Saving lives
The ambulance screams down the road, siren wailing. Cars, cyclists, and pedestrians hurry to get out of its way. Inside, the navigation system gives the driver turn-by-turn directions for the fastest route to the hospital. Behind him, an EMT works on the patient, using a hands-free system to communicate with the doctor at a nearby hospital.
The doctor, the EMT, and the driver are all working desperately to keep the patient alive—and are relying on devices with complex human machine interfaces (HMIs).
The navigation system tracks the ambulance’s progress, providing just-in-time instructions to the driver.
“Turn right at the next street, NE 85th,” it says, the automated voice speaking loudly to be heard over the sound of the siren.
Ahead, the traffic light is red. The driver weaves in between the cars partially blocking the street, moving quickly toward an intersection. Meanwhile, the navigation system communicates with the city’s smart traffic light system. By the time the driver reaches the corner, the lights have already turned green.
He makes the turn.
Once the navigation system senses that the ambulance is through the intersection, it signals the traffic control system, returning the light to its regular programming.
The EMT and the doctor
“He’s going into shock. I’m seeing increased heart rate and respiratory,” the doctor says, through the EMT’s earpiece. “Blood pressure 74/50, heart rate 144, respirations 38.”
The doctor sees all the patient information on her own terminal at the hospital, thanks to the data transmitted through the medical devices connected to the patient.
“Giving him 1,000 ccs now,” the EMT says, hooking up the IV fluid. The voice-activated communication system ensures that his hands are free at all times to work on the patient.
“Likely internal bleeding and organ damage. I’m going to look at rerouting you.”
From the hospital, the doctor checks the location of the ambulance on a touchscreen. With a few taps, she’s able to see that the ambulance can get to nearby City General Hospital without adding more than a few minutes. That hospital is better equipped to deal with the patient if he needs surgery, so it’s worth the slight delay. The application informs her that the hospital is not overwhelmed, despite the fact that several other patients from the same accident are en route.
“Rerouting you to City General,” she tells the EMT and then selects City General on the tablet device.
In the ambulance’s cab, the navigation system announces, “Rerouting to City General by physician’s order. Turn left at the next light, Maple Avenue.”
There’s an accident on what would be the fastest route to City General, so the navigation system is routing the driver to smaller side streets. The accident was already accounted for in the time estimate given to the doctor because all the systems are using the same data.
The driver follows the directions given by the computer.
In back of the ambulance, the EMT is now connected to a City General physician. All the patient’s data, read by the medical devices connected to him, is being sent to City General. Control over the patient has been quickly and seamlessly transferred, ensuring the best possible care, as the ambulance speeds to its destination.
The patient’s vitals have improved. To get a head start on treatment, the City General physician asks the EMT to stream video of the patient, focusing on the chest area, so that the physician can visually assess the damage. The EMT can do this using a new app that was recently added to his onboard tablet device via a cloud update.
The EMT hasn’t tried to take a video in the field before, and the ambulance is jogging this way and that. He hopes he can figure out how to get the image.
“Video streaming,” he says to the tablet. The app opens and a shaky image of the patient appears on the screen.
The EMT quickly scans the interface—and finds that it’s quite familiar. It uses the same design as the data entry application he’s used to. He presses a button to begin streaming. The app is tied into the other systems the EMT is using, and it streams the video to the correct physician, as the EMT zooms in for a close-up of the chest area.
From the video, the physician at City General can already tell that the patient will need surgery. He notifies the thoracic team to start prepping.
The ambulance is at its destination only moments later, and the patient is in surgery shortly thereafter. He’ll have a long recovery, but he will live—because of the capable people tending to him and the advanced equipment they rely on.
Integrated systems, unified interfaces
This scenario is just one example of how human machine interfaces can be designed to work in a coordinated fashion to improve and even save lives. HMIs, combined with sensors, cloud data, and advanced communication systems, are changing our world for the better. A scenario like this can help ensure success by:
- Maintaining consistency in HMIs across devices and applications so that users don’t have to learn a new interface in order to use an application.
- Using appropriate interaction modes, such as voice activation when the user needs to have hands available, and touch interfaces for fine control.
- Seamless communication and coordination between connected users via applications that share data and that take control or send commands as appropriate.
- Real-time sensing and optimization, in which information such as traffic data and current status can be leveraged and systems can talk to one another to respond in real-time.
EB GUIDE is a reliable, sophisticated HMI development tool designed to allow you to create the interfaces that power these and many other scenarios.
You can try EB GUIDE for free by downloading our Community Edition