A matte black, cube-shaped room with the holographic projection of a cadaver at a surgeon’s disposal? That’s better than playing a turn at the classic board game “Operation.”
Housed at the University of Michigan’s 3D Lab, the Michigan Immersive Digital Experience Nexus (MIDEN) has a small space in which four display screens make up its walls and floor. These flat surfaces serve as a canvas for investigating the inside of the human frame. Armed with a controller not unlike those used with a video game console and sporting a pair of glasses with three insect-like antennae pointed in different directions, students can step inside the futuristic-looking space and prepare to perform an autopsy.
With a slight motion of the thumb on the right joystick of the controller, a healthy part of the cadaver’s tissue slides away and exposes the body’s yawning insides. The controls are that simple. Even the optics around the student’s eyes are themselves controls, detecting the motion of the head’s positioning. Within those glasses, the user is given a multi-dimensional, human shape in both lenses. Your body is the controller in this place. Without the gear, an onlooker remains an audience member, observing a procedure on a body flattened like a Loony Toon run over by a steamroller.
I’m always interested to see the handheld controller technology being used for more practical applications than simply video game entertainment. It offers an ergonomic solution for a remote control that could be used for an array of industries. Might the MIDEN simulation prove useful with this device to train surgeons for surgery? I suppose this isn't too novel a thought and, honestly, the idea still terrifies me more than it intrigues. Nevertheless, the controller and its antennae can perhaps evolve into a commonly used surgical tool.
Perhaps the multilayered simulated model is spooky in an uncanny sort of way. The 3D projection looks like it could be your Uncle Steve on an invisible slab. Yet, this use of the app is one budding vista of learning the localities of tissue layers, organs, and skeleton in relation to one another in virtual space. And, for better or worse, it might become a means to desensitize the nerves of a future surgeon before performing surgery on an actual person.
The app operates in what’s known as a CAVE environment – a projection-based virtual reality system – but 3D Lab’s manager, Eric Maslowski, forecasts hopeful expansion on its training uses in entirely different fields altogether. You could make these forecasts yourself by just looking at it. These are the barebones of the app, and it can only go further with new kinds of content.
Personally, I would love to treat that room to the classic science fiction shrinking trope where a cadre of super science heroes are made small to solve a patient’s terminal virus (or the micro-sized nuclear device). The resolution just needs to remain crisp.
But could the technology someday offer a full-body scan that reproduces a unique anatomy in 3D, including specific ailments? Or might surgeons opt for a trial run on your organ transplant and show how it’s done in a feature-length documentary that could also be used for training? Apps may spawn that could eventually involve visuals on diseases in either internal or external manifestations. How about practicing limb amputations? Do you have what it takes to perform plastic surgery; step into the virtual OR. Anyone can speculate on where the MIDEN crew can take the app, but it amazes me to see the hybridization of different mediums: the use of video game hardware and 3D simulations for medicine. It amazes me to see how the technology can assist turning the student into the professional.
For more information, visit the University of Michigan School 3D Lab.