With few exceptions, most of us have played in a sandbox. Who can forget the feeling of those tiny grains flowing through our fingertips while waiting to dig the next hole or shovel the next pile into a bucket?
But how many can say they’ve dug their hands into an augmented reality (AR) sandbox?
The prototype of the AR sandbox was developed in the early 1990s at the Keck Center for Active Visualization in Earth Science at the University of California, Davis thanks to a grant from the National Science Foundation. It was created by a team of computer scientists, science educators, exhibit designers and other professionals.
“They hoped it would improve public understanding of freshwater lake ecosystems by using 3-D visualizations,” explained Ted Kucharski, a structural engineer in Northrop Grumman’s Aerospace Systems sector who has built two AR sandboxes at the company’s FabLab (Fabrication Laboratory) in Redondo Beach, California. UC Davis computer scientist Dr. Oliver Kreylos designed and programmed the first AR sandbox’s software, setting the stage for future systems like the ones Kucharski built.
What’s the difference between augmented reality and virtual reality?
“Augmented reality is a live view of a real-world environment augmented by computer-generated information. It modifies or enhances physical perception,” Kucharski said. “It’s different from virtual reality, which simulates the physical environment with an environment that only exists inside a computer and is viewed with VR goggles.”
In Kucharski’s case, the first AR sandbox he built was a 40-by-30-inch box that contained tiny mountains and equipment to measure sand surface profile and project a contour map in real time. When users poured sand on a mountain to make it taller or flattened a mountain to make it smaller, a computer calculated the difference in height. Users could also place their hands over the landscape to create a shadow, which in turn set off virtual rainstorms and simulated water that flowed down the mountainsides to flood the lowlands below.
How Does the Augmented Reality Sandbox work?
Kucharski explained that an AR sandbox uses a computer, projector and motion-sensing device, called the XBOX 360 Kinect Sensor, which provides near real-time measurement of the sand’s elevation. The Kinect Sensor measures the distance to the sand, while the computer generates color-graduated contour lines. Then the projector displays the color-graduated contour lines on the sand. When sand is displaced, a constantly updated colorful contour-plot image is projected onto the sand in near real time.
“The interaction with the computer through the sand, instead of with a keyboard and mouse, is a surreal experience. It’s captivating and puts a smile on everyone’s face when the virtual rain pours down the terrain,” Kucharski said. “It’s hard to explain the experience without actually going into the sandbox and pushing sand around.”
As part of the company’s Alignment Engineering Group, which has measurement devices covering everything from theodolites to laser trackers to photogrammetry, subject matter expert Kucharski is constantly searching for new and better ways to measure physical objects. “I became interested [in AR sandbox technology] because I like the real-time display of 3-D measurement data,” he said. “Typically our measurement data is provided in a tabulated spreadsheet, so I’m always looking for a more interesting way to present that data. And this is about as interesting as it gets for now.”
It all came together this past January, when Kucharski spotted the perfect platform to begin construction at the FabLab in Redondo Beach. He pitched his idea to Northrop Grumman Innovation Manager Tony Long, who gave him the green light. Within four months, Kucharski built two devices: one for an annual conference showcasing the company’s technology and another for Take Our Daughters and Sons to Work Day.
“I couldn’t wait to get started,” he said. “The weekend after I got approval, I was cutting wood, drilling pieces, bolting parts together and making trips to the electronics and hardware stores. I couldn’t believe my eyes when I got the software to work and the projector illuminated the sand for the first time.
“The FabLab is a great environment to put a project like this together. I’ve had a fantastic time building, demonstrating and watching others enjoy the AR sandbox.”
What are the practical applications for an augmented reality sandbox?
Beyond enjoyment, Kucharski sees many practical applications for the AR sandbox, including using it similar to a 3-D printer to assist in making a mold from layers of materials such as sand or clay. He’s also looking at taking drone-measured information and re-creating the terrain with the AR sandbox. That way Kucharski and other researchers can explore aspects of land use, such as where water would flow during and after a rainstorm.
“The key is getting the desired profile projected in three colors as a sculpting guide. A lot of 3-D terrain data already exists; it just needs to be converted to a compatible format,” he said. “I have many ideas for utilizing imported data.”
Kucharski also would like to see this type of technology incorporated into Northrop Grumman’s factory modernization to provide real-time feedback in the measurement process. “It really could be useful in areas such as cable and harness placement during spacecraft integration,” he said. “The game-changing aspect of it is the way we’re interacting with a computer: mouse and keyboard are not required during operation for input; a monitor is not required for output. The AR sandbox alters the way we typically interface with a computer and perceive reality.”
Are you interested in an engineering career? Northrop Grumman hires engineers in many areas of specialty, including structural engineering and other high tech areas where you can explore augmented reality….and have access to the FabLab!