One of the biggest reasons virtual reality hasn’t taken off is the clunky headsets that users have to wear. But what if you could get the benefits of virtual reality without the headsets, using screens that computationally improve the images they display?
That’s the goal of the startup Brelyon, which is commercializing a new kind of display and content-rendering approach that immerses users in virtual worlds without requiring them to strap goggles onto their heads.
The displays run light through a processing layer before it reaches users’ eyes, recalculating the image to create ultrawide, visual experiences with depth. The company is also working on a new kind of content-rendering architecture to generate more visually efficient imagery. The result is a 120-inch screen that simulates the sensation of looking out a window into a virtual world, where content pops in and out of existence at different angles and depths, depending on what you feed the display.
“Our current displays use different properties of light, specifically the wavefront of the electric field,” says Brelyon co-founder and CEO Barmak Heshmat, a former postdoc in the Media Lab. “In our newest architecture, the display uses a stack of shader programming empowered with inference microservices to modify and generate content on the fly, amplifying your immersion with the screens.”
Customers are already using Brelyon’s current displays in flight simulators, gaming, defense, and teleoperations, and Heshmat says the company is actively scaling its manufacturing capacity to meet growing demand.
“Wherever you want to increase visual efficiency with screens, Brelyon can help,” Heshmat says. “Optically, these virtual displays allow us to craft a much larger, control-center-like experience without needing added space or wearing headsets, and at the compute level our rerendering architectures allow us to use every bit of that screen in most efficient way.”
Of light and math
Heshmat came to MIT in 2013 as a postdoc in the Media Lab’s Camera Culture group, which is directed by Associate Professor Ramesh Raskar. At the Media Lab, Heshmat worked on computational imaging, which he describes as “combining mathematics with the physics of light to do interesting things.”
With Raskar, Heshmat worked on a new approach to improving ultrafast cameras that used time as an extra dimension in optical design.
“The system essentially sent light through an array of mirrors to make the photons bounce many times inside the camera,” Heshmat explains. “It allowed us to capture the image at many different times.”
Heshmat worked across campus, ultimately publishing papers with five different professors, and says his experience at MIT helped change the way he perceived himself.
“There were many things that I took from MIT,” Heshmat says. “Beyond the technical expertise, I also got the confidence and belief that I could be a leader. That’s what’s different about MIT compared to other schools: It’s a very vibrant, intellectually-triggering environment where everyone’s very driven and everyone’s creating their own universe, in a sense.”
After graduating, Heshmat worked at a virtual reality company, where he noticed that people liked the idea of virtual reality but didn’t like wearing headsets. The observation led him to explore ways of achieving immersion without strapping a device to his head.
The idea brought him back to his research with Raskar at MIT.
“There’s this relationship between imaging and displays; they’re kind of like a dual of each other,” Heshmat explains. “What you can do with imaging, the inverse of it is doable with displays. Since I’d worked on this imaging system at MIT, what’s called time-folded imaging, I thought to try the inverse of that in the world of displays. That was how Brelyon started.”
Brelyon’s first check came from the MIT-affiliated E14 Fund after Heshmat built a prototype of the first device in his living room.
Brelyon’s displays control the angles and focus of light to simulate wide, deep views and give the impression of looking through a window. Brelyon currently sells two displays, Ultra Reality and Ultra Reality Mini. The Ultra Reality display offers a 10-foot-wide display and a depth of around 3 feet. The displays are fully compatible with standard laptops and computers, so users can connect their devices via an HDMI cable and run their favorite simulation or gaming software right away, which Heshmat notes is a key benefit over traditional, headset-based virtual reality displays that require companies to create custom software.
“This is a plug-and-play solution that is much smaller than setting up a projection screen, doesn’t require a dedicated room, doesn’t require a special environment, doesn’t need alignment of projectors or any of that,” Heshmat says.
Processing light
Heshmat says Brelyon has sold displays to some of the largest simulation training companies in the world.
“In simulation training, you usually care about large visualizations and large peripheral fields of view, or situational awareness,” Heshmat says. “That allows you to look around in, say, the cockpit of the airplane. Brelyon allows you to do that in the size of a single desktop monitor.”
Brelyon has been focused on selling its displays to other businesses to date, but Heshmat hopes to eventually sell to individuals and believes the company’s displays hold huge potential for anyone who wants to improve the experience of looking at a monitor.
“Imagine you’re sitting in the backseat of a car, and instead of looking at a 12-inch tablet, you have this 14-inch or 12-inch aperture, but this aperture is looking into a much larger image, so you have a window to an IMAX theater,” Heshmat says.
Ultimately, Heshmat believes Brelyon is opening up a new platform to change the way we perceive the digital world.
“We are adding a new layer of control between the world of computers and what your eyes see,” Heshmat explains. “We have this new proton-processing layer on top of displays, and we think we’re bridging the gap between the experience that you see and the world of computers. We're trying to connect that programming all the way to the end processing of photons. There are some exciting opportunities that come from that. The displays of future won’t just let the light out just like an array of lamps. They’ll run light through these photon processors and allow you to do much more with light.”