Back

MIT deep learning project generates real-time holograms on an iPhone


An MIT project supported by Sony has demonstrated that photorealistic 3D colour holograms can now be generated quickly, even on a smartphone. Generating the vast amounts of data required to enable stable holography has already been considered a significant 3D visualisation challenge, so this could represent a big breakthrough. Obvious applications include VR and AR headsets and smart spectacles.

Previous attempts to slim down the amounts of data required to drive holograms have involved replacing complex physics simulations with simple lookup tables, but this affects image quality. The MIT model involves using deep learning in the shape of a convolutional neural network to mimic how the human brain would create shortcuts to attack the problem.

The system uses a paltry 620 kilobytes of memory and a consumer-grade GPU generate to generate sixty 1920 x 180 colour 3D holograms per second. An iPhone 11 Pro can generate just over one hologram per second while a Google TPU can do double that, suggested real-time holograms suggesting future VR- and AR-enabled handsets will be able to achieve real-time performance too.

Go here for a more detailed description of the researchers’ work.

Source: IEEE Spectrum



RELATED INSIGHTS