Google Research advances NeRF (Neural Radiance Fields) technology, crowdsourcing photos taken by tourists to create detailed 3D images
Traditional NeRF struggles with the wide luminance range in different images, coping better with static, controlled images. NeRF-in-the-Wild has been developed to handle many more lighting conditions, angles and formats. The Google Brain ANN (Artificial Neural Network) can now smooth out any variations between the photos it uses as its source, even removing foreign objects such as vehicles. Highly detailed 3D environments can now be created simply by trawling the Internet for holiday snaps people have taken of the target objects or locations.
Source: RedShark News