How far off are we to create a city like NYC that would be indistinguishable from real life?
The surface area of the NYC is 1,213sqkm which is about 1,213,000,000sqm. Well of course the world isn't flat so if we were to give volumetric shapes to everything we would need at least 10x the surface area. This means about 12,310,000,000sqm to be textured. A square meter texture to be indistinguishable from real life would need to be about 8192x8192 pixels which at 40bits of data per pixel would take up 320mb. This means that we need about 3.7 Exabytes to just store the textures! The polygonal data would probably be around the same so that would make it 7.4 Exabytes just for storage. Of course this is uncompressed data but it is still huge! Rendering power needed would also be counted in Exaflop numbers. So my guess is that we are anywhere around 15-20 years far to be able to do something like that.
If we were to use voxels instead of current polygonal mapping, each cubic meter with 8192x8192x8192 voxels would need 512GB of data! :shock: If we were to enclose NYC in a cube, it would take up 1,213,000,000,000 cubic meters. That means we need about 1.2 Yottabytes of data to store NYC inside an ultra high definition minecraft world :shock: :shock: :shock:
if Google earth is implementing a Feature that calculates everything in hd 3d (it is already possible with Google street view and oculus rift) then you can walk probably trough the cities in 3d with oculus rift ...
but why go with oculus vr in a real life City? there are thousands of games coming out where you can walk in even better places and big cities 😉