Intrepid individuals passim past person attempted to conquer the skies utilizing everything from pedal-powered planes to rocket-powered pitchy packs.
Arguably the closest a quality has ever travel to non-mechanically aided flight, however, is wingsuit BASE jumping – besides known arsenic wingsuit flying.
This highly unsafe and technically hard athletics involves BASE jumping from a precocious constituent portion dressed successful a webbing-sleeved jumpsuit that enables the wearer to glide alternatively than free fall.
Requiring years of skydiving and BASE jumping acquisition – and with a fatality complaint of 1 successful 500 jumps – wingsuit BASE jumping is simply a pursuit that has, until now, been beyond the scope of 99.9% of the population.
JUMP is the world’s archetypal hyperreal wingsuit simulator, combining a existent wingsuit, a virtual world helmet and a premix of suspension, upwind effects and hyperreal multi-sensory stimulation.
It is the brainchild of main enforcement and laminitis James Jensen, who was portion of the squad that acceptable up The VOID, 1 of the archetypal walking VR simulation companies.
Jensen assembled a team, and betwixt 2019 and 2021, they built a prototype simulator. That led to a moving installation successful Bluffdale, Utah, which has present been operating for much than 4 months and has flown much than 5,000 people. “I’ve ne'er sky-dived oregon BASE jumped,” says Jensen. “I trust connected my nonrecreational athletes to archer maine this is real—they’ve said it’s astir 85% there. We’re pushing for 100%.”
JUMP takes the flyer into hyper-detailed 3D landscapes of immoderate of the world’s astir breath-taking BASE jumps, including Notch Peak successful the US. To execute this, the JUMP squad flew a chopper kitted retired with top-of-the-range cameras, spending 2 days capturing thousands of ultra-high-resolution images of the scenery below.
The images were processed utilizing the latest mentation of the RealityCapture photogrammetry instrumentality that enables ultra-realistic 3D models to beryllium created from sets of images and/or laser scans.
Reconstructing the 58,000 images captured required 5 supercomputers. The squad besides utilized precise accusation from gyroscopes and different sensors to make a high-precision customized formation log. The effect was an incredibly elaborate integer exemplary of the situation of much than 8 cardinal polygons crossed 10 quadrate miles.
The adjacent measurement was to bring the immense dataset into Unreal Engine 5. “It took immoderate enactment from the RealityCapture team, but successful the end, we developed immoderate caller tools that helped chop up these monolithic information sets and delegate the due textures and materials,” says Jensen.
The squad leveraged Nanite, Unreal Engine 5’s virtualised micro-polygon geometry strategy to grip the import and replication of the multimillion-polygon mesh portion maintaining a real-time framework complaint without immoderate noticeable nonaccomplishment of fidelity.
For the lighting and shadows, the squad harnessed the powerfulness of Lumen, a afloat dynamic planetary illumination strategy successful Unreal Engine 5 that enables indirect lighting to accommodate connected the alert to changes to nonstop lighting oregon geometry.
“Because we are looking for full photorealism, we are leaning heavy into Nanite and Lumen to marque our scenes travel to life,” says Jensen. “We presently person the largest dataset successful Nanite astatine 8 cardinal polygons – much than 700 parts and 16k textures per part.”
Jensen explains that features specified arsenic these are the crushed JUMP utilized Unreal Engine to make the experience. “Unreal Engine is conscionable flat-out starring the manufacture successful high-resolution real-time simulations,” helium says.
“Seeing the things that I utilized to bash successful video accumulation that would instrumentality days, adjacent weeks, and months to render present each hap successful existent clip is unbelievable. Polygon number has ever been a bottleneck, and planetary illumination with Lumen – it’s mindblowing to spot successful existent time.”
The JUMP squad filled retired the virtual situation with shrubs, trees, writer and different objects from Quixel Megascans, a scans room offering of photorealistic 3D scanned tileable surfaces, textures, vegetation and different high-fidelity CG assets that is included with Unreal Engine 5.
They besides developed their ain physics engine, FLIGHT, which handles each of the configurations and physics for some the carnal and integer worlds. Blender and Maya were utilized for the 3D art.
The effect is an awe-inspiring virtual satellite realistic capable to instrumentality flyers into believing they are lasting connected the precipice of a 1,200m drop.
But to make the afloat immersive sensation of existent flight, what is seen successful the VR headset needs to beryllium combined with a existent wingsuit, suspension system, upwind effects and multi-sensory stimulation.
“Physical effects are indispensable successful being capable to mimic reality,” says Jensen. “When you tin synchronise carnal sensation with visuals and audio, you spell to a full different magnitude successful virtual world simulations.”
The simulation’s haptics are triggered by events successful the virtual environment. “We’ve written customized codification wrong Unreal Engine specifically for moments wrong of the wingsuit BASE jumping acquisition that initiates signals for scent, upwind speed, haptic signifier effects, dependable effects and carnal objects,” explains Jensen.
Details that supply a existent consciousness of beingness see filling the wingsuit with compressed air. Once a flyer jumps disconnected the cliff, their wingsuit inflates wrong a fewer seconds, portion a instrumentality blows upwind astatine an ever-increasing gait to adhd to the realism of the experience.
For now, JUMP is simply a location-based experience, but Jensen alludes to a aboriginal successful which a mentation of the strategy could beryllium operating successful homes astir the world.
“The JUMP simulator and exertion are the instauration for existent afloat mobility wrong immoderate metaverse,” helium says. “Through a fewer years of location-based entertainment, we volition inevitably deduce a cleanable virtual world mobility merchandise for at-home use.”