Taking to the air to put things in sharp relief.
The story behind Visualskies.
Words by Paul Hellard
Visualskies is a full-service company offering VR, AR, high-resolution photogrammetry, drone-lapse, and videography to clients from all over the media spectrum and beyond.
Joe Steel has been flying drones since before drones were on the market. He once worked at a company called Timeslice who created all kinds of Matrix-like effects for commercials and films with directors like Michael Bay. Ross Dannmayr has a background in architecture and communications tech. Joe and Ross are life-long friends and it was natural for them to set up Visualskies together to produce and make drones available for all kinds of projects in film, architecture, city planning, and other areas.
“We all come from very different backgrounds,” says Duncan Lees. Way before joining the tight crew at Visualskies, Lees was a professional archaeologist. He’d also spent time as a surveyor and the work was very much heritage-based. “I understood coordinate geometry and geospatial data and all that kind of stuff,” he says. “When I moved around in that industry around 15 years ago now, I was developing markets for this new tech that wasn’t necessarily full of those survey companies.”
Lees’ experience in gathering real-world spatial data dovetailed very well in movie production work. When Visualskies collected the data from the ground and the air, it would be more cost-effective and complete. “It became a much better fit for our clients, friends, and colleagues,” he adds.
“We also gather a lot of data, collected from online resources,” says Joe Steel. “Google images can be used to build a vast gallery of forests, for instance. We can fulfil our location scouting services, without actually going to the locations.”
Visualskies recently provided solutions in National Geographic’s quest to find out more about the Columbian Lost City discoveries. This featured as part of the ‘Lost Cities’ series on the Discovery Channel.
First, the whole vast area of the site was collected by an overall helicopter scan. Five square kilometres of the jungle canopy were scanned in aerial lidar coverage and meanwhile, Joe and Duncan collected higher detailed data using ground lidar and aerial photogrammetry of the main sites on the ground. Together, they visualised the entire city site, even tunnels under the ground, in really high detail.
On top of providing the 3D data, Visualskies were tasked with providing a method to view the data on location. They invented an augmented-reality application VSScout, [more on this below] which the presenter could then go to the location and sweep layers of history onto the camera view. For example, the presenter Albert Lin held up the iPad to reveal the fortress which used to stand in place and is now only ruins. Lin used this tool to visualise tunnels stretching hundreds of metres underneath the city. Within this application on the iPad, Visualskies integrated all the data they captured into the one asset. “This allowed Lin to move and explore exactly where he wanted to go. Each digital element was on a different layer and he could drive the show instead of us pre-defining it.”
Albert Lin had all of that spatial data on an iPad so he was geo-located in the city, and he could at any time, change the view of the camera from the modern city as it was, to the archaeological assets that were above and below the ground, all integrated with the reconstructions that were shown in the right place. “We were capturing and providing data for them in a very short space of time, and this is what we provide in VFX and virtual production,” says Lees. “We scan one day, process the next, and are able to be on set showing results on the third day. We’ve honed our workflow into something we are very proud of.”
On all their jobs, the Visualskies crew works very closely with the VFX Supervisor and team on set. There’s always been an open two-way course of communication, pre-shoot and during the productions they’ve worked on. “This is definitely an end-to-end service for the shows,” adds Joe Steel. Once the location is scanned, the FBX format data is imported into the Unreal Engine, which is a real-time graphics engine software. “This basically means we use a game engine to visualise and playback the data onscreen,” adds Joe.
What is pretty clear from talking to the guys at Visualskies, is that they have a very clear idea of what drone, what cameras, what lens and what scanner and software they need for each kind of job. Their service is very end-to-end, and the workflow is crystal clear. “This hasn’t been developed in isolation either,” says Duncan. “We’ve been canvassing the views of high-end VFX Supervisors and producers on the ways they would need to use our data.” Once the data has been collected using the very best software and hardware, combining all that LiDAR and photogrammetry in RealityCapture is the next step so the fidelity of both is clear.
Appropriate gear can be selected for each job and location in advance. The desk-based assessment enables the VSScout team to select the most appropriate equipment and quickly deploy to the location. Visualskies are Leica Geosystems approved operators, using only the highest quality scanning and photographic equipment both on the ground and in the air; including the UK’s only array scanning drone – VSArray. LiDAR scanners can collect up to two million precisely measured points per second from the ground and from the air, while VSArray can sweep photogrammetry data into a 144-megapixel RAW file.
Optimised for virtual production at 2.5 million polygons, retopologised and UV-mapped. This part of the job, after the scanning and building of the virtual set, is completely directable. The photogrammetry can have light, reflections, and shadows in it, which can be a problem, so it can be a job to take textures back to their raw 3D state. That way, they can be baked in and will look real at any angle. This is brought to fruition with something called maps of the image: ambient occlusion maps and normal maps.
If the client is a movie studio, Visualskies might now send them the data in the Unreal Engine format. “The move to deliver in Unreal format is a new development predicated upon moves by some studios to use their Virtual Production pipelines,” explains Lees. “Up until now the usual delivery has been as an OBJ file with associated UV layout and image files. But this is entirely down to the client’s requirements. Delivering the data in Unreal ensures there’s no load time required.”
The really clever bit is where they combine this data into the AR/VR application [VSScout] that the Visualskies team have created.
The data is then stored for use by the client in a VR headset with the custom-built iPad application. The data can also be used by an art department to design sets, by a previz department to start character animation to block out where actors will go. DOPs and directors can also block out chopper shots of locations that may or may not even exist in the real world.
Delivering the files on a locked iPad is becoming more popular; using an in-house designed bespoke software package called the Visualskies Scouting system. VSScout is a powerful toolkit to explore a location from afar, creating a totally immersive experience in which key crew can select lenses, filters, camera moves, and even the time of day for a future shoot if it is required. VSScout can be used for storyboarding and blocking out the scenes using the virtual environment on the iPad and/or on a VR headset. The data collected can be used across the entire production, from the Art Department through to Post Production, saving both time and money and minimising downtime.
While the COVID19 crisis has dried up a bit of work, Visualskies is using the time to come to terms with how they might address how to use data in new ways. “We knew there were lots of drone pilots but our clients need someone who knows how to make TV, capture sets, and someone who knew what producers require,” explains Lees. This pandemic has focused people’s minds on getting as much done pre-shoot as possible. Seeing the location in a virtual aerial view well before the shoot date makes it easier to ensure the location crew is proximity-safe at all times. “I think, during the pandemic, it has given us a chance to hone a methodology, a path through this,” Duncan adds. “People like GeoSlam, RealityCapture, and Leica see us as a really good advert for what they do, and the software and hardware they provide. We’ve been putting together sizzle reels showing off what the Leica kit and GeoSlam’s kit has been doing.”
Joe, Ross and Duncan have worked together as Visualskies since just before Christmas in 2019. They work with a range of scanners like the Leica and Z+F. The Leica P40, P50, and RTC360 scanners and Z+F Imager 5016 scanner. They have also mounted a Geoslam Zeb Horizon scanner on a drone, and also used it hand-held.
Digital processing is done in Leica Cyclone and Reality Capture software, by Capturing Reality. Maya, ZBrush, Lightroom, Photoshop, and Houdini does a lot of the heavy-lifting in image processing and compositing.