Kandao Obsidian Pro is the world's first 12K professional-grade VR panoramic camera, selected as Time magazine's Best Invention of 2022. It is equipped with eight 24-megapixel APS-C large-area image sensors, offering more pixels, larger pixel size, and greater light sensitivity. This allows for comprehensive recording and presentation of rich detail information, delivering cinematic ultra-clear image quality that makes it a "magic weapon" for directors. It is suitable for various VR film/documentary shootings.
Previously, Kandao and MeetMo.io collaborated to create the most advanced 8K broadcast workflow at the NASA lunar rocket launch site. They provided immersive technology support for Felix & Paul Studios' "Space Explorers: Artemis Ascending," a live event streamed to 360° domes and Meta's Horizon Worlds streaming platform.
MeetMo.io is an award-winning technology provider serving customers worldwide. In their recent collaboration with Felix & Paul Studios, MeetMo.io delivered a new series of immersive technologies for the inaugural launch event of the Artemis Program. The Artemis Program, led by NASA, is a human spaceflight program aimed at exploring the moon. The primary goal of the Artemis Program is to return humans to the moon and go beyond. Additionally, the program aims to send the first woman and the first person of color to the moon by 2024.
Felix & Paul Studios, in collaboration with MeetMo.io, broadcasted the inaugural launch event to viewers globally through an immersive experience. The experience was showcased in domes with resolutions up to 8K and simultaneously delivered to Meta Horizon Worlds and telecommunications companies in Europe and Asia.
To provide an authentic experience, Felix & Paul Studios aimed to transport viewers right onto the rocket launch pad. Through powerful tools and technologies, the team achieved 8K streaming with near-instant delivery worldwide.
MeetMo.io utilized their Virtual Production Desktop (VPD) box powered by the NVIDIA RTX A6000. The team was able to achieve 360° 8K live stitching through a quad-link 12G-SDI workflow and integrated it into Flightline's broadcast truck. With the VPD and RTX technology, the MeetMo.io team had the necessary tools to create, produce, and deliver real-time, live immersive experiences.
MeetMo.io aimed to create a unique, immersive experience for the inaugural launch event of the Artemis Program. The team wanted to transport audiences to the Kennedy Space Center's launch pad, allowing them to view the event from there. They needed to synchronize footage from multiple cameras and ensure high-quality visual streaming with ultra-low latency. MeetMo.io developed the Virtual Production Desktop (VPD), an edge computer with a built-in encoder that supports ultra-high resolutions. The VPD is powered by the RTX A6000, delivering reliable and powerful graphics performance. The RTX A4000 was used inside the team's encoder/decoder, enabling them to encode and stream the 8K video at lightning speeds. MeetMo.io also benefited from an efficient remote workflow through their cloud-native platform. With the RTX cards, the MeetMo.io team can stitch 8K 360-degree videos at up to 60 frames per second.
Getting Ready to Launch
Typically, creating a unique, immersive experience of this scale requires significant infrastructure and months of preparation. However, MeetMo.io aimed to streamline the workflow and make it a swift and seamless process. By utilizing real-time stitching, encoding transmission capabilities, and multi-device network access, the team sought to deliver captivating content without disrupting normal broadcast and production standards.
Felix & Paul Studios had a broadcast truck provided by FlightLine Films and four 360-degree cameras placed in different locations. Their objective was to ensure that all captured footage could be displayed in real time with low latency and high resolution.
MeetMo.io's task was to stream videos in multiple distribution formats, ranging from full 360-degree formats to 180-degree dome formats. "We faced a unique challenge when it was time to set up for broadcast—we needed footage from multiple camera angles, as well as the NASA traditional 16:9 live feeds, all in real time," said Michael Mansouri, CEO at MeetMo.io. "This meant not only significant differences in projection and delivery formats between various viewing methods but also the need for quick and seamless integration into the broadcast truck."
The team also had to establish a connection between the 8K livestream cameras and the spatial audio systems. Everything needed to be synchronized for the large-format dome theaters. This experience was also made available for viewing through VR headsets on Meta's Horizon Worlds, mobile and desktop devices on Facebook 360, and dedicated distribution through partnered telecom operators.
Streaming Mission
MeetMo.io has developed an innovative live broadcasting solution: MoLink, a powerful edge computing telco rackmount 1U powered by the NVIDIA RTX A4000. By combining virtual and real-world media, it allows for the simultaneous multicast of low-latency 8K videos across multiple devices, from smartphones to dome displays, using its Metaverse Transmedia Encoder (MTE).
With the NVIDIA RTX cards, the team can easily stitch together 360-degree content at 60 frames per second, enhancing the immersive experience for the launch event. The RTX A6000 enables MeetMo.io to create real-time tools for video repositioning and synchronize video and audio to the traditional 16:9 format. The A4000 card is utilized inside the team's encoder/decoder, allowing for lightning-fast encoding and streaming of 8K videos.
Preparing for Future Immersive Broadcasts
Powered by NVIDIA RTX, the VPD (Virtual Production Desktop) brings new capabilities and efficiencies to broadcasters and studios. With VPD, professionals can enhance immersive storytelling and virtual production workflows with the following benefits:
• Higher resolutions, up to 8K
• Higher frame rates at 60 and 120 frames per second
• Direct 8K live input into real-time engines like Unreal and Unity
• Lower latency through the RTMP protocol
• Higher color subsampling
•Rescaling and repositioning, including reframing, rotations, picture-in-picture, and zoom in/zoom out
• Adding graphics, texts, and imagery inside 360-degree projections
With these technologies, broadcasters can invite audiences into a new immersive world with higher-quality visuals. It not only provides viewers with an interactive way of experiencing content but also offers them creative tools for crafting unique pieces on their own, from livestreaming gaming events or social gatherings to Hollywood-style productions for video conferencing presentations.