In 2016, gaming juggernaut VALVe hired Punch Drunk to develop a mixed-reality workflow to create a real-time composite – embedding a person playing a VR game into the actual game landscape so viewers can see the player inside the game without a headset or any additional peripherals.
We worked with VALVe and game developers for months to engineer a solution that would be replicable in multiple VR game bays simultaneously and robust enough to withstand constant usage during the entire week of the Ti6 competition.
Outside Ti6 we hosted a huge multi-bay, multiplayer experience space where VR enthusiasts and gamers could play VR games while a specially-designed camera system captured a feed of them on a green screen and composited them in real-time into the game.
That composite was then displayed throughout the experience and streamed on Twitch. After refining this workflow and deploying at future events with HTC and Ikea, we eventually retired the workflow as game engines and VR headsets developed to a point where this workflow was no longer required.
Prior to our involvement, there were no known Steam-based real-time mixed-reality workflows that could seamlessly composite a player inside the game they were playing.