Early in my broadcast and production career I was held in awe of the expertise shown by my engineering mentors, peers and colleagues. Individuals from Sony, CBC, Global and many more production houses across Canada. They were all gifted individuals doing something, that to me, was just truly amazing and even artistic in its own way. Especially beautiful was the elegance of the video wall in the broadcast truck. In a simplified description, vision all orchestrated from multiple camera into the switch, the patch panel and cable bundles wrapped and labelled all in the right conduit ports. Then the magic, night after night, flawless functionality, seeing the wall light up with powerful images from deployed cameras and how we could intently target the monitor wall. A number of years later HD ushered in a whole new era of multi-view processing and flat panel LCD displays. Switching and matrix'ing the clean feeds now began to take on personality. A single button touch could redirect and re-purpose each and every portal window of the event production at the whim of the technical director.
In my mind, at that time, there was an understanding of what this could mean if it were feasible to put that capability into the hands of the viewer, the user, the analyst, the coach. In professional sport it might be classed as a god-like experience or more appropriately one of a military general. What was fundamental then and now is that this kind of control in a monitoring application would have profound effects topically on the subject matter being played out in front of the cameras.
Fast forward 15 years! At #myplayXplay those building blocks of end point contextual switching have taken shape in our #iPad iOS App. The ability to take multiple angles and allow a user to define their viewing condition is now possible and associated all in a rich, wireless untethered user experience. With some very hard work on our part we venture to change in game how video will be used for performance and coach improvements.
Having achieved 4 feeds easily it has been our investigation that the upper limit is really only limited to network capacity and data throughput (video resolution driven) on each feed. The bigger question for coaches, when they ask for all camera angles, is can the brain easily process so many views of the game while also making high value decisions on the game. It is a user interaction and experience question.
Our first demo is a Dual Camera View where myplayXplay lets the user choose two discreet angles. While user can pick two angles they can also have a battery of cameras to select to view.
When in Dual Camera View mode we then introduce a number of features not possible outside the context of myplayXplay and an iPad. For example zooming in on detail now opens up the possibility of utilizing 4K feeds for digital pristine pixels of key zonal video review. That is to say we will eventually have 4K cameras on game play and pinching to zoom will yield extremely tight vision on a Scrum from rugby or a Face-off in hockey.
This next in viewing concepts is 4 up. We will be trialling this view shortly with a Super Rugby team in Australia. It is a significant amount of information to view in one scan of a our field of vision.
What we do know is that myplayXplay is the first live feed game system of its kind that will relay and present multiple game feed angles for coaches. What we don't know is that when coach asks for all angles will he be able to really use them? We think so.
The new era of a handheld multi-view video wall is here! But it is here with so much more in contextualized review/replay and tagging. The application developments and scenarios are truly incredible.