Source Digital, Inc. announces the awarding of a European patent that further cements the company’s dominance in interactive video as well as in monetizable aspects of the metaverse. The patent not only covers functionality for activating visual content in video, but at any coordinate in any metaverse Source can enable interactivity for end users, empowering a more personalized, seamless video-metaverse (cross-verse) experience.
Source Digital already has the technology and IP that is agnostic to audio, video or metaverse – to facilitate personalized SAM’s (Source Activated Moments) designed for user engagement and interactivity around information, socialization, shopping, contextually targeted brands, advertising, and virtually any other extension of interactivity against moments in time. This patent further extends the company’s capability to deploy SAM’s directly correlated to any on-screen visual location in video (2D), metaverse (3D), or augmented reality (3D).
Each SAM “coordinate” effectively creates a mapping system allowing end users to touch/click any digital location/object and get an interactive return of information – even buy or share it! Users can also re-experience or cross-link the precise metaverse moment when they originally found it. (Imagine clicking on a shirt and immediately accessing information about the brand, adding it to your cart, and buying or sharing it with a friend; then, jumping right back to the original moment when the item was found.)
This patent also provides the ability to match the physical location of an end user (on earth) or digital object (in the metaverse) and identify its real-world coordinates. In all cases, these intersections can trigger additional digital interactive events (similar to the description above).