Apple has been awarded a patent today (via AppleInsider) that describes an augmented reality (AR) system that can tag real-world items in a live video stream and display information about them in a HUD overlay. It sounds exactly like Pop-Up video in practice: turn your device to focus on Rick Astley, for instance, and get a pop-up picture of the singer belting out “Never Gonna Give You Up.”
The patent describes an AR system for iOS devices, which can be used in a variety of different ways. At its most basic, it works by labeling elements of an image in a live video feed, as when it names the parts of a circuit board being shot with the rear-facing camera on an iPad-like device in Apple’s patent. But it has more advanced features, too: Apple describes a user being able to edit the supplied data in case of inaccuracies or incorrect matches, and also includes various means for sharing the information between users and devices.
Apple’s system involves a collaboration aspect, as one user can annotate or edit the information being presented on their own view, and send it to a second user’s device. The iOS device employing the AR tech is also described as being able to show both the straight image itself, and the version with overlaid information at once in windows side-by-side, allowing both an unobstructed view and one with all the contextual information. In Apple’s provided example, a real-world view of San Francisco is paired with a computer-generated model of the same. The user can interact with the CG model to navigate through streets, and modify points of interest in case they’re traveling, something which sounds like it would add considerably to the current iOS Maps experience.
This type of dual-view could then be shared live with a second user, Apple says in the patent. So one user could build a virtual map and highlight important POIs, and then sync that with a second user’s device to help them navigate. It could also be used to collaborate in various professions, including doctors comparing x-rays or other medical imaging.
This AR system is mostly unique because of its sharing and collaboration features, but it also includes techniques that could easily be at home in a wearable AR display like Google Glass. But even as just a simple extension to Maps, it has value, and as an API built into iOS, the possibilities really start to take off. AR is getting more advanced, but we’ve seen players like Layar pivot away from similar products. Still, Apple would have different goals with such an invention, so it’s still possible this could make its way to shipping product.
- Early Apple iPhone Developer Prototype Looked More Like An iPad, Had Ethernet And Serial Ports
- Summit Partners Puts $26M Into Samwer Brothers' African Amazon Clone Jumia
- Aggregift Turns Anything On Amazon Into A Crowdfunded, Group Gift
- Apple's Low-Cost iPhone Reportedly Getting The Same 4-Inch Display As iPhone 5, ‘Super-Thin' Plastic Case
- ClickBank Makes it Easier for "Everyday Experts" to Turn Their Know-How Into a ...