WWDC 18: Where Apple AI met computer vision

Click here to visit Original posting

Apple’s focus on the image isn’t just about marketing, it’s about unleashing whole new ways of interacting with devices as the company quietly moves to twin machine intelligence with computer vision in a host of new ways.

What has Apple done?

WWDC saw numerous OS improvements with this in mind. These enhancements extend across most of its platforms. Think about:

  • Core ML: Apple’s new machine learning framework that lets developers build computer vision machine learning features inside apps using very little code. Supported features include face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, and image registration.
  • ARKit 2: The capacity to create shared AR experiences, and to allow others to also witness events taking place in AR space, opens up other opportunities to twin the virtual and real worlds.
  • USDZ: Apple’s newly-revealed Universal Scene Description (USDZ) file format allows AR objects to be shared. You can embed USDZ items in a web page, or share them by mail. You’ll see this evolve for use across retail and elsewhere.
  • Memoji:More than what they seem, Memoji’s capacity to respond to your facial movements reflects Apple’s gradual move to explore emotion sensing. Will your personal avatar one day represent you in Group FaceTime VR chat?
  • Photos:Apple has boosted the app with the capacity not just to identify all kinds of objects in your images, but also developing a talent to recommend ways to improve an image and/or suggesting new image collections based on what Siri learns you like. CoreML and improved Siri mean Google Lens must be in Apple’s sights.
  • Measure:Apple’s new ARKit measurement appis useful, but it also means that AR objects can begin to offer a much more sophisticated ability to scale, and provides the camera sensor (and thus the AI) with more accurate information pertaining to distance and size.

Other features, such as Continuity Camera (which lets you use your iPhone to scan an item for use by your Mac), and, of course, Siri’s new Shortcuts feature just makes it easier for third-party developers to contribute to this attempt.

To read this article in full, please click here