How to use newly announced technologies to create incredible apps :- You can create an incredible app using iOS 13. iOS13 announced at WWDC in June, iOS 13 is Apple’s next operating system for iPhones and iPads. Features include a Dark Mode, a Find My app, a revamped Photos app, new Siri voice, updated privacy features, new street-level view for Maps, and more.
Latest Features :-
1. Core ML 3
2. The Vision framework
3. New in Siri
4. Sign In with Apple option
5. Dark Mode
1. Core ML 3 :-
2. The Vision Framework :-
- Vision is a new, powerful, and easy-to-use framework that provides solutions to computer vision challenges through a consistent interface. Understand how to use the Vision API to detect faces, compute facial landmarks, track objects, and more. Learn how to take things even further by providing custom machine learning models for Vision tasks using CoreML.
- a framework to apply high-performance image analysis and computer vision techniques to identify faces, detect features, and classify scenes in images and video
And much more new frameworks and improvements to exist frameworks.
Vision framework allows you to :-
- Detect face rectangle and face landmarks (face contour, median line, eyes, brows, nose, lips, pupils position)
- Find projected rectangular regions surface.
- Find and recognizes barcodes.
- Find regions of visible text.
- Determine the horizon angle in an image.
- Detect transforms needed to align the content of two images.
- Process images with Core ML model.
- Track movement of a previously identified arbitrary object across multiple images or video frames.
There are 3 base class categories :-
- VNRequest and derived classes :- describe your analysis request. It has request completion handler and array of results.
- VNObservation and derived classes :- describe an analysis result.
- VNImageRequestHandler, VNSequenceRequestHandler :- processes one or more VNRequest on given image.
What You Can Do with Vision :-
- Face Detection
- Face Detection: Small Faces
- Face Detection: Strong Profiles
- Face Detection: Partially Occluded
- Face Detection: Hats and Glasses
- Face Landmarks
- Image Registration
- Rectangle Detection
- Barcode Detection
- Text Detection
- Object Tracking [For faces, rectangles, and general templates]
- Vision is a new high-level framework for Computer Vision
- Various detectors and tracking through one consistent interface
- Integration with Core ML allows you to use custom models with ease
3. New in Siri :-
- Siri is getting a new voice in iOS 13, Apple announced on stage at WWDC 2019, with the company employing new “Neural text to speech” technology to make the virtual assistant sound much more natural.
- Unlike the old version of Siri, the iOS 13 voice is entirely generated by software, instead of using audio clips from voice actors. In a brief demo shown on stage, the new voice does seem to do a better job at actually pronouncing words, especially ones that are more complicated (like “thermodynamics”). The new voice is also better at longer sentences, stressing syllables more accurately than the older version.
- The received wisdom is that Siri lags behind Amazon Alexa and Google Home, but received wisdom has a problem. Whatever it’s about, the opinion tends to be quickly formed and it takes a very long time to change it. So right now Siri has zoomed ahead with the advances in iOS 13 and iPadOS but it’s going to take time for that to really register.
- That’s partly because it’s going to be months before we all have the final versions on our iPhones and iPads. It’s also because it’s then going to take us time to really experience the differences.
- It’s especially so because some of those differences are a direct result of improvements to Siri Shortcuts and that’s nowhere near as main Stream as Alexa is and it’s also the case that Siri can go further, that there are things we’d like it to change, areas we’d love it to improve in. Yet while this may be us reading too much into it, some of this year’s improvements even lay the groundwork for these areas.
4. Sign In with Apple option :-
- Apple’s new sign-in feature brings a secure way to log in to your iOS 13 apps
- At WWDC 2019 Apple is continuing to make its case and push for stronger privacy features. To make it simpler and more secure for iOS 13 users to sign in to apps, Apple is launching a new sign-in button called “Sign in with Apple.” The tool works like similar social sign-in buttons — like those that allow users to log into third-party apps with either their Google or Facebook ID — but adds Apple’s twist with a focus on privacy and security.
- Most apps and services today require users to either create a user profile or log in with a social ID to deliver a unique, customized experience. The former requires comes with a lot of friction, as it requires you to enter a lot of information, while the latter is convenient but could reveal a lot about you.“Now, this can be convenient, but it also can come at the cost of your privacy, your personal information sometimes gets shared behind the scenes and these logins can be used to track you,” Apple Senior Vice President of Software Engineering Craig Federighi said of competing for social sign-in options during Apple’s keynote address. “So we wanted to solve this, and many developers do, too. And so now we have the solution. It’s called sign in.
- ”The Apple sign-in button allows iOS users to sign in to apps, like ride-sharing apps, with their Apple ID but without all the tracking or having to reveal personal information. Apple will provide developers with the sign-in APIs to build into their app.
- When apps require your email address or name, you can choose how you want to share this information with developers. In a demo using Bird’s scooter rental app, Federighi showed that users can either choose to share their email address with the developer or choose an option that will create a randomized email address that will relay the message to your Apple iCloud email address to protect your security.“And that’s good news because we give each app, a unique random address, and this means you can disable any one of them and anytime and when you’re tired of hearing from that app,” he said.
- Though the experience was focused on iOS, Apple will bring its sign-in button to all of its platforms and on the web, Federighi said, so you’ll have more control over the personal information you share with websites and developers when you log in to apps and services.
5. Dark Side of the Mode :-
6. PencilKit :-
- Capture touch input as an opaque drawing and display that content from your app.
- PencilKit makes it easy to incorporate hand-drawn content into your iOS or macOS apps quickly and easily. PencilKit provides a drawing environment for your iOS app that takes input from Apple Pencil, or the user’s finger, and turns it into high-quality images you display in either iOS or macOS. The environment comes with tools for creating, erasing, and selecting lines.
- You capture content in your iOS app using a PKCanvasView object. The canvas object is a view that you integrate into your existing view hierarchy. It supports the low-latency capture of touches originating from Apple Pencil or your finger. It then vends the final results as a PKDrawingReferences object, whose contents you can save with your app’s content. You can also convert the drawn content into an image, which you can display in your iOS or macOS app.
Pencil Interactions :-
- Apple introduced PencilKit at WWDC 2019 this week for much easier implementation of Apple Pencil experiences in third-party apps. The new framework will allow devs to tap into the same low latency and the new Apple Pencil tool palette and “markup anywhere” features that Apple itself uses for drawing and annotating in its own apps.
- Handle double-tap interactions that a user makes on Apple Pencil.
- Pencil interactions let your app detect double taps the user makes on their Apple Pencil. Supporting Pencil interactions in your app gives the user a quick way to perform an action such as switching between drawing tools by simply double-tapping their Apple Pencil.
Lower latency with PencilKit :-
- Apple is already using PencilKit across the entire system in iOS 13 including in Notes for low latency drawing and note-taking, in Pages for marking up documents, and with its “markup anywhere” feature for annotating screenshots and PDFs.
- The new APIs require just three lines of code for developers to get the same low latency, UI, and tool palette that Apple uses for Pencil. That includes the drop from 20 milliseconds to 9 milliseconds latency that Apple announced during its unveiling of iPadOS.
New Dynamic Tool Picker & Expressive Inks
- A big part of what developers get access to with the APIs is Apple’s canvas and new dynamic tool picker (pictured above) with pen, marker, pencil, eraser, and lasso tools. That includes Apple’s expressive, responsive inks and drawing models that it uses in apps like Notes and Pages.
- Some of this new feature set was shown briefly by Apple on stage to demo new and improved “markup anywhere” features for annotating screenshots and PDFs – now integrated system-wide with support for editing full documents and more. With the introduction of PencilKit, developers will be able to much more easily offer users access to the markup controls to draw or annotate in third-party apps too, even for apps that might not use Apple Pencil as the main input device.
- Developers interested in learning more can head over to Apple’s website where it has sample code for the new PencilKit APIs.