3 January 2018
iPhone X is hitting the shelves pretty soon. Ahead of the official release, Apple has been encouraging the techies to start creating and publishing their Xcode 9 (beta) apps long in advance. All to power up the hype for the new addition. And it’s a tease!
Aside from the crisp 5.8” Super Retina Display, the developers will finally get their hands dirty with the most cutting-edge firmware for face tracking, augmented reality, machine learning and more. Our company has already begun testing some of these goodies. Here are the main highlights, you should consider when planning to develop app for iPhone X:
1. Machine Vision
iOS now features ARKit. Together with TrueDepth camera, the brand new toolkit will enable space awareness capabilities to plant 3D objects into the camera view. Origin and orientation of the coordinate system are fully customizable.
2. World Tracking
Key to enabling AR with changing camera view is the awareness of the device’s self-positioning. Here, at play are the aforementioned computer vision together with internal motion sensing hardware. This process is called visual-inertial odometry. It tracks parallel horizontal surfaces to plant objects and is also able to estimate the room lighting, adding appropriate amounts to the rendered 3D object.
There are 2 types of requests to handle for image processing: VNImageRequestHandlerfor a single image and VNSequenceRequestHandlerfor multiple images which are outputted intocobservations.
3. Face Tracking
FaceID is not the only asset that goes with facial recognition. There are dozens of potential applications to this nice feature that come as part of ARKit. The front camera attributes the face with a number of anchors representing the position, orientation and topology converted into readings about the facial expression. These can be used to gather facial data and, say, mirror it on a 3D character.
The particular handlers with face tracking are VNDetectFaceRectanglesRequest to detect general faces and VNDetectFaceLandmarksRequest to track separate facial features and capture them in the “results”.
4. Information Tracking
Less promoted but, nonetheless, new to iOS devices, now, available with ARKit are:
VNDetectTextRectanglesRequest – camera view image text detection;
VNDetectBarcodesRequest – barcode information processing.
You might be stoked about all this new technology coming your way quite soon. However, many of these things are not yet fully adapted to harsh scenarios. For example, with AR, environmental details can be difficult to measure:
In the process of iPhone X app development, you should be aware of these limitations and test how much your app is constrained by them.
Also, the old devices won’t be able to support the new features. Here, we’re talking only iPhone X, however, for a startup, backward compatibility can make a huge difference, so mind that it might not happen. The ARKit firmware will be available for iPhone 6s, 6s Plus, 7, 7 Plus, but the joys of Augmented Reality are forever locked away from any devices below iPhone X. They don’t have TrueDepth frontal camera. As a business owner embarking on the road to build app for iPhone X you should consider that a large portion of the user base will not transition to the new model right away. The main reason is the cost.
5. Machine Learning
iOS 11 now comes with native machine learning framework. Core ML is finally addressing a number of issues iOS has had in the past due to relying on custom third-party apps. Such as granting access to non-native software which had raised huge iPhone X app security concerns and performance issues. No need to stay connected to the internet to use the framework now as well.
This figure shows the complete sequence of how Core ML works with data:
You may also take advantage of a few Core ML models that are readily available or use Core ML Tools to convert your models into Core format.
6. Natural Language Processing
Together with ARKit, the new NLP is starring a whole bag of its own features. These are meant to greatly enhance NSLinguisticTagger. It’s the main API that goes far beyond only tagging and even further with IOS 11 NLP.
The interaction happens between tag schemes and tag options.
let tagger = NSLinguisticTagger(tagSchemes: [.ENTER SCHEME NAME HERE], options: 0) is the requested outcome.
[.token] – identifies the type of every token;
[.language] – identifies dominant language in the text;
[.nameType] – let’s you know if the word is a proper name;
[.lemma] – gets you the stem from every word token;
[.lexicalClass] – each token’s lexical class;
let options: NSLinguisticTagger.Options = [.OPTION1, .OPTION2] are the conditions we add to filter the unneeded elements, etc.
Remember that the addition of Core ML for Apple is more of a catch-up rather than a technical breakthrough, though. It does not offer more than there is with Machine Learning apps in general, which are pretty limited and laborious to build so far down the road:
7. New Screen Shape
Relevant for development, the new screen now takes almost the entire surface. This time, curvier than ever.
The upper part of the screen is giving some space to the sensors, though, not entirely. Mind your app view when placing the device horizontally. Here is a nice working model:
Also, think about leaving the upper status bar unhidden. It might provide the user with additional information that wouldn’t spoil your UX.
As for the bottom of the screen, you will pay a heavy price if you don’t heed the following designlogic: don’t place the app controls there. Bottom is the place for the Home button. Users will confuse the interactive elements and hit Home, resulting in frustration with your UI decision, to put it mildly. Also don’t hide the Home button away, unless it’s a passive user experience.
8. New Aspect Ratios
New iPhone means new layouts and these will be quite different. A primary consideration to both update your apps for iPhone X and create new ones from scratch is the Safe Area. This area is not occluded by ancestors and your future apps will have to live with that.
The Safe Area is referenced with safeAreaLayoutGuide in the Interface Builder or tweaked with the layout margins. You will also have to turn on Use Safe Area Layout Guides in the Interface Builder Document.
Mind that with old iOS storyboards, enabling the Safe Area Layout Guides will override the constraints for top and bottom layout guides, as well as causing trailing edges. Always test the constraints when you do this.
These are some of the key UI ideas. Though, if you’re creating apps for iPhone X that don’t exactly walk the edge of the latest AR and ML features, don’t consider yourself left out. Both core hardware and software are getting their boost as well. There are tons of updates that will push the limits of what’s newly possible and provide thoughtful improvements to workflow.
9. Xcode 9
An extensive toolset for developers working in all Apple environments is getting some too. Aside from revamped editor with improved scrolling for any-size code document and search, here are the new features within:
10. Swift 4
The core programming language couldn’t be left out either. Made open source in 2015, the fourth iteration now comes with a gamut of great improvements:
11. A11 GPU and Metal 2
A treat for game and graphics developers. Together with powered A11 GPU, Metal 2 improves rendering, computing and more.
These are some of the key new features and capabilities that are coming with the release. iPhone X ships November 3 and you can already build app for iPhone X with Xcode 9. The above will cover the changes and additions that developers should study to prepare.
We are just one click away from helping you develop an amazing application! Let’s get in touch. Drop us a line in the form below, and we’ll reach out to you as soon as humanly possible.