11 iOS 11 Features You Must Know when Developing Apps for iPhone X29 September 2017
iPhone X is
Aside from the crisp 5.8” Super Retina Display, the developers will finally get their hands dirty with the most cutting-edge firmware for face tracking, augmented reality, machine learning and more. Our company has already begun testing some of these goodies. Here are the main highlights, you should consider when planning to develop app for iPhone X:
1. Machine Vision
iOS now features ARKit. Together with TrueDepth camera, the brand new toolkit will enable space awareness capabilities to plant 3D objects into the camera view. Origin and orientation of the coordinate system are fully customizable.
2. World Tracking
Key to enabling AR with changing camera view is the awareness of the device’s self-positioning. Here, at play are the aforementioned computer vision together with internal motion sensing hardware. This process is called visual-inertial odometry. It tracks parallel horizontal surfaces to plant objects and is also able to estimate the room lighting, adding appropriate amounts to the rendered 3D object.
There are 2 types of requests to handle for image processing: VNImageRequestHandlerfor a single image and VNSequenceRequestHandlerfor multiple images which are outputted intocobservations.
3. Face Tracking
FaceID is not the only asset that goes with facial recognition. There are dozens of potential applications to this nice feature that come as part of ARKit. The front camera attributes the face with a number of anchors representing the position, orientation and topology converted into readings about the facial expression. These can be used to gather facial data and, say, mirror it on a 3D character.
The particular handlers with face tracking are VNDetectFaceRectanglesRequest to detect general faces and VNDetectFaceLandmarksRequest to track separate facial features and capture them in the “results”.
4. Information Tracking
Less promoted but, nonetheless, new to iOS devices, now, available with ARKit are:
VNDetectTextRectanglesRequest – camera view image text detection;
VNDetectBarcodesRequest – barcode information processing.
You might be stoked about all this new technology coming your way quite soon. However, many of these things are not yet fully adapted to harsh scenarios. For example, with AR, environmental details can be difficult to measure:
- Unclear exterior, as well as poor lighting can be challenging to adequately plant a 3D object;
- Quick camera motions disrupt the tracking. Extreme distances will also produce bad results;
- Refine the image with slow camera movement in order to capture the surfaces from changing angles; This will gather relevant world tracking data on your device;
- When plane detection is complete – disable it to preserve a clear estimate of the environment.
In the process of iPhone X app development, you should be aware of these limitations and test how much your app is constrained by them.
Also, the old devices won’t be able to support the new features. Here, we’re talking only iPhone X, however, for a startup, backward compatibility can make a huge difference, so mind that it might not happen. The ARKit firmware will be available for iPhone 6s, 6s Plus, 7, 7 Plus, but the joys of Augmented Reality are forever locked away from any devices below iPhone X. They don’t have TrueDepth frontal camera. As a business owner embarking on the road to build app for iPhone X you should consider that a large portion of the user base will not transition to the new model right away. The main reason is the cost.
5. Machine Learning
iOS 11 now comes with native machine learning framework. Core ML is finally addressing a number of issues iOS has had in the past due to relying on custom third-party apps. Such as granting access to non-native software which had raised huge iPhone X app security concerns and performance issues. No need to stay connected to the internet to use the framework now as well.
This figure shows the complete sequence of how Core ML works with data:
You may also take advantage of a few Core ML models that are readily available or use Core ML Tools to convert your models into Core format.
6. Natural Language Processing
Together with ARKit, the new NLP is starring a whole bag of its own features. These are meant to greatly enhance NSLinguisticTagger. It’s the main API that goes far beyond only tagging and even further with IOS 11 NLP.
The interaction happens between tag schemes and tag options.
let tagger = NSLinguisticTagger(tagSchemes: [.ENTER SCHEME NAME HERE], options: 0) is the requested outcome.
[.token] – identifies the type of every token;
[.language] – identifies dominant language in the text;
[.nameType] – let’s you know if the word is a proper name;
[.lemma] – gets you the stem from every word token;
[.lexicalClass] – each token’s lexical class;
let options: NSLinguisticTagger.Options = [.OPTION1, .OPTION2] are the conditions we add to filter the unneeded elements, etc.
Remember that the addition of Core ML for Apple is more of a catch-up rather than a technical breakthrough, though. It does not offer more than there is with Machine Learning apps in general, which are pretty limited and laborious to build so far down the road:
- No learning on the fly, only from the past and processed data;
- Every app needs to be trained individually;
- You need to manually process, structure all the data prior to inputting it as “experience” and hire a personal life-coach for your AI to supervise the training;
- Only able to process a minor part of human language;
- No causation or ontological connections, only correlational conclusions;
- Unable to handle symbolic meanings;
- Low transferability and reusability of learned data;
- System opacity makes the AI difficult to debug.
7. New Screen Shape
Relevant for development, the new screen now takes almost the entire surface. This time, curvier than ever.
The upper part of the screen is giving some space to the sensors, though, not entirely. Mind your app view when placing the device horizontally. Here is a nice working model:
Also, think about leaving the upper status bar unhidden. It might provide the user with additional information that wouldn’t spoil your UX.
As for the bottom of the screen, you will pay a heavy price if you don’t heed the following designlogic: don’t place the app controls there. Bottom is the place for the Home button. Users will confuse the interactive elements and hit Home, resulting in frustration with your UI decision, to put it mildly. Also don’t hide the Home button away, unless it’s a passive user experience.
8. New Aspect Ratios
New iPhone means new layouts and these will be quite different. A primary consideration to both update your apps for iPhone X and create new ones from scratch is the Safe Area. This area is not occluded by ancestors and your future apps will have to live with that.
The Safe Area is referenced with safeAreaLayoutGuide in the Interface Builder or tweaked with the layout margins. You will also have to turn on Use Safe Area Layout Guides in the Interface Builder Document.
Mind that with old iOS storyboards, enabling the Safe Area Layout Guides will override the constraints for top and bottom layout guides, as well as causing trailing edges. Always test the constraints when you do this.
These are some of the key UI ideas. Though, if you’re creating apps for iPhone X that don’t exactly walk the edge of the latest AR and ML features, don’t consider yourself left out. Both core hardware and software are getting their boost as well. There are tons of updates that will push the limits of what’s newly possible and provide thoughtful improvements to workflow.
9. Xcode 9
An extensive toolset for developers working in all Apple environments is getting some too. Aside from revamped editor with improved scrolling for any-size code document and search, here are the new features within:
- New Refactoring Systems – thanks to these, it gets easier for developers to make changes from one place all across of their apps;
- GitHub Integration – free GitHub Libraries are accessible right from the integrated development environment;
- New iOS Playground Templates – made specifically for good performance both on Swift and Xcode;
- Multiple Live App Testing – simultaneously testing a single project on many different devices;
- Enhanced Debugging – new debuggers for Metal 2 graphics, unidentified behavior sanitizers to fight unexpected behaviors as well as the ability to debug iOS devices wirelessly;
- Built-in Xcode Server – this one removes the need to install the macOS Server.
10. Swift 4
The core programming language couldn’t be left out either. Made open source in 2015, the fourth iteration now comes with a gamut of great improvements:
- Access Changes – private methods and variables have long been restricted for usage by properties with the same definition. Even if they were all within one file. This issue has been bypassed by changing the private to fileprivate. In Swift 4 you can finally get access to privatefrom extensions within the same file;
- Code size cuts – now Swift 4 analyzes which parts of the code are not accessed by the app and removes them. E.g., when working with a huge library, the methods you’re not using don’t get into the code;
- Sequencing – no more need to use Iterator.Element to reach out to extensions thanks to newly added Associated Types. You may now simply use Element. Be warned though – these new association types may conflict with an already defined Sequence or Collection. Also, there are new methods that allow direct changes in sequences;
- Combined classes and protocols – you can now make a declaration defined as multiple classes, types and protocols;
- Strings revamped – strings.charactersnow removed and the new result for operations (such as division of string) called substring now added;
- No Objective-C support, unless tagged with @objc – methods, functions and properties no longer generate compiled Objective-C copies automatically. For these cases, you will get a warning, but it might not be enough to properly address every needed instance.
11. A11 GPU and Metal 2
A treat for game and graphics developers. Together with powered A11 GPU, Metal 2 improves rendering, computing and more.
- The new GPU now does not rely on CPU for resource allocation and other tasks;
- With GPU Frame Debugger, developers will get their hands on hardware performance metrics which will help them do more intricate adjustments;
- The new GPU is now 10nm after the previous iteration with 16nm, meaning more battery life and more capacity for GPU transitions.
These are some of the key new features and capabilities that are coming with the release.