Now that the final version of iOS SDK 4.2 is available, let’s have a look at what is new. What follows is a detailed overview of the What’s New in iOS 4.2 and iOS 4.2 API Diffs documents.
The iPad makes the jump to 4.x
iOS 4.2 is the release that finally unifies the iPhone and iPad SDKs. Consequently, the iPad gains some features that already came to the iPhone in iOS 4.0 and 4.1, such as:
- C block objects
- Grand Central Dispatch
- Local Notifications
- Core Motion
- The Assets Library
- Event Kit for calendar access
- Game Center
- Quick Look
- The Accelerate framework
iPad developers that have been living under a rock over the summer should also have a look at the What’s New in iOS 4.0 and iOS 4.1 documents.
Apps can now support wireless printing from iOS devices to supported printers. Unfortunately, Apple pulled the functionality to print to any printer shared by an OS X machine on the network at the last minute, but this feature can already be reinstalled with third-party tools such as Printopia or FingerPrint and will hopefully return in a future OS X 10.6.x release.
To support printing in your app, first determine if the device supports printing (
+[UIPrintInteractionController isPrintingAvailable]), then retrieve the singleton
+[UIPrintInteractionController sharedPrintController] and provide your content to the print controller via one of its
UIPrintInteractionController can directly print images or PDF content (from URLs or in the form of
Via built-in subclasses of the abstract
UIPrintFormatter class, the printing of plain text (
UIViewPrintFormatter) is also supported out of the box.
To render the printable content yourself, subclass
With the 4.2 SDK, third-party apps can use AirPlay to stream *audio- to AirPlay devices such as the Apple TV. Video streaming is not supported at the moment. Apple:
AirPlay support is built in to the AV Foundation framework and the Core Audio family of frameworks. Any audio content you play using these frameworks is automatically made eligible for AirPlay distribution. Once the user chooses to play your audio using AirPlay, it is routed automatically by the system.
Core MIDI is a new framework to let an iOS device communicate with MIDI devices such as keyboards and synthesizers. It consists of three classes to setup and manage MIDI connections:
MIDINetworkSession. The framework also includes the C-based MIDI Services API. I have not had the opportunity to investigate this further, especially how to connect a MIDI device to an iPhone or iPad (Apple says
devices can be connected via the dock connector or network).
Weak Linking Support
Developers can now opt to weak-link certain classes (instead of entire frameworks) to their apps that are not available in their deployment target iOS version. This mechanism can simplify the code you need to use to check for the availability of certain features at runtime. Marco Arment has already written a great tutorial on this topic: Supporting older versions of iOS while using new APIs.
Enhancements to Existing Frameworks
- AVAsset now has a property to indicate whether the asset is DRM-protected:
- An asset’s metadata now includes its duration:
- New API for determining if the user has authorized the device/your app to use location services:
- There is also a new delegate method
-[CLLocationManagerDelegate locationManager:didChangeAuthorizationStatus:]that notifies your app about changes in the authorization status of your app (important for multitasking apps).
- New function:
CTFontDrawGlyphs(). No need to convert CTFont objects to Core Graphics fonts with
CTFontCopyGraphicsFont()before drawing them anymore.
- New function:
CTFontGetLigatureCaretPositions()retrieves a list of possible caret positions inside a ligature.
EKEventViewController now requires you to set a delegate
EKEventKitViewDelegate that informs your app when the view controller should be closed and what action the user took (tapped the Done button, responded to an event and saved it, or deleted it).
GKFriendRequestComposeViewControllerDelegate can be used to present a screen that allows the player to send Game Center friend requests to other players from inside your app.
iAd now supports iPad-sized banners (1024x66 and 768x66 points). Your app should use the
ADBannerContentSizeIdentifierLandscape constants to request the appropriate banner size for the current platform.
MKMapViewhas a new method:
-annotationsInMapRect:returns a set of all map annotations in a specified region. According to Apple, “This method is much faster than doing a linear search of the objects in the annotations property yourself.”
- Apps should override
-setDrageState:animated:method to implement drag and drop support for custom annotation views. As the system detects user actions that would indicate a drag, it calls this method to update the drag state. In response, your app can perform animations to visualize state changes.
MPMediaEntityis the new common abstract superclass for
MPMediaItemCollection. With this change, collections can now contain both items and other collections.
MPVolumeViewinterface now includes a control for routing audio content to AirPlay-enabled devices. It gained two new properties,
showsVolumeSlider, to control which UI elements should be visible.
- Persistent IDs are now not only available for songs, but also for artists, albums, album artists, composers, genres, and podcasts.
+[MPMediaItem persistentIDPropertyForGroupingType:]helps translate between persistent ID keys and
+[MPMediaItem titlePropertyForGroupingType:]translates between
MPMediaGroupingkeys and title keys.
- Results of a
MPMediaQuerycan now further be divided into sections, represented by the new
MPMediaQuerySectionclass. Each section has a localized
titleand identifies the
rangeof items in the media query that fall into that section. You access a query’s sections through MPMediaQuery’s
MPMoviePlayerController’s playback interface has been standardized in the
CAShapeLayer, the layer class to display Core Graphics paths, gained new properties to control the relative start and end points of the path:
strokeEnd. These properties are animatable and should come in handy if you want to animate the creation of a path from start to finish.
QLPreviewControllerDelegategained two new methods to help provide a smooth transition between a document icon or thumbnail and the full-size quick look view.
-previewController:frameForPreviewItem:inSourceView:asks for the frame of the preview item to animate a zoom effect between the preview and the full-screen view.
UIImageof the preview item that the quick look controller can crossfade with during the zoom animation.
- New “scroll by page” capabilities using VoiceOver. If your app contains a view that supports a scroll by page action, you should implement the
-accessibilityScroll:method in the
UIApplicationDelegatehas a new method,
-application:openURL:sourceApplication:annotation:, which provides your app with further information when it was launched from another app. You not only get notified which app launched yours, but the calling application can also pass arbitrary data in the form of a property list to your app using the
annotationargument. Unfortunately, the
annotationproperty is only available if the calling app uses
UIDocumentInteractionController. If the calling app uses
-[UIApplication openURL:], it still has to resort to URL parameters to pass information along.
UIDevicenow has a -playInputClick method that lets us play the standard keyboard click sound from our app. A click plays only if the user has enabled keyboard clicks. Yay!
UITextInputModeclass now exposes the language in use for inputting text. primaryLanguage property.