Now that the final version of iOS SDK 4.2 is available, let’s have a look at what is new. What follows is a detailed overview of the What’s New in iOS 4.2 and iOS 4.2 API Diffs documents.
The iPad makes the jump to 4.x
iOS 4.2 is the release that finally unifies the iPhone and iPad SDKs. Consequently, the iPad gains some features that already came to the iPhone in iOS 4.0 and 4.1, such as:
- C block objects
- Grand Central Dispatch
Multitasking
support- Local Notifications
- Core Motion
- The Assets Library
- Event Kit for calendar access
- iAd
- Game Center
- Quick Look
- The Accelerate framework
iPad developers that have been living under a rock over the summer should also have a look at the What’s New in iOS 4.0 and iOS 4.1 documents.
Printing
Apps can now support wireless printing from iOS devices to supported printers. Unfortunately, Apple pulled the functionality to print to any printer shared by an OS X machine on the network at the last minute, but this feature can already be reinstalled with third-party tools such as Printopia or FingerPrint and will hopefully return in a future OS X 10.6.x release.
To support printing in your app, first determine if the device supports printing (+[UIPrintInteractionController isPrintingAvailable]
), then retrieve the singleton UIPrintInteractionController
with +[UIPrintInteractionController sharedPrintController]
and provide your content to the print controller via one of its printingItem
, printingItems
, printPageRenderer
, or printFormatter
properties. UIPrintInteractionController
can directly print images or PDF content (from URLs or in the form of NSData
, UIImage
, or ALAsset
objects).
Via built-in subclasses of the abstract UIPrintFormatter
class, the printing of plain text (UISimpleTextPrintFormatter
), HTML
(UIMarkupTextPrintFormatter
), and UIView
contents
(UIViewPrintFormatter
) is also supported out of the box.
To render the printable content yourself, subclass UIPrintPageRenderer
.
AirPlay
With the 4.2 SDK, third-party apps can use AirPlay to stream *audio- to AirPlay devices such as the Apple TV. Video streaming is not supported at the moment. Apple:
AirPlay support is built in to the AV Foundation framework and the Core Audio family of frameworks. Any audio content you play using these frameworks is automatically made eligible for AirPlay distribution. Once the user chooses to play your audio using AirPlay, it is routed automatically by the system.
Core MIDI
Core MIDI is a new framework to let an iOS device communicate with MIDI devices such as keyboards and synthesizers. It consists of three classes to setup and manage MIDI connections:
MIDINetworkHost
, MIDINetworkConnection
, and MIDINetworkSession
. The framework also includes the C-based MIDI Services API. I have not had the opportunity to investigate this further, especially how to connect a MIDI device to an iPhone or iPad (Apple says
devices can be connected via the dock connector or network).
Weak Linking Support
Developers can now opt to weak-link certain classes (instead of entire frameworks) to their apps that are not available in their deployment target iOS version. This mechanism can simplify the code you need to use to check for the availability of certain features at runtime. Marco Arment has already written a great tutorial on this topic: Supporting older versions of iOS while using new APIs.
Enhancements to Existing Frameworks
AVFoundation
- AVAsset now has a property to indicate whether the asset is DRM-protected:
AVAsset.hasProtectedContent
. - An asset’s metadata now includes its duration:
AVMetadataItem.duration
.
Core Location
- New API for determining if the user has authorized the device/your app to use location services:
+[CLLocationManager authorizationStatus]
- There is also a new delegate method
-[CLLocationManagerDelegate locationManager:didChangeAuthorizationStatus:]
that notifies your app about changes in the authorization status of your app (important for multitasking apps).
Core Text
- New function:
CTFontDrawGlyphs()
. No need to convert CTFont objects to Core Graphics fonts withCTFontCopyGraphicsFont()
before drawing them anymore. - New function:
CTFontGetLigatureCaretPositions()
retrieves a list of possible caret positions inside a ligature.
Event Kit
EKEventViewController
now requires you to set a delegate EKEventKitViewDelegate
that informs your app when the view controller should be closed and what action the user took (tapped the Done button, responded to an event and saved it, or deleted it).
Game Kit
The new GKFriendRequestComposeViewController
and
GKFriendRequestComposeViewControllerDelegate
can be used to present a screen that allows the player to send Game Center friend requests to other players from inside your app.
iAd
iAd now supports iPad-sized banners (1024x66 and 768x66 points). Your app should use the ADBannerContentSizeIdentifierPortrait
and ADBannerContentSizeIdentifierLandscape
constants to request the appropriate banner size for the current platform.
Map Kit
MKMapView
has a new method:-annotationsInMapRect:
returns a set of all map annotations in a specified region. According to Apple, “This method is much faster than doing a linear search of the objects in the annotations property yourself.”- Apps should override
MKAnnotationView
’s new-setDrageState:animated:
method to implement drag and drop support for custom annotation views. As the system detects user actions that would indicate a drag, it calls this method to update the drag state. In response, your app can perform animations to visualize state changes.
Media Player
MPMediaEntity
is the new common abstract superclass forMPMediaItem
andMPMediaItemCollection
. With this change, collections can now contain both items and other collections.- The
MPVolumeView
interface now includes a control for routing audio content to AirPlay-enabled devices. It gained two new properties,showsRouteButton
andshowsVolumeSlider
, to control which UI elements should be visible. - Persistent IDs are now not only available for songs, but also for artists, albums, album artists, composers, genres, and podcasts.
+[MPMediaItem persistentIDPropertyForGroupingType:]
helps translate between persistent ID keys andMPMediaGrouping
keys. Similarly,+[MPMediaItem titlePropertyForGroupingType:]
translates betweenMPMediaGrouping
keys and title keys.- Results of a
MPMediaQuery
can now further be divided into sections, represented by the newMPMediaQuerySection
class. Each section has a localizedtitle
and identifies therange
of items in the media query that fall into that section. You access a query’s sections through MPMediaQuery’sitemSections
orcollectionSections
properties. MPMoviePlayerController
’s playback interface has been standardized in theMPMediaPlayback
protocol.
Quartz Core
CAShapeLayer
, the layer class to display Core Graphics paths, gained new properties to control the relative start and end points of the path:strokeStart
andstrokeEnd
. These properties are animatable and should come in handy if you want to animate the creation of a path from start to finish.
Quick Look
QLPreviewControllerDelegate
gained two new methods to help provide a smooth transition between a document icon or thumbnail and the full-size quick look view.-previewController:frameForPreviewItem:inSourceView:
asks for the frame of the preview item to animate a zoom effect between the preview and the full-screen view.-previewController:transitionImageForPreviewItem:contentRect:
requests aUIImage
of the preview item that the quick look controller can crossfade with during the zoom animation.
UIKit
- New “scroll by page” capabilities using VoiceOver. If your app contains a view that supports a scroll by page action, you should implement the
-accessibilityScroll:
method in theUIAccessibilityAction
informal protocol. UIApplicationDelegate
has a new method,-application:openURL:sourceApplication:annotation:
, which provides your app with further information when it was launched from another app. You not only get notified which app launched yours, but the calling application can also pass arbitrary data in the form of a property list to your app using theannotation
argument. Unfortunately, theannotation
property is only available if the calling app usesUIDocumentInteractionController
. If the calling app uses-[UIApplication openURL:]
, it still has to resort to URL parameters to pass information along.UIDevice
now has a -playInputClick method that lets us play the standard keyboard click sound from our app. A click plays only if the user has enabled keyboard clicks. Yay!- The
UITextInputMode
class now exposes the language in use for inputting text. primaryLanguage property.