What's New in iOS SDK 4.2

Now that the final version of iOS SDK 4.2 is available, let’s have a look at what is new. What follows is a detailed overview of the What’s New in iOS 4.2 and iOS 4.2 API Diffs documents.

The iPad makes the jump to 4.x

iOS 4.2 is the release that finally unifies the iPhone and iPad SDKs. Consequently, the iPad gains some features that already came to the iPhone in iOS 4.0 and 4.1, such as:

iPad developers that have been living under a rock over the summer should also have a look at the What’s New in iOS 4.0 and iOS 4.1 documents.

Printing

Apps can now support wireless printing from iOS devices to supported printers. Unfortunately, Apple pulled the functionality to print to any printer shared by an OS X machine on the network at the last minute, but this feature can already be reinstalled with third-party tools such as Printopia or FingerPrint and will hopefully return in a future OS X 10.6.x release.

To support printing in your app, first determine if the device supports printing (+[UIPrintInteractionController isPrintingAvailable]), then retrieve the singleton UIPrintInteractionController with +[UIPrintInteractionController sharedPrintController] and provide your content to the print controller via one of its printingItem, printingItems, printPageRenderer, or printFormatter properties. UIPrintInteractionController can directly print images or PDF content (from URLs or in the form of NSData, UIImage, or ALAsset objects).

Via built-in subclasses of the abstract UIPrintFormatter class, the printing of plain text (UISimpleTextPrintFormatter), HTML (UIMarkupTextPrintFormatter), and UIView contents (UIViewPrintFormatter) is also supported out of the box.

To render the printable content yourself, subclass UIPrintPageRenderer.

AirPlay

With the 4.2 SDK, third-party apps can use AirPlay to stream *audio- to AirPlay devices such as the Apple TV. Video streaming is not supported at the moment. Apple:

AirPlay support is built in to the AV Foundation framework and the Core Audio family of frameworks. Any audio content you play using these frameworks is automatically made eligible for AirPlay distribution. Once the user chooses to play your audio using AirPlay, it is routed automatically by the system.

Core MIDI

Core MIDI is a new framework to let an iOS device communicate with MIDI devices such as keyboards and synthesizers. It consists of three classes to setup and manage MIDI connections: MIDINetworkHost, MIDINetworkConnection, and MIDINetworkSession. The framework also includes the C-based MIDI Services API. I have not had the opportunity to investigate this further, especially how to connect a MIDI device to an iPhone or iPad (Apple says devices can be connected via the dock connector or network).

Weak Linking Support

Developers can now opt to weak-link certain classes (instead of entire frameworks) to their apps that are not available in their deployment target iOS version. This mechanism can simplify the code you need to use to check for the availability of certain features at runtime. Marco Arment has already written a great tutorial on this topic: Supporting older versions of iOS while using new APIs.

Enhancements to Existing Frameworks

AVFoundation

Core Location

Core Text

Event Kit

EKEventViewController now requires you to set a delegate EKEventKitViewDelegate that informs your app when the view controller should be closed and what action the user took (tapped the Done button, responded to an event and saved it, or deleted it).

Game Kit

The new GKFriendRequestComposeViewController and GKFriendRequestComposeViewControllerDelegate can be used to present a screen that allows the player to send Game Center friend requests to other players from inside your app.

iAd

iAd now supports iPad-sized banners (1024x66 and 768x66 points). Your app should use the ADBannerContentSizeIdentifierPortrait and ADBannerContentSizeIdentifierLandscape constants to request the appropriate banner size for the current platform.

Map Kit

  • MKMapView has a new method: -annotationsInMapRect: returns a set of all map annotations in a specified region. According to Apple, “This method is much faster than doing a linear search of the objects in the annotations property yourself.”
  • Apps should override MKAnnotationView’s new -setDrageState:animated: method to implement drag and drop support for custom annotation views. As the system detects user actions that would indicate a drag, it calls this method to update the drag state. In response, your app can perform animations to visualize state changes.

Media Player

Quartz Core

  • CAShapeLayer, the layer class to display Core Graphics paths, gained new properties to control the relative start and end points of the path: strokeStart and strokeEnd. These properties are animatable and should come in handy if you want to animate the creation of a path from start to finish.

Quick Look

UIKit

  • New “scroll by page” capabilities using VoiceOver. If your app contains a view that supports a scroll by page action, you should implement the -accessibilityScroll: method in the UIAccessibilityAction informal protocol.
  • UIApplicationDelegate has a new method, -application:openURL:sourceApplication:annotation:, which provides your app with further information when it was launched from another app. You not only get notified which app launched yours, but the calling application can also pass arbitrary data in the form of a property list to your app using the annotation argument. Unfortunately, the annotation property is only available if the calling app uses UIDocumentInteractionController. If the calling app uses -[UIApplication openURL:], it still has to resort to URL parameters to pass information along.
  • UIDevice now has a -playInputClick method that lets us play the standard keyboard click sound from our app. A click plays only if the user has enabled keyboard clicks. Yay!
  • The UITextInputMode class now exposes the language in use for inputting text. primaryLanguage property.