Now that the final version of iOS SDK 4.3 is available, let’s have a look at what is new. What follows is a detailed overview of the What’s New in iOS 4.3 and iOS 4.3 API Diffs documents. Compared to the changes in iOS SDK 4.2, the 4.3 update is rather small.
No more iPhone 3G support
iOS 4.3 drops support for the iPhone 3G and second-generation iPod touch. So if you need your apps to run on these devices, be sure to not require any of the new features.
App switching gestures on the iPad
Apple introduced new four and five finger gestures to switch between apps on the iPad. These gestures are not yet activated for consumers, but developers can enable them in Settings. If your app uses gestures that potentially use four or more fingers, you should test it for possible interference with the new multitasking gestures and discuss your concerns in the Apple Developer Forums.
AirPlay for everybody
After introducing AirPlay in iOS 4.2 for some of Apple’s own apps, the feature can now be used by all third-party apps that play video through MPMoviePlayerController. AirPlay is disabled by default, but all you have to do is set the
YES and the OS manages the display of the AirPlay button for you as soon as it detects an AirPlay device nearby.
You can also enable AirPlay for web-based video content embedded through the QuickTime Plug-in or HTML5 video element.
Chapter information in AVAsset
AVAsset can now access the chapters an asset contains with the new method
chapterMetadataGroupsWithTitleLocale:containingItemsWithCommonKeys:. Each chapter is returned in the form of an
AVMetadataItem containing the chapter’s title and time range. This looks super-useful for apps that work with podcasts or audiobooks. The new property
availableChapterLocales can tell you the locales in which chapter information is available.
AVAsset usage restrictions
Some new properties of
AVAsset provide information on what this asset can be used for:
Indicates whether the asset can be used within a segment of an
Indicates whether the asset can be exported using
Indicates whether the asset, or its URL, can be used to initialize an instance of
Indicates whether the asset’s media data can be extracted using
Network playback statistics
When playing a network stream, you can now track network playback statistics through two new methods on
errorLog. These methods return instances of
AVPlayerItemErrorLog, respectively, which in turn contain arrays of
AVPlayerItemErrorLogEvent instances that represent the single log events.
Asynchronous metadata loading
AVMetadataItem class can now load metadata asynchronously. Call
loadValuesAsynchronouslyForKeys:completionHandler: to initiate the load process and
statusOfValueForKey:error: to check if the metadata for a key has been loaded.
New metadata keys
In addition, Apple has defined some very interesting-looking-but-so-far-undocumented new constants for metadata keys:
These look as if there would be metadata not only about the location of a video, but also about the direction the camera is facing and the movement of the camera over the duration of the video. It would be interesting to check if a video taken with the iPhone actually contains this metadata (I haven’t checked).
Let me just quote from Apple’s What’s New document here because I haven’t got anything to add:
The Audio Unit and Audio Toolbox frameworks include the following enhancements:
AudioUnitParameterHistoryInfostruct (in the Audio Unit framework) along with supporting audio unit properties adds the ability to track and use parameter automation history.
ExtendedAudioFormatInfostruct (in the Audio Toolbox framework) lets you specify which codec to use when accessing the
kAFInfoDictionary_SourceBitDepthdictionary key and the
kAudioFilePropertySourceBitDepthproperty (in the Audio Toolbox framework) provide access to the bit depth of an audio stream.
kAudioConverterErr_NoHardwarePermissionresult code (in the Audio Toolbox framework) indicates that a request to create a new audio converter object cannot be satisfied because the application does not have permission to use the requested hardware codec.
In iOS SDK 4.2, the
CFStringGetHyphenationLocationBeforeIndex() function was added to hyphenate
CFStrings. In iOS SDK 4.3, we got another new function,
CFStringIsHyphenationAvailableForLocale(), to ask the system if hyphenation information is available for the specified locale.
Apple added some new constants to the Core Text framework. They are not documented yet (besides the comments in the header files), but it seems that Core Text on iOS supports a few new font traits and formatting settings, such as line spacing in paragraphs or non-rectangular clipping paths for
CTFrames. The new stuff:
Color bitmap glyphs are available)
Specifies array of paths to clip frame)
The space in points added between lines within the paragraph)
A value of)
Falseindicates that horizontal glyph forms are to be used;
Trueindicates that vertical glyph forms are to be used.
In addition to small banners, iAd now also supports full-screen ads on the iPad (to be used, for instance, as full-page ads in a magazine app). Use the new
ADInterstitialAd class to display them.
iAd also got a new error state, ADErrorApplicationInactive.
Apple defined some new constants to make it easier to retrieve some frequently needed EXIF information about camera and lens model from a
CGImageSourceCopyProperties() to retrieve an image’s EXIF dictionary.
Besides AirPlay support, the
MPMoviePlayerController class also gained new properties to track network playback statistics, analogous to the AV Foundation framework.
If the player is playing a network stream,
errorLog reference instances of two new classes,
MPMovieErrorLog, each containing arrays of
UIViewControllerhas a new method called
disablesAutomaticKeyboardDismissal, which you can override to control whether the keyboard should be dismissed automatically when the user changes from a control that uses the keyboard to one that does not. By default, this method returns
NO, except when a view controller is presented modally with its modal presentation style set to
To support the new screen mirroring feature in the iPad 2, a new read-only property was added to
UIScreen: if screen mirrroring is active,
mirroredScreenwill contain the screen object that is being mirrored (the device’s main screen).
preferredMode(undocumented so far) is the preferred
UIScreenModeof the screen in question. From the header file:
Choosing this mode will likely produce the best results.