Ole Begemann

iOS Development

What's New for Developers in Mac OS X Lion (Part 3)

This is the third and final article in my little series covering the new developer APIs in Lion. Check out parts one and two if you haven’t done so already:

Core Data

Formalized Concurrency Model

Working with Core Data on a background thread involves manually managing two separate managed object contexts and their associated object graphs, taking care that you do not pass managed object from one thread/context to another. You should never access a managed object context outside the thread or dispatch queue that created it. This is still true in Lion, and the system now lets you formalize this agreement by creating a context with initWithConcurrenceType:NSConfinementConcurrencyType.

If you pass NSPrivateQueueConcurrencyType to initWithConcurrencyType:, the context creates and manages a private dispatch queue to operate on. To send any message to such a context, you must use the new methods performBlock: or performBlockAndWait:. The context will then execute the passed blocks on its own queue.

NSMainQueueConcurrencyType creates a context that is associated with the main dispatch queue and thus the main thread. Use such a context to link it to controllers and UI objects that are required to run on the main thread.

Support for Nested Managed Object Contexts

In earlier releases of OS X, a managed object context is always directly associated with a “parent” persistent store coordinator. In Lion, rather than having a persistent store coordinator, a context can also have another managed object context as its parent. This nested relationship between managed object contexts has two very useful use cases:

  1. Fetches and saves are mediated by the second context, which can execute these time-consuming operations in the background, thereby avoiding blocking the UI.

  2. Allow discardable edits that go away by simply throwing away the second context.

Use the -setParentContext: method to assign a parent context to your managed object context.

Ordered Relationships

To-many relationships in Core Data can now optionally have an order. This much-awaited feature should save many people a lot of tedious manual work that is involved in manually maintaining an order attribute for the items in the relationship. See the isOrdered attribute of NSRelationshipDescription.

In code, ordered relationships are represented by the new NSOrderedSet class (see below). Note that ordered relationships are significantly less efficient than unordered ones. You should only use them if a relationship has an intrinsic order. Do not use ordered relationships just because you want to avoid sorting your fetch results.

External Storage for Attributes

Another area where many developers had to invest manual work was when the Core Data store was designed to hold references to large amounts of binary data that were usually stored in external files (such as images). Developers had to generate unique filenames, create these files, store the filenames in the data store and delete the files when the associated managed object was deleted.

In Lion, managed objects support optional external storage for attribute values. If you specify in the Core Data editor that the value of a certain attribute (typically a BLOB) may be stored externally, Core Data uses internal heuristics (probably based on the size of the BLOB) to decide whether to store the data directly in the database or in a separate external file. As a developer, you don’t have to care either way. See the allowsExternalBinaryDataStorage attribute of NSAttributeDescription. I think this is a great feature.

Fetch Requests

NSFetchRequest got a few new features in Lion:

Implement Your Own Incremental Stores

Two new classes, NSIncrementalStore and NSIncrementalStoreNode, allow you to add support for other non-atomic persistent stores besides the existing SQLite store. (Writing custom atomic persistent stores was already supported in earlier versions of OS X.)

One cool possibility would be to write a Core Data interface to a web service backend such as CouchDB.

The Core Data Release Notes offer the best overview of the new features in Apple’s documentation.

Core Foundation

With the CFStringGetHyphenationLocationBeforeIndex() function it is now possible to locate potential hyphenation points within a word. Since hyphenation is language-specific, you have to pass the locale of the string you are inspecting to the function. Do not forget to check whether the system supports hyphenation for that locale beforehand by calling CFStringIsHyphenationAvailableForLocale()

See the Core Foundation Release Notes for details on this and more changes in Core Foundation.

Core Location

The Core Location framework originated on iOS and was introduced to OS X with Snow Leopard. In Lion, Apple brought the header files up to par with iOS. They even added stuff like CLHeading that doesn’t make a lot of sense on a computer but who knows when the first laptop with a built-in compass will come out? The same goes for the CLDeviceOrientation enum, which includes values for portrait, landscape, face up and face down orientations.

The most useful addition to the framework seems to be region monitoring. Start the CLLocationManager by sending it a startMonitoringForRegion:desiredAccuracy: message to be alerted when the computer enters a specific area, or use startMonitoringSignificantLocationChanges to be notified whenever the computer’s location changes.


As usual, the Foundation Release Notes for Lion give the best explanation of the new features in Apple’s documentation. It is unfortunate that it is so hard to discover these release notes documents for the particular frameworks as they are not linked from either What’s New in Lion or the API Diffs.

Native JSON Support

With NSJSONSerialization, OS X gets native reading and writing support for JSON, which has become the de facto data format for web APIs.

The +JSONObjectWithData:options:error: class method takes JSON data that you received from a network request and converts it into a hierarchy of Foundation objects (NSDictionary, NSArray, NSString, NSNumber, NSNull). Two useful options, NSJSONReadingMutableContainers and NSJSONReadingMutableLeaves let you specify whether the resulting Foundation objects should be immutable (the default) or mutable. The class also offers a variant to read the data directly from an NSInputStream: +JSONObjectWithStream:options:error:.

To convert an object graph into UTF-8-encoded JSON, call the +dataWithJSONObject:options:error: class method. By default, the resulting JSON data will be as compact as possible. Passing NSJSONWritingPrettyPrinted as an option to the method allows you to generate more readable output.

Key-Value Observing

The new method removeObserver:forKeyPath:context: allows you to more precisely remove a key-value observer by passing the same context pointer you used when adding the observer with addObserver:forKeyPath:options:context:. Using the old removeObserver:forKeyPath: method could possibly lead to problems.


Before Lion, asynchronous NSURLConnections always required a running runloop, making the class a bit difficult to use with operation or dispatch queues. Queue support was added in Lion. Setting an NSOperationQueue with the setDelegateQueue: method will deliver the delegate messages on the specified queue.

The new sendAsynchronousRequest:queue:completionHandler: convenience method will start an asynchronous connection and execute the completionHandler block on the specified queue once the request has finished either successfully or with an error. It frees the developer from implementing any NSURLConnectionDelegate methods if special handling of the connection, continuous progress reports etc. are not required. Very handy.

Linguistic Tagging

With the new NSLinguisticTagger class, developers can easily analyze natural-language text and identify different grammatical parts of speech. For example, given the sentence “Steve Jobs just resigned as CEO.” and configured to use the tag scheme NSLinguisticTagSchemeLexicalClass, NSLinguisticTagger can identify the words “Steve”, “Jobs” and “CEO” as nouns, the word “just” as an adverb, and the word “as” as a preposition.

Other possible tag schemes include NSLinguisticTagSchemeLemma to identify stem forms of words, NSLinguisticTagSchemeLanguage to identify the language and NSLinguisticTagSchemeScript to identify the script (Latin, Cyrillic, etc.).

The tagging works very well for English. Other languages might not support all tag schemes or might not be supported at all. Call the availableTagSchemesForLanguage: class method to determine the level of support for a language.

To do the analysis, create an instance of NSLinguisticTagger and pass it a string with setString:. Then call enumerateTagsInRange:scheme:options:usingBlock: to iterate over all tags the linguistic tagger found.

Ordered Sets

NSOrderedSet and NSMutableOrderedSet are new collection classes that combine the advantages of NSArray (ordered collection, fast index-based access) and NSSet (unique contents, fast access to objects). Note that NSOrderedSet inherits neither from NSArray nor from NSSet.

The class also offers two interesting methods, -(NSArray *)array and - (NSSet *)set. These return proxy objects for the underlying ordered set that act like an array or set without actually being one. If the underlying ordered set is mutable, changes to it will directly pass through to the proxy objects and these “immutable” collections will appear to outside code to be changing.

Regular Expressions and Data Detectors

Two new Lion developer features that you may already know from iOS are built-in regular expression support and data detectors.

The NSRegularExpression class represents a regular expression that you can apply to a string. After creating a regex instance (+regularExpressionWithPattern:options:error:), you use the Block iterator method enumerateMatchesInString:options:range:usingBlock: to enumerate all matches in the specified string. Each match is represented by an NSTextCheckingResult instance.

To use regular expressions for searching and replacing text, use the stringByReplacingMatchesInString:options:range:withTemplate: method.

The class reference documentation for NSRegularExpression has a very detailed overview about the supported regex syntax.

NSDataDetector is a subclass of NSRegularExpression that offers pre-configured regular expressions to identify patterns such as dates, addresses, phone numbers, or URLs.


OpenGL 3.2

Lion now supports OpenGL 3.2. Developers should select a specific OpenGL profile to tell the system which version of OpenGL an app is designed for. The current options are either kCGLOGLPVersion_3_2_Core for Open GL 3.2 or kCGLOGLPVersion_Legacy, which provides the same functionality found in earlier versions of Mac OS X.

See the OpenGL Profiles section in Apple’s OpenGL Programming Guide for details.


QTKit provides two new classes that make it easier to export movies in different formats. It also features new APIs for reading movie metadata without having to use the QuickTime C API, which is not available in 64-bit apps.

The new QTExportSession class (not yet documented) represents an export process that produces a transcoded output from a given QTMovie source. The properties of the exported movie are specified by an instance of QTExportOptions. The initializer for an export session is the self-explaining initWithMovie:exportOptions:outputURL:error: method. After you have created the export session instance and set a delegate, send it a run message to start the export operation asynchronously. The session will inform its delegate about success (exportSessionDidSucceed:) or failure (exportSession:didFailWithError:) of the operation. It also informs regularly about progress while the operation is running (exportSession:didReachProgress:).

The export settings in the QTExportOptions class are specified by instantiating the class with one of the available format constants: QTExportOptionsAppleM4VCellular, QTExportOptionsAppleM4V480pSD, QTExportOptionsAppleM4ViPod, QTExportOptionsAppleM4VAppleTV, QTExportOptionsAppleM4VWiFi, QTExportOptionsAppleM4V720pHD, QTExportOptionsQuickTimeMovie480p, QTExportOptionsQuickTimeMovie720p, QTExportOptionsQuickTimeMovie1080p, QTExportOptionsAppleM4A. It does not seem possible to configure the export options in a more granular manner. See the QTExportSession.h and QTExportOptions.h header files for details.

The new metadata reading capabilities are based on the QTMetadataItem class. The QTMovie and QTTrack classes have been extended by new methods that return these metadata items. The commonMetadata method returns an array of QTMetadataItem objects for each common metadata key for which a value for the current locale is available.

Metadata can be available in several formats, including QTMetadataFormatQuickTimeUserData, QTMetadataFormatQuickTimeMetadata, QTMetadataFormatiTunesMetadata and QTMetadataFormatID3Metadata. The availableMetadataFormats method returns an array of those formats available for the current movie or track while the metadataForFormat: method lets you retrieve the metadata for a specific format.


QLPreviewView is a new class that allows you to embed a Quick Look preview into your own view hierarchy. Its designated initializer is initWithFrame:style:, giving you the option of two styles, QLPreviewViewStyleNormal and QLPreviewViewStyleCompact. After creating the view, just assign the item to preview (which must implement the QLPreviewItem protocol) to its previewItem property and add the view to your view hierarchy just as you would with any other view.

Quartz Core

Core Animation

New Features from iOS

Several Core Animation classes inherited new properties that were introduced before in iOS 4. Examples:

Remote Layer Clients and Servers

The new (not yet documented) classes CARemoteLayerServer and CARemoteLayerClient seem to allow one process to render a layer tree that is then displayed by another process. I guess this was introduced for the new XPC interprocess communication framework.

Core Image Face Detection

One of the coolest new features of Lion is a public face detection API. Don’t confuse face detection (identifying the position and size of faces in an image) with face recognition (being able to tell if two faces show the same person or not). iPhoto can do the latter while the new API is “only” capable of the former.

Face detection is part of Core Image and therefore works on CIImage objects. The API is very easy to use, provided that you have converted the image you want to analyze into a CIImage. The source can either be a still image or a frame of a video.

The central class for face detection is CIDetector. Although it can only detect faces at the moment, Apple has designed the API in a generalized manner so that other detection algorithms can be easily added in the future. To detect faces, instatiate a CIDetector with the +detectorOfType:context:options: class method, passing CIDetectorTypeFace as the type. The context argument may be nil but you can improve performance if you pass in a CIContext instance that already has uploaded the image to be processed to the GPU. The options dictionary allows you to opt for higher accuracy or higher speed of the detection, depending on your needs.

Sending the detector instance a featuresInImage: method starts the detection process and returns an array of CIFaceFeature objects if any faces were found. Each face is describes by its bounds as well as the leftEyePosition, rightEyePosition and mouthPosition.