Ole Begemann

iOS Development

Animating the drawing of a CGPath with CAShapeLayer

One of the nice little additions in iOS SDK 4.2 are two new properties for CAShapeLayer: strokeStart and strokeEnd. Both are floats that hold a value between 0.0 and 1.0, indicating the relative location along the shape layer’s path at which to start and stop stroking the path.

The default values are 0.0 for strokeStart and 1.0 for strokeEnd, obviously, which causes the shape layer’s path to be stroked along its entire length. If you would, say, set layer.strokeEnd = 0.5f, only the first half of the path would be stroked. So far so good.

The really nice thing about these properties is that they are animatable. By animating strokeEnd from 0.0 to 1.0 over a duration of a few seconds, we can easily display the path as it is being drawn:

1
2
3
4
5
CABasicAnimation *pathAnimation = [CABasicAnimation animationWithKeyPath:@"strokeEnd"];
pathAnimation.duration = 10.0;
pathAnimation.fromValue = [NSNumber numberWithFloat:0.0f];
pathAnimation.toValue = [NSNumber numberWithFloat:1.0f];
[self.pathLayer addAnimation:pathAnimation forKey:@"strokeEndAnimation"];

Finally, add a second layer containing the image of a pen and use a CAKeyframeAnimation to animate it along the path with the same speed to make the illusion perfect:

1
2
3
4
5
CAKeyframeAnimation *penAnimation = [CAKeyframeAnimation animationWithKeyPath:@"position"];
penAnimation.duration = 10.0;
penAnimation.path = self.pathLayer.path;
penAnimation.calculationMode = kCAAnimationPaced;
[self.penLayer addAnimation:penAnimation forKey:@"penAnimation"];

Download the sample video (H.264).

This also works with text; we just have to convert the glyphs to a CGPath. Core Text offers a function to do just that, CTFontCreatePathForGlyph(). To use it, we need to create an attributed string with the text we want to render and split it up first into lines and then into glyphs. After converting the glyphs to paths, we add them all to a single CGPath as subpaths. See the excellent article Low-level text rendering by Ohmu for details. The result looks cool:

Download the sample video (H.264).

Get the sample project (for the iPad) on GitHub. I am releasing the parts I wrote under the MIT License, and the code I took from the article I mentioned above is released under the equally liberal Code Project Open License.