NSResponder Modifications: Swipe, Rotate, and Magnify

| | Comments (2) | TrackBacks (1)
So, I'm sure you've heard of the Macbook Air and it's revolutionary multi-touch trackpad, borrowed from the iPhone technology. The basic gist of it is that it provides application specific gestures that can be triggered by performing some gestures on the new trackpad. With that in mind, every Cocoa developer should be asking themselves this question: What has Apple done to NSEvent [and friends] to facilitate gestures in their own applications and how can I do it in mine?

With the help of my local Apple Store's Macbook Air, and some NSEvent knowledge, I'm going to answer exactly that.

Alright, since we're dealing with what are presumably new events, the first place we should check is the system version of NSEvent, hoping that maybe, just maybe Apple has defined them for us. Since this a blog post of decent length, you can infer what I did -- they aren't documented at all. Hooray.

This is furthermore supported by a small test application that I wrote which logs all of the events coming out of NSApplication by replacing my NSPrincipalClass key in Info.plist with the name of my subclass of NSApplication - GestureTest. By overriding -(void)sendEvent:(NSEvent *)anEvent, and logging every event that comes out, I can get a quick overview of every single event that gets passed to my application. This is very handy in a lot of situations, particular this one. Upon closer inspection, we have five new events - three events for each gesture, and two events corresponding to the beginning and end of a gesture respectively.

The gestures are as follows, grouped with their corresponding NSEventType.

BeginGesture: 19 (I think, I forgot to double check this, woops!)
Swipe: 31
Magnify: 30
Rotate: 18
EndGesture: 20

Since these events are standard NSEvents, they are affected by flag modifiers (i.e. we can differentiate between a swipe with Command held down and a swipe with Command-Option held down).

Thus, we have a basic way to check for these events -- we can check the flag and handle gestures appropriately. However, this is pretty inefficient - we have to check every event for this type and it isn't very flexible - we have to intercept at the NSApplication level. No good.

If you recall from Chapter 16 of Aaron Hillegass' excellent book Cocoa Programming for Mac OS X (Preorder the third edition here), every object that is going to respond to events needs to descend from NSResponder. NSResponder implements the public interface for responding to various mouse and keyboard events - our good friends -(void)mouseDown:(NSEvent *)anEvent, -(void)keyDown:(NSEvent *)anEvent, and friends. However, this documentation hasn't been updated for the Macbook Air, and as expected, the header does not reveal any additional information about these new events.

However, we have a solution for this. Steve Nygard's excellent class-dump which allows us to inspect a particular framework for it's symbols. Naturally, being the curious person I am, I ran this against the Macbook Air's version of AppKit (949.18.0 if you're curious). After a short wait (AppKit is very large after all), the results poured out into a freshly pressed text file. I immediately searched for the word "swipe", chosen completely at random. What I found, was worth it's weight in gold... assuming you want to implement gestures.

- (void)magnifyWithEvent:(id)fp8;
- (void)rotateWithEvent:(id)fp8;
- (void)swipeWithEvent:(id)fp8;
- (void)beginGestureWithEvent:(id)fp8;
- (void)endGestureWithEvent:(id)fp8;

These new methods belong to NSResponder, which means that every single responder now supports these gesture events. That's pretty awesome. My immediate reaction was to set symbolic breakpoints for these symbols and to perform some gestures using my little test application. As expected, they were hit, and I checked out the backtrace. The only thing of interest was a new private method for NSApplication -- _handleGestureEvent:
called from NSApplication's -(void)sendEvent:(NSEvent *)anEvent method.

Just to confirm this assumption, I overrode this method and just logged events from it. Turns out the only events that come through this method are gestures. Awesome. Interestingly enough, there is a small observation to be made here. Since every gesture passes through here, one could actually replace these gesture events with any other event, so you could get some free custom gestures by replacing the gestures with say, a keyboard shortcut event before passing it down the responder chain. Nothing really fantastic, but nice to note if you feel like playing with private methods.

So finally, we have everything we need to implement gestures for any NSResponder based object. Simply override the appropriate method for your gesture (swipe, pinch [magnify], or rotate), you can grab the value out of the NSEvent using the corresponding methods:

Swipe : -(float)deltaY; -(float)deltaX;
Magnification : -(float)deltaZ; -(float)magnification;
Rotation : -(float)rotation;

These methods return various values for each type of event depending on the type of event, velocity of movement and/or direction of movement. Some interesting notes here - swipes can be detected to the left or right as expected, but also up and down. Pretty cool. I also suspect that the two methods under magnification actually return the same value, but I haven't checked. The value logged with the event is deltaZ, however. Rotation is pretty self explanatory. The actual values these methods return is a little bit more of a mystery. Swipe is straight forward, 1.0 for right, -1.0 for left, 1.0 for up, and -1.0 for down, depending on whether you check deltaX or deltaY respectively. Magnification and rotation both return various positive or negative values depending on the rotation or scale of magnification. You'll just kind of have to play with these, I couldn't discern a good pattern. However, there are two methods on NSEvent: -(float)standardRotationThreshold and -(float)standardMagnificationThreshold which I'm sure return some magic values that allow you to determine the angle and scale for the gestures, which I imagine can help you figure out what you need to do to your responder as a result of the gesture. Additionally, NSEvent adds -(BOOL)isGesture, which as expected, lets you determine if an event is a gesture. NSResponder also adds gesture masks via -(unsigned long long)gestureEventMask and -(void)setGestureEventMask:(unsigned long long)aMask which presumably deal with the specific masks regarding events.

So, that should pretty much do it. You should be armed and ready to go to write some amazing code to take advantage of the trackpad inside the Macbook Air, and hopefully coming soon to a Macbook and Macbook Pro near you. Before anyone asks, this code will NOT work on current Macbook models, at least not on the current build of Leopard. It's possible when this AppKit version begins shipping standard (10.5.2, I suspect) that we may be able to fake these events with the old trackpads, depending on how much information we can squeeze out of AppKit.

Finally, I've built a small project based on the all of this code that will allow those of you lucky enough to have your Macbook Air already to see it all in action. It's fairly simple, just filling a simple custom view with a color based on the gesture you perform, but it is set up as a great test bed for playing around with third party gestures. In addition it has a commented out private method from NSApplication to log all the gesture events if you want to see them. Obviously if you uncomment this method, the events won't be processed and the color won't change since it's not sending them down the chain. I'll leave it as an exercise to the reader to add that. ;)

Thanks to everyone at my local Apple Store for their help, and the Macbook Air. Hopefully this will help everyone get their applications ready for multi-touch trackpads before they are everywhere.

Be back soon.


1 TrackBacks

Listed below are links to blogs that reference this entry: NSResponder Modifications: Swipe, Rotate, and Magnify.

TrackBack URL for this entry: http://cocoadex.com/cgi-bin/mt/mt-tb.cgi/8

» Enabling your OS X software for multi-touch from You Can't Fire Me, I Don't Work In This Van

Cocoadex has posted an excellent article on how you can take advantage of the Macbook Air’s multi-touch trackpad in your Cocoa OS X application. Thanks to Charles Srstka for sending me the link. ... Read More

2 Comments

n[ate]vw said:

I had been eyeing the current NSEvent docs looking to see how multitouch might fit in, or if it would end up as a separate "Gesture Services" framework or something. I had noticed the deltaZ, interesting that they are actually using that for zoom.

So thanks for looking into this! Hopefully we'll get a mini-WWDC later this month, but in the meantime I can know a little of what to anticipate as I design my custom views.

Hello,

with your info, I created a little sample app. Trackbacks do not seem to work from my weblog to yours, neither could I create a movable type ID :(

But here is the link:

http://vonbelow.com/weblog/archives/41-Rotation-using-gestures.html

Alex

Leave a comment