SOMEWHERE IN QUICKTIME
DERIVED MEDIA HANDLERS
JOHN WANG
In this column, I'll be telling you about derived media handlers in QuickTime 1.5 -- but first, some background.
QuickTime movies contain tracks that refer to media. In QuickTime 1.0, two media types are supported: video and sound. A movie might therefore have one track that refers to video media and one that refers to sound media. Each of these supported media has a media handler, which is code that's responsible for interpreting the media's data. Obviously, displaying video images requires different code than playing sound. The media handler code is in the form of a component of type 'mhlr'. The video media handler has the subtype 'vide', and the sound media handler has the subtype 'soun'.
QuickTime uses the concept of a media to separate media interpretation from the Movie Toolbox and to place the responsibility into individual media handlers. This has the added advantage that media handlers can be created to interpret new media types. However, it wasn't possible to easily create a media handler in QuickTime 1.0.
DERIVED HANDLERS TO THE RESCUE
Derived media handler support was introduced in QuickTime 1.5 to allow developers to define new
custom media types. As an example of the capabilities of the derived media handler, QuickTime 1.5
has a new 'text' media type that's implemented using a derived media handler. Derived media handler
components can easily be created because they can use the services of a common base media handler
supplied by Apple; hence the namederived media handler. The base media handler manages most of
the duties that must be performed by all media handlers and reduces the intricacies of writing a
standalone media handler.
This column will discuss sample code (provided on this issue's CD) that implements a complete QuickDraw derived media handler. This media handler will interpret QuickDraw pictures stored in the media's data. Each media sample in the data is a QuickDraw picture. For example, you could have a movie of a bouncing ball, but instead of having compressed pixel images of the balls bouncing, as in a video media, you would have a series of pictures of a ball drawn with PaintOval as it moves along its path. The CD also contains a sample that creates interesting movies using our new QuickDraw media type.
CREATING THE COMPONENT SHELL
The first step in creating our sample derived media handler is to create a component shell to which
we can add media handler-specific calls in a later step. The MyComponent.c file contains the
following routines: main (the dispatch routine for the component), MyOpen, MyClose, MyCanDo,
MyVersion, and MyRegister. MyOpen, the initialization routine for our component, opens the base
media handler and sends atarget request to it. A target request is a Component Manager service that allows a new component
instance to establish itself as atarget component instance for another (delegate) instance. In our case,
the target is our QuickDraw derived media handler and the delegate is the base media handler. The
delegate will be called by the target whenever the target wants to delegate calls to it. The delegate
should call the target whenever the delegate would normally call itself (for example, when it uses its
own services). This effectively makes our derived media handler sit on top of the base media handler
by handling all requests that it can handle and delegating requests that it can't handle to the base
media handler.
By calling the ComponentSetTarget routine after opening an instance of the base media handler component, we inform the base media handler that our component is derived from it. For example, the following code from our derived media handler component's open routine, MyOpen, will open the base media handler and target it:
myComp = OpenDefaultComponent(MediaHandlerType, BaseMediaType); ComponentSetTarget(myComp, self); (**storage).delegate = myComp;
The above description of targeting a component is similar to another Component Manager service, called capturing. Capturing is a service whereby the target component completely and permanently overrides the delegate component by hiding it from further use. This is feasible, for example, when updating a component or fixing bugs in it; the target can implement only the new features while delegating the original functionality to its delegate. Since the target wouldn't want the outdated delegate component to be visible any longer, it would capture the delegate using the Component Manager's CaptureComponent routine. You shouldn't call CaptureComponent on the base media handler, because that would hide it and prevent other derived media handlers from using it. Conceptually, you're not replacing the base media handler; you're just using its services. Therefore, targeting it is sufficient.
When the media handler is no longer used by the Movie Toolbox, the QuickDraw derived media handler's MyClose routine will be called. To close the connection to the base media handler, MyClose must call CloseComponent to close the base media handler component instance:
CloseComponent((**storage).delegate);
To prevent our derived media handler from registering if the base media handler isn't available, the MyRegister routine returns true (to not register) if the initialization done by MyOpen fails or false (to register) if it succeeds.
DEFINING THE MEDIA DATA FORMAT
The next step in creating a derived media handler is to define a media type identifier, a sample
description record that's stored along with the media samples, and the format of the media data.
For our QuickDraw media, we'll use 'Qdrw' as the media type. (As with resource types, Apple reserves all-lowercase types, so we use a media type that contains one uppercase character.) Every movie that contains a track created by using NewTrackMedia with mediaType 'Qdrw' will automatically refer to our custom media handler. Our media handler will have the component type 'mhlr' and subtype 'Qdrw'.
All description records must contain size, type, resvd1, resvd2, and dataRefIndex fields as a minimum. You should always fill in the size and type fields, but you can set the other fields to 0. It's also recommended that a field for the media data version be included in the sample description record so that it's always possible to identify the version of the media data.
#define QDrawMediaType 'Qdrw' #define QDMediaVersion 0x100 typedef struct GraphicsDescription { long size; long type; // QDrawMediaType long resvd1; short resvd2; short dataRefIndex; short version; // QDMediaVersion } QDrawDescription, *QDrawDescriptionPtr, **QDrawDescriptionHandle;
Since every media data sample has an associated description record, it's possible to find out the version, or any other data defined in the record, for the particular sample. To prevent wasting space with duplicate description records, QuickTime associates a sample description index with each sample; thus, many media samples can refer to the same description record through its index.
As mentioned earlier, the samples stored in our media data are simply QuickDraw pictures. So the data handle passed to AddMediaSample, which is the Movie Toolbox call to add data into a media, will simply be a PicHandle. This allows us to easily create sample data with OpenPicture and ClosePicture and dispose of it with KillPicture.
CREATING A MOVIE THAT USES THE DERIVED MEDIA HANDLER
When we start writing the implementation-specific component routines for the media handler, we'd
like to be able to test them with a QuickTime movie that uses that media handler. So, the next step is
to write an application that creates a movie using the QuickDraw media handler. But there's a sort of
Catch-22: We won't be able to run the application to create the movie until after we begin
implementing some of the QuickDraw media handler component routines, because some of the
Movie Toolbox calls that are used to create the movie will use the QuickDraw media handler. For
example, NewTrackMedia will cause QuickTime to open an instance of our component to prepare
for editing and playback of the new media.
The code below shows how to create a movie using our newly defined QuickDraw media format (the complete code is on this issue's CD). AddGraphics, defined later, is a wrapper procedure for AddMediaSample that any application can call to easily add media samples.
// Create track and media. myTrack = NewMovieTrack(myMovie, (long) kFrameWidth << 16, (long) kFrameHeight << 16, 0); myMedia = NewTrackMedia(myTrack, QDrawMediaType, 600, nil, (OSType) nil); // Add samples to media. BeginMediaEdits(myMedia); myQDDesc = (QDrawDescriptionHandle) NewHandleClear(sizeof(QDrawDescription)); (**myQDDesc).size = sizeof(QDrawDescription); (**myQDDesc).type = QDrawMediaType; (**myQDDesc).version = QDMediaVersion; myPict = OpenPicture(&drawRect); PaintOval(&drawRect); ClosePicture(); AddGraphics(myMedia, myPict, myQDDesc, 600, 0, nil); DrawPicture(myPict, &drawRect); KillPicture(myPict); EndMediaEdits(myMedia); // Place media into movie. InsertMediaIntoTrack(myTrack, 0, 0, GetMediaDuration(myMedia), kFix1);
The main difference between code that generates a movie using normal QuickTime video media and code that uses our QuickDraw media is in the NewTrackMedia and AddMediaSample calls. For NewTrackMedia, we pass 'Qdrw', as defined earlier, for the mediaType parameter.
The wrapper procedure AddGraphics is defined as follows:
pascal OSErr AddGraphics(Media graphicsMedia, PicHandle myPic, QDrawDescriptionHandle QDDesc, TimeValue duration, short mySync; TimeValue *sampleTime) { return (AddMediaSample(graphicsMedia, (Handle) myPic, 0L, GetHandleSize(myPic), duration, (SampleDescriptionHandle) QDDesc, 1L, mySync, sampleTime)); }
ADDING THE MEDIA HANDLER ROUTINES
The last step is, of course, to complete our derived media handler component.
The file named MyMediaComponentRoutines.c contains the routines that our QuickDraw media handler implements rather than delegating to the base media handler. It wouldn't make much sense to delegate MediaInitialize and MediaIdle, since these routines are crucial to our code: MediaInitialize initializes our media handler and MediaIdle is the routine that gets called for drawing. On the other hand, a call such as MediaGSetVolume wouldn't be very useful to our very quiet graphics media, so MediaGSetVolume would be delegated to the base media handler.
All media handler routines, as defined in the Derived Media Handler Components chapter, must be delegated to the base media handler if not implemented. We've chosen to implement the following routines because our media handler does spatial processing (in other words, it draws).
- MediaInitialize: Prepares access to media by saving necessary information passed to it in the GetMovieCompleteParams record.
- MediaIdle: The Movie Toolbox provides processing time to the derived media handler through this routine. The QuickDraw media handler draws during this call.
- MediaSetActive: The Movie Toolbox calls this routine if the media is enabled or disabled.
- MediaSetRate: Called if the rate changes. This is necessary to determine when the movie rate is reversed.
- MediaSetMediaTimeScale: Our media handler cares about the media time scale since we store times in the media's time coordinate system.
- MediaTrackEdited: Called if the track has been edited. If so, we'll want to redraw.
- MediaSetGWorld: Called if the destination GWorld changes. If so, we'll want to know the new GWorld for drawing.
- MediaSetDimensions: Called if the spatial dimensions change.
- MediaSetMatrix: Called if the track matrix or movie matrix changes due to movie resizing.
- MediaGetTrackOpaque: Called to determine whether the track is opaque. We want to return true so that correct compositing occurs. Our media may be semitransparent.
- MediaSampleDescriptionChanged: Called if the sample description record changes. If we ever store information beyond the media data version, we'll probably want to know if the user changes the sample description contents. If the description record changes due to different media samples referring to different description records, this routine isn't called. Instead, the media handler must check the sample description index returned by GetMediaSample.
The routines we didn't implement are:
- MediaGGetStatus: Our simplified QuickDraw media handler doesn't need any error processing.
- MediaPutMediaInfo, MediaGetMediaInfo: Since we don't store any proprietary information along with the media data, we don't need to implement this.
- MediaSetMovieTimeScale: Our media handler doesn't care about the movie time scale since we don't store any times in the movie's time coordinate system.
- MediaSetClip: Our media handler doesn't support clipping.
- MediaSetGraphicsMode, MediaGetGraphicsMode: Our media handler doesn't support graphics modes.
- MediaGetNextBoundsChange: Our bounds doesn't change dynamically. (The text media handler is an example of a media that has dynamically changing bounds.)
- MediaGetSrcRgn: Our media doesn't have an irregular display region.
- MediaGSetVolume, MediaSetSoundBalance, MediaGetSoundBalance: We don't play sound.
- MediaPreroll: We let the base media handler do our prerolling for us. The data handler is smart about caching, so we really don't need to worry about this.
Since it's difficult for us to cover every possible condition in which the media handler will get called, a clever approach is needed to aid in the development of the derived media handler. The solution lies in DebugStr: By strategically placing DebugStr calls throughout the media handler, we can see which routines are being called. Knowing which events trigger calls to our media handler will allow us to decide which calls the handler should support and which ones we can delegate. For example, it was through this process that I found that MediaSetMatrix was a call I needed to implement because resizing a movie window causes MediaSetMatrix to be called.
This approach makes it possible to create a media handler starting off with just a MediaInitialize routine and building from there. The selectors for a media handler component are in the range of 0x500 to 0x5FF. Therefore, any calls that are delegated with selectors in this range should be examined to determine whether the actions that cause the routine to be called are significant to the media handler implementation. If so, the routine should not be delegated. The QuickTime documentation, this column, the sample code, and common sense should give you a good idea of which media handler-specific routines a derived media handler must implement.
THE GUTS OF THE MATTER
As you can see in MyMediaComponentRoutines.c, most of the guts are in the routines
MediaInitialize and MediaIdle. MediaInitialize is called by the Movie Toolbox when a movie using
the media is opened. The MediaInitialize routine should grab information it needs that's passed to it
by QuickTime in the GetMovieCompleteParams record and store it in a private data structure.
typedef struct { short version; Movie theMovie; Track theTrack; Media theMedia; TimeScale movieScale; TimeScale mediaScale; TimeValue movieDuration; TimeValue trackDuration; TimeValue mediaDuration; Fixed effectiveRate; TimeBase timeBase; short volume; Fixed width; Fixed height; MatrixRecord trackMovieMatrix; CGrafPtr moviePort; GDHandle movieGD; PixMapHandle trackMatte; } GetMovieCompleteParams; typedef struct { // Component stuff ComponentInstance delegate; ComponentInstance self; // Characteristics Movie myMovie; Track myTrack; Media myMedia; Fixed mediaRate; Rect graphicsBox; MatrixRecord myMatrix; CGrafPtr port; GDHandle device; long sampleDescIndex; // Media globals long somethingChanged; Boolean enabled; Fixed newMediaRate; TimeValue lastMediaTime; } PrivateGlobals;
The above private data structure for the QuickDraw derived media handler shows the fields that the handler is interested in. For example, we would obviously need to know the trackMovieMatrix, but a sound media handler would not. The information in the GetMovieCompleteParams record is valid at the time of the MediaInitialize call and is updated through other derived media handler routines such as MediaSetGWorld. It's important to implement such derived media handler routines to update information used by the media handler.
MediaInitialize also needs to inform the base media handler of its capabilities by calling the routine MediaSetHandlerCapabilities. Our media handler uses this routine to tell the base media handler that we perform spatial processing and that we also can work with transfer modes.
MediaIdle does the bulk of the processing in a media handler. Our MediaIdle routine uses GetMediaSample to get the media sample and then calls DrawPicture to display the sample. It also uses a scheme of calling GetMediaNextInterestingTime to implement sync frames; this allows greater performance when playing movies backward, because the media handler won't have to begin drawing from the beginning of the media. The concept of sync frames is important in QuickTime movies because it allows temporal compression so that not all frames need to contain complete state information. Keeping a small number of frames between key frames makes it possible to preserve performance when playing movies backward, since rendering of frames must still occur in the forward direction.
The effect of not having sync frames is evident if you create a movie without any (see the example on the CD). Such a movie will look to the media handler like a movie in which every frame is a key frame. As you can see with these movies, backward playback of movies gives different results than forward playback since each sample is treated as a sync sample even though it shouldn't be. This is not recommended because it's conceptually and visually confusing to users.
In addition, to prevent the redrawing of previously drawn frames, our media handler keeps track of the last media time that the image has been updated so that it can continue from there if no other changes to the environment prevent it. This works only when the movie is playing in the forwarddirection. When a movie is played backward, each frame must be completely recreated starting from the last key frame.
OUT OF TIME
Using the QuickDraw derived media handler as a framework, you can create your own media type.
Interactivity tracks, custom sound format tracks (such as MIDI), and even hardware control tracks are
all possible. With some creativity and work, you can expand the capabilities of QuickTime beyond
imaginable limits.
JOHN WANG (AppleLink WANG.JY) of Apple's Printing, Imaging, and Graphics group was once a math and science nerd whose writing skills were as bad as the BASIC programming language compared to C. His hard work (hah) and prep school training finally pulled him through. Yet he still can't believe he's writing to an audience greater than just himself. (The editors can't either.) He's even got a double feature in this issue (see also "Print Hints"). Will wonders never cease.*
For more information about derived media handlers, see the Derived Media Handler Components chapter of Inside Macintosh: QuickTime Components (which is included in the QuickTime Developer's Kit v. 1.5). *
For more information on the Component Manager, see "Techniques for Writing and Debugging Components" in developIssue 12 and the Component Manager documentation in the QuickTime Developer's Kit v. 1.5. (The Component Manager will also be described in Inside Macintosh: More Macintosh Toolbox .)*
For more information on the picture format, see the "Color Picture Format" section of the Color QuickDraw chapter ofInside Macintosh Volume V.*Thanks to Ken Doyle, Bill Guschwan, Peter Hoddie, and Guillermo Ortiz for reviewing this column. *
- SPREAD THE WORD:
- Slashdot
- Digg
- Del.icio.us
- Newsvine