OggComponent/VorbisComponent: Difference between revisions
(→News) |
m (→Introduction) |
||
Line 4: | Line 4: | ||
=== Introduction === | === Introduction === | ||
The aim of this project is to integrate the [[Ogg]] Format and [[Vorbis]] Codec into [[Wikipedia:Mac OS X|Mac OS X]] with the end result of making the Format and Codec available for use by any of the System's sound applications. | |||
The work consists of two main parts. | The work consists of two main parts. |
Revision as of 07:30, 29 December 2005
Integrating Ogg into Mac OS X
Warning: this project is in study/design stage.
Introduction
The aim of this project is to integrate the Ogg Format and Vorbis Codec into Mac OS X with the end result of making the Format and Codec available for use by any of the System's sound applications.
The work consists of two main parts.
- The Audio File Format should be recognised
- The Sound should be coded/decodec.
To do such tasks, the new Core Audio API will be used.
The first part would consist of the creation of an AudioFileComponent. That component has to implement the AudioFileComponentBase AudioFileFormat and AudiofileObject classes. The example on CoreAudio/AudioFile is a good starting point. It implements a rawAudio file reader/writer. The OGG part of ogg-vorbis should go here. The streaming support go here too.
The second part is the codec support. To support Vorbis, the creation of a AudioCodec is needed. The example on CoreAudio/AudioCodecs is a good starting point too. It implements an IMA4 codec. No file format is implied here. So, there should be a glue ( that I haven't discovered yet ) from the AudioFileComponent API to know that contained data is Vorbis and launches the correct AudioCodec. In fact, AudioFile class contains a ReadHeader() method that allows to discover what kind of data the file contains, and then create an Audio Input Stream Format Description that determine the type of data, and codec to use.
Finally, for the rest of Ogg Codecs, just the second part is needed.
NOTE: This is not sufficient for full OS X support. Many if not most OS X applications (iTunes, for example) use QuickTime APIs for playback and encoding of media files. Quicktime does not use the Core Audio components (yet). If we want to support those applications it will be necessary to also create two Quicktime Components. We would need a Movie Import component and a Movie Export component (codes 'eat ' and 'spit'). It has been suggested that some time in the future Quicktime will use Core Audio components if they exist, but it is not known when that will happen. In any case, the Quicktime components will still be necessary for Theora support and may even be necessary for multi-link Vorbis files as well.
Example code for building Quicktime components can be found at Apple's Developer Connection website.
Work in progress...
Related Work
These projects were broken by the upgrade to Quicktime 7.
Resources
- libfishsound: A callback-based wrapper for vorbis and speex libraries. This may be a much better match to the callback-based AudioCodec API than libvorbis.
- Quicktime and GDB: Info on debugging quicktime components with GDB.
- A useful bug report filed against qtcomponents.
News
Oct 3, 2005: The QTComponent has been further updated. Get it from http://zskl.zsk.p.lodz.pl/~skali/oggvorbis.html
Sep 19, 2005: It looks like somebody has fixed the qtcomponents code enough to get ogg vorbis playback working again. However, there are still several issues, and it would still be a good idea to work on a Core Audio-based codec.
Old News
Good news! It turns out that the eat/spit components from the SourceForge Quicktime Components project were not broken by the upgrade to Quicktime 7. It's just the Sound Manager-based sound decompressor component (code 'sdec') that needs to be updated to an AudioCodec. This greatly reduces the required work, since the AudioCodec API is quite simple compared to the Quicktime component API.
See this message in the quicktime-api mailing list archives for details.