OggComponent/VorbisComponent: Difference between revisions

From XiphWiki
Jump to navigation Jump to search
Line 28: Line 28:
* [http://qtcomponents.sf.net The SourceForge Quicktime Components project]
* [http://qtcomponents.sf.net The SourceForge Quicktime Components project]
* [http://damien.drix.free.fr/qtflac/ Quicktime components for FLAC]
* [http://damien.drix.free.fr/qtflac/ Quicktime components for FLAC]
=== Resources ===
* [http://www.annodex.net/software/libfishsound/html/ libfishsound:] A callback-based wrapper for vorbis and speex libraries.  This may be a much better match to the callback-based AudioCodec API than libvorbis.
* [[Quicktime and GDB]]: Info on debugging quicktime components with GDB.


=== Update ===
=== Update ===

Revision as of 01:29, 15 September 2005

Integrating OGG into MacOsX

Warning: this project is in study/design stage.

Introduction

This project consists on integrating the Ogg Format and Vorbis Codec into MacOS X. That will result on having that in every sound application.

The work consists of two main parts.

  • The Audio File Format should be recognised
  • The Sound should be coded/decodec.

To do such tasks, the new Core Audio API will be used.

The first part would consist of the creation of an AudioFileComponent. That component has to implement the AudioFileComponentBase AudioFileFormat and AudiofileObject classes. The example on CoreAudio/AudioFile is a good starting point. It implements a rawAudio file reader/writer. The OGG part of ogg-vorbis should go here. The streaming support go here too.

The second part is the codec support. To support Vorbis, the creation of a AudioCodec is needed. The example on CoreAudio/AudioCodecs is a good starting point too. It implements an IMA4 codec. No file format is implied here. So, there should be a glue ( that I haven't discovered yet ) from the AudioFileComponent API to know that contained data is Vorbis and launches the correct AudioCodec.

Finally, for the rest of Ogg Codecs, just the second part is needed.

NOTE: This is not sufficient for full OS X support. Many if not most OS X applications (iTunes, for example) use Quicktime APIs for playback and encoding of media files. Quicktime does not use the Core Audio components (yet). If we want to support those applications it will be necessary to also create two Quicktime Components. We would need a Movie Import component and a Movie Export component (codes 'eat ' and 'spit'). It has been suggested that some time in the future Quicktime will use Core Audio components if they exist, but it is not known when that will happen. In any case, the Quicktime components will still be necessary for Theora support and may even be necessary for multi-link Vorbis files as well.

Example code for building Quicktime components can be found here

Work in progress...

Related Work

These projects were broken by the upgrade to Quicktime 7.

Resources

  • libfishsound: A callback-based wrapper for vorbis and speex libraries. This may be a much better match to the callback-based AudioCodec API than libvorbis.
  • Quicktime and GDB: Info on debugging quicktime components with GDB.

Update

Good news! It turns out that the eat/spit components from the SourceForge Quicktime Components project were not broken by the upgrade to Quicktime 7. It's just the SoundConverter-based codec component (code 'cdec') that needs to be updated to an AudioCodec. This greatly reduces the required work, since the AudioCodec API is quite simple compared to the Quicktime component API.

See this message in the quicktime-api mailing list archives for details.