OggComponent/VorbisComponent: Difference between revisions

From XiphWiki
Jump to navigation Jump to search
No edit summary
 
(15 intermediate revisions by 9 users not shown)
Line 1: Line 1:
== Integrating OGG into MacOsX ==
== Integrating Ogg into Mac OS X ==


Warning: this project is in study/design stage.
Warning: this project is in study/design stage.


=== Introduction ===
=== Introduction ===
This project consists on integrating the Ogg Format and Vorbis Codec into MacOS X. That will result on having that in every sound application.
The aim of this project is to integrate the [[Ogg]] Format and [[Vorbis]] Codec into [[Wikipedia:Mac OS X|Mac OS X]] with the end result of making the Format and Codec available for use by any of the System's sound applications.


The work consists of two main parts.
The work consists of two main parts.
Line 14: Line 14:
The first part would consist of the creation of an [http://developer.apple.com/documentation/MusicAudio/Reference/CAAudioTooboxRef/AudioFileComponent/CompositePage.html AudioFileComponent]. That component has to implement the AudioFileComponentBase AudioFileFormat and AudiofileObject classes. The example on CoreAudio/AudioFile is a good starting point. It implements a rawAudio file reader/writer. The OGG part of ogg-vorbis should go here. The streaming support go here too.
The first part would consist of the creation of an [http://developer.apple.com/documentation/MusicAudio/Reference/CAAudioTooboxRef/AudioFileComponent/CompositePage.html AudioFileComponent]. That component has to implement the AudioFileComponentBase AudioFileFormat and AudiofileObject classes. The example on CoreAudio/AudioFile is a good starting point. It implements a rawAudio file reader/writer. The OGG part of ogg-vorbis should go here. The streaming support go here too.


The second part is the codec support. To support Vorbis, the creation of a [http://developer.apple.com/documentation/MusicAudio/Reference/CoreAudio/audiocodec/chapter_3_section_1.html#//apple_ref/doc/uid/TP30001108-CH204 AudioCodec] is needed. The example on CoreAudio/AudioCodecs is a good starting point too. It implements an IMA4 codec. No file format is implied here. So, there should be a glue ( that I haven't discovered yet ) from the AudioFileComponent API to know that contained data is Vorbis and launches the correct AudioCodec.
The second part is the codec support. To support Vorbis, the creation of a [http://developer.apple.com/documentation/MusicAudio/Reference/CoreAudio/audiocodec/chapter_3_section_1.html#//apple_ref/doc/uid/TP30001108-CH204 AudioCodec] is needed. The example on CoreAudio/AudioCodecs is a good starting point too. It implements an IMA4 codec. No file format is implied here. So, there should be a glue ( that I haven't discovered yet ) from the AudioFileComponent API to know that contained data is Vorbis and launches the correct AudioCodec. In fact, AudioFile class contains a ReadHeader() method that allows to discover what kind of data the file contains, and then create an Audio Input Stream Format Description that determine the type of data, and codec to use.


Finally, for the rest of Ogg Codecs, just the second part is needed.
Finally, for the rest of Ogg Codecs, just the second part is needed.


NOTE: This is not sufficient for full OS X support.  Many if not most OS X applications (iTunes, for example) use Quicktime APIs for playback and encoding of media files.  Quicktime does not use the Core Audio components (yet).  If we want to support those applications it will be necessary to also create two Quicktime Components.  We would need a Movie Import component and a Movie Export component (codes 'eat ' and 'spit').  It has been suggested that some time in the future Quicktime will use Core Audio components if they exist, but it is not known when that will happen.  In any case, the Quicktime components will still be necessary for Theora support and may even be necessary for multi-link Vorbis files as well.
NOTE: This is not sufficient for full OS X support.  Many if not most OS X applications (iTunes, for example) use [[Wikipedia:QuickTime|QuickTime]] APIs for playback and encoding of media files.  <strike>Quicktime does not use the Core Audio components (yet).</strike> (See '''Update''' below) If we want to support those applications it will be necessary to also create two Quicktime Components.  We would need a Movie Import component and a Movie Export component (codes 'eat ' and 'spit').  It has been suggested that some time in the future Quicktime will use Core Audio components if they exist, but it is not known when that will happen.  In any case, the Quicktime components will still be necessary for Theora support and may even be necessary for multi-link Vorbis files as well.


Example code for building Quicktime components can be found [http://developer.apple.com/samplecode/ElectricImageComponent/ElectricImageComponent.html here]
'''Update:''' Developers can now instruct QuickTime to use CoreAudio for audio processing.  See [http://developer.apple.com/qa/qa2005/qa1448.html QA1448].
 
Example code for building Quicktime components can be found [http://developer.apple.com/samplecode/ElectricImageComponent/ElectricImageComponent.html at Apple's Developer Connection website].


Work in progress...
Work in progress...
Line 32: Line 34:
* [http://www.annodex.net/software/libfishsound/ libfishsound:] A callback-based wrapper for vorbis and speex libraries.  This may be a much better match to the callback-based AudioCodec API than libvorbis.
* [http://www.annodex.net/software/libfishsound/ libfishsound:] A callback-based wrapper for vorbis and speex libraries.  This may be a much better match to the callback-based AudioCodec API than libvorbis.
* [[Quicktime and GDB]]: Info on debugging quicktime components with GDB.
* [[Quicktime and GDB]]: Info on debugging quicktime components with GDB.
* [http://sourceforge.net/tracker/index.php?func=detail&aid=1144430&group_id=41359&atid=430388 A useful bug report] filed against qtcomponents.


=== Update ===
=== News ===
Good news!  It turns out that the eat/spit components from the [http://qtcomponents.sf.net SourceForge Quicktime Components project] were '''not''' broken by the upgrade to Quicktime 7.  It's just the SoundConverter-based codec component (code 'cdec') that needs to be updated to an AudioCodec.  This '''greatly''' reduces the required work, since the AudioCodec API is quite simple compared to the Quicktime component API.
'''Dec 17, 2005''': [http://xiph.org/quicktime/ XiphQT 0.1.3 released] for further information about the development visite http://xiph.org/quicktime/
 
'''Oct 3, 2005''': The QTComponent has been further updated. Get it from http://zskl.zsk.p.lodz.pl/~skali/oggvorbis.html
 
'''Sep 19, 2005''': It looks like somebody [http://sourceforge.net/forum/forum.php?thread_id=1353547&forum_id=135636 has fixed] the qtcomponents code enough to get ogg vorbis playback working again.  However, there are still several issues, and it would still be a good idea to work on a Core Audio-based codec.
 
==== Old News ====
Good news!  It turns out that the eat/spit components from the [http://qtcomponents.sf.net SourceForge Quicktime Components project] were '''not''' broken by the upgrade to Quicktime 7.  It's just the Sound Manager-based sound decompressor component (code 'sdec') that needs to be updated to an AudioCodec.  This '''greatly''' reduces the required work, since the AudioCodec API is quite simple compared to the Quicktime component API.


See [http://lists.apple.com/archives/QuickTime-API/2005/Sep/msg00126.html this message] in the quicktime-api mailing list archives for details.
See [http://lists.apple.com/archives/QuickTime-API/2005/Sep/msg00126.html this message] in the quicktime-api mailing list archives for details.
[[Category:Xiph-related Software]]

Latest revision as of 15:33, 15 February 2008

Integrating Ogg into Mac OS X

Warning: this project is in study/design stage.

Introduction

The aim of this project is to integrate the Ogg Format and Vorbis Codec into Mac OS X with the end result of making the Format and Codec available for use by any of the System's sound applications.

The work consists of two main parts.

  • The Audio File Format should be recognised
  • The Sound should be coded/decodec.

To do such tasks, the new Core Audio API will be used.

The first part would consist of the creation of an AudioFileComponent. That component has to implement the AudioFileComponentBase AudioFileFormat and AudiofileObject classes. The example on CoreAudio/AudioFile is a good starting point. It implements a rawAudio file reader/writer. The OGG part of ogg-vorbis should go here. The streaming support go here too.

The second part is the codec support. To support Vorbis, the creation of a AudioCodec is needed. The example on CoreAudio/AudioCodecs is a good starting point too. It implements an IMA4 codec. No file format is implied here. So, there should be a glue ( that I haven't discovered yet ) from the AudioFileComponent API to know that contained data is Vorbis and launches the correct AudioCodec. In fact, AudioFile class contains a ReadHeader() method that allows to discover what kind of data the file contains, and then create an Audio Input Stream Format Description that determine the type of data, and codec to use.

Finally, for the rest of Ogg Codecs, just the second part is needed.

NOTE: This is not sufficient for full OS X support. Many if not most OS X applications (iTunes, for example) use QuickTime APIs for playback and encoding of media files. Quicktime does not use the Core Audio components (yet). (See Update below) If we want to support those applications it will be necessary to also create two Quicktime Components. We would need a Movie Import component and a Movie Export component (codes 'eat ' and 'spit'). It has been suggested that some time in the future Quicktime will use Core Audio components if they exist, but it is not known when that will happen. In any case, the Quicktime components will still be necessary for Theora support and may even be necessary for multi-link Vorbis files as well.

Update: Developers can now instruct QuickTime to use CoreAudio for audio processing. See QA1448.

Example code for building Quicktime components can be found at Apple's Developer Connection website.

Work in progress...

Related Work

These projects were broken by the upgrade to Quicktime 7.

Resources

  • libfishsound: A callback-based wrapper for vorbis and speex libraries. This may be a much better match to the callback-based AudioCodec API than libvorbis.
  • Quicktime and GDB: Info on debugging quicktime components with GDB.
  • A useful bug report filed against qtcomponents.

News

Dec 17, 2005: XiphQT 0.1.3 released for further information about the development visite http://xiph.org/quicktime/

Oct 3, 2005: The QTComponent has been further updated. Get it from http://zskl.zsk.p.lodz.pl/~skali/oggvorbis.html

Sep 19, 2005: It looks like somebody has fixed the qtcomponents code enough to get ogg vorbis playback working again. However, there are still several issues, and it would still be a good idea to work on a Core Audio-based codec.

Old News

Good news! It turns out that the eat/spit components from the SourceForge Quicktime Components project were not broken by the upgrade to Quicktime 7. It's just the Sound Manager-based sound decompressor component (code 'sdec') that needs to be updated to an AudioCodec. This greatly reduces the required work, since the AudioCodec API is quite simple compared to the Quicktime component API.

See this message in the quicktime-api mailing list archives for details.