Understanding
JMF
Java
Media Framework (JMF) provides a unified architecture and messaging
protocol
for managing the
acquisition,
processing & delivery
of
time-based
media data.
JMF
is designed to support most standard media content types, such as
AIFF,
AU, AVI, GSM, MIDI, MPEG, QuickTime, RMF, and WAV.
JMF
provides a common cross-platform Java API
for accessing underlying media frameworks.
High-Level
Architecture

Time
Model

A
Clock uses
a TimeBase to keep track of the
passage
of time while a media stream is being presented.
A Clock object's media
time represents the current position within a media
stream--the
beginning of the stream is media time zero, the end of the stream is
the
maximum media time for the stream.
The
duration
of the media stream is the elapsed time from start to finish--the
length
of time that it takes to present the media stream.
To
keep track of the current media time, a Clock uses:
- The time-base
start-time--the time that its TimeBase reports when
the
presentation begins.
- The media
start-time--the position in the media stream where presentation
begins.
- The playback
rate--how fast the Clock is running in relation to its
TimeBase.
The rate is a scale factor
that
is applied to the TimeBase. For example, a rate of 1.0
represents
the normal playback rate for the media stream, while a rate
of 2.0 indicates that the presentation will run at twice the
normal
rate. A negative rate indicates that the Clock
is running in the opposite direction from its TimeBase--for
example,
a negative rate might be used to play a media stream backward.
During
presentation, the current media time is
calculated
using the following formula:
MediaTime = MediaStartTime + Rate(TimeBaseTime - TimeBaseStartTime)
Managers
JMF
uses intermediary objects called managers.
JMF
uses four managers:
- Manager--handles
the construction of
Players, Processors, DataSources,
and DataSinks.
- PackageManager--maintains
a registry of packages such as
custom
Players, Processors, DataSources,
and DataSinks.
- CaptureDeviceManager--maintains
a registry of available capture devices.
- PlugInManager--maintains
a registry of available JMF plug-in processing components, such as
Multiplexers,
Demultiplexers, Codecs, Effects,
and Renderers.
Event
Model
Whenever
a JMF object needs to report on the current conditions,
it
posts a MediaEvent.
To
receive notification when a MediaEvent is posted, you
implement
the appropriate listener interface and register your listener class
with
the object that posts the event by calling its addListener
method.
Data
Model
JMF
media players usually use DataSources to
manage the transfer of media-content. A DataSource
encapsulates
both the location of media and the protocol and software used to
deliver
the media.
A DataSource
is identified by either a JMF MediaLocator or
a URL
Push
and Pull Data Sources
JMF
data sources can be categorized according to how data transfer is
initiated:
- Pull
Data-Source--the client initiates the data transfer and
controls the flow of data from pull data-sources.
- Push
Data-Source--the server initiates the data transfer and
controls the flow of data from a push data-source.
Data
Formats
The
exact media format of an object is represented by a Format
object.
JMF extends Format to define audio- and video-specific
formats.
Controls
A
Control
often provides access to a corresponding user interface component that
enables user control over an object's attributes.
Standard
Contros
CachingControl
enables download progress to be monitored and displayed. If a Player
or Processor can report its download progress, it implements
this
interface so that a progress bar can be displayed to the user.
GainControl
enables audio volume adjustments such as setting the level and muting
the
output of a Player or Processor. It also supports a
listener
mechanism for volume changes.
User
Interface Components
To
get
the default user interface component for a particular Control,
you call getControlComponent. This method returns an AWT Component
that you can add to your applet's presentation space or application
window.
A Player
provides access to both a visual component
and a control panel component--to
retrieve
these components, you call the Player methods getVisualComponent
and getControlPanelComponent.
Presentation
Players
A
Player
processes an input stream of media data and renders it at a precise
time.
Player
States
In
normal operation, a Player steps through each state until it
reaches
the Started state:
- A Player
in the Unrealized state
has
been instantiated, but does not yet know anything about its media. When
a media Player is first created, it is Unrealized.
- A Realizing Player
is in the process of determining
its resource
requirements. During realization, a Player acquires the
resources
that it only needs to acquire once.
- When
a Player finishes Realizing, it moves into the Realized
state. A Realized Player knows what resources it needs and
information
about the type of media it is to present.
- When prefetch is
called, a Player moves from the Realized state
into the Prefetching state.
A
Prefetching Player is preparing to present its media. During
this
phase, the Player preloads its
media data,
obtains exclusive-use resources, and does whatever else it needs to do
to prepare itself to play.
- When
a Player finishes Prefetching, it moves into the Prefetched state. A Prefetched
Player is ready
to be started.
- Calling start puts
a Player into the Started state.
A
Started Player object's time-base time
and media time are mapped and its clock is running.
A
Player
posts TransitionEvents as it
moves
from one state to another. The ControllerListener
interface provides a way for your program to determine what
state a Player is in and to respond appropriately. For
example,
when your program calls an asynchronous method on a Player or
Processor, it needs to listen for the appropriate event to
determine
when the operation is complete.
Processors
Processors
can also be used to present media data. A Processor
is just a specialized type of Player that provides
control
over what processing is performed on the input media stream. A Processor
supports all of the same presentation controls as a Player.
In
addition to rendering media data to presentation devices, a Processor
can output media data through a DataSource so that it can be
presented
by another Player or Processor, further manipulated
by
another Processor, or delivered to some other destination,
such
as a file.
Capture
A
multimedia
capturing device can act as a source for multimedia data
delivery.
Such capture devices are abstracted as DataSources.
Some
devices deliver multiple data streams--for example, an audio/video
conferencing
board might deliver both an audio and a video stream. The corresponding
DataSource
can contain multiple SourceStreams that map to the data
streams
provided by the device.
Media
Data Storage and Transmission
A
DataSink is used to
read media data from a DataSource and
render
the media to some destination--generally a destination other than a
presentation
device. A particular DataSink might write data to a file,
write
data across the network, or function as an RTP broadcaster.