JMF Examples
The sample program "MDIApp" allows you to open JMF
players
and place
them in JInternalFrames inside a JFrame.
How to run this sample
1. Compile MDIApp.java
javac MDIApp.java
2. Run MDIApp
java MDIApp
3. Select File->Open and select your favourite movie.
(clip01.mpg & msnbcmpg.mpg
under ./clips).
Source Code
MDApp.javaHighlight of MDApp.java:
.....
......
public void controllerUpdate(ControllerEvent ce) {
if (ce instanceof RealizeCompleteEvent) {
mplayer.prefetch();
} else if (ce instanceof PrefetchCompleteEvent) {
if ((visual = mplayer.getVisualComponent()) != null) {
Dimension size = visual.getPreferredSize();
videoWidth = size.width;
videoHeight = size.height;
getContentPane().add("Center", visual);
} else
videoWidth = 320;
if ((control = mplayer.getControlPanelComponent()) != null) {
controlHeight = control.getPreferredSize().height;
getContentPane().add("South", control);
}
setSize(videoWidth + insetWidth,
videoHeight + controlHeight + insetHeight);
validate();
mplayer.start();
} else if (ce instanceof EndOfMediaEvent) {
mplayer.setMediaTime(new Time(0));
mplayer.start();
}
}
To create clones from a DataSource and use them
to create different players for playback. A
sample application is written to accept a URL of the input
media for
playback and a number specifying how many copies of the
playback it should
generate. It performs the following steps:
1. Create a DataSource for the
given
input URL.
2. Create a cloneable DataSource
from
the original DataSource with the
Manager.createCloneableDataSource
call.
3. Create clones from the cloneable
DataSource.
4. For the cloneable DataSource and
each of the clones created,
create
a
player for playback.
How
to run this sample
java Clone <URL> <# of copies>
Example:
java Clone
file:clips/clip01mpg.mpg 2
Source
Code
Clone.java
Highlight of Clone.java:
public class Clone extends Frame implements ControllerListener {
public boolean open(DataSource ds) {
p = Manager.createPlayer(ds);
p.addControllerListener(this);
p.prefetch();
vc = p.getVisualComponent();
add("Center", vc);
cc = p.getControlPanelComponent();
add("South", cc);
p.start();
}
How to run this sample
- java Seek <URL>
Example:
- java Seek file:clips/msnbc-vid-mov.mov
Source Code
Seek.java
Highlight of Seek.java:
ml = new MediaLocator(args[0]);
ds = Manager.createDataSource(ml);
Seek seek = new Seek();
seek.open(ds);
public class Seek extends Frame implements ControllerListener, ActionListener {
public boolean open(DataSource ds) {
System.err.println("create player for: " + ds.getContentType());
Player p = Manager.createPlayer(ds);
p.addControllerListener(this);
p.realize();
// Try to retrieve a FramePositioningControl from the player.
fpc = (FramePositioningControl)p.getControl("javax.media.control.FramePositioningControl");
if (fpc == null) {
System.err.println("The player does not support FramePositioningControl.");
System.err.println("There's no reason to go on for the purpose of this demo.");
return false;
}
Time duration = p.getDuration();
System.err.println("Movie duration: " + duration.getSeconds());
int totalFrames = fpc.mapTimeToFrame(duration);
System.err.println("Total # of video frames in the movies: " + totalFrames);
p.prefetch();
setLayout(new BorderLayout());
cntlPanel = new Panel();
fwdButton = new Button("Forward");
bwdButton = new Button("Backward");
rndButton = new Button("Random");
fwdButton.addActionListener(this);
bwdButton.addActionListener(this);
rndButton.addActionListener(this);
cntlPanel.add(fwdButton);
cntlPanel.add(bwdButton);
cntlPanel.add(rndButton);
Component vc = p.getVisualComponent();
add("Center", vc);
add("South", cntlPanel);
}
public void actionPerformed(ActionEvent ae) {
String command = ae.getActionCommand();
if (command.equals("Forward")) {
int dest = fpc.skip(1);
} else if (command.equals("Backward")) {
int dest = fpc.skip(-1);
} else if (command.equals("Random")) {
int randomFrame = (int)(totalFrames * Math.random());
randomFrame = fpc.seek(randomFrame);
System.err.println("Jump to a random frame: " + randomFrame);
}
int currentFrame = fpc.mapTimeToFrame(p.getMediaTime());
System.err.println("Current frame: " + currentFrame);
}
The VideoTransmit
class is
a simple wrapper that can be programmed to take
video input from a source of your choice and transmit
the video to a
destination computer or network in JPEG format. It
takes 3 parameters:
Port number can be any port number that is not in use by any other service on your computer. For example, you can use "22222". Make sure that it is an even number.
If all goes well,
the video will be sent out for 30
seconds and then the application will exit.
Note: The
source video,
whether its a file or live video, needs to be in a format
that can be converted to JPEG/RTP. Cinepak, RGB, YUV
and JPEG are good
formats. Also the dimensions of
the video should be a multiple
of 8x8.
How
to
run this sample
1. Run VideoTransmit
with
the required 3 command line parameters
For example,
2. To receive the transmission
on the client side use JMStudio:
- use open RTP
session and specify group: 224.112.112.112 &
port: 22222
OR
- use open URL and specify: rtp://224.112.112.112:2222/video
Source
Code
VideoTransmit.java
Highlight of VideoTransmit.java:
public class VideoTransmit {
private MediaLocator locator;
private String ipAddress;
private String port;
private Processor processor = null;
private DataSink rtptransmitter = null;
private DataSource dataOutput = null;
public VideoTransmit(MediaLocator locator, String ipAddress, String port){
this.locator = locator;
this.ipAddress = ipAddress;
this.port = port;
}
public synchronized String start() {
createProcessor();
createTransmitter();
processor.start();
}
public void stop() {
processor.stop();
processor.close();
processor = null;
rtptransmitter.close();
rtptransmitter = null;
}
private String createProcessor() {
DataSource ds = Manager.createDataSource(locator);
processor = Manager.createProcessor(ds);
TrackControl [] tracks = processor.getTrackControls();
// Search through the tracks for a video track
for (int i = 0; i < tracks.length; i++) {
Format format = tracks[i].getFormat();
if ( tracks[i].isEnabled() && format instanceof VideoFormat) {
// Found a video track. Try to program it to output JPEG/RTP
// Make sure the sizes are multiple of 8's.
Dimension size = ((VideoFormat)format).getSize();
float frameRate = ((VideoFormat)format).getFrameRate();
int w = (size.width % 8 == 0 ? size.width :(int)(size.width / 8) * 8);
int h = (size.height % 8 == 0 ? size.height : (int)(size.height / 8) * 8);
VideoFormat jpegFormat = new VideoFormat(VideoFormat.JPEG_RTP,
new Dimension(w, h), Format.NOT_SPECIFIED,Format.byteArray,frameRate);
tracks[i].setFormat(jpegFormat);
} else
tracks[i].setEnabled(false);
}
// Set the output content descriptor to RAW_RTP
ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
processor.setContentDescriptor(cd);
dataOutput = processor.getDataOutput();
}
private String createTransmitter() {
String rtpURL = "rtp://" + ipAddress + ":" + port + "/video";
MediaLocator outputLocator = new MediaLocator(rtpURL);
rtptransmitter = Manager.createDataSink(dataOutput, outputLocator);
rtptransmitter.open();
rtptransmitter.start();
dataOutput.start();
}
public static void main(String [] args) {
VideoTransmit vt = new VideoTransmit(new MediaLocator(args[0]),args[1], args[2]);
String result = vt.start();
Thread.currentThread().sleep(60000);
vt.stop();
}
}
The AudioTransmit class is very similar to the VideoTransmit
How to run this sample 1. Run AudioTransmit
with
the required 3 command line parameters
For example,
- java AudioTransmit javasound://0
224.112.112.112 2222
OR
-
java AudioTransmit
file:clips/clip01mpg.mpg
224.112.112.112 22222
OR
- java AudioTransmit
file:clips/clip01audio.gsm
224.112.112.112 22222
2. To receive the
transmission
on the client side use JMStudio:
- use open RTP
session and specify group: 224.112.112.112 &
port: 22222
OR
- use open URL
and specify: rtp://224.112.112.112:2222/audio
Source Code
AudioTransmit.java
Highlight of AudioTransmit.java:
Very similar to VideoTransmit.java except the part that sets the track format:
TrackControl [] tracks = processor.getTrackControls();
// Search through the tracks for a Audio track
for (int i = 0; i < tracks.length; i++) {
Format format = tracks[i].getFormat();
if ( tracks[i].isEnabled() && format instanceof AudioFormat) {
AudioFormat ulawFormat = new AudioFormat(AudioFormat.DVI_RTP);
tracks[i].setFormat (ulawFormat);
} else
tracks[i].setEnabled(false);
}
private String createTransmitter() {
String rtpURL = "rtp://" + ipAddress + ":" + port + "/audio";
....
}
Transmitting
Audio and Video over
RTP
The AVTransmit2
class
is very similar to the VideoTransmit, but uses RTP managers
to send the video and audio streams.
Since
the
media tracks are transmitted in multiple sessions, you'll need to
use one Player per track on the receive side. Using
JMStudio, you can
start multiple Players from the "File" menu using
the "New Window"
item. Then either:
rtp://<sourceIP>:<port>/media
Where <sourceIP> is the IP address of the RTP session and the port
number is the same one that is used on the transmitting side.
OR
1. Run AVTransmit2
with
the required 3 command line parameters
For example, we can use any of the following:
-
java AVTransmit2
file:clips/clip01mpg.mpg
224.112.112.112 22222
2. To receive the
transmission
on the client side use JMStudio:
- use open RTP
session and specify group: 224.112.112.112 &
port: 22222
AND
use FILE -> New Window and open RTP sesssion with port 22224.
OR
- use open URL
and specify: rtp://224.112.112.112:22222/video
AND use FILE -> New Window and open URL with 22224/audio
Notes:
- java AVTransmit2 javasound://0
224.112.112.112 22222 (audio only)
- java AVTransmit2
vfw://0
224.112.112.112 22222 (video only)
In such case create only one
instance of JMStudio.
Where 128.82.4.7 is
the receicver address.
If the sender address is 128.82.4.9
it will use port 22222 as well
to send out data. In this
case the receiver (e.g., JMStudio)
should specify the sender
as: 128.82.4.9 22222.
Therefore to use unicast
you should have two
machines since
you can not use the same port for
both sender and receiver.
Source Code
AVTransmit2.java
Highlight of AVTransmit2.java:
private String createTransmitter() {
PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
PushBufferStream pbss[] = pbds.getStreams();
rtpMgrs = new RTPManager[pbss.length];
SessionAddress localAddr, destAddr;
InetAddress ipAddr;
SendStream sendStream;
int port;
SourceDescription srcDesList[];
for (int i = 0; i < pbss.length; i++) {
rtpMgrs[i] = RTPManager.newInstance();
port = portBase + 2*i;
ipAddr = InetAddress.getByName(ipAddress);
localAddr = new SessionAddress( InetAddress.getLocalHost(), port);
destAddr = new SessionAddress( ipAddr, port);
rtpMgrs[i].initialize( localAddr);
rtpMgrs[i].addTarget( destAddr);
sendStream = rtpMgrs[i].createSendStream(dataOutput, i);
sendStream.start();
}
}
Receiving
Audio and Video using RTP
AVReceive2
uses the
RTPManager
API to receive RTP transmissions.
AVReceive2 performs the following tasks:
1. Run AVTransmit2
with
the required 3 command line parameters
For example:
- java AVTransmit2 file:clips/clip01mpg.mpg 224.112.112.112 1234
2. Run AVReceive2 and specify the RTP session addresses to receive from.
For example:
- java AVReceive2 224.112.112.112/1234 224.112.112.112/1236
to simultaneously receive 2 different RTP sessions (video and audio).
Note: because of port reuse,
it must run in this order, 1 then 2.
Source Code
AVReceive2.java
protected boolean initialize() {
mgrs = new RTPManager[sessions.length];
for (int i = 0; i < sessions.length;
i++) {mgrs[i] = (RTPManager) RTPManager.newInstance();
mgrs[i].addSessionListener(this);
mgrs[i].addReceiveStreamListener(this);
mgrs[i].initialize( localAddr);
mgrs[i].addTarget(destAddr);
}
}
// SessionListener.
public synchronized void update(SessionEvent evt) {
if (evt instanceof NewParticipantEvent) {
Participant p = ((NewParticipantEvent)evt).getParticipant();
System.err.println(" - A new participant had just joined: " + p.getCNAME());
}
}
//ReceiveStreamListener
public synchronized void update( ReceiveStreamEvent evt) {
RTPManager mgr = (RTPManager)evt.getSource();
if (evt instanceof NewReceiveStreamEvent) {
stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
DataSource ds = stream.getDataSource();
RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
// create a player by passing datasource to the Media Manager
Player p = javax.media.Manager.createPlayer(ds);
p.addControllerListener(this);
p.realize();
PlayerWindow pw = new PlayerWindow(p, stream);
playerWindows.addElement(pw);
}
}
The application accepts:
How
to run this sample
1. Recieve:
-
java RTPExport -o <output URL> -d
<duration> <session>
<session> ...
where <session> is of the form:
[SourceIP]:[SourcePort]/[ContentType].
Example:
- java RTPExport -o file:foo.mov -d
30 224.122.122.122:8800/video
224.122.122.122:8802/audio
will
simultaneously receive video and audio streams and transcode them to a
QuickTime
movie for a duration of 30 seconds.
2. Send:
Example:
Use JMStudio: File -> Transmit ->
Capture -> OK-> Next-> Finish
Source
Code
RTPExport.java
Highlight of RTPExport.java:
....
MediaLocator iml = createMediaLocator(inputURL);
MediaLocator oml = createMediaLocator(outputURL);
RTPExport RTPExport = new RTPExport();
RTPExport.doIt(iml, oml, duration);
.......
public class RTPExport implements ControllerListener, DataSinkListener {
public boolean doIt(MediaLocator inML, MediaLocator outML, int duration) {
Processor p = Manager.createProcessor(inML);
p.addControllerListener(this);
p.configure();
setContentDescriptor(p, outML);
setTrackFormats(p);
p.realize();
DataSink dsink = createDataSink(p, outML);
dsink.addDataSinkListener(this);
p.setStopTime(new Time((double)duration));
p.start();
dsink.start();
}
}
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds = p.getDataOutput();
DataSink dsink = Manager.createDataSink(ds, outML);
dsink.open();
return dsink;
}
Transcoding
to Different Formats
Given an input URL, the object is to transcode the
input
media to different
track formats and generate a resulting file with the
transcoded data.
The output file could also be of a different content
type from the original input.
The application accepts an input URL where the
original
data comes from;
an output URL where the transcoded data will be saved;
and a list of track formats of which the input data
will
be transcoded to.
How to run this sample
- java
Transcode -o
<output
URL>
[ -a
<audio format>] [-v <video format>]
<input URL>
Example:
- java
Transcode
-o file:foo.avi
file:clips01mpg.mpg
will transcode from: mpg -> avi
- java
Transcode
-o file:foo.mov
file:foo.avi
will transcode
from: avi -> mov.
Source Code
Transcode.java
Highlight of Transcode.java:
....
MediaLocator iml = createMediaLocator(inputURL);
MediaLocator oml = createMediaLocator(outputURL);
Transcode transcode = new Transcode();
transcode.doIt(iml, oml, fmts, mediaStart, mediaEnd);
.......
public class Transcode implements ControllerListener, DataSinkListener {
public boolean doIt(MediaLocator inML, MediaLocator outML, Format fmts[],
int start, int end) {
}
}
To capture the contents of the screen and feed it to
a
JMF player or Processor for purposes such as saving screen shots to
disk
or transmitting live using RTP.
Java 2 v.1.3 introduced a new class, Robot,
that provides screen
capture facilities. Since this is similar to capturing
video from a VCR or
camera, it is best encapsulated in a JMF
DataSource
object.
This example shows how to use robot to create a
screen
capture DataSource.
The DataSource is a PushDataSource
that pushes
captured
screen shots at
regular intervals to the connected Player or Processor.
It creates a new protocol "screen" with the following URL syntax:
screen://x,y,width,height/fps
where
Compile the
DataSource
and
LiveStream classes under:
jmfsolutions/com.sun.media.protocol.screen- javac -d . DataSource.java LiveStream.java
How
to run this sample:
- java JMStudio screen://0,0,160,120/10
OR:
2. Run AVTransmit2 & AVReceive2:
- java AVTransmit2
screen://0,0,160,120/10 224.122.122.122 2222
- java AVReceive2 224.122.122.122/2222
Source Code:
DataSource.java
LiveStream.java
Highlight of screen data source:
DataSource.java
package com.sun.media.protocol.screen;
public class DataSource extends PushBufferDataSource {
.......
public PushBufferStream [] getStreams() {
stream = new LiveStream(getLocator());
}
}
LiveStream.java
package com.sun.media.protocol.screen;
public class LiveStream implements PushBufferStream, Runnable {
.......
protected Robot robot = null;
public LiveStream(MediaLocator locator) {
.......
parseLocator(locator);
robot = new Robot();
thread = new Thread(this, "Screen Grabber");
}
......
public void read(Buffer buffer) throws IOException {
synchronized (this) {
........
BufferedImage bi = robot.createScreenCapture(
new Rectangle(x, y, width, height));
......
}
}
...
}
Compiling:
The
CaptureFrame, DataSource
&
LiveStream classes under:
jmfsolutions/ScreenCapture/com.sun.media.protocol.screen- javac -d . CaptureFrame.java DataSource.java LiveStream.java
- cd jmfsolutions/ScreenCapture
How
to run this sample:
- java JMStudio screen://0,0,160,120/10
Source Code:
DataSource.java
DataSource.java:
LiveStream.java:
package com.sun.media.protocol.screen;
public class LiveStream implements PushBufferStream, Runnable {
.......
protected Robot robot = null;
public LiveStream(MediaLocator locator) {
.......
parseLocator(locator);
robot = new Robot();
CaptureFrame frame1 = new CaptureFrame("Capture Frame");
frame1.FrameTransmit(LiveStream.this);
thread = new Thread(this, "Screen Grabber");
}
....
}
CaptureFrame.java
package com.sun.media.protocol.screen;
public class CaptureFrame extends JFrame {
static final int FPS_INIT = 15;
LiveStream stream = null;
public void FrameTransmit(LiveStream s) {
stream = s;
}
public CaptureFrame(String windowTitle) {
JSlider framesPerSecond = new JSlider(JSlider.HORIZONTAL,0, 30, FPS_INIT);
framesPerSecond.addChangeListener(new SliderListener());
JPanel contentPane = new JPanel();
contentPane.add(sliderLabel);
addComponentListener(new ComponentAdapter() {
public void componentMoved(ComponentEvent e) {
Point pp = e.getComponent().getLocation();
int X = (int) pp.getX();
int Y = (int) pp.getY();
stream.setPoint((int) pp.getX(), (int) pp.getY());
System.out.println("new frame X and Y are: " + X + " , " + Y);
}
public void componentResized(ComponentEvent e) {
Dimension dm = e.getComponent().getSize();
try {
int tms = (int) dm.getWidth()/8;
int width = tms*8;
tms = (int) dm.getHeight()/8;
int height = tms*8;
stream.setDimension(width, height);
System.out.println("new frame width and height: " + width + " , " + height);
} catch (Exception ioe) { }
}
});
}
class SliderListener implements ChangeListener {
public void stateChanged(ChangeEvent e) {
JSlider source = (JSlider)e.getSource();
if (!source.getValueIsAdjusting()) {
float ffps = (float) source.getValue();
stream.setFrameRate(ffps);
System.out.println("new frame rate is: " + ffps);
}
}
}
}