`

Experiments in Streaming Content in Java ME(一)

 
阅读更多

Since my book on Mobile Media API (MMAPI), Pro Java ME MMAPI: Mobile Media API for Java Micro Edition, was published in May, I have been inundated with requests to help readers with streaming content via MMAPI for Java-enabled mobile devices. This topic was an important omission from the book, but one that was simply not feasible to include because of the lack of support for it within various MMAPI implementations. In this article, I will show you the results of experiments I have conducted since the publication of the book to stream content via MMAPI using a custom datasource.

DISCLAIMER: Before I commence, I would like to point out that even though I was able to stream data from a streaming server and receive it successfully in a MIDlet using a custom datasource, I wasn't able to utilize this data in any meaningful manner because of limitations in the way this data is read by the MMAPI implementation at my disposal. You may have more success if you have access to a MMAPI implementation that doesn't read its data fully. Even if you don't, this article still provides a good study of the issues involved in streaming media data. At the very least, it shows you how to create and utilize your own custom datasource.

For a background on Java ME please see my previous tutorial series on getting started. For an introduction to MMAPI, tutorial 4 is a good start, or you can always buy the book.

Background to the streaming problem

MMAPI is a format- and protocol-agnostic API, which means that the API doesn't dictate mandatory support from device manufacturers for any particular format or protocol. One of the protocols that is widely requested by application developers is the Real Time Streaming Protocol (RTSP) and the associated Real-time Transport Protocol (RTP) for streaming audio/video content. The advantage of streaming content is that it provides a fast turnaround time for the user, control over the content distribution to the distributor, and an overall richer user experience.

However, hardly any manufacturer supports this protocol through Java ME. Some new phones provide support for RTSP, but that support is only on a smattering of devices. A majority of devices still do not support this protocol, therefore limiting useful application development in the streaming media department. A majority of questions in the MMAPI forums of various device manufacturers revolve around this very issue, that is, how to provide streaming data when RTSP is not supported. This article aims to point you in the right direction. I'll start by cutting through the clutter to try to provide an understanding of what streaming means.

What is streaming?

Streaming is the process of transferring data via a channel to its destination, where it is decoded and consumed via the user or device in real time, that is, as the data is being delivered. It differs from non-streaming processes because it doesn't require the data to be fully downloaded before it can be seen or used. Streaming is not the property of the data that is being delivered, but is an attribute of the distribution channel. This means, technically, that most media can be streamed.

HTTP and RTSP

HTTP and RTSP are application-level protocols that allow remote retrieval of data. So why can't you use HTTP for streaming media content? The truth is, you can. When you click on a Web page link to play an audio file, in most cases the media data is streamed to your machine. However, streaming content over HTTP is inherently inefficient. This is because HTTP is based on the Transmission Control Protocol (TCP), which makes sure that media packets are delivered to their destination reliably without worrying about when they are delivered. On the other hand, RTSP can be based on both User Datagram Protocol (UDP), which is a connectionless protocol ensuring faster delivery over reliability, and on TCP. Besides, RTSP has control mechanisms built in that allow random access to the media data, allowing you to seek, pause, and play.

Making sense of RTSP, RTP, and RTCP

There is a lot of confusion among newcomers over the acronyms RTSP, RTP, and RTCP. All three represent different protocols related to streaming of media content. An RTSP session initiates both Real-time Transport Protocol (RTP) and RTP Control Protocol (RTCP) sessions. RTSP is only the control protocol, a bit like a remote control for a DVD player, in that it allows you to start, stop, resume, and seek data remotely. The actual data delivery is done via RTP, and RTCP is a partner protocol to RTP providing feedback to both the sender and receiver on the quality of media data that is being transferred.

With this basic introduction about RTSP and streaming out of the way, let's set up our own streaming server to conduct some experiments. You can read more about RTSP, RTP, and RTCP at http://www.rtsp.org.

Set up a streaming server

To conduct experiments for the purposes of this article, you will need access to a specialty streaming server that can create RTSP streams for media objects. One such server is the Darwin Streaming Server, which is an open-source streaming server based on the same source code as Apple's commercial QuickTime streaming server. Implementations of this free server are available for Mac OS, Linux, and Windows. Download the version that is suitable for your OS and run the installer. You can also choose to download the source code and build it in your environment. I have run the examples in this article on a Windows XP machine, and the server is installed in C:/Program Files/Darwin Streaming Server .

For the purposes of this article, you will also need to have Perl installed on your computer, to administer the Darwin server. For Windows, you can download ActivePerl.

As part of the installation, you will be asked to provide an administrator username and password, but make sure that you run the administration server after the installation (by running the streamingadminserver.pl file). This starts an administration server on port 1220 with which you can monitor the current activity within the streaming server. More importantly, you will need to supply a username/password combination the first time you log into the administrative console (by navigating to http://localhost:1220 in your browser) for running the movie and MP3 broadcast service. It is important to set this (even though you never really need to supply this username/password combination anywhere when running the examples in this article).

Note: On Windows, if you download the latest version of ActivePerl, streamingadminserver.pl is likely to fail with the following error:

ActivePerl 5.8.0 or higher is required 
  in order to run the Darwin Streaming Server web-based administration. Please 
  download it from http://www.activeperl.com/ and install it.

This is because of an incorrect configuration check in this script, and you can easily fix it by commenting out lines 33 and 34 (put a # in front of these lines).

The streaming server starts on port 554 and comes with a few sample movie files, ready for streaming in the installation folder under the Movies directory. The Darwin server can stream MPEG-4, 3GPP, and QuickTime movie files natively. This means that these files don't need to be "hinted" in order to be streamed. Hinting is a process by which media files are prepared with track information for streaming using the professional version of QuickTime. For the purposes of this article, I will work with natively streamable files like 3GPP and MPEG-4 only.

To test that your streaming server is working correctly, use the QuickTime player to launch a file via RTSP. For example, if you can open the URL rtsp://localhost:554/sample_50kbit.3gp correctly in the Quicktime player and view the file, pause it, stop it, and seek it, then your streaming server is working correctly.

Model an RTP packet

As I said earlier, RTP is the actual delivery protocol for streaming data. Each streaming session involves the streaming server sending RTP packets to its destination based on the client request (requests that are delivered via the RTSP protocol). A full knowledge of the RTP RFC is not required for the purposes of this article, so the following base class will model an RTP packet to its best possible approximation.

Note: I have used the Java ME Wireless Toolkit 2.3 (beta) to create and run the examples in this article. You can start by creating a project called "StreamingData" (or whatever you prefer) in this toolkit to place your code in. The J2ME tutorial part 1 gives more details on the process of creating projects in this toolkit.

publicclassRTPPacket...{

//usedtoidentifyseparatestreamsthatmaycontributetothispacket
privatelongSSRC;

//incrementingidentifierforeachpacketthatissent
privatelongsequenceNumber;

//usedtoplacethispacketinthecorrecttimingorder
//thatis,wherethispacketfitsintimebasedmedia
privatelongtimeStamp;

//thetypeofthemediadata,orthepayloadtype
privatelongpayloadType;

//theactualmediadata,alsocalledthepayload
privatebytedata[];

//thegetandsetmethods
publiclonggetSSRC()...{returnthis.SSRC;}
publicvoidsetSSRC(longSSRC)...{this.SSRC=SSRC;}

publiclonggetSequenceNumber()...{returnthis.sequenceNumber;}
publicvoidsetSequenceNumber(longsequenceNumber)
...{this.sequenceNumber=sequenceNumber;}

publiclonggetTimeStamp()...{returnthis.timeStamp;}
publicvoidsetTimeStamp(longtimeStamp)...{this.timeStamp=timeStamp;}

publiclonggetPayloadType()...{returnthis.payloadType;}
publicvoidsetPayloadType(longpayloadType)
...{this.payloadType=payloadType;}

publicbyte[]getData()...{returnthis.data;}
publicvoidsetData(byte[]data)...{this.data=data;}

publicStringtoString()...{
return
"RTPPacket"+sequenceNumber+
":["+
"ssrc=0x"+SSRC+
",timestamp="+timeStamp+
",payloadtype="+payloadType+
"]";

}


}

 

The comments within the code should offer you some idea about the various features of an RTP packet. Since you won't be building a complete RTP client and will be running this code within the confines of this example, the main feature of the above class is the data, or the payload contained within such a packet. Note that an RTP packet contains other information as well, which is not modeled by this class.

Create a custom DataSource

A DataSource is a MMAPI abstract class, implementations of which encapsulate the task of media data location and retrieval. Device manufacturers provide their own implementations in the Java ME toolkit for most sources. Developers don't need to create their own custom datasources because the task of locating data over file or network is rudimentary and fulfilled by the device manufacturer's implementation. However, in cases where the developer needs to do data retrieval from a custom source, a custom datasource is the answer, and media data fetched from a streaming server is a perfect example.

Data retrieval is one thing, while data consumption is another. Since MMAPI doesn't allow you to create custom media players, will a custom datasource suffice in this example? Let's proceed further with the creation of the custom datasource before I answer that question. The following listing shows the starting of the custom datasource class that I will use for talking to the streaming server:

 

importjavax.microedition.media.Control;
importjavax.microedition.media.protocol.DataSource;
importjavax.microedition.media.protocol.SourceStream;

publicclassStreamingDataSourceextendsDataSource...{

//thefullURLlikelocatortothedestination
privateStringlocator;

//theinternalstreamthatconnectstothesource
privateSourceStream[]streams;

publicStreamingDataSource(Stringlocator)...{
super(locator);
setLocator(locator);
}


publicvoidsetLocator(Stringlocator)...{this.locator=locator;}

publicStringgetLocator()...{returnlocator;}

publicvoidconnect()...{}

publicvoidstop()...{}

publicvoidstart()...{}

publicvoiddisconnect()...{}

publicStringgetContentType()...{return"";}

publicControl[]getControls()...{returnnull;}

publicControlgetControl(StringcontrolType)...{returnnull;}

publicSourceStream[]getStreams()...{returnstreams;}

}

This class contains only placeholder methods at the moment. Internally, each datasource uses a SourceStream implementation to read individual streams of data from; therefore, let's create a simple SourceStream implementation for reading RTP packets:

 

importjava.io.IOException;
importjavax.microedition.media.Control;
importjavax.microedition.media.protocol.SourceStream;
importjavax.microedition.media.protocol.ContentDescriptor;

publicclassRTPSourceStreamimplementsSourceStream...{

publicRTPSourceStream(Stringaddress)throwsIOException...{}

publicvoidclose()...{}

publicintread(byte[]buffer,intoffset,intlength)
throwsIOException...{

return0;
}


publiclongseek(longwhere)throwsIOException...{
thrownewIOException("cannotseek");
}


publiclongtell()...{return-1;}

publicintgetSeekType()...{returnNOT_SEEKABLE;}

publicControl[]getControls()...{returnnull;}

publicControlgetControl(StringcontrolType)...{returnnull;}

publiclonggetContentLength()...{return-1;}

publicintgetTransferSize()...{return-1;}

publicContentDescriptorgetContentDescriptor()...{
returnnewContentDescriptor("audio/rtp");
}

}

As with the previous listing, this class only contains placeholder methods for the moment. However, all listings so far should compile and preverify successfully.

分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics