• No results found

Conversational real-time multimedia in Java ME

N/A
N/A
Protected

Academic year: 2021

Share "Conversational real-time multimedia in Java ME"

Copied!
83
0
0

Loading.... (view fulltext now)

Full text

(1)2008:198 CIV. MASTER'S THESIS. Conversational Real-time Multimedia in Java ME. Adam Bergkvist. Luleå University of Technology MSc Programmes in Engineering Computer Science and Engineering Department of Computer Science and Electrical Engineering Division of Information and Communication Technology 2008:198 CIV - ISSN: 1402-1617 - ISRN: LTU-EX--08/198--SE.

(2) Conversational Real-time Multimedia in Java ME. Adam Bergkvist. Lule˚ a University of Technology Dept. of Computer Science and Electrical Engineering Information and Communication Technology. October 19, 2008.

(3)

(4) A BSTRACT The Java Platform, Micro Edition is a widely adopted technology to, in a controlled way, introduce third party developers to the otherwise inaccessible platforms in mobile phones, and at the same time allow applications to run on devices from different manufacturers. Ericsson Research would like to investigate the possibilities to develop conversational realtime services using Java ME, and thus the purpose of this report is to conduct such an investigation. A problem with developing such services in Java ME is the very limited and inconsistent support for real-time multimedia among devices. This report targets both the Java ME specifications, investigating the theoretical possibilities to support conversational real-time multimedia, as well as the currently available support in the seventh Java Platform implementation by Sony Ericsson (JP-7), with focus on video. It can be concluded that it is not possible do develop such services on the JP-7 platform available today. These conclusions are drawn from the facts that it is not possible to access captured media data in real-time, nor play a real-time video stream with acceptable performance, using the JP-7 implementation. However, the specifications does not hinder future implementations to be used for developing strongly standardized real-time communication services, with intentionally reduced possibilities to control advanced settings. More general conversational communication services would, however, require new specifications, providing better control and flexibility, to be standardized.. iii.

(5)

(6) P REFACE The primary motivation for doing this work is that Ericsson Research would like to investigate the possibilities to develop platform independent real-time communication services. Being able to work with Java would enable both faster development and deployment of new services. The work with thesis started in Spring 2007, at Ericsson Research in Lule˚ a, and was completed the following Autumn. The initial work consisted of developing the thesis specification to find a feasible approach to the problem with real-time streaming on Java-enabled mobile phones. The emphasis of the initial version of the thesis specification was based around extending the available multimedia APIs with native functionality. However, uncertainty if the required tools needed to make the extensions, would be available in time, caused the specification to be redesigned. The second approach consisted of designing a entirely new multimedia framework for Java ME with incorporated support for developing conversational real-time multimedia services. This track was later abandoned for a more narrow and practical approach. By the time of the fourth version, the specification had traversed a loop and the thesis objectives were more similar to the ones in the initial version.. v.

(7)

(8) N OMENCLATURE AMMS. Advanced Multimedia Supplements. API. Application Programming Interface. CDC. Connected Device Configuration. CIF. Common Intermediate Format. CLDC. Connected Limited Device Configuration. GUI. Graphical User Interface. IP. Internet Protocol. IPC. Inter-Process Communication. Java EE. Java Platform, Enterprise Edition. Java ME. Java Platform, Micro Edition. Java SE. Java Platform, Standard Edition. JMF. Java Media Framework. JNI. Java Native Interface. JP-7. (Sony Ericsson) Java Platform 7. JTWI. Java Technology for the Wireless Industry (JSR-185). JVM. Java Virtual Machine. MIDP. Mobile Information Device Profile. MIME (type). Multipurpose Internet Mail Extensions. MMAPI. Mobile Media API. MSA. Mobile Service Architecture (JSR-248). MTA. Mobile Telephony System A. MTB. Mobile Telephony System B. OS. Operating System vii.

(9) QCIF. Quarter CIF. RTCP. Real-Time Control Protocol. RTP. Real-Time Protocol. RTSP. Real-Time Streaming Protocol. SDK. Software Development Kit. SDN. Sun Developer Network. SIP. Session Initiation Protocol. TCP. Transmission Control Protocol. UML. Uniform Modeling Language. VoIP. Voice over IP. viii.

(10) C ONTENTS . . . . . .. 1 1 2 3 4 4 5. Chapter 2: Theory 2.1 Java Platform, Micro Edition . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Multimedia in Java ME . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 7 7 11. Chapter 3: Experiments 3.1 Tools . . . . . . . . . . . . . . . . . . . . . . . 3.2 Experiment Structure . . . . . . . . . . . . . 3.3 Experiment 1 — Recording Audio and Video 3.4 Experiment 2 — Real-Time Recording . . . . 3.5 Experiment 3 — Custom Protocol Player . . 3.6 Experiment 4 — Java ME and I AM . . . . . 3.7 Experiment 5 — Trouble Shooting . . . . . . 3.8 Experiment 6 — Java ME and I AM, Part 2 . 3.9 Experiment 7 — A Different Approach . . . .. . . . . . . . . .. 17 18 18 19 20 22 25 33 39 41. Chapter 4: Evaluation 4.1 Summary of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Critique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 45 45 46. Chapter 5: Discussion 5.1 The Experiments . 5.2 The Specifications 5.3 Conclusions . . . . 5.4 Future Work . . .. Chapter 1: Introduction 1.1 Evolution of Mobile Communication Services 1.2 Developing Mobile Communication Services . 1.3 Problem Area . . . . . . . . . . . . . . . . . . 1.4 Purpose and Approach . . . . . . . . . . . . . 1.5 Delimitation . . . . . . . . . . . . . . . . . . . 1.6 Related Work . . . . . . . . . . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. 47 47 49 50 52. Appendix A:Implementations A.1 Experiment 1 — Recording Audio and Video A.2 Experiment 2 — Real-Time Recording . . . . A.3 Experiment 3 — Custom Protocol Player . . A.4 Experiment 4 — Java ME and I AM . . . . . A.5 Experiment 5 — Trouble Shooting . . . . . . A.6 Experiment 6 — Java ME and I AM, Part 2 . A.7 Experiment 7 — A Different Approach . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . .. 55 56 59 59 62 65 66 67. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . ..

(11)

(12) C HAPTER 1 Introduction This chapter will initially take the reader back to the birth of mobile communication, and then continue with a short discussion about todays and future mobile communication services. The next section describes how mobile communication services are developed today, and the players involved in the process. The following section describes the problem area, which leads up to the thesis purpose and delimitation. The chapter is concluded with a discussion about related work.. 1.1. Evolution of Mobile Communication Services. In today’s modern society, it is for a large number of people natural to always be available through our mobile communication devices. In our homes, at our workplaces, while traveling to work or the local supermarket, as well as abroad. Mobile phones are no longer expensive, bulky and energy consuming devices only affordable to companies and professionals. However, this way of communicating has only been a reality for roughly the last decade of the, little more than, fifty year old history of the mobile phone. It began in 1956, when Ericsson introduced Telephony System A (MTA) — the world’s first fully automatic mobile telephony system [1]. MTA could only be used in Stockholm and G¨ oteborg, and within the first year it had a total of 26 subscribers. The terminal, which was intended to be mounted in a vehicle, weighed around 40 kg and required a substantial power supply. MTA was shut down in 1967 and by that time, the number of subscribers had increased to around 125 [2]. MTA was replaced by Mobile Telephony System B (MTB), which was introduced in 1965. Thanks to the invention of the transistor, the weight of the portable device had been decreased to 9 kg [3]. Opposed to MTA, MTB was also available in Malm¨ o, and by the time it was shut down in 1983, MTB had around 600 subscribers in the three cities [2]. Making a giant leap to mobile telephony today, the situation is different. For nearly a third of the world’s population, mobile communication is a part of everyday life [1]. We are still calling each other by dialing numbers and speaking, but the evolution of mobile communication networks and terminals enables us to do much more. We are able to make calls and access the Internet from almost anywhere, and terminals can traverse several base stations during a call without being interrupted. A mid-rage terminal is not seldom equipped with a camera, mp3-player, FM radio receiver and software to surf the Internet and handle e-mail. A decade ago, one would have to buy several devices to achieve the 1.

(13) 2. Introduction. same functionality. Along with the evolution of mobile communication networks and terminals comes an increasing range of more sophisticated services. A trend is that the previously co-existing worlds of telecommunication and the Internet are converging. As a result, new mobile communication services often have counterparts, or are inspired by services, in the PC dominated Internet world. Today, terminal manufacturers equip their products with their own software for calling, managing contacts and so on. Future terminals might, except the operating system, be empty at sale, or shipped with a set of default tools. It would then be up to the user to install the software of his, or her, choice. An example application stack could perhaps consist of Google Talk or MNS Messenger for maintaining presence status and exchanging text messages, and a Skype client for calling. An example scenario of how such an application stack could be used, to provide a complete mobile communication environment, is demonstrated below. Bob, sitting at home by his computer, checks the presence status of Alice. Alice is on her way to the city by bus, but has set her presence status to available. He writes her a short greeting message. Alice’s terminal makes her aware that someone is trying to communicate with her; she replies. After exchanging a number of messages Alice realizes that she can not keep up with Bobs typing speed, so she decides to add voice the chat session. Bob accepts the invitation and puts on his headset. The conversation continues, and they eventually decide to meet in the city. Bob hands the session over to his mobile terminal, and the conversation continues while Bob walks to the nearest bus stop. The above scenario exemplified three important characteristics of future mobile communication services. First, services are able to communicate regardless of platform. In this case, mobile terminal and PC. Second, a session can be altered during its life time. Alice added voice to the chat session, but she could might as well have included full duplex video. Finally, Bob made a hand over from his computer to his mobile terminal. This transition could possibly include synchronization of recent chat logs between the PC and the mobile device. A Mobile communication service, with the characteristics addressed above, puts harsh requirements on its development environment. The environment has to be powerful enough to control real-time streams and seamless hand-overs, and at the same time provide the level of abstraction needed to produce platform independent services.. 1.2. Developing Mobile Communication Services. Mobile communication devices comes in wide range of models with varying physical formats and capabilities. From simple entry-level and more capable mid-range devices to advanced smartphones. The latter category are usually equipped with a standard Operating System (OS) as, for example, Symbian OS or Windows Mobile. These Operating Systems gives the user the opportunity to install and run native third party applications. This is normally not the case with non-smartphone devices. Today, development of advanced mobile communication services, for non-smartphone devices, are most commonly done using customized tools and native Application Programming Interfaces (APIs) provided by the device manufacturer. This gives the developer a great level of control, and services can be integrated with the device operating system in a seamless way. However, access to the necessary tools and documentation is normally controlled by the platform manufacturer and only granted to its customers and other business partners. Apart from availability, another downside is portability. Since manufacturers use their own proprietary operating systems, a service built on native APIs gets tailored.

(14) 1.3. Problem Area. 3. for a specific device family, and can thus not be ported to other devices easily. Even if a service could be ported, it would have to be included in the device manufacturers next official firmware update, since there is usually no other way to install it. Poor portability combined with a troublesome installation process has a negative effect on the deployment rate of the service. There is however another way to create services targeting non-smartphone mobile devices. The Java Platform, Micro Edition (Java ME) is a Java platform designed to operate on resource-constrained devices. Similar to the other Java platforms, it requires a virtual machine to run an application. A Java-enabled phone provides just that, and a set of APIs to build services upon. Java ME is addressed in detail in chapter 3 (2.1). Opposed to the manufacturers proprietary APIs, Java ME is free to use. Anyone can download a Java ME Software Development Kit (SDK), either directly from Sun Microsystems, or a customized version from a terminal manufacturer. Opposed to developing with native APIs, the Java ME developer is limited to the functionality made available by the manufacturer. For example, if a terminal manufacturer decided to not make the camera available in Java ME, nothing can be done by a third party Java developer to undo this unfortunate decision. However, since a Java ME application is executed in a virtual machine, it is portable. It is therefore possible for a user to download an application form a service provider and install it directly on the device. He, or she, can then beam it over to another user, and spread the service. This enables fast deployment of new services independently of content servers. Even though Java ME makes it possible to write device independent code, every application cannot be expected to work on any device. The reason for this is that resources and hardware features varies between Java-enabled cellphones. As a result — the contained APIs, that handles these features, varies. This issue is known as platform fragmentation, and is a big problem within the Java ME community. Section 2.1.2 addresses this problem and the attempts made to resolve it. Despite its shortcomings, Java ME is a interesting platform for future development of mobile communication services.. 1.3. Problem Area. To be able to cope with the varying feature-sets of hardware that falls under the category of small devices, the Java ME Runtime Environment is divided into three elements. The first two elements provides basic functionality needed to build applications. This includes, a virtual machine, core APIs, GUI functionality, and support to perform input/output operations. The third element is composed of a set of optional APIs intended to extend the Java ME Runtime Environment with support for device specific features. See Section 2.1.1 for a more complete description of the Java ME structure. There are mainly two APIs used to develop multimedia applications for Java-enabled phones today. Both fall under the category of optional APIs. The Mobile Media API (MMAPI) is a API to play and record time-based media in Java ME. The Advanced Multimedia Supplements (AMMS) is a supplementary package to MMAPI. It adds support for media-related hardware features found in modern devices. Unfortunately, most Java platforms, found on cellular phones today, lack support for conversational real-time multimedia communication. This makes it hard to develop mobile real-time multimedia communication services using Java ME in its current form. A possible way to overcome this problem could be to extend the media APIs with components capable of making calls to native code. This would, however, affect the portability of the service in a negative way, but it would be acceptable in the scope of this thesis..

(15) 4. Introduction. 1.4. Purpose and Approach. The purpose of this thesis is to investigate if the available media frameworks, for Javaenabled cellular phones, can be used to develop conversational real-time communication services. Since it is up to every device manufacturer to use the Java ME specifications and technologies to enable Java on their own devices, one implementation can not represent Java ME in general. This is particularly important to have in mind when it comes to the multimedia related areas. The approach of this thesis has therefore been to divide the research work into the following two tracks. • Investigate the Java ME structure and its APIs. Both general, but especially multimedia related APIs. • Conduct experiments with an implementation to test the possibilities and limitations. As a part of the investigation, the following research problems have been identified to supplement the thesis purpose. • It it possible to record and send media data in real-time with performance, good enough, for conversational purposes? • Is it possible to play back a media stream in real-time with performance, good enough, for conversational purposes? • Can the multimedia capabilities, added to Java ME with the Advanced Multimedia Supplements (AMMS) optional package be used to develop a real-time conversational service? • If the multimedia capabilities of Java ME would prove to be insufficient, how could the platform be extended with capabilities available in the native platform? • Would it be possible to build a complete real-time conversational communication service in Java ME for the Sony Ericsson K800i mobile terminal with the current implementation? • Are there any limitations in the current specifications of Java ME and its multimedia APIs that could prevent Java ME from being used to develop real-time conversational communication services?. 1.5. Delimitation. Building a complete real-time communication service in Java ME is not within the scope of this thesis for two reasons. First, it is at this point not known if it is even possible to achieve. Secondly, if possible, it would require a significant amount of time. Instead, this thesis will be dedicated to investigating the possibilities to record and play back real-time media streams in Java ME. This will mainly be achieved by building small example applications with emphasis on particular functionality, rather than a complete communication service.. 1.5.1. Mobile Terminals. The mobile terminal that will be used in this thesis is the Sony Ericsson K800i. That particular model was chosen since it is currently the main device used by Ericsson Research.

(16) 1.6. Related Work. 5. for testing prototype applications. The Sony Ericsson K800i is equipped with the seventh Java Plaform by Sony Ericsson (JP-7). The interested reader can find more information about JP-7, as well as its predecessors and the newly released JP-8, in the Developer Guidelines for Java ME available at http://developer.sonyericsson.com. During the experiments, two types of the K800i terminals will be used. The first type is a regular phone that can be bought at any well-stocked cellular phone retailer. The second type, called prototype phone, can be flashed with customized firmware. As a result, additional software can be installed on a prototype phone. This is however not possible with the regular model, which will be limited to running Java applications only.. 1.6. Related Work. The following paragraphs addresses existing work that has been done in area of multimedia in Java ME. The author of the book Pro Java ME MMAPI: Mobile Media API for Java Micro Edition [4], Wikram Goyal, has published an article about streaming content in Java ME. The article called Experiments in Streaming Content in Java ME [5] is a complement to the book. According to the author, the topic of streaming was excluded from the book, due to lack of support in the terminals at the time. In the experiment, the author uses a customized data source to stream video from a Darwin Streaming Server. The conclusion was that the Player needs to read the entire file before it can be played back, and therefore streaming would not be possible. The thesis Multimedia with J2ME in the Cellular Phone by Fredrik Dyrvold [6] investigates the multimedia support of the Mobile Information Device Profile 2.0 (MIDP 2.0). The author defines multimedia as: text, image, audio, video, animation and network handling. The conclusion is that MIDP 2.0 supports multimedia in all aspects except video. This is handled by the Mobile Media API. JSR-281 specifies the IMS (IP Multimedia Subsystem) Services API [7] for Java ME. The purpose of the specification is to provide a high-level API to access IMS services in an integrated and consistent way. Its high level design is intended to relieve the developer of IMS implementation details, and ensure conformance to IMS related standards. What qualifies the JSR-281 specification as related work to this thesis is that it adopts the MMAPI to handle media streams. The I AM client is a conversational real-time multimedia communication prototype developed by Sony Ericsson Mobile Communications (SEMC) and Ericsson Research [8]. It supports presence, text chat, VoIP and single duplex video sharing. It can only be run on prototype versions of the Sony Ericsson K800i. A limitation in the current implementation implies that an ongoing voice call is required to be able to share video. The I AM will be used used as the stream reference during the work with this thesis..

(17)

(18) C HAPTER 2 Theory. This chapter will introduce the reader to the terminology and technologies used in the reminder of this report. The majority of this chapter addresses the structure of the Java ME platform, and how it can be extended. The sections about the Java ME platform in general are followed by an investigation of the more multimedia specific components, and how these can be extended.. 2.1. Java Platform, Micro Edition. Java Platform, Micro Edition or Java ME (previously known as Java Platform 2, Micro Edition or J2ME) is one of four currently available Java Platforms. The other three being Java Platform, Standard Edition1 ; Java Platform, Enterprise Edition2 and Java Card. Java ME can not be seen as a single piece of software, but rather a set of technologies and specifications aimed at small devices. A short definition of Java ME could be – Java for small devices. The Java ME Technology section at Sun Developer Network (SDN) [10], defines Java ME as: Java Platform, Micro Edition (Java ME) is a collection of technologies and specifications to create a platform that fits the requirements for mobile devices such as consumer products, embedded devices, and advanced mobile devices. It is a collection of technologies and specifications that can be combined to create a complete Java runtime environment specifically to fit the requirements of a particular device or market. The idea of using Java on small devices dates back to the early days. Java, at the time known as Oak, was initially designed for consumer electronics [11]. To target the consumer electronics market, the Oak platform had to meet a set of specific criteria. To be able to operate on devices produced by different manufacturers, using a variety of hardware configurations, Oak had to be platform independent. For this reason, the Oak code would be converted into an intermediate byte-code representation, which would then be executed by an interpreter on the target device. Another requirement was reliability. Functionality classified as sources of potential programmer-introduced errors, such as multiple inheritance in C++, were excluded to increase reliability. To utilize the limited memory resources efficiently, garbage collecting was introduced. Finally, pointers were excluded for security 1 Known 2 Known. as Java Platform 2, Standard Edition, or J2SE, until version 6 [9] as Java Platform 2, Enterprise Edition, or J2EE, until version 5 [9]. 7.

(19) 8. Theory. reasons. Later, the name Oak had to be changed due to copyright issues. During a trip to a Coffee shop the new name Java was thought up. At the time, it would prove to be hard to introduce Java in the consumer electronics market. The turning point for Java came with the World Wide Web. The Internet introduced requirements such as reliability, security and platform independence. A perfect match since these requirements were taken into consideration in the initial design of Java. To conclude this short part on Java history, Java ME can then be seen as Java returning to what it initially was designed for — a technology for small devices.. 2.1.1. Structure. Java ME is divided into three different elements; configurations, profiles and optional packages [12, 10]. A configuration targets a specific group of devices based on processing power and memory resources. On top of the configuration lies a profile. A profile targets a more specific kind of devices, with more features in common, compared to a configuration. Support for features such as displaying graphics and storage management is located in the profile. Additional packages can be used to add support for even more device specific features than contained in the profile. When building an implementation combined of these three elements on a device, it results in a complete Java ME runtime environment or Java ME stack. Figure 2.1 shows two example Java ME stacks, and how these relate to other Java technologies. 2.1.1.1. Configurations. A configuration is the base of a Java ME stack as illustrated in Figure 2.1 [12, 10]. It specifies a Java Virtual Machine and core functionality. The functionality in a configuration is intentionally limited for it to be as general as possible. Two devices with similar processing power and memory capacity should be able to share a common configuration, even though, for example, input and output are handled differently. There are currently two configurations available: the Connected Device Configuration or CDC3 , and the Connected Limited Device Configuration or CLDC4 . The Connected Device Configuration is designed for more capable devices in terms of network capacity, processing power and memory resources. It is necessary since the CDC specifies a full Java Virtual Machine (JVM) must be supported. Examples of CDC capable devices are: high-end Personal Digital Assistants (PDAs), smartphones, television set-top boxes and car navigation systems. The CDC is out of the scope of this thesis since it is not supported by the hardware used in the experiments (see thesis delimitation 1.5). The interested reader can find more information about CDC in the specification[13]. The word “Limited” in Connected Limited Device Configuration distinguishes it from its bigger sibling CDC. The CLDC has generally lower system requirements than CDC. For example, a portable device usually has, compared to a television set-top box, limited power supply since it is battery powered and the display has to be small enough to fit in the device. The set-top box can also have a static connection to the Internet while a mobile device uses radio, and possibly shuts down radio components to save energy between transfer sessions. Processing power is also a battery draining resource. CLDC is therefore the configuration widely adopted in mobile phones and similar devices today. The JVM in the CLDC specification is actually a piece of software. Implementors may however use their own implementation as long as it conforms with the CLDC specification. 3 Latest 4 Latest. version, 1.1.2, specified in JSR 218 [13] version, 1.1, specified in JSR 139 [14].

(20) 2.1. Java Platform, Micro Edition. 9. Java Platform, Micro Edition (Java ME) Servers & enterprise computers. Servers & personal computers. High-end PDAs, TV set-top boxes & Embedded devices. Mobile phones & entry level PDAs. Smart cards. Optional Packages Optional Packages Optional Packages. Personal Profile Java Platform, Enterprise Edition (Java EE). Java Platform, Standard Edition (Java SE). Personal Basics Profile. Foundation Profile. Optional Packages. MID Profile. CDC JVM. JVM. JVM. CLDC. Java Card. KVM. Card VM. Figure 2.1: Java runtime environments. It is, due to its size, called KVM, since it is a JVM whose size is measured in kilobytes rather than megabytes. A typical target device for the CLDC has the following characteristics [14, 15]: • A processor, 16 or 32-bit, with a minimum clock frequency of 16MHz • A minimum of 160KB of non-volatile memory available for libraries and the virtual machine (KVM) • At least 32KB of RAM (Random Access Memory) available to the Java Platform • Limited power supply, often battery powered • Network connectivity, often intermittent with limited bandwidth (9600 bps or less) • Manufactured in big volumes • A user interface varying from sophisticated to none at all 2.1.1.2. Profiles. A profile makes up the middle layer in the Java ME stack. A configuration combined with a profile defines a complete Java Runtime Environment [12, 10]. The profile makes the stack more specific toward a family of devices with more features in common. Devices.

(21) 10. Theory. using the same profile have similar methods for inputing and displaying information, and can therefore share APIs concerning user input and representation of graphical interfaces. Profiles available for the CDC are: Foundation Profile (JSR 219), Personal Basic Profile (JSR 217) and the Personal profile (JSR 216). These profiles on top of CDC, as shown in Figure 2.1, makes up the Personal Java application runtime environment. Similarly to CDC, these profiles lies outside the scope of this report and are therefore not addressed further. The Mobile Information Device Profile (MID Profile or MIDP)5 is, on the other hand, designed to work on top of the CLDC as illustrated in Figure 2.1. The combination CLDC and MIDP 2.0 is the most popular Java Runtime Environment adopted for mobile devices today, including cell phones and PDAs [17]. A Java application designed to run on the MIDP is called a MIDlet. The word MIDlet is a combination of MID, taken from the MID Profile and the postfix let, used with Java applications such as Applet and Servlet. The term MIDlet will be frequently used in the remainder of this report. The MIDP 2.0 specifies several important APIs needed to be able to develop applications on a mobile device. The lcdui package introduces classes for building graphical user interfaces (GUIs) on small displays and handling input. It includes components as forms, on which items as text, images and input fields, can be appended, and several types of lists which can be used for building menus among other things. When using these kind of graphical components, to build the user interface, the programmer leaves much of the structuring and placing to the system. The programmer only appends components in the order he/she wants them to appear. It is then up to the system to organize and structure the GUI to fit the screen, and other limited resources, of the particular device. This approach makes the GUI very portable, but limited. For making more general GUIs, the lcdui package provides a canvas class which allows the screen to be decorated freely. Another important package in the MIDP 2.0 API is the midlet package. It contains the abstract class MIDlet which every MIDlet is derived from. The MIDlet class controls the MIDlet life cycle and allows Java to make platform requests. The MIDlet life cycle is needed to control the execution of the MIDlet since external events, such as an incomming call, will pause the MIDlet execution. An example of doing a platform request is to initiate a call from from a MIDlet by specifying a number, and relying on platform functionality to handle the actual call. The useful java.util package, known from Java SE, is also a part of MIDP 2.0. For a complete reference of the MID Profile see [18]. A typical target device for the MIDP, version 2.0, has the following characteristics: [19] • An additional 256KB, at least, of non-volatile memory beyond the 160KB required by CLDC. • A minimum of 8KB non-volatile memory for persistent application data. • At least 128KB of RAM available for the Java runtime. • Two-way network connectivity with limited bandwidth (possibly intermittent). • A screen with a resolution of at least 94x54 pixels and a color depth of minimum 1 bit. • At least one type of input device of type: keyboard, keypad or touch screen. • Capable of playing tones by either hardware or software. 5 Latest. version, 2.0, specified in JSR 118 [16].

(22) 2.2. Multimedia in Java ME 2.1.1.3. 11. Optional Packages. Optional Packages, or Optional APIs, are found at the top of the Java ME stack. Unlike a profile an optional package does not define a complete application environment, but is used to extend the runtime environment made up of the configuration and profile [12, 10, 20]. A reason why the functionality, found in an optional package, is not included in the profile may be that it is not general enough, and different profiles need to share that particular functionality. An advantage of the separation is that the profile only needs to contain functionality general enough to the specific device type, and many profiles do not have to contain the same functionality, but can rather use the optional package. It is up to the device manufacturer to support optional packages in the runtime environment. Optional packages can, unfortunately, not be added by a third party Java developer. This makes it impossible for a third party developer to extend or modify the Java Runtime Environment support for hardware features. Examples of optional packages are the Wireless Messaging API 2.0 (JSR 205, often available in devices capable of sending, and receiving SMS), Java APIs for Bluetooth (JSR 82) and The Mobile Media API (JSR 135). A more complete list of additional packages can be found in [12].. 2.1.2. Java ME in Mobile Phones. A major problem when developing services for mobile phones using Java ME is platform fragmentation. It is a result of the varying capabilities of Java-enabled mobile phones. Varying capabilities results in different optional packages being available, and limits the portability of MIDlets. Two attempts have been made to specify a default set of optional packages. The Java Technology for the Wireless Industry (JSR 185) [21] was first to do this. Recently, the Mobile Service Architecture (JSR 248) [22], is setting a new standard.. 2.2. Multimedia in Java ME. As mobile phones have developed from being used for telephony only into capable multimedia devices, the basic multimedia support in the MID Profile has become insufficient. To fully utilize the capabilities of new devices, optional packages have been specified to extend the multimedia support in Java ME. The Mobile Media API (MMAPI) and Advanced Multimedia Supplements (AMMS) are examples of optional packages designed to give the developer access to advanced multimedia features on more capable multimedia devices [4, 23].. 2.2.1. Mobile Media API. The Mobile Media API or MMAPI is the optional package used to play and capture timebased media, such as audio and video, on Java ME enabled devices [24]. A small subset of MMAPI is actually included in MIDP 2.0, but the included packages only allows for simple audio playback. MMAPI is however not limited to run on the MIP Profile [4]. Because of the limited resources available in the target devices, a key design issue of MMAPI was to make it a low footprint API. MMAPI has to make due with the remains after CLDC and MIDP have allocated their required shares of the available memory. Since the CLDC and MIDP can be adopted to a wide range of devices with varying complexity, one can not assume that the same media formats are supported on every device. Another design goal of MMAPI was therefore to make it independent of protocols and media formats. This is done by specifying parts of the API as interfaces and factories. It is then.

(23) 12. Theory. left to the device manufacturers to supply implementations to the interfaces based on the capabilities of their devices. By doing so, the API is spared form hard coded functionality that may not be available on all devices, and new formats can be supported easily. The downside, with this approach, is that MIDlets written for a particular device, cannot expect the same media formats to be available on every other device. To partly cope with this problem, the MIDlet can query the system and discover if a particular media format is supported. The MIDlet can then adapt and run supplementary code to handle the situation. For example, it could choose a different media format or tell the user that the operation is not supported. This is similar to the way a MIDlet can query if an implementation of an particular optional package is available on the device. 2.2.1.1. Components. MMAPI consists of four main components [23]. These four components, and a simplified scheme how they relate to each other are described in Figure 2.2 below. For a complete list of fields and methods, see the API documentation of JSR-135 [25].. <<interface>>. Controllable. (1.). (3.). <<interface>>. <<abstract>>. <<interface>>. Player. DataSource. SourceStream. (2.). Manager. (4.) <<interface>>. Control. Figure 2.2: The four main MMAPI components. The first main component is the Player interface (1.). The Player is a common interface used to control playback of different media types and formats. The idea is that managing playback of a H.263 video file is not done differently than playing a MPEG-4 video or MP3 audio file. Starting, pausing and stopping playback is done the same way using methods in the Player interface. Which implementation of the Player interface that is actually used to play a media file is decided by the media MIME type. The mapping between MIME types and Player implementations is done by the Manager class when creating a Player. As seen.

(24) 2.2. Multimedia in Java ME. 13. in figure 2.2, the Player implements the Controllable interface. This enables more advanced options than the default methods in the Player interface. Controls will be addressed further below. It is not possible to extend the capabilities of an MMAPI implementation by defining customized players. The next main component in MMAPI is the Manager class (2.). The Manager is used to query the system about support for content types and protocols. It has also a static method for playing simple tones. The main responsibility is, however, to create Players. The Manager class contains three factory methods used to create Players depending on how the media will be accessed. The first factory method, createPlayer(String locator), is used to create a Player from a media locator referring to a file in the file system or located on the World Wide Web. The second method, createPlayer(InputStream stream, String type), creates a Player by specifying a input stream which the Player will read data from. Since the data in the stream is read as pure bytes, the Manager needs additional information about the content media to be able to provide a suitable Player. Therefore, the purpose of the second parameter is to specify the MIME type of the data contained in the stream. The final factory method, createPlayer(DataSource source), accepts a MMAPI DataSource. This method is called internally by the Manager even if the media is specified as a locator or stream. In MMAPIs “bigger brother” for Java SE, Java Media Framework (JMF), is it possible to register a new player with the Manager and associating it with a media type. However, since customized players can not be created, this functionality is absent as well. The DataSource is the third main component in MMAPI (3.). As the name suggests, it is used to represent a source of data. The Players only concern is to connect to the DataSource and read data. The DataSource can, on the other hand, internally manage a complex protocol to retrieve its data from, for example, the network or locally on the device. The DataSource is, as shown in figure 2.2 composed of one or more SourceStreams. A video file may usually contain two streams separating audio and video. A SourceStream represents a single media data stream in the DataSource. As seen in figure 2.2, both DataSource and SourceStream implements the Controllable interface. The final main component of MMAPI is the Control interface (4.). The idea behind the system with controls is to enable a Player to provide more control, specific to the media type it is currently managing, than the default methods provided by the Player interface itself. Since the Player interface implements the Controllable interface, it must provide methods to retrieve a specific Control, or an array of all available controls for the current Player. Examples of controls are the VolumeControl, MetaDataControl and RecordControl interfaces. See [23] or [25] for a complete list of standard controls in MMAPI.. 2.2.1.2. Player Life Cycle. Starting a Player in MMAPI can be done with different levels of programmatic control [25, 12]. The most basic way is simply to let the Manager create a Player and then start it right away. But the Player startup time, in this context: the time between a Player is commanded to start playing and when it actually starts to render the media, varies greatly. The media type and location are two factors affecting the startup time. To provide greater control over the Players internal operations, a set of Player states have been defined. When started directly, the Player will traverse all necessary states before the media is rendered. But the Player can also be operated by making the Player move to a particular state on command. This enables a MIDlet to prepare a Player for playback, and only do minimal work when the user hits the play button, thereby minimizing the user perceived startup time. The Player state machine is described in Figure 2.3 below. Initially, a newly created Player is always in the UNREALIZED state. In this state,.

(25) 14. Theory. close() CLOSED. UNREALIZED realize(). deallocate() REALIZING. close() REALIZED prefetch(). close(). deallocate() PREFETCHED start(). close(). stop() STARTED. Figure 2.3: The MMAPI Player states. the Player does not have enough information to start playing. Besides moving to another reachable state, the only useful operation that can be performed on a Player in the UNREALIZED state is to call the setLoopCount() method to set the number of times the Player should loop. Calling almost any other method will result in a IllegalStateException being thrown. Calling realize() on an UNREALIZED Player results in an attempt to move it to the REALIZED state. In case of a media error, during the realization process, a MediaException is thrown. Depending on the media source, realizing a player could involve reading a file locally or communicating with a remote server. This makes it a potentially resource and time-consuming process. The realization process can therefore be aborted by calling the deallocate() method on the Player instance before the state transition is completed. The Player will then go back to the UNREALIZED state. When realized, the Player is likely to have acquired the resources it needs, except exclusive system resources such as audio devices [25]. It has also gathered enough information to provide the Controls supported by the source media. Once realized, the next logical state is the PREFETCHED state. This tansition is accomplished by calling the prefetch() method. The process of prefetching includes operations such as buffering media data or acquiring exclusive resources. To release scare resources the deallocate() method can be called to transfer the Player back to the REALIZED state. Once in the PREFETCHED state, the time it would take to start up the Player, on.

(26) 2.2. Multimedia in Java ME. 15. an incoming play request, has been reduced to the lowest possible. The Player is finally started by calling the start() method. This moves the Player into the STARTED state, which indicates that it is processing media data. When the end of the media is reached, the Player will automatically return to the PREFETCHED state. The same procedure will occur if stop() is called to pause the playback. The close() method can be called from any state, and results in moving the Player to the CLOSED state. Most of the resources allocated by the instance are released [25]. Except examining the current state, no useful operations or state transitions can be made in the CLOSED state. The state transitioning methods are called synchronously. That is, the methods will not return before the particular task is completed, unless prematurely aborted by an exception being thrown. However, implementing the PlayerListener interface, and registering as a listener, allows a MIDlet to receive asynchronous events about Player state changes. This is also useful for reacting to unpredictable events, such as end of media and error notifications.. 2.2.2. Advanced Multimedia Supplements. Features traditionally found in different types of devices are now being integrated, under the same shell. A modern mobile terminal are, not seldom, equipped with RDS radio receiver, mega-pixel camera and mp3-player. Increased processing power allows the device to more advanced tasks as media processing [26]. The Mobile Media API was designed to be flexible and independent of platform, media formats, protocols and other features supported by the device. But as devices are being equipped with more advanced multimedia features, the functionality found in MMAPI has become insufficient. The Advanced Multimedia Supplements (AMMS) optional package is designed to extend the multimedia capabilities of MMAPI. Therefore AMMS is not the predecessor of MMAPI, but rather, as the name suggests, a supplementary package. The main focus of the updates in AMMS is to better support advanced cameras, radio and audio processing. Using the additional controls introduced in AMMS allows the MIDlet developer to better control camera features as automatic focus, flash, image format and resolution. AMMS allows a radio player to change frequency with a control rather than closing it and creating a new player with the correct frequency, which was the case with MMAPI..

(27)

(28) C HAPTER 3 Experiments. The focus in this chapter is to presents seven experiments conducted during the work with this thesis. The communal purpose of these experiments is to evaluate the native multimedia support in the seventh Java ME platform from Sony Ericsson (JP-7), as well as how it can be extended. The chapter will start by briefly describing the tools that were used and how the experiments are structured. The experiment topics were chosen to investigate the multimedia features of the platform, first in a general way, then more specific to real-time multimedia. However, Experiment 5 was not initially planned but was conducted as a result of Experiment 4. Before continuing with the content of this chapter, this section will be concluded with a short description of each experiment. 1. The first experiment explores the media recording capabilities of the Java ME Platform on the Sony Ericsson K800i mobile terminal. Review the Delimitation section (1.5) to see witch hardware is used in thesis. 2. Continuing on the subject of recording, the second experiment investigates the possibilities to access recorded media data in real-time. 3. In the third experiment, focus has changed to media playback. This experiment evaluates different methods to implement a custom protocol in MMAPI and creating Players. 4. The fourth experiment explores several alternatives to extend the JP-7 implementation of MMAPI to enable playback of real-time streams. 5. The fifth experiment is dedicated to investigate the Player related problems encountered in the previous experiment. 6. Continuing on the subject of extending MMAPI, this experiment uses the result from the previous experiment to try to achieve better results than those of experiment four. 7. The last experiment tries a different approach by invoking a high-level Player normally intended to play RTSP streams. 17.

(29) 18. Experiments. 3.1. Tools. The process of creating and deploying a MIDlet on a mobile devices consists of several steps. First, the MIDlet code has to be built, preverified and packaged. Then it has to be transfered to the mobile device to be installed. To facilitate the MIDlet creation process, an Integrated Development Environment (IDE), based on the Eclipse SDK, was used. The Eclipse ME plugin configured together with the Sony Ericsson Java ME SDK enabled MIDlets to be created and packaged from within the Eclipse environment. The MIDlets were then transferred to the device and installed using the Serial Proxy application, shipped with the Sony Ericsson Java ME SDK. The Serial Proxy also provides a dialog, from within the Eclipse environment, to start, stop and remove MIDlets from the device as well as forwarding messages written to the standard out and error output streams to the Eclipse console. This was very useful for debugging purposes.. 3.2. Experiment Structure. In the following sections, every experiment is structured into three separate sections. The purpose is to clarify the different steps in the experimental process as well as to provide consistency between the experiments. These three building blocks are summarized by the bullets below and will be described in more detail following section. • Short introduction, purpose and additional background information • Implementation • Results and observations Every experiment starts with an short introduction briefly describing, for example, how the current experiment relates to the previous one. The introduction also contains a short statement presenting the purpose of that particular experiment. The purpose clearly states what is to be investigated and achieved with the experiment. The preceding background information provides the reader with knowledge about various technologies used in the experiment, as well as other relevant pieces information know before the experiment was conducted. The second section, titled Implementation, describes how the experiment was conducted. This section only contains brief descriptions of the implementations to preserve space, but more information can be found in appendixes. The implementation section is followed by the third, and final, section — Results and observations. The purpose of this section is to present any generated results. The experiment is concluded by relating back to the initial purpose and see if it has been achieved. Problems that may have had implications on the result are also presented in this section. If an experiment has multiple implementation steps, it is sometimes divided into separate implementation sections in which case every implementation section is followed by a results and observation section of its own. An experiment of this kind is however concluded with a summary of the results..

(30) 3.3. Experiment 1 — Recording Audio and Video. 3.3. 19. Experiment 1 — Recording Audio and Video. The purpose of this experiment is to explore the possibilities to record audio and video with the Sony Ericsson K800i, using Java ME. For example, one area to examine is whether audio and video can be captured simultaneously and/or separately. Since the Sony Ericsson K800i is equipped with an additional front camera for video calls, the possibilities to use that camera will be investigated as well. Recording audio and video in Java ME is done using functionality provided by the Mobile Media API (MMAPI) optional package [27]. (See Section 2.2.1 for more information about MMAPI.) The static factory method createPlayer(String locator), provided by the Manager class, is used to create a Player. But instead of specifying a location of a media file, a special locator is used for capturing live media. The locator is defined as: "capture://" device [ "?" media_encodings ], where the capturing device is specified as: device = "audio" / "video" / "audio_video" / dev_name. For further information about specifying custom devices (dev_name) and optional media encodings (media_encodings), see the Manager documentation in the MMAPI reference API [25].. 3.3.1. Implementation. A prototype MIDlet was created to test the recording capabilities of the MMAPI implementation on the Sony Ericsson K800i. Briefly described, the MIDlet consists of two threads. The main thread handles the GUI and user input, while the second threads takes care of the multimedia functionality. Capturing and playback are handled by two separate Player instances and their associated controls. Appendix A.1 describes the components of the RecordingMIDlet in two separate UML diagram; one for each Player instance.. 3.3.2. Results and Observations. The first area to investigate was capturing different media types with the K800i device. Initially audio only, then video only and finally, audio and video simultaneously. The results of the first recording experiment is summarized in Table 1.. Capture Locator capture://audio capture://video capture://audio video. Prototype K800i audio recorded no RecordControl audio recorded. Regular K800i audio recorded no RecordControl audio & video recorded. Table 1: Recording Results. The first device to be tested was the K800i prototype version. Audio was captured and played back as expected. The second media type was video only. It was no problem to extract the VideoControl and show the camera output in the view finder. It was however not possible to retrieve a RecordControl using the video-only locator. Obviously, recording video separatly is not supported. The third, and final, capture locator was the combined audio and video locator. The view finder could be started without problems and the RecordControl could be retrieved. However, when playing back the data from the capture.

(31) 20. Experiments. session, it appears that only audio was recorded. The initial intention was to use prototype phones only, but the results when using the combined locator on a prototype phone motivated a second test using a regular K800i. The audio, and video only locators produced the same result as with the prototype phone. However, the player created with the combined locator was able to record both audio and video. During the tests a strange side effect was noticed. The view finder displayed the camera output normally, but when the actual recording started, the screen was rotated 90 degrees counter clockwise. The second point mentioned in the purpose of this experiment was to use the video call camera, located on the front of the device. Even though it is supported in the MMAPI specification by specifying a device in the capture locator, there is no available documentation describing it being done on the K800i. The question has been discussed at Sony Ericsson Developers Forums [28], but a representative from Sony Ericsson claims that it is, as far as he knows, not possible.. 3.4. Experiment 2 — Real-Time Recording. Experiment one investigated the general recording support in Java ME on the Sony Ericsson K800i by recording a piece of media data, and then stopping the recording to play it back. This experiment will continue on the subject of recording, but opposed to the previous experiment, focus on the real-time aspect. The purpose of this experiment is to investigate if the MMAPI implementation on the Sony Ericsson K800i can be used to record video data in real-time. In the context of this experiment, recording data in real-time means making the recorded data available to the MIDlet in real-time. This would be necessary to be able to develop a real-time communication service where a terminal would have to act both as a receiver and sender of media data. This experiment will continue to build on the prototype MIDlet from the previous experiment. The addition will consist of a way to monitor the internal activity in the output stream connected to the capturing player. The ByteArrayOutputStream, used as the data sink in the previous experiment, contains a method size() used to get the size of its internal buffer. This method could be called periodically to monitor the buffer. Even though this solution would reveal if data is written to the stream during recording, the stream would have to be polled for size updates. The Implementation section proposes a more complete solution that could be extended to support, for example, sending the recorded data to a receiver over radio.. 3.4.1. Implementation. For this experiment, the RecordingMIDlet from Experiment 1, is modified by replacing the ByteArrayOutputStream with a customized implementation of the OutputStream abstract class. By implementing the abstract write method it is possible to access the data directly when it is written to the OutputStream. A complete prototype would include a packetizer and eventually send the packets to the recipient over the network, but for this experiment, it is sufficient to log the activity in the customized output stream. Appendix A.2 contains a small UML diagram describing the modifications done to the RecordingMIDlet compared to Experiment 1..

(32) 3.4. Experiment 2 — Real-Time Recording. 3.4.2. 21. Results and Observations. The log messages in MIDlet Output 1 were produced when running the RecordingMIDlet with the LoggingOutputStream customized output stream. Every line is prefixed with a short text hinting which component produced the message. The entire output will be described in detail below. Midlet Output 1 RecordingMIDlet Output RM : RecordingMIDlet LOS : LoggingOutputStream 1 (RM) starting up at t = 0 (ms) 2 (RM) record/playback thread started 3 (RM) capturePlayer: SystemClass290@ff91a2b7 4 (RM) player update - event: started 5 (RM) recording started at t = 15004 6 (RM) player update - event: recordStarted 7 (RM) recording stopped at t = 23149 8 (RM) player update - player: event: recordStopped 9 (RM) player update - player: event: stopped 10 (LOS) write at t = 23801 11 (LOS) write at t = 23804 12 (LOS) write at t = 23806 13 (LOS) write at t = 23810 14 (LOS) write at t = 23820 15 (LOS) write at t = 23821 16 .... Line 1 shows the startup message of the RecordingMIDlet and represents the start of time counting (t = 0). The next three lines are produced when the Player is initialized to show the camera output in the view finder. At line 5, the recording has started at t = 15004, and the preceding line shows the recordingStarted event posted by the player. The recording is stopped and committed at t = 23149 without anything yet logged by the LoggingOutputStream. Line 8 and 9 shows the events posted by the player as a reaction to the recording being stopped. It is obvious that all activity in the LoggingOutputStream takes place after the recording has been stopped and committed (from line 10 and onwards). The timestamps shows that all write operations occurred later than the point where the recording was stopped so we are not dealing with delayed outputs. The small time difference between the calls suggests that the data is copied from a internal buffer to the output stream piece by piece. Unfortunately, the MMAPI specification states that the commit method implies a call to the stop method. This prevents small chunks of data from being committed periodically which could be a possible, however not particularly good, way to get around this problem..

(33) 22. 3.5. Experiments. Experiment 3 — Custom Protocol Player. Opposed to the two previous experiments, which dealt with the topic of recording, this experiment focuses on media playback. More precisely, playback using an access protocol not natively supported by the MMAPI implementation. The purpose of this experiment is to investigate how a custom streaming protocol, could be implemented in MMAPI. If a suitable alternative is found, a prototype MIDlet should be created to test the implementation. As described in the theory chapter, MMAPI components (Section 2.2.1.1), a MMAPI Player can be created from a locator string, an input stream and a DataSource. Which method to choose depends primarily on which protocol will be used to access the media. Using a string locator is a simple and powerful way to create a Player. According to the JSR-135 (MMAPI) specification, a locator is specified the form [25]: scheme ":" scheme-specific-part, where scheme defines the access protocol. Obviously this form is very general. An example of a locator is the capture locator addressed in a previous experiment. Two examples of media locators for playback are: "http://www.mmedia.com/demo.mp4" or "file:///c:/other/audio.amr". The first example describes a locator to access a MPEG-4 file located on a web server using HTTP. The second locator is used to play an AMR audio file from the local file system. The protocols that can be used with a locator is however limited by the MMAPI implementation on the particular device. The Sony Ericsson K800i, for example, supports the FILE, HTTP and RTSP protocols for media playback. Since there is no way to register new protocols in MMAPI, it is not possible to use a string locator for a custom protocol. The other two supported ways to create a Player is from an InputStream or a DataSource. These are both abstract classes, which means that they are meant to be extended. The reminder of this experiment will evaluate which, of the two, that is most suitable for handling a custom streaming protocol. Theoretically both the InputStream and DataSource could be used to implement a custom protocol. The reason for this is that both leaves the implementation of the method, which retrieves data to the player, to the extending class. The standard Java abstract class InputStream is, as the name suggests, a stream used for reading. By extending the InputStream and implementing the abstract read() method, which reads the next single byte from the stream, the extending class can read data from anywhere. For example from a file, the network or any other readable source. The data can then be processed in any way and forwarded to the object reading the stream. The DataSource abstract class is not a stream itself, but it is composed of one, or several, SourceStreams. See the section about the main components in MMAPI (2.2.1.1) for more information about the DataSource abstract class and the SourceStream interface. Similar to the standard InputStream, the implementation of the method that reads data is left to the class implementing the SourceStream interface which provides the same flexibility as the InputStream. The MMAPI Specification [25] mentions two particular reasons to use a DataSource for a custom protocol opposed to a InputStream. First, the DataSource supports a random seeking API which enables a standard way to freely jump to any position in the media. But since the purpose of this experiment is to investigate a suitable way to implement a real-time streaming protocol – random seeking is not very useful. Second, the DataSource/SourceStream supports setting a “logical” media data transfer size, useful for frame-delimited data, as for example video. In practice, a method is used to set minimum size of the buffer used when reading data from the SourceStream. This feature could be use-.

(34) 3.5. Experiment 3 — Custom Protocol Player. 23. ful, since video data will be arriving in packets with most commonly one frame per packet. It would be more complicated to use an InputStream in a similar way, due to its design. The abstract class InputStream has three separate methods for reading: int read(), int read(byte[] b) and int read(byte[] b, int off, int len). All methods have integers as return values, but the meaning differs among them. In the first method, the return value contains the byte of data read from the stream. The other two methods stores the retrieved data in the buffer argument, and the return value is then the number of bytes read from the stream. Since only the first method is abstract, it is the only one the extending class is required to supply an implementation for. The predefined implementations of the other two methods then uses the first method to read multiple bytes. Reading a number of bytes with one of the two latter methods would then imply calling the first method once for every byte read. Implementing the delivery mechanism of the custom protocol in a method reading a single byte is not very suitable since the incoming data is not delivered as single bytes. To get around this problem, incoming data chunks would have to be buffered and then delivered to the Player one byte at a time, or the default implementation of the two latter methods, capable of reading bigger data chunks, would have to be overridden as well. This is however not something worth doing since there is a better alternative. Conveniently the SourceStream interface contains only one method for reading with similar signature as the third method in the InputStream as described above. Besides random seeing and transfer size, the DataSource abstract class provides additional functionality not supported by the more general InputStream. For example, the DataSource declares a set of abstract methods for connecting to the source and preparing data to be read. These methods can be used to handle setup operations for the streaming protocol. Additionally both the DataSource abstract class and the SourceStream interface implements the Controllable interface which enables querying for controls. To summarize the literature investigation of the different methods to create a Player, using the DataSource abstract class seems most suitable for handling a custom protocol. The DataSource, which was designed specifically for delivering media data to a Player, is obviously more equipped than the more general InputStream abstract class. Even though most of the functionality could be added by extending InputStream, the DataSource has predefined methods and routines for media specific tasks.. 3.5.1. Implementation. A prototype MIDlet called CustomDataSourceMIDlet has been implemented to investigate how a custom DataSource behaves on the device. For testing purposes, data is read from an InputStream encapsulated within the SourceStream to simulate some protocol delivering media data. Note that this should not be confused with a Player created using a InputStream as argument to the Manager player factory method. The encapsulated InputStream could be replaced by any readable data source, as for example, a network socket. More details about the CustomDataSourceMIDlet implementation can be found in Appendix A.3.. 3.5.2. Results and Observations. The introduction to this experiment (Section 3.5) described the MMAPI DataSource abstract class and how it manages the life-cycle of its media through a simple connection protocol. The life-cycle management worked well when the CustomDataSourceMIDlet was run in the PC-Emulator. The connect() method was automatically called upon creation of the Player, and start() when the realization process began. On the other hand, the result was not the same when the MIDlet was tested on the actual device. Both the connect().

(35) 24. Experiments. and start() methods had to be called manually at suitable places in the code. This divergence in behavior caused some rather critical errors. For example, necessary instantiations in the connect() method were never performed, which resulted in numerous NullPointerExceptions and termination of the MIDlet. However, the cleanup procedure worked well on the device as well. The disconnect(), which implies a call to the stop() method, was automatically called in both the emulator and on the device. MIDlet Output 2 shows the output of a test run of the CustomDataSourceMIDlet prototype MIDlet. The data content of the CustomDataSource, in this case represented by an encapsulated InputStream as described above, was read from the 3G 50h263qcifVideo.3gp video file. The file contains H.263 video data in QCIF format. To instruct the Manager to create a suitable player, the content type of the CustomDataSource was set to video/h2632000. Midlet Output 2 CustomDataSourceMIDlet Output CDSM : CustomDataSourceMIDlet CSS:0 : CustomSourceStream (first stream) 1 (CDSM) DataSource: creating.. 2 (CDSM) DataSource: created. 3 (CDSM) DataSource: Connecting.. (manual) 4 (CDSM) DataSource: Connected. 5 (CDSM) Player: creating.. 6 (CDSM) Player: created. 7 (CDSM) Player: com.sonyericsson.mmedia.H263VideoPlayer@ff91a2b7 8 (CDSM) Player: realizing.. 9 (CSS:0) read 4096 bytes 10 (CSS:0) read 4096 bytes 11 (CSS:0) read 4096 bytes 12 ... 13 (CSS:0) read 4096 bytes 14 (CSS:0) read 4096 bytes 15 (CSS:0) read 4096 bytes 16 (CDSM) Player: realized. 17 (CDSM) Player: starting.. 18 (CSS:0) read 4096 bytes 19 (CSS:0) read 4096 bytes 20 (CSS:0) read 4096 bytes 21 ... 22 (CSS:0) read 4096 bytes 23 (CSS:0) read 4096 bytes 24 (CSS:0) read 4096 bytes 25 (CDSM) Player: started. 26 (CSS:0) read 4096 bytes 27 (CSS:0) read 4096 bytes 28 (CSS:0) read 4096 bytes 29 ....

(36) 3.6. Experiment 4 — Java ME and I AM. 25. Lines 1 to 7 in MIDlet Output 2 shows the logging printouts produced by the creation of the CustomDataSource and Player instances. The manual call to connect() is logged on line 3. The start() method was never called manually since it is has an empty implementation in the CustomDataSource. The reason for this is that all operations necessary to read data from the CustomDataSource are carried out in the connect() method. For example, in a case of a protocol where data should be pushed to a DataSource the start() method could be handy. Line 7 shows the output of doing a toString() on the player instance, received from the Manager. Since the content type of the CustomDataSource was set to video/h263-2000 we get a H263VideoPlayer. The code that produced the output from line 8 and onwards consists of a realize() call directly followed by a start() call on the Player instance. The realize() method blocks the execution until the Player instance is realized which means that the Player should start to playing as soon as it is realized. Lines 16 and 17 indicates that the realize() method has returned and the starting process initiated. The Player instance is finally started at line 25. The Player continue to read data from line 26 and onwards which indicates that it does not need to read the entire file before it can start playing. This is a good result if the H263VideoPlayer is going to be used to play a video stream.. 3.6. Experiment 4 — Java ME and I AM. The previous experiment investigated how a custom protocol could be implemented in Java ME. This experiment will continue the previous experiment left off and investigate on how a custom DataSource can be used to implement streaming protocol. The purpose of this experiment is to present and evaluate several ways to implement a custom streaming protocol on a Sony Ericsson K800i prototype phone. The desired result is one, or more, prototype MIDlets capable of playing a video stream received from another mobile terminal running the I AM client. The prototype will be limited to only receiving video for two reasons. First, one way video could at least be useful in some scenarios. For example, a person could want to show another person a view of a particular location during a voice call. One way audio, on the other hand, could be perceived as limited since people are used to to-way audio in mobile terminals using circuit switched systems. Second, it is not necessary to do jitter buffering on the received video packets [8]. Frames can be rendered directly at arrival without compromising quality. Since this experiment is conducted on prototype phones, it is possible to extend the Java ME Runtime Environment with functionality provided by native code. This can be done by using Inter-Process Communication (IPC) to signal and share data between processes. IPC could, for example, enable a MIDlet to make calls to native functions or provide access to a byte buffer with media data populated by native code. IPC in conjunction with the I AM prototype could make it possible to let a MIDlet utilize streaming video playback functionality, already implemented in native code.. Receive. Depack. Decode. Render. Control. Figure 3.1: Simplification of the steps involved in playing an RTP video stream.

References

Related documents

There was consistently no obvious beneficial effect of anticoa- gulation treatment on cerebral infarction and stroke in men and women younger than 65 years and one additional

The results can be compared with the commonly used IPAQ, which actually provided low and non-significant correlations when the agreement was assessed with ActivPal as the

Motiverade lärare som inkluderar skolkuratorn och det sociala perspektivet i sitt dagliga arbete leder till god samverkan inom skolan och är en förutsättning för att

Rätt grovlek God kondition God kondition Enl instruktion God kondition God kondition Enl följekort/ ritning Enl följekort Mäts på minst tre punkter Enl följekorti ritning

In regard to the first question, it was concluded that the subjective autonomy modeled within the museum space in isolation, lacks the capacity to address

Swedish Road and Traffic Research Institute postadress postal address S 58l 01 Linköping Sweden gatuadress off/ce address Olaus Magnus väg 37. telefon

The performance measures available on training and/or validation data are base classifier accuracy, ensemble accuracy, ensemble AUC and numerous diversity measures.. B ACKGROUND

1) To establish methods and determine genotypes for the polymorphisms named below. 2) To clarify the influence of polymorphism 80G&gt;A in the reduced folate carrier gene SLC19A1