Streaming media files

Playing media files

To play a stream, use the NetStream class. Create a NetStream object and pass the contructor a NetConnection object:

ns:NetStream = new NetStream(nc);Call the method or the NetStream.play2() method and pass the URI of the media file:

ns:NetStream = new NetStream(nc);"bikes"); 

This code plays the recorded stream named “bikes.flv” within the application to which you are connected with NetConnection.connect().

Different file types require prefixes and file extensions. For example, the following code plays the file “bikes.f4v”,"mp4:bikes.f4v");

For detailed information about play() and play2(), their parameters, and when to use each method, see the NetStream class in the ActionScript 3.0 Reference.

Playing streams nested in subfolders

To play a stream, specify the codec before you specify the path to the stream. FLV streams don’t require a codec prefix, but F4V/MP4 files, MP3 files, and RAW files do.

Suppose you’re using the default vod application. By default, the vod application is configured to look for streams in the applications/vod/media folder. Suppose a stream called “sample” is nested in the applications/vod/media/final folder. The following examples are for the client-side method:"mp4:final/sample.f4v",0,-1)  // F4V/MP4 files"raw:final/sample",0,-1) // RAW files"mp3:final/sample",0,-1) //MP3 files"final/sample",0,-1). // FLV files

The following are examples for the Server-Side ActionScript method:"mp4:final/sample.f4v",0,-1)  // F4V/MP4 files"raw:final/sample",0,-1)  // RAW files"mp3:final/sample",0,-1)  // MP3 files"final/sample",0,-1)  // FLV files

Naming streams

Stream names cannot contain any of the following characters: \ / : * ? " < > |.

Managing the client buffer

  • Very low frame rate H.264 videos can take a long time to start if the buffer is too short.

    H.264 video requires 64 messages before it begins playback. For example, at 15 fps, 2 seconds of buffer holds 30 samples. In this case, the server and the player wait for over 4 seconds for the 64 images to arrive, even if they're small.

  • Server-side streams do not play if the file size is less than either the configured buffer time (MinBufferTime tag in the Application.xml configuration file), or 2 seconds.

  • When playing a stream, set NetStream.bufferTime to at least .1. Set the value smaller for live applications. Set the value larger (3-5 seconds) for on-demand applications.

Mapping URIs to local and network drives

Use virtual directories to simplify mapping URIs to local and network drives. Virtual directories let you publish and store media files in different, predetermined locations, which can help you organize your media files. Configure virtual directories in the VirtualDirectory/Streams tag of the Vhost.xml file.

One way you can use directory mapping is to separate storage of different kinds of resources. For example, your application could allow users to view either high-bandwidth video or low-bandwidth video, and you might want to store high-bandwidth and low-bandwidth video in separate folders. You can create a mapping wherein all streams that start with low are stored in a specific directory, C:\low_bandwidth, and all streams that start with high are stored in a different directory:


When the client wants to access low-bandwidth video, the client calls"low/sample"). This call tells the server to look for the sample.flv file in the c:\low_bandwidth folder.

Similarly, a call to"high/sample") tells the server to look for the sample.flv file in the c:\high_bandwidth folder.

The following table shows three examples of different virtual directory configurations, including mapping to a local drive and a network drive, and how the configurations determine the directory to which a recorded stream is published. In the first case, because the URI specified ("myStream") does not match the virtual directory name that is specified ("low"), the server publishes the stream to the default streams directory.

Mapping in Vhost.xml

<VirtualDirectory><Streams> tag

URI in NetStream call

Location of published stream










Capturing video snapshots

This feature enables you to get a thumbnail snapshot of a given video, including sound, for display purposes.

Flash Player clients are permitted to access data from streams in the directories specified by the Client.audioSampleAccess and Client.videoSampleAccess properties. See .

To access data, call BitmapData.draw() and SoundMixer.computeSpectrum() on the client. For more information about accessing raw audio, see Accessing raw sound data.

Handling metadata in streams

A recorded media file often has metadata encoded in it by the server or a tool. The Flash Video Exporter utility (version 1.1 or later) is a tool that embeds a video’s duration, frame rate, and other information into the video file itself. Other video encoders embed different sets of metadata, or you can explicitly add your own metadata.

The NetStream object that plays the stream on the client dispatches an onMetaData event when the stream encounters the metadata. To read the metadata, you must handle the event and extract the info object that contains the metadata. For example, if a file is encoded with Flash Video Exporter, the info object contains these properties:


The duration of the video.


The width of the video display.


The height of the video display.


The frame rate at which the video was encoded.

Using XMP metadata

You can deliver Adobe Extensible Metadata Platform (XMP) metadata embedded video streaming through Adobe Media Server to Flash Player. Adobe Media Server supports XMP metadata embedded in FLV and MP4/F4V formats. Adobe Media Server 3.5 supports one XMP metadata packet per MP4/F4V file.

With XMP metadata, you have a communication system that provides critical media information from media creation to the point where media is viewed. XMP information you add during the production process can add to the interactive experience of the media. In addition, speech-to-text metadata embedded within files and encoded from Adobe encoding tools such as Adobe Media Encoder can be delivered. AMF0 and AMF3 connections are supported. XMP metadata can be internal information about the file or information for end users.

For example, you could create a trailer in Adobe® Premiere® and transfer the metadata to the FLV file. When users view the file, they can use Flash Player 10 search to look for metadata and jump to a specific location in the file. When NetStream plays the content, an onXMPData message with the single field data is sent as a callback. The data field contains the entire XMP message from the media file.

For detailed information about XMP, see

Example: Media player

This tutorial uses ActionScript 3.0 to add a Video object to the Stage to display video. For more information about working with video, see the “Working with Video” chapter in ActionScript 3.0 Developer’s Guide at

This tutorial provides the simplest example of displaying video for learning. To build a more robust video player, see the Open Source Media Framework.


This example uses the MediaPlayer sample,, from the rootinstall/documentation/samples folder.

Run the example in Flash

  1. Create a rootinstall/applications/mediaplayer folder.

  2. Copy the rootinstall/documentation/samples/MediaPlayer/streams folder to the rootinstall/applications/mediaplayer folder so you have the following:


  3. In Flash, open the MediaPlayer.fla file from the rootinstall/documentation/samples/MediaPlayer folder.

  4. Select Control > Test Movie. The video plays without sound and the Output window displays messages.

    The Output window and the video in test-movie mode

You can watch the output as the stream plays and the connection status changes. The call to triggers the call to onMetaData, which displays metadata in the console window, like this:

 metadata: duration=30 width=292 height=292 framerate=30

Run the example in Flash Builder

  1. Open in Flash Builder.

  2. Choose Run > Debug. For Project, choose MediaPlayer. For Application file, choose

  3. Click Debug.

    The video runs in an application window. Click the Flash Builder window to see the output messages.

Write the main client class

  1. Create an ActionScript 3.0 class. Import NetConnection, NetStream, and any other classes you need:

     package { 
         import flash.display.Sprite; 
  2. Create a new class, MediaPlayer, and declare the variables you’ll need within it:

     public class MediaPlayer extends Sprite 
         var nc:NetConnection; 
         var ns:NetStream; 
         var video:Video; 
  3. Define the constructor: create a NetConnection object and add an event listener to it, and connect to the server:

     public function MediaPlayer() 
             nc = new NetConnection(); 
             nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler); 
  4. Create a netStatusHandler function that handles both NetConnection and NetStream events:

      private function netStatusHandler(event:NetStatusEvent):void{ 
        trace(" " + + "\n", " " +; 
        switch ({ 
            case "NetConnection.Connect.Success": 
                // Call doPlaylist() or doVideo() here. 
            case "NetConnection.Connect.Failed": 
                // Handle this case here. 
            case "NetConnection.Connect.Rejected": 
                // Handle this case here. 
             case "NetStream.Play.Stop": 
                // Handle this case here. 
             case "NetStream.Play.StreamNotFound": 
                // Handle this case here. 
             case "NetStream.Publish.BadName": 
                 trace("The stream name is already used"); 
                // Handle this case here. 


To see the full list of event codes that are available, see in the ActionScript 3.0 Reference.

  1. Create a NetStream object and register a netStatus event listener:

     private function connectStream(nc:NetConnection):void { 
         ns = new NetStream(nc); 
         ns.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler); 
         ns.client = new CustomClient(); 

    Notice that you set the client property to an instance of the CustomClient class. CustomClient is a separate class that defines some special event handlers.

  2. Create a Video object and attach the stream to it:

         video = new Video(); 

    In ActionScript 3.0, call Video.attachNetStream()—not Video.attachVideo() as in ActionScript 2.0—to attach the stream to the Video object.

  3. Call to play the stream and addChild() to add it to the Stage:

         ... "bikes", 0); 

    The URI of the stream you pass to is relative to the URI of the application you pass to NetConnection.connect().

Write the client event handler class

You also need to write the CustomClient class, which contains the onMetaData and onPlayStatus event handlers. You must handle these events when you call, but you cannot use the addEventListener() method to register the event handlers.

  1. In your main client class, attach the new class to the NetStream.client property:

     ns.client = new CustomClient();
  2. Create the new client class:

     class CustomClient { 
  3. Write a function named onMetaData() to handle the onMetaData event:

     public function onMetaData(info:Object):void { 
         trace("metadata: duration=" + info.duration + " width=" + info.width +  
             " height=" + info.height + " framerate=" + info.framerate); 
  4. Write a function named onPlayStatus() to handle the onPlayStatus event:

     public function onPlayStatus(info:Object):void { 
         trace("handling playstatus here");