Adding metadata to a live stream

About metadata

[adobebyline]See Adobe Evangelist Jens Loeffler’s article Working with metadata for live Flash video streaming.[/adobebyline]

Metadata in streaming media gives subscribers the opportunity to get information about the media they are viewing. Metadata can contain information about the video, such as title, copyright information, duration of the video, or creation date. A client can use the metadata to set the width and height of the video player.

In a recorded stream, a special data message is inserted at the beginning of the media file that provides metadata. Any client that connects to Adobe Media Server receives the metadata when it plays the recorded stream. However, when a client connects to a live stream during the broadcast, they miss receiving the data keyframe.

You can write code that tells Adobe Media Server to send metadata to clients whenever they connect to a live stream. Any client connecting to the server, even if they connect late, receives the metadata when it plays the live video.

You can also use this feature to add metadata to a live stream at any time during a broadcast.

Note:

Because DVR applications use recorded streams, you do not need to use data keyframes to push metadata to the client. In DVR applications (and in all recorded video applications) the onMetaData() method is called at the beginning of the stream and during events such as seek and pause.

Sending metadata to a live stream

To send metadata to clients when they connect to a live stream, pass a special command, @setDataFrame, to the client-side NetStream.send()method or to the server-side Stream.send() method. Whether the command is sent from the client or the server, handle the data on the client the same way you would handle any message from the send() method. Pass a handler name to the send() method and define a function with that name to handle the data.

To send the command from the client, call the NetStream.send() method:

 NetStream.send(@setDataFrame, onMetaData [,metadata ])
 NetStream.send(@setDataFrame, onMetaData [,metadata ])
 NetStream.send(@setDataFrame, onMetaData [,metadata ])

The onMetaData parameter specifies a function that handles the metadata when it’s received. You can create multiple data keyframes. Each data keyframe must use a unique handler (for example, onMetaData1, onMetaData2, and so on).

The metadata parameter is an Object or Array (or any subclass) that contains the metadata to set in the stream. Each metadata item is a property with a name and a value set in the metadata object. You can use any name, but Adobe recommends that you use common names, so that the metadata you set can be easily read.

To add metadata to a live stream in a client-side script, use the following code:

var metaData:Object = new Object();
metaData.title = "myStream";
metaData.width = 400;
metaData.height = 200;
ns.send("@setDataFrame", "onMetaData", metaData);
var metaData:Object = new Object(); metaData.title = "myStream"; metaData.width = 400; metaData.height = 200; ns.send("@setDataFrame", "onMetaData", metaData);
var metaData:Object = new Object(); 
metaData.title = "myStream"; 
metaData.width = 400; 
metaData.height = 200; 
ns.send("@setDataFrame", "onMetaData", metaData);

To clear metadata from a live stream in a client-side script, use the following code:

ns.send("@clearDataFrame", "onMetaData");
ns.send("@clearDataFrame", "onMetaData");
ns.send("@clearDataFrame", "onMetaData");

To add metadata to a live stream in a server-side script, use the following code:

s = Stream.get("myStream");
metaData = new Object();
metaData.title = "myStream";
metaData.width = 400;
metaData.height = 200;
s.send("@setDataFrame", "onMetaData", metaData);
s = Stream.get("myStream"); metaData = new Object(); metaData.title = "myStream"; metaData.width = 400; metaData.height = 200; s.send("@setDataFrame", "onMetaData", metaData);
s = Stream.get("myStream"); 
metaData = new Object(); 
metaData.title = "myStream"; 
metaData.width = 400; 
metaData.height = 200; 
s.send("@setDataFrame", "onMetaData", metaData);

To clear metadata from a live stream in a server-side script, use the following code:

s.send("@clearDataFrame", "onMetaData");
s.send("@clearDataFrame", "onMetaData");
s.send("@clearDataFrame", "onMetaData");

Retrieving metadata

You can retrieve metadata from client-side code only. You cannot retreive metadata from server-side code. Even if you send the @setDataFrame message from the server, use client-side code to retrieve it.

To retrieve metadata, assign the NetStream.client property to an object. Define an onMetaData function on that object, as in the following:

netstream.client = this;
function onMetaData(info:Object):void {
var key:String;
for (key in info) {
trace(key + ": " + info[key]);
}
}
netstream.client = this; function onMetaData(info:Object):void { var key:String; for (key in info) { trace(key + ": " + info[key]); } }
netstream.client = this; 
 
function onMetaData(info:Object):void { 
    var key:String; 
    for (key in info) { 
        trace(key + ": " + info[key]); 
    } 
}

This function outputs the metadata added by the server and the metadata sent with @setDataFrame.

To have the server add metadata in addition to any messages you send with the @setDataFrame message, publish the video with the "record" flag or the "append" flag, as in the following:

netstream.publish("mycamera", "record");
netstream.publish("mycamera", "append");
netstream.publish("mycamera", "record"); netstream.publish("mycamera", "append");
netstream.publish("mycamera", "record"); 
netstream.publish("mycamera", "append");
Note:

If you publish a live video to the server with the "live" flag or without a type parameter, the server does not record the video. In this case, the server does not add standard metadata to the file.

To play the stream, call NetStream.play() and pass a value for the start parameter to indicate that the stream is a recorded live stream:

netstream.play("mycamera", 0); // Plays a recorded live stream.
netstream.play("mycamera", 0); // Plays a recorded live stream.
netstream.play("mycamera", 0); // Plays a recorded live stream.

If you pass 0 or greater for the start parameter, the client plays the recorded stream starting from the time given. The recorded stream includes the standard metadata added by the server. If the server doesn’t find a recorded stream, it ignores the play() method.

If you pass -2 or -1 for the start parameter, the client plays the live video and does not receive the standard metadata:

netstream.play("mycamera", -2); // Looks for a live stream first.
netstream.play("mycamera", -1); // Plays a live stream.
netstream.play("mycamera", -2); // Looks for a live stream first. netstream.play("mycamera", -1); // Plays a live stream.
netstream.play("mycamera", -2); // Looks for a live stream first. 
netstream.play("mycamera", -1); // Plays a live stream.

Example: Add metadata to live video

This example is a client application that does the following:

  • Captures and encodes video.

  • Displays the video as it’s captured.

  • Streams video from the client to Adobe Media Server.

  • Sends metadata to the server that the server sends to clients when they play the live stream.

  • Streams video from Adobe Media Server back to the client when you click a button.

  • Displays the video streamed from the server.

  • Displays the metadata sent from the server in a TextArea component.

Note:

To test this code, create a RootInstall/applications/publishlive folder on the server. Open the RootInstall/documentation/samples/metadata/Metadata.swf file to connect to the application.

  1. On Adobe Media Server, create a RootInstall/applications/publishlive folder.

  2. In Flash, create an ActionScript file and save it as Metadata.as.

  3. Copy and paste the following code into the Script window:

    package {
    import flash.display.MovieClip;
    import flash.net.NetConnection;
    import flash.events.NetStatusEvent;
    import flash.events.MouseEvent;
    import flash.events.AsyncErrorEvent;
    import flash.net.NetStream;
    import flash.media.Video;
    import flash.media.Camera;
    import flash.media.Microphone;
    import fl.controls.Button;
    import fl.controls.Label;
    import fl.controls.TextArea;
    public class Metadata extends MovieClip {
    private var nc:NetConnection;
    private var ns:NetStream;
    private var nsPlayer:NetStream;
    private var vid:Video;
    private var vidPlayer:Video;
    private var cam:Camera;
    private var mic:Microphone;
    private var clearBtn:Button;
    private var startPlaybackBtn:Button;
    private var outgoingLbl:Label;
    private var incomingLbl:Label;
    private var myMetadata:Object;
    private var outputWindow:TextArea;
    public function Metadata(){
    setupUI();
    nc = new NetConnection();
    nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
    nc.connect("rtmp://localhost/publishlive");
    }
    /*
    * Clear the MetaData associated with the stream
    */
    private function clearHandler(event:MouseEvent):void {
    if (ns){
    trace("Clearing MetaData");
    ns.send("@clearDataFrame", "onMetaData");
    }
    }
    private function startHandler(event:MouseEvent):void {
    displayPlaybackVideo();
    }
    private function onNetStatus(event:NetStatusEvent):void {
    trace(event.target + ": " + event.info.code);
    switch (event.info.code)
    {
    case "NetConnection.Connect.Success":
    publishCamera();
    displayPublishingVideo();
    break;
    case "NetStream.Publish.Start":
    sendMetadata();
    break;
    }
    }
    private function asyncErrorHandler(event:AsyncErrorEvent):void {
    trace(event.text);
    }
    private function sendMetadata():void {
    trace("sendMetaData() called")
    myMetadata = new Object();
    myMetadata.customProp = "Welcome to the Live feed of YOUR LIFE, already in progress.";
    ns.send("@setDataFrame", "onMetaData", myMetadata);
    }
    private function publishCamera():void {
    cam = Camera.getCamera();
    mic = Microphone.getMicrophone();
    ns = new NetStream(nc);
    ns.client = this;
    ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
    ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
    ns.attachCamera(cam);
    ns.attachAudio(mic);
    ns.publish("myCamera", "record");
    }
    private function displayPublishingVideo():void {
    vid = new Video(cam.width, cam.height);
    vid.x = 10;
    vid.y = 10;
    vid.attachCamera(cam);
    addChild(vid);
    }
    private function displayPlaybackVideo():void {
    nsPlayer = new NetStream(nc);
    nsPlayer.client = this;
    nsPlayer.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
    nsPlayer.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
    nsPlayer.play("myCamera", 0);
    vidPlayer = new Video(cam.width, cam.height);
    vidPlayer.x = cam.width + 100;
    vidPlayer.y = 10;
    vidPlayer.attachNetStream(nsPlayer);
    addChild(vidPlayer);
    }
    private function setupUI():void {
    outputWindow = new TextArea();
    outputWindow.move(250, 175);
    outputWindow.width = 250;
    outputWindow.height = 150;
    outgoingLbl = new Label();
    incomingLbl = new Label();
    outgoingLbl.width = 150;
    incomingLbl.width = 150;
    outgoingLbl.text = "Publishing Stream";
    incomingLbl.text = "Playback Stream";
    outgoingLbl.move(30, 150);
    incomingLbl.move(300, 150);
    startPlaybackBtn = new Button();
    startPlaybackBtn.width = 150;
    startPlaybackBtn.move(250, 345)
    startPlaybackBtn.label = "View Live Event";
    startPlaybackBtn.addEventListener(MouseEvent.CLICK, startHandler);
    clearBtn = new Button();
    clearBtn.width = 100;
    clearBtn.move(135,345);
    clearBtn.label = "Clear Metadata";
    clearBtn.addEventListener(MouseEvent.CLICK, clearHandler);
    addChild(clearBtn);
    addChild(outgoingLbl);
    addChild(incomingLbl);
    addChild(startPlaybackBtn);
    addChild(outputWindow);
    }
    public function onMetaData(info:Object):void {
    var key:String;
    for (key in info){
    outputWindow.appendText(key + ": " + info[key] + "\n");
    }
    }
    }
    }
    package { import flash.display.MovieClip; import flash.net.NetConnection; import flash.events.NetStatusEvent; import flash.events.MouseEvent; import flash.events.AsyncErrorEvent; import flash.net.NetStream; import flash.media.Video; import flash.media.Camera; import flash.media.Microphone; import fl.controls.Button; import fl.controls.Label; import fl.controls.TextArea; public class Metadata extends MovieClip { private var nc:NetConnection; private var ns:NetStream; private var nsPlayer:NetStream; private var vid:Video; private var vidPlayer:Video; private var cam:Camera; private var mic:Microphone; private var clearBtn:Button; private var startPlaybackBtn:Button; private var outgoingLbl:Label; private var incomingLbl:Label; private var myMetadata:Object; private var outputWindow:TextArea; public function Metadata(){ setupUI(); nc = new NetConnection(); nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); nc.connect("rtmp://localhost/publishlive"); } /* * Clear the MetaData associated with the stream */ private function clearHandler(event:MouseEvent):void { if (ns){ trace("Clearing MetaData"); ns.send("@clearDataFrame", "onMetaData"); } } private function startHandler(event:MouseEvent):void { displayPlaybackVideo(); } private function onNetStatus(event:NetStatusEvent):void { trace(event.target + ": " + event.info.code); switch (event.info.code) { case "NetConnection.Connect.Success": publishCamera(); displayPublishingVideo(); break; case "NetStream.Publish.Start": sendMetadata(); break; } } private function asyncErrorHandler(event:AsyncErrorEvent):void { trace(event.text); } private function sendMetadata():void { trace("sendMetaData() called") myMetadata = new Object(); myMetadata.customProp = "Welcome to the Live feed of YOUR LIFE, already in progress."; ns.send("@setDataFrame", "onMetaData", myMetadata); } private function publishCamera():void { cam = Camera.getCamera(); mic = Microphone.getMicrophone(); ns = new NetStream(nc); ns.client = this; ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler); ns.attachCamera(cam); ns.attachAudio(mic); ns.publish("myCamera", "record"); } private function displayPublishingVideo():void { vid = new Video(cam.width, cam.height); vid.x = 10; vid.y = 10; vid.attachCamera(cam); addChild(vid); } private function displayPlaybackVideo():void { nsPlayer = new NetStream(nc); nsPlayer.client = this; nsPlayer.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); nsPlayer.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler); nsPlayer.play("myCamera", 0); vidPlayer = new Video(cam.width, cam.height); vidPlayer.x = cam.width + 100; vidPlayer.y = 10; vidPlayer.attachNetStream(nsPlayer); addChild(vidPlayer); } private function setupUI():void { outputWindow = new TextArea(); outputWindow.move(250, 175); outputWindow.width = 250; outputWindow.height = 150; outgoingLbl = new Label(); incomingLbl = new Label(); outgoingLbl.width = 150; incomingLbl.width = 150; outgoingLbl.text = "Publishing Stream"; incomingLbl.text = "Playback Stream"; outgoingLbl.move(30, 150); incomingLbl.move(300, 150); startPlaybackBtn = new Button(); startPlaybackBtn.width = 150; startPlaybackBtn.move(250, 345) startPlaybackBtn.label = "View Live Event"; startPlaybackBtn.addEventListener(MouseEvent.CLICK, startHandler); clearBtn = new Button(); clearBtn.width = 100; clearBtn.move(135,345); clearBtn.label = "Clear Metadata"; clearBtn.addEventListener(MouseEvent.CLICK, clearHandler); addChild(clearBtn); addChild(outgoingLbl); addChild(incomingLbl); addChild(startPlaybackBtn); addChild(outputWindow); } public function onMetaData(info:Object):void { var key:String; for (key in info){ outputWindow.appendText(key + ": " + info[key] + "\n"); } } } }
    package { 
        import flash.display.MovieClip; 
        import flash.net.NetConnection; 
        import flash.events.NetStatusEvent;  
        import flash.events.MouseEvent; 
        import flash.events.AsyncErrorEvent; 
        import flash.net.NetStream; 
        import flash.media.Video; 
        import flash.media.Camera; 
        import flash.media.Microphone; 
        import fl.controls.Button; 
        import fl.controls.Label; 
        import fl.controls.TextArea; 
        public class Metadata extends MovieClip { 
            private var nc:NetConnection; 
            private var ns:NetStream; 
            private var nsPlayer:NetStream; 
            private var vid:Video; 
            private var vidPlayer:Video; 
            private var cam:Camera; 
            private var mic:Microphone; 
            private var clearBtn:Button; 
            private var startPlaybackBtn:Button; 
            private var outgoingLbl:Label; 
            private var incomingLbl:Label; 
            private var myMetadata:Object; 
            private var outputWindow:TextArea; 
             
            public function Metadata(){ 
                setupUI(); 
                nc = new NetConnection(); 
                nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); 
                nc.connect("rtmp://localhost/publishlive"); 
            } 
         
            /* 
             *  Clear the MetaData associated with the stream 
             */ 
            private function clearHandler(event:MouseEvent):void { 
                if (ns){ 
                    trace("Clearing MetaData"); 
                    ns.send("@clearDataFrame", "onMetaData"); 
                } 
            }    
             
            private function startHandler(event:MouseEvent):void { 
                displayPlaybackVideo(); 
            } 
             
            private function onNetStatus(event:NetStatusEvent):void { 
                trace(event.target + ": " + event.info.code); 
                switch (event.info.code) 
                { 
                    case "NetConnection.Connect.Success": 
                        publishCamera(); 
                        displayPublishingVideo(); 
                        break; 
                    case "NetStream.Publish.Start": 
                        sendMetadata(); 
                        break; 
                } 
            } 
             
            private function asyncErrorHandler(event:AsyncErrorEvent):void { 
                trace(event.text); 
            } 
            private function sendMetadata():void { 
                trace("sendMetaData() called") 
                myMetadata = new Object(); 
                myMetadata.customProp = "Welcome to the Live feed of YOUR LIFE, already in progress."; 
                ns.send("@setDataFrame", "onMetaData", myMetadata); 
            } 
            private function publishCamera():void { 
                cam = Camera.getCamera(); 
                mic = Microphone.getMicrophone(); 
                ns = new NetStream(nc); 
                ns.client = this; 
                ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); 
                ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler); 
                ns.attachCamera(cam); 
                ns.attachAudio(mic); 
                ns.publish("myCamera", "record"); 
            } 
            private function displayPublishingVideo():void { 
                vid = new Video(cam.width, cam.height); 
                vid.x = 10; 
                vid.y = 10; 
                vid.attachCamera(cam); 
                addChild(vid);  
            } 
            private function displayPlaybackVideo():void { 
                nsPlayer = new NetStream(nc); 
                nsPlayer.client = this; 
                nsPlayer.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); 
                nsPlayer.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler); 
                nsPlayer.play("myCamera", 0); 
                vidPlayer = new Video(cam.width, cam.height); 
                vidPlayer.x = cam.width + 100; 
                vidPlayer.y = 10; 
                vidPlayer.attachNetStream(nsPlayer); 
                addChild(vidPlayer); 
            } 
             
            private function setupUI():void { 
                outputWindow = new TextArea(); 
                outputWindow.move(250, 175); 
                outputWindow.width = 250; 
                outputWindow.height = 150; 
                 
                outgoingLbl = new Label(); 
                incomingLbl = new Label(); 
                outgoingLbl.width = 150; 
                incomingLbl.width = 150; 
                outgoingLbl.text = "Publishing Stream"; 
                incomingLbl.text = "Playback Stream"; 
                outgoingLbl.move(30, 150); 
                incomingLbl.move(300, 150); 
                 
                startPlaybackBtn = new Button(); 
                startPlaybackBtn.width = 150; 
                startPlaybackBtn.move(250, 345) 
                startPlaybackBtn.label = "View Live Event"; 
                startPlaybackBtn.addEventListener(MouseEvent.CLICK, startHandler); 
                 
                clearBtn = new Button(); 
                clearBtn.width = 100; 
                clearBtn.move(135,345); 
                clearBtn.label = "Clear Metadata"; 
                clearBtn.addEventListener(MouseEvent.CLICK, clearHandler); 
                 
                addChild(clearBtn); 
                addChild(outgoingLbl); 
                addChild(incomingLbl); 
                addChild(startPlaybackBtn); 
                addChild(outputWindow); 
            } 
             
            public function onMetaData(info:Object):void { 
                var key:String; 
                for (key in info){ 
                    outputWindow.appendText(key + ": " + info[key] + "\n"); 
                } 
            } 
        } 
    }
  4. Save the file.

  5. Choose File > New > Flash File (ActionScript 3.0) and click OK.

  6. Save the file as Metadata.fla in the same folder as the Metadata.as file.

  7. Open the Components Panel, drag a Button and a TextArea component to the Stage, and delete them.

    This action adds the components to the Library. The components are added to the application at runtime.

  8. Choose File > Publish Settings. Click the Flash tab. Click Script Settings and enter Metadata as the Document class. Click the checkmark to validate the path.

  9. Save the file and choose Control > Test Movie to run the application.

Flash Media Live Encoder metadata properties

Flash Media Live Encoder sets the following metadata properties and values. You do not need to add this metadata to live streams:

Metadata property name

Data type

Description

lastkeyframetimestamp

Number

The timestamp of the last video keyframe recorded.

width

Number

The width of the video, in pixels.

height

Number

The height of the video, in pixels.

videodatarate

Number

The video bit rate.

audiodatarate

Number

The audio bit rate.

framerate

Number

The frames per second at which the video was recorded.

creationdate

String

The creation date of the file.

createdby

String

The creator of the file.

audiocodecid

Number

The audio codec ID used in the file. Values are:

0 Uncompressed

1 ADPCM

2 MP3

5 Nellymoser 8 kHz Mono

6 Nellymoser

10 HE-AAC

11 Speex

videocodecid

Number

The video codec ID used in the file. Values are:

2 Sorenson H.263

3 Screen video

4 On2 VP6

5 On2 VP6 with transparency

7 H.264

audiodelay

Number

The delay introduced by the audio codec, in seconds.

Metadata properties for recorded live streams

If you record the file as you stream it, Adobe Media Server adds the metadata listed in the following table. To record a file as you publish it to the server, use the "record" parameter, as in the following:

ns.publish("myCamera", "record");
ns.publish("myCamera", "record");
ns.publish("myCamera", "record");

Metadata property name

Data type

Description

audiocodecid

Number

The audio codec ID used in the file. Values are:

0 Uncompressed

1 ADPCM

2 MP3

5 Nellymoser 8kHz Mono

6 Nellymoser

10 HE-AAC

11 Speex

canSeekToEnd

Boolean

Whether the last video frame is a keyframe (true if yes, false if no).

createdby

String

The name of the file creator.

duration

Number

The length of the file, in seconds.

creationdate

String

The date the file was created.

videocodecid

Number

The video codec ID used in the file. Values are:

2 Sorenson H.263

3 Screen video

4 On2 VP6

5 On2 VP6 with transparency

7 H.264

When you use the "record" flag with the NetStream.publish() call, the server attempts to merge your metadata properties with the standard metadata properties. If there is a conflict between the two, the server uses the standard metadata properties. For example, suppose you add these metadata properties:

 duration=5
 x=200
 y=300
 duration=5  x=200  y=300
 duration=5 
 x=200 
 y=300

When the server starts to record the video, it begins to write its own metadata properties to the file, including duration. If the recording is 20 seconds long, the server adds duration=20 to the metadata, overwriting the value you specified. However, x=200 and y=300 are still saved as metadata, because they create no conflict. The other properties the server sets, such as audiocodecid, videocodecid, creationdate, and so on, are also saved in the file.

When a stream is recorded, the recording starts at a keyframe. Recordings stop immediately on keyframes or I-frames.

Get help faster and easier

New user?