Testing Live Streaming Applications – Video Stream, AV Sync, CC

Testing Live streaming or multimedia applications require proper lab setup which includes a bandwidth controller, network sniffing tools like Wireshark, Fiddler; application debug tools like ADB and MONITOR for Android-based applications, iTools, iTunes and xCode for iOS based applications, physical devices to test the application and the internet with the required bandwidth.

Testing Approach

1. Functional Testing

We should first test the functionality of the application. Below points should be tested with respect to Player.

  • Ability to Launch the application
  • Ability to play/pause the video stream
  • Ability to increase/decrease the volume
  • Ability to see Closed Captions if implemented
  • Ability to forward/rewind the stream

2. Testing Video Stream

Streaming testing should be done to check the Bitrate, Buffer length, and Lag in video playback. One should test below-mentioned points in video streaming:

  • How much time it takes to start the playback
  • Lag in the video from Live content
  • What is the buffer fill
  • Able to playback the stream at variable bandwidth if that is implemented

3. AV Sync

Audio Video Sync or Lip Sync refers to the timing sync between audio part (sound) and video part (images) during playback. It must be tested to deliver the quality streaming application. When we talk about a streaming application and video playback, AV Sync comes into the picture.

You can test AV Sync simply by watching and observing the video playback, or you can use AV test videos. There are several AV-Test videos available on the internet to use. However, you can not find exact AV Syncout conditions by observing. To find exact issues you need to test your video from available stream analyzers.

4. Closed Captions(CC)

Closed Captions or CC are the subtitles you see during playback of a video. These need to be tested if they are implemented in your stream. To test this functionality, you need to verify that the CC is in sync with the Audio

5. Profile Switching

This is the main part to test if your stream supports Adaptive Bitrate. To test this, you will require a bandwidth controller that is capable of controlling your network bandwidth as required. A network sniffing tool to check the switching of profiles. Debugging tools to capture any error situation.

Below is an illustration of how Adaptive Bitrate works in the variable bandwidth conditions. Suppose when your player started the playback it is getting a bandwidth of 500 kbps, so it started the playback of the first chunk of (B-500kbps R-480×270) stream.

Then suddenly your bandwidth increases to 2000 kbps, then the player switched the stream to the second chunk of (B-2000 R- 1280×720) stream. The bandwidth does not change from 2000 kbps for the third chunk, so the player plays the third chunk of (B-2000 R- 1280×720) stream.

Again your bandwidth decreases to 300 kbps; then the player switched the stream to the fourth chunk of (B-300 R- 240×144) stream.

This goes on in the same manner. The stream gets switched to the profiles as per available bandwidth.

[image_frame url=”http://rcvacademy.com/wp-content/uploads/2016/10/http-live-streaming-adaptive-bitrate.png” border_style=”boxed-frame” action=”open-lightbox”]

How Player gets Stream from HLS Server?

When a player request for the stream from the server, it sends a playlist/manifest (master playlist) file which contains the details of all the streams. Master Playlist also contains the information of sub-playlists/streams that are encoded in different bitrates and resolutions. When you open this playlist (.m3u8) files in a text editor, you will be able to see all the details. Below are the samples which will give you a complete picture of these playlists.

Master Playlist

#EXTM3U

#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000,RESOLUTION=624×352 624x352_800.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2600000,RESOLUTION=1280×720 1280x720_2600.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1200000,RESOLUTION=640×360 640x360_1200.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=450000,RESOLUTION=448×252 448x252_450.m3u8

Master Playlist provide the address for individual playlists/streams. In this master playlist, you can see, there are four streams present which are encoded at different bitrates/bandwidth and resolutions. Here denotes,

  1. EXTM3U: Extended M3U File/Playlist
  2. EXT-X-STREAM-INF: Indicates that the next URL in the playlist file identifies another playlist file.
  3. PROGRAM-ID: Indicates value is a decimal integer that uniquely identifies a particular presentation
  4. BANDWIDTH: Indicates the value is a decimal integer of bits per second (bitrate the video is encoded)
  5. RESOLUTION: Indicates the resolution of the video

Sub Playlist

When a player chooses a playlist by deciding the available bandwidth, it gets the sub playlist which contains the details of the streams. Below is an example of such sub playlist.

#EXTM3U
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:10, no desc
fileSequence0.ts
#EXTINF:10, no desc
fileSequence1.ts
#EXTINF:10, no desc
fileSequence2.ts
#EXTINF:10, no desc
fileSequence3.ts
#EXTINF:10, no desc
fileSequence4.ts
#EXTINF:10, no desc
fileSequence5.ts
#EXTINF:10, no desc
fileSequence6.ts
#EXTINF:10, no desc
fileSequence7.ts
#EXTINF:10, no desc
fileSequence8.ts
#EXTINF:10, no desc
fileSequence9.ts
#EXTINF:10, no desc
fileSequence10.ts

Playlist Relationship

[image_frame url=”http://rcvacademy.com/wp-content/uploads/2016/10/http-live-streaming-playlist-relationship.png” border_style=”boxed-frame” action=”open-lightbox”]

HTTP Live Streaming – HLS Architecture | HLS Server Component

HTTP Live Streaming lets you send audio and video over HTTP from a web server for playback on Android, iOS, Desktop and other platform applications that support HLS playback. HLS supports both Live and Video On Demand content.

HLS Architecture

[image_frame url=”http://rcvacademy.com/wp-content/uploads/2016/10/http-live-streaming-hls-architecture.png” border_style=”boxed-frame” action=”open-lightbox”]

HLS consists of three major components:

  1. Server Component
  2. Distribution Component
  3. Client Software

 

HLS Server Component

The server component is responsible for taking input streams of media and encoding them digitally and then by creating multiple bitrate streams suitable for delivery that are then sent to the segmenter for creating segments or chunks.

[image_frame url=”http://rcvacademy.com/wp-content/uploads/2016/10/http-live-streaming-hls-server-component.jpg” border_style=”boxed-frame” action=”open-lightbox”]

HLS Distribution Component

Distribution component is responsible for content distribution. When a client/player sends a request, it reaches to the origin web server, and the response is sent back to the client in the form of index files. The player reads that index file, and again request for the content and the content is then sent to the client in the form of chunks or segments.

All the request and response are done through the CDN over HTTP. Once the content is served to the client, a cached copy of that content is created onto the CDN. So if some other client requests the same data, it is directly served from the CDN. Which reduces the load on to the origin web server.

Few examples of Media Servers are Wowza Streaming Engine, Akamai & Amazon Cloud Front.

[image_frame url=”http://rcvacademy.com/wp-content/uploads/2016/10/http-live-streaming-hls-distribution-component.png” border_style=”boxed-frame” action=”open-lightbox”]

Client Software

Client software is a player that is capable of playing HLS stream in a native application or on an HLS supported browser. Any player which supports HLS stream can be embedded into the application for Live and on Demand Playback.

Few examples are Quick Time Player, JW Player, ffplay/avplay, Safari browser and many more.

Streaming Protocol Basics – HLS Streaming, HTTP Dynamic Streaming

Before we try to understand about streaming protocols first let us try to understand the types of streaming.

Types of Streaming

There are mainly two types of Streaming:

Progressive Download

In this process, client/player asks for a video file from the server and server sends the whole file to the client over HTTP, Your playback will only start when the file is downloaded.

Since the content is downloaded on your local machine, the content is not secure. Users cannot skip forward in the timeline as content is downloaded in a linear manner. There is no monitoring of the video file; it is simply being downloaded and played back.

If the network speed is slow, you will see buffering/stalled in the video. Some examples of Progressive download are Youtube, Vimeo.

Adaptive Bitrate Streaming

It is a technique used for multimedia streaming over the Internet at fluctuating bandwidth. Adaptive Bitrate streaming works by detecting a user’s bandwidth and CPU capacity and adjusts the quality of a video stream accordingly.

You will get a clear picture later when we explain it with a figure and an example.

If we see the technologies used in the past we find that most of the streaming was done on RTSP streaming protocol, but now for adaptive bitrate streaming we use HTTP-based streaming which is designed to work efficiently in the distributed HTTP networks such as Internet.

This streaming requires an encoder which encodes the raw video from a single stream to multiple bitrates. The client application or Player switches between the streams (multiple bitrate streams) as per available bandwidth and pays the stream accordingly, in result we do not see much buffering and get good experience in both high bandwidth and low bandwidth connections.

Adaptive Bitrate Streaming Protocols

Some of the popular streaming protocols nowadays are as follows:

  1. Apple HTTP Live Streaming
  2. Adobe HTTP Dynamic Streaming
  3. Microsoft Smooth Steaming

We will explain HTTP Live Streaming (HLS) in next post as it is widely used in the media streaming industry.

Video Streaming Introduction – What is Live Streaming

Video/Media which is continuously received and being played back by a multimedia player is called video streaming.

Live streaming is the process of broadcasting real-time video to the people who are accessing the video over the internet.

You can find so many examples in the real world of media streaming like Youtube, Livestream, Vimeo, Hulu, Ustream, Netflix and many more.

How does Live Streaming fit in IT?

We all love to watch Live events especially the Sports and Concerts by our celebrity entertainers. However, when it is not possible to attend a live event, we watch these programs in real time on our TV or over the Internet through Live streaming Apps that are available on different platforms like Android, iOS, Desktop, etc.

Who uses Video Streaming?

Broadcasting companies who want their content to be delivered everywhere use the streaming concepts and deliver the video over the internet to its concepts.