Bookmark and Share

Video Streaming - Etc. 2020

codecs logo


Video Streaming - Terminology
  • Encode
    To convert raw audio and/or video content into compressed form using technologies such as MPEG.
  • Transcode
    To convert a video signal that is encoded in one technology (MPEG-2) into another (MPEG-4).
  • Transrate
    To change the bit rate of compressed video stream.
  • Transmux
    To convert to a different container format without changing the file contents.
  • Pull
    A Pull is a connection initiated by a streaming server to receive a broadcast from a designated encoder for re-distribution across a network.
  • Push
    A Push is a connection initiated by an encoder to a streaming server to receive a broadcast for re-distribution across a network. This requires a username and password.
  • Latency
    Latency refers to the amount of time taken for data to complete a return trip between two points.
  • ABR
    Adaptive BitRate Video Streaming, the protocol developed by apple and used for iOS (and many products)
  • RTMP
    Real Time Media Protocol, developed by Adobe used by Flash
  • CBR (Constant Bit Rate) encoding
    The encoding software attempt to keep the total bits/second constant through the entire video. This makes the size of the file predictable and easier to stream. Most modern CODECs will allow you to set an upper threshold on the bit rate and allow the rate to drop when it is not required for quality to help reduce the amount of bandwidth used.
  • Variable bit rate encoding (VBR)
    A method of encoding video that first analyses the video and then compresses it. While it can take up to twice as long to encode the video, they are compressed at an optimal rate for the smallest file size. The variability in the data rate of the data stream does not make it appropriate for RTSP streamed content, but good for progressive download or video on CDs or other physical media.



Video Streaming - Streaming vs. Download

We have two ways to view media on the internet: Downloading and Streaming. In this section, I'll briefly summarize each, and then later, address them more deeply.


download_vs_streaming.png
  • Downloading

    When we download a file, the entire file is saved on our computer. This has some advantages (such as quicker access to different parts of the file) but has the big disadvantage of having to wait for the whole file to download before any of it can be viewed.

    The easiest way to provide downloadable video files is to use a simple hyperlink to the file. A slightly more advanced method is to embed the file in a web page using special HTML code.

    Delivering video files this way is known as HTTP streaming or HTTP delivery. For this reason it is easy to set up and use on almost any website, without requiring additional software or special hosting plans.

  • Streaming

    Streaming media works differently - the end user can start watching the file almost as soon as it begins downloading. In effect, the file is sent to the user in a constant stream, and the user watches it as it arrives. The obvious advantage with this method is no waiting. Streaming media has additional advantages such as being able to broadcast live events such as a webcast or netcast.

    True streaming video must be delivered from a specialized streaming server.

  • Progressive Downloading

    There is also a hybrid method known as progressive download. In this method the video clip is downloaded but begins playing as soon as a portion of the file has been received. This simulates true streaming, but doesn't have all the advantages.


Most end users cannot tell the difference between the video delivered by progressive download and that delivered via a streaming video server. They are all streams after all. It is not until we look very carefully at video player functions like how the navigation actually works (rewind and fast forward), that there is a difference.


The progressive download method is using a standard web server to transmit the file. The streaming method is using a streaming media server.

Streaming means that when a viewer clicks the button on a website, the video starts playing immediately, and it continues to play more or less smoothly to the end. To make this happen, the data rate of the encoded file must be smaller than the bandwidth capacity of the remote viewer; otherwise, the video will frequently stop playing.

The technical definition of streaming video is video delivered via a streaming server, which is a software program that's dedicated to deliver streaming media. This is in contrast with a traditional web server. The streaming server has several additional functions over a standard web server:

    
  • Real-time flow control
  • 
  • Intelligent stream switching
  • 
  • Interactive clip navigation

In progressive download, the end user experience is similar to streaming media, however the digital file is downloaded to a physical drive on the end user's device, the digital file is typically stored in the temp folder of the associated web browser if the digital media was embedded into a web page or is diverted to a storage directory that is set in the preferences of the media player used for playback. The digital media file will stutter or stop play back if the rate of play back exceeds the rate at which the file is downloaded. The file will begin to play again after further download.

In streaming, a streaming server works with the client to send audio and/or video over the Internet or Intranet and play it almost immediately. They allow real-time 'broadcasting' of live events, and the ability to control the play-back of on-demand content. Playback begins as soon as sufficient data has downloaded. The viewer can skip to a point part way through a clip without needing to download the beginning. If the data can not be downloaded fast enough, a streamed web cast sacrifices quality in order for the viewing to remain synchronised with the original timing of the content.



live_vs_ondemand.png

One of the primary reasons that producers use streaming servers is because once video is stored (or cached) on a hard drive, it's very easy to copy. Streaming video can be cache-less, which makes it inherently more secure.

Most Internet Video is delivered by progressive download. For example, YouTube video is delivered by progressive download.

What's happening when we watch a movie from a web site? First, the video file is placed on a server. Second, we click a link to that file on a web page. The server sends the data to us. It may appear to be streaming since playback can begin almost immediately. The progressive download feature in most media players allows them to begin playing the file as soon as enough data has been downloaded.

Years ago, the only acceptable way to deliver video was via streaming protocols such as RTSP, RTP, or RTCP that required proprietary players and expensive servers. Often referred to as Streaming Video, these solutions were costly and did not scale well but offered more functionality at the time. Yet, as technology has evolved, there was a wholesale migration towards to a new way of delivering video delivery via the standard HTTP protocol (often referred to as Progressive Download). Less expensive, it also scales well. This shift has occurred due to customer acceptance once the technology evolved to include many of the features that were once only possible with streaming protocols.

By using metadata attached to encoded files, progressive download can now allow users full seek and navigation at any time without requiring full file download. By using bandwidth throttling (specify bit rate that files should be delivered at), it is now possible to deliver only the amount of video that will be viewed, preventing wasted bandwidth. For more on Bandwidth Optimization, check here.

To maximize security, no-cache headers can be used to prevent browsers from storing content in cache and further DRM protection is easily available from partners.

So, with the broader availability of high-bandwidth networks and new media delivery features of web server, the differences that previously favored the use of a stream server over a web server for delivering digital media content have blurred. In non-multicast streaming scenarios, depending upon your business need, a stream server or a web server can both be viable options for digital media content delivery today.

However, in many ways it is inferior to adaptive streaming - Http Live Streaming (HLS) which will be described later.


download_and_play_vs_streaming.png


Progressive download can be achieved using a regular web (http) server. The client (player) handles the buffering and playing during the download process.

The quality of the file from the progressive download is pre-determined. A user watching from a mobile connection on a 3 inch screen will have the same video as a user watching from a cable modem connection on a 1080p TV. The player is unable to dynamically adjust based on the user's network and screen conditions. Furthermore, if a user starts in a high-bandwidth environment, then moves to a low-bandwidth environment, HTTP Progressive Download is completely unable to keep pace. HLS, however, handles this scenario gracefully with minimal rebuffering and lag.


Video Streaming Primer
Video Streaming 1.12.pdf


web_server_vs_streaming_server.png


HLS is another HTTP-based media streaming communications protocol implemented by Apple Inc. as part of their QuickTime and iPhone software systems. There are multiple adaptive streaming alternatives today, including Adobe's Dynamic Streaming, Apple's HTTP Live Streaming and Microsoft's Smooth Streaming. For more info, check Adaptive Bit Rate Video Streaming: Why Delivery Will Matter More Than Codec.

HLS was originally unveiled with the introduction of the iPhone 3.0 in 2009. Prior to the iPhone 3, no streaming protocols were supported natively on the iPhone. What it's doing is an adaptive streaming over HTTP. The server is usually running streaming media software such as Windows Media Services (Microsoft) or the Helix Server (RealNetworks).

Adaptive streaming segments video into small chunks. For example HLS usually uses 10 second chunks. The video is encoded at multiple bitrates and resolutions creating chunks of different sizes. This is the adaptive part of adaptive streaming, as the mobile client can choose between different bitrates/resolutions and adapt to larger or smaller chunks automatically as network conditions change.

HTTP Live Streaming lets you send audio and video over HTTP from an ordinary web server for playback on iPhone, iPad, iPod touch, and desktop computers. HTTP Live Streaming supports both live broadcasts and prerecorded content (video on demand). HTTP Live Streaming supports multiple alternate streams at different bit rates, and the client software can switch streams intelligently as network bandwidth changes. HTTP Live Streaming also provides for media encryption and user authentication over HTTPS, allowing publishers to protect their work.

By default both of the methods use protocols more suited to streaming, such as RTSP (Real Time Streaming Protocol) and UDP (User Datagram Protocol). RTSP provides built-in support for the control messages and other features of streaming servers.

With progressive download, if the playback rate exceeds the download rate, playback is delayed until more data is downloaded. Files that are downloaded over the Web are generally only able to be viewed after the entire file is downloaded.

But with HLS, files delivered to the server using streaming media technology are playable at the same time they are received by the computer they are being played on. A streaming server works with the client to send audio and/or video over the Internet and play it almost immediately.



hls

hls2

Picture from Apple, application/x-mpegURL .m3u8 and video/MP2T .ts



Streaming has the following steps:

  • The audio stream is compressed using an audio codec (MP3, Vorbis or AAC).
  • The video stream is compressed using a video codec (H.264 or V8).
  • Encoded audio and video streams are assembled in a container bitstream (FLV, WebM, ASF or ISMA).
  • The bitstream is delivered from a streaming server to a streaming client using a transport protocol (MMS or RTP).
  • The streaming client may interact with the streaming server using a control protocol (MMS or RTSP).

RTP (Real Time Protocol)

Real-Time Protocol (RTP) is a transport protocol that was developed for streaming data. RTP includes extra data fields not present in TCP. It provides a timestamp and sequence number to facilitate the data transport timing, and allows control of the media server so that the video stream is served at the correct rate for real-time display. The media player then uses these RTP fields to assemble the received packets into the correct order and playback rate.

  • Sequence number
    The value of this 16-bit number increments by one for each packet. It is used by the player to detect packet loss and then to sequence the packets in the correct order. The initial number for a stream session is chosen at random.
  • Timestamp
    This is a sampling instance derived from a reference clock to allow for synchronization and jitter calculations. It is monotonic and linear in time.
  • Source identfiers
    CSRC is a unique identifier for the synchronization of the RTP stream. One or more CSRCs exist when the RTP stream is carrying information for multiple media sources. This could be the case for a video mix between two sources or for embedded content.

In a sense, RTP is not a true transport protocol, and it is designed to use UDP as a packet transport mechanism. In other words, RTP usually runs on UDP, and uses its multiplexing and checksum features. Note that RTP does not provide any control of the quality of service or reservation of network resources.


rtp_header.png




RTCP (Real Time Control Protocol)

RTCP is used in conjunction with RTP. In other words, whenever an RTP connection is made, an RTCP connection also needs to be made. This connection is made using a second neighboring UDP port; if the RTP connection uses port 1500, then the RTCP connection uses port 1501.

RTCP gives feedback to each participant in an RTP session that can be used to control the session. The messages includes reception reports, including number of packets lost and jitter statistics (early or late arrivals). This information potentially can be used by higher layer applications to modify the transmission. For example, the bit rate of a stream could be changed to counter network congestion. Some RTCP messages relate to control of a video conference with multiple participants.

RTCP provides the following features:

  • Allow synchronization between different media types, such as video and audio. It has timestamp that is used by the receiver to align the clocks in each different RTP stream so that video and audio signals can be synced.
  • Report reception quality to the senders.
  • Provide identification of the senders in the RTP session so that new receivers can join and figure out which streams they need to obtain in order to participate fully.




Session Description Protocol (SDP)

SDP is a media description format intended for describing multimedia sessions, including video-conferencing. It includes session announcement and session invitation. Below is a sample of SDP.

v=0
o=- 32176 32176 IN IP4 13.16.32.209
s=ONetworkRenderer
i=OLiveBroadcast
c=IN IP4 221.1.0.1
t=0 0
b=AS:32
a=x-qt-text-nam:ONetwork Renderer
a=x-qt-text-inf:OLive Broadcast
a=x-qt-text-cmt:source application:ONetwork Renderer
a=x-qt-text-aut:
a=x-qt-text-cpy:
a=range:npt=0-
m=audio 22002 RTP/AVP 96
a=rtpmap:96 MP4A-LATM/44100/1
a=fmtp:96 cpresent=0;config=400024100000
a=control:trackID=1

The description of the sdp is shown below:

v: Version
o: Originator, session identifier, version, network type, protocol type, address
s: Subject
i: Information
c: Connection type, address
t: Start and stop times
m: Media type, port number, transport protocol, RTP profile
a: Dynamic payload type description

When we changed something in the sdp, the client should start a new session to see the effect of the changes. It works the same way as the html file. The web server holds the new html pages, and a client needs to refresh the the page to see any changes made.





RTSP (Real Time Streaming Protocol)

RTSP provides a means for users to control media sessions. RTSP does not actually provide for the transport of video signals but it allows these signals to be controlled by a user. The RTSP (Real Time Streaming Protocol) is a network control protocol to control streaming media servers. The protocol is used for establishing and controlling media sessions between the the streaming server and client. RTSP is considered more of a framework than a protocol. RTSP is designed to work on top of RTP to both control and deliver real-time content.


streaming_link.png

rtsp

RTSP is one of the number of different protocols have been developed to facilitate real-time streaming of multimedia content. Streaming means that the mean frame rate of the video viewed at the player is dictated by the transmitted frame rate. The delivery rate has to be controlled so that the video data arrives just before it is required for display on the player. The associated audio track or tracks must also remain synchronized to the video. IP data transmission is not a synchronous process and delivery is by best effort. To achieve synchronism, timing references have to be embedded in the stream.


rtsp_tcp.png

It delivers content as a unicast stream. It is an application-level protocol that was created specifically to control the delivery of real-time data, such as audio and video content. It is implemented over a correction-oriented transport protocol. It supports player control actions such as stopping, pausing, rewinding, and fast-forwarding.

If the connection URL uses RTSP, RTSP automatically negotiates the best delivery mechanism for the content. It then directs the RTP protocol to deliver streaming content using UDP, or using a TCP-based protocol on a network that does not support UDP.

The default transport layer port number is 554.
streaming_control.png

  • OPTIONS

    An OPTIONS request returns the request types the server will accept.

  • DESCRIBE

    A DESCRIBE request includes an RTSP URL (rtsp://...), and the type of reply data that can be handled. The default port for the RTSP protocol is 554 for both UDP and TCP transports. This reply includes the presentation description, typically in Session Description Protocol (SDP) format. Among other things, the presentation description lists the media streams controlled with the aggregate URL. In the typical case, there is one media stream each for audio and video.

  • SETUP

    A SETUP request specifies how a single media stream must be transported. This must be done before a PLAY request is sent. The request contains the media stream URL and a transport specifier. This specifier typically includes a local port for receiving RTP data (audio or video), and another for RTCP data (meta information). The server reply usually confirms the chosen parameters, and fills in the missing parts, such as the server's chosen ports. Each media stream must be configured using SETUP before an aggregate play request may be sent.

  • PLAY

    A PLAY request will cause one or all media streams to be played. Play requests can be stacked by sending multiple PLAY requests. The URL may be the aggregate URL (to play all media streams), or a single media stream URL (to play only that stream). A range can be specified. If no range is specified, the stream is played from the beginning and plays to the end, or, if the stream is paused, it is resumed at the point it was paused.

  • PAUSE

    A PAUSE request temporarily halts one or all media streams, so it can later be resumed with a PLAY request. The request contains an aggregate or media stream URL. A range parameter on a PAUSE request specifies when to pause. When the range parameter is omitted, the pause occurs immediately and indefinitely.

  • RECORD

    The RECORD request can be used to send a stream to the server for storage.

  • TEARDOWN

    A TEARDOWN request is used to terminate the session. It stops all media streams and frees all session related data on the server.

A streaming server works with the client to send audio and/or video over the Internet or Intranet and play it almost immediately. They allow real-time 'broadcasting' of live events, and the ability to control the play-back of on-demand content. Playback begins as soon as sufficient data has downloaded. The viewer can skip to a point part way through a clip without needing to download the beginning. If the data can not be downloaded fast enough, a streamed web cast sacrifices quality in order for the viewing to remain synchronised with the original timing of the content.

With Windows Media Server, RTSP supports the following features:

  • RTP packets can stream over UDP or over TCP. If the client can tolerate packet loss, streaming over UDP can be more efficient than TCP because UDP does not incur the overhead of retransmitting lost packets.
  • The encapsulation of Advanced Streaming Format (ASF) packets in RTP is proprietary.
  • The description of the ASF file, called ASF encapsulated in SDP, is proprietary.
  • WMS supports retransmission of lost RTP packets sent over UDP. This behavior allows a client to give up on expired RTP packets, which in turn helps the client avoid falling behind after losing packets.
  • WMS supports a forward error correction (FEC) scheme for RTP packets.
  • Streaming with RTSP fails if a firewall separates the client and server, and the firewall blocks the ports and protocols that RTSP uses. This problem is especially common with home Internet gateways. Even if the gateway has a built-in RTSP NAT, streaming might fail at times.
  • RTSP has the overhead of requiring multiple requests before playback can begin. However, the client can pipeline many of these requests and send them over a single TCP connection, in which case WMP does not need to block waiting for a response.

rtsp_rfc.png



For more info, please read the following:

HTTP versus RTMP: Which Way to Go and Why?
HTTPvsRTMP.pdf


What Is a Streaming Media Protocol?
HTTP, RTSP, RTMP, Smooth Streaming, HLS, HDS, and more:
What exactly are streaming protocols, and how do they interact with other communications protocols?
What Is a Streaming Media Protocol?





Containers

Just like a ZIP file can contain any sort of file within it, video container formats only define how to store things within them, not what kinds of data are stored. By definition, a container format could wrap any kind of data. Most container formats are specialized for specific data requirements.

The most popular multi-media containers are:

  • MP4

    • Most commonly used video format.
    • 4th standard produced by MPEG group.
    • It can store most of any types of video data including H.264, metadata, chapters, DVD menus, and subtitles.
    • It can also be easily streamed over the internet.
  • AVI

    • Created by Microsoft in 1992.
    • Doesn't allow for metadata, chapters, and subtitles.
  • MOV

    • Owned by Apple's Quick Time department.
    • Released in 1991.
    • Starting from 2001, MOV and MP4 began using the same format specifications.



Here are more info on container formats:

  • 3GP (used by many mobile phones; based on the ISO base media file format)
  • ASF (container for Microsoft WMA and WMV, which today usually do not use a container)
  • AVI (the standard Microsoft Windows container, also based on RIFF)
  • DVR-MS ("Microsoft Digital Video Recording", proprietary video container format developed by Microsoft based on ASF)
  • Flash Video (FLV, F4V) (container for video and audio from Adobe Systems)
  • IFF (first platform-independent container format)
  • Matroska (MKV) (not limited to any codec or system, as it can hold virtually anything. It is an open standard and open source container format).
  • MJ2 - Motion JPEG 2000 file format, based on the ISO base media file format which is defined in MPEG-4 Part 12 and JPEG 2000 Part 12
  • QuickTime File Format (standard QuickTime video container from Apple Inc.)
  • MPEG program stream (standard container for MPEG-1 and MPEG-2 elementary streams on reasonably reliable media such as disks; used also on DVD-Video discs)
  • MPEG-2 transport stream (a.k.a. MPEG-TS) (standard container for digital broadcasting and for transportation over unreliable media; used also on Blu-ray Disc video; typically contains multiple video and audio streams, and an electronic program guide)
  • MP4 (standard audio and video container for the MPEG-4 multimedia portfolio, based on the ISO base media file format defined in MPEG-4 Part 12 and JPEG 2000 Part 12) which in turn was based on the QuickTime file format.
  • Ogg (standard container for Xiph.org audio fomat Vorbis and video format Theora)
  • RM (RealMedia; standard container for RealVideo and RealAudio)


MPEG-2 Transport Stream (M2TS) container format

TS (Transport Stream) is a standard method used in MPEG for converting PES (Packetized Elementary Stream) streams into a stream pf packets that can easily be transported over an IP network, a satelite link, or a digital television broadcast to a home. Each packet is a fixed 188 bytes long.


TransportStream.png

Multiple MPEG programs are combined then sent to a transmitting antenna. In the US broadcast digital TV system, an ATSC (Advanced Television Systems Committee) receiver then decodes the TS and displays it. In most other parts of the world, transmission would be accomplished by one or more variants of the modular DVB (DVB-H, DVB-SH (Satellite to Handhelds), DVB-NGH, or eMBMS (but not yet)) system.

From http://en.wikipedia.org/wiki/MPEG_transport_stream



MPEG-2 TS profiles
  • Main
    • segments are time-aligned
  • Simple
    • seamlessly switchable subset of Main
    • Segments start with PAT (Program Association Table), PMT (Program Map Table), and PCR (Program Clock Reference).
    • Segments contain only complete AU's



HTML5 - Video codecs and containers

HTML5 - Audio and Video.



Audio codecs

Audio Codecs discussed in my HTML5 section..




Tools




Best Practices Suite for Digital Audio-Visual Distribution

For the best practices of video streaming, please check digital-ema-toolkit-complete.pdf from EMA.




Unicast, Broadcast, Multicast, and Anycast

Unicast

Unicast packets are sent from host to host. The communication is from a single host to another single host. There is one device transmitting a message destined for one reciever.


Broadcast

Broadcast is when a single device is transmitting a message to all other devices in a given address range. This broadcast could reach all hosts on the subnet, all subnets, or all hosts on all subnets. Broadcast packets have the host (and/or subnet) portion of the address set to all ones. By design, most modern routers will block IP broadcast traffic and restrict it to the local subnet.


Multicast

Multicast appears to be a hybrid of unicast and broadcast communication, but that's not the case. Multicast does allow point-to-multipoint communication, which is similar to broadcast, but it happens in a different manner. The crux of multicast is that it enables multiple recipients to receive messages without flooding the messages to all hosts on a broadcast domain.

Multicast works by sending messages/data to IP multicast group addresses. Routers then forward copies unlike broadcasts (which are not forwarded) of the packet out every interface that has hosts subscribed to that group address.

This is where multicast differs from broadcast messages-with multicast communication, copies of packets are sent only to subscribed hosts.

There are several different groups that uses or applications can subscribed to. The range of multicast addresses starts with 224.0.0.0 and goes through 239.255.255.255.


multicast_network_addressing.png

As we can see, this range of addresses falls within IP Class D address space.

Multicasting sounds like a very efficient solution to the resource problems of delivering a webcast to very large audiences. But it can be used only for live or simulated live webcasting. You lose the interactivity of on-demand streaming.

A multicast is similar to a broadcast in the sense that its target is a number of machines on a network, but not all. Where a broadcast is directed to all hosts on the network, a multicast is directed to a group of hosts. The hosts can choose whether they wish to participate in the multicast group (often done with the Internet Group Management Protocol), whereas in a broadcast, all hosts are part of the broadcast group whether they like it or not!

How do you talk to a group of hosts (our multicast group), where each host has a different MAC address, and at the same time ensure that the other hosts, which are not part of the multicast group, don't process the information ?

Here is the answer:
Multicast - Understand How IP Multicast Works.


Anycast

Anycast communication allows the same address to be placed on more than one device so that when traffic is sent to one device addressed in the same way, it is routed to nearest host that share the same address.

Each destination address identifies a set of receiver endpoints, but only one of them is chosen at any given time to receive information from any given sender.

So, like multicast addresses, an anycast address identifies multiple interfaces, but there's a big difference: the anycast packet is only delivered to one adress -astually, the first one it finds defined in terms of routing distance. This address is special because we can apply a single address to more than one interface. W could call them one-to-one-of-many addresses.





Video Streaming Servers


CDN (Content Delivery Network) Systems

CDNs are used to rapidly and cost-effectively deliver a variety of content to numerous end points, whether the end points are web browsers, mobile devices, set-top boxes, or even gaming consoles.

  • Tier 1 Service Providers
    AT&T, Bell Canada, Deutsche Telekom, Global Crossing, Level 3, Tata Communications, and Verizon
  • Data Service Providers That Also Provide Some CDN Services
    Amazon and Rackspace
  • Stand-alone CDNs
    Akamai, EdgeCast Networks, Highwinds, Limelight, Mirror Image, iStreamPlanet, Octoshape, PowerStream, StreamGuys, Streamzilla
  • Technology Providers to Many CDNs
    Adobe, Cisco, Jet-Stream, Juniper, Microsoft (Microsoft also has Windows Azure, a cloud computing infrastructure)

    • Cisco CDS
      Cisco CDS


    • Velocix Alcatel-Lucent
      Velocix


    • Juniper Media Flow
      Juniper MediaFlow


    • JetStream
      JetStream


    • Edgeware
      Edgeware


    • Verivue


FFmpeg

FFmpeg is a set of libraries such as libavcodec, libavformat, and libavutil. We can read and write audio/video file formats and decompress and compress their contents.

  • It's the most widely used in Linux distributions.
  • It's also compatible with Windows and Mac.
  • It's mainly created as the culmination of free and open source encoding and decoding libraries that are out there. This makes it one-stop shopping for encoding and decoding pretty much any type of video.
  • libavcodec - an audio/video codec library used by several other projects.
  • libavformat - an audio/video container mux and demux library
  • ffmpeg - command line program for transcoding multimedia files.
  • For more on ffmpeg, please visit Video Streaming : FFmpeg.

FFmpeg - install

On linux (Fedora 18):

  1. Get a snapshot from FFmpeg Download and Source Code Repository Access
    or download the ffmpeg here: FFmpeg 1.2 "Magic" (ffmpeg-1.2.tar.bz2).
  2. tar xvjf ffmpeg-1.2.tar.bz2
  3. ./configure in the ffmpeg diretory.
  4. make
  5. make install
  6. $ which ffmpeg
    /usr/local/bin/ffmpeg
    $ which ffprobe
    /usr/local/bin/ffprobe







DirectShow

DirectShow is a multimedia framework like Apple's QuickTime framework and Linux multimedia frameworks such as GStreamer.

It is an editing and diagnostic tool used by Microsoft to check media files. It is a programming feature added to Windows operating systems and is often used by Windows Media Player and Windows Media Player Classic.

Most players use DirectShow for playback. It uses a graph of filters to process multimedia data. In other words, DirectShow is a system that uses multiple DirectShow filters as building blocks to construct what is called a DirectShow graph. DirectShow divides a video playback into a sequence of fundamental processing steps known as filters. Each filter represents one stage in the processing of the data.


A video codec is a filter example. It takes raw uncompressed video and compresses it using a video standard such as H.264. To compress a multimedia stream a filter graph could have two inputs (audio/video). Usually these are expressed as file sources. The file sources would feed compression filters, the output of the compression filters would be fed to a multiplexer that would combine the two inputs and produce a single output. An example of a multiplexer would be an MPEG transport stream creator. Finally the multiplexer output would be fed to a file sink, which would create a file from the output.

Filter has input and/or output pins that may be used to connect the filter to other filters. The generic nature of this connection mechanism enables filters to be connected in various ways so as to implement different complex functions. To implement a specific complex task, a developer must first build a filter graph by creating instances of the required filters, and then connecting the filters together.

A filter graph in multimedia processing is a directed graph. Edges represent one way data flow and nodes represent a data processing step. The term pins or pads are used to describe the connection point between nodes and edges.


Read more: What is DirectShow Decoder?
eHow.com http://www.ehow.com/about_5077088_directshow-decoder.html#ixzz1lXxy3ey6





K-Lite Codec Pack

The K-Lite Codec Pack is a collection of codecs and tools designed to make using DirectShow easier. There are many types of codec packs, and most are available free over the Internet. Different codec packs may come with different files, and having one may not mean you can play every form of media.

Download K-Lite_Codec_Pack_810_Mega(21MB): K-Lite_Codec_Pack_810_Mega.exe

Note 1:
If we have to use a codec pack, a minimal one like K-Lite Basic or Combined Community Codec Pack (CCCP) might be the one we should use.

Note 2:
Most of the best video players out there-like the popular VLC or PotPlayer which we can down load from here have their own self-contained sets of codecs, that won't conflict with anything on our system, nor do they require any kind of separate management or updating on our part.
If VLC and PotPlayer won't play our video, we probably don't need to view it.