HTML5 Rocks

HTML5 Rocks

Alpha transparency in Chrome video

By Sam Dutton at

Alpha transparency in Chrome video

Chrome Canary now supports video alpha transparency in WebM.

In other words, Chrome takes the alpha channel into account when playing 'green screen' videos encoded to WebM with an alpha channel. This means you can play videos with transparent backgrounds: over web pages, images or even other videos.

There's a demo at This only works in Chrome Canary at this point, and you'll need to enable VP8 alpha transparency from the chrome://flags page. Somewhat surreal, and a bit rough around the edges (literally) but you get the idea!

Here's a screencast:

How to make alpha videos

The method we describe uses the open source tools Blender and ffmpeg:

  1. Film your subject in front of a single colour background such as a bright green curtain.
  2. Process the video to build an array of PNG still images with transparency data.
  3. Encode to a video format (in this case, WebM).

There are also proprietary tools to do the same job, such as Adobe After Effects, which you may find simpler.

1. Make a green screen video

First of all, you need to film your subject in a way that everything in the background can be 'removed' (made transparent) by subsequent processing.

The easiest way to do this is to film in front of a single colour background, such as a screen or curtain. Green or blue are the colors most often used, mostly because of their difference from skin tones.

There are several guides online to filming green screen video (also known as chroma key) and lots of places to buy green and blue screen backdrops. Alternatively, you can paint a background with chroma key paint.

The Great Gatsby VFX reel shows just how much can be accomplished with green screen.

Some tips for filming:

  • Ensure your subject does not have clothes or objects that are the same color as the backdrop, otherwise these will show up as 'holes' in the final video. Even small logos or jewelry can be problematic.
  • Use consistent, even lighting, and avoid shadows: the aim is to have the smallest possible range of colors in the background that will subsequently need to be made transparent.
  • Using multiple diffused lights helps to avoid shadows and background color variations.
  • Avoid shiny backgrounds: matte surfaces diffuse light better.

2. Create raw alpha video from green screen video

The following steps describe one way to create a raw alpha video from green screen videos:

  1. Once you've shot a green screen video, you can use an open source tool like Blender to convert the video to an array of PNG files with alpha data. Use Blender's color keying to remove the green screen and make it transparent. (Note that PNG is not compulsory: any format that preserves alpha channel data is fine.)
  2. Convert the array of PNG files to a raw YUVA video using an open source tool such as ffmpeg:

    ffmpeg -i image%04d.png -pix_fmt yuva420p video.raw

    Alternatively encode the files directly to WebM, using an ffmpeg command like this:

    ffmpeg -i image%04d.png output.webm

If you want to add audio, you can use ffmpeg to mux that in with a command like this:

ffmpeg -i image%04d.png -i audio.wav output.webm

3. Encode alpha video to WebM

Raw alpha videos can be encoded to WebM in two ways.

  1. With ffmpeg: we added support to ffmpeg to encode WebM alpha videos.

    Use ffmpeg with an input video including alpha data, set the output format to WebM, and encoding will automatically be done in the correct format as per the spec. (Note: you'll currently need to make sure to get the latest version of ffmpeg from the git tree for this to work.)

    Sample command:

    ffmpeg -i myAlphaVideo.webm output.webm

  2. Using webm-tools:

    git clone

    webm-tools is a set of simple open source tools related to WebM, maintained by the WebM Project authors, including a tool for creating WebM videos with alpha transparency.

    Run the binary with --help to see list of options supported by alpha_encoder.

4. Playback in Chrome

To play the encoded WebM file in Chrome, simply set the file as the source of a video element.

As of now, VP8 alpha playback is behind a flag, so you have to either enable it in about:flags or set the command line flag --enable-vp8-alpha-playback when you start Chrome. When the flag is enabled, alpha playback also works with MediaSource.

How did they do it?

We talked to Google engineer Vignesh Venkatasubramanian about his work on the project. He summarised the key challenges involved:

  • The VP8 bitstream had no support for alpha channel. So we had to incorporate alpha without breaking the VP8 bitstream and without breaking existing players.
  • Chrome's renderer was not capable of rendering videos with alpha.
  • Chrome has multiple rendering paths for multiple hardware/GPU devices. Every rendering path had to be changed to support rendering of alpha videos.

We can think of lots of interesting use cases for video alpha transparency: games, interactive videos, collaborative story telling (add your own video to a background video/image), videos with alternative characters or plots, web apps that use overlay video components...

Happy film making! Let us know if you build something amazing with alpha transparency.

WebRTC hits Firefox, Android and iOS

By Sam Dutton at

A lot has happened with WebRTC over the last few weeks. Time for an update!

In particular, we're really excited to see WebRTC arriving on multiple browsers and platforms.

getUserMedia is available now in Chrome with no flags, as well as Opera, and Firefox Nightly/Aurora (though for Firefox you'll need to set preferences). Take a look at the cross-browser demo of getUserMedia at—and check out Chris Wilson's amazing examples of using getUserMedia as input for Web Audio.

webkitRTCPeerConnection is now in Chrome stable and it's flagless. TURN server support is available in Chrome 24 and above. There's an ultra-simple demo of Chrome's RTCPeerConnection implementation at and a great video chat application at (A word of explanation about the name: after several iterations, it's currently known as webkitRTCPeerConnection. Other names and implementations have been deprecated. When the standards process has stabilised, the webkit prefix will be removed.)

WebRTC has also now been implemented for desktop in Firefox Nightly and Aurora, and for iOS and Android via the Ericsson Bowser browser.


DataChannel is a WebRTC API for high performance, low latency, peer-to-peer communication of arbritary data. The API is simple—similar to WebSocket—but communication occurs directly between browsers, so DataChannel can be much faster than WebSocket even if a relay (TURN) server is required (when 'hole punching' to cope with firewalls and NATs fails).

DataChannel is planned for version 25 of Chrome, behind a flag – though it may miss this version. This will be for experimentation only, may not be fully functional, and communication won't be possible with the Firefox implementation. DataChannel in later versions should be more stable and will be implemented so as to enable interaction with DataChannel in Firefox.

Firefox Nightly/Aurora supports mozGetUserMedia, mozRTCPeerConnection and DataChannel (but don't forget to set your about:config preferences!)

Here's a screenshot of DataChannel running in Firefox:

This demo is at Here's a code snippet:

pc1.onconnection = function() {
  log("pc1 onConnection ");
  dc1 = pc1.createDataChannel("This is pc1",{}); // reliable (TCP-like)
  dc1 = pc1.createDataChannel("This is pc1",{outOfOrderAllowed: true, maxRetransmitNum: 0}); // unreliable (UDP-like)
  log("pc1 created channel " + dc1 + " binarytype = " + dc1.binaryType);
  channel = dc1;
  channel.binaryType = "blob";
  log("pc1 new binarytype = " + dc1.binaryType);

  // Since we create the datachannel, don't wait for onDataChannel!
  channel.onmessage = function(evt) {
    if ( instanceof Blob) {
      fancy_log("*** pc2 sent Blob: " + + ", length=" +,"blue");
    } else {
      fancy_log('pc2 said: ' +, "blue");
  channel.onopen = function() {
    log("pc1 onopen fired for " + channel);
    channel.send("pc1 says Hello...");
    log("pc1 state: " + channel.state);
  channel.onclose = function() {
    log("pc1 onclose fired");
  log("pc1 state:" + channel.readyState);

More information and demos for the Firefox implementation are available from the blog. Basic WebRTC support is due for release in Firefox 18 at the beginning of 2013, and support is planned for additional features including getUserMedia and createOffer/Answer constraints, as well as TURN (to allow communication between browsers behind firewalls).

For more information about WebRTC, see Getting Started With WebRTC. There's even a WebRTC book, available in print and several eBook formats.

Resolution Constraints

Constraints have been implemented in Chrome 24 and above. These can be used to set values for video resolution for getUserMedia() and RTCPeerConnection addStream() calls.

There's an example at Play around with different constraints by setting a breakpoint and tweaking values.

A couple of gotchas... getUserMedia constraints set in one browser tab affect constraints for all tabs opened subsequently. Setting a disallowed value for constraints gives a rather cryptic error message:

navigator.getUserMedia error:  NavigatorUserMediaError {code: 1, PERMISSION_DENIED: 1}

Likewise the error if you try to use getUserMedia from the local file system, not on a server!

Streaming screen capture

Tab Capture is now available in the Chrome Dev channel. This makes it possible to capture the visible area of the tab as a stream, which can then be used locally, or with RTCPeerConnection's addStream(). Very useful for sceencasting and web page sharing. For more information see the WebRTC Tab Content Capture proposal.

Keep us posted by commenting on this update: we'd love to hear what you're doing with these APIs.

...and don't forget to file any bugs you encounter at!

Introducing Video Player Sample

By Pete LePage at

Have you ever wanted a fun and beautiful way to publish videos on your own site like the new 60 Minutes or apps from the Chrome Web Store? I'm excited to announce the release of The Video Player Sample web app! The Video Player Sample is an open source video player web app built using the same architecture as the 60 Minutes and apps. It can be customized, extended, or just used out of the box and populated with your own content.

How it works

When a user opens the Video Player Sample, they can choose to watch a single video or create a playlist of videos/episodes from a list that they have uploaded and populated to the app. The Video Player Sample is configured and information about the videos is stored in JSON files (config.json and data.json respectively), both of which are located in the data directory.

Key features

  • A beautiful video watching experience, including a full screen view
  • Ability to subscribe to shows, watch episodes, create play lists
  • Support for multiple video formats depending on what the user’s browser supports (including WebM, Ogg, MP4, and even a Flash fallback)
  • A Categories page with an overview of the different shows/categories available in the app
  • Notifications of new episodes (when the app is installed via the Chrome Web Store)
  • Built in support for sharing to Google+, Twitter and Facebook
  • To ensure easy customization, all source files, including the Photoshop PSD’s, are included

How it's built

The Video Player Sample is written for the open web platform using HTML and JavaScript, broadly following the Model View Controller pattern and structure.

Browser Support

In addition to working as an app that can be installed through the Chrome Web Store, the Video Player Sample has been tested and works in all of the modern browsers.

Try it out

You can see a demo of the video player in action in the demo app, or by Adding It To Chrome through the Chrome Web Store. To learn more about how the app works, check out the documentation.

You can grab the code from Google Code.