[whatwg] Exposing framerate / statistics of <video> playback and related feedback
Hugh Guiney
hugh.guiney at gmail.com
Tue May 1 10:21:06 PDT 2012
On Mon, Apr 30, 2012 at 7:37 PM, Ian Hickson <ian at hixie.ch> wrote:
> On Fri, 28 May 2010, Ian Fette wrote:
>>
>> Has any thought been given to exposing such metrics as framerate, how
>> many frames are dropped, rebuffering, etc from the <video> tag?
>
> It has come up a lot, but the main question is: what is the use case?
Web-based non-linear editors. This software already exists—YouTube has
one: http://www.youtube.com/editor, Mozilla has one:
http://mozillapopcorn.org/popcorn-maker/, and there are/have been
several independent efforts as well
(http://lifehacker.com/5629683/jaycut-is-a-pretty-awesome-web+based-video-editor,
http://www.spacebarstudios411.com/easyclip/, etc).
Right now all of this software is alpha-stage, but the kinds of
problems they attempt to solve involve: pop-up annotations,
synchronized slide shows, clickable video areas, etc. Essentially,
they will allow users to author rich multimedia experiences that
aren't achievable with a traditional desktop NLE. Even if desktop NLEs
were to offer this functionality with an HTML export like Adobe is
doing with Flash CS6, it is advantageous to work in the destination
medium rather than one fundamentally different; a similar trend is
happening right now as web designers are moving away from Photoshop
and beginning to design in the browser directly, and I can only
imagine the same will happen with moving images, technology
permitting.
As it stands, frame rate awareness is a feature of NLEs that you would
have to try very hard NOT to find. It is quite common for modern
camcorders to offer an array of different available frame rates, for
instance Panasonic's prosumer models (HVX200, HPX170 etc.) offer at
least 20 different fps options: 12, 15, 18, 20, 21, 22, 24, 25, 26,
27, 28, 30, 32, 34, 36, 40, 44, 48, 54, and 60. One of the primary
purposes these options is to allow the user to achieve time distortion
effects; if your main timeline is 24fps, you could shoot at 12fps and
play it back for fast-motion; alternatively 48fps for slow-motion.
These are called "undercranking" and "overcranking" respectively and
have been in use since the dawn of film.
A ubiquitous UI paradigm in modern video editing is to have a timeline
with a set frame rate, that videos of alternate frame rates can be
dragged into to change their effective playback speed. Not only is
this useful for artistic time distortion effects, but also pragmatic
time distortion, such as mixing 24fps (US film) and 30fps (US
broadcast), 24fps with 25fps (European film), etc. with a non-variable
output frame rate.
Other use cases:
- Categorizing/filtering a video catalog by frame rate, such as on a
stock videography or film archive site, to only see those match the
user's interest.
- Video player UI displaying the frame rate so that users can tell if
it is worthwhile to attempt playback on a slow connection, or device
with limited playback capabilities. For instance such a user might
discern that watching a 1080p60 video on a mobile device would take up
too much bandwidth BEFORE pressing play and having the video stutter
or play too slowly. Similarly, devices could detect this on their own
and report to the user.
- Frame-accurate subtitle authoring; timing the display of text with a
character's lip movements is a precise art and if it is off by even a
few seconds, it is distracting to the audience.
- NLE that ingests Edit Decision List (EDL) XML files, which denote
cuts, transitions, etc. in SMPTE timecode, so editors can work on
projects that were originally cut in another NLE. This would be
especially useful for desktop-to-web migration.
> If you have fixed frame rates, it's trivial to do the conversion to and
> from SMTPE timecode in JavaScript; you don't need any direct support from
> the media element API.
Yes, but we currently have no way of knowing what fixed frame rate we
are working with, making this kind of conversion impossible except
through pure guesswork. If frame rate is exposed we don't need SMPTE
internally.
More information about the whatwg
mailing list