Declarative programming of TV applications using NCL
Abstract
NCL is the declarative programming language used to develop TV applications in IPTV systems and Terrestrial TV standardized by ITU and Brazilian TV Forum, respectively. Its main characteristics are: defining temporal synchronization among media assets and viewer interactions; layout reuse facilities; support multi-device presentation; support embed HTML code and scripts in the lightweight scripting language Lua; and an API for life-cycle controls (start, pause, resume, stop) and modifying applications on-the-fly called NCL editing command. This talk briefly introduces NCL, highlights its recent advances, and discuss the future of the language.
Keywords:
- Declarative Multimedia,
- Digital TV,
- Integrated BroadBand Broadcast,
- IPTV,
- Ginga,
- NCL
Table of contents
1.Introduction
The convergence of TV systems should be supported not only at low-level network layers (internet, broadcast, cable), but also at the application layer. It requires an application programming language able to integrate and coordinate several media objects in a simple and synchronized manner. The solution can come from languages using the declarative paradigm. Nested Context Language (NCL) [5[5] ITU 2009 H.761: Nested Context Language (NCL) and Ginga-NCL for IPTV Services. Technical Report. ITU, Geneva, Switzerland.] is a declarative XML-based language initially designed for hypermedia document specification for the web. The language’s flexibility, facilities, and mainly its intrinsic ability to easily define spatiotemporal synchronization among media assets, including viewer interactions, make it an outstanding solution.
In 2007, NCL and its player, called Ginga, was adopted in the Brazilian terrestrial DTV standard, SBTVD. At the beginning of 2009, they part of an ITU-T11.ITU Telecommunication Standardization Sector Recommendation for IPTV services [3[3] Roberto Gerson Azevedo, Eduardo Cruz Araújo, Bruno Lima, Luiz Fernando Soares, and Marcelo F. Moreno 2014 Composer: Meeting Non-Functional Aspects of Hypermedia Authoring Environment. Multimedia Tools Appl. 70, 2 (May 2014), 1199–1228. ]. NCL and Ginga were designed at the TeleMídia22. http://www.telemidia.puc-rio.br Laboratory at the Pontifical Catholic University of Rio de Janeiro (PUC-Rio), which offer an open-source Ginga implementation.33. http://www.github.com/TeleMidia/ginga Independent of the distribution network where they may be applied, all Ginga and NCL specifications are open and royalty-free.
Recently the Brazilian DTV system standards have been upgraded in order to address new use cases related to deeper integration between broadcast services and broadband services. Such an evolution could not disrupt the current DTV services since Brazil and many ISDB-T countries are still under the switch-off process from analog to digital. The middleware layer is the best candidate for such an incremental (yet powerful) evolution. In this way, Ginga new receptors D-profile was created and the new features include: new media players for adaptive streaming video formats (MPEG-Dash and HLS); digital copy control through DRM; support for downloaded fonts; new preparation media event; and video buffer control.
There are other declarative methods for programming TV applications from other interactive TV standards. All of them are HTML5-based technologies. Hybridcast44. http://www.iptvforum.jp/en/hybridcast is standardized in Japan as the second generation of multimedia coding specifications that succeeds in the first one, based on BML (HTML-like). The HbbTV55. http://www.hbbtv.org specification is based on web existing standards. It is a proposal that succeeds its former European MHP interactive standard and added support for some kind of media synchronization, companion devices, and new media types. ATSC66. http://www.atsc.org/standards/atsc-3-0-standards refers to digital television standards developed in the US. ATSC has been designed with HTML5 and IP as the key components of IBB services.
In this paper, we focus on introduce NCL and its recent advances addressed by the D-profile receiver. These specification efforts were carried out in the Brazilian DTV Forum under a collaborative effort led by the Academia. The paper is organized as follows. Section 2 introduce the NCL programming. In Section 3, we discuss Ginga and D-profile features. Finally, Section 4 presents our final remarks and future work.
2.NCL
NCL is today the standard declarative language for the specification of interactive applications in the Brazilian digital TV system [1[1] ABNT 2016 ABNT NBR 15606-2:2016 Televisão digital terrestre – Codificação de dados e especificações de transmissão para radio-fusão digital Parte 2: Ginga-NCL para receptores fixos e móveis – Linguagem de aplicação XML para codificação de aplicações.] and ITU-T[5[5] ITU 2009 H.761: Nested Context Language (NCL) and Ginga-NCL for IPTV Services. Technical Report. ITU, Geneva, Switzerland.]. It is used for application coding in both fixed and mobile TV receiver (Figure 1). Its main characteristics are: defining temporal synchronization among media assets and viewer interactions; layout reuse facilities; support multi-device presentation; support embed HTML code and scripts in the lightweight scripting language Lua; and an API for life-cycle controls (start, pause, resume, stop) and modifying applications on-the-fly called NCL editing command.
NCL 1.0, was developed in 2000[6[6] M. J. Antonacci, D. C. Muchaluat-Saade, R. F. Rodrigues, and L. F. G. Soares 2000 NCL: Uma Linguagem Declarativa para Especificação de Documentos Hipermídia na Web. In VI Simpósio Brasileiro de Sistemas Multimídia e Hipermídia – SBMídia 2000.], and the most recent version, NCL 3.0, was launched in
2006[1[1] ABNT 2016 ABNT NBR 15606-2:2016 Televisão digital terrestre – Codificação de dados e especificações de transmissão para radio-fusão digital Parte 2: Ginga-NCL para receptores fixos e móveis – Linguagem de aplicação XML para codificação de aplicações.]. An NCL program is an XML (eXtensible Markup Language) code that describes an interactive
multimedia presentation. It can be seen as a glue language, which holds media objects together in a multimedia presentation. NCL only defines
how media objects are structured and related in time and space, but does not restrict or prescribe any media object content type. It is based
on NCM (Nested Context Model)[8[8] Luiz Fernando Gomes Soares, Rogério Ferreira Rodrigues, and Romualdo Monteiro de Resende Costa 2006 Nested Context Model 3.0 Part 6 – NCL (Nested Context Language) Main Profile (2006).]. This model first version dates from 1991 and consists of a
conceptual model for the specification and presentation of hypermedia documents. NCM combines the ideas nodes and hypermedia links with the
concept of nested contexts, which allows the scope definition and better organization of the information represented. Figure 2 presents the main entities of NCM. In an NCL application, there are a set of
<media>
objects and a set of synchronism <link>
elements, which
determine the object’s behavior during the presentation.
A <media>
element may be image, video, audio, text, imperative, other declarative object, and so on. It is
defined by type, content location, and properties. Some special types are predefined by the language: application/x-ncl-settings for global
variables defined by the programmer or reserved NCL Player; application/x-time for Universal Time Coordinated (UTC); application/x-ncl-NCL for
another NCL embedded application; and imperative media object types application/xncl-NCLua for code written in Lua or Java languages. It is
worth mentioning that NCL treats main audiovisual streams like all other media objects it can relate. Moreover, it treats an HTML document as
one of its possible media object types. Therefore, NCL does not substitute but HTML but integrates its documents.
Media objects and links are grouped in contexts, which can be nested, enabling better code organization. A
<context>
is a composite that contains <media>
,
<context>
, <switch>
, and <link>
child
elements. The <switch>
element allows for the definition of alternative objects to be chosen in
presentation time when testing rules. The <descriptor>
element specifies temporal and spatial information
needed to present each media object, including the initial position on the screen (<region>
).
The <link>
element defines temporal relationship between nodes. They should be based on reusable behaviors
called connectors. In NCL 3.0, the main connector is <causalConnector>
that defines a set of conditions
that need to be satisfied to trigger a set of actions. The conditions act over nodes and may test: occurrence of media an event (presentation
occurring); assessment statement to compare nodes attributes; user interactions give a specific key; or logical expressions using “and” or
“or” operators through; Actions change objects presentation (start, pause, resume, stop) and attributes.
In order to illustrate an NCL use case, Listing 1 shows a simple application that presents a sequence of video, audio, and image. This sample
application is organized in two parts: the <head>
with reusable components (rgCenter
<region>
, dCenterFade <descriptor>
, and onEndStart
<causalConnector>
) and <body>
which define the application behavior
(<media>
and <link>
). Lines 18–20 defines the three
<media>
video1, audio1 and img1. The visual ones use the dCenterFade descriptor to be presented on the
center screen with a fade effect. video1 is stated by the entry <port>
, while the first
<link>
(20–24) and second link (lines 24–28) define the start audio1 and img1.
1 <ncl>
2 <head>
3 <regionBase>
4 <region id="rgCenter" left="25%" top="25%" width="50%" />
5 </regionBase>
6 <descriptorBase>
7 <descriptor id="dCenterFade" region="rgCenter" trasIn="fade" />
8 </descriptorBase>
9 <connectorBase>
10 <causalConnector id="onEndStart">
11 <simpleCondition role="onEnd"/>
12 <simpleAction role="start"/>
13 </causalConnector>
14 </connectorBase>
15 </head>
16 <body>
17 <port id="entry" component="v1" />
18 <media id="video1" descriptor="dCenterFade" src="v1.mp4 " />
19 <media id="audio1" src="a1.mp3 " />
20 <media id="img1" descriptor="dCenterFade" src="img1.png" />
21 <link xconnector="onEndStart">
22 <bind role="onEnd" component="video1"/>
23 <bind role="start" component="audio1"/>
24 </link>
25 <link xconnector="onEndStart">
26 <bind role="onEnd" component="audio1"/>
27 <bind role="start" component="img1"/>
28 </link>
29 </body>
30 </ncl>