WO2017038065A1 - Mappage de signalisation d'événement avec html - Google Patents
Mappage de signalisation d'événement avec html Download PDFInfo
- Publication number
- WO2017038065A1 WO2017038065A1 PCT/JP2016/003889 JP2016003889W WO2017038065A1 WO 2017038065 A1 WO2017038065 A1 WO 2017038065A1 JP 2016003889 W JP2016003889 W JP 2016003889W WO 2017038065 A1 WO2017038065 A1 WO 2017038065A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- event
- time
- value
- events
- mmt
- Prior art date
Links
- 238000013507 mapping Methods 0.000 title abstract description 44
- 230000011664 signaling Effects 0.000 title description 21
- 238000000034 method Methods 0.000 claims description 31
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 13
- 230000007723 transport mechanism Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000002123 temporal effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000037431 insertion Effects 0.000 description 4
- 230000008685 targeting Effects 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 239000000686 essence Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 241000197200 Gallinago media Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 241000238876 Acari Species 0.000 description 1
- 241000282836 Camelus dromedarius Species 0.000 description 1
- 241000577979 Peromyscus spicilegus Species 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000012092 media component Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2362—Generation or processing of Service Information [SI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8543—Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
Definitions
- the present disclosure relates generally to a system and method for mapping an event to HTML.
- the Digital Terrestrial Radio and television broadcasts, Direct Broadcast Satellite digital television (TV) networks like DirecTV and Dish Network, any other digital broadcast infrastructures offers a cost effective manner to reach a potentially large local target audience with digital content such as web pages, advertisements for products related to the subject of the broadcasts, supplementary information giving more detail about the subject of the broadcast, etc.
- the primary limitations of such networks are the aggregate bandwidth and the maximum transmission unit (MTU) size of the broadcast network's channels.
- MTU maximum transmission unit
- broadcast infrastructure does not provide a digital upstream channel for interactivity such as requesting more information about an advertisement, ordering books, songs, Digital Video Discs (DVDs) or services which are the subjects of broadcasts or for any other purpose.
- the various content types are carried on one or more sub-channels.
- Sub-channels are referred to by different names in different systems.
- the aggregate bandwidth of a channel can be provisioned across the different sub-channels and consequentially the content type can be provisioned to various channels and sub-channels.
- the term “In-Band Transmission” as used herein means the content of supplementary digital data such as a web page is broadcast in the same sub-channel as the main audio or video or captions broadcast.
- the term “Out of Band Transmission” or “out-of-band” means the broadcast of the supplementary digital data is transmitted on a different sub-channel than the main audio or video or captions transmission, or otherwise made available using an alternative network, such as the Internet.
- Digital data may be transmitted using any other digital downstream broadcast which provides additional information, ads for services or products which may or may not be related to the broadcast subject, etc. This provides an opportunity to send downstream with the broadcast any digital data which can be web pages, ads related to the broadcast, excerpts of books, video clips from movies, audio clips from songs, etc. Significantly, it provides an opportunity to send advertisements for products or services related to the broadcast subject.
- a method for determining information associated with a duration value of a media element comprising: receiving a first one or more attributes of elements; receiving a first one or more properties; determining a first duration value from the first one or more attributes of elements; determining a second duration value from the first one or more properties; determining a third duration value as a function of the first duration value and second duration value; and associating the third duration value with the duration value of a media element.
- a method for determining information associated with a media presentation end time comprising: receiving a first one or more attributes of elements; receiving a first one or more properties; determining a first begin time value from the first one or more attributes of elements; determining a second begin time value from the first one or more properties; determining a third begin time value from the first begin time value and second begin time value; determining a first duration value from the first one or more attributes of elements; determining a second duration value from the first one or more properties; determining a third duration value from the first duration value and the second duration value; determining a first end time value from the first one or more attributes of elements and the first one or more properties; determining a second end time value from the first one or more properties; determining a third end time value from the third begin time value and third duration value; determining a fourth end time value as a function of the first end time value, the second end time value, and the third end time value; and associating the fourth end time value with the media presentation end time.
- FIG. 1 illustrates an Audio-Video (AV) system with event messages.
- FIG. 2 illustrates a transport mechanism for event messages.
- FIG. 3 illustrates an exemplary event stream semantics.
- FIG. 4 illustrates another exemplary event stream semantics.
- FIG. 5 illustrates AV program segments and AV insert segments.
- FIG. 6 illustrates an exemplary splice_insert().
- FIG. 7 illustrates a Dynamic Adaptive Streaming over Hypertext transfer protocol (DASH) set of event messages.
- FIG. 8 illustrates a HyperText Markup Language (HTML) set of mapped event messages.
- FIG. 9 illustrates an exemplary flow in an in-band signaling.
- FIG. 10A illustrates out-of-band Moving picture experts group Media Transport (MMT) Events.
- MMT Moving picture experts group Media Transport
- FIG. 10B illustrates another example of signaling out-of-band MMT Events.
- FIG. 11A illustrates in-band (in Media Processing Unit (MPU)) MMT Events.
- FIG. 11B illustrates another example of in-band (in MPU) MMT Events.
- FIG. 11C is an exemplary structure of the inband event descriptor.
- FIG. 12 illustrates mapping MMT Events to Moving Picture Experts Group (MPEG) Composition Information (CI).
- FIG. 13 illustrates mapping DASH and MMT Event Streams to TextTrack objects.
- FIG. 14A illustrates DASH and MMT Events reported to applications as DataCue.
- FIG. 14B illustrates another embodiment of DASH and MMT Events reported to applications as DataCue.
- FIG. 15A illustrates mapping MMT Event message fields to MPEG CI MediaSync elements.
- FIG. 15B illustrates another mapping MMT Event message fields to MPEG CI MediaSync elements.
- FIG. 16A illustrates an example message flow for Media Presentation Description (MPD) EventStream.
- FIG. 16B illustrates an example message flow for DASH Inband EventStream.
- FIG. 16C illustrates an example message flow for Application Event Information (AEI) EventStream.
- FIG. 16D illustrates an example message flow for MPU (in-band) EventStream.
- FIG. 17 illustrates logical components of a conceptual DASH client model.
- FIG. 18 depicts an end-to-end architecture.
- FIG. 19 illustrates another exemplary event stream semantics.
- FIG. 20 is another embodiment of in-band MPU (MMT) Event (evti box).
- MMT in-band MPU
- FIG. 21A illustrates mapping of a subset of MMT Event information to MPEG CI MediaSync and view element’s properties.
- FIG. 21B illustrates mapping of a subset of MMT Event information to MPEG CI MediaSync elements properties.
- FIG. 21C illustrates mapping of a subset of MMT Event information to MPEG CI view elements properties.
- FIG. 21D illustrates an exemplary signaling of out-of-band MMT event targeting a CI MediaSync element.
- FIG. 21E illustrates an exemplary signaling of in-band MMT event targeting a CI MediaSync element.
- FIG. 22 illustrates a binarized representation of ⁇ sourceList> element.
- FIG. 23A illustrates an exemplary MMT event transmission and processing for different CI element types.
- FIG. 23A illustrates an exemplary MMT event transmission and processing for different CI element types.
- FIG. 23B illustrates an exemplary MMT event reception and processing for different CI element types.
- FIG. 24 illustrates an exemplary mapping of @value (for AEI) and value (for evti) to the corresponding CI element type (e.g. MediaSync, view) indication.
- FIG. 25 illustrates a binarized representation of viewRole attribute of view element.
- FIG. 26 illustrates a binarized representation of sequence of ⁇ divLocation> elements.
- FIG. 27 illustrates mapping of MMT Event information to MPEG CI divLocation element’s properties.
- FIG. 28A illustrates mapping of id attribute of the CI document would to refId attribute of ⁇ Event> element and ref_id field of evti box.
- FIG. 28B illustrates the mapping of timing attributes for CI document elements (e.g. MediaSync, view, divLocation) to the signaled values of the timing attributes for the CI elements and the signaled values of timing information for MMT events.
- FIG. 29 illustrates HTML5 documents serving as entry point and the associated CI documents.
- FIG. 30A illustrates another embodiment of MMT Events reporting to application.
- FIG. 30B illustrates the mapping of timing attributes for CI document elements to wall clock time.
- FIG. 30C illustrates another mapping of timing attributes for CI document elements to wall clock time.
- FIG. 31 is a flowchart of an exemplary mapping of timing attributes for CI document elements to a wall clock time.
- Media elements are used to present audio data, or video and audio data, to the user.
- the textTracks attribute of media elements may return an array host object for objects of type TextTrack that is fixed length and read only.
- the array contains the TextTrack objects of the text tracks in the media element's list of text tracks, in the same order as in the list of text tracks.
- the TextTrack object may contain the kind attribute which returns the text track kind of the text track that the TextTrack object represents.
- the TextTrack object may contain the label attribute which returns the text track label of the text track that the TextTrack object represents.
- the TextTrack object may contain the language attribute which returns the text track language of the text track that the TextTrack object represents.
- the TextTrack object may contain the readyState attribute which returns the numeric value corresponding to the text track readiness state of the text track that the TextTrack object represents, as may be defined by the following list:
- the TextTrack object may contain the mode attribute, on getting, which returns the numeric value corresponding to the text track mode of the text track that the TextTrack object represents, as may be defined by the following list:
- the cues attribute may return a live TextTrackCueList object that represents the subset of the text track list of cues of the text track that the TextTrack object represents whose start times occur before the earliest possible position when the script started, in text track cue order. Otherwise, it returns null. When an object is returned, the same object is returned each time.
- the activeCues attribute returns a live TextTrackCueList object that represents the subset of the text track list of cues of the text track that the TextTrack object represents whose active flag was set when the script started, in text track cue order. Otherwise, it returns null. When an object is returned, the same object is returned each time.
- the TextTrackList of media elements returns an array host object for objects of type TextTrack.
- BMFF Base Media File
- earliest_presentation_time within the context of ISO BMFF is the earliest presentation time of any content in the reference stream in the first subsegment, in the timescale indicated in the timescale field; the earliest presentation time is derived from media in access units, or parts of access units, that are not omitted by an edit list (if any).
- timescale within the context of ISO BMFF provides the timescale, in ticks per second, for the time and duration fields within this box; it is preferable that this match the timescale of the reference stream or track; for files based on this specification, that is the timescale field of the Media Header Box of the track.
- the edit list (elst) box within the context of ISO BMFF contains an explicit timeline map. Each entry defines part of the track time-line: by mapping part of the media time-line, or by indicating ‘empty’ time, or by defining a ‘dwell’, where a single time-point in the media is held for a period.
- the Segment Index box ('sidx') within the context of ISO BMFF provides a compact index of one media stream within the media segment to which it applies.
- Each Segment Index box documents how a (sub)segment is divided into one or more subsegments (which may themselves be further subdivided using Segment Index boxes).
- a subsegment may be defined as a time interval of the containing (sub)segment, and corresponds to a single range of bytes of the containing (sub)segment. The durations of all the subsegments sum to the duration of the containing (sub)segment.
- Each entry in the Segment Index box contains a reference type that indicates whether the reference points directly to the media bytes of a referenced leaf subsegment, or to a Segment Index box that describes how the referenced subsegment is further subdivided; as a result, the segment may be indexed in a ‘hierarchical’ or ‘daisy-chain’ or other form by documenting time and byte offset information for other Segment Index boxes applying to portions of the same (sub)segment.
- Each Segment Index box provides information about a single media stream of the Segment, referred to as the reference stream.
- the first Segment Index box in a segment may document the entirety of that media stream in the segment, and may precede any other Segment Index box in the segment for the same media stream. If a segment index is present for at least one media stream but not all media streams in the segment, then normally a media stream in which not every access unit is independently coded, such as video, is selected to be indexed. For any media stream for which no segment index is present, referred to as non-indexed stream, the media stream associated with the first Segment Index box in the segment serves as a reference stream in a sense that it also describes the subsegments for any non-indexed media stream.
- Segment Index boxes may be inline in the same file as the indexed media or, in some cases, in a separate file containing only indexing information.
- a Segment Index box contains a sequence of references to subsegments of the (sub)segment documented by the box. The referenced subsegments are contiguous in presentation time. Similarly, the bytes referred to by a Segment Index box are contiguous in both the media file, and the separate index segment, or in the single file if indexes are placed within the media file. The referenced size gives the count of the number of bytes in the material referenced. In the file containing the Segment Index box, the anchor point for a Segment Index box is the first byte after that box.
- the anchor point in the media file is the beginning of the top-level segment (i.e. the beginning of the segment file if each segment is stored in a separate file).
- the material in the file containing media (which may also be the file that contains the segment index boxes) starts at the indicated offset from the anchor point. If there are two files, the material in the index file starts at the anchor point, i.e. immediately following the Segment Index box.
- the Segment Index box documents the presence of Stream Access Points (SAPs).
- FIG. 17 illustrates the logical components of a conceptual DASH client model.
- the DASH Access Engine receives the Media Presentation Description (MPD), constructs and issues requests and receives Segments or parts of Segments.
- the output of the DASH Access Engine consists of media in MPEG container formats (e.g. ISO/International Electrotechnical Commission (IEC) 14496-12 ISO Base Media File Format, ISO/IEC 13818-1 MPEG-2 Transport Stream), or parts thereof, together with timing information that maps the internal timing of the media to the timeline of the Media Presentation.
- the DASH Access client may also receive and extract Events that are related to the media time. The events may be processed in the DASH client or may be forwarded to an application in the execution environment DASH client.
- FIG. 18 depicts the end-to-end architecture.
- the MMT sending entity is responsible for sending the Packages to the MMT receiving entity as MMT protocol (MMTP) packet flows.
- the Package specifies the components comprising the media content and the relationship among them to provide necessary information for advanced delivery.
- the format of data units in this specification is defined to encapsulate the encoded media data for either storage or delivery, and to allow for easy conversion between data to be stored and data to be delivered.
- the sending entity may be required to gather contents from content providers based on the presentation information (PI) of the Package that are provided by a Package provider.
- a Package provider and Content providers may be co-located.
- Media content is provided as an Asset that is segmented into a series of encapsulated MMT Processing Units that forms a MMTP packet flow.
- the MMTP packet flow of such content is generated by using the associated transport characteristics information. Signaling messages may be used to manage the delivery and the consumption of Packages.
- An MMT receiving entity may also be referred to
- FIG. 17 illustrates the logical components of a conceptual DASH client model.
- the DASH Access Engine receives the Media Presentation Description (MPD), constructs and issues requests and receives Segments or parts of Segments.
- the output of the DASH Access Engine consists of media in MPEG container formats (e.g. ISO/International Electrotechnical Commission (IEC) 14496-12 ISO Base Media File Format, ISO/IEC 13818-1 MPEG-2 Transport Stream), or parts thereof, together with timing information that maps the internal timing of the media to the timeline of the Media Presentation.
- the DASH Access client may also receive and extract Events that are related to the media time. The events may be processed in the DASH client or may be forwarded to an application in the execution environment DASH client.
- FIG. 18 depicts the end-to-end architecture.
- the MMT sending entity is responsible for sending the Packages to the MMT receiving entity as MMT protocol (MMTP) packet flows.
- the Package specifies the components comprising the media content and the relationship among them to provide necessary information for advanced delivery.
- the format of data units in this specification is defined to encapsulate the encoded media data for either storage or delivery, and to allow for easy conversion between data to be stored and data to be delivered.
- the sending entity may be required to gather contents from content providers based on the presentation information (PI) of the Package that are provided by a Package provider.
- a Package provider and Content providers may be co-located.
- Media content is provided as an Asset that is segmented into a series of encapsulated MMT Processing Units that forms a MMTP packet flow.
- the MMTP packet flow of such content is generated by using the associated transport characteristics information. Signaling messages may be used to manage the delivery and the consumption of Packages.
- An MMT receiving entity may also be referred to
- Extensible Markup Language (XML) documents include XML elements.
- XML elements are written with a start tag, with an end tag, with the content in between:
- the XML element is everything from the start tag to the end tag
- Some XML elements can be "closed” in the opening tag e.g. ⁇ tagname />.
- Some XML elements do not have a closing tag.
- a type definition defines the structure and legal building blocks. Type definitions may apply to a XML document, XML element, XML elements attributes etc.
- a XML child element is an element that is fully contained within the content of the XML parent element.
- a XML parent element is the smallest element that fully contains the child element. For example: uimsbf corresponds to unsigned integer with most significant bit first.
- bslbf corresponds to bit string with leftmost bit first.
- an application may correspond to a CI document processor.
- a broadcast system may transmit audio and/or video (“AV”) and/or captions content 100 to a receiver 110.
- the transmission may be through a network 140 and may be done in any manner, such as television broadcast, cable networks, satellite broadcasts, local networks, Internet, etc.
- Additional data such as event messages 120, may be signaled together with the AV and/or captions content 100 or separately from AV and/or captions content 100.
- the events may come from one or more servers which are different from the broadcast station.
- the event messages 120 may be signaled either using an in-band technique or an out-of-band technique.
- the term “In-Band Transmission” or “in-band” means the content of supplementary digital data such as a web page is transmitted in the same sub-channel as the main audio and/or video and/or captions transmission.
- the term “Out of Band Transmission” or “out-of-band” means the transmission of the supplementary digital data is on a different sub-channel than the main audio and/or video and/or captions transmission, or otherwise made available using an alternative network, such as the Internet.
- the event messages 120 may be used to signal any suitable type of event.
- one particular type of event that may be signaled is to provide a targeted advertisement for the particular user or type of user.
- Another particular type of event that may be signaled would provide information about an upcoming splice point, for example, corresponding to an ad avail.
- the broadcast system may use a transport mechanism 220 to carry the event messages through the network 140.
- One transport mechanism especially suitable for use with broadcast systems such as the Advanced Television Systems Committee (ATSC) standards, is Dynamic Adaptive Streaming Over HTTP (DASH) together with Real-Time Object Delivery Over Unidirectional Transport (ROUTE) 200.
- DASH supports media streaming for delivery of media content in which control lies mainly with the client. Clients may request data using a HTTP protocol from web servers.
- DASH is suitable for a media presentation that is broken into a sequence of small file segments, each containing a short playback interval of a longer program, e.g., live sports event, pre-recorded television episode, etc.
- ROUTE is a real-time object delivery protocol that is agnostic to and independent of the object’s internal structure.
- Some of the features of ROUTE include a (1) single transport protocol for linear television, non-real time files and signaling metadata, (2) enables early playout of segments, (3) flexible packetization for playout timing and transport optimized delivery, (4) out-of-band and advanced delivery of file descriptors to enhance reliability of object recovery, and to reduce signaling overhead.
- DASH is described in ISO/IEC 23009-1:2014, “Information technology -- Dynamic adaptive streaming over HTTP (DASH) -- Part 1: Media presentation description and segment formats”, incorporated by reference herein in its entirety.
- MPU Media Processing Unit
- MMT MPEG Media Transport
- CI MPEG Composition Information 215.
- MPU supports tools that provide formatting, delivery, and signaling.
- the formatting defines a logical structure of the media content together with the format of the data units to be processed by the MMT entity.
- the delivery defines an application layer transport protocol and a payload format.
- the signaling defines formats of signaling messages to manage delivery and consumption of media data.
- the combination of MMT/MPU provides a self-contained single media track with the sample data being provided in decoding order.
- MMT is described in ISO/IEC DIS 23008-1, “Information technology - High efficiency coding and media delivery in heterogeneous environments - Part 1: MPEG media transport (MMT)”, incorporated by reference herein in its entirety.
- MPEG CI is described in ISO/IEC DIS 23008-11, “Information technology - High efficiency coding and media delivery in heterogeneous environments - Part 11: MPEG Media Transport Composition Information”, incorporated by reference herein in its entirety.
- the ISO/IEC DIS 23008-11 standard may be further modified and published as a new reference document updating and/or including variant description of MPEG CI.
- An MPEG CI document may include a reference to a HTML document.
- the HTML document may include a reference to the CI document.
- Some elements in a CI document may reference a corresponding element in HTML document for example using id.
- An MMT CI Processing Engine may receive as input both CI and HTML document.
- the CI annotation may indicate that the presentation needs to be carried out on a secondary companion device for certain elements.
- a CI document may contain media synchronization instructions.
- a CI document may contain spatial layout instructions.
- An example CI processing may include:
- event messages included within the ROUTE/DASH transport mechanism may be referred to as DASH events.
- event messages included within the MMT/MPU transport mechanism may be referred to as MMT events.
- the events are received using the corresponding transport mechanism and may be mapped to a corresponding HyperText Markup Language (HTML) and/or MPEG CI so that the content can be managed and/or suitably rendered visually and/or audibly.
- HTML HyperText Markup Language
- MPEG CI HyperText Markup Language
- out-of-band DASH events may be signaled using an event stream semantics.
- the events are timed, i.e., each event starts at a specific media presentation time and typically has a duration.
- Events of the same type may be clustered as an Event Stream (“EventStream”).
- EventStream This enables a DASH client to subscribe to an EventStream of interest and to ignore an EventStream that is of no relevance or interest.
- the EventStream element contains a @schemeIdUri attribute that provides a Uniform Resource Identifier (“URI”) to identify the scheme and an optional attribute @value.
- URI Uniform Resource Identifier
- the URI identifying the scheme may be a Uniform Resource Name (“URN”) or a Uniform Resource Locator (“URL”).
- UPN Uniform Resource Name
- URL Uniform Resource Locator
- time scale attribute @timescale is provided to assign events to a specific media presentation time. The timed events themselves are described by the Event element, which may include @presentationTime, @duration, and @
- in-band DASH events may be signaled using a modified Event Stream semantics.
- the event streams may be multiplexed, if desired.
- the transmitter and the receiver may use a suitable technique for digital program inserting different AV content.
- program segment 1 is the main broadcast program
- insert 1 is a first advertisement
- program segment 2 is a continuation of the main broadcast program
- insert 2 is a second advertisement
- program segment 3 is a continuation of the main broadcast program.
- ANSI/SCTE-35 American National Standards Institute/Society of Cable Telecommunications Engineers 35 2013
- the transmitter and the receiver may use a suitable technique for digital program inserting different AV content.
- program segment 1 is the main broadcast program
- insert 1 is a first advertisement
- program segment 2 is a continuation of the main broadcast program
- insert 2 is a second advertisement
- program segment 3 is a continuation of the main broadcast program.
- ANSI/SCTE-35 American National Standards Institute/Society of Cable Telecommunications Engineers 35 2013
- the splice_insert() signals a splice event, namely, to signal different AV content to be inserted in the presentation between segments.
- the timing of the splice event may be specified in terms of a presentation time stamp (“PTS”). It uses two fields; pts_adjustment field in splice_info_section() syntax and pts_time field in splice_insert() syntax.
- the splice event PTS is calculated as pts_time + pts_adjustment mod M, where M is PTS rollover modulo 233: In practice, pts_time field often carries relative PTS from the program start. When broadcasting the program, the multiplexer sets the current PTS offset in pts_adjustment so that the resultant event PTS is aligned with the broadcast time base.
- Cue insertion data used to determine the appropriate AV content to be inserted into the video stream, such as an advertisement, may be passed as an XML Linking Language (XLink) which in turn may be passed to a resolver, such as an XLink resolver.
- XLink is an XML markup language that provides methods for creating internal and external links within XML documents and associated metadata with those links.
- XLink may be defined by “World Wide Web Consortium (W3C) XLINK XML Linking Language (XLink) Version 1.1, W3C Recommendation 06, May 2010”, incorporated by reference herein in its entirety.
- the result of the XLink resolver to the cue insertion data may be any suitable response, and in the case of DASH is often one of three events, (1) MPD validity expiration; (2) MPD Patch; and (3) MPD Update.
- the MPD Patch indicates AV content to be included within the presentation.
- the MPD Update indicates replacement AV content to be included within the presentation.
- DASH events may be used to deliver content program metadata as defined in TV-Anytime, such as a parental rating information.
- TV-Anytime is described in European Telecommunications Standards Institute (ETSI) Technical Specification (TS) 102 822-1 “Broadcast and On-line Services: Search, select, and rightful use of content on personal storage systems (“TV-Anytime”)”, incorporated by reference herein in its entirety.
- ETSI European Telecommunications Standards Institute
- TS Technical Specification
- out-of-band DASH events may be received that include, in part, a presentation_time_delta and a duration.
- the duration may be replaced by event_duration.
- Event 1 may start at time 00m:00s with a duration of 30m:00s (30 minutes).
- Event 2 may start at a time 30m:00s with a duration of 00m:30s (30 seconds).
- Event 3 may start at a time 30:30s with a duration of 00m:30s (30 seconds).
- Event 4 may start at a time 31m:00 with an unspecified duration. While this set of events defines the presentation of the respective events in a time ordered manner, this set of events is not suitable for the constructs defined by HTML.
- the received in-band DASH events may be mapped to HTML using a cue list.
- the in-band DASH events may be mapped to a DataCue (TextTrackCue) property as defined in HbbTV 2.0 Specification (2015-02-02), incorporated by reference herein in its entirety.
- the duration signaled within DASH is mapped to endTime within the DataCue.
- the mapping may be performed, for example, by adding the duration to the corresponding startTime.
- the DataCue provides a structure to maintain an ordered list of AV and/or captions content to be presented. This is particularly useful to include advertisements (e.g., Ad #), within the program.
- the additional advertisements may be personalized to characteristics of the particular viewer, such as their geographic location, their interests, their gender, their profile, etc.
- the syntax video.TextTracks[0].cues[0] may be used, and preferably has its mode set to hidden.
- the “video” portion of the syntax identifies the content as being video content
- the “TextTracks[0]” portion of the syntax identifies the AV content to be presented
- the “cues[0]” portion of the syntax identifies the startTime and the endTime of the TextTrack.
- the Event 1 is mapped to Program segment 1 with a DataCue (TextTrackCue) of cues[0], with a start time of 00m:00s and an end time of 30m:00s.
- the Event 2 is mapped to Ad1 with a DataCue (TextTrackCue) of cues[1], with a start time of 30m:00s and an end time of 30m:30s.
- the Event 3 is mapped to Ad2 with a DataCue (TextTrackCue) of cues[2], with a start time of 30m:30s and an end time of 31m:00s.
- the Event 4 is mapped to Program segment 2 with a DataCue (TextTrackCue) of cues[3], with a start time of 31m:00s.
- a CueChange event may be fired at 00m:00s, 30m:00s, 30m:30s, and 31m:00s to signal the change in AV content.
- the mapping of the start time is preferably based upon a scaled presentation_time_delta plus a time offset from the start of the segment. Since the particular program segments and advertisements (other AV and/or captions content) may not start with an absolute presentation time of zero, the startTime should be modified to account for the temporal offset from the start.
- the DASH events may be signaled in-band or out-of band.
- a different data structure other than DataCue may be used to store the event information.
- One or more DASH Events are received by a media player with HTML elements, e.g., audio and/or video.
- the DASH Event details are mapped to the DataCue.
- an XLink with event related data is sent to an XLink resolver which resolves the data to a particular URI.
- the URI is provided to the media player.
- the media player may be a HbbTV 2.0 compliant media player.
- out-of-band MMT events may be signaled using an event stream semantics.
- the events are timed, i.e., each event starts at a specific media presentation time and typically has a duration.
- Events of the same type may be clustered as an Application Event (AE) descriptor (“AE_descriptor”), where AE refers to an application event.
- AE Application Event
- the AE_descriptor element contains a scheme_id_uri attribute that provides a Uniform Resource Identifier (“URI”) to identify the scheme and an optional attribute value.
- URI Uniform Resource Identifier
- the URI identifying the scheme may be a Uniform Resource Name (“URN”) or a Uniform Resource Locator (“URL”).
- the AE_descriptor may include an id which is an identifier.
- the AE_descriptor may include a presentation_time indicating the start time of the content, a duration indicating the duration of the content, and data descriptive of the content.
- the AE_descriptor may also include an application_identifier().
- the application identifier() may be used to direct event messages to a particular application.
- FIG. 10B is another example of signaling out-of-band MMT Events. Compared to FIG. 10A, FIG. 10B includes the following additional fields: @mpuSeqNum, @timestamp, @timescale. The semantic of @presentationTime is changed with respect to presentation_time_delta in FIG. 10A. In another embodiment, a select combination of fields from FIG. 10A and FIG. 10B may be used.
- in-band MMT events may be signaled using modified AE_descriptor semantics replacing AE_descriptor with event and presentation_time is replaced with presentation_time_delta.
- the event streams may be multiplexed by adding the event messages as part of segments.
- the application_identifier() may be used to direct event messages to a particular application.
- FIG. 11B is another example of signaling in-band MMT Events. Compared to FIG. 11A, FIG. 11B includes the following additional fields: timescale.
- the semantic of presentation_time_delta is changed with respect to presentation_time_delta in FIG. 11A as noted in the comment field in FIG. 11B.
- a select combination of fields from FIG. 11A and FIG. 11B may be used.
- an in-band MMT event may comprise an inband event descriptor and an event signaled in the MPU.
- Events in an MMT-based service may be carried in evti boxes in MPUs.
- FIG. 11B indicates an exemplary structure of an evti box.
- the in-band MMT event information maps to an evti box.
- Such an evti box may appear at the beginning of the file, after the ftyp box, but before the moov box, or it may appear immediately before any moof box.
- the inband event descriptor may be signaled at the asset-level.
- the inband event descriptor may be signaled in the MMT Package table (MPT).
- FIG. 11C is an exemplary structure of the inband event descriptor.
- the received in-band MMT events and out-of-band MMT events may be mapped to MediaSync elements of MPEG CI.
- MediaSync elements provide temporal information of the corresponding media elements of the HTML document.
- the MediaSync element may include the following attributes, (1) refld, (2) begin, (3) dur, (4) end, (5) clipBegin, (6) clipEnd, (7) mediaSrc, (8) type, (9) href, (10) show, (11) actuate, (12) isDependent, (13) depLd, (14) soureList.
- the in-band MMT events and the out-of-band MMT events may be mapped to the MediaSync elements.
- the presentation_time_delta (or presentation_time in the case of out-of-band) signaled within MMT is mapped to “begin” within the MediaSync elements.
- the MediaSync element provides a structure to maintain an ordered list of AV and/or captions content to be presented. This is particularly useful to include advertisements (e.g., Ad #), within the program.
- advertisements e.g., Ad #
- the additional advertisements may be personalized to characteristics of the particular viewer, such as their geographic location, their interests, their gender, their profile, etc.
- the Event 1 is mapped to Program segment 1 with a begin time of 00m:00s, a dur of 30m:00s, and an end time of 30m:00s.
- the Event 2 is mapped to Ad1 with a begin time of 30m:00s, a dur of 00m:30s, and an end time of 30m:30s.
- the Event 3 is mapped to Ad2 with a begin time of 30m:30s, a dur of 00m:30s, and an end time of 31m:00s.
- the Event 4 is mapped to Program segment 2 with a begin time of 31m:00s.
- the mapping of the begin time is preferably based upon a presentation_time_delta (or presentation_time in the case of out-of-band) plus a mpu_presentation_time (i.e., time offset).
- the mpu_presentation_time indicates the presentation time of the first access unit in the designated MPU by a 64-bit network time protocol (“NTP”) time stamp format. Since the particular program segments and advertisements (other AV and/or captions content) may not start with an absolute presentation time of zero, the begin time should be modified to account for the temporal offset from the start. It is to be understood that in the text and figures may interchangeably use underscore case notations, camel case notations, and/or syntax names preceded by an @ sign.
- the TextTrackList may include TextTracks corresponding to the DASH event streams and/or the MMT event streams.
- the different properties of the TextTrack may be included with the MPD Events (out-of-band DASH Events), the in-band DASH Events, the (out-of-band) MMT Events, and in-band MPU Events.
- the inBandMetadataTrackDispatchType attribute may return the text track in-band metadata track dispatch type of the text track that the TextTrack object represents.
- a different value of inBandMetadataTrackDispatchType may be used to identify out-of-band MMT Events and/or in-band MPU Events.
- mapping of received events may be performed in any suitable manner, and may be based upon one or more of the fields included therein.
- the mapping may be to the start times, the duration, and/or the end times of different portions of audio, video, and/or captions content. Preferable examples of such mapping are provided herein.
- the DASH events and MMT events may be reported to applications as DataCue.
- the MPD Events may be mapped as the @presentationTime (scaled according to the EventStream @timescale attribute) + the time offset of the start of the period from the start of the presentation + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- time offset calculated for time-shifted playback may be zero.
- time offset calculated for time-shifted playback may be determined based on current wall clock time and the time include in the content being played back.
- the Inband DASH Events may be mapped as presentation_time_delta (scaled according to the timescale value) + the time offset of the start of the segment from the start of the presentation + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- time offset calculated for time-shifted playback may be zero.
- time offset calculated for time-shifted playback may be determined based on current wall clock time and the time include in the content being played back.
- the MMT Events may be mapped as presentation_time + the time offset (e.g. mpu_presentation_time, presentation time in Presentation Information after application of offsets in with elst box if available) of the start of MPU + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- the MMT Events may be mapped as presentation_time (scaled according to the EventStream @timescale value) + the time offset (e.g. mpu_presentation_time, presentation time in Presentation Information after application of offsets in with elst box if available) of the start of MPU + (optional) time offset calculated for time-shifted playback.
- time offset calculated for time-shifted playback may be zero.
- time offset calculated for time-shifted playback may be determined based on current wall clock time and the time include in the content being played back.
- the Inband MPU Events may be mapped as presentation_time_delta + the time offset (anchor time e.g. mpu_presentation_time, presentation time in PI after application of offsets in with elst box if available) of the start of MPU + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- the Inband MPU Events may be mapped as presentation_time_delta (scaled according to the timescale value) + the time offset (anchor time e.g.
- time offset calculated for time-shifted playback may be converted to HTML5 media timeline.
- time offset calculated for time-shifted playback may be zero.
- time offset calculated for time-shifted playback may be determined based on current wall clock time and the time include in the content being played back.
- an anchor time A (e.g. reference_start_time) may be signaled.
- the startTime is mapped to presentation_time_delta + the time offset (e.g. reference_start_time) of the start of MPU + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- the startTime is mapped to presentation_time_delta (scaled according to the timescale value) + the time offset (e.g. reference_start_time) of the start of MPU + (optional) time offset calculated for time-shifted playback.
- the calculated time may be converted to HTML5 media timeline.
- time offset calculated for time-shifted playback may be zero.
- time offset calculated for time-shifted playback may be determined based on current wall clock time and the time include in the content being played back.
- the startTime calculation may, in part, be based on mpu_presentation_time of a designated MPU.
- the designated MPU may be signaled in the out-of-band MMT Event message.
- the startTime calculation may, in part, be based on presentation time signaled in Presentation Information e.g. media presentation information (MPI).
- MPI may correspond to MPD.
- MPI may correspond to an MPEG CI document.
- the presentation time may correspond to the value of the begin attribute in the MPEG CI document of corresponding MediaSync element (for e.g. identified using value of Id).
- the presentation time may correspond to a designated MPU.
- the startTime calculation may, in part, be based on mpu_presentation_time of a designated MPU.
- the designated MPU may be inferred (for e.g. current MPU being encoded/decoded).
- the startTime calculation may, in part, be based on presentation time signaled in Presentation Information e.g. media presentation information (MPI).
- MPI may correspond to MPD.
- MPI may correspond to an MPEG CI document.
- the presentation time may correspond to the value of the begin attribute in the MPEG CI document of corresponding MediaSync element (for e.g. identified using value of Id).
- the presentation time may correspond to a designated MPU.
- startTime is set to currentTime.
- endTime of DataCue if the calculated value of endTime of DataCue exceeds a value then it is set to that value.
- endTime of DataCue exceeds a first value e.g. the end attribute value in the MPEG CI document of corresponding MediaSync element (for e.g. identified using value of Id), then it is set to a second value.
- the second value may be the same as the first value.
- FIG. 14B illustrates another embodiment of how DASH events and MMT events may be reported to applications as DataCue.
- the cuechange event of the TextTrack object may be fired according to the “time marches on” of HTML5, as defined in W3C Recommendation 28 October 2014): "HTML5: A vocabulary and associated Application Program Interfaces (APIs) for HTML and Extensible HyperText Markup Language (XHTML)”, incorporated by reference herein in its entirety. This allows the possibility of “missed cues” (cues that start and end between successive iterations of the technique). For these cues a cuechange event will be fired but the cue will not be available in the activeCues TextTrackCueList when the handler is called.
- the cues attribute of the TextTrack may be populated as follows: For an MPD EventStream, the cues attribute may contain cues representing the complete list of DASH Events currently defined for that EventStream. If the MPD is dynamic, the list may be updated if the list of Events changes following an MPD update. An example message flow is illustrated in FIG. 16A For an InbandEventStream, the cues attribute may contain cues representing at least the DASH Events whose start times are less than or equal to the current playback position and whose end times are greater than the current playback position.
- past cues may be retained in the cue list at least until the completion of the next iteration of “time marches on” that occurs after the end time of the cue.
- the cue list may also contain additional past or future Events which the terminal has acquired.
- An example message flow is illustrated in FIG. 16B.
- the cues attribute may contain cues representing the complete list of MMT Events currently defined for that EventStream. If the AEI message is replaced/updated by receipt of a later AEI with same id then the list may be replaced/updated. It is noted that the AEI EventStream corresponds to a set of Events received in AEI.
- An example message flow is illustrated in FIG. 16C.
- the cues attribute may contain cues representing at least the MMT Events whose start times are less than or equal to the current playback position and whose end times are greater than the current playback position.
- past cues may be retained in the cue list at least until the completion of the next iteration of “time marches on” that occurs after the end time of the cue.
- the cue list may also contain additional past or future Events which the terminal has acquired. It is noted that the MPU EventStream corresponds to a set of Events received in MPUs. An example message flow is illustrated in FIG. 16D
- the AEI may be signaled as descriptor for e.g. in MMT Package (MP) table, in Media Presentation Information (MPI). AEI may be received in MMT signaling.
- MP MMT Package
- MPI Media Presentation Information
- Events when delivered via broadcast in an MMT-based system, Events may be delivered in an XML document called an Application Event Information (AEI) document.
- AEI Application Event Information
- system when an AEI is delivered via broadcast, it may be delivered in the service layer signaling (SLS) for the service.
- SLS service layer signaling
- inband MPU Events may be received as a descriptor in MP table.
- cues may correspond to DASH periods, MPUs, captions/subtitle paragraphs.
- events may replace existing cues in cue list.
- the MMT events and MPU Events, and subsets thereof, may be reported to applications as a MediaSync property (attribute value).
- the href may be used to map events which are not suitable to be directly mapped to MediaSync attributes. So in this case, a URL may be created based upon such event fields.
- the value for begin attribute of MediaSync may be calculated in any of the different options described for calculating the startTime attribute value of DataCue object.
- the value for end attribute of MediaSync may be calculated in any of the different options described for calculating the endTime attribute value of DataCue object.
- the clipBegin may be set to a pre-determined value e.g. 0.
- the clipBegin may be determined based on a corresponding mpu_presentation_time of a designated MPU.
- the clipEnd may be set to a pre-determined value e.g. Indefinite.
- the clipEnd may be determined based on a corresponding mpu_presentation_time of a designated MPU.
- mediaSrc may be a URI. In an example, mediaSrc may be a URN.
- FIG. 15B illustrates another embodiment of how the MMT events and MPU Events, and subsets thereof, may be reported to applications as a MediaSync property (attribute value).
- a reference to the CI document may be included in the HTML5 document. Spatial information of the media elements may be used to present the corresponding media data (such as the MMT Assets).
- the body element of the HTML5 document may include more than one div element, when the presentation is meant to be consumed on multiple screens.
- the CI document defines in a hierarchical structure the concepts of Area and View.
- An Area represents a spatial region defined by div element of HTML5 which may relate to one or more HTML5 media elements.
- a View represents a set of Areas to be consumed on a single screen.
- the view element provides temporal information about the spatial changes of a View and its Areas.
- the viewRole attribute indicates the role of the view element. Several view elements can be contained in one CI document, and this attribute can be used to distinguish each of them. The following table defines the meaning of the possible values for the viewRole attribute.
- Event messages may include information that indicate whether they apply to MediaSync element or to view element for e.g. by way of appropriate id.
- the value for attributes of view element may be assigned/derived using any of the options listed for the corresponding attribute of MediaSync element.
- the value of viewRole attribute may be signaled for e.g. in an event message or by some other means defined within the system. In another example, the value of viewRole attribute may be derived for e.g. based on video essences present in the corresponding channel.
- Event messages may carry a syntax element, say refId, which has the same value as the id attribute of an element (e.g. MediaSync, view, CI) in the corresponding CI document.
- refId a syntax element, say refId, which has the same value as the id attribute of an element (e.g. MediaSync, view, CI) in the corresponding CI document.
- Event messages may carry a syntax element, say refId, which has the same value as the refId attribute of an element (e.g. MediaSync) in the corresponding CI document.
- refId a syntax element, say refId, which has the same value as the refId attribute of an element (e.g. MediaSync) in the corresponding CI document.
- Event messages may carry a syntax element, say refId, which has the same value as the refDiv attribute of an element (e.g. divLocation) in the corresponding CI document.
- a sourceList may correspond to a list of assets to be presented in sequence.
- a sourceList element provides one or more mediaSrc elements, which should be considered as alternative sources for the same component.
- a receiver should play only one mediaSrc element out of a sourceList for playback.
- An example sourceListType type definition may include the following rules:
- mediaSrcType type definition may enable inclusion of following attributes: mimeType, isDependent, depId.
- the componenId provides a unique identifier for the media component to which the media source list pertains.
- a video component of a service is identified as the “mainVideo” component.
- An example xsd:ID type definition may include a the following rules:
- An example refIdType type definition may be based on xsd:ID.
- An example xsd:token type definition may include a the following rules:
- An example viewRoleType type definition may include the following rules:
- divLocationType type definition may enable inclusion of following attributes: begin, end, dur, refDiv, plungeIn, plungeOut.
- refDiv may conform to refDivType type definition.
- An example refDivType type definition may definition may be based on xsd:ID.
- out-of-band events may be signaled using an event stream semantics.
- the events are timed, i.e., each event starts at a specific media presentation time and typically has a duration.
- Events of the same type may be clustered as an Event Stream (“EventStream”). This enables a client to subscribe to an EventStream of interest and to ignore an EventStream that is of no relevance or interest.
- the EventStream element contains a @schemeIdUri attribute that provides a Uniform Resource Identifier (“URI”) to identify the scheme and an optional attribute @value.
- URI Uniform Resource Identifier
- the URI identifying the scheme may be a Uniform Resource Name (“URN”) or a Uniform Resource Locator (“URL”).
- UPN Uniform Resource Name
- URL Uniform Resource Locator
- Event Streams contain timed events, also a time scale attribute @timescale is provided to assign events to a specific media presentation time.
- the timed events themselves are described by the Event element, which may include @presentationTime, @duration, @id, and @refId.
- @xxx corresponds to content for tag xxx.
- a field_name of evti box corresponds to the value of field_name.
- ref_id i.e. refId
- the @duration attribute of Event is required to be signaled i.e. it is not optional (since the default value is unknown).
- the end time of Event may then be determined as sum of @presentationTime value and @duration.
- FIG. 21A and FIG. 21B illustrate another embodiment of how the out-of-band MMT events and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor) as a MediaSync or view property (e.g. attribute value, child element).
- FIG. 21A lists the mapping for properties that exists in both MediaSync and view elements.
- FIG 21B lists the mapping for properties that exists in MediaSync element. Note, that in case of AEI based eventing, the refId and sourceList property of MediaSync element may correspond to properties (e.g. attribute, content) of ⁇ Event> element. As listed in FIG.
- the content of ⁇ Event> element may be in XML format and may include a ⁇ sourceList> element corresponding to the sourceList child element of MediaSync.
- the refId attribute for the ⁇ Event> element corresponds to the refId attribute of MediaSync element.
- the refId attribute of MediaSync element corresponds to information carried in ref_id field in evti box within MPU.
- the sourceList property in MediaSync element corresponds to information carried by event_data[] field in evti box within MPU.
- the event_data[] field in evti box within MPU may be in XML format and may include a ⁇ sourceList> element.
- the ⁇ sourceList> element in the event_data[] corresponds to the sourceList child element of MediaSync.
- the ⁇ sourceList> element within event_data carried in MPU(s) may be in binary format.
- FIG. 22 shows an example binarized representation source_list() of the ⁇ sourceList> element.
- the location of the binarized representation source_list() in MPU carrying the event may be fixed.
- a binarized representation of ⁇ sourceList> may be carried in any suitable location e.g. encoded in video essence.
- the tags of XML elements in event_data[] may be preceded by a namespace qualifier e.g. mmtci.
- the tags of XML elements in content of ⁇ Event> element and corresponding to CI elements may be preceded by a namespace qualifier e.g. mmtci.
- FIG. 21A ad illustrate another embodiment of how the out-of-band MMT events and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor) as a MediaSync or view property (e.g. attribute value, child element).
- FIG. 21A lists the mapping for properties that exists in both MediaSync and view elements.
- FIG 21C lists the mapping for properties that exists in view element. Note, that in case of AEI based eventing, the id, viewRole and divLocation property of view element may correspond to properties (e.g. attribute, content) of ⁇ Event> element. As listed in FIG.
- the content of ⁇ Event> element may be in XML format and may include a sequence of ⁇ divLocation> elements corresponding to the divLocation child elements of view.
- the content of ⁇ Event> element may be in XML format and may include a ⁇ viewRole> elements corresponding to the viewRole attribute of CI view element.
- the refId attribute for the ⁇ Event> element corresponds to the id attribute of view element.
- the id attribute of view element corresponds to information carried in ref_id field in evti box within MPU.
- the viewRole and divLocation (sequence) properties in view element corresponds to information carried by event_data[] field in evti box within MPU.
- the event_data[] field in evti box within MPU may be in XML format and may include a ⁇ viewRole> element, a sequence of ⁇ divLocation> element.
- the content of ⁇ viewRole> element in event_data[] corresponds to the viewRole attribute of view element and the sequence of ⁇ divLocation> elements in the event_data[] corresponds to the sequence of divLocation child elements of view.
- the ⁇ viewRole> element within event_data carried in MPU(s) may be in binary format.
- FIG. 25 shows an example binarized representation view_role() of the ⁇ view_role> element. In an example, the location of the binarized representation view_role() in MPU carrying the event may be fixed. In another example the sequence of ⁇ divLocation> elements within event_data carried in MPU(s) may be in binary format.
- FIG. 26 shows an example binarized representation div_location_list() of the sequence of ⁇ divLocation> element. In an example, the location of the binarized representation div_location_list() in MPU carrying the event may be fixed.
- a binarized representation of ⁇ divLocation> may be carried in any suitable location e.g. encoded in video essence.
- the tags of XML elements in event_data[] may be preceded by a namespace qualifier e.g. mmtci.
- the tags of XML elements in content of ⁇ Event> element and corresponding to CI elements may be preceded by a namespace qualifier e.g. mmtci
- the out-of-band MMT events may signal a particular scheme (@schemeIdURI) value e.g. “urn:atsc:mmt:event:2015”.
- the @value of an out-of-band MMT event may be used to signal an event type e.g. whether the event is meant for a MediaSync CI element or a view CI element.
- the in-band MMT events may signal a particular scheme (scheme_id_uri) value e.g. “urn:atsc:mmt:event:2015”.
- the value field of an in-band MMT event may be used to signal an event type e.g. whether the event is meant for a MediaSync CI element or a view CI element.
- FIG. 23A illustrates an exemplary MMT event transmission and processing for different CI element types.
- FIG. 23B illustrates an exemplary MMT event reception and processing for different CI element types.
- FIG. 24 illustrates an exemplary mapping of @value (for EventStream of AEI) and value field (for evti) to the corresponding CI element type (e.g. MediaSync, view) indication.
- a mapping of event information to corresponding CI element may be determined based on @value (for EventStream of AEI) and value (for evti). This mapping would be used during transmission as well as reception.
- events are defined only for MediaSync CI element, in such an example only a subset of @value and/or value in FIG. 24 may be defined (e.g.
- a transmitter using FIG. 23A may skip over determining whether the MMT event is for view CI element and the corresponding processing step of the YES path.
- a receiver using FIG. 23B may skip over determining whether the MMT event is for view CI element and the corresponding processing step of the YES path.
- the scheme for out-of-band and in-band MMT events may take on different values. In such an event the @value (for EventStream of AEI) and/or value field (of evti) signaling may be skipped and the targeted CI element (e.g. MediaSync, view) may be inferred from scheme value.
- @refId (for EventStream of AEI) and/or ref_id (of evti) may be absorbed into content of ⁇ Event> element (for EventStream of AEI) and/or event_data[] field (for evti).
- FIG. 21D it illustrates how an out-of-band MMT event targeting a CI MediaSync element may be signaled. It is based on the mapping outlined in FIG. 24, FIG. 21A, FIG. 21B. As shown the ⁇ Event> element contains a refId attribute which corresponds to a refId attribute of CI MediaSync element.
- the sourceList child element of the CI MediaSync element is signaled as ⁇ mmtci:sourceList> element within the content of ⁇ Event> element.
- FIG. 21E it illustrates how an in-band MMT event targeting a CI MediaSync element may be signaled. It is based on the mapping outlined in FIG. 24, FIG. 21A, FIG. 21B. As shown the event_data[] contains ⁇ sourceList> which corresponds to sourceList child element of the CI MediaSync element. The information in event_data[] may use any suitable character encoding e.g. UTF-8 (8-bit Unicode Transformation Format).
- the content of ⁇ Event> element may include application specific data in addition to pre-defined data (for e.g. as shown in FIG. 21D, FIG. 21E).
- the application specific data may be passed on to appropriate application which can then interpret the specific data and may carry out some specific action.
- the elements with tags that are not from a pre-defined set may correspond to application specific data and are passed to the appropriate application.
- the entire content of ⁇ Event> element may be passed to the appropriate application.
- the value of event_data[] may include application specific data in addition to pre-defined data.
- the application specific data may be passed on to appropriate application which can then interpret the specific data and may carry out some specific action.
- the value of event_data[] is XML formatted then the elements with tags that are not from a pre-defined set may correspond to application specific data and are passed to the appropriate application.
- the value of event_data may be passed to the appropriate application.
- the CI element may be created by the receiver with that refId (for Event of EventStream of AEI) and/or ref_id (for evti) for the CI document.
- a CI MedaiSync element may be created by the receiver with attribute refId equal to the received event refId (for Event of EventStream of AEI) and/or ref_id (for evti) for the CI document.
- a CI view element may be created by the receiver with attribute id equal to the received event refId/ref_id for the CI document.
- An example xsd:positiveInteger type definition may include the following rules:
- An example plungeInType type definition may be xsd:positiveInteger.
- An example plungeOutType type definition may include the following rules:
- FIG. 27 illustrates another embodiment of how the out-of-band MMT events and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor) as a divLocation property (e.g. attribute value, child element).
- a divLocation property e.g. attribute value, child element.
- the refDiv, begin, dur, end, plungeIn, plungeOut property of divLocation element may correspond to properties (e.g. attribute, content) of ⁇ Event> element.
- the content of ⁇ Event> element may be in XML format and may include a ⁇ plungeIn> element the content of which corresponds to the plungeIn attribute of the divLocation elements under consideration.
- the content of ⁇ Event> element may be in XML format and may include a ⁇ plungeOut> element the content of which corresponds to the plungeOut attribute of the divLocation elements under consideration.
- the refId attribute for the ⁇ Event> element corresponds to the refDiv attribute of divLocation element.
- the refDiv attribute of divLocation element corresponds to information carried in ref_id field in evti box within MPU.
- the plungeIn, plungeOut (sequence) properties in divLocation element corresponds to information carried by event_data[] field in evti box within MPU.
- the event_data[] field in evti box within MPU may be in XML format and may include a ⁇ plungeIn> element, a ⁇ plungeOut> element.
- the content of ⁇ plungeIn> element in event_data[] corresponds to the plungeIn attribute of divLocation element and the content of ⁇ plungeOut> element in the event_data[] corresponds to the plungeOut attribute of divLocation element.
- a pre-determined (e.g. 4) @value (of EventStream of AEI) may be used to signal the out-of-band event corresponds to a CI divLocation element.
- a pre-determined e.g.
- value (evti) may be used to signal the in-band event corresponds to a CI divLocation element.
- the tags of XML elements in event_data[] may be preceded by a namespace qualifier e.g. mmtci.
- the tags of XML elements in content of ⁇ Event> element and corresponding to CI elements may be preceded by a namespace qualifier e.g. mmtci
- the content of ⁇ Event> element may be in XML format and may include a ⁇ plungeIn> element the content of which corresponds to the plungeIn attribute of the divLocation elements under consideration.
- the content of ⁇ Event> element may be in XML format and may include a ⁇ plungeOut> element the content of which corresponds to the plungeOut attribute of the divLocation elements under consideration.
- the refId attribute for the ⁇ Event> element corresponds to the refDiv attribute of divLocation element. In case of in-band MMT events the refDiv attribute of divLocation element corresponds to information carried in ref_id field in evti box within MPU.
- the plungeIn, plungeOut (sequence) properties in divLocation element corresponds to information carried by event_data[] field in evti box within MPU.
- the event_data[] field in evti box within MPU may be in XML format and may include a ⁇ plungeIn> element, a ⁇ plungeOut> element.
- the content of ⁇ plungeIn> element in event_data[] corresponds to the plungeIn attribute of divLocation element and the content of ⁇ plungeOut> element in the event_data[] corresponds to the plungeOut attribute of divLocation element.
- a pre-determined e.g.
- @value (of EventStream of AEI) may be used to signal the out-of-band event corresponds to a CI divLocation element.
- a pre-determined (e.g. 4) value (evti) may be used to signal the in-band event corresponds to a CI divLocation element.
- the tags of XML elements in event_data[] may be preceded by a namespace qualifier e.g. mmtci.
- the tags of XML elements in content of ⁇ Event> element and corresponding to CI elements may be preceded by a namespace qualifier e.g. mmtci.
- FIG. 28A and FIG. 28B illustrate another embodiment of how the out-of-band MMT events and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor) when the event information includes a CI document.
- a CI document may be included in the content of an ⁇ Event> element.
- the id attribute of the CI document would be signaled in the refId attribute of ⁇ Event> element, the corresponding mapping is shown in FIG. 28A.
- the id attribute of the CI document would be signaled as an attribute of the CI element included in the CI document included in the content of ⁇ Event> element.
- FIG. 28A illustrates another embodiment of how the out-of-band MMT events and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor) when the event information includes a CI document.
- a CI document may be included in the
- Timing information sent for ⁇ AEI> element e.g. @timestamp; @timescale of EventStream; @presentationTime and @duration of Event
- timing attributes e.g. begin, end, dur
- the timing attributes for the CI elements are updated for e.g. as shown in FIG. 28B, based on their signaled value and the signaled value for ⁇ Event> (e.g. @timestamp of AEI; @timescale of EventStream; @presentationTime and @duration of Event).
- These updated timing information may undergo further transformation to a suitable format e.g.
- a CI document may be included in the content of an event_data[] field of evti box.
- the id attribute of the CI document would be signaled in the ref_id field of evti box shown in FIG. 20, the corresponding mapping is shown in FIG. 28A.
- the id attribute of the CI document would be signaled as an attribute of the CI element included in the CI document included in the event_data field of evti box.
- FIG. 28B illustrates the relationship between the timing information sent for evti box (e.g.
- timescale field event_presentation_time_delta field, event_duration field, earliest presentation time of current MPU
- timing attributes e.g. begin, end, dur
- CI elements e.g. MediaSync, view, divLocation
- the timing attributes for the CI elements are updated for e.g. as shown in FIG. 28B, based on their signaled value and the signaled value of the timing information corresponding to the evti box. These updated timing information may undergo further transformation to a suitable format e.g. UTC, NTP.
- a pre-determined e.g.
- 3) @value may be used to signal the out-of-band event corresponds to a CI document (or CI element).
- a pre-determined (e.g. 3) value (evti) may be used to signal the in-band event corresponds to a CI document (or CI element).
- the tags of XML elements in event_data[] may be preceded by a namespace qualifier e.g. mmtci.
- the tags of XML elements in content of ⁇ Event> element and corresponding to CI elements may be preceded by a namespace qualifier e.g. mmtci.
- the CI document may be defined with several elements referring to the elements defined in the HTML5 document.
- FIG. 29 shows how several HTML5 documents and CI documents may be emitted and/or received and the references from CI document to HTML5 document.
- a HTML5 document serves as an entry point for the presentation and the presentation of HTML5 document may be effectively modified by the associated CI documents.
- HTML5 document A serves as the entry point for CI documents B, C, D; and HTML5 document 0 serves as the entry point for CI documents 1, 2.
- a CI document may contain references to more than one HTML5 document i.e. the instructions included in the CI document impacts elements (presentation) in more than one HTML5 document.
- only one HTML5 document e.g.
- the one received later may be active at the receiver at any given time.
- only one HTML5 document e.g. the one received later
- the instructions listed in CI documents B, C, D cannot be executed without receiving HTML5 document A which serves as an entry point for presentation.
- the instructions listed in CI documents 1, 2 cannot be executed without receiving HTML5 document 0.
- a receiver may skip processing of CI documents for which it has not received the corresponding HTML5 document, that serves as an entry point for presentation.
- FIG. 30A illustrates another embodiment of how the out-of-band MMT events (AEI) and in-band MMT Events (corresponding to evti box), and subsets thereof, may be reported to applications (or CI processor).
- AEI out-of-band MMT events
- in-band MMT Events corresponding to evti box
- @value attribute (of EventSream) may be set to a pre-determined value (e.g. 1) to indicate that the event information (e.g. content of ⁇ Event> element) contains a HTML5 document that corresponds to the entry point of a presentation.
- a CI document may be also optionally present within the event information (e.g. content of ⁇ Event> element).
- value field may be set to a pre-determined value (e.g. “1”) to indicate that the event information (e.g. within event_data[] field) contains a HTML5 document that corresponds to the entry point of a presentation.
- a CI document may be also optionally present within the event information (e.g. within event_data[] field).
- HTML5 document and CI document are included in the event information (e.g. content of ⁇ Event> element, within event_data[] field) then they may be included in a fashion that allows a receiver to identify them and separate them into different documents for e.g. HTML 5 document and CI document may be included sequentially, CI documents would be included inside a ⁇ CI> element or ⁇ mmtci:CI> element within the HTML 5 document.
- multiple HTML5, multiple CI document may be included in the event information.
- a receiver that tunes into a channel may skip processing of events which correspond to CI documents only until it receives an event with an entry point to a presentation.
- @value attribute (of EventStream) may be set to a pre-determined value (e.g. 2) to indicate the event information (e.g. content of ⁇ Event> element) contains a CI document.
- value field may be set to a pre-determined value (e.g. “2”) to indicate the event information (e.g. within event_data[] field) contains a CI document.
- scheme value listed in FIG. 30A “urn:atsc:mmt:event:xxxx” may be replaced by any suitable value.
- the value of begin, dur, end attributes of elements within a CI document may be required to be media time i.e. offset based timing.
- Offset based time may be specified as offsets relative to an anchor time. Offset based time may be positive or negative offset. Offset based time may be specified in pre-determined units of time e.g. (hours, minutes, seconds, milliseconds). Offset based time may be specified in a pre-determined time format.
- the value of begin, dur, end attributes of elements within a CI document may be required to be media time i.e. offset based timing for specific values of @value attribute of EventStream of AEI and/or value field of evti box.
- FIG. 30B shows an example mapping for the timing attributes (e.g. begin, dur, end attributes) of elements within CI document received in content of ⁇ Event> element or event_data[] of evti_box.
- This mapping may be used to derive the time in a common time base e.g. wall-clock (like NTP, UTC) time values of the timing attributes prior to consumption by for e.g. a broadcaster application, a CI processor.
- a common time base e.g. wall-clock (like NTP, UTC) time values of the timing attributes prior to consumption by for e.g. a broadcaster application, a CI processor.
- FIG. 30C shows another example mapping for the timing attributes (e.g. begin, dur, end attributes) of elements within CI document received in content of ⁇ Event> element or event_data[] of evti_box.
- This mapping may be used to derive the time in a common time base e.g. wall-clock (like NTP, UTC) time values of the timing attributes prior to consumption by for e.g. a broadcaster application, a CI processor.
- the smaller of the derived duration value: CI attribute, Event attribute is effective.
- the effective end value of the CI attribute is based on the effective duration.
- FIG. 30B or FIG. 30C is used only when a CI document is received as indicated in FIG. 30A.
- the common time base e.g. wall-clock time
- the resulting value may roll-over.
- An example roll-over operation may be described as follows: if maximum value that can be accommodated is MV and the derived value is DV where DV>MV, then the derived value is set to DV % (MV+1).
- AEI end2 is calculated as the sum of the new value of begin attribute (set in the table, two rows above) and the received (or inferred) value of dur attribute. When dur attribute is not received then it is inferred to indefinite.
- evti box end2 is calculated as the sum of the new value of begin attribute (set in the table, two rows above) and the received (or inferred) value of dur attribute. When dur attribute is not received then it is inferred to indefinite.
- the anchor MPU time is scaled according to the time scale value (e.g. @timestamp for AEI is scaled according to the EventStream @timescale value) when being used for mapping timing attributes (e.g. begin, dur, end attributes) of elements within CI document received in Event Information (e.g. content of ⁇ Event> element).
- time scale value e.g. @timestamp for AEI is scaled according to the EventStream @timescale value
- mapping timing attributes e.g. begin, dur, end attributes
- elements within CI document received in Event Information e.g. content of ⁇ Event> element.
- timing attributes e.g. begin, dur, end attributes
- common time base T e.g. wall-clock time
- an anchor time A may be derived based on signaled information.
- the anchor time is signaled as an attribute (e.g. @timestamp).
- the anchor time may be inferred to be the mpu_presentation_time of a MPU (e.g. the MPU carrying the event).
- the anchor time may need to be converted to a time base T (e.g. NTP, UTC).
- a presentation time offset P may be derived based on signaled information. For example for AEI the presentation time offset is signaled as an attribute (e.g. @presentationTime). If the attribute is not signaled then the value of the attribute may be set to a default value (e.g. 0).
- the presentation time offset may be scaled (e.g. based on @timescale attribute of EventStream, the scaling may involve multiplication and/or division by received value of @timescale).
- the presentation time offset is signaled as a field (e.g. event_presentation_time_delta).
- the presentation time offset may be scaled (e.g. based on timescale field of evti box, the scaling may involve multiplication and/or division by received value of timescale field).
- This presentation time offset may need to be converted to a time base T (e.g. NTP, UTC).
- the presentation time in time base T is the sum of the derived values A and P.
- an MMT event information e.g. content of ⁇ Event>, value of event_data[]
- the timing attribute e.g. begin, end
- the default value is specified either in MPEG CI standard or a pre-determined value for the attribute if no default value is specified in the MPEG CI standard.
- the processing blocks in the YES path of the decision diamond that checks whether MMT event information contains CI document is an exemplary embodiment mapping different timing attributes of the CI document to a time base T.
- D is derived in time base T.
- D is an offset value in time base T.
- the actual time for the begin, end timing attributes in time base T is A + P + D.
- dur (duration) attribute is offset time value, in time base T, its value is D.
- the event information may be processed further (for e.g. resolving Event duration and duration specified in CI document).
- an input value x may undergo a modulo operation, say by a value y; the output is remainder of x divided by y, defined only for integers x and y with x greater than or equal to 0 and y greater than 0.
- y When input value x is greater than or equal to y then roll-over(s) occur.
- the number of roll-overs is x / y, where / denotes integer division with truncation of the result toward zero. For example, 7 / 4 is truncated to 1.
- comparisons of received, inferred, or calculated timing values are performed by the receiver in a common time base e.g. 64-bit NTP time stamp format, precision time protocol (PTP) timestamp as defined in Institute of Electrical and Electronics Engineers (IEEE) 1588 (2008).
- PTP precision time protocol
- These comparisons may lead to selection of one of the received, inferred, or calculated wall clock time value for the timing attributes of CI document elements.
- the rolled over time value is inferred to be the greater of the two values.
- both time values may have rolled over then first the number of times each time value rolled over time value is used for the comparison; a time value with larger number of roll-overs is determined to be larger. If the number of roll-over is equal then the rolled-over values are used for comparison.
- the rules for pair-wise comparison also apply for comparison of more than two time values for e.g. by taking all possible pair wise combinations and performing the comparison.
- the comparison may include tracking the value of interest (smallest or largest).
- the event related timing information (e.g. event presentation time) duration may undergo scaling (e.g. division by a time scale).
- the scaled event related timing information may further be converted to a common time base for (e.g. by truncation to fixed number of decimal points for the binary representation).
- 64-bit NTP time stamp may be used and for the scaled event related timing information the number of binary bits after the decimal point may be truncated to be 32 bits.
- PTP time stamp may be used and for the scaled event related timing information the number of binary bits after the decimal point may be truncated to be 32 bits.
- the end time attribute (in a common time base) of an element of CI document exceeds the end time information (in the common time base) of the event then it is set to the end time information of the event.
- the end time information (in the common time base) of the event is determined based on an anchor time, event presentation time and event duration.
- the end time attribute of an element of CI document is determined based on the begin attribute, the dur attribute (received, inferred or derived) values.
- smallest of set of multiple values is the value that is less than equal to all the other values in the set.
- timing values such as time offset, duration, absolute (wall clock) time are converted to time offset, duration, absolute (wall clock) time of a common time base prior to carrying out operations such as addition, subtraction etc.
- indefinite + any time value is equal to indefinite. In an example, indefinite is greater than any finite time value.
- timing attributes e.g. begin, end, dur
- the inferred time value for processing is indefinite
- a value for e.g. 1
- a string for e.g. “1”
- the values of evti box fields are constrained based on a context.
- the context may be determined by the CI element indicated for the event (e.g. based on value of scheme_id_uri field, value of value field).
- the properties (e.g. attributes, content) of ⁇ Event> element are constrained based on a context.
- the context may be determined by the CI element indicated for the event (e.g. based on value of @schemeIdURI attribute of ⁇ Event> element, value of @value attribute of ⁇ Event> element).
- the value of fields corresponding to the evti box may be derived based on information transmitted in an MPU event (for in-band MMT events).
- the earliest presentation time mpu_presentation_time of current MPU corresponds to the mpu_presentation_time of the first access unit in current MPU.
- the current MPU may correspond to the MPU identified by a MPU sequence number.
- the MPU sequence number in turn corresponds to AEI descriptor which is defined at the asset level.
- An Event message may be valid only for a certain duration which may be signaled explicitly or implicitly in the emission. At the end of the validity duration the Event message may be discarded at the receiver.
- the Event messages may include information such as an identifier.
- a later Event message received with the same identifier may replace an earlier Event message received with the same identifier.
- the event information (e.g. MPEG CI document, HTML5 document) are encapsulated in a “metadata envelope” that includes a “valid from” and a “valid until” it.
- the “valid from” and “valid until” attributes define the interval of validity of the event information.
- valid from is inferred (e.g. now) and not signaled.
- valid until is inferred (e.g. indefinite) and not signaled.
- the event information (e.g. MPEG CI document, HTML5 document) are encapsulated in a “metadata envelope” that includes a “next URL” attribute associated with it.
- the “next URL” attribute is the URL of the next scheduled version of that event information.
- each functional block or various features of the base station device and the terminal device (the video decoder and the video encoder) used in each of the aforementioned embodiments may be implemented or executed by a circuitry, which is typically an integrated circuit or a plurality of integrated circuits.
- the circuitry designed to execute the functions described in the present specification may comprise a general-purpose processor, a digital signal processor (DSP), an application specific or general application integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic devices, discrete gates or transistor logic, or a discrete hardware component, or a combination thereof.
- the general-purpose processor may be a microprocessor, or alternatively, the processor may be a conventional processor, a controller, a microcontroller or a state machine.
- the general-purpose processor or each circuit described above may be configured by a digital circuit or may be configured by an analogue circuit. Further, when a technology of making into an integrated circuit superseding integrated circuits at the present time appears due to advancement of a semiconductor technology, the integrated circuit by this technology is also able to be used.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Un système de mappage d'un événement avec HTML est décrit.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562213540P | 2015-09-02 | 2015-09-02 | |
US62/213,540 | 2015-09-02 | ||
US201562244626P | 2015-10-21 | 2015-10-21 | |
US62/244,626 | 2015-10-21 | ||
US201562250383P | 2015-11-03 | 2015-11-03 | |
US62/250,383 | 2015-11-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017038065A1 true WO2017038065A1 (fr) | 2017-03-09 |
Family
ID=58186922
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/003889 WO2017038065A1 (fr) | 2015-09-02 | 2016-08-26 | Mappage de signalisation d'événement avec html |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017038065A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018171794A1 (fr) * | 2017-03-24 | 2018-09-27 | Mediatek Inc. | Procédés et appareil de changements d'actifs de contenu multimédia |
CN112188256A (zh) * | 2019-07-02 | 2021-01-05 | 腾讯美国有限责任公司 | 信息处理方法、信息提供方法、装置、电子设备及存储介质 |
US20220337647A1 (en) * | 2021-04-19 | 2022-10-20 | Tencent America LLC | Extended w3c media extensions for processing dash and cmaf inband events |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014004955A1 (fr) * | 2012-06-28 | 2014-01-03 | Azuki Systems, Inc. | Procédé et système d'insertion de publicité dans la distribution ott (over the top) de contenus multimédias en direct |
-
2016
- 2016-08-26 WO PCT/JP2016/003889 patent/WO2017038065A1/fr active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014004955A1 (fr) * | 2012-06-28 | 2014-01-03 | Azuki Systems, Inc. | Procédé et système d'insertion de publicité dans la distribution ott (over the top) de contenus multimédias en direct |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018171794A1 (fr) * | 2017-03-24 | 2018-09-27 | Mediatek Inc. | Procédés et appareil de changements d'actifs de contenu multimédia |
TWI656784B (zh) * | 2017-03-24 | 2019-04-11 | 聯發科技股份有限公司 | 用於媒體內容資產改變的方法及裝置 |
US10958988B2 (en) | 2017-03-24 | 2021-03-23 | Mediatek Inc. | Methods and apparatus for media content asset changes |
CN112188256A (zh) * | 2019-07-02 | 2021-01-05 | 腾讯美国有限责任公司 | 信息处理方法、信息提供方法、装置、电子设备及存储介质 |
CN112188256B (zh) * | 2019-07-02 | 2024-05-24 | 腾讯美国有限责任公司 | 信息处理方法、信息提供方法、装置、电子设备及存储介质 |
US20220337647A1 (en) * | 2021-04-19 | 2022-10-20 | Tencent America LLC | Extended w3c media extensions for processing dash and cmaf inband events |
US11882170B2 (en) * | 2021-04-19 | 2024-01-23 | Tencent America LLC | Extended W3C media extensions for processing dash and CMAF inband events |
US12225070B2 (en) | 2021-04-19 | 2025-02-11 | Tencent America LLC | Extended W3C media extensions for processing DASH and CMAF inband events |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10595065B2 (en) | Method and apparatus for transmitting and receiving multi-media services | |
US10129609B2 (en) | Method for transceiving media files and device for transmitting/receiving using same | |
KR101946861B1 (ko) | 멀티미디어 방송 서비스의 미디어 데이터 동기화 방법 및 장치 | |
US9860611B2 (en) | Broadcast service transmitting method, broadcasting service receiving method and broadcast service receiving apparatus | |
KR101215747B1 (ko) | 디지털 텔레비전 수신기 | |
US9225443B2 (en) | Method for transmitting broadcast service, method for receiving the broadcasting service, and apparatus for receiving the broadcasting service | |
US9596510B2 (en) | Method for transmitting broadcast service, method for receiving broadcast service, and apparatus for receiving broadcast service | |
US9667902B2 (en) | Method for transmitting a broadcast service, method for receiving a broadcast service, and apparatus for receiving a broadcast service | |
KR20130138777A (ko) | 멀티미디어 흐름을 동기화시키기 위한 방법 및 대응하는 장치 | |
KR102135255B1 (ko) | Uri 메시지 워터마크 페이로드가 있는 방송 시스템 | |
WO2017038065A1 (fr) | Mappage de signalisation d'événement avec html | |
Recommendation | ITU-Th. 750 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16841100 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16841100 Country of ref document: EP Kind code of ref document: A1 |