+

WO2006092996A1 - Support d’enregistrement, appareil de reproduction et procédé de reproduction - Google Patents

Support d’enregistrement, appareil de reproduction et procédé de reproduction Download PDF

Info

Publication number
WO2006092996A1
WO2006092996A1 PCT/JP2006/303148 JP2006303148W WO2006092996A1 WO 2006092996 A1 WO2006092996 A1 WO 2006092996A1 JP 2006303148 W JP2006303148 W JP 2006303148W WO 2006092996 A1 WO2006092996 A1 WO 2006092996A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
playback
title
time
information
Prior art date
Application number
PCT/JP2006/303148
Other languages
English (en)
Japanese (ja)
Inventor
Shigeki Matsunaga
Hideyuki Kuwano
Ryuichiro Takamatsu
Wataru Ikeda
Takashi Kakiuchi
Kakuya Yamamoto
Takahiro Yamaguchi
Original Assignee
Matsushita Electric Industrial Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co., Ltd. filed Critical Matsushita Electric Industrial Co., Ltd.
Publication of WO2006092996A1 publication Critical patent/WO2006092996A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42646Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6581Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8543Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/783Adaptations for reproducing at a rate different from the recording rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/00086Circuits for prevention of unauthorised reproduction or copying, e.g. piracy
    • G11B20/00137Circuits for prevention of unauthorised reproduction or copying, e.g. piracy involving measures which result in a restriction to contents recorded on or reproduced from a record carrier to authorised users
    • G11B20/00159Parental control systems
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/00086Circuits for prevention of unauthorised reproduction or copying, e.g. piracy
    • G11B20/00731Circuits for prevention of unauthorised reproduction or copying, e.g. piracy involving a digital rights management system for enforcing a usage restriction
    • G11B20/00739Circuits for prevention of unauthorised reproduction or copying, e.g. piracy involving a digital rights management system for enforcing a usage restriction wherein the usage restriction is associated with a specific geographical region
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • G11B2020/1062Data buffering arrangements, e.g. recording or playback buffers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/12Formatting, e.g. arrangement of data block or words on the record carriers
    • G11B2020/1264Formatting, e.g. arrangement of data block or words on the record carriers wherein the formatting concerns a specific kind of data
    • G11B2020/1288Formatting by padding empty spaces with dummy data, e.g. writing zeroes or random data when de-icing optical discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs

Definitions

  • the present invention relates to a recording medium on which AV stream data is recorded, such as a Blu-ray Disc (BD) -ROM, a reproducing apparatus and a reproducing method for reproducing the recording medium.
  • a recording medium on which AV stream data is recorded such as a Blu-ray Disc (BD) -ROM
  • BD Blu-ray Disc
  • a typical information recording medium on which video data is recorded is a DVD (hereinafter also referred to as “Standard Definition (SD) —DVD”).
  • SD Standard Definition
  • DVD DVD
  • the conventional DVD will be described below.
  • FIG. 1 is a diagram showing the structure of an SD-DVD. As shown in the lower part of Fig. 1, a logical address space is provided on the DVD disk from lead-in to lead-out. In the logical address space, volume information of the leading file system is recorded, followed by application data such as video and audio.
  • a file system is a mechanism for managing data defined by standards such as ISO9660 and Universal Disc Format (UDF).
  • UDF Universal Disc Format
  • SD-DVD uses both UDF and ISO9660 file systems. Together, they are also called “UDF bridges”.
  • the recorded data can be read by either UDF or ISO9660 file system drivers.
  • the DVD handled here is a ROM disk for package media, and is not physically writable.
  • Data recorded on a DVD can be viewed as a directory or file as shown in the upper left of FIG. 1 through a UDF bridge.
  • a directory called “VIDEO TS” is placed directly under the root directory (“ROO T” in Fig. 1). Case data is recorded.
  • Application data is recorded as multiple files, and the following types of files are the main files.
  • IFO Video Title Set # 1 Playback Control Information File VTS 01 0.
  • VOB Video Title Set # 1 Stream File As shown in the above example, two extensions are defined. riFOj is an extension indicating that the file has playback control information recorded therein, and “VOB” is an extension indicating that the file is an MPEG stream of AV data.
  • Playback control information refers to information used to implement interactivity (a technology that dynamically changes playback according to user operations) used in DVDs, and AV data such as metadata. The information that comes with the data.
  • the playback control information is generally called navigation information.
  • the playback control information file includes “VIDEO—TS. IFO” for managing the entire disc and “VTS-01-0. IFO” which is playback control information for each video title set.
  • a DVD can record multiple titles, in other words, multiple different movies and songs on a single disc.
  • “01” in the file name body indicates the number of the video title set. For example, in the case of the video title set # 2, “VTS-02-02. IFO” is obtained.
  • the upper right part of FIG. 1 is a DVD navigation space in the DVD application layer, which is a logical structure space in which the playback control information described above is expanded.
  • “VIDEO—TS. IFOJ information is VIDEO Manager Information (VMGI) and“ VTS— 01— 0. I FO ”or other video title set playback control information is Video Title Set Information (VTSI). ) As a DVD navigation space.
  • VTSI describes Program Chain Information (PGCI), which is information of a playback sequence called Program Chain (PGC).
  • PPC Program Chain
  • PGCI consists of a set of cells and a kind of programming information called commands. Cell himself
  • VOB abbreviation of Video Object, indicating MPEG stream
  • the command is processed by a virtual machine of the DVD, and is close to, for example, a Java (registered trademark) script executed on a browser that displays a web page.
  • Java® scripts control windows and browsers in addition to logical operations (for example, opening a new browser window)
  • DVD commands can be used in addition to logical operations.
  • the only difference is that the playback control of the AV title, for example, the designation of the chapter to be played back is only executed.
  • the Cell is recorded on the disc and has the VOB start and end addresses (logical addresses) as its internal information, and the player uses the VOB start and end address information described in the Cell. Use to read and play data.
  • VOB start and end addresses logical addresses
  • FIG. 2 is a schematic diagram for explaining navigation information embedded in an MPEG stream that is AV data.
  • the interactivity that is a feature of SD—DVD is realized only by the navigation information recorded in “VIDEO—TS. IFO” and “VTS—01—0. IFO” mentioned above.
  • Some important information is multiplexed with video and audio data in VOB using a dedicated carrier called Navigation 'Pack (Navi Pack or NV_PCK).
  • buttons appear on the menu screen, and each button defines the processing when the button is selected and executed.
  • one button is selected on the menu screen (a translucent color is overlaid on the selection button by highlighting), and the user uses the up / down / left / right keys on the remote control to select the selected button. It can be moved to any button up / down / left / right.
  • the program of the corresponding command is executed by determining (pressing the enter key)
  • playback of a corresponding title or chapter is executed by a command.
  • the upper left part of FIG. 2 shows an outline of information stored in the NV-PCK.
  • the NV—PCK contains information on the color, illite color and individual button information. No, illite color In the information, color palette information is described, and the semi-transparent color of the highlighted highlight is specified.
  • the button information includes rectangular area information that is position information of each button, movement information from the button to another button (designation of a movement destination button corresponding to each of the user's up / down / left / right key operations), Button command information (command executed when the button is determined) is described.
  • the highlight on the menu screen is created as an overlay image as shown in the upper right part of FIG.
  • the overlay image is an image obtained by adding the color palette information color to the rectangular area information of the button information. This overlay image is combined with the background image shown on the right and displayed on the screen.
  • a menu screen is realized on DVD. Also, why is it necessary to embed part of the navigation data in the stream using NV-PCK! /, For the following reasons.
  • the menu information is dynamically updated in synchronization with the stream.For example, when the menu screen is displayed only during the movie playback for 5 to 10 minutes, the processing that is likely to cause the synchronization timing is a problem. This is so that it can be realized.
  • FIG. 3 is a schematic diagram showing the configuration of a VOB in a DVD.
  • data such as video, audio, and subtitles (stage 1) is packetized and packed (stage 2) based on the MPEG system (ISOZIEC13818-1) standard, and each is multiplexed into one MPEG program stream (stage 3).
  • MPEG system ISOZIEC13818-1
  • NV-PCK that includes button commands for realizing interactive functions is also multiplexed.
  • individual data to be multiplexed is a bit string based on the decoding order, but it is not always reproduced between multiplexed data, that is, between video, audio, and subtitles.
  • Order based on decoding order if you change the word! A bit string is formed, and V is not.
  • stage 4 This is an MPEG system stream decoder model (stage 4, generally called System Targ et Decoder, or STD) that has a decoder buffer corresponding to each elemental stream after demultiplexing. Data temporarily by Accumulate and derive from things.
  • stage 4 generally called System Targ et Decoder, or STD
  • This decoder buffer has a different size for each elementary stream, and has 232 kB for video, 4 kB for audio, and 52 kB for subtitles. For this reason, since the data input timing to each decoder buffer is different for each elementary stream, there is a difference between the order in which the bit string is formed as the MPEG system stream and the display (decoding) timing.
  • subtitle data multiplexed side by side with video data is not necessarily decoded at the same timing.
  • Patent Document 1 Japanese Patent No. 2813245
  • BD-ROM which is the next-generation information recording medium, is scheduled to record content including programs consisting of XML 'XHTML and scripts that are not used by the navigation commands as described above.
  • Interactive applications can be realized by executing scripts that run on and off events.
  • a playback device is a playback device that plays back a title that is digital content, a title playback device that acquires and plays back the title, A point list in which an event to be generated in playback is associated with an event time that is the time at which the event is generated is acquired, and the playback time of the title being played back by the title playback means according to the acquired point list
  • an event generation unit that generates an event corresponding to the event time, and a process corresponding to the event described in the point list acquired by the event generation unit
  • An application that describes event processing A script execution unit that acquires a script that is a program and executes event processing corresponding to the event generated by the event generation unit according to the acquired script, and the event generation unit includes the title reproduction unit.
  • An event including reproduction state information indicating a reproduction state of a title is generated, and the script execution unit switches event processing to be executed depending on the reproduction state information included in the event generated by the event generation unit. It is characterized by that. With such a configuration, it is possible to execute a script that generates an event that is linked to the playback time of the content, and that performs an optimal process for the event. This also makes it possible to realize expressive applications.
  • the event generation means generates an event including reproduction status information indicating that the title is specially reproduced when the title reproduction means performs special reproduction of the title. As well.
  • the script execution means sets the visible attribute of the image associated with the event to an invisible state when the reproduction state information included in the event indicates that the title is specially reproduced. After setting to visible state, the image set to visible state may be displayed.
  • an event generation condition that is a condition for generating the event is described in association with the event, and the event generation unit reproduces the title being reproduced by the title reproduction unit.
  • the time reaches the event time described in the point list, it is determined whether or not the event generation condition corresponding to the event time is satisfied, and the event corresponding to the event time is satisfied only when the event generation condition is satisfied. You may generate.
  • the event generation condition is a condition indicating whether or not an event needs to be generated during special playback, and the event generation unit is configured such that the title playback unit performs special playback of the title.
  • the event may or may not be generated according to the condition indicated by the event generation condition corresponding to the event time.
  • the event is an event generated for each chapter that is a unit of a story constituting the title, and an event process corresponding to the event is a chapter. Even if it is a process corresponding to the switching of.
  • the present invention is also a playback method for playing back a title which is digital content, a title playback step for acquiring and playing back the title, an event to be generated over the playback of the title, and the A point list described in association with an event time that is an event generation time is acquired, and the playback time of the title being played back by the title playback step is described in the point list according to the acquired point list.
  • An event generation step that generates an event corresponding to the event time when the event time is reached, and an event process that is a process corresponding to the event described in the point list acquired in the event generation step are described.
  • the script execution step provides a playback method for switching event processing to be executed depending on playback state information included in the event generated in the event generation step.
  • the present invention also provides a program for a playback device that plays back a title that is digital content, and causes a computer to execute the steps recited in claim 7.
  • the recording medium of the present invention is a computer-readable recording medium on which a program for a reproducing apparatus for reproducing a title that is digital content is recorded, and the program according to claim 8 is recorded on the recording medium. It is characterized by.
  • the integrated circuit of the present invention is an integrated circuit for playing back a title that is digital content, and should be generated in the title playback means for acquiring and playing back the title, and playback of the title.
  • a point list in which an event is associated with an event time that is the time when the event is generated is acquired, and the playback time of the title being played back by the title playback means is stored in the point list according to the acquired point list.
  • the event generation means for generating an event corresponding to the event time, and the point list acquired by the event generation means are described.
  • Script execution means for acquiring a script that is an application program that describes event processing that is processing corresponding to the event generated, and executing event processing corresponding to the event generated by the event generation means according to the acquired script
  • the event generation means generates an event including reproduction state information indicating a reproduction state of the title by the title reproduction means
  • the script execution means includes a reproduction included in the event generated by the event generation means.
  • the event processing to be executed is switched depending on the state information.
  • the recording medium of the present invention is a computer-readable recording medium, which is a title that is digital content, an event that is to be generated in reproduction of the title, and an event that is a time at which the event is generated
  • FIG. 1 is a diagram showing the structure of an SD-DVD.
  • FIG. 2 is a schematic diagram for explaining navigation information embedded in an MPEG stream that is AV data.
  • FIG. 3 is a schematic diagram showing the configuration of a VOB in a DVD.
  • FIG. 4 is a diagram showing a data hierarchy of a BD-ROM.
  • FIG. 5 is a diagram showing the structure of logical data recorded on a BD-ROM.
  • FIG. 6 is a diagram showing an outline of a basic configuration of a BD-ROM player that plays BD-ROM.
  • FIG. 7 is a detailed block diagram of the configuration of the player shown in FIG.
  • FIG. 8 is a diagram showing a BD-ROM application space.
  • FIG. 9 is a diagram showing the structure of an MPEG stream (VOB).
  • FIG. 10 is a diagram illustrating a pack configuration in an MPEG stream.
  • FIG. 11 is a diagram for explaining the relationship between AV data and a player configuration.
  • FIG. 12 is a diagram for explaining a VOB data continuous supply model using a track buffer.
  • FIG. 13 is a diagram showing an internal structure of a VOB management information file.
  • FIG. 14 is a diagram for explaining details of VOBU information.
  • FIG. 15 is a diagram for explaining an address information acquisition method using a time map.
  • FIG. 16 is a diagram showing a structure of playlist information.
  • FIG. 17 is a diagram showing a structure of an event handler table.
  • FIG. 18 is a diagram showing a structure of BD. INFO that is BD-ROM overall information.
  • FIG. 19 is a diagram showing a configuration of a global event handler table.
  • FIG. 20 is a diagram showing an example of a time event.
  • FIG. 21 is a diagram showing an example of a user event by a user's menu operation.
  • FIG. 22 is a diagram showing an example of a global event.
  • FIG. 23 is a diagram for explaining a functional configuration of a program processor.
  • FIG. 24 is a diagram showing a list of system parameters (SPRM).
  • FIG. 25 is a diagram showing an example of a program in an event handler related to control of a menu screen having two selection buttons.
  • FIG. 26 is a diagram showing an example of a program in an event handler related to a user event of menu selection.
  • FIG. 27 is a flowchart showing a flow of basic processing of AV data reproduction in a BD-ROM player.
  • FIG. 28 is a flowchart showing the flow of processing from the start of playlist playback to the end of VOB playback in the BD-ROM player.
  • FIG. 29 (A) is a flowchart showing a process flow related to a time event in a BD-ROM player
  • FIG. 29 (B) shows a process flow related to a user event in a BD-ROM player. It is a flowchart to show.
  • FIG. 30 is a flowchart showing a flow of processing of caption data in a BD-ROM player.
  • FIG. 31 is a diagram for explaining the module configuration and control flow of the playback apparatus of the present invention.
  • FIG. 32 is a diagram for explaining the relationship between the Index Table and the XHTML file.
  • FIG. 33 is a diagram showing an example of an XHTML file in which buttons are displayed.
  • FIG. 34 is an explanatory diagram for explaining the life cycle of data.
  • FIG. 35 is a timing chart illustrating a data life cycle.
  • FIG. 36 is a diagram for explaining a method of combining data into one.
  • FIG. 37 is an explanatory diagram of a method for introducing memory size expansion.
  • FIG. 38 is a diagram of a player variable table to which language settings for applications are added.
  • FIG. 39 is a diagram showing an example of switching the language displayed or processed by the application according to the language setting for the application.
  • FIG. 40 is a diagram showing an example of switching the language to be displayed or processed by the application based on the description of XML'XHTML and the script.
  • FIG. 41 is a sequence diagram of a language switching method using an IndexTable for a set language, a file list, or a compressed file.
  • FIG. 42 is a diagram showing an example in which an IndexTable for a set language, a file list, or a compressed file is recorded on a disc.
  • FIG. 43 is a configuration diagram of a playlist file in which a point list has been added and modified.
  • FIG. 44 is a diagram showing an example in which a commentary picture application is realized by a system that generates and processes an event linked to a content playback time.
  • FIG. 45 is a flowchart of a system for generating and processing an event linked to a content playback time.
  • FIG. 46 is a diagram illustrating a description example of a commentary picture application.
  • Fig. 47 generates an event that can be linked to the content playback time and can also notify the playback status.
  • FIG. 48 is a flowchart of a system that generates and processes an event that is linked to the content playback time and can also notify the playback state.
  • FIG. 49 is a diagram showing a description example of a chapter title application.
  • FIG. 50 is a configuration diagram of a playlist file in which event generation condition information is added to the point information and modified.
  • FIG. 51 is a diagram showing an outline of the operation of a system that generates and processes events linked to content playback times according to event generation conditions.
  • FIG. 52 is a flowchart of a system for generating and processing an event linked to a content playback time according to an event generation condition.
  • VOB information (YYY. VOBI) reading step 5403 VOB (YYY. VOB) reading step
  • FIG. 4 shows the data hierarchy of the BD-ROM.
  • BD-ROM 104 which is a disk medium
  • AV data 103 As shown in FIG. 4, on the BD-ROM 104, which is a disk medium, AV data 103,
  • BD management information 102 such as management information related to AV data and an AV playback sequence, and a BD playback program 101 that realizes interactive operations are recorded.
  • the entity data of each title exists as AV data 103, and the scenario control description data (hereinafter also simply referred to as "scenario") of each title exists as BD management information 102.
  • BD-ROM mainly for AV applications for reproducing AV contents such as movies, etc.
  • BD-ROM is a computer like CD-ROM and DVD-ROM. Of course, it can be used as a recording medium for various purposes.
  • FIG. 5 is a diagram showing the structure of logical data recorded on the BD-ROM 104 described above.
  • BD-ROM104 like other optical discs such as DVD and CD, has a recording area that spirals from the inner periphery to the outer periphery, and stores logical data between the inner lead-in and outer lead-out. It has a logical address space that can be recorded.
  • BCA Burst Cutting Area
  • file system information (volume).
  • the file system is a mechanism for managing data defined by standards such as UDF and ISO9660 as explained in the prior art.
  • Logical data recorded in the same way as a normal PC is stored in the directory and file structure. It is possible to read out using.
  • the directory and file structure on the BD-ROM 104 is the BD VIDEO directory immediately under the root directory (ROOT).
  • This directory is a directory in which data such as AV data and management information handled by the BD-ROM (BD playback program 101, BD management information 102, AV data 103 shown in FIG. 4) is recorded.
  • BD management information is a file that records information about the entire BD-ROM. The BD-ROM player first reads this file.
  • BD management information is a file that records playlist information for recording scenarios. There is one file per playlist.
  • VOB VOB explained in the conventional example.
  • VOB VOB
  • BD management information is a file that records management information related to VOB, which is AV data.
  • the correspondence with the VOB is identified by the file body name ("YYY" matches).
  • PNG (standardized by the World Wide Web Consortium (W3C)), which is one of the “AV data” and is image data for composing subtitles and menu screens. It is an image format and is read as “bing”. ) Format image file.
  • W3C World Wide Web Consortium
  • One PNG image corresponds to one file.
  • FIG. 6 the configuration of the player that plays the BD-ROM 104 will be described with reference to FIGS. 6 and 7.
  • FIG. 6 is a diagram showing an outline of a basic configuration of a BD-ROM player that reproduces the BD-ROM 104.
  • BD-ROM player data on the BD-ROM 104 is read through the optical pickup 202.
  • the read data is recorded in a dedicated memory according to the type of each data.
  • BD playback programs ("BD. PROG” or "XXX. PROG” files) are stored in the program recording memory 203, and BD management information (“: BD. INFO", "X XX. PL” or “YYY. VOBI” files) AV data (“YYY. VOB” or “ZZZ. PNG” file) is recorded in the management information recording memory 204 in the AV recording memory 205, respectively.
  • the BD playback program recorded in the program recording memory 203 is processed by the program processing unit 206.
  • the BD management information recorded in the management information recording memory 204 is processed by the management information processing unit 207.
  • the AV data recorded in the AV recording memory 205 is processed by the presentation processing unit 208.
  • the program processing unit 206 receives the event information such as the playlist information to be reproduced and the execution timing of the program from the management information processing unit 207, and processes the program. In addition, it is possible to dynamically change the playlist to be played back by the program. In this case, it is realized by sending a playback instruction for the playlist after the change to the management information processing unit 207.
  • the program processing unit 206 further receives an event from the user, for example, a remote control request that is operated by the user, and executes an execution process if there is a program corresponding to the user event.
  • the management information processing unit 207 receives an instruction from the program processing unit 206, and analyzes the playlist corresponding to the instruction and the management information of the VOB corresponding to the playlist. In addition, The presentation processing unit 208 is instructed to reproduce AV data to be reproduced.
  • management information processing unit 207 receives reference time information from presentation processing unit 208, and instructs presentation processing unit 208 to stop AV data reproduction based on the time information. Furthermore, an event indicating the program execution timing is generated for the program processing unit 206.
  • the presentation processing unit 208 has a decoder corresponding to video, audio, and subtitle data, and decodes and outputs AV data in accordance with instructions from the management information processing unit 207.
  • Video data and subtitle data are drawn on each dedicated plane after decoding. Specifically, video data is drawn on the video plane 210, and image data such as caption data is drawn on the image plane 209. Further, the composition processing of the video drawn on the two planes is performed by the composition processing unit 211 and output to a display device such as a TV.
  • the BD-ROM player is recorded on the BD-ROM 104 shown in FIG.
  • FIG. 7 is a detailed block diagram of the configuration of the player shown in FIG.
  • the correspondence between each component shown in FIG. 6 and each component shown in FIG. 7 is as follows.
  • the AV recording memory 205 corresponds to the image memory 308 and the track buffer 309.
  • the program processing unit 206 corresponds to a program processor 302 and a UO (User Operation) manager 303.
  • the management information processing unit 207 corresponds to the scenario processor 305 and the presentation controller 306.
  • the presentation processing unit 208 corresponds to a clock 307, a demultiplexer 310, an image processor 311, a video processor 312, and a sound processor 313.
  • VOB data MPEG stream
  • PNG image data
  • the demultiplexer 310 extracts the VOB data recorded in the track buffer 309 based on the time obtained from the clock 307. Furthermore, the video data included in the VOB data is sent to the video processor 312 and the audio data is sent to the sound processor 313. [0075]
  • the video processor 312 and the sound processor 313 are each configured with a decoder buffer and a decoder power, respectively, as defined by the MPEG system standard. In other words, the video and audio data sent from the demultiplexer 310 are temporarily recorded in the respective decoder buffers and decoded by the individual decoders according to the clock 307.
  • PNG data recorded in the image memory 308 has the following two processing methods.
  • the presentation controller 306 instructs the decoding timing. Display and hide subtitles to the presentation controller 306 at the subtitle display time (start and end) so that the scenario processor 305 receives the time information from the clock 307 and can display appropriate subtitles. Give instructions.
  • the image processor 311 extracts the corresponding PNG data from the image memory 308, decodes it, and renders it on the image plane 209.
  • the program processor 302 instructs the decoding timing. Whether the program processor 302 instructs to decode the image depends on the BD program being processed by the program processor 302, and it is unclear!
  • the image data and the video data are recorded on the image plane 209 and the video plane 210 after being decoded, drawn, and synthesized and output by the synthesis processing unit 211.
  • Management information (scenario, AV management information) read from the BD-ROM 104 is recorded in the management information recording memory 204.
  • Scenario information (“BD. INFO” and "XXX. PL" is a scenario. It is read and processed by the processor 305.
  • the AV management information (“YYY. VOBI”) is read and processed by the presentation controller 306.
  • the scenario processor 305 analyzes the information of the playlist and instructs the presentation controller 306 on the VOB referenced by the playlist and its playback position.
  • the presentation controller 306 manages the target VOB. Pray the information (“YYY. VOBI”) and instruct the drive controller 317 to read out the target VOB.
  • the drive controller 317 moves the optical pickup according to the instruction of the presentation controller 306, and reads the target AV data.
  • the read AV data is recorded in the image memory 308 or the track buffer 309 as described above.
  • scenario processor 305 monitors the time of the clock 307 and throws an event to the program processor 302 at the timing set in the management information.
  • the BD program (“BD. PROG” or “XXX. PROG”) recorded in the program recording memory 203 is executed by the program processor 302.
  • the program processor 302 processes the BD program when an event is sent from the scenario processor 305 or when an event is sent from the UO manager 303.
  • the UO manager 303 When a request is sent from the user by a remote control key, the UO manager 303 generates an event corresponding to the request and sends it to the program processor 302.
  • the BD-ROM is played back by the operation of each component as described above.
  • FIG. 8 is a diagram showing a BD-ROM application space.
  • a playlist becomes one playback unit! /.
  • the playlist has a static scenario that also configures the playback sequence of the cell and a dynamic scenario described by the program. Unless there is a dynamic scenario by the program, the playlist only plays back individual cells in order, and playback of the playlist ends when playback of all cells is finished.
  • the program can dynamically change the playback target according to the playback description beyond the playlist, the user's selection or the state of the player.
  • a typical example is dynamic change of a playback target via a menu screen.
  • the menu is one of the components of the function for dynamically selecting a scenario to be played back by the user's selection, that is, a playlist.
  • the program referred to here is an event handler executed by a time event or a user event.
  • a time event is an event generated based on time information embedded in a playlist. Sent from the scenario processor 305 described in FIG. 7 to the program processor 302 This is equivalent to the event to be performed. When a time event is issued, the program processor
  • the program to be executed can instruct playback of another playlist. In this case, playback of the currently played playlist is stopped and playback of the specified playlist is performed. Transition to.
  • the user event is an event generated by a user's remote control key operation.
  • Event handlers corresponding to menu selection events are valid only for a limited period in the playlist. In other words, the validity period of each event handler is set as playlist information.
  • the program processor 302 searches for a valid event handler when the “Up”, “Down”, “Left”, “Right” key or “Determination” key of the remote control is pressed. The handler is executed. In other cases, the menu selection event will be ignored.
  • the second user event is a menu screen call event generated by operating the “menu” key.
  • a menu screen call event When a menu screen call event is generated, a global event handler is called. Global event handlers are always valid event handlers independent of playlists. Using this feature, DVD menu calls can be implemented. By implementing the menu call, it is possible to call up the audio, subtitle menu, etc. during the title playback, and execute the title playback at the point where it was interrupted after changing the audio or subtitle.
  • a cell which is a unit constituting a static scenario in a playlist, refers to all or part of a playback section of a VOB (MPEG stream).
  • the cell has the playback period in the VOB as information on the start and end times.
  • the VOB management information (VOBI) paired with each VOB has a time map (Time Map or TM) inside it, and the playback and end times of the VOB described above are stored in the VOB ( In other words, the read start address and end address of the target file “YYY. VOBJ” are derived. It is possible. Details of the time map will be described later with reference to FIG.
  • FIG. 9 is a diagram showing the structure of an MPEG stream (VOB) used in the present embodiment.
  • a VOB is composed of multiple Video Object Units (VOBU).
  • VOBU is a unit based on Group Of Pictures (GOP) in an MPEG video stream, and is one playback unit as a multiplexed stream including audio data.
  • GIP Group Of Pictures
  • VOBU has a playback time of 0.4 to 1.0 seconds, and normally has a playback time of 0.5 seconds. This is driven by the fact that the MPEG GOP structure is usually 15 frames Z seconds (in the case of NTSC).
  • the VOBU has therein a video pack (V—PCK) that is video data and an audio pack (A—PCK) that is audio data.
  • V—PCK video pack
  • A—PCK audio pack
  • Each pack is composed of one sector, and in this embodiment, it is composed of 2 kB units.
  • Fig. 10 is a diagram illustrating a pack configuration in an MPEG stream.
  • V and Tatsu elementary data are sequentially put in the data storage area of the packet called the payload.
  • a packet header is attached to the payload to form one packet.
  • the packet header includes information indicating which stream data is stored in the payload, information indicating whether the data is video data or audio data, and video data or audio data.
  • ID for identifying which stream data
  • DTS Decode Time Stamp
  • PTS Presentation Time Stamp
  • DTS and PTS are not necessarily recorded in all packet headers.
  • Rules for recording in MPEG are specified. The details of the rules are described in the MPEG system (ISOZIEC13818-1) standard, and will be omitted.
  • Pack header Contains a System Clock Reference (SCR), which is a time stamp indicating when the pack passes through the demultiplexer and is input to the decoder buffer of each elementary stream!
  • SCR System Clock Reference
  • FIG. 11 is a diagram for explaining the relationship between AV data and the configuration of a BD-ROM player.
  • FIG. 11 The upper part of FIG. 11 is a part of the player configuration diagram described above with reference to FIG. As shown in the figure, the data on the BD-ROM is input to the track buffer 309 if it is a VOB or MPEG stream through an optical pickup, and to the image memory 308 if it is PNG or image data.
  • the track buffer 309 is First-In First-Out (FIFO), and the input VOB data is sent to the demultiplexer 310 in the order of input. At this time, each pack is extracted from the track buffer 309 according to the SCR described above, and data is sent to the video processor 312 or the sound processor 313 via the demultiplexer 310.
  • FIFO First-In First-Out
  • image data which image is drawn is instructed by the presentation controller 306 (see FIG. 7). Further, the image data used for drawing is left in the image memory as it is in the case of image data for power menu that is deleted from the image memory 308 at the same time in the case of subtitle image data. This is because the menu drawing may depend on user operations, so the same image may be drawn multiple times.
  • the lower diagram in Fig. 11 is a diagram showing interleaved recording of the VOB finale and the PNG finale on the BD-ROM.
  • AV data that is a series of continuous playback units is continuously recorded.
  • the drive only needs to read the data sequentially and send it to the player.
  • AV data that should be played continuously is divided and discretely arranged on the disc, individual continuous sections During this time, a seek operation is performed, and data reading is stopped during this period. In other words, the supply of data may stop.
  • VOB file can be recorded in a continuous area.
  • data that is reproduced in synchronization with video data recorded in the VOB such as caption data.
  • VOB files it is necessary to read the BD-ROM power of caption data by some method.
  • VOB file is divided into several blocks, and the VOB file and the image data are recorded in an interleaved manner.
  • the lower part of Fig. 11 is a diagram for explaining the interleaved recording.
  • FIG. 12 is a diagram for explaining a VOB data continuous supply model using a track buffer 309 that solves the problem in the interleaved recording described above.
  • VOB data is stored in the track buffer 309. If the data input rate to the track buffer 309 is set higher if there is a difference between the data output rate from the track buffer 309, the amount of data stored in the track buffer 309 will remain as long as the BD-ROM data continues to be read. Will increase.
  • the input rate to the track buffer 309 is Va and the output rate from the track buffer is Vb.
  • a continuous recording area of VOB continues from “a 1” to “a 2” of logical addresses.
  • the interval between “a2” and “a3” is a section where image data is recorded and VOB data cannot be read.
  • the lower diagram of FIG. 12 is a diagram showing the accumulation amount of the track buffer 309.
  • the horizontal axis indicates time, and the vertical axis indicates the amount of data stored in the track buffer 309.
  • Time “tl” is It shows the time when reading of “al”, which is the starting point of one continuous recording area of VOB, was started.
  • the time “t2” is the time when the data “a2” that is the end point of the continuous recording area is read. That is, the amount of data in the track buffer increases from time “tl” to time “t2” at the rate Va ⁇ Vb, and the data storage amount B (t2) at time “t2” is obtained by the following (formula 1). be able to.
  • the structure of the navigation data (BD management information) recorded on the BD-ROM will be described with reference to FIG. 13 and FIG.
  • FIG. 13 shows the internal structure of the VOB management information file (“YYY. VOBI”).
  • the VOB management information includes the stream attribute information (Attribute) and time map (T)
  • Stream attribute information includes video attribute (Video), audio attribute
  • the time map (TMAP) is a table having information for each VOBU, and has the number of VOBUs (Number) and the information of each VOBU (VOBU # 1 to VOBU #n).
  • Each VOBU information has a playback time length (Duration) of the VOBU and a data size (Size) of the VOBU.
  • FIG. 14 is a diagram for explaining the details of the VOBU information.
  • an MPEG stream has two physical quantity aspects, a temporal aspect and a data size aspect.
  • Audio Code number 3 which is a compression standard for audio, performs compression at a fixed bit rate, so the relationship between time and address can be obtained by a linear expression.
  • each frame has a fixed display time, for example, in the case of NTSC, one frame has a display time of 1Z29.97 seconds, but the data size after compression of each frame is The data size varies greatly depending on the characteristics of the picture, the picture type used for compression, and the so-called IZ PZB picture. Therefore, in the case of MPEG video, the relationship between time and address cannot be expressed in the general form.
  • the time map (TMA P) links the relationship between time and address in the VOB.
  • the time map (TMAP) is a table having the number of frames in the VOBU and the number of packs in the VOBU as entries for each VOBU.
  • FIG. 15 is a diagram for explaining an address information acquisition method using a time map.
  • time information Time
  • a search is performed for which VOBU the time belongs to. Specifically, the number of frames for each VOBU in the time map is added, and the sum of the number of frames exceeds or matches the value obtained by converting the time into the number of frames, and the VOBU corresponding to the time becomes the VOBU .
  • the size of each VOBU in the time map is calculated up to the VOBU immediately before the VOBU, and the start of the pack to be read to reproduce the frame including the given time. It is an address (Adess).
  • Fig. 16 is a diagram showing the structure of playlist information.
  • the playlist information includes a cell list (CellList) and an event list (EventList).
  • the cell list (CellList) is information indicating a reproduction cell sequence in the playlist, and the cells are reproduced in the description order of the list.
  • CellList The contents of the cell list (CellList) are the number of cells (Number) and cell information (Cell #l to Cel l #n).
  • Each cell information (Cell # to Cell #n) includes a VOB file name (VOBName), a valid section start time (In) and a valid section end time (Out) in the VOB, and a subtitle table (SubtitleTable )have.
  • VOBName VOB file name
  • In valid section start time
  • Out valid section end time
  • SubtitleTable subtitle table
  • the valid section start time (In) and valid section end time (Out) are each represented by the frame number in the VOB, and VOB data necessary for playback by using the time map (TMAP) described above. You can get the address.
  • the subtitle table is a table having subtitle information to be reproduced in synchronization with the VOB. Subtitles can have multiple languages like audio, and the subtitle table (Sub titleTable) consists of a number of languages (Number) followed by a table for each individual language (Language # 1 ⁇ : Language #k). And
  • the table for each language contains the language information (Language) and Subtitle information number (Number) of the displayed subtitles and subtitle information of the displayed subtitles (Speech #l to Speech #j).
  • Each subtitle information corresponds to It consists of an image data file name (Name), a caption display start time (In), a caption display end time (Out), and a caption display position (Position).
  • the event list (EventList) is a table that defines events that occur in the play list.
  • the event list is composed of the number of events (Number) followed by individual events (Event # l to Event # m).
  • Each event (Event # l to Event # m) has an event type (Type), It consists of event ID (ID), event generation time (Time), and effective period (Duration).
  • FIG. 17 is a diagram showing a configuration of an event handler table (“XXX. PROG”) having event handlers (time events and user events for menu selection) for each playlist.
  • XXX. PROG event handler table
  • the event handler table has a defined number of event handler Z programs (Num ber) and individual event handler Z programs (Program #l to Program #n).
  • the description in each event handler Z program is the event handler ID (event—) paired with the event handler start definition (event—handler> tag) and the above event ID. handler id), followed by the parenthesis following the program function function ".
  • BD. INFO information relating to the entire BD-ROM
  • FIG. 18 is a diagram showing the structure of BD. INFO, which is BD-ROM overall information.
  • the entire BD-ROM information is composed of a title list (TitleList) and an event list (EventList) for global events.
  • the title list (TitleList) is composed of the number of titles in the disc (Number) and the following title information (Title #l to Title #n).
  • Each title information (Titlel-Title # n) is a table of playlists included in the title (
  • the playlist table contains the number of playlists in the title (Number) and the playlist name ( Name), that is, have the file name of the playlist!
  • the chapter list (ChapterList) is composed of the number of chapters (Number) included in the title and each chapter information (Chapter #l to Chapter #n), and each chapter information (Chapter #l to Chapter #n) Has a cell table (CellTable) included in the corresponding chapter, and the cell table (CellTable) is composed of the number of cells (Number) and entry information of each cell (Cell Entry # 1 to CellEntry #k). .
  • Cell entry information (CellEntry # 1 to CellEntry #k) is described by the name of the playlist including the cell and the cell number in the playlist.
  • the event list includes the number of global events (Number) and information about each global event (Event #l to Event #m). It should be noted here that the first global event defined is called the first event (FirstEvent) and is the first event that is called when the BD-ROM is inserted into the player. Each global event information (Event #l to Event #m) has only event type (Type) and event ID (ID)! /.
  • FIG. 19 shows the structure of the global event handler table (“BD. PROG”). This table has the same contents as the event handler table described in FIG. 17, and its description is omitted.
  • BD. PROG global event handler table
  • the event generation mechanism will be described with reference to FIGS.
  • FIG. 20 is a diagram illustrating an example of a time event.
  • the time event is defined by the event list (EventList) of the playlist information ("XXX. PL").
  • FIG. 21 is a diagram showing an example of a user event by a user's menu operation.
  • EventList EventList of the playlist information
  • event type (Type) is "UserEvent”
  • the user event becomes ready when the event generation time (“tl") is reached. At this time, the event itself has not yet been generated. The event is in the ready state for the period ( ⁇ T1 ⁇ ) described in the valid standard information (Duration).
  • the UO event It is generated by the UO manager 303 and output to the program processor 302.
  • the program processor 302 sends a UO event to the scenario processor 305, and the scenario processor 305 searches for a valid user event at the time when the U U event is received.
  • the scenario processor 305 When there is a target user event as a result of the search, the scenario processor 305 generates a user event and outputs it to the program processor 302.
  • the program processor 302 searches for an event handler having an event ID, for example, “Evl” in the case of the example shown in FIG. 21, and executes the target event handler. In this example, playback of playlist # 2 is started.
  • the generated user event does not include information on which remote control key is pressed by the user.
  • Information on the selected remote control key is transmitted to the program processor 302 by the UO event, and is recorded and held in the register SPRM (8) of the virtual player.
  • the event handler program can check the value of this register and execute branch processing.
  • FIG. 22 is a diagram illustrating an example of a global event.
  • EventList the event list of the entire BD-ROM information ("BD. INFO").
  • An event defined as a global event that is, the event type (Type) is "Global
  • An event that is Event ⁇ ⁇ is generated only when the user operates the remote control key.
  • a UO event is generated by the UO manager 303 and output to the program processor 302.
  • the program processor 302 sends a UO event to the scenario processor 305.
  • the scenario processor 305 generates a corresponding global event and sends it to the program processor 302.
  • the program processor 302 searches for an event handler having the event ID “menu” and executes the target event handler. For example, in the example shown in FIG. 22, playback of playlist # 3 is started.
  • menu keys like a remote control in a player who plays a power DVD simply called a menu key.
  • ID corresponding to each menu key By defining the ID corresponding to each menu key, appropriate processing corresponding to each menu key can be performed.
  • FIG. 23 is a diagram for explaining a functional configuration of the program processor.
  • the program processor 302 is a processing module having a virtual player machine inside.
  • the virtual player machine is a functional model defined as BD-ROM and does not depend on the implementation of each BD-ROM player. In other words, ensure that any BD-ROM player can perform the same function! /
  • the virtual player machine has two main functions. Programming functions and player variables (registers).
  • the programming function is based on Java (registered trademark) Script, and the following two functions are defined as BD-ROM specific functions.
  • Link function Stops the current playback and starts playback from the specified playlist, cell, and time
  • playback start time in the cell PNG drawing function draws the specified PNG data on the image plane Draw (File, X, Y)
  • Image plane clear function Clear specified area of image plane Clear (X, Y, W, H)
  • the player variable includes a system parameter (SPRM) indicating the player status and a general parameter (GPRM) that can be used for general purposes.
  • SPRM system parameter
  • GPRM general parameter
  • FIG. 24 is a diagram showing a list of system parameters (SPRM).
  • SPRM (8) Selection key information
  • the programming function of the virtual player is based on Java (registered trademark) Script.
  • B-shell used in UNIX (registered trademark) OS, etc. which is not Java (registered trademark) Script, It could be another programming function such as Perl Script.
  • the program language used in the present invention is not limited to «Java (registered trademark) Script.
  • 25 and 26 are diagrams showing examples of programs in the event handler.
  • FIG. 25 is a diagram showing an example of a program in an event handler related to control of a menu screen having two selection buttons.
  • Program (PlavList # l. Cell # 1) using time event at the top of the program on the left side of Figure 25 Is executed.
  • "1" is set to GPRM (O), one of the general parameters.
  • GPRM (O) is used to identify the selected button in the program. In the initial state, the button 1 placed on the left side is selected as the initial value.
  • Button 1 draws the PNG image "lblack.png” starting at the coordinates (10, 200) (upper left corner).
  • Button 2 draws the PNG image "2white.png” starting from the coordinates (330, 200) (upper left corner).
  • FIG. 26 is a diagram showing an example of a program in the event handler related to the user event of menu selection.
  • the program shown in FIG. 26 is interpreted and executed as described above.
  • FIG. 27 is a flowchart showing the flow of basic processing of AV data playback in the BD-ROM player.
  • the BD-ROM player When the BD-ROM is inserted (S101), the BD-ROM player reads and analyzes "BD. INFO” (S102) and reads "BD. PROG” (S103) To do. Both “BD.INFO” and “BD.PROG” are stored in the management information recording memory 204 and analyzed by the scenario processor 305.
  • the scenario processor 305 generates the first event according to the first event (FirstEvent) information in the “BD. INFO” file (S 104).
  • the generated first event is received by the program processor 302, and an event handler corresponding to the event is executed (S105).
  • the event handler corresponding to the first event is expected to record information specifying the playlist to be played first. If play list reproduction is not instructed, the player simply waits to accept a user event without reproducing anything (No in S201).
  • the UO manager 303 When receiving a remote control operation from the user (Yes in S201), the UO manager 303 generates a UO event for the program manager processor 302 (S202).
  • the program manager processor 302 determines whether it is due to the UO event force menu key (S203). In the case of the menu key (Yes in S203), the program manager processor 302 sends a UO event to the scenario processor 305, and the scenario processor 305 generates a user event (S204). The program processor 302 executes an event handler corresponding to the generated user event (S205).
  • FIG. 28 is a flowchart showing a process flow from the start of playlist playback to the end of VOB playback in the BD-ROM player.
  • playlist reproduction is started by the first event handler or the global event handler (S301).
  • the scenario processor 305 reads and analyzes the playlist information “XXX. PL” (S302) and reads the program information “XXX. PROG302 corresponding to the playlist as information necessary for playback of the playlist to be played back. (S303).
  • the scenario processor 305 starts cell reproduction based on the cell information registered in the playlist (S304).
  • Cell playback means that a request is sent from the scenario processor to the presentation controller 306, and the presentation controller 306 starts AV data playback (S305).
  • the presentation controller 306 reads the VOB information file "XXX. VOBI" corresponding to the cell to be reproduced (S402) and analyzes it.
  • the presentation controller 306 specifies the VOBU to start playback using the time map and its address, and instructs the drive controller 317 about the read address.
  • the drive controller 317 reads the target VOB data “YYY. VOB” (S403).
  • the read VOB data is sent to the decoder and reproduction is started (S404). VOB playback continues until the playback section of the VOB ends (S405). When the next cell exists (Yes in S406), the process proceeds to cell playback (S304). If there is no next cell (No in S406), the process related to playback ends.
  • FIG. 29 is a flowchart showing a flow of event processing after the start of AV data reproduction.
  • FIG. 29 (A) is a flowchart showing a flow of processing relating to a time event in the BD-ROM player.
  • the BD-ROM player is an event-driven player model.
  • playlist playback starts, the event processing processes for the time event system, user event system, and caption display system are started, and event processing is executed in parallel.
  • FIG. 29 (B) is a flowchart showing a flow of processing relating to a user event in the BD-ROM player.
  • the scenario processor 305 determines whether the current time is within the user event valid period. If it is within the valid period (Yes in S606), the scenario processor 305 generates a user event (S607), The program processor 302 executes and processes the target event handler (S608).
  • FIG. 30 is a flowchart showing the flow of processing of caption data in a BD-ROM player.
  • the presentation controller 306 instructs the image processor 311 to erase the caption.
  • the image processor 311 deletes the subtitles drawn and drawn from the image plane 209 according to the instruction (S706).
  • the BD-ROM player performs basic processing related to playback of the BD-ROM based on a user instruction or BD management information recorded on the BD-ROM.
  • the second embodiment relates to the introduction of an XML 'XHTML-based screen configuration environment and a programming environment using events and scripts in order to realize richer interactivity in the BD-ROM. It is.
  • the contents are basically based on the first embodiment, and the explanation will focus on the expanded or different parts.
  • FIG. 31 is a diagram showing how the module configuration, control flow, events, etc. related to scenario control using XHTML and script of the playback apparatus of the present invention are transmitted.
  • the user event processing unit is a module that receives a remote control signal or the like and allocates an event to the next module.
  • Events related to playback control such as playback Z stop Z fast forward Z rewind Z skip Z angle change Z audio switching Z subtitle switching, etc. are sent to the AV playback control unit.
  • Events such as button focus movement (up / down / left / right keys) and determination are sent to the XHTML processor.
  • Events for title selection and menu call related to title switching are sent to the title control unit.
  • the Index Table is a file listing the titles on the disc.
  • the Title List part in Fig. 18 is cut out into one.
  • the title control unit is a module that performs title switching according to the Index Table when title switching is requested. If the title is defined in XHTML, control is performed so that the XHTML file associated with the title is read in the XHTML processing section.
  • the XHTML processing unit is a module that reads an XHTML file, configures the screen according to style definition information, and executes related scripts according to events. As a result of executing the script, if AV playback is necessary, the AV playback control unit is controlled to start playback, and if title switching is necessary, the title control unit is controlled.
  • the AV playback control unit plays back an AV stream according to the event or instruction, and generates an event when the state of the AV playback control unit changes or when the playback position of the AV stream reaches a specific position. Notify the XHTML processor.
  • the event notifying that the state of the player has changed is that the user's power is also instructed to play, and when the user event processing unit notifies the AV playback control unit of the playback start request event, the AV playback control unit performs playback. Start. At this time, the event is such that the AV playback control section notifies the change from the stopped state to the playback state.
  • the event for notifying the playback position is the point when the end of the AV stream is reached, when the cell boundary is reached, or when there is data indicating a point in time of the AV stream called a mark. It is an event to notify that it has been reached.
  • FIG. 32 is a diagram for explaining the relationship between the Index Table and the XHTML file.
  • Figure 32 shows the behavior when a title is selected.
  • the XHTML file also contains a script that controls playback.
  • the script file is indirectly referenced, but it can also be described directly in the XHTML file.
  • it is written as X HTML, but if it follows the XML format, it may be in a format using its own tags.
  • the "onLoad” attribute described in the XHTML file in the figure specifies a script to be executed when the file is loaded.
  • the script “playTitlel” is called, and the script itself is described in the script file.
  • the user event processing unit and AV playback control unit also provide a mechanism for executing a script in response to the notified event.
  • the script "jumpTitle2" is called.
  • the “EndOfStream” event is an event generated by the AV playback control unit when, for example, AV playback reaches the end of the file.
  • FIG. 33 is a diagram showing an example of an XHTML file in which buttons are displayed.
  • Fig. 33 shows an example of screen generation in addition to the previous example.
  • XHTML is called from the Index Table as before, and the XHTML is written with information to generate a menu screen etc.! /.
  • buttons are placed on the screen.
  • the corresponding script is executed.
  • the script defined by the "onClick” attribute is executed. Selecting the button on the left associated with the PNG image file labeled TitleA executes the “playA” script and jumps to title 1.
  • the TitleB button is selected, after jumping to Title 2 after a certain AV stream has been played.
  • a data file such as a PNG image file is referenced.
  • the more sophisticated the menu the more complex the graphics that are displayed and the larger the size of the image data used to display it.
  • An XHTML file can also be created so that the screen can be switched by a user operation. This is the case when selecting an item in the menu screen causes a submenu to appear. In such a case, it may be necessary to use powerful image data and back force that are not displayed at the first moment when the menu screen is displayed.
  • FIG. 34 is an explanatory diagram for explaining the life cycle of data.
  • Figure 34 shows the data life cycle in such cases.
  • buffer data files such as XHT ML files and script files that are necessary to display the screen, and PNG image files referenced from those files. Preload on top.
  • the XHTML processing unit When the XHTML processing unit needs data, it is read from the buffer into the work memory of the XHTML processing unit and used for screen display. When the screen disappears, the data that is no longer needed is released from the work memory, and the next necessary data is read with the buffer capacity.
  • This work memory may be shared with a buffer.
  • the memory in the noffer is stored until the data loaded in it is no longer needed, and freed when it is no longer needed.
  • the content creator can easily determine the content configuration that can be operated by any player, and the content creation becomes easy.
  • the start timing and end timing of the data life cycle should be the timing at which AV playback is stopped, or when it is temporarily stopped or not played continuously even when it is played continuously. ,.
  • timings include AV stream switching points, screen switching points configured with XHTML files, title switching timings, and other points as long as the above conditions are satisfied. ! /.
  • the timing of title switching can be referred to explicitly as static data in the Index Table and is easy to control from the player. For this reason, the data life cycle starts when a certain title starts, and ends when a certain title ends.
  • FIG. 35 is a timing chart showing the flow of control between each module, the flow of data, and the data life cycle corresponding thereto.
  • the buffer may be released forcibly from the title control unit or a higher-level module, or may be performed by the XHTML processing unit.
  • the title control unit cancels the currently executing script and instructs the XHTML processing unit to release the buffer. Also, when the title is switched, it is instructed to load the file related to the next title into the buffer.
  • Fig. 36 is a diagram for explaining a method of combining data into one. Therefore, as shown in Fig. 36 (a), a file list that lists the files required for each title is created for each title, and when a title is selected, all the files listed in the title list are buffered. Just load it into
  • the files are grouped together in a ZIP file together with the directory structure, and the file is expanded in the buffer after being read into the buffer, and the directory structure etc. It may be configured. It is not necessary to be a ZIP file as long as it can format the files together and maintain the directory structure. Nor does it need to be a compressed file.
  • FIG. 37 is an explanatory diagram of a method for introducing memory size expansion. Therefore, as shown in Fig. 37, a method for controlling the size that is guaranteed to be installed in the player and the actual size of the installed noffer is shown. When this method is used, if more rich content is produced in the future, the corresponding player will display it richly, but even if it is V, the minimum display can be secured. Become.
  • the content creator prepares two types of data. Both are forces that are data read with the same title.
  • the datal in the figure is stored below the buffer size guaranteed by any player. Therefore, any player can read and play datal.
  • data2 is larger than the guaranteed buffer size. If necessary minimum Only the size of ⁇ is installed, and in the case of a player, data cannot be read! /. However, a player with a larger buffer can read this data, making it possible to play richer content by taking advantage of the larger data size of data2 than datal. Richer content means that the size of image data, resolution, number of colors, and more scripts are included, allowing various additional controls! / Things! Uh.
  • data that is always kept to a size smaller than the guaranteed buffer size is indispensable data recorded on the disc.
  • data that is larger than the guaranteed nofa size but can provide richer content is optional data that may or may not be recorded according to the author's intention.
  • the player language setting or the user setting It is related to introducing a mechanism to switch the language to be displayed or processed according to the situation. Basically, the contents are based on the first embodiment and the second embodiment, and the explanation will be focused on the expanded or different parts.
  • FIG. 38 is a diagram of a player variable table to which language settings for applications are added.
  • a system parameter indicating the player's own language setting or user-specified language specification for the XML 'XHTML-based screen configuration environment and programming environment using events and scripts, in this embodiment, as shown in FIG.
  • the language code parameter for navigation is added to the system parameter (SPRM) described in 24 as the language setting for the application.
  • SPRM system parameter
  • Introducing the system parameters shown in Figure 38 Is an example, and may be a player's own language area information (region or country name) or a fixed value (eg, English) given at the time of player manufacture as a language designation method from the user.
  • the language code for the audio stream (Language code for AST) and the language code for the character stream (Language code for STST) are applied as described in Fig. 24. Subsequent processing may be performed assuming that the language is set for use.
  • FIG. 39 is a diagram showing an example of switching the language displayed or processed by the application according to the language setting for the application.
  • the outline of the third embodiment will be described with reference to FIG. 39.
  • scripts for example, a function that executes the same function for English or Japanese, such as a program function for language selection, or a program function for a video privilege. Some functions are executed differently for English and Japanese.
  • FIG. 40 shows an example in which the application example used in FIG. 39 is switched by XML 'XHTML and script description.
  • the playback button for English and Japanese is described as a single XML ⁇ XHTML file with an invisible attribute, and the program function onload () of the script that is executed when this application is executed with the body> element It is specified.
  • the program function onload () the variable lang corresponding to the language setting for the above-mentioned application is checked. If the language setting for the application is Japanese, the playback button for Japanese is made visible, and if it is English, English is displayed. The playback button for is made visible.
  • FIG. 40 is merely an example, and although it is not illustrated, other descriptions may be used as long as the description is for switching the language displayed or processed by the application based on the description of XML 'XHTML or script!
  • the application corresponds to the language specified in the language setting for the application
  • the language displayed or processed by the application as a standard language is set as the standard language. You may switch to another language.
  • the standard language may be determined as a compelling specification such as English, the representative language of the region to which the player belongs, the language code for the audio stream described above, and the language code for the subtitle stream described above. May be the standard language.
  • file names and reserved words are examples only, and a separate extension may be added instead of the file body.
  • the XML'XHTML and script application language described in this embodiment can change the language that is displayed or processed based on the language setting for the application, and has been changed as described in the second embodiment.
  • This section explains how to achieve both the method of reading only the files necessary for the language into the buffer.
  • First prepare the above-mentioned IndexTable or the above-mentioned file list or the above-mentioned compressed file for each language, which is described to load the file group for each language, and record it on the disk.
  • there is a method of loading the IndexTable, file list or compressed file for the set language there is a method of loading the IndexTable, file list or compressed file for the set language.
  • FIG. 41 is a sequence diagram of a language switching method using an IndexTable for a set language, a file list, or a compressed file.
  • the XHTML processing unit sets the value of the navigation language code. Confirm the setting language for the application (step S802).
  • step S803 in order to confirm whether or not the application corresponding to the title can display or process the language confirmed in step S802, whether or not there is an IndexTable or a file list or a compressed file corresponding to the language confirmed in step S802. Search (step S803).
  • FIGS. 42 (a) and 42 (b) there is a method of adding a reserved word of the corresponding language to the index table or file list for each language or the file name of the compressed file.
  • the index table file name for standard language is Index, bd
  • the file name of Japanese index table file name is given the Japanese reserved word "JPN ⁇ and file body n.
  • Index_JPN.bd The file name and reserved word are just examples, and a separate extension may be added instead of the file body.
  • IndexTable has a list of the number of languages supported by the application, the supported languages, and the corresponding file list or compressed file link pairs. Also good.
  • step S804 if the search is successful in step S803, step S805 is executed, and if the search is unsuccessful, step S806 is executed.
  • step S805 the IndexTable, file list, or compressed file corresponding to the language confirmed in step S802 is loaded. Specifically, in the case of IndexTable, the title control section reads IndexTable of the corresponding language and executes the title (Step S807). On the other hand, in step S806, an IndexTable, a file list or a compressed file corresponding to the standard language described above is loaded. Specifically, in the case of IndexTable, the title control unit reads the standard language IndexTable and executes the title (step S807).
  • the application may correspond to the language specified in the language setting for the application, or the standard language for the case. As shown in FIG. 39, FIG. 40, and FIG. 42, it may be specified that the above reserved word is not added to the file name corresponding to the standard language.
  • special playback such as fast-forwarding and rewinding is performed in the XML 'XHTML-based screen configuration environment and programming environment using events and scripts described in the second embodiment.
  • event processing mechanism that also takes into account. It also relates to a technology that makes such a mechanism applicable to stream distribution on a network.
  • the contents are basically based on the first embodiment and the second embodiment, and the description will be focused on the expansion or different parts.
  • the configuration of the playback device of the present embodiment is the same as that described in the second embodiment with reference to FIG. 31 on the block diagram.
  • the playback apparatus executes an event according to the playback state of the AV playback control unit (for example, during special playback or normal playback), and plays back a title that is digital content.
  • the point is different from the above embodiment. Therefore, the playback device of this embodiment
  • the AV playback control unit described the title playback means for acquiring and playing back the title, the event to be generated in the playback of the title, and the event time that is the time for generating the event.
  • a point list is acquired, and when the playback time of the title being played back by the title playback means reaches the event time described in the point list according to the acquired point list, it corresponds to the event time.
  • the XHTML processing unit acquires a script that is an application program that describes event processing that is processing corresponding to the event described in the point list acquired by the event generation unit realized by the AV playback control unit, In accordance with the acquired script, it has a function of realizing a script execution means for executing event processing corresponding to the event generated by the event generation means, and the script execution means adds an event generated by the event generation means to the event generated by the event generation means. Depending on the playback status information included, the event processing to be executed is switched.
  • an application using XMLZXHTML and script realizes interactive application by event and script.
  • event processing such as fast forward and rewind during normal playback.
  • description of content was complicated.
  • the above-mentioned "BD management information" is applied to generate an event linked to the content playback time, thereby realizing an event generation and handle system capable of realizing a more expressive application. .
  • FIG. 43 is a configuration diagram of a playlist file obtained by correcting the point list.
  • the event list in the playlist information described with reference to FIG. 16 is deleted, and a point list (PointList) is added as shown in FIG.
  • the point list consists of multiple point information (Point #l to Point #m), and each point information specifies the point type (PointType) that is the type of point information and the content playback time to generate the event.
  • Cell—ID indicates the event to be generated in the Cell
  • Time indicates at which playback time (playback position) the event is generated in the Cell specified by the Cell— ID
  • the period during which the event is generated Four information powers of a certain duration.
  • PointType has three types as an example.
  • the first is ChapterPoint, which indicates the point at which chapters provided as story breaks in the title are switched.
  • the second is JumpPoint, which indicates the point at which playback jumps to a specific playback position for a specific title from the application.
  • Third there is an EventPoint that indicates a point (event) that provides some application mainly in conjunction with the playback time of the content, such as the aforementioned commentary picture. Also, when the point type is ChapterPoint or JumpPoint, and it is not necessary to specify the duration of the point, “0” is assigned to Duration, which is the period during which the event continues to be generated.
  • FIG. 44 is a diagram showing an example in which a commentary picture application is realized by a system that generates and processes an event linked to a content playback time. The outline of the event generation system according to the present invention will be described with reference to FIG.
  • the lower part shows the behavior of the AV playback control unit that performs AV stream playback control processing based on the AV stream VOB file, VOB management information file, and playlist information file.
  • the middle section shows the behavior of the above-mentioned XHTML processing unit that receives information such as various events and playback times from the AV playback control unit and controls execution of applications using XMLZXH TML and scripts.
  • the top row shows an example of an application (commentary picture here) realized by this system! /
  • the AV playback control unit plays back the AV stream based on the VOB file, the VOB management information file, and the playlist information file.
  • the AV playback controller monitors whether or not the content playback time power reaches the playback time indicated by the point information included in the playlist information described above.
  • the AV playback control unit When the content playback time reaches the playback time described in the above point information ((1) Point detection), the AV playback control unit notifies the XHTML processing unit of the event with the above point type information. ((2) Point event notification). When notified of an event, the XHTML processor checks the event description of the application and executes a stub (event handler) that should be executed when the event occurs ((3) Event processing).
  • Event processing When the content playback time comes, a balloon (commentary picture) with an altitude of 3776m is drawn on Mt. Fuji that appears in the scene ((4) Commentary picture display).
  • FIG. 45 is a flowchart of a system for generating and processing an event linked to content playback time.
  • FIG. 45 a specific flowchart of the event generation 'handle system according to the present invention described with reference to FIG. 44 will be described with reference to FIG.
  • the AV playback control unit caches point information (Point information) included in the playlist information (step S902).
  • the AV playback control unit repeats the following processing (step S903: no) until playback of the playlist ends (step S903: yes).
  • step S903: yes When the playback time of the AV stream reaches the time specified by the point information cached in step S902 (step S904yes), the AV playback control unit generates an event based on the above point information and generates an XHTML processing unit. (Step S905).
  • the event includes at least point type information (PointType) and event period information (Duration) as necessary.
  • the XHTML processing unit executes the event handler (event processing script) of the event based on the description of the application (step S906).
  • FIG. 46 is a diagram illustrating a description example of the commentary picture application.
  • Figure 46 shows a specific application description example.
  • the application description is a description of screen drawing items such as buttons, images, and text information on menu screens, popup screens, games, etc., and the screen description items described in the content description.
  • a style description that describes the position, appearance, etc.
  • a stub description that describes the specific operation of the application, such as when a button is pressed or an event is generated, and a script that is executed when any event occurs It also has the ability to describe events that describe information such as (event handler) and its arguments.
  • the XHTML processing unit receives the event of the point when the playback time of the content reaches the time described in Point # 0001.
  • the XHTML processing unit first confirms the event description delimited by the bdi: bevent> element, and searches for the bdi: beitem> element for which information related to "EventPoint" is to be described.
  • Type "EventPointFired”
  • bdi The information is described in the beitem> element, and the script to be executed as the event handler is specified in the onoccur attribute.
  • the script display As information that can be handled in the script, specify "comment” in the object attribute and describe the object name to be operated in the script (here, "comment” indicated by the object attribute).
  • This application description is only an example. Other information such as a point information number may also be provided as an attribute.
  • the XHTML processing unit checks the script description separated into ⁇ script> elements and executes the display () function.
  • the XHTML processing unit draws the image "comment, png" corresponding to the above-mentioned commentary picture to realize a commentary picture.
  • the system according to the present embodiment described above can easily realize an application that provides some service in conjunction with content, such as a commentary picture that attaches a detailed information image in synchronization with a video.
  • content such as a commentary picture that attaches a detailed information image in synchronization with a video.
  • the start timing of the application to be provided in conjunction with the content is described in the point information of the point list in association with the playback time of the content. Therefore, even during special playback such as fast-forward playback and rewind playback, a pop-up event can be described in a concise manner that requires no special description on the content side.
  • FIG. 3 is a diagram illustrating an example in which a chapter title application is realized by a system that operates. Specifically, as shown in the upper part of Fig. 47, there is a need to display the chapter title of the chapter when it reaches the chapter at the time of fast-forward playback in order to help the user understand. This chapter title will not be displayed during normal playback, so it will not be displayed, and will only be displayed during fast forward playback.
  • the AV playback control unit which is the event generation unit, includes playback state information indicating that the title is specially played when the title playback unit specially plays the title! Generate an event.
  • the lower part shows the behavior of the AV playback control unit that performs AV stream playback control processing based on the AV stream VOB file, VOB management information file, and playlist information file.
  • the middle section shows the behavior of the XHTML processing unit that receives information such as various events and playback time from the AV playback control unit, and controls the execution of applications using XMLZXHTML and scripts.
  • the upper row shows an example of the application realized by this system (chapter title display during fast forward).
  • the AV playback control unit plays back the AV stream based on the VOB file, the VOB management information file, and the playlist information file.
  • the AV playback controller monitors whether or not the content playback time power reaches the playback time indicated by the point information included in the playlist information described above.
  • the AV playback control unit detects that the point that has reached the above point information power is a chapter point ((1) Point detection ).
  • the AV playback control unit acquires player status information when the point is reached ((2) playback status check).
  • the player status information is set to the title playback status, Check that the title playback state is the fast forward playback state.
  • the player status information is, for example, the content playback time, setting information for the player such as language and angle, Even real time information at the time of playback.
  • the AV playback control unit notifies the XHTML processing unit of the event with the above point type information and title playback state information ((3) Point + playback state event notification).
  • the XHT ML processor checks the event description of the application and executes a script (event handler) that should be executed when the event occurs ((4) Event processing).
  • Event handler a script that should be executed when the event occurs.
  • FIG. 48 is a flowchart of a system that generates and processes an event that is linked to the content playback time and can also notify the playback state.
  • FIG. 48 a specific flowchart of the event generation node system according to the present invention described with reference to FIG. 47 will be described with reference to FIG.
  • the AV playback control unit when instructed to play a playlist (step S1001), it caches point information (Point information) included in the playlist information (step S1002).
  • step S1003 no
  • step S1003 yes
  • step S1004yes When the playback time of the AV stream reaches the time specified by the point information cached in step S1002 (step S1004yes), the AV playback control unit displays the playback state when the point is reached. Check the information (step S 1005). Next, the AV playback control unit generates an event based on the above point information and notifies the XHTML processing unit (step S1006).
  • the event includes at least point type information (PointType) and playback status information confirmed in step S1005, and also includes event period information (Duration) as necessary.
  • FIG. 49 is a diagram illustrating a description example of the chapter title application. Figure 49 shows a specific application description example.
  • the 12th point information (Point # 0012) force is the delimiter of the 12th chapter and the point type is "ChapterPoint".
  • the playback time of the content has reached the time described in Point # 0012
  • the XHTML processing unit will first check the event description delimited by bdi: bevent> element and “I want to write information about ChapterPoint bdi: beitem> Search for elements (only one in this example).
  • the type "ChapterPointFired” and the information is described in the bdi: beitem> element, and the script to be executed as the event handler is specified in the onoccur attribute.
  • the script chaptertitle O is described to be executed. That is, the event is an event generated for each chapter that is a unit of a group of stories constituting the title, and the event process corresponding to the event is a process corresponding to switching of chapters.
  • the XHTML processing unit checks the script description divided into ⁇ script> elements and executes the chaptertitle O function.
  • chaptertitle O function captures the order number of point information as beitem.pointld, and when the order number power of the point is 12 "as in this embodiment, chapter 12 (id attribute of ⁇ object> element) And! ⁇ ⁇ Substitute 'isible (visible state)' for the visibility attribute of the object.
  • the visible attribute of the image associated with the event is set to the invisible state force visible state, and then the image set to the visible state is displayed.
  • the system in which the AV playback control unit notifies the XHTML processing unit including the player status information at the time of event generation is limited to the event whose point type is ChapterPoint. It is not something.
  • the AV playback control unit may also be notified including the player status information.
  • a generation condition flag is newly added for each event described in the BD management information.
  • FIG. 50 is a configuration diagram of a playlist file in which event generation condition information is added to the point information and modified. This will be specifically described with reference to FIG.
  • event generation condition information is further added as a member of each point information of the point list added in FIG.
  • This event generation condition information describes the conditions for generating an event based on the point information. For example, in this embodiment, it is necessary to generate an event even during fast forward playback (required) ) Describe whether it is power or not. Since the roles of other existing members are the same as described above, the explanation is omitted.
  • the playlist information shown in FIG. 50 is merely an example, and other methods may be used as long as the player can store point information such as event generation conditions at the time of content playback.
  • point information such as event generation conditions at the time of content playback.
  • information corresponding to the above point information may be described in XMLZXHTML or a script describing the VOB management information and application shown in FIG. [0344]
  • the above generation conditions are merely examples, and the contents of the event generation condition information are not particularly limited to the above contents. For example, it may be the content of whether or not the event generation is indispensable in any regeneration situation.
  • a condition according to the state of the player may be set such that an event is generated only during a special playback (such as when rewinding only and when).
  • the conditions may be in accordance with the player's ability, such as whether or not the player is connected to the network. A combination of these conditions may also be used.
  • you may write a conditional branch that determines whether to acquire information and execute the corresponding process. However, by including it in the playlist information as an event generation condition, the event is generated. The processing itself can be omitted, and there is an advantage that the load on the player is lighter than in the case of judging the conditions in the scribing.
  • Event Information necessary for event processing such as handlers, may be stored in the memory and cached, or some processing may be executed intensively.
  • FIG. 51 is a diagram showing an outline of the operation of the system that generates and processes an event linked to the content playback time according to the event generation condition.
  • An example in which the event generation condition information is information on whether or not it is necessary to generate an event during special playback will be described with reference to FIG.
  • Event Point A is located at the content playback time before Event Point B, and the event generation condition information of Event Point A describes that event generation during special playback is unnecessary.
  • the event generation condition information of Point B shall describe that event generation during special playback is mandatory.
  • Event A based on the Point information
  • Event B based on the Point information is generated. That is, in the point list, an event generation condition that is a condition for generating the event is described in association with the event, and the event is realized by the AV playback control unit.
  • the event generation means determines whether or not an event generation condition corresponding to the event time is satisfied when the reproduction time of the title being reproduced by the title reproduction means reaches the event time described in the point list. The event corresponding to the event time is generated only when it is satisfied.
  • Event Point B when Event Point B is reached ((1) Point detection), event generation condition information included in the Point information of Event Point B is confirmed ((2) Generation condition confirmation). Since the event generation condition information of Event Point B describes that event generation during special playback is essential, the AV playback control unit uses the playback status information of the title when Event Point B is reached. Confirm ((3) Playback status check), generate an event with playback status information based on Event Point B point information, and notify the XHTML processing section ((4) Point + playback status event notification ). When the XHTML processing unit receives an event, it executes the corresponding event processing based on the application description ((5) Event processing).
  • the event generation condition realized by the AV playback control unit is a condition indicating whether or not an event needs to be generated during special playback, and the event generation unit is configured such that the title playback unit selects the title.
  • the event is generated or not generated according to the condition indicated by the event generation condition corresponding to the event time.
  • point information, information necessary for event processing, and scripts may be preliminarily cached in a memory in order to reduce the load required for event processing during special playback.
  • FIG. 52 is a flowchart of a system that generates and processes an event linked to the content playback time according to the event generation condition.
  • the event generation based on the event generation condition information is concretely described. A simple flowchart will be described with reference to FIG.
  • the AV playback control unit when instructed to play a playlist (step S1101), it caches point information (Point information) included in the playlist information (step S1102).
  • step S1103: no the AV playback control unit repeats the following processing (step S1103: no) until playback of the playlist ends (step S1103: yes).
  • the AV playback control unit includes the point information of the point. Event generation conditions are confirmed, and it is confirmed whether the state force event generation conditions when the point is reached are met (step S1105). If the event generation conditions are met, the process proceeds to step S 1106 to continue the event generation process (step S1105 yes), and if the event generation conditions are not met, the play of the playlist is continued without generating an event (step S1105). SI 105no).
  • the AV playback control unit checks the playback state information when the point is reached (step S1106). Next, the AV playback control unit generates an event based on the above point information and notifies the XHTML processing unit (step S1107).
  • the event includes at least point type information (PointType) and playback state information confirmed in step S1106, and also includes event period information (Duration) as necessary.
  • the XHTML processing unit executes the event handler of the event based on the description of the application (step S1108).
  • the force S described for the event generation based on the event generation condition S can reduce the event generation load of the player during special playback.
  • content descriptions with a higher degree of freedom are possible.
  • this embodiment can be applied to network distribution.
  • the playback device first downloads the IndexTable, and then downloads the stub that is described in XHTML and the stream to be played based on it.
  • the playback device of the present invention has the IndexTable and the scribing as already described in the second embodiment.
  • the title and the stream may be stored on the hard disk for each title and cached in the memory immediately before playback. By doing so, it is possible to obtain an effect that it is possible to enjoy interactive playback of content in real time and seamlessly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

L’invention concerne un système de commande de reproduction employant un XML et un script mis en œuvre par le biais d’une distribution de réseau, permettant ainsi un traitement d’événement dans une reproduction spéciale comme une avance rapide ou un rembobinage. Une partie commande de reproduction A/V acquiert une liste de points dans laquelle chacun des événements listés à générer dans la reproduction de titres, qui sont des contenus numériques, est associé à une heure d’événement listée respective, c’est-à-dire le moment auquel il faut générer cet événement. Si l’heure de reproduction d’un titre, que l’on reproduit, atteint l’heure d’événement correspondante figurant dans la liste de points, la partie commande de reproduction A/V génère l’événement correspondant à l’heure d’événement. Une partie traitement XHTML acquiert un script qui est un programme d’application décrivant un traitement d’événement, qui constitue un traitement correspondant à l’événement figurant dans la liste de points acquis. Selon le script acquis, la partie traitement XHTML exécute le traitement d’événement correspondant à l’événement généré, et commute en outre les procédés d’événement à exécuter, selon les informations d’état de reproduction incluses dans l’événement.
PCT/JP2006/303148 2005-03-02 2006-02-22 Support d’enregistrement, appareil de reproduction et procédé de reproduction WO2006092996A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005057955 2005-03-02
JP2005-057955 2005-03-02

Publications (1)

Publication Number Publication Date
WO2006092996A1 true WO2006092996A1 (fr) 2006-09-08

Family

ID=36941030

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/303148 WO2006092996A1 (fr) 2005-03-02 2006-02-22 Support d’enregistrement, appareil de reproduction et procédé de reproduction

Country Status (1)

Country Link
WO (1) WO2006092996A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008114389A1 (fr) * 2007-03-19 2008-09-25 Pioneer Corporation Système de reproduction de contenu et son procédé de commande

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11112922A (ja) * 1997-09-30 1999-04-23 Hitachi Ltd ストリームイベント点検出表示方法および装置
JP2002033993A (ja) * 2000-07-17 2002-01-31 Sanyo Electric Co Ltd 映像記録再生装置
JP2002354395A (ja) * 2001-05-30 2002-12-06 Sanyo Electric Co Ltd 光ディスク再生装置
WO2003051058A1 (fr) * 2001-12-12 2003-06-19 Koninklijke Philips Electronics N.V. Lecture d'applications de television interactive
WO2004025651A1 (fr) * 2002-09-12 2004-03-25 Matsushita Electric Industrial Co., Ltd. Support d'enregistrement, dispositif de reproduction, programme, procede de reproduction et procede d'enregistrement
JP2004221765A (ja) * 2003-01-10 2004-08-05 Toshiba Corp 情報再生装置及び情報再生方法
JP2005346873A (ja) * 2004-06-04 2005-12-15 Canon Inc 情報再生装置及び情報再生方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11112922A (ja) * 1997-09-30 1999-04-23 Hitachi Ltd ストリームイベント点検出表示方法および装置
JP2002033993A (ja) * 2000-07-17 2002-01-31 Sanyo Electric Co Ltd 映像記録再生装置
JP2002354395A (ja) * 2001-05-30 2002-12-06 Sanyo Electric Co Ltd 光ディスク再生装置
WO2003051058A1 (fr) * 2001-12-12 2003-06-19 Koninklijke Philips Electronics N.V. Lecture d'applications de television interactive
WO2004025651A1 (fr) * 2002-09-12 2004-03-25 Matsushita Electric Industrial Co., Ltd. Support d'enregistrement, dispositif de reproduction, programme, procede de reproduction et procede d'enregistrement
JP2004221765A (ja) * 2003-01-10 2004-08-05 Toshiba Corp 情報再生装置及び情報再生方法
JP2005346873A (ja) * 2004-06-04 2005-12-15 Canon Inc 情報再生装置及び情報再生方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008114389A1 (fr) * 2007-03-19 2008-09-25 Pioneer Corporation Système de reproduction de contenu et son procédé de commande

Similar Documents

Publication Publication Date Title
JP6541853B2 (ja) 再生装置及び再生方法
JP2005332521A (ja) 情報記録媒体及び情報再生装置
JP2017204320A (ja) 再生方法、および再生装置
JP2007036663A (ja) 互換性を考慮した情報記録媒体およびその記録装置、記録方法、記録プログラム
JP5295572B2 (ja) 情報記録媒体および情報記録媒体再生システム
WO2006092996A1 (fr) Support d’enregistrement, appareil de reproduction et procédé de reproduction
JP6445933B2 (ja) 記録媒体、再生装置およびその方法
WO2006090664A1 (fr) Support d'enregistrement d'informations, dispositif de reproduction et methode de reproduction
JP2007018623A (ja) 情報記録媒体、およびその再生装置、再生方法。
JP2007133938A (ja) オーディオミキシング出力の可否を示すフラグを持った情報記録媒体、および、その再生装置、再生方法
JP2007011899A (ja) 宣言型言語で記述された再生制御環境の起動条件を考慮した情報記録媒体およびその再生装置、再生方法。
JP2006244654A (ja) ディスクプレーヤからの印刷方法
JP6591202B2 (ja) 記録媒体、再生装置およびその方法
JP2007048383A (ja) 情報記録媒体およびその記録装置、記録方法、記録プログラム
JP2007235185A (ja) ランダムアクセスに適した情報記録媒体、およびその記録/再生装置、記録/再生方法
JP2006236453A (ja) 情報記録媒体およびその再生装置、再生方法。
JP2006228339A (ja) 宣言型言語で記述された再生制御環境のリソース管理を考慮した情報記録媒体およびその再生装置、再生方法
JP2006244655A (ja) 再生装置および再生方法
JP2007012217A (ja) 情報記録媒体およびその再生装置、再生方法
WO2016009606A1 (fr) Support d'enregistrement, dispositif de reproduction et procédé s'y rapportant
WO2016021118A1 (fr) Support d'enregistrement, dispositif de reproduction et procédé de reproduction
JP2006013726A (ja) 情報記録媒体、およびその記録/再生装置、記録/再生方法
JP2006073127A (ja) ランダムアクセスに適した情報記録媒体、およびその記録/再生装置、記録/再生方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 06714288

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载