+

WO2018142945A1 - Dispositif et procédé de traitement d'informations - Google Patents

Dispositif et procédé de traitement d'informations Download PDF

Info

Publication number
WO2018142945A1
WO2018142945A1 PCT/JP2018/001314 JP2018001314W WO2018142945A1 WO 2018142945 A1 WO2018142945 A1 WO 2018142945A1 JP 2018001314 W JP2018001314 W JP 2018001314W WO 2018142945 A1 WO2018142945 A1 WO 2018142945A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame rate
video
identification information
mpd
information processing
Prior art date
Application number
PCT/JP2018/001314
Other languages
English (en)
Japanese (ja)
Inventor
義行 小林
俊也 浜田
平林 光浩
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2018142945A1 publication Critical patent/WO2018142945A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus

Definitions

  • the present disclosure relates to an information processing apparatus and method, and more particularly to an information processing apparatus and method capable of conveying a change in frame rate.
  • MPEG-DASH Moving Picture Experts Group phase-Dynamic Adaptive Streaming over HTTP
  • the client selects and acquires contents having different bit rates and continues reproduction.
  • MPEG-DASH Dynamic Adaptive Streaming over HTTP
  • URL http://mpeg.chiariglione.org/standards/mpeg-dash/media-presentation-description-and-segment-formats/text-isoiec-23009-12012-dam -1)
  • moving network playback may be slowed down regardless of the end user's or content provider's intention because the network bandwidth during connection changes dynamically. It was difficult to distinguish.
  • the present disclosure has been made in view of such a situation, and is intended to be able to communicate a change in frame rate.
  • the information processing apparatus receives a moving image having a frame rate different from the frame rate reproduced immediately before by the reproducing unit, and a reproducing unit that reproduces a moving image of two or more different frame rates. And a display control unit for displaying identification information indicating that the frame rate is different from the frame rate reproduced immediately before.
  • the display control unit may display the identification information at a frame rate different from a video frame rate.
  • the identification information may indicate that the frame rate is lower than the frame rate reproduced immediately before.
  • the identification information may indicate that the frame rate is higher than the frame rate reproduced immediately before.
  • the identification information may indicate a ratio to a maximum frame rate.
  • the identification information may indicate a ratio to the frame rate reproduced immediately before.
  • the moving images of the two or more different frame rates are transmitted from the content distribution side.
  • the identification information is transmitted from the content distribution side.
  • the identification information is described in an adaptation set (AdaptationSet) or a representation (Representation) of the MPD and transmitted.
  • AdaptationSet adaptation set
  • Representation representation of the MPD
  • the information processing apparatus reproduces a moving image of two or more different frame rates, and receives a moving image of a frame rate different from the frame rate reproduced immediately before. And displaying identification information indicating that the frame rate is different from the frame rate reproduced immediately before.
  • the information processing apparatus includes a moving image generation unit that generates moving images of two or more different frame rates, and a moving image having a frame rate different from the frame rate reproduced immediately before on the reproduction side.
  • An identification information generation unit that generates identification information indicating that the received frame rate is different from the previously reproduced frame rate to be displayed, a moving image generated by the moving image generation unit, and the identification information generation unit And a transmitter for transmitting the identified identification information.
  • the identification information generated by the identification information generation unit may be described to further include an MPD generation unit that generates an MPD.
  • the MPD generation unit may describe the frame rate of the identification information generated by the identification information generation unit to generate the MPD.
  • the MPD generation unit may describe the identification information generated by the identification information generation unit in an adaptation set (AdaptationSet) of the MPD.
  • AdaptationSet adaptation set
  • the MPD generation unit may describe the identification information generated by the identification information generation unit in a representation of the MPD.
  • the identification information may indicate that the frame rate is lower than the frame rate reproduced immediately before on the reproduction side.
  • the identification information may indicate that the frame rate is higher than the frame rate reproduced immediately before on the reproduction side.
  • the identification information may indicate a ratio to a maximum frame rate.
  • the identification information may indicate a ratio of the frame rate reproduced immediately before on the reproduction side.
  • the information processing apparatus generates moving images of two or more different frame rates, and the moving image having a frame rate different from the frame rate reproduced immediately before on the reproduction side is The identification information indicating that the received frame rate is different from the previously reproduced frame rate to be displayed is generated, and the generated moving image and the generated identification information are transmitted.
  • moving images of two or more different frame rates are reproduced.
  • identification information indicating that the frame rate is different from the frame rate reproduced immediately before is displayed.
  • a moving image of two or more different frame rates is generated, and the reproduction side is displayed when a moving image of a frame rate different from the frame rate reproduced immediately before is received.
  • Identification information is generated that indicates that it is different from the frame rate reproduced immediately before. Then, the generated moving image and the generated identification information are transmitted.
  • information can be processed.
  • changes in frame rates can be communicated.
  • FIG. 7 is a diagram showing an example of MPD syntax in the case of indicating the ratio of the maximum video frame rate to the video frame currently being reproduced.
  • 18 is a flowchart illustrating MPEG-DASH streaming playback processing in the case of streaming playback of the MPD of FIG. 17; It is a figure which shows the example of the syntax of MPD in the case of showing the ratio of the video frame rate reproduced immediately before and the video frame in process of present reproduction.
  • FIG. 20 is a flowchart illustrating MPEG-DASH streaming playback processing in the case of streaming playback of the MPD of FIG. 19; It is a flowchart explaining the highest frame rate determination processing. It is a flowchart explaining a video output frame rate determination process.
  • Bit rate switching> ⁇ Distribution of video and audio>
  • streaming delivery via the Internet is expected as a means for delivering video and music to consumers.
  • the Internet as a transmission means is unstable in transmission compared to broadcast and optical disks.
  • the maximum rate of the transmission band changes greatly depending on the user's environment.
  • a constant transmission band is not always secured, and it fluctuates with the passage of time.
  • the fluctuation of the transmission bandwidth also means that the response time to the request from the client is not constant.
  • MPEG-DASH Moving Picture Experts Group-Dynamic Adaptive Streaming over HTTP
  • This is a pull type model in which a plurality of files with different data sizes are placed on the server side, and the client refers to the MPD (Media Presentation Description) to select the most suitable file.
  • a general HTTP (HyperText Transfer Protocol) server can be used by using http without using a special protocol.
  • the file format is not only MPEG-TS (Moving Picture Experts Group-Transport Stream), but also a file of International Organization for Standardization Base Media File Format (ISOBMFF) format.
  • ISOOBMFF International Organization for Standardization Base Media File Format
  • FIG. 1 An example of data transmission using MPEG-DASH is shown in FIG.
  • the file generation device 2 generates video data and audio data as moving image content, encodes the data, and converts it into a file format for transmission. For example, the file generation device 2 files (segments) these data every 10 seconds or so.
  • the file generation device 2 uploads the generated segment file to the Web server 3.
  • the file generation device 2 generates an MPD file (management file) for managing moving image content, and uploads it to the Web server 3.
  • MPD file management file
  • the Web server 3 as a DASH server distributes the file of the moving image content generated by the file generation device 2 live to the reproduction terminal 5 via the Internet 4 in a method conforming to MPEG-DASH.
  • the web server 3 stores the segment file and the MPD file uploaded from the file generation device 2. Further, in response to a request from the reproduction terminal 5, the Web server 3 transmits the stored segment file and MPD file to the reproduction terminal 5.
  • the reproduction terminal 5 includes streaming data control software (hereinafter also referred to as control software) 6, video reproduction software 7, client software for HTTP access (hereinafter referred to as access software). ) Execute 8 mag.
  • the control software 6 is software that controls data to be streamed from the web server 3. For example, the control software 6 acquires an MPD file from the web server 3. Also, the control software 6 transmits the segment file to be reproduced based on, for example, the reproduction time information indicating the reproduction time specified by the MPD file or the moving image reproduction software 7 and the network band of the Internet 4 The request is commanded to the access software 8.
  • the video reproduction software 7 is software that reproduces the encoded stream acquired from the Web server 3 via the Internet 4.
  • the moving image reproduction software 7 designates reproduction time information as the control software 6.
  • the video reproduction software 7 decodes the encoded stream supplied from the access software 8.
  • the video reproduction software 7 outputs video data and audio data obtained as a result of decoding.
  • the access software 8 is software that controls communication with the Web server 3 using HTTP. For example, the access software 8 supplies the moving image reproduction software 7 with a notification of reception start. Further, the access software 8 transmits a transmission request for the encoded stream of the segment file to be reproduced to the Web server 3 in response to an instruction of the control software 6. Furthermore, the access software 8 receives the segment file of the bit rate according to the communication environment and the like transmitted from the Web server 3 in response to the transmission request. Then, the access software 8 extracts the encoded stream from the received file, and supplies it to the moving image reproduction software 7.
  • the MPD has, for example, a configuration as shown in FIG.
  • the client (reproduction terminal 5 in the example of FIG. 1) is optimum from the attribute of Representation included in Period of MPD (Media Presentation of FIG. 2). Choose one.
  • the client reads the top segment (Segment) of the selected Representation (Representation) to obtain and process the Initialization Segment. Subsequently, the client acquires and reproduces the subsequent segment.
  • Period, Representation, and Segment in the MPD is as shown in FIG. That is, one media content can be managed for each period, which is a data unit in the time direction, and each period is managed for each segment, which is a data unit in the time direction. Can. Also, for each period, a plurality of representations (Representations) with different attributes such as bit rate can be configured.
  • the file of this MPD (also referred to as MPD file) has a hierarchical structure as shown in FIG. 4 below Period. Further, when the structure of this MPD is arranged on the time axis, it becomes as shown in the example of FIG. As is clear from the example of FIG. 5, a plurality of representations (Representations) exist for the same segment (Segment). By adaptively selecting one of these, the client can acquire and reproduce appropriate stream data according to the communication environment, its own decoding capability, and the like.
  • ⁇ Frame rate notification UI User Interface
  • the end user recognizes the frame rate in the following manner.
  • video playback devices such as Blu-ray Disc (registered trademark) playback devices
  • the frame rate of video is directly displayed as a numerical value. Even if the video is stopped, the end user recognizes indirectly that the video has a low frame rate by continuing the audio. It indirectly recognizes the progress of progress display of playback position (temporal position).
  • the display of the UI is controlled to notify the end user that the moving image frame rate has been intentionally changed.
  • the UI can also be referred to as identification information indicating that the frame rate of a moving image is intentionally changed, that is, a frame rate different from the frame rate reproduced immediately before, and hereinafter, it is also referred to as identification information.
  • the UI may be OSD (On Screen Display) in a so-called playback device, but in MPEG-DASH, the combination of specific frame rate values is not clear in the playback device, so it is included in the graphic asset. It is desirable that the creator of the moving image provides it as an image file.
  • the UI also has the effect of making the presence and distribution of the low frame rate portion and the high frame rate portion clearer with respect to the entire content.
  • bit rate adjustment by frame rate adjustment may be implemented as one implementation method of bit rate adaptation.
  • the content itself is the video of each frame rate prepared in all sections, and the distribution of the low frame rate part and the high frame rate part in the whole is determined dynamically at the time of streaming reproduction. That is, moving images of a plurality (two or more) different frame rates are distributed from the server.
  • menu graphics type In the case of streaming delivery, as described above, the menu graphics type often causes the playback device to provide its own UI. In addition, even when menu graphics and the like are distributed and animation graphics are included, they are usually aligned with the frame rate of the video.
  • animation graphics are displayed at the frame rate of the video, they may be degraded, but as shown in FIG. 6, animation occurs when the video output reaches 30 fps in accordance with the 30 fps of the video.
  • the frame rate will be halved. Therefore, in the case of a blinking animation, as shown in FIG. 7, it may continue to be displayed or may not be displayed as shown in FIG.
  • an announcement is performed on an animation icon by graphics whose frame rate is higher than that of video.
  • the end user can easily view at least that it is not a malfunction of the playback device. For example, when the video frame rate is low, it is possible to prepare an animation icon moving at a high speed such as 2x or 4x video.
  • a frame rate of graphics is defined in metadata (for example, MPD).
  • the playback device can set the frame rate setting of the video output to the frame rate of graphics.
  • a URL of a graphics asset and a SupplementalProperty element capable of describing a graphics frame rate are added.
  • the frame rate of the video output may be reduced in accordance with the frame rate of the video.
  • the frame rate of the graphics superimposed on the video is also reduced concomitantly.
  • the video frame rate has been lowered to lower the bit rate rate in response to the network bandwidth problem when downloading segments. Since there is no problem with the video output frame rate of the playback device, there is no need for the playback device to follow the decrease in the video frame rate to lower the video output frame rate. This can be signaled by metadata to which the present technology is applied.
  • the introduction of metadata to which the present technology is applied can avoid the triggering of an unintended graphics display state. At least, as shown in FIG. 9, it is effective when the graphics update rate (animation rate etc.) exceeds the video frame rate.
  • video decoders have a decoding delay caused by reordering B and P frames of MPEG video coding.
  • the decoding delay is derived from the MPEG video coding system and is theoretically generated. For low frame rate video, the number of video frame rates may fall below this amount of decoding delay.
  • the playback device can grasp a specific delay amount. Therefore, in the present technology, the video decoder can be controlled by the setting value with reference to the adaptation set (AdaptationSet) of the MPD file or the frame rate attribute of the Representation element.
  • AdaptationSet adaptation set
  • Many video decoders have an instruction defined to forcibly eject the video frame stored in the decoder, but the video frame can be obtained by issuing the instruction to the video decoder.
  • the decoder has a decoding delay of 3 video frames.
  • the first video frame is not input after the fourth video segment is input to the decoder. It will not be output.
  • the number of video frames in the video segment to be reproduced can be calculated from the value of the frame rate attribute of the Representation element and the time length of the video segment. If the calculated number of video frames is equal to or smaller than the delay amount of the decoder, a situation may occur in which video frames are not output at the start of streaming playback.
  • FIG. 10 is a block diagram illustrating an example of a configuration of a delivery system which is an aspect of an information processing system to which the present technology is applied.
  • the delivery system 100 shown in FIG. 10 is a system for delivering data (content) such as images and sounds.
  • the file generation apparatus 101, the distribution server 102, and the reproduction terminal 103 are communicably connected to each other via the network 104.
  • the file generation apparatus 101 is an aspect of an information processing apparatus to which the present technology is applied, and is an apparatus that performs processing related to generation of an MP4 file or an MPD file (also referred to as an MPD file) storing image data and audio data.
  • the file generation apparatus 101 generates image data and audio data, generates an MP4 file for storing the generated image data and the generated audio data, and generates an MPD file for managing the MP4 file, and generates those files.
  • the data is supplied to the distribution server 102.
  • the distribution server 102 is an aspect of an information processing apparatus to which the present technology is applied, and is a server that performs processing related to a content data distribution service using MPEG-DASH (that is, a distribution service of MP4 files using MPD files). is there.
  • the distribution server 102 acquires and manages the MPD file and the MP4 file supplied from the file generation apparatus 101, and provides a distribution service using MPEG-DASH.
  • the distribution server 102 in response to a request from the reproduction terminal 103, the distribution server 102 provides the MPD file to the reproduction terminal 103. Further, the distribution server 102 supplies the requested MP4 file to the reproduction terminal 103 in response to the request from the reproduction terminal 103 based on the MPD file.
  • the reproduction terminal 103 is an aspect of an information processing apparatus to which the present technology is applied, and is an apparatus that performs processing related to reproduction of image data and audio data.
  • the reproduction terminal 103 requests the distribution server 102 to distribute the MP4 file in accordance with the MPEG-DASH, and acquires the supplied MP4 file in response to the request.
  • the reproduction terminal 103 acquires an MPD file from the distribution server 102, and acquires from the distribution server 102 an MP4 file storing desired content data according to the information of the MPD file.
  • the reproduction terminal 103 decodes the acquired MP4 file, and reproduces image data and audio data.
  • the network 104 is an arbitrary communication network, may be a communication network of wired communication, may be a communication network of wireless communication, or may be configured by both of them. Also, the network 104 may be configured by one communication network, or may be configured by a plurality of communication networks.
  • the Internet a public telephone network, a wide area communication network for wireless mobiles such as so-called 3G circuits and 4G circuits, WAN (Wide Area Network), LAN (Local Area Network), communication conforming to Bluetooth (registered trademark) standards Communication network for near field communication such as NFC (Near Field Communication), communication channel for infrared communication, standard such as HDMI (High-Definition Multimedia Interface) or USB (Universal Serial Bus)
  • WAN Wide Area Network
  • LAN Local Area Network
  • Bluetooth registered trademark
  • Communication network for near field communication such as NFC (Near Field Communication)
  • communication channel for infrared communication standard such as HDMI (High-Definition Multimedia Interface) or USB (Universal Serial Bus)
  • a communication network or communication path of any communication standard such as a communication network of wired communication conforming to the above may be included in the network 104.
  • the file generation apparatus 101, the distribution server 102, and the reproduction terminal 103 are communicably connected to the network 104, and can exchange information with each other via the network 104.
  • the file generation apparatus 101, the distribution server 102, and the reproduction terminal 103 may be connected to the network 104 by wired communication, may be connected by wireless communication, or may be connected by both of them. It may be done.
  • one file generation apparatus 101, one distribution server 102, and one reproduction terminal 103 are shown as the configuration of the distribution system 100, but these numbers are arbitrary and identical to each other. It does not have to be.
  • the file generation device 101, the distribution server 102, and the reproduction terminal 103 may be singular or plural.
  • FIG. 11 is a block diagram showing an example of the main configuration of the file generation apparatus 101.
  • the file generation device 101 includes a video stream generation unit 110, an image file generation unit 111, a content file generation unit 112, an MPD generation unit 113, and a communication unit 114.
  • the file generation device 101 includes a video stream generation unit 110, an image file generation unit 111, a content file generation unit 112, an MPD generation unit 113, and a communication unit 114.
  • voice since it is not directly related to this technique, it is abbreviate
  • the video stream generation unit 110 and the image file generation unit 111 perform processing related to generation of a stream of content data.
  • the video stream generation unit 110 performs A / D conversion, frame rate conversion, and encoding of an input video analog signal (also referred to as an image signal), and video digital data (also referred to as image data) To generate a video stream, which is a stream of H.264, and supply it to the content file generation unit 112.
  • the video stream generation unit 110 generates video streams of two or more different frame rates.
  • the image file generation unit 111 generates a image file (also referred to as UI data) by encoding a UI to which the present technology is applied, which is generated by the content creator, and generates the image file It supplies to 112.
  • the content of the signal processing on the video analog signal by the video stream generation unit 110 is arbitrary.
  • the content of the signal processing for the image file by the image file generation unit 111 is arbitrary.
  • the conversion method or coding method is arbitrary.
  • the video stream generation unit 110 can generate an MPEG2 stream, an AVC stream, an HEVC stream, and the like from a video analog signal.
  • the content file generation unit 112 performs processing related to generation of a file (content file) storing the content data supplied from the video stream generation unit 110 and the image file generation unit 111.
  • the content file generation unit 112 generates an MP4 file which is a content file storing a video stream supplied as content data from the video stream generation unit 110 and an image file supplied as content data from the image file generation unit 111.
  • the MP4 file which is a content file storing a video stream supplied as content data from the video stream generation unit 110 and an image file supplied as content data from the image file generation unit 111.
  • the specification of the content file generated by the content file generation unit 112 is arbitrary.
  • the content file generation unit 112 can generate an MP4 file storing an MPEG2 stream, an AVC stream, an HEVC stream or the like, a DSD lossless stream, an AAC stream, an LPCM stream or the like.
  • the content file generation unit 112 may generate content files other than MP4 files.
  • DSD is an abbreviation of Direct Stream Digital, which is one of high-quality audio encoding methods.
  • the MPD generation unit 113 performs processing related to generation of management information of the content file generated by the content file generation unit 112. For example, the MPD generation unit 113 generates an MPD file for the MP4 file supplied from the content file generation unit 112 and supplies the MPD file to the communication unit 114. When generating the MPD file, the MPD generation unit 113 applies the above-described present technology and sets the frame rate of graphics (image file) to the MPD.
  • the communication unit 114 performs processing related to communication with another device via the network 104. For example, the communication unit 114 supplies the supplied MPD file or MP4 file to the distribution server 102.
  • the video stream generation unit 110 of the file generation device 101 When the distribution data generation process is started, the video stream generation unit 110 of the file generation device 101 generates a video stream from the video analog signal in step S101. At this time, in the video stream generation unit 110, for example, video streams of two or more different frame rates are generated.
  • the image file generation unit 111 generates an image file in step S102.
  • step S103 the content file generation unit 112 generates a content file (for example, an MP4 file) storing the video stream generated in step S101 and the image file generated in step S102.
  • a content file for example, an MP4 file
  • step S104 the MPD generation unit 113 executes an MPD file generation process described later with reference to FIG. 15, FIG. 17, or FIG. 19, and manages the content file (MP4 file) generated in step S103. Generate a file That is, in the MPD file, the URL of the image file (graphic), the frame rate of the image file, and the like are described.
  • step S105 the communication unit 114 supplies (uploads) the content file generated in step S103 and the MPD file generated in step S104 to the distribution server 102.
  • step S105 When the process of step S105 ends, the distribution data generation process ends.
  • FIG. 13 is a block diagram showing a main configuration example of the playback terminal 103.
  • the playback terminal 103 has a downloader 151 which is access software, an MPD buffer 152, control software 153, moving image playback software 154, and a display 155.
  • the display 155 may be external.
  • the downloader 151 receives the MPD file and the URL of the MP4 file by the control software 153, and requests the distribution server 102 for the MPD file and the MP4 file. In response to the request, the downloader 151 receives the segment file of the bit rate according to the communication environment and the like sent from the distribution server 102. The downloader 151 extracts the encoded stream from the received file and supplies it to the video reproduction software 154.
  • the MPD buffer 152 temporarily stores the MPD file acquired via the video reproduction software 154 and supplies the MPD file to the control software 153 at a predetermined timing.
  • the control software 153 acquires the MPD file from the distribution server 102. Also, the control software 153 transmits the segment file to be reproduced based on, for example, reproduction time information indicating the MPD file, reproduction time designated by the moving image reproduction software 154, and the network band of the network 104. The request is commanded to the downloader 151. Also, the control software 153 supplies information on the frame rate of video, graphics and video output to the blender 168 of the video reproduction software 154, and the reproduction and graphics (icon) of the video reproduction software 154 Control the display of Furthermore, the control software 153 controls the video decoder 163 according to the setting value with reference to the frame rate attribute of the adaptation set (AdaptationSet) or the representation (Representation) element of the MPD file as necessary. Do.
  • the video playback software 154 plays back moving images of two or more different frame rates from the distribution server 102. That is, the video reproduction software 154 designates reproduction time information as the control software 153. Also, upon receiving the notification of reception start from the downloader 151, the moving image reproduction software 154 decodes the encoded stream supplied from the downloader 151. The video reproduction software 154 outputs video data and audio data obtained as a result of decoding. In the example of FIG. 13, the description regarding audio is omitted because it is not directly related to the present technology.
  • the video playback software 154 includes a demultiplexer 161, a segment buffer 162, a video decoder 163, a video frame buffer 164, an image file buffer 165, an image decoder 166, a graphics frame buffer 167, a blender 168, and a video renderer 169. Configured
  • the demultiplexer 161 When the demultiplexer 161 receives the encoded stream supplied from the downloader 151, the demultiplexer 161 separates the stream into MPD files, video segments, and image files. The demultiplexer 161 supplies the MPD file to the MPD buffer 152, supplies the video segment to the segment buffer 162, and supplies the image file to the image file buffer 165.
  • the segment buffer 162 temporarily stores the video segment from the demultiplexer 161 and supplies it to the video decoder 163 at a predetermined timing.
  • the video decoder 163 decodes the video segment in units of segments according to the encoding scheme at the time of encoding.
  • the video decoder 163 temporarily stores the decoded video frame in the video frame buffer 164 and supplies it to the blender 168 at a predetermined timing.
  • the image file buffer 165 temporarily stores the image file from the demultiplexer 161 and supplies it to the image decoder 166 at a predetermined timing.
  • the image decoder 166 decodes the image file in units of segments according to the encoding method at the time of encoding.
  • the image decoder 166 temporarily stores the decoded image frame in the graphics frame buffer 167 and supplies it to the blender 168 at a predetermined timing.
  • the blender 168 superimposes the video and the image based on the information on the frame rate of the video, graphics and video output from the control software 153. At that time, resolution conversion is also performed.
  • the blender 168 outputs the frame resulting from the superposition of the video and the image to the video renderer 169.
  • the video renderer 169 sends the frame after superposition to the external output. For example, the video renderer 169 outputs the frame after superposition to the display 155 for display.
  • the video frame rate may change at the timing of segment switching.
  • the variable frame rate notification processing performed after the parsing of the MPD file will be described with reference to the flowchart in FIG.
  • the control software 153 sets Vfr to 0 in step S151.
  • step S152 the control software 153 selects one video representation (Representation).
  • step S153 the control software 153 determines whether the video representation (Representation) selected in step S152 includes a frame rate difference representation (Representation). If it is determined in step S152 that a representation (Representation) is included, the process proceeds to step S154.
  • step S154 the control software 153 sets Vfr to one.
  • step S153 If it is determined in step S153 that no representation (Representation) is included, step S154 is skipped, and the process proceeds to step S155.
  • step S155 the control software 153 determines whether all video representations (Representations) in the MPD have been confirmed. If it is determined in step S155 that all video representations (Representation) in the MPD have not been confirmed, the process returns to step S152, and the subsequent processes are repeated.
  • step S155 If it is determined in step S155 that all video representations (Representation) in the MPD have been confirmed, the process proceeds to step S156.
  • step S156 the control software 153 determines whether the value of Vfr is one. If it is determined in step S156 that the value of Vfr is 1, the process proceeds to step S157.
  • step S157 the control software 153 supplies the blender 168 with information on the frame rate of the video, graphics, and video output to notify the display 155 that the frame rate is variable reproduction.
  • step S156 If it is determined in step S156 that the value of Vfr is 0, the variable frame rate notification process of FIG. 14 ends.
  • FIG. 15 is a diagram illustrating an example of the syntax of the MPD in the case of displaying “raised” and “decreased” for a predetermined period.
  • the numbers on the left side indicate the number of lines from the top, and are added for the convenience of description.
  • the frame rate of graphics is set to 60 fps (fame per second).
  • the frame rate of graphics is described in the value attribute of the SupplementalProperty element defined in urn: mpeg: dash: icon_frame_rate: 2016, as shown in the sixth line.
  • the maximum frame rate of all representations (Representation) is confirmed at the start of MPD playback, and the video output is fixed to that value.
  • the schmeIdUri attribute of the seventh line is defined in urn: mpeg: dash: icon_frame_rate_up_file: 2016, describes the URL of the image file that contains the icon to be displayed when the frame rate is increased. ing.
  • the schmeIdUri attribute of line 8 is defined in urn: mpeg: dash: icon_frame_rate_down_file: 2016, describes the URL of the image file that contains the icon to be displayed when the frame rate is lowered. ing.
  • the icon display may be displayed only for a fixed period, or may be continued until a frame rate change occurs next.
  • step S171 the control software 153 performs the MPD parsing stored in the MPD buffer 152.
  • step S172 the control software 153 performs variable frame rate notification processing. Since the variable frame rate notification processing is the same as the processing described above with reference to FIG. 14, the detailed description thereof is omitted.
  • step S173 the video playback software 154 starts video segment playback. That is, the video decoder 163 decodes the video segment stored in the segment buffer 162, temporarily stores the decoded image frame in the graphics frame buffer 167, and supplies the image frame to the blender 168 at a predetermined timing.
  • the blender 168 superimposes the video and the image based on the information on the frame rate of the video, graphics and video output from the control software 153. At that time, resolution conversion is also performed.
  • the blender 168 outputs the frame resulting from the superposition of the video and the image to the video renderer 169.
  • the video renderer 169 sends the frame after superposition to the external output. For example, the video renderer 169 outputs the frame after superposition to the display 155 for display.
  • step S174 the control software 153 determines whether there is a frame rate change of the video segment. That is, in step S174, when a transition of Representation (Representation) occurs, it is determined that there is a frame rate change of the video segment, and when it is changed to a low rate, the process proceeds to step S175. In step S175, the control software 153 controls the blender 168 to display an icon (for example, down. Png in the eighth line of FIG. 15) indicating a frame drop. Thereafter, the processing proceeds to step S177.
  • Representation transition of Representation
  • step S174 when transition of Representation (Representation) occurs, it is determined that there is a frame rate change of the video segment, and when it is changed to a high rate, the process proceeds to step S176.
  • step S176 the control software 153 controls the blender 168 to display an icon (for example, up. Png in the seventh line of FIG. 15) indicating a frame rise. Thereafter, the processing proceeds to step S177.
  • step S174 when transition of Representation (Representation) does not occur, it is determined that there is no frame rate change of the video segment, and the process proceeds to step S177.
  • step S177 the control software 153 determines whether or not all video segments have been reproduced. If it is determined in step S177 that the reproduction has not ended yet, the process returns to step S173, and the subsequent processes are repeated. On the other hand, when it is determined in step S177 that all video segment reproduction has ended, the MPEG-DASH streaming reproduction process is ended.
  • FIG. 17 is a diagram showing an example of MPD syntax when indicating the ratio of the maximum video frame rate to the video frame currently being reproduced.
  • the numbers on the left side indicate the number of lines from the top, and are added for the convenience of description.
  • the frame rate of graphics is set to 60 fps (fame per second).
  • the frame rate of graphics is described in the value attribute of the SupplementalProperty element defined in urn: mpeg: dash: icon_frame_rate: 2016, as shown in the sixth line.
  • the maximum frame rate of all representations (Representation) is confirmed at the start of MPD playback, and the video output is fixed to that value.
  • the predetermined icon is displayed to operate at the frame rate intended for the end user. We can announce the effect.
  • the URL of the image file in which the icon is stored has the schmeIdUri attribute described in the value attribute of the SupplementalProperty element defined in urn: mpeg: dash: icon_frame_rate_file: 2016.
  • image files for icons corresponding to respective frame rates of three defined representations are defined.
  • the eighth line 1.png is an icon indicating that it is 24 fps of 2/5 corresponding to the maximum frame rate of 60 fps.
  • 2.png is an icon indicating that it is half 30 fps corresponding to the maximum frame rate of 60 fps.
  • 3.png is an icon indicating that the maximum frame rate is 60 fps.
  • the numerical value of the frame rate value may be directly displayed. You may display a fraction like "30/60". You may display like an analog meter. You may display like a digital meter. It may be displayed in shades of color. It may be displayed by the density of lines like contour lines of a map. It is an icon that makes the object rotate, and the difference in rotational speed may indicate the difference in frame rate. It may be an icon as the object moves, and the difference in frame rate may be displayed by the speed difference.
  • step S191 the control software 153 performs the MPD parsing stored in the MPD buffer 152.
  • step S192 the control software 153 performs variable frame rate notification processing. Since the variable frame rate notification processing is the same as the processing described above with reference to FIG. 14, the detailed description thereof is omitted.
  • step S193 the video playback software 154 starts video segment playback. That is, the video decoder 163 decodes the video segment stored in the segment buffer 162, temporarily stores the decoded image frame in the graphics frame buffer 167, and supplies the image frame to the blender 168 at a predetermined timing.
  • the blender 168 superimposes the video and the image based on the information on the frame rate of the video, graphics and video output from the control software 153. At that time, resolution conversion is also performed.
  • the blender 168 outputs the frame resulting from the superposition of the video and the image to the video renderer 169.
  • the video renderer 169 sends the frame after superposition to the external output. For example, the video renderer 169 outputs the frame after superposition to the display 155 for display.
  • step S194 the control software 153 determines whether there is a frame rate change of the video segment. That is, in step S194, when a transition of Representation (Representation) occurs, it is determined that there is a frame rate change of the video segment, and the process proceeds to step S195.
  • step S195 the control software 153 controls the blender 168 to display an icon indicating the ratio of the maximum video frame rate to the video frame rate (for example, 1. png of the eighth line of FIG. png, 3.png) on the 16th line. Thereafter, the processing proceeds to step S196.
  • step S194 when transition of Representation (Representation) does not occur, it is determined that there is no frame rate change of the video segment, and the process proceeds to step S196.
  • step S196 the control software 153 determines whether or not all video segments have been reproduced. If it is determined in step S196 that the reproduction has not ended yet, the process returns to step S193, and the subsequent processes are repeated. On the other hand, if it is determined in step S196 that all video segment reproduction has ended, the MPEG-DASH streaming reproduction process is ended.
  • FIG. 19 is a diagram showing an example of MPD syntax in the case of showing the ratio of the video frame rate reproduced immediately before and the video frame currently being reproduced.
  • the numbers on the left side indicate the number of lines from the top, and are added for the convenience of description.
  • the frame rate of graphics is set to 60 fps (fame per second).
  • the frame rate of graphics is described in the value attribute of the SupplementalProperty element defined in urn: mpeg: dash: icon_frame_rate: 2016, as shown in the sixth line.
  • the maximum frame rate of all representations (Representation) is confirmed at the start of MPD playback, and the video output is fixed to that value.
  • the Representation_id “V1_2” on the seventh line has a frame rate value that is different from the graphics frame rate, but by displaying the same video frame repeatedly several times, the same rate output as the graphics frame rate can be output. It is set to be possible.
  • Representation_id “V1_3” is selected in the 12th line, the video is rendered at 60 fps, so the graphics are rendered at 60 fps, and the final video output is also set to 60 fps.
  • the predetermined icon is displayed to operate at the frame rate intended for the end user. We can announce the effect.
  • the URL of the image file in which the icon is stored has the schmeIdUri attribute described in the value attribute of the SupplementalProperty element defined in urn: mpeg: dash: icon_frame_rate_file: 2016.
  • the numerical value of the frame rate value may be directly displayed. You may display a fraction like "30/60". You may display like an analog meter. You may display like a digital meter. It may be displayed in shades of color. It may be displayed by the density of lines like contour lines of a map. It is an icon that makes the object rotate, and the difference in rotational speed may indicate the difference in frame rate. It may be an icon as the object moves, and the difference in frame rate may be displayed by the speed difference.
  • step S201 the control software 153 performs the MPD parsing stored in the MPD buffer 152.
  • step S202 the control software 153 performs variable frame rate notification processing. Since the variable frame rate notification processing is the same as the processing described above with reference to FIG. 14, the detailed description thereof is omitted.
  • step S203 the video playback software 154 starts video segment playback. That is, the video decoder 163 decodes the video segment stored in the segment buffer 162, temporarily stores the decoded image frame in the graphics frame buffer 167, and supplies the image frame to the blender 168 at a predetermined timing.
  • the blender 168 superimposes the video and the image based on the information on the frame rate of the video, graphics and video output from the control software 153. At that time, resolution conversion is also performed.
  • the blender 168 outputs the frame resulting from the superposition of the video and the image to the video renderer 169.
  • the video renderer 169 sends the frame after superposition to the external output. For example, the video renderer 169 outputs the frame after superposition to the display 155 for display.
  • step S204 the control software 153 determines whether there is a frame rate change of the video segment. That is, in step S204, when a transition of Representation (Representation) occurs, it is determined that there is a frame rate change of the video segment, and the process proceeds to step S205.
  • step S205 the control software 153 controls the blender 168 to display an icon indicating the ratio of the video frame rate to the video frame rate of the video segment immediately before (for example, V1_3 to V1_2.png, line 14 in line 9 of FIG. 19). Display the eye V1_2 to V1_3.png). Thereafter, the process proceeds to step S206.
  • step S204 when transition of Representation (Representation) does not occur, it is determined that there is no frame rate change of the video segment, and the process proceeds to step S206.
  • step S206 the control software 153 determines whether or not all video segments have been reproduced. If it is determined in step S206 that the reproduction has not ended yet, the process returns to step S203, and the subsequent processes are repeated. On the other hand, when it is determined in step S206 that all video segment reproduction has ended, the MPEG-DASH streaming reproduction process is ended.
  • the control software 153 selects one video representation (Representation).
  • the control software 153 reads the frame rate attribute value (fr) of the video representation (Representation) selected in step S222.
  • step S224 If it is determined in step S224 that fr_max is equal to or greater than fr, the process proceeds to step S226.
  • step S226 the control software 153 determines whether all video presentations (Representation) in the MPD have been confirmed. If it is determined in step S226 that all video presentations (Representation) in the MPD have been confirmed, the maximum frame rate determination process of FIG. 21 is ended.
  • step S226 when it is determined in step S226 that not all video presentations (Representation) in the MPD have been confirmed, the process returns to step S222, and the subsequent processes are repeated.
  • step S241 the control software 153 performs a maximum frame rate (fr_max) determination process.
  • the maximum frame rate (fr_max) determining process is basically the same as the determining process described above with reference to FIG. 21, and thus the description thereof is omitted.
  • step S 242 the control software 153 determines whether the adaptation set (AdaptationSet) has a definition of icon_frame_rate. If it is determined in step S242 that the adaptation set (AdaptationSet) has a definition of icon_frame_rate, the process proceeds to step S243. In step S243, the control software 153 determines whether fr_max is smaller than icon_frame_rate.
  • step S243 If it is determined in step S243 that fr_max is smaller than icon_frame_rate, the process proceeds to step S244.
  • step S244 the control software 153 sets the frame rate of the video output to icon_frame_rate, and ends the video output frame rate determination process.
  • step S242 if it is determined in step S242 that there is no definition of icon_frame_rate in the adaptation set (AdaptationSet), or if it is determined in step S243 that fr_max is equal to or greater than icon_frame_rate, the process proceeds to step S245.
  • step S245 the frame rate of video output is set to fr_max, and the video output frame rate determination process is ended.
  • the frame rate of the video output may be the same as the frame rate of the video, or may be set by the end user.
  • the end user When the end user can select, it is generally fixed to the highest frame rate that can be received by a television receiver or a monitor for a personal computer such as "60p fixed output".
  • the video frame rate is changed from 60 fps to 30 fps.
  • Graphics may also be rendered at 30 fps and the final output may be changed to 30 fps.
  • the graphics will continue to render at 60 fps and the final output may be changed to 30 fps.
  • the graphics will continue to render at 60 fps and the final output may maintain 60 fps.
  • the frame rate attribute is described in Representation, and an example of referring to it is shown. However, as described above, it is described in the adaptation set (AdaptationSet). You may refer to that.
  • FIG. 23 is a view showing an example of displaying an icon of a frame rate value.
  • the frame rate is decreased from 30 fps to 15 fps, as shown in A of FIG.
  • a fraction may be written together with an icon indicating a drop.
  • a fraction may be written together with an icon indicating a drop.
  • FIG. 24 is a view showing another icon display example of the frame rate value.
  • representation with a digital meter as shown in FIG. 24 may be performed.
  • 24A shows a digital meter that represents 24 fps
  • FIG. 24B shows a digital meter that represents 15 fps.
  • FIG. 25 is a view showing still another exemplary icon display of the frame rate value. As shown in the example of FIG. 25, the notation may be performed as an analog meter.
  • FIG. 26 is a view showing an example of an animation icon notifying of a frame rate change.
  • a of FIG. 26 illustrates an example of the icon sequence in the case where the continuous display (1, 2,..., 6, 1, 2,..., 6,...) Is recursively performed at 30 fps.
  • B of FIG. 26 illustrates an example of the icon sequence in the case where the continuous display (1, 2, 3, 1, 2, 3,...) Is recursively performed at 15 fps.
  • an icon indicating that the frame rate different from the frame rate reproduced immediately before is displayed.
  • the identification information to be displayed may not necessarily use the one described in the MPD, and may use the one prepared in the playback device.
  • the present technology it is not a situation on the playback device side, but, for example, it is an intentional frame drop when, for example, a representation with a reduced frame rate is selected to reduce the bit rate. Can be notified.
  • the series of processes described above can be performed by hardware or software.
  • a program that configures the software is installed on a computer.
  • the computer includes, for example, a general-purpose personal computer that can execute various functions by installing a computer incorporated in dedicated hardware and various programs.
  • FIG. 27 is a block diagram showing an example of a hardware configuration of a computer that executes the series of processes described above according to a program.
  • a central processing unit (CPU) 1001 a read only memory (ROM) 1002, and a random access memory (RAM) 1003 are mutually connected via a bus 1004.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • An input / output interface 1010 is also connected to the bus 1004.
  • An input unit 1011, an output unit 1012, a storage unit 1013, a communication unit 1014, and a drive 1015 are connected to the input / output interface 1010.
  • the input unit 1011 includes, for example, a keyboard, a mouse, a microphone, a touch panel, an input terminal, and the like.
  • the output unit 1012 includes, for example, a display, a speaker, and an output terminal.
  • the storage unit 1013 includes, for example, a hard disk, a RAM disk, and a non-volatile memory.
  • the communication unit 1014 includes, for example, a network interface.
  • the drive 1015 drives removable media 1021 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 1001 loads the program stored in the storage unit 1013 into the RAM 1003 via the input / output interface 1010 and the bus 1004, and executes the program. Processing is performed.
  • the RAM 1003 also stores data necessary for the CPU 1001 to execute various processes.
  • the program executed by the computer (CPU 1001) can be recorded and applied to, for example, a removable medium 1021 as a package medium or the like.
  • the program can be installed in the storage unit 1013 via the input / output interface 1010 by attaching the removable media 1021 to the drive 1015.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be received by the communication unit 1014 and installed in the storage unit 1013.
  • this program may be installed in advance in the ROM 1002 or the storage unit 1013.
  • encoded data may be multiplexed and transmitted or recorded in encoded data, or may be encoded with encoded data without being multiplexed in encoded data. It may be transmitted or recorded as associated separate data.
  • association means, for example, that one data can be processed (linked) in processing the other data. That is, the data associated with each other may be collected as one data or may be individual data.
  • the information associated with the coded data (image) may be transmitted on a transmission path different from that of the coded data (image).
  • information associated with encoded data (image) may be recorded on a recording medium (or another recording area of the same recording medium) different from the encoded data (image).
  • association may not be the entire data but a part of the data.
  • an image and information corresponding to the image may be associated with each other in an arbitrary unit such as a plurality of frames, one frame, or a part in a frame.
  • a system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing or not. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device housing a plurality of modules in one housing are all systems. .
  • the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
  • the configuration described as a plurality of devices (or processing units) in the above may be collectively configured as one device (or processing unit).
  • configurations other than those described above may be added to the configuration of each device (or each processing unit).
  • part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit) if the configuration or operation of the entire system is substantially the same. .
  • the present technology can have a cloud computing configuration in which one function is shared and processed by a plurality of devices via a network.
  • the program described above can be executed on any device.
  • the device may have necessary functions (functional blocks and the like) so that necessary information can be obtained.
  • each step described in the above-described flowchart can be executed by one device or in a shared manner by a plurality of devices.
  • the plurality of processes included in one step can be executed by being shared by a plurality of devices in addition to being executed by one device.
  • the process of the step of writing the program may be executed in chronological order according to the order described in the present specification, or the call is performed in parallel or It may be individually executed at necessary timing such as time. Furthermore, the process of the step of writing this program may be executed in parallel with the process of another program, or may be executed in combination with the process of another program.
  • a reproduction unit that reproduces a moving image of two or more different frame rates
  • a display control unit that displays identification information indicating that the frame rate different from the frame rate reproduced immediately before is received when a moving image having a frame rate different from the frame rate reproduced immediately before by the reproduction unit is received .
  • the display control unit displays the identification information at a frame rate different from a video frame rate.
  • the identification information indicates that the frame rate is lower than the frame rate reproduced immediately before.
  • the information processing apparatus indicates a ratio to a maximum frame rate.
  • the information processing apparatus indicates a ratio to the frame rate reproduced immediately before.
  • the information processing apparatus according to any one of (1) to (6), wherein the moving images of two or more different frame rates are transmitted from the content distribution side.
  • the information processing apparatus according to any one of (1) to (6), wherein the identification information is transmitted from a content distribution side.
  • the information processing apparatus according to (8), wherein the identification information is described and transmitted in an adaptation set (AdaptationSet) or a representation (Representation) of the MPD.
  • AdaptationSet adaptation set
  • Representation representation
  • the information processing apparatus Play back moving images of two or more different frame rates, An information processing method comprising: displaying identification information indicating that the frame rate different from the frame rate reproduced immediately before is received when a moving image having a frame rate different from the frame rate reproduced immediately before is received.
  • a moving image generation unit that generates moving images of two or more different frame rates
  • An identification information generation unit that generates identification information indicating that the frame rate different from the frame rate reproduced immediately before is displayed when a moving image having a frame rate different from the frame rate reproduced immediately before on the reproduction side is received;
  • An information processing apparatus comprising: a transmission unit that transmits the moving image generated by the moving image generation unit and the identification information generated by the identification information generation unit.
  • the information processing apparatus further including: an MPD generation unit that generates an MPD by describing identification information generated by the identification information generation unit.
  • the MPD generation unit describes the frame rate of the identification information generated by the identification information generation unit and generates the MPD.
  • the information processing apparatus describes the identification information generated by the identification information generation unit in an adaptation set (AdaptationSet) of the MPD.
  • the MPD generation unit describes identification information generated by the identification information generation unit in a representation of the MPD.
  • the information processing apparatus Generate moving images of two or more different frame rates, The reproduction side generates identification information indicating that the frame rate is different from the frame rate reproduced immediately before being displayed when a moving image having a frame rate different from the frame rate reproduced immediately before is received.
  • DESCRIPTION OF SYMBOLS 100 Distribution system, 101 file generation apparatus, 102 distribution server, 103 reproduction terminal, 104 network, 110 video stream generation part, 111 image file generation part, 112 content file generation part, 113 MPD generation part, 114 communication part, 151 downloader, 152 MPD buffer, 153 control software, 154 video playback software, 155 display, 161 demultiplexer, 162 segment buffer, 163 video decoder, 164 video frame buffer, 165 image file buffer, 166 image decoder, 167 graphics frame buffer , 168 Brenda, 169 Video Renderers

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

L'invention concerne un dispositif et un procédé de traitement d'informations permettant de notifier un changement de fréquence de trame. Un logiciel de reproduction d'image animée reproduit une image animée qui a été reçue d'un serveur de distribution et qui possède au moins deux fréquences de trame différentes. Un logiciel de commande fournit, à un mélangeur du logiciel de reproduction d'image animée, des informations concernant une vidéo, des graphiques (fichier image) qui sont des informations d'identification indiquant la différence de fréquence de trame par rapport à celle reproduite juste avant, ainsi que la fréquence de trame pour la sortie vidéo, puis effectue une commande sur la reproduction à effectuer par le logiciel de reproduction d'image animée ainsi que sur l'affichage du graphique sur un dispositif d'affichage. La présente invention peut s'appliquer à un système de distribution qui distribue des contenus tels que des images et des sons.
PCT/JP2018/001314 2017-01-31 2018-01-18 Dispositif et procédé de traitement d'informations WO2018142945A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-015364 2017-01-31
JP2017015364 2017-01-31

Publications (1)

Publication Number Publication Date
WO2018142945A1 true WO2018142945A1 (fr) 2018-08-09

Family

ID=63039648

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/001314 WO2018142945A1 (fr) 2017-01-31 2018-01-18 Dispositif et procédé de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2018142945A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114339411A (zh) * 2021-12-30 2022-04-12 西安紫光展锐科技有限公司 视频处理方法、装置及设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006324941A (ja) * 2005-05-19 2006-11-30 Sony Corp 画像処理装置および方法、並びにプログラム
WO2015053352A1 (fr) * 2013-10-10 2015-04-16 シャープ株式会社 Dispositif d'affichage et procede de traitement super-resolution
JP2016007015A (ja) * 2011-01-07 2016-01-14 シャープ株式会社 再生装置、再生装置の制御方法、生成装置、生成装置の制御方法、制御プログラム、及び該プログラムを記録した記録媒体

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006324941A (ja) * 2005-05-19 2006-11-30 Sony Corp 画像処理装置および方法、並びにプログラム
JP2016007015A (ja) * 2011-01-07 2016-01-14 シャープ株式会社 再生装置、再生装置の制御方法、生成装置、生成装置の制御方法、制御プログラム、及び該プログラムを記録した記録媒体
WO2015053352A1 (fr) * 2013-10-10 2015-04-16 シャープ株式会社 Dispositif d'affichage et procede de traitement super-resolution

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114339411A (zh) * 2021-12-30 2022-04-12 西安紫光展锐科技有限公司 视频处理方法、装置及设备
CN114339411B (zh) * 2021-12-30 2023-12-26 西安紫光展锐科技有限公司 视频处理方法、装置及设备

Similar Documents

Publication Publication Date Title
JP2021061628A (ja) 情報処理装置および情報処理方法
US10623754B2 (en) Information processing device and method
JP5829626B2 (ja) 再生装置、再生装置の制御方法、生成装置、生成装置の制御方法、制御プログラム、及び該プログラムを記録した記録媒体
CN109937448B (zh) 用于在特技播放回放期间提供音频内容的系统和方法
JP6555263B2 (ja) 情報処理装置および方法
WO2017061297A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
WO2016002513A1 (fr) Dispositif et procédé de traitement d'informations
KR101978922B1 (ko) 관심 영역과 배경프레임 개별 전송을 이용한 고화질 360도 영상 스트리밍 방법
EP3005708A1 (fr) Procédés et appareils de diffusion en continu de contenu
US9516357B2 (en) Recording variable-quality content stream
US20190373213A1 (en) Information processing device and method
WO2018139284A1 (fr) Dispositif et procédé de traitement d'image, et programme
WO2022212591A1 (fr) Publicité ciblée améliorée pour diffusion vidéo en continu
US9060184B2 (en) Systems and methods for adaptive streaming with augmented video stream transitions using a media server
WO2014112186A1 (fr) Serveur de contenu et procédé de distribution de contenu
JP6258168B2 (ja) 配信装置、再生装置および配信システム
WO2018142945A1 (fr) Dispositif et procédé de traitement d'informations
WO2018142947A1 (fr) Dispositif et procédé de traitement d'informations
JP6793526B2 (ja) 動画配信システム、配信サーバ、及びプログラム
WO2014171385A1 (fr) Dispositif serveur, procédé de distribution de contenu et programme informatique
JP2012222530A (ja) 受信装置及び方法、並びにプログラム
WO2013039042A1 (fr) Dispositif de reproduction, procédé de reproduction, dispositif de distribution, système de distribution, programme de reproduction et support d'enregistrement
WO2018079293A1 (fr) Dispositif et procédé de traitement d'informations
KR101656102B1 (ko) 컨텐츠 파일 생성/제공 장치 및 방법
KR101026682B1 (ko) 포트변환 데이터 스트리밍 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18747371

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 18747371

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载