+

WO2018152063A1 - Mappage de courbe de tonalité pour des images à plage dynamique élevée - Google Patents

Mappage de courbe de tonalité pour des images à plage dynamique élevée Download PDF

Info

Publication number
WO2018152063A1
WO2018152063A1 PCT/US2018/017830 US2018017830W WO2018152063A1 WO 2018152063 A1 WO2018152063 A1 WO 2018152063A1 US 2018017830 W US2018017830 W US 2018017830W WO 2018152063 A1 WO2018152063 A1 WO 2018152063A1
Authority
WO
WIPO (PCT)
Prior art keywords
output
dynamic range
mid
tones
input
Prior art date
Application number
PCT/US2018/017830
Other languages
English (en)
Inventor
Jaclyn Anne Pytlarz
Robin Atkins
Original Assignee
Dolby Laboratories Licensing Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dolby Laboratories Licensing Corporation filed Critical Dolby Laboratories Licensing Corporation
Priority to JP2019543902A priority Critical patent/JP6738972B2/ja
Priority to US16/486,392 priority patent/US10600166B2/en
Priority to BR112019016825A priority patent/BR112019016825A2/pt
Priority to EP18706171.8A priority patent/EP3559901B1/fr
Priority to PL18706171T priority patent/PL3559901T3/pl
Priority to RU2019124451A priority patent/RU2713869C1/ru
Priority to ES18706171T priority patent/ES2817852T3/es
Priority to KR1020197023862A priority patent/KR102122165B1/ko
Priority to CN201880011955.3A priority patent/CN110337667B/zh
Priority to DK18706171.8T priority patent/DK3559901T3/da
Publication of WO2018152063A1 publication Critical patent/WO2018152063A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Definitions

  • the present invention relates generally to images. More particularly, an embodiment of the present invention relates to determining parameters for a tone curve for mapping high-dynamic range (HDR) images and video signals from a first dynamic range to a second dynamic range.
  • HDR high-dynamic range
  • the term 'dynamic range' may relate to a capability of the human visual system (HVS) to perceive a range of intensity (e.g., luminance, luma) in an image, e.g., from darkest grays (blacks) to brightest whites (highlights).
  • HVS human visual system
  • DR relates to a 'scene-referred' intensity.
  • DR may also relate to the ability of a display device to adequately or approximately render an intensity range of a particular breadth. In this sense, DR relates to a 'display-referred' intensity.
  • the term may be used in either sense, e.g. interchangeably.
  • high dynamic range relates to a DR breadth that spans the some 14-15 orders of magnitude of the human visual system (HVS).
  • HVS human visual system
  • EDR enhanced dynamic range
  • VDR visual dynamic range
  • images where n ⁇ 8 e.g., color 24- bit JPEG images
  • images where n > 8 may be considered images of enhanced dynamic range.
  • EDR and HDR images may also be stored and distributed using high-precision (e.g., 16-bit) floating-point formats, such as the OpenEXR file format developed by Industrial Light and Magic.
  • Metadata relates to any auxiliary information that is transmitted as part of the coded bitstream and assists a decoder to render a decoded image.
  • metadata may include, but are not limited to, color space or gamut information, reference display parameters, and auxiliary signal parameters, as those described herein.
  • LDR lower dynamic range
  • SDR standard dynamic range
  • HDR content may be color graded and displayed on HDR displays that support higher dynamic ranges (e.g., from 1,000 nits to 5,000 nits or more).
  • the methods of the present disclosure relate to any dynamic range higher than SDR.
  • High Dynamic Range (HDR) content authoring is now becoming widespread as this technology offers more realistic and lifelike images than earlier formats.
  • many display systems including hundreds of millions of consumer television displays, are not capable of reproducing HDR images.
  • HDR content optimized on one HDR display may not be suitable for direct playback on another HDR display.
  • One approach being used to serve the overall market is to create multiple versions of new video content; say, one using HDR images, and another using SDR (standard dynamic range) images.
  • SDR standard dynamic range
  • a potentially better approach might be to create one version of content (in HDR), and use an image data transformation system (e.g., as part of a set-top box functionality) to automatically down-convert (or up-convert) the input HDR content to the appropriate output SDR or HDR content.
  • image data transformation system e.g., as part of a set-top box functionality
  • improved techniques for determining a tone curve for display mapping of HDR images are developed.
  • FIG. 1 depicts an example process for a video delivery pipeline
  • FIG. 2 depicts an example mapping function for mapping images from a first dynamic range to a second dynamic range using three anchor points according to prior art
  • FIG. 3 depicts an example mapping function for mapping images from a first dynamic range to a second dynamic range according to an embodiment of the present invention
  • FIG. 4A and FIG. 4B depict example processes for determining the tone- mapping function according to embodiments of the present invention
  • FIG. 5 depicts two example mapping functions determined according to embodiments of the present invention.
  • FIG. 6A and FIG. 6B depict example plots showing how input mid-tones values are mapped to output mid-tones values to determine the mid-tones anchor point of a tone-curve mapping function according to embodiments of the present invention.
  • Example embodiments described herein relate to methods for determining a tone curve for display mapping of high dynamic range (HDR) images.
  • a processor receives first information data comprising an input black point level (xl, SMin), an input mid-tones level (x2, SMid), and an input white point level (x3, SMax) in a first dynamic range.
  • the processor also accesses second information data for an output image in a second dynamic range comprising a first output black point level (TminPQ) and a first output white point level (TmaxPQ) in the second dynamic range.
  • the processor determines an output mid-tones value in the second dynamic range based on the first and second information data.
  • the processor computes a second output black point and a second output white point in the second dynamic range based on the second information data and the output mid-tones value. Then, it computes a tail slope, a head slope, and a mid-tones slope based on the first information data, the second information data, and the output mid-tones value.
  • the processor determines a transfer function to map pixel values of the input image in the first dynamic range to corresponding pixel values of the output image in the second dynamic range, wherein the transfer function comprises two segments, wherein the first segment is determined based on the tail slope, the mid-tones slope, the input black point level, the input mid-tones level, the second output black point, and the output mid-tones value, and the second segment is determined based on the mid-tones slope, the head slope, the input mid-tones level, the input white point level, the output mid-tones value, and the second output white point.
  • the processor maps the input image to the output image using the determined transfer function.
  • the processor computes the second output black point and the second output white point based on the first information data, the second information data and the output mid-tones value.
  • the processor computes a tail slope based on the input black point level, the input mid-tones level, the second output black point and the output mid- tones value. [00021] In an embodiment, the processor computes a head slope based on the input white point level, the input mid-tones level, the second output white point and the output mid- tones value.
  • the processor computes a mid-tones slope based on the first information data, the second output black point, the second output white point and the output mid-tones value.
  • FIG. 1 depicts an example process of a conventional video delivery pipeline (100) showing various stages from video capture to video content display.
  • a sequence of video frames (102) is captured or generated using image generation block (105).
  • Video frames (102) may be digitally captured (e.g. by a digital camera) or generated by a computer (e.g. using computer animation) to provide video data (107).
  • video frames (102) may be captured on film by a film camera. The film is converted to a digital format to provide video data (107).
  • a production phase (110) video data (107) is edited to provide a video production stream (112).
  • Block (115) post-production editing may include adjusting or modifying colors or brightness in particular areas of an image to enhance the image quality or achieve a particular appearance for the image in accordance with the video creator's creative intent. This is sometimes called "color timing" or "color grading.”
  • Other editing e.g. scene selection and sequencing, image cropping, addition of computer- generated visual special effects, etc.
  • video images are viewed on a reference display (125).
  • video data of final production may be delivered to encoding block (120) for delivering downstream to decoding and playback devices such as television sets, set-top boxes, movie theaters, and the like.
  • coding block (120) may include audio and video encoders, such as those defined by ATSC, DVB, DVD, Blu-Ray, and other delivery formats, to generate coded bit stream (122).
  • the coded bit stream (122) is decoded by decoding unit (130) to generate a decoded signal (132) representing an identical or close approximation of signal (117).
  • the receiver may be attached to a target display (140) which may have completely different characteristics than the reference display (125).
  • a display management block (135) may be used to map the dynamic range of decoded signal (132) to the characteristics of the target display (140) by generating display- mapped signal (137).
  • mapping function (220) is defined by three anchor points (205, 210, 215): a black point (xl, yl), a mid-tones point (x2, y2), and a white point (x3,y3).
  • This transformation has the following properties:
  • the values from xl to x3 represent the range of possible values that describe the pixels that make up the input image. These values may be the brightness levels for a particular color primary (R, G, or B), or may be the overall luminance levels of the pixel (say the Y component in a YCbCr representation). Typically, these values correspond to the darkest (black point) and brightest (white point) levels supported by the display used to create the content (the "mastering display” (125)); however, in some embodiments, when the characteristics of the reference display are unknown, these values may represent minimum and maximum values in the input image (e.g., in R, G, or B, luminance, and the like), which can be either received via image metadata or computed in the receiver.
  • the values from yl to y3 represent the range of possible values that describe the pixels that make up the output image (again either color primary brightness or overall luminance levels). Typically, these values correspond to the darkest (black point) and brightest (white point) levels supported by the intended output display (the "target display” (140)). • Any input pixel with the value xl is constrained to map to an output pixel with the value yl, and any input pixel with the value x3 is constrained to map to an output pixel with the value y3.
  • x2 and y2 represent a specific mapping from input to output that is used as an anchor point for some "midrange” (or mid-tones level) element of the image. Considerable latitude is permitted in the specific choice of x2 and y2, and the '480 Patent teaches a number of alternatives for how these parameters may be chosen.
  • G, G, and G are constants
  • x denotes the input value for a color channel or luminance
  • y denotes the corresponding output value
  • n is a free parameter that can be used to adjust mid-tone contrast.
  • the G, G, and G values may be determined by solving
  • FIG. 3 depicts an example of a new tone-mapping curve according to an embodiment.
  • tone mapping curve (320) comprises up to four segments: • Optional linear segment LI, for values lower than (xl, yl) (xl can be negative as well)
  • segments SI and S2 are determined using a third-order (cubic) Hermite polynomial.
  • mappings using linear segments LI and L2 may be particularly useful when one computes the inverse of the tone curve, for example, during inverse tone mapping, say, when one needs to generate a high dynamic range image from a standard dynamic range image.
  • curve (320) is controlled by three anchor points (305, 310, 315): a black point (xl, yl), a mid-tones value point (x2, y2), and a white point (x3, y3).
  • each of the spline segments can be further constrained by two slopes, at each end-point; thus the full curve is controlled by three anchor points and three slopes: the tail slope at (xl,yl), the mid-tones slope at (x2,y2), and the head slope at (x3,y3).
  • the complete curve may be defined based on the following parameters:
  • SMid x2; denotes the average (e.g., arithmetic, median, geometric) luminance of the source content. In some embodiments, it may simply denote an "important" luminance feature in the input picture. In some other embodiment it may also denote the average or median of a selected region (say, a face).
  • These data may be received using image or source metadata, they may be computed by a display management unit (e.g., 135), or they may be based on known assumptions about the mastering or reference display environment. In addition, the following data are assumed to be known for the target display (e.g., received by reading the display's Extended Display Identification Data (EDID)):
  • EDID Extended Display Identification Data
  • TminPQ the minimum luminance of the target display
  • TmaxPQ the maximum luminance of the target display
  • slopeMin slope at (xl,yl);
  • slopeMid slope at (x2,y2)
  • slopeMax slope at (x3, y3).
  • FIG. 4A depicts an example process for determining the anchor points and the slopes of the tone-mapping curve according to an embodiment.
  • step 412 Given the input parameters (405) SMin, SMid, SMax, TminPQ, and TmaxPQ, step 412 computes y2 (TMid).
  • TMid the mid-tones anchor point
  • TMid may be selected to be within a fixed distance (e.g., between TMinPQ+c and TmaxPQ- ⁇ i, where c and d are fixed values), or a distance computed as a percentage of these values.
  • step 425 computes the remaining unknown parameters (e.g., yl, y2, and the three slopes).
  • step 430 computes the two spline segments and the two linear segments.
  • computing TMid may be further refined by attempting to preserve the original ratios between the input "tail-contrast” (that is, the contrast between the blacks (SMin) and the mid-tones (SMid)) and the input "head contrast” (that is, the contrast between the mid-tones (SMid) and the highlights (SMax)).
  • tail-contrast that is, the contrast between the blacks (SMin) and the mid-tones (SMid)
  • head contrast that is, the contrast between the mid-tones (SMid) and the highlights (SMax)
  • TDR TmaxPQ-TminPQ
  • SMax range the SMax range
  • source tail contrast the SMin to SMid range
  • tailroom ( SMid-SMin) /TDR;
  • offsetScalar -0.5 * ( cos (pi*min ( 1 , max ( 0 , (midLoc-0.5 ) / 0.5 ) ) ) -
  • offsetHead max(0, headroom*preservationHead + midLoc - 1) * TDR * offsetScalar;
  • TMid SMid-offsetHead+offsetTail
  • TMax min (TMid+SMax-SMid, TmaxPQ) ;
  • TMin max (TMid-SMid+SMin, TminPQ) ;
  • maxMaxSlope 3* (TMax-TMid) /( SMax-SMid) ;
  • contrastF actor 1; however, its value may be adjusted to further change the output signal contrast based on user preferences (e.g., using the "contrast" control on a TV set)
  • the purpose of this function is to provide a smooth transition between the maximum possible offsets (e.g., headroom*preservationHead or tailroom*preservationTail) and the corresponding head or tail slopes.
  • the s() function in equation (5) can be any suitable linear or non-linear function of
  • s(midLoc) A /min(l, max(0,
  • slopeMid is computed to be as close to the diagonal for a 1- to-1 mapping.
  • the four segments may be generated as follows:
  • step (410) computes a first target mid-tones value and associated target headroom and tailroom values based on the input parameters (e.g., see step 2 in Table 1).
  • step (415) computes head and tail offsets (e.g., see step 4 in Table 1).
  • Step (420) applies the head and tail offsets to the input SMid value to compute the target TMid value (e.g., see step 5 in Table 1).
  • step (425) computes TMax and TMin based on the input parameters and TMid (e.g., see step 6 in Table 1).
  • step (425) computes also the two end-slopes and the mid-slope for the two splines (e.g., see step 8 in Table 1) while trying to preserve monotonicity. Given the input parameters, TMid, TMin, TMax, and the three slopes, one then can apply equation (7) to compute in step (430) the tone-mapping curve.
  • FIG. 5 depicts two different examples of the proposed tone curve, mapping PQ-coded data from a first dynamic range to a second, lower, dynamic range. Note that the same curve may also be used for an inverse mapping, say, for translating the data from a lower dynamic range range back to the original high dynamic range.
  • the inverse mapping may be computed as follows:
  • FIG. 6A depicts an example plot showing how input mid-tones values (SMid) are mapped to output mid-tones values (TMid) to determine (e.g., using steps 1 to 4 in Table 1) the mid-tones anchor point of the tone-curve mapping function according to an embodiment.
  • step 6A one observes that within a small range of SMid values function 605 is not monotonic, which may yield unexpected results, say, during fade-ins or fade outs.
  • offset-Head min(max(0, (midLoc - cutoff) ) *TDR
  • offsetTail min(max(0, (cutoff - midLoc) ) *TDR
  • cutoff is a variable that prioritizes the maintenance of highlights versus shadows by specifying the percentage of the target dynamic range where the head and tail offsets will be clipped to. For example, a cutoff value of 0.5 gives equal prioritization to highlights and shadows, whereas a cutoff value of 0.7 gives higher prioritization to highlights than to shadows.
  • the proposed tone curve exhibits multiple advantages, including:
  • Image contrast can be explicitly controlled via slopeMid and the contrastFactor variable (see, Table 1, step 8).
  • This curve can map negative input values (e.g., x ⁇ 0). Such a condition may occur due to noise or other system conditions (e.g., spatial filtering). One typically clamps negative values to 0; however, such clamping may end up increasing the noise and reducing overall quality. By being able to map negative input values, such clipping-based artifacts may be reduced or eliminated.
  • the tone-mapping curve can easily be inverted by a simple switching of the x toy mapping to ay to x mapping;
  • the computations in Table 1 never require any non-integer power functions (computing powers of 2 or 4 requires simple multiplications).
  • equation (7) requires a few more multiply/divide and add/subtract operations per input sample, but requires no power functions.
  • equation (7) may be implemented using a table look up. More importantly, user studies performed by the inventors seem to indicate that users prefer the overall picture quality of images generated using the new tone mapping function (320).
  • Embodiments of the present invention may be implemented with a computer system, systems configured in electronic circuitry and components, an integrated circuit (IC) device such as a microcontroller, a field programmable gate array (FPGA), or another configurable or programmable logic device (PLD), a discrete time or digital signal processor (DSP), an application specific IC (ASIC), and/or apparatus that includes one or more of such systems, devices or components.
  • IC integrated circuit
  • FPGA field programmable gate array
  • PLD configurable or programmable logic device
  • DSP discrete time or digital signal processor
  • ASIC application specific IC
  • the computer and/or IC may perform, control, or execute instructions related to image transformations for images with high dynamic range, such as those described herein.
  • the computer and/or IC may compute any of a variety of parameters or values that relate to image transformation processes described herein.
  • the image and video embodiments may be implemented in hardware, software, firmware and various combinations thereof.
  • Certain implementations of the invention comprise computer processors which execute software instructions which cause the processors to perform a method of the invention.
  • processors in a display, an encoder, a set top box, a transcoder or the like may implement methods related to image transformations for HDR images as described above by executing software instructions in a program memory accessible to the processors.
  • the invention may also be provided in the form of a program product.
  • the program product may comprise any non-transitory medium which carries a set of computer-readable signals comprising instructions which, when executed by a data processor, cause the data processor to execute a method of the invention.
  • Program products according to the invention may be in any of a wide variety of forms.
  • the program product may comprise, for example, physical media such as magnetic data storage media including floppy diskettes, hard disk drives, optical data storage media including CD ROMs, DVDs, electronic data storage media including ROMs, flash RAM, or the like.
  • the computer-readable signals on the program product may optionally be compressed or encrypted.
  • a component e.g. a software module, processor, assembly, device, circuit, etc.
  • reference to that component should be interpreted as including as equivalents of that component any component which performs the function of the described component (e.g., that is functionally equivalent), including components which are not structurally equivalent to the disclosed structure which performs the function in the illustrated example embodiments of the invention.
  • EEE 1 A method to map using a processor an image from a first dynamic range to a second dynamic range, the method comprising:
  • first information data for an input image in the first dynamic range comprising an input black point level (xl, SMin), an input mid-tones level (x2, SMid), and an input white point level (x3, SMax) in the first dynamic range;
  • the transfer function comprises two segments, wherein the first segment is determined based on the tail slope, the mid-tones slope, the input black point, the input mid-tones level, the second output black point, and the output mid-tones value, and the second segment is determined based on the mid-tones slope, the head slope, the input mid-tones level, the input white point, the output mid-tones level, and the second output white point; and
  • mapping the input image to the output image using the determined transfer function mapping the input image to the output image using the determined transfer function.
  • EEE 2 The method of EEE 1, wherein the first dynamic range comprises a high dynamic range and the second dynamic range comprises a standard dynamic range.
  • EEE 3 The method of EEE 1 or EEE 2, wherein the transfer function further comprises a first linear segment for input values lower than the input black point in the first dynamic range, wherein the linear segment has a slope equal to the tail slope.
  • EEE 4 The method of EEE 3, wherein the transfer function for the first linear segment comprises
  • x denotes an input pixel value
  • y denotes an output pixel value
  • slopeMin denotes the tail slope
  • TMin denotes the second output black point in the second dynamic range
  • SMin denotes the input black point in the first dynamic range
  • EEE 5 The method of any of the EEEs 1-4, wherein the transfer function further comprises a second linear segment for input values larger the input white point in the first dynamic range, wherein the linear segment has a slope equal to the head slope.
  • EEE 6 The method of EEE 5, wherein the transfer function for the second linear segment comprises
  • x denotes an input pixel value
  • y denotes an output pixel value
  • slopeMax denotes the head slope
  • TMax denotes the second output white point in the second dynamic range
  • SMax denotes the input white point in the first dynamic range.
  • EEE 7 The method of any of the EEEs 1-6, wherein the first segment and/or the second segment are determined based on a third-order Hermite spline polynomial.
  • T (x— SMin) /(SMid— SMin)
  • x denotes an input pixel value
  • y denotes an output pixel value
  • slopeMin denotes the tail slope
  • slopeMid denotes the mid-tones slope
  • TMin and TMid denote the second output black point and the output mid-tones level
  • SMin and SMid denote the input black point and the input mid-tones level in the first dynamic range.
  • EEE 10 The method of any of the EEEs 1-9, wherein computing the output mid-tones value comprises computing:
  • TMid TMinPQ+a*TMinPQ
  • TMid TmaxPQ-6*TmaxPQ
  • TMid denotes the output mid-tones value
  • a and b are percentile values in [0, 1]
  • SMid denotes the input mid-tones level
  • TminPQ and TmaxPQ comprise values in the second information data.
  • EEE 1 1. The method of any of the EEEs 1-9, wherein computing the output mid-tones value further comprises:
  • EEE 12 The method of EEE 1 1, wherein determining the preliminary output mid-tones value and the first and second bounding values for the first output mid-tones value, comprises computing
  • tailroom ( SMid-SMin) /TDR;
  • TDR TmaxPQ-TminPQ
  • midLoc denotes the preliminary output mid-tones value
  • tailroom and headroom denote the first and second bounding values for the first output mid-tones value
  • SMin, SMid, and SMax denote the first information data
  • TminPQ denotes the first output black point level.
  • EEE 13 The method of EEE 11 or EEE 12, wherein at least one contrast preservation value is approximately 50%.
  • EEE 14 The method of any of the EEEs 11-13, wherein computing the output mid-tones value (TMid) comprises computing
  • TMid SMid-offsetHead+offsetTail
  • SMid denotes the input mid-tones level in the first dynamic range
  • offsetHead and offsetTail denote the head and tail offsets.
  • EEE 15 The method of EEE 14, wherein computing the second output white point and the second output black point in the second dynamic range comprises computing
  • TMax min (TMid+SMax-SMid, TmaxPQ) ,
  • TMin max (TMid-SMid+SMin, TminPQ,
  • TMax denotes the second output white point
  • SMin and SMax denote the input black point and the input white point in the first dynamic range
  • TminPQ and TmaxPQ denote the first output black point and the first output white point in the second dynamic range.
  • EEE 16 The method of EEE 10 or EEE 15, further comprising computing a maximum value of the tail slope (maxMinSlope) and/or a maximum value of the head slope (maxMaxSlope) as
  • maxMaxSlope 3* (TMax-TMid) / ( SMax-SMid) .
  • contrastF actor denotes a contrast parameter
  • EEE 18 The method of any of the EEEs 1-17, further comprising determining a backward look-up table to map an image from the second dynamic range back to the first dynamic range, the method comprising: determining a forward look-up table mapping a plurality of x(i) values in the first dynamic range to _y(/ ' ) values in the second dynamic range based on the determined transfer function; and
  • EEE 19 An apparatus comprising a processor and configured to perform any one of the methods recited in EEEs 1-18.
  • EEE 20 A non-transitory computer-readable storage medium having stored thereon computer-executable instruction for executing a method with one or more processors in accordance with any one of the EEEs 1-18.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne des procédés de mappage d'une image d'une première plage dynamique à une seconde plage dynamique. Le mappage est basé sur une fonction comprenant deux fonctions splines polynomiales déterminées à l'aide de trois points d'ancrage et de trois pentes. Le premier point d'ancrage est déterminé à l'aide des niveaux de points noirs de l'entrée et de la sortie cible, le second point d'ancrage est déterminé à l'aide des niveaux de points blancs de l'entrée et de la sortie cible, et le troisième point d'ancrage est déterminé à l'aide de données d'informations de demi-tons pour l'entrée et la sortie cible. Le niveau de demi-tons de la sortie cible est calculé de manière adaptative sur la base d'un mappage idéal univoque et par préservation du contraste d'entrée dans les noirs et les mises en évidence. Une fonction de transfert de mappage de tonalité donnée à titre d'exemple basée sur des splines d'Hermite de troisième ordre (cubique) est présentée.
PCT/US2018/017830 2017-02-15 2018-02-12 Mappage de courbe de tonalité pour des images à plage dynamique élevée WO2018152063A1 (fr)

Priority Applications (10)

Application Number Priority Date Filing Date Title
JP2019543902A JP6738972B2 (ja) 2017-02-15 2018-02-12 ハイダイナミックレンジ画像のためのトーン曲線マッピング
US16/486,392 US10600166B2 (en) 2017-02-15 2018-02-12 Tone curve mapping for high dynamic range images
BR112019016825A BR112019016825A2 (pt) 2017-02-15 2018-02-12 mapeamento de curva de tons para imagens com alta faixa dinâmica
EP18706171.8A EP3559901B1 (fr) 2017-02-15 2018-02-12 Mappage de courbe tonale d'images à plage dynamique élevée
PL18706171T PL3559901T3 (pl) 2017-02-15 2018-02-12 Odwzorowywanie krzywej tonalnej dla obrazów o wysokim zakresie dynamiki
RU2019124451A RU2713869C1 (ru) 2017-02-15 2018-02-12 Преобразование тоновой кривой для изображений с расширенным динамическим диапазоном
ES18706171T ES2817852T3 (es) 2017-02-15 2018-02-12 Mapeo mediante curva de tonos para imágenes de alto rango dinámico
KR1020197023862A KR102122165B1 (ko) 2017-02-15 2018-02-12 하이 다이내믹 레인지 이미지들에 대한 톤 곡선 매핑
CN201880011955.3A CN110337667B (zh) 2017-02-15 2018-02-12 高动态范围图像的色调曲线映射
DK18706171.8T DK3559901T3 (da) 2017-02-15 2018-02-12 Tonekurveafbildning for billeder med højt dynamikområde

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201762459141P 2017-02-15 2017-02-15
EP17156284.6 2017-02-15
US62/459,141 2017-02-15
EP17156284 2017-02-15
US201762465298P 2017-03-01 2017-03-01
US62/465,298 2017-03-01

Publications (1)

Publication Number Publication Date
WO2018152063A1 true WO2018152063A1 (fr) 2018-08-23

Family

ID=58094214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/017830 WO2018152063A1 (fr) 2017-02-15 2018-02-12 Mappage de courbe de tonalité pour des images à plage dynamique élevée

Country Status (2)

Country Link
TW (1) TWI671710B (fr)
WO (1) WO2018152063A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020060980A1 (fr) * 2018-09-17 2020-03-26 Dolby Laboratories Licensing Corporation Mappage d'affichage pour images à grande plage dynamique sur affichages limitant la puissance
JP2020113907A (ja) * 2019-01-11 2020-07-27 日本テレビ放送網株式会社 変換装置、変換方法及びプログラム
WO2020219341A1 (fr) 2019-04-23 2020-10-29 Dolby Laboratories Licensing Corporation Gestion d'affichage pour images à grande gamme dynamique
CN112215760A (zh) * 2019-07-11 2021-01-12 华为技术有限公司 一种图像处理的方法及装置
CN112686810A (zh) * 2019-10-18 2021-04-20 华为技术有限公司 一种图像处理的方法及装置
WO2021222310A1 (fr) * 2020-04-28 2021-11-04 Dolby Laboratories Licensing Corporation Contrôle de contraste et de luminosité dépendant de l'image pour des dispositifs d'affichage hdr
WO2021223205A1 (fr) * 2020-05-08 2021-11-11 Huawei Technologies Co., Ltd. Codeur, décodeur, système et procédé de détermination de paramètres de courbe de mappage de tons
US20220301126A1 (en) * 2021-03-18 2022-09-22 Canon Kabushiki Kaisha Image processing apparatus, display apparatus, image processing method, and non-transitory computer readable medium
WO2023281264A3 (fr) * 2021-07-08 2023-02-09 British Broadcasting Corporation Procédé et appareil pour la conversion de signaux hdr
US11594159B2 (en) 2019-01-09 2023-02-28 Dolby Laboratories Licensing Corporation Display management with ambient light compensation
US11632527B2 (en) 2018-09-26 2023-04-18 Dolby Laboratories Licensing Corporation Projector light source dimming using metadata from future frames
CN116132648A (zh) * 2020-04-30 2023-05-16 华为技术有限公司 动态范围映射的方法和装置
EP4239565A4 (fr) * 2020-12-17 2024-05-08 Huawei Technologies Co., Ltd. Procédé et dispositif de reproduction de tons
GB2625218A (en) * 2021-07-08 2024-06-12 British Broadcasting Corp Method and apparatus for conversion of HDR signals
US12333694B2 (en) 2019-10-18 2025-06-17 Huawei Technologies Co., Ltd. Image processing method and apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252133A1 (en) * 2003-06-11 2004-12-16 Agfa-Gevaert Method and user interface for modifying at least one of contrast and density of pixels of a processed image
US20130155330A1 (en) * 2011-12-19 2013-06-20 Dolby Laboratories Licensing Corporation Color Grading Apparatus and Methods
US8593480B1 (en) 2011-03-15 2013-11-26 Dolby Laboratories Licensing Corporation Method and apparatus for image data transformation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252133A1 (en) * 2003-06-11 2004-12-16 Agfa-Gevaert Method and user interface for modifying at least one of contrast and density of pixels of a processed image
US8593480B1 (en) 2011-03-15 2013-11-26 Dolby Laboratories Licensing Corporation Method and apparatus for image data transformation
US20130155330A1 (en) * 2011-12-19 2013-06-20 Dolby Laboratories Licensing Corporation Color Grading Apparatus and Methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JOANNA L POWER ET AL: "Reproducing color images as duotones", COMPUTER GRAPHICS PROCEEDINGS. SIGGRAPH '96, ACM, NEW YORK, US, 1 August 1996 (1996-08-01), pages 237 - 248, XP058220100, ISBN: 978-0-89791-746-9, DOI: 10.1145/237170.237261 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020060980A1 (fr) * 2018-09-17 2020-03-26 Dolby Laboratories Licensing Corporation Mappage d'affichage pour images à grande plage dynamique sur affichages limitant la puissance
US11361699B2 (en) 2018-09-17 2022-06-14 Dolby Laboratories Licensing Corporation Display mapping for high dynamic range images on power-limiting displays
KR102287095B1 (ko) 2018-09-17 2021-08-06 돌비 레버러토리즈 라이쎈싱 코오포레이션 전력-제한 디스플레이 상에서의 하이 다이나믹 레인지 이미지를 위한 디스플레이 매핑
KR20210046832A (ko) * 2018-09-17 2021-04-28 돌비 레버러토리즈 라이쎈싱 코오포레이션 전력-제한 디스플레이 상에서의 하이 다이나믹 레인지 이미지를 위한 디스플레이 매핑
CN112703529A (zh) * 2018-09-17 2021-04-23 杜比实验室特许公司 功率限制显示器上高动态范围图像的显示映射
CN112703529B (zh) * 2018-09-17 2022-05-31 杜比实验室特许公司 功率限制显示器上高动态范围图像的显示映射
JP2022500969A (ja) * 2018-09-17 2022-01-04 ドルビー ラボラトリーズ ライセンシング コーポレイション パワー制限ディスプレイにおけるハイダイナミックレンジ画像のディスプレイマッピング
US11632527B2 (en) 2018-09-26 2023-04-18 Dolby Laboratories Licensing Corporation Projector light source dimming using metadata from future frames
US11594159B2 (en) 2019-01-09 2023-02-28 Dolby Laboratories Licensing Corporation Display management with ambient light compensation
JP2020113907A (ja) * 2019-01-11 2020-07-27 日本テレビ放送網株式会社 変換装置、変換方法及びプログラム
JP7249154B2 (ja) 2019-01-11 2023-03-30 日本テレビ放送網株式会社 変換装置、変換方法及びプログラム
US11803948B2 (en) 2019-04-23 2023-10-31 Dolby Laboratories Licensing Corporation Display management for high dynamic range images
WO2020219341A1 (fr) 2019-04-23 2020-10-29 Dolby Laboratories Licensing Corporation Gestion d'affichage pour images à grande gamme dynamique
CN112215760A (zh) * 2019-07-11 2021-01-12 华为技术有限公司 一种图像处理的方法及装置
CN112686810A (zh) * 2019-10-18 2021-04-20 华为技术有限公司 一种图像处理的方法及装置
EP4036841A4 (fr) * 2019-10-18 2022-12-07 Huawei Technologies Co., Ltd. Procédé et appareil de traitement d'image
CN112686810B (zh) * 2019-10-18 2025-04-04 华为技术有限公司 一种图像处理的方法及装置
US12333694B2 (en) 2019-10-18 2025-06-17 Huawei Technologies Co., Ltd. Image processing method and apparatus
KR20230003002A (ko) * 2020-04-28 2023-01-05 돌비 레버러토리즈 라이쎈싱 코오포레이션 Hdr 디스플레이들을 위한 이미지 의존적 콘트라스트 및 밝기 제어
US11935492B2 (en) 2020-04-28 2024-03-19 Dolby Laboratories Licensing Corporation Image-dependent contrast and brightness control for HDR displays
WO2021222310A1 (fr) * 2020-04-28 2021-11-04 Dolby Laboratories Licensing Corporation Contrôle de contraste et de luminosité dépendant de l'image pour des dispositifs d'affichage hdr
KR102799246B1 (ko) 2020-04-28 2025-04-23 돌비 레버러토리즈 라이쎈싱 코오포레이션 Hdr 디스플레이들을 위한 이미지 의존적 콘트라스트 및 밝기 제어
TWI766666B (zh) * 2020-04-28 2022-06-01 美商杜拜研究特許公司 用於高動態範圍顯示之影像相依對比度及亮度控制
EP4250279A3 (fr) * 2020-04-28 2023-11-01 Dolby Laboratories Licensing Corporation Contrôle de contraste et de luminosité dépendant de l'image pour des dispositifs d'affichage hdr
CN116132648A (zh) * 2020-04-30 2023-05-16 华为技术有限公司 动态范围映射的方法和装置
WO2021223205A1 (fr) * 2020-05-08 2021-11-11 Huawei Technologies Co., Ltd. Codeur, décodeur, système et procédé de détermination de paramètres de courbe de mappage de tons
EP4239565A4 (fr) * 2020-12-17 2024-05-08 Huawei Technologies Co., Ltd. Procédé et dispositif de reproduction de tons
US11869174B2 (en) * 2021-03-18 2024-01-09 Canon Kabushiki Kaisha Image processing apparatus, display apparatus, image processing method, and non-transitory computer readable medium
US20220301126A1 (en) * 2021-03-18 2022-09-22 Canon Kabushiki Kaisha Image processing apparatus, display apparatus, image processing method, and non-transitory computer readable medium
WO2023281264A3 (fr) * 2021-07-08 2023-02-09 British Broadcasting Corporation Procédé et appareil pour la conversion de signaux hdr
GB2625218A (en) * 2021-07-08 2024-06-12 British Broadcasting Corp Method and apparatus for conversion of HDR signals
GB2608990B (en) * 2021-07-08 2024-10-30 British Broadcasting Corp Method and apparatus for conversion of HDR signals
GB2625218B (en) * 2021-07-08 2024-11-13 British Broadcasting Corp Method and apparatus for conversion of HDR signals

Also Published As

Publication number Publication date
TW201835861A (zh) 2018-10-01
TWI671710B (zh) 2019-09-11

Similar Documents

Publication Publication Date Title
US10600166B2 (en) Tone curve mapping for high dynamic range images
WO2018152063A1 (fr) Mappage de courbe de tonalité pour des images à plage dynamique élevée
RU2762384C1 (ru) Переформирование сигналов для сигналов широкого динамического диапазона
US10580367B2 (en) Display mapping for high dynamic range images
KR102287095B1 (ko) 전력-제한 디스플레이 상에서의 하이 다이나믹 레인지 이미지를 위한 디스플레이 매핑
EP3459248A1 (fr) Remodelage de signal destiné à des images à plage dynamique élevée
WO2016172091A1 (fr) Remodelage et codage de signaux dans l'espace de couleurs ipt-pq
WO2022039930A1 (fr) Métadonnées d'image pour vidéo à haute gamme dynamique
AU2022358503B2 (en) Multi-step display mapping and metadata reconstruction for hdr video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18706171

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018706171

Country of ref document: EP

Effective date: 20190724

ENP Entry into the national phase

Ref document number: 20197023862

Country of ref document: KR

Kind code of ref document: A

Ref document number: 2019543902

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112019016825

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112019016825

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20190813

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载