+

CN113836705B - Method, device, storage medium and electronic device for processing illumination data - Google Patents

Method, device, storage medium and electronic device for processing illumination data Download PDF

Info

Publication number
CN113836705B
CN113836705B CN202111040489.1A CN202111040489A CN113836705B CN 113836705 B CN113836705 B CN 113836705B CN 202111040489 A CN202111040489 A CN 202111040489A CN 113836705 B CN113836705 B CN 113836705B
Authority
CN
China
Prior art keywords
color
illumination
light source
target model
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111040489.1A
Other languages
Chinese (zh)
Other versions
CN113836705A (en
Inventor
钱静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111040489.1A priority Critical patent/CN113836705B/en
Publication of CN113836705A publication Critical patent/CN113836705A/en
Application granted granted Critical
Publication of CN113836705B publication Critical patent/CN113836705B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/02Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Image Generation (AREA)

Abstract

本发明公开了一种光照数据的处理方法、装置、存储介质和电子装置。该方法包括:获取待处理的目标模型的第一位置;基于第一位置确定虚拟光源的第二位置;在第二位置上,基于虚拟光源对目标模型的光照进行模拟,得到模拟结果;基于模拟结果对目标模型的光照贴图进行调整,得到目标模型的目标光照数据。通过本发明,达到了提高光照数据的处理效率的技术效果。

The present invention discloses a method, device, storage medium and electronic device for processing illumination data. The method comprises: obtaining a first position of a target model to be processed; determining a second position of a virtual light source based on the first position; simulating illumination of the target model based on the virtual light source at the second position to obtain a simulation result; adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model. Through the present invention, the technical effect of improving the processing efficiency of illumination data is achieved.

Description

Illumination data processing method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for processing illumination data, a storage medium, and an electronic apparatus.
Background
Currently, in illumination data processing, a baking of illumination (bak) process may be performed on the model, which may include a method of generating an illumination map (lightmap). However, this usually requires a strong lighting experience of the user to adjust the illumination map appropriately, and when the user with insufficient experience performs the operation of generating the illumination map, the dark portion of the model is usually dead and black, or even if the scene is already lit by a lot of light, there is a problem that the model is not lit.
Therefore, since the processing effect of the illumination data is difficult to grasp, the user is often required to continuously debug, and thus there is a technical problem that the processing efficiency of the illumination data is low.
Aiming at the technical problem of low efficiency of processing illumination data in the prior art, no effective solution is proposed at present.
Disclosure of Invention
The invention mainly aims to provide a processing method and device of illumination data, a storage medium and an electronic device, so as to at least solve the technical problem of low efficiency of processing the illumination data.
In order to achieve the above object, according to one aspect of the present invention, there is provided a method of processing illumination data. The method comprises the steps of obtaining a first position of a target model to be processed, determining a second position of a virtual light source based on the first position, simulating illumination of the target model based on the virtual light source at the second position to obtain a simulation result, and adjusting an illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
Optionally, the illumination of the target model is simulated based on the virtual light source to obtain a simulation result, wherein the simulation result comprises the steps of determining a gray level image based on the target model and the virtual light source, coloring the gray level image to obtain a color image, and determining the color image as the simulation result.
Optionally, the target model comprises a plurality of vertexes, the gray image is determined based on the target model and the virtual light source, the gray image comprises a step of determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the plurality of first distances are in one-to-one correspondence with the plurality of vertexes, and the plurality of first distances are expressed as the gray image.
Optionally, the method further comprises obtaining a radius of the virtual light source, and determining the first distance based on the second location of the virtual light source and the location of each vertex comprises determining the first distance based on the second location, the location of each vertex and the radius.
Optionally, coloring the gray image to obtain a color image comprises converting a first distance into a second distance and a third distance based on a first parameter, determining a first color based on the second distance and a second color based on the third distance, and determining the color image based on the first color and the second color, wherein the first color is the color of a first area of the color image and the second color is the color of a second area of the color image.
Optionally, the method further comprises adjusting the color intensity of the color image from a first color intensity to a second color intensity, and determining the color image as a simulation result comprises determining the color image of the second color intensity as a simulation result.
Optionally, the method further comprises the steps of obtaining an original color map of the target model, adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model, wherein the step of adjusting the original color map based on the simulation result to obtain an adjustment result, and the step of superposing the adjustment result and the illumination map to obtain the target illumination data.
Optionally, the original color map is adjusted based on the simulation result to obtain an adjustment result, including adjusting the original color map based on the second parameter and the simulation result to obtain an adjustment result.
Optionally, the adjusting result and the illumination map are overlapped to obtain target illumination data, wherein the method comprises the steps of obtaining a product among a third parameter, the adjusting result, first channel data of the illumination map and second channel data of the illumination map, and determining the product as color data in the target illumination data.
Optionally, acquiring the first position of the target model to be processed comprises acquiring an axial center position of the target model and determining the axial center position as the first position.
Optionally, determining the second location of the virtual light source based on the first location includes obtaining a target offset and determining the second location based on the first location and the target offset.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a processing apparatus for illumination data. The device comprises an acquisition unit, a determination unit, a simulation unit and an adjustment unit, wherein the acquisition unit is used for acquiring a first position of a target model to be processed, the determination unit is used for determining a second position of a virtual light source based on the first position, the simulation unit is used for simulating illumination of the target model based on the virtual light source at the second position to obtain a simulation result, and the adjustment unit is used for adjusting an illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
To achieve the above object, according to another aspect of the present invention, there is provided a computer-readable storage medium. The computer readable storage medium stores a computer program, wherein the computer program is used for controlling a device where the computer readable storage medium is located to execute the method for processing illumination data according to the embodiment of the invention when the computer program is run by a processor.
In order to achieve the above object, according to another aspect of the present invention, there is provided an electronic device. The electronic device comprises a memory and a processor, characterized in that the memory stores a computer program, the processor being arranged to run the computer program for performing the method of data processing according to the embodiments of the invention.
In the embodiment, a first position of a target model to be processed is acquired, a second position of a virtual light source is determined based on the first position, illumination of the target model is simulated based on the virtual light source at the second position to obtain a simulation result, and an illumination map of the target model is adjusted based on the simulation result to obtain target illumination data of the target model. That is, in this embodiment, by setting a virtual light source in combination with the position of the target model, the result of illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain the illumination data of the target model, so that the effect of supplementing (brightening) the target model by the virtual light source is achieved, the condition that the dark part of the target model is dead and black is reduced, the technical problem of low processing efficiency of the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application. In the drawings:
Fig. 1 is a block diagram of a hardware structure of a mobile terminal according to a processing method of illumination data according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of processing illumination data according to an embodiment of the invention;
FIG. 3 is a schematic diagram of a light scene baking according to one of the related art;
FIG. 4 is a schematic diagram of direct illumination only according to one of the related art;
FIG. 5 is a schematic diagram of a direct illumination and an indirect illumination according to the related art;
Fig. 6 is a schematic view of effects before baking and effects after baking according to one of the related art;
FIG. 7 is a comparative schematic illustration of illumination data processing according to an embodiment of the invention;
FIG. 8 is a comparative schematic diagram of another illumination data processing according to an embodiment of the invention;
FIG. 9 is a schematic diagram of a result of processing illumination data according to an embodiment of the present invention;
FIG. 10 is a schematic illustration of a gray plot according to an embodiment of the present invention;
FIG. 11 is a schematic illustration of another gray plot according to an embodiment of the present invention;
FIG. 12 is a schematic diagram of a diagram for converting a gray diagram to a colored diagram in accordance with an embodiment of the invention;
FIG. 13 is a schematic illustration of a contrast of a gray image and a colored image according to an embodiment of the present invention;
FIG. 14 is a comparative schematic diagram of another illumination data processing according to an embodiment of the invention;
fig. 15 is a schematic diagram of an illumination data processing apparatus according to an embodiment of the present invention.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the application herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method embodiments provided by the embodiments of the present application may be performed in a mobile terminal, a computer terminal, or similar computing device. Taking the mobile terminal as an example, fig. 1 is a block diagram of a hardware structure of the mobile terminal according to an embodiment of the present application. As shown in fig. 1, the mobile terminal may include one or more (only one is shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a method of data processing in an embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, implement the above-mentioned method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the mobile terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as a NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a method for processing illumination data running on the mobile terminal is provided, and fig. 2 is a flowchart of a method for processing illumination data according to an embodiment of the present invention. As shown in fig. 2, the method may include the steps of:
Step S202, a first position of a target model to be processed is acquired.
In the solution provided in the above step S202 of the present invention, the object model to be processed may be a scene model, for example, a scene model in a game scene, which may include, but is not limited to, a model of landscape type. The embodiment may determine a first location of the target model. Alternatively, this embodiment obtains the axis (pivot) heart position of the object model, which may be represented by frag.
Step S204, determining a second position of the virtual light source based on the first position.
In the solution provided in the above step S204 of the present invention, after the first position of the object model to be processed is acquired, the second position of the virtual light source may be determined based on the first position.
In this embodiment, a virtual light source may be newly added in the renderer (loader), and the virtual light source may be a pseudo point light source, which may be called a simulated point light source or a simulated light source, for simulating the light supplementing effect of the target model. Alternatively, the embodiment may determine the second position of the virtual light source based on the first position, which may be the position of the center point of the virtual light source, based on the position-related parameter, which may be represented by (x 0, y0, z 0). Optionally, the above parameters related to the position may be manually input by the user, so as to achieve the purpose of manually and flexibly adjusting the target model by the user.
Alternatively, this embodiment may obtain a target offset, which is an adjustable parameter, and may be represented by pivotoffest, and may determine the second position based on the first position and the target offset, for example, offset the first position by the target offset, so as to obtain the second position of the virtual light source, and may be the second position of the virtual light source=the first position+the target offset, so that the second position may be represented by new_pivot=frag.
In step S206, in the second position, the illumination of the target model is simulated based on the virtual light source, so as to obtain a simulation result.
In the technical solution provided in the above step S206 of the present invention, after determining the second position of the virtual light source based on the first position, the illumination of the target model may be simulated based on the virtual light source at the second position, so as to obtain a simulation result.
In this embodiment, in the second position, the virtual light source may be adjusted, and the illumination of the target model may be simulated based on the second position of the virtual light source, the radius (Rx, ry, rz) of the virtual light source, and the like, to obtain a simulation result, which may be represented by color, so that the simulation result is a result of simulating the illumination. Alternatively, the radius of the virtual light source may be manually input by a user, which is an adjustable parameter, so that the range of the virtual light source may be adjusted by adjusting the radius of the virtual light source.
Step S208, adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
In the technical scheme provided in the step S208, after the illumination of the target model is simulated based on the virtual light source to obtain the simulation result, the illumination map of the target model can be adjusted based on the simulation result to obtain the target illumination data of the target model.
In this embodiment, the illumination map of the target model is a map generated by baking the target model, and may be implemented by a target engine, which may be a game engine. The illumination map may include color channel (RGB) data and grayscale channel (Alpha) data, and the simulation result may be superimposed with the color channel data and the grayscale channel data of the illumination map to obtain target illumination data of the target model, where the target illumination data may be local illumination data of the target model, including final color data.
Optionally, the illumination map of the target model in this embodiment is a map generated after performing scene baking illumination on the target model, and the texture of the scene illumination data of the target model is recorded, which can be used to increase the illumination atmosphere and the artistic effect of the scene. The illumination map is essentially one or more maps applied to the target model, and contains information such as indirect illumination, shadow and the like obtained by pre-calculating through an illumination map baking mode (the indirect illumination can be selected to be baked only and the shadow can not be baked when baking). The embodiment can avoid real-time illumination and shadow calculation when the game is running, thereby improving the running performance of the game, and the embodiment can be suitable for a computing platform with weaker performance, such as a mobile platform.
The method of the embodiment is applied to scene baking, which can be light scene baking, is a processing method in a game scene, and can be used for simulating an illumination environment to generate an illumination map, so that the use of dynamic illumination is reduced in a game, and a large amount of illumination calculation is further reduced. Optionally, in gaming applications, the lighting of the lighting map is applied simultaneously with the dynamic direct lighting.
The method comprises the steps of S202 to S208, obtaining a first position of a target model to be processed, determining a second position of a virtual light source based on the first position, simulating illumination of the target model based on the virtual light source at the second position to obtain a simulation result, and adjusting an illumination map of the target model based on the simulation result to obtain target illumination data of the target model. That is, in this embodiment, by setting a virtual light source in combination with the position of the target model, the result of illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain the illumination data of the target model, so that the effect of supplementing (brightening) the target model by the virtual light source is achieved, the condition that the dark part of the target model is dead and black is reduced, the technical problem of low processing efficiency of the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The above-described method of this embodiment is further described below.
As an optional implementation manner, step S206 simulates illumination of the target model based on the virtual light source to obtain a simulation result, wherein the simulation result comprises the steps of determining a gray image based on the target model and the virtual light source, coloring the gray image to obtain a color image, and determining the color image as the simulation result.
In this embodiment, when the simulation of the illumination of the target model based on the virtual light source is implemented, a gray image, that is, a gray image, whose gray value may be uniformly changed may be first determined based on the target model and the virtual light source, to obtain a simulation result. After the gray image is determined, it may be a coloring process for the gray image, converting the gray image into a color image, that is, the embodiment converts a gray image obtained based on the target model and the virtual light source into a color image, and further determines a simulation result for simulating the target model based thereon.
As an alternative implementation mode, the target model comprises a plurality of vertexes, the gray image is determined based on the target model and the virtual light source, the gray image comprises the steps of determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the plurality of first distances are in one-to-one correspondence with the plurality of vertexes, and the plurality of first distances are expressed as the gray image.
In this embodiment, the object model includes a plurality of vertices, the position of each vertex may be determined, and then a first distance may be determined based on the second position of the virtual light source and the position (x 1, y1, z 1) of each vertex, where the first distance may be the position of each vertex from the virtual light source, so as to obtain a plurality of first distances corresponding to the plurality of vertices one to one, and the plurality of first distances may be represented as a gray scale image, that is, the gray scale image of this embodiment may represent the plurality of first distances.
As an alternative embodiment, the method further comprises obtaining a radius of the virtual light source, and determining the first distance based on the second location of the virtual light source and the location of each vertex comprises determining the first distance based on the second location, the location of each vertex, and the radius.
In this embodiment, the radius of the virtual light source may be obtained in response to a first operation instruction acting on the graphical user interface, i.e. the parameter entered by the user may comprise the radius of the virtual light source, which may be used to adjust the range of the virtual light source, and in turn the embodiment may determine a first distance based on the second position (x 0, y0, z 0) of the virtual light source, the position (x 1, y1, z 1) of each vertex and the radius (Rx, ry, rz) of the virtual light source, which first distance may be denoted by a0, alternatively,Thereby generating a gray image by the first distance a0 corresponding to each vertex of the object model.
As an alternative implementation mode, the gray image is subjected to coloring processing to obtain a color image, and the color image comprises the steps of converting a first distance into a second distance and a third distance based on a first parameter, determining a first color based on the second distance and determining a second color based on the third distance, and determining the color image based on the first color and the second color, wherein the first color is the color of a first area of the color image, and the second color is the color of a second area of the color image.
In this embodiment, when the coloring process is performed on the gray-scale image to obtain the color image, a first parameter may be acquired first, and the first parameter may be a parameter for controlling the distance and may be represented by Mid. This embodiment may convert the first distance into the second distance based on the first parameter, for example, the first distance is a0, the second distance is A1, a1= saturate (a 0/Mid), the second distance corresponds to the first area A1 of the color image, the first color may be determined according to an interpolation function (lerp) and A1, the first color color=lerp (color 1.Rgb×color intensity. X, color2.Rgb×color intensity. Y, A1), wherein color1.Rgb, color2.Rgb are color parameters for determining the first color, color intensity. X, color intensity. Y are color intensity for determining the first color, and the first color may include red and green. Optionally, the embodiment further converts the first distance into a third distance based on the first parameter, for example, the third distance is A2, where a2= saturate ((a 0-Mid)/(1.0-Mid)), the third distance may correspond to a second area A2 of the color image, the second color may be determined according to interpolation functions (lerp) and A2, the second color color=lerp (color. Rgb, color3.Rgb, color intensity. Z, A2), where color. Rgb, color3.Rgb are color parameters for determining the second color, color intensity of the second color is determined, the second color may include blue, further the embodiment may determine the color image based on the first color and the second color, may dye the first area with the first color, then dye the second area with the second color, and may obtain a green image with the second color as a center of the green color, and the green image may be the center of the red color.
Alternatively, the first area and the second area in this embodiment are adjacent areas in the color image, where the second area may enclose the first area, that is, the distance between the point in the first area and the center of the color image may be smaller than the distance between the point in the second area and the center of the color image. For example, the color image may be a circular (or approximately circular) color image, and the first region may be an inner circle of the color image, and the second region may be an outer circle of the color image, and the color of the inner circle may be determined by a second distance a1= saturate (a 0/Mid) corresponding to the inner circle, and the color of the outer circle may be determined by a third distance a2= saturate ((a 0-Mid)/(1.0-Mid)) corresponding to the outer circle, thereby achieving the purpose of determining the circular (or approximately circular) color image.
It should be noted that, the color image of this embodiment is obtained by coloring the gray-scale image determined by the object model and the virtual light source, and is used to represent the simulation result of simulating the illumination of the object model, and the shape of the color image is merely an example of the embodiment of the present invention, and is not limited to the shape of the color image of this embodiment of the present invention, but may be any other shape, such as triangle, rectangle, square, or other shape that may form an area, and any shape of the color image that may be used to represent the simulation result of simulating the illumination of the object model is within the scope of this embodiment, and is not particularly limited herein.
As an alternative embodiment, the method further comprises adjusting the color intensity of the color image from a first color intensity to a second color intensity, and determining the color image as a simulation result comprises determining the color image of the second color intensity as a simulation result.
In this embodiment, after the color image is determined based on the first color and the second color, the color intensity (ColorIntensity) of the color image may be adjusted, and the color intensity of the color image may be adjusted from the original first color intensity to the second color intensity. Optionally, the embodiment multiplies the color data of the color image by a coefficient, so as to achieve the purpose of adjusting the color intensity, for example, the color data of the color image is (0.5, 0), the coefficient is 2, the color intensity can be adjusted based on the color data (0.5, 0) of the color image, and the color data can be changed into the color data (1, 0) even the value can exceed 1 by multiplying the color data by the coefficient of 2. After adjusting the color intensity of the color image from the first color intensity to the second color intensity, the color image of the second color intensity may be determined as a simulation result.
The method further comprises the steps of obtaining an original color map of the target model, adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model, adjusting the original color map based on the simulation result to obtain an adjustment result, and superposing the adjustment result and the illumination map to obtain the target illumination data.
In this embodiment, an original color map of the target model is obtained, which may be referred to as a base color map (Diffuse map, referring to the color map of the target model before baking, which may be represented by Raw data. Tex0.Rgb, the original color map may be adjusted based on the simulation result, for example, the adjustment result, the color channel data and the gray channel data in the illumination map are superimposed, so as to obtain the target illumination data.
As an alternative implementation, the method for adjusting the original color map based on the simulation result to obtain the adjustment result comprises the steps of adjusting the original color map based on the second parameter and the simulation result to obtain the adjustment result.
In this embodiment, the second parameter may be obtained in response to a second operation instruction acting on the graphical user interface, i.e. the parameters entered by the user may comprise the second parameter, which is an adjustable dyeing parameter, which may be used to adjust the color of the virtual light source, which may be indicated by BaseColor. The embodiment may adjust the original color map raw_data.tex0.rgb based on the second parameter basecolor.rgb and the simulation result color, to obtain an adjustment result, for example, an adjustment result mtl.albedo= srgb 2.2 2linearf3 (raw_data.tex0.rgb) color.
As an optional implementation manner, the adjustment result and the illumination map are overlapped to obtain target illumination data, and the method comprises the steps of obtaining a product among a third parameter, the adjustment result, first channel data of the illumination map and second channel data of the illumination map, and determining the product as color data in the target illumination data.
In this embodiment, when the adjustment result and the illumination map are superimposed to obtain the target illumination data, a third parameter may be obtained, where the third parameter may be 0.2, and further the first channel data and the second channel data of the illumination map may be obtained, where the first channel number may be color channel data (RGB), the second channel data may be grayscale channel data (Alpha), and a product between the third parameter, the adjustment result, the first channel data of the illumination map, and the second channel data of the illumination map may be obtained, for example, 0.2×adjustment result×rgb×alpha of the illumination map is final color data in the target illumination data.
According to the embodiment, some optimization is performed on the baked loader, the second position of the virtual light source can be adjusted according to the input target offset, the range of the virtual light source is determined according to the input radius, the color of the virtual light source is determined according to the adjustable dyeing parameter, the algorithm of the virtual light source is added to the original color map of the target model, the effect that the target model can be supplemented with light (brightened) through the virtual light source by manually inputting the parameter is achieved, even an ill-experienced art students can see better final effect through adjustment, the situation that dark parts of the target model are dead and black is reduced, the technical problem of low processing efficiency of the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The technical solution of the embodiment of the present invention is further described by way of example with reference to the preferred implementation manner, and specifically, the technical solution is described by way of example with reference to a game scenario in a game application.
Light scene baking is one technique in a game scene, as shown in fig. 3. FIG. 3 is a schematic diagram of baking a lighting scene according to the related art, which can simulate lighting environment to generate lighting maps, wherein the lighting maps can be a set comprising a plurality of lighting maps, so that dynamic lighting is reduced in a game, and lighting calculation is greatly reduced.
Fig. 4 is a schematic view of only direct illumination according to one of the related art. Fig. 5 is a schematic view of direct illumination and indirect illumination according to one of the related art. As shown in fig. 4 and 5, the illumination range of direct illumination and indirect illumination is larger than the direct-only illumination range, and in the game, lightmap baked illumination and dynamic direct illumination should be simultaneously applied.
The baking of illumination usually requires a strong polishing experience of a user to adjust, and a user with insufficient experience can cause dark parts of a model to be dead black when baking is performed, or the problem that the scene is not illuminated even if much light is applied to the scene exists.
While the effect before baking and the effect after baking are somewhat different in the related art, as shown in fig. 6. Fig. 6 is a schematic diagram of an effect before baking and an effect after baking according to one of the related arts, and a scene model is a room scene model, and information such as illumination and shading of the room scene model is different between the effect before baking and the effect after baking, so that an actual baking effect is difficult to grasp, continuous baking and debugging are often required, and the environment is not very friendly to users with insufficient experience.
In order to improve the working efficiency of a user, the embodiment optimizes the baked loader, simulates the light supplementing effect of the model by adding a virtual light source in the loader, and adds a manual local brightening function, and fine-adjusts the model under the condition that the light arrangement of the baking illumination map is unchanged, so that the problems of dead black and bright parts of the model can be well reduced.
FIG. 7 is a comparative schematic diagram of illumination data processing according to an embodiment of the present invention. As shown in fig. 7, the model before the treatment was added with the manual light supplement parameters, and the model after the dark part was dead and black was added.
The illumination map is essentially one or more maps applied to the scene model. They contain information such as indirect illumination, shadows, etc. obtained by pre-calculation by illumination map baking (indirect illumination may be selected to be baked only when baked, shadows are not baked). The illumination map can avoid real-time illumination and shadow calculation during game running, improves the running performance of the game, and can be applied to a computing platform with weaker performance, for example, a mobile platform.
FIG. 8 is a comparative schematic diagram of another illumination data processing according to an embodiment of the invention. As shown in FIG. 8, the light supplementing effect of the target model to be processed is simulated by adding a virtual light source in the loader, so that the function of local brightness enhancement is realized, and the target model is finely adjusted under the condition that the light arrangement of the illumination map of the baked target model is unchanged, so that the problems of dead black of dark parts and brightness enhancement of bright parts can be well reduced.
The above-described method of this embodiment is further described below.
According to the embodiment, an algorithm simulating a point light source can be added to a basic color map (Diiffuse map) of the target model according to input parameters, and the position, color and range of the virtual light source can be adjusted to achieve the purpose of processing illumination data of the target model. For example, a sphere may be made red at the center, green in the middle, and blue at the periphery, as shown in fig. 9. Fig. 9 is a schematic diagram of a result of processing illumination data according to an embodiment of the present invention.
The embodiment may determine the position of the virtual light source according to the position of the pivot axis of the target model, which is shifted according to the shift pivotoffest, where the shift is an adjustable parameter, for example, the position of the virtual light source may be new_pivot=frag.
This embodiment calculates the distance between each vertex on the target model and the virtual light source, optionally, the vertex position (x 1, y1, z 1) of the target model, the position (x 0, y0, z 0) of the center point of the virtual light source, and the radius of the virtual light source is (Rx, ry, rz), and then the distance a0 between each vertex on the target model and the virtual light source can be expressed as follows:
This embodiment may represent the distances of the respective vertices to the virtual light sources as a gray graph, as shown in fig. 10, where fig. 10 is a schematic diagram of a gray graph according to an embodiment of the present invention, and the result of a0 is uniformly varied.
This embodiment can color a gray chart according to the previous step, a1= saturate (a 0/Mid), a2= saturate ((a 0-Mid)/(1.0-Mid)), where Mid is a parameter for controlling distance, A1 is an inner circle, A2 is an outer circle, A1 is dyed by color=lerp (color 1.Rgb×color intensity. X, color2.Rgb×color intensity. Y, A1), and A2 is dyed by color=lerp (color. Rgb, color3.Rgb×color intensity. Z, A2), so that a three-color chart can be obtained, as shown in fig. 11, wherein fig. 11 is a schematic diagram of another gray chart according to an embodiment of the present invention, including inner circles A1 and A2.
This embodiment can be colored according to the gray chart of the previous step, and the gray chart can be changed into a colored chart, as shown in fig. 12, wherein fig. 12 is a schematic diagram of converting the gray chart into the colored chart according to an embodiment of the present invention. This embodiment can adjust the intensity of the color, e.g., color (0.5, 0) can adjust the intensity, multiplying it by 2 can result in color (1, 0), even values exceeding 1.
Fig. 13 is a schematic diagram of a contrast of a gray image and a color image according to an embodiment of the present invention. As shown in fig. 13, the color image may be a three-color image so that the inside-out may be red, green, and blue in this order.
Alternatively, mtl.albedo= srgb 2.2 2linearf3 (raw_data.tex0.rgb. Basecolor.rgb) ×color. Wherein raw_data.tex0.rgb is used to represent the base color map, baseColor is an adjustable dyeing parameter that can be used to adjust the color of the virtual light source), color is the result of the simulated illumination calculated above (color image).
In this embodiment, the above calculation results mtl.albedo and lightmap may be superimposed to obtain an alpha of the final color=0.2×rgb× lightmap of the above calculation result× lightmap, where lightmap refers to a map generated after baking and illumination of a scene, and may be carried by a game engine.
FIG. 14 is a comparative schematic diagram of another illumination data processing according to an embodiment of the invention. As shown in FIG. 14, even the ill-experienced art students can make the final effect of the target model look better and more controllable by adjusting the method, and can manually adjust local illumination after baking, and the embodiment can manually add light supplement, and fine-adjust the photo-map light arrangement under the condition of unchanged baking light, thereby realizing the purpose of adjusting dead black of the dark part.
According to the embodiment, some optimization is performed on the baked loader, the position of the virtual light source can be adjusted according to the input offset, the range of the virtual light source is determined according to the input radius, the color of the virtual light source is determined according to the adjustable dyeing parameter, the algorithm of the virtual light source is added to the original color map of the target model, the effect that the target model can be supplemented with light (brightened) through the virtual light source by manually inputting the parameter is achieved, even an ill-experienced art student can see better and more controllable final effect through adjustment, the condition that dark parts of the target model are dead and black is reduced, the technical problem of low processing efficiency of illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The embodiment of the invention also provides a device for processing the illumination data. It should be noted that, the apparatus of this embodiment may be used to perform the method for processing illumination data shown in fig. 2 according to the embodiment of the present invention.
Fig. 15 is a schematic diagram of an illumination data processing apparatus according to an embodiment of the present invention. As shown in fig. 15, the processing apparatus 150 of illumination data includes an acquisition unit 151, a determination unit 152, an analog unit 153, and an adjustment unit 154.
An acquiring unit 151, configured to acquire a first position of a target model to be processed.
A determining unit 152 for determining a second position of the virtual light source based on the first position.
And a simulation unit 153, configured to simulate, at the second position, illumination of the target model based on the virtual light source, to obtain a simulation result.
And an adjusting unit 154, configured to adjust the illumination map of the target model based on the simulation result, so as to obtain target illumination data of the target model.
In this embodiment, by setting a virtual light source in combination with the position of the target model, the result of illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain illumination data of the target model, so that the effect of supplementing (brightening) the target model by the virtual light source is achieved, the condition that dark parts of the target model are dead and black is reduced, the technical problem of low processing efficiency of the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
Embodiments of the present invention also provide a computer-readable storage medium. The computer readable storage medium stores a computer program, wherein the computer program is used for controlling a device where the computer readable storage medium is located to execute the method for processing illumination data according to the embodiment of the invention when the computer program is run by a processor.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to, a USB flash disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, etc. various media in which a computer program may be stored.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (14)

1. A method for processing illumination data, comprising:
acquiring a first position of a target model to be processed;
determining a second location of the virtual light source based on the first location;
Simulating illumination of the target model based on the virtual light source at the second position to obtain a simulation result;
Adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model;
The method comprises the steps of carrying out superposition on the simulation result and the illumination map to obtain target illumination data, wherein the illumination map comprises color channel data and gray channel data.
2. The method of claim 1, wherein simulating the illumination of the target model based on the virtual light source results in a simulation result, comprising:
determining a gray scale image based on the target model and the virtual light source;
Coloring the gray level image to obtain a color image;
and determining the color image as the simulation result.
3. The method of claim 2, wherein the target model includes a plurality of vertices, determining a grayscale image based on the target model and the virtual light source, comprising:
Determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the plurality of first distances are in one-to-one correspondence with the plurality of vertices;
the plurality of first distances is represented as the grayscale image.
4. The method of claim 3, wherein the step of,
The method further includes obtaining a radius of the virtual light source;
Determining a first distance based on the second location of the virtual light source and the location of each of the vertices includes determining the first distance based on the second location, the location of each of the vertices, and the radius.
5. The method of claim 4, wherein coloring the gray scale image to obtain a color image comprises:
Converting the first distance into a second distance and a third distance based on a first parameter;
determining a first color based on the second distance and a second color based on the third distance;
The color image is determined based on the first color and the second color, wherein the first color is a color of a first region of the color image and the second color is a color of a second region of the color image.
6. The method of claim 2, wherein the step of determining the position of the substrate comprises,
The method further includes adjusting a color intensity of the color image from a first color intensity to a second color intensity;
determining the color image as the simulation result includes determining the color image of the second color intensity as the simulation result.
7. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The method further includes obtaining an original color map of the target model;
superposing the simulation result and the illumination map to obtain the target illumination data, wherein the method comprises the following steps:
Adjusting the original color map based on the simulation result to obtain an adjustment result; and superposing the adjustment result and the illumination map to obtain the target illumination data.
8. The method of claim 7, wherein adjusting the original color map based on the simulation result results to obtain an adjustment result comprises:
and adjusting the original color map based on the second parameter and the simulation result to obtain the adjustment result.
9. The method of claim 7, wherein superimposing the adjustment result and the illumination map to obtain the target illumination data comprises:
Obtaining a product among a third parameter, the adjustment result, first channel data of the illumination map and second channel data of the illumination map, wherein the first channel data is the color channel data and the second channel data is the gray channel data;
the product is determined as color data in the target illumination data.
10. The method according to any one of claims 1 to 9, wherein obtaining a first location of the object model to be processed comprises:
acquiring the axis position of the target model;
And determining the axial center position as the first position.
11. The method of any of claims 1 to 9, wherein determining a second location of a virtual light source based on the first location comprises:
A target offset is obtained and the second location is determined based on the first location and the target offset.
12. A lighting data processing apparatus, comprising:
the acquisition unit is used for acquiring a first position of the target model to be processed;
a determining unit configured to determine a second position of the virtual light source based on the first position;
The simulation unit is used for simulating the illumination of the target model based on the virtual light source at the second position to obtain a simulation result;
The adjusting unit is used for adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model;
The adjustment unit is further configured to superimpose the simulation result and the illumination map to obtain the target illumination data, where the illumination map includes color channel data and gray channel data.
13. A computer readable storage medium, characterized in that a computer program is stored in the computer readable storage medium, wherein the computer program, when run by a processor, controls a device in which the computer readable storage medium is located to perform the method according to any of the claims 1 to 11.
14. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to be executed by the processor to perform the method of any of claims 1to 11.
CN202111040489.1A 2021-09-06 2021-09-06 Method, device, storage medium and electronic device for processing illumination data Active CN113836705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111040489.1A CN113836705B (en) 2021-09-06 2021-09-06 Method, device, storage medium and electronic device for processing illumination data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111040489.1A CN113836705B (en) 2021-09-06 2021-09-06 Method, device, storage medium and electronic device for processing illumination data

Publications (2)

Publication Number Publication Date
CN113836705A CN113836705A (en) 2021-12-24
CN113836705B true CN113836705B (en) 2025-01-21

Family

ID=78962346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111040489.1A Active CN113836705B (en) 2021-09-06 2021-09-06 Method, device, storage medium and electronic device for processing illumination data

Country Status (1)

Country Link
CN (1) CN113836705B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111724313A (en) * 2020-04-30 2020-09-29 完美世界(北京)软件科技发展有限公司 Shadow map generation method and device
CN111899325A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Rendering method and device of crystal stone model, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3838981B2 (en) * 2003-01-31 2006-10-25 Necアクセステクニカ株式会社 Image processing apparatus, image processing method, and image processing program
US8525847B2 (en) * 2009-06-01 2013-09-03 Apple Inc. Enhancing images using known characteristics of image subjects
US10303973B2 (en) * 2015-04-15 2019-05-28 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data
CN111862290B (en) * 2020-07-03 2021-05-11 完美世界(北京)软件科技发展有限公司 Radial fuzzy-based fluff rendering method and device and storage medium
CN112819941B (en) * 2021-03-05 2023-09-12 网易(杭州)网络有限公司 Method, apparatus, device and computer readable storage medium for rendering water surface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111724313A (en) * 2020-04-30 2020-09-29 完美世界(北京)软件科技发展有限公司 Shadow map generation method and device
CN111899325A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Rendering method and device of crystal stone model, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113836705A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN112215934B (en) Game model rendering method and device, storage medium and electronic device
CN112316420B (en) Model rendering method, device, equipment and storage medium
CN109427088B (en) Rendering method for simulating illumination and terminal
CN114119818B (en) Scene model rendering method, device and equipment
CN109448089A (en) A kind of rendering method and device
CN110115841B (en) Rendering method and device for vegetation object in game scene
JP3141245B2 (en) How to display images
CN104392479B (en) Method of carrying out illumination coloring on pixel by using light index number
CN110248242B (en) Image processing and live broadcasting method, device, equipment and storage medium
CN107657648B (en) Real-time efficient dyeing method and system in mobile game
CN112274934A (en) Model rendering method, device, equipment and storage medium
CN111784814B (en) Virtual character skin adjustment method and device
CN113450440A (en) Method and device for rendering image, computer readable storage medium and electronic equipment
CN112270759A (en) Image-based light effect processing method, device, device and storage medium
CN114549732A (en) Model rendering method, device and electronic device
CN114565709B (en) A data storage management method, object rendering method and device
JPWO2019226366A5 (en)
CN106355634A (en) Sun simulating method and device
CN113947663B (en) Vegetation model generation method and device, storage medium and electronic device
CN113836705B (en) Method, device, storage medium and electronic device for processing illumination data
CN113313798B (en) Cloud picture manufacturing method and device, storage medium and computer equipment
CN114187398A (en) Processing method and device for human body lighting rendering based on normal map
CN116310052B (en) Picture rendering method based on multiple light sources and related equipment
CN114399572B (en) Dynamic shadow generation method, device, electronic equipment and storage medium
JP7301453B2 (en) IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, COMPUTER PROGRAM, AND ELECTRONIC DEVICE

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载