CN101710429A - Illumination algorithm of augmented reality system based on dynamic light map - Google Patents
Illumination algorithm of augmented reality system based on dynamic light map Download PDFInfo
- Publication number
- CN101710429A CN101710429A CN200910044517A CN200910044517A CN101710429A CN 101710429 A CN101710429 A CN 101710429A CN 200910044517 A CN200910044517 A CN 200910044517A CN 200910044517 A CN200910044517 A CN 200910044517A CN 101710429 A CN101710429 A CN 101710429A
- Authority
- CN
- China
- Prior art keywords
- light
- illumination
- map
- augmented reality
- light source
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 15
- 238000005286 illumination Methods 0.000 title claims description 17
- 238000000034 method Methods 0.000 claims abstract description 15
- 230000003321 amplification Effects 0.000 claims abstract description 3
- 238000003199 nucleic acid amplification method Methods 0.000 claims abstract description 3
- 230000000694 effects Effects 0.000 claims description 5
- 238000005516 engineering process Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005375 photometry Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000002301 combined effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Landscapes
- Image Generation (AREA)
Abstract
为实现增强现实系统中光照的实时准确性,提出了一种基于动态light map的快速光照算法。该方法通过采集视频图像中校准物体的像素亮度,计算出摄像机的信号放大量和信号偏置,并利用动态更新的light map所记录的环境光照信息实现虚、实场景之间的一致光照,且能够适应光照变化的场景。
In order to realize the real-time accuracy of lighting in augmented reality system, a fast lighting algorithm based on dynamic light map is proposed. This method calculates the signal amplification and signal offset of the camera by collecting the pixel brightness of the calibration object in the video image, and uses the ambient light information recorded by the dynamically updated light map to achieve consistent lighting between virtual and real scenes, and Scenes that can adapt to changes in lighting.
Description
技术领域technical field
本发明属于计算机图形学和图像处理领域,具体涉及通过图形图像处理对增强现实场景进行光照增强的方法。The invention belongs to the field of computer graphics and image processing, and in particular relates to a method for enhancing illumination of an augmented reality scene through graphic image processing.
背景技术Background technique
增强现实(Augmented Reality,简称AR)技术是虚拟现实技术的一个分支,是一种将真实环境同虚拟物体相融合的技术。在国外,AR技术已被广泛运用于医学可视化、维护与修理、理解与提示、机器人路径规划、娱乐、军用飞机导航和攻击瞄准等领域。近年来,AR也引起了国内研究者的关注,相关的应用也越来越多。Augmented Reality (AR) technology is a branch of virtual reality technology, which is a technology that integrates the real environment with virtual objects. In foreign countries, AR technology has been widely used in medical visualization, maintenance and repair, understanding and prompting, robot path planning, entertainment, military aircraft navigation and attack targeting and other fields. In recent years, AR has also attracted the attention of domestic researchers, and there are more and more related applications.
增强现实技术需要解决的三个关键的问题是:1.如何将虚拟物体和真实场景进行校准,使它们之间能够无缝地结合,这就是增强现实的注册技术所要解决的问题;2.怎样在真实场景中显示虚拟物体,使其与真实场景协调一致,是立体显示技术所要解决的问题;3.实时性问题。随着AR注册技术的逐步完善,人们已逐渐把注意力转移到了如何增强AR场景的真实感上。The three key problems that augmented reality technology needs to solve are: 1. How to calibrate virtual objects and real scenes so that they can be seamlessly combined, which is the problem to be solved by augmented reality registration technology; 2. How to It is a problem to be solved by the stereoscopic display technology to display the virtual object in the real scene so that it is consistent with the real scene; 3. Real-time problem. With the gradual improvement of AR registration technology, people have gradually shifted their attention to how to enhance the realism of AR scenes.
为了使增强现实场景看起来更加真实,更容易让人们接受,我们需要解决虚拟物体与真实场景之间的光照一致性问题。为了得到令人信服的光照效果,首先要对真实场景的光照情况进行分析,然后根据分析的结果建立虚拟物体的光照模型,从而达到虚、实物体相融合的效果。近年来的很多方法都是使用辅助测光设备——鱼眼透镜摄像机来记录环境中的光照信息,并以此确定各个光源的位置,然后用已知的光源对增强现实场景进行照明。这类方法能够准确测量环境光照,实现逼真的光照效果,但是需要昂贵的测光设备,这对于普通的实验条件来说是无法实现的。In order to make augmented reality scenes look more real and easier for people to accept, we need to solve the problem of lighting consistency between virtual objects and real scenes. In order to obtain a convincing lighting effect, it is necessary to analyze the lighting conditions of the real scene first, and then establish a lighting model of the virtual object according to the analysis results, so as to achieve the fusion effect of virtual and real objects. Many methods in recent years use auxiliary photometry equipment - fisheye lens camera to record the lighting information in the environment, and use this to determine the position of each light source, and then use the known light source to illuminate the augmented reality scene. Such methods can accurately measure ambient lighting and achieve realistic lighting effects, but require expensive photometry equipment, which is impossible for ordinary experimental conditions.
发明内容Contents of the invention
本发明针对现有的增强现实光照技术对硬件条件要求较高的情况,提出了一种基于动态light map(光照图,用于记录真实场景中各个法线方向上的光照情况)的增强光照算法。该算法避免使用HDR(High Dynamic Range)摄像机,而是用light map记录环境中的光照情况,并且对light map进行动态更新,能够达到实时交互的帧率,产生正确的明暗和阴影,且实现起来较为简单,通用性较强。The present invention proposes an enhanced lighting algorithm based on a dynamic light map (illumination map, used to record the lighting conditions in each normal direction in a real scene) in view of the fact that the existing augmented reality lighting technology has high requirements on hardware conditions. . This algorithm avoids the use of HDR (High Dynamic Range) cameras, but uses the light map to record the lighting conditions in the environment, and dynamically updates the light map, which can achieve the frame rate of real-time interaction, produce correct light and shade and shadows, and realize it Simpler and more versatile.
本发明的光照算法是基于自然特征的,而不是使用传统的人工标识。用于校准的物体是具有清晰图案的平面物体。算法中使用动态更新的light map记录环境光照,并用虚拟光源模拟真实光源来对虚拟物体进行照明,以产生与真实环境一致的光照。图1所示为AR系统光照生成过程。The illumination algorithm of the present invention is based on natural features, rather than using traditional artificial markers. The object used for calibration is a flat object with a clear pattern. In the algorithm, the dynamically updated light map is used to record the ambient light, and the virtual light source is used to simulate the real light source to illuminate the virtual object, so as to produce the light consistent with the real environment. Figure 1 shows the AR system lighting generation process.
1.场景的光照模型1. The lighting model of the scene
本发明所选择的校准图案是具有漫反射表面的,它向各个方向上反射的光线是相同的。在时刻t,校准图案上的某个点所反射的光线数量取决于该点的表面法线方向nt。为了能够准确的计算法线方向,我们通常用以某个点为中心的一个小面片π来代替该点,则面片π在时刻t所接收的光能为:The calibration pattern selected in the present invention has a diffuse reflective surface, and the light reflected by it in all directions is the same. At time t, the amount of light reflected by a point on the calibration pattern depends on the direction n t of the surface normal at that point. In order to calculate the normal direction accurately, we usually replace the point with a small patch π centered on a certain point, then the light energy received by the patch π at time t is:
其中,Ωl表示光源L的能量,dl表示它的方向。在已经计算得到校准图案表面各点的反射系数的情况下,可以用下面的公式来表示面片π在t时刻的象素亮度I(π,t):Among them, Ω l represents the energy of the light source L, and d l represents its direction. In the case that the reflection coefficient of each point on the surface of the calibration pattern has been calculated, the following formula can be used to represent the pixel brightness I(π,t) of the patch π at time t:
I(π,t)=gρπxt+b (2)I(π,t)=gρπ x t +b (2)
式中的ρπ表示面片π上各点的反射系数的平均值,用π上的平均亮度来计算。g和b分别表示摄像机的信号放大量和信号偏置。ρ π in the formula represents the average value of the reflection coefficient of each point on the surface π, which is calculated by the average brightness on π. g and b represent the signal amplification and signal bias of the camera respectively.
2.动态light map的生成2. Generation of dynamic light map
从上面的光照模型分析可知,要计算出面片π的亮度,就必须先计算光能Ω。我们也可认为空间中的某个位置(如校准图案所在的位置),它的各个朝向(也就是以它为中心的某个球面上的每个点的法线方向)上所接收到的光能与法线方向n有关(如图2所示),因此式(1)中的xπ,t可看成是n的函数。这样就可以用一个light map M(n)来表示该位置的光照情况。在这种情况下,虚拟物体表面的象素亮度可以表示成:From the above illumination model analysis, it can be seen that to calculate the brightness of the facet π, the light energy Ω must be calculated first. We can also think that at a certain position in space (such as the position of the calibration pattern), the light received on its various orientations (that is, the normal direction of each point on a certain spherical surface centered on it) can be related to the normal direction n (as shown in Figure 2), so x π, t in formula (1) can be regarded as a function of n. In this way, a light map M(n) can be used to represent the lighting situation of the location. In this case, the pixel brightness on the surface of the virtual object can be expressed as:
I(v,t)=gρvMt(nv)+b (3)I(v,t)=gρ v M t (n v )+b (3)
式中的ρv表示虚拟物体表面的反射系数,在虚拟物体建模时定义。ρ v in the formula represents the reflection coefficient of the virtual object surface, which is defined when the virtual object is modeled.
由(3)式可知,要得到虚拟物体表面的亮度,就必须先求出摄像机的g和b,可通过式(2)求得。式(2)中有三个未知量,即g、b和xt。令g′=1/g,b′=1/b,则式(2)可表示为It can be seen from formula (3) that to obtain the brightness of the virtual object surface, the g and b of the camera must be obtained first, which can be obtained through formula (2). There are three unknowns in formula (2), namely g, b and x t . Let g'=1/g, b'=1/b, then formula (2) can be expressed as
-I(π,t)g′+ρπxt+b′=0 (4)-I(π,t)g'+ρ π x t +b'=0 (4)
将不同时刻所得到的式(4)放在一起就形成了一个方程组,如下:Putting the formula (4) obtained at different times together forms a system of equations, as follows:
方程组(5)也可以表述成式(6)。为简化表达式,式中用Itn表示式(5)中的I(π,tn):Equation (5) can also be expressed as formula (6). To simplify the expression, I(π,t n ) in formula (5) is represented by I tn in the formula:
将方程组(6)转化成矩阵形式如下:Transform the equation group (6) into a matrix form as follows:
解式(7)可得到多个特征值和多个特征向量,而与最小特征值相对应的特征向量就是我们要求的值。解向量中的(xt1,xt2,…,xtn)即为形成M(n)所需的各个法线方向的光能值。Solving formula (7) can get multiple eigenvalues and multiple eigenvectors, and the eigenvector corresponding to the smallest eigenvalue is the value we require. (x t1 , x t2 ,..., x tn ) in the solution vector are the light energy values in each normal direction required to form M(n).
以上的计算结果是在光照不变的情况下得到的。当校准物体的空间位置变化或者光源发生改变的时候,虚拟物体的光照情况也会随之改变。因此,必须对light map进行及时的更新,以适应环境光照的变化。The above calculation results are obtained under the condition of constant illumination. When the spatial position of the calibration object changes or the light source changes, the lighting conditions of the virtual object will also change accordingly. Therefore, the light map must be updated in time to adapt to changes in ambient lighting.
令Mt(n)为t时刻的light map,则t+1时刻的light map可以用下面的等式来表示:Let M t (n) be the light map at time t, then the light map at time t+1 can be expressed by the following equation:
Mt+1(n)=(1-f(n))Mt(n)+f(n)xt+1 (8)M t+1 (n)=(1-f(n))M t (n)+f(n)x t+1 (8)
其中, in,
式(8)中的xt+1是t+1时刻所观察到的法线方向为nt+1的面片所接收的光能。式(9)中的s是一个模糊常量。事实上,当光照突然发生变化时,该方法不能够及时地更新整个light map,而是每一帧都只能采样到一个法线方向并更新该法线方向上的光能值,而其它法线方向上的光能值保持不变。因此,我们在建立了light map以后,对每一个新的帧都要进行更新,从而最大程度的达到及时更新的效果。x t+1 in formula (8) is the light energy received by the facet with normal direction n t+1 observed at time t+1 . s in formula (9) is a fuzzy constant. In fact, when the light changes suddenly, this method cannot update the entire light map in time, but can only sample one normal direction in each frame and update the light energy value in the normal direction, while other methods The light energy value in the direction of the line remains constant. Therefore, after we have established the light map, we must update each new frame, so as to achieve the effect of timely update to the greatest extent.
在绘制虚拟物体时,light map可作为纹理贴图来表现虚拟物体表面的明暗状况。When drawing a virtual object, the light map can be used as a texture map to express the light and dark conditions of the virtual object surface.
3.光源位置分析3. Light source location analysis
为了确定AR场景中的阴影状况,我们需要计算真实环境中的光源位置。当真实环境中只有一个光源的时候,能够很容易模拟出光源的位置信息:扫描整个light map,找出与最大光能值相对应的法线方向ns,则可以确定该方向就是光源所在的方向。当真实环境中存在不只一个光源的时候,可以用多个虚拟光源来模拟真实光源。我们给定一个较大的亮度值Ic作为临界亮度值,当某个法线方向上的光能值大于(或等于)Ic时,就认为该方向上可能存在一个光源;反之,则认为该方向上不存在光源。对于每个可能存在光源的法线方向n,我们进行如下操作以确定该方向上是否真的存在光源:首先,定义一个较小的角度ε来表示临界角度。然后,逐个扫描light map中的法线方向,当某个法线方向n′与法线方向n的夹角小于临界角度ε时,将n′看作是n的邻近法线方向。最后,判断各个邻近法线方向上的光强度是否小于n上的光强度,如果所有邻近法线方向上的光强度都小于n上的光强度,则法线方向n上必定存在光源,否则,不存在光源。这个过程可以用式(10)来表示。所有的虚拟光源都分布在以虚拟物体为中心的球面上,如图3所示。In order to determine the shadow situation in the AR scene, we need to calculate the position of the light source in the real environment. When there is only one light source in the real environment, it is easy to simulate the position information of the light source: scan the entire light map and find out the normal direction n s corresponding to the maximum light energy value, then it can be determined that this direction is where the light source is located direction. When there is more than one light source in the real environment, multiple virtual light sources can be used to simulate the real light source. We give a large brightness value I c as the critical brightness value, when the light energy value in a certain normal direction is greater than (or equal to) I c , it is considered that there may be a light source in this direction; otherwise, it is considered that There are no light sources in this direction. For each normal direction n where there may be a light source, we perform the following operations to determine whether there is really a light source in this direction: First, define a small angle ε to represent the critical angle. Then, scan the normal direction in the light map one by one, when the angle between a certain normal direction n' and the normal direction n When it is less than the critical angle ε, n' is regarded as the adjacent normal direction of n. Finally, judge whether the light intensity in each adjacent normal direction is less than the light intensity on n, if the light intensity in all adjacent normal directions is less than the light intensity on n, then there must be a light source in the normal direction n, otherwise, There is no light source. This process can be represented by formula (10). All virtual light sources are distributed on a spherical surface centered on the virtual object, as shown in Figure 3.
式(10)中的I、In、In′分别表示环境中的光能、法线方向n上的光能和法线方向n′上的光能。I, In , and In ' in formula (10) represent the light energy in the environment, the light energy in the normal direction n, and the light energy in the normal direction n', respectively.
在确定了所有虚拟光源的位置后,就可以利用Phong光照模型(一个可接受的“标准”模型,它反映了漫射、镜面反射和环境光对物体表面作用的结合)来计算增强现实场景中的阴影。After determining the positions of all virtual light sources, the Phong lighting model (an accepted "standard" model that reflects the combined effects of diffuse, specular, and ambient light on surfaces) can be used to calculate the shadows.
4.阴影生成4. Shadow generation
本发明中所用的虚拟物体是由许多三角形面片组成的,每一个面片对应着三个顶点v1(x1,y1,z1)、v2(x2,y2,z2)、v3(x3,y3,z3)。这三个顶点可以确定空间中一个唯一的平面:The virtual object used in the present invention is composed of many triangular faces, each face corresponds to three vertices v 1 (x 1 , y 1 , z 1 ), v 2 (x 2 , y 2 , z 2 ) , v 3 (x 3 , y 3 , z 3 ). These three vertices define a unique plane in space:
ax+by+cz+d=0 (11)ax+by+cz+d=0 (11)
其中,in,
a=y1*(z2-z3)+y2*(z3-z1)+y3*(z1-z2);a=y 1 *(z 2 -z 3 )+y 2 *(z 3 -z 1 )+y 3 *(z 1 -z 2 );
b=z1*(x2-x3)+z2*(x3-x1)+z3*(x1-x2);b=z 1 *(x 2 -x 3 )+z 2 *(x 3 -x 1 )+z 3 *(x 1 -x 2 );
c=x1*(y2-y3)+x2*(y3-y1)+x3*(y1-y2);c=x 1 *(y 2 -y 3 )+x 2 *(y 3 -y 1 )+x 3 *(y 1 -y 2 );
d=-(x1*(y2*z3-y3*z2)+x2*(y3*z1-y1*z3)+x3*(y1*z2-y2*z1))。d=-(x 1 *(y 2 *z 3 -y 3 *z 2 )+x 2 *(y 3 *z 1 -y 1 *z 3 )+x 3 *(y 1 *z 2 -y 2 *z 1 )).
假设光源的位置为(xp,yp,zp),则在确定了某个面片的平面参数后,就可以利用下面的公式来判断该面片是否正对着光源:Assuming that the position of the light source is (x p , y p , z p ), after determining the plane parameters of a patch, the following formula can be used to determine whether the patch is facing the light source:
S=a*xp+b*yp+c*zp+d (12)S=a*x p +b*y p +c*z p +d (12)
若计算出的S值大于0,则该面片正对着光源;否则,该面片背对着光源。If the calculated S value is greater than 0, the patch is facing the light source; otherwise, the patch is facing away from the light source.
逐个扫描虚拟物体中的三角形面片,当某个三角形面片正对着光源时,依次判断它的三条边是否为临界边:如果与某条边相邻的面片不存在,或者背对着光源,那么这个边就是临界边;否则,该边不为临界边。Scan the triangular faces in the virtual object one by one. When a triangular face is facing the light source, judge whether its three sides are critical edges in turn: if the face adjacent to a certain edge does not exist, or faces away from the light source, then this edge is a critical edge; otherwise, this edge is not a critical edge.
所有的临界边构成了虚拟物体上能够投射阴影的区域的边界。在已知场景中其它物体(包括虚拟物体和真实物体)的3D结构后,就可以利用计算机图形学中的阴影体方法生成该物体投射到其它物体上的阴影。All critical edges constitute the boundaries of the area on the virtual object that can cast shadows. After the 3D structure of other objects (including virtual objects and real objects) in the scene is known, the shadow volume method in computer graphics can be used to generate shadows cast by the object on other objects.
阴影体方法产生阴影有两个步骤:首先连接光源与各个临界边的两个端点,并延长至较远处,这样就可以形成由临界边的两个端点和延长线的末端所确定的半无限四边形,所有这样的四边形构成了物体的阴影体;然后根据阴影体与场景中其它物体的交线确定物体投射到其它物体上的阴影区域。The shadow volume method has two steps to generate shadows: first connect the light source and the two endpoints of each critical edge, and extend to a farther distance, so that a semi-infinity determined by the two endpoints of the critical edge and the end of the extension line can be formed Quadrilaterals, all such quadrilaterals constitute the shadow volume of the object; then determine the shadow area cast by the object on other objects based on the intersection of the shadow volume and other objects in the scene.
附图说明Description of drawings
图1:AR系统光照生成过程。Figure 1: AR system lighting generation process.
图2:物体表面光照示意图。Figure 2: Schematic diagram of object surface illumination.
图3:虚拟光源分布图Figure 3: Virtual light source distribution diagram
图4:光照结果图Figure 4: Lighting result graph
具体实施过程Specific implementation process
1.程序框架1. Program framework
输入:系统的输入为视频图像(VImage[i])和虚拟物体3D模型(obj)。Input: The input of the system is a video image (VImage[i]) and a virtual object 3D model (obj).
2.实验环境2. Experimental environment
本发明的硬件环境由三部分组成:一个USB摄像头、一台PC主机和一台LCD显示器。其中,PC主机的CPU为Core 2 Duo E6550,2.33GHz,内存为1GBRAM,显卡为NVIDIA GeForce 8600 GT,其OpenGL版本为2.1.1。The hardware environment of the present invention consists of three parts: a USB camera, a PC host and an LCD display. Among them, the CPU of the PC host is Core 2 Duo E6550, 2.33GHz, the memory is 1GB RAM, the graphics card is NVIDIA GeForce 8600 GT, and its OpenGL version is 2.1.1.
本发明的软件实现以Windows XP为平台,在Visual studio 2005环境下,使用OpenCV、OpenGL等库开发。The software realization of the present invention takes Windows XP as platform, under Visual studio 2005 environment, uses library developments such as OpenCV, OpenGL.
实验中首先利用USB摄像头拍摄真实场景的视频,在程序采集到一定的光照信息后,就将虚拟物体绘制到真实场景中,并对AR场景进行明暗增强,如图4所示。图中的平面物体为校准物体,茶壶为计算机生成的虚拟物体。In the experiment, the USB camera is used to shoot the video of the real scene. After the program collects certain lighting information, the virtual object is drawn into the real scene, and the AR scene is brightened and darkened, as shown in Figure 4. The planar object in the figure is a calibration object, and the teapot is a computer-generated virtual object.
在对虚拟茶壶进行明暗绘制的同时,程序计算环境中光源的位置。然后根据光源的位置以及虚、实物体的几何结构与空间位置,计算并绘制出“茶壶”投射到校准平面上的阴影,如图4所示。While shading the virtual teapot, the program calculates the position of the light source in the environment. Then, according to the position of the light source and the geometric structure and spatial position of the virtual and solid objects, the shadow projected by the "teapot" on the calibration plane is calculated and drawn, as shown in Figure 4.
Claims (3)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN200910044517A CN101710429B (en) | 2009-10-12 | 2009-10-12 | Illumination algorithm of augmented reality system based on dynamic light map |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN200910044517A CN101710429B (en) | 2009-10-12 | 2009-10-12 | Illumination algorithm of augmented reality system based on dynamic light map |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN101710429A true CN101710429A (en) | 2010-05-19 |
| CN101710429B CN101710429B (en) | 2012-09-05 |
Family
ID=42403213
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN200910044517A Expired - Fee Related CN101710429B (en) | 2009-10-12 | 2009-10-12 | Illumination algorithm of augmented reality system based on dynamic light map |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN101710429B (en) |
Cited By (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101923791A (en) * | 2010-05-31 | 2010-12-22 | 华中师范大学 | A Chinese Character Learning Method Using Augmented Reality Technology Combined with Plane Reading Materials |
| CN102096941A (en) * | 2011-01-30 | 2011-06-15 | 北京航空航天大学 | Consistent lighting method under falsehood-reality fused environment |
| CN102314708A (en) * | 2011-05-23 | 2012-01-11 | 北京航空航天大学 | Optical field sampling and simulating method by utilizing controllable light source |
| CN103155004A (en) * | 2010-09-01 | 2013-06-12 | 马斯科公司 | Apparatus, system, and method for demonstrating a lighting solution by image rendering |
| CN103339654A (en) * | 2011-01-25 | 2013-10-02 | 高通股份有限公司 | Use occlusion to detect and track 3D objects |
| CN104463198A (en) * | 2014-11-19 | 2015-03-25 | 上海电机学院 | Method for carrying out illumination estimation on real illumination environment |
| CN104580920A (en) * | 2013-10-21 | 2015-04-29 | 华为技术有限公司 | Imaging processing method and user terminal |
| AT14791U1 (en) * | 2014-09-04 | 2016-06-15 | Zumtobel Lighting Gmbh | Augmented reality-based lighting system and procedure |
| US9449428B2 (en) | 2009-12-21 | 2016-09-20 | Thomson Licensing | Method for generating an environment map |
| WO2016161950A1 (en) * | 2015-04-10 | 2016-10-13 | Beijing Zhigu Rui Tuo Tech Co., Ltd. | Display control methods and apparatuses |
| CN106940897A (en) * | 2017-03-02 | 2017-07-11 | 苏州蜗牛数字科技股份有限公司 | A kind of method that real shadow is intervened in AR scenes |
| CN107025683A (en) * | 2017-03-30 | 2017-08-08 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
| CN107093204A (en) * | 2017-04-14 | 2017-08-25 | 苏州蜗牛数字科技股份有限公司 | It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama |
| CN107424206A (en) * | 2017-04-14 | 2017-12-01 | 苏州蜗牛数字科技股份有限公司 | A kind of interactive approach that the performance of virtual scene shadow is influenceed using actual environment |
| CN107749076A (en) * | 2017-11-01 | 2018-03-02 | 太平洋未来科技(深圳)有限公司 | The method and apparatus that real illumination is generated in augmented reality scene |
| CN107808409A (en) * | 2016-09-07 | 2018-03-16 | 中兴通讯股份有限公司 | The method, device and mobile terminal of illumination render are carried out in a kind of augmented reality |
| CN107871339A (en) * | 2017-11-08 | 2018-04-03 | 太平洋未来科技(深圳)有限公司 | The rendering intent and device of virtual objects color effect in video |
| CN108375830A (en) * | 2017-01-25 | 2018-08-07 | 矢崎总业株式会社 | head-up display device and display control method |
| CN108460841A (en) * | 2018-01-23 | 2018-08-28 | 电子科技大学 | A kind of indoor scene light environment method of estimation based on single image |
| CN109064544A (en) * | 2018-08-09 | 2018-12-21 | 太平洋未来科技(深圳)有限公司 | Light and shadow rendering method, device and electronic equipment for virtual objects in panoramic video |
| CN109242800A (en) * | 2018-09-26 | 2019-01-18 | 北京邮电大学 | The method of dummy model illumination consistency is realized by Image estimation ambient lighting |
| CN110166760A (en) * | 2019-05-27 | 2019-08-23 | 浙江开奇科技有限公司 | Image treatment method and terminal device based on panoramic video image |
| WO2019216688A1 (en) * | 2018-05-10 | 2019-11-14 | Samsung Electronics Co., Ltd. | Method for estimating light for augmented reality and electronic device thereof |
| CN110458964A (en) * | 2019-08-21 | 2019-11-15 | 四川大学 | A Real-time Calculation Method of Dynamic Lighting in Real Environment |
| US10546422B2 (en) | 2013-09-13 | 2020-01-28 | Signify Holding B.V. | System and method for augmented reality support using a lighting system's sensor data |
| WO2020019131A1 (en) * | 2018-07-23 | 2020-01-30 | 太平洋未来科技(深圳)有限公司 | Method and apparatus for determining light ray information, and electronic device |
| CN111260769A (en) * | 2020-01-09 | 2020-06-09 | 北京中科深智科技有限公司 | Real-time rendering method and device based on dynamic illumination change |
| CN111462295A (en) * | 2020-03-27 | 2020-07-28 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality snap |
| CN112200712A (en) * | 2020-09-08 | 2021-01-08 | 成都安易迅科技有限公司 | GLES image rendering method and device, storage medium and computer equipment |
| WO2021223133A1 (en) * | 2020-05-07 | 2021-11-11 | 浙江大学 | Neural network-based augmented reality drawing method |
| CN115080693A (en) * | 2022-06-23 | 2022-09-20 | 蔚来汽车科技(安徽)有限公司 | Text processing methods, electronic devices, dialogue systems and automobiles |
| CN119323656A (en) * | 2024-12-16 | 2025-01-17 | 山东天竞电子科技有限公司 | Method for synthesizing real virtual image |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN100552713C (en) * | 2006-06-23 | 2009-10-21 | 腾讯科技(深圳)有限公司 | A kind of image processing method |
| CN101102398B (en) * | 2007-07-26 | 2010-05-19 | 上海交通大学 | Fully automatic real-time digital image processing enhancement system |
-
2009
- 2009-10-12 CN CN200910044517A patent/CN101710429B/en not_active Expired - Fee Related
Cited By (47)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9449428B2 (en) | 2009-12-21 | 2016-09-20 | Thomson Licensing | Method for generating an environment map |
| CN101923791A (en) * | 2010-05-31 | 2010-12-22 | 华中师范大学 | A Chinese Character Learning Method Using Augmented Reality Technology Combined with Plane Reading Materials |
| CN103155004A (en) * | 2010-09-01 | 2013-06-12 | 马斯科公司 | Apparatus, system, and method for demonstrating a lighting solution by image rendering |
| CN103155004B (en) * | 2010-09-01 | 2016-05-18 | 玛斯柯有限公司 | Demonstrate equipment, the system and method for illumination scheme by image rendering |
| CN103339654A (en) * | 2011-01-25 | 2013-10-02 | 高通股份有限公司 | Use occlusion to detect and track 3D objects |
| US10109065B2 (en) | 2011-01-25 | 2018-10-23 | Qualcomm Incorporated | Using occlusions to detect and track three-dimensional objects |
| CN102096941A (en) * | 2011-01-30 | 2011-06-15 | 北京航空航天大学 | Consistent lighting method under falsehood-reality fused environment |
| CN102096941B (en) * | 2011-01-30 | 2013-03-20 | 北京航空航天大学 | Consistent lighting method under falsehood-reality fused environment |
| CN102314708A (en) * | 2011-05-23 | 2012-01-11 | 北京航空航天大学 | Optical field sampling and simulating method by utilizing controllable light source |
| CN102314708B (en) * | 2011-05-23 | 2013-07-31 | 北京航空航天大学 | Optical field sampling and simulating method by utilizing controllable light source |
| US10546422B2 (en) | 2013-09-13 | 2020-01-28 | Signify Holding B.V. | System and method for augmented reality support using a lighting system's sensor data |
| CN104580920A (en) * | 2013-10-21 | 2015-04-29 | 华为技术有限公司 | Imaging processing method and user terminal |
| CN104580920B (en) * | 2013-10-21 | 2018-03-13 | 华为技术有限公司 | The method and user terminal of a kind of imaging |
| AT14791U1 (en) * | 2014-09-04 | 2016-06-15 | Zumtobel Lighting Gmbh | Augmented reality-based lighting system and procedure |
| CN104463198A (en) * | 2014-11-19 | 2015-03-25 | 上海电机学院 | Method for carrying out illumination estimation on real illumination environment |
| WO2016161950A1 (en) * | 2015-04-10 | 2016-10-13 | Beijing Zhigu Rui Tuo Tech Co., Ltd. | Display control methods and apparatuses |
| CN106162117A (en) * | 2015-04-10 | 2016-11-23 | 北京智谷睿拓技术服务有限公司 | Display control method and device |
| CN107808409B (en) * | 2016-09-07 | 2022-04-12 | 中兴通讯股份有限公司 | Method and device for performing illumination rendering in augmented reality and mobile terminal |
| CN107808409A (en) * | 2016-09-07 | 2018-03-16 | 中兴通讯股份有限公司 | The method, device and mobile terminal of illumination render are carried out in a kind of augmented reality |
| CN108375830A (en) * | 2017-01-25 | 2018-08-07 | 矢崎总业株式会社 | head-up display device and display control method |
| US11295702B2 (en) | 2017-01-25 | 2022-04-05 | Yazaki Corporation | Head-up display device and display control method |
| CN106940897A (en) * | 2017-03-02 | 2017-07-11 | 苏州蜗牛数字科技股份有限公司 | A kind of method that real shadow is intervened in AR scenes |
| CN107025683A (en) * | 2017-03-30 | 2017-08-08 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
| CN107424206B (en) * | 2017-04-14 | 2020-09-22 | 苏州蜗牛数字科技股份有限公司 | Interaction method for influencing shadow expression of virtual scene by using real environment |
| CN107424206A (en) * | 2017-04-14 | 2017-12-01 | 苏州蜗牛数字科技股份有限公司 | A kind of interactive approach that the performance of virtual scene shadow is influenceed using actual environment |
| CN107093204A (en) * | 2017-04-14 | 2017-08-25 | 苏州蜗牛数字科技股份有限公司 | It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama |
| CN107749076A (en) * | 2017-11-01 | 2018-03-02 | 太平洋未来科技(深圳)有限公司 | The method and apparatus that real illumination is generated in augmented reality scene |
| CN107871339B (en) * | 2017-11-08 | 2019-12-24 | 太平洋未来科技(深圳)有限公司 | Rendering method and device for color effect of virtual object in video |
| CN107871339A (en) * | 2017-11-08 | 2018-04-03 | 太平洋未来科技(深圳)有限公司 | The rendering intent and device of virtual objects color effect in video |
| CN108460841A (en) * | 2018-01-23 | 2018-08-28 | 电子科技大学 | A kind of indoor scene light environment method of estimation based on single image |
| WO2019216688A1 (en) * | 2018-05-10 | 2019-11-14 | Samsung Electronics Co., Ltd. | Method for estimating light for augmented reality and electronic device thereof |
| US10902669B2 (en) | 2018-05-10 | 2021-01-26 | Samsung Electronics Co., Ltd. | Method for estimating light for augmented reality and electronic device thereof |
| WO2020019131A1 (en) * | 2018-07-23 | 2020-01-30 | 太平洋未来科技(深圳)有限公司 | Method and apparatus for determining light ray information, and electronic device |
| CN109064544A (en) * | 2018-08-09 | 2018-12-21 | 太平洋未来科技(深圳)有限公司 | Light and shadow rendering method, device and electronic equipment for virtual objects in panoramic video |
| CN109242800A (en) * | 2018-09-26 | 2019-01-18 | 北京邮电大学 | The method of dummy model illumination consistency is realized by Image estimation ambient lighting |
| CN109242800B (en) * | 2018-09-26 | 2022-03-29 | 北京邮电大学 | Method for realizing illumination consistency of virtual model by estimating environmental illumination through image |
| CN110166760A (en) * | 2019-05-27 | 2019-08-23 | 浙江开奇科技有限公司 | Image treatment method and terminal device based on panoramic video image |
| CN110458964A (en) * | 2019-08-21 | 2019-11-15 | 四川大学 | A Real-time Calculation Method of Dynamic Lighting in Real Environment |
| CN111260769A (en) * | 2020-01-09 | 2020-06-09 | 北京中科深智科技有限公司 | Real-time rendering method and device based on dynamic illumination change |
| CN111462295A (en) * | 2020-03-27 | 2020-07-28 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality snap |
| CN111462295B (en) * | 2020-03-27 | 2023-09-05 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality co-production |
| WO2021223133A1 (en) * | 2020-05-07 | 2021-11-11 | 浙江大学 | Neural network-based augmented reality drawing method |
| CN112200712A (en) * | 2020-09-08 | 2021-01-08 | 成都安易迅科技有限公司 | GLES image rendering method and device, storage medium and computer equipment |
| CN112200712B (en) * | 2020-09-08 | 2023-10-27 | 成都安易迅科技有限公司 | GLES image rendering method and device, storage medium and computer equipment |
| CN115080693A (en) * | 2022-06-23 | 2022-09-20 | 蔚来汽车科技(安徽)有限公司 | Text processing methods, electronic devices, dialogue systems and automobiles |
| CN119323656A (en) * | 2024-12-16 | 2025-01-17 | 山东天竞电子科技有限公司 | Method for synthesizing real virtual image |
| CN119323656B (en) * | 2024-12-16 | 2025-06-20 | 山东天竞电子科技有限公司 | A method for synthesizing realistic virtual images |
Also Published As
| Publication number | Publication date |
|---|---|
| CN101710429B (en) | 2012-09-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN101710429B (en) | Illumination algorithm of augmented reality system based on dynamic light map | |
| JP6246757B2 (en) | Method and system for representing virtual objects in field of view of real environment | |
| CN100594519C (en) | A Method of Real-time Generating Augmented Reality Environment Illumination Model Using Spherical Panoramic Camera | |
| Wang et al. | View-dependent displacement mapping | |
| CN102096941B (en) | Consistent lighting method under falsehood-reality fused environment | |
| CN103400003B (en) | Based on GPU programming realization laser radar scene simulation method | |
| CN110033509B (en) | Method for constructing three-dimensional face normal based on diffuse reflection gradient polarized light | |
| CN108460841A (en) | A kind of indoor scene light environment method of estimation based on single image | |
| US8436855B1 (en) | Efficient illumination of large three dimensional environments | |
| CN107644453A (en) | A kind of rendering intent and system based on physical colored | |
| Liao et al. | Interreflection removal for photometric stereo by using spectrum-dependent albedo | |
| Gruyer et al. | Modeling and validation of a new generic virtual optical sensor for ADAS prototyping | |
| TW201044316A (en) | Geospatial modeling system for colorizing images and related methods | |
| CN103413346B (en) | A kind of sense of reality fluid real-time reconstruction method and system thereof | |
| CN110021067A (en) | A method of three-dimensional face normal is constructed based on mirror-reflection gradient polarised light | |
| Fouque et al. | Photometric DIC: a unified framework for global Stereo Digital Image Correlation based on the construction of textured digital twins | |
| Zhuang et al. | The influence of active projection speckle patterns on underwater binocular stereo vision 3D imaging | |
| CN119068120B (en) | A basic data generation method for unmanned system intelligence level assessment | |
| CN115205492A (en) | A method and device for real-time mapping of a three-dimensional model by a laser beam | |
| CN119758290A (en) | Point cloud reflection intensity simulation method, medium and equipment | |
| CN119251430A (en) | A method, medium and device for constructing a virtual scene based on point cloud data | |
| CN106780708A (en) | A kind of 3D model rendering method and system based on simulation refraction and global illumination | |
| Balz et al. | Improved real-time SAR simulation in urban areas | |
| Meister | On creating reference data for performance analysis in image processing | |
| Schiavone et al. | Interoperability issues for terrain databases in distributed interactive simulation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C53 | Correction of patent for invention or patent application | ||
| CB03 | Change of inventor or designer information |
Inventor after: Chen Hao Inventor after: Sun Jianhua Inventor after: Zhao Liming Inventor after: Zhang Xiguang Inventor after: Tan Ming Inventor after: Zeng Ping Inventor before: Chen Hao Inventor before: Sun Jianhua Inventor before: Tan Ming Inventor before: Zeng Ping |
|
| COR | Change of bibliographic data |
Free format text: CORRECT: INVENTOR; FROM: CHEN HAO SUN JIANHUA TAN MING ZENG PING TO: CHEN HAO SUN JIANHUA ZHAO LIMING ZHANG XIGUANG TAN MING ZENG PING |
|
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120905 Termination date: 20151012 |
|
| EXPY | Termination of patent right or utility model |