CN106131530A - A kind of bore hole 3D virtual reality display system and methods of exhibiting thereof - Google Patents
A kind of bore hole 3D virtual reality display system and methods of exhibiting thereof Download PDFInfo
- Publication number
- CN106131530A CN106131530A CN201610739396.0A CN201610739396A CN106131530A CN 106131530 A CN106131530 A CN 106131530A CN 201610739396 A CN201610739396 A CN 201610739396A CN 106131530 A CN106131530 A CN 106131530A
- Authority
- CN
- China
- Prior art keywords
- player
- camera
- unit
- actual
- bore hole
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a kind of bore hole 3D virtual reality display system and methods of exhibiting thereof.The present invention uses the movable information of VR sensing unit collection player, three-dimensional rendering unit is issued and triggers or manipulation instruction, trigger or control the three-dimensional virtual scene of three-dimensional rendering unit, three-dimensional rendering unit renders binocular image, send to VR display unit so that player produces the virtual reality experience of immersion;Use actual stereoscopic camera array simultaneously, and combine green curtain and scratch and player itself is put in digitized virtual scene as technology, finally player and virtual scene are presented in the way of bore hole 3D shows, allow onlooker just need not can observe, in the way of the 3rd visual angle, the scene that player is experiencing by any auxiliary equipment;The viewing location at all right artificial adjustment the 3rd visual angle of onlooker and angle, thus watch situation about occurring in virtual scene according to the interest of oneself, it is also possible to whole virtual scene is carried out comprehensive observation and analysis.
Description
Technical field
The present invention relates to bore hole 3D Display Technique, be specifically related to a kind of bore hole 3D virtual reality display system and displaying side thereof
Method.
Background technology
Human lives, in a three-dimensional world, utilizes stereoscopic vision mechanism to come this three-dimensional world of perception.For table
Reaching this world, people have pointed out and have developed a lot of mode, and wherein image is expression way the most intuitively.But, the most greatly
Most display devices can only realize 2D (two-dimentional) display, and the content that can give expression to scene but have ignored depth information, therefore people
Position before and after can only being gone between judgment object by information such as the shades in the experience accumulated in living at ordinary times and 2D image
Relation.In the epoch information-based, digitized, along with the development of society, 2D shows gradually can not meet human wants, and 3D shows
Become the goal in research that research worker is new, and become development trend new in display field.Grind along with 3D is shown by people
Study carefully and deepen continuously, it has been suggested that various technology achieve multiple 3D display mode.Wherein, multiple views raster pattern bore hole 3D display can
Allow several beholder watch stereo-picture in bigger viewing visual angle simultaneously, and help just can experience depending on equipment without any helping
To the visual experience of shock, thus people are enjoyed to pay close attention to.
Virtual reality (Virtual Reality, VR) is with computer technology as core, in conjunction with related science technology, raw
Becoming the digitized environment approximated with certain limit true environment at aspect height such as vision, hearing, touch senses, user is by necessary equipment
Interact with the object in digitized environment, influence each other, impression and the body coming to corresponding natural environment personally can be produced
Test.Virtual reality is the mankind creates the one of formation for the knowledge of natural environment, simulation nature in exploring natural process, and then more preferably
Ground adapts to and utilizes natural scientific method and technology.
Along with social productive forces and the development of science and technology, all trades and professions research pay attention to day by day to VR technology, VR
Technology also achieves huge progress, and progressively becomes a new science and technology field.But, virtual reality technology is intended at present
Single carry out immersion experience, and the beholder on its side because can only see player with the VR helmet do various
Action, and can't see the content on VR display screen, cause sharing and to understand the immersion impression that player is experiencing.
Although onlooker can see simple eye picture the that is first visual angle picture in the VR helmet by monitor, but in default of binocular
The immersion three-dimensional stereo effect that picture brings, and onlooker can only accept to be made because player's head is quickly mobile passively
The image Rapid Variable Design become, causes onlooker not only can not share the shock that player is experienced, as Fast transforms
Image and produce visual fatigue.
Summary of the invention
For above problems of the prior art, the present invention propose a kind of bore hole 3D virtual reality display system and
Its methods of exhibiting.
It is an object of the present invention to propose a kind of bore hole 3D virtual reality display system.
The bore hole 3D virtual reality display system of the present invention includes: VR sensing unit, many mesh camera positioning acquisition unit, three
Dimension rendering unit, VR display unit, bore hole 3D synthesis display unit and green curtain space;Wherein, VR sensing unit, many mesh camera are determined
Position collecting unit, VR display unit and player are respectively positioned in green curtain space;VR sensing unit and many mesh camera positioning acquisition list
Unit is respectively connecting to three-dimensional rendering unit;Three-dimensional rendering unit is respectively connecting to VR display unit and the synthesis display of bore hole 3D is single
Unit;Different according to viewing object, bore hole 3D virtual reality display system is divided into player's part and onlooker's part;Player
In part, VR sensing unit gathers player's movable information in green curtain space, and by motion information transmission to three-dimensional rendering
Unit, triggers or controls the three-dimensional virtual scene in three-dimensional rendering unit, and controls three-dimensional rendering unit and render player visual angle
The binocular image of the three-dimensional virtual scene seen;Binocular image transmits to VR display unit, will have the binocular of binocular parallax
Image flows to the right and left eyes of player, thus player produces the virtual reality experience of immersion;In onlooker's part, many
Actual stereoscopic camera array shooting in mesh camera positioning acquisition unit is positioned at the player in green curtain space, gathers different angles
Multiple, with player's image of green curtain background, send to three-dimensional rendering unit together with the attitude of actual stereoscopic camera array;
Three-dimensional rendering unit arranges the parameter of many mesh virtual camera according to the parameter of actual stereoscopic camera array, and according to actual cubic phase
The attitude of gesture stability many mesh virtual camera of machine array makes the two moment keep synchronizing, and many mesh virtual camera senses by VR
The three-dimensional virtual scene of unit triggers or control carries out real-time rendering, it is thus achieved that virtual scene renders image, and by player's image
In green curtain background carry out green curtain rejecting, then green curtain is scratched as after player's image and virtual scene render image and fold
Add, form multiple anaglyphs, transmit to bore hole 3D synthesis display unit;Multiple disparity maps are entered by bore hole 3D synthesis display unit
Row anaglyph synthesizes, and carries out bore hole 3D and show;Onlooker controls actual stereoscopic camera in many mesh camera positioning acquisition unit
The attitude of array, by the display of bore hole 3D synthesis display unit, it is achieved the observation at the 3rd visual angle is experienced.
Green curtain space refers to the inner space of the cuboid built by green curtain, and each face in cuboid is by green
Curtain covers, and can arrange lower VR sensing unit, player and many mesh camera positioning acquisition unit, the side of cuboid in cuboid
The personnel that are provided with import and export.
VR sensing unit carries out the man-machine interaction of player and bore hole 3D virtual reality display system, carries out player with empty
Intend the perception transmission between scene, including truly feeling/power feels the information such as perception, three dimensional orientation tracking, interbehavior.Its
In, use three dimensions track and localization technology, by the movable information of the player that track and localization device catches, make player have
Having the interactive space that can move freely, increase the motility of player's interactive operation, movable information includes the head of player
Position and angle, and limbs information.Track and localization technology can be divided into active track and localization technology and Passive Tracking location technology.
Wherein, the track and localization utensil of active track and localization technology has emitter and receptor, it is possible to by launch and receive signal it
Between physical link determine the movable information of player.The track and localization device of Passive Tracking location technology does not have active signal
Source, is only measured the change receiving signal, determines position and the attitude of tracked object by receptor.Track and localization technology uses
One in laser positioning technology, optical locating techniques, infrared active optical technology and visible ray active optical technology.VR
Sensing unit by the motion information transmission of player that collects to three-dimensional rendering unit.
Many mesh camera positioning acquisition unit includes actual stereoscopic camera array, space orientation tracing system and data acquisition list
Unit, actual stereoscopic camera array is connected to data acquisition unit, and data acquisition unit gathers the data of actual stereoscopic camera array,
Data will be obtained again transmit to three-dimensional rendering unit;Space orientation tracing system location and the actual stereoscopic camera array of tracking are at sky
6 degree of freedom between, including positional information and directional information.Actual stereoscopic camera array includes many mesh actual camera, never
It is positioned at the player in green curtain space with angle shot, thus the player with green curtain background getting different visual angles schemes
Picture.The hardware and software parameter of many mesh actual camera all keeps consistent, hardware parameter mainly include acquisition chip (as CCD,
CMOS), camera circuitry and the specifications parameter etc. of camera lens, software parameter include resolution, time of exposure, white balance, color correction,
Image cropping region and Bayer (pattra leaves ear) translation type etc., these parameters are both needed to keep consistent, so could close in bore hole 3D
Become and realize preferable synthetic effect on display unit.Wherein, many mesh actual camera with convergence type Structural assignments, each actual camera
Photocentre is positioned on same level line, and in optical axis is generally aligned in the same plane, and the spacing of adjacent actual camera is equal, each actual phase
The optical axis of machine intersects at a bit before being positioned at actual stereoscopic camera array, referred to as convergent point.According to even number actual camera,
Then on the perpendicular bisector of the photocentre line that convergent point is positioned at middle two actual camera;According to odd number camera, then convergent point position
On the optical axis of middle actual camera.Specification according to bore hole 3D display screen and binocular fusion ability, calculate adjacent
The spacing of actual camera and the position of convergent point, it is ensured that the player photographed parallax amount on bore hole 3D display screen is people
Within the range of fusion of eye, and ensure when the setting of virtual camera is consistent with actual stereoscopic camera array, the void photographed
Intend the parallax amount of scene too within the range of fusion of human eye.
Time-consuming in order to reduce the operation of total system, improve system frame per second, the data acquisition unit in the present invention uses many
Thread collected by camera scheme, i.e. carrys out the data of parallel acquisition actual camera, then the data of acquisition is sent to by calling multi-core CPU
Three-dimensional rendering unit, thus the system that improves to a great extent runs frame per second.
Space orientation tracing system location and the attitude of tracking actual stereoscopic camera array 6 degree of freedom in space,
Including positional information and directional information.The track and localization Implementation Technology that space orientation tracing system uses is also single with VR sensing
The track and localization technology mentioned in unit is the same, use laser positioning technology, optical locating techniques, infrared active optical technology and
One in visible ray active optical technology.Generally, in order to avoid the signal disturbing between two set alignment systems, improve each
The accuracy of location and robustness, overlap different track and localization technology frequently with two and be respectively intended to carry out different objects determining
Position.
In three-dimensional rendering unit, store virtual scene and the preset rules path of three-dimensional digital.First, pass according to VR
The movable information of player that sense unit gathers, triggers or controls virtual scene response, and according to the position of player's head and
Angle arranges the orientation of virtual binocular camera, renders the binocular image of the virtual scene that player visual angle is seen, and transmits
Show on VR display unit, thus allow player can see the virtual three dimensional field of different angles along with the motion of oneself head
Scape, and limb control virtual scene can be passed through, produce interaction with virtual scene, and then produce the virtual reality experience of immersion.
By changing the visual angle of the binocular image in three-dimensional rendering unit to the position of head and visual angle so that the vision system of player
Can connect between system and motion perception system, feel more true to nature.
In addition, three-dimensional rendering unit is provided with and actual stereoscopic camera array quantity, hardware parameter and software parameter
The most identical many mesh virtual camera, three-dimensional rendering unit also to arrange many according to by the attitude of actual stereoscopic camera array
The attitude of mesh virtual camera so that it is keep consistent with the parameter of actual stereoscopic camera array and attitude moment, each actual phase
The virtual camera that the corresponding parameter of machine is identical with attitude, many mesh virtual camera is also adopted by the convergence type structure row of same structure
Cloth.Triggering or the virtual scene of manipulation of VR sensing unit, many mesh virtual camera carries out real-time rendering to virtual scene, obtains many
Virtual scene renders image, and virtual scene renders image for rendering texture, will the data of every frame be saved in respective respectively
Render in texture.It addition, the player's image with green curtain background of actual stereoscopic camera array acquisition is carried out green curtain rejecting
After, the player's image after green curtain is scratched picture renders image with corresponding virtual scene and is overlapped, and forms multiple disparity maps
Multiple anaglyphs are finally sent to bore hole 3D synthesis display unit by picture, and the three-dimensional synthesis and bore hole 3D that carry out anaglyph show
Show.So, onlooker just can be by bore hole 3D display screen, it is seen that player's kinestate in virtual scene, and
And onlooker controls the many mesh void identical with actual stereoscopic camera array by the attitude controlling actual stereoscopic camera array simultaneously
Intend the attitude of camera, thus control to watch visual angle, it is achieved share with the emotion of player and three-dimensional experience.
VR display unit generally includes VR display screen and two groups of imaging lens, and the binocular image from three-dimensional rendering unit divides
The left and right screen of VR display screen transported to by supplementary biography, and the left and right screen on VR display screen is being played as lens imaging by a composition respectively
The right and left eyes front of person, gives right and left eyes respectively by defeated for the image with binocular parallax, according to the fusion faculty of brain, in game
The brain of person is formed the stereoscopic vision of virtual scene.
Bore hole 3D synthesis display unit includes image composing unit, and bore hole 3D display screen.Bore hole 3D display screen includes
2D display screen and grating.Image composing unit realizes principle according to physical arrangement and the 3D of bore hole 3D display screen, by multiple parallaxes
Image carries out image rearrangement according to the physical characteristic of bore hole 3D display screen, shown below is each sub-pixel of 2D display screen
The relation formula of what position was corresponding should take any secondary anaglyph:
Wherein, Q represents the mapping value of each sub-pixel and anaglyph, k and l is the image coordinate of sub-pixel, with upper left
Angle is zero point;α is the grating angle relative to vertical direction, and X is the sub-pixel number shared by a grating horizontal cycle, and K is
Viewpoint number, koffFor the sub-pixel number of the next screen periods left margin of 2D display screen upper left corner distance, mod () is remainder letter
Number.By above formula, just multiple anaglyphs that three-dimensional rendering unit generates can be carried out three-dimensional synthesis, finally will close
Result is become to deliver to bore hole 3D display screen display.The light of different anaglyphs is carried out point by bore hole 3D display screen in space
Open, and make it assemble at viewing ratio, realize being spatially separating, when the left and right of onlooker of different anaglyph
When eye is respectively seen different anaglyph, according to the fusion of brain, stereoscopic vision will be formed in the brain.Grating is narrow
Seam grating or be column mirror grating.
A kind of bore hole 3D virtual reality display method of offer is provided.
The bore hole 3D virtual reality display method of the present invention, different according to viewing object, bore hole 3D virtual reality display system
System is divided into player partly and onlooker's part:
(1) player's part:
1) VR sensing unit gathers player's movable information in green curtain space, and the motion information transmission that will monitor
To three-dimensional rendering unit;
2) movable information of player triggers or controls the three-dimensional virtual scene in three-dimensional rendering unit, and controls three-dimensional wash with watercolours
Dye unit renders the binocular image of the three-dimensional virtual scene that player visual angle is seen, then binocular image transmission is shown to VR
Unit;
3) binocular image with binocular parallax is flowed to the right and left eyes of player, melting according to brain by VR display unit
Conjunction ability, forms the stereoscopic vision of virtual scene in the brain of player, thus player produces the virtual reality of immersion
Experience;
(2) onlooker's part:
1) early stage debugs actual stereoscopic camera array;
2) onlooker controls the attitude of actual stereoscopic camera array, and the shooting of actual stereoscopic camera array is positioned at green curtain space
Player, gathers multiple of different angles with player's image of green curtain background, and with the attitude of actual stereoscopic camera array
Send together to three-dimensional rendering unit;
3) three-dimensional rendering unit arranges the parameter of many mesh virtual camera according to the parameter of actual stereoscopic camera array, and according to
The attitude of gesture stability many mesh virtual camera of actual stereoscopic camera array makes the two moment keep synchronizing;
4) many mesh virtual camera carries out real-time rendering to the three-dimensional virtual scene triggered by VR sensing unit or control, it is thus achieved that
Multiple virtual scenes render image;
5) the green curtain background in player's image is carried out green curtain rejecting by three-dimensional rendering unit, then green curtain is scratched as after
Player's image and corresponding virtual scene render image and are overlapped, and form multiple anaglyphs, transmit to the conjunction of bore hole 3D
Become display unit;
6) multiple disparity maps are carried out anaglyph synthesis by bore hole 3D synthesis display unit, and carry out bore hole 3D and show;Other
Onlooker is by the display of bore hole 3D synthesis display unit, it is achieved the observation at the 3rd visual angle is experienced.
Wherein, in the 1 of step ()) in, VR sensing unit gathers player's movable information in green curtain space and includes:
Caught the movable information of player by track and localization device, movable information includes head position and the angle of player, and
Limbs information, limbs information includes the position of player, angle and touching information.
In step () 2) in, three-dimensional rendering unit renders binocular image, specifically includes following steps:
A) the limbs information of the player that VR sensing unit gathers, triggers or controls three-dimensional rendering unit virtual scene and ring
Should;
B) three-dimensional rendering unit arranges the orientation of virtual binocular camera according to the position of player's head and angle, virtual
Binocular camera carries out real-time rendering to virtual scene, it is thus achieved that the binocular image of the virtual scene that player is seen at visual angle;
C) binocular image is sent to VR display unit.
In step (two) 1) in, early stage is debugged actual stereoscopic camera array and is comprised the following steps:
A) the hardware and software parameter of many mesh actual camera is respectively provided with unanimously;
B) by many mesh actual camera with convergence type Structural assignments, the photocentre of each actual camera is positioned on same level line, light
In axle is generally aligned in the same plane;
C) spacing adjusting adjacent actual camera is equal;
D) make the optical axis of each actual camera intersect at a bit before being positioned at actual stereoscopic camera array, referred to as assemble
Point.
In step (two) 2) in, onlooker controls the attitude of actual stereoscopic camera array, and by actual stereoscopic camera battle array
The attitude transmission of row includes to three-dimensional rendering unit: onlooker controls actual stereoscopic camera array 6 degree of freedom in space
Attitude, including positional information and directional information, and position and follow the trail of actual stereoscopic camera battle array by space orientation tracing system
The attitude of row 6 degree of freedom in space, then transmits the attitude of actual stereoscopic camera array to three-dimensional rendering unit.Real
Border stereoscopic camera array is arranged on camera mount, and relative position and attitude between each actual camera are constant, onlooker
Control the attitude of actual stereoscopic camera array, control the attitude of many mesh virtual camera simultaneously, thus control to watch visual angle, it is achieved with
The emotion of player is shared and three-dimensional experience.
In step (two) 3) in, the quantity of many mesh virtual camera, hardware and software parameter all with many mesh actual camera one
Cause, the actual camera that a corresponding parameter of virtual camera is identical.
In step (two) 4) in, virtual scene is rendered by many mesh virtual camera, it is thus achieved that virtual scene renders image,
The data of every frame are saved in respective rendering in texture respectively.
In step (two) 6) in, bore hole 3D synthesis display unit is according to the physical arrangement of bore hole 3D display screen, according to public affairs
Formula (1) carries out anaglyph synthesis, the mapping relations formula of anaglyph and the sub-pixel of the 2D display screen of bore hole 3D display screen:
Wherein, Q represents the mapping value of each sub-pixel and anaglyph, k and l is the image coordinate of sub-pixel, with upper left
Angle is zero point;α is the grating angle relative to vertical direction, and X is the sub-pixel number shared by a grating horizontal cycle, and K is
Viewpoint number, is the quantity of actual camera, koffSub-pixel for the next screen periods left margin of 2D display screen upper left corner distance
Number, mod () is remainder function.
Advantages of the present invention:
The present invention uses the movable information of VR sensing unit collection player, issues three-dimensional rendering unit and triggers or manipulation
Instruction, triggers or controls the three-dimensional virtual scene of three-dimensional rendering unit, and three-dimensional rendering unit renders binocular image, sends to VR aobvious
Show unit so that player produces the virtual reality experience of immersion;Use actual stereoscopic camera array simultaneously, and combine green curtain
Scratch as player itself is put in digitized virtual scene by technology, finally player and virtual scene are shown with bore hole 3D
Mode present, allow onlooker just need not can observe game in the way of the 3rd visual angle by any auxiliary equipment
The scene that person is experiencing, and can see that player itself has incorporated in virtual scene, such that it is able to be easily understood that
Player does the reason of various action, and then can together experience the enjoyment experiencing VR with player;Moreover, onlooker also may be used
With viewing location and the angle at artificial adjustment the 3rd visual angle, change position and the orientation of virtual camera in virtual scene, thus
Situation about occurring in the interest viewing virtual scene according to oneself, it is also possible to whole virtual scene is carried out comprehensive observation and divides
Analysis.
Accompanying drawing explanation
Fig. 1 is the structured flowchart of the bore hole 3D virtual reality display system of the present invention;
Fig. 2 is bowing of an embodiment of the actual stereoscopic camera array of the bore hole 3D virtual reality display system of the present invention
View;
Fig. 3 is the structural representation of the bore hole 3D virtual reality display system of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawings, by specific embodiment, the present invention is expanded on further.
As shown in figures 1 and 3, the bore hole 3D virtual reality display system of the present invention includes: VR sensing unit 1, many mesh phase
Machine positioning acquisition unit 2, three-dimensional rendering unit 3, VR display unit 4, bore hole 3D synthesis display unit 5 and green curtain space 6;Its
In, VR sensing unit 1, many mesh camera positioning acquisition unit 2, VR display unit 4 and player P are respectively positioned in green curtain space 6;VR
Sensing unit 1 and many mesh camera positioning acquisition unit are respectively connecting to three-dimensional rendering unit 3;Three-dimensional rendering unit 3 connects respectively
Display unit 5 is synthesized to VR display unit 4 and bore hole 3D.The image with binocular parallax is imaged on game by VR display unit 4
Before the eyes of person P.Onlooker L is by the display of bore hole 3D synthesis display unit 5, it is achieved the observation at the 3rd visual angle is experienced.
1.VR sensing unit
In the present embodiment, the VR equipment of VR sensing unit employing HTC VIVE, the three dimensions track and localization technology of VIVE,
Referred to as Lighthouse indoor positioning technologies, belongs to laser scanning location technology, determines fortune by laser and light sensor
The position of animal body.Two generating lasers are positioned in diagonal angle, form the adjustable rectangular region of size, maximum trace location
It is 4.5 × 4.5m.Laser beam is fixed LED by two rows inside emitter and is sent, 6 times per second.Have in each generating laser
Two scanning elements, respectively the most in turn to green curtain spatial emission laser scanning located space.
Use laser scanning location technology, player's head and hand-held handle install multiple light-sensitive sensors,
Arranging multiple generating laser in green curtain space, the scanning element in generating laser is in turn to green curtain spatial emission laser
Scanning green curtain space, light-sensitive sensors accepts laser, and transmits to computing unit, and computing unit distinguishes head with handle not
Same light-sensitive sensors, and calculate the time receiving laser respectively, determine the positional information of head and handle respectively.
The head-mount head of player shows, and hand-held handle, and head is aobvious has more than 70 light sensors on handle.Logical
The time crossing calculating reception laser calculates the sensing station accurate location relative to generating laser, by multiple photosensitive biographies
Sensor can detect lifts one's head aobvious position and direction.Need exist for illustratively, the laser positioning technology that HTC Vive uses, fixed
During Wei, the ID of light sensor can pass to computing unit along with the data that it receives simultaneously, say, that computing unit
It is can directly to distinguish different light sensors, thus aobvious according to the be fixed on head of each light sensor and on handle position
Put and other information the most finally build head and show and the threedimensional model of handle, thus obtain the movable information of player.
2. more than mesh camera positioning acquisition unit
Actual stereoscopic camera array uses many mesh industrial camera.
Industrial camera and the specification of camera lens in actual stereoscopic camera array are the most as follows:
Table 1 camera characteristics
Table 2 camera lens characteristics
5 mesh actual stereoscopic camera array is as shown in Figure 2.Specification according to bore hole 3D display screen and binocular fusion energy
Power, the spacing calculating adjacent cameras is 3.57cm, and focusing distance Z0 is 2m, and nearest object distance ZA is that 1.5m, CMOS are actual to be made
It is 1920 × 1080 with pixel number, then the angle of visual field of actual camera in the vertical direction is about 42 °.According in the present embodiment
The viewpoint number of bore hole 3D display screen is 8, therefore selects 8 mesh industrial cameras to form actual stereoscopic camera array, and is installed in
On a set of camera mount of formula pickaback, it is simple to follow shot.
In the present embodiment, the space orientation to actual stereoscopic camera array uses infrared image tracer technique, by green
Curtain space above sets up space orientation tracing system and catches the labelling point on actual stereoscopic camera array, draws reality in real time
The pose of border stereoscopic camera array, pose includes locus and angle.
3. three-dimensional rendering unit
Three-dimensional rendering unit is responsible for controlling three-dimensional virtual scene according to the movable information that VIVE collects and is carried out triggering sound
Should, and the orientation of virtual binocular camera is set in real time according to the position of head and angle-data, and render binocular image output
On VR display unit.
It addition, three-dimensional rendering unit is according to the position of actual stereoscopic camera array and angle, eight mesh virtual cameras are set
Position and angle, virtual camera uses the convergence type Structural assignments as actual stereoscopic camera array, and the appearance of deficiency and excess camera
State is always consistent with actual stereoscopic camera array.Virtual scene is rendered by eight mesh virtual cameras, it is thus achieved that virtual scene
Render image, the data of every frame are saved in respective rendering in texture respectively.
The player's image with green curtain background that eight mesh actual camera gather, by empty to carrying out YUV color pixel-by-pixel
Between test, after green is rejected, result and virtual scene are rendered image and are overlapped, form eight anaglyphs,
After these anaglyphs are sent to bore hole 3D synthesis display unit.
4.VR display unit
VR display unit be arranged on head aobvious on display device, its simple eye resolution up to 1200 × 1080, refresh rate
For 90FPS, two groups of lens decibels use Fresnel lenses, and visible angle is about 100 °~110 °.
5. bore hole 3D synthesis display unit
2D display screen is LED Three kinds of hors d'oeuvres one display screen of P3, and resolution is 1920 × 1080, a size of 5.76m × 3.24m,
Viewing ratio is 10m;Grating uses slit grating, and viewpoint number is 8.The tangent value at the angle of inclination of grating is 1/3, one
The sub-pixel that screen periods is covered is 8.The mapping relations matrix of sub-pixel and viewpoint so can be calculated according to formula (1)
As follows:
Table 3 sub-pixel and viewpoint mapping matrix (part)
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |
| 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
| 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 |
| 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 |
| 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 |
| 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 1 | 2 | 3 |
8 anaglyphs that will be formed after superposition, according to the mapping relations of figure (1), carry out gray value pickup, synthesize one
The image of 1920 × 1080 send and shows to 2D display screen, blocks effect according to slit grating, can be existed by 8 anaglyphs
Space separates, when onlooker's right and left eyes sees two anaglyphs therein (such as the 1st and the 2nd), and can be
Brain is formed stereoscopic vision.
It is finally noted that, publicize and implement the purpose of example and be that help is further appreciated by the present invention, but this area
Those of skill will appreciate that: without departing from the spirit and scope of the invention and the appended claims, various replacements and repairing
It is all possible for changing.Therefore, the present invention should not be limited to embodiment disclosure of that, and the scope of protection of present invention is with power
Profit claim defines in the range of standard.
Claims (10)
1. a bore hole 3D virtual reality display system, it is characterised in that described display systems includes: VR sensing unit, many mesh
Camera positioning acquisition unit, three-dimensional rendering unit, VR display unit, bore hole 3D synthesis display unit and green curtain space;Wherein, institute
State VR sensing unit, many mesh camera positioning acquisition unit, VR display unit and player to be respectively positioned in green curtain space;VR senses single
First and many mesh camera positioning acquisition unit is respectively connecting to three-dimensional rendering unit;Described three-dimensional rendering unit is respectively connecting to VR and shows
Show unit and bore hole 3D synthesis display unit;Different according to viewing object, described bore hole 3D virtual reality display system is divided into trip
Play person's part and onlooker's part;In player's part, VR sensing unit gathers player's motion letter in green curtain space
Breath, and by motion information transmission to three-dimensional rendering unit, trigger or control the three-dimensional virtual scene in three-dimensional rendering unit, and controlling
Three-dimensional rendering unit processed obtains the binocular image of the three-dimensional virtual scene that player visual angle is seen;Binocular image transmission shows to VR
Show unit, the binocular image with binocular parallax is flowed to the right and left eyes of player, thus player produces the void of immersion
Intend experience of reality;In onlooker's part, the actual stereoscopic camera array shooting in many mesh camera positioning acquisition unit is positioned at green
The player in curtain space, multiple of collection different angles are with player's image of green curtain background, with actual stereoscopic camera array
Attitude send together to three-dimensional rendering unit;It is empty that three-dimensional rendering unit arranges many mesh according to the parameter of actual stereoscopic camera array
Intend the parameter of camera, and according to the attitude of gesture stability many mesh virtual camera of actual stereoscopic camera array, protected in the two moment
Holding synchronization, many mesh virtual camera carries out real-time rendering to the three-dimensional virtual scene triggered by VR sensing unit or control, it is thus achieved that empty
Intend scene rendering image, and the green curtain background in player's image is carried out green curtain rejecting, then green curtain is scratched the game after picture
Person's image and virtual scene render image and are overlapped, and form multiple anaglyphs, transmit to bore hole 3D synthesis display unit;Naked
Multiple disparity maps are carried out anaglyph synthesis by eye 3D synthesis display unit, and carry out bore hole 3D and show;Onlooker controls many mesh
The attitude of actual stereoscopic camera array in camera positioning acquisition unit, by the display of bore hole 3D synthesis display unit, it is achieved the
The observation at three visual angles is experienced.
2. display systems as claimed in claim 1, it is characterised in that described green curtain space refer to one built by green curtain long
The inner space of cube, each face in cuboid is covered by green curtain, can arrange lower VR sensing unit, trip in cuboid
Play person and many mesh camera positioning acquisition unit, the side of described cuboid is provided with personnel and imports and exports.
3. display systems as claimed in claim 1, it is characterised in that described VR sensing unit uses three dimensions track and localization
Technology, by the movable information of the player that track and localization device catches;Track and localization technology is divided into active track and localization technology
With Passive Tracking location technology;Wherein, the track and localization utensil of described active track and localization technology has emitter and receptor, energy
Enough physical link by launching and receive between signal determine the movable information of player;Described Passive Tracking location technology
Track and localization device does not have active signal source, is only measured the change receiving signal by receptor, determines the position of tracked object
Put and attitude;Track and localization technology uses laser positioning technology, optical locating techniques, infrared active optical technology and visible ray
One in active optical technology.
4. display systems as claimed in claim 1, it is characterised in that described many mesh camera positioning acquisition unit includes actual vertical
Body camera array, space orientation tracing system and data acquisition unit, actual stereoscopic camera array is connected to data acquisition unit,
Data acquisition unit gathers the data of actual stereoscopic camera array, then will obtain data transmission to three-dimensional rendering unit;Space is fixed
Tracing system location, position and tracking actual stereoscopic camera array 6 degree of freedom in space, believe including positional information and direction
Breath.
5. display systems as claimed in claim 4, it is characterised in that described actual stereoscopic camera array includes the actual phase of many mesh
Machine, many mesh actual camera is positioned on same level line with convergence type Structural assignments, the photocentre of each actual camera, and optical axis is positioned at same
The spacing of actual camera in plane and adjacent is equal, and the optical axis of each actual camera intersects at before being positioned at stereoscopic camera array
A bit, referred to as convergent point;According to even number actual camera, then convergent point is positioned at the photocentre of middle two actual camera even
On the perpendicular bisector of line;According to odd number camera, then on the optical axis of the actual camera that convergent point is positioned at centre;Many mesh actual camera
Hardware and software parameter all keep consistent, hardware parameter mainly includes the specifications parameter of acquisition chip, camera circuitry and camera lens,
Software parameter includes that resolution, time of exposure, white balance, color correction, image cropping region and pattra leaves ear Bayer change class
Type.
6. display systems as claimed in claim 4, it is characterised in that according to specification and the binocular of bore hole 3D display screen
Fusion faculty, calculates the spacing of adjacent actual camera and the position of convergent point, it is ensured that the player photographed shows in bore hole 3D
Parallax amount on screen is within the range of fusion of human eye, and ensures the setting when virtual camera and actual stereoscopic camera array
Time consistent, the parallax amount of the virtual scene photographed is too within the range of fusion of human eye.
7. display systems as claimed in claim 1, it is characterised in that described bore hole 3D synthesis display unit includes that image synthesizes
Unit, and bore hole 3D display screen;Described bore hole 3D display screen includes 2D display screen and grating.
8. a bore hole 3D virtual reality display method, described methods of exhibiting, different according to viewing object, bore hole 3D virtual reality
Display systems is divided into player partly and onlooker's part:
(1) player's part:
1) VR sensing unit gathers player's movable information in green curtain space, and by the motion information transmission that monitors to three
Dimension rendering unit;
2) movable information of player triggers or controls the three-dimensional virtual scene in three-dimensional rendering unit, and controls three-dimensional rendering list
Unit renders the binocular image of the three-dimensional virtual scene that player visual angle is seen, then that binocular image transmission is single to VR display
Unit;
3) binocular image with binocular parallax is flowed to the right and left eyes of player by VR display unit, according to the fusion energy of brain
Power, forms the stereoscopic vision of virtual scene in the brain of player, thus player produces the virtual reality experience of immersion;
(2) onlooker's part:
1) early stage debugs actual stereoscopic camera array;
2) onlooker controls the attitude of actual stereoscopic camera array, and the shooting of actual stereoscopic camera array is positioned at the game in green curtain space
Person, gathers multiple of different angles with player's image of green curtain background, and together with the attitude of actual stereoscopic camera array
Send to three-dimensional rendering unit;
3) three-dimensional rendering unit arranges the parameter of many mesh virtual camera according to the parameter of actual stereoscopic camera array, and according to reality
The attitude of gesture stability many mesh virtual camera of stereoscopic camera array makes the two moment keep synchronizing;
4) many mesh virtual camera carries out real-time rendering to the three-dimensional virtual scene triggered by VR sensing unit or control, it is thus achieved that multiple
Virtual scene renders image;
5) the green curtain background in player's image is carried out green curtain rejecting by three-dimensional rendering unit, and then green curtain is scratched the game after picture
Person's image and corresponding virtual scene render image and are overlapped, and form multiple anaglyphs, transmit to the synthesis of bore hole 3D aobvious
Show unit;
6) multiple disparity maps are carried out anaglyph synthesis by bore hole 3D synthesis display unit, and carry out bore hole 3D and show;Onlooker
Display by bore hole 3D synthesis display unit, it is achieved the observation at the 3rd visual angle is experienced.
9. methods of exhibiting as claimed in claim 8, it is characterised in that in the 2 of step ()) in, three-dimensional rendering unit renders
Binocular image, specifically includes following steps:
A) the limbs information of the player that VR sensing unit gathers, triggers or controls the response of three-dimensional rendering unit virtual scene;
B) three-dimensional rendering unit arranges the orientation of virtual binocular camera, virtual binocular according to the position of player's head and angle
Camera carries out real-time rendering to virtual scene, it is thus achieved that the binocular image of the virtual scene that player is seen at visual angle;
C) binocular image is sent to VR display unit.
10. methods of exhibiting as claimed in claim 8, it is characterised in that in the 1 of step (two)) in, early stage debugging is actual three-dimensional
Camera array comprises the following steps:
A) the hardware and software parameter of many mesh actual camera is respectively provided with unanimously;
B) by many mesh actual camera with convergence type Structural assignments, the photocentre of each actual camera is positioned on same level line, optical axis position
In same plane;
C) spacing adjusting adjacent actual camera is equal;
D) optical axis of each actual camera is made to intersect at a bit before being positioned at actual stereoscopic camera array, referred to as convergent point.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610739396.0A CN106131530B (en) | 2016-08-26 | 2016-08-26 | A kind of bore hole 3D virtual reality display system and its methods of exhibiting |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610739396.0A CN106131530B (en) | 2016-08-26 | 2016-08-26 | A kind of bore hole 3D virtual reality display system and its methods of exhibiting |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN106131530A true CN106131530A (en) | 2016-11-16 |
| CN106131530B CN106131530B (en) | 2017-10-31 |
Family
ID=57274811
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201610739396.0A Active CN106131530B (en) | 2016-08-26 | 2016-08-26 | A kind of bore hole 3D virtual reality display system and its methods of exhibiting |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN106131530B (en) |
Cited By (38)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106683501A (en) * | 2016-12-23 | 2017-05-17 | 武汉市马里欧网络有限公司 | AR children scene play projection teaching method and system |
| CN106708074A (en) * | 2016-12-06 | 2017-05-24 | 深圳市元征科技股份有限公司 | Method and device for controlling unmanned aerial vehicle based on VR glasses |
| CN107390173A (en) * | 2017-06-27 | 2017-11-24 | 成都虚拟世界科技有限公司 | A kind of position fixing handle suit and alignment system |
| CN107919085A (en) * | 2017-12-25 | 2018-04-17 | 河南新汉普影视技术有限公司 | A kind of intelligent virtual conference system |
| CN107948631A (en) * | 2017-12-25 | 2018-04-20 | 河南新汉普影视技术有限公司 | It is a kind of based on cluster and the bore hole 3D systems that render |
| CN107976811A (en) * | 2017-12-25 | 2018-05-01 | 河南新汉普影视技术有限公司 | A kind of simulation laboratory and its emulation mode based on virtual reality mixing |
| CN108063938A (en) * | 2017-12-21 | 2018-05-22 | 深圳市得色科技有限公司 | The bore hole 3D imaging display methods and its system of a kind of game engine |
| CN108124509A (en) * | 2017-12-08 | 2018-06-05 | 深圳前海达闼云端智能科技有限公司 | Image display method, wearable intelligent device and storage medium |
| CN108347415A (en) * | 2017-01-24 | 2018-07-31 | 上海乐相科技有限公司 | A kind of wireless communications method and equipment based on virtual reality system |
| CN108414978A (en) * | 2018-02-08 | 2018-08-17 | 北京理工大学 | A kind of expansible base station array, optical tracking system and its tracking |
| CN108459722A (en) * | 2018-06-04 | 2018-08-28 | 北京虚拟映画科技有限公司 | virtual viewing device and system |
| CN108564623A (en) * | 2018-04-17 | 2018-09-21 | 北京轻威科技有限责任公司 | An active optical positioning method, device and system |
| CN108600732A (en) * | 2018-04-11 | 2018-09-28 | 成都黑萤科技有限责任公司 | A kind of bore hole 3D rendering display systems based on entity photo frame |
| CN108614636A (en) * | 2016-12-21 | 2018-10-02 | 北京灵境世界科技有限公司 | A kind of 3D outdoor scenes VR production methods |
| CN108830943A (en) * | 2018-06-29 | 2018-11-16 | 歌尔科技有限公司 | A kind of image processing method and virtual reality device |
| CN109032350A (en) * | 2018-07-10 | 2018-12-18 | 深圳市创凯智能股份有限公司 | Spinning sensation mitigates method, virtual reality device and computer readable storage medium |
| CN109254406A (en) * | 2018-11-07 | 2019-01-22 | 深圳市传智科技有限公司 | A kind of multi-functional augmented reality glasses |
| WO2019037040A1 (en) * | 2017-08-24 | 2019-02-28 | 腾讯科技(深圳)有限公司 | Method for recording video on the basis of a virtual reality application, terminal device, and storage medium |
| CN109597484A (en) * | 2018-12-03 | 2019-04-09 | 山东浪潮商用系统有限公司 | A kind of self-service tax system and method based on VR virtual reality |
| CN109901713A (en) * | 2019-02-25 | 2019-06-18 | 山东大学 | Multi-person cooperative assembly system and method |
| CN110060349A (en) * | 2019-02-25 | 2019-07-26 | 叠境数字科技(上海)有限公司 | A method of extension augmented reality head-mounted display apparatus field angle |
| CN110620917A (en) * | 2019-10-22 | 2019-12-27 | 上海第二工业大学 | Virtual reality cross-screen stereoscopic display method |
| CN110650354A (en) * | 2019-10-12 | 2020-01-03 | 苏州大禹网络科技有限公司 | Live broadcast method, system, equipment and storage medium for virtual cartoon character |
| CN110689570A (en) * | 2019-09-29 | 2020-01-14 | 北京达佳互联信息技术有限公司 | Live virtual image broadcasting method and device, electronic equipment and storage medium |
| CN110753218A (en) * | 2019-08-21 | 2020-02-04 | 佳都新太科技股份有限公司 | Digital twinning system and method and computer equipment |
| CN111383313A (en) * | 2020-03-31 | 2020-07-07 | 歌尔股份有限公司 | Virtual model rendering method, device and equipment and readable storage medium |
| EP3547672A4 (en) * | 2016-11-28 | 2020-07-08 | ZTE Corporation | Data processing method, device, and apparatus |
| CN111586394A (en) * | 2020-03-25 | 2020-08-25 | 万象三维视觉科技(北京)有限公司 | Three-dimensional interactive display system and display method thereof |
| CN112669469A (en) * | 2021-01-08 | 2021-04-16 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera |
| CN113012270A (en) * | 2021-03-24 | 2021-06-22 | 纵深视觉科技(南京)有限责任公司 | Stereoscopic display method and device, electronic equipment and storage medium |
| CN113192373A (en) * | 2021-03-18 | 2021-07-30 | 徐州九鼎机电总厂 | Periscope simulation imaging method based on immersive human-computer interaction simulation system |
| CN113347407A (en) * | 2021-05-21 | 2021-09-03 | 华中科技大学 | Medical image display system based on naked eye 3D |
| CN113449027A (en) * | 2021-06-23 | 2021-09-28 | 上海国际汽车城(集团)有限公司 | Three-dimensional visual display method and device for dynamic information of urban intersection |
| CN113821107A (en) * | 2021-11-23 | 2021-12-21 | 成都索贝数码科技股份有限公司 | Indoor and outdoor naked eye 3D system with real-time and free viewpoint |
| CN115486836A (en) * | 2022-09-20 | 2022-12-20 | 浙江大学 | Visual motion perception detection method and device based on VR equipment |
| CN115623185A (en) * | 2021-07-12 | 2023-01-17 | 丰田自动车株式会社 | Virtual reality simulator and computer-readable recording medium |
| CN117176936A (en) * | 2023-09-22 | 2023-12-05 | 南京大学 | A freely expandable three-dimensional digital sandbox system and light field rendering method |
| CN119090677A (en) * | 2024-08-05 | 2024-12-06 | 盐城锦源电气科技有限公司 | A virtual reality somatosensory tourism method and system based on naked eye 3D |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1655852A (en) * | 2002-05-24 | 2005-08-17 | 皇家飞利浦电子股份有限公司 | On-line gaming spectator |
| US20060258446A1 (en) * | 2002-03-29 | 2006-11-16 | Igt | Simulating real gaming environments with interactive host and players |
| US20100137052A1 (en) * | 2004-08-11 | 2010-06-03 | Aristocrat Technologies Australia Pty Limited | Tournament gaming system |
| CN101732858A (en) * | 2008-11-11 | 2010-06-16 | 盛乐信息技术(上海)有限公司 | Real and virtual combined networking game system and realizing method thereof |
-
2016
- 2016-08-26 CN CN201610739396.0A patent/CN106131530B/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060258446A1 (en) * | 2002-03-29 | 2006-11-16 | Igt | Simulating real gaming environments with interactive host and players |
| CN1655852A (en) * | 2002-05-24 | 2005-08-17 | 皇家飞利浦电子股份有限公司 | On-line gaming spectator |
| US20100137052A1 (en) * | 2004-08-11 | 2010-06-03 | Aristocrat Technologies Australia Pty Limited | Tournament gaming system |
| CN101732858A (en) * | 2008-11-11 | 2010-06-16 | 盛乐信息技术(上海)有限公司 | Real and virtual combined networking game system and realizing method thereof |
Cited By (52)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3547672A4 (en) * | 2016-11-28 | 2020-07-08 | ZTE Corporation | Data processing method, device, and apparatus |
| CN106708074A (en) * | 2016-12-06 | 2017-05-24 | 深圳市元征科技股份有限公司 | Method and device for controlling unmanned aerial vehicle based on VR glasses |
| CN108614636A (en) * | 2016-12-21 | 2018-10-02 | 北京灵境世界科技有限公司 | A kind of 3D outdoor scenes VR production methods |
| CN106683501A (en) * | 2016-12-23 | 2017-05-17 | 武汉市马里欧网络有限公司 | AR children scene play projection teaching method and system |
| CN106683501B (en) * | 2016-12-23 | 2019-05-14 | 武汉市马里欧网络有限公司 | A kind of AR children scene plays the part of projection teaching's method and system |
| CN108347415A (en) * | 2017-01-24 | 2018-07-31 | 上海乐相科技有限公司 | A kind of wireless communications method and equipment based on virtual reality system |
| CN107390173A (en) * | 2017-06-27 | 2017-11-24 | 成都虚拟世界科技有限公司 | A kind of position fixing handle suit and alignment system |
| CN107390173B (en) * | 2017-06-27 | 2020-07-17 | 成都理想智美科技有限公司 | Positioning system |
| US11000766B2 (en) | 2017-08-24 | 2021-05-11 | Tencent Technology (Shenzhen) Company Limited | Video recording method based on virtual reality application, terminal device, and storage medium |
| CN109952757B (en) * | 2017-08-24 | 2020-06-05 | 腾讯科技(深圳)有限公司 | Method for recording video based on virtual reality application, terminal equipment and storage medium |
| WO2019037040A1 (en) * | 2017-08-24 | 2019-02-28 | 腾讯科技(深圳)有限公司 | Method for recording video on the basis of a virtual reality application, terminal device, and storage medium |
| CN109952757A (en) * | 2017-08-24 | 2019-06-28 | 腾讯科技(深圳)有限公司 | Method, terminal device and storage medium based on virtual reality applications recorded video |
| CN108124509A (en) * | 2017-12-08 | 2018-06-05 | 深圳前海达闼云端智能科技有限公司 | Image display method, wearable intelligent device and storage medium |
| CN108063938A (en) * | 2017-12-21 | 2018-05-22 | 深圳市得色科技有限公司 | The bore hole 3D imaging display methods and its system of a kind of game engine |
| CN107976811A (en) * | 2017-12-25 | 2018-05-01 | 河南新汉普影视技术有限公司 | A kind of simulation laboratory and its emulation mode based on virtual reality mixing |
| CN107948631A (en) * | 2017-12-25 | 2018-04-20 | 河南新汉普影视技术有限公司 | It is a kind of based on cluster and the bore hole 3D systems that render |
| CN107976811B (en) * | 2017-12-25 | 2023-12-29 | 河南诺控信息技术有限公司 | Virtual reality mixing-based method simulation laboratory simulation method of simulation method |
| CN107919085A (en) * | 2017-12-25 | 2018-04-17 | 河南新汉普影视技术有限公司 | A kind of intelligent virtual conference system |
| CN108414978B (en) * | 2018-02-08 | 2020-08-11 | 北京理工大学 | Extensible base station array, optical tracking system and tracking method thereof |
| CN108414978A (en) * | 2018-02-08 | 2018-08-17 | 北京理工大学 | A kind of expansible base station array, optical tracking system and its tracking |
| CN108600732A (en) * | 2018-04-11 | 2018-09-28 | 成都黑萤科技有限责任公司 | A kind of bore hole 3D rendering display systems based on entity photo frame |
| CN108600732B (en) * | 2018-04-11 | 2020-07-17 | 成都黑萤科技有限责任公司 | Naked eye 3D image display system based on solid photo frame |
| CN108564623A (en) * | 2018-04-17 | 2018-09-21 | 北京轻威科技有限责任公司 | An active optical positioning method, device and system |
| CN108459722A (en) * | 2018-06-04 | 2018-08-28 | 北京虚拟映画科技有限公司 | virtual viewing device and system |
| CN108459722B (en) * | 2018-06-04 | 2021-09-07 | 北京虚拟映画科技有限公司 | Virtual film watching device and system |
| CN108830943B (en) * | 2018-06-29 | 2022-05-31 | 歌尔光学科技有限公司 | Image processing method and virtual reality device |
| CN108830943A (en) * | 2018-06-29 | 2018-11-16 | 歌尔科技有限公司 | A kind of image processing method and virtual reality device |
| CN109032350B (en) * | 2018-07-10 | 2021-06-29 | 深圳市创凯智能股份有限公司 | Vertigo sensation alleviating method, virtual reality device, and computer-readable storage medium |
| CN109032350A (en) * | 2018-07-10 | 2018-12-18 | 深圳市创凯智能股份有限公司 | Spinning sensation mitigates method, virtual reality device and computer readable storage medium |
| CN109254406A (en) * | 2018-11-07 | 2019-01-22 | 深圳市传智科技有限公司 | A kind of multi-functional augmented reality glasses |
| CN109597484A (en) * | 2018-12-03 | 2019-04-09 | 山东浪潮商用系统有限公司 | A kind of self-service tax system and method based on VR virtual reality |
| CN110060349A (en) * | 2019-02-25 | 2019-07-26 | 叠境数字科技(上海)有限公司 | A method of extension augmented reality head-mounted display apparatus field angle |
| CN109901713A (en) * | 2019-02-25 | 2019-06-18 | 山东大学 | Multi-person cooperative assembly system and method |
| CN109901713B (en) * | 2019-02-25 | 2020-07-17 | 山东大学 | Multi-person cooperative assembly system and method |
| CN110753218A (en) * | 2019-08-21 | 2020-02-04 | 佳都新太科技股份有限公司 | Digital twinning system and method and computer equipment |
| CN110689570A (en) * | 2019-09-29 | 2020-01-14 | 北京达佳互联信息技术有限公司 | Live virtual image broadcasting method and device, electronic equipment and storage medium |
| CN110650354A (en) * | 2019-10-12 | 2020-01-03 | 苏州大禹网络科技有限公司 | Live broadcast method, system, equipment and storage medium for virtual cartoon character |
| CN110620917A (en) * | 2019-10-22 | 2019-12-27 | 上海第二工业大学 | Virtual reality cross-screen stereoscopic display method |
| CN111586394A (en) * | 2020-03-25 | 2020-08-25 | 万象三维视觉科技(北京)有限公司 | Three-dimensional interactive display system and display method thereof |
| CN111383313A (en) * | 2020-03-31 | 2020-07-07 | 歌尔股份有限公司 | Virtual model rendering method, device and equipment and readable storage medium |
| CN112669469A (en) * | 2021-01-08 | 2021-04-16 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera |
| CN112669469B (en) * | 2021-01-08 | 2023-10-13 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on drones and panoramic cameras |
| CN113192373A (en) * | 2021-03-18 | 2021-07-30 | 徐州九鼎机电总厂 | Periscope simulation imaging method based on immersive human-computer interaction simulation system |
| CN113012270A (en) * | 2021-03-24 | 2021-06-22 | 纵深视觉科技(南京)有限责任公司 | Stereoscopic display method and device, electronic equipment and storage medium |
| CN113347407A (en) * | 2021-05-21 | 2021-09-03 | 华中科技大学 | Medical image display system based on naked eye 3D |
| CN113449027A (en) * | 2021-06-23 | 2021-09-28 | 上海国际汽车城(集团)有限公司 | Three-dimensional visual display method and device for dynamic information of urban intersection |
| CN115623185A (en) * | 2021-07-12 | 2023-01-17 | 丰田自动车株式会社 | Virtual reality simulator and computer-readable recording medium |
| CN113821107B (en) * | 2021-11-23 | 2022-03-04 | 成都索贝数码科技股份有限公司 | Indoor and outdoor naked eye 3D system with real-time and free viewpoint |
| CN113821107A (en) * | 2021-11-23 | 2021-12-21 | 成都索贝数码科技股份有限公司 | Indoor and outdoor naked eye 3D system with real-time and free viewpoint |
| CN115486836A (en) * | 2022-09-20 | 2022-12-20 | 浙江大学 | Visual motion perception detection method and device based on VR equipment |
| CN117176936A (en) * | 2023-09-22 | 2023-12-05 | 南京大学 | A freely expandable three-dimensional digital sandbox system and light field rendering method |
| CN119090677A (en) * | 2024-08-05 | 2024-12-06 | 盐城锦源电气科技有限公司 | A virtual reality somatosensory tourism method and system based on naked eye 3D |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106131530B (en) | 2017-10-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106131530B (en) | A kind of bore hole 3D virtual reality display system and its methods of exhibiting | |
| CN205987196U (en) | Bore hole 3D virtual reality display system | |
| CN106131536A (en) | A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof | |
| US12169276B2 (en) | Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking | |
| CN101072366B (en) | Free stereo display system based on light field and binocular vision technology | |
| CN205901977U (en) | Interactive display system of bore hole 3D augmented reality | |
| CN106681512B (en) | A kind of virtual reality device and corresponding display methods | |
| US20160371884A1 (en) | Complementary augmented reality | |
| CN108513123B (en) | Image array generation method for integrated imaging light field display | |
| CN107333121A (en) | The immersion solid of moving view point renders optical projection system and its method on curve screens | |
| CN106843456A (en) | A kind of display methods, device and virtual reality device followed the trail of based on attitude | |
| CN106444023A (en) | Super-large field angle binocular stereoscopic display transmission type augmented reality system | |
| CN116309854B (en) | Method, device, equipment, system and storage medium for calibrating augmented reality equipment | |
| CN105894567A (en) | Scaling pixel depth values of user-controlled virtual object in three-dimensional scene | |
| JPWO2019198784A1 (en) | Light field image generation system, image display system, shape information acquisition server, image generation server, display device, light field image generation method and image display method | |
| CN102929091A (en) | Method for manufacturing digital spherical curtain three-dimensional film | |
| CN107545537A (en) | A kind of method from dense point cloud generation 3D panoramic pictures | |
| JP6682624B2 (en) | Image processing device | |
| CN107948631A (en) | It is a kind of based on cluster and the bore hole 3D systems that render | |
| CN115767068A (en) | Information processing method and device and electronic equipment | |
| WO2017191703A1 (en) | Image processing device | |
| JP6775669B2 (en) | Information processing device | |
| CN101908233A (en) | Method and system for producing plural viewpoint picture for three-dimensional image reconstruction | |
| CN105893452A (en) | Method and device for presenting multimedia information | |
| CN108537835A (en) | Holographic image generation and display method and equipment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |