WO2023152403A1 - Three-dimensional visualization device and method - Google Patents
Three-dimensional visualization device and method Download PDFInfo
- Publication number
- WO2023152403A1 WO2023152403A1 PCT/EP2023/053675 EP2023053675W WO2023152403A1 WO 2023152403 A1 WO2023152403 A1 WO 2023152403A1 EP 2023053675 W EP2023053675 W EP 2023053675W WO 2023152403 A1 WO2023152403 A1 WO 2023152403A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focusing
- image
- optical
- distance
- generation unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/52—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/56—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0127—Head-up displays characterised by optical features comprising devices increasing the depth of field
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0185—Displaying image at variable distance
Definitions
- the present invention relates to the field of visualization of graphic elements onto one or more focal planes, and in particular relates to a three-dimensional visualization device and method.
- HUD Heads-Up Display
- augmented reality for example in the automotive and aircraft fields
- electronic control means for example the so-called Heads-Up Display (HUD) in augmented reality
- image generation unit optical elements such as lenses, mirrors and beam splitters for enlarging and focusing the image
- semi -reflective screen for viewing the image.
- the graphic elements constituting an image to be visualized and formed by the image generation unit follow optical paths through the optical elements and are projected onto the semi-reflective screen at respective variable focal distances, so that, in correspondence with an operating condition of the device, a user sees these graphic elements distributed on different focal planes, i.e. in three dimensions.
- variable magnification projection optical system wherein the display distance of a virtual image is changeable at an arbitrary position within a screen.
- the variable magnification projection optical system has a first projection optical system, which forms a first intermediate image from an image of a display element, and a second projection optical system which reflects image light from the first intermediate image by a combiner that is a display screen, thereby displaying a virtual image through the display screen.
- the second projection optical system has a variable focus optical system in which the display position of the virtual image is variable.
- Document DEI 12017006073T5 discloses an imaging system for generating virtual images with multiple depths on a screen.
- the imaging system comprises an image realization device for forming a source image, and projection optics for rendering a display image on the screen.
- the display image is a virtual image corresponding to the source image.
- the image realization device comprises an image realization surface and a light structuring device having a surface with a first and second region, wherein the light structuring device is configured to simulate a first lens with a first focal length on the first region of the surface.
- the surface and the image realization surface are arranged so that a first source image formed on a first area of the image realization surface and projected by the projection optics creates a first display image on the screen at a first apparent depth.
- the light structuring device is further configured to simulate a second lens on the second area of the surface, the second lens having a second focal length.
- the surface and the image realization surface are arranged such that a second source image that is on a second area of the image realization surface formed and projected by the projection optics creates a second display image on the screen at a second apparent depth.
- the first and second lenses are independently configurable.
- Document US10353212B2 discloses a see-through type display apparatus and a method of operating the see-through type display apparatus.
- the see-through type display apparatus simultaneously outputs a plurality of two-dimensional (2D) images having different depth information onto different regions, produces a multi-layered depth image by sequentially arranging the plurality of 2D images according to the depth information, and causes the multi-layered depth image and a reality environment to converge onto one region.
- 2D two-dimensional
- Document DE3532120A1 discloses a windscreen for motor vehicles with a reflecting device for making optical information or signals visible in the field of vision of the windscreen.
- the reflecting device is a hologram with the property of a mirror.
- the reflection angle of the holographic mirror is different from the reflection angle of the glass surface.
- the rays reflected by the glass surface do not reach the eye of the viewer, but only the rays reflected by the holographic mirror.
- a hologram with the function is on a carrier layer that is curved to match the shape of the windshield of a plane mirror, so that the reflected image of the optical signals or information is free of optical aberrations even when the windshield is curved.
- a disadvantage of the known three-dimensional visualization devices consists in that focusing is achieved through the mechanical adjustment of the position of the lenses, resulting relatively slow, affected by uncertainties deriving from mechanical uncertainties, and sensitive to the vibrations of the environment in which the device is installed.
- Another drawback of the known devices consists in that the number of focal planes on which a graphic element can be focused is limited, due to the mechanical adjustment of the focusing.
- a further disadvantage of the known devices consists in that different graphic elements can be displayed simultaneously only on the same focal plane, or on a limited number of focal planes at predetermined and fixed focal distances.
- An object of the present invention is to propose a three-dimensional visualization device in which the graphic elements of each image can be simultaneously and finely displayed and focused each on a respective focal plane placed at a respective focusing distance.
- Another object is to propose a device in which the projected images are always aligned with the direction of the user's gaze.
- figure 1 illustrates a schematic view of the three-dimensional visualization device object of the present invention, in which some parts have been removed to better highlight others
- figure 2 illustrates an enlarged schematic view of a projection group of the device of figure 1 comprising an optical block
- figure 3 illustrates an enlarged schematic view of the optical block of figure 2
- figures 4a, 4b respectively illustrate an ambient scene and an image visualized by the device of figure 1 as seen by a user
- figure 5 illustrates a schematic side view of the graphic elements of the image of figure 4b on their respective focal planes as seen by a user
- figure 6 illustrates a schematic perspective view of the graphic elements of figure 5, in which some parts have been removed to better highlight others
- figure 7 illustrates an expanded image derived from the image of figure 4b
- figures 8a, 8b illustrate respective frames of the expanded image of figure 7
- figure 9 illustrates the ambient scene of figure 4a onto which the visualized image of figure 4b is superimposed
- figure 10a illustrates a general representation of a
- numeral 1 indicates the three-dimensional visualization device object of the present invention.
- the three-dimensional visualization device 1 essentially comprises electronic control means 24, a screen element 40, an image generation unit 26, and two sets of optical elements 30 defining two respective and distinct optical paths P between the image generation unit 26 and the screen element 40.
- An image 50 comprises a finite number of graphic elements or symbols SI-N, each of which is assigned to be focused on focal planes FI-N placed at respective predetermined focusing distances DI-N, equal or different, measured starting from a reference such as for example the screen element 40, the sets of optical elements 30 or a single element of the latter.
- the electronic control means 24 consist of microprocessors and other digital and analog electric and electronic circuits, and are connected to the active components of the device 1 and to circuits which are external to the device itself, as described below.
- the image generation unit 26 preferably consists of a projector means, for example of the DLP or TFT type, and is capable of projecting luminous images at a predetermined resolution, i.e. composed of a predetermined number of pixels.
- each set of optical elements 30 comprises a focusing means 32 and a diffuser element 33 interposed between the focusing means 32 and the image generation unit 26.
- the diffuser element 33 is an almost planar optical component capable of revealing and showing, on a surface thereof facing the focusing means 32, a respective portion of a luminous image projected by the image generation unit 26.
- This portion of luminous image, constituting the object of the projection is preferably focused on the diffuser element 33 thanks to an optical means 27, such as a lens or a group of lenses, interposed between the image generation unit 26 and the diffuser element 33 along its respective optical path P.
- the distance of the diffuser element 33 from the focusing means 32 therefore fixes the so-called object distance which, in combination with the variable focal distance of the focusing means 32, determines the focusing distance DI-N of said image portion.
- the actual frame is established, i.e. the object, which will be projected through its respective set of optical elements 30 and displayed on the screen element 40 focused at its respective focusing distance DI-N.
- the diffuser element 33 thus establishes a reference or offset for calculating and adjusting the focal distance of each focusing means 32 and of each set of optical elements 30.
- each focusing means 32 is a varifocal lens with piezoelectric adjustment of the focal distance, or of a similar lens type with electrical or electronic adjustment, the actuation of which, usually via digital signals, by the electronic control means 24 allows it to adjust quickly and finely its focal distance.
- the luminous image shown by the diffuser element 33 on one side of the focusing means 32 is focused by the latter on its opposite side, at the desired focusing distance DI-N.
- each focusing means 32 can be adjusted between a respective minimum focal distance and a respective maximum focal distance, the value of which depends on geometrical and constructive factors of the focusing means 32 itself.
- Each focusing means 32 is assigned to focus one or more graphic elements SI-N at their respective focusing distances DI-N comprised between said minimum and maximum focal distances.
- the optical paths P each passing through its respective diffuser element 33 and through its respective focusing means 32, preferably converge into a combiner element 35 consisting of a beam splitter, or an optical prism, or the like.
- the two sets of optical elements 30 are side by side and their optical axes are aligned along parallel directions.
- the optical axis of one set of optical elements 30 is aligned normally with the center of a front face 35f of the combiner element 35.
- the optical axis of the remaining set of optical elements 30 points to a reflecting element 37 fixed in front of a side face 351 of the combiner element 35, with the normal of the reflecting element 37 oriented along the bisector of the angle between said optical axis and the normal to said side face 351.
- a mirror means 34 is associated with the combiner element 35 to a side face thereof opposite to its side face 351 which faces the reflecting element 37.
- the mirror means 34 is for example of the so-called blocker type.
- Magnifying elements 36 constituted by a set of coaxial lenses aligned normally with the center of the rear face 35r of the combiner element 35, are preferably placed downstream of the combiner element 35 itself between this and the screen element 40.
- the screen element 40 consists of a plate of glass or other transparent material with two layers, between which a holographic optical element, so-called HOE (Holographic Optical Element), is preferably inserted, in the form of a film or thin layer.
- HOE Holographic Optical Element
- a reflecting means 42 is interposed between the optical elements 30 and the screen element 40, preferably between the magnifying elements 36 and the screen element 40, or alternatively between the combiner element 35 and the magnifying elements 36.
- the reflecting means 42 deviates the optical paths P between the combiner element 35 and the screen element 40, so that an optical path P of a given length thus diverted is contained inside a smaller volume than that it would occupy if it were not diverted, and the device 1 results more compact.
- the device 1 can further comprise a focusing optical means 28 between the combiner element 35 and the screen element 40, or more generally between the sets of optical elements 30 and the screen element 40, for example placed before the reflecting means 42 between the latter and the magnifying elements 36.
- optical elements 30, the combiner element 35, the mirror means 34, the reflective element 37 and the magnifying elements 36 are part of an optical block 300.
- Said optical block 300, the electronic control means 24, the image generation unit 26 and the optical means 27, 28 constitute a so-called projection group 100.
- the preferred operation of the three-dimensional visualization device 1 provides that it is installed in or on the dashboard of a motor vehicle or operating vehicle close to the windshield, with the screen element 40 coinciding with the windshield itself, as shown in figure 5 by way of example.
- said screen element 40 can be a glass plate independently fixed to the dashboard and separated from said windshield.
- the electronic control means 24 are connected to an advanced user assistance system, so-called ADAS (Advanced Driver- Assistance System), external to the device 1, from which they almost continuously receive information on the state of the vehicle, for example speed, position, orientation, information about the route to be followed, information about the vehicle's surroundings, for example about traffic signs or the presence and position of other vehicles and obstacles, and other data that may be relevant for a user G of the vehicle and of the device 1, which is the driver of said vehicle.
- ADAS Advanced Driver- Assistance System
- the electronic control means 24 on the basis of predetermined criteria configurable from the outside by the user G through control and command means, known and not illustrated, digitally process a set of graphic elements SI-N to form an image 50, with a predetermined width/height ratio, assigned to be seen by the user G on the screen element 40 superimposed onto the ambient scene A observed by said user G in front of the vehicle.
- the electronic control means 24 produce a new image 50 with a frequency which is fixed or depends on time or other factors, such as the speed of the vehicle.
- Each of the graphic elements SI-N having predetermined colors, shape, position and meaning, carries a digital attribute which indicates its desired focusing distance DI-N.
- the electronic control means 24 distribute the graphic elements SI-N of the image 50 in an expanded image 51 comprising two adjacent portions near 52n and far 52f, each of which has the same width/height ratio as the image 50.
- the near portion 52n contains the graphic elements SI-N assigned to be focused on focal planes FI-N at focusing distances DI-N comprised between a minimum focusing distance D n and an intermediate focusing distance Dm.
- the far portion 52f contains the graphic elements SI-N assigned to be focused between the intermediate focusing distance Dm and a maximum focusing distance Dr
- the minimum focusing distance D n and the intermediate focusing distance Dm correspond respectively to the minimum focal distance and the maximum focal distance of one of the focusing means 32
- the intermediate focusing distance Dm and the maximum focusing distance Df correspond respectively to the minimum focal distance and the maximum focal distance of the remaining focusing means 32.
- the electronic control means 24 therefore produce a series or sequence of frames FFI-M deriving from the expanded image 51.
- Each of the frames FFI-M comprises two or more portions, preferably two adjacent portions 53n, 53f, of dimensions identical to those of the expanded image 51.
- Each portion 53n, 53f of one of the frames FFI-M contains only graphic elements SI-N having identical focusing distance DI-N.
- the number NM of the frames FFI-M produced starting from the image 50, i.e. from the expanded image 51, is necessarily greater than or equal to a lower limit number equal to the greatest of: a number of near focusing distances Nn, equal to the number of focusing distances DI-N comprised between the minimum focusing distance D n and the intermediate focusing distance Dm at which at least one of the graphic elements SI-N of the image 50 is assigned to be focused; a number of far focusing distances Nf, equal to the number of focusing distances Di- N between the intermediate focusing distance Dm and the maximum focusing distance Df at which at least one of the graphic elements SI-N of the image 50 is assigned to be focused.
- the number NM of the frames FFI-M is preferably less than or equal to the least common multiple of said numbers of near and far focusing distances Nn, Nf.
- the sequence of frames for the image 50 comprises frames FFI-M preferably different from one other.
- the higher the number NM of the frames FFI-M, the higher their rate Cf ps the shorter the visualization time T v of each individual frame.
- a new display cycle of the same sequence of frames begins, preferably visualized according to the order followed during the previous cycle.
- the electronic control means 24 electronically transmit said frame to the image generation unit 26, and simultaneously operate the two focusing means 32 to adjust the focal distance of each of these, setting it to the value indicated by the attributes of the graphic elements SI-N contained in each respective portion 53n, 53f of the frame.
- the image generation unit 26 generates the luminous graphic representation of each portion 53n, 53f of the frame and projects it along the respective optical path P thereof through the focusing means 32 operated by the electronic control means 24.
- the portion 53n, deriving from the near portion 52n of the expanded image 51 is projected through the focusing means 32 adjusted between the focusing distances minimum D n and intermediate D m ;
- the portion 53f, deriving from the far portion 52f of the expanded image 51 is projected through the focusing means 32 adjusted between the focusing distances intermediate Dm and maximum Dr
- the two portions 53n, 53f of the frame, having identical proportions, are combined together in the combiner element 35 to form a combined frame of proportions equal to those of the image 50.
- the magnifying elements 36 have the function of enlarging the combined frame and transforming it, so that its projection onto the screen element 40, which has a predetermined possibly concave shape, appears to have the correct proportions of the image 50 from the point of view of the user G.
- a display cycle of the sequence of frames FFI-M has a duration of T v x NM seconds, and the same frame is projected onto the screen element 40 every Tv x NM seconds. Therefore, every Tv x NM seconds, i.e. every NM / Cf ps seconds, the user G observes the entire sequence of frames FFI-M containing all the graphic elements SI-N constituting the image 50.
- the user G does not distinguish the sequentiality and alternation with which the graphic elements SI-N are visualized on the screen element 40, but he sees on the screen element 40 a single almost static image 50 in which all the graphic elements SI-N are present almost continuously and simultaneously, each one focused at its respective focusing distance DI-N. In this way, the user G has the impression that the image 50 is three-dimensional, with each of the graphic elements SI-N superimposed on the portion of the ambient scene A to which the meaning of the graphic element itself refers.
- the holographic optical element, associated with the screen element 40, mostly reflects the wavelengths present in the image 50, so that the three-dimensional image 50 appears to the user G to be sufficiently bright with respect to the ambient scene A on which it is superimposed, so that the user G can easily observe the graphic elements SI-N.
- a variant of the device 1 provides that the number of sets of optical elements 30 and of their respective optical paths P is greater than two, for example four. Consequently, the expanded image 51 comprises four portions, each containing the graphic elements SI-N assigned to be focused at focusing distances DI-N within a respective range, wherein these ranges are preferably separated, but alternatively also partially overlapping. All the optical paths P converge into an output combiner element 35 through a tree hierarchy of combiner elements 35 which combine the optical paths P two by two. In this way the focusing of the graphic elements SI-N is entrusted to and distributed among four focusing means 32, the workload of each one being on average lower than in the variants of the device 1 with a smaller number of focusing means 32. The price for this advantage results in a reduction in the resolution of the images displayed, since the resolution of the image generation unit 26 is divided by the number of portions of the expanded image 51.
- the holographic optical element is associated with an element of the device 1 itself other than the screen element 40, for example with the reflecting medium 42 located before the latter.
- a second embodiment of the device 1 provides that this also includes eye monitoring means 150 and pointing means 200 of the optical paths P, connected to the electronic control means 24.
- the eye monitoring means 150 are assigned to detect the position of the eyes and/or the gaze direction W of the user G of the device 1, to trace the movements of the eyes and gaze of the user G.
- Said eye monitoring means 150 comprise an optical detector 151, which is preferably a camera framing the user's eyes, and a lighting element 152 such as a LED source or another luminous means, operating in the visible and/or infrared spectrum depending on the detection spectrum of the associated optical detector 151.
- an optical detector 151 which is preferably a camera framing the user's eyes
- a lighting element 152 such as a LED source or another luminous means, operating in the visible and/or infrared spectrum depending on the detection spectrum of the associated optical detector 151.
- the lighting element 152 illuminates the user's eyes even in low ambient light conditions, so that the optical detector 151 can detect and track the pupils of the user G.
- the data produced by the eye monitoring means 150 are employed in an algorithm for eye tracking running in the electronic control means 24 or in a dedicated processor of the eye tracking means 150 themselves to determine at any instant in which direction the user G is looking.
- a variant of the eye monitoring means 150 provides that the optical detector 151 is of another type, for example a photodiode.
- multiple optical detectors 151 can frame and observe the eyes of the user G from different directions for faster and/or more accurate detection.
- the pointing means 200 are for example so-called f-theta and/or telecentric optical systems.
- the pointing means 200 comprise one or more mirrors 201 interposed between the sets of optical elements 30 and the screen element 40 and orientable, i.e. rotatable and/or translatable, by means of respective actuator elements 202 along at least one direction.
- the pointing means 200 comprise a mirror 201 connected to an actuator element 202 which allows it to be tilted around two orthogonal axes.
- the pointing means 200 are controlled by the electronic control means 24 to adjust the display position of the images 50 on the screen element 40 according to the position of the eyes and/or the gaze direction W of the user G.
- each image 50 visualized on the screen element 40 can be less than the total area of the screen element 40 itself, as illustrated in figures 11 to 13 by way of example. Therefore, each image 50 can be projected in a respective display position covering only a certain portion and/or centered at a certain point of the screen element 40.
- the pointing means 200 allow for varying said display position for each image according to the gaze direction W of the user G, continuously aligning the projected images 50 with the gaze direction W of the user G.
- a variant of the device 1 provides that the mirror 201 of the pointing means 200 coincides with the reflecting means 42, to which an actuator element 202 is connected making it orientable.
- the solution adopted for the device 1 in its second embodiment just described allows to use varifocal focusing means 32 in which the adjustment of the focal distance is extremely rapid. In fact, in order to react quickly to the actuation commands therefore, these have a pupil of reduced dimensions, which limit the field of view or solid angle of projection of the focusing means 32 themselves.
- the orientation of the optical paths P passing through the focusing means 32 therefore allows the images 50 to be projected in different positions which cover a larger surface of the screen element 40, also in angular terms, with respect to the field of view of the focusing means 32.
- the projection position is determined so that the projected graphic elements SI-N are interposed exactly between the eyes of the user G and the portion of the ambient scene A on which said graphic elements SI-N are assigned to be superimposed.
- the invention provides that the three-dimensional visualization device 1 comprises a plurality of image generation units 26, preferably two, each pertaining to respective sets of optical elements 30 converging in the same output combiner element 35, either directly or through a hierarchy of combiner elements 35.
- the various image generation units 26 cooperate in parallel to project respective portions of the expanded image 51 derived from the same image 50, with the advantageous effect of facilitating the work of their respective focusing means 32 without at the same time reducing the resolution of the image 50 visualized on the screen element 40.
- the present invention also relates to a three-dimensional visualization method by means of the three-dimensional visualization device described above.
- Said method comprises the phases of: processing, by means of the electronic control means 24, an image 50 comprising one or more graphic elements SI-N assigned to be focused at respective focusing distances DI-N predetermined by the electronic control means 24; decomposing, by means of the electronic control means 24, the image 50 into a sequence of frames FFI-M, each preferably having two portions 53n, 53f, each of which contains only graphic elements SI-N assigned to be focused at the same and respective focusing distance DI-N comprised between respective minimum and maximum focal distances; transmitting in a sequential and cyclical manner the sequence of frames FFI-M, derived from the image 50, from the electronic control means 24 to the image generation unit 26; performing the following steps, repeating them for each transmitted frame of the sequence of frames FFI-M to the image generation unit 26: by means of the electronic control means 24, operating each focusing means 32 to adjust its focal distance to the focusing distance DI-N of the graphic elements SI-N of a respective portion 53n, 53f of the transmitted frame; by means of the image generation unit 26, projecting each
- the phase of decomposing the image 50 provides that a portion 53n of each of the frames FFI-M contains only graphic elements SI-N assigned to be focused between a minimum focusing distance D n and an intermediate focusing distance Dm, and that the remaining portion 53f contains only graphic elements SI-N assigned to be focused between said intermediate focusing distance Dm and a maximum focusing distance Dr
- the step of operating the focusing means 32 then provides that the electronic control means 24 actuate each focusing means 32 by regulating the focal distance thereof within the range of the focusing distances DI-N of its respective portion 53n, 53f of the frames FFI-M.
- the transmission of the frames restarts cyclically from the beginning of said sequence, preferably according to the order followed in the previous cycle.
- the phases of processing, decomposing, transmitting and performing are repeated for each new image 50 to be visualized on the screen element 40.
- the phase of performing further comprises the step of interrupting, momentarily and for a minimum amount of time, the projection by the image generation unit 26, before the step of operating the focusing means 32, so as not to cause a blur of the previous transmitted frame while adjusting the focus of the focusing means 32 to the value required for the current transmitted frame.
- Another variant of the method provides that the image 50 is decomposed into a different number of portions 53n, 53f, associated with an equal number of respective focusing means 32.
- the method further comprises the phases of: detecting the position of the eyes and/or the gaze direction W of a user G of the device 1 by means of eye monitoring means 150 of the device 1 itself, comprising for example one or more cameras connected to the electronic control means 24 and the shots of which are processed by an eye tracking algorithm; adjusting the display position of the images 50 on the screen element 40, according to the detected position of the eyes and/or the gaze direction W of the user G, by means of pointing means 200 of the device 1 such as for example a mirror 201 adjustable along one or more orthogonal directions by means of an actuator element 202 thereof.
- a third embodiment of the three-dimensional visualization method provides that the frames FFI-M are transmitted to a plurality of image generation units 26, and that each of these projects respective portions of each frame.
- An advantage of the present invention is to provide a three-dimensional visualization device 1 in which the graphic elements SI-N of each image 50 can be simultaneously displayed and accurately focused each on respective focal planes FI-N at respective focusing distances DI-N.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
A three-dimensional visualization device comprises electronic control means (24), at least one image generation unit (26), a screen element (40) for visualizing the images (50), and at least one set of optical elements (30) defining at least one optical path (P) between the at least one image generation unit (26) and the screen element (40). Each image (50) comprises at least one graphic element (S1-N) assigned to be focused at a respective predetermined focusing distance (D1-N). The device (1) comprises at least two sets of optical elements (30) defining respective optical paths (P). Each set of optical elements (30) comprises at least a diffuser element (33) and a focusing means (32) which is assigned to focus at least one graphic element (S1-N) at the respective focusing distance (D1-N) thereof comprised between a minimum focal distance and a maximum focal distance of the focusing means (32). Graphic elements (S1-N) of the same image (50), having different focusing distances (D1- N) comprised between the same minimum and maximum focal distances, follow the same optical path (P) and are projected sequentially and cyclically over time.
Description
THREE-DIMENSIONAL VISUALIZATION DEVICE AND METHOD
DESCRIPTION
TECHNICAL FIELD
The present invention relates to the field of visualization of graphic elements onto one or more focal planes, and in particular relates to a three-dimensional visualization device and method.
PRIOR ART
There is a known three-dimensional visualization device, for example the so-called Heads-Up Display (HUD) in augmented reality, for example in the automotive and aircraft fields, which comprises electronic control means, an image generation unit, optical elements such as lenses, mirrors and beam splitters for enlarging and focusing the image, and a semi -reflective screen for viewing the image. The graphic elements constituting an image to be visualized and formed by the image generation unit follow optical paths through the optical elements and are projected onto the semi-reflective screen at respective variable focal distances, so that, in correspondence with an operating condition of the device, a user sees these graphic elements distributed on different focal planes, i.e. in three dimensions.
Document WO2018/066675A1 discloses a variable magnification projection optical system wherein the display distance of a virtual image is changeable at an arbitrary position within a screen. The variable magnification projection optical system has a first projection optical system, which forms a first intermediate image from an image of a display element, and a second projection optical system which reflects image light from the first intermediate image by a combiner that is a display screen, thereby displaying a virtual image through the display screen. The second projection optical system has a variable focus optical system in which the display position of the virtual image is variable.
Document DEI 12017006073T5 discloses an imaging system for generating virtual
images with multiple depths on a screen. The imaging system comprises an image realization device for forming a source image, and projection optics for rendering a display image on the screen. The display image is a virtual image corresponding to the source image. The image realization device comprises an image realization surface and a light structuring device having a surface with a first and second region, wherein the light structuring device is configured to simulate a first lens with a first focal length on the first region of the surface. The surface and the image realization surface are arranged so that a first source image formed on a first area of the image realization surface and projected by the projection optics creates a first display image on the screen at a first apparent depth. The light structuring device is further configured to simulate a second lens on the second area of the surface, the second lens having a second focal length. The surface and the image realization surface are arranged such that a second source image that is on a second area of the image realization surface formed and projected by the projection optics creates a second display image on the screen at a second apparent depth. The first and second lenses are independently configurable.
Document US10353212B2 discloses a see-through type display apparatus and a method of operating the see-through type display apparatus. The see-through type display apparatus simultaneously outputs a plurality of two-dimensional (2D) images having different depth information onto different regions, produces a multi-layered depth image by sequentially arranging the plurality of 2D images according to the depth information, and causes the multi-layered depth image and a reality environment to converge onto one region.
Document DE3532120A1 discloses a windscreen for motor vehicles with a reflecting device for making optical information or signals visible in the field of vision of the windscreen. The reflecting device is a hologram with the property of a mirror. The reflection angle of the holographic mirror is different from the reflection angle of the glass surface. As a result, the rays reflected by the glass surface do not reach the eye of the viewer, but only the rays reflected by the holographic mirror. In the case of a curved windshield, a hologram with the function is on a carrier layer that is curved to match the shape of the windshield of a plane mirror, so that the reflected image of the optical signals
or information is free of optical aberrations even when the windshield is curved.
A disadvantage of the known three-dimensional visualization devices consists in that focusing is achieved through the mechanical adjustment of the position of the lenses, resulting relatively slow, affected by uncertainties deriving from mechanical uncertainties, and sensitive to the vibrations of the environment in which the device is installed.
Another drawback of the known devices consists in that the number of focal planes on which a graphic element can be focused is limited, due to the mechanical adjustment of the focusing.
A further disadvantage of the known devices consists in that different graphic elements can be displayed simultaneously only on the same focal plane, or on a limited number of focal planes at predetermined and fixed focal distances.
DISCLOSURE OF THE INVENTION
An object of the present invention is to propose a three-dimensional visualization device in which the graphic elements of each image can be simultaneously and finely displayed and focused each on a respective focal plane placed at a respective focusing distance.
Another object is to propose a device in which the projected images are always aligned with the direction of the user's gaze.
BEIEF DESCRIPTION OF THE DRAWINGS
The features of the invention are highlighted below with particular reference to the attached drawings in which: figure 1 illustrates a schematic view of the three-dimensional visualization device object of the present invention, in which some parts have been removed to better highlight others;
figure 2 illustrates an enlarged schematic view of a projection group of the device of figure 1 comprising an optical block; figure 3 illustrates an enlarged schematic view of the optical block of figure 2; figures 4a, 4b respectively illustrate an ambient scene and an image visualized by the device of figure 1 as seen by a user; figure 5 illustrates a schematic side view of the graphic elements of the image of figure 4b on their respective focal planes as seen by a user; figure 6 illustrates a schematic perspective view of the graphic elements of figure 5, in which some parts have been removed to better highlight others; figure 7 illustrates an expanded image derived from the image of figure 4b; figures 8a, 8b illustrate respective frames of the expanded image of figure 7; figure 9 illustrates the ambient scene of figure 4a onto which the visualized image of figure 4b is superimposed; figure 10a illustrates a general representation of a time diagram of the projection of frames for visualizing an image; figure 10b illustrates an exemplary representation of the time diagram of figure 10a of the projection of the frames of figures 8a, 8b for visualizing the image of figure 4b; figures 11 and 12 illustrate respective images projected in different display positions on the screen element; figure 13 schematically illustrates the gaze of a user directed in two different directions.
BEST MODE TO CARRY OUT THE INVENTION
With reference to figures 1 to 13, numeral 1 indicates the three-dimensional visualization device object of the present invention.
In the preferred embodiment, the three-dimensional visualization device 1 essentially comprises electronic control means 24, a screen element 40, an image generation unit 26, and two sets of optical elements 30 defining two respective and distinct optical paths P between the image generation unit 26 and the screen element 40.
An image 50 comprises a finite number of graphic elements or symbols SI-N, each of which is assigned to be focused on focal planes FI-N placed at respective predetermined focusing distances DI-N, equal or different, measured starting from a reference such as for example the screen element 40, the sets of optical elements 30 or a single element of the latter.
The electronic control means 24 consist of microprocessors and other digital and analog electric and electronic circuits, and are connected to the active components of the device 1 and to circuits which are external to the device itself, as described below.
The image generation unit 26 preferably consists of a projector means, for example of the DLP or TFT type, and is capable of projecting luminous images at a predetermined resolution, i.e. composed of a predetermined number of pixels.
With particular reference to figure 3, each set of optical elements 30 comprises a focusing means 32 and a diffuser element 33 interposed between the focusing means 32 and the image generation unit 26.
The diffuser element 33 is an almost planar optical component capable of revealing and showing, on a surface thereof facing the focusing means 32, a respective portion of a luminous image projected by the image generation unit 26. This portion of luminous image, constituting the object of the projection, is preferably focused on the diffuser element 33 thanks to an optical means 27, such as a lens or a group of lenses, interposed between the image generation unit 26 and the diffuser element 33 along its respective optical path P.
The distance of the diffuser element 33 from the focusing means 32 therefore fixes the so-called object distance which, in combination with the variable focal distance of the focusing means 32, determines the focusing distance DI-N of said image portion. In other words, onto the diffuser element 33 the actual frame is established, i.e. the object, which will be projected through its respective set of optical elements 30 and displayed on the
screen element 40 focused at its respective focusing distance DI-N. The diffuser element 33 thus establishes a reference or offset for calculating and adjusting the focal distance of each focusing means 32 and of each set of optical elements 30.
Preferably, each focusing means 32 is a varifocal lens with piezoelectric adjustment of the focal distance, or of a similar lens type with electrical or electronic adjustment, the actuation of which, usually via digital signals, by the electronic control means 24 allows it to adjust quickly and finely its focal distance. The luminous image shown by the diffuser element 33 on one side of the focusing means 32 is focused by the latter on its opposite side, at the desired focusing distance DI-N.
The focal distance of each focusing means 32 can be adjusted between a respective minimum focal distance and a respective maximum focal distance, the value of which depends on geometrical and constructive factors of the focusing means 32 itself. Each focusing means 32 is assigned to focus one or more graphic elements SI-N at their respective focusing distances DI-N comprised between said minimum and maximum focal distances.
The optical paths P, each passing through its respective diffuser element 33 and through its respective focusing means 32, preferably converge into a combiner element 35 consisting of a beam splitter, or an optical prism, or the like.
The two sets of optical elements 30 are side by side and their optical axes are aligned along parallel directions. The optical axis of one set of optical elements 30 is aligned normally with the center of a front face 35f of the combiner element 35. The optical axis of the remaining set of optical elements 30 points to a reflecting element 37 fixed in front of a side face 351 of the combiner element 35, with the normal of the reflecting element 37 oriented along the bisector of the angle between said optical axis and the normal to said side face 351.
The optical paths P exit coinciding with each other from a rear face 35r of the combiner element 35.
Optionally, a mirror means 34 is associated with the combiner element 35 to a side face thereof opposite to its side face 351 which faces the reflecting element 37. The mirror means 34 is for example of the so-called blocker type.
Magnifying elements 36, constituted by a set of coaxial lenses aligned normally with the center of the rear face 35r of the combiner element 35, are preferably placed downstream of the combiner element 35 itself between this and the screen element 40.
The screen element 40 consists of a plate of glass or other transparent material with two layers, between which a holographic optical element, so-called HOE (Holographic Optical Element), is preferably inserted, in the form of a film or thin layer.
Optionally, a reflecting means 42 is interposed between the optical elements 30 and the screen element 40, preferably between the magnifying elements 36 and the screen element 40, or alternatively between the combiner element 35 and the magnifying elements 36. The reflecting means 42, of the mirror or series of mirrors type, deviates the optical paths P between the combiner element 35 and the screen element 40, so that an optical path P of a given length thus diverted is contained inside a smaller volume than that it would occupy if it were not diverted, and the device 1 results more compact.
The device 1 can further comprise a focusing optical means 28 between the combiner element 35 and the screen element 40, or more generally between the sets of optical elements 30 and the screen element 40, for example placed before the reflecting means 42 between the latter and the magnifying elements 36.
The optical elements 30, the combiner element 35, the mirror means 34, the reflective element 37 and the magnifying elements 36 are part of an optical block 300.
Said optical block 300, the electronic control means 24, the image generation unit 26 and the optical means 27, 28 constitute a so-called projection group 100.
The preferred operation of the three-dimensional visualization device 1 provides that it is installed in or on the dashboard of a motor vehicle or operating vehicle close to the windshield, with the screen element 40 coinciding with the windshield itself, as shown in figure 5 by way of example. Alternatively, said screen element 40 can be a glass plate independently fixed to the dashboard and separated from said windshield.
The electronic control means 24 are connected to an advanced user assistance system, so- called ADAS (Advanced Driver- Assistance System), external to the device 1, from which they almost continuously receive information on the state of the vehicle, for example speed, position, orientation, information about the route to be followed, information about the vehicle's surroundings, for example about traffic signs or the presence and position of other vehicles and obstacles, and other data that may be relevant for a user G of the vehicle and of the device 1, which is the driver of said vehicle.
The electronic control means 24, on the basis of predetermined criteria configurable from the outside by the user G through control and command means, known and not illustrated, digitally process a set of graphic elements SI-N to form an image 50, with a predetermined width/height ratio, assigned to be seen by the user G on the screen element 40 superimposed onto the ambient scene A observed by said user G in front of the vehicle. The electronic control means 24 produce a new image 50 with a frequency which is fixed or depends on time or other factors, such as the speed of the vehicle.
Each of the graphic elements SI-N, having predetermined colors, shape, position and meaning, carries a digital attribute which indicates its desired focusing distance DI-N.
The electronic control means 24 distribute the graphic elements SI-N of the image 50 in an expanded image 51 comprising two adjacent portions near 52n and far 52f, each of which has the same width/height ratio as the image 50. The near portion 52n contains the graphic elements SI-N assigned to be focused on focal planes FI-N at focusing distances DI-N comprised between a minimum focusing distance Dn and an intermediate focusing distance Dm. The far portion 52f contains the graphic elements SI-N assigned to be focused between the intermediate focusing distance Dm and a maximum focusing distance Dr
Preferably, the minimum focusing distance Dn and the intermediate focusing distance Dm correspond respectively to the minimum focal distance and the maximum focal distance of one of the focusing means 32; the intermediate focusing distance Dm and the maximum focusing distance Df correspond respectively to the minimum focal distance and the maximum focal distance of the remaining focusing means 32.
The electronic control means 24 therefore produce a series or sequence of frames FFI-M deriving from the expanded image 51. Each of the frames FFI-M comprises two or more portions, preferably two adjacent portions 53n, 53f, of dimensions identical to those of the expanded image 51. Each portion 53n, 53f of one of the frames FFI-M contains only graphic elements SI-N having identical focusing distance DI-N.
The number NM of the frames FFI-M produced starting from the image 50, i.e. from the expanded image 51, is necessarily greater than or equal to a lower limit number equal to the greatest of: a number of near focusing distances Nn, equal to the number of focusing distances DI-N comprised between the minimum focusing distance Dn and the intermediate focusing distance Dm at which at least one of the graphic elements SI-N of the image 50 is assigned to be focused; a number of far focusing distances Nf, equal to the number of focusing distances Di- N between the intermediate focusing distance Dm and the maximum focusing distance Df at which at least one of the graphic elements SI-N of the image 50 is assigned to be focused.
The number NM of the frames FFI-M is preferably less than or equal to the least common multiple of said numbers of near and far focusing distances Nn, Nf.
The sequence of frames for the image 50 comprises frames FFI-M preferably different from one other. The frames FFI-M of said sequence are assigned to be displayed by the image generation unit 26 on the screen element 40 alternating with each other in a sequential and cyclical manner for the entire time during which the image 50 is visualized
by the device 1, with a rate Cfps, expressed in frames per second (fps), which mainly depends on the number NM of the frames FFI-M produced starting from the image 50, and for a visualization time Tv of the single frame linked to the rate Cfps through the relationship Tv = 1 / Cfps. The higher the number NM of the frames FFI-M, the higher their rate Cfps, the shorter the visualization time Tv of each individual frame. As long as the image 50 is visualized by the device 1, at the end of a display cycle of the sequence of frames FFI-M a new display cycle of the same sequence of frames begins, preferably visualized according to the order followed during the previous cycle.
For each frame to be visualized, the electronic control means 24 electronically transmit said frame to the image generation unit 26, and simultaneously operate the two focusing means 32 to adjust the focal distance of each of these, setting it to the value indicated by the attributes of the graphic elements SI-N contained in each respective portion 53n, 53f of the frame.
The image generation unit 26 generates the luminous graphic representation of each portion 53n, 53f of the frame and projects it along the respective optical path P thereof through the focusing means 32 operated by the electronic control means 24. In particular, the portion 53n, deriving from the near portion 52n of the expanded image 51, is projected through the focusing means 32 adjusted between the focusing distances minimum Dn and intermediate Dm; the portion 53f, deriving from the far portion 52f of the expanded image 51, is projected through the focusing means 32 adjusted between the focusing distances intermediate Dm and maximum Dr
The two portions 53n, 53f of the frame, having identical proportions, are combined together in the combiner element 35 to form a combined frame of proportions equal to those of the image 50.
The projected combined frame exits the rear face 35r of the combiner element 35, passes through the magnifying elements 36, is focused by the optical means 28, is reflected by the reflecting means 42, and arrives on the screen element 40, where it is visible to the user G.
The magnifying elements 36 have the function of enlarging the combined frame and transforming it, so that its projection onto the screen element 40, which has a predetermined possibly concave shape, appears to have the correct proportions of the image 50 from the point of view of the user G.
A display cycle of the sequence of frames FFI-M has a duration of Tv x NM seconds, and the same frame is projected onto the screen element 40 every Tv x NM seconds. Therefore, every Tv x NM seconds, i.e. every NM / Cfps seconds, the user G observes the entire sequence of frames FFI-M containing all the graphic elements SI-N constituting the image 50.
For a rate Cfps sufficiently high, for example Cfps > 20 x NM fps and preferably Cfps = 24 x NM fps, the user G does not distinguish the sequentiality and alternation with which the graphic elements SI-N are visualized on the screen element 40, but he sees on the screen element 40 a single almost static image 50 in which all the graphic elements SI-N are present almost continuously and simultaneously, each one focused at its respective focusing distance DI-N. In this way, the user G has the impression that the image 50 is three-dimensional, with each of the graphic elements SI-N superimposed on the portion of the ambient scene A to which the meaning of the graphic element itself refers.
The holographic optical element, associated with the screen element 40, mostly reflects the wavelengths present in the image 50, so that the three-dimensional image 50 appears to the user G to be sufficiently bright with respect to the ambient scene A on which it is superimposed, so that the user G can easily observe the graphic elements SI-N.
Graphic elements SI-N of the same image 50 having different focusing distances DI-N comprised between the same minimum and maximum focal distances, therefore, follow the same optical path P and are projected sequentially and cyclically over time.
A variant of the device 1 provides that the number of sets of optical elements 30 and of their respective optical paths P is greater than two, for example four. Consequently, the
expanded image 51 comprises four portions, each containing the graphic elements SI-N assigned to be focused at focusing distances DI-N within a respective range, wherein these ranges are preferably separated, but alternatively also partially overlapping. All the optical paths P converge into an output combiner element 35 through a tree hierarchy of combiner elements 35 which combine the optical paths P two by two. In this way the focusing of the graphic elements SI-N is entrusted to and distributed among four focusing means 32, the workload of each one being on average lower than in the variants of the device 1 with a smaller number of focusing means 32. The price for this advantage results in a reduction in the resolution of the images displayed, since the resolution of the image generation unit 26 is divided by the number of portions of the expanded image 51.
Another variant of the device 1 provides that the holographic optical element is associated with an element of the device 1 itself other than the screen element 40, for example with the reflecting medium 42 located before the latter.
A second embodiment of the device 1 provides that this also includes eye monitoring means 150 and pointing means 200 of the optical paths P, connected to the electronic control means 24.
The eye monitoring means 150 are assigned to detect the position of the eyes and/or the gaze direction W of the user G of the device 1, to trace the movements of the eyes and gaze of the user G.
Said eye monitoring means 150 comprise an optical detector 151, which is preferably a camera framing the user's eyes, and a lighting element 152 such as a LED source or another luminous means, operating in the visible and/or infrared spectrum depending on the detection spectrum of the associated optical detector 151.
The lighting element 152 illuminates the user's eyes even in low ambient light conditions, so that the optical detector 151 can detect and track the pupils of the user G. The data produced by the eye monitoring means 150 are employed in an algorithm for eye tracking running in the electronic control means 24 or in a dedicated processor of the eye tracking
means 150 themselves to determine at any instant in which direction the user G is looking.
A variant of the eye monitoring means 150 provides that the optical detector 151 is of another type, for example a photodiode.
Optionally, multiple optical detectors 151, for example two cameras, can frame and observe the eyes of the user G from different directions for faster and/or more accurate detection.
The pointing means 200 are for example so-called f-theta and/or telecentric optical systems. The pointing means 200 comprise one or more mirrors 201 interposed between the sets of optical elements 30 and the screen element 40 and orientable, i.e. rotatable and/or translatable, by means of respective actuator elements 202 along at least one direction. Preferably, the pointing means 200 comprise a mirror 201 connected to an actuator element 202 which allows it to be tilted around two orthogonal axes.
The pointing means 200 are controlled by the electronic control means 24 to adjust the display position of the images 50 on the screen element 40 according to the position of the eyes and/or the gaze direction W of the user G.
In particular, also depending on the size of the pupil of the focusing means 32, which determines the solid angle of projection thereof, the area of each image 50 visualized on the screen element 40 can be less than the total area of the screen element 40 itself, as illustrated in figures 11 to 13 by way of example. Therefore, each image 50 can be projected in a respective display position covering only a certain portion and/or centered at a certain point of the screen element 40. The pointing means 200 allow for varying said display position for each image according to the gaze direction W of the user G, continuously aligning the projected images 50 with the gaze direction W of the user G.
A variant of the device 1 provides that the mirror 201 of the pointing means 200 coincides with the reflecting means 42, to which an actuator element 202 is connected making it orientable.
The solution adopted for the device 1 in its second embodiment just described allows to use varifocal focusing means 32 in which the adjustment of the focal distance is extremely rapid. In fact, in order to react quickly to the actuation commands therefore, these have a pupil of reduced dimensions, which limit the field of view or solid angle of projection of the focusing means 32 themselves. The orientation of the optical paths P passing through the focusing means 32 therefore allows the images 50 to be projected in different positions which cover a larger surface of the screen element 40, also in angular terms, with respect to the field of view of the focusing means 32. Furthermore, the projection position is determined so that the projected graphic elements SI-N are interposed exactly between the eyes of the user G and the portion of the ambient scene A on which said graphic elements SI-N are assigned to be superimposed.
In a third embodiment thereof, the invention provides that the three-dimensional visualization device 1 comprises a plurality of image generation units 26, preferably two, each pertaining to respective sets of optical elements 30 converging in the same output combiner element 35, either directly or through a hierarchy of combiner elements 35.
The various image generation units 26 cooperate in parallel to project respective portions of the expanded image 51 derived from the same image 50, with the advantageous effect of facilitating the work of their respective focusing means 32 without at the same time reducing the resolution of the image 50 visualized on the screen element 40.
The present invention also relates to a three-dimensional visualization method by means of the three-dimensional visualization device described above.
Said method comprises the phases of: processing, by means of the electronic control means 24, an image 50 comprising one or more graphic elements SI-N assigned to be focused at respective focusing distances DI-N predetermined by the electronic control means 24; decomposing, by means of the electronic control means 24, the image 50 into a sequence of frames FFI-M, each preferably having two portions 53n, 53f, each of
which contains only graphic elements SI-N assigned to be focused at the same and respective focusing distance DI-N comprised between respective minimum and maximum focal distances; transmitting in a sequential and cyclical manner the sequence of frames FFI-M, derived from the image 50, from the electronic control means 24 to the image generation unit 26; performing the following steps, repeating them for each transmitted frame of the sequence of frames FFI-M to the image generation unit 26: by means of the electronic control means 24, operating each focusing means 32 to adjust its focal distance to the focusing distance DI-N of the graphic elements SI-N of a respective portion 53n, 53f of the transmitted frame; by means of the image generation unit 26, projecting each portion 53n, 53f of the transmitted frame through its respective focusing means 32 onto the screen element 40.
Preferably, the phase of decomposing the image 50 provides that a portion 53n of each of the frames FFI-M contains only graphic elements SI-N assigned to be focused between a minimum focusing distance Dn and an intermediate focusing distance Dm, and that the remaining portion 53f contains only graphic elements SI-N assigned to be focused between said intermediate focusing distance Dm and a maximum focusing distance Dr
The step of operating the focusing means 32 then provides that the electronic control means 24 actuate each focusing means 32 by regulating the focal distance thereof within the range of the focusing distances DI-N of its respective portion 53n, 53f of the frames FFI-M.
The phase of transmitting the frames FF I-M preferably provides for them to be transmitted at a rate Cfps comprised between 1 frame per second and 10000 frames per second, preferably between 20 and 120 frames per second. Each frame is projected for a visualization time Tv = 1 / Cfps.
After the transmission, one at a time, of the sequence of frames FFI-M, the transmission
of the frames restarts cyclically from the beginning of said sequence, preferably according to the order followed in the previous cycle.
The phases of processing, decomposing, transmitting and performing are repeated for each new image 50 to be visualized on the screen element 40.
In a variant of the method, the phase of performing further comprises the step of interrupting, momentarily and for a minimum amount of time, the projection by the image generation unit 26, before the step of operating the focusing means 32, so as not to cause a blur of the previous transmitted frame while adjusting the focus of the focusing means 32 to the value required for the current transmitted frame.
Another variant of the method provides that the image 50 is decomposed into a different number of portions 53n, 53f, associated with an equal number of respective focusing means 32.
In a second embodiment thereof, the method further comprises the phases of: detecting the position of the eyes and/or the gaze direction W of a user G of the device 1 by means of eye monitoring means 150 of the device 1 itself, comprising for example one or more cameras connected to the electronic control means 24 and the shots of which are processed by an eye tracking algorithm; adjusting the display position of the images 50 on the screen element 40, according to the detected position of the eyes and/or the gaze direction W of the user G, by means of pointing means 200 of the device 1 such as for example a mirror 201 adjustable along one or more orthogonal directions by means of an actuator element 202 thereof.
A third embodiment of the three-dimensional visualization method provides that the frames FFI-M are transmitted to a plurality of image generation units 26, and that each of these projects respective portions of each frame.
An advantage of the present invention is to provide a three-dimensional visualization
device 1 in which the graphic elements SI-N of each image 50 can be simultaneously displayed and accurately focused each on respective focal planes FI-N at respective focusing distances DI-N.
Claims
CLAIMS ) Three-dimensional visualization device comprising electronic control means (24), at least one image generation unit (26), a screen element (40) for visualizing the images (50), and at least one set of optical elements (30) defining at least one optical path (P) between the at least one image generation unit (26) and the screen element (40), with each image (50) comprising at least one graphic element (SI-N) assigned to be focused at a respective predetermined focusing distance (DI-N); the device (1) being characterized in that it comprises at least two sets of optical elements (30) defining respective optical paths (P), with each set of optical elements (30) comprising at least a diffuser element (33) and a focusing means (32) which is assigned to focus at least one graphic element (SI-N) at the respective focusing distance (DI-N) thereof comprised between a minimum focal distance and a maximum focal distance of the focusing means (32), wherein graphic elements (SI-N) of the same image (50), having different focusing distances (DI-N) comprised between the same minimum and maximum focal distances, follow the same optical path (P) and are projected sequentially and cyclically over time. ) Device according to claim 1 characterized in that at least one focusing means (32) is of the lens type with electrical adjustment of the focal distance. ) Device according to any one of the preceding claims characterized in that the optical paths (P) converge into at least one combiner element (35). ) Device according to claim 3 characterized in that it further comprises magnifying elements (36) interposed between the at least one recombination element (35) and the screen element (40). ) Device according to claim 3 or 4 characterized in that it further comprises at least one mirror means (34) associated with the at least one combiner element (35). ) Device according to any one of the preceding claims characterized in that the screen
element (40) comprises a holographic optical element.
7) Device according to any one of the preceding claims characterized in that it further comprises optical means (27, 28) for focusing, interposed between the at least one image generation unit (26) and each diffuser element (33) and/or between the sets of optical elements (30) and the screen element (40), and/or in that it further comprises reflecting means (42) interposed between the sets of optical elements (30) and the screen element (40) for deviating the optical paths (P).
8) Device according to any one of the preceding claims characterized in that it further comprises eye monitoring means (150) and pointing means (200) of the optical paths (P), connected to the electronic control means (24); the eye monitoring means (150) being assigned to detect the position of the eyes and/or the gaze direction (W) of a user (G) of the device (1); the pointing means (200) being interposed between the sets of optical elements (30) and the screen element (40) and assigned to adjust the display position of the images (50) on the screen element (40) according to the position of the eyes and/or the gaze direction (W) of the user (G).
9) Device according to claim 8 characterized in that the eye monitoring means (150) comprise at least one optical detector (151), such as a camera, a photodiode, or the like, and an lighting element (152), such as a visible or infrared LED source, or the like, and in that the pointing means (200) comprise at least one mirror (201) which can be orientated by means of a respective actuator element (202) along at least one direction.
10) Three-dimensional visualization method by means of the three-dimensional visualization device of any one of the preceding claims; the device (1) comprising electronic control means (24), at least one image generation unit (26), a screen element (40) and at least two sets of optical elements (30), with each image (50) comprising at least one graphic element (SI-N) assigned to be focused at a respective predetermined focusing distance (DI-N), and with each set of optical elements (30) comprising at least one diffuser element (33) and a focusing means (32) assigned to
focus at least one graphic element (SI-N) at the respective focusing distance (DI-N) thereof comprised between a minimum focal distance and a maximum focal distance of the focusing means (32); the method being characterized in that it comprises the phases of: processing an image (50) by means of the electronic control means (24); decomposing, by means of the electronic control means (24), the image (50) into a sequence of frames (FFI-M), each frame having at least two portions (53n, 53f) each containing only graphic elements (SI-N) assigned to be focused at the same and respective focusing distance (DI-N); transmitting, sequentially and cyclically, the frames (FFI-M) of the image (50) from the electronic control means (24) to the at least one image generation unit (26); performing, for each transmitted frame of the sequence of frames (FFI-M) to the at least one image generation unit (26), the steps of: by means of the electronic control means (24), operating each focusing means (32) to adjust its focal distance to the focusing distance (DI-N) of the graphic elements (SI-N) of a respective portion (53n, 53f) of the transmitted frame; by means of the at least one image generation unit (26), projecting each portion (53n, 53f) of the transmitted frame through its respective focusing means (32) onto the screen element (40). ) Method according to claim 10 characterized in that the phase of performing further comprises the step of suspending the projection by means of the at least one image generation unit (26) before the step of operating the focusing means (32). ) Method according to claim 10 or 11 characterized in that the phase of decomposing the image (50) provides for dividing each of the frames (FFI-M) into two portions (53n, 53f), wherein each portion contains only graphic elements (SI-N) assigned to be focused at the same and respective focusing distance (DI-N) comprised between respective minimum and maximum focal distances. ) Method according to any one of claims 10 to 12 characterized by transmitting the
frames FFI-M) at a rate (Cfps) comprised between 1 and 10000 frames per second, preferably between 20 and 120 frames per second. ) Method according to any one of claims 10 to 13 characterized by repeating the phases of processing, decomposing, transmitting and performing for each new image (50) to be visualized. ) Method according to any one of claims 10 to 14 characterized in that it further comprises the phases of: - detecting the position of the eyes and/or the gaze direction (W) of a user (G) of the device (1) by means of eye monitoring means (150) of the device (1); adjusting the display position of the images (50) on the screen element (40) according to the position of the eyes and/or the gaze direction (W) of the user (G), by means of pointing means (200) of the device (1).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IT102022000002672 | 2022-02-14 | ||
| IT102022000002672A IT202200002672A1 (en) | 2022-02-14 | 2022-02-14 | THREE-DIMENSIONAL DISPLAY DEVICE AND METHOD |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023152403A1 true WO2023152403A1 (en) | 2023-08-17 |
Family
ID=81392853
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/053675 Ceased WO2023152403A1 (en) | 2022-02-14 | 2023-02-14 | Three-dimensional visualization device and method |
Country Status (2)
| Country | Link |
|---|---|
| IT (1) | IT202200002672A1 (en) |
| WO (1) | WO2023152403A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025074044A1 (en) * | 2023-10-05 | 2025-04-10 | Stellantis Auto Sas | Multi-plane head-up display method, and associated device and motor vehicle |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE3532120A1 (en) | 1985-09-10 | 1987-03-19 | Ver Glaswerke Gmbh | WINDSHIELD WITH A REFLECTIVE DEVICE FOR MIRRORING OPTICAL SIGNALS INTO THE FIELD OF THE DRIVER |
| DE112014003685T5 (en) * | 2013-08-09 | 2016-05-12 | Denso Corporation | Information display device |
| WO2018066675A1 (en) | 2016-10-06 | 2018-04-12 | コニカミノルタ株式会社 | Variable magnification projection optical system and image display device |
| US10353212B2 (en) | 2017-01-11 | 2019-07-16 | Samsung Electronics Co., Ltd. | See-through type display apparatus and method of operating the same |
| DE112017006073T5 (en) | 2016-11-30 | 2019-08-29 | Cambridge Enterprise Limited | MULTI-DEPTH APPARATUS |
| US20210088700A1 (en) * | 2019-09-19 | 2021-03-25 | Facebook Technologies, Llc | Varifocal polarization sensitive diffusive display |
-
2022
- 2022-02-14 IT IT102022000002672A patent/IT202200002672A1/en unknown
-
2023
- 2023-02-14 WO PCT/EP2023/053675 patent/WO2023152403A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE3532120A1 (en) | 1985-09-10 | 1987-03-19 | Ver Glaswerke Gmbh | WINDSHIELD WITH A REFLECTIVE DEVICE FOR MIRRORING OPTICAL SIGNALS INTO THE FIELD OF THE DRIVER |
| DE112014003685T5 (en) * | 2013-08-09 | 2016-05-12 | Denso Corporation | Information display device |
| WO2018066675A1 (en) | 2016-10-06 | 2018-04-12 | コニカミノルタ株式会社 | Variable magnification projection optical system and image display device |
| DE112017006073T5 (en) | 2016-11-30 | 2019-08-29 | Cambridge Enterprise Limited | MULTI-DEPTH APPARATUS |
| US10353212B2 (en) | 2017-01-11 | 2019-07-16 | Samsung Electronics Co., Ltd. | See-through type display apparatus and method of operating the same |
| US20210088700A1 (en) * | 2019-09-19 | 2021-03-25 | Facebook Technologies, Llc | Varifocal polarization sensitive diffusive display |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025074044A1 (en) * | 2023-10-05 | 2025-04-10 | Stellantis Auto Sas | Multi-plane head-up display method, and associated device and motor vehicle |
| FR3153902A1 (en) * | 2023-10-05 | 2025-04-11 | Psa Automobiles Sa | Multi-plane head-up display method, associated device and motor vehicle. |
Also Published As
| Publication number | Publication date |
|---|---|
| IT202200002672A1 (en) | 2023-08-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7370340B2 (en) | Enhanced augmented reality experience on heads-up display | |
| EP3330771B1 (en) | Display apparatus and method of displaying using focus and context displays | |
| US10162175B2 (en) | Dual head-up display apparatus | |
| US6078427A (en) | Smooth transition device for area of interest head-mounted display | |
| US10955685B2 (en) | Volumetric display arrangement and a method for representing content of an image | |
| US10412378B2 (en) | Resonating optical waveguide using multiple diffractive optical elements | |
| JP7189513B2 (en) | head-up display device | |
| WO2023152403A1 (en) | Three-dimensional visualization device and method | |
| CN114174893B (en) | Display device with reduced power consumption | |
| WO2018124299A1 (en) | Virtual image display device and method | |
| WO2024090297A1 (en) | Head-up display system | |
| CN110618529A (en) | Light field display system for augmented reality and augmented reality device | |
| JP2000214408A (en) | Picture display device | |
| JP7111070B2 (en) | head-up display device | |
| EP3693783B1 (en) | Display device | |
| JPWO2019151314A1 (en) | Display device | |
| JP2019191368A (en) | Virtual image display device and head-up display device | |
| US12443033B2 (en) | Enhanced augmented reality experience on heads up display | |
| JP2019120891A (en) | Virtual display device and head-up display device | |
| JPWO2019070079A1 (en) | Display device and display method using this | |
| JPWO2019138972A1 (en) | Display method and display device | |
| WO2024228016A1 (en) | User-mountable display system and method | |
| JPWO2019093500A1 (en) | Display device | |
| WO2020184506A1 (en) | Head-up display device | |
| CN119960191A (en) | Display device, driving method thereof, and display system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23704181 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23704181 Country of ref document: EP Kind code of ref document: A1 |