US20090109215A1 - Method and apparatus for user interface communication with an image manipulator - Google Patents
Method and apparatus for user interface communication with an image manipulator Download PDFInfo
- Publication number
- US20090109215A1 US20090109215A1 US11/932,450 US93245007A US2009109215A1 US 20090109215 A1 US20090109215 A1 US 20090109215A1 US 93245007 A US93245007 A US 93245007A US 2009109215 A1 US2009109215 A1 US 2009109215A1
- Authority
- US
- United States
- Prior art keywords
- image
- coordinate system
- user interaction
- dimensional coordinate
- distortion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
- G03H2001/0061—Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject
Definitions
- GUI graphical user interface
- a graphical user interface is a type of computer application user interface that allows people to interact with a computer and computer-controlled devices.
- a GUI typically employs graphical icons, visual indicators or special graphical elements, along with text, labels or text navigation to represent the information and actions available to a user. The actions are usually performed through direct manipulation of the graphical elements.
- Holographic images can be created as single or consecutive images using available holographic technology. These technologies include mirrors, lasers, light, and images strategically positioned to cause the proper reflection to yield a holographic image broadcast through an entry point in the laser and mirror positioning system. Black background and rooms with low or no light may enhance the appearance of the holographic image or images, which may also use a holographic plate as a display medium. Holographic systems may be large in size and spread out over a large broadcasting area or may be compact enough to fit in spaces smaller than a desktop. Holographic technology is only limited in size by the size of the component parts. By using holographic technology, images may be displayed multi-dimensionally, rather simply on a planar projection.
- Holographic displays generated over the last 20-year period utilize various configurations including lasers with images on glass plates such as an AGFA 8E75HD glass plate or other glass plates as well a laser such as a Spectra Physics 124B HeNe laser, a 35 mW laser diode system utilizing different processing methods such as pyrochrome processing.
- Split beam techniques can also be used Multi H1 to Multi H2.
- Such configurations as 8 ⁇ 10, triethanolomine, from Linotronic 300 image setter film are also commonly utilized or a configuration with rear-illuminated for 30 ⁇ 40 cm reflection hologram, where a logo floats 18-inches in front of the plate.
- the “heliodisplay” of IO2 Technology, LLC of San Francisco, Calif. projects images into a volume of free space, i.e. into an aerosol mixture such as fog or a gas, and may operate as floating touchscreen when connected to a PC by a USB cable.
- the image is displayed into two-dimensional space (i.e. planar).
- the Heliodisplay images appear 3 dimensional (“3-D”), the images are planar and have no physical depth reference.
- the heliodisplay is a two dimensional display that projects against a curtain of air, or even glass. While, the heliodisplay may give the appearance of 3-D, the images displayed and the interface are 2-D. As such, the heliodisplay is not a true 3-D holographic display, and thus the interface operates on a two-dimensional plane, not taking advantage of a full three dimensional coordinate system.
- An embodiment of the present invention relates to the creation of a holographic user interface display system that combines physical media or digitally stored files with a digital holographic player hardware system.
- the result is the creation of a multimedia holographic user interface and viewing experience, where a variety of graphical schematics enabling cohesive access to information utilizing pyramids, blocks, spheres, cylinders, other graphical representations, existing templates, specific object rendering, free form association, user delegated images and quantum representations of information to form a user interface where the available tools combine over time to match a users evolving data and requests.
- Embodiments of the invention provide a holographic user interface which transforms the computing environment to enable a 3-D holographic style user interface and display system.
- the system utilizes holographic projection technology along with programmed quadrant matrixes sensor field to create multiple methods to select and interact with data and user interface tools and icons presented in a holographic format.
- the system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.
- an interconnection medium e.g., a bus
- a system and corresponding method for providing a 3-D user interface involves display images in a 3-D coordinate system.
- Sensors are configured to sense user interaction within the 3-D coordinate system, so that a processor may receive user interaction information from the sensors.
- the sensors are able to provide information to the processor that enables the processor to correlate user interaction with images in the 3-D coordinate system.
- a system, and corresponding method, for manipulating an original image may include at least one sensor that may be configured to sense a user interaction with the image in a three dimensional coordinate system, a correlation unit that may be configured to correlate the user interaction with the three dimensional coordinate system, and a projecting unit that may be configured to project an updated image based on the correlated user interaction.
- the distortion may include twisting, bending, cutting, displacement, and squeezing.
- the updated image may be equivalent to the original image manipulated by a distortion.
- the original image and/or the updated image may be a holographic image.
- the at least one sensor may be a laser sensor which may be configured to geometrically identify a position within the three dimensional coordinate system.
- the at least one sensor may be farther configured to triangulate and/or quadrilate a position within the three dimensional coordinate system.
- the projecting unit may be further configured to project the updated image by selecting a pre-recorded interference pattern based on the correlated data.
- the projecting unit may also be configured to project the updated image by projecting a generated interference pattern based on the correlated data.
- FIG. 1 is a block diagram illustrating a holographic user interface according to an example embodiment of the present invention
- FIG. 2 is a flow chart diagram illustrating a method for providing a 3 dimensional (3-D) interface with a system according to an example embodiment of the present invention
- FIG. 3 is a perspective view of sensor field used in connection with an example embodiment of the present invention.
- FIGS. 4A and 4B are front views of a holographic user interface device according to an example embodiment of the present invention.
- FIG. 5 is a perspective view of a diagram of a holographic user interface according to another example embodiment of the present invention.
- FIG. 6 is flow diagram depicting example operations of an image manipulation system according to an example embodiment of the present invention.
- FIG. 7 is a schematic of an image manipulation system employing a twisting distortion in accordance to an example embodiment of the present invention.
- FIG. 8 is a schematic of an image manipulation system employing a squeezing distortion in accordance to an example embodiment of the present invention.
- FIG. 9 is a schematic of an image manipulation system employing a cutting distortion in accordance to an example embodiment of the present invention.
- the present invention in accordance with one embodiment relates to the creation of a holographic user interface which transforms the computing environment to enable a three dimensional (3-D) holographic style user interface and display system.
- the system utilizes holographic projection technology along with programmed quadrant matrixes sensor field to create multiple methods to select and interact with data and user interface tools and icons presented in a holographic format.
- FIG. 1 illustrates a holographic user interface 100 according to one example embodiment of the present invention.
- the holographic user interface 100 includes a processor 114 that operates software 112 , controls a holographic image projector 116 , and processes information obtained from sensors 118 a , 118 b .
- the projector may generate a 3-D display image 101 , 102 within a 3-D coordinate system 150 .
- the sensors 118 a and 118 b may be directed toward the 3-D coordinate system to sense a user interaction with images within the 3-D coordinate system. If a user were to interact with an image 101 or 102 , the sensors 118 a and 118 b would provide coordinate information that the processor can correlate with the projected images 101 and 102 in the 3-D coordinate system.
- the sensed user interaction may include, but is not limited to, a sensed movement about the holographic image or a sensing of blocked light caused by a user “touching” the holographic image.
- Thermal and audio sensing may also be employed in the user interaction sensing.
- FIG. 2 is a flow chart that illustrates the method for providing a three dimensional (3-D) interface with a system.
- the interface generates ( 210 ) an image in a 3-D coordinate system.
- an embodiment of the interface deploys holographic information in the form of a user interface template as a default once turned on.
- Sensors on the interface sense ( 220 ) a user's interaction with the 3-D coordinate system. The sensing may occur through the use of matrixes or triangulated data points that correspond to specific functions and data display which the system is capable of displaying.
- the interface may then correlate ( 230 ) the user's interaction with an image in the 3-D coordinate system.
- the interface By sensing and correlating interaction with the 3-D coordinate system, the interface allows a computer system or display to interact with a user.
- the holographic data displayed by the system becomes a result of a selection process by the user who triggers data being displayed by key strokes or by the use of a three dimensional interactive interface.
- Users location commands are read by the system at their exact points and then the system deploys the appropriate response or holographic media based upon the users specific request made via the location of that request.
- FIG. 3 illustrates a sensor field used in connection with embodiments of the present invention.
- the embodiment illustrated in FIG. 3 includes four laser sensors 320 a - d .
- the manipulatable interface may be a relatable and interactive holographic media via the use of a sprocketed sensor system which deploys from the display either via a built in or retrofit hardware peripheral that creates a quadrilateral angle navigation system to determine the exact point 330 of a fingertip touch point 340 within a quadrant 310 (also referred to as a “3-D coordinate system”).
- This touch point if effectively deployed by the user, is mapped to the image deployed by the holographic hardware and software system, as each image that is displayed in the system is displayed from an exact point at an exact place in space that has been preconfigured to match specific points on the quadrilateral sensor system.
- the points in space attached to programmed images are then matched to touch points made by the user.
- the touch point may trigger the same functions as a mouse and cursor.
- the sensors may be laser sensors configured to provide data to triangulate a point within the 3-D coordinate system, photo voltaic sensors, photo electric light sensors, or image sensors.
- the sensors may also be motion sensors, which may for example be detected to sense the motion of a user's hand within the 3-D coordinate system.
- the sensors may be programmed to identify the specific location of the touchpoint 330 that may extend through multiple planar images, to identify a single image located at a 3-D coordinate space.
- FIG. 4A illustrates a holographic user interface device 400 A according to one embodiment of the present invention.
- the device 400 A has a port 410 A that may provide the output projector for the multi-dimensional display, and also the sensors for detecting user interaction.
- the projector and sensors map out a 3-D coordinate system 420 to serve as the holographic user interface.
- a communications port 430 A such as a universal serial bus (“USB”) port or wireless connection, serves to allow the device 400 A to communicate with a computer system.
- USB universal serial bus
- the holographic system may be based upon our prior holographic system technology filing, filed Apr. 5, 2007, U.S. application Ser. No.
- FIG. 4B illustrates holographic user interface devices 400 A, as described in relation to FIG. 4A , and 400 B.
- the holographic user interface device 400 B may be identical to the holographic user interface device 400 A, such that the device 400 B may include ports 410 B and 430 B, and may be configured to provide a holographic image in the 3-D coordinate system 420 .
- Multiple holographic user interface devices may be used to project a holographic image.
- the user interface device 400 A may be configured to project the holographic image from a desk or floor, while the second user interface device 400 B may be configured to project the holographic image from a ceiling.
- the second interface device 400 B may be used to reinforce the obstructed portion of the holographic image.
- the full holographic image may be viewed even in the presence of obstructions.
- any number of holographic user interface devices may be employed, and that any number of the user interface devices may be used to sense a user interaction.
- the second user interface device 400 B has been illustrated in a 180° configuration with respect to the first user interface device 400 A, any number of user interface devices may be included and the user interface devices may be offset by any distance or angle.
- FIG. 5 is a perspective view of a diagram of a holographic user interface 500 according to another embodiment of the present invention.
- the holographic user interface device may operate with a projection screen 580 .
- Images 505 displayed by the projection screen 580 of the user interface 500 can include, but are not limited to, shapes, graphic images, animation sequences, documents, and audiovisual programs, which may be configured as a logical display featuring icons whose organization on the projection screen 580 may be based upon the users patterns of use with the system. Examples of user patterns with the system may include, but are not limited to, always going online first, always working on a word document second, and always viewing pictures or videos from the users hard drive.
- These icons could be presented, for example, to the user in an order of priority on the display representing the users evolving use habits based upon history (e.g., distinct changes based upon day, time, and date).
- These icons which may include traditional UI operating system icons such as Word document icons and portable document format (“PDF”) icons, may be presented in a holographic format, Documents may be revised and read through in a traditional manner or through a holographic view. Any displayed holographic item may revert back to the flat display monitor, or vice versa, based upon a user command.
- traditional UI operating system icons such as Word document icons and portable document format (“PDF”) icons
- FIG. 6 is a flow diagram depicting example operations of an image manipulation system according to an example embodiment of the present invention.
- FIGS. 7-9 illustrate examples of image manipulation systems employing twisting, squeezing, and cutting distortions, respectively, according to example embodiments of the present invention.
- the image manipulation system may include a holographic user interface system 700 .
- the user interface 700 similarly to the user interface systems described in FIGS. 1-5 , may include a projector 725 configured to project a holographic image 715 in a three dimensional (3-D) coordinate system 720 .
- the user interface may also include sensors 710 that may be configured to sense a user interaction with the holographic image 715 which may be a 3-D image ( 601 ).
- the user interaction may be in the form of direct interaction, for example by a user's hand 740 interacting with the image 715 .
- the user interaction may also be in the form of voice recognition, retinal scan, fingerprint matching, or any other known input means.
- the sensed data may be correlated with respect to the 3-D coordinate system ( 602 ). This correlation may be performed by a correlation unit 735 which may be located in the user interface system 700 . It should also be appreciated that the correlation may be performed externally (e.g., via a host computer, or any device connected through a network) by transmitting the sensed data through the data exchange port 730 .
- the correlation of data may be used to interpret an intended distortion of the image 715 .
- a user may attempt to grab the image with their hand and upon making a twisting motion with their wrist, the correlation unit may interpret the distortion as an axial rotation, where the entire image as a whole is rotated about a center longitudinal axis, or as a twist, where a portion of the image is rotated, as shown in FIG. 7 .
- the correlation unit 735 may distinguish between the distortion of twist and rotation by, for example, recognizing predefined indicators. Examples of indicators may be, but are not limited to, vocal commands, hand positions, or any other known input means.
- the selection of a distortion based on correlated data may further be made by the use of at least one indicator.
- the correlated data may be used to select a new image to be projected ( 603 ).
- the new projected image may be based on the original projected image having undergone the distortion, for example, FIG. 7 displays an image having under gone the distortion of twist.
- a new image may be projected during each stage of distortion, therefore the image may appear to a user as moving in real time as the user interacts with the image.
- Each possible position of the image may be stored in a fixed media, as for example the fixed media described in U.S. patent application Ser. No. 11/857,161 (herein incorporated by reference), where each position may be referenced to an interference pattern.
- the measured responses from the sensor may be used to determine which interference pattern is to be projected, in what order the interference patterns are to be projected, and at what rate the interference patterns should be projected.
- the projection of the image may continuously change positions in accordance with the movement of the user's hand, or any form of interaction in the form of input.
- the correlated data may also be used to generate an interference pattern with use of computer aided holography.
- a variable medium for example a liquid crystal based medium, may be used for temporary storage of the computer generated interference pattern. Therefore, the distorted shape of the image, upon applying the twisting interference, may also be predetermined. It should be appreciated that the image may be twisted or rotated in any location and along any direction.
- FIGS. 8 and 9 illustrate user interface systems similar to those discussed in FIGS. 1-7 , and provide examples of squeezing and cutting distortions, respectively.
- a measured user interaction 740 may be correlated with the use of a correlation unit 735 .
- the correlation unit may identify the squeezing by, for example, identifying an indicator of a user's fingers being in a pinching configuration.
- the correlated data may be used to select a new image to be projected. As shown in the example provided by FIG. 8 , the user may distort the projected image by applying a squeezing distortion.
- the amount of distortion applied by the squeezing user interference may be predetermined by the interference patterns stored in the fixed media, or the computer generated interference patterns. Furthermore, the distorted shape of the image, upon applying the squeezing interference, may also be predetermined. It should be appreciated that the image may be squeezed in any location or along any direction.
- FIG. 9 an example of a cutting distortion is illustrated.
- a user interaction 740 may be correlated with the use of the correlation unit 735 .
- the correlation unit may identify the cutting by, for example, by identifying an indicator of a user's fingers being in a scissor configuration.
- the correlated data may be used to select a new image to be projected.
- the user may distort the projected image by applying a cutting distortion.
- the amount of distortion applied by the squeezing user interference may be predetermined by the interference patterns stored in the fixed media, or the computer generated interference patterns.
- the distorted shape of the image, upon applying the cutting interference may also be predetermined. It should be appreciated that the image may be cut in any location or along any direction.
- an image may also be displaced, for example a user may move an image side to side vertically or horizontally, and a user may also move the image backward or forward.
- An image may also be bent in any location of the image and along any direction. It should be appreciated that any form or type of image distortion known in the art may be applied. It should also be appreciated that any number of distortions may be applied to an image at a given time.
- image manipulation systems may be used in tandem with voice recognition, retinal scan, fingerprint matching, and standard input systems. It should also be appreciated that the at least a portion of the holographic image may become distorted as a result of a user input by means of voice recognition, retinal scan, fingerprint matching, or any other known input means. It should also be appreciated that any number of projection systems may be used in the authentication systems. Additionally, the sensors and/or correlation unit may be located externally from the user interface device. It should also be appreciated that any known 3-D imagery may be employed by the image manipulation system, for example volumetric imagery.
- a computer usable medium can include a readable memory device, such as a solid state memory device, a hard drive device, a CD-ROM, a DVD-ROM, or a computer diskette, having stored computer-readable program code segments.
- the computer readable medium can also include a communications or transmission medium, such as electromagnetic signals propagating on a computer network, a bus or a communications link, either optical, wired, or wireless, carrying program code segments as digital or analog data signals.
- the program code enables and supports computer implementation of the operations described in FIGS. 1-9 or any other described embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Holo Graphy (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system, and method for use thereof, for image manipulation. The system may generate an original image in a three dimensional coordinate system. A sensing system may sense a user interaction with the image. The sensed user interaction may be correlated with the three dimensional coordinate system. The correlated user interaction may be used to project an updated image, where the updated image may be a distorted version of the original image. The image distortion may be in the form of a twisting, bending, cutting, displacement, or squeezing. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.
Description
- A graphical user interface (GUI) is a type of computer application user interface that allows people to interact with a computer and computer-controlled devices. A GUI typically employs graphical icons, visual indicators or special graphical elements, along with text, labels or text navigation to represent the information and actions available to a user. The actions are usually performed through direct manipulation of the graphical elements.
- Holographic images can be created as single or consecutive images using available holographic technology. These technologies include mirrors, lasers, light, and images strategically positioned to cause the proper reflection to yield a holographic image broadcast through an entry point in the laser and mirror positioning system. Black background and rooms with low or no light may enhance the appearance of the holographic image or images, which may also use a holographic plate as a display medium. Holographic systems may be large in size and spread out over a large broadcasting area or may be compact enough to fit in spaces smaller than a desktop. Holographic technology is only limited in size by the size of the component parts. By using holographic technology, images may be displayed multi-dimensionally, rather simply on a planar projection.
- Currently, progress has been made in technologies that can enhance the capability and range of holographic media. Specifically, progress has been made in projects that employ multi-million mirror systems and, via companies that have designed specialized high speed and high capacity micro processors for specialized jobs, other than holographic systems. This technology could be applied to holographic technologies to make possible the proper positioning of millions of mirrors at a rate of between 24 to 60 or more frames of video per second, with corresponding synched audio.
- Holographic displays generated over the last 20-year period utilize various configurations including lasers with images on glass plates such as an AGFA 8E75HD glass plate or other glass plates as well a laser such as a Spectra Physics 124B HeNe laser, a 35 mW laser diode system utilizing different processing methods such as pyrochrome processing. Split beam techniques can also be used Multi H1 to Multi H2. Such configurations as 8×10, triethanolomine, from Linotronic 300 image setter film are also commonly utilized or a configuration with rear-illuminated for 30×40 cm reflection hologram, where a logo floats 18-inches in front of the plate.
- Some user interfaces have adopted a multi-dimensional interface approach. For example, the “heliodisplay” of IO2 Technology, LLC of San Francisco, Calif. projects images into a volume of free space, i.e. into an aerosol mixture such as fog or a gas, and may operate as floating touchscreen when connected to a PC by a USB cable. However, with the heliodisplay, the image is displayed into two-dimensional space (i.e. planar). While the Heliodisplay images appear 3 dimensional (“3-D”), the images are planar and have no physical depth reference.
- Unfortunately, these existing uses have certain limitations in distribution and deployment. For example, functionally, the heliodisplay is a two dimensional display that projects against a curtain of air, or even glass. While, the heliodisplay may give the appearance of 3-D, the images displayed and the interface are 2-D. As such, the heliodisplay is not a true 3-D holographic display, and thus the interface operates on a two-dimensional plane, not taking advantage of a full three dimensional coordinate system.
- Accordingly, there is a need for an integrated User Interface that utilizes true 3-D technology to create a computing and multimedia environment where a user can easily navigate by touch, mouse, voice activation, or pointer system to effectively navigate the interface to raise the level of the user experience to a true 3-D environment, with the goal of attaining elements of the attenuated clarity, realism and benefits of that environment that match our day to day conventional interactions with the 3-D world. With voice activation a user may announce interface positions, or alter a holographic interface, via voice commands.
- An embodiment of the present invention relates to the creation of a holographic user interface display system that combines physical media or digitally stored files with a digital holographic player hardware system. The result is the creation of a multimedia holographic user interface and viewing experience, where a variety of graphical schematics enabling cohesive access to information utilizing pyramids, blocks, spheres, cylinders, other graphical representations, existing templates, specific object rendering, free form association, user delegated images and quantum representations of information to form a user interface where the available tools combine over time to match a users evolving data and requests.
- Embodiments of the invention provide a holographic user interface which transforms the computing environment to enable a 3-D holographic style user interface and display system. The system utilizes holographic projection technology along with programmed quadrant matrixes sensor field to create multiple methods to select and interact with data and user interface tools and icons presented in a holographic format. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.
- In an example embodiment of the invention, a system and corresponding method for providing a 3-D user interface involves display images in a 3-D coordinate system. Sensors are configured to sense user interaction within the 3-D coordinate system, so that a processor may receive user interaction information from the sensors. The sensors are able to provide information to the processor that enables the processor to correlate user interaction with images in the 3-D coordinate system.
- In another example embodiment of the invention, a system, and corresponding method, for manipulating an original image is presented. The system may include at least one sensor that may be configured to sense a user interaction with the image in a three dimensional coordinate system, a correlation unit that may be configured to correlate the user interaction with the three dimensional coordinate system, and a projecting unit that may be configured to project an updated image based on the correlated user interaction. The distortion may include twisting, bending, cutting, displacement, and squeezing. The updated image may be equivalent to the original image manipulated by a distortion. The original image and/or the updated image may be a holographic image.
- The at least one sensor may be a laser sensor which may be configured to geometrically identify a position within the three dimensional coordinate system. The at least one sensor may be farther configured to triangulate and/or quadrilate a position within the three dimensional coordinate system.
- The projecting unit may be further configured to project the updated image by selecting a pre-recorded interference pattern based on the correlated data. The projecting unit may also be configured to project the updated image by projecting a generated interference pattern based on the correlated data.
- The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the invention.
-
FIG. 1 is a block diagram illustrating a holographic user interface according to an example embodiment of the present invention; -
FIG. 2 is a flow chart diagram illustrating a method for providing a 3 dimensional (3-D) interface with a system according to an example embodiment of the present invention; -
FIG. 3 is a perspective view of sensor field used in connection with an example embodiment of the present invention; -
FIGS. 4A and 4B are front views of a holographic user interface device according to an example embodiment of the present invention; -
FIG. 5 is a perspective view of a diagram of a holographic user interface according to another example embodiment of the present invention; and -
FIG. 6 is flow diagram depicting example operations of an image manipulation system according to an example embodiment of the present invention; -
FIG. 7 is a schematic of an image manipulation system employing a twisting distortion in accordance to an example embodiment of the present invention; -
FIG. 8 is a schematic of an image manipulation system employing a squeezing distortion in accordance to an example embodiment of the present invention; and -
FIG. 9 is a schematic of an image manipulation system employing a cutting distortion in accordance to an example embodiment of the present invention. - A description of example embodiments of the invention follows.
- The present invention, in accordance with one embodiment relates to the creation of a holographic user interface which transforms the computing environment to enable a three dimensional (3-D) holographic style user interface and display system. The system utilizes holographic projection technology along with programmed quadrant matrixes sensor field to create multiple methods to select and interact with data and user interface tools and icons presented in a holographic format.
-
FIG. 1 illustrates aholographic user interface 100 according to one example embodiment of the present invention. Theholographic user interface 100 includes aprocessor 114 that operatessoftware 112, controls aholographic image projector 116, and processes information obtained fromsensors D display image D coordinate system 150. Thesensors image sensors images -
FIG. 2 is a flow chart that illustrates the method for providing a three dimensional (3-D) interface with a system. The interface generates (210) an image in a 3-D coordinate system. In operation, an embodiment of the interface deploys holographic information in the form of a user interface template as a default once turned on. Sensors on the interface sense (220) a user's interaction with the 3-D coordinate system. The sensing may occur through the use of matrixes or triangulated data points that correspond to specific functions and data display which the system is capable of displaying. The interface may then correlate (230) the user's interaction with an image in the 3-D coordinate system. By sensing and correlating interaction with the 3-D coordinate system, the interface allows a computer system or display to interact with a user. The holographic data displayed by the system becomes a result of a selection process by the user who triggers data being displayed by key strokes or by the use of a three dimensional interactive interface. Users location commands are read by the system at their exact points and then the system deploys the appropriate response or holographic media based upon the users specific request made via the location of that request. -
FIG. 3 illustrates a sensor field used in connection with embodiments of the present invention. The embodiment illustrated inFIG. 3 includes four laser sensors 320 a-d. The manipulatable interface may be a relatable and interactive holographic media via the use of a sprocketed sensor system which deploys from the display either via a built in or retrofit hardware peripheral that creates a quadrilateral angle navigation system to determine theexact point 330 of afingertip touch point 340 within a quadrant 310 (also referred to as a “3-D coordinate system”). This touch point, if effectively deployed by the user, is mapped to the image deployed by the holographic hardware and software system, as each image that is displayed in the system is displayed from an exact point at an exact place in space that has been preconfigured to match specific points on the quadrilateral sensor system. The points in space attached to programmed images are then matched to touch points made by the user. The touch point may trigger the same functions as a mouse and cursor. - One skilled in the art will recognize that other sensing configurations or devices may be used to sense a location within a 3-D coordinate system. For example, the sensors may be laser sensors configured to provide data to triangulate a point within the 3-D coordinate system, photo voltaic sensors, photo electric light sensors, or image sensors. The sensors may also be motion sensors, which may for example be detected to sense the motion of a user's hand within the 3-D coordinate system. The sensors may be programmed to identify the specific location of the
touchpoint 330 that may extend through multiple planar images, to identify a single image located at a 3-D coordinate space. -
FIG. 4A illustrates a holographicuser interface device 400A according to one embodiment of the present invention. Thedevice 400A has aport 410A that may provide the output projector for the multi-dimensional display, and also the sensors for detecting user interaction. The projector and sensors map out a 3-D coordinatesystem 420 to serve as the holographic user interface. Acommunications port 430A, such as a universal serial bus (“USB”) port or wireless connection, serves to allow thedevice 400A to communicate with a computer system. The holographic system may be based upon our prior holographic system technology filing, filed Apr. 5, 2007, U.S. application Ser. No. 11/397,147, which is incorporated herein by reference in its entirety, where the User Interface icons and documents may be saved to a fixed media form and activated by commands sent from the operating system to the device managing the index on the holographic fixed media system and display. Similarly, any system that utilizes holographic displays may also be manipulated and selected using the sensor interface system. -
FIG. 4B illustrates holographicuser interface devices 400A, as described in relation toFIG. 4A , and 400B. The holographicuser interface device 400B may be identical to the holographicuser interface device 400A, such that thedevice 400B may includeports system 420. Multiple holographic user interface devices may be used to project a holographic image. For example, theuser interface device 400A may be configured to project the holographic image from a desk or floor, while the seconduser interface device 400B may be configured to project the holographic image from a ceiling. If theport 410A of the firstuser interface device 400A is obstructed by a user or external object, thesecond interface device 400B may be used to reinforce the obstructed portion of the holographic image. Thus, the full holographic image may be viewed even in the presence of obstructions. It should be appreciated that any number of holographic user interface devices may be employed, and that any number of the user interface devices may be used to sense a user interaction. It should also be appreciated that although the seconduser interface device 400B has been illustrated in a 180° configuration with respect to the firstuser interface device 400A, any number of user interface devices may be included and the user interface devices may be offset by any distance or angle. -
FIG. 5 is a perspective view of a diagram of aholographic user interface 500 according to another embodiment of the present invention. The holographic user interface device may operate with aprojection screen 580.Images 505 displayed by theprojection screen 580 of theuser interface 500 can include, but are not limited to, shapes, graphic images, animation sequences, documents, and audiovisual programs, which may be configured as a logical display featuring icons whose organization on theprojection screen 580 may be based upon the users patterns of use with the system. Examples of user patterns with the system may include, but are not limited to, always going online first, always working on a word document second, and always viewing pictures or videos from the users hard drive. These icons could be presented, for example, to the user in an order of priority on the display representing the users evolving use habits based upon history (e.g., distinct changes based upon day, time, and date). These icons, which may include traditional UI operating system icons such as Word document icons and portable document format (“PDF”) icons, may be presented in a holographic format, Documents may be revised and read through in a traditional manner or through a holographic view. Any displayed holographic item may revert back to the flat display monitor, or vice versa, based upon a user command. - It should be appreciated that the methods involved in providing a 3-D user interface system may be utilized by image manipulation systems.
FIG. 6 is a flow diagram depicting example operations of an image manipulation system according to an example embodiment of the present invention.FIGS. 7-9 illustrate examples of image manipulation systems employing twisting, squeezing, and cutting distortions, respectively, according to example embodiments of the present invention. - The image manipulation system, as shown in
FIG. 7 may include a holographicuser interface system 700. Theuser interface 700, similarly to the user interface systems described inFIGS. 1-5 , may include aprojector 725 configured to project aholographic image 715 in a three dimensional (3-D) coordinatesystem 720. The user interface may also includesensors 710 that may be configured to sense a user interaction with theholographic image 715 which may be a 3-D image (601). The user interaction may be in the form of direct interaction, for example by a user'shand 740 interacting with theimage 715. The user interaction may also be in the form of voice recognition, retinal scan, fingerprint matching, or any other known input means. - Once a user interaction has been detected, the sensed data may be correlated with respect to the 3-D coordinate system (602). This correlation may be performed by a
correlation unit 735 which may be located in theuser interface system 700. It should also be appreciated that the correlation may be performed externally (e.g., via a host computer, or any device connected through a network) by transmitting the sensed data through thedata exchange port 730. - The correlation of data may be used to interpret an intended distortion of the
image 715. For example, a user may attempt to grab the image with their hand and upon making a twisting motion with their wrist, the correlation unit may interpret the distortion as an axial rotation, where the entire image as a whole is rotated about a center longitudinal axis, or as a twist, where a portion of the image is rotated, as shown inFIG. 7 . Thecorrelation unit 735 may distinguish between the distortion of twist and rotation by, for example, recognizing predefined indicators. Examples of indicators may be, but are not limited to, vocal commands, hand positions, or any other known input means. Thus, the selection of a distortion based on correlated data may further be made by the use of at least one indicator. - Once the data has been correlated, the correlated data may be used to select a new image to be projected (603). The new projected image may be based on the original projected image having undergone the distortion, for example,
FIG. 7 displays an image having under gone the distortion of twist. A new image may be projected during each stage of distortion, therefore the image may appear to a user as moving in real time as the user interacts with the image. Each possible position of the image may be stored in a fixed media, as for example the fixed media described in U.S. patent application Ser. No. 11/857,161 (herein incorporated by reference), where each position may be referenced to an interference pattern. The measured responses from the sensor may be used to determine which interference pattern is to be projected, in what order the interference patterns are to be projected, and at what rate the interference patterns should be projected. Thus, by projecting the interference pattern as dictated by the correlated data, the projection of the image may continuously change positions in accordance with the movement of the user's hand, or any form of interaction in the form of input. It should also be appreciated that the correlated data may also be used to generate an interference pattern with use of computer aided holography. A variable medium, for example a liquid crystal based medium, may be used for temporary storage of the computer generated interference pattern. Therefore, the distorted shape of the image, upon applying the twisting interference, may also be predetermined. It should be appreciated that the image may be twisted or rotated in any location and along any direction. -
FIGS. 8 and 9 illustrate user interface systems similar to those discussed inFIGS. 1-7 , and provide examples of squeezing and cutting distortions, respectively. Similarly to the example described inFIGS. 6 and 7 , in the example shown inFIG. 8 a measureduser interaction 740 may be correlated with the use of acorrelation unit 735. The correlation unit may identify the squeezing by, for example, identifying an indicator of a user's fingers being in a pinching configuration. The correlated data may be used to select a new image to be projected. As shown in the example provided byFIG. 8 , the user may distort the projected image by applying a squeezing distortion. The amount of distortion applied by the squeezing user interference may be predetermined by the interference patterns stored in the fixed media, or the computer generated interference patterns. Furthermore, the distorted shape of the image, upon applying the squeezing interference, may also be predetermined. It should be appreciated that the image may be squeezed in any location or along any direction. - In
FIG. 9 an example of a cutting distortion is illustrated. Auser interaction 740 may be correlated with the use of thecorrelation unit 735. The correlation unit may identify the cutting by, for example, by identifying an indicator of a user's fingers being in a scissor configuration. The correlated data may be used to select a new image to be projected. As shown in the example provided byFIG. 9 , the user may distort the projected image by applying a cutting distortion. The amount of distortion applied by the squeezing user interference may be predetermined by the interference patterns stored in the fixed media, or the computer generated interference patterns. Furthermore, the distorted shape of the image, upon applying the cutting interference, may also be predetermined. It should be appreciated that the image may be cut in any location or along any direction. - In a similar manner, an image may also be displaced, for example a user may move an image side to side vertically or horizontally, and a user may also move the image backward or forward. An image may also be bent in any location of the image and along any direction. It should be appreciated that any form or type of image distortion known in the art may be applied. It should also be appreciated that any number of distortions may be applied to an image at a given time.
- It should also be appreciated that image manipulation systems may be used in tandem with voice recognition, retinal scan, fingerprint matching, and standard input systems. It should also be appreciated that the at least a portion of the holographic image may become distorted as a result of a user input by means of voice recognition, retinal scan, fingerprint matching, or any other known input means. It should also be appreciated that any number of projection systems may be used in the authentication systems. Additionally, the sensors and/or correlation unit may be located externally from the user interface device. It should also be appreciated that any known 3-D imagery may be employed by the image manipulation system, for example volumetric imagery.
- Those of ordinary skill in the art should recognize that methods involved in providing a 3-D user interface with a system may be embodied in a computer program product that includes a computer usable medium. For example, such a computer usable medium can include a readable memory device, such as a solid state memory device, a hard drive device, a CD-ROM, a DVD-ROM, or a computer diskette, having stored computer-readable program code segments. The computer readable medium can also include a communications or transmission medium, such as electromagnetic signals propagating on a computer network, a bus or a communications link, either optical, wired, or wireless, carrying program code segments as digital or analog data signals. The program code enables and supports computer implementation of the operations described in
FIGS. 1-9 or any other described embodiments. - While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
Claims (25)
1. A method of manipulating an original image through a user interface, the method comprising:
sensing a user interaction with the original image in a three dimensional coordinate system;
correlating the user interaction with the three dimensional coordinate system; and
projecting an updated image based on the correlated user interaction, the updated image being equivalent to the original image manipulated by a distortion.
2. The method of claim 1 wherein the original image and/or the updated image is a holographic image.
3. The method of claim 1 wherein sensing includes using laser sensors to geometrically identify a position within the three dimensional coordinate system.
4. The method of claim 3 wherein using laser sensors to geometrically identify includes using laser sensors to triangulate and/or quadrilate a position within the three dimensional coordinate system.
5. The method of claim 1 wherein the distortion is any one or combination of: twisting, bending, cutting, displacement, and squeezing.
6. The method of claim 1 further comprising selecting a distortion based on at least one indicator.
7. The method of claim 2 wherein projecting the updated image further includes selecting a pre-recorded interference pattern based on the correlated data.
8. The method of claim 2 wherein projecting the updated image further includes generating an interference pattern based on the correlated data.
9. A user interface system for manipulating an original image, the system comprising:
at least one sensor configured to sense a user interaction with the image in a three dimensional coordinate system;
a correlation unit configured to correlate the user interaction with the three dimensional coordinate system; and
a projecting unit configured to project an updated image based on the correlated user interaction, the updated image being equivalent to the original image manipulated by a distortion.
10. The system of claim 9 wherein the original image and/or the updated image is a holographic image.
11. The system of claim 9 wherein the at least one sensor is a laser sensor configured to geometrically identify a position within the three dimensional coordinate system.
12. The system of claim 11 wherein the at least one sensor is further configured to triangulate and/or quadrilate a position within the three dimensional coordinate system.
13. The system of claim 9 wherein the distortion is any one or combination of: twisting, bending, cutting, displacement, and squeezing.
14. The system of claim 13 wherein the correlation unit is further configured to select the distortion based on at least one indicator.
15. The system of claim 10 wherein the projecting unit is further configured to project the updated image by selecting a pre-recorded interference pattern based on the correlated data.
16. The system of claim 10 wherein the projecting unit is further configured to project the updated image by projecting a generated interference pattern based on the correlated data.
17. A computer program product having a computer program stored thereon, the computer program defined by instructions which, when executed by a processor, cause the processor to:
sense a user interaction with the original image in a three dimensional coordinate system;
correlate the user interaction with the three dimensional coordinate system; and
project an updated image based on the correlated user interaction, the updated image being equivalent to the original image manipulated by a distortion.
18. The program of claim 17 wherein the original image and/or the updated image is a holographic image.
19. The program of claim 17 wherein the instructions to sense a user interaction further include:
use laser sensors to geometrically identify a position within the three dimensional coordinate system.
20. The program of claim 17 wherein the instructions further include:
use laser sensors to triangulate and/or quadrilate a position within the three dimensional coordinate system.
21. The program of claim 17 wherein the distortion is any one of: twisting, bending, cutting, displacement, and squeezing.
22. The program of claim 21 wherein the instructions to correlate further include:
select the distortion based on at least one indicator.
23. The method of claim 18 wherein the instructions to project the updated image further include:
select a pre-recorded interference pattern based on the correlated data.
24. The method of claim 18 wherein the instructions to project the updated image further include:
generate an interference pattern based on the correlated data.
25. A user interface system for manipulating an original image, the system comprising:
means to sense a user interaction with the image in a three dimensional coordinate system;
means to correlate the user interaction with the three dimensional coordinate system; and
means to project an updated image based on the correlated user interaction, the updated image being equivalent to the original image manipulated by a distortion.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/932,450 US20090109215A1 (en) | 2007-10-31 | 2007-10-31 | Method and apparatus for user interface communication with an image manipulator |
US13/049,684 US8319773B2 (en) | 2007-10-31 | 2011-03-16 | Method and apparatus for user interface communication with an image manipulator |
US13/654,752 US8902225B2 (en) | 2007-10-31 | 2012-10-18 | Method and apparatus for user interface communication with an image manipulator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/932,450 US20090109215A1 (en) | 2007-10-31 | 2007-10-31 | Method and apparatus for user interface communication with an image manipulator |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/049,684 Continuation US8319773B2 (en) | 2007-10-31 | 2011-03-16 | Method and apparatus for user interface communication with an image manipulator |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090109215A1 true US20090109215A1 (en) | 2009-04-30 |
Family
ID=40582257
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/932,450 Abandoned US20090109215A1 (en) | 2007-10-31 | 2007-10-31 | Method and apparatus for user interface communication with an image manipulator |
US13/049,684 Expired - Fee Related US8319773B2 (en) | 2007-10-31 | 2011-03-16 | Method and apparatus for user interface communication with an image manipulator |
US13/654,752 Expired - Fee Related US8902225B2 (en) | 2007-10-31 | 2012-10-18 | Method and apparatus for user interface communication with an image manipulator |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/049,684 Expired - Fee Related US8319773B2 (en) | 2007-10-31 | 2011-03-16 | Method and apparatus for user interface communication with an image manipulator |
US13/654,752 Expired - Fee Related US8902225B2 (en) | 2007-10-31 | 2012-10-18 | Method and apparatus for user interface communication with an image manipulator |
Country Status (1)
Country | Link |
---|---|
US (3) | US20090109215A1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090102603A1 (en) * | 2007-10-19 | 2009-04-23 | Fein Gene S | Method and apparatus for providing authentication with a user interface system |
US20090113348A1 (en) * | 2007-10-31 | 2009-04-30 | Fein Gene S | Method and apparatus for a user interface with priority data |
US20090109176A1 (en) * | 2007-10-31 | 2009-04-30 | Fein Gene S | Digital, data, and multimedia user interface with a keyboard |
US20090109174A1 (en) * | 2007-10-30 | 2009-04-30 | Fein Gene S | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units |
WO2013093837A1 (en) * | 2011-12-23 | 2013-06-27 | Koninklijke Philips Electronics N.V. | Method and apparatus for interactive display of three dimensional ultrasound images |
US20140198077A1 (en) * | 2008-10-10 | 2014-07-17 | Sony Corporation | Apparatus, system, method, and program for processing information |
US8902225B2 (en) | 2007-10-31 | 2014-12-02 | Genedics Llc | Method and apparatus for user interface communication with an image manipulator |
EP2472357A3 (en) * | 2010-12-31 | 2015-04-29 | LG Electronics Inc. | Mobile terminal and hologram controlling method thereof |
US20150205399A1 (en) * | 2011-08-26 | 2015-07-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9110563B2 (en) | 2007-10-31 | 2015-08-18 | Genedics Llc | Method and apparatus for user interface of input devices |
US20160091979A1 (en) * | 2014-09-30 | 2016-03-31 | Shenzhen Estar Technology Group Co., Ltd. | Interactive displaying method, control method and system for achieving displaying of a holographic image |
KR101861667B1 (en) * | 2011-09-30 | 2018-05-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
CN108111538A (en) * | 2018-01-25 | 2018-06-01 | 芜湖应天光电科技有限责任公司 | Smart projector speech control system and its method based on sound groove recognition technology in e |
WO2019061858A1 (en) * | 2017-09-26 | 2019-04-04 | 歌尔科技有限公司 | Coordinate mapping method and device for projection area, projector, and projection system |
US11176695B2 (en) * | 2018-12-11 | 2021-11-16 | Canon Kabushiki Kaisha | Shape information acquisition apparatus and shape information acquisition method |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013074997A1 (en) * | 2011-11-18 | 2013-05-23 | Infinite Z, Inc. | Indirect 3d scene positioning control |
US10268277B2 (en) | 2014-09-30 | 2019-04-23 | Hewlett-Packard Development Company, L.P. | Gesture based manipulation of three-dimensional images |
KR20160085613A (en) * | 2015-01-08 | 2016-07-18 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US10101816B2 (en) * | 2015-02-27 | 2018-10-16 | Rovi Guides, Inc. | Systems and methods for displaying media assets associated with holographic structures |
US9713871B2 (en) | 2015-04-27 | 2017-07-25 | Microsoft Technology Licensing, Llc | Enhanced configuration and control of robots |
US10007413B2 (en) | 2015-04-27 | 2018-06-26 | Microsoft Technology Licensing, Llc | Mixed environment display of attached control elements |
US11449146B2 (en) | 2015-06-10 | 2022-09-20 | Wayne Patrick O'Brien | Interactive holographic human-computer interface |
WO2017135481A1 (en) * | 2016-02-04 | 2017-08-10 | 엘지전자 주식회사 | Mobile terminal and control method therefor |
US10234935B2 (en) | 2016-08-11 | 2019-03-19 | Microsoft Technology Licensing, Llc | Mediation of interaction methodologies in immersive environments |
US9983684B2 (en) | 2016-11-02 | 2018-05-29 | Microsoft Technology Licensing, Llc | Virtual affordance display at virtual target |
US10402707B2 (en) * | 2017-01-04 | 2019-09-03 | Justin Garak | Interactive optical code creation |
US11288733B2 (en) * | 2018-11-14 | 2022-03-29 | Mastercard International Incorporated | Interactive 3D image projection systems and methods |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6031519A (en) * | 1997-12-30 | 2000-02-29 | O'brien; Wayne P. | Holographic direct manipulation interface |
US6211848B1 (en) * | 1998-05-15 | 2001-04-03 | Massachusetts Institute Of Technology | Dynamic holographic video with haptic interaction |
US6377238B1 (en) * | 1993-04-28 | 2002-04-23 | Mcpheters Robert Douglas | Holographic control arrangement |
US20020070921A1 (en) * | 2000-12-13 | 2002-06-13 | Feldman Stephen E. | Holographic keyboard |
US7054045B2 (en) * | 2003-07-03 | 2006-05-30 | Holotouch, Inc. | Holographic human-machine interfaces |
US7185271B2 (en) * | 2002-08-20 | 2007-02-27 | Hewlett-Packard Development Company, L.P. | Methods and systems for implementing auto-complete in a web page |
US20070055949A1 (en) * | 2005-01-29 | 2007-03-08 | Nicholas Thomas | Methods and apparatus for rfid interface control |
US7262783B2 (en) * | 2004-03-03 | 2007-08-28 | Virtual Iris Studios, Inc. | System for delivering and enabling interactivity with images |
US7538746B2 (en) * | 2004-07-23 | 2009-05-26 | Lockheed Martin Corporation | Direct ocular virtual 3D workspace |
US20090184851A1 (en) * | 2004-06-30 | 2009-07-23 | Giorgio Grego | Inputting information using holographic techniques |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4593967A (en) | 1984-11-01 | 1986-06-10 | Honeywell Inc. | 3-D active vision sensor |
US4818048A (en) | 1987-01-06 | 1989-04-04 | Hughes Aircraft Company | Holographic head-up control panel |
US5675437A (en) | 1992-11-27 | 1997-10-07 | Voxel | Light control film for use in viewing holograms and related method |
GB9505664D0 (en) | 1995-03-21 | 1995-05-10 | Central Research Lab Ltd | An interactive display and input device |
US6147773A (en) | 1995-09-05 | 2000-11-14 | Hewlett-Packard Company | System and method for a communication system |
US5812292A (en) | 1995-11-27 | 1998-09-22 | The United States Of America As Represented By The Secretary Of The Navy | Optical correlator using optical delay loops |
US6388657B1 (en) | 1997-12-31 | 2002-05-14 | Anthony James Francis Natoli | Virtual reality keyboard system and method |
US6064354A (en) | 1998-07-01 | 2000-05-16 | Deluca; Michael Joseph | Stereoscopic user interface method and apparatus |
EP1028306B1 (en) * | 1998-08-28 | 2008-10-15 | Mitutoyo Corporation | Apparatus and method concerning analysis and generation of a part program for measuring coordinates and surface properties |
US6710770B2 (en) | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6507353B1 (en) | 1999-12-10 | 2003-01-14 | Godot Huard | Influencing virtual actors in an interactive environment |
GB0012275D0 (en) | 2000-05-22 | 2000-07-12 | Secr Defence Brit | Three dimensional human computer interface |
US6650318B1 (en) | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
CA2410427A1 (en) | 2000-05-29 | 2001-12-06 | Vkb Inc. | Virtual data entry device and method for input of alphanumeric and other data |
US6667751B1 (en) | 2000-07-13 | 2003-12-23 | International Business Machines Corporation | Linear web browser history viewer |
FI116425B (en) | 2002-01-18 | 2005-11-15 | Nokia Corp | Method and apparatus for integrating an extensive keyboard into a small apparatus |
WO2003107039A2 (en) | 2002-06-13 | 2003-12-24 | I See Tech Ltd. | Method and apparatus for a multisensor imaging and scene interpretation system to aid the visually impaired |
US8458028B2 (en) * | 2002-10-16 | 2013-06-04 | Barbaro Technologies | System and method for integrating business-related content into an electronic game |
WO2004044769A1 (en) | 2002-11-11 | 2004-05-27 | Zxibix, Inc. | System and method of facilitating and evaluating user thinking about an arbitrary problem |
US7671843B2 (en) | 2002-11-12 | 2010-03-02 | Steve Montellese | Virtual holographic input method and device |
US7644433B2 (en) | 2002-12-23 | 2010-01-05 | Authernative, Inc. | Authentication system and method based upon random partial pattern recognition |
US20070159453A1 (en) | 2004-01-15 | 2007-07-12 | Mikio Inoue | Mobile communication terminal |
US20050277467A1 (en) | 2004-06-14 | 2005-12-15 | Jcm American Corporation, A Nevada Corporation | Gaming machine using holographic imaging |
US7376903B2 (en) | 2004-06-29 | 2008-05-20 | Ge Medical Systems Information Technologies | 3D display system and method |
US7634741B2 (en) | 2004-08-31 | 2009-12-15 | Sap Ag | Method and apparatus for managing a selection list based on previous entries |
JP2006318515A (en) | 2004-09-10 | 2006-11-24 | Ricoh Co Ltd | Hologram element, production method thereof and optical header |
US20060167971A1 (en) | 2004-12-30 | 2006-07-27 | Sheldon Breiner | System and method for collecting and disseminating human-observable data |
US20060229108A1 (en) | 2005-02-04 | 2006-10-12 | Cehelnik Thomas G | Mobile phone extension and data interface via an audio headset connection |
US7844599B2 (en) | 2005-08-24 | 2010-11-30 | Yahoo! Inc. | Biasing queries to determine suggested queries |
GB0518912D0 (en) | 2005-09-16 | 2005-10-26 | Light Blue Optics Ltd | Methods and apparatus for displaying images using holograms |
US8471812B2 (en) | 2005-09-23 | 2013-06-25 | Jesse C. Bunch | Pointing and identification device |
US20070169066A1 (en) | 2005-11-17 | 2007-07-19 | Nielsen Spencer J | System and method for an extensible 3D interface programming framework |
US7644054B2 (en) | 2005-11-23 | 2010-01-05 | Veveo, Inc. | System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and typographic errors |
US8279168B2 (en) | 2005-12-09 | 2012-10-02 | Edge 3 Technologies Llc | Three-dimensional virtual-touch human-machine interface system and method therefor |
US7724407B2 (en) | 2006-01-24 | 2010-05-25 | American Air Liquide, Inc. | Holographic display and controls applied to gas installations |
US20070266428A1 (en) | 2006-03-06 | 2007-11-15 | James Downes | Method, System, And Apparatus For Nested Security Access/Authentication |
US8334841B2 (en) | 2006-03-13 | 2012-12-18 | Navisense | Virtual user interface method and system thereof |
US9646415B2 (en) | 2006-05-16 | 2017-05-09 | Underground Imaging Technologies, Inc. | System and method for visualizing multiple-sensor subsurface imaging data |
US8086971B2 (en) | 2006-06-28 | 2011-12-27 | Nokia Corporation | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
US20080231926A1 (en) | 2007-03-19 | 2008-09-25 | Klug Michael A | Systems and Methods for Updating Dynamic Three-Dimensional Displays with User Input |
US7881901B2 (en) | 2007-09-18 | 2011-02-01 | Gefemer Research Acquisitions, Llc | Method and apparatus for holographic user interface communication |
US20090102603A1 (en) | 2007-10-19 | 2009-04-23 | Fein Gene S | Method and apparatus for providing authentication with a user interface system |
US20090109174A1 (en) | 2007-10-30 | 2009-04-30 | Fein Gene S | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units |
US8127251B2 (en) | 2007-10-31 | 2012-02-28 | Fimed Properties Ag Limited Liability Company | Method and apparatus for a user interface with priority data |
US8212768B2 (en) | 2007-10-31 | 2012-07-03 | Fimed Properties Ag Limited Liability Company | Digital, data, and multimedia user interface with a keyboard |
US8477098B2 (en) | 2007-10-31 | 2013-07-02 | Gene S. Fein | Method and apparatus for user interface of input devices |
US20090109215A1 (en) | 2007-10-31 | 2009-04-30 | Fein Gene S | Method and apparatus for user interface communication with an image manipulator |
-
2007
- 2007-10-31 US US11/932,450 patent/US20090109215A1/en not_active Abandoned
-
2011
- 2011-03-16 US US13/049,684 patent/US8319773B2/en not_active Expired - Fee Related
-
2012
- 2012-10-18 US US13/654,752 patent/US8902225B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6377238B1 (en) * | 1993-04-28 | 2002-04-23 | Mcpheters Robert Douglas | Holographic control arrangement |
US6031519A (en) * | 1997-12-30 | 2000-02-29 | O'brien; Wayne P. | Holographic direct manipulation interface |
US6211848B1 (en) * | 1998-05-15 | 2001-04-03 | Massachusetts Institute Of Technology | Dynamic holographic video with haptic interaction |
US20020070921A1 (en) * | 2000-12-13 | 2002-06-13 | Feldman Stephen E. | Holographic keyboard |
US7185271B2 (en) * | 2002-08-20 | 2007-02-27 | Hewlett-Packard Development Company, L.P. | Methods and systems for implementing auto-complete in a web page |
US7054045B2 (en) * | 2003-07-03 | 2006-05-30 | Holotouch, Inc. | Holographic human-machine interfaces |
US7262783B2 (en) * | 2004-03-03 | 2007-08-28 | Virtual Iris Studios, Inc. | System for delivering and enabling interactivity with images |
US20090184851A1 (en) * | 2004-06-30 | 2009-07-23 | Giorgio Grego | Inputting information using holographic techniques |
US7538746B2 (en) * | 2004-07-23 | 2009-05-26 | Lockheed Martin Corporation | Direct ocular virtual 3D workspace |
US20070055949A1 (en) * | 2005-01-29 | 2007-03-08 | Nicholas Thomas | Methods and apparatus for rfid interface control |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090102603A1 (en) * | 2007-10-19 | 2009-04-23 | Fein Gene S | Method and apparatus for providing authentication with a user interface system |
US20090109174A1 (en) * | 2007-10-30 | 2009-04-30 | Fein Gene S | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units |
US9335890B2 (en) | 2007-10-31 | 2016-05-10 | Genedics Llc | Method and apparatus for user interface of input devices |
US20090113348A1 (en) * | 2007-10-31 | 2009-04-30 | Fein Gene S | Method and apparatus for a user interface with priority data |
US20090109176A1 (en) * | 2007-10-31 | 2009-04-30 | Fein Gene S | Digital, data, and multimedia user interface with a keyboard |
US8127251B2 (en) | 2007-10-31 | 2012-02-28 | Fimed Properties Ag Limited Liability Company | Method and apparatus for a user interface with priority data |
US8212768B2 (en) | 2007-10-31 | 2012-07-03 | Fimed Properties Ag Limited Liability Company | Digital, data, and multimedia user interface with a keyboard |
US9939987B2 (en) | 2007-10-31 | 2018-04-10 | Genedics Llc | Method and apparatus for user interface of input devices |
US8902225B2 (en) | 2007-10-31 | 2014-12-02 | Genedics Llc | Method and apparatus for user interface communication with an image manipulator |
US9110563B2 (en) | 2007-10-31 | 2015-08-18 | Genedics Llc | Method and apparatus for user interface of input devices |
US20140198077A1 (en) * | 2008-10-10 | 2014-07-17 | Sony Corporation | Apparatus, system, method, and program for processing information |
EP2472357A3 (en) * | 2010-12-31 | 2015-04-29 | LG Electronics Inc. | Mobile terminal and hologram controlling method thereof |
US20150205399A1 (en) * | 2011-08-26 | 2015-07-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9256323B2 (en) * | 2011-08-26 | 2016-02-09 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
KR101861667B1 (en) * | 2011-09-30 | 2018-05-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
WO2013093837A1 (en) * | 2011-12-23 | 2013-06-27 | Koninklijke Philips Electronics N.V. | Method and apparatus for interactive display of three dimensional ultrasound images |
US10966684B2 (en) | 2011-12-23 | 2021-04-06 | Koninklijke Philips N.V | Method and apparatus for interactive display of three dimensional ultrasound images |
US20160091979A1 (en) * | 2014-09-30 | 2016-03-31 | Shenzhen Estar Technology Group Co., Ltd. | Interactive displaying method, control method and system for achieving displaying of a holographic image |
US9753547B2 (en) * | 2014-09-30 | 2017-09-05 | Shenzhen Estar Technology Group Co., Ltd. | Interactive displaying method, control method and system for achieving displaying of a holographic image |
WO2019061858A1 (en) * | 2017-09-26 | 2019-04-04 | 歌尔科技有限公司 | Coordinate mapping method and device for projection area, projector, and projection system |
CN108111538A (en) * | 2018-01-25 | 2018-06-01 | 芜湖应天光电科技有限责任公司 | Smart projector speech control system and its method based on sound groove recognition technology in e |
US11176695B2 (en) * | 2018-12-11 | 2021-11-16 | Canon Kabushiki Kaisha | Shape information acquisition apparatus and shape information acquisition method |
Also Published As
Publication number | Publication date |
---|---|
US8902225B2 (en) | 2014-12-02 |
US20120038547A1 (en) | 2012-02-16 |
US20130038528A1 (en) | 2013-02-14 |
US8319773B2 (en) | 2012-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8319773B2 (en) | Method and apparatus for user interface communication with an image manipulator | |
CA2699560C (en) | Method and apparatus for holographic user interface communication | |
US20090102603A1 (en) | Method and apparatus for providing authentication with a user interface system | |
US8212768B2 (en) | Digital, data, and multimedia user interface with a keyboard | |
US11954808B2 (en) | Rerendering a position of a hand to decrease a size of a hand to create a realistic virtual/augmented reality environment | |
US12032746B2 (en) | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments | |
Zhu et al. | Bishare: Exploring bidirectional interactions between smartphones and head-mounted augmented reality | |
US9939987B2 (en) | Method and apparatus for user interface of input devices | |
US8127251B2 (en) | Method and apparatus for a user interface with priority data | |
US9092053B2 (en) | Systems and methods for adjusting a display based on the user's position | |
US20090109174A1 (en) | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units | |
Osman et al. | A 3d annotation interface using the divine visual display | |
MacAllister et al. | Implementing native support for oculus and leap motion in a commercial engineering visualization and analysis platform | |
Tahir et al. | Interactive Slide Navigation: An Approach for Manipulating Slides with Augmented Reality Markers | |
Wesche et al. | Immersive Interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: GENEDICS LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FEIN, GENE S.;MERRITT, ELI A.S.;MERRITT, EDWARD;SIGNING DATES FROM 20140627 TO 20140701;REEL/FRAME:033234/0314 |