US20050206583A1 - Selectively controllable heads-up display system - Google Patents
Selectively controllable heads-up display system Download PDFInfo
- Publication number
- US20050206583A1 US20050206583A1 US11/042,662 US4266205A US2005206583A1 US 20050206583 A1 US20050206583 A1 US 20050206583A1 US 4266205 A US4266205 A US 4266205A US 2005206583 A1 US2005206583 A1 US 2005206583A1
- Authority
- US
- United States
- Prior art keywords
- display
- surgeon
- eye
- user
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7445—Display arrangements, e.g. multiple display units
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the inventions relate to electronic display systems and control systems therefor. More particularly the inventions relate to selectively operable heads-up display systems for presenting information and/or image(s) to the user.
- the heads-up display is configured for use by medical technicians or personnel, such as surgeons performing an operation.
- a heads-up display is generally defined as an electronically generated display containing information or data that is superimposed on an observer's normal field of view.
- HUD heads-up display
- One such application is for use by pilots of aircraft.
- a semi-transparent display screen is located generally in front of the eyes of the pilot (i.e. a screen mounted on the pilot's head or helmet, or in the view of the aircraft windshield).
- Such a system enables a pilot to concentrate on the difficult tasks associated with flying the aircraft, without diverting his attention to scan or examine a wide array of instruments.
- surgeons must keep track of many different types of information during an operation. For example, a surgeon must carefully view or monitor the physical surgery while simultaneously monitoring a patient's condition (e.g., blood pressure, heart rate, pulse, etc.). In addition, depending on the procedure, the surgeon must also monitor the status and settings of surgical equipment and tools. Although the additional information is necessary and important, monitoring such information often diverts the surgeon from the immediate task at hand.
- a patient's condition e.g., blood pressure, heart rate, pulse, etc.
- surgeons use many different types of displays and must continually monitor many different sources of information.
- more tools and sources of data become available to surgeons for use during operations, more opportunities for distraction arise.
- prior attempts in the medical field to fulfill that need have been unsatisfactory.
- Efforts have also been made to use head-mounted displays in augmented reality simulations for medical applications wherein a desired image or three-dimensional model is superimposed on the real scene of a patient. For example, it was reported that a research effort in the Department of Computer Science at the University of North Carolina has attempted to develop a see-through head-mounted display that superimposed a computer-generated three-dimensional image of the internal features of a subject over the real-life view of the subject.
- Views acquired from several cameras are then displayed on a head-mounted display with an integrated tracking system to provide images of the remote environment.
- the explained purpose of the effort was to duplicate, at a remote location, a three-dimensional virtual reality environment of a medical room.
- the article does not disclose the use of see-through displays providing a surgeon with the ability to select and display additional forms of data, or to superimpose data over a real-life view of the patient or surgical site.
- the “inserted image” corresponding to the user's gaze point is converted from low resolution to high-resolution.
- the user can not select additional or alternative forms of data or different images to be superimposed over the primary image on the head-mounted display.
- U.S. Pat. No. 4,988,976 discloses a motorcycle helmet that displays data or information such as speed, time, rpm's, fuel, oil, etc. on the transparent visor (i.e. vacuum fluorescent display) of the rider.
- Head-mounted displays that are worn in front of the user's eyes or worn as eye spectacles also exist. For example, see the following U.S. Pat. Nos. 5,129,716; 5,151,722; 5,003,300; 5,162,828; 5,331,333; 5,281,957; 5,334,991; 5,450,596 and 5,392,158.
- Also pertinent to this invention is the field of eye-tracking to control various computer or imaging functions.
- Various systems unrelated to the medical field have used eye-tracking for controlling a field of view.
- U.S. Pat. No. 4,028,725 discloses an eye and head tracking system that controls a beam-splitter and retains the high-resolution part of the image in the field of view.
- the eye-tracking is carried out by infrared detection (i.e. see U.S. Pat. No. 3,724,932). See also U.S. Pat. Nos.
- Dodds discloses a video camera system that records scene conditions and heads-up displays.
- a selectively operable, head-mounted, see-through viewing display system for presenting desired information and/or images to a user, while at the same time allowing the user to view the real-world environment in which he or she operates.
- the above and other objects are achieved in an improved, selectively controllable system for presenting desired data on a head-mounted (or “heads-up”) display.
- the system includes a command computer processor for receiving inputs that represent data and for controlling the display of desired data.
- the computer communicates with and controls the heads-up display system, which is configured to display the desired data in a manner that is aligned in the user's field of view.
- the heads-up display includes a user interface incorporating “hands-free” menu selection to allow the user to control the display of various types of data.
- the hands-free menu selection is carried out using an eye-tracking cursor and a speech recognition computer to point to and select specific menus and operations.
- the above and other objects are also achieved in an user-controllable heads-up system for presenting medical data to a physician.
- the system includes a command control computer for receiving inputs defining medical data and for controlling the display of that data on a head's-up display screen in the normal field of view of the physician.
- the heads-up display provides the physician with a “user interface” including menus and associated operations that can be selected with an eye-tracking cursor.
- the system also includes a microphone and speaker so that a physician can communicate with other personnel and computers both locally and remote from the cite.
- the command computer includes a speech recognition processor to respond to spoken commands of the physician.
- the command computer also communicates with and receives a wide array of data from other computers networked therewith.
- the physician can select the specific data to be displayed on the screen.
- the physician can, with the eye-tracking cursor, control various medical imaging devices.
- a see-through computer display screen is mounted on a head piece that is worn by the user.
- a command computer controls a user interface so that command icons or menus are displayed in a super-imposed manner on the see-through, head-mounted display, thereby allowing the user to see both the normal field of view and the user interface.
- the user interface is provided with a “point-and-select” type of cursor.
- An eye-tracking system is integrated with the command control computer and the user interface to monitor the user's eye movement and to correspondingly control movement of the cursor.
- the user selects various computer operations from menus contained in the user interface by moving the eye-tracking cursor to selected menus or icons.
- the user can control the command computer to selectively display on the see-through HUD screen numerous items of data or images, while still seeing the normal field of view.
- the preferred embodiment of this invention is configured for use by a surgeon performing an operation on a patient.
- the invention is equally applicable to any environment in which the user is conducting precision or detailed procedures with his or hands on a relatively stationary subject, and where the user would find it advantageous to see data superimposed over the normal field of view.
- the potential applications are too numerous to mention, but would include forensics, microelectronics, biotechnology, chemistry, etc.
- the preferred embodiment refers to use by a surgeon, and to the acquisition and display of medical data, its applicability is much broader, and the claims should be interpreted accordingly.
- the description of the preferred embodiments make reference to standard medical imaging devices that are used to generate images to be displayed on the heads-up display.
- the disclosure specifically references several examples of such devices, including video cameras, x-ray devices, CAT and NMR scanners, etc.
- numerous other medical imaging systems are well known to exist, and most likely, numerous improved imaging devices will be developed in the future.
- the present invention does not depend on the type of imaging device that is implemented.
- the inventions described herein are not to be limited to the specific scanning or imaging devices disclosed in the preferred embodiments, but rather, are intended to be used with any and all applicable medical imaging devices.
- the preferred embodiments depicted in the drawings show a single generic imaging device mounted on a manipulator arm. Numerous other tool and manipulator configurations, and multiple imaging devices, can be substituted for the single device.
- the detailed description below shows at least two embodiments for the display screen.
- the preferred embodiment discloses the display screen mounted on the head of the user.
- the second embodiment shows the display screen positioned between the user and the subject, in a manner that is not mounted upon or supported by the head of the user. Additional embodiments also exist, and need not be disclosed.
- the first embodiment can be modified for use in the eye-piece of a scope of any form, such as used in micro-electronics, biotech, medicine, forensics, chemical research, etc.
- the specific arrangement of the icons and menus that appear on the HUD screen, and the associated operations performed by those icons and menu items, are a matter of choice for the specific application.
- the invention is not intended to be limited to the specific arrangement and contents of the icons and menus shown and described in the preferred embodiments.
- the icons and menu items for a selectively controllable heads-up display used by a dentist would likely be different than the arrangement used for a micro-electronics engineer.
- FIGS. 1A, 1B and 1 C depict side, top and front views, respectively, of a selectable heads-up display worn by a technician, such as a surgeon.
- FIG. 2 shows an example for a basic configuration of an integrated head mounted display (HMD) and associated computer system used by a technician, such as a surgeon.
- HMD head mounted display
- FIG. 3 is a block diagram of the primary components of the heads-up surgeon's display of FIG. 2 .
- FIG. 4 is a more detailed block diagram of a preferred embodiment of the heads-up display system.
- FIG. 5 depicts an embodiment for the physical eye-tracking system implemented with an infrared laser.
- FIG. 6A depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display on a transparent or see-through screen.
- FIG. 6B depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a patient's vital signs selected for display on a transparent or see-through screen.
- FIG. 6C depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a view from one of the cameras selected for display on a transparent or see-through screen.
- FIG. 7A depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a view from one of the cameras selected for display on a portion of the screen that has been made selectively non-transparent.
- FIG. 7B depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple x-ray images from the computer or one of the cameras selected for display on a portion of the screen that has been made selectively non-transparent.
- FIG. 7C depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple forms of data and images displayed in windows on a portion of the screen.
- FIG. 7D depicts another form for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple forms of data and images displayed in various windows on a portion of the screen.
- FIGS. 8A, 8B and 8 C depict various views of an alternative embodiment implemented with a transparent heads-up display that is not worn by the surgeon, but rather, is movably stationed over the patient.
- FIGS. 1A, 1B and 1 C are three views of a head-mounted, selectable, display system 10 .
- the display system 10 is worn by a surgeon 12 performing an operation.
- the system is easily programmed for use by any individual performing detailed procedures in which it is advantageous to see the normal field of view, while also having access to and seeing in that field of view a variety of forms of data relating to the procedure.
- a display in accordance with this invention will have many applications, but is primarily intended for procedures where detailed work is performed on relatively stationary objects. Accordingly, while the description below refers repeatedly to the user as a “surgeon”, it should be understood that the other users are included in the scope of the invention.
- the major components of the HUD system 10 as worn by the surgeon 12 include a display screen 14 , microphone 16 , speaker 17 , display driver 18 , camera 29 , display mirrors 20 and 22 , light 31 , eye-tracking laser 24 , eye-tracking detector 26 , and eye-tracking optics 28 .
- each of these components are integrated into a single, adjustable head piece that is placed on the surgeon's head 30 . Once placed on a surgeon's head 30 , the various components are adjusted for proper operation.
- the screen 14 is positioned comfortably in front of and in the normal line of sight of the surgeon's eyes 32 , and the microphone 16 (placed in front of surgical mask 19 ) and speaker 17 are positioned so that the surgeon 12 can communicate with selected medical assistants and computers including an electronic speech recognition system, as discussed in greater detail below.
- the display driver 18 and display mirrors 20 and 22 are adjusted to superimpose selected images and data with light rays 4 from outside of HUD 10 on the display screen 14 via optical path 6 .
- the eye-tracking laser 24 , eye-tracking detector 26 , and eye-tracking optics 28 are aligned to detect and communicate eye movement to an eye-tracking computer, as also discussed below.
- an eyeglass frame may be used to support the display surface 14 , eye-tracking laser 24 , eye-tracking detector 26 , eye-tracking optics 28 , or other components as desired.
- the surgeon 12 wears the HUD system 10 to simultaneously view selected data and images while performing an operation on a patient 34 .
- the surgeon 12 can selectively control the display screen 14 to operate between opaque or translucent modes.
- the screen 14 will display primarily the data or images generated from a control computer 36 or one or more video or medical imaging input device(s) 38 , for example, when conducting particularly detailed or internal surgery.
- the surgeon 12 will be able to see through display screen 14 to the patient 34 , while at the same time seeing the data or images generated by the computer 36 or imaging input device(s) 38 .
- imaging input device 38 is shown in FIG. 2 .
- multiple numbers, and any and all forms, of image-generating systems can be employed in the proposed system.
- CCD, video, x-ray, NMR, CAT, and all other medical imaging systems can be employed.
- the imaging input device 38 is mounted on a moveable and computer controllable manipulator assembly or arm 39 , so that it can be controlled by the surgeon 12 through the HUD system to move to selected parts of the patient 34 , and to obtain and magnify images of particular parts of the body, tissue or organs undergoing surgery.
- a moveable and computer controllable manipulator assembly or arm 39 so that it can be controlled by the surgeon 12 through the HUD system to move to selected parts of the patient 34 , and to obtain and magnify images of particular parts of the body, tissue or organs undergoing surgery.
- the surgeon 12 can command the HUD system 10 , under control of computer 36 , to selectively display on screen 14 various forms of data, graphics or images, including magnified or otherwise modified images from the imaging input device(s) 38 , while at the same time looking through the screen 14 to view the patient 34 and the normal, real-life environment of the operating room.
- the surgeon 12 is able to directly view the patient 34 , while at the same time, select from many forms of data for display on the HUD screen 14 .
- the surgeon 12 may wish to see both the movements of his or her hands, and a superimposed magnified view from one of the image input device(s) 38 .
- the surgeon 12 can control the video device 38 to move to and focus on the critical area or surgical site, and to obtain and magnify and image for display on the HUD display 14 .
- the surgeon can control the computer system 36 to record the generated images, and then to display on the HUD screen 14 selected parts thereof after being magnified or otherwise computer enhanced.
- the surgeon 12 has the great advantage of being able to simultaneously control the HUD and data acquisition systems, while also looking through the HUD display 14 to watch minute hand movements and other aspects of the local environment.
- the surgeon 12 may command the computer 36 in several ways.
- the HUD system 10 incorporates eye-tracking to control a curser that is displayed on the HUD screen 14 . More specifically, as shown graphically in FIGS. 6A-6C , the standard display 14 includes menu items or icons that can be selected by a curser 40 that is in turn controlled by an eye-tracking system. For example, when the surgeon 12 focuses his eyes on a selected icon or menu displayed on the HUD screen 14 , the eye-tracking system will correspondingly cause the curser 40 to move or “track” over the selected icon or menu item.
- surgeon can then select the specific icon or menu by voice command or with a foot-operated select button (not shown) operating in a manner similar to the well known “mouse” button.
- a foot-operated select button (not shown) operating in a manner similar to the well known “mouse” button.
- the surgeon operates eye-tracking curser 40 in a “hands-free” manner to move onto icons and to select various imaging systems, computers or other data sources for display on the HUD display 14 .
- the HUD system 10 includes a standard speech recognition sub-system integrated with the command operations.
- the surgeon 12 can speak select speech commands or select words to select specific menus or to initiate computer operations to acquire and display select images or data.
- the surgeon can use his or her eye to move the curser 40 to a particular menu or icon, and then speak commands to perform various operations associated specifically with that menu or icon, such as obtaining or magnifying images, selecting particular parts of patient histories, etc.
- the HUD display system 10 preferably communicates with the command computer 36 via any appropriate form of wireless communication, as is well known in the art of computer networking.
- the surgeon is shown wearing a radio transmitter 44 and an antenna 46 that operate in a manner that is well known in the art to communicate with computer 36 .
- wireless communication it is expressly noted that any and all forms of wireless or wired communication can be used, and as a result, the invention is not intended to be limited to any specific form of data communication between the HUD system and the computer 36 .
- the surgeon may use the HUD system as a hands-free interface to control an imaging input device 38 (shown as a camera in FIG. 2 ) and its associated manipulator assembly 39 to obtain images of the patient 34 , and to display the images on the HUD display 14 .
- the surgeon can operate the HUD system as an interface to obtain from computer 36 (or from additional remote or local computers not shown) reference or other patient data, and to display that data on the HUD display 14 .
- FIG. 3 Shown in FIG. 3 is a general block diagram of the HUD system integrated in a surgeon's environment in accordance with the present invention.
- the HUD system 10 including the referenced display, eye-tracking, speech recognition and communication elements, is coupled to a main command control computer 36 .
- Also coupled to the command computer 36 are the numerous sensor inputs 47 , such as those that monitor vital signs and acquire medical images.
- An electronic data base 48 of the patient's history is either coupled to or included in the memory of the command computer 36 .
- the command computer 36 has access over a standard network 50 to remote sites and computers (not shown).
- sensor inputs 47 There are numerous types of sensor inputs 47 that will monitor the patient 34 and generate data of interest to the surgeon 12 . Each such sensor 47 is considered old in the art, and operates to monitor and generate computer data defining the characteristics of the patient 34 in a manner well known to those of ordinary skill in the art. Thus, it is expressly noted that while several specific types of sensor inputs may be described in this specification, any and all types of sensor inputs 47 can be used, as long as the data is generated in a manner that is made accessible to the surgeon 12 by the command control computer 36 .
- the patient data base 48 includes any and all type of patient data that a surgeon may wish to access, including, for example, the patient's history and prior medical images. While the data base 48 is shown in FIG. 3 as being separate from the command computer 36 , the data base can also be loaded into the data storage portion of the command computer 36 . Likewise, the patient data base 48 may be located remote from the command computer 36 , and can be accessed over the network 50 .
- the network 50 may access not only the standard hospital network, but also remote sites.
- the surgeon 12 can access and communicate with other computers, expert systems or data devices (not shown) that are both local and remote from the surgeon's location.
- specialists remote from the specific operating site may view the operation and communicate or consult directly with the surgeon 12 .
- the remote sites can selectively display the operation from any number of cameras in the operating room, and in addition, can selectively display and view the procedure with the same perspective of the surgeon 12 through the HUD display screen 14 , using the head-mounted camera 29 . In that manner, specialists at the remote sites will see what the surgeon 12 sees, including the view of the patient 34 and the data, menus, icons and cursor shown on the HUD display screen 14 .
- the video camera 29 is a remote controlled, high-performance camera that is mounted on the HUD gear worn by the surgeon 12 so that it can selectively view and transmit an image of the HUD screen 14 to the command computer 36 , and if desired, to a remote site or storage device (e.g., disk or video tape) controlled thereby.
- the camera 29 can be mounted on the head of the surgeon 12 in a manner so as to make use of the same optics 20 and 22 used by the display driver 18 .
- the head mounted camera 29 and/or imaging device 38 may be mounted instead to the robotic arm 39 controllable by the surgeon to tilt, pan, zoom or otherwise focus upon, selected views of the patient.
- the head mounted camera 29 may also be mounted other than on the top of the surgeon's head.
- the camera 29 can be mounted on the left side of the surgeon's head, wherein additional optics are used to scan the display screen 14 .
- FIG. 4 shows a more specific diagram of the main components of the HUD display system 10 .
- the HUD system 10 is coupled to and communicates with the command control computer 36 via a communication link 52 .
- the communications link comprises of two high speed digital radio transceivers or a optical communication system using fiber optic cable.
- the link allows for video, audio, and/or data to be transported to and from a computer network to and from the operator in the form of graphics, audio, video, text, or other data.
- FIG. 4 shows a more specific diagram of the main components of the HUD display system 10 .
- the HUD system 10 is coupled to and communicates with the command control computer 36 via a communication link 52 .
- the communications link comprises of two high speed digital radio transceivers or a optical communication system using fiber optic cable.
- the link allows for video, audio, and/or data to be transported to and from a computer network to and from the operator in the form of graphics, audio, video, text, or other data.
- the communication link 52 can use any method of communicating appropriate signals or information to and from a computer system and/or network (i.e. including but not limited to a radio transceiver, fiber optic cable, wire etc.).
- One use of the communication link 52 is to transmit to the command control computer 36 the input commands 54 generated by the surgeon 12 .
- the surgeon 12 generates the input commands 54 in one or more of several alternative manners.
- the commands 54 are generated when an eye-tracking system 56 detects the surgeon's eyes 32 focusing on selected icons or menu items displayed on the HUD screen 14 .
- the icons and menu items then cause the initiation of a corresponding operation, as is common with standard icon-based user interfaces employed with computers running the Macintosh or Windows 95 operating systems.
- the surgeon 12 may generate the commands orally by speaking select words or commands through microphone 16 .
- a standard voice recognition sub-system or computer 58 interprets the oral sounds output by the microphone 16 , and generates digital commands 54 in accordance with well known speech recognition processes. These speech commands 54 are then passed to command control computer 36 through communication links 52 .
- standard speech recognition systems see C. Schmandt, Voice Communication With Computers , (Van Nostrand Reinhold, NY, 1994), and C. Baber et al., Interactive Speech Technolology: Human Factors Issues in the Application of Speech Input/Output to Computers , (Taylor and Francis, Pa., 1993), incorporated herein by reference.
- programming icon or menu based user interfaces see J. Sullivan et al., Intelligent User Interfaces , (Addison-Wesley Publishing Company, NY, 1991), incorporated herein by reference.
- the communication link 52 is also responsible for routing video images from a camera and lighting system 49 configured on the HUD system. More specifically, the camera and lighting system 49 generate video information under control of the surgeon 12 for display on the HUD screen 14 .
- the surgeon controls pan driver 65 , tilt driver 67 , magnification driver 69 and light 31 to focus upon selected scenes for imaging.
- the pan driver 65 controls pivotal movement in the horizontal direction by the camera while the tilt driver 67 controls the vertical pivotal scanning movement of the camera.
- the magnifier driver 69 controls the degree of zoom of the image input device 38 .
- the camera drivers each control a respective servo-motor, stepper motor or actuator that moves or controls the associated camera parameter.
- the surgeon can control the camera to focus upon and scan a particular feature (such as a tumor), and to generate and display on the HUD screen 14 highly magnified views thereof.
- the head mounted camera 29 can be controlled to scan the HUD screen 14 to generate, record and transmit to remote sites the view as seen by the surgeon 12 .
- the imaging input device 38 can either be controlled manually and/or automatically.
- the communication links 52 also serve to route control information from the command computer 36 to the HUD system 10 to operate the various sub-systems such as the speaker 17 , display driver 18 , display screen 14 , and imaging input device 38 . More specifically, the command computer 36 operates to maintain the contents of the display on HUD screen 14 , including maintaining the display of the basic menus and icons in accordance with the mode selected by the surgeon 12 , controlling and displaying movement of the cursor 40 (shown in FIGS. 6A, 6B , and 6 C) in response to the eye-tracking system 56 and/or speech recognition system 58 , and displaying data and images obtained from the numerous available sources.
- the command computer 36 regularly communicates, through communication links 52 , the control signals necessary to operate the display driver or generating system 18 , which in turn creates and projects the required elements of the basic user interface through the display optics 20 / 22 onto the HUD screen 14 .
- the eye-tracking system 56 generates input signals for the command computer 36 , which in turn controls the display generating system 18 to correspondingly move the cursor 40 (shown in FIGS. 6A, 6B , and 6 C) on the display screen 14 in a manner that tracks the movement of the surgeon's eyes 32 .
- the command computer 36 updates the status and contents of the various menus and icons, in a manner familiar to those who use a “point-and-click” user interface, such as found in common Windows '95 and Macintosh computer systems using a mouse, touch-pad or similar device.
- further input signals 54 are generated for use by the command computer 36 , for example, to obtain patient data or past or current images.
- the command control computer 36 carries out the required operations external to the HUD system 10 (such as controlling the sensor inputs 47 which may include imaging input device 38 or inputs from data base information 48 or other network 50 as shown in FIG. 3 ) to access and obtain the requested data.
- the command computer 36 controls the HUD system 10 via communication links 52 to update the screen 14 to display for the surgeon the requested data or images, and to generate audio through speaker 17 .
- the display driver or generating system 18 operates to generate the images that are transmitted through the optics 20 / 22 and displayed on the HUD screen 14 .
- Such display drivers or systems are well known to the those of ordinary skill in the art, and any applicable display system can be used.
- display generating devices such as CRTs, LEDs, laser diodes, LCDs, etc.
- this invention is not limited to any particular system, as long as it can generate and display video or computer-generated images onto the display surface/screen 14 or directly into the user's eyes 32 thereby superimposing images over the surgeon's actual field of view. See, for example, the following references related to display systems, each of which are incorporated herein by reference: A.
- the HUD system 10 uses a projection method for displaying images in the user's field of view using a light source (e.g. CRT, LED, laser diode, LCD projector, etc.).
- a light source e.g. CRT, LED, laser diode, LCD projector, etc.
- the light source intensity or brightness can be varied in the user's field of view so that the image being displayed can be more visible than the surrounding light.
- the light source intensity may also be decreased so that the surrounding light can become more visible and the image being displayed less visible.
- a CRT/LED projector 18 is used as the display driver.
- display mirrors 20 and 22 are used to transmit the projected images to the screen 14 . More specifically, mirrors 20 and 22 are located within the head-mounted system 10 , and are positioned outside of the field of view of the user 12 . However, they reflect the projected image so that it can be superimposed over a real scene on the display screen/surface 14 formed of a glass or other suitable display material. Another method of displaying images in the user's field of view is by using LCD technology embedded inside the display surface 14 .
- This eye-tracking system 56 includes eye-tracking electronics 73 , an infrared camera 26 , eye-tracking optics 28 , and an infrared laser 24 , all of which are described below.
- FIG. 5 Shown in FIG. 5 is an example of an eye-tracking system 56 .
- This system operates in a manner known to those of ordinary skill in the art. Several such systems are readily available, and the invention is not limited to any particular device, system, means, step or method for tracking the eye.
- eye-tracking systems see for example the following U.S. Pat. Nos. 5,231,674; 5,270,748; 5,341,181; 5,430,505; 5,367,315; 5,345,281; 5,331,149 and 5,471,542 incorporated herein by reference.
- a low power laser 24 generates an infrared eye-tracking laser beam 60 .
- the laser beam is projected through a lens 62 and reflected by a mirror 28 onto the user's eye(s) 32 .
- the user's eyes include a sclera 64 , cornea 66 , and pupil 68 .
- the eye components cause distortions in the infrared laser beam, which are reflected back onto mirror 28 , and then through a lens 70 into an infrared photodetector, infrared camera 26 or other type of photodetector.
- This distortion of the laser beam corresponds to the eye direction vector which can be measured accurately by eye-tracking electronics 73 (Shown in FIG. 4 ).
- Data defining the eye direction vector is subsequently transmitted from the eye-tracking electronics 73 to the command computer 36 through the communication links 52 .
- the eye-tracking optics which include mirror 28 , lens 62 , infrared camera 26 , and laser 24 , may be automatically adjusted for optimal performance through the use of computer controlled actuators (not shown).
- eye-tracking electronics 73 are shown in the block diagram as carried by the heads-up display system 10 , it is also possible to transmit the raw data from the infrared detector imaging device 26 to the command computer 36 , which then determines the associated eye direction vectors. Likewise, the eye-tracking computer (and other electronics, if used) can be worn by the surgeon 12 on a belt or backpack (not shown).
- FIGS. 6 and 7 Shown in FIGS. 6 and 7 are the contents and arrangement of several preferred forms of visual displays for the screen 14 , including exemplary icons and menus for the user interface.
- an object such a patient 34 is shown visible through the display 14 in the normal field of view 72 of the user 12 . More specifically, a surgeon 12 wearing the HUD system 10 will see the patient 34 on table 75 through the semi-transparent display screen 14 . The user will also see a number of icons or menu items 74 , 76 , 78 , 80 , 82 , 84 and 86 , superimposed over the real scene along the top portion of the normal field of view 72 .
- the icons or menu items 74 , 76 , 78 , 80 , 82 , 84 , and 86 can be positioned along an opaque portion of the display screen 14 , outside the normal field of view 72 .
- the specific contents and form of the icon or menu items 74 , 76 , 78 , 80 , 82 , 84 , and 86 , along with their associated computer operations, will vary depending on the specific implementation and programming. However, in its preferred form, there will be included icons or menu items that allow the user to control one or more cameras, access one or more computers, control the characteristics of the display 14 , and display data, such as a patient's vital signs.
- three separate icons or menu items 74 , 76 and 78 are assigned to control three cameras, indicated as CAM 1 , CAM 2 and CAM 3 , respectively.
- the user can independently control the associated camera systems to obtain (pan, tilt, rotate, zoom, etc.) and display various images.
- two icons or menu items 80 and 82 are assigned to control access to two computers, indicated as COMP 1 and COMP 2 , respectively.
- COMP 1 and COMP 2 By selecting either of the computer icons, the user can access and control the associated computers or networks, to obtain patient data or other reference material.
- Another icon or menu item 84 is indicated with the label DC and is provided to allow the user to access and vary the characteristics of the screen 14 .
- the DC icon or menu item 84 can be accessed by the user to control brightness, contrast, and the degree to which you can see through the screen 14 .
- Another icon 86 allows the user to control various devices to obtain and display on screen 14 any of a wide variety of vital signs.
- each of the icons or menu items 74 , 76 , 78 , 80 , 82 , 84 , and 86 can be accessed and controlled by causing the eye-tracking curser 40 to move over and select the desired icon. For example, referring to FIG. 6B , to see an update of the patient's vital signs, the surgeon can focus his or eyes 32 on the icon 86 corresponding to the patient's vital signs. The eye-tracking system 56 will track the surgeon's eyes 32 to cause the cursor 40 to scroll to the VITAL icon 86 .
- FIG. 6B Shown in FIG. 6B is the display of the standard vital signs in analog graphic format. Specifically, graphics are shown for the patient's blood pressure 83 , heart rate 85 , respiratory rate 87 and body temperature 89 .
- any additional vital sign e.g., blood sugar, oxygen level, blood flow, etc.
- the analog displays 83 , 85 , 87 and 89 digital values and titles can be displayed (not shown).
- the system can be programmed to display the vital signs for a set period of time, continuously, or in an “on-off” fashion.
- FIG. 6C there is shown the same view as FIG. 6B , with an image captured by an image input device 38 superimposed on the normal field of view 72 .
- the surgeon 12 focuses his eyes upon the associated icon, for example, the CAM 1 icon 74 .
- the eye-tracking system 56 causes the cursor 40 to track to icon 74 , and correspondingly initiates the desired camera image to be superimposed over the image of the patient 34 .
- the image is a 5-times magnified image of an incision 91 in the patient.
- the surgeon may also cause the display screen 14 to operate in an opaque mode, displaying only the image of the incision 91 as if on a normal computer display screen.
- the surgeon can magnify or otherwise control an image input device(s) 38 to obtain the image(s) desired, at the appropriate magnification.
- FIG. 7A shows the display screen 14 operating in an opaque or semi-transparent mode with the incision 91 magnified to a 10-times view. Also shown in FIG.
- FIG. 7A depicts the cursor 40 to precisely select portions of the image to be further still further magnified, enhanced, and/or centered in the display.
- FIG. 7B depicts the display of skeletal FIGS. 94 and 96 selected as above by the surgeon 12 moving the cursor 40 to still another of the camera icons, for example, CAM 2 icon 76 .
- the user interface and menu/icon programming can be configured to require the surgeon to take further action after the cursor 40 tracks over one of the icons. For example, and in the simplest form, once the surgeon causes the cursor to track over a selected icon, nothing may happen until the surgeon “clicks” a foot-operated mouse button (not shown). In more complex forms of the invention, the surgeon can actually access the selected operation by tracking the cursor to the selected icon and then speaking a select code word (e.g., “select” or “open”) into microphone 16 , which word or words are interpreted by speech recognition system 58 . In still another form of the invention, the surgeon can access the selected operation associated with a particular icon or menu item by blinking a set number of times in quick succession after tracking the cursor 40 to the desired located. The blinking action is detected by the eye-tracking system 56 .
- a select code word e.g., “select” or “open”
- the surgeon can access the selected operation associated with a particular icon or menu item by blinking
- the selection by the surgeon of a specific icon will cause the associated data or images to be displayed on the screen 14 .
- a series of menus can be associated with each icon, each menu having successively more detail. For example, instead of having three camera icons 74 , 76 and 78 , a single “video” icon can be substituted, which when selected by the cursor 40 , will change the display to then show the individual icons for each of the many available cameras. Next, when one of the individual camera icons is selected by the cursor 40 , the display will again change to show individual icons for the various camera controls, such to control the panning, tilting, rotating, magnification, filters, manipulator members, etc.
- the HUD system 10 may incorporate a hierarchical icon or menu system, where several layers of menus are necessary to obtain and display desired data. In that case, greater flexibility is desirable in controlling how much data can be displayed on the screen 14 at any given time. More specifically, as shown in FIG. 7C , in the more complex forms of the invention, the well known programming techniques from the Macintosh or Windows 95 operating systems are adapted and included in command computer 36 to allow the display of multiple windows, that can be easily tiled, cascaded, selected, re-sized, or hidden in a manner now familiar to those skilled in the art.
- the surgeon 12 can simply “hide” the image until it is again desired to view it, at which point it can be easily selected.
- the surgeon is not required to sequence through each of the menu levels to access the desired image.
- the user has configured and controlled the display to show the windows or regions having patient data 98 , magnified camera view 100 , MRI data 102 , magnified skeletal view 96 , and whole body skeletal view 94 .
- the surgeon 12 can independently hide, re-size, and rearrange each of the windows, along with the transparency level of the screen 14 , thereby providing a maximum of flexibility.
- FIG. 7D depicts the display 14 having various user selected information, data, or images positioned thereon at varying degrees of intensity or transparency.
- the surgeon has placed in the center of the display 14 a semi-transparent window 108 through which the normal field of view 72 displays the patient 34 .
- the cursor 40 switches to the eye-tracking cross hairs 90 and 92 .
- the cross hairs 90 and 92 allow the surgeon 12 to select a specific portion of the patient 34 , such as incision 91 .
- the selected portion (e.g., part of the incision 91 ) of patient 34 is then locked on and magnified as a separate view in a different window 112 in the manner described above where a different set of selectable cross hairs 115 and 117 are shown for further magnification.
- data such as the patient name and procedure description is displayed in the title portion 114 at the top of the display screen/surface 14 .
- the surgeon 12 has selected more detailed patient data 116 to be displayed in a separate window 116 in the lower left hand comer of display 14 .
- Various medication listings 118 and recommendations 120 , along with current cursor or cross-hair coordinates 122 are displayed in a top portion of screen 14 .
- programmable warning indicators 124 , 126 , and 128 are also selected for display.
- the warning indicators may be programmed by the surgeon 12 to monitor certain conditions and to visually indicate warnings at desired levels.
- the same programmable warning indicators will issue various levels of audible warning tones to the surgeon 12 through the speaker 17 .
- the surgeon 12 has selected and displayed the vital signs in a separate window 130 at the top left corner of the display screen/surface 14 , and a 3D model of a select portion of the patient in window 132 , which is continually updated is shown below the vital signs 130 and updated in real time.
- Other graphical information 110 is updated in real time and displayed at the bottom right corner of the display screen/surface 14 .
- Other pertinent images or data may be selectively displayed in other areas of the display 14 , for example skeletal images in area 131 and magnetic resonant imaging (MRI) data 132 . All of the described operations can also be effected or initiated by using the speech recognition system 58 .
- MRI magnetic resonant imaging
- the overall medical HUD system 10 is extremely flexible, allowing each surgeon to customize the display and use only the features deemed appropriate, necessary and desirable.
- the surgeon may choose to display on a small part of the screen 14 only the patient's vital signs, as shown in FIG. 6B .
- the surgeon may elect to proceed in a semi-transparent mode, with several windows of data or images, as shown in FIG. 7D .
- Each of the variations in programming set forth above can be configured for each specific surgeon 12 in a computer file assigned to that user and stored in command computer 36 .
- a particular surgeon 12 prefers to have specific icons or menus shown on specific screens, or for example, prefers digital over analog displays for vital signs, that user can select those specific settings and the system will perform in accordance therewith.
- the system allows for substantial customization for specific types of surgeons or fields outside of surgery (e.g., microelectronics, forensics, etc.).
- FIGS. 8A, 8B , and 8 C Shown in FIGS. 8A, 8B , and 8 C is still another embodiment of the invention that incorporates a non-attached heads-up display screen 134 .
- the display screen/surface 134 is shown as a generally flat rectangular semi-transparent display area.
- the display screen/surface 134 is attached to a top end of a sturdy but flexible post 136 via joint 138 , thereby allowing it to be moved to various viewable positions.
- the joint 138 is also constructed to allow the screen 134 to be moved left or right relative to the post 136 .
- the lower end of post 136 is attached to a base 140 that is supported on the floor.
- the post 136 is also adjustable in height and can be bent to allow precise positioning of the screen/surface 134 by the user.
- the display screen 134 is mounted via robotic arm 142 .
- Display screen/surface 134 of HUD 10 is attached at a lower end of the multi-jointed, robotic arm or other manipulator 142 .
- the upper end of arm 142 is coupled to a mounting platform 144 that is fixed to the ceiling 146 of a room.
- Speech commands such as “Adjust Display On”, “Adjust Display Off”, “Up”, “Down”, “Left”, “Right”, “Forward”, and “Backward” can be used for controlling the position of the retractable display screen/surface 134 .
- the position of the display screen/surface 134 may be robotically controlled by speech signals from an operator 12 .
- speech signals may be received by a microphone and processed by a speech recognition system which thereby sends signals to a robotic microcontroller that drives the appropriate actuators to position the display screen/surface 134 as desired.
- the non-attached heads-up display system operates in a manner as described in the head-mounted HUD display embodiment of FIGS. 1-7 , thereby allowing the surgeon to select and display data in a superimposed manner over the normal field of view of the patient 34 .
- Both the speech recognition system 58 and eye-tracking system 56 can be used to move the cursor 40 to activate select icons or menus for operating computer 36 and display 134 .
- additional methods for moving the cursor are possible, including using a low power laser mounted on head of the surgeon 12 , along with a touch screen incorporated in display screen itself.
- many computer-controlled instruments used in the micro-electronics field have one or more eyepieces through which the operator views a specimen.
- the preferred embodiment of the heads-up display can be easily modified for use in such applications. More specifically, because the user's eye is relatively aligned with the eye-piece, the eye-tracking laser can be unobtrusively mounted in or below the eye-piece shaft.
- the user interface can be displayed by a display driver onto the same view seen through the eye-piece. In same manner as described for the preferred embodiment, the user can see the full realm of data in the normal field of view, while simultaneously controlling the associated computer. Still further modifications are possible without departing from the spirit and scope of the invention.
- the HUD system's ability to provide high resolution images directly in the field of view of an operator without forcing the operator to look away can greatly enhance the ability to complete an operation in a very precise and controlled manner.
- This precise control can be incorporated into surgical cutting, probing, and/or positioning tools by clearly presenting the position of such tools onto the display with respect to a patient and/or patient model obtained from real time imagery.
- This technique can be very advantageous in the event that the actual orientation or position of such tool(s) is unobtainable from an unassisted eye but requires direct visual control to operate.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Epidemiology (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Primary Health Care (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Urology & Nephrology (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The inventions relate to electronic display systems and control systems therefor. More particularly the inventions relate to selectively operable heads-up display systems for presenting information and/or image(s) to the user. In its preferred form, the heads-up display is configured for use by medical technicians or personnel, such as surgeons performing an operation.
- A heads-up display is generally defined as an electronically generated display containing information or data that is superimposed on an observer's normal field of view. As explained in greater detail below, heads-up display (“HUD”) systems have been used in various applications. One such application is for use by pilots of aircraft. In the typical aircraft HUD system, a semi-transparent display screen is located generally in front of the eyes of the pilot (i.e. a screen mounted on the pilot's head or helmet, or in the view of the aircraft windshield). Such a system enables a pilot to concentrate on the difficult tasks associated with flying the aircraft, without diverting his attention to scan or examine a wide array of instruments.
- It is also well known that medical technicians or personnel, such as surgeons, must keep track of many different types of information during an operation. For example, a surgeon must carefully view or monitor the physical surgery while simultaneously monitoring a patient's condition (e.g., blood pressure, heart rate, pulse, etc.). In addition, depending on the procedure, the surgeon must also monitor the status and settings of surgical equipment and tools. Although the additional information is necessary and important, monitoring such information often diverts the surgeon from the immediate task at hand.
- Some surgical operations require that the surgeon divert his eyes to view a video monitor, for example, when performing highly complex laser or internal surgery conducted through scopes. See U.S. Pat. No. 5,222,477, which discloses an endoscope or borescope stereo viewing system. In addition, the surgeon may from time to time need to refer to data, such as defined by a patient's recorded or written history, or to previously taken x-rays or other computer generated images (e.g., CAT, NMR, 3D, etc.). For example, U.S. Pat. No. 5,452,416 discloses an automated system and a method for organizing, presenting, and manipulating medical images for viewing by physicians. See also U.S. Pat. Nos. 5,251,127 and 5,305,203, which disclose a computer-aided surgery apparatus that positions surgical tools during surgery or examinations. In each of the above-described systems, in order to view the displayed information, the surgeon must divert his or her eyes to a remote monitor.
- Thus, surgeons use many different types of displays and must continually monitor many different sources of information. However, as more tools and sources of data become available to surgeons for use during operations, more opportunities for distraction arise. It is difficult for a surgeon to focus on his or her conduct during a surgical procedure while also continually shifting focus away from the patient to other monitors or indicators. Therefore, a need exists for conveniently, efficiently and accurately displaying to a surgeon various types and sources of information, views, and images of a patient undergoing a critical medical procedure. As explained in greater detail below, prior attempts in the medical field to fulfill that need have been unsatisfactory.
- For example, video signal sources have been adapted to scan views or images for different types of medical uses and applications. U.S. Pat. No. 4,737,972 to Schoolman (“Schoolman I”) discloses a head-mounted device that provides stereoscopic x-ray images. Furthermore, U.S. Pat. No. 4,651,201 to Schoolman (“Schoolman II”) discloses an endoscope that provides stereoscopic images of the patient on a display. Both Schoolman I and Schoolman II allows for the selective transmission of other video data to the display. However, Schoolman I and Schoolman II do not use a “see through” display that allows the surgeon to monitor both the environment around him and the video image. If the surgeon wishes to monitor or view the real-world environment, as opposed to the displayed graphics, the head-mounted display must be removed.
- Efforts have also been made to use head-mounted displays in augmented reality simulations for medical applications wherein a desired image or three-dimensional model is superimposed on the real scene of a patient. For example, it was reported that a research effort in the Department of Computer Science at the University of North Carolina has attempted to develop a see-through head-mounted display that superimposed a computer-generated three-dimensional image of the internal features of a subject over the real-life view of the subject. Information describing those research efforts may be found on the World Wide Web in a document maintained by Jannick Rolland at the site on the World Wide Web Pages of the NSF/ARPA Science and Technology Center for Computer Graphics and Scientific Visualization at the University of North Carolina, Chapel Hill (http://www.cs.unc.edu/˜rolland, cited February, 1996, copies of which are included in the information disclosure statement that has been filed concurrently with this application). That World Wide Web site in turn referenced the following publication: A. R. Kancheral, et al., “A Novel Virtual Reality Tool for Teaching 3D Anatomy,” Proc. CVR Med '95 (1995). Other research efforts at the University of North Carolina attempted to use a video see-through head-mounted display and a high-performance computer graphics engine to superimpose ultrasound images over the real view of the subject, thereby allowing a user to “see within” the subject. A set of trackers captured the motion of the body part with respect to the field of view of the user, and a computer updated the position of the body part in real time. The computer attempted to correlate the “tracked” position of the body with the three-dimensional model and to display the model on the heads-up display in a manner that gave the appearance of “x-ray vision.”
- In the above-described University of North Carolina research efforts, the focus was primarily to help teach students by superimposing a single computer-generated image over a moving, real-life, image of a subject. However, as explained in the associated literature, the “tracking” requirements made the research effort quite complicated, and the results appeared less than satisfactory. Moreover, such a teaching system is not applicable to the real-world environment of a surgeon, where the patient is not moved (and “tracking” is unnecessary), and where the surgeon needs or desires other information to be made readily available for viewing.
- Still another research program associated with the University of North Carolina is described in Fuchs, et al., “Virtual Space Teleconferencing using a Sea of Cameras,” Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery (Pittsburgh, Pa, Sep. 22-24, 1994). That article describes research efforts that attempted to use a multitude of stationary cameras to acquire both photometric and depth data. The acquired data was purportedly used to construct a remote site in accordance with the head position and orientation of a local participant. According to the article, each participant wears a head-mounted display to look around a remote environment having surface geometries that are continuously sensed by a multitude of video cameras mounted along the walls and ceiling, from which cameras depth maps are extracted through cross-correlation stereo techniques. Views acquired from several cameras are then displayed on a head-mounted display with an integrated tracking system to provide images of the remote environment. The explained purpose of the effort was to duplicate, at a remote location, a three-dimensional virtual reality environment of a medical room. However, the article does not disclose the use of see-through displays providing a surgeon with the ability to select and display additional forms of data, or to superimpose data over a real-life view of the patient or surgical site.
- Another type of head-mounted display is described in Yoshida, et al., “Optical Design and Analysis of a Head-Mounted Display with a High-Resolution Insert,” Proc. SPIE 2537 (1995). That article describes yet another research program associated with the University of North Carolina in which a small area of a high-resolution image is inserted on a large field of a low resolution image displayed on a head-mounted screen. The system is described as using eye-tracking information to dynamically place the high resolution insert at the user's gaze point. The system purports to provide the user with both high-resolution imagery and a large field of view. In essence, using eye-tracking electronics, the “inserted image” corresponding to the user's gaze point is converted from low resolution to high-resolution. However, as above, the user can not select additional or alternative forms of data or different images to be superimposed over the primary image on the head-mounted display.
- Thus, few head-mounted displays have been developed for the medical industry, and all those described above have had limited purpose and utility. On the other hand, and as discussed briefly above, a wide variety of head-mounted devices are commonly used in military applications. As mentioned, aircraft pilots, tank commanders, weapon operators and foot soldiers have all used head-mounted displays to display various forms of weapon or image information along with other data defining the real-world environment of the person wearing the display. For examples of such systems, see the following U.S. Pat. Nos. 4,028,725; 5,281,960; 5,000,544; 5,227,769; 4,994,794; 5,341,242; 4,878,046; 3,940,204; 3,923,370; 4,884,137; 4,915,487; and 4,575,722. Likewise, helmet or head-mounted displays have also been used for motorcycle riders. U.S. Pat. No. 4,988,976 discloses a motorcycle helmet that displays data or information such as speed, time, rpm's, fuel, oil, etc. on the transparent visor (i.e. vacuum fluorescent display) of the rider. Head-mounted displays that are worn in front of the user's eyes or worn as eye spectacles also exist. For example, see the following U.S. Pat. Nos. 5,129,716; 5,151,722; 5,003,300; 5,162,828; 5,331,333; 5,281,957; 5,334,991; 5,450,596 and 5,392,158.
- The field of virtual reality also has driven advances in the use of various types of head-mounted displays. For example, see the following U.S. Pat. Nos. 4,636,866; 5,321,416; 5,347,400; 5,348,477; 5,406,415; 5,414,544; 5,416,876; 5,436,765; 5,479,224; 5,473,365; D363,279; 5,485,172; 5,483,307; 5,130,794. See also the publication How Virtual Reality Works, by J. Eddings (Ziff-Davis Press, Emeryville, Calif., 1994), and the site maintained by Rolland (referenced above) relating to telepresence systems and augmented reality.
- Advances have also been made in the area of heads-up displays or screens that are not attached to or worn by the user. Most commonly, such systems are employed in automotive or military environments, to provide vehicle performance, weapon status, and other data for the driver or pilot. For examples of such systems, see the following U.S. Pat. Nos. 5,278,696; 4,652,870; 4,711,512; 4,729,634; 4,799,765; 4,927,234; 4,973,139; 4,988,976; 4,740,780; 4,787,711; 4,740,780; 4,831,366; 5,005,009; 5,037,182; 5,231,379; 4,824,228; 4,763,990; 4,669,810; 4,688,879; 4,818,048; 4,930,847; 4,932,731; 5,198,895; 5,210,624; 5,214,413; 5,302,964; 4,725,125; 4,188,090; 5,028,119 and 4,769,633.
- Numerous advances have occurred in the specific forms of, and materials used in, heads-up display systems. See, for example, U.S. Pat. Nos. 4,987,410 and 4,961,625 (use of Liquid Crystal Displays (LCDs)); U.S. Pat. Nos. 5,108,479 and 5,066,525 (laminating glass plates or panels); and U.S. Pat. No. 5,457,356 (making a flat panel head-mounted display).
- Also pertinent to this invention is the field of eye-tracking to control various computer or imaging functions. Various systems unrelated to the medical field have used eye-tracking for controlling a field of view. For example, see U.S. Pat. No. 4,028,725, which discloses an eye and head tracking system that controls a beam-splitter and retains the high-resolution part of the image in the field of view. The eye-tracking is carried out by infrared detection (i.e. see U.S. Pat. No. 3,724,932). See also U.S. Pat. Nos. 5,287,437; 4,439,755; 4,349,815; 4,437,113; 4,028,725 (referenced earlier) and the article “Optical Design and Analysis of a Head-Mounted Display with a High-Resolution Insert,” referenced above, particularly at
footnote 19, which refers to the Eye-tracking Systems Handbook, Applied Science Laboratories, Waltham Mass. (1992). - Finally, video recording systems for recording scenery and heads up displays have been taught by the prior art. U.S. Pat. No. 5,241,391 to Dodds (“Dodds”) discloses a video camera system that records scene conditions and heads-up displays.
- Notwithstanding the large number of articles and patents issued in the area of heads-up or head-mounted displays, there has been no such display that is designed for the special needs of individuals performing detailed but critical tasks on relatively stationary subjects. Such a system would be extremely useful to personnel working in the fields of medicine, forensics, and micro-electronics.
- Presently, there is a need for a selectively operable, head-mounted, see-through viewing display system for presenting desired information and/or images to a user, while at the same time allowing the user to view the real-world environment in which he or she operates. There is a further need to provide a convenient selectable viewing system that can be easily controlled by an eye-tracking cursor and speech recognition to control different images or displays on a videomonitor or to control a field of view, while keeping the user's hands free to conduct precision operations.
- Accordingly, it is an object of this invention to provide an improved heads-up display system.
- It is another object of the invention to provide a “hands-free” heads-up display system that is useful to individuals performing detailed procedures, such as those working in the fields of medicine, forensics, micro-electronics, biotech, etc.
- It is another object of the invention to provide an improved head-mounted display that allows the user to view both the subject and selected data.
- It is another object of the invention to provide an improved heads-up display that includes a user-friendly interface to a command control computer.
- It is another object of this invention to provide an improved heads-up display that interfaces with a command control computer and includes an eye-tracking cursor to select menus to control computer performance and the display of data.
- It is another object of this invention to provide an improved heads-up display that interfaces with a command control computer and includes a speech recognition circuit to control computer performance and display of data.
- It is another object of the invention to provide an improved heads-up display that can be positioned between a surgeon and a patient in the surgeon's line of sight.
- It is another object of the invention to provide an improved heads-up display that allows the user to view the subject while simultaneously monitoring the output from a number of different information sources, including imaging devices and remote or networked computer systems.
- It is another object of the invention to provide an improved heads-up display that allows a medical technician to control medical imaging devices to obtain images of select parts of a patient and to display those images on the heads-up display.
- It is another object of the invention to provide an improved heads-up display that allows a user to control a computer to acquire and display data defining a subject's history while simultaneously viewing the subject.
- It is another object of the invention to provide an improved method for conducting surgery on a patient while simultaneously obtaining access to and conveniently displaying on a heads-up display a variety of types of data relating to the patient.
- It is another object of the invention to provide an improved method of controlling a heads-up display by employing a “point-and-click” type user interface and cursor controlled by tracking movement of the eye.
- It is another object of the invention to provide an improved method of controlling a heads-up display by employing speech recognition, both alone and in combination with an eye-tracking cursor.
- It is another object of the invention to provide an improved heads-up display system that allows the user to control tools or instruments in a hands-free manner.
- It is another object of the invention to provide an improved heads-up display system that allows a surgeon to control surgical tools or other instruments in a hands-free manner.
- It is another object of the invention to provide an improved heads-up display maintained in an eyepiece of a scope or instrument and that is controlled with integral eye-tracking and speech recognition systems.
- The above and other objects are achieved in an improved, selectively controllable system for presenting desired data on a head-mounted (or “heads-up”) display. The system includes a command computer processor for receiving inputs that represent data and for controlling the display of desired data. The computer communicates with and controls the heads-up display system, which is configured to display the desired data in a manner that is aligned in the user's field of view. The heads-up display includes a user interface incorporating “hands-free” menu selection to allow the user to control the display of various types of data. In its preferred form, the hands-free menu selection is carried out using an eye-tracking cursor and a speech recognition computer to point to and select specific menus and operations.
- The above and other objects are also achieved in an user-controllable heads-up system for presenting medical data to a physician. The system includes a command control computer for receiving inputs defining medical data and for controlling the display of that data on a head's-up display screen in the normal field of view of the physician. The heads-up display provides the physician with a “user interface” including menus and associated operations that can be selected with an eye-tracking cursor. The system also includes a microphone and speaker so that a physician can communicate with other personnel and computers both locally and remote from the cite. The command computer includes a speech recognition processor to respond to spoken commands of the physician. The command computer also communicates with and receives a wide array of data from other computers networked therewith. The physician can select the specific data to be displayed on the screen. In addition, the physician can, with the eye-tracking cursor, control various medical imaging devices.
- The above and other objects are also achieved in a method of selectively displaying multiple forms of data on a head-mounted display. In accordance with the method, a see-through computer display screen is mounted on a head piece that is worn by the user. A command computer controls a user interface so that command icons or menus are displayed in a super-imposed manner on the see-through, head-mounted display, thereby allowing the user to see both the normal field of view and the user interface. The user interface is provided with a “point-and-select” type of cursor. An eye-tracking system is integrated with the command control computer and the user interface to monitor the user's eye movement and to correspondingly control movement of the cursor. The user selects various computer operations from menus contained in the user interface by moving the eye-tracking cursor to selected menus or icons. By using the eye-tracking cursor to select various computer operations, the user can control the command computer to selectively display on the see-through HUD screen numerous items of data or images, while still seeing the normal field of view.
- The preferred embodiments of the inventions are described below in the Figures and Detailed Description. Unless specifically noted, it is the intention of the inventors that the words and phrases in the specification and claims be given the ordinary and accustomed meaning to those of ordinary skill in the applicable art(s). If the inventors intend any other meaning, they will specifically state that they are applying a special meaning to a word or phrase.
- Likewise, the use of the words “function” or “means” in the Detailed Description is not intended to indicate a desire to invoke the special provisions of 35 U.S.C. Section 112, ¶ 6 to define his invention. To the, contrary, if the provisions of 35 U.S.C. Section 112, ¶ 6 are sought to be invoked to define the inventions, the claims will specifically state the phrases “means for” or “step for” and a function, without also reciting in such phrases any structure, material or act in support of the function. Even when the claims recite a “means for” or “step for” performing a function, if they also recite any structure, material or acts in support of that means of step, then the intention is not to invoke the provisions of 35 U.S.C. Section 112, ¶ 6. Moreover, even if the inventors invoke the provisions of 35 U.S.C. Section 112, ¶ 6 to define the inventions, it is the intention that the inventions not be limited only to the specific structure, material or acts that are described in his preferred embodiments. Rather, if the claims specifically invoke the provisions of 35 U.S.C. Section 112, ¶ 6, it is nonetheless the intention to cover and include any and all structures, materials or acts that perform the claimed function, along with any and all known or later developed equivalent structures, materials or acts for performing the claimed function.
- As a primary example, the preferred embodiment of this invention is configured for use by a surgeon performing an operation on a patient. However, the invention is equally applicable to any environment in which the user is conducting precision or detailed procedures with his or hands on a relatively stationary subject, and where the user would find it advantageous to see data superimposed over the normal field of view. The potential applications are too numerous to mention, but would include forensics, microelectronics, biotechnology, chemistry, etc. Thus, even though the preferred embodiment refers to use by a surgeon, and to the acquisition and display of medical data, its applicability is much broader, and the claims should be interpreted accordingly.
- Further, the description of the preferred embodiments make reference to standard medical imaging devices that are used to generate images to be displayed on the heads-up display. The disclosure specifically references several examples of such devices, including video cameras, x-ray devices, CAT and NMR scanners, etc. However, numerous other medical imaging systems are well known to exist, and most likely, numerous improved imaging devices will be developed in the future. Thus, the present invention does not depend on the type of imaging device that is implemented. The inventions described herein are not to be limited to the specific scanning or imaging devices disclosed in the preferred embodiments, but rather, are intended to be used with any and all applicable medical imaging devices. Likewise, the preferred embodiments depicted in the drawings show a single generic imaging device mounted on a manipulator arm. Numerous other tool and manipulator configurations, and multiple imaging devices, can be substituted for the single device.
- Further, the specification in some places refers to several computers or controllers that perform various control operations. The specific form of computer is not important to the invention. In its preferred form, applicant divides several of the computing, control and analysis operations into several cooperating computers or embedded systems. However, with appropriate programming well known to those of ordinary skill in the art, the inventions can be implemented using a single, high power computer. Thus, it is not the intention to limit the inventions to any particular form or any number of computers, or to any specific computer network arrangement.
- Likewise, the detailed description below shows at least two embodiments for the display screen. The preferred embodiment discloses the display screen mounted on the head of the user. The second embodiment shows the display screen positioned between the user and the subject, in a manner that is not mounted upon or supported by the head of the user. Additional embodiments also exist, and need not be disclosed. For example, the first embodiment can be modified for use in the eye-piece of a scope of any form, such as used in micro-electronics, biotech, medicine, forensics, chemical research, etc.
- Similarly, the specific arrangement of the icons and menus that appear on the HUD screen, and the associated operations performed by those icons and menu items, are a matter of choice for the specific application. Thus, the invention is not intended to be limited to the specific arrangement and contents of the icons and menus shown and described in the preferred embodiments. For example, the icons and menu items for a selectively controllable heads-up display used by a dentist would likely be different than the arrangement used for a micro-electronics engineer.
- Further examples exist throughout the disclosure, and it is not the intention to exclude from the scope of the invention the use of structures, materials or acts that are not expressly identified in the specification, but nonetheless are capable of performing a recited function.
- The inventions of this application are better understood in conjunction with the following Figures and Detailed Description of their preferred embodiments. The various hardware and software elements used to carry out the inventions are illustrated in the attached drawings in the form of block diagrams and flow charts. For simplicity and brevity, the Figures and Detailed Description do not address in detail features that are well known in the prior art, such as the literature listed in the Background of the Invention, above. However, to assure an adequate disclosure, the specification hereby expressly incorporates by reference each and every patent and other publication referenced above in the Background of the Invention.
-
FIGS. 1A, 1B and 1C depict side, top and front views, respectively, of a selectable heads-up display worn by a technician, such as a surgeon. -
FIG. 2 shows an example for a basic configuration of an integrated head mounted display (HMD) and associated computer system used by a technician, such as a surgeon. -
FIG. 3 is a block diagram of the primary components of the heads-up surgeon's display ofFIG. 2 . -
FIG. 4 is a more detailed block diagram of a preferred embodiment of the heads-up display system. -
FIG. 5 depicts an embodiment for the physical eye-tracking system implemented with an infrared laser. -
FIG. 6A depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display on a transparent or see-through screen. -
FIG. 6B depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a patient's vital signs selected for display on a transparent or see-through screen. -
FIG. 6C depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a view from one of the cameras selected for display on a transparent or see-through screen. -
FIG. 7A depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with a view from one of the cameras selected for display on a portion of the screen that has been made selectively non-transparent. -
FIG. 7B depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple x-ray images from the computer or one of the cameras selected for display on a portion of the screen that has been made selectively non-transparent. -
FIG. 7C depicts an embodiment for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple forms of data and images displayed in windows on a portion of the screen. -
FIG. 7D depicts another form for the display view and associated icons and menu items as seen by the surgeon wearing the heads-up display, with multiple forms of data and images displayed in various windows on a portion of the screen. -
FIGS. 8A, 8B and 8C depict various views of an alternative embodiment implemented with a transparent heads-up display that is not worn by the surgeon, but rather, is movably stationed over the patient. - Shown in
FIGS. 1A, 1B and 1C are three views of a head-mounted, selectable,display system 10. In the preferred embodiment shown in the Figures, thedisplay system 10 is worn by asurgeon 12 performing an operation. However, the system is easily programmed for use by any individual performing detailed procedures in which it is advantageous to see the normal field of view, while also having access to and seeing in that field of view a variety of forms of data relating to the procedure. Thus, it is envisioned that a display in accordance with this invention will have many applications, but is primarily intended for procedures where detailed work is performed on relatively stationary objects. Accordingly, while the description below refers repeatedly to the user as a “surgeon”, it should be understood that the other users are included in the scope of the invention. - For convenience, the phrases “head-mounted display,” “heads-up display” and “HUD” are used interchangeably throughout this specification. The major components of the
HUD system 10 as worn by thesurgeon 12 include adisplay screen 14,microphone 16,speaker 17,display driver 18,camera 29, display mirrors 20 and 22, light 31, eye-trackinglaser 24, eye-trackingdetector 26, and eye-trackingoptics 28. In its preferred form, each of these components are integrated into a single, adjustable head piece that is placed on the surgeon'shead 30. Once placed on a surgeon'shead 30, the various components are adjusted for proper operation. More specifically, thescreen 14 is positioned comfortably in front of and in the normal line of sight of the surgeon'seyes 32, and the microphone 16 (placed in front of surgical mask 19) andspeaker 17 are positioned so that thesurgeon 12 can communicate with selected medical assistants and computers including an electronic speech recognition system, as discussed in greater detail below. Thedisplay driver 18 and display mirrors 20 and 22 are adjusted to superimpose selected images and data with light rays 4 from outside ofHUD 10 on thedisplay screen 14 via optical path 6. Likewise, the eye-trackinglaser 24, eye-trackingdetector 26, and eye-trackingoptics 28 are aligned to detect and communicate eye movement to an eye-tracking computer, as also discussed below. In more compact versions of the head mounted display system discussed above and shown inFIGS. 1A, 1B , 1C, and 2, an eyeglass frame may be used to support thedisplay surface 14, eye-trackinglaser 24, eye-trackingdetector 26, eye-trackingoptics 28, or other components as desired. - Referring to
FIG. 2 , thesurgeon 12 wears theHUD system 10 to simultaneously view selected data and images while performing an operation on apatient 34. In a preferred form, thesurgeon 12 can selectively control thedisplay screen 14 to operate between opaque or translucent modes. In the opaque mode, thescreen 14 will display primarily the data or images generated from acontrol computer 36 or one or more video or medical imaging input device(s) 38, for example, when conducting particularly detailed or internal surgery. In the translucent mode, thesurgeon 12 will be able to see throughdisplay screen 14 to thepatient 34, while at the same time seeing the data or images generated by thecomputer 36 or imaging input device(s) 38. - To simplify the disclosure, only one
imaging input device 38 is shown inFIG. 2 . However, it is expressly noted that multiple numbers, and any and all forms, of image-generating systems can be employed in the proposed system. Thus, CCD, video, x-ray, NMR, CAT, and all other medical imaging systems can be employed. In its most basic form, theimaging input device 38 is mounted on a moveable and computer controllable manipulator assembly orarm 39, so that it can be controlled by thesurgeon 12 through the HUD system to move to selected parts of thepatient 34, and to obtain and magnify images of particular parts of the body, tissue or organs undergoing surgery. For a more detailed discussion on automatically controlled and moveable cameras, see U.S. Pat. No. 5,249,045 and the patents cited therein, all of which are incorporated herein by reference. - In operation, the
surgeon 12 can command theHUD system 10, under control ofcomputer 36, to selectively display onscreen 14 various forms of data, graphics or images, including magnified or otherwise modified images from the imaging input device(s) 38, while at the same time looking through thescreen 14 to view thepatient 34 and the normal, real-life environment of the operating room. Thus, thesurgeon 12 is able to directly view thepatient 34, while at the same time, select from many forms of data for display on theHUD screen 14. For example, if thesurgeon 12 is performing surgery in close proximity to critical organs, the surgeon may wish to see both the movements of his or her hands, and a superimposed magnified view from one of the image input device(s) 38. Thesurgeon 12 can control thevideo device 38 to move to and focus on the critical area or surgical site, and to obtain and magnify and image for display on theHUD display 14. In addition, the surgeon can control thecomputer system 36 to record the generated images, and then to display on theHUD screen 14 selected parts thereof after being magnified or otherwise computer enhanced. In accordance with the invention, thesurgeon 12 has the great advantage of being able to simultaneously control the HUD and data acquisition systems, while also looking through theHUD display 14 to watch minute hand movements and other aspects of the local environment. - As explained in greater detail below, the
surgeon 12 may command thecomputer 36 in several ways. In the preferred mode, theHUD system 10 incorporates eye-tracking to control a curser that is displayed on theHUD screen 14. More specifically, as shown graphically inFIGS. 6A-6C , thestandard display 14 includes menu items or icons that can be selected by acurser 40 that is in turn controlled by an eye-tracking system. For example, when thesurgeon 12 focuses his eyes on a selected icon or menu displayed on theHUD screen 14, the eye-tracking system will correspondingly cause thecurser 40 to move or “track” over the selected icon or menu item. The surgeon can then select the specific icon or menu by voice command or with a foot-operated select button (not shown) operating in a manner similar to the well known “mouse” button. Thus, the surgeon operates eye-trackingcurser 40 in a “hands-free” manner to move onto icons and to select various imaging systems, computers or other data sources for display on theHUD display 14. - Alternatively, or in combination with the eye-tracking
curser 40, theHUD system 10 includes a standard speech recognition sub-system integrated with the command operations. Thus, thesurgeon 12 can speak select speech commands or select words to select specific menus or to initiate computer operations to acquire and display select images or data. In the combined speech and eye-tracking mode, the surgeon can use his or her eye to move thecurser 40 to a particular menu or icon, and then speak commands to perform various operations associated specifically with that menu or icon, such as obtaining or magnifying images, selecting particular parts of patient histories, etc. - As shown in
FIG. 2 , theHUD display system 10 preferably communicates with thecommand computer 36 via any appropriate form of wireless communication, as is well known in the art of computer networking. Thus, the surgeon is shown wearing aradio transmitter 44 and anantenna 46 that operate in a manner that is well known in the art to communicate withcomputer 36. Although it is preferred to use wireless communication, it is expressly noted that any and all forms of wireless or wired communication can be used, and as a result, the invention is not intended to be limited to any specific form of data communication between the HUD system and thecomputer 36. - As discussed above, the surgeon may use the HUD system as a hands-free interface to control an imaging input device 38 (shown as a camera in
FIG. 2 ) and its associatedmanipulator assembly 39 to obtain images of thepatient 34, and to display the images on theHUD display 14. In addition, the surgeon can operate the HUD system as an interface to obtain from computer 36 (or from additional remote or local computers not shown) reference or other patient data, and to display that data on theHUD display 14. - Shown in
FIG. 3 is a general block diagram of the HUD system integrated in a surgeon's environment in accordance with the present invention. TheHUD system 10, including the referenced display, eye-tracking, speech recognition and communication elements, is coupled to a maincommand control computer 36. Also coupled to thecommand computer 36 are thenumerous sensor inputs 47, such as those that monitor vital signs and acquire medical images. Anelectronic data base 48 of the patient's history is either coupled to or included in the memory of thecommand computer 36. Thecommand computer 36 has access over astandard network 50 to remote sites and computers (not shown). - There are numerous types of
sensor inputs 47 that will monitor thepatient 34 and generate data of interest to thesurgeon 12. Eachsuch sensor 47 is considered old in the art, and operates to monitor and generate computer data defining the characteristics of the patient 34 in a manner well known to those of ordinary skill in the art. Thus, it is expressly noted that while several specific types of sensor inputs may be described in this specification, any and all types ofsensor inputs 47 can be used, as long as the data is generated in a manner that is made accessible to thesurgeon 12 by thecommand control computer 36. - The
patient data base 48 includes any and all type of patient data that a surgeon may wish to access, including, for example, the patient's history and prior medical images. While thedata base 48 is shown inFIG. 3 as being separate from thecommand computer 36, the data base can also be loaded into the data storage portion of thecommand computer 36. Likewise, thepatient data base 48 may be located remote from thecommand computer 36, and can be accessed over thenetwork 50. - In its preferred form, the
network 50 may access not only the standard hospital network, but also remote sites. In that manner, thesurgeon 12 can access and communicate with other computers, expert systems or data devices (not shown) that are both local and remote from the surgeon's location. Likewise, specialists remote from the specific operating site may view the operation and communicate or consult directly with thesurgeon 12. More specifically, the remote sites can selectively display the operation from any number of cameras in the operating room, and in addition, can selectively display and view the procedure with the same perspective of thesurgeon 12 through theHUD display screen 14, using the head-mountedcamera 29. In that manner, specialists at the remote sites will see what thesurgeon 12 sees, including the view of thepatient 34 and the data, menus, icons and cursor shown on theHUD display screen 14. In its preferred form, thevideo camera 29 is a remote controlled, high-performance camera that is mounted on the HUD gear worn by thesurgeon 12 so that it can selectively view and transmit an image of theHUD screen 14 to thecommand computer 36, and if desired, to a remote site or storage device (e.g., disk or video tape) controlled thereby. As shown inFIGS. 1B and 1C , thecamera 29 can be mounted on the head of thesurgeon 12 in a manner so as to make use of thesame optics display driver 18. In addition, as described below, the head mountedcamera 29 and/orimaging device 38 may be mounted instead to therobotic arm 39 controllable by the surgeon to tilt, pan, zoom or otherwise focus upon, selected views of the patient. The head mountedcamera 29 may also be mounted other than on the top of the surgeon's head. For example, thecamera 29 can be mounted on the left side of the surgeon's head, wherein additional optics are used to scan thedisplay screen 14. -
FIG. 4 shows a more specific diagram of the main components of theHUD display system 10. TheHUD system 10 is coupled to and communicates with thecommand control computer 36 via acommunication link 52. In its preferred form, the communications link comprises of two high speed digital radio transceivers or a optical communication system using fiber optic cable. The link allows for video, audio, and/or data to be transported to and from a computer network to and from the operator in the form of graphics, audio, video, text, or other data. For examples of such systems, see R. Gagliardi et al., Optical Communications, (John Wiley & Sons, Inc. NY, 1995), C. Lynch et al., Packet Radio Networks: Architectures, Protocols, Technologies and Applications, (Pergamon Press NY, 1987), J. Cabral et al., “Multimedia Systems for Telemedicine and Their Communications Requirements,” IEEE Communications Magazine (July 1996), M. Tsiknakis et al., “Intelligent Image Management in a Distributed PACS and Telemedicine Environment,” IEEE Communications Magazine (July 1996), and A. Hutchison, “Electronic Data Interchange for Health Care,” IEEE Communications Magazine (July 1996). The above publications are incorporated herein by reference. It is noted that thecommunication link 52 can use any method of communicating appropriate signals or information to and from a computer system and/or network (i.e. including but not limited to a radio transceiver, fiber optic cable, wire etc.). - One use of the
communication link 52 is to transmit to thecommand control computer 36 the input commands 54 generated by thesurgeon 12. Thesurgeon 12 generates the input commands 54 in one or more of several alternative manners. Primarily, thecommands 54 are generated when an eye-trackingsystem 56 detects the surgeon'seyes 32 focusing on selected icons or menu items displayed on theHUD screen 14. The icons and menu items then cause the initiation of a corresponding operation, as is common with standard icon-based user interfaces employed with computers running the Macintosh or Windows 95 operating systems. Alternatively, thesurgeon 12 may generate the commands orally by speaking select words or commands throughmicrophone 16. A standard voice recognition sub-system orcomputer 58 interprets the oral sounds output by themicrophone 16, and generatesdigital commands 54 in accordance with well known speech recognition processes. These speech commands 54 are then passed to commandcontrol computer 36 throughcommunication links 52. For more information on standard speech recognition systems, see C. Schmandt, Voice Communication With Computers, (Van Nostrand Reinhold, NY, 1994), and C. Baber et al., Interactive Speech Technolology: Human Factors Issues in the Application of Speech Input/Output to Computers, (Taylor and Francis, Pa., 1993), incorporated herein by reference. For more information on programming icon or menu based user interfaces, see J. Sullivan et al., Intelligent User Interfaces, (Addison-Wesley Publishing Company, NY, 1991), incorporated herein by reference. - The
communication link 52 is also responsible for routing video images from a camera andlighting system 49 configured on the HUD system. More specifically, the camera andlighting system 49 generate video information under control of thesurgeon 12 for display on theHUD screen 14. Thus, using commands generated by speech or from the icon/menu system, the surgeon controlspan driver 65,tilt driver 67,magnification driver 69 and light 31 to focus upon selected scenes for imaging. Thepan driver 65 controls pivotal movement in the horizontal direction by the camera while thetilt driver 67 controls the vertical pivotal scanning movement of the camera. Themagnifier driver 69 controls the degree of zoom of theimage input device 38. The camera drivers each control a respective servo-motor, stepper motor or actuator that moves or controls the associated camera parameter. In that manner, the surgeon can control the camera to focus upon and scan a particular feature (such as a tumor), and to generate and display on theHUD screen 14 highly magnified views thereof. In addition, the head mountedcamera 29 can be controlled to scan theHUD screen 14 to generate, record and transmit to remote sites the view as seen by thesurgeon 12. Although only one head mountedcamera 29 is actually shown in the drawings, it should be understood that multiple cameras can be used, including multiple different types of cameras (such as video, television, infra-red), and that those and additional cameras may be controlled by other than thesurgeon 12. Thus, theimaging input device 38 can either be controlled manually and/or automatically. - In addition to routing input commands 54, eye vector information from eye-tracking
system 56, and data from image input device(s) 38 to thecommand control computer 36, the communication links 52 also serve to route control information from thecommand computer 36 to theHUD system 10 to operate the various sub-systems such as thespeaker 17,display driver 18,display screen 14, andimaging input device 38. More specifically, thecommand computer 36 operates to maintain the contents of the display onHUD screen 14, including maintaining the display of the basic menus and icons in accordance with the mode selected by thesurgeon 12, controlling and displaying movement of the cursor 40 (shown inFIGS. 6A, 6B , and 6C) in response to the eye-trackingsystem 56 and/orspeech recognition system 58, and displaying data and images obtained from the numerous available sources. - Thus, the
command computer 36 regularly communicates, throughcommunication links 52, the control signals necessary to operate the display driver or generatingsystem 18, which in turn creates and projects the required elements of the basic user interface through thedisplay optics 20/22 onto theHUD screen 14. As thesurgeon 12 moves hiseyes 32 to focus upon the various icons and menus of the user interface, the eye-trackingsystem 56 generates input signals for thecommand computer 36, which in turn controls thedisplay generating system 18 to correspondingly move the cursor 40 (shown inFIGS. 6A, 6B , and 6C) on thedisplay screen 14 in a manner that tracks the movement of the surgeon'seyes 32. At the same time, thecommand computer 36 updates the status and contents of the various menus and icons, in a manner familiar to those who use a “point-and-click” user interface, such as found in common Windows '95 and Macintosh computer systems using a mouse, touch-pad or similar device. As various menus and icons are selected, further input signals 54 are generated for use by thecommand computer 36, for example, to obtain patient data or past or current images. In response, thecommand control computer 36 carries out the required operations external to the HUD system 10 (such as controlling thesensor inputs 47 which may includeimaging input device 38 or inputs fromdata base information 48 orother network 50 as shown inFIG. 3 ) to access and obtain the requested data. Thecommand computer 36 then controls theHUD system 10 viacommunication links 52 to update thescreen 14 to display for the surgeon the requested data or images, and to generate audio throughspeaker 17. - The display driver or generating
system 18, shown inFIG. 4 , operates to generate the images that are transmitted through theoptics 20/22 and displayed on theHUD screen 14. Such display drivers or systems are well known to the those of ordinary skill in the art, and any applicable display system can be used. For example, it is known to use display generating devices such as CRTs, LEDs, laser diodes, LCDs, etc., and this invention is not limited to any particular system, as long as it can generate and display video or computer-generated images onto the display surface/screen 14 or directly into the user'seyes 32 thereby superimposing images over the surgeon's actual field of view. See, for example, the following references related to display systems, each of which are incorporated herein by reference: A. Yoshida et al., “Design and Applications of a High-Resolution Insert Head-Mounted-Display”, Proc. VRAIS' 95 (pgs. 84-93, 1995), E. Williams, Liquid Crystals for Electronic Devices (Noyes Data Corporation, NJ, 1975); M. Tidwell et al., “The Virtual Retinal Display—A Retinal Scanning Imaging System,” Proceedings of Virtual Reality World '95 (pgs. 325-334, Munich, Germany: IDG Conferences and Seminars, 1995); J. Kollin, “Optical Engineering Challenges of the Virtual Retinal Display,” Proceedings of the SPIE (Vol. 2537, pgs. 48-60, Bellingham, Wash., 1995); J. Kollin, “A Retinal Display for Virtual-Environment Applications,” Proceedings of Society for Information Display (1993 International Symposium, Digest of Technical Papers, Vol. XXIV, pg. 827, Playa del Rey, Calif.: Society for Information Display, 1993) and G. Robinson, “Display Prototype Uses Eye's Retina as Screen,” Electronic Engineering Times (pgs. 33-34, Apr. 1, 1996). - In its preferred form, the
HUD system 10 uses a projection method for displaying images in the user's field of view using a light source (e.g. CRT, LED, laser diode, LCD projector, etc.). The light source intensity or brightness can be varied in the user's field of view so that the image being displayed can be more visible than the surrounding light. The light source intensity may also be decreased so that the surrounding light can become more visible and the image being displayed less visible. - Most CRT, LED, and other projection display methods require distance (optical path length) for projecting images. Thus, as shown in
FIG. 1 , a CRT/LED projector 18 is used as the display driver. In order to make thedisplay screen 14 as small as possible, display mirrors 20 and 22 are used to transmit the projected images to thescreen 14. More specifically, mirrors 20 and 22 are located within the head-mountedsystem 10, and are positioned outside of the field of view of theuser 12. However, they reflect the projected image so that it can be superimposed over a real scene on the display screen/surface 14 formed of a glass or other suitable display material. Another method of displaying images in the user's field of view is by using LCD technology embedded inside thedisplay surface 14. Here light from the surrounding environment is blocked or filtered by the display when the appropriate voltage is applied to cells in a LCD matrix. Part of the control of thedisplay screen 14 is done through eye-trackingsystem 56. This eye-trackingsystem 56 includes eye-tracking electronics 73, aninfrared camera 26, eye-trackingoptics 28, and aninfrared laser 24, all of which are described below. - Shown in
FIG. 5 is an example of an eye-trackingsystem 56. This system operates in a manner known to those of ordinary skill in the art. Several such systems are readily available, and the invention is not limited to any particular device, system, means, step or method for tracking the eye. For more detail on such eye-tracking systems, see for example the following U.S. Pat. Nos. 5,231,674; 5,270,748; 5,341,181; 5,430,505; 5,367,315; 5,345,281; 5,331,149 and 5,471,542 incorporated herein by reference. In its preferred form, alow power laser 24 generates an infrared eye-trackinglaser beam 60. The laser beam is projected through alens 62 and reflected by amirror 28 onto the user's eye(s) 32. The user's eyes include asclera 64,cornea 66, andpupil 68. When the user's eye(s) 32 move, the eye components cause distortions in the infrared laser beam, which are reflected back ontomirror 28, and then through alens 70 into an infrared photodetector,infrared camera 26 or other type of photodetector. This distortion of the laser beam corresponds to the eye direction vector which can be measured accurately by eye-tracking electronics 73 (Shown inFIG. 4 ). Data defining the eye direction vector is subsequently transmitted from the eye-tracking electronics 73 to thecommand computer 36 through the communication links 52. For calibration, the eye-tracking optics which includemirror 28,lens 62,infrared camera 26, andlaser 24, may be automatically adjusted for optimal performance through the use of computer controlled actuators (not shown). - It is expressly noted that, while separate eye-tracking electronics 73 are shown in the block diagram as carried by the heads-up
display system 10, it is also possible to transmit the raw data from the infrareddetector imaging device 26 to thecommand computer 36, which then determines the associated eye direction vectors. Likewise, the eye-tracking computer (and other electronics, if used) can be worn by thesurgeon 12 on a belt or backpack (not shown). - Shown in
FIGS. 6 and 7 are the contents and arrangement of several preferred forms of visual displays for thescreen 14, including exemplary icons and menus for the user interface. Referring first toFIG. 6A , an object such apatient 34 is shown visible through thedisplay 14 in the normal field ofview 72 of theuser 12. More specifically, asurgeon 12 wearing theHUD system 10 will see the patient 34 on table 75 through thesemi-transparent display screen 14. The user will also see a number of icons ormenu items view 72. Alternatively, the icons ormenu items display screen 14, outside the normal field ofview 72. The specific contents and form of the icon ormenu items display 14, and display data, such as a patient's vital signs. - For example, as shown in
FIG. 6A , three separate icons ormenu items camera icons menu items menu item 84 is indicated with the label DC and is provided to allow the user to access and vary the characteristics of thescreen 14. For example, the DC icon ormenu item 84 can be accessed by the user to control brightness, contrast, and the degree to which you can see through thescreen 14. Anothericon 86 allows the user to control various devices to obtain and display onscreen 14 any of a wide variety of vital signs. - As discussed above, each of the icons or
menu items curser 40 to move over and select the desired icon. For example, referring toFIG. 6B , to see an update of the patient's vital signs, the surgeon can focus his oreyes 32 on theicon 86 corresponding to the patient's vital signs. The eye-trackingsystem 56 will track the surgeon'seyes 32 to cause thecursor 40 to scroll to theVITAL icon 86. Depending on the programming ofHUD system 10, when thecursor 40 tracks over theVITAL icon 86, the patient's vital signs will be superimposed over a portion of the surgeon's field ofview 72. Shown inFIG. 6B is the display of the standard vital signs in analog graphic format. Specifically, graphics are shown for the patient'sblood pressure 83,heart rate 85,respiratory rate 87 andbody temperature 89. However, any additional vital sign (e.g., blood sugar, oxygen level, blood flow, etc.) can be programmed into the system and selected by the surgeon for display. In addition to, or in place of, the analog displays 83, 85, 87 and 89, digital values and titles can be displayed (not shown). Likewise, the system can be programmed to display the vital signs for a set period of time, continuously, or in an “on-off” fashion. - Referring now to
FIG. 6C , there is shown the same view asFIG. 6B , with an image captured by animage input device 38 superimposed on the normal field ofview 72. To select a camera and display its image, thesurgeon 12 focuses his eyes upon the associated icon, for example, theCAM1 icon 74. As above, the eye-trackingsystem 56 causes thecursor 40 to track toicon 74, and correspondingly initiates the desired camera image to be superimposed over the image of thepatient 34. In the example shown inFIG. 6C , the image is a 5-times magnified image of anincision 91 in the patient. If desired, using appropriate menu icons, such as, for example, thedisplay control icon 84, the surgeon may also cause thedisplay screen 14 to operate in an opaque mode, displaying only the image of theincision 91 as if on a normal computer display screen. Likewise, the surgeon can magnify or otherwise control an image input device(s) 38 to obtain the image(s) desired, at the appropriate magnification. For example,FIG. 7A shows thedisplay screen 14 operating in an opaque or semi-transparent mode with theincision 91 magnified to a 10-times view. Also shown inFIG. 7A , thecursor 40 has been replaced with a cross hatch formed bydotted lines surgeon 12 to precisely select portions of the image to be further still further magnified, enhanced, and/or centered in the display. By way of further example, reference is made toFIG. 7B , which depicts the display of skeletalFIGS. 94 and 96 selected as above by thesurgeon 12 moving thecursor 40 to still another of the camera icons, for example,CAM2 icon 76. - If desired, the user interface and menu/icon programming can be configured to require the surgeon to take further action after the
cursor 40 tracks over one of the icons. For example, and in the simplest form, once the surgeon causes the cursor to track over a selected icon, nothing may happen until the surgeon “clicks” a foot-operated mouse button (not shown). In more complex forms of the invention, the surgeon can actually access the selected operation by tracking the cursor to the selected icon and then speaking a select code word (e.g., “select” or “open”) intomicrophone 16, which word or words are interpreted byspeech recognition system 58. In still another form of the invention, the surgeon can access the selected operation associated with a particular icon or menu item by blinking a set number of times in quick succession after tracking thecursor 40 to the desired located. The blinking action is detected by the eye-trackingsystem 56. - In its simplest form, the selection by the surgeon of a specific icon will cause the associated data or images to be displayed on the
screen 14. In such a mode of operation, it is desirable to include numerous icons on the periphery of the field ofview 72, so that thesurgeon 12 can quickly select an operation. In a more complex form of the invention, a series of menus can be associated with each icon, each menu having successively more detail. For example, instead of having threecamera icons cursor 40, will change the display to then show the individual icons for each of the many available cameras. Next, when one of the individual camera icons is selected by thecursor 40, the display will again change to show individual icons for the various camera controls, such to control the panning, tilting, rotating, magnification, filters, manipulator members, etc. - As indicated, in more complex forms of the invention, the
HUD system 10 may incorporate a hierarchical icon or menu system, where several layers of menus are necessary to obtain and display desired data. In that case, greater flexibility is desirable in controlling how much data can be displayed on thescreen 14 at any given time. More specifically, as shown inFIG. 7C , in the more complex forms of the invention, the well known programming techniques from the Macintosh or Windows 95 operating systems are adapted and included incommand computer 36 to allow the display of multiple windows, that can be easily tiled, cascaded, selected, re-sized, or hidden in a manner now familiar to those skilled in the art. Thus, once a specific camera is selected, thesurgeon 12 can simply “hide” the image until it is again desired to view it, at which point it can be easily selected. Thus, the surgeon is not required to sequence through each of the menu levels to access the desired image. For example, as shown inFIG. 7C , the user has configured and controlled the display to show the windows or regions havingpatient data 98, magnifiedcamera view 100,MRI data 102, magnifiedskeletal view 96, and whole bodyskeletal view 94. Thesurgeon 12 can independently hide, re-size, and rearrange each of the windows, along with the transparency level of thescreen 14, thereby providing a maximum of flexibility. - The flexibility of the system is further shown in
FIG. 7D ,which depicts thedisplay 14 having various user selected information, data, or images positioned thereon at varying degrees of intensity or transparency. For example, inFIG. 7D the surgeon has placed in the center of the display 14 a semi-transparent window 108 through which the normal field ofview 72 displays thepatient 34. When the surgeon'seyes 32 are focused on the window 108, thecursor 40 switches to the eye-trackingcross hairs system 56, thecross hairs surgeon 12 to select a specific portion of thepatient 34, such asincision 91. The selected portion (e.g., part of the incision 91) ofpatient 34 is then locked on and magnified as a separate view in a different window 112 in the manner described above where a different set of selectable cross hairs 115 and 117 are shown for further magnification. As also shown inFIG. 7D , data such as the patient name and procedure description is displayed in the title portion 114 at the top of the display screen/surface 14. Thesurgeon 12 has selected more detailed patient data 116 to be displayed in a separate window 116 in the lower left hand comer ofdisplay 14. Various medication listings 118 and recommendations 120, along with current cursor or cross-hair coordinates 122 are displayed in a top portion ofscreen 14. Also selected for display are programmable warning indicators 124, 126, and 128, generally shown at the top right portion of the display screen/surface 14. The warning indicators may be programmed by thesurgeon 12 to monitor certain conditions and to visually indicate warnings at desired levels. The same programmable warning indicators will issue various levels of audible warning tones to thesurgeon 12 through thespeaker 17. In the configuration ofFIG. 7D , thesurgeon 12, has selected and displayed the vital signs in a separate window 130 at the top left corner of the display screen/surface 14, and a 3D model of a select portion of the patient in window 132, which is continually updated is shown below the vital signs 130 and updated in real time. Othergraphical information 110 is updated in real time and displayed at the bottom right corner of the display screen/surface 14. Other pertinent images or data may be selectively displayed in other areas of thedisplay 14, for example skeletal images in area 131 and magnetic resonant imaging (MRI) data 132. All of the described operations can also be effected or initiated by using thespeech recognition system 58. - Thus, the overall
medical HUD system 10 is extremely flexible, allowing each surgeon to customize the display and use only the features deemed appropriate, necessary and desirable. In the simplest operating mode, the surgeon may choose to display on a small part of thescreen 14 only the patient's vital signs, as shown inFIG. 6B . In other modes, and depending on the procedure, the surgeon may elect to proceed in a semi-transparent mode, with several windows of data or images, as shown inFIG. 7D . - Each of the variations in programming set forth above can be configured for each
specific surgeon 12 in a computer file assigned to that user and stored incommand computer 36. Thus, if aparticular surgeon 12 prefers to have specific icons or menus shown on specific screens, or for example, prefers digital over analog displays for vital signs, that user can select those specific settings and the system will perform in accordance therewith. Similarly, the system allows for substantial customization for specific types of surgeons or fields outside of surgery (e.g., microelectronics, forensics, etc.). - Shown in
FIGS. 8A, 8B , and 8C is still another embodiment of the invention that incorporates a non-attached heads-updisplay screen 134. InFIGS. 8A and 8B , the display screen/surface 134 is shown as a generally flat rectangular semi-transparent display area. The display screen/surface 134 is attached to a top end of a sturdy butflexible post 136 via joint 138, thereby allowing it to be moved to various viewable positions. The joint 138 is also constructed to allow thescreen 134 to be moved left or right relative to thepost 136. The lower end ofpost 136 is attached to a base 140 that is supported on the floor. Thepost 136 is also adjustable in height and can be bent to allow precise positioning of the screen/surface 134 by the user. In a further modification to this non-attached HUD embodiment, as shown inFIG. 8C , thedisplay screen 134 is mounted viarobotic arm 142. Display screen/surface 134 ofHUD 10 is attached at a lower end of the multi-jointed, robotic arm orother manipulator 142. The upper end ofarm 142 is coupled to a mountingplatform 144 that is fixed to theceiling 146 of a room. Speech commands such as “Adjust Display On”, “Adjust Display Off”, “Up”, “Down”, “Left”, “Right”, “Forward”, and “Backward” can be used for controlling the position of the retractable display screen/surface 134. Here the position of the display screen/surface 134 may be robotically controlled by speech signals from anoperator 12. Such speech signals may be received by a microphone and processed by a speech recognition system which thereby sends signals to a robotic microcontroller that drives the appropriate actuators to position the display screen/surface 134 as desired. - In the embodiments of
FIGS. 8A, 8B and 8C, the non-attached heads-up display system operates in a manner as described in the head-mounted HUD display embodiment ofFIGS. 1-7 , thereby allowing the surgeon to select and display data in a superimposed manner over the normal field of view of thepatient 34. Both thespeech recognition system 58 and eye-trackingsystem 56 can be used to move thecursor 40 to activate select icons or menus for operatingcomputer 36 anddisplay 134. However, in the modified forms of the invention shown inFIGS. 8A, 8B and 8C, additional methods for moving the cursor are possible, including using a low power laser mounted on head of thesurgeon 12, along with a touch screen incorporated in display screen itself. - The foregoing description of a preferred embodiment and best mode of the invention known to applicant at the time of filing the application has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and many modifications and variations are possible in the light of the above teaching. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application, and to enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated.
- For example, many computer-controlled instruments used in the micro-electronics field have one or more eyepieces through which the operator views a specimen. The preferred embodiment of the heads-up display can be easily modified for use in such applications. More specifically, because the user's eye is relatively aligned with the eye-piece, the eye-tracking laser can be unobtrusively mounted in or below the eye-piece shaft. The user interface can be displayed by a display driver onto the same view seen through the eye-piece. In same manner as described for the preferred embodiment, the user can see the full realm of data in the normal field of view, while simultaneously controlling the associated computer. Still further modifications are possible without departing from the spirit and scope of the invention.
- The HUD system's ability to provide high resolution images directly in the field of view of an operator without forcing the operator to look away can greatly enhance the ability to complete an operation in a very precise and controlled manner. This precise control can be incorporated into surgical cutting, probing, and/or positioning tools by clearly presenting the position of such tools onto the display with respect to a patient and/or patient model obtained from real time imagery. This technique can be very advantageous in the event that the actual orientation or position of such tool(s) is unobtainable from an unassisted eye but requires direct visual control to operate.
Claims (2)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/042,662 US20050206583A1 (en) | 1996-10-02 | 2005-01-24 | Selectively controllable heads-up display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/720,662 US6847336B1 (en) | 1996-10-02 | 1996-10-02 | Selectively controllable heads-up display system |
US11/042,662 US20050206583A1 (en) | 1996-10-02 | 2005-01-24 | Selectively controllable heads-up display system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/720,662 Continuation US6847336B1 (en) | 1996-10-02 | 1996-10-02 | Selectively controllable heads-up display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050206583A1 true US20050206583A1 (en) | 2005-09-22 |
Family
ID=34063582
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/720,662 Expired - Fee Related US6847336B1 (en) | 1996-10-02 | 1996-10-02 | Selectively controllable heads-up display system |
US11/042,662 Abandoned US20050206583A1 (en) | 1996-10-02 | 2005-01-24 | Selectively controllable heads-up display system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/720,662 Expired - Fee Related US6847336B1 (en) | 1996-10-02 | 1996-10-02 | Selectively controllable heads-up display system |
Country Status (1)
Country | Link |
---|---|
US (2) | US6847336B1 (en) |
Cited By (219)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030107643A1 (en) * | 2001-08-17 | 2003-06-12 | Byoungyi Yoon | Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion |
US20040246269A1 (en) * | 2002-11-29 | 2004-12-09 | Luis Serra | System and method for managing a plurality of locations of interest in 3D data displays ("Zoom Context") |
US20050190181A1 (en) * | 1999-08-06 | 2005-09-01 | Canon Kabushiki Kaisha | Image processing method and apparatus |
US20050203367A1 (en) * | 2001-06-13 | 2005-09-15 | Ahmed Syed N | Guide system |
US20050228281A1 (en) * | 2004-03-31 | 2005-10-13 | Nefos Thomas P | Handheld diagnostic ultrasound system with head mounted display |
US20070174060A1 (en) * | 2001-12-20 | 2007-07-26 | Canon Kabushiki Kaisha | Control apparatus |
US20070200863A1 (en) * | 2005-12-28 | 2007-08-30 | Depuy Products, Inc. | System and method for wearable user interface in computer assisted surgery |
DE102006011233A1 (en) * | 2006-03-10 | 2007-09-13 | Siemens Ag | Image representation optimizing method, involves optimizing image representation on imaging device e.g. monitor, by darkening area in which device is arranged and by increasing contrast of device when e.g. cardiologist, looks to device |
US20080030404A1 (en) * | 2005-09-20 | 2008-02-07 | Irwin L Newberg | Antenna transceiver system |
US20080039818A1 (en) * | 2006-08-11 | 2008-02-14 | Siemens Aktiengesellschaft | Technical medical system and method for operating it |
US20080045807A1 (en) * | 2006-06-09 | 2008-02-21 | Psota Eric T | System and methods for evaluating and monitoring wounds |
US20080097176A1 (en) * | 2006-09-29 | 2008-04-24 | Doug Music | User interface and identification in a medical device systems and methods |
US20080107361A1 (en) * | 2006-11-07 | 2008-05-08 | Sony Corporation | Imaging apparatus, display apparatus, imaging method, and display method |
US20080129839A1 (en) * | 2006-11-07 | 2008-06-05 | Sony Corporation | Imaging apparatus and imaging method |
US20080181452A1 (en) * | 2006-07-25 | 2008-07-31 | Yong-Moo Kwon | System and method for Three-dimensional interaction based on gaze and system and method for tracking Three-dimensional gaze |
US20080180521A1 (en) * | 2007-01-29 | 2008-07-31 | Ahearn David J | Multi-view system |
US20080191950A1 (en) * | 2007-02-13 | 2008-08-14 | Raytheon Company | Conformal electronically scanned phased array antenna and communication system for helmets and other platforms |
US20080253695A1 (en) * | 2007-04-10 | 2008-10-16 | Sony Corporation | Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program |
US20080259199A1 (en) * | 2006-12-07 | 2008-10-23 | Sony Corporation | Image display system, display apparatus, and display method |
US20090040231A1 (en) * | 2007-08-06 | 2009-02-12 | Sony Corporation | Information processing apparatus, system, and method thereof |
EP2120212A1 (en) * | 2007-03-12 | 2009-11-18 | Sony Corporation | Image processing device, image processing method, and image processing system |
US20100045569A1 (en) * | 2008-08-22 | 2010-02-25 | Leonardo William Estevez | Display Systems and Methods for Mobile Devices |
US20100085462A1 (en) * | 2006-10-16 | 2010-04-08 | Sony Corporation | Display apparatus, display method |
US20100113940A1 (en) * | 2008-01-10 | 2010-05-06 | The Ohio State University Research Foundation | Wound goggles |
US20100164990A1 (en) * | 2005-08-15 | 2010-07-01 | Koninklijke Philips Electronics, N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
US20100241992A1 (en) * | 2009-03-21 | 2010-09-23 | Shenzhen Futaihong Precision Industry Co., Ltd. | Electronic device and method for operating menu items of the electronic device |
US20100328204A1 (en) * | 2009-06-25 | 2010-12-30 | The Boeing Company | Virtual Control Station |
US20110043644A1 (en) * | 2008-04-02 | 2011-02-24 | Esight Corp. | Apparatus and Method for a Dynamic "Region of Interest" in a Display System |
US20110069041A1 (en) * | 2005-03-18 | 2011-03-24 | Cohen Alexander J | Machine-differentiatable identifiers having a commonly accepted meaning |
US20110161998A1 (en) * | 2009-12-31 | 2011-06-30 | Motorola, Inc. | Systems and Methods Providing Content on a Display Based Upon Facial Recognition of a Viewer |
US20110221657A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Optical stabilization of displayed content with a variable lens |
WO2011116332A2 (en) * | 2010-03-18 | 2011-09-22 | SPI Surgical, Inc. | Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto |
US20110298621A1 (en) * | 2010-06-02 | 2011-12-08 | Lokesh Shanbhag | System and method for generating alerts |
WO2011156195A2 (en) * | 2010-06-09 | 2011-12-15 | Dynavox Systems Llc | Speech generation device with a head mounted display unit |
US20120021806A1 (en) * | 2010-07-23 | 2012-01-26 | Maltz Gregory A | Unitized, Vision-Controlled, Wireless Eyeglass Transceiver |
US20120026191A1 (en) * | 2010-07-05 | 2012-02-02 | Sony Ericsson Mobile Communications Ab | Method for displaying augmentation information in an augmented reality system |
US20120069050A1 (en) * | 2010-09-16 | 2012-03-22 | Heeyeon Park | Transparent display device and method for providing information using the same |
US20120098752A1 (en) * | 1997-09-19 | 2012-04-26 | Rolus Borgward Glenn | Digital Book |
US20120206323A1 (en) * | 2010-02-28 | 2012-08-16 | Osterhout Group, Inc. | Ar glasses with event and sensor triggered ar eyepiece interface to external devices |
US20120212414A1 (en) * | 2010-02-28 | 2012-08-23 | Osterhout Group, Inc. | Ar glasses with event and sensor triggered control of ar eyepiece applications |
US20120249400A1 (en) * | 2009-12-22 | 2012-10-04 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Signal processing eye protecting digital glasses |
US20120287040A1 (en) * | 2011-05-10 | 2012-11-15 | Raytheon Company | System and Method for Operating a Helmet Mounted Display |
US20130002525A1 (en) * | 2011-06-29 | 2013-01-03 | Bobby Duane Foote | System for locating a position of an object |
US20130044130A1 (en) * | 2011-08-17 | 2013-02-21 | Kevin A. Geisner | Providing contextual personal information by a mixed reality device |
US20130169533A1 (en) * | 2011-12-29 | 2013-07-04 | Grinbath, Llc | System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex |
US8531394B2 (en) | 2010-07-23 | 2013-09-10 | Gregory A. Maltz | Unitized, vision-controlled, wireless eyeglasses transceiver |
US20130242262A1 (en) * | 2005-10-07 | 2013-09-19 | Percept Technologies Inc. | Enhanced optical and perceptual digital eyewear |
WO2013138647A1 (en) * | 2012-03-15 | 2013-09-19 | Google Inc. | Using convergence angle to select among different ui elements |
US20130257832A1 (en) * | 2012-03-30 | 2013-10-03 | Exelis, Inc. | Image pickoff apparatus system and method |
US8599174B2 (en) | 2005-03-18 | 2013-12-03 | The Invention Science Fund I, Llc | Verifying a written expression |
US8617197B2 (en) | 2010-03-18 | 2013-12-31 | SPI Surgical, Inc. | Introducer device |
US8643951B1 (en) | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
US8640959B2 (en) | 2005-03-18 | 2014-02-04 | The Invention Science Fund I, Llc | Acquisition of a user expression and a context of the expression |
WO2014009859A3 (en) * | 2012-07-10 | 2014-03-06 | Aïmago S.A. | Perfusion assessment multi-modality optical medical device |
WO2014047402A1 (en) * | 2012-09-20 | 2014-03-27 | MUSC Foundation for Research and Development | Head-mounted systems and methods for providing inspection, evaluation or assessment of an event or location |
US20140098135A1 (en) * | 2012-10-05 | 2014-04-10 | Elwha Llc | Correlating user reactions with augmentations displayed through augmented views |
US20140098134A1 (en) * | 2012-10-05 | 2014-04-10 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US20140146038A1 (en) * | 2012-11-28 | 2014-05-29 | International Business Machines Corporation | Augmented display of internal system components |
JP2014145734A (en) * | 2013-01-30 | 2014-08-14 | Nikon Corp | Information input/output device, and information input/output method |
US20140282196A1 (en) * | 2013-03-15 | 2014-09-18 | Intuitive Surgical Operations, Inc. | Robotic system providing user selectable actions associated with gaze tracking |
US20140275760A1 (en) * | 2013-03-13 | 2014-09-18 | Samsung Electronics Co., Ltd. | Augmented reality image display system and surgical robot system comprising the same |
CN104055478A (en) * | 2014-07-08 | 2014-09-24 | 金纯� | Medical endoscope control system based on sight tracking control |
US8860660B2 (en) | 2011-12-29 | 2014-10-14 | Grinbath, Llc | System and method of determining pupil center position |
DE102013107041A1 (en) * | 2013-04-18 | 2014-10-23 | Carl Gustav Carus Management Gmbh | Ultrasound system and method for communication between an ultrasound device and bidirectional data goggles |
US8897605B2 (en) | 2005-03-18 | 2014-11-25 | The Invention Science Fund I, Llc | Decoding digital information included in a hand-formed expression |
CN104166239A (en) * | 2014-08-25 | 2014-11-26 | 成都贝思达光电科技有限公司 | Head-worn video glasses view-finding device for high definition camera |
DE102013210354A1 (en) * | 2013-06-04 | 2014-12-04 | Bayerische Motoren Werke Aktiengesellschaft | Eye-controlled interaction for data glasses |
US8928632B2 (en) | 2005-03-18 | 2015-01-06 | The Invention Science Fund I, Llc | Handwriting regions keyed to a data receptor |
CN104298344A (en) * | 2013-07-16 | 2015-01-21 | 精工爱普生株式会社 | Information processing apparatus, information processing method, and information processing system |
CN104298499A (en) * | 2013-07-16 | 2015-01-21 | 精工爱普生株式会社 | Information processing apparatus, information processing method, and information processing system |
DE102013013698A1 (en) * | 2013-08-16 | 2015-02-19 | Audi Ag | Method for operating electronic data glasses and electronic data glasses |
ITMI20131527A1 (en) * | 2013-09-17 | 2015-03-18 | Menci Software S R L | SURGICAL DISPLAY DEVICE |
US8996413B2 (en) | 2012-12-28 | 2015-03-31 | Wal-Mart Stores, Inc. | Techniques for detecting depleted stock |
WO2015054322A1 (en) * | 2013-10-07 | 2015-04-16 | Avegant Corporation | Multi-mode wearable apparatus for accessing media content |
US20150123880A1 (en) * | 2013-11-04 | 2015-05-07 | Weng-Kong TAM | Digital loupe device |
US9035878B1 (en) | 2012-02-29 | 2015-05-19 | Google Inc. | Input system |
US9046999B1 (en) * | 2010-06-08 | 2015-06-02 | Google Inc. | Dynamic input at a touch-based interface based on pressure |
US20150153913A1 (en) * | 2013-12-01 | 2015-06-04 | Apx Labs, Llc | Systems and methods for interacting with a virtual menu |
US9063650B2 (en) | 2005-03-18 | 2015-06-23 | The Invention Science Fund I, Llc | Outputting a saved hand-formed expression |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US9105126B2 (en) | 2012-10-05 | 2015-08-11 | Elwha Llc | Systems and methods for sharing augmentation data |
US9111384B2 (en) | 2012-10-05 | 2015-08-18 | Elwha Llc | Systems and methods for obtaining and using augmentation data and for sharing usage data |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9141188B2 (en) | 2012-10-05 | 2015-09-22 | Elwha Llc | Presenting an augmented view in response to acquisition of data inferring user activity |
US20150268483A1 (en) * | 2005-10-07 | 2015-09-24 | Percept Technologies Inc. | Enhanced optical and perceptual digital eyewear |
US20150290031A1 (en) * | 2013-05-16 | 2015-10-15 | Wavelight Gmbh | Touchless user interface for ophthalmic devices |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US20150355815A1 (en) * | 2013-01-15 | 2015-12-10 | Poow Innovation Ltd | Dynamic icons |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9292086B2 (en) | 2012-09-26 | 2016-03-22 | Grinbath, Llc | Correlating pupil position to gaze location within a scene |
WO2016064800A1 (en) * | 2014-10-20 | 2016-04-28 | Mayo Foundation For Medical Education And Research | Imaging data capture and video streaming system |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US20160220105A1 (en) * | 2015-02-03 | 2016-08-04 | Francois Duret | Device for viewing an interior of a mouth |
WO2016133644A1 (en) * | 2015-02-20 | 2016-08-25 | Covidien Lp | Operating room and surgical site awareness |
US20160349539A1 (en) * | 2015-05-26 | 2016-12-01 | Lumenis Ltd. | Laser safety glasses with an improved imaging system |
US9536350B2 (en) | 2011-08-24 | 2017-01-03 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9576329B2 (en) * | 2014-07-31 | 2017-02-21 | Ciena Corporation | Systems and methods for equipment installation, configuration, maintenance, and personnel training |
DE102015216917A1 (en) * | 2015-09-03 | 2017-03-09 | Siemens Healthcare Gmbh | System for presenting an augmented reality about an operator |
US9626072B2 (en) | 2012-11-07 | 2017-04-18 | Honda Motor Co., Ltd. | Eye gaze control system |
US9639964B2 (en) | 2013-03-15 | 2017-05-02 | Elwha Llc | Dynamically preserving scene elements in augmented reality systems |
US9645640B2 (en) | 2013-12-21 | 2017-05-09 | Audi Ag | Device and method for navigating within a menu for controlling a vehicle, and selecting a menu entry from the menu |
RU2619794C1 (en) * | 2010-06-07 | 2017-05-18 | Зе Боинг Компани | Virtual control station |
US20170163866A1 (en) * | 2013-07-24 | 2017-06-08 | Google Inc. | Input System |
US20170186157A1 (en) * | 2015-12-23 | 2017-06-29 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
US9757039B2 (en) | 2008-07-10 | 2017-09-12 | Ecole Polytechnique Federale De Lausanne (Epfl) | Functional optical coherent imaging |
US9823474B2 (en) | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
CN107427216A (en) * | 2015-12-25 | 2017-12-01 | 韦斯特尤尼蒂斯株式会社 | Medical system |
WO2017210497A1 (en) * | 2016-06-03 | 2017-12-07 | Covidien Lp | Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator |
CN107850778A (en) * | 2015-06-05 | 2018-03-27 | 马克·莱姆陈 | Apparatus and method for image capture of medical or dental images using head mounted camera and computer system |
CN108153424A (en) * | 2015-06-03 | 2018-06-12 | 塔普翊海(上海)智能科技有限公司 | The eye of aobvious equipment is moved moves exchange method with head |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US10021430B1 (en) | 2006-02-10 | 2018-07-10 | Percept Technologies Inc | Method and system for distribution of media |
US10025486B2 (en) | 2013-03-15 | 2018-07-17 | Elwha Llc | Cross-reality select, drag, and drop for augmented reality systems |
US10109075B2 (en) | 2013-03-15 | 2018-10-23 | Elwha Llc | Temporal element restoration in augmented reality systems |
US10169862B2 (en) | 2015-05-07 | 2019-01-01 | Novadaq Technologies ULC | Methods and systems for laser speckle imaging of tissue using a color image sensor |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US10195058B2 (en) | 2013-05-13 | 2019-02-05 | The Johns Hopkins University | Hybrid augmented reality multimodal operation neural integration environment |
EP2923638B1 (en) * | 2011-03-18 | 2019-02-20 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Optical measuring device and system |
US10269179B2 (en) | 2012-10-05 | 2019-04-23 | Elwha Llc | Displaying second augmentations that are based on registered first augmentations |
US10278782B2 (en) * | 2014-03-19 | 2019-05-07 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
US10306037B2 (en) * | 2008-09-30 | 2019-05-28 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US20190272029A1 (en) * | 2012-10-05 | 2019-09-05 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US10409443B2 (en) | 2015-06-24 | 2019-09-10 | Microsoft Technology Licensing, Llc | Contextual cursor display based on hand tracking |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
US20190290101A1 (en) * | 2018-03-23 | 2019-09-26 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus |
US10432922B2 (en) | 2014-03-19 | 2019-10-01 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
US10527847B1 (en) | 2005-10-07 | 2020-01-07 | Percept Technologies Inc | Digital eyewear |
US10528130B2 (en) | 2010-07-23 | 2020-01-07 | Telepatheye Inc. | Unitized eye-tracking wireless eyeglasses system |
US10539787B2 (en) | 2010-02-28 | 2020-01-21 | Microsoft Technology Licensing, Llc | Head-worn adaptive display |
US10575737B2 (en) | 2012-04-27 | 2020-03-03 | Novadaq Technologies ULC | Optical coherent imaging medical device |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
US10726765B2 (en) | 2018-02-15 | 2020-07-28 | Valve Corporation | Using tracking of display device to control image display |
EP3726834A1 (en) * | 2019-04-17 | 2020-10-21 | Medneo GmbH | Telepresence system and method |
CN111904768A (en) * | 2020-08-27 | 2020-11-10 | 上海联影医疗科技有限公司 | Medical equipment scanning intra-aperture image display method and medical equipment |
JP2020533681A (en) * | 2017-09-08 | 2020-11-19 | サージカル シアター インコーポレイテッド | Dual mode augmented reality surgery system |
US10860100B2 (en) | 2010-02-28 | 2020-12-08 | Microsoft Technology Licensing, Llc | AR glasses with predictive control of external device based on event input |
EP3764367A1 (en) * | 2019-07-11 | 2021-01-13 | Milestone S.r.l. | System and method for medical gross examination |
US10962789B1 (en) | 2013-03-15 | 2021-03-30 | Percept Technologies Inc | Digital eyewear system and method for the treatment and prevention of migraines and photophobia |
KR20210048954A (en) * | 2019-10-24 | 2021-05-04 | (주)미래컴퍼니 | Surgical system using surgical robot |
TWI731430B (en) * | 2019-10-04 | 2021-06-21 | 財團法人工業技術研究院 | Information display method and information display system |
US20210260749A1 (en) * | 2004-02-26 | 2021-08-26 | Teladoc Health, Inc. | Graphical interface for a remote presence system |
US11112865B1 (en) * | 2019-02-13 | 2021-09-07 | Facebook Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
US11219428B2 (en) * | 2014-01-29 | 2022-01-11 | Becton, Dickinson And Company | Wearable electronic device for enhancing visualization during insertion of an invasive device |
JP2022509460A (en) * | 2018-10-25 | 2022-01-20 | べイエオニクス サージカル リミテッド | UI for head-mounted display system |
US20220104910A1 (en) * | 2020-10-02 | 2022-04-07 | Ethicon Llc | Monitoring of user visual gaze to control which display system displays the primary information |
WO2022070076A1 (en) * | 2020-10-02 | 2022-04-07 | Cilag Gmbh International | Reconfiguration of display sharing |
US11382642B2 (en) | 2010-02-11 | 2022-07-12 | Cilag Gmbh International | Rotatable cutting implements with friction reducing material for ultrasonic surgical instruments |
US11419626B2 (en) | 2012-04-09 | 2022-08-23 | Cilag Gmbh International | Switch arrangements for ultrasonic surgical instruments |
US11426191B2 (en) | 2012-06-29 | 2022-08-30 | Cilag Gmbh International | Ultrasonic surgical instruments with distally positioned jaw assemblies |
US11452525B2 (en) | 2019-12-30 | 2022-09-27 | Cilag Gmbh International | Surgical instrument comprising an adjustment system |
US11471209B2 (en) | 2014-03-31 | 2022-10-18 | Cilag Gmbh International | Controlling impedance rise in electrosurgical medical devices |
US11559347B2 (en) | 2015-09-30 | 2023-01-24 | Cilag Gmbh International | Techniques for circuit topologies for combined generator |
US11583306B2 (en) | 2012-06-29 | 2023-02-21 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US11589916B2 (en) | 2019-12-30 | 2023-02-28 | Cilag Gmbh International | Electrosurgical instruments with electrodes having variable energy densities |
US11660089B2 (en) | 2019-12-30 | 2023-05-30 | Cilag Gmbh International | Surgical instrument comprising a sensing system |
US11666375B2 (en) | 2015-10-16 | 2023-06-06 | Cilag Gmbh International | Electrode wiping surgical device |
US11672534B2 (en) | 2020-10-02 | 2023-06-13 | Cilag Gmbh International | Communication capability of a smart stapler |
US11684402B2 (en) | 2016-01-15 | 2023-06-27 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on tissue characterization |
US11684412B2 (en) | 2019-12-30 | 2023-06-27 | Cilag Gmbh International | Surgical instrument with rotatable and articulatable surgical end effector |
US11696776B2 (en) | 2019-12-30 | 2023-07-11 | Cilag Gmbh International | Articulatable surgical instrument |
US11717311B2 (en) | 2012-06-29 | 2023-08-08 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US11717706B2 (en) | 2009-07-15 | 2023-08-08 | Cilag Gmbh International | Ultrasonic surgical instruments |
US11723716B2 (en) | 2019-12-30 | 2023-08-15 | Cilag Gmbh International | Electrosurgical instrument with variable control mechanisms |
US11748924B2 (en) | 2020-10-02 | 2023-09-05 | Cilag Gmbh International | Tiered system display control based on capacity and user operation |
US11759251B2 (en) | 2019-12-30 | 2023-09-19 | Cilag Gmbh International | Control program adaptation based on device status and user input |
US11779387B2 (en) | 2019-12-30 | 2023-10-10 | Cilag Gmbh International | Clamp arm jaw to minimize tissue sticking and improve tissue control |
US11779329B2 (en) | 2019-12-30 | 2023-10-10 | Cilag Gmbh International | Surgical instrument comprising a flex circuit including a sensor system |
US11786291B2 (en) | 2019-12-30 | 2023-10-17 | Cilag Gmbh International | Deflectable support of RF energy electrode with respect to opposing ultrasonic blade |
US11798676B2 (en) * | 2012-09-17 | 2023-10-24 | DePuy Synthes Products, Inc. | Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking |
US20230356095A1 (en) * | 2021-07-09 | 2023-11-09 | Gel Blaster, Inc. | Smart target co-witnessing hit attribution system and method |
US11812957B2 (en) | 2019-12-30 | 2023-11-14 | Cilag Gmbh International | Surgical instrument comprising a signal interference resolution system |
US11830602B2 (en) | 2020-10-02 | 2023-11-28 | Cilag Gmbh International | Surgical hub having variable interconnectivity capabilities |
US11864820B2 (en) | 2016-05-03 | 2024-01-09 | Cilag Gmbh International | Medical device with a bilateral jaw configuration for nerve stimulation |
US11871982B2 (en) | 2009-10-09 | 2024-01-16 | Cilag Gmbh International | Surgical generator for ultrasonic and electrosurgical devices |
US11871955B2 (en) | 2012-06-29 | 2024-01-16 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US11877897B2 (en) | 2020-10-02 | 2024-01-23 | Cilag Gmbh International | Situational awareness of instruments location and individualization of users to control displays |
US11883022B2 (en) | 2020-10-02 | 2024-01-30 | Cilag Gmbh International | Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information |
US11903634B2 (en) | 2015-06-30 | 2024-02-20 | Cilag Gmbh International | Surgical instrument with user adaptable techniques |
US11911063B2 (en) | 2019-12-30 | 2024-02-27 | Cilag Gmbh International | Techniques for detecting ultrasonic blade to electrode contact and reducing power to ultrasonic blade |
US11937863B2 (en) | 2019-12-30 | 2024-03-26 | Cilag Gmbh International | Deflectable electrode with variable compression bias along the length of the deflectable electrode |
US11937866B2 (en) | 2019-12-30 | 2024-03-26 | Cilag Gmbh International | Method for an electrosurgical procedure |
US11944366B2 (en) | 2019-12-30 | 2024-04-02 | Cilag Gmbh International | Asymmetric segmented ultrasonic support pad for cooperative engagement with a movable RF electrode |
US11950797B2 (en) | 2019-12-30 | 2024-04-09 | Cilag Gmbh International | Deflectable electrode with higher distal bias relative to proximal bias |
US11963683B2 (en) | 2020-10-02 | 2024-04-23 | Cilag Gmbh International | Method for operating tiered operation modes in a surgical system |
US11974772B2 (en) | 2016-01-15 | 2024-05-07 | Cilag GmbH Intemational | Modular battery powered handheld surgical instrument with variable motor control limits |
US11989930B2 (en) | 2018-10-25 | 2024-05-21 | Beyeonics Surgical Ltd. | UI for head mounted display system |
US11986201B2 (en) | 2019-12-30 | 2024-05-21 | Cilag Gmbh International | Method for operating a surgical instrument |
US11992372B2 (en) | 2020-10-02 | 2024-05-28 | Cilag Gmbh International | Cooperative surgical displays |
US11998230B2 (en) | 2016-11-29 | 2024-06-04 | Cilag Gmbh International | End effector control and calibration |
US12016566B2 (en) | 2020-10-02 | 2024-06-25 | Cilag Gmbh International | Surgical instrument with adaptive function controls |
US12023086B2 (en) | 2019-12-30 | 2024-07-02 | Cilag Gmbh International | Electrosurgical instrument for delivering blended energy modalities to tissue |
JP7523466B2 (en) | 2019-03-29 | 2024-07-26 | ラズミク ガザリアン | Method and apparatus for variable resolution screens - Patents.com |
US12051214B2 (en) | 2020-05-12 | 2024-07-30 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US12053224B2 (en) | 2019-12-30 | 2024-08-06 | Cilag Gmbh International | Variation in electrode parameters and deflectable electrode to modify energy density and tissue interaction |
US12064293B2 (en) | 2020-10-02 | 2024-08-20 | Cilag Gmbh International | Field programmable surgical visualization system |
US12064109B2 (en) | 2019-12-30 | 2024-08-20 | Cilag Gmbh International | Surgical instrument comprising a feedback control circuit |
US12076006B2 (en) | 2019-12-30 | 2024-09-03 | Cilag Gmbh International | Surgical instrument comprising an orientation detection system |
US12082808B2 (en) | 2019-12-30 | 2024-09-10 | Cilag Gmbh International | Surgical instrument comprising a control system responsive to software configurations |
US12114912B2 (en) | 2019-12-30 | 2024-10-15 | Cilag Gmbh International | Non-biased deflectable electrode to minimize contact between ultrasonic blade and electrode |
US12114914B2 (en) | 2016-08-05 | 2024-10-15 | Cilag Gmbh International | Methods and systems for advanced harmonic energy |
US12126916B2 (en) | 2018-09-27 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
US12178403B2 (en) | 2016-11-24 | 2024-12-31 | University Of Washington | Light field capture and rendering for head-mounted displays |
US12193698B2 (en) | 2016-01-15 | 2025-01-14 | Cilag Gmbh International | Method for self-diagnosing operation of a control switch in a surgical instrument system |
US12213801B2 (en) | 2020-10-02 | 2025-02-04 | Cilag Gmbh International | Surgical visualization and particle trend analysis system |
WO2025041183A1 (en) * | 2023-08-22 | 2025-02-27 | Dal Pont Medical S.r.l.s. | Optical device with an augmented reality management system |
US12239360B2 (en) | 2016-01-15 | 2025-03-04 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on button displacement, intensity, or local tissue characterization |
US12261988B2 (en) | 2021-11-08 | 2025-03-25 | Proprio, Inc. | Methods for generating stereoscopic views in multicamera systems, and associated devices and systems |
US12262937B2 (en) | 2020-05-29 | 2025-04-01 | Cilag Gmbh International | User interface for surgical instrument with combination energy modality end-effector |
Families Citing this family (363)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7248232B1 (en) * | 1998-02-25 | 2007-07-24 | Semiconductor Energy Laboratory Co., Ltd. | Information processing device |
US6674449B1 (en) * | 1998-11-25 | 2004-01-06 | Ge Medical Systems Global Technology Company, Llc | Multiple modality interface for imaging systems |
DE50104533D1 (en) * | 2000-01-27 | 2004-12-23 | Siemens Ag | SYSTEM AND METHOD FOR VIEWPOINTED LANGUAGE PROCESSING |
US20020010734A1 (en) * | 2000-02-03 | 2002-01-24 | Ebersole John Franklin | Internetworked augmented reality system and method |
US7057582B2 (en) * | 2000-03-15 | 2006-06-06 | Information Decision Technologies, Llc | Ruggedized instrumented firefighter's self contained breathing apparatus |
DE10041104C1 (en) * | 2000-08-22 | 2002-03-07 | Siemens Ag | Device and method for communication between a mobile data processing device and a stationary data processing device |
JP2002207832A (en) * | 2000-12-28 | 2002-07-26 | Atsushi Takahashi | Distribution system of internet technology instruction education, and instruction system using communication network |
US6791581B2 (en) | 2001-01-31 | 2004-09-14 | Microsoft Corporation | Methods and systems for synchronizing skin properties |
US7073130B2 (en) * | 2001-01-31 | 2006-07-04 | Microsoft Corporation | Methods and systems for creating skins |
JP4878083B2 (en) * | 2001-03-13 | 2012-02-15 | キヤノン株式会社 | Image composition apparatus and method, and program |
GB0122601D0 (en) * | 2001-09-19 | 2001-11-07 | Imp College Innovations Ltd | Manipulation of image data |
US7418480B2 (en) * | 2001-10-25 | 2008-08-26 | Ge Medical Systems Global Technology Company, Llc | Medical imaging data streaming |
US20030160755A1 (en) * | 2002-02-28 | 2003-08-28 | Palm, Inc. | Detachable expandable flexible display |
US6923652B2 (en) * | 2002-02-21 | 2005-08-02 | Roger Edward Kerns | Nonverbal communication device and method |
US6910911B2 (en) | 2002-06-27 | 2005-06-28 | Vocollect, Inc. | Break-away electrical connector |
US7259906B1 (en) * | 2002-09-03 | 2007-08-21 | Cheetah Omni, Llc | System and method for voice control of medical devices |
DE10242262A1 (en) * | 2002-09-12 | 2004-03-25 | Daimlerchrysler Ag | Stereo vision system for assisting night vision in vehicles has arrangement for generating stereoscopic reproduction of image signals acquired by at least two night vision-compatible cameras |
GB0222265D0 (en) * | 2002-09-25 | 2002-10-30 | Imp College Innovations Ltd | Control of robotic manipulation |
US20040196958A1 (en) * | 2002-11-29 | 2004-10-07 | Werner Beck | Operating device for a diagnostic imaging unit |
US7050078B2 (en) * | 2002-12-19 | 2006-05-23 | Accenture Global Services Gmbh | Arbitrary object tracking augmented reality applications |
US20040196399A1 (en) * | 2003-04-01 | 2004-10-07 | Stavely Donald J. | Device incorporating retina tracking |
US7623892B2 (en) | 2003-04-02 | 2009-11-24 | Palm, Inc. | System and method for enabling a person to switch use of computing devices |
US7922321B2 (en) | 2003-10-09 | 2011-04-12 | Ipventure, Inc. | Eyewear supporting after-market electrical components |
US8109629B2 (en) | 2003-10-09 | 2012-02-07 | Ipventure, Inc. | Eyewear supporting electrical components and apparatus therefor |
ATE447205T1 (en) * | 2003-05-12 | 2009-11-15 | Elbit Systems Ltd | METHOD AND SYSTEM FOR AUDIOVISUAL COMMUNICATION |
CN101770073B (en) * | 2003-12-03 | 2013-03-27 | 株式会社尼康 | Information displaying apparatus |
US7317955B2 (en) * | 2003-12-12 | 2008-01-08 | Conmed Corporation | Virtual operating room integration |
US7629989B2 (en) * | 2004-04-02 | 2009-12-08 | K-Nfb Reading Technology, Inc. | Reducing processing latency in optical character recognition for portable reading machine |
EP2202609B8 (en) | 2004-06-18 | 2016-03-09 | Tobii AB | Eye control of computer apparatus |
US20050285844A1 (en) * | 2004-06-29 | 2005-12-29 | Ge Medical Systems Information Technologies, Inc. | 3D display system and method |
US11644693B2 (en) | 2004-07-28 | 2023-05-09 | Ingeniospec, Llc | Wearable audio system supporting enhanced hearing support |
US11829518B1 (en) | 2004-07-28 | 2023-11-28 | Ingeniospec, Llc | Head-worn device with connection region |
US9155373B2 (en) * | 2004-08-02 | 2015-10-13 | Invention Science Fund I, Llc | Medical overlay mirror |
WO2006012678A1 (en) * | 2004-08-03 | 2006-02-09 | Silverbrook Research Pty Ltd | Walk-up printing |
US11852901B2 (en) | 2004-10-12 | 2023-12-26 | Ingeniospec, Llc | Wireless headset supporting messages and hearing enhancement |
US7573439B2 (en) * | 2004-11-24 | 2009-08-11 | General Electric Company | System and method for significant image selection using visual tracking |
US7501995B2 (en) * | 2004-11-24 | 2009-03-10 | General Electric Company | System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation |
US7598928B1 (en) * | 2004-12-16 | 2009-10-06 | Jacqueline Evynn Breuninger Buskop | Video display hat |
US20060142740A1 (en) * | 2004-12-29 | 2006-06-29 | Sherman Jason T | Method and apparatus for performing a voice-assisted orthopaedic surgical procedure |
US7896869B2 (en) * | 2004-12-29 | 2011-03-01 | Depuy Products, Inc. | System and method for ensuring proper medical instrument use in an operating room |
US20070273674A1 (en) * | 2005-03-18 | 2007-11-29 | Searete Llc, A Limited Liability Corporation | Machine-differentiatable identifiers having a commonly accepted meaning |
US7809215B2 (en) | 2006-10-11 | 2010-10-05 | The Invention Science Fund I, Llc | Contextual information encoded in a formed expression |
SE528518C2 (en) * | 2005-04-29 | 2006-12-05 | Totalfoersvarets Forskningsins | Way to navigate in a world recorded by one or more image sensors and a device for carrying out the method |
US9648907B2 (en) * | 2005-05-31 | 2017-05-16 | Philip Morris Usa Inc. | Virtual reality smoking system |
US20070030211A1 (en) * | 2005-06-02 | 2007-02-08 | Honeywell International Inc. | Wearable marine heads-up display system |
US7918863B2 (en) | 2005-06-24 | 2011-04-05 | Conceptus, Inc. | Minimally invasive surgical stabilization devices and methods |
US20070015999A1 (en) * | 2005-07-15 | 2007-01-18 | Heldreth Mark A | System and method for providing orthopaedic surgical information to a surgeon |
DE602006021760D1 (en) * | 2005-09-27 | 2011-06-16 | Penny Ab | DEVICE FOR CHECKING AN EXTERNAL DEVICE |
US20070078678A1 (en) * | 2005-09-30 | 2007-04-05 | Disilvestro Mark R | System and method for performing a computer assisted orthopaedic surgical procedure |
US20080082363A1 (en) * | 2005-10-07 | 2008-04-03 | Nader Habashi | On-line healthcare consultation services system and method of using same |
US11733549B2 (en) | 2005-10-11 | 2023-08-22 | Ingeniospec, Llc | Eyewear having removable temples that support electrical components |
US12044901B2 (en) | 2005-10-11 | 2024-07-23 | Ingeniospec, Llc | System for charging embedded battery in wireless head-worn personal electronic apparatus |
US7697827B2 (en) | 2005-10-17 | 2010-04-13 | Konicek Jeffrey C | User-friendlier interfaces for a camera |
USD549694S1 (en) | 2005-11-15 | 2007-08-28 | Vocollect, Inc. | Headset |
USD552595S1 (en) | 2005-11-16 | 2007-10-09 | Vocollect, Inc. | Control panel for a headset |
US8417185B2 (en) | 2005-12-16 | 2013-04-09 | Vocollect, Inc. | Wireless headset and method for robust voice data communication |
JP5159041B2 (en) * | 2006-01-30 | 2013-03-06 | 株式会社東芝 | Ultrasonic diagnostic apparatus and image processing program thereof |
US7885419B2 (en) | 2006-02-06 | 2011-02-08 | Vocollect, Inc. | Headset terminal with speech functionality |
US7773767B2 (en) * | 2006-02-06 | 2010-08-10 | Vocollect, Inc. | Headset terminal with rear stability strap |
US7764247B2 (en) | 2006-02-17 | 2010-07-27 | Microsoft Corporation | Adaptive heads-up user interface for automobiles |
US8635082B2 (en) | 2006-05-25 | 2014-01-21 | DePuy Synthes Products, LLC | Method and system for managing inventories of orthopaedic implants |
DE102006041952B3 (en) * | 2006-08-30 | 2008-04-30 | Schmücker, Hartmut, Dr. | Multifunctional communication system on the head-belt |
DE102006059144A1 (en) * | 2006-12-14 | 2008-06-26 | Siemens Ag | Device and method for controlling a diagnostic and / or therapy system |
US10795457B2 (en) * | 2006-12-28 | 2020-10-06 | D3D Technologies, Inc. | Interactive 3D cursor |
US11228753B1 (en) * | 2006-12-28 | 2022-01-18 | Robert Edwin Douglas | Method and apparatus for performing stereoscopic zooming on a head display unit |
US11275242B1 (en) * | 2006-12-28 | 2022-03-15 | Tipping Point Medical Images, Llc | Method and apparatus for performing stereoscopic rotation of a volume on a head display unit |
US10847184B2 (en) | 2007-03-07 | 2020-11-24 | Knapp Investment Company Limited | Method and apparatus for initiating a live video stream transmission |
WO2008109172A1 (en) * | 2007-03-07 | 2008-09-12 | Wiklof Christopher A | Recorder with retrospective capture |
US8265949B2 (en) | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
AU2008308868B2 (en) * | 2007-09-30 | 2014-12-04 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrumentation |
US9703369B1 (en) * | 2007-10-11 | 2017-07-11 | Jeffrey David Mullen | Augmented reality video game systems |
GB0722592D0 (en) * | 2007-11-16 | 2007-12-27 | Birmingham City University | Surgeons headgear |
US8159458B2 (en) | 2007-12-13 | 2012-04-17 | Apple Inc. | Motion tracking user interface |
WO2009083191A1 (en) * | 2007-12-21 | 2009-07-09 | Holger Essiger | System for selectively displaying data, comprising an eyeglasses-type device, especially eyeglasses |
US9158116B1 (en) | 2014-04-25 | 2015-10-13 | Osterhout Group, Inc. | Temple and ear horn assembly for headworn computer |
USD626949S1 (en) | 2008-02-20 | 2010-11-09 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
US20090216534A1 (en) * | 2008-02-22 | 2009-08-27 | Prakash Somasundaram | Voice-activated emergency medical services communication and documentation system |
EP2108328B2 (en) * | 2008-04-09 | 2020-08-26 | Brainlab AG | Image-based control method for medicinal devices |
WO2009128781A1 (en) * | 2008-04-17 | 2009-10-22 | Lundgren & Nordstrand Ab | A method and a device for remote visualization |
DE102008027832A1 (en) * | 2008-06-11 | 2009-12-17 | Vrmagic Gmbh | Ophthalmoscope simulator |
US7954996B2 (en) * | 2008-07-08 | 2011-06-07 | General Electric Company | Positioning system with tilting arm support for imaging devices |
USD605629S1 (en) | 2008-09-29 | 2009-12-08 | Vocollect, Inc. | Headset |
US8386261B2 (en) | 2008-11-14 | 2013-02-26 | Vocollect Healthcare Systems, Inc. | Training/coaching system for a voice-enabled work environment |
US20150205111A1 (en) | 2014-01-21 | 2015-07-23 | Osterhout Group, Inc. | Optical configurations for head worn computing |
US9952664B2 (en) | 2014-01-21 | 2018-04-24 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9229233B2 (en) | 2014-02-11 | 2016-01-05 | Osterhout Group, Inc. | Micro Doppler presentations in head worn computing |
US9400390B2 (en) | 2014-01-24 | 2016-07-26 | Osterhout Group, Inc. | Peripheral lighting for head worn computing |
US9366867B2 (en) | 2014-07-08 | 2016-06-14 | Osterhout Group, Inc. | Optical systems for see-through displays |
US20150277120A1 (en) | 2014-01-21 | 2015-10-01 | Osterhout Group, Inc. | Optical configurations for head worn computing |
US9965681B2 (en) | 2008-12-16 | 2018-05-08 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9298007B2 (en) | 2014-01-21 | 2016-03-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9715112B2 (en) | 2014-01-21 | 2017-07-25 | Osterhout Group, Inc. | Suppression of stray light in head worn computing |
DE102009010263B4 (en) * | 2009-02-24 | 2011-01-20 | Reiner Kunz | Method for navigating an endoscopic instrument during technical endoscopy and associated device |
US9436276B2 (en) * | 2009-02-25 | 2016-09-06 | Microsoft Technology Licensing, Llc | Second-person avatars |
US20100240988A1 (en) * | 2009-03-19 | 2010-09-23 | Kenneth Varga | Computer-aided system for 360 degree heads up display of safety/mission critical data |
US20100238161A1 (en) * | 2009-03-19 | 2010-09-23 | Kenneth Varga | Computer-aided system for 360º heads up display of safety/mission critical data |
US8160287B2 (en) | 2009-05-22 | 2012-04-17 | Vocollect, Inc. | Headset with adjustable headband |
US8786873B2 (en) * | 2009-07-20 | 2014-07-22 | General Electric Company | Application server for use with a modular imaging system |
US9728006B2 (en) | 2009-07-20 | 2017-08-08 | Real Time Companies, LLC | Computer-aided system for 360° heads up display of safety/mission critical data |
DE102009037835B4 (en) | 2009-08-18 | 2012-12-06 | Metaio Gmbh | Method for displaying virtual information in a real environment |
WO2011024134A1 (en) * | 2009-08-26 | 2011-03-03 | Ecole Polytechnique Federale De Lausanne (Epfl) | Wearable systems for audio, visual and gaze monitoring |
US8438659B2 (en) | 2009-11-05 | 2013-05-07 | Vocollect, Inc. | Portable computing device and headset interface |
JP5814261B2 (en) * | 2010-01-13 | 2015-11-17 | バイオ−ラッド ラボラトリーズ,インコーポレイティド | Education system for dental professionals |
IT1401669B1 (en) | 2010-04-07 | 2013-08-02 | Sofar Spa | ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL. |
US8243882B2 (en) | 2010-05-07 | 2012-08-14 | General Electric Company | System and method for indicating association between autonomous detector and imaging subsystem |
US8659397B2 (en) | 2010-07-22 | 2014-02-25 | Vocollect, Inc. | Method and system for correctly identifying specific RFID tags |
USD643400S1 (en) | 2010-08-19 | 2011-08-16 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
USD643013S1 (en) | 2010-08-20 | 2011-08-09 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
US9111498B2 (en) | 2010-08-25 | 2015-08-18 | Eastman Kodak Company | Head-mounted display with environmental state detection |
US8780014B2 (en) | 2010-08-25 | 2014-07-15 | Eastman Kodak Company | Switchable head-mounted display |
US8619005B2 (en) * | 2010-09-09 | 2013-12-31 | Eastman Kodak Company | Switchable head-mounted display transition |
US8633979B2 (en) | 2010-12-29 | 2014-01-21 | GM Global Technology Operations LLC | Augmented road scene illustrator system on full windshield head-up display |
US8605011B2 (en) * | 2010-12-29 | 2013-12-10 | GM Global Technology Operations LLC | Virtual viewfinder on full windshield head-up display |
WO2012095664A1 (en) * | 2011-01-14 | 2012-07-19 | Bae Systems Plc | An apparatus for presenting an image and a method of presenting the image |
EP2477059A1 (en) * | 2011-01-14 | 2012-07-18 | BAE Systems PLC | An Apparatus for Presenting an Image and a Method of Presenting the Image |
US8996386B2 (en) * | 2011-01-19 | 2015-03-31 | Denso International America, Inc. | Method and system for creating a voice recognition database for a mobile device using image processing and optical character recognition |
US20120235065A1 (en) | 2011-03-16 | 2012-09-20 | Intellirad Control, Inc. | Radiation control and minimization system and method |
JP6457262B2 (en) | 2011-03-30 | 2019-01-23 | アヴィザル,モルデチャイ | Method and system for simulating surgery |
US8911087B2 (en) | 2011-05-20 | 2014-12-16 | Eyefluence, Inc. | Systems and methods for measuring reactions of head, eyes, eyelids and pupils |
US8885877B2 (en) | 2011-05-20 | 2014-11-11 | Eyefluence, Inc. | Systems and methods for identifying gaze tracking scene reference locations |
US9138343B2 (en) | 2011-05-31 | 2015-09-22 | Bayer Healthcare Llc | Tip protector sleeve |
CN103607971B (en) * | 2011-07-07 | 2016-08-31 | 奥林巴斯株式会社 | Medical master slave manipulator |
US8912979B1 (en) | 2011-07-14 | 2014-12-16 | Google Inc. | Virtual window in head-mounted display |
AU2011204946C1 (en) | 2011-07-22 | 2012-07-26 | Microsoft Technology Licensing, Llc | Automatic text scrolling on a head-mounted display |
US8767306B1 (en) | 2011-09-22 | 2014-07-01 | Google Inc. | Display system |
US8937646B1 (en) * | 2011-10-05 | 2015-01-20 | Amazon Technologies, Inc. | Stereo imaging using disparate imaging devices |
US9215293B2 (en) | 2011-10-28 | 2015-12-15 | Magic Leap, Inc. | System and method for augmented and virtual reality |
US8929589B2 (en) | 2011-11-07 | 2015-01-06 | Eyefluence, Inc. | Systems and methods for high-resolution gaze tracking |
US8756060B2 (en) * | 2011-12-22 | 2014-06-17 | Ncr Corporation | Methods and apparatus for audio input for customization of digital displays |
US9684374B2 (en) | 2012-01-06 | 2017-06-20 | Google Inc. | Eye reflection image analysis |
US9153043B1 (en) | 2012-02-16 | 2015-10-06 | Google, Inc. | Systems and methods for providing a user interface in a field of view of a media item |
US9569594B2 (en) | 2012-03-08 | 2017-02-14 | Nuance Communications, Inc. | Methods and apparatus for generating clinical reports |
US9569593B2 (en) * | 2012-03-08 | 2017-02-14 | Nuance Communications, Inc. | Methods and apparatus for generating clinical reports |
US9423870B2 (en) | 2012-05-08 | 2016-08-23 | Google Inc. | Input determination method |
IN2014DN08500A (en) | 2012-05-25 | 2015-05-15 | Surgical Theater LLC | |
US9415745B1 (en) | 2012-06-08 | 2016-08-16 | The Boeing Company | High intensity light source blocking system and method |
US9389420B2 (en) * | 2012-06-14 | 2016-07-12 | Qualcomm Incorporated | User interface interaction for transparent head-mounted displays |
US9152226B2 (en) * | 2012-06-15 | 2015-10-06 | Qualcomm Incorporated | Input method designed for augmented reality goggles |
US8882662B2 (en) | 2012-06-27 | 2014-11-11 | Camplex, Inc. | Interface for viewing video from cameras on a surgical visualization system |
US9642606B2 (en) | 2012-06-27 | 2017-05-09 | Camplex, Inc. | Surgical visualization system |
US20140024889A1 (en) * | 2012-07-17 | 2014-01-23 | Wilkes University | Gaze Contingent Control System for a Robotic Laparoscope Holder |
IL221863A (en) * | 2012-09-10 | 2014-01-30 | Elbit Systems Ltd | Digital system for surgical video capturing and display |
KR101470411B1 (en) * | 2012-10-12 | 2014-12-08 | 주식회사 인피니트헬스케어 | Medical image display method using virtual patient model and apparatus thereof |
US9729831B2 (en) * | 2012-11-29 | 2017-08-08 | Sony Corporation | Wireless surgical loupe |
US9681982B2 (en) * | 2012-12-17 | 2017-06-20 | Alcon Research, Ltd. | Wearable user interface for use with ocular surgical console |
US20140184802A1 (en) * | 2012-12-28 | 2014-07-03 | Wal-Mart Stores, Inc. | Techniques for reducing consumer wait time |
US10660526B2 (en) | 2012-12-31 | 2020-05-26 | Omni Medsci, Inc. | Near-infrared time-of-flight imaging using laser diodes with Bragg reflectors |
US9500635B2 (en) | 2012-12-31 | 2016-11-22 | Omni Medsci, Inc. | Short-wave infrared super-continuum lasers for early detection of dental caries |
CA2895969A1 (en) | 2012-12-31 | 2014-07-03 | Omni Medsci, Inc. | Near-infrared lasers for non-invasive monitoring of glucose, ketones, hba1c, and other blood constituents |
EP3184038B1 (en) | 2012-12-31 | 2019-02-20 | Omni MedSci, Inc. | Mouth guard with short-wave infrared super-continuum lasers for early detection of dental caries |
WO2014143276A2 (en) | 2012-12-31 | 2014-09-18 | Omni Medsci, Inc. | Short-wave infrared super-continuum lasers for natural gas leak detection, exploration, and other active remote sensing applications |
US9993159B2 (en) | 2012-12-31 | 2018-06-12 | Omni Medsci, Inc. | Near-infrared super-continuum lasers for early detection of breast and other cancers |
EP2752730B1 (en) * | 2013-01-08 | 2019-04-03 | Volvo Car Corporation | Vehicle display arrangement and vehicle comprising a vehicle display arrangement |
US9342145B2 (en) | 2013-01-22 | 2016-05-17 | Kabushiki Kaisha Toshiba | Cursor control |
US10288881B2 (en) * | 2013-03-14 | 2019-05-14 | Fresenius Medical Care Holdings, Inc. | Wearable interface for remote monitoring and control of a medical device |
US9498291B2 (en) | 2013-03-15 | 2016-11-22 | Hansen Medical, Inc. | Touch-free catheter user interface controller |
US9234742B2 (en) | 2013-05-01 | 2016-01-12 | Faro Technologies, Inc. | Method and apparatus for using gestures to control a laser tracker |
EP2999414B1 (en) | 2013-05-21 | 2018-08-08 | Camplex, Inc. | Surgical visualization systems |
US10905943B2 (en) * | 2013-06-07 | 2021-02-02 | Sony Interactive Entertainment LLC | Systems and methods for reducing hops associated with a head mounted system |
AT14754U3 (en) * | 2013-06-18 | 2016-10-15 | Kolotov Alexandr Alexandrovich | Helmet for motorcyclists and people who do extreme activities |
US9563331B2 (en) | 2013-06-28 | 2017-02-07 | Microsoft Technology Licensing, Llc | Web-like hierarchical menu display configuration for a near-eye display |
JP6102588B2 (en) * | 2013-07-10 | 2017-03-29 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US20150157188A1 (en) * | 2013-08-14 | 2015-06-11 | Ahmnon D. Moskowitz | Systems and apparatuses for improved visualization of endoscopic images |
US9990771B2 (en) * | 2013-08-28 | 2018-06-05 | Aurelian Viorel DRAGNEA | Method and system of displaying information during a medical procedure |
WO2015042460A1 (en) | 2013-09-20 | 2015-03-26 | Camplex, Inc. | Surgical visualization systems and displays |
US10881286B2 (en) | 2013-09-20 | 2021-01-05 | Camplex, Inc. | Medical apparatus for use with a surgical tubular retractor |
US9626801B2 (en) * | 2013-12-31 | 2017-04-18 | Daqri, Llc | Visualization of physical characteristics in augmented reality |
KR102146856B1 (en) | 2013-12-31 | 2020-08-21 | 삼성전자주식회사 | Method of displaying a photographing mode using lens characteristics, Computer readable storage medium of recording the method and a digital photographing apparatus. |
US9558592B2 (en) | 2013-12-31 | 2017-01-31 | Daqri, Llc | Visualization of physical interactions in augmented reality |
US9244539B2 (en) | 2014-01-07 | 2016-01-26 | Microsoft Technology Licensing, Llc | Target positioning with gaze tracking |
US10254856B2 (en) | 2014-01-17 | 2019-04-09 | Osterhout Group, Inc. | External user interface for head worn computing |
US11103122B2 (en) | 2014-07-15 | 2021-08-31 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9529195B2 (en) | 2014-01-21 | 2016-12-27 | Osterhout Group, Inc. | See-through computer display systems |
US9366868B2 (en) | 2014-09-26 | 2016-06-14 | Osterhout Group, Inc. | See-through computer display systems |
US11227294B2 (en) | 2014-04-03 | 2022-01-18 | Mentor Acquisition One, Llc | Sight information collection in head worn computing |
US20150277118A1 (en) | 2014-03-28 | 2015-10-01 | Osterhout Group, Inc. | Sensor dependent content position in head worn computing |
US20150228119A1 (en) | 2014-02-11 | 2015-08-13 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US20160019715A1 (en) | 2014-07-15 | 2016-01-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
US9299194B2 (en) | 2014-02-14 | 2016-03-29 | Osterhout Group, Inc. | Secure sharing in head worn computing |
US10649220B2 (en) | 2014-06-09 | 2020-05-12 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9841599B2 (en) | 2014-06-05 | 2017-12-12 | Osterhout Group, Inc. | Optical configurations for head-worn see-through displays |
US10191279B2 (en) | 2014-03-17 | 2019-01-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9829707B2 (en) | 2014-08-12 | 2017-11-28 | Osterhout Group, Inc. | Measuring content brightness in head worn computing |
US9671613B2 (en) | 2014-09-26 | 2017-06-06 | Osterhout Group, Inc. | See-through computer display systems |
US9939934B2 (en) | 2014-01-17 | 2018-04-10 | Osterhout Group, Inc. | External user interface for head worn computing |
US9448409B2 (en) | 2014-11-26 | 2016-09-20 | Osterhout Group, Inc. | See-through computer display systems |
US9746686B2 (en) | 2014-05-19 | 2017-08-29 | Osterhout Group, Inc. | Content position calibration in head worn computing |
US9810906B2 (en) | 2014-06-17 | 2017-11-07 | Osterhout Group, Inc. | External user interface for head worn computing |
US9575321B2 (en) | 2014-06-09 | 2017-02-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
US10684687B2 (en) | 2014-12-03 | 2020-06-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US9594246B2 (en) | 2014-01-21 | 2017-03-14 | Osterhout Group, Inc. | See-through computer display systems |
US9538915B2 (en) | 2014-01-21 | 2017-01-10 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US12093453B2 (en) | 2014-01-21 | 2024-09-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US9766463B2 (en) | 2014-01-21 | 2017-09-19 | Osterhout Group, Inc. | See-through computer display systems |
US11487110B2 (en) | 2014-01-21 | 2022-11-01 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
US9310610B2 (en) | 2014-01-21 | 2016-04-12 | Osterhout Group, Inc. | See-through computer display systems |
US9651784B2 (en) | 2014-01-21 | 2017-05-16 | Osterhout Group, Inc. | See-through computer display systems |
US12105281B2 (en) | 2014-01-21 | 2024-10-01 | Mentor Acquisition One, Llc | See-through computer display systems |
US9836122B2 (en) | 2014-01-21 | 2017-12-05 | Osterhout Group, Inc. | Eye glint imaging in see-through computer display systems |
US9740280B2 (en) | 2014-01-21 | 2017-08-22 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9523856B2 (en) | 2014-01-21 | 2016-12-20 | Osterhout Group, Inc. | See-through computer display systems |
US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
US20150205135A1 (en) | 2014-01-21 | 2015-07-23 | Osterhout Group, Inc. | See-through computer display systems |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US9811152B2 (en) | 2014-01-21 | 2017-11-07 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9494800B2 (en) | 2014-01-21 | 2016-11-15 | Osterhout Group, Inc. | See-through computer display systems |
US9846308B2 (en) | 2014-01-24 | 2017-12-19 | Osterhout Group, Inc. | Haptic systems for head-worn computers |
US20160085072A1 (en) | 2014-01-24 | 2016-03-24 | Osterhout Group, Inc. | See-through computer display systems |
US9524588B2 (en) | 2014-01-24 | 2016-12-20 | Avaya Inc. | Enhanced communication between remote participants using augmented and virtual reality |
US9401540B2 (en) | 2014-02-11 | 2016-07-26 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US12112089B2 (en) | 2014-02-11 | 2024-10-08 | Mentor Acquisition One, Llc | Spatial location presentation in head worn computing |
US9852545B2 (en) | 2014-02-11 | 2017-12-26 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
EP3108801A4 (en) * | 2014-02-21 | 2017-10-25 | Sony Corporation | Head-mounted display, control device, and control method |
US9387589B2 (en) * | 2014-02-25 | 2016-07-12 | GM Global Technology Operations LLC | Visual debugging of robotic tasks |
US20160187651A1 (en) | 2014-03-28 | 2016-06-30 | Osterhout Group, Inc. | Safety for a vehicle operator with an hmd |
US11547499B2 (en) | 2014-04-04 | 2023-01-10 | Surgical Theater, Inc. | Dynamic and interactive navigation in a surgical environment |
US10853589B2 (en) | 2014-04-25 | 2020-12-01 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US9672210B2 (en) | 2014-04-25 | 2017-06-06 | Osterhout Group, Inc. | Language translation with head-worn computing |
US9423842B2 (en) | 2014-09-18 | 2016-08-23 | Osterhout Group, Inc. | Thermal management for head-worn computer |
US20150309534A1 (en) | 2014-04-25 | 2015-10-29 | Osterhout Group, Inc. | Ear horn assembly for headworn computer |
US9651787B2 (en) | 2014-04-25 | 2017-05-16 | Osterhout Group, Inc. | Speaker assembly for headworn computer |
US20160137312A1 (en) | 2014-05-06 | 2016-05-19 | Osterhout Group, Inc. | Unmanned aerial vehicle launch system |
US9829547B2 (en) | 2014-05-08 | 2017-11-28 | Resonance Technology, Inc. | Head-up display with eye-tracker for MRI applications |
US10235567B2 (en) | 2014-05-15 | 2019-03-19 | Fenwal, Inc. | Head mounted display device for use in a medical facility |
EP3200109B1 (en) * | 2014-05-15 | 2022-10-05 | Fenwal, Inc. | Head-mounted display device for use in a medical facility |
US11100327B2 (en) | 2014-05-15 | 2021-08-24 | Fenwal, Inc. | Recording a state of a medical device |
US10663740B2 (en) | 2014-06-09 | 2020-05-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
IL235073A (en) * | 2014-10-07 | 2016-02-29 | Elbit Systems Ltd | Head-mounted displaying of magnified images locked on an object of interest |
US9684172B2 (en) | 2014-12-03 | 2017-06-20 | Osterhout Group, Inc. | Head worn computer display systems |
US10702353B2 (en) | 2014-12-05 | 2020-07-07 | Camplex, Inc. | Surgical visualizations systems and displays |
USD743963S1 (en) | 2014-12-22 | 2015-11-24 | Osterhout Group, Inc. | Air mouse |
USD751552S1 (en) | 2014-12-31 | 2016-03-15 | Osterhout Group, Inc. | Computer glasses |
USD753114S1 (en) | 2015-01-05 | 2016-04-05 | Osterhout Group, Inc. | Air mouse |
WO2016130533A1 (en) * | 2015-02-10 | 2016-08-18 | Brian Mullins | Dynamic lighting for head mounted device |
US20160239985A1 (en) | 2015-02-17 | 2016-08-18 | Osterhout Group, Inc. | See-through computer display systems |
US10878775B2 (en) | 2015-02-17 | 2020-12-29 | Mentor Acquisition One, Llc | See-through computer display systems |
GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
WO2016154589A1 (en) | 2015-03-25 | 2016-09-29 | Camplex, Inc. | Surgical visualization systems and displays |
KR102694898B1 (en) | 2015-05-19 | 2024-08-12 | 매직 립, 인코포레이티드 | Dual composite optical field device |
US12151101B2 (en) * | 2015-06-02 | 2024-11-26 | Battelle Memorial Institute | Non-invasive eye-tracking control of neuromuscular stimulation system |
KR20180021086A (en) | 2015-06-30 | 2018-02-28 | 쓰리엠 이노베이티브 프로퍼티즈 컴파니 | Illuminator |
US11003246B2 (en) | 2015-07-22 | 2021-05-11 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US10139966B2 (en) | 2015-07-22 | 2018-11-27 | Osterhout Group, Inc. | External user interface for head worn computing |
CN107613897B (en) | 2015-10-14 | 2021-12-17 | 外科手术室公司 | Augmented reality surgical navigation |
ITUB20155830A1 (en) * | 2015-11-23 | 2017-05-23 | R A W Srl | "NAVIGATION, TRACKING AND GUIDE SYSTEM FOR THE POSITIONING OF OPERATOR INSTRUMENTS" |
WO2017091704A1 (en) | 2015-11-25 | 2017-06-01 | Camplex, Inc. | Surgical visualization systems and displays |
US10646289B2 (en) * | 2015-12-29 | 2020-05-12 | Koninklijke Philips N.V. | System, controller and method using virtual reality device for robotic surgery |
US10850116B2 (en) | 2016-12-30 | 2020-12-01 | Mentor Acquisition One, Llc | Head-worn therapy device |
US10591728B2 (en) | 2016-03-02 | 2020-03-17 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US10667981B2 (en) | 2016-02-29 | 2020-06-02 | Mentor Acquisition One, Llc | Reading assistance system for visually impaired |
US9880441B1 (en) | 2016-09-08 | 2018-01-30 | Osterhout Group, Inc. | Electrochromic systems for head-worn computer systems |
US9826299B1 (en) | 2016-08-22 | 2017-11-21 | Osterhout Group, Inc. | Speaker systems for head-worn computer systems |
EP3440660A1 (en) * | 2016-04-06 | 2019-02-13 | Koninklijke Philips N.V. | Method, device and system for enabling to analyze a property of a vital sign detector |
US10824253B2 (en) | 2016-05-09 | 2020-11-03 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US10466491B2 (en) | 2016-06-01 | 2019-11-05 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US10684478B2 (en) | 2016-05-09 | 2020-06-16 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US9910284B1 (en) | 2016-09-08 | 2018-03-06 | Osterhout Group, Inc. | Optical systems for head-worn computers |
US10578852B2 (en) * | 2016-05-05 | 2020-03-03 | Robert D. Watson | Surgical stereoscopic visualization system with movable head mounted display |
US10690936B2 (en) | 2016-08-29 | 2020-06-23 | Mentor Acquisition One, Llc | Adjustable nose bridge assembly for headworn computer |
USD840395S1 (en) | 2016-10-17 | 2019-02-12 | Osterhout Group, Inc. | Head-worn computer |
US10867445B1 (en) * | 2016-11-16 | 2020-12-15 | Amazon Technologies, Inc. | Content segmentation and navigation |
US11202687B2 (en) | 2016-12-23 | 2021-12-21 | Biolase, Inc. | Dental system and method |
USD864959S1 (en) | 2017-01-04 | 2019-10-29 | Mentor Acquisition One, Llc | Computer glasses |
US20190355179A1 (en) * | 2017-01-19 | 2019-11-21 | Hewlett-Packard Development Company, L.P. | Telepresence |
US20180247712A1 (en) | 2017-02-24 | 2018-08-30 | Masimo Corporation | System for displaying medical monitoring data |
WO2018156809A1 (en) * | 2017-02-24 | 2018-08-30 | Masimo Corporation | Augmented reality system for displaying patient data |
US10977858B2 (en) | 2017-03-30 | 2021-04-13 | Magic Leap, Inc. | Centralized rendering |
CN110495186B (en) | 2017-03-30 | 2021-11-19 | 奇跃公司 | Sound reproduction system and head-mounted device |
AU2018252665A1 (en) | 2017-04-14 | 2019-10-17 | Magic Leap, Inc. | Multimodal eye tracking |
US10401954B2 (en) * | 2017-04-17 | 2019-09-03 | Intel Corporation | Sensory enhanced augmented reality and virtual reality device |
EP3622529A1 (en) | 2017-05-08 | 2020-03-18 | Masimo Corporation | System for pairing a medical system to a network controller by use of a dongle |
US10918455B2 (en) | 2017-05-08 | 2021-02-16 | Camplex, Inc. | Variable light source |
US10760931B2 (en) * | 2017-05-23 | 2020-09-01 | Microsoft Technology Licensing, Llc | Dynamic control of performance parameters in a six degrees-of-freedom sensor calibration subsystem |
US11079522B1 (en) | 2017-05-31 | 2021-08-03 | Magic Leap, Inc. | Fiducial design |
FR3068481B1 (en) | 2017-06-29 | 2019-07-26 | Airbus Operations (S.A.S.) | DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT |
US10366691B2 (en) | 2017-07-11 | 2019-07-30 | Samsung Electronics Co., Ltd. | System and method for voice command context |
US10422995B2 (en) | 2017-07-24 | 2019-09-24 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US11409105B2 (en) | 2017-07-24 | 2022-08-09 | Mentor Acquisition One, Llc | See-through computer display systems |
US10578869B2 (en) | 2017-07-24 | 2020-03-03 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US10969584B2 (en) | 2017-08-04 | 2021-04-06 | Mentor Acquisition One, Llc | Image expansion optic for head-worn computer |
US11316865B2 (en) | 2017-08-10 | 2022-04-26 | Nuance Communications, Inc. | Ambient cooperative intelligence system and method |
US10546655B2 (en) | 2017-08-10 | 2020-01-28 | Nuance Communications, Inc. | Automated clinical documentation system and method |
US10152141B1 (en) | 2017-08-18 | 2018-12-11 | Osterhout Group, Inc. | Controller movement tracking with light emitters |
CN107616797A (en) * | 2017-08-25 | 2018-01-23 | 深圳职业技术学院 | A kind of critically ill patient calling system |
AU2018353008B2 (en) | 2017-10-17 | 2023-04-20 | Magic Leap, Inc. | Mixed reality spatial audio |
JPWO2019092954A1 (en) * | 2017-11-07 | 2020-11-12 | ソニー・オリンパスメディカルソリューションズ株式会社 | Medical display device and medical observation device |
EP4335391A1 (en) | 2017-12-07 | 2024-03-13 | Augmedics Ltd. | Spinous process clamp |
US11071595B2 (en) | 2017-12-14 | 2021-07-27 | Verb Surgical Inc. | Multi-panel graphical user interface for a robotic surgical system |
JP7252965B2 (en) | 2018-02-15 | 2023-04-05 | マジック リープ, インコーポレイテッド | Dual Listener Position for Mixed Reality |
EP3753238A4 (en) | 2018-02-15 | 2021-04-07 | Magic Leap, Inc. | Mixed reality virtual reverberation |
IL305389B2 (en) | 2018-02-15 | 2024-09-01 | Magic Leap Inc | Mixed reality musical instrument |
US11515020B2 (en) | 2018-03-05 | 2022-11-29 | Nuance Communications, Inc. | Automated clinical documentation system and method |
US11250382B2 (en) | 2018-03-05 | 2022-02-15 | Nuance Communications, Inc. | Automated clinical documentation system and method |
US20190272902A1 (en) | 2018-03-05 | 2019-09-05 | Nuance Communications, Inc. | System and method for review of automated clinical documentation |
US11284646B2 (en) | 2018-03-22 | 2022-03-29 | Altria Client Services Llc | Augmented reality and/or virtual reality based e-vaping device vapor simulation systems and methods |
WO2019226269A2 (en) | 2018-04-24 | 2019-11-28 | Mentor Acquisition One, Llc | See-through computer display systems with vision correction and increased content density |
US10771727B2 (en) | 2018-04-27 | 2020-09-08 | Vital Optics, Inc | Monitoring system with heads-up display |
EP3787543A4 (en) | 2018-05-02 | 2022-01-19 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
US10667072B2 (en) | 2018-06-12 | 2020-05-26 | Magic Leap, Inc. | Efficient rendering of virtual soundfields |
WO2019241760A1 (en) | 2018-06-14 | 2019-12-19 | Magic Leap, Inc. | Methods and systems for audio signal filtering |
JP7478100B2 (en) | 2018-06-14 | 2024-05-02 | マジック リープ, インコーポレイテッド | Reverberation Gain Normalization |
CN116156411A (en) | 2018-06-18 | 2023-05-23 | 奇跃公司 | Spatial audio for interactive audio environments |
EP3810018A1 (en) * | 2018-06-19 | 2021-04-28 | Tornier, Inc. | Multi-user collaboration and workflow techniques for orthopedic surgical procedures using mixed reality |
CN112513983B (en) | 2018-06-21 | 2024-12-17 | 奇跃公司 | Wearable system speech processing |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
US10895757B2 (en) | 2018-07-03 | 2021-01-19 | Verb Surgical Inc. | Systems and methods for three-dimensional visualization during robotic surgery |
JP7499749B2 (en) | 2018-07-24 | 2024-06-14 | マジック リープ, インコーポレイテッド | Application Sharing |
EP3857291A4 (en) | 2018-09-25 | 2021-11-24 | Magic Leap, Inc. | SYSTEMS AND PROCEDURES FOR EXTENDED REALITY |
WO2020073024A1 (en) | 2018-10-05 | 2020-04-09 | Magic Leap, Inc. | Emphasis for audio spatialization |
CN116320907A (en) | 2018-10-05 | 2023-06-23 | 奇跃公司 | Near field audio rendering |
WO2020076856A1 (en) | 2018-10-08 | 2020-04-16 | Mcginley Education Innovations, Llc | Augmented reality based real-time ultrasonography image rendering for surgical assistance |
CN113168526B (en) | 2018-10-09 | 2024-08-27 | 奇跃公司 | System and method for virtual and augmented reality |
US11487316B2 (en) | 2018-10-24 | 2022-11-01 | Magic Leap, Inc. | Asynchronous ASIC |
US10898151B2 (en) * | 2018-10-31 | 2021-01-26 | Medtronic Inc. | Real-time rendering and referencing for medical procedures |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
CN113632030B (en) | 2018-12-27 | 2024-11-01 | 奇跃公司 | System and method for virtual reality and augmented reality |
CN113748462A (en) | 2019-03-01 | 2021-12-03 | 奇跃公司 | Determining input for a speech processing engine |
US10659848B1 (en) | 2019-03-21 | 2020-05-19 | International Business Machines Corporation | Display overlays for prioritization of video subjects |
WO2020198385A1 (en) | 2019-03-25 | 2020-10-01 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US11413111B2 (en) * | 2019-05-24 | 2022-08-16 | Karl Storz Imaging, Inc. | Augmented reality system for medical procedures |
EP3980880A4 (en) | 2019-06-06 | 2022-11-23 | Magic Leap, Inc. | Photoreal character configurations for spatial computing |
US11611705B1 (en) * | 2019-06-10 | 2023-03-21 | Julian W. Chen | Smart glasses with augmented reality capability for dentistry |
US11216480B2 (en) | 2019-06-14 | 2022-01-04 | Nuance Communications, Inc. | System and method for querying data points from graph data structures |
US11043207B2 (en) | 2019-06-14 | 2021-06-22 | Nuance Communications, Inc. | System and method for array data simulation and customized acoustic modeling for ambient ASR |
US11227679B2 (en) | 2019-06-14 | 2022-01-18 | Nuance Communications, Inc. | Ambient clinical intelligence system and method |
US11531807B2 (en) | 2019-06-28 | 2022-12-20 | Nuance Communications, Inc. | System and method for customized text macros |
US10846911B1 (en) * | 2019-07-09 | 2020-11-24 | Robert Edwin Douglas | 3D imaging of virtual fluids and virtual sounds |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
US12178666B2 (en) | 2019-07-29 | 2024-12-31 | Augmedics Ltd. | Fiducial marker |
US11328740B2 (en) | 2019-08-07 | 2022-05-10 | Magic Leap, Inc. | Voice onset detection |
US11704874B2 (en) | 2019-08-07 | 2023-07-18 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
US11670408B2 (en) | 2019-09-30 | 2023-06-06 | Nuance Communications, Inc. | System and method for review of automated clinical documentation |
JP7565349B2 (en) | 2019-10-18 | 2024-10-10 | マジック リープ, インコーポレイテッド | Gravity Estimation and Bundle Adjustment for Visual Inertial Odometry |
CN114586382A (en) | 2019-10-25 | 2022-06-03 | 奇跃公司 | Reverberation fingerprint estimation |
CN114846434A (en) | 2019-10-25 | 2022-08-02 | 奇跃公司 | Non-uniform stereo rendering |
US11959997B2 (en) | 2019-11-22 | 2024-04-16 | Magic Leap, Inc. | System and method for tracking a wearable device |
US11381791B2 (en) | 2019-12-04 | 2022-07-05 | Magic Leap, Inc. | Variable-pitch color emitting display |
WO2021113781A1 (en) | 2019-12-06 | 2021-06-10 | Magic Leap, Inc. | Environment acoustics persistence |
US11269181B2 (en) | 2019-12-09 | 2022-03-08 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11992373B2 (en) * | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11337023B2 (en) | 2019-12-20 | 2022-05-17 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
EP3848779A1 (en) | 2020-01-09 | 2021-07-14 | BHS Technologies GmbH | Head-mounted display system and method for controlling a medical imaging device |
US11090873B1 (en) * | 2020-02-02 | 2021-08-17 | Robert Edwin Douglas | Optimizing analysis of a 3D printed object through integration of geo-registered virtual objects |
JP2023513250A (en) | 2020-02-10 | 2023-03-30 | マジック リープ, インコーポレイテッド | Dynamic co-location of virtual content |
WO2021163382A1 (en) | 2020-02-14 | 2021-08-19 | Magic Leap, Inc. | Multi-application audio rendering |
US11778410B2 (en) | 2020-02-14 | 2023-10-03 | Magic Leap, Inc. | Delayed audio following |
CN115398316A (en) | 2020-02-14 | 2022-11-25 | 奇跃公司 | 3D object annotation |
EP4103999A4 (en) | 2020-02-14 | 2023-08-02 | Magic Leap, Inc. | Session manager |
CN118276683A (en) | 2020-02-14 | 2024-07-02 | 奇跃公司 | Tool Bridge |
CN117714967A (en) | 2020-03-02 | 2024-03-15 | 奇跃公司 | Immersive audio platform |
US11917384B2 (en) | 2020-03-27 | 2024-02-27 | Magic Leap, Inc. | Method of waking a device using spoken voice commands |
US11636843B2 (en) | 2020-05-29 | 2023-04-25 | Magic Leap, Inc. | Surface appropriate collisions |
US11561613B2 (en) | 2020-05-29 | 2023-01-24 | Magic Leap, Inc. | Determining angular acceleration |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US12239385B2 (en) | 2020-09-09 | 2025-03-04 | Augmedics Ltd. | Universal tool adapter |
US11222103B1 (en) | 2020-10-29 | 2022-01-11 | Nuance Communications, Inc. | Ambient cooperative intelligence system and method |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
US12150821B2 (en) | 2021-07-29 | 2024-11-26 | Augmedics Ltd. | Rotating marker and adapter for image-guided surgery |
US12064397B2 (en) | 2021-08-25 | 2024-08-20 | Fenwal, Inc. | Determining characteristic of blood component with handheld camera |
WO2024057210A1 (en) | 2022-09-13 | 2024-03-21 | Augmedics Ltd. | Augmented reality eyewear for image-guided medical intervention |
Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US531379A (en) * | 1894-12-25 | mathewson | ||
US3724932A (en) * | 1971-04-09 | 1973-04-03 | Stanford Research Inst | Eye tracker and method |
US3923370A (en) * | 1974-10-15 | 1975-12-02 | Honeywell Inc | Head mounted displays |
US3940204A (en) * | 1975-01-23 | 1976-02-24 | Hughes Aircraft Company | Optical display systems utilizing holographic lenses |
US3988533A (en) * | 1974-09-30 | 1976-10-26 | Video Tek, Inc. | Video-type universal motion and intrusion detection system |
US4028725A (en) * | 1976-04-21 | 1977-06-07 | Grumman Aerospace Corporation | High-resolution vision system |
US4181405A (en) * | 1978-08-07 | 1980-01-01 | The Singer Company | Head-up viewing display |
US4188090A (en) * | 1977-06-01 | 1980-02-12 | Elliott Brothers (London) Limited | Retractable head-up displays |
US4349815A (en) * | 1979-01-11 | 1982-09-14 | Redifon Simulation Limited | Head-movable frame-scanner for head-coupled display |
US4437113A (en) * | 1981-12-21 | 1984-03-13 | The United States Of America As Represented By The Secretary Of The Air Force | Anti-flutter apparatus for head mounted visual display |
US4439755A (en) * | 1981-06-04 | 1984-03-27 | Farrand Optical Co., Inc. | Head-up infinity display and pilot's sight |
US4482326A (en) * | 1982-01-26 | 1984-11-13 | Instrument Flight Research Inc. | Flight training glasses |
US4575722A (en) * | 1982-05-05 | 1986-03-11 | Litton Systems, Inc. | Magneto-optic display |
US4636866A (en) * | 1982-12-24 | 1987-01-13 | Seiko Epson K.K. | Personal liquid crystal image display |
US4651201A (en) * | 1984-06-01 | 1987-03-17 | Arnold Schoolman | Stereoscopic endoscope arrangement |
US4652870A (en) * | 1984-02-10 | 1987-03-24 | Gec Avionics Limited | Display arrangements for head-up display systems |
US4669810A (en) * | 1984-02-03 | 1987-06-02 | Flight Dynamics, Inc. | Head up display system |
US4688879A (en) * | 1985-07-08 | 1987-08-25 | Kaiser Optical Systems, Inc. | Holographic multi-combiner for head-up display |
US4711512A (en) * | 1985-07-12 | 1987-12-08 | Environmental Research Institute Of Michigan | Compact head-up display |
US4725125A (en) * | 1985-08-27 | 1988-02-16 | Gec Avionics Limited | Head-up displays |
US4729634A (en) * | 1985-02-04 | 1988-03-08 | United Technologies Corporation | Reflective head-up display |
US4737972A (en) * | 1982-02-24 | 1988-04-12 | Arnold Schoolman | Stereoscopic fluoroscope arrangement |
US4740780A (en) * | 1985-06-24 | 1988-04-26 | Gec Avionics, Inc. | Head-up display for automobile |
US4763990A (en) * | 1984-02-03 | 1988-08-16 | Flight Dynamics, Inc. | Head up display system |
US4769633A (en) * | 1985-04-17 | 1988-09-06 | Gec Avionics Limited | Head-up displays |
US4787711A (en) * | 1986-01-23 | 1988-11-29 | Yazaki Corporation | On-vehicle head up display device with optical means for correcting parallax in a vertical direction |
US4796987A (en) * | 1984-12-20 | 1989-01-10 | Linden Harry A | Digital display for head mounted protection |
US4799765A (en) * | 1986-03-31 | 1989-01-24 | Hughes Aircraft Company | Integrated head-up and panel display unit |
US4818048A (en) * | 1987-01-06 | 1989-04-04 | Hughes Aircraft Company | Holographic head-up control panel |
US4824228A (en) * | 1986-11-28 | 1989-04-25 | Hughes Aircraft Company | Stereomicroscopic inspection system with low eyestrain features |
US4831366A (en) * | 1988-02-05 | 1989-05-16 | Yazaki Corporation | Head up display apparatus for automotive vehicle |
US4878046A (en) * | 1987-07-30 | 1989-10-31 | United Technologies Corporation | Mounting a cathode ray tube for a heads-up display system |
US4884137A (en) * | 1986-07-10 | 1989-11-28 | Varo, Inc. | Head mounted video display and remote camera system |
US4915487A (en) * | 1989-02-01 | 1990-04-10 | Systems Research Laboratories | Heads up display for night vision goggle |
US4927234A (en) * | 1987-08-21 | 1990-05-22 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Optical system for head-up displays |
US4930847A (en) * | 1987-07-09 | 1990-06-05 | Environmental Research Institute Of Michigan | Multicolor holographic element and apparatus for head-up display applications |
US4932731A (en) * | 1986-11-14 | 1990-06-12 | Yazaki Corporation | Holographic head-up display apparatus |
US4961625A (en) * | 1987-09-18 | 1990-10-09 | Flight Dynamics, Inc. | Automobile head-up display system with reflective aspheric surface |
US4973139A (en) * | 1989-04-07 | 1990-11-27 | Hughes Aircraft Company | Automotive head-up display |
US4984179A (en) * | 1987-01-21 | 1991-01-08 | W. Industries Limited | Method and apparatus for the perception of computer-generated imagery |
US4987410A (en) * | 1988-01-25 | 1991-01-22 | Kaiser Aerospace & Electronics Corporation | Multiple image forming apparatus |
US4988976A (en) * | 1989-09-06 | 1991-01-29 | Lu Hsing Tseng | Head-up display with magnetic field speed detecting means |
US4994794A (en) * | 1987-06-29 | 1991-02-19 | Gec-Marconi Limited | Methods and apparatus for displaying data |
US5000544A (en) * | 1988-08-01 | 1991-03-19 | Gec-Marconi Limited | Helmet system with optical display projection system including a cylindrical refractive surface |
US5003300A (en) * | 1987-07-27 | 1991-03-26 | Reflection Technology, Inc. | Head mounted display for miniature video display system |
US5005009A (en) * | 1988-02-16 | 1991-04-02 | K. W. Muth Company, Inc. | Method and apparatus for multiple object viewing |
US5028119A (en) * | 1989-04-07 | 1991-07-02 | Hughes Aircraft Company | Aircraft head-up display |
US5037182A (en) * | 1990-09-12 | 1991-08-06 | Delco Electronics Corporation | Rearview mirror head-up display |
US5066525A (en) * | 1989-01-25 | 1991-11-19 | Central Glass Company, Limited | Laminated glass panel incorporating hologram sheet |
US5108479A (en) * | 1989-10-09 | 1992-04-28 | Asahi Glass Company Ltd | Process for manufacturing glass with functional coating |
US5129716A (en) * | 1987-10-23 | 1992-07-14 | Laszlo Holakovszky | Stereoscopic video image display appliance wearable on head like spectacles |
US5130794A (en) * | 1990-03-29 | 1992-07-14 | Ritchey Kurtis J | Panoramic display system |
US5150137A (en) * | 1990-10-10 | 1992-09-22 | Pulse Medical Instruments | Positioning system for pupil imaging optics |
US5151722A (en) * | 1990-11-05 | 1992-09-29 | The Johns Hopkins University | Video display on spectacle-like frame |
US5162828A (en) * | 1986-09-25 | 1992-11-10 | Furness Thomas A | Display system for a head mounted viewing transparency |
US5198895A (en) * | 1991-08-29 | 1993-03-30 | Rockwell International Corporation | Holographic head-up display |
US5210624A (en) * | 1989-09-19 | 1993-05-11 | Fujitsu Limited | Heads-up display |
US5214413A (en) * | 1987-06-23 | 1993-05-25 | Nissan Motor Company, Limited | Head-up display apparatus for vehicular display |
US5222477A (en) * | 1991-09-30 | 1993-06-29 | Welch Allyn, Inc. | Endoscope or borescope stereo viewing system |
US5227769A (en) * | 1991-05-23 | 1993-07-13 | Westinghouse Electric Corp. | Heads-up projection display |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5231391A (en) * | 1990-11-30 | 1993-07-27 | Skf France | Passive pick-up device for monitoring the state of the tire of a vehicle wheel and measuring the rotation characteristics of the wheel |
US5241391A (en) * | 1990-10-19 | 1993-08-31 | Gec Ferranti Defence Systems Limited | Video camera system for recording a scene and a head-up display |
US5243448A (en) * | 1988-09-28 | 1993-09-07 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Head-up display |
US5251127A (en) * | 1988-02-01 | 1993-10-05 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US5278696A (en) * | 1992-05-22 | 1994-01-11 | Kaiser Aerospace & Electronics Corporation | Head-up display system |
US5281960A (en) * | 1991-11-19 | 1994-01-25 | Silhouette Technology, Inc. | Helmet mounted display |
US5281957A (en) * | 1984-11-14 | 1994-01-25 | Schoolman Scientific Corp. | Portable computer and head mounted display |
US5287437A (en) * | 1992-06-02 | 1994-02-15 | Sun Microsystems, Inc. | Method and apparatus for head tracked display of precomputed stereo images |
US5302964A (en) * | 1992-09-25 | 1994-04-12 | Hughes Aircraft Company | Heads-up display (HUD) incorporating cathode-ray tube image generator with digital look-up table for distortion correction |
US5305203A (en) * | 1988-02-01 | 1994-04-19 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US5311220A (en) * | 1992-06-10 | 1994-05-10 | Dimension Technologies, Inc. | Autostereoscopic display |
US5319363A (en) * | 1990-08-31 | 1994-06-07 | The General Hospital Corporation | Network for portable patient monitoring devices |
US5321416A (en) * | 1992-07-27 | 1994-06-14 | Virtual Research Systems | Head-mounted visual display apparatus |
US5331149A (en) * | 1990-12-31 | 1994-07-19 | Kopin Corporation | Eye tracking system having an array of photodetectors aligned respectively with an array of pixels |
US5331333A (en) * | 1988-12-08 | 1994-07-19 | Sharp Kabushiki Kaisha | Display apparatus |
US5334991A (en) * | 1992-05-15 | 1994-08-02 | Reflection Technology | Dual image head-mounted display |
US5341181A (en) * | 1992-11-20 | 1994-08-23 | Godard Roger R | Systems and methods for capturing and presentng visual information |
US5341242A (en) * | 1991-09-05 | 1994-08-23 | Elbit Ltd. | Helmet mounted display |
US5345281A (en) * | 1992-12-17 | 1994-09-06 | John Taboada | Eye tracking system and method |
US5347400A (en) * | 1993-05-06 | 1994-09-13 | Ken Hunter | Optical system for virtual reality helmet |
US5348477A (en) * | 1992-04-10 | 1994-09-20 | Cae Electronics Ltd. | High definition television head mounted display unit |
US5367315A (en) * | 1990-11-15 | 1994-11-22 | Eyetech Corporation | Method and apparatus for controlling cursor movement |
US5392158A (en) * | 1991-11-01 | 1995-02-21 | Sega Enterprises, Ltd. | Head-mounted image display |
US5406415A (en) * | 1992-09-22 | 1995-04-11 | Kelly; Shawn L. | Imaging system for a head-mounted display |
US5414544A (en) * | 1992-12-25 | 1995-05-09 | Sony Corporation | Display apparatus |
US5416876A (en) * | 1994-01-28 | 1995-05-16 | Hughes Training, Inc. | Fiber optic ribbon subminiature display for head/helmet mounted display |
US5430505A (en) * | 1992-01-30 | 1995-07-04 | Mak Technologies, Inc. | High speed eye tracking device and method |
US5436841A (en) * | 1992-03-16 | 1995-07-25 | Aerospatiale Societe Nationale Industrielle | Method and device for determining the relative position and the relative path of two space vehicles |
US5436765A (en) * | 1992-07-27 | 1995-07-25 | Olympus Optical Co., Ltd. | Visual display apparatus |
US5450596A (en) * | 1991-07-18 | 1995-09-12 | Redwear Interactive Inc. | CD-ROM data retrieval system using a hands-free command controller and headwear monitor |
US5452416A (en) * | 1992-12-30 | 1995-09-19 | Dominator Radiology, Inc. | Automated system and a method for organizing, presenting, and manipulating medical images |
US5457356A (en) * | 1993-08-11 | 1995-10-10 | Spire Corporation | Flat panel displays and process |
USD363279S (en) * | 1993-10-01 | 1995-10-17 | Seiko Epson Corporation | Head set with visual display unit and stereophonic headphones |
US5471542A (en) * | 1993-09-27 | 1995-11-28 | Ragland; Richard R. | Point-of-gaze tracker |
US5483307A (en) * | 1994-09-29 | 1996-01-09 | Texas Instruments, Inc. | Wide field of view head-mounted display |
US5485172A (en) * | 1993-05-21 | 1996-01-16 | Sony Corporation | Automatic image regulating arrangement for head-mounted image display apparatus |
US5539422A (en) * | 1993-04-12 | 1996-07-23 | Virtual Vision, Inc. | Head mounted display system |
US5751260A (en) * | 1992-01-10 | 1998-05-12 | The United States Of America As Represented By The Secretary Of The Navy | Sensory integrated data interface |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5493595A (en) * | 1982-02-24 | 1996-02-20 | Schoolman Scientific Corp. | Stereoscopically displayed three dimensional medical imaging |
US4613219A (en) * | 1984-03-05 | 1986-09-23 | Burke Marketing Services, Inc. | Eye movement recording apparatus |
US5231379A (en) | 1987-09-18 | 1993-07-27 | Hughes Flight Dynamics, Inc. | Automobile head-up display system with apparatus for positioning source information |
US5473365A (en) | 1992-12-25 | 1995-12-05 | Olympus Optical Co., Ltd. | Head-mounted image display apparatus in which a system is capable of changing aspect ratio of observed image |
JPH06194598A (en) | 1992-12-25 | 1994-07-15 | Olympus Optical Co Ltd | Display device of head mounting type |
US5421589A (en) * | 1993-05-14 | 1995-06-06 | The Walt Disney Company | Method and apparatus for displaying an alpha channel virtual image |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5649061A (en) * | 1995-05-11 | 1997-07-15 | The United States Of America As Represented By The Secretary Of The Army | Device and method for estimating a mental decision |
US5671158A (en) * | 1995-09-18 | 1997-09-23 | Envirotest Systems Corp. | Apparatus and method for effecting wireless discourse between computer and technician in testing motor vehicle emission control systems |
-
1996
- 1996-10-02 US US08/720,662 patent/US6847336B1/en not_active Expired - Fee Related
-
2005
- 2005-01-24 US US11/042,662 patent/US20050206583A1/en not_active Abandoned
Patent Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US531379A (en) * | 1894-12-25 | mathewson | ||
US3724932A (en) * | 1971-04-09 | 1973-04-03 | Stanford Research Inst | Eye tracker and method |
US3988533A (en) * | 1974-09-30 | 1976-10-26 | Video Tek, Inc. | Video-type universal motion and intrusion detection system |
US3923370A (en) * | 1974-10-15 | 1975-12-02 | Honeywell Inc | Head mounted displays |
US3940204A (en) * | 1975-01-23 | 1976-02-24 | Hughes Aircraft Company | Optical display systems utilizing holographic lenses |
US4028725A (en) * | 1976-04-21 | 1977-06-07 | Grumman Aerospace Corporation | High-resolution vision system |
US4188090A (en) * | 1977-06-01 | 1980-02-12 | Elliott Brothers (London) Limited | Retractable head-up displays |
US4181405A (en) * | 1978-08-07 | 1980-01-01 | The Singer Company | Head-up viewing display |
US4349815A (en) * | 1979-01-11 | 1982-09-14 | Redifon Simulation Limited | Head-movable frame-scanner for head-coupled display |
US4439755A (en) * | 1981-06-04 | 1984-03-27 | Farrand Optical Co., Inc. | Head-up infinity display and pilot's sight |
US4437113A (en) * | 1981-12-21 | 1984-03-13 | The United States Of America As Represented By The Secretary Of The Air Force | Anti-flutter apparatus for head mounted visual display |
US4482326A (en) * | 1982-01-26 | 1984-11-13 | Instrument Flight Research Inc. | Flight training glasses |
US4737972A (en) * | 1982-02-24 | 1988-04-12 | Arnold Schoolman | Stereoscopic fluoroscope arrangement |
US4575722A (en) * | 1982-05-05 | 1986-03-11 | Litton Systems, Inc. | Magneto-optic display |
US4636866A (en) * | 1982-12-24 | 1987-01-13 | Seiko Epson K.K. | Personal liquid crystal image display |
US4669810A (en) * | 1984-02-03 | 1987-06-02 | Flight Dynamics, Inc. | Head up display system |
US4763990A (en) * | 1984-02-03 | 1988-08-16 | Flight Dynamics, Inc. | Head up display system |
US4652870A (en) * | 1984-02-10 | 1987-03-24 | Gec Avionics Limited | Display arrangements for head-up display systems |
US4651201A (en) * | 1984-06-01 | 1987-03-17 | Arnold Schoolman | Stereoscopic endoscope arrangement |
US5281957A (en) * | 1984-11-14 | 1994-01-25 | Schoolman Scientific Corp. | Portable computer and head mounted display |
US4796987A (en) * | 1984-12-20 | 1989-01-10 | Linden Harry A | Digital display for head mounted protection |
US4729634A (en) * | 1985-02-04 | 1988-03-08 | United Technologies Corporation | Reflective head-up display |
US4769633A (en) * | 1985-04-17 | 1988-09-06 | Gec Avionics Limited | Head-up displays |
US4740780A (en) * | 1985-06-24 | 1988-04-26 | Gec Avionics, Inc. | Head-up display for automobile |
US4688879A (en) * | 1985-07-08 | 1987-08-25 | Kaiser Optical Systems, Inc. | Holographic multi-combiner for head-up display |
US4711512A (en) * | 1985-07-12 | 1987-12-08 | Environmental Research Institute Of Michigan | Compact head-up display |
US4725125A (en) * | 1985-08-27 | 1988-02-16 | Gec Avionics Limited | Head-up displays |
US4787711A (en) * | 1986-01-23 | 1988-11-29 | Yazaki Corporation | On-vehicle head up display device with optical means for correcting parallax in a vertical direction |
US4799765A (en) * | 1986-03-31 | 1989-01-24 | Hughes Aircraft Company | Integrated head-up and panel display unit |
US4884137A (en) * | 1986-07-10 | 1989-11-28 | Varo, Inc. | Head mounted video display and remote camera system |
US5162828A (en) * | 1986-09-25 | 1992-11-10 | Furness Thomas A | Display system for a head mounted viewing transparency |
US4932731A (en) * | 1986-11-14 | 1990-06-12 | Yazaki Corporation | Holographic head-up display apparatus |
US4824228A (en) * | 1986-11-28 | 1989-04-25 | Hughes Aircraft Company | Stereomicroscopic inspection system with low eyestrain features |
US4818048A (en) * | 1987-01-06 | 1989-04-04 | Hughes Aircraft Company | Holographic head-up control panel |
US4984179A (en) * | 1987-01-21 | 1991-01-08 | W. Industries Limited | Method and apparatus for the perception of computer-generated imagery |
US5214413A (en) * | 1987-06-23 | 1993-05-25 | Nissan Motor Company, Limited | Head-up display apparatus for vehicular display |
US4994794A (en) * | 1987-06-29 | 1991-02-19 | Gec-Marconi Limited | Methods and apparatus for displaying data |
US4930847A (en) * | 1987-07-09 | 1990-06-05 | Environmental Research Institute Of Michigan | Multicolor holographic element and apparatus for head-up display applications |
US5003300A (en) * | 1987-07-27 | 1991-03-26 | Reflection Technology, Inc. | Head mounted display for miniature video display system |
US4878046A (en) * | 1987-07-30 | 1989-10-31 | United Technologies Corporation | Mounting a cathode ray tube for a heads-up display system |
US4927234A (en) * | 1987-08-21 | 1990-05-22 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Optical system for head-up displays |
US4961625A (en) * | 1987-09-18 | 1990-10-09 | Flight Dynamics, Inc. | Automobile head-up display system with reflective aspheric surface |
US5129716A (en) * | 1987-10-23 | 1992-07-14 | Laszlo Holakovszky | Stereoscopic video image display appliance wearable on head like spectacles |
US4987410A (en) * | 1988-01-25 | 1991-01-22 | Kaiser Aerospace & Electronics Corporation | Multiple image forming apparatus |
US5305203A (en) * | 1988-02-01 | 1994-04-19 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US5251127A (en) * | 1988-02-01 | 1993-10-05 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US4831366A (en) * | 1988-02-05 | 1989-05-16 | Yazaki Corporation | Head up display apparatus for automotive vehicle |
US5005009A (en) * | 1988-02-16 | 1991-04-02 | K. W. Muth Company, Inc. | Method and apparatus for multiple object viewing |
US5000544A (en) * | 1988-08-01 | 1991-03-19 | Gec-Marconi Limited | Helmet system with optical display projection system including a cylindrical refractive surface |
US5243448A (en) * | 1988-09-28 | 1993-09-07 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Head-up display |
US5331333A (en) * | 1988-12-08 | 1994-07-19 | Sharp Kabushiki Kaisha | Display apparatus |
US5066525A (en) * | 1989-01-25 | 1991-11-19 | Central Glass Company, Limited | Laminated glass panel incorporating hologram sheet |
US4915487A (en) * | 1989-02-01 | 1990-04-10 | Systems Research Laboratories | Heads up display for night vision goggle |
US5028119A (en) * | 1989-04-07 | 1991-07-02 | Hughes Aircraft Company | Aircraft head-up display |
US4973139A (en) * | 1989-04-07 | 1990-11-27 | Hughes Aircraft Company | Automotive head-up display |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US4988976A (en) * | 1989-09-06 | 1991-01-29 | Lu Hsing Tseng | Head-up display with magnetic field speed detecting means |
US5210624A (en) * | 1989-09-19 | 1993-05-11 | Fujitsu Limited | Heads-up display |
US5108479A (en) * | 1989-10-09 | 1992-04-28 | Asahi Glass Company Ltd | Process for manufacturing glass with functional coating |
US5130794A (en) * | 1990-03-29 | 1992-07-14 | Ritchey Kurtis J | Panoramic display system |
US5319363A (en) * | 1990-08-31 | 1994-06-07 | The General Hospital Corporation | Network for portable patient monitoring devices |
US5037182A (en) * | 1990-09-12 | 1991-08-06 | Delco Electronics Corporation | Rearview mirror head-up display |
US5150137A (en) * | 1990-10-10 | 1992-09-22 | Pulse Medical Instruments | Positioning system for pupil imaging optics |
US5241391A (en) * | 1990-10-19 | 1993-08-31 | Gec Ferranti Defence Systems Limited | Video camera system for recording a scene and a head-up display |
US5151722A (en) * | 1990-11-05 | 1992-09-29 | The Johns Hopkins University | Video display on spectacle-like frame |
US5367315A (en) * | 1990-11-15 | 1994-11-22 | Eyetech Corporation | Method and apparatus for controlling cursor movement |
US5231391A (en) * | 1990-11-30 | 1993-07-27 | Skf France | Passive pick-up device for monitoring the state of the tire of a vehicle wheel and measuring the rotation characteristics of the wheel |
US5331149A (en) * | 1990-12-31 | 1994-07-19 | Kopin Corporation | Eye tracking system having an array of photodetectors aligned respectively with an array of pixels |
US5227769A (en) * | 1991-05-23 | 1993-07-13 | Westinghouse Electric Corp. | Heads-up projection display |
US5450596A (en) * | 1991-07-18 | 1995-09-12 | Redwear Interactive Inc. | CD-ROM data retrieval system using a hands-free command controller and headwear monitor |
US5198895A (en) * | 1991-08-29 | 1993-03-30 | Rockwell International Corporation | Holographic head-up display |
US5341242A (en) * | 1991-09-05 | 1994-08-23 | Elbit Ltd. | Helmet mounted display |
US5222477A (en) * | 1991-09-30 | 1993-06-29 | Welch Allyn, Inc. | Endoscope or borescope stereo viewing system |
US5392158A (en) * | 1991-11-01 | 1995-02-21 | Sega Enterprises, Ltd. | Head-mounted image display |
US5281960A (en) * | 1991-11-19 | 1994-01-25 | Silhouette Technology, Inc. | Helmet mounted display |
US5751260A (en) * | 1992-01-10 | 1998-05-12 | The United States Of America As Represented By The Secretary Of The Navy | Sensory integrated data interface |
US5430505A (en) * | 1992-01-30 | 1995-07-04 | Mak Technologies, Inc. | High speed eye tracking device and method |
US5436841A (en) * | 1992-03-16 | 1995-07-25 | Aerospatiale Societe Nationale Industrielle | Method and device for determining the relative position and the relative path of two space vehicles |
US5348477A (en) * | 1992-04-10 | 1994-09-20 | Cae Electronics Ltd. | High definition television head mounted display unit |
US5334991A (en) * | 1992-05-15 | 1994-08-02 | Reflection Technology | Dual image head-mounted display |
US5278696A (en) * | 1992-05-22 | 1994-01-11 | Kaiser Aerospace & Electronics Corporation | Head-up display system |
US5287437A (en) * | 1992-06-02 | 1994-02-15 | Sun Microsystems, Inc. | Method and apparatus for head tracked display of precomputed stereo images |
US5311220A (en) * | 1992-06-10 | 1994-05-10 | Dimension Technologies, Inc. | Autostereoscopic display |
US5436765A (en) * | 1992-07-27 | 1995-07-25 | Olympus Optical Co., Ltd. | Visual display apparatus |
US5321416A (en) * | 1992-07-27 | 1994-06-14 | Virtual Research Systems | Head-mounted visual display apparatus |
US5406415A (en) * | 1992-09-22 | 1995-04-11 | Kelly; Shawn L. | Imaging system for a head-mounted display |
US5302964A (en) * | 1992-09-25 | 1994-04-12 | Hughes Aircraft Company | Heads-up display (HUD) incorporating cathode-ray tube image generator with digital look-up table for distortion correction |
US5341181A (en) * | 1992-11-20 | 1994-08-23 | Godard Roger R | Systems and methods for capturing and presentng visual information |
US5345281A (en) * | 1992-12-17 | 1994-09-06 | John Taboada | Eye tracking system and method |
US5414544A (en) * | 1992-12-25 | 1995-05-09 | Sony Corporation | Display apparatus |
US5452416A (en) * | 1992-12-30 | 1995-09-19 | Dominator Radiology, Inc. | Automated system and a method for organizing, presenting, and manipulating medical images |
US5539422A (en) * | 1993-04-12 | 1996-07-23 | Virtual Vision, Inc. | Head mounted display system |
US5347400A (en) * | 1993-05-06 | 1994-09-13 | Ken Hunter | Optical system for virtual reality helmet |
US5485172A (en) * | 1993-05-21 | 1996-01-16 | Sony Corporation | Automatic image regulating arrangement for head-mounted image display apparatus |
US5457356A (en) * | 1993-08-11 | 1995-10-10 | Spire Corporation | Flat panel displays and process |
US5471542A (en) * | 1993-09-27 | 1995-11-28 | Ragland; Richard R. | Point-of-gaze tracker |
USD363279S (en) * | 1993-10-01 | 1995-10-17 | Seiko Epson Corporation | Head set with visual display unit and stereophonic headphones |
US5416876A (en) * | 1994-01-28 | 1995-05-16 | Hughes Training, Inc. | Fiber optic ribbon subminiature display for head/helmet mounted display |
US5483307A (en) * | 1994-09-29 | 1996-01-09 | Texas Instruments, Inc. | Wide field of view head-mounted display |
Cited By (378)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120098752A1 (en) * | 1997-09-19 | 2012-04-26 | Rolus Borgward Glenn | Digital Book |
US8654082B2 (en) * | 1997-09-19 | 2014-02-18 | Glenn Rolus Borgward | Digital book |
US20050190181A1 (en) * | 1999-08-06 | 2005-09-01 | Canon Kabushiki Kaisha | Image processing method and apparatus |
US20050203367A1 (en) * | 2001-06-13 | 2005-09-15 | Ahmed Syed N | Guide system |
US7493153B2 (en) * | 2001-06-13 | 2009-02-17 | Volume Interactions Pte., Ltd. | Augmented reality system controlled by probe position |
US7190825B2 (en) | 2001-08-17 | 2007-03-13 | Geo-Rae Co., Ltd. | Portable communication device for stereoscopic image display and transmission |
US20030117395A1 (en) * | 2001-08-17 | 2003-06-26 | Byoungyi Yoon | Method and system for calculating a photographing ratio of a camera |
US20070035619A1 (en) * | 2001-08-17 | 2007-02-15 | Byoungyi Yoon | Method and system for controlling space magnification for stereoscopic images |
US20030107643A1 (en) * | 2001-08-17 | 2003-06-12 | Byoungyi Yoon | Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion |
US20030112508A1 (en) * | 2001-08-17 | 2003-06-19 | Byoungyi Yoon | Method and system for controlling space magnification for stereoscopic images |
US20030108236A1 (en) * | 2001-08-17 | 2003-06-12 | Byoungyi Yoon | Portable communication device for stereoscopic image display and transmission |
US20030122925A1 (en) * | 2001-08-17 | 2003-07-03 | Byoungyi Yoon | Method and system for providing the motion information of stereoscopic cameras |
US7664649B2 (en) * | 2001-12-20 | 2010-02-16 | Canon Kabushiki Kaisha | Control apparatus, method and computer readable memory medium for enabling a user to communicate by speech with a processor-controlled apparatus |
US20070174060A1 (en) * | 2001-12-20 | 2007-07-26 | Canon Kabushiki Kaisha | Control apparatus |
US20040246269A1 (en) * | 2002-11-29 | 2004-12-09 | Luis Serra | System and method for managing a plurality of locations of interest in 3D data displays ("Zoom Context") |
US20210260749A1 (en) * | 2004-02-26 | 2021-08-26 | Teladoc Health, Inc. | Graphical interface for a remote presence system |
US20050228281A1 (en) * | 2004-03-31 | 2005-10-13 | Nefos Thomas P | Handheld diagnostic ultrasound system with head mounted display |
US20110069041A1 (en) * | 2005-03-18 | 2011-03-24 | Cohen Alexander J | Machine-differentiatable identifiers having a commonly accepted meaning |
US9063650B2 (en) | 2005-03-18 | 2015-06-23 | The Invention Science Fund I, Llc | Outputting a saved hand-formed expression |
US20140225823A1 (en) * | 2005-03-18 | 2014-08-14 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Machine-differentiatable identifiers having a commonly accepted meaning |
US8823636B2 (en) | 2005-03-18 | 2014-09-02 | The Invention Science Fund I, Llc | Including environmental information in a manual expression |
US8599174B2 (en) | 2005-03-18 | 2013-12-03 | The Invention Science Fund I, Llc | Verifying a written expression |
US8897605B2 (en) | 2005-03-18 | 2014-11-25 | The Invention Science Fund I, Llc | Decoding digital information included in a hand-formed expression |
US8928632B2 (en) | 2005-03-18 | 2015-01-06 | The Invention Science Fund I, Llc | Handwriting regions keyed to a data receptor |
US8640959B2 (en) | 2005-03-18 | 2014-02-04 | The Invention Science Fund I, Llc | Acquisition of a user expression and a context of the expression |
US9459693B2 (en) * | 2005-03-18 | 2016-10-04 | Invention Science Fund I, Llc | Machine-differentiatable identifiers having a commonly accepted meaning |
US8749480B2 (en) | 2005-03-18 | 2014-06-10 | The Invention Science Fund I, Llc | Article having a writing portion and preformed identifiers |
US8787706B2 (en) | 2005-03-18 | 2014-07-22 | The Invention Science Fund I, Llc | Acquisition of a user expression and an environment of the expression |
US20100164990A1 (en) * | 2005-08-15 | 2010-07-01 | Koninklijke Philips Electronics, N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
US7545322B2 (en) * | 2005-09-20 | 2009-06-09 | Raytheon Company | Antenna transceiver system |
US20080030404A1 (en) * | 2005-09-20 | 2008-02-07 | Irwin L Newberg | Antenna transceiver system |
US10976575B1 (en) | 2005-10-07 | 2021-04-13 | Percept Technologies Inc | Digital eyeware |
US20130242262A1 (en) * | 2005-10-07 | 2013-09-19 | Percept Technologies Inc. | Enhanced optical and perceptual digital eyewear |
US11630311B1 (en) | 2005-10-07 | 2023-04-18 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
US20150268483A1 (en) * | 2005-10-07 | 2015-09-24 | Percept Technologies Inc. | Enhanced optical and perceptual digital eyewear |
US11428937B2 (en) * | 2005-10-07 | 2022-08-30 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
US11675216B2 (en) * | 2005-10-07 | 2023-06-13 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
US10795183B1 (en) | 2005-10-07 | 2020-10-06 | Percept Technologies Inc | Enhanced optical and perceptual digital eyewear |
US10527847B1 (en) | 2005-10-07 | 2020-01-07 | Percept Technologies Inc | Digital eyewear |
US7810504B2 (en) | 2005-12-28 | 2010-10-12 | Depuy Products, Inc. | System and method for wearable user interface in computer assisted surgery |
US20070200863A1 (en) * | 2005-12-28 | 2007-08-30 | Depuy Products, Inc. | System and method for wearable user interface in computer assisted surgery |
US10021430B1 (en) | 2006-02-10 | 2018-07-10 | Percept Technologies Inc | Method and system for distribution of media |
US7851736B2 (en) | 2006-03-10 | 2010-12-14 | Siemens Aktiengesellschaft | Method and device for optimizing the image display on imaging apparatuses of a medical system during a medical procedure |
DE102006011233B4 (en) * | 2006-03-10 | 2011-04-28 | Siemens Ag | Method and device for optimizing the image representation on an imaging device |
DE102006011233A1 (en) * | 2006-03-10 | 2007-09-13 | Siemens Ag | Image representation optimizing method, involves optimizing image representation on imaging device e.g. monitor, by darkening area in which device is arranged and by increasing contrast of device when e.g. cardiologist, looks to device |
US20080045807A1 (en) * | 2006-06-09 | 2008-02-21 | Psota Eric T | System and methods for evaluating and monitoring wounds |
US8032842B2 (en) * | 2006-07-25 | 2011-10-04 | Korea Institute Of Science & Technology | System and method for three-dimensional interaction based on gaze and system and method for tracking three-dimensional gaze |
US20080181452A1 (en) * | 2006-07-25 | 2008-07-31 | Yong-Moo Kwon | System and method for Three-dimensional interaction based on gaze and system and method for tracking Three-dimensional gaze |
US20080039818A1 (en) * | 2006-08-11 | 2008-02-14 | Siemens Aktiengesellschaft | Technical medical system and method for operating it |
US8498868B2 (en) * | 2006-08-11 | 2013-07-30 | Siemens Aktiengesellschaft | Technical medical system and method for operating it |
US20080097176A1 (en) * | 2006-09-29 | 2008-04-24 | Doug Music | User interface and identification in a medical device systems and methods |
US8681256B2 (en) * | 2006-10-16 | 2014-03-25 | Sony Corporation | Display method and display apparatus in which a part of a screen area is in a through-state |
US9846304B2 (en) | 2006-10-16 | 2017-12-19 | Sony Corporation | Display method and display apparatus in which a part of a screen area is in a through-state |
US20100085462A1 (en) * | 2006-10-16 | 2010-04-08 | Sony Corporation | Display apparatus, display method |
US9182598B2 (en) | 2006-10-16 | 2015-11-10 | Sony Corporation | Display method and display apparatus in which a part of a screen area is in a through-state |
US20080129839A1 (en) * | 2006-11-07 | 2008-06-05 | Sony Corporation | Imaging apparatus and imaging method |
US20080107361A1 (en) * | 2006-11-07 | 2008-05-08 | Sony Corporation | Imaging apparatus, display apparatus, imaging method, and display method |
US8872941B2 (en) * | 2006-11-07 | 2014-10-28 | Sony Corporation | Imaging apparatus and imaging method |
US20100220037A1 (en) * | 2006-12-07 | 2010-09-02 | Sony Corporation | Image display system, display apparatus, and display method |
US20080259199A1 (en) * | 2006-12-07 | 2008-10-23 | Sony Corporation | Image display system, display apparatus, and display method |
US8009219B2 (en) | 2006-12-07 | 2011-08-30 | Sony Corporation | Image display system, display apparatus, and display method |
US7876374B2 (en) | 2006-12-07 | 2011-01-25 | Sony Corporation | Image display system, display apparatus, and display method |
US20080180521A1 (en) * | 2007-01-29 | 2008-07-31 | Ahearn David J | Multi-view system |
US9300949B2 (en) | 2007-01-29 | 2016-03-29 | David J. Ahearn | Multi-view system |
US7532163B2 (en) | 2007-02-13 | 2009-05-12 | Raytheon Company | Conformal electronically scanned phased array antenna and communication system for helmets and other platforms |
US20080191950A1 (en) * | 2007-02-13 | 2008-08-14 | Raytheon Company | Conformal electronically scanned phased array antenna and communication system for helmets and other platforms |
EP2120212A1 (en) * | 2007-03-12 | 2009-11-18 | Sony Corporation | Image processing device, image processing method, and image processing system |
EP2120212A4 (en) * | 2007-03-12 | 2011-07-06 | Sony Corp | Image processing device, image processing method, and image processing system |
US20100091139A1 (en) * | 2007-03-12 | 2010-04-15 | Sony Corporation | Image processing apparatus, image processing method and image processing system |
US20080253695A1 (en) * | 2007-04-10 | 2008-10-16 | Sony Corporation | Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program |
US8687925B2 (en) | 2007-04-10 | 2014-04-01 | Sony Corporation | Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program |
US20090040231A1 (en) * | 2007-08-06 | 2009-02-12 | Sony Corporation | Information processing apparatus, system, and method thereof |
US9972116B2 (en) | 2007-08-06 | 2018-05-15 | Sony Corporation | Information processing apparatus, system, and method for displaying bio-information or kinetic information |
US10937221B2 (en) | 2007-08-06 | 2021-03-02 | Sony Corporation | Information processing apparatus, system, and method for displaying bio-information or kinetic information |
US9568998B2 (en) | 2007-08-06 | 2017-02-14 | Sony Corporation | Information processing apparatus, system, and method for displaying bio-information or kinetic information |
US10262449B2 (en) | 2007-08-06 | 2019-04-16 | Sony Corporation | Information processing apparatus, system, and method for displaying bio-information or kinetic information |
US10529114B2 (en) | 2007-08-06 | 2020-01-07 | Sony Corporation | Information processing apparatus, system, and method for displaying bio-information or kinetic information |
US8797331B2 (en) | 2007-08-06 | 2014-08-05 | Sony Corporation | Information processing apparatus, system, and method thereof |
US20100113940A1 (en) * | 2008-01-10 | 2010-05-06 | The Ohio State University Research Foundation | Wound goggles |
US20110043644A1 (en) * | 2008-04-02 | 2011-02-24 | Esight Corp. | Apparatus and Method for a Dynamic "Region of Interest" in a Display System |
US9618748B2 (en) * | 2008-04-02 | 2017-04-11 | Esight Corp. | Apparatus and method for a dynamic “region of interest” in a display system |
US10617303B2 (en) | 2008-07-10 | 2020-04-14 | Ecole Polytechnique Federale De Lausanne (Epfl) | Functional optical coherent imaging |
US9757039B2 (en) | 2008-07-10 | 2017-09-12 | Ecole Polytechnique Federale De Lausanne (Epfl) | Functional optical coherent imaging |
US8628197B2 (en) | 2008-08-22 | 2014-01-14 | Texas Instruments Incorporated | Display systems and methods for mobile devices |
US8353598B2 (en) * | 2008-08-22 | 2013-01-15 | Texas Instruments Incorporated | Display systems and methods for mobile devices |
US20100045569A1 (en) * | 2008-08-22 | 2010-02-25 | Leonardo William Estevez | Display Systems and Methods for Mobile Devices |
US11716412B2 (en) | 2008-09-30 | 2023-08-01 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10897528B2 (en) | 2008-09-30 | 2021-01-19 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US11089144B2 (en) | 2008-09-30 | 2021-08-10 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US11258891B2 (en) | 2008-09-30 | 2022-02-22 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10686922B2 (en) | 2008-09-30 | 2020-06-16 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10530914B2 (en) | 2008-09-30 | 2020-01-07 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10530915B2 (en) | 2008-09-30 | 2020-01-07 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US12126748B2 (en) | 2008-09-30 | 2024-10-22 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10306037B2 (en) * | 2008-09-30 | 2019-05-28 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US20100241992A1 (en) * | 2009-03-21 | 2010-09-23 | Shenzhen Futaihong Precision Industry Co., Ltd. | Electronic device and method for operating menu items of the electronic device |
US20100328204A1 (en) * | 2009-06-25 | 2010-12-30 | The Boeing Company | Virtual Control Station |
US8773330B2 (en) * | 2009-06-25 | 2014-07-08 | The Boeing Company | Method and apparatus for a virtual mission control station |
US11717706B2 (en) | 2009-07-15 | 2023-08-08 | Cilag Gmbh International | Ultrasonic surgical instruments |
US11871982B2 (en) | 2009-10-09 | 2024-01-16 | Cilag Gmbh International | Surgical generator for ultrasonic and electrosurgical devices |
US20120249400A1 (en) * | 2009-12-22 | 2012-10-04 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Signal processing eye protecting digital glasses |
US8589968B2 (en) | 2009-12-31 | 2013-11-19 | Motorola Mobility Llc | Systems and methods providing content on a display based upon facial recognition of a viewer |
US20110161998A1 (en) * | 2009-12-31 | 2011-06-30 | Motorola, Inc. | Systems and Methods Providing Content on a Display Based Upon Facial Recognition of a Viewer |
US11382642B2 (en) | 2010-02-11 | 2022-07-12 | Cilag Gmbh International | Rotatable cutting implements with friction reducing material for ultrasonic surgical instruments |
US9875406B2 (en) | 2010-02-28 | 2018-01-23 | Microsoft Technology Licensing, Llc | Adjustable extension for temple arm |
US20120206323A1 (en) * | 2010-02-28 | 2012-08-16 | Osterhout Group, Inc. | Ar glasses with event and sensor triggered ar eyepiece interface to external devices |
US20110221657A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Optical stabilization of displayed content with a variable lens |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US10860100B2 (en) | 2010-02-28 | 2020-12-08 | Microsoft Technology Licensing, Llc | AR glasses with predictive control of external device based on event input |
US10539787B2 (en) | 2010-02-28 | 2020-01-21 | Microsoft Technology Licensing, Llc | Head-worn adaptive display |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US8814691B2 (en) | 2010-02-28 | 2014-08-26 | Microsoft Corporation | System and method for social networking gaming with an augmented reality |
US9285589B2 (en) * | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9759917B2 (en) * | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US9329689B2 (en) | 2010-02-28 | 2016-05-03 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US20120212414A1 (en) * | 2010-02-28 | 2012-08-23 | Osterhout Group, Inc. | Ar glasses with event and sensor triggered control of ar eyepiece applications |
US10268888B2 (en) | 2010-02-28 | 2019-04-23 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
WO2011116332A3 (en) * | 2010-03-18 | 2012-04-19 | SPI Surgical, Inc. | Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto |
WO2011116332A2 (en) * | 2010-03-18 | 2011-09-22 | SPI Surgical, Inc. | Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto |
US20110238079A1 (en) * | 2010-03-18 | 2011-09-29 | SPI Surgical, Inc. | Surgical Cockpit Comprising Multisensory and Multimodal Interfaces for Robotic Surgery and Methods Related Thereto |
US8617197B2 (en) | 2010-03-18 | 2013-12-31 | SPI Surgical, Inc. | Introducer device |
US9622828B2 (en) | 2010-03-18 | 2017-04-18 | SPI Surgical, Inc. | Introducer device |
US9474580B2 (en) | 2010-03-18 | 2016-10-25 | SPI Surgical, Inc. | Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto |
US20110298621A1 (en) * | 2010-06-02 | 2011-12-08 | Lokesh Shanbhag | System and method for generating alerts |
RU2619794C1 (en) * | 2010-06-07 | 2017-05-18 | Зе Боинг Компани | Virtual control station |
US9046999B1 (en) * | 2010-06-08 | 2015-06-02 | Google Inc. | Dynamic input at a touch-based interface based on pressure |
US9791957B2 (en) | 2010-06-08 | 2017-10-17 | X Development Llc | Dynamic input at a touch-based interface based on pressure |
WO2011156195A2 (en) * | 2010-06-09 | 2011-12-15 | Dynavox Systems Llc | Speech generation device with a head mounted display unit |
US10031576B2 (en) | 2010-06-09 | 2018-07-24 | Dynavox Systems Llc | Speech generation device with a head mounted display unit |
WO2011156195A3 (en) * | 2010-06-09 | 2012-03-01 | Dynavox Systems Llc | Speech generation device with a head mounted display unit |
US20120026191A1 (en) * | 2010-07-05 | 2012-02-02 | Sony Ericsson Mobile Communications Ab | Method for displaying augmentation information in an augmented reality system |
US8531355B2 (en) * | 2010-07-23 | 2013-09-10 | Gregory A. Maltz | Unitized, vision-controlled, wireless eyeglass transceiver |
US8531394B2 (en) | 2010-07-23 | 2013-09-10 | Gregory A. Maltz | Unitized, vision-controlled, wireless eyeglasses transceiver |
US10528130B2 (en) | 2010-07-23 | 2020-01-07 | Telepatheye Inc. | Unitized eye-tracking wireless eyeglasses system |
US20120021806A1 (en) * | 2010-07-23 | 2012-01-26 | Maltz Gregory A | Unitized, Vision-Controlled, Wireless Eyeglass Transceiver |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US20120069050A1 (en) * | 2010-09-16 | 2012-03-22 | Heeyeon Park | Transparent display device and method for providing information using the same |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
EP2923638B1 (en) * | 2011-03-18 | 2019-02-20 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Optical measuring device and system |
US20120287040A1 (en) * | 2011-05-10 | 2012-11-15 | Raytheon Company | System and Method for Operating a Helmet Mounted Display |
US8872766B2 (en) * | 2011-05-10 | 2014-10-28 | Raytheon Company | System and method for operating a helmet mounted display |
US20130002525A1 (en) * | 2011-06-29 | 2013-01-03 | Bobby Duane Foote | System for locating a position of an object |
US9153195B2 (en) * | 2011-08-17 | 2015-10-06 | Microsoft Technology Licensing, Llc | Providing contextual personal information by a mixed reality device |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US20130044130A1 (en) * | 2011-08-17 | 2013-02-21 | Kevin A. Geisner | Providing contextual personal information by a mixed reality device |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9536350B2 (en) | 2011-08-24 | 2017-01-03 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9910490B2 (en) * | 2011-12-29 | 2018-03-06 | Eyeguide, Inc. | System and method of cursor position control based on the vestibulo-ocular reflex |
US20130169533A1 (en) * | 2011-12-29 | 2013-07-04 | Grinbath, Llc | System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex |
US8860660B2 (en) | 2011-12-29 | 2014-10-14 | Grinbath, Llc | System and method of determining pupil center position |
US9035878B1 (en) | 2012-02-29 | 2015-05-19 | Google Inc. | Input system |
US8643951B1 (en) | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
WO2013138647A1 (en) * | 2012-03-15 | 2013-09-19 | Google Inc. | Using convergence angle to select among different ui elements |
US20130257832A1 (en) * | 2012-03-30 | 2013-10-03 | Exelis, Inc. | Image pickoff apparatus system and method |
US12167866B2 (en) | 2012-04-09 | 2024-12-17 | Cilag Gmbh International | Switch arrangements for ultrasonic surgical instruments |
US11419626B2 (en) | 2012-04-09 | 2022-08-23 | Cilag Gmbh International | Switch arrangements for ultrasonic surgical instruments |
US10575737B2 (en) | 2012-04-27 | 2020-03-03 | Novadaq Technologies ULC | Optical coherent imaging medical device |
US11426191B2 (en) | 2012-06-29 | 2022-08-30 | Cilag Gmbh International | Ultrasonic surgical instruments with distally positioned jaw assemblies |
US11717311B2 (en) | 2012-06-29 | 2023-08-08 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US11583306B2 (en) | 2012-06-29 | 2023-02-21 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US11871955B2 (en) | 2012-06-29 | 2024-01-16 | Cilag Gmbh International | Surgical instruments with articulating shafts |
US10101571B2 (en) | 2012-07-10 | 2018-10-16 | Novadaq Technologies ULC | Perfusion assessment multi-modality optical medical device |
WO2014009859A3 (en) * | 2012-07-10 | 2014-03-06 | Aïmago S.A. | Perfusion assessment multi-modality optical medical device |
US11798676B2 (en) * | 2012-09-17 | 2023-10-24 | DePuy Synthes Products, Inc. | Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking |
US9819843B2 (en) | 2012-09-20 | 2017-11-14 | Zeriscope Inc. | Head-mounted systems and methods for providing inspection, evaluation or assessment of an event or location |
WO2014047402A1 (en) * | 2012-09-20 | 2014-03-27 | MUSC Foundation for Research and Development | Head-mounted systems and methods for providing inspection, evaluation or assessment of an event or location |
US9292086B2 (en) | 2012-09-26 | 2016-03-22 | Grinbath, Llc | Correlating pupil position to gaze location within a scene |
US9111383B2 (en) | 2012-10-05 | 2015-08-18 | Elwha Llc | Systems and methods for obtaining and using augmentation data and for sharing usage data |
US10713846B2 (en) | 2012-10-05 | 2020-07-14 | Elwha Llc | Systems and methods for sharing augmentation data |
US9105126B2 (en) | 2012-10-05 | 2015-08-11 | Elwha Llc | Systems and methods for sharing augmentation data |
US10254830B2 (en) * | 2012-10-05 | 2019-04-09 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US9111384B2 (en) | 2012-10-05 | 2015-08-18 | Elwha Llc | Systems and methods for obtaining and using augmentation data and for sharing usage data |
US20140098134A1 (en) * | 2012-10-05 | 2014-04-10 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US10269179B2 (en) | 2012-10-05 | 2019-04-23 | Elwha Llc | Displaying second augmentations that are based on registered first augmentations |
US20170315613A1 (en) * | 2012-10-05 | 2017-11-02 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US9671863B2 (en) * | 2012-10-05 | 2017-06-06 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US9674047B2 (en) * | 2012-10-05 | 2017-06-06 | Elwha Llc | Correlating user reactions with augmentations displayed through augmented views |
US9077647B2 (en) | 2012-10-05 | 2015-07-07 | Elwha Llc | Correlating user reactions with augmentations displayed through augmented views |
US20190272029A1 (en) * | 2012-10-05 | 2019-09-05 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US9448623B2 (en) | 2012-10-05 | 2016-09-20 | Elwha Llc | Presenting an augmented view in response to acquisition of data inferring user activity |
US20140098135A1 (en) * | 2012-10-05 | 2014-04-10 | Elwha Llc | Correlating user reactions with augmentations displayed through augmented views |
US10665017B2 (en) | 2012-10-05 | 2020-05-26 | Elwha Llc | Displaying in response to detecting one or more user behaviors one or more second augmentations that are based on one or more registered first augmentations |
US9141188B2 (en) | 2012-10-05 | 2015-09-22 | Elwha Llc | Presenting an augmented view in response to acquisition of data inferring user activity |
US10180715B2 (en) | 2012-10-05 | 2019-01-15 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US10481757B2 (en) | 2012-11-07 | 2019-11-19 | Honda Motor Co., Ltd. | Eye gaze control system |
US9626072B2 (en) | 2012-11-07 | 2017-04-18 | Honda Motor Co., Ltd. | Eye gaze control system |
US20140146038A1 (en) * | 2012-11-28 | 2014-05-29 | International Business Machines Corporation | Augmented display of internal system components |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US8996413B2 (en) | 2012-12-28 | 2015-03-31 | Wal-Mart Stores, Inc. | Techniques for detecting depleted stock |
US20150355815A1 (en) * | 2013-01-15 | 2015-12-10 | Poow Innovation Ltd | Dynamic icons |
US10884577B2 (en) * | 2013-01-15 | 2021-01-05 | Poow Innovation Ltd. | Identification of dynamic icons based on eye movement |
JP2014145734A (en) * | 2013-01-30 | 2014-08-14 | Nikon Corp | Information input/output device, and information input/output method |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US9767608B2 (en) * | 2013-03-13 | 2017-09-19 | Samsung Electronics Co., Ltd. | Augmented reality image display system and surgical robot system comprising the same |
US20140275760A1 (en) * | 2013-03-13 | 2014-09-18 | Samsung Electronics Co., Ltd. | Augmented reality image display system and surgical robot system comprising the same |
US10025486B2 (en) | 2013-03-15 | 2018-07-17 | Elwha Llc | Cross-reality select, drag, and drop for augmented reality systems |
US20140282196A1 (en) * | 2013-03-15 | 2014-09-18 | Intuitive Surgical Operations, Inc. | Robotic system providing user selectable actions associated with gaze tracking |
US10628969B2 (en) | 2013-03-15 | 2020-04-21 | Elwha Llc | Dynamically preserving scene elements in augmented reality systems |
US11747895B2 (en) * | 2013-03-15 | 2023-09-05 | Intuitive Surgical Operations, Inc. | Robotic system providing user selectable actions associated with gaze tracking |
US20230266823A1 (en) * | 2013-03-15 | 2023-08-24 | Intuitive Surgical Operations, Inc. | Robotic system providing user selectable actions associated with gaze tracking |
US10109075B2 (en) | 2013-03-15 | 2018-10-23 | Elwha Llc | Temporal element restoration in augmented reality systems |
US9639964B2 (en) | 2013-03-15 | 2017-05-02 | Elwha Llc | Dynamically preserving scene elements in augmented reality systems |
US11209654B1 (en) | 2013-03-15 | 2021-12-28 | Percept Technologies Inc | Digital eyewear system and method for the treatment and prevention of migraines and photophobia |
US10962789B1 (en) | 2013-03-15 | 2021-03-30 | Percept Technologies Inc | Digital eyewear system and method for the treatment and prevention of migraines and photophobia |
US11977678B2 (en) * | 2013-03-15 | 2024-05-07 | Intuitive Surgical Operations, Inc. | Robotic system providing user selectable actions associated with gaze tracking |
DE102013107041A1 (en) * | 2013-04-18 | 2014-10-23 | Carl Gustav Carus Management Gmbh | Ultrasound system and method for communication between an ultrasound device and bidirectional data goggles |
US10195058B2 (en) | 2013-05-13 | 2019-02-05 | The Johns Hopkins University | Hybrid augmented reality multimodal operation neural integration environment |
CN105120812A (en) * | 2013-05-16 | 2015-12-02 | 视乐有限公司 | Touchless user interface for ophthalmic devices |
US20150290031A1 (en) * | 2013-05-16 | 2015-10-15 | Wavelight Gmbh | Touchless user interface for ophthalmic devices |
DE102013210354A1 (en) * | 2013-06-04 | 2014-12-04 | Bayerische Motoren Werke Aktiengesellschaft | Eye-controlled interaction for data glasses |
US20140354516A1 (en) * | 2013-06-04 | 2014-12-04 | Bayerische Motoren Werke Aktiengesellschaft | Vision-Controlled Interaction for Data Spectacles |
US9283893B2 (en) * | 2013-06-04 | 2016-03-15 | Bayerische Motoren Werke Aktiengesellschaft | Vision-controlled interaction for data spectacles |
US20180364965A1 (en) * | 2013-07-16 | 2018-12-20 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
US20160154620A1 (en) * | 2013-07-16 | 2016-06-02 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
US10664216B2 (en) | 2013-07-16 | 2020-05-26 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
TWI617279B (en) * | 2013-07-16 | 2018-03-11 | 精工愛普生股份有限公司 | Information processing apparatus, information processing method, and information processing system |
RU2645004C2 (en) * | 2013-07-16 | 2018-02-15 | Сейко Эпсон Корпорейшн | Information processing device, information processing method and information processing system |
CN104298344A (en) * | 2013-07-16 | 2015-01-21 | 精工爱普生株式会社 | Information processing apparatus, information processing method, and information processing system |
US10042598B2 (en) * | 2013-07-16 | 2018-08-07 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
US9898662B2 (en) * | 2013-07-16 | 2018-02-20 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
CN104298499A (en) * | 2013-07-16 | 2015-01-21 | 精工爱普生株式会社 | Information processing apparatus, information processing method, and information processing system |
JP2015019678A (en) * | 2013-07-16 | 2015-02-02 | セイコーエプソン株式会社 | Information processing apparatus, information processing method, and information processing system |
WO2015008469A3 (en) * | 2013-07-16 | 2015-05-14 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
CN109637622A (en) * | 2013-07-16 | 2019-04-16 | 精工爱普生株式会社 | Information processing unit, information processing method and information processing system |
US20160148052A1 (en) * | 2013-07-16 | 2016-05-26 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
US11341726B2 (en) | 2013-07-16 | 2022-05-24 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
US20170163866A1 (en) * | 2013-07-24 | 2017-06-08 | Google Inc. | Input System |
DE102013013698B4 (en) | 2013-08-16 | 2024-10-02 | Audi Ag | Method for operating electronic data glasses |
DE102013013698A1 (en) * | 2013-08-16 | 2015-02-19 | Audi Ag | Method for operating electronic data glasses and electronic data glasses |
ITMI20131527A1 (en) * | 2013-09-17 | 2015-03-18 | Menci Software S R L | SURGICAL DISPLAY DEVICE |
WO2015054322A1 (en) * | 2013-10-07 | 2015-04-16 | Avegant Corporation | Multi-mode wearable apparatus for accessing media content |
US9772495B2 (en) * | 2013-11-04 | 2017-09-26 | Weng-Kong TAM | Digital loupe device |
US20150123880A1 (en) * | 2013-11-04 | 2015-05-07 | Weng-Kong TAM | Digital loupe device |
US10254920B2 (en) * | 2013-12-01 | 2019-04-09 | Upskill, Inc. | Systems and methods for accessing a nested menu |
US10558325B2 (en) * | 2013-12-01 | 2020-02-11 | Upskill, Inc. | Systems and methods for controlling operation of an on-board component |
US20150153913A1 (en) * | 2013-12-01 | 2015-06-04 | Apx Labs, Llc | Systems and methods for interacting with a virtual menu |
US20150153826A1 (en) * | 2013-12-01 | 2015-06-04 | Apx Labs, Llc | Systems and methods for providing a virtual menu |
US20150153912A1 (en) * | 2013-12-01 | 2015-06-04 | Apx Labs, Llc | Systems and methods for accessing a nested menu |
US10466858B2 (en) * | 2013-12-01 | 2019-11-05 | Upskill, Inc. | Systems and methods for interacting with a virtual menu |
US9645640B2 (en) | 2013-12-21 | 2017-05-09 | Audi Ag | Device and method for navigating within a menu for controlling a vehicle, and selecting a menu entry from the menu |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
US11219428B2 (en) * | 2014-01-29 | 2022-01-11 | Becton, Dickinson And Company | Wearable electronic device for enhancing visualization during insertion of an invasive device |
US10432922B2 (en) | 2014-03-19 | 2019-10-01 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
US11438572B2 (en) | 2014-03-19 | 2022-09-06 | Intuitive Surgical Operations, Inc. | Medical devices, systems and methods using eye gaze tracking for stereo viewer |
US11147640B2 (en) * | 2014-03-19 | 2021-10-19 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking |
US10965933B2 (en) | 2014-03-19 | 2021-03-30 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
US10278782B2 (en) * | 2014-03-19 | 2019-05-07 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking |
US11792386B2 (en) | 2014-03-19 | 2023-10-17 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
US11471209B2 (en) | 2014-03-31 | 2022-10-18 | Cilag Gmbh International | Controlling impedance rise in electrosurgical medical devices |
CN104055478A (en) * | 2014-07-08 | 2014-09-24 | 金纯� | Medical endoscope control system based on sight tracking control |
US9576329B2 (en) * | 2014-07-31 | 2017-02-21 | Ciena Corporation | Systems and methods for equipment installation, configuration, maintenance, and personnel training |
CN104166239A (en) * | 2014-08-25 | 2014-11-26 | 成都贝思达光电科技有限公司 | Head-worn video glasses view-finding device for high definition camera |
WO2016064800A1 (en) * | 2014-10-20 | 2016-04-28 | Mayo Foundation For Medical Education And Research | Imaging data capture and video streaming system |
US20160220105A1 (en) * | 2015-02-03 | 2016-08-04 | Francois Duret | Device for viewing an interior of a mouth |
US9877642B2 (en) * | 2015-02-03 | 2018-01-30 | Francois Duret | Device for viewing an interior of a mouth |
US20210157403A1 (en) * | 2015-02-20 | 2021-05-27 | Covidien Lp | Operating room and surgical site awareness |
JP2020049296A (en) * | 2015-02-20 | 2020-04-02 | コヴィディエン リミテッド パートナーシップ | Operating room and surgical site awareness |
US10908681B2 (en) | 2015-02-20 | 2021-02-02 | Covidien Lp | Operating room and surgical site awareness |
JP2018511359A (en) * | 2015-02-20 | 2018-04-26 | コヴィディエン リミテッド パートナーシップ | Operating room and surgical site recognition |
WO2016133644A1 (en) * | 2015-02-20 | 2016-08-25 | Covidien Lp | Operating room and surgical site awareness |
EP3258876A4 (en) * | 2015-02-20 | 2018-12-26 | Covidien LP | Operating room and surgical site awareness |
CN107249497A (en) * | 2015-02-20 | 2017-10-13 | 柯惠Lp公司 | Operating room and operative site are perceived |
JP2021100690A (en) * | 2015-02-20 | 2021-07-08 | コヴィディエン リミテッド パートナーシップ | Operating room and surgical site awareness |
US9823474B2 (en) | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US10169862B2 (en) | 2015-05-07 | 2019-01-01 | Novadaq Technologies ULC | Methods and systems for laser speckle imaging of tissue using a color image sensor |
US20160349539A1 (en) * | 2015-05-26 | 2016-12-01 | Lumenis Ltd. | Laser safety glasses with an improved imaging system |
US10197816B2 (en) * | 2015-05-26 | 2019-02-05 | Lumenis Ltd. | Laser safety glasses with an improved imaging system |
CN108153424A (en) * | 2015-06-03 | 2018-06-12 | 塔普翊海(上海)智能科技有限公司 | The eye of aobvious equipment is moved moves exchange method with head |
CN107850778A (en) * | 2015-06-05 | 2018-03-27 | 马克·莱姆陈 | Apparatus and method for image capture of medical or dental images using head mounted camera and computer system |
AU2016270422B2 (en) * | 2015-06-05 | 2020-12-03 | Marc Lemchen | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
EP3304173A4 (en) * | 2015-06-05 | 2018-05-30 | Marc Lemchen | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
US10409443B2 (en) | 2015-06-24 | 2019-09-10 | Microsoft Technology Licensing, Llc | Contextual cursor display based on hand tracking |
US11903634B2 (en) | 2015-06-30 | 2024-02-20 | Cilag Gmbh International | Surgical instrument with user adaptable techniques |
US10055870B2 (en) | 2015-09-03 | 2018-08-21 | Siemens Healthcare Gmbh | Method and system for displaying an augmented reality to an operator of a medical imaging apparatus |
DE102015216917A1 (en) * | 2015-09-03 | 2017-03-09 | Siemens Healthcare Gmbh | System for presenting an augmented reality about an operator |
CN106491206A (en) * | 2015-09-03 | 2017-03-15 | 西门子保健有限责任公司 | For the system for showing extension reality with regard to operator |
US11559347B2 (en) | 2015-09-30 | 2023-01-24 | Cilag Gmbh International | Techniques for circuit topologies for combined generator |
US11766287B2 (en) | 2015-09-30 | 2023-09-26 | Cilag Gmbh International | Methods for operating generator for digitally generating electrical signal waveforms and surgical instruments |
US11666375B2 (en) | 2015-10-16 | 2023-06-06 | Cilag Gmbh International | Electrode wiping surgical device |
US20170186157A1 (en) * | 2015-12-23 | 2017-06-29 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
CN110148453A (en) * | 2015-12-23 | 2019-08-20 | 西门子医疗有限公司 | For exporting the method and system of augmented reality information |
US10366489B2 (en) * | 2015-12-23 | 2019-07-30 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
US10846851B2 (en) | 2015-12-23 | 2020-11-24 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
US11694328B2 (en) | 2015-12-23 | 2023-07-04 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
CN106909771A (en) * | 2015-12-23 | 2017-06-30 | 西门子医疗有限公司 | Method and system for exporting augmented reality information |
CN107427216A (en) * | 2015-12-25 | 2017-12-01 | 韦斯特尤尼蒂斯株式会社 | Medical system |
US12193698B2 (en) | 2016-01-15 | 2025-01-14 | Cilag Gmbh International | Method for self-diagnosing operation of a control switch in a surgical instrument system |
US11684402B2 (en) | 2016-01-15 | 2023-06-27 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on tissue characterization |
US11896280B2 (en) | 2016-01-15 | 2024-02-13 | Cilag Gmbh International | Clamp arm comprising a circuit |
US11751929B2 (en) | 2016-01-15 | 2023-09-12 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on tissue characterization |
US11974772B2 (en) | 2016-01-15 | 2024-05-07 | Cilag GmbH Intemational | Modular battery powered handheld surgical instrument with variable motor control limits |
US12201339B2 (en) | 2016-01-15 | 2025-01-21 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on tissue characterization |
US12239360B2 (en) | 2016-01-15 | 2025-03-04 | Cilag Gmbh International | Modular battery powered handheld surgical instrument with selective application of energy based on button displacement, intensity, or local tissue characterization |
US11864820B2 (en) | 2016-05-03 | 2024-01-09 | Cilag Gmbh International | Medical device with a bilateral jaw configuration for nerve stimulation |
US11612446B2 (en) * | 2016-06-03 | 2023-03-28 | Covidien Lp | Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator |
WO2017210497A1 (en) * | 2016-06-03 | 2017-12-07 | Covidien Lp | Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator |
US12114914B2 (en) | 2016-08-05 | 2024-10-15 | Cilag Gmbh International | Methods and systems for advanced harmonic energy |
US12178403B2 (en) | 2016-11-24 | 2024-12-31 | University Of Washington | Light field capture and rendering for head-mounted displays |
US11998230B2 (en) | 2016-11-29 | 2024-06-04 | Cilag Gmbh International | End effector control and calibration |
JP2020533681A (en) * | 2017-09-08 | 2020-11-19 | サージカル シアター インコーポレイテッド | Dual mode augmented reality surgery system |
US11199900B2 (en) * | 2017-12-04 | 2021-12-14 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
US10726765B2 (en) | 2018-02-15 | 2020-07-28 | Valve Corporation | Using tracking of display device to control image display |
US10849484B2 (en) * | 2018-03-23 | 2020-12-01 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus |
US20190290101A1 (en) * | 2018-03-23 | 2019-09-26 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus |
US12126916B2 (en) | 2018-09-27 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
JP7367041B2 (en) | 2018-10-25 | 2023-10-23 | べイエオニクス サージカル リミテッド | UI for head-mounted display systems |
JP2022509460A (en) * | 2018-10-25 | 2022-01-20 | べイエオニクス サージカル リミテッド | UI for head-mounted display system |
US11989930B2 (en) | 2018-10-25 | 2024-05-21 | Beyeonics Surgical Ltd. | UI for head mounted display system |
US11112865B1 (en) * | 2019-02-13 | 2021-09-07 | Facebook Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
US20210397255A1 (en) * | 2019-02-13 | 2021-12-23 | Facebook Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
US11662812B2 (en) * | 2019-02-13 | 2023-05-30 | Meta Platforms Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
JP7523466B2 (en) | 2019-03-29 | 2024-07-26 | ラズミク ガザリアン | Method and apparatus for variable resolution screens - Patents.com |
EP3726834A1 (en) * | 2019-04-17 | 2020-10-21 | Medneo GmbH | Telepresence system and method |
EP3764367A1 (en) * | 2019-07-11 | 2021-01-13 | Milestone S.r.l. | System and method for medical gross examination |
TWI731430B (en) * | 2019-10-04 | 2021-06-21 | 財團法人工業技術研究院 | Information display method and information display system |
US11467400B2 (en) | 2019-10-04 | 2022-10-11 | Industrial Technology Research Institute | Information display method and information display system |
KR102304962B1 (en) * | 2019-10-24 | 2021-09-27 | (주)미래컴퍼니 | Surgical system using surgical robot |
KR20210048954A (en) * | 2019-10-24 | 2021-05-04 | (주)미래컴퍼니 | Surgical system using surgical robot |
US11684412B2 (en) | 2019-12-30 | 2023-06-27 | Cilag Gmbh International | Surgical instrument with rotatable and articulatable surgical end effector |
US11759251B2 (en) | 2019-12-30 | 2023-09-19 | Cilag Gmbh International | Control program adaptation based on device status and user input |
US11696776B2 (en) | 2019-12-30 | 2023-07-11 | Cilag Gmbh International | Articulatable surgical instrument |
US11452525B2 (en) | 2019-12-30 | 2022-09-27 | Cilag Gmbh International | Surgical instrument comprising an adjustment system |
US11786291B2 (en) | 2019-12-30 | 2023-10-17 | Cilag Gmbh International | Deflectable support of RF energy electrode with respect to opposing ultrasonic blade |
US11786294B2 (en) | 2019-12-30 | 2023-10-17 | Cilag Gmbh International | Control program for modular combination energy device |
US11707318B2 (en) | 2019-12-30 | 2023-07-25 | Cilag Gmbh International | Surgical instrument with jaw alignment features |
US12076006B2 (en) | 2019-12-30 | 2024-09-03 | Cilag Gmbh International | Surgical instrument comprising an orientation detection system |
US11779329B2 (en) | 2019-12-30 | 2023-10-10 | Cilag Gmbh International | Surgical instrument comprising a flex circuit including a sensor system |
US11911063B2 (en) | 2019-12-30 | 2024-02-27 | Cilag Gmbh International | Techniques for detecting ultrasonic blade to electrode contact and reducing power to ultrasonic blade |
US11937863B2 (en) | 2019-12-30 | 2024-03-26 | Cilag Gmbh International | Deflectable electrode with variable compression bias along the length of the deflectable electrode |
US11937866B2 (en) | 2019-12-30 | 2024-03-26 | Cilag Gmbh International | Method for an electrosurgical procedure |
US11944366B2 (en) | 2019-12-30 | 2024-04-02 | Cilag Gmbh International | Asymmetric segmented ultrasonic support pad for cooperative engagement with a movable RF electrode |
US11950797B2 (en) | 2019-12-30 | 2024-04-09 | Cilag Gmbh International | Deflectable electrode with higher distal bias relative to proximal bias |
US11779387B2 (en) | 2019-12-30 | 2023-10-10 | Cilag Gmbh International | Clamp arm jaw to minimize tissue sticking and improve tissue control |
US11812957B2 (en) | 2019-12-30 | 2023-11-14 | Cilag Gmbh International | Surgical instrument comprising a signal interference resolution system |
US11974801B2 (en) | 2019-12-30 | 2024-05-07 | Cilag Gmbh International | Electrosurgical instrument with flexible wiring assemblies |
US11660089B2 (en) | 2019-12-30 | 2023-05-30 | Cilag Gmbh International | Surgical instrument comprising a sensing system |
US11589916B2 (en) | 2019-12-30 | 2023-02-28 | Cilag Gmbh International | Electrosurgical instruments with electrodes having variable energy densities |
US11723716B2 (en) | 2019-12-30 | 2023-08-15 | Cilag Gmbh International | Electrosurgical instrument with variable control mechanisms |
US11986201B2 (en) | 2019-12-30 | 2024-05-21 | Cilag Gmbh International | Method for operating a surgical instrument |
US11986234B2 (en) | 2019-12-30 | 2024-05-21 | Cilag Gmbh International | Surgical system communication pathways |
US12114912B2 (en) | 2019-12-30 | 2024-10-15 | Cilag Gmbh International | Non-biased deflectable electrode to minimize contact between ultrasonic blade and electrode |
US11744636B2 (en) | 2019-12-30 | 2023-09-05 | Cilag Gmbh International | Electrosurgical systems with integrated and external power sources |
US12064109B2 (en) | 2019-12-30 | 2024-08-20 | Cilag Gmbh International | Surgical instrument comprising a feedback control circuit |
US12023086B2 (en) | 2019-12-30 | 2024-07-02 | Cilag Gmbh International | Electrosurgical instrument for delivering blended energy modalities to tissue |
US12082808B2 (en) | 2019-12-30 | 2024-09-10 | Cilag Gmbh International | Surgical instrument comprising a control system responsive to software configurations |
US12053224B2 (en) | 2019-12-30 | 2024-08-06 | Cilag Gmbh International | Variation in electrode parameters and deflectable electrode to modify energy density and tissue interaction |
US12051214B2 (en) | 2020-05-12 | 2024-07-30 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US12262937B2 (en) | 2020-05-29 | 2025-04-01 | Cilag Gmbh International | User interface for surgical instrument with combination energy modality end-effector |
CN111904768A (en) * | 2020-08-27 | 2020-11-10 | 上海联影医疗科技有限公司 | Medical equipment scanning intra-aperture image display method and medical equipment |
US11830602B2 (en) | 2020-10-02 | 2023-11-28 | Cilag Gmbh International | Surgical hub having variable interconnectivity capabilities |
US20220104910A1 (en) * | 2020-10-02 | 2022-04-07 | Ethicon Llc | Monitoring of user visual gaze to control which display system displays the primary information |
US12016566B2 (en) | 2020-10-02 | 2024-06-25 | Cilag Gmbh International | Surgical instrument with adaptive function controls |
US11992372B2 (en) | 2020-10-02 | 2024-05-28 | Cilag Gmbh International | Cooperative surgical displays |
US12064293B2 (en) | 2020-10-02 | 2024-08-20 | Cilag Gmbh International | Field programmable surgical visualization system |
US11672534B2 (en) | 2020-10-02 | 2023-06-13 | Cilag Gmbh International | Communication capability of a smart stapler |
US11963683B2 (en) | 2020-10-02 | 2024-04-23 | Cilag Gmbh International | Method for operating tiered operation modes in a surgical system |
US11883022B2 (en) | 2020-10-02 | 2024-01-30 | Cilag Gmbh International | Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information |
US11877897B2 (en) | 2020-10-02 | 2024-01-23 | Cilag Gmbh International | Situational awareness of instruments location and individualization of users to control displays |
WO2022070076A1 (en) * | 2020-10-02 | 2022-04-07 | Cilag Gmbh International | Reconfiguration of display sharing |
US11748924B2 (en) | 2020-10-02 | 2023-09-05 | Cilag Gmbh International | Tiered system display control based on capacity and user operation |
US12213801B2 (en) | 2020-10-02 | 2025-02-04 | Cilag Gmbh International | Surgical visualization and particle trend analysis system |
US11986739B2 (en) * | 2021-07-09 | 2024-05-21 | Gel Blaster, Inc. | Smart target co-witnessing hit attribution system and method |
US20230356095A1 (en) * | 2021-07-09 | 2023-11-09 | Gel Blaster, Inc. | Smart target co-witnessing hit attribution system and method |
US12261988B2 (en) | 2021-11-08 | 2025-03-25 | Proprio, Inc. | Methods for generating stereoscopic views in multicamera systems, and associated devices and systems |
WO2025041183A1 (en) * | 2023-08-22 | 2025-02-27 | Dal Pont Medical S.r.l.s. | Optical device with an augmented reality management system |
Also Published As
Publication number | Publication date |
---|---|
US6847336B1 (en) | 2005-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6847336B1 (en) | Selectively controllable heads-up display system | |
US11819273B2 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
US20230255446A1 (en) | Surgical visualization systems and displays | |
US11989930B2 (en) | UI for head mounted display system | |
US10895742B2 (en) | Microsurgery system for displaying in real time magnified digital image sequences of an operated area | |
US9766441B2 (en) | Surgical stereo vision systems and methods for microsurgery | |
Rolland et al. | Optical versus video see-through head-mounted displays in medical visualization | |
US12062430B2 (en) | Surgery visualization theatre | |
WO2021062375A1 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
US20170143442A1 (en) | Surgical visualization systems and displays | |
US12142367B2 (en) | Surgery visualization theatre | |
EP3907585B1 (en) | Systems and methods of controlling an operating room display using an augmented reality headset | |
JP5571390B2 (en) | Imaging apparatus and imaging method | |
US11094283B2 (en) | Head-wearable presentation apparatus, method for operating the same, and medical-optical observation system | |
Rolland et al. | Optical versus video see-through head-mounted displays | |
WO2021226134A1 (en) | Surgery visualization theatre | |
CA3117533A1 (en) | Ui for head mounted display system | |
JP7367041B2 (en) | UI for head-mounted display systems | |
EP4034028A1 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
US20230179755A1 (en) | Stereoscopic imaging apparatus with multiple fixed magnification levels | |
EP4146115A1 (en) | Surgery visualization theatre | |
JP2024541906A (en) | PROCEDURE GUIDANCE AND TRAINING DEVICE, METHOD, AND SYSTEM |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A, NEW YORK Free format text: GRANT OF SECURITY INTEREST IN UNITED STATES PATENTS;ASSIGNORS:EBUREAU, LLC;IOVATION, INC.;SIGNAL DIGITAL, INC.;AND OTHERS;REEL/FRAME:058294/0161 Effective date: 20211201 Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, NEW YORK Free format text: GRANT OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNORS:TRU OPTIK DATA CORP.;NEUSTAR INFORMATION SERVICES, INC.;NEUSTAR DATA SERVICES, INC.;AND OTHERS;REEL/FRAME:058294/0010 Effective date: 20211201 |
|
AS | Assignment |
Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NO. 16/990,698 PREVIOUSLY RECORDED ON REEL 058294 FRAME 0010. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:TRU OPTIK DATA CORP.;NEUSTAR INFORMATION SERVICES, INC.;NEUSTAR DATA SERVICES, INC.;AND OTHERS;REEL/FRAME:059846/0157 Effective date: 20211201 |