+

WO2018131688A1 - Dispositif de visualisation, système de visualisation d'espace sous-marin et procédé de visualisation d'espace sous-marin - Google Patents

Dispositif de visualisation, système de visualisation d'espace sous-marin et procédé de visualisation d'espace sous-marin Download PDF

Info

Publication number
WO2018131688A1
WO2018131688A1 PCT/JP2018/000699 JP2018000699W WO2018131688A1 WO 2018131688 A1 WO2018131688 A1 WO 2018131688A1 JP 2018000699 W JP2018000699 W JP 2018000699W WO 2018131688 A1 WO2018131688 A1 WO 2018131688A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewing
video
display screen
viewing device
omnidirectional
Prior art date
Application number
PCT/JP2018/000699
Other languages
English (en)
Japanese (ja)
Inventor
倫明 二宮
友和 満留
Original Assignee
株式会社日本エスシーマネージメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日本エスシーマネージメント filed Critical 株式会社日本エスシーマネージメント
Publication of WO2018131688A1 publication Critical patent/WO2018131688A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/214Specialised server platform, e.g. server located in an airplane, hotel, hospital
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region

Definitions

  • the present invention relates to a viewing device, an underwater space viewing system, and an underwater space viewing method for providing underwater video to viewers.
  • a creature shown in an aquarium image taken by an underwater camera is identified, the name of the organism is read from a database, synthesized with the aquarium image, and the aquarium image is converted into a real-time pictorial book image.
  • the direction of shooting with the underwater camera can be arbitrarily specified through a portable terminal operated by a person (viewer) who is viewing the real-time pictorial book image.
  • Patent Document 2 an attachment provided by the patentee of Patent Document 2 that can easily realize a head-mound display by wearing a smartphone, a fablet, or the like has also appeared on the market.
  • a smartphone equipped with a three-dimensional posture sensor or the like is used for such a attachment, and the technique disclosed in Patent Document 2 is applied to a three-dimensional image while blocking the field of view outside the range that the viewer wants to see. Can be given an immersive time.
  • JP 2011-60130 A Japanese Patent No. 5961892
  • An object of the present invention is to provide an underwater space viewing system and an underwater space viewing method capable of solving these problems and providing viewers with a video of a landscape as if they were exploring the underwater space on their own will. It is in.
  • the viewing device of the present invention includes a goggle that a viewer can wear on the head and a portable terminal having a display screen, and the goggle can view the display screen in a state where the viewer blocks external light.
  • a terminal accommodating portion that detachably accommodates the portable terminal at a certain position, a monocular lens plate having a monocular lens for enlarging the display screen, and a binocular lens plate having a binocular lens for enlarging the display screen are interchangeably accommodated.
  • the mobile terminal includes an attention point that is a specific part on the display screen, a detection unit that detects a three-dimensional posture of the mobile terminal and a change state of the three-dimensional posture, and an omnidirectional image obtained by imaging an underwater space.
  • Detection of the detection means with respect to a predetermined management device that extracts a partial video that is a video including a point of interest and changes the extraction part of the partial video according to the three-dimensional posture and a change state thereof Communication means for transmitting a result and acquiring the partial video from the management device; and display control means for displaying the acquired partial video on the display screen.
  • the “omnidirectional video” refers to an image obtained by imaging 360 degrees in all directions.
  • the underwater space viewing system of the present invention includes an acquisition unit that acquires an omnidirectional video image of an underwater space, a viewing device having a display screen that allows a viewer to view in a state where outside light is blocked, and the viewing Detection means for detecting a point of interest as a specific part on the display screen from the apparatus, a three-dimensional attitude of the viewing apparatus and a change state of the three-dimensional attitude, and the detection means from the omnidirectional video acquired by the acquisition means
  • a partial image that is a portion of the image including the attention point detected in step (b) is extracted, the extracted partial image is displayed on the display screen, and the three-dimensional posture detected by the detection unit and the change state thereof are displayed.
  • Control means for changing the extraction part of the partial video and the viewing device has a monocular lens plate having a monocular lens for enlarging the display screen and a binocular for enlarging the display screen
  • a lens housing portion that interchangeably houses a binocular lens plate having a lens, and a partition plate that divides the viewing space of the display screen into two when the binocular lens plate is housed in the lens housing portion.
  • a partition plate housing portion for housing.
  • the underwater space viewing method of the present invention is a method for viewing a celestial sphere image obtained by imaging an underwater space using a viewing device having a display screen that allows a viewer to view the image while the outside light is blocked.
  • a monocular lens plate having a monocular lens for enlarging the display screen or a binocular lens plate having a binocular lens for enlarging the display screen is selectively accommodated in a predetermined portion of the viewing device, and the binocular lens plate
  • a detecting step for detecting a change state of the three-dimensional posture and the viewing device extracts a partial video that is a video including a portion of the attention point from the omnidirectional video and the three-dimensional figure.
  • the apparatus includes a display step of displaying the acquired partial video on the display screen.
  • the underwater space viewing system of the present invention has an underwater image capturing device that captures a celestial sphere image of the underwater space, and a display screen that allows the viewer to view on land with the outside light blocked.
  • a viewing device ; an acquisition unit that acquires the omnidirectional video from the underwater imaging device; a detection unit that detects a point of interest that is a specific part of the display screen from the viewing device; and the total acquired by the acquisition unit.
  • the control means obtains a relative position of the three-dimensional coordinate system of the target point in the three-dimensional coordinate system of the omnidirectional image by calculation, and based on the calculation result, calculates the partial image from the omnidirectional image.
  • the detection means also detects a change state of the three-dimensional posture of the attention point, and the control means determines the position of the underwater imaging device when the change state satisfies a predetermined first condition.
  • attention point change control for continuously changing the extracted part of the partial video is performed.
  • the control means enlarges the display magnification of the attention point when the stopped state after the three-dimensional posture of the attention point is changed for a predetermined time.
  • vibration detection means for detecting the vibration of the subject shown in the partial video, and vibration transmission for transmitting the detected vibration to the viewer of the viewing device showing the partial video, respectively.
  • the underwater imaging device includes sound collecting means for collecting underwater sounds in synchronization with the imaging of the omnidirectional video, and the viewing device outputs sound based on sound data.
  • Output means the acquisition means acquires the collected sound together with the omnidirectional video, the control means relative to the omnidirectional video in the partial video relative to the omnidirectional video
  • the sound data having a volume corresponding to a specific position is corrected, and the corrected sound data is output to the viewing device together with the partial video.
  • the underwater imaging device includes a plurality of each of the underwater imaging device and the viewing device, and each of the underwater imaging devices is 3 of the attention point of the viewing device linked to the underwater imaging device.
  • the position changes independently according to the dimensional orientation.
  • the underwater imaging device of the present invention suspends the omnidirectional camera that images the omnidirectional image of the underwater space and the omnidirectional camera that is displaceable in the depth direction while moving the water surface.
  • a moving buoy wherein the moving buoy transmits the omnidirectional video imaged by the omnidirectional camera to the outside and receives a displacement instruction from the outside, and the communication buoy received by the communication device
  • a control device for displacing the position of the moving buoy or the omnidirectional ball according to a displacement instruction is accommodated.
  • the viewing device of the present invention includes a goggle that a viewer can wear on a head, and a portable terminal having a display screen, and the goggle is configured so that the viewer blocks the outside light.
  • a terminal accommodating unit that detachably accommodates the portable terminal at a position where the display screen can be viewed;
  • the portable terminal includes an attention point that is a specific part of the display screen, a three-dimensional posture of the attention point, and the Detecting means for detecting a change state of a three-dimensional posture; and extracting a partial image that is a portion of the image including the attention point from an omnidirectional image obtained by imaging an underwater space and responding to the three-dimensional posture and the change state thereof.
  • the goggles interchangeably accommodate a monocular lens plate having a monocular lens for enlarging the display screen and a binocular lens plate having a binocular lens for enlarging the display screen, and the lens accommodating portion.
  • a partition plate housing portion that detachably accommodates a partition plate that divides the viewing space of the display screen into two when the binocular lens plate is housed.
  • the underwater space viewing method of the present invention includes an imaging step of placing an underwater imaging device in water and capturing an omnidirectional image of the underwater space, and real time omnidirectional video captured by the underwater imaging device.
  • the whole block diagram of the underwater space viewing-and-listening system concerning this embodiment.
  • the schematic diagram which shows the structural example of an underwater imaging device.
  • the hardware block diagram of a management apparatus The functional block diagram of a management apparatus.
  • (A) is a top view of goggles,
  • (b) is a front view, and
  • (c) is a rear view.
  • (A), (b) is a schematic diagram which shows the state of the goggles at the time of use.
  • the underwater space refers to a three-dimensional area when water is compared to air.
  • FIG. 1 is an overall configuration diagram of the underwater space viewing system according to the first embodiment.
  • This underwater space viewing system includes a plurality of underwater imaging devices 10, a plurality of viewing devices 20 each worn by a viewer, and a management device 30.
  • the underwater imaging device 10 and the viewing device 20 are connected by wireless communication via the management device 30.
  • a configuration example of each apparatus will be described.
  • FIG. 2 is a schematic diagram illustrating a configuration example of the underwater imaging device 10.
  • the underwater imaging device 10 includes a camera box 130 that houses a moving buoy 110, a waterproof cable holder 120, and an omnidirectional camera 133.
  • the omnidirectional camera 133 is a camera that can image 360 degrees in all directions, and is also called an omnidirectional camera.
  • the moving buoy 110 suspends the camera box 130 (omnidirectional camera 133) movably in the depth direction while moving on the water surface.
  • the moving buoy 110 is made of a radio wave permeable hard resin that floats on water and has a cavity. In the cavity, a communication device 1101 having an antenna, a propulsion mechanism 1102, a cable winding mechanism 1103, and a control device 1104 are installed.
  • a large-capacity rechargeable lithium ion battery is disposed as a power source for the communication device 1101, the propulsion mechanism 1102, and the control device 1104 at a predetermined portion of the moving buoy 110.
  • the communication device 1101 enables bidirectional communication between the management device 30 and the control device 1104.
  • a GPS receiver is provided, and the latitude / longitude / time information of the mobile buoy 110 is received and transmitted to the control device 1104.
  • An orientation detection function may be provided.
  • Latitude / longitude / time information is defined by the position and orientation of the moving buoy 110 at the time of imaging, that is, the quiet water surface is defined by the X coordinate system and the Y coordinate system orthogonal to the horizontal plane, and the Z coordinate system orthogonal to the horizontal plane. Is used to specify the three-dimensional coordinate system (coordinate space specified by the X, Y, Z coordinates) of the moving buoy 110.
  • the control device 1104 performs displacement control in the horizontal direction of the moving buoy 110 and the depth direction of the omnidirectional camera 133 based on an instruction from the management device 30 connected via the communication device 1101, and in the camera box 130.
  • the omnidirectional video and sound data from the omnidirectional camera 133 are transmitted to the management device 30 in association with the latitude / longitude / time information of the moving buoy 110 and the distance from the moving buoy 110 to the camera box 130.
  • the horizontal displacement control of the moving buoy 110 is performed by controlling the propulsion mechanism 1102 based on the latitude / longitude received by the communication device 1101 and the displacement instruction from the management device 30.
  • the displacement control of the omnidirectional camera 133 in the depth direction is performed by controlling the cable winding mechanism 1103 based on the displacement instruction from the management device 30 received by the communication device 1101.
  • the displacement amount is converted into the winding amount of the communication cable 121. That is, based on the outer diameter of the communication cable 121 and the reel diameter of the cable winding mechanism 1103, the distance from the moving buoy 110 to the camera box 130 when the camera box 130 is actually raised and lowered is measured, and the measurement result is recorded. Keep it. Since the above processing procedure can be made into a routine, it can be constituted by a one-chip computer or an FPGA (Field-Programmable Gate Array).
  • the propulsion mechanism 1102 displaces the moving buoy 110 in any two-dimensional direction on the water surface, and includes a screw, a motor, and a steering unit.
  • the displacement direction and displacement amount of the moving buoy 110 are determined by the control device 1104 controlling the motor and the operation unit that rotate the screw.
  • the cable winding mechanism 1103 is a mechanism that enables electrical connection between the control device 1104 and the proximal end core wire of the communication cable 121 that is a coaxial line, and changes the winding amount of the communication cable 121.
  • the camera box 130 includes a waterproof housing part 131 at least partially made of transparent glass, a joint member 132 that is watertightly fixed to the housing part 131, and an omnidirectional camera housed in a predetermined part of the housing part 131.
  • the housing part 131 is made of a material having a sufficient weight so as to easily sink in the seabed direction.
  • the omnidirectional camera 133 is a camera that captures an omnidirectional image of the underwater space, and includes an optical system component, a recording system component, various sensors, a storage, and a control unit that controls the operation of these components.
  • the optical system parts are installed near the transparent glass of the housing part 131. The detailed configuration of the control unit will be described later.
  • the waterproof cable holder 120 includes a first connector 121 whose one end is watertightly fixed to the lower surface of the moving buoy 110 and a second connector whose other end is watertightly fixed to the joint member 132 of the camera box 130. And a corrosion-resistant bellows portion 123 that connects the first connector portion 121 and the second connector portion 122.
  • the waterproof cable holder 120 has a hollow shaft portion in the length direction, and the communication cable 121 is movably accommodated in the shaft portion.
  • the communication cable 121 is a coaxial line, and its proximal end core wire is electrically connected to the control device 1104 installed in the moving buoy 110.
  • the leading end core wire is electrically connected to the control unit of the omnidirectional camera 133.
  • the communication cable 121 has its distal end fixed to the joint member 132, but the portion other than the base end penetrates the first connector 121 and is wound around the cable winding mechanism 1103 of the moving buoy 110, or It will be rewound.
  • the bellows portion 123 When wound, the bellows portion 123 is shortened, and the camera box 130 is pulled up in the water surface direction. When rewound, the bellows portion 123 becomes longer, and the camera box 130 is pulled down into the water.
  • the waterproof cable holder 120 is exposed to seawater. Therefore, measures to prevent rust or seawater from entering are indispensable.
  • seawater is an almost neutral aqueous solution containing dissolved oxygen and contains a large amount of salts such as chlorides. Therefore, the first connector part 121 and the second connector part 122 are made of a nickel material or a hard die-cast material whose surface is processed with fluororesin or at least the surface is subjected to rust prevention (insulation). desirable. You may use what shape
  • FIG. 3 is a diagram showing an example of the configuration.
  • the control unit includes a computer having a CPU (Central Processing Unit) 301, a RAM (Random Access Memory) 302, and a ROM (Read Only Memory) 303 connected as main components via a bus B1.
  • the bus B1 further includes an external storage device 304, an input I / F (I / F is an abbreviation of an interface, the same applies hereinafter) 305, an image processing block 306, a moving image compression block 307, a sound processing block 308, and an output I / F 309. It is connected.
  • the omnidirectional camera 133 is provided with a rechargeable battery.
  • the CPU 301 controls the operation and overall operation of each block.
  • the RAM 302 is used as a work area for the CPU 301.
  • the ROM 303 stores various parameters described by codes that can be read by the CPU 301.
  • the external storage device 304 stores image data, sound data, and the like, but also stores a camera control program that can be read and executed by the CPU 301.
  • Image signals output from two imaging elements 311 and 312 each having an imaging range of 180 degrees in the surface direction are input to the input I / F 305 via the mount 313. Detection signals from the attitude sensor 315 and the sound sensor 317 are also input to the input I / F 305.
  • the attitude sensor is composed of a three-axis acceleration sensor, a three-axis gyro sensor, a geomagnetic sensor, or a combination thereof, and is used to detect a change in attitude of the omnidirectional camera 133. That is, it is used to specify the position of the moving buoy 110 and the three-dimensional coordinates in the X, Y, and Z directions when the omnidirectional camera 133 is imaged.
  • the information obtained from the attitude sensor 315 is used when the management device 30 extracts a partial video including a portion (a point of interest described later) that is viewed by the viewer from the omnidirectional video.
  • the sound sensor 317 collects a sound whose volume is stronger as the distance from the sound source is shorter and weaker as the distance is farther.
  • the sound source is associated with the three-dimensional coordinates of the omnidirectional video.
  • the image processing block 306 includes an ISP (Image Signal Processor) or the like, and performs image processing such as shading correction, Bayer interpolation, white balance correction, and gamma correction on the image signal.
  • the image processing block 306 further synthesizes a plurality of image signals acquired through the above processing, thereby generating a omnidirectional video.
  • the video compression block 307 is MPEG-4 AVC / H. It is a codec block that performs compression and expansion of moving images such as H.264.
  • the moving image compression block 307 is a block that stores the generated omnidirectional video or reproduces and outputs the stored omnidirectional video.
  • the sound processing block 308 converts the data collected by the sound sensor into digital sound data that can be recognized by the management apparatus 30, and transmits the digital sound data to the CPU 301 and records it in the RAM 302 and the external storage device 304.
  • the output I / F 309 outputs the omnidirectional video and sound data recorded before and during recording to the external storage device 304 to the viewing device 20 via the communication cable 121, the control device 1104, and the management device 30.
  • So-called live view and live sound can be provided to the viewing device 20 connected in a state where real-time communication is possible by using the omnidirectional video and sound data being recorded before recording.
  • the management device 30 is a device that monitors and controls the operations of the underwater imaging device 10 and the viewing device 20, and is realized by a computer having a wireless communication function and a management program.
  • FIG. 4 is a hardware configuration diagram of the management device 30.
  • the management device 30 includes a CPU 401, a RAM 402, and a ROM 403 connected via a bus B2. Further, a large-capacity storage 404, an image processing block 405, a moving image compression block 406, a sound processing block 407, and a communication I / F 408 are connected to the bus B2. Although not shown, the management device 30 is provided with a power supply circuit or a battery.
  • the CPU 401 controls the operation of each block by reading and executing the management program.
  • the RAM 402 is used as a work area for the CPU 401.
  • the ROM 403 stores various parameters described by codes that can be decoded by the CPU 401.
  • the mass storage 404 has a program block for recording the management program, a data block for recording video and sound, and a recording medium for constructing a database (DB) to be described later.
  • DB database
  • the basic operation of the image processing block 405 is the same as that of the image processing block 306 described above, but in this embodiment, a partial video is extracted from the omnidirectional video obtained by the processing of the image processing block 306. Further, the partial video is gradually changed, or the display magnification of the partial video at the attention point of each extracted partial video is changed. Furthermore, the extracted partial image is analyzed to detect vibration information such as the presence / absence of vibration, the magnitude of vibration, and the vibration frequency, and this can be converted into sound.
  • the moving image compression block 406 is a codec block that performs moving image compression and expansion similarly to the moving image compression block 307. In this embodiment, moving image compression and decompression are performed on the partial video.
  • the omnidirectional video obtained by the processing of the image processing block 306 and / or the extracted partial video, display magnification, and vibration information are stored in the data block of the RAM 402 and the large-capacity storage 404, and are appropriately reproduced and output.
  • the sound processing block 407 transmits sound data (sound) input from the viewing device 20 and sound data (sound collection from underwater) input from the underwater imaging device 10 to the CPU 401, and data in the RAM 402 and the large capacity storage 404. Record to block.
  • the image processing block 405 can analyze the vibration information of the subject and convert it into sound, data representing the converted sound is recorded in the data block.
  • the communication I / F 408 is an interface for performing wireless communication between the underwater imaging device 10 and the viewing device 20.
  • FIG. 5 is a functional configuration diagram of the management apparatus 30 formed by the CPU 401 executing the management program.
  • the management device 30 includes a main control unit 501 that controls the overall operation, and a communication control unit 502 that controls communication between the underwater imaging device 10 and the viewing device 20.
  • a viewing device management DB 503 and an imaging device management DB 504 constructed in the large-capacity storage 404 are provided.
  • the management device 30 also has acquisition means for acquiring various data and transmitting it to the main control unit 501. That is, the camera ID acquisition unit 511 acquires identification information (ID) when the omnidirectional camera 133 is powered on. At this time, the position information of the omnidirectional camera 133 (including the position information of the camera box 130 and the moving buoy 110) is also acquired.
  • the image data acquisition unit 512 acquires an omnidirectional image captured by the omnidirectional camera 133 after the power is turned on.
  • the sound data acquisition unit 513 acquires sound data collected by the omnidirectional camera 133 after the power is turned on.
  • the device ID acquisition unit 514 acquires the ID when the viewing device 20 is powered on.
  • the device orientation acquisition unit 515 acquires data representing the orientation and change of the viewing device 20 after the power is turned on.
  • the audio data acquisition unit 516 acquires the audio of the viewer from the viewing device 20 after the power is turned on.
  • the operation content acquisition unit 517 acquires the operation content of the viewer with respect to the viewing device 20.
  • the operation content includes an operation on a switch or button corresponding to the operation content table expanded in the RAM 402, and the posture or posture change pattern of the attention point of the viewing device 20 specified from the pattern dictionary expanded in the RAM 402 (for example, up, down, left, right) Posture change at a predetermined speed in the direction, stop after the posture change, start of posture change from stop, etc.).
  • the omnidirectional video and sound data are transmitted to the viewing device 20 through the communication control unit 502 after direct or necessary processing.
  • the data representing the operation content acquired by the posture content acquisition unit 507 may be recorded only in the RAM 402.
  • attribute information of each viewing device 20 is stored. That is, for each viewing device 20, the ID, power on / off information, the performance of image processing that can be executed by the viewing device 20 (such as whether 3D video can be output, etc.) are stored in the viewing device management DB 503 as attribute information. Stored. Further, IDs of other related viewing devices (for example, viewing devices 20 worn by members of the same group) are associated and stored in the attribute information.
  • attribute information of the underwater imaging device 10 is stored. That is, for each underwater imaging device 10, the identification information (ID) of the omnidirectional camera 133, the power on / off information of the omnidirectional camera 133, the current position, time, and attitude of the omnidirectional camera 133 when the power is turned on. Information and the like are stored in the imaging device DB 504 as attribute information.
  • the current position of the omnidirectional camera 133 can be calculated based on the latitude / longitude of the moving buoy 110 and the distance of the camera box 130 from the moving buoy 110.
  • the management apparatus 30 also includes functional blocks of an authentication processing unit 521, a link setting unit 522, an instruction content recognition unit 523, a point of interest detection unit 524, a voice recognition unit 525, an image editing unit 526, and a sound data correction unit 527.
  • the authentication processing unit 521 performs authentication processing of the viewing device 20 when the viewing device 20 logs in.
  • the authentication process is a process for confirming whether the ID or the like of the viewing device 20 is registered in advance.
  • the link setting unit 521 links the underwater imaging device 10 and the viewing device 20 based on their IDs. When the underwater imaging device 10 and the viewing device 20 are linked on a one-to-one basis, that is, one viewing device 20 can view the omnidirectional video and sound data from one specific underwater imaging device 10.
  • a pair link is set based on an instruction from the viewing device 20.
  • N is a natural number of 2 or more
  • the link is set so that the omnidirectional video and sound data from one underwater imaging device 10 can be viewed on N viewing devices 20. To do.
  • the instruction content recognition unit 523 recognizes the instruction content of the viewer based on the data acquired by the device orientation acquisition unit 515 and the data acquired by the operation content acquisition unit 517. For example, the posture of the viewing device 20 represented by the data acquired by the device posture acquisition unit 515 changes the posture from the reference position in the vertical up / down direction or horizontal left / right direction at a predetermined speed, and maintains the posture for a desired time as it is. Suppose you return. The instruction content recognition unit 523 recognizes that such a posture change pattern is a displacement instruction by comparison with the recognition dictionary developed in the RAM 402.
  • the instruction content recognition unit 523 recognizes that such a posture change pattern is an attention point displacement instruction.
  • the instruction content recognition unit 523 recognizes that the viewer is directly or staring at the video of the point of interest. If the operation acquired from the operation content acquisition unit 517 is a touch operation other than the button operation described above, the viewing mode corresponding to the touch operation is recognized.
  • the attention point detection unit 524 detects the attention point of the viewing device 20 and the change state of its three-dimensional posture.
  • the “attention point” refers to a specific part (specific point specified by three-dimensional coordinates) of the omnidirectional video that is estimated to be watched by the viewer of the omnidirectional video.
  • an approximately central portion of the display screen of the viewing device 20 is a point of interest.
  • the operation content acquired by the operation content acquisition unit 517 specifies a point of interest (pointing instruction), or when the voice recognized by the voice recognition unit 525 described later is an instruction to move the viewpoint Detects the designated site as a point of interest.
  • the three-dimensional posture of the attention point and its change state are detected based on the data representing the posture and change acquired by the device posture acquisition unit 515.
  • the change state includes a change start, a change start, a change maintenance, a stop after a change, a state in which the stop continues.
  • the voice recognition unit 525 recognizes the content of the voice data acquired from the viewing device 20.
  • the main control unit 501 can execute a predetermined process.
  • a predetermined process for example, a process of changing a point of interest in an omnidirectional video in the vertical and horizontal directions is performed.
  • the audio data from a certain viewing device 20 can be configured to be transmitted to another viewing device 20 by setting.
  • the image editing unit 526 obtains the relative position of the three-dimensional coordinate system (coordinate space represented by X, Y, Z coordinates) of the point of interest in the three-dimensional coordinate system of the acquired omnidirectional video by calculation, and the calculation result Based on the above, a process of cutting out a certain range of coordinate space centered on the point of interest from the omnidirectional video is performed. Moreover, the process which extracts the image
  • the calculation for obtaining the relative position is a three-dimensional rotation transformation of the three-dimensional coordinates of the omnidirectional video image with a tilt corresponding to the point of interest, and reduces the processing load compared to the case of performing image processing on the entire omnidirectional video image. be able to. Therefore, it is possible to output a partial video that does not feel uncomfortable for the viewer (that is, playback does not stop halfway) to the viewing device 20.
  • the image editing unit 526 also performs a process of enlarging the display magnification of the partial video when the stop state at the point of interest continues for a certain time.
  • the sound data correction unit 527 corrects the sound data to a volume corresponding to the relative position in the partial video. That is, based on the three-dimensional coordinates of the omnidirectional video acquired from the omnidirectional camera 133 and the collected sound data, the sound volume restored from the sound data is corrected. For example, if it is determined that the position of the collected sound source is away from the point of interest, the volume is reduced according to the distance.
  • the underwater noise can be eliminated by filtering with a bandpass filter or the like.
  • the viewing device 20 is a device worn by the viewer, and includes a partial image including the attention point in the omnidirectional video imaged by the omnidirectional camera 133 in accordance with the viewer's intention, and the omnidirectional camera. This is a device that acquires the sound collected at 133 from the management device 30 and allows the viewer to view the sound.
  • the partial video is configured to allow the viewer to view the video in a manner that blocks the other fields of view.
  • AP predetermined application program
  • a smartphone has a display screen, a posture sensor such as a gyro sensor or an acceleration sensor, a wireless communication function, an input mechanism from a microphone, and an output mechanism to a speaker or an earphone.
  • the attitude sensor can detect the attitude of the casing or its change state, and the detection result can be transmitted to the management apparatus 30 together with its own ID.
  • the voice uttered by the viewer can be transmitted to the management apparatus 30 and the sound data or voice from the management apparatus 30 can be output. Further, it has a login / logoff function using its own ID with the management apparatus 30.
  • FIGS. 6A is a top view of the goggles 21, FIG. 6B is a front view, and FIG. 6C is a rear view.
  • FIG. 7 is a schematic diagram showing a use state of the goggles 21.
  • FIGS. 8A and 8B are schematic diagrams showing the state of the goggles 21 in use.
  • the goggles 21 each have a terminal accommodating portion 21 and a lens accommodating portion 22 formed by molding a resin material.
  • the terminal accommodating portion 21 and the lens accommodating portion 22 are configured to be openable and closable around an engagement mechanism provided in a part thereof, for example, a hinge.
  • a buckle is provided on the outer surface of the lens housing portion 22 to fix the belt members 23, 24, and 25 for attaching the goggles 21 to the viewer's head at three points.
  • the terminal accommodating part 21 has a narrow wide U-shaped groove-like gap into which a smartphone can be inserted, that is, a terminal accommodating mechanism 211 in which only the upper bottom surface part is opened, and a partition plate accommodating mechanism 212 for inserting the partition plate 273. Is formed.
  • the front panel 215 of the terminal accommodating portion 21 blocks external light in cooperation with the casing up to the joint portion with the lens accommodating portion 22 and the casing of the lens accommodating portion 22.
  • a soft resin that elastically holds the smartphone is attached to the back side of the front panel 215.
  • the partition plate 273 is a resin plate extending in the vertical direction with respect to the display screen 280 of the smartphone, and has a role of dividing the viewing space for viewing the display screen 280 into two.
  • the lens housing portion 22 hits a narrow wide U-shaped groove-like gap into which the interchangeable lens plate can be inserted interchangeably, that is, a lens plate housing mechanism 213 opened only at the upper bottom surface, and the nose of the viewer's face.
  • the face portion 222 is formed so as to be bordered by a soft elastic material covering the periphery of both eyes from the portion 221.
  • the interchangeable lens plate housed in the lens plate housing mechanism 213 is positioned so that the lens surface overlaps the display screen 280 of the smartphone.
  • a binocular lens plate 272 attached when viewing the stereo video is used.
  • the former can be used for adults and the latter for children.
  • Each lens is obtained by holding one or two lenses with a frame.
  • any of these lenses is a plano-convex lens, a plano-concave lens, a biconvex lens, a biconcave lens, or the like.
  • a Fresnel lens can also be used. If necessary, a vision correction lens can also be used.
  • the viewing device 20 that can be used for both children and adults is realized. Can do.
  • FIG. 9 is a functional configuration diagram of a smartphone accommodated in the viewing device 20. These functions are realized by the CPU of the smartphone that reads and executes the AP.
  • the smartphone has a posture detection unit 231.
  • the posture detection unit 231 transmits the detection result of the posture sensor to the management device 30 in real time via the communication control unit 232. That is, the management device 30 acquires and transmits data necessary for specifying a point of interest corresponding to the orientation, direction, and change state that the smartphone is facing.
  • image data for displaying a partial video including a point of interest among the omnidirectional video corresponding to the azimuth / direction / change state, and sound data collected by the omnidirectional camera 133 are input through the image data input unit 241 and the sound data input / output unit 251 respectively.
  • the image data is restored to an image having a resolution and a display size that can be displayed on the display screen 280 of the smartphone by the image processing unit 242 and input to the display control unit 243.
  • the display control unit 243 switches to the monocular display image 281 when the monocular video display is set on the smartphone, and switches to the stereo video display image 282 when the smartphone displays the three-dimensional stereo video. Display control on the screen 280 is performed.
  • the sound data input from the management device 30 is restored to sound that can be played back by the smartphone by the sound processing unit 252 and output from a speaker or an earphone.
  • audio data input from the microphone is input to the sound processing unit 252 via the audio input unit 253, and is transmitted to the management device 30 through the sound data input / output unit 251 and the communication control unit 232.
  • FIG. 10 is a sequence diagram of each device
  • FIG. 11 is an explanatory diagram of the state of the underwater imaging device 10 in water.
  • the management device 30 sets a predetermined portion in a predetermined direction in each of the omnidirectional images from the plurality of underwater imaging devices 10 as an attention point, that is, a direction set by default in the omnidirectional camera 133.
  • the partial video of the part and the sound data collected by the omnidirectional camera 133 are input (ST00).
  • the viewer accesses the management device 30 through the login screen displayed on the display screen 280 of the smartphone and logs in before mounting the viewing device 20 having the structure shown in FIGS. 6 to 8 on the head (ST01). .
  • the management device 30 performs authentication upon the login from the viewing device 20 (ST02). If the authentication is successful, the partial video from the currently operating omnidirectional camera 133 is presented to the viewing device 20 as a thumbnail video having a resolution lower than that during normal display (ST03).
  • the viewing device 20 displays a thumbnail image on the display screen 280 and waits for the viewer to select the omnidirectional camera 133.
  • the management device 30 starts a link with the underwater imaging device 10 and the viewing device 20 that house the omnidirectional camera 133. (ST05).
  • the viewing device 20 identifies the omnidirectional camera 133 (ST06).
  • the underwater imaging device 10 (control device 1104) specifies the viewing device 20 to be linked (ST07), picks up an omnidirectional video for the specified viewing device 20, and collects sound data. Do.
  • a partial image of a default direction / part is output from the captured omnidirectional image (ST08). The viewer wears the viewing device 20 on the head and wears the earphone connected to the smartphone to the ear.
  • the viewer who is viewing the partial video moves the omnidirectional camera 133 in a direction in which he / she wants to view the video in the water or changes the attitude of the viewing device 20 in order to change the partial video.
  • the posture sensor of the smartphone detects the posture change and transmits it to the management device 30 (ST09).
  • the management device 30 controls the displacement of the omnidirectional camera 133 when the detection result of the posture change satisfies the predetermined first condition (ST10).
  • attention point change control ST11 for continuously changing the extracted part of the partial video is performed.
  • Example of the first condition For example, with the viewer wearing the viewing device 20 and viewing a partial video on the display screen 280, the head is moved vertically downward at a predetermined speed, and the posture is maintained as it is for a desired time. Suppose that it was returned.
  • the predetermined speed is, for example, a speed at which the video on the display screen 280 is interrupted.
  • the management device 30 recognizes such a posture change of the viewing device 20 as a displacement control instruction in the depth direction, and performs displacement control of the omnidirectional camera 133 (ST10). Specifically, the control device 1104 of the underwater imaging device 10 causes the cable winding mechanism 1103 to rewind the communication cable 121.
  • the amount of rewinding increases as the time during which the viewer maintains the vertically downward posture becomes longer.
  • the position of the omnidirectional camera 133 is displaced in the depth direction from the time point.
  • the omnidirectional camera 133 captures and collects an omnidirectional video image at the displaced position (ST12), and transmits the data obtained as a result to the management device 30 in real time.
  • the management device 30 performs image editing and sound processing on the received data, and transmits the result data to the viewing device 20 (ST13).
  • the viewing device 20 performs partial video output and sound output based on the result data received from the management device 30 (ST14).
  • the viewer when raising the omnidirectional camera 133, the viewer changes the posture of the head vertically upward at a predetermined speed while wearing the viewing device 20, and maintains the posture for a desired time as it is. Return to level.
  • the posture sensor of the smartphone detects this posture change and transmits it to the management device 30 (ST09).
  • the management device 30 performs displacement control (ST10) of the omnidirectional camera 133. That is, the management device 30 causes the control device 1104 of the underwater imaging device 10 to cause the cable winding mechanism 1103 to wind the communication cable 121 (ST11). The amount of winding increases as the time during which the viewer maintains the vertically upward posture becomes longer. Thereby, the position of the omnidirectional camera 133 is displaced in the water surface direction from the time point.
  • the omnidirectional camera 133 captures and collects an omnidirectional video image at the displaced position (ST12), and transmits the data obtained thereby to the management device 30 in real time.
  • the management device 30 performs image editing and sound processing on the received data, and transmits the result data to the viewing device 20 (ST13).
  • the viewing device 20 performs partial video output and sound output based on the result data received from the management device 30 (ST14).
  • the viewer when it is desired to move the omnidirectional camera 133 in the horizontal left direction, the viewer changes the head in the horizontal left direction at a predetermined speed while wearing the viewing device 20, and maintains the posture as it is for a desired time. maintain.
  • the posture sensor of the smartphone detects the posture change and transmits it to the management device 30 (ST09).
  • the management device 30 performs displacement control (ST10) of the omnidirectional camera 133. That is, the management device 30 recognizes the displacement control instruction in the horizontal direction, controls the steering mechanism of the underwater imaging device 10, drives the screw, motor, and steering unit, and moves the moving buoy 110 in the horizontal left direction from that point. (ST11).
  • the viewer when it is desired to move the omnidirectional camera 133 in the horizontal right direction, the viewer changes his / her head in the horizontal right direction at a predetermined speed while wearing the viewing device 20 and keeps his posture for a desired time. maintain. Subsequent operations of the management device 30 and the omnidirectional camera 133 are the same processing as described above for the vertical displacement. Note that the omnidirectional camera 133 can be moved in any direction in the horizontal direction. In this case, the viewer changes the head to a desired direction at a predetermined speed while wearing the viewing device 20, maintains the posture for a desired time, and then returns to the horizontal direction.
  • Example of second condition> On the other hand, it is assumed that the viewer wears the viewing device 20 and gradually changes the head in the vertical vertical direction, the horizontal direction, and the diagonal direction at a speed at which the display of the partial video on the display screen 280 is not interrupted. Further, it is assumed that such a change is continued and then stopped, and the stopped state continues for a predetermined time.
  • the posture sensor of the smartphone detects such a change in posture (including the stopped state after the change) and transmits it to the management device 30 (ST09).
  • the management device 30 recognizes the state in which the posture change continues as an attention point change instruction from the viewing device 20, and changes the attention point (ST11).
  • the management device 30 performs image editing of the partial video including the changed attention point, performs sound processing, and transmits the result data to the viewing device 20 (ST13).
  • the management device 30 also performs image editing for increasing the imaging magnification of the partial video when the stopped state of the viewing device 20 continues for a predetermined time. Then, the management device 30 outputs the result data of these processes to the viewing device 20.
  • the viewing device 20 performs partial video output and sound output based on the result data received from the management device 30 (ST14).
  • the viewing device 20 repeats the operations after ST09 until the viewer logs off (ST15: N).
  • the management device 30 cancels the link between the viewing device 20 and the underwater imaging device 10 (ST16). Thereby, the underwater imaging device 10 shifts to the standby mode (ST17).
  • the viewing device 20 transmits the viewing mode (posture change or the like) by the viewer to the management device 30, and the management device 30 operates the omnidirectional camera according to the viewing mode.
  • the process of 133 displacement control and attention point change is performed.
  • a partial video is extracted from the omnidirectional video, and image editing of the partial video is performed according to the viewing mode of the viewer. For example, when the state where the attention point is stopped after the change is maintained for a certain time or longer, the display magnification of the partial video is increased with the attention point as the center. As a result, for example, as shown in FIG.
  • the omnidirectional camera 133 can be displaced to the vicinity of the depression of the sea reef, and the partial image of the part and the sound collected at the part can be viewed. Therefore, for example, it becomes easy to watch a habitual creature hidden behind a rock. Also, even if the creature you want to see is not approaching or even if the creature is small, the imaging magnification is increased by stopping them for a certain period of time while keeping them at the center position of the partial video as the point of interest. Therefore, it is possible to experience as if watching these creatures up close. Thus, according to the present embodiment, it is possible to provide an image of a landscape in which a viewer explores the underwater space with his / her own intention.
  • the parent viewing device 20 displays a three-dimensional stereo image by accommodating the binocular lens plate 272 in the goggles 21, and the child viewing device 20 displays the goggles. By accommodating the monocular lens plate 271 in 21, a monocular image can be displayed.
  • each of the viewing devices 20 can control the underwater imaging device 10 independently, it is possible to view a video that the parent and child want to view.
  • the viewing device 20 can be combined with the viewing direction and the part to be viewed.
  • the management device 30 has a function of detecting the vibration of the subject shown in the partial video, and a vibration transmission medium that transmits the detected vibration to the viewer of the viewing device 20 that shows the partial video. If the structure is provided, it is possible to give the viewer vibration that cannot be observed from the land. In addition, when converting a vibration into a sound, the said vibration can be heard as a sound.
  • the present invention can be implemented as an underwater space viewing method having the following steps. (1) An imaging process in which the underwater imaging device 10 is placed in the water and an omnidirectional image of the underwater space is captured. (2) The omnidirectional video captured by the underwater imaging device 10 is acquired in real time, and a partial video including a predetermined attention point is extracted from the acquired omnidirectional video. A display step of displaying on the viewing device 20 that can be viewed in a state where external light is blocked. (3) A changing step in which the attention point is continuously changed according to the change state of the three-dimensional posture of the viewing device 20, and thereby the partial image displayed on the viewing device 20 is changed.
  • the underwater imaging device 10 may be movable in the horizontal direction and the depth direction as in the first embodiment, but may be installed in a specific part with a good landscape.
  • a controller that uniquely controls the propulsion mechanism 102, the cable winding mechanism 1103, and the control device 1104 of the moving part 110 is provided, and the attention point or the like is displaced by this controller. You may make it make it.
  • Mode 3 In the first embodiment, an example in which a smartphone is inserted into goggles as the viewing device 20 having a display screen that can be viewed in a state where a viewer blocks outside light on land has been described.
  • a liquid crystal display connected to a personal computer may be surrounded by a predetermined casing or film.
  • the AP is installed on a tablet terminal or a personal computer.
  • each underwater imaging device 10 is provided with one or a plurality of detection means for detecting the vibration of an object shown in the partial video, and a vibration transmission medium is provided in each viewing device 20, The vibration detected by the underwater imaging device 10 may be transmitted to the viewer of the viewing device 20 that displayed the partial video through a vibration transmission medium.
  • a vibration transmission medium is provided in each viewing device 20.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un système de visualisation d'espace sous-marin qui est capable de fournir, à un spectateur, une vidéo d'une scène de telle sorte qu'un espace sous-marin peut être exploré conformément aux intentions du spectateur. Un dispositif de commande dans une bouée mobile (110) est relié à une caméra omnidirectionnelle, dans une boîte de caméra (130) dans la mer, par un câble de communication logé dans un support de câble étanche à l'eau (120). Le dispositif de commande dans la bouée mobile (110) transmet sans fil une vidéo omnidirectionnelle à un dispositif de gestion sur terre. Le dispositif de gestion délivre, à un dispositif de visualisation, une vidéo partielle extraite de la vidéo omnidirectionnelle conformément à l'orientation du dispositif de visualisation, et modifie le rapport de grossissement d'imagerie et la position de la caméra omnidirectionnelle en fonction du mouvement du dispositif de visualisation.
PCT/JP2018/000699 2017-01-13 2018-01-12 Dispositif de visualisation, système de visualisation d'espace sous-marin et procédé de visualisation d'espace sous-marin WO2018131688A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-004568 2017-01-13
JP2017004568A JP6262890B1 (ja) 2017-01-13 2017-01-13 視聴装置、水中空間視聴システム及び水中空間視聴方法

Publications (1)

Publication Number Publication Date
WO2018131688A1 true WO2018131688A1 (fr) 2018-07-19

Family

ID=60989228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/000699 WO2018131688A1 (fr) 2017-01-13 2018-01-12 Dispositif de visualisation, système de visualisation d'espace sous-marin et procédé de visualisation d'espace sous-marin

Country Status (2)

Country Link
JP (1) JP6262890B1 (fr)
WO (1) WO2018131688A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6665981B2 (ja) * 2016-06-08 2020-03-13 株式会社ザクティ 全天球カメラ
JP2021192471A (ja) * 2018-09-14 2021-12-16 ソニーグループ株式会社 表示制御装置および表示制御方法、並びにプログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10304351A (ja) * 1997-02-25 1998-11-13 Masanobu Kujirada 動物移動映像等提供システム
JP2003134505A (ja) * 2001-10-22 2003-05-09 Ishikawajima Harima Heavy Ind Co Ltd 宇宙展望台
JP2007075300A (ja) * 2005-09-13 2007-03-29 Konami Digital Entertainment:Kk 立体視眼鏡
JP2013190451A (ja) * 2010-09-30 2013-09-26 Panasonic Corp ステレオビューア及び表示装置
WO2015145863A1 (fr) * 2014-03-28 2015-10-01 国立研究開発法人理化学研究所 Système d'affichage, dispositif, procédé d'affichage et programme
JP2016005125A (ja) * 2014-06-17 2016-01-12 株式会社ネクストシステム ヘッドマウント型装置及びヘッドマウント型情報表示システム
JP2016092691A (ja) * 2014-11-07 2016-05-23 キヤノン株式会社 画像処理装置及びその制御方法、プログラム、並びに記憶媒体

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS628895A (ja) * 1985-07-06 1987-01-16 Kaiken:Kk ラジオコントロ−ル潜水ロボツト
JPH03169796A (ja) * 1989-11-29 1991-07-23 Kansai Electric Power Co Inc:The 水質測定用ロボット装置
JPH04225478A (ja) * 1990-12-27 1992-08-14 Fujitsu Ltd 視線検出装置
JPH04238296A (ja) * 1991-01-22 1992-08-26 Nuclear Fuel Ind Ltd 水中遊泳検査装置
JP4380890B2 (ja) * 2000-06-14 2009-12-09 三菱重工業株式会社 捜索用水中航走体システム、水中航走体、船舶用捜索指令装置、および画像処理方法
JP3929346B2 (ja) * 2002-04-24 2007-06-13 日立造船株式会社 ステレオ画像表示方法およびステレオ画像表示装置
JP5402690B2 (ja) * 2010-02-04 2014-01-29 日本電気株式会社 鉱物採取システムおよび鉱物採取方法
JP2011191384A (ja) * 2010-03-12 2011-09-29 Panasonic Corp 表示装置
JP5805423B2 (ja) * 2011-04-13 2015-11-04 株式会社フジタ 全方位撮影システム
JP2012257021A (ja) * 2011-06-08 2012-12-27 Sony Corp 表示制御装置および方法、プログラム、並びに記録媒体
JP6139118B2 (ja) * 2012-12-06 2017-05-31 東芝メディカルシステムズ株式会社 X線診断装置及び制御プログラム
CA3027407A1 (fr) * 2014-02-18 2015-08-27 Merge Labs, Inc. Lunettes d'affichage facial a utiliser avec des dispositifs informatiques mobiles
JP2016084021A (ja) * 2014-10-27 2016-05-19 三菱電機株式会社 車両用表示装置及び車両用表示システム
JP6621063B2 (ja) * 2015-04-29 2019-12-18 パナソニックIpマネジメント株式会社 カメラ選択方法及び映像配信システム
JP6256513B2 (ja) * 2016-04-13 2018-01-10 株式会社リコー 撮像システム、撮像装置、方法およびプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10304351A (ja) * 1997-02-25 1998-11-13 Masanobu Kujirada 動物移動映像等提供システム
JP2003134505A (ja) * 2001-10-22 2003-05-09 Ishikawajima Harima Heavy Ind Co Ltd 宇宙展望台
JP2007075300A (ja) * 2005-09-13 2007-03-29 Konami Digital Entertainment:Kk 立体視眼鏡
JP2013190451A (ja) * 2010-09-30 2013-09-26 Panasonic Corp ステレオビューア及び表示装置
WO2015145863A1 (fr) * 2014-03-28 2015-10-01 国立研究開発法人理化学研究所 Système d'affichage, dispositif, procédé d'affichage et programme
JP2016005125A (ja) * 2014-06-17 2016-01-12 株式会社ネクストシステム ヘッドマウント型装置及びヘッドマウント型情報表示システム
JP2016092691A (ja) * 2014-11-07 2016-05-23 キヤノン株式会社 画像処理装置及びその制御方法、プログラム、並びに記憶媒体

Also Published As

Publication number Publication date
JP6262890B1 (ja) 2018-01-17
JP2018113653A (ja) 2018-07-19

Similar Documents

Publication Publication Date Title
JP6094190B2 (ja) 情報処理装置および記録媒体
CN110139028B (zh) 一种图像处理的方法及头戴式显示设备
US9927948B2 (en) Image display apparatus and image display method
JP6252849B2 (ja) 撮像装置および方法
US9055220B1 (en) Enabling the integration of a three hundred and sixty degree panoramic camera within a mobile device case
KR20120053006A (ko) 개선된 오디오/비디오 방법들 및 시스템들
CN107071237A (zh) 图像记录系统、用户佩戴装置、摄像装置、图像处理装置以及图像记录方法
JP6580516B2 (ja) 処理装置および画像決定方法
JP6262890B1 (ja) 視聴装置、水中空間視聴システム及び水中空間視聴方法
WO2020059327A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
GB2558283A (en) Image processing
KR20070113067A (ko) 휴대용 입체 카메라
US20240205513A1 (en) Video display system, information processing device, information processing method, and recording medium
CN213028363U (zh) 一种虚拟现实眼镜
JPWO2016002322A1 (ja) 画像処理装置、画像処理方法およびプログラム
GB2571286A (en) Virtual reality
JP2005277845A (ja) 撮影制御装置
JP2022022871A (ja) 処理装置および没入度導出方法
RU2782312C1 (ru) Способ обработки изображения и устройство отображения, устанавливаемое на голове
JP2009017106A (ja) 撮像装置
WO2022240319A1 (fr) Dispositif pour obtenir une vue panoramique
CN113660478A (zh) 一种无人机立体摄影同步终端的显示方法及立体眼镜
KR20080007914A (ko) 낚시용 카메라
WO2018096315A1 (fr) Réalité virtuelle
JP2021068296A (ja) 情報処理装置、ヘッドマウントディスプレイ、およびユーザ操作処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18739186

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18739186

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载