+

WO2018139073A1 - Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande - Google Patents

Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande Download PDF

Info

Publication number
WO2018139073A1
WO2018139073A1 PCT/JP2017/044284 JP2017044284W WO2018139073A1 WO 2018139073 A1 WO2018139073 A1 WO 2018139073A1 JP 2017044284 W JP2017044284 W JP 2017044284W WO 2018139073 A1 WO2018139073 A1 WO 2018139073A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
area
image
unit
display device
Prior art date
Application number
PCT/JP2017/044284
Other languages
English (en)
Japanese (ja)
Inventor
久雄 熊井
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018139073A1 publication Critical patent/WO2018139073A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Definitions

  • One aspect of the present invention relates to a display control device that displays content on a display device, a display device that displays a partial image in a specified display region of the image region of the content, and the like.
  • Patent Document 1 discloses a technology related to panoramic video distribution.
  • Patent Document 2 discloses a technique related to display of an omnidirectional image. These documents relate to a technique for causing a display device to display a partial image of a designated display region for an image such as a omnidirectional image having an image region having a size that does not fit on one screen of the display device.
  • Japanese Patent Publication Japanese Unexamined Patent Application Publication No. 2015-173424 (Published October 1, 2015)” Japanese Patent Publication “Japanese Patent Laid-Open No. 2015-18296 (published Jan. 29, 2015)”
  • a head mounted display is known as a display device for displaying the partial image as described above.
  • HMD head mounted display
  • the display area can basically be freely specified by the educator or the instructor, so the educator or instructor may not see the area that the educator or instructor wants to pay attention to. This is because the effects of education and guidance cannot be expected unless the educator or leader can recognize this.
  • This is not limited to displaying a partial image of an omnidirectional image, but is a problem that occurs in common when displaying a partial image of a designated display area of the image area.
  • One embodiment of the present invention realizes a display control device or the like that allows a user other than the user of the display device to recognize a display region of a display device that displays a partial image of a specified display region of the image region. For the purpose.
  • a display control device is a display control device that displays content on a first display device, and is a display region specified among image regions of the content.
  • a display area specifying unit for specifying which area of the image area is the display area, and display area information indicating the area specified by the display area specifying unit.
  • a display area notification unit to be displayed on the first display device.
  • the second display device can communicate directly or indirectly with a display control device that displays content on the first display device, and the image of the content
  • a display control device that displays content on the first display device
  • the image of the content A second display device that displays a partial image of a specified display area of the area, a display area specifying unit that specifies which area of the image area is specified as a display area, and the display area
  • the display area information indicating the area specified by the specifying unit is notified to the display control apparatus, and the display area notification unit is configured to display the display area information on the first display apparatus.
  • a control method for a display control apparatus is a control method for a display control apparatus that displays a content on a first display device, and includes:
  • the second display device that displays the partial image of the designated display area includes a display area specifying step for specifying which area of the image area is the display area, and the area specified in the display area specifying step.
  • the display area of the second display device can be recognized by the user of the first display device.
  • Embodiment 1 An embodiment of the present invention will be described with reference to FIGS.
  • FIG. 1 is a block diagram illustrating an example of a main configuration of a display control device 1 and a display device (second display device) 3 included in a control system 5 according to the present embodiment.
  • the display device 3 is a head mounted display (HMD) that is used by being mounted on the user's head.
  • HMD head mounted display
  • a user who is a leader uses the display control device 1 to guide a trainee user who is a user who uses the display device 3.
  • the guidance is performed while displaying the same content on both the display control device 1 and the display device 3.
  • the display device 3 is not limited to the HMD, and may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like provided with a display.
  • the display control device 1 may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like.
  • the display control device 1 includes a control unit 10, a display unit (first display device) 17, an input unit 18, a communication unit 19, and a storage unit 20.
  • the control unit 10 controls each unit of the display control device 1 in an integrated manner, and includes an omnidirectional image drawing unit 11, a display region specifying unit 12, a display region notification unit 13, and a guide unit (display region control unit) 14. , A combining unit 15 and a superimposition processing unit 16.
  • the omnidirectional image drawing unit 11 displays the omnidirectional image (content) on the display unit 17.
  • the display device 3 displays a partial image that is a part of the omnidirectional image, but the omnidirectional image drawing unit 11 causes the display unit 17 to display the entire image area of the omnidirectional image on one screen.
  • the image displayed by the omnidirectional image drawing unit 11 only needs to include at least the partial image displayed on the display device 3, and may be a partial image of the image area of the omnidirectional image. .
  • the display area specifying unit 12 specifies which area of the image area of the content the display device 3 uses as the display area. Then, the display area notification unit 13 notifies the user of the display control device 1 of the display area information by displaying the display area information indicating the area specified by the display area specifying unit 12 on the display unit 17.
  • the guide unit 14 causes the display unit 17 to display area guide information indicating a guide area that is an image area to be displayed on the display device 3.
  • the guide area can be designated by the instructor through an input operation to the input unit 18.
  • a guide area can be set in advance in guide information 22 described later.
  • the guide unit 14 causes the display device 3 to display the image area indicated by the area guide information.
  • the guide section 14 also performs control to move the image area of the display device 3 to the position where the display area information is moved. Note that the process of moving the image area of the display device 3 may be performed by providing a processing block different from the guide unit 14.
  • the synthesizing unit 15 synthesizes the display area information described above with the omnidirectional image, generates a synthesized image in which the display area information is superimposed on the omnidirectional image, and causes the display unit 17 to display the synthesized image.
  • the composition unit 15 generates a composite image in which the above-described area guide information is superimposed on the omnidirectional image and causes the display unit 17 to display the composite image.
  • the synthesizing unit 15 generates a synthesized image obtained by superimposing a later-described superimposed image on the omnidirectional image and causes the display unit 17 to display the synthesized image.
  • the superimposition processing unit 16 causes the display device 3 to display a superimposed image related to the omnidirectional image. Since the partial image of the omnidirectional image is displayed on the display device 3, the superimposed image is displayed superimposed on the partial image. In addition, the superimposition processing unit 16 sets a region designated in the omnidirectional image displayed by the display unit 17 as a superimposition region of the superimposed image. Details of the superimposition processing unit 16 will be described in the second embodiment.
  • the display unit 17 is a device that displays an image.
  • the display unit 17 may be a display device externally attached to the display control device 1.
  • the input unit 18 receives a user input operation, and outputs information indicating the content of the received input operation to the control unit 10.
  • the input unit 18 may be, for example, a receiving unit that receives a signal indicating the content of a user input operation on a controller (not shown) from the controller.
  • the input unit 18 performs an operation for specifying the position of the guide area on the content, an operation for moving the display area information on the omnidirectional image, an operation for selecting the area guide information, an operation for specifying the overlapping area on the content, and the like. Anything can be accepted.
  • the communication unit 19 is for the display control device 1 to communicate with another device (display device 3 in this example).
  • the communication unit 19 and the communication unit 38 of the display device 3 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may do.
  • the storage unit 20 stores various data used by the display control device 1.
  • the storage unit 20 stores an omnidirectional image 21, guide information 22, and a superimposed image 23.
  • the omnidirectional image 21 is an image obtained by imaging all directions from the imaging point.
  • the content displayed by the display control device 1 is the omnidirectional image 21.
  • the guide information 22 is information indicating which of the display areas of the omnidirectional image 21 is a guide area.
  • the guide unit 14 can display area guide information based on the guide information 22.
  • the superimposed image 23 is an image having contents related to the omnidirectional image 21. Details of the superimposed image 23 will be described in a second embodiment.
  • the omnidirectional image 21 and the superimposed image 23 may be a moving image or a still image. In the present embodiment, an example in which the omnidirectional image 21 is a moving image will be described.
  • the display device 3 is a device that displays content, and includes a control unit 30, a display unit 35, a storage unit 36, a sensor 37, and a communication unit 38.
  • the control unit 30 controls each unit of the display device 3 in an integrated manner, and includes a line-of-sight direction specifying unit 31, an omnidirectional image drawing unit (display region specifying unit) 32, a combining unit 33, and a display region notifying unit 34. including.
  • the line-of-sight direction specifying unit 31 determines the line-of-sight direction of the user of the display device 3 from the output value of the sensor 37.
  • the sensor 37 detects the orientation of the display device 3, that is, the orientation of the face of the user wearing the display device 3 (front direction).
  • the sensor 37 may be configured by a six-axis sensor that combines at least two of a three-axis gyro sensor, a three-axis acceleration sensor, a three-axis magnetic sensor, and the like.
  • the line-of-sight direction specifying unit 31 sets the direction of the user's face specified from the output values of these sensors as the user's line-of-sight direction.
  • the sensor 37 may detect the position of the user's black eye.
  • the line-of-sight direction specifying unit 31 specifies the line-of-sight direction from the position of the user's black eye.
  • the sensor 37 may include a sensor that detects the orientation of the user's face and a sensor that detects the position of the user's black eyes.
  • the identification of the line-of-sight direction can also be realized by a configuration other than the above.
  • a camera installed outside the display device 3 may be used instead of the sensor 37.
  • the display device 3 is provided with a light emitting device, which is blinked, and this state is photographed by the camera, and the position and orientation of the display device 3 can be detected from the image.
  • the line-of-sight direction can be determined by back-calculating from the light reception time when the laser emitted from the external light emitting device is received by the light receiver provided in the display device 3, the angle of each point light reception, or the time difference.
  • the omnidirectional image drawing unit 32 specifies which region in the image region of the omnidirectional image 21 is designated as the display region from the line-of-sight direction specified by the line-of-sight direction specifying unit 31. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display the partial image of the identified display region in the image region of the omnidirectional image 21 via the synthesis unit 33.
  • the synthesizing unit 33 when there is a superimposed image 23 to be superimposed on the partial image, generates a synthesized image in which the superimposed image is superimposed on the partial image and causes the display unit 35 to display the synthesized image.
  • the display area notification unit 34 notifies the display control apparatus 1 of display area information indicating the display area specified by the omnidirectional image drawing unit 32 via the communication unit 38, thereby displaying the display area information. 1 on the display unit 17.
  • the display unit 35 is a device that displays an image.
  • the display unit 35 may be a non-transmissive type or a transmissive type. When the transmissive display unit 35 is used, it is possible to provide the user with a mixed reality space in which the image displayed by the display unit 35 is superimposed on the visual field outside the display device 3 (real space). Further, the display unit 35 may be a display device externally attached to the display device 3, or a normal flat panel display or the like.
  • the storage unit 36 stores various data used by the display device 3. Although not shown in FIG. 1, the omnidirectional image 21 is stored in the storage unit 36 as in the display control device 1. When displaying the superimposed image 23, the superimposed image 23 is also stored in the storage unit 36.
  • the communication unit 38 is for the display device 3 to communicate with another device (in this example, the display control device 1).
  • the communication unit 38 and the communication unit 19 of the display control device 1 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may communicate.
  • FIG. 2 is a diagram illustrating the relationship between the omnidirectional image and the display area.
  • the omnidirectional image A1 is shown in a three-dimensional coordinate space defined by x, y, and z axes orthogonal to each other.
  • the omnidirectional image A1 forms an omnidirectional sphere that is a sphere of radius r.
  • the z-axis direction coincides with the vertical direction of the sensor 37 in the real space
  • the y-axis direction coincides with the front direction of the sensor 37 in the real space
  • the x-axis direction coincides with the left-right direction of the sensor 37 in the real space. I'm doing it.
  • the line-of-sight direction identifying unit 31 determines which direction the sensor 37 is facing from the output value of the sensor 37. Since the sensor 37 is mounted on the display device 3 in a predetermined orientation, if the user wears the display device 3 in the correct orientation, the orientation of the sensor 37 can be regarded as the user's line-of-sight direction. Therefore, in the following, the direction of the sensor 37 is described as the user's line-of-sight direction.
  • the line-of-sight direction identifying unit 31 rotates the line-of-sight direction around an azimuth angle (yaw) ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) that is a rotation angle around the vertical axis (z-axis) and a horizontal axis (x-axis). It can be expressed in combination with an angle of elevation (pitch) ⁇ ( ⁇ 90 ° ⁇ ⁇ ⁇ 90 °).
  • the omnidirectional image drawing unit 32 When the line-of-sight direction specifying unit 31 specifies the azimuth angle and elevation angle indicating the line-of-sight direction, the omnidirectional image drawing unit 32 includes a straight line extending from the user's viewpoint position Q in the direction indicated by the specified azimuth angle and elevation angle, An intersection point P with the sphere image A1 is obtained. Then, in the omnidirectional image A1, an area having a height h and a width w with the intersection P as the center is specified as the display area A11. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display a portion in the display area A11 of the omnidirectional image A1.
  • the display area A11 changes in conjunction with the user's line-of-sight direction, and the image displayed on the display unit 35 also changes accordingly.
  • the viewpoint position Q in the omnidirectional sphere is assumed to be stationary in order to simplify the description, but the viewpoint position in the omnidirectional sphere is linked with the movement of the user in the real space. Q may be moved.
  • FIG. 3 is a diagram illustrating an image displayed by the display control device 1.
  • the display control device 1 displays a planar image obtained by mapping the omnidirectional image A1 on a two-dimensional plane on the display unit 17.
  • the position on the omnidirectional image A1 includes the azimuth angle ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) and the elevation angle ⁇ ( ⁇ 90 ° ⁇ ⁇ that is a rotation angle around the horizontal axis (x axis). ⁇ 90 °).
  • the azimuth angle at the left end of the omnidirectional image A1 is ⁇ 180 °
  • the azimuth angle at the right end is 180 °
  • the elevation angle at the upper end is 90 °
  • the elevation angle at the lower end is ⁇ 90 °.
  • the display area information A11 indicating the display area in the display device 3 is superimposed on the omnidirectional image A1.
  • display area information A12 and A13 indicating display areas in the other display devices 3 are also displayed in a superimposed manner. That is, in FIG. 3, the display areas of the three display devices 3 are represented by A11 to A13, respectively.
  • the display control device 1 can also display the display area information of the plurality of display devices 3 on one screen.
  • the area guide information B1 and B2 are superimposed on the omnidirectional image A1.
  • the area guide information B1 and B2 indicate guide areas that are image areas to be displayed on the display device 3.
  • the area guide information B1 and B2 can be selected by an input operation via the input unit 18.
  • a partial image of the guide area indicated by the area guide information B1 is displayed on the display device 3.
  • the area guide information B2 is selected.
  • the partial image of the guide area indicated by the area guide information B1 and B2 can be forcibly displayed on the display device 3. It should be noted that the same partial image may be displayed collectively for the plurality of display devices 3, or different partial images may be displayed by selecting different region guide information for each display device 3.
  • the display position of the display area information may be changed by an input operation via the input unit 18.
  • the image area of the display device 3 may be moved to the position to which the display area information is moved.
  • the guide information 22 may be information as shown in FIG. 4, for example.
  • FIG. 4 is a diagram illustrating an example of the guide information 22.
  • the guide information 22 in FIG. 4A is data in a table format in which information on “guide area”, “azimuth angle range”, “elevation angle range”, and “reproduction time” is associated.
  • “Guide area” is identification information of the guide area, and in this example, the name of the guide area is described.
  • “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the guide region in the omnidirectional image.
  • the guide region B1 (the guide region indicated by the region guide information B1 in FIG. 3) has an azimuth angle of ⁇ 110 ° at the left end and ⁇ 30 ° at the right end in the omnidirectional image. , It is defined as a region occupying a rectangle having a lower end elevation angle of ⁇ 10 ° and an upper end elevation angle of 40 °.
  • “Reproduction time” is information indicating a reproduction time zone in which the guide area is set. For example, since the “reproduction time” of the illustrated guide area B1 is 00:01:00 to 00:05:00, the content reproduction time of this guide area B1 is 00:01:00 to 00:05:00. The time zone is set.
  • the guide information 22 shown in FIG. 4B includes the “azimuth range” and “elevation range” included in the guide information 22 shown in FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the guide information 22 in FIG. 4B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the guide region. More specifically, “width” and “height” indicate the width and height of the guide region, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the guide region. The reference position can be an arbitrary position on the guide area.
  • the position of the lower left corner may be used as the reference position.
  • the lower left corner is the position indicated by “azimuth angle” and “elevation angle”
  • the area having the width and height indicated by “width” and “height” is the guide area.
  • the display area specifying unit 12 specifies the display area of the display device 3.
  • the display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG. 5, for example.
  • FIG. 5 is a diagram illustrating an example of display area management information for managing the display area of the display device 3.
  • the display area management information in FIG. 5A is data in a table format in which information on “user”, “azimuth angle range”, and “elevation angle range” is associated.
  • “User” is identification information of the user corresponding to the display area, that is, the user who is viewing the image in the display area.
  • the user name is described.
  • the display area management information may include identification information of the display device 3 in addition to the “user” information or instead of the “user” information.
  • “Azimuth range” and “elevation range” are information indicating the range occupied by the display area of the display device 3 in the omnidirectional image.
  • the display area on the display device 3 of the user X1 is a omnidirectional image with a left azimuth angle of ⁇ 85 °, a right azimuth angle of ⁇ 5 °, and a lower elevation angle of 25. It is defined that the region occupies a rectangle having an elevation angle of 75 ° at the upper end.
  • the display area management information in FIG. 5B includes the “azimuth angle range” and the “elevation angle range” included in the display area management information in FIG. It replaces “height”, “azimuth”, and “elevation”. That is, in the display area management information in FIG. 5B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the display area. More specifically, “width” and “height” indicate the width and height of the display area, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the display area. The reference position can be an arbitrary position on the display area.
  • the center position may be used as the reference position.
  • the center is the position indicated by “azimuth angle” and “elevation angle”
  • the area having the width and height indicated by “width” and “height” is the display area.
  • FIG. 6 is a flowchart illustrating an example of processing in which the display control device 1 displays an image.
  • FIG. 6 is a flowchart illustrating an example of processing in which the display control device 1 displays an image.
  • the processing in the case where there are a plurality of display devices 3 is the same as the following.
  • the omnidirectional image drawing unit 11 acquires the omnidirectional image 21 from the storage unit 20 and displays it on the display unit 17 via the synthesis unit 15.
  • the display region specifying unit 12 specifies the display region of the display device 3 by communicating with the display device 3.
  • the display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG.
  • the guide unit 14 receives the designation of the position of the guide area by the instructor.
  • the guide unit 14 specifies a guide region.
  • the designation of the position of the guide region in S3 may be performed by inputting a desired position of the instructor in the omnidirectional image 21 via the input unit 18, for example.
  • the guide unit 14 specifies a guide region having a size corresponding to the width and height of the display region of the display device 3 with the input position as the center.
  • the guide unit 14 causes the display unit 17 to display the area guide information indicating the guide area specified in S4 via the combining unit 15. Further, the display area notification unit 13 causes the display unit 17 to display display area information indicating the display area specified in S ⁇ b> 2 via the synthesis unit 15. Thereby, the area guide information and the display area information are displayed superimposed on the omnidirectional image 21. Note that the area guide information and the display area information may be displayed in separate steps. Moreover, the process of S3 and S4 may be performed before S2, and may be performed in parallel with S2.
  • the guide unit 14 determines whether or not the area guide information displayed in S5 has been selected. For example, the guide unit 14 may determine that the region guide information has been selected when the displayed region guide information is selected by the instructor through an input operation via the input unit 18.
  • the method for selecting the area guide information is not particularly limited. For example, the area guide information may be selected by clicking or tapping the displayed area guide information.
  • the area guide information is not selected when the state where the displayed area guide information is not selected continues for a predetermined time or when an input operation of a leader who selects not to select the area guide information is detected. It may be determined that If it is determined that the area guide information is selected in S6 (YES in S6), the process proceeds to S7. If it is determined that the area guide information is not selected (NO in S6), the process returns to S1.
  • the guide unit 14 notifies the display device 3 of the guide region indicated by the selected region guide information, and instructs the display device 3 to set the region as the display region. Thereby, in the display device 3, the display area is changed, and an image of the guide area indicated by the area guide information is displayed. Thereafter, the process returns to S1.
  • the guide area may be set by automatically detecting an object from the omnidirectional image.
  • a predetermined area including the display area of the detected object may be extracted as the guide area.
  • the instructor can easily confirm whether or not the instructor is looking at a predetermined object in the content.
  • the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance.
  • an object having such an appearance can be automatically detected.
  • the object may be detected using machine learning or the like.
  • the function of the detection unit for detecting an object may be provided in the guide unit 14, or a processing block of the detection unit may be provided separately and an object may be detected by the processing block.
  • FIG. 7 is a flowchart illustrating an example of processing in which the display device 3 displays an image.
  • the line-of-sight direction specifying unit 31 specifies the line-of-sight direction of the instructor wearing the display device 3, and the omnidirectional image drawing unit 32 determines all of the line-of-sight directions specified by the line-of-sight direction specifying unit 31.
  • the display area in the celestial sphere image is specified.
  • the omnidirectional image drawing unit 32 causes the display unit 35 to draw (display) the partial image of the omnidirectional image corresponding to the specified display area via the synthesis unit 33.
  • the display area notification unit 34 notifies the display control apparatus 1 of the specified display area.
  • the display area notification unit 34 includes the identification information of the display device 3 or the user of the display device 3 and information indicating the range occupied by the display area of the display device 3 in the omnidirectional image (for example, the range of the azimuth angle). Information indicating the range of the elevation angle) is transmitted to the display control device 1 in association with each other.
  • the omnidirectional image drawing unit 32 determines whether or not a display area change instruction is received from the display control device 1. If it is determined that it has not been received (NO in S14), the process returns to S11, and if it is determined that it has been received (YES in S14), the process proceeds to S15.
  • the omnidirectional image drawing unit 32 changes the display area according to the instruction received from the display control apparatus 1, and the partial image of the omnidirectional image corresponding to the changed display area is passed through the synthesis unit 33. Then, the display unit 35 is drawn (displayed). Thereafter, the process returns to S11.
  • the omnidirectional image drawing unit 32 performs the following (1) to (4)
  • the display area may be changed after performing any one of the image processing. This makes it difficult for the instructor to get drunk even when the display area changes suddenly.
  • the display control apparatus 1 may determine whether the amount of change in the position of the display area before and after the change is greater than or equal to a predetermined value. In this case, when it is determined in the above determination that the value is equal to or greater than the predetermined value, the display control device 1 instructs the display device 3 to execute any one of the image processes (1) to (4). May be.
  • FIG. 8 is a diagram illustrating an image displayed by the display control device 1 in which a superimposed image is superimposed on an omnidirectional image.
  • display area information A11 to A13 and area guide information B1 and B2 are displayed superimposed on the omnidirectional image A1.
  • the area information C1 to C3 is superimposed and displayed on the omnidirectional image A1.
  • the area information C1 and C2 is superimposition area information indicating a superimposition area that is an area in which the superimposed image is superimposed and displayed.
  • the area information C3 is superposition prohibition area information indicating a superposition prohibition area that is an area where a superimposed image is not displayed.
  • the instructor who uses the display control device 1 can select the region information C1 or C2 and display the superimposed image in the superimposed region indicated by the region information C1 or C2.
  • the superimposed image may be an image related to the content to be superimposed (the omnidirectional image A1 in this example).
  • the superimposed image may be a moving image content, a still image content, or an annotation displayed as an annotation regarding the omnidirectional image A1 or another superimposed image.
  • the superimposed image is an image obtained by imaging the same imaging target as the imaging target of the omnidirectional image A1 (in this example, the cityscape), but the imaging device used for imaging is different from the imaging device that images the omnidirectional image A1. It may be an image. Further, the superimposed image has a higher resolution than the omnidirectional image A1, and therefore may be a higher definition image than the omnidirectional image A1.
  • the superimposed image may be, for example, an image obtained by photographing the imaging target of the omnidirectional image A1 at the same angle as the omnidirectional image A1, or an image (for example, a specific image) taken at an angle different from the omnidirectional image A1. Image of the viewpoint). Further, the superimposed image may be an image obtained by enlarging a part of the imaging target of the omnidirectional image A1.
  • multifaceted information about the imaging target of the omnidirectional image A1 can be given to the user. For example, by using a high-resolution image obtained by enlarging a specific building in the guide area in the area guide information B1 as a superimposed image, the user can check a part of the specific building in detail while checking the entire cityscape. can do. Further, for example, by using an image obtained by capturing a specific building at an angle different from that of the omnidirectional image A1 as a superimposed image, the user can select a portion of the specific building that is not captured at the imaging angle of the omnidirectional image A1. Can also be confirmed.
  • the content of the annotation is not particularly limited as long as it relates to the omnidirectional image A1 or the superimposed image.
  • the information regarding the omnidirectional image A1 may be information indicating the state of the imaging target, the operation, the name, the notable part of the omnidirectional image A1, and the like.
  • examples of the information related to the superimposed image include the angle at which the image is captured, whether the image is an enlarged image, and which portion of the image to be imaged.
  • a UI (User Interface) menu for operating the display control device 1 or a UI menu for operation control of the display device 3 may be displayed as an annotation.
  • the omnidirectional image A1 is an image in which the cityscape is an imaging target, but the imaging target is arbitrary.
  • the omnidirectional image A1 may be an image obtained by imaging the entire state of the operating room where the operation is performed.
  • the imaging target may include a surgeon, an assistant, a patient, a surgical instrument, various devices, and the like.
  • the display control device 1 and the display device 3 can be used for medical education.
  • the instructor can check the progress of the entire operation in the omnidirectional image A1
  • a superimposed image of the surgeon's eyes may be displayed when used for surgeon education
  • a superimposed image of the assistant's eyes may be displayed when used for assistant education.
  • an image obtained by capturing a screen displaying patient vital data as a superimposed image the user can recognize the relationship between the transition of vital data during surgery and the movement of each person.
  • a high-resolution image of the operative field may be used as a superimposed image, thereby allowing the user to recognize the details of the surgeon's detailed work.
  • information necessary for surgery, device operation information (for example, on / off of heart-lung machine), and the like may be displayed as annotations.
  • the area management information that defines the overlapping area and the overlapping prohibited area as described above may be stored in the storage unit 20 or the like.
  • the superimposition processing unit 16 can specify the superimposition region and the superposition prohibition region by referring to the stored region management information.
  • the area management information may be information as shown in FIG. 9, for example.
  • FIG. 9 is a diagram illustrating an example of region management information that defines a superimposition region and a superposition prohibition region.
  • the area management information in (a) of FIG. 9 has a table format in which information of “area”, “azimuth angle range”, “elevation angle range”, “superimposition / prohibition”, and “reproduction time” is associated. Information.
  • “Area” is identification information of a superimposition area or a superposition prohibition area, and in this example, the name of the area is described.
  • “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the superimposition region or the superposition prohibition region in the omnidirectional image. For example, in the area management information shown in the drawing, the azimuth angle at the left end is ⁇ 80 °, the azimuth angle at the right end is ⁇ 100 °, the elevation angle at the lower end is 15 °, and the elevation angle at the upper end is 30 °. It is specified that the area occupies a rectangle.
  • “Superposition / prohibition” is information indicating whether the “region” is a superposition region or a superposition prohibition region.
  • a region where the “superimposition / prohibition” information is “superimposition” is a superimposition region, and a region where “prohibition” is a superimposition prohibition region.
  • “Reproduction time” is information indicating a reproduction time zone in which a superimposition area or a superposition prohibition area is set. For example, since the “reproduction time” of the illustrated area C1 (the area indicated by the area information C1 in FIG. 8) is 00:01:00 to 00:05:00, the content reproduction time of this area C1 is 00:00. The time zone is set to 01:00 to 00:05:00.
  • the area management information in (b) of FIG. 9 indicates that the “azimuth angle range” and “elevation angle range” included in the area management information in (a) of FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the area management information of FIG. 9B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the overlapping area or the overlapping prohibited area. More specifically, “width” and “height” indicate the width and height of the overlapping region or the overlapping prohibition region, respectively. In addition, “azimuth angle” and “elevation angle” indicate reference positions for specifying a superimposition region or a superposition prohibition region.
  • the reference position can be an arbitrary position on the superposition area or the superposition prohibition area.
  • the position of the lower left corner may be used as the reference position.
  • the lower left corner is the position indicated by “azimuth angle” and “elevation angle”
  • the region having the width and height indicated by “width” and “height” is the overlapping region or the overlapping prohibition region.
  • the superimposition area may have a size that fits within the display area.
  • the superimposition area is a narrower area than the display area that is a part of the image area of the omnidirectional image. For this reason, the content may be enlarged and displayed when the instructor selects the overlapping region. Thereby, compared with the example of FIG. 8, it is possible to easily select the overlapping region. This will be described with reference to FIG.
  • FIG. 10 is a diagram showing an example in which the superimposition area information and the superposition prohibition area information are displayed after the content is enlarged and displayed.
  • the partial image of the display area B ⁇ b> 1 of FIG. 8 is displayed on the entire screen of the display unit 17 of the display control device 1.
  • the area information C1 and C3 can be confirmed in more detail than the example of FIG.
  • the content is enlarged and displayed on the entire screen here, it may be superimposed on the original content after being enlarged at an arbitrary enlargement ratio.
  • the display mode of the area information C1 indicating the overlapping area and the area information C3 indicating the overlapping prohibited area are different. Specifically, the area information C3 is marked with “X” indicating that superposition is prohibited. As a result, the instructor can intuitively recognize the overlapping area and the overlapping prohibited area.
  • Area information C3 is an area set to include the display area of the object D1. Thereby, even when the superimposed image is displayed, the object D1 does not disappear.
  • the superposition prohibition area may be set by automatically detecting the object D1 from the omnidirectional image. In this case, a predetermined range including the display area of the detected object D1 may be set as the superposition prohibition area.
  • the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance. Thus, by analyzing the omnidirectional image, an object having such an appearance can be automatically detected. Further, the object may be detected using machine learning or the like.
  • Such an object detection may be configured to be performed by the superimposition processing unit 16 or may be configured to separately provide a processing block for detecting an object and detect the object in the processing block.
  • FIG. 11 is a flowchart illustrating an example of processing when the display control device 1 displays a superimposed image on the display device 3. Note that FIG. 11 illustrates processing after the omnidirectional image is displayed on the display unit 17. In the following, an example in which there is one display device 3 will be described, but the processing in the case where there are a plurality of display devices 3 is the same as the following.
  • the display area specifying unit 12 specifies the display area of the display device 3 by communicating with the display device 3.
  • the superimposition processing unit 16 extracts a superimposition region in the displayed omnidirectional image. Specifically, the superimposition processing unit 16 selects an area corresponding to the playback time of the content and having “superimposition / prohibition” information “superimposition” among the areas indicated in the area management information (see FIG. 9). Extract.
  • only the display area specified in S21 may be the extraction target, or a wider range (for example, the display area and a predetermined area around it) may be the extraction target. In this case, a predetermined area (area specified by the azimuth angle range and the elevation angle range) indicated in the area management information (see FIG. 9) is extracted at least a part of which is included in the extraction target area. To do.
  • the superimposition processing unit 16 determines whether there is a superimposed image to be displayed in the superimposition region extracted in S22.
  • the area management information as shown in FIG. 9 is set in advance, the overlapping area to be displayed in each area may be set in advance.
  • the superimposition processing unit 16 determines the presence or absence of a superimposed image associated with the superimposition region extracted in S22. If it is determined in S23 that there is a superimposed image (YES in S23), the process proceeds to S24. On the other hand, if it is determined that there is no superimposed image (NO in S23), the process returns to S21.
  • the superimposition processing unit 16 acquires information indicating the display mode of the superimposed image. Such information may be stored in advance in association with the superimposed image.
  • the information indicating the display mode includes, for example, the presence / absence of perspective (perspective display, that is, display by perspective projection), the transmittance of the superimposed image, and the superimposed decoration method (for example, presence / absence of application of image processing for blurring the outline of the superimposed image). The information to show is mentioned.
  • the superimposition processing unit 16 causes the display unit 17 to display superimposition region information indicating the superimposition region extracted in S22. At this time, the superimposition processing unit 16 may enlarge and display an image area including the superimposition area (see FIG. 10).
  • the superimposition processing unit 16 receives selection of a superimposition region and a superimposed image.
  • the selection of the overlapping area may be performed by selecting the displayed overlapping area information by an input operation to the input unit 18, for example. Further, when a plurality of superimposed images are associated with the selected superimposed region, the plurality of superimposed images may be displayed and selected by the instructor.
  • the synthesizing unit 15 synthesizes an image of the superimposed image in the superimposition area received in S26 and draws (displays) the image on the display unit 17. Note that the superimposed image may be displayed as a preview before S26.
  • the superimposition processing unit 16 notifies the display device 3 of the superposed region and the superposed image that have been selected in S26, and instructs the superposed region to superimpose and display this superposed image. Thereby, also on the display device 3, the superimposed image is superimposed and displayed on the superimposed region.
  • the superimposition processing unit 16 may transmit the superimposed image to the display device 3. Thereafter, the process returns to S21.
  • the superposition prohibition area may be extracted, and in S25, the superposition prohibition area information indicating the superposition prohibition area may be displayed.
  • the superimposition prohibition area is (1) when the instructor can freely set the position of the superimposition area, (2) when determining the position of the superimposition area based on the position of the object in the content, and (3) This is useful, for example, when determining the position of the superimposition area based on the position of the display area.
  • an object having a predetermined appearance for example, a building in the example of FIG. 8 is detected in the content, and is shifted by a predetermined offset from the periphery of the object (for example, a reference position in the object).
  • a superimposed image is displayed at (position).
  • the superposition image can be displayed in association with the object by setting the superposition area while avoiding this, and the superposition prohibition area is hidden by the superposition image. It will not be done.
  • FIG. 12 is a block diagram illustrating an example of a main configuration of the display control device 1 and the display device 3 configuring the control system 5 according to an embodiment of the present invention.
  • the display control device 1 is different from the display control device 1 of each embodiment described above in that it includes an image transmission unit 40.
  • the display device 3 is different from the display device 3 of each of the above embodiments in that the omnidirectional image drawing unit 32 is not provided but the image acquisition unit 45 is provided.
  • the image transmission unit 40 transmits an image to the display device 3 via the communication unit 19.
  • the image to be transmitted is a partial image corresponding to the display area of the display device 3 in the omnidirectional image.
  • the display area is specified by the display area specifying unit 12 based on a notification from the display area notifying unit 34.
  • the image acquisition unit 45 acquires the image transmitted by the display control device 1 via the communication unit 38 and causes the display unit 35 to display the image. Accordingly, the display device 3 can display a partial image of the omnidirectional image or an image in which the superimposed image is superimposed on the omnidirectional image without storing the omnidirectional image or the superimposed image.
  • FIG. 13 is a diagram showing an overview of a control system 50 according to Embodiment 4 of the present invention.
  • the control system 50 includes a display control device 1, a display device 3, and a server 7.
  • the control system 50 may include a plurality of display devices 3.
  • the display control device 1, the display device 3, and the server 7 are communicably connected via a predetermined communication network (for example, the Internet). Therefore, the user of the display control device 1 (instructor) or the user of the display device 3 (instructor) can be connected to the communication network from other users (instructors or instructors).
  • the control system 50 can be used even if the user is in a remote location.
  • the server 7 stores various information used in the control system 50, and transmits the information to the display control device 1 and the display device 3.
  • the server 7 stores a communication unit for the server 7 to communicate with other devices (in the present embodiment, the display control device 1 and the display device 3), a control unit that controls each unit of the server 7, and various information.
  • a storage unit is provided.
  • control unit of the server 7 includes an omnidirectional image transmission unit that transmits the omnidirectional image 21 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the omnidirectional image 21 from the omnidirectional image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the omnidirectional image 21.
  • the control unit of the server 7 includes a guide information transmission unit that transmits guide information 22 in response to a request from another device. Therefore, the display control apparatus 1 can acquire the guide information 22 from the guide information transmission unit and display the area guide information. For this reason, the display control apparatus 1 of this embodiment does not need to memorize
  • FIG. When the displayed area guide information is selected, the display control apparatus 1 notifies the display apparatus 3 of the guide area indicated by the selected area guide information. The display device 3 acquires the notified partial image of the guide area from the server 7 and displays it.
  • the control unit of the server 7 includes a superimposed image transmission unit that transmits the superimposed image 23 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the superimposed image 23 from the superimposed image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the superimposed image 23. Similarly, the area management information (see FIG. 9) can be obtained from the server 7.
  • the display control device 1 has described an example of displaying a rectangular planar image obtained by mapping the omnidirectional image on a two-dimensional plane.
  • the display control device 1 displays at least the display area of the display device 3. It is only necessary to display an image including it, and is not limited to this example. For example, when displaying omnidirectional images captured by two fisheye cameras, each image captured by each fisheye camera is mapped to a two-dimensional plane to generate two circular planar images, which are displayed. May be.
  • the content can display a partial image of a display area that is a specified part of the entire image area. What is necessary is not limited to the spherical image.
  • the image may be a hemispherical image, or may be a flat image (such as a panoramic photograph) having a display size that does not fit on one screen of the display device 3.
  • the content is not limited to those intended for education and guidance.
  • the content may be a still image or may include a plurality of still images as components.
  • the still images displayed on the display control device 1 and the display device 3 are synchronized.
  • the display area designation method is not particularly limited.
  • the display area may be specified by a controller or the like for specifying the display area.
  • the display control apparatus 1 changes the display area by instructing the display apparatus 3
  • the display apparatus 3 can be changed.
  • the display area may be changed.
  • the display area is changed by displaying a superimposed image or an annotation including a message prompting the display of the guide area, a symbol (arrow or the like) indicating the moving direction of the line of sight for displaying the guide area, or a message on the display device 3. Can be urged to.
  • the display device 3 has a function of outputting sound
  • the display device 3 can be prompted to change the display area by causing the display device 3 to output the sound.
  • a message such as “Move the line of sight to the right and pay attention to a triangular building” may be displayed as an annotation.
  • a message that guides the user's line-of-sight direction to a predetermined line-of-sight direction such as "Please move your line of sight slightly to the left so that a low-cylindrical building is located in the center of the field of view” It may be displayed.
  • the user may be prompted to display the guide area by changing the display mode of the guide area and the other image areas. For example, the display brightness of the guide area may be made higher than other areas. In this case, the user can easily find the guide area by moving the line-of-sight direction.
  • control blocks of the display control device 1 and the display device 3 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. However, it may be realized by software using a CPU (Central Processing Unit).
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the display control device 1 and the display device 3 include a CPU that executes instructions of a program that is software that implements each function, and a ROM (in which the program and various data are recorded so as to be readable by a computer (or CPU)). Read Only Memory) or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like.
  • the computer or CPU reads the program from the recording medium and executes the program, thereby achieving the object of one embodiment of the present invention.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • a display control device (1) is a display control device that displays content on a first display device (display unit 17), and displays a specified display region among the image regions of the content.
  • a display area specifying unit (12) for specifying which area of the image area the second display device (display apparatus 3) for displaying a partial image is a display area, and an area specified by the display area specifying unit
  • a display area notification unit (13) for displaying the display area information on the first display device is a display control device that displays content on a first display device (display unit 17), and displays a specified display region among the image regions of the content.
  • a display area specifying unit (12) for specifying which area of the image area the second display device (display apparatus 3) for displaying a partial image is a display area, and an area specified by the display area specifying unit
  • a display area notification unit (13) for displaying the display area information on the first display device.
  • the display area of the second display device is specified, and display area information indicating the area is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device.
  • the first display device only needs to display an area including at least the display area in the content.
  • the first display device may display the entire image area of the content, or a narrower area. It may be displayed.
  • the display control device is the display control device according to aspect 1, in which the image of the second display device is displayed when the display area information is moved on the content displayed on the first display device. It is good also as a structure provided with the display area control part (guide part 14) which moves an area
  • the user of the first display device moves the display region information to the image region that the user of the second display device wants to view, thereby displaying the partial image of the image region on the second display. It can be displayed on the device.
  • the display control device is the guide unit (14) that causes the first display device to display region guide information indicating an image region to be displayed on the second display device in the above aspect 1. It is good also as a structure provided with.
  • the user of the first display device can recognize the image region to be displayed on the second display device based on the region guide information. Further, as described above, the display area of the second display device can be recognized by the user of the first display device based on the display area information. Therefore, the user of the first display device can recognize whether the image region to be displayed on the second display device matches the actual display region of the second display device.
  • the display control apparatus is the display area control unit that causes the second display apparatus to display an image area indicated by the area guide information when the area guide information is selected in the aspect 3. It is good also as a structure provided with (guide part 14).
  • the user of the first display device can display the partial image of the image region on the second display device by selecting the region guide information. Therefore, the user of the second display device can browse the partial image of the image area to be displayed on the second display device.
  • the display control device includes a detection unit that detects an object having a predetermined appearance from the content in the aspect 3 or 4, and the guide unit includes the object detected by the detection unit.
  • the area guide information indicating the image area may be displayed on the first display device.
  • area guide information indicating an image area including an object having a predetermined appearance in the content is displayed on the first display device. Therefore, the first display indicates whether the display area of the object and the display area of the second display device are coincident with each other, that is, whether or not the user of the second display device is viewing the object. It can be recognized by the user of the device.
  • the display control device is the superimposition processing unit according to any one of the aspects 1 to 5 described above, wherein the superimposed image related to the content is superimposed on the partial image and displayed on the second display device. 16), and the superimposed region in which the superimposed image is superimposed on the partial image may be a region designated in the content displayed by the first display device.
  • the user of the first display device designates a region in the content displayed by the first display device, thereby superimposing the region on the partial image displayed on the second display device.
  • An image can be displayed.
  • the second display device (display device 3) according to aspect 7 of the present invention can communicate directly or indirectly with the display control device (1) that displays the content on the first display device (display unit 17).
  • a second display device that displays a partial image of a designated display area of the image area of the content, wherein a display area identifying unit that identifies which area of the image area is designated as the display area (
  • a notification unit (34).
  • the display area of the second display device is specified, the display area information indicating the specified area is notified to the display control device, and the display area information is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device.
  • a control method of the display control device (1) according to the aspect 8 of the present invention is a control method of a display control device that causes a content to be displayed on the first display device (display unit 17).
  • the display control device (1) may be realized by a computer.
  • display control is performed by operating the computer as each unit (software element) included in the display control device (1).
  • a control program for a display control apparatus for realizing the apparatus (1) by a computer and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
  • the second display device (3) according to the above aspect of the present invention may also be realized by a computer. In this case, the computer operates as each unit (software element) included in the second display device (3).
  • a control program for the second display device for realizing the second display device (3) by a computer and a computer-readable recording medium on which the control program is recorded fall within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention permet à un utilisateur de reconnaître la zone d'affichage d'un dispositif d'affichage utilisé par un autre utilisateur. Ce dispositif de commande d'affichage (1) comprend : une unité d'identification de zone d'affichage (12) permettant d'identifier une zone dans une zone d'image qu'un dispositif d'affichage (3) définit comme zone d'affichage, ledit dispositif d'affichage (3) affichant une image partielle dans une zone d'affichage désignée dans la zone d'image d'une image omnidirectionnelle (21) ; et une unité de notification de zone d'affichage (13) permettant de présenter les informations de zone d'affichage représentant la zone identifiée par l'unité d'identification de zone d'affichage (12) sur une unité d'affichage (17).
PCT/JP2017/044284 2017-01-24 2017-12-11 Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande WO2018139073A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017010316 2017-01-24
JP2017-010316 2017-01-24

Publications (1)

Publication Number Publication Date
WO2018139073A1 true WO2018139073A1 (fr) 2018-08-02

Family

ID=62978209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/044284 WO2018139073A1 (fr) 2017-01-24 2017-12-11 Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande

Country Status (1)

Country Link
WO (1) WO2018139073A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050058A1 (fr) * 2018-09-07 2020-03-12 ソニー株式会社 Système de distribution de contenu, procédé de distribution de contenu, et programme
WO2020054456A1 (fr) * 2018-09-14 2020-03-19 ソニー株式会社 Dispositif de commande d'affichage et procédé de commande d'affichage, et programme
JP2021087136A (ja) * 2019-11-28 2021-06-03 株式会社リコー 通信端末、撮影システム、画像処理方法及びプログラム
JP7647554B2 (ja) 2019-07-03 2025-03-18 ソニーグループ株式会社 ファイル生成装置、ファイル生成方法、再生処理装置及び再生処理方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014174401A (ja) * 2013-03-11 2014-09-22 Seiko Epson Corp 画像表示システム及び頭部装着型表示装置
WO2016002445A1 (fr) * 2014-07-03 2016-01-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2016038876A (ja) * 2014-08-11 2016-03-22 カシオ計算機株式会社 画像入力装置、画像出力装置及び画像入出力システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014174401A (ja) * 2013-03-11 2014-09-22 Seiko Epson Corp 画像表示システム及び頭部装着型表示装置
WO2016002445A1 (fr) * 2014-07-03 2016-01-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2016038876A (ja) * 2014-08-11 2016-03-22 カシオ計算機株式会社 画像入力装置、画像出力装置及び画像入出力システム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050058A1 (fr) * 2018-09-07 2020-03-12 ソニー株式会社 Système de distribution de contenu, procédé de distribution de contenu, et programme
US11470395B2 (en) 2018-09-07 2022-10-11 Sony Corporation Content distribution system and content distribution method
WO2020054456A1 (fr) * 2018-09-14 2020-03-19 ソニー株式会社 Dispositif de commande d'affichage et procédé de commande d'affichage, et programme
JP7647554B2 (ja) 2019-07-03 2025-03-18 ソニーグループ株式会社 ファイル生成装置、ファイル生成方法、再生処理装置及び再生処理方法
JP2021087136A (ja) * 2019-11-28 2021-06-03 株式会社リコー 通信端末、撮影システム、画像処理方法及びプログラム
JP7400407B2 (ja) 2019-11-28 2023-12-19 株式会社リコー 通信端末、撮影システム、画像処理方法及びプログラム

Similar Documents

Publication Publication Date Title
WO2018101227A1 (fr) Dispositif de commande d'affichage, visiocasque, procédé de commande pour dispositif de commande d'affichage et programme de commande
EP3019939B1 (fr) Appareil de commande d'affichage et support d'enregistrement lisible par ordinateur
EP1404126B1 (fr) Appareil et procédé de combinaison vidéo
US20160210785A1 (en) Augmented reality system and method for positioning and mapping
WO2018139073A1 (fr) Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande
US20160292923A1 (en) System and method for incorporating a physical image stream in a head mounted display
JP2007042073A (ja) 映像提示システム、映像提示方法、該映像提示方法をコンピュータに実行させるためのプログラム、および記憶媒体
TW201708883A (zh) 電子系統、可攜式顯示裝置及導引裝置
JP6422584B2 (ja) 情報処理装置
JP2009267729A (ja) 映像処理装置、映像処理方法、プログラム及び記録媒体
JP2005174021A (ja) 情報提示方法及び装置
KR20210100170A (ko) 전자기기 및 그 제어 방법
KR102200115B1 (ko) 다시점 360도 vr 컨텐츠 제공 시스템
WO2018168823A1 (fr) Dispositif de traitement d'image et équipement électronique
JP6262283B2 (ja) 仮想空間を提供する方法、プログラム、および記録媒体
US20240264662A1 (en) Head mounted information processing apparatus and head mounted display system
JP7005160B2 (ja) 電子機器及びその制御方法
JP6442619B2 (ja) 情報処理装置
CN112053444B (zh) 基于光通信装置叠加虚拟对象的方法和相应的电子设备
WO2018139147A1 (fr) Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande
KR101315398B1 (ko) 3차원 증강 현실 표시 장치 및 방법
JP2010063076A (ja) 画像処理装置および画像処理装置プログラム
JP5172794B2 (ja) ビデオコミュニケーションシステム、方法およびプログラム
WO2018168825A1 (fr) Dispositif de traitement d'image et équipement électronique
JP5647813B2 (ja) 映像提示システム、プログラム及び記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17893900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17893900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载