+

WO2018139073A1 - Display control device, second display device, method for controlling display control device, and control program - Google Patents

Display control device, second display device, method for controlling display control device, and control program Download PDF

Info

Publication number
WO2018139073A1
WO2018139073A1 PCT/JP2017/044284 JP2017044284W WO2018139073A1 WO 2018139073 A1 WO2018139073 A1 WO 2018139073A1 JP 2017044284 W JP2017044284 W JP 2017044284W WO 2018139073 A1 WO2018139073 A1 WO 2018139073A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
area
image
unit
display device
Prior art date
Application number
PCT/JP2017/044284
Other languages
French (fr)
Japanese (ja)
Inventor
久雄 熊井
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018139073A1 publication Critical patent/WO2018139073A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Definitions

  • One aspect of the present invention relates to a display control device that displays content on a display device, a display device that displays a partial image in a specified display region of the image region of the content, and the like.
  • Patent Document 1 discloses a technology related to panoramic video distribution.
  • Patent Document 2 discloses a technique related to display of an omnidirectional image. These documents relate to a technique for causing a display device to display a partial image of a designated display region for an image such as a omnidirectional image having an image region having a size that does not fit on one screen of the display device.
  • Japanese Patent Publication Japanese Unexamined Patent Application Publication No. 2015-173424 (Published October 1, 2015)” Japanese Patent Publication “Japanese Patent Laid-Open No. 2015-18296 (published Jan. 29, 2015)”
  • a head mounted display is known as a display device for displaying the partial image as described above.
  • HMD head mounted display
  • the display area can basically be freely specified by the educator or the instructor, so the educator or instructor may not see the area that the educator or instructor wants to pay attention to. This is because the effects of education and guidance cannot be expected unless the educator or leader can recognize this.
  • This is not limited to displaying a partial image of an omnidirectional image, but is a problem that occurs in common when displaying a partial image of a designated display area of the image area.
  • One embodiment of the present invention realizes a display control device or the like that allows a user other than the user of the display device to recognize a display region of a display device that displays a partial image of a specified display region of the image region. For the purpose.
  • a display control device is a display control device that displays content on a first display device, and is a display region specified among image regions of the content.
  • a display area specifying unit for specifying which area of the image area is the display area, and display area information indicating the area specified by the display area specifying unit.
  • a display area notification unit to be displayed on the first display device.
  • the second display device can communicate directly or indirectly with a display control device that displays content on the first display device, and the image of the content
  • a display control device that displays content on the first display device
  • the image of the content A second display device that displays a partial image of a specified display area of the area, a display area specifying unit that specifies which area of the image area is specified as a display area, and the display area
  • the display area information indicating the area specified by the specifying unit is notified to the display control apparatus, and the display area notification unit is configured to display the display area information on the first display apparatus.
  • a control method for a display control apparatus is a control method for a display control apparatus that displays a content on a first display device, and includes:
  • the second display device that displays the partial image of the designated display area includes a display area specifying step for specifying which area of the image area is the display area, and the area specified in the display area specifying step.
  • the display area of the second display device can be recognized by the user of the first display device.
  • Embodiment 1 An embodiment of the present invention will be described with reference to FIGS.
  • FIG. 1 is a block diagram illustrating an example of a main configuration of a display control device 1 and a display device (second display device) 3 included in a control system 5 according to the present embodiment.
  • the display device 3 is a head mounted display (HMD) that is used by being mounted on the user's head.
  • HMD head mounted display
  • a user who is a leader uses the display control device 1 to guide a trainee user who is a user who uses the display device 3.
  • the guidance is performed while displaying the same content on both the display control device 1 and the display device 3.
  • the display device 3 is not limited to the HMD, and may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like provided with a display.
  • the display control device 1 may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like.
  • the display control device 1 includes a control unit 10, a display unit (first display device) 17, an input unit 18, a communication unit 19, and a storage unit 20.
  • the control unit 10 controls each unit of the display control device 1 in an integrated manner, and includes an omnidirectional image drawing unit 11, a display region specifying unit 12, a display region notification unit 13, and a guide unit (display region control unit) 14. , A combining unit 15 and a superimposition processing unit 16.
  • the omnidirectional image drawing unit 11 displays the omnidirectional image (content) on the display unit 17.
  • the display device 3 displays a partial image that is a part of the omnidirectional image, but the omnidirectional image drawing unit 11 causes the display unit 17 to display the entire image area of the omnidirectional image on one screen.
  • the image displayed by the omnidirectional image drawing unit 11 only needs to include at least the partial image displayed on the display device 3, and may be a partial image of the image area of the omnidirectional image. .
  • the display area specifying unit 12 specifies which area of the image area of the content the display device 3 uses as the display area. Then, the display area notification unit 13 notifies the user of the display control device 1 of the display area information by displaying the display area information indicating the area specified by the display area specifying unit 12 on the display unit 17.
  • the guide unit 14 causes the display unit 17 to display area guide information indicating a guide area that is an image area to be displayed on the display device 3.
  • the guide area can be designated by the instructor through an input operation to the input unit 18.
  • a guide area can be set in advance in guide information 22 described later.
  • the guide unit 14 causes the display device 3 to display the image area indicated by the area guide information.
  • the guide section 14 also performs control to move the image area of the display device 3 to the position where the display area information is moved. Note that the process of moving the image area of the display device 3 may be performed by providing a processing block different from the guide unit 14.
  • the synthesizing unit 15 synthesizes the display area information described above with the omnidirectional image, generates a synthesized image in which the display area information is superimposed on the omnidirectional image, and causes the display unit 17 to display the synthesized image.
  • the composition unit 15 generates a composite image in which the above-described area guide information is superimposed on the omnidirectional image and causes the display unit 17 to display the composite image.
  • the synthesizing unit 15 generates a synthesized image obtained by superimposing a later-described superimposed image on the omnidirectional image and causes the display unit 17 to display the synthesized image.
  • the superimposition processing unit 16 causes the display device 3 to display a superimposed image related to the omnidirectional image. Since the partial image of the omnidirectional image is displayed on the display device 3, the superimposed image is displayed superimposed on the partial image. In addition, the superimposition processing unit 16 sets a region designated in the omnidirectional image displayed by the display unit 17 as a superimposition region of the superimposed image. Details of the superimposition processing unit 16 will be described in the second embodiment.
  • the display unit 17 is a device that displays an image.
  • the display unit 17 may be a display device externally attached to the display control device 1.
  • the input unit 18 receives a user input operation, and outputs information indicating the content of the received input operation to the control unit 10.
  • the input unit 18 may be, for example, a receiving unit that receives a signal indicating the content of a user input operation on a controller (not shown) from the controller.
  • the input unit 18 performs an operation for specifying the position of the guide area on the content, an operation for moving the display area information on the omnidirectional image, an operation for selecting the area guide information, an operation for specifying the overlapping area on the content, and the like. Anything can be accepted.
  • the communication unit 19 is for the display control device 1 to communicate with another device (display device 3 in this example).
  • the communication unit 19 and the communication unit 38 of the display device 3 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may do.
  • the storage unit 20 stores various data used by the display control device 1.
  • the storage unit 20 stores an omnidirectional image 21, guide information 22, and a superimposed image 23.
  • the omnidirectional image 21 is an image obtained by imaging all directions from the imaging point.
  • the content displayed by the display control device 1 is the omnidirectional image 21.
  • the guide information 22 is information indicating which of the display areas of the omnidirectional image 21 is a guide area.
  • the guide unit 14 can display area guide information based on the guide information 22.
  • the superimposed image 23 is an image having contents related to the omnidirectional image 21. Details of the superimposed image 23 will be described in a second embodiment.
  • the omnidirectional image 21 and the superimposed image 23 may be a moving image or a still image. In the present embodiment, an example in which the omnidirectional image 21 is a moving image will be described.
  • the display device 3 is a device that displays content, and includes a control unit 30, a display unit 35, a storage unit 36, a sensor 37, and a communication unit 38.
  • the control unit 30 controls each unit of the display device 3 in an integrated manner, and includes a line-of-sight direction specifying unit 31, an omnidirectional image drawing unit (display region specifying unit) 32, a combining unit 33, and a display region notifying unit 34. including.
  • the line-of-sight direction specifying unit 31 determines the line-of-sight direction of the user of the display device 3 from the output value of the sensor 37.
  • the sensor 37 detects the orientation of the display device 3, that is, the orientation of the face of the user wearing the display device 3 (front direction).
  • the sensor 37 may be configured by a six-axis sensor that combines at least two of a three-axis gyro sensor, a three-axis acceleration sensor, a three-axis magnetic sensor, and the like.
  • the line-of-sight direction specifying unit 31 sets the direction of the user's face specified from the output values of these sensors as the user's line-of-sight direction.
  • the sensor 37 may detect the position of the user's black eye.
  • the line-of-sight direction specifying unit 31 specifies the line-of-sight direction from the position of the user's black eye.
  • the sensor 37 may include a sensor that detects the orientation of the user's face and a sensor that detects the position of the user's black eyes.
  • the identification of the line-of-sight direction can also be realized by a configuration other than the above.
  • a camera installed outside the display device 3 may be used instead of the sensor 37.
  • the display device 3 is provided with a light emitting device, which is blinked, and this state is photographed by the camera, and the position and orientation of the display device 3 can be detected from the image.
  • the line-of-sight direction can be determined by back-calculating from the light reception time when the laser emitted from the external light emitting device is received by the light receiver provided in the display device 3, the angle of each point light reception, or the time difference.
  • the omnidirectional image drawing unit 32 specifies which region in the image region of the omnidirectional image 21 is designated as the display region from the line-of-sight direction specified by the line-of-sight direction specifying unit 31. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display the partial image of the identified display region in the image region of the omnidirectional image 21 via the synthesis unit 33.
  • the synthesizing unit 33 when there is a superimposed image 23 to be superimposed on the partial image, generates a synthesized image in which the superimposed image is superimposed on the partial image and causes the display unit 35 to display the synthesized image.
  • the display area notification unit 34 notifies the display control apparatus 1 of display area information indicating the display area specified by the omnidirectional image drawing unit 32 via the communication unit 38, thereby displaying the display area information. 1 on the display unit 17.
  • the display unit 35 is a device that displays an image.
  • the display unit 35 may be a non-transmissive type or a transmissive type. When the transmissive display unit 35 is used, it is possible to provide the user with a mixed reality space in which the image displayed by the display unit 35 is superimposed on the visual field outside the display device 3 (real space). Further, the display unit 35 may be a display device externally attached to the display device 3, or a normal flat panel display or the like.
  • the storage unit 36 stores various data used by the display device 3. Although not shown in FIG. 1, the omnidirectional image 21 is stored in the storage unit 36 as in the display control device 1. When displaying the superimposed image 23, the superimposed image 23 is also stored in the storage unit 36.
  • the communication unit 38 is for the display device 3 to communicate with another device (in this example, the display control device 1).
  • the communication unit 38 and the communication unit 19 of the display control device 1 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may communicate.
  • FIG. 2 is a diagram illustrating the relationship between the omnidirectional image and the display area.
  • the omnidirectional image A1 is shown in a three-dimensional coordinate space defined by x, y, and z axes orthogonal to each other.
  • the omnidirectional image A1 forms an omnidirectional sphere that is a sphere of radius r.
  • the z-axis direction coincides with the vertical direction of the sensor 37 in the real space
  • the y-axis direction coincides with the front direction of the sensor 37 in the real space
  • the x-axis direction coincides with the left-right direction of the sensor 37 in the real space. I'm doing it.
  • the line-of-sight direction identifying unit 31 determines which direction the sensor 37 is facing from the output value of the sensor 37. Since the sensor 37 is mounted on the display device 3 in a predetermined orientation, if the user wears the display device 3 in the correct orientation, the orientation of the sensor 37 can be regarded as the user's line-of-sight direction. Therefore, in the following, the direction of the sensor 37 is described as the user's line-of-sight direction.
  • the line-of-sight direction identifying unit 31 rotates the line-of-sight direction around an azimuth angle (yaw) ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) that is a rotation angle around the vertical axis (z-axis) and a horizontal axis (x-axis). It can be expressed in combination with an angle of elevation (pitch) ⁇ ( ⁇ 90 ° ⁇ ⁇ ⁇ 90 °).
  • the omnidirectional image drawing unit 32 When the line-of-sight direction specifying unit 31 specifies the azimuth angle and elevation angle indicating the line-of-sight direction, the omnidirectional image drawing unit 32 includes a straight line extending from the user's viewpoint position Q in the direction indicated by the specified azimuth angle and elevation angle, An intersection point P with the sphere image A1 is obtained. Then, in the omnidirectional image A1, an area having a height h and a width w with the intersection P as the center is specified as the display area A11. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display a portion in the display area A11 of the omnidirectional image A1.
  • the display area A11 changes in conjunction with the user's line-of-sight direction, and the image displayed on the display unit 35 also changes accordingly.
  • the viewpoint position Q in the omnidirectional sphere is assumed to be stationary in order to simplify the description, but the viewpoint position in the omnidirectional sphere is linked with the movement of the user in the real space. Q may be moved.
  • FIG. 3 is a diagram illustrating an image displayed by the display control device 1.
  • the display control device 1 displays a planar image obtained by mapping the omnidirectional image A1 on a two-dimensional plane on the display unit 17.
  • the position on the omnidirectional image A1 includes the azimuth angle ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) and the elevation angle ⁇ ( ⁇ 90 ° ⁇ ⁇ that is a rotation angle around the horizontal axis (x axis). ⁇ 90 °).
  • the azimuth angle at the left end of the omnidirectional image A1 is ⁇ 180 °
  • the azimuth angle at the right end is 180 °
  • the elevation angle at the upper end is 90 °
  • the elevation angle at the lower end is ⁇ 90 °.
  • the display area information A11 indicating the display area in the display device 3 is superimposed on the omnidirectional image A1.
  • display area information A12 and A13 indicating display areas in the other display devices 3 are also displayed in a superimposed manner. That is, in FIG. 3, the display areas of the three display devices 3 are represented by A11 to A13, respectively.
  • the display control device 1 can also display the display area information of the plurality of display devices 3 on one screen.
  • the area guide information B1 and B2 are superimposed on the omnidirectional image A1.
  • the area guide information B1 and B2 indicate guide areas that are image areas to be displayed on the display device 3.
  • the area guide information B1 and B2 can be selected by an input operation via the input unit 18.
  • a partial image of the guide area indicated by the area guide information B1 is displayed on the display device 3.
  • the area guide information B2 is selected.
  • the partial image of the guide area indicated by the area guide information B1 and B2 can be forcibly displayed on the display device 3. It should be noted that the same partial image may be displayed collectively for the plurality of display devices 3, or different partial images may be displayed by selecting different region guide information for each display device 3.
  • the display position of the display area information may be changed by an input operation via the input unit 18.
  • the image area of the display device 3 may be moved to the position to which the display area information is moved.
  • the guide information 22 may be information as shown in FIG. 4, for example.
  • FIG. 4 is a diagram illustrating an example of the guide information 22.
  • the guide information 22 in FIG. 4A is data in a table format in which information on “guide area”, “azimuth angle range”, “elevation angle range”, and “reproduction time” is associated.
  • “Guide area” is identification information of the guide area, and in this example, the name of the guide area is described.
  • “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the guide region in the omnidirectional image.
  • the guide region B1 (the guide region indicated by the region guide information B1 in FIG. 3) has an azimuth angle of ⁇ 110 ° at the left end and ⁇ 30 ° at the right end in the omnidirectional image. , It is defined as a region occupying a rectangle having a lower end elevation angle of ⁇ 10 ° and an upper end elevation angle of 40 °.
  • “Reproduction time” is information indicating a reproduction time zone in which the guide area is set. For example, since the “reproduction time” of the illustrated guide area B1 is 00:01:00 to 00:05:00, the content reproduction time of this guide area B1 is 00:01:00 to 00:05:00. The time zone is set.
  • the guide information 22 shown in FIG. 4B includes the “azimuth range” and “elevation range” included in the guide information 22 shown in FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the guide information 22 in FIG. 4B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the guide region. More specifically, “width” and “height” indicate the width and height of the guide region, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the guide region. The reference position can be an arbitrary position on the guide area.
  • the position of the lower left corner may be used as the reference position.
  • the lower left corner is the position indicated by “azimuth angle” and “elevation angle”
  • the area having the width and height indicated by “width” and “height” is the guide area.
  • the display area specifying unit 12 specifies the display area of the display device 3.
  • the display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG. 5, for example.
  • FIG. 5 is a diagram illustrating an example of display area management information for managing the display area of the display device 3.
  • the display area management information in FIG. 5A is data in a table format in which information on “user”, “azimuth angle range”, and “elevation angle range” is associated.
  • “User” is identification information of the user corresponding to the display area, that is, the user who is viewing the image in the display area.
  • the user name is described.
  • the display area management information may include identification information of the display device 3 in addition to the “user” information or instead of the “user” information.
  • “Azimuth range” and “elevation range” are information indicating the range occupied by the display area of the display device 3 in the omnidirectional image.
  • the display area on the display device 3 of the user X1 is a omnidirectional image with a left azimuth angle of ⁇ 85 °, a right azimuth angle of ⁇ 5 °, and a lower elevation angle of 25. It is defined that the region occupies a rectangle having an elevation angle of 75 ° at the upper end.
  • the display area management information in FIG. 5B includes the “azimuth angle range” and the “elevation angle range” included in the display area management information in FIG. It replaces “height”, “azimuth”, and “elevation”. That is, in the display area management information in FIG. 5B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the display area. More specifically, “width” and “height” indicate the width and height of the display area, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the display area. The reference position can be an arbitrary position on the display area.
  • the center position may be used as the reference position.
  • the center is the position indicated by “azimuth angle” and “elevation angle”
  • the area having the width and height indicated by “width” and “height” is the display area.
  • FIG. 6 is a flowchart illustrating an example of processing in which the display control device 1 displays an image.
  • FIG. 6 is a flowchart illustrating an example of processing in which the display control device 1 displays an image.
  • the processing in the case where there are a plurality of display devices 3 is the same as the following.
  • the omnidirectional image drawing unit 11 acquires the omnidirectional image 21 from the storage unit 20 and displays it on the display unit 17 via the synthesis unit 15.
  • the display region specifying unit 12 specifies the display region of the display device 3 by communicating with the display device 3.
  • the display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG.
  • the guide unit 14 receives the designation of the position of the guide area by the instructor.
  • the guide unit 14 specifies a guide region.
  • the designation of the position of the guide region in S3 may be performed by inputting a desired position of the instructor in the omnidirectional image 21 via the input unit 18, for example.
  • the guide unit 14 specifies a guide region having a size corresponding to the width and height of the display region of the display device 3 with the input position as the center.
  • the guide unit 14 causes the display unit 17 to display the area guide information indicating the guide area specified in S4 via the combining unit 15. Further, the display area notification unit 13 causes the display unit 17 to display display area information indicating the display area specified in S ⁇ b> 2 via the synthesis unit 15. Thereby, the area guide information and the display area information are displayed superimposed on the omnidirectional image 21. Note that the area guide information and the display area information may be displayed in separate steps. Moreover, the process of S3 and S4 may be performed before S2, and may be performed in parallel with S2.
  • the guide unit 14 determines whether or not the area guide information displayed in S5 has been selected. For example, the guide unit 14 may determine that the region guide information has been selected when the displayed region guide information is selected by the instructor through an input operation via the input unit 18.
  • the method for selecting the area guide information is not particularly limited. For example, the area guide information may be selected by clicking or tapping the displayed area guide information.
  • the area guide information is not selected when the state where the displayed area guide information is not selected continues for a predetermined time or when an input operation of a leader who selects not to select the area guide information is detected. It may be determined that If it is determined that the area guide information is selected in S6 (YES in S6), the process proceeds to S7. If it is determined that the area guide information is not selected (NO in S6), the process returns to S1.
  • the guide unit 14 notifies the display device 3 of the guide region indicated by the selected region guide information, and instructs the display device 3 to set the region as the display region. Thereby, in the display device 3, the display area is changed, and an image of the guide area indicated by the area guide information is displayed. Thereafter, the process returns to S1.
  • the guide area may be set by automatically detecting an object from the omnidirectional image.
  • a predetermined area including the display area of the detected object may be extracted as the guide area.
  • the instructor can easily confirm whether or not the instructor is looking at a predetermined object in the content.
  • the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance.
  • an object having such an appearance can be automatically detected.
  • the object may be detected using machine learning or the like.
  • the function of the detection unit for detecting an object may be provided in the guide unit 14, or a processing block of the detection unit may be provided separately and an object may be detected by the processing block.
  • FIG. 7 is a flowchart illustrating an example of processing in which the display device 3 displays an image.
  • the line-of-sight direction specifying unit 31 specifies the line-of-sight direction of the instructor wearing the display device 3, and the omnidirectional image drawing unit 32 determines all of the line-of-sight directions specified by the line-of-sight direction specifying unit 31.
  • the display area in the celestial sphere image is specified.
  • the omnidirectional image drawing unit 32 causes the display unit 35 to draw (display) the partial image of the omnidirectional image corresponding to the specified display area via the synthesis unit 33.
  • the display area notification unit 34 notifies the display control apparatus 1 of the specified display area.
  • the display area notification unit 34 includes the identification information of the display device 3 or the user of the display device 3 and information indicating the range occupied by the display area of the display device 3 in the omnidirectional image (for example, the range of the azimuth angle). Information indicating the range of the elevation angle) is transmitted to the display control device 1 in association with each other.
  • the omnidirectional image drawing unit 32 determines whether or not a display area change instruction is received from the display control device 1. If it is determined that it has not been received (NO in S14), the process returns to S11, and if it is determined that it has been received (YES in S14), the process proceeds to S15.
  • the omnidirectional image drawing unit 32 changes the display area according to the instruction received from the display control apparatus 1, and the partial image of the omnidirectional image corresponding to the changed display area is passed through the synthesis unit 33. Then, the display unit 35 is drawn (displayed). Thereafter, the process returns to S11.
  • the omnidirectional image drawing unit 32 performs the following (1) to (4)
  • the display area may be changed after performing any one of the image processing. This makes it difficult for the instructor to get drunk even when the display area changes suddenly.
  • the display control apparatus 1 may determine whether the amount of change in the position of the display area before and after the change is greater than or equal to a predetermined value. In this case, when it is determined in the above determination that the value is equal to or greater than the predetermined value, the display control device 1 instructs the display device 3 to execute any one of the image processes (1) to (4). May be.
  • FIG. 8 is a diagram illustrating an image displayed by the display control device 1 in which a superimposed image is superimposed on an omnidirectional image.
  • display area information A11 to A13 and area guide information B1 and B2 are displayed superimposed on the omnidirectional image A1.
  • the area information C1 to C3 is superimposed and displayed on the omnidirectional image A1.
  • the area information C1 and C2 is superimposition area information indicating a superimposition area that is an area in which the superimposed image is superimposed and displayed.
  • the area information C3 is superposition prohibition area information indicating a superposition prohibition area that is an area where a superimposed image is not displayed.
  • the instructor who uses the display control device 1 can select the region information C1 or C2 and display the superimposed image in the superimposed region indicated by the region information C1 or C2.
  • the superimposed image may be an image related to the content to be superimposed (the omnidirectional image A1 in this example).
  • the superimposed image may be a moving image content, a still image content, or an annotation displayed as an annotation regarding the omnidirectional image A1 or another superimposed image.
  • the superimposed image is an image obtained by imaging the same imaging target as the imaging target of the omnidirectional image A1 (in this example, the cityscape), but the imaging device used for imaging is different from the imaging device that images the omnidirectional image A1. It may be an image. Further, the superimposed image has a higher resolution than the omnidirectional image A1, and therefore may be a higher definition image than the omnidirectional image A1.
  • the superimposed image may be, for example, an image obtained by photographing the imaging target of the omnidirectional image A1 at the same angle as the omnidirectional image A1, or an image (for example, a specific image) taken at an angle different from the omnidirectional image A1. Image of the viewpoint). Further, the superimposed image may be an image obtained by enlarging a part of the imaging target of the omnidirectional image A1.
  • multifaceted information about the imaging target of the omnidirectional image A1 can be given to the user. For example, by using a high-resolution image obtained by enlarging a specific building in the guide area in the area guide information B1 as a superimposed image, the user can check a part of the specific building in detail while checking the entire cityscape. can do. Further, for example, by using an image obtained by capturing a specific building at an angle different from that of the omnidirectional image A1 as a superimposed image, the user can select a portion of the specific building that is not captured at the imaging angle of the omnidirectional image A1. Can also be confirmed.
  • the content of the annotation is not particularly limited as long as it relates to the omnidirectional image A1 or the superimposed image.
  • the information regarding the omnidirectional image A1 may be information indicating the state of the imaging target, the operation, the name, the notable part of the omnidirectional image A1, and the like.
  • examples of the information related to the superimposed image include the angle at which the image is captured, whether the image is an enlarged image, and which portion of the image to be imaged.
  • a UI (User Interface) menu for operating the display control device 1 or a UI menu for operation control of the display device 3 may be displayed as an annotation.
  • the omnidirectional image A1 is an image in which the cityscape is an imaging target, but the imaging target is arbitrary.
  • the omnidirectional image A1 may be an image obtained by imaging the entire state of the operating room where the operation is performed.
  • the imaging target may include a surgeon, an assistant, a patient, a surgical instrument, various devices, and the like.
  • the display control device 1 and the display device 3 can be used for medical education.
  • the instructor can check the progress of the entire operation in the omnidirectional image A1
  • a superimposed image of the surgeon's eyes may be displayed when used for surgeon education
  • a superimposed image of the assistant's eyes may be displayed when used for assistant education.
  • an image obtained by capturing a screen displaying patient vital data as a superimposed image the user can recognize the relationship between the transition of vital data during surgery and the movement of each person.
  • a high-resolution image of the operative field may be used as a superimposed image, thereby allowing the user to recognize the details of the surgeon's detailed work.
  • information necessary for surgery, device operation information (for example, on / off of heart-lung machine), and the like may be displayed as annotations.
  • the area management information that defines the overlapping area and the overlapping prohibited area as described above may be stored in the storage unit 20 or the like.
  • the superimposition processing unit 16 can specify the superimposition region and the superposition prohibition region by referring to the stored region management information.
  • the area management information may be information as shown in FIG. 9, for example.
  • FIG. 9 is a diagram illustrating an example of region management information that defines a superimposition region and a superposition prohibition region.
  • the area management information in (a) of FIG. 9 has a table format in which information of “area”, “azimuth angle range”, “elevation angle range”, “superimposition / prohibition”, and “reproduction time” is associated. Information.
  • “Area” is identification information of a superimposition area or a superposition prohibition area, and in this example, the name of the area is described.
  • “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the superimposition region or the superposition prohibition region in the omnidirectional image. For example, in the area management information shown in the drawing, the azimuth angle at the left end is ⁇ 80 °, the azimuth angle at the right end is ⁇ 100 °, the elevation angle at the lower end is 15 °, and the elevation angle at the upper end is 30 °. It is specified that the area occupies a rectangle.
  • “Superposition / prohibition” is information indicating whether the “region” is a superposition region or a superposition prohibition region.
  • a region where the “superimposition / prohibition” information is “superimposition” is a superimposition region, and a region where “prohibition” is a superimposition prohibition region.
  • “Reproduction time” is information indicating a reproduction time zone in which a superimposition area or a superposition prohibition area is set. For example, since the “reproduction time” of the illustrated area C1 (the area indicated by the area information C1 in FIG. 8) is 00:01:00 to 00:05:00, the content reproduction time of this area C1 is 00:00. The time zone is set to 01:00 to 00:05:00.
  • the area management information in (b) of FIG. 9 indicates that the “azimuth angle range” and “elevation angle range” included in the area management information in (a) of FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the area management information of FIG. 9B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the overlapping area or the overlapping prohibited area. More specifically, “width” and “height” indicate the width and height of the overlapping region or the overlapping prohibition region, respectively. In addition, “azimuth angle” and “elevation angle” indicate reference positions for specifying a superimposition region or a superposition prohibition region.
  • the reference position can be an arbitrary position on the superposition area or the superposition prohibition area.
  • the position of the lower left corner may be used as the reference position.
  • the lower left corner is the position indicated by “azimuth angle” and “elevation angle”
  • the region having the width and height indicated by “width” and “height” is the overlapping region or the overlapping prohibition region.
  • the superimposition area may have a size that fits within the display area.
  • the superimposition area is a narrower area than the display area that is a part of the image area of the omnidirectional image. For this reason, the content may be enlarged and displayed when the instructor selects the overlapping region. Thereby, compared with the example of FIG. 8, it is possible to easily select the overlapping region. This will be described with reference to FIG.
  • FIG. 10 is a diagram showing an example in which the superimposition area information and the superposition prohibition area information are displayed after the content is enlarged and displayed.
  • the partial image of the display area B ⁇ b> 1 of FIG. 8 is displayed on the entire screen of the display unit 17 of the display control device 1.
  • the area information C1 and C3 can be confirmed in more detail than the example of FIG.
  • the content is enlarged and displayed on the entire screen here, it may be superimposed on the original content after being enlarged at an arbitrary enlargement ratio.
  • the display mode of the area information C1 indicating the overlapping area and the area information C3 indicating the overlapping prohibited area are different. Specifically, the area information C3 is marked with “X” indicating that superposition is prohibited. As a result, the instructor can intuitively recognize the overlapping area and the overlapping prohibited area.
  • Area information C3 is an area set to include the display area of the object D1. Thereby, even when the superimposed image is displayed, the object D1 does not disappear.
  • the superposition prohibition area may be set by automatically detecting the object D1 from the omnidirectional image. In this case, a predetermined range including the display area of the detected object D1 may be set as the superposition prohibition area.
  • the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance. Thus, by analyzing the omnidirectional image, an object having such an appearance can be automatically detected. Further, the object may be detected using machine learning or the like.
  • Such an object detection may be configured to be performed by the superimposition processing unit 16 or may be configured to separately provide a processing block for detecting an object and detect the object in the processing block.
  • FIG. 11 is a flowchart illustrating an example of processing when the display control device 1 displays a superimposed image on the display device 3. Note that FIG. 11 illustrates processing after the omnidirectional image is displayed on the display unit 17. In the following, an example in which there is one display device 3 will be described, but the processing in the case where there are a plurality of display devices 3 is the same as the following.
  • the display area specifying unit 12 specifies the display area of the display device 3 by communicating with the display device 3.
  • the superimposition processing unit 16 extracts a superimposition region in the displayed omnidirectional image. Specifically, the superimposition processing unit 16 selects an area corresponding to the playback time of the content and having “superimposition / prohibition” information “superimposition” among the areas indicated in the area management information (see FIG. 9). Extract.
  • only the display area specified in S21 may be the extraction target, or a wider range (for example, the display area and a predetermined area around it) may be the extraction target. In this case, a predetermined area (area specified by the azimuth angle range and the elevation angle range) indicated in the area management information (see FIG. 9) is extracted at least a part of which is included in the extraction target area. To do.
  • the superimposition processing unit 16 determines whether there is a superimposed image to be displayed in the superimposition region extracted in S22.
  • the area management information as shown in FIG. 9 is set in advance, the overlapping area to be displayed in each area may be set in advance.
  • the superimposition processing unit 16 determines the presence or absence of a superimposed image associated with the superimposition region extracted in S22. If it is determined in S23 that there is a superimposed image (YES in S23), the process proceeds to S24. On the other hand, if it is determined that there is no superimposed image (NO in S23), the process returns to S21.
  • the superimposition processing unit 16 acquires information indicating the display mode of the superimposed image. Such information may be stored in advance in association with the superimposed image.
  • the information indicating the display mode includes, for example, the presence / absence of perspective (perspective display, that is, display by perspective projection), the transmittance of the superimposed image, and the superimposed decoration method (for example, presence / absence of application of image processing for blurring the outline of the superimposed image). The information to show is mentioned.
  • the superimposition processing unit 16 causes the display unit 17 to display superimposition region information indicating the superimposition region extracted in S22. At this time, the superimposition processing unit 16 may enlarge and display an image area including the superimposition area (see FIG. 10).
  • the superimposition processing unit 16 receives selection of a superimposition region and a superimposed image.
  • the selection of the overlapping area may be performed by selecting the displayed overlapping area information by an input operation to the input unit 18, for example. Further, when a plurality of superimposed images are associated with the selected superimposed region, the plurality of superimposed images may be displayed and selected by the instructor.
  • the synthesizing unit 15 synthesizes an image of the superimposed image in the superimposition area received in S26 and draws (displays) the image on the display unit 17. Note that the superimposed image may be displayed as a preview before S26.
  • the superimposition processing unit 16 notifies the display device 3 of the superposed region and the superposed image that have been selected in S26, and instructs the superposed region to superimpose and display this superposed image. Thereby, also on the display device 3, the superimposed image is superimposed and displayed on the superimposed region.
  • the superimposition processing unit 16 may transmit the superimposed image to the display device 3. Thereafter, the process returns to S21.
  • the superposition prohibition area may be extracted, and in S25, the superposition prohibition area information indicating the superposition prohibition area may be displayed.
  • the superimposition prohibition area is (1) when the instructor can freely set the position of the superimposition area, (2) when determining the position of the superimposition area based on the position of the object in the content, and (3) This is useful, for example, when determining the position of the superimposition area based on the position of the display area.
  • an object having a predetermined appearance for example, a building in the example of FIG. 8 is detected in the content, and is shifted by a predetermined offset from the periphery of the object (for example, a reference position in the object).
  • a superimposed image is displayed at (position).
  • the superposition image can be displayed in association with the object by setting the superposition area while avoiding this, and the superposition prohibition area is hidden by the superposition image. It will not be done.
  • FIG. 12 is a block diagram illustrating an example of a main configuration of the display control device 1 and the display device 3 configuring the control system 5 according to an embodiment of the present invention.
  • the display control device 1 is different from the display control device 1 of each embodiment described above in that it includes an image transmission unit 40.
  • the display device 3 is different from the display device 3 of each of the above embodiments in that the omnidirectional image drawing unit 32 is not provided but the image acquisition unit 45 is provided.
  • the image transmission unit 40 transmits an image to the display device 3 via the communication unit 19.
  • the image to be transmitted is a partial image corresponding to the display area of the display device 3 in the omnidirectional image.
  • the display area is specified by the display area specifying unit 12 based on a notification from the display area notifying unit 34.
  • the image acquisition unit 45 acquires the image transmitted by the display control device 1 via the communication unit 38 and causes the display unit 35 to display the image. Accordingly, the display device 3 can display a partial image of the omnidirectional image or an image in which the superimposed image is superimposed on the omnidirectional image without storing the omnidirectional image or the superimposed image.
  • FIG. 13 is a diagram showing an overview of a control system 50 according to Embodiment 4 of the present invention.
  • the control system 50 includes a display control device 1, a display device 3, and a server 7.
  • the control system 50 may include a plurality of display devices 3.
  • the display control device 1, the display device 3, and the server 7 are communicably connected via a predetermined communication network (for example, the Internet). Therefore, the user of the display control device 1 (instructor) or the user of the display device 3 (instructor) can be connected to the communication network from other users (instructors or instructors).
  • the control system 50 can be used even if the user is in a remote location.
  • the server 7 stores various information used in the control system 50, and transmits the information to the display control device 1 and the display device 3.
  • the server 7 stores a communication unit for the server 7 to communicate with other devices (in the present embodiment, the display control device 1 and the display device 3), a control unit that controls each unit of the server 7, and various information.
  • a storage unit is provided.
  • control unit of the server 7 includes an omnidirectional image transmission unit that transmits the omnidirectional image 21 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the omnidirectional image 21 from the omnidirectional image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the omnidirectional image 21.
  • the control unit of the server 7 includes a guide information transmission unit that transmits guide information 22 in response to a request from another device. Therefore, the display control apparatus 1 can acquire the guide information 22 from the guide information transmission unit and display the area guide information. For this reason, the display control apparatus 1 of this embodiment does not need to memorize
  • FIG. When the displayed area guide information is selected, the display control apparatus 1 notifies the display apparatus 3 of the guide area indicated by the selected area guide information. The display device 3 acquires the notified partial image of the guide area from the server 7 and displays it.
  • the control unit of the server 7 includes a superimposed image transmission unit that transmits the superimposed image 23 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the superimposed image 23 from the superimposed image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the superimposed image 23. Similarly, the area management information (see FIG. 9) can be obtained from the server 7.
  • the display control device 1 has described an example of displaying a rectangular planar image obtained by mapping the omnidirectional image on a two-dimensional plane.
  • the display control device 1 displays at least the display area of the display device 3. It is only necessary to display an image including it, and is not limited to this example. For example, when displaying omnidirectional images captured by two fisheye cameras, each image captured by each fisheye camera is mapped to a two-dimensional plane to generate two circular planar images, which are displayed. May be.
  • the content can display a partial image of a display area that is a specified part of the entire image area. What is necessary is not limited to the spherical image.
  • the image may be a hemispherical image, or may be a flat image (such as a panoramic photograph) having a display size that does not fit on one screen of the display device 3.
  • the content is not limited to those intended for education and guidance.
  • the content may be a still image or may include a plurality of still images as components.
  • the still images displayed on the display control device 1 and the display device 3 are synchronized.
  • the display area designation method is not particularly limited.
  • the display area may be specified by a controller or the like for specifying the display area.
  • the display control apparatus 1 changes the display area by instructing the display apparatus 3
  • the display apparatus 3 can be changed.
  • the display area may be changed.
  • the display area is changed by displaying a superimposed image or an annotation including a message prompting the display of the guide area, a symbol (arrow or the like) indicating the moving direction of the line of sight for displaying the guide area, or a message on the display device 3. Can be urged to.
  • the display device 3 has a function of outputting sound
  • the display device 3 can be prompted to change the display area by causing the display device 3 to output the sound.
  • a message such as “Move the line of sight to the right and pay attention to a triangular building” may be displayed as an annotation.
  • a message that guides the user's line-of-sight direction to a predetermined line-of-sight direction such as "Please move your line of sight slightly to the left so that a low-cylindrical building is located in the center of the field of view” It may be displayed.
  • the user may be prompted to display the guide area by changing the display mode of the guide area and the other image areas. For example, the display brightness of the guide area may be made higher than other areas. In this case, the user can easily find the guide area by moving the line-of-sight direction.
  • control blocks of the display control device 1 and the display device 3 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. However, it may be realized by software using a CPU (Central Processing Unit).
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the display control device 1 and the display device 3 include a CPU that executes instructions of a program that is software that implements each function, and a ROM (in which the program and various data are recorded so as to be readable by a computer (or CPU)). Read Only Memory) or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like.
  • the computer or CPU reads the program from the recording medium and executes the program, thereby achieving the object of one embodiment of the present invention.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • a display control device (1) is a display control device that displays content on a first display device (display unit 17), and displays a specified display region among the image regions of the content.
  • a display area specifying unit (12) for specifying which area of the image area the second display device (display apparatus 3) for displaying a partial image is a display area, and an area specified by the display area specifying unit
  • a display area notification unit (13) for displaying the display area information on the first display device is a display control device that displays content on a first display device (display unit 17), and displays a specified display region among the image regions of the content.
  • a display area specifying unit (12) for specifying which area of the image area the second display device (display apparatus 3) for displaying a partial image is a display area, and an area specified by the display area specifying unit
  • a display area notification unit (13) for displaying the display area information on the first display device.
  • the display area of the second display device is specified, and display area information indicating the area is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device.
  • the first display device only needs to display an area including at least the display area in the content.
  • the first display device may display the entire image area of the content, or a narrower area. It may be displayed.
  • the display control device is the display control device according to aspect 1, in which the image of the second display device is displayed when the display area information is moved on the content displayed on the first display device. It is good also as a structure provided with the display area control part (guide part 14) which moves an area
  • the user of the first display device moves the display region information to the image region that the user of the second display device wants to view, thereby displaying the partial image of the image region on the second display. It can be displayed on the device.
  • the display control device is the guide unit (14) that causes the first display device to display region guide information indicating an image region to be displayed on the second display device in the above aspect 1. It is good also as a structure provided with.
  • the user of the first display device can recognize the image region to be displayed on the second display device based on the region guide information. Further, as described above, the display area of the second display device can be recognized by the user of the first display device based on the display area information. Therefore, the user of the first display device can recognize whether the image region to be displayed on the second display device matches the actual display region of the second display device.
  • the display control apparatus is the display area control unit that causes the second display apparatus to display an image area indicated by the area guide information when the area guide information is selected in the aspect 3. It is good also as a structure provided with (guide part 14).
  • the user of the first display device can display the partial image of the image region on the second display device by selecting the region guide information. Therefore, the user of the second display device can browse the partial image of the image area to be displayed on the second display device.
  • the display control device includes a detection unit that detects an object having a predetermined appearance from the content in the aspect 3 or 4, and the guide unit includes the object detected by the detection unit.
  • the area guide information indicating the image area may be displayed on the first display device.
  • area guide information indicating an image area including an object having a predetermined appearance in the content is displayed on the first display device. Therefore, the first display indicates whether the display area of the object and the display area of the second display device are coincident with each other, that is, whether or not the user of the second display device is viewing the object. It can be recognized by the user of the device.
  • the display control device is the superimposition processing unit according to any one of the aspects 1 to 5 described above, wherein the superimposed image related to the content is superimposed on the partial image and displayed on the second display device. 16), and the superimposed region in which the superimposed image is superimposed on the partial image may be a region designated in the content displayed by the first display device.
  • the user of the first display device designates a region in the content displayed by the first display device, thereby superimposing the region on the partial image displayed on the second display device.
  • An image can be displayed.
  • the second display device (display device 3) according to aspect 7 of the present invention can communicate directly or indirectly with the display control device (1) that displays the content on the first display device (display unit 17).
  • a second display device that displays a partial image of a designated display area of the image area of the content, wherein a display area identifying unit that identifies which area of the image area is designated as the display area (
  • a notification unit (34).
  • the display area of the second display device is specified, the display area information indicating the specified area is notified to the display control device, and the display area information is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device.
  • a control method of the display control device (1) according to the aspect 8 of the present invention is a control method of a display control device that causes a content to be displayed on the first display device (display unit 17).
  • the display control device (1) may be realized by a computer.
  • display control is performed by operating the computer as each unit (software element) included in the display control device (1).
  • a control program for a display control apparatus for realizing the apparatus (1) by a computer and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
  • the second display device (3) according to the above aspect of the present invention may also be realized by a computer. In this case, the computer operates as each unit (software element) included in the second display device (3).
  • a control program for the second display device for realizing the second display device (3) by a computer and a computer-readable recording medium on which the control program is recorded fall within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention allows one user to recognize the display area of a display device used by another user. A display control device (1) comprises: a display area identification unit (12) for identifying an area in an image area that a display device (3) defines as a display area, the display device (3) displaying a partial image in a display area designated in the image area of an omnidirectional image (21); and a display area notification unit (13) for presenting display area information representing the area identified by the display area identification unit (12) on a display unit (17).

Description

表示制御装置、第2の表示装置、表示制御装置の制御方法、および制御プログラムDisplay control device, second display device, display control device control method, and control program
 本発明の一態様は、コンテンツを表示装置に表示させる表示制御装置、およびコンテンツの画像領域のうち指定された表示領域の部分画像を表示する表示装置等に関する。 One aspect of the present invention relates to a display control device that displays content on a display device, a display device that displays a partial image in a specified display region of the image region of the content, and the like.
 下記の特許文献1には、パノラマ映像の配信に関する技術が開示されている。また、下記の特許文献2には、全天球画像の表示に関する技術が開示されている。これらの文献は、何れも表示装置の一画面に収まらないサイズの画像領域を有する全天球画像等の画像について、指定された表示領域の部分画像を表示装置に表示させる技術に関する。 The following Patent Document 1 discloses a technology related to panoramic video distribution. Patent Document 2 below discloses a technique related to display of an omnidirectional image. These documents relate to a technique for causing a display device to display a partial image of a designated display region for an image such as a omnidirectional image having an image region having a size that does not fit on one screen of the display device.
日本国公開特許公報「特開2015-173424号公報(2015年10月1日公開)」Japanese Patent Publication “Japanese Unexamined Patent Application Publication No. 2015-173424 (Published October 1, 2015)” 日本国公開特許公報「特開2015-18296号公報(2015年1月29日公開)」Japanese Patent Publication “Japanese Patent Laid-Open No. 2015-18296 (published Jan. 29, 2015)”
 上記のような部分画像を表示する表示装置としてヘッドマウントディスプレイ(HMD)が知られている。HMDを用いて全天球画像を閲覧することにより、ユーザは映像空間内に没入して画像の閲覧を楽しむことができる。 A head mounted display (HMD) is known as a display device for displaying the partial image as described above. By browsing the omnidirectional image using the HMD, the user can be immersed in the video space and enjoy viewing the image.
 しかしながら、従来は、HMDを使用しているユーザが全天球画像の何れの領域を見ているかを他のユーザが認識することはできなかった。また、HMDではない通常のフラットパネルディスプレイに部分画像を表示する場合であっても、そのフラットパネルディスプレイから離れた遠隔地に居るユーザには全天球画像の何れの領域が見られているかを認識することはできなかった。 However, conventionally, other users could not recognize which area of the omnidirectional image the user using the HMD is looking at. Also, even when a partial image is displayed on a normal flat panel display that is not an HMD, which area of the omnidirectional image is viewed by a user at a remote location away from the flat panel display. I couldn't recognize it.
 これは、例えば全天球画像を教育や指導目的で使用する場合に問題となり得る。表示領域は基本的に被教育者や被指導者が自由に指定できるため、教育者や指導者が注目して欲しい領域を、被教育者や被指導者が見ていない可能性があり、このことを教育者や指導者が認識できなければ、教育や指導の効果が期待できないためである。なお、これは全天球画像の部分画像を表示する場合に限られず、画像領域のうち指定された表示領域の部分画像を表示させる場合に共通して生じる問題点である。 This can be a problem when, for example, spherical images are used for educational or teaching purposes. The display area can basically be freely specified by the educator or the instructor, so the educator or instructor may not see the area that the educator or instructor wants to pay attention to. This is because the effects of education and guidance cannot be expected unless the educator or leader can recognize this. This is not limited to displaying a partial image of an omnidirectional image, but is a problem that occurs in common when displaying a partial image of a designated display area of the image area.
 本発明の一態様は、画像領域のうち指定された表示領域の部分画像を表示する表示装置の表示領域を、上記表示装置のユーザ以外のユーザに認識させることができる表示制御装置等を実現することを目的とする。 One embodiment of the present invention realizes a display control device or the like that allows a user other than the user of the display device to recognize a display region of a display device that displays a partial image of a specified display region of the image region. For the purpose.
 上記の課題を解決するために、本発明の一態様に係る表示制御装置は、コンテンツを第1の表示装置に表示させる表示制御装置であって、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定部と、上記表示領域特定部が特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知部と、を備えている構成である。 In order to solve the above-described problem, a display control device according to an aspect of the present invention is a display control device that displays content on a first display device, and is a display region specified among image regions of the content. A display area specifying unit for specifying which area of the image area is the display area, and display area information indicating the area specified by the display area specifying unit. And a display area notification unit to be displayed on the first display device.
 上記の課題を解決するために、本発明の一態様に係る第2の表示装置は、第1の表示装置にコンテンツを表示させる表示制御装置と直接または間接に通信可能であり、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置であって、上記画像領域の何れの領域を表示領域として指定されているかを特定する表示領域特定部と、上記表示領域特定部が特定した領域を示す表示領域情報を上記表示制御装置に通知して、該表示領域情報を上記第1の表示装置に表示させる表示領域通知部と、を備えている構成である。 In order to solve the above problem, the second display device according to one embodiment of the present invention can communicate directly or indirectly with a display control device that displays content on the first display device, and the image of the content A second display device that displays a partial image of a specified display area of the area, a display area specifying unit that specifies which area of the image area is specified as a display area, and the display area The display area information indicating the area specified by the specifying unit is notified to the display control apparatus, and the display area notification unit is configured to display the display area information on the first display apparatus.
 上記の課題を解決するために、本発明の一態様に係る表示制御装置の制御方法は、コンテンツを第1の表示装置に表示させる表示制御装置の制御方法であって、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定ステップと、上記表示領域特定ステップで特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知ステップと、を含む方法である。 In order to solve the above problem, a control method for a display control apparatus according to an aspect of the present invention is a control method for a display control apparatus that displays a content on a first display device, and includes: The second display device that displays the partial image of the designated display area includes a display area specifying step for specifying which area of the image area is the display area, and the area specified in the display area specifying step. A display area notification step of causing the first display device to display display area information to be displayed.
 本発明の一態様によれば、第2の表示装置の表示領域を第1の表示装置のユーザに認識させることができるという効果を奏する。 According to one aspect of the present invention, the display area of the second display device can be recognized by the user of the first display device.
本発明の実施形態1に係る制御システムに含まれる表示制御装置および表示装置の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of a principal part structure of the display control apparatus and display apparatus which are included in the control system which concerns on Embodiment 1 of this invention. 全天球画像と表示領域との関係を示す図である。It is a figure which shows the relationship between an omnidirectional image and a display area. 上記表示制御装置が表示する画像を示す図である。It is a figure which shows the image which the said display control apparatus displays. ガイド情報の一例を示す図である。It is a figure which shows an example of guide information. 表示領域情報の一例を示す図である。It is a figure which shows an example of display area information. 上記表示制御装置が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process which the said display control apparatus performs. 上記表示装置が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process which the said display apparatus performs. 本発明の実施形態2に係る表示制御装置が表示する画像を示す図である。It is a figure which shows the image which the display control apparatus which concerns on Embodiment 2 of this invention displays. 領域管理情報の一例を示す図である。It is a figure which shows an example of area | region management information. コンテンツを拡大表示した上で重畳領域情報および重畳禁止領域情報を表示した例を示す図である。It is a figure which shows the example which displayed the superimposition area | region information and the superimposition prohibition area | region information after enlarging and displaying the content. 上記表示制御装置が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process which the said display control apparatus performs. 本発明の実施形態3に係る制御システムに含まれる表示制御装置および表示装置の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of a principal part structure of the display control apparatus contained in the control system which concerns on Embodiment 3 of this invention, and a display apparatus. 本発明の実施形態4に係る制御システムの概要を示すブロック図である。It is a block diagram which shows the outline | summary of the control system which concerns on Embodiment 4 of this invention.
 〔実施形態1〕
 本発明の一実施形態について図1から図7に基づいて説明する。
Embodiment 1
An embodiment of the present invention will be described with reference to FIGS.
 〔システム概要〕
 図1は、本実施形態に係る制御システム5に含まれる表示制御装置1および表示装置(第2の表示装置)3の要部構成の一例を示すブロック図である。本実施形態において、表示装置3は、ユーザの頭部に装着され使用されるヘッドマウントディスプレイ(HMD)である。本実施形態では、指導者であるユーザが表示制御装置1を使用して、表示装置3を使用するユーザである被指導者ユーザを指導する例を説明する。指導は、同じコンテンツを表示制御装置1と表示装置3の両方に表示させながら行う。なお、表示装置3は、HMDに限られず、ディスプレイを備えたパーソナルコンピュータやテレビジョン受像機、スマートフォン、タブレット端末等であってもよい。表示制御装置1は、パーソナルコンピュータやテレビジョン受像機、スマートフォン、タブレット端末等であってもよい。
[System Overview]
FIG. 1 is a block diagram illustrating an example of a main configuration of a display control device 1 and a display device (second display device) 3 included in a control system 5 according to the present embodiment. In the present embodiment, the display device 3 is a head mounted display (HMD) that is used by being mounted on the user's head. In the present embodiment, an example will be described in which a user who is a leader uses the display control device 1 to guide a trainee user who is a user who uses the display device 3. The guidance is performed while displaying the same content on both the display control device 1 and the display device 3. The display device 3 is not limited to the HMD, and may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like provided with a display. The display control device 1 may be a personal computer, a television receiver, a smartphone, a tablet terminal, or the like.
 〔表示制御装置の構成〕
 表示制御装置1は、制御部10、表示部(第1の表示装置)17、入力部18、通信部19、および記憶部20を備えている。制御部10は、表示制御装置1の各部を統括して制御するものであり、全天球画像描画部11、表示領域特定部12、表示領域通知部13、ガイド部(表示領域制御部)14、合成部15、および重畳処理部16を含む。
[Configuration of display control device]
The display control device 1 includes a control unit 10, a display unit (first display device) 17, an input unit 18, a communication unit 19, and a storage unit 20. The control unit 10 controls each unit of the display control device 1 in an integrated manner, and includes an omnidirectional image drawing unit 11, a display region specifying unit 12, a display region notification unit 13, and a guide unit (display region control unit) 14. , A combining unit 15 and a superimposition processing unit 16.
 全天球画像描画部11は、全天球画像(コンテンツ)を表示部17に表示させる。表示装置3は、全天球画像の一部である部分画像を表示するが、全天球画像描画部11は、全天球画像の画像領域全体を表示部17に一画面で表示させる。なお、全天球画像描画部11が表示させる画像は、表示装置3に表示されている部分画像を少なくとも含んでいればよく、全天球画像の画像領域の一部の画像であってもよい。 The omnidirectional image drawing unit 11 displays the omnidirectional image (content) on the display unit 17. The display device 3 displays a partial image that is a part of the omnidirectional image, but the omnidirectional image drawing unit 11 causes the display unit 17 to display the entire image area of the omnidirectional image on one screen. Note that the image displayed by the omnidirectional image drawing unit 11 only needs to include at least the partial image displayed on the display device 3, and may be a partial image of the image area of the omnidirectional image. .
 表示領域特定部12は、表示装置3が、コンテンツの画像領域の何れの領域を表示領域としているかを特定する。そして、表示領域通知部13は、表示領域特定部12が特定した領域を示す表示領域情報を表示部17に表示させることにより、該領域を表示制御装置1のユーザに通知する。 The display area specifying unit 12 specifies which area of the image area of the content the display device 3 uses as the display area. Then, the display area notification unit 13 notifies the user of the display control device 1 of the display area information by displaying the display area information indicating the area specified by the display area specifying unit 12 on the display unit 17.
 ガイド部14は、表示装置3に表示させるべき画像領域であるガイド領域を示す領域ガイド情報を、表示部17に表示させる。ガイド領域は指導者が入力部18への入力操作によって指定することができる。また、後述するガイド情報22にガイド領域を予め設定しておくこともできる。ガイド部14は、表示させた領域ガイド情報がユーザに選択された場合に、当該領域ガイド情報が示す画像領域を表示装置3に表示させる。さらに、ガイド部14は、表示領域情報が表示部17に表示されているコンテンツ上で移動された場合に、表示装置3の画像領域を表示領域情報の移動先の位置に移動させる制御も行う。なお、表示装置3の画像領域を移動させる処理は、ガイド部14とは別の処理ブロックを設けて該処理ブロックに行わせてもよい。 The guide unit 14 causes the display unit 17 to display area guide information indicating a guide area that is an image area to be displayed on the display device 3. The guide area can be designated by the instructor through an input operation to the input unit 18. In addition, a guide area can be set in advance in guide information 22 described later. When the displayed area guide information is selected by the user, the guide unit 14 causes the display device 3 to display the image area indicated by the area guide information. Furthermore, when the display area information is moved on the content displayed on the display section 17, the guide section 14 also performs control to move the image area of the display device 3 to the position where the display area information is moved. Note that the process of moving the image area of the display device 3 may be performed by providing a processing block different from the guide unit 14.
 合成部15は、全天球画像に上述の表示領域情報を合成して、全天球画像に表示領域情報が重畳された合成画像を生成し、表示部17に表示させる。同様に、合成部15は、全天球画像に上述の領域ガイド情報が重畳された合成画像を生成し、表示部17に表示させる。また、合成部15は、全天球画像に後述の重畳画像が重畳された合成画像を生成し、表示部17に表示させる。 The synthesizing unit 15 synthesizes the display area information described above with the omnidirectional image, generates a synthesized image in which the display area information is superimposed on the omnidirectional image, and causes the display unit 17 to display the synthesized image. Similarly, the composition unit 15 generates a composite image in which the above-described area guide information is superimposed on the omnidirectional image and causes the display unit 17 to display the composite image. Further, the synthesizing unit 15 generates a synthesized image obtained by superimposing a later-described superimposed image on the omnidirectional image and causes the display unit 17 to display the synthesized image.
 重畳処理部16は、全天球画像と関連する重畳画像を表示装置3に表示させる。表示装置3には全天球画像の部分画像が表示されるから、重畳画像は部分画像に重畳して表示される。また、重畳処理部16は、表示部17が表示する全天球画像において指定された領域を、重畳画像の重畳領域とする。なお、重畳処理部16の詳細は実施形態2で説明する。 The superimposition processing unit 16 causes the display device 3 to display a superimposed image related to the omnidirectional image. Since the partial image of the omnidirectional image is displayed on the display device 3, the superimposed image is displayed superimposed on the partial image. In addition, the superimposition processing unit 16 sets a region designated in the omnidirectional image displayed by the display unit 17 as a superimposition region of the superimposed image. Details of the superimposition processing unit 16 will be described in the second embodiment.
 表示部17は、画像を表示する装置である。表示部17は、表示制御装置1に外付けされた表示装置であってもよい。 The display unit 17 is a device that displays an image. The display unit 17 may be a display device externally attached to the display control device 1.
 入力部18は、ユーザの入力操作を受け付けるものであり、受け付けた入力操作の内容を示す情報を制御部10に出力する。入力部18は、例えば図示しないコントローラに対するユーザの入力操作の内容を示す信号を、当該コントローラから受信する受信部であってもよい。入力部18は、コンテンツ上でガイド領域の位置を指定する操作、表示領域情報を全天球画像上で移動させる操作、領域ガイド情報を選択する操作、およびコンテンツ上で重畳領域を指定する操作等を受け付け可能なものであればよい。 The input unit 18 receives a user input operation, and outputs information indicating the content of the received input operation to the control unit 10. The input unit 18 may be, for example, a receiving unit that receives a signal indicating the content of a user input operation on a controller (not shown) from the controller. The input unit 18 performs an operation for specifying the position of the guide area on the content, an operation for moving the display area information on the omnidirectional image, an operation for selecting the area guide information, an operation for specifying the overlapping area on the content, and the like. Anything can be accepted.
 通信部19は、表示制御装置1が他の装置(本例では表示装置3)と通信するためのものである。通信部19と表示装置3の通信部38とは、フルメッシュ(ピアツーピア)で通信するものであってもよいし、所定の通信ネットワーク(例えばインターネット等)および他の機器を介して間接的に通信するものであってもよい。 The communication unit 19 is for the display control device 1 to communicate with another device (display device 3 in this example). The communication unit 19 and the communication unit 38 of the display device 3 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may do.
 記憶部20は、表示制御装置1が使用する各種データを記憶するものである。記憶部20には、全天球画像21、ガイド情報22、および重畳画像23が格納されている。全天球画像21は、撮像地点から全方位を撮像した画像である。本実施形態において表示制御装置1が表示するコンテンツは全天球画像21である。ガイド情報22は、全天球画像21の表示領域のうち、何れの領域がガイド領域であるかを示す情報である。ガイド部14は、ガイド情報22に基づいて領域ガイド情報を表示することができる。重畳画像23は、全天球画像21と関連した内容の画像である。重畳画像23の詳細は実施形態2で説明する。全天球画像21および重畳画像23は、動画像であってもよいし、静止画像であってもよい。本実施形態では、全天球画像21が動画像である例を説明する。 The storage unit 20 stores various data used by the display control device 1. The storage unit 20 stores an omnidirectional image 21, guide information 22, and a superimposed image 23. The omnidirectional image 21 is an image obtained by imaging all directions from the imaging point. In the present embodiment, the content displayed by the display control device 1 is the omnidirectional image 21. The guide information 22 is information indicating which of the display areas of the omnidirectional image 21 is a guide area. The guide unit 14 can display area guide information based on the guide information 22. The superimposed image 23 is an image having contents related to the omnidirectional image 21. Details of the superimposed image 23 will be described in a second embodiment. The omnidirectional image 21 and the superimposed image 23 may be a moving image or a still image. In the present embodiment, an example in which the omnidirectional image 21 is a moving image will be described.
 〔表示装置の構成〕
 表示装置3は、コンテンツを表示させる装置であり、制御部30、表示部35、記憶部36、センサ37、および通信部38を備えている。制御部30は、表示装置3の各部を統括して制御するものであり、視線方向特定部31、全天球画像描画部(表示領域特定部)32、合成部33、および表示領域通知部34を含む。
[Configuration of display device]
The display device 3 is a device that displays content, and includes a control unit 30, a display unit 35, a storage unit 36, a sensor 37, and a communication unit 38. The control unit 30 controls each unit of the display device 3 in an integrated manner, and includes a line-of-sight direction specifying unit 31, an omnidirectional image drawing unit (display region specifying unit) 32, a combining unit 33, and a display region notifying unit 34. including.
 視線方向特定部31は、センサ37の出力値から、表示装置3のユーザの視線方向を判定する。センサ37は、表示装置3の向き、すなわち表示装置3を装着しているユーザの顔の向き(正面方向)を検出するものである。センサ37は、3軸のジャイロセンサ、3軸の加速度センサ、3軸の磁力センサ等のうち少なくとも2つを組み合わせた6軸センサによって構成してもよい。視線方向特定部31は、これらセンサの出力値から特定したユーザの顔の向きを、ユーザの視線方向とする。また、センサ37は、ユーザの黒目の位置を検出するものであってもよい。この場合、視線方向特定部31は、ユーザの黒目の位置から視線方向を特定する。また、センサ37には、ユーザの顔の向きを検出するセンサと、ユーザの黒目の位置を検出するセンサとが含まれていてもよい。なお、視線方向の特定は、上記以外の構成で実現することもできる。例えば、センサ37の代わりに、表示装置3の外部に設置したカメラを用いてもよい。この場合、表示装置3には発光装置を設けてこれを明滅させ、この様子を上記カメラで撮影し、その画像から表示装置3の位置や向きを検出することができる。また、例えば、外部の発光装置から発射されるレーザー等を表示装置3に備えられた受光器で受光した受光時間や各点受光の角度、時間差から逆算して視線方向を割り出すこともできる。 The line-of-sight direction specifying unit 31 determines the line-of-sight direction of the user of the display device 3 from the output value of the sensor 37. The sensor 37 detects the orientation of the display device 3, that is, the orientation of the face of the user wearing the display device 3 (front direction). The sensor 37 may be configured by a six-axis sensor that combines at least two of a three-axis gyro sensor, a three-axis acceleration sensor, a three-axis magnetic sensor, and the like. The line-of-sight direction specifying unit 31 sets the direction of the user's face specified from the output values of these sensors as the user's line-of-sight direction. The sensor 37 may detect the position of the user's black eye. In this case, the line-of-sight direction specifying unit 31 specifies the line-of-sight direction from the position of the user's black eye. The sensor 37 may include a sensor that detects the orientation of the user's face and a sensor that detects the position of the user's black eyes. The identification of the line-of-sight direction can also be realized by a configuration other than the above. For example, instead of the sensor 37, a camera installed outside the display device 3 may be used. In this case, the display device 3 is provided with a light emitting device, which is blinked, and this state is photographed by the camera, and the position and orientation of the display device 3 can be detected from the image. In addition, for example, the line-of-sight direction can be determined by back-calculating from the light reception time when the laser emitted from the external light emitting device is received by the light receiver provided in the display device 3, the angle of each point light reception, or the time difference.
 全天球画像描画部32は、視線方向特定部31が特定した視線方向から、全天球画像21の画像領域における何れの領域が表示領域として指定されているかを特定する。そして、全天球画像描画部32は、全天球画像21の画像領域のうち、上記特定した表示領域の部分画像を、合成部33を介して表示部35に表示させる。 The omnidirectional image drawing unit 32 specifies which region in the image region of the omnidirectional image 21 is designated as the display region from the line-of-sight direction specified by the line-of-sight direction specifying unit 31. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display the partial image of the identified display region in the image region of the omnidirectional image 21 via the synthesis unit 33.
 合成部33は、部分画像に重畳する重畳画像23がある場合に、部分画像に重畳画像が重畳された合成画像を生成し、表示部35に表示させる。 The synthesizing unit 33, when there is a superimposed image 23 to be superimposed on the partial image, generates a synthesized image in which the superimposed image is superimposed on the partial image and causes the display unit 35 to display the synthesized image.
 表示領域通知部34は、全天球画像描画部32が特定した表示領域を示す表示領域情報を、通信部38を介して表示制御装置1に通知することにより、該表示領域情報を表示制御装置1の表示部17に表示させる。 The display area notification unit 34 notifies the display control apparatus 1 of display area information indicating the display area specified by the omnidirectional image drawing unit 32 via the communication unit 38, thereby displaying the display area information. 1 on the display unit 17.
 表示部35は、画像を表示する装置である。表示部35は、非透過型であってもよいし、透過型であってもよい。透過型の表示部35を用いた場合、ユーザに、表示装置3外(現実空間)の視野に、表示部35が表示する画像が重ね合わされた複合現実空間を提供することができる。また、表示部35は、表示装置3に外付けされた表示装置であってもよく、通常のフラットパネルディスプレイ等であってもよい。 The display unit 35 is a device that displays an image. The display unit 35 may be a non-transmissive type or a transmissive type. When the transmissive display unit 35 is used, it is possible to provide the user with a mixed reality space in which the image displayed by the display unit 35 is superimposed on the visual field outside the display device 3 (real space). Further, the display unit 35 may be a display device externally attached to the display device 3, or a normal flat panel display or the like.
 記憶部36は、表示装置3が使用する各種データを記憶するものである。図1では図示を省略しているが、記憶部36には、表示制御装置1と同様に全天球画像21が格納されている。また、重畳画像23を表示させる場合には、重畳画像23も記憶部36に格納しておく。 The storage unit 36 stores various data used by the display device 3. Although not shown in FIG. 1, the omnidirectional image 21 is stored in the storage unit 36 as in the display control device 1. When displaying the superimposed image 23, the superimposed image 23 is also stored in the storage unit 36.
 通信部38は、表示装置3が他の装置(本例では表示制御装置1)と通信するためのものである。通信部38と表示制御装置1の通信部19とは、フルメッシュ(ピアツーピア)で通信するものであってもよいし、所定の通信ネットワーク(例えばインターネット等)および他の機器を介して間接的に通信するものであってもよい。 The communication unit 38 is for the display device 3 to communicate with another device (in this example, the display control device 1). The communication unit 38 and the communication unit 19 of the display control device 1 may communicate with each other in a full mesh (peer-to-peer) or indirectly via a predetermined communication network (for example, the Internet) and other devices. You may communicate.
 〔視線方向に応じた画像の表示〕
 表示装置3がユーザの視線方向に応じた画像を表示する方法について、図2に基づいて説明する。図2は、全天球画像と表示領域との関係を示す図である。図2では、互いに直交するx、y、z軸で規定される三次元座標空間上にて全天球画像A1を示している。全天球画像A1は、半径rの球体である全天球を構成している。z軸方向は現実空間におけるセンサ37の垂直方向と一致しており、y軸方向は現実空間におけるセンサ37の正面方向と一致しており、x軸方向は現実空間におけるセンサ37の左右方向と一致している。
[Display of image according to line of sight]
A method in which the display device 3 displays an image corresponding to the user's line-of-sight direction will be described with reference to FIG. FIG. 2 is a diagram illustrating the relationship between the omnidirectional image and the display area. In FIG. 2, the omnidirectional image A1 is shown in a three-dimensional coordinate space defined by x, y, and z axes orthogonal to each other. The omnidirectional image A1 forms an omnidirectional sphere that is a sphere of radius r. The z-axis direction coincides with the vertical direction of the sensor 37 in the real space, the y-axis direction coincides with the front direction of the sensor 37 in the real space, and the x-axis direction coincides with the left-right direction of the sensor 37 in the real space. I'm doing it.
 視線方向特定部31は、センサ37の出力値から、センサ37が何れの方向を向いているかを判定する。センサ37は表示装置3に所定の向きで搭載されるので、表示装置3をユーザが正しい向きに装着していれば、センサ37の向きをユーザの視線方向とみなすことが可能である。よって、以下ではセンサ37の向きをユーザの視線方向として説明する。視線方向特定部31は、視線方向を、垂直軸(z軸)周りの回転角である方位角(ヨー)θ(-180°≦θ≦180°)と、水平軸(x軸)周りの回転角である仰角(ピッチ)φ(-90°≦φ≦90°)との組み合わせで表すことができる。 The line-of-sight direction identifying unit 31 determines which direction the sensor 37 is facing from the output value of the sensor 37. Since the sensor 37 is mounted on the display device 3 in a predetermined orientation, if the user wears the display device 3 in the correct orientation, the orientation of the sensor 37 can be regarded as the user's line-of-sight direction. Therefore, in the following, the direction of the sensor 37 is described as the user's line-of-sight direction. The line-of-sight direction identifying unit 31 rotates the line-of-sight direction around an azimuth angle (yaw) θ (−180 ° ≦ θ ≦ 180 °) that is a rotation angle around the vertical axis (z-axis) and a horizontal axis (x-axis). It can be expressed in combination with an angle of elevation (pitch) φ (−90 ° ≦ φ ≦ 90 °).
 視線方向特定部31が視線方向を示す方位角と仰角を特定すると、全天球画像描画部32は、ユーザの視点位置Qから上記特定した方位角および仰角が示す方向に延びる直線と、全天球画像A1との交点Pを求める。そして、全天球画像A1において、この交点Pを中心とする高さh、幅wの領域を表示領域A11と特定する。そして、全天球画像描画部32は、全天球画像A1のうち、この表示領域A11内の部分を表示部35に表示させる。これにより、ユーザの視線方向と連動して、表示領域A11が変化し、これにより表示部35に表示される画像も変化する。なお、本実施形態では説明を簡単にするため、全天球内での視点位置Qは不動であるとするが、現実空間でのユーザの移動と連動して、全天球内での視点位置Qを移動させてもよい。 When the line-of-sight direction specifying unit 31 specifies the azimuth angle and elevation angle indicating the line-of-sight direction, the omnidirectional image drawing unit 32 includes a straight line extending from the user's viewpoint position Q in the direction indicated by the specified azimuth angle and elevation angle, An intersection point P with the sphere image A1 is obtained. Then, in the omnidirectional image A1, an area having a height h and a width w with the intersection P as the center is specified as the display area A11. Then, the omnidirectional image drawing unit 32 causes the display unit 35 to display a portion in the display area A11 of the omnidirectional image A1. As a result, the display area A11 changes in conjunction with the user's line-of-sight direction, and the image displayed on the display unit 35 also changes accordingly. In the present embodiment, the viewpoint position Q in the omnidirectional sphere is assumed to be stationary in order to simplify the description, but the viewpoint position in the omnidirectional sphere is linked with the movement of the user in the real space. Q may be moved.
 〔表示する画像〕
 表示制御装置1が表示する画像について図3に基づいて説明する。図3は、表示制御装置1が表示する画像を説明する図である。図3に示すように、表示制御装置1は、全天球画像A1を2次元平面にマッピングした平面画像を表示部17に表示する。上述のように、全天球画像A1上の位置は、方位角θ(-180°≦θ≦180°)と、水平軸(x軸)周りの回転角である仰角φ(-90°≦φ≦90°)との組み合わせで表すことができる。図示の例では、全天球画像A1の左端の方位角が-180°、右端の方位角が180°、上端の仰角が90°、下端の仰角が-90°である。
[Image to be displayed]
An image displayed by the display control device 1 will be described with reference to FIG. FIG. 3 is a diagram illustrating an image displayed by the display control device 1. As illustrated in FIG. 3, the display control device 1 displays a planar image obtained by mapping the omnidirectional image A1 on a two-dimensional plane on the display unit 17. As described above, the position on the omnidirectional image A1 includes the azimuth angle θ (−180 ° ≦ θ ≦ 180 °) and the elevation angle φ (−90 ° ≦ φ that is a rotation angle around the horizontal axis (x axis). ≦ 90 °). In the illustrated example, the azimuth angle at the left end of the omnidirectional image A1 is −180 °, the azimuth angle at the right end is 180 °, the elevation angle at the upper end is 90 °, and the elevation angle at the lower end is −90 °.
 全天球画像A1には、表示装置3における表示領域を示す表示領域情報A11が重畳表示されている。また、他の表示装置3における表示領域を示す表示領域情報A12およびA13も重畳表示されている。つまり、図3では、3つの表示装置3の表示領域が、それぞれA11~A13で表されている。このように、表示制御装置1は、複数の表示装置3の表示領域情報を1つの画面に表示することもできる。 The display area information A11 indicating the display area in the display device 3 is superimposed on the omnidirectional image A1. In addition, display area information A12 and A13 indicating display areas in the other display devices 3 are also displayed in a superimposed manner. That is, in FIG. 3, the display areas of the three display devices 3 are represented by A11 to A13, respectively. As described above, the display control device 1 can also display the display area information of the plurality of display devices 3 on one screen.
 また、全天球画像A1には、領域ガイド情報B1およびB2が重畳表示されている。領域ガイド情報B1およびB2は、表示装置3に表示させるべき画像領域であるガイド領域を示している。領域ガイド情報B1およびB2と、表示領域情報A11~A13とを表示させることにより、指導者は、表示装置3を使用する被指導者がガイド領域を見ていないことを容易に認識することができる。 In addition, the area guide information B1 and B2 are superimposed on the omnidirectional image A1. The area guide information B1 and B2 indicate guide areas that are image areas to be displayed on the display device 3. By displaying the area guide information B1 and B2 and the display area information A11 to A13, the instructor can easily recognize that the instructor who uses the display device 3 is not looking at the guide area. .
 そして、領域ガイド情報B1およびB2は、入力部18を介した入力操作によって選択可能となっている。そして、指導者が領域ガイド情報B1を選択すると、領域ガイド情報B1が示すガイド領域の部分画像が表示装置3に表示される。領域ガイド情報B2を選択した場合も同様である。これにより、領域ガイド情報B1およびB2の示すガイド領域の部分画像を強制的に表示装置3に表示させることができる。なお、複数の表示装置3について、一括で同じ部分画像を表示させてもよいし、各表示装置3についてそれぞれ異なる領域ガイド情報を選択することにより、異なる部分画像を表示させてもよい。 The area guide information B1 and B2 can be selected by an input operation via the input unit 18. When the instructor selects the area guide information B1, a partial image of the guide area indicated by the area guide information B1 is displayed on the display device 3. The same applies when the area guide information B2 is selected. Thereby, the partial image of the guide area indicated by the area guide information B1 and B2 can be forcibly displayed on the display device 3. It should be noted that the same partial image may be displayed collectively for the plurality of display devices 3, or different partial images may be displayed by selecting different region guide information for each display device 3.
 また、表示領域情報の表示位置についても、入力部18を介した入力操作によって変更できるようにしてもよい。そして、表示領域情報の表示位置が移動されたときには、表示装置3の画像領域を表示領域情報の移動先の位置に移動させてもよい。 Also, the display position of the display area information may be changed by an input operation via the input unit 18. When the display position of the display area information is moved, the image area of the display device 3 may be moved to the position to which the display area information is moved.
 〔ガイド情報の例〕
 ガイド情報22は、例えば図4のような情報であってもよい。図4は、ガイド情報22の一例を示す図である。図4の(a)のガイド情報22は、「ガイド領域」、「方位角の範囲」、「仰角の範囲」、および「再生時刻」の情報が対応付けられたテーブル形式のデータである。
[Example of guide information]
The guide information 22 may be information as shown in FIG. 4, for example. FIG. 4 is a diagram illustrating an example of the guide information 22. The guide information 22 in FIG. 4A is data in a table format in which information on “guide area”, “azimuth angle range”, “elevation angle range”, and “reproduction time” is associated.
 「ガイド領域」はガイド領域の識別情報であり、本例ではガイド領域の名称を記述している。「方位角の範囲」および「仰角の範囲」は全天球画像においてガイド領域の占める範囲を示す情報である。例えば、図示のガイド情報22では、ガイド領域B1(図3の領域ガイド情報B1が示すガイド領域)は、全天球画像において、左端の方位角が-110°、右端の方位角が-30°、下端の仰角が-10°、上端の仰角が40°である矩形を占める領域であると規定されている。 “Guide area” is identification information of the guide area, and in this example, the name of the guide area is described. “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the guide region in the omnidirectional image. For example, in the illustrated guide information 22, the guide region B1 (the guide region indicated by the region guide information B1 in FIG. 3) has an azimuth angle of −110 ° at the left end and −30 ° at the right end in the omnidirectional image. , It is defined as a region occupying a rectangle having a lower end elevation angle of −10 ° and an upper end elevation angle of 40 °.
 「再生時刻」は、ガイド領域を設定する再生時間帯を示す情報である。例えば、図示のガイド領域B1の「再生時刻」は00:01:00~00:05:00であるから、このガイド領域B1は、コンテンツの再生時刻が00:01:00~00:05:00の時間帯に設定される。 “Reproduction time” is information indicating a reproduction time zone in which the guide area is set. For example, since the “reproduction time” of the illustrated guide area B1 is 00:01:00 to 00:05:00, the content reproduction time of this guide area B1 is 00:01:00 to 00:05:00. The time zone is set.
 一方、図4の(b)のガイド情報22は、同図の(a)のガイド情報22に含まれていた「方位角の範囲」および「仰角の範囲」が、「幅」、「高さ」、「方位角」、および「仰角」に置き換わったものである。つまり、図4の(b)のガイド情報22では、「幅」、「高さ」、「方位角」、および「仰角」が、ガイド領域の占める範囲を示す情報である。より詳細には、「幅」および「高さ」は、ガイド領域の幅および高さをそれぞれ示している。また、「方位角」および「仰角」は、ガイド領域を特定するための基準位置を示している。この基準位置はガイド領域上の任意の位置とすることができるが、例えばガイド領域が矩形である場合、その左下隅の位置を基準位置としてもよい。この場合、左下隅が「方位角」および「仰角」で示される位置となり、「幅」および「高さ」で示される幅および高さを有する領域がガイド領域となる。 On the other hand, the guide information 22 shown in FIG. 4B includes the “azimuth range” and “elevation range” included in the guide information 22 shown in FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the guide information 22 in FIG. 4B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the guide region. More specifically, “width” and “height” indicate the width and height of the guide region, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the guide region. The reference position can be an arbitrary position on the guide area. However, for example, when the guide area is rectangular, the position of the lower left corner may be used as the reference position. In this case, the lower left corner is the position indicated by “azimuth angle” and “elevation angle”, and the area having the width and height indicated by “width” and “height” is the guide area.
 〔表示領域管理情報の例〕
 上述のように、表示領域特定部12は、表示装置3の表示領域を特定する。表示領域特定部12は、特定した表示領域を、例えば図5に示すような表示領域管理情報として管理してもよい。図5は、表示装置3の表示領域を管理するための表示領域管理情報の一例を示す図である。図5の(a)の表示領域管理情報は、「ユーザ」、「方位角の範囲」、および「仰角の範囲」の情報が対応付けられたテーブル形式のデータである。
[Example of display area management information]
As described above, the display area specifying unit 12 specifies the display area of the display device 3. The display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG. 5, for example. FIG. 5 is a diagram illustrating an example of display area management information for managing the display area of the display device 3. The display area management information in FIG. 5A is data in a table format in which information on “user”, “azimuth angle range”, and “elevation angle range” is associated.
 「ユーザ」は表示領域に対応するユーザ、すなわちその表示領域の画像を視聴しているユーザの識別情報であり、本例ではユーザ名を記述している。なお、表示領域管理情報は、「ユーザ」の情報に加えて、または「ユーザ」の情報の代わりに表示装置3の識別情報を含んでいてもよい。 “User” is identification information of the user corresponding to the display area, that is, the user who is viewing the image in the display area. In this example, the user name is described. Note that the display area management information may include identification information of the display device 3 in addition to the “user” information or instead of the “user” information.
 「方位角の範囲」および「仰角の範囲」は全天球画像において、表示装置3の表示領域が占める範囲を示す情報である。例えば、図示の表示領域管理情報では、ユーザX1の表示装置3における表示領域は、全天球画像において、左端の方位角が-85°、右端の方位角が-5°、下端の仰角が25°、上端の仰角が75°である矩形を占める領域であると規定されている。 “Azimuth range” and “elevation range” are information indicating the range occupied by the display area of the display device 3 in the omnidirectional image. For example, in the illustrated display area management information, the display area on the display device 3 of the user X1 is a omnidirectional image with a left azimuth angle of −85 °, a right azimuth angle of −5 °, and a lower elevation angle of 25. It is defined that the region occupies a rectangle having an elevation angle of 75 ° at the upper end.
 一方、図5の(b)の表示領域管理情報は、同図の(a)の表示領域管理情報に含まれていた「方位角の範囲」および「仰角の範囲」が、「幅」、「高さ」、「方位角」、および「仰角」に置き換わったものである。つまり、図5の(b)の表示領域管理情報では、「幅」、「高さ」、「方位角」、および「仰角」が、表示領域の占める範囲を示す情報である。より詳細には、「幅」および「高さ」は、表示領域の幅および高さをそれぞれ示している。また、「方位角」および「仰角」は、表示領域を特定するための基準位置を示している。この基準位置は表示領域上の任意の位置とすることができるが、例えば表示領域が矩形である場合、その中心の位置を基準位置としてもよい。この場合、中心が「方位角」および「仰角」で示される位置となり、「幅」および「高さ」で示される幅および高さを有する領域が表示領域となる。 On the other hand, the display area management information in FIG. 5B includes the “azimuth angle range” and the “elevation angle range” included in the display area management information in FIG. It replaces “height”, “azimuth”, and “elevation”. That is, in the display area management information in FIG. 5B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the display area. More specifically, “width” and “height” indicate the width and height of the display area, respectively. Further, “azimuth angle” and “elevation angle” indicate reference positions for specifying the display area. The reference position can be an arbitrary position on the display area. For example, when the display area is rectangular, the center position may be used as the reference position. In this case, the center is the position indicated by “azimuth angle” and “elevation angle”, and the area having the width and height indicated by “width” and “height” is the display area.
 〔処理の流れ(表示制御装置)〕
 表示制御装置1が実行する処理の流れ(表示制御装置の制御方法)の一例を図6に基づいて説明する。図6は、表示制御装置1が画像を表示する処理の一例を示すフローチャートである。なお、以下では、表示装置3が1つである例を説明するが、表示装置3が複数の場合の処理も以下と同様である。
[Process flow (display control device)]
An example of the flow of processing executed by the display control device 1 (control method of the display control device) will be described with reference to FIG. FIG. 6 is a flowchart illustrating an example of processing in which the display control device 1 displays an image. In the following, an example in which there is one display device 3 will be described, but the processing in the case where there are a plurality of display devices 3 is the same as the following.
 S1では、全天球画像描画部11は、記憶部20から全天球画像21を取得し、合成部15を介して表示部17に表示させる。S2(表示領域特定ステップ)では、表示領域特定部12は、表示装置3と通信して、当該表示装置3の表示領域を特定する。表示領域特定部12は、特定した表示領域を、図5に示したような表示領域管理情報として管理してもよい。 In S <b> 1, the omnidirectional image drawing unit 11 acquires the omnidirectional image 21 from the storage unit 20 and displays it on the display unit 17 via the synthesis unit 15. In S <b> 2 (display region specifying step), the display region specifying unit 12 specifies the display region of the display device 3 by communicating with the display device 3. The display area specifying unit 12 may manage the specified display area as display area management information as shown in FIG.
 S3では、ガイド部14は、指導者によるガイド領域の位置の指定を受け付ける。そして、S4では、ガイド部14は、ガイド領域を特定する。S3におけるガイド領域の位置の指定は、例えば入力部18を介して、全天球画像21における指導者の所望の位置を入力することによって行われてもよい。この場合、S4では、ガイド部14は、入力された位置を中心とし、表示装置3の表示領域の幅および高さに対応したサイズのガイド領域を特定する。なお、ガイド情報22(図4参照)を用いてガイド領域を特定する場合、S3の処理は省略し、S4でガイド部14はガイド情報22を参照してガイド領域を特定する。 In S3, the guide unit 14 receives the designation of the position of the guide area by the instructor. In S4, the guide unit 14 specifies a guide region. The designation of the position of the guide region in S3 may be performed by inputting a desired position of the instructor in the omnidirectional image 21 via the input unit 18, for example. In this case, in S4, the guide unit 14 specifies a guide region having a size corresponding to the width and height of the display region of the display device 3 with the input position as the center. When the guide area is specified using the guide information 22 (see FIG. 4), the process of S3 is omitted, and the guide unit 14 specifies the guide area with reference to the guide information 22 in S4.
 S5(表示領域通知ステップ)では、ガイド部14は、S4で特定したガイド領域を示す領域ガイド情報を、合成部15を介して表示部17に表示させる。また、表示領域通知部13は、S2で特定した表示領域を示す表示領域情報を、合成部15を介して表示部17に表示させる。これにより、全天球画像21に重畳して領域ガイド情報と表示領域情報が表示される。なお、領域ガイド情報と表示領域情報は、個別のステップで表示させてもよい。また、S3およびS4の処理は、S2よりも先に行ってもよいし、S2と平行で行ってもよい。 In S5 (display area notification step), the guide unit 14 causes the display unit 17 to display the area guide information indicating the guide area specified in S4 via the combining unit 15. Further, the display area notification unit 13 causes the display unit 17 to display display area information indicating the display area specified in S <b> 2 via the synthesis unit 15. Thereby, the area guide information and the display area information are displayed superimposed on the omnidirectional image 21. Note that the area guide information and the display area information may be displayed in separate steps. Moreover, the process of S3 and S4 may be performed before S2, and may be performed in parallel with S2.
 S6では、ガイド部14は、S5で表示させた領域ガイド情報が選択されたか否かを判定する。例えば、ガイド部14は、表示させた領域ガイド情報が入力部18を介した入力操作にて指導者に選択された場合には、領域ガイド情報が選択されたと判定してもよい。なお、領域ガイド情報の選択方法は特に限定されないが、例えば、表示されている領域ガイド情報をクリックあるいはタップすることによって選択できるようにしてもよい。一方、表示させた領域ガイド情報が選択されない状態が所定時間継続した場合や、領域ガイド情報を選択しないことを選択する指導者の入力操作が検出された場合には、領域ガイド情報が選択されなかったと判定してもよい。S6で領域ガイド情報が選択されたと判定した場合(S6でYES)S7に進み、選択されなかったと判定した場合(S6でNO)S1に戻る。 In S6, the guide unit 14 determines whether or not the area guide information displayed in S5 has been selected. For example, the guide unit 14 may determine that the region guide information has been selected when the displayed region guide information is selected by the instructor through an input operation via the input unit 18. The method for selecting the area guide information is not particularly limited. For example, the area guide information may be selected by clicking or tapping the displayed area guide information. On the other hand, the area guide information is not selected when the state where the displayed area guide information is not selected continues for a predetermined time or when an input operation of a leader who selects not to select the area guide information is detected. It may be determined that If it is determined that the area guide information is selected in S6 (YES in S6), the process proceeds to S7. If it is determined that the area guide information is not selected (NO in S6), the process returns to S1.
 S7では、ガイド部14は、選択された領域ガイド情報が示すガイド領域を表示装置3に通知して、該領域を表示領域とするように表示装置3に指示する。これにより、表示装置3では、表示領域が変更されて、領域ガイド情報の示すガイド領域の画像が表示される。この後処理はS1に戻る。 In S7, the guide unit 14 notifies the display device 3 of the guide region indicated by the selected region guide information, and instructs the display device 3 to set the region as the display region. Thereby, in the display device 3, the display area is changed, and an image of the guide area indicated by the area guide information is displayed. Thereafter, the process returns to S1.
 〔ガイド領域の自動抽出〕
 ガイド領域は、全天球画像からオブジェクトを自動で検出して設定してもよく、この場合検出したオブジェクトの表示領域を含む所定範囲の領域をガイド領域として抽出すればよい。これにより、指導者は、コンテンツ中の所定のオブジェクトを被指導者が見ているか否かを容易に確認することができる。
[Automatic extraction of guide area]
The guide area may be set by automatically detecting an object from the omnidirectional image. In this case, a predetermined area including the display area of the detected object may be extracted as the guide area. Thereby, the instructor can easily confirm whether or not the instructor is looking at a predetermined object in the content.
 自動検出する場合、検出するオブジェクトの外観(形状、大きさ、色等)を予め定めておけばよい。これにより、全天球画像を解析することにより、そのような外観のオブジェクトを自動で検出することができる。また、機械学習等を用いてオブジェクトを検出してもよい。オブジェクトを検出する検出部の機能はガイド部14に持たせてもよいし、検出部の処理ブロックを別途設けて、該処理ブロックにてオブジェクトを検出する構成としてもよい。 In the case of automatic detection, the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance. Thus, by analyzing the omnidirectional image, an object having such an appearance can be automatically detected. Further, the object may be detected using machine learning or the like. The function of the detection unit for detecting an object may be provided in the guide unit 14, or a processing block of the detection unit may be provided separately and an object may be detected by the processing block.
 〔処理の流れ(表示装置)〕
 表示装置3が実行する処理の流れ(表示装置の制御方法)の一例を図7に基づいて説明する。図7は、表示装置3が画像を表示する処理の一例を示すフローチャートである。
[Processing flow (display device)]
An example of the flow of processing (display device control method) executed by the display device 3 will be described with reference to FIG. FIG. 7 is a flowchart illustrating an example of processing in which the display device 3 displays an image.
 S11では、視線方向特定部31は、表示装置3を装着している被指導者の視線方向を特定し、全天球画像描画部32は、視線方向特定部31が特定した視線方向から、全天球画像における表示領域を特定する。そして、S12では、全天球画像描画部32は、上記特定した表示領域に対応する、全天球画像の部分画像を、合成部33を介して表示部35に描画(表示)させる。 In S <b> 11, the line-of-sight direction specifying unit 31 specifies the line-of-sight direction of the instructor wearing the display device 3, and the omnidirectional image drawing unit 32 determines all of the line-of-sight directions specified by the line-of-sight direction specifying unit 31. The display area in the celestial sphere image is specified. In S <b> 12, the omnidirectional image drawing unit 32 causes the display unit 35 to draw (display) the partial image of the omnidirectional image corresponding to the specified display area via the synthesis unit 33.
 S13では、表示領域通知部34は、上記特定された表示領域を表示制御装置1に通知する。具体的には、表示領域通知部34は、表示装置3または表示装置3のユーザの識別情報と、全天球画像において表示装置3の表示領域が占める範囲を示す情報(例えば方位角の範囲と仰角の範囲を示す情報)とを対応付けて表示制御装置1に送信する。 In S13, the display area notification unit 34 notifies the display control apparatus 1 of the specified display area. Specifically, the display area notification unit 34 includes the identification information of the display device 3 or the user of the display device 3 and information indicating the range occupied by the display area of the display device 3 in the omnidirectional image (for example, the range of the azimuth angle). Information indicating the range of the elevation angle) is transmitted to the display control device 1 in association with each other.
 S14では、全天球画像描画部32は、表示制御装置1から表示領域の変更指示を受信したか否かを判定する。受信していないと判定した場合(S14でNO)S11に戻り、受信したと判定した場合(S14でYES)S15に進む。 In S14, the omnidirectional image drawing unit 32 determines whether or not a display area change instruction is received from the display control device 1. If it is determined that it has not been received (NO in S14), the process returns to S11, and if it is determined that it has been received (YES in S14), the process proceeds to S15.
 S15では、全天球画像描画部32は、表示制御装置1から受信した指示に従って表示領域を変更し、変更後の表示領域に対応する、全天球画像の部分画像を、合成部33を介して表示部35に描画(表示)させる。この後処理はS11に戻る。 In S <b> 15, the omnidirectional image drawing unit 32 changes the display area according to the instruction received from the display control apparatus 1, and the partial image of the omnidirectional image corresponding to the changed display area is passed through the synthesis unit 33. Then, the display unit 35 is drawn (displayed). Thereafter, the process returns to S11.
 〔表示領域の変更について〕
 上記S15において、変更前後の表示領域の位置の変化量(方位角、仰角の変化量)が所定値以上である場合には、全天球画像描画部32は、下記(1)~(4)の何れかの画像処理を行った上で、表示領域を変更してもよい。これにより、表示領域の急変時にも被指導者が酔いにくくなる。
(1)表示領域の変更中はブラックアウトする(黒画面を表示する)
(2)クロスフェードにより表示領域を切り替える(変更前の部分画像をフェードアウトさせつつ、変更後の部分画像をフェードインさせる)
(3)表示領域の変更中は映像をぼやかす(解像度もしくは解像感を下げる)
(4)表示領域を変化させる速度を落とす
 なお、変更前後の表示領域の位置の変化量が所定値以上であるか否かの判定を表示制御装置1側で行ってもよい。この場合、上記判定において所定値以上と判定された場合に、表示制御装置1は、表示装置3に対して、上記(1)~(4)の何れかの画像処理を実行するように指示してもよい。
[About changing the display area]
If the amount of change in the position of the display area before and after the change (the amount of change in the azimuth angle and the elevation angle) is equal to or greater than a predetermined value in S15, the omnidirectional image drawing unit 32 performs the following (1) to (4) The display area may be changed after performing any one of the image processing. This makes it difficult for the instructor to get drunk even when the display area changes suddenly.
(1) Black out while changing the display area (display a black screen)
(2) Switching the display area by crossfade (fading out the changed partial image while fading out the changed partial image)
(3) Blur the image while changing the display area (decrease resolution or resolution)
(4) Decreasing the speed at which the display area is changed Note that the display control apparatus 1 may determine whether the amount of change in the position of the display area before and after the change is greater than or equal to a predetermined value. In this case, when it is determined in the above determination that the value is equal to or greater than the predetermined value, the display control device 1 instructs the display device 3 to execute any one of the image processes (1) to (4). May be.
 〔実施形態2〕
 本発明の他の実施形態について、図8~図11に基づいて説明する。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。実施形態3以降も同様である。
[Embodiment 2]
Another embodiment of the present invention will be described with reference to FIGS. For convenience of explanation, members having the same functions as those described in the embodiment are given the same reference numerals, and descriptions thereof are omitted. The same applies to the third and subsequent embodiments.
 本実施形態では、全天球画像に重畳して重畳画像を表示させる例を説明する。重畳画像は、表示制御装置1の表示部17に表示され、表示装置3の表示部35にも表示される。本実施形態の表示制御装置1が表示する画像について図8に基づいて説明する。図8は、表示制御装置1が表示する、全天球画像に重畳画像を重畳させた画像を説明する図である。 In the present embodiment, an example will be described in which a superimposed image is displayed superimposed on the omnidirectional image. The superimposed image is displayed on the display unit 17 of the display control device 1 and also displayed on the display unit 35 of the display device 3. An image displayed by the display control apparatus 1 of the present embodiment will be described with reference to FIG. FIG. 8 is a diagram illustrating an image displayed by the display control device 1 in which a superimposed image is superimposed on an omnidirectional image.
 図8の例では、図3と同様に、全天球画像A1に重畳して、表示領域情報A11~A13と、領域ガイド情報B1およびB2が表示されている。また、全天球画像A1には、領域情報C1~C3が重畳表示されている。このうち、領域情報C1およびC2は、重畳画像を重畳表示させる領域である重畳領域を示す重畳領域情報である。また、領域情報C3は、重畳画像を表示させない領域である重畳禁止領域を示す重畳禁止領域情報である。 In the example of FIG. 8, as in FIG. 3, display area information A11 to A13 and area guide information B1 and B2 are displayed superimposed on the omnidirectional image A1. In addition, the area information C1 to C3 is superimposed and displayed on the omnidirectional image A1. Among these, the area information C1 and C2 is superimposition area information indicating a superimposition area that is an area in which the superimposed image is superimposed and displayed. The area information C3 is superposition prohibition area information indicating a superposition prohibition area that is an area where a superimposed image is not displayed.
 表示制御装置1を使用する指導者は、領域情報C1またはC2を選択して、当該領域情報C1またはC2が示す重畳領域に重畳画像を表示させることができる。 The instructor who uses the display control device 1 can select the region information C1 or C2 and display the superimposed image in the superimposed region indicated by the region information C1 or C2.
 重畳画像は、重畳されるコンテンツ(本例では全天球画像A1)と関連した画像であればよい。例えば、重畳画像は、動画像コンテンツであってもよいし、静止画像コンテンツであってもよく、全天球画像A1または他の重畳画像に関する注釈として表示されるアノテーションであってもよい。 The superimposed image may be an image related to the content to be superimposed (the omnidirectional image A1 in this example). For example, the superimposed image may be a moving image content, a still image content, or an annotation displayed as an annotation regarding the omnidirectional image A1 or another superimposed image.
 重畳画像は、全天球画像A1の撮像対象(本例では街並み)と同じ撮像対象を撮像した画像であるが、撮像に使用した撮像装置が、全天球画像A1を撮像した撮像装置と異なっている画像であってもよい。また、重畳画像は、全天球画像A1よりも解像度が高く、それゆえ全天球画像A1よりも高精細な画像であってもよい。重畳画像は、例えば全天球画像A1の撮像対象を、全天球画像A1と同じアングルで撮影した画像であってもよいし、全天球画像A1とは異なるアングルで撮影した画像(例えば特定の視点の画像)であってもよい。また、重畳画像は、全天球画像A1の撮像対象の一部を拡大した画像であってもよい。 The superimposed image is an image obtained by imaging the same imaging target as the imaging target of the omnidirectional image A1 (in this example, the cityscape), but the imaging device used for imaging is different from the imaging device that images the omnidirectional image A1. It may be an image. Further, the superimposed image has a higher resolution than the omnidirectional image A1, and therefore may be a higher definition image than the omnidirectional image A1. The superimposed image may be, for example, an image obtained by photographing the imaging target of the omnidirectional image A1 at the same angle as the omnidirectional image A1, or an image (for example, a specific image) taken at an angle different from the omnidirectional image A1. Image of the viewpoint). Further, the superimposed image may be an image obtained by enlarging a part of the imaging target of the omnidirectional image A1.
 重畳画像を表示させることにより、全天球画像A1の撮像対象についての多面的な情報をユーザに与えることができる。例えば、領域ガイド情報B1内のガイド領域における特定の建物を拡大した高解像度画像を重畳画像とすることにより、ユーザは、街並みの全体像を確認しながら、特定の建物の一部を詳細に確認することができる。また、例えば、特定の建物を全天球画像A1とは異なるアングルで撮像した画像を重畳画像とすることにより、ユーザは、特定の建物の、全天球画像A1の撮像アングルでは撮像されない部分についても確認することができる。 By displaying the superimposed image, multifaceted information about the imaging target of the omnidirectional image A1 can be given to the user. For example, by using a high-resolution image obtained by enlarging a specific building in the guide area in the area guide information B1 as a superimposed image, the user can check a part of the specific building in detail while checking the entire cityscape. can do. Further, for example, by using an image obtained by capturing a specific building at an angle different from that of the omnidirectional image A1 as a superimposed image, the user can select a portion of the specific building that is not captured at the imaging angle of the omnidirectional image A1. Can also be confirmed.
 アノテーションの内容は、全天球画像A1または重畳画像に関するものであれば特に限定されない。例えば、全天球画像A1に関する情報は、撮像対象の状態、動作、名称、全天球画像A1の注目すべき部分等を示す情報であってもよい。また、重畳画像に関する情報としては、どのようなアングルで撮像したか、拡大画像であるか否か、撮像対象の何れの部分の画像であるか等が挙げられる。この他にも例えば、表示制御装置1を操作するUI(User Interface)メニュー、あるいは表示装置3の動作制御用のUIメニュー等をアノテーションとして表示してもよい。 The content of the annotation is not particularly limited as long as it relates to the omnidirectional image A1 or the superimposed image. For example, the information regarding the omnidirectional image A1 may be information indicating the state of the imaging target, the operation, the name, the notable part of the omnidirectional image A1, and the like. Further, examples of the information related to the superimposed image include the angle at which the image is captured, whether the image is an enlarged image, and which portion of the image to be imaged. In addition to this, for example, a UI (User Interface) menu for operating the display control device 1 or a UI menu for operation control of the display device 3 may be displayed as an annotation.
 図8の例では、全天球画像A1は、街並みを撮像対象とした画像であるが、撮像対象は任意である。例えば、全天球画像A1を、手術が行われている手術室内の全体の様子を撮像した画像としてもよい。この場合、撮像対象には、執刀医、助手、患者、手術器具、各種機器等が含まれていてもよい。全天球画像A1をこのような画像とすることにより、表示制御装置1および表示装置3を医療教育に利用することができる。 In the example of FIG. 8, the omnidirectional image A1 is an image in which the cityscape is an imaging target, but the imaging target is arbitrary. For example, the omnidirectional image A1 may be an image obtained by imaging the entire state of the operating room where the operation is performed. In this case, the imaging target may include a surgeon, an assistant, a patient, a surgical instrument, various devices, and the like. By using the omnidirectional image A1 as such an image, the display control device 1 and the display device 3 can be used for medical education.
 例えば、重畳画像を執刀医や助手等の全天球画像A1中の人物の目線の画像とすることにより、被指導者は、全天球画像A1にて手術全体の進行を確認しながら、各人物が手術中にどのような目配りをすればよいかを学習することができる。なお、学習目的に応じて、何れの人物の目線の画像を表示させるかを切り替え可能にしてもよい。例えば、執刀医の教育に使用する場合には執刀医の目線の重畳画像を表示させ、助手の教育に使用する場合には助手の目線の重畳画像を表示させてもよい。また、複数の被指導者にそれぞれ異なる目線の重畳画像を表示させてもよい。また、患者のバイタルデータを表示する画面を撮像した画像を重畳画像とすることにより、手術中のバイタルデータの遷移と、それに対する各人物の動作との関係をユーザに認識させることができる。この他にも、例えば術野の高解像度画像を重畳画像としてもよく、これにより執刀医の細かな作業の内容もユーザに認識させることができる。さらに、手術で必要な情報や、機器の操作情報(例えば人工心肺のオン/オフ)等をアノテーションとして表示してもよい。 For example, by setting the superimposed image as an image of the eyes of a person in the omnidirectional image A1 such as a surgeon or an assistant, the instructor can check the progress of the entire operation in the omnidirectional image A1, It is possible to learn how to take care of a person during surgery. Note that it may be possible to switch which person's eye-gaze image is displayed according to the learning purpose. For example, a superimposed image of the surgeon's eyes may be displayed when used for surgeon education, and a superimposed image of the assistant's eyes may be displayed when used for assistant education. Moreover, you may display the superimposition image of a different eyes | visual_axis in a several to-be-led | instructed person, respectively. In addition, by using an image obtained by capturing a screen displaying patient vital data as a superimposed image, the user can recognize the relationship between the transition of vital data during surgery and the movement of each person. In addition to this, for example, a high-resolution image of the operative field may be used as a superimposed image, thereby allowing the user to recognize the details of the surgeon's detailed work. Furthermore, information necessary for surgery, device operation information (for example, on / off of heart-lung machine), and the like may be displayed as annotations.
 〔領域管理情報の例〕
 上述のような重畳領域および重畳禁止領域を規定した領域管理情報を記憶部20等に格納しておいてもよい。この場合、重畳処理部16は、記憶されている領域管理情報を参照することにより重畳領域および重畳禁止領域を特定することができる。
[Example of area management information]
The area management information that defines the overlapping area and the overlapping prohibited area as described above may be stored in the storage unit 20 or the like. In this case, the superimposition processing unit 16 can specify the superimposition region and the superposition prohibition region by referring to the stored region management information.
 領域管理情報は、例えば図9のような情報であってもよい。図9は、重畳領域および重畳禁止領域を規定した領域管理情報の一例を示す図である。図9の(a)の領域管理情報は、「領域」、「方位角の範囲」、「仰角の範囲」、「重畳/禁止」、および「再生時刻」の情報が対応付けられたテーブル形式の情報である。 The area management information may be information as shown in FIG. 9, for example. FIG. 9 is a diagram illustrating an example of region management information that defines a superimposition region and a superposition prohibition region. The area management information in (a) of FIG. 9 has a table format in which information of “area”, “azimuth angle range”, “elevation angle range”, “superimposition / prohibition”, and “reproduction time” is associated. Information.
 「領域」は重畳領域または重畳禁止領域の識別情報であり、本例では当該領域の名称を記述している。「方位角の範囲」および「仰角の範囲」は全天球画像において重畳領域または重畳禁止領域の占める範囲を示す情報である。例えば、図示の領域管理情報では、領域C1は、全天球画像において、左端の方位角が-80°、右端の方位角が-100°、下端の仰角が15°、上端の仰角が30°である矩形を占める領域であると規定されている。 “Area” is identification information of a superimposition area or a superposition prohibition area, and in this example, the name of the area is described. “Azimuth angle range” and “elevation angle range” are information indicating the range occupied by the superimposition region or the superposition prohibition region in the omnidirectional image. For example, in the area management information shown in the drawing, the azimuth angle at the left end is −80 °, the azimuth angle at the right end is −100 °, the elevation angle at the lower end is 15 °, and the elevation angle at the upper end is 30 °. It is specified that the area occupies a rectangle.
 「重畳/禁止」は、「領域」が重畳領域であるか、重畳禁止領域であるかを示す情報である。「重畳/禁止」の情報が「重畳」である領域は重畳領域であり、「禁止」である領域は重畳禁止領域である。 “Superposition / prohibition” is information indicating whether the “region” is a superposition region or a superposition prohibition region. A region where the “superimposition / prohibition” information is “superimposition” is a superimposition region, and a region where “prohibition” is a superimposition prohibition region.
 「再生時刻」は、重畳領域または重畳禁止領域を設定する再生時間帯を示す情報である。例えば、図示の領域C1(図8の領域情報C1が示す領域)の「再生時刻」は00:01:00~00:05:00であるから、この領域C1は、コンテンツの再生時刻が00:01:00~00:05:00の時間帯に設定される。 “Reproduction time” is information indicating a reproduction time zone in which a superimposition area or a superposition prohibition area is set. For example, since the “reproduction time” of the illustrated area C1 (the area indicated by the area information C1 in FIG. 8) is 00:01:00 to 00:05:00, the content reproduction time of this area C1 is 00:00. The time zone is set to 01:00 to 00:05:00.
 一方、図9の(b)の領域管理情報は、同図の(a)の領域管理情報に含まれていた「方位角の範囲」および「仰角の範囲」が、「幅」、「高さ」、「方位角」、および「仰角」に置き換わったものである。つまり、図9の(b)の領域管理情報では、「幅」、「高さ」、「方位角」、および「仰角」が、重畳領域または重畳禁止領域の占める範囲を示す情報である。より詳細には、「幅」および「高さ」は、重畳領域または重畳禁止領域の幅および高さをそれぞれ示している。また、「方位角」および「仰角」は、重畳領域または重畳禁止領域を特定するための基準位置を示している。この基準位置は重畳領域または重畳禁止領域上の任意の位置とすることができるが、例えば重畳領域または重畳禁止領域が矩形である場合、その左下隅の位置を基準位置としてもよい。この場合、左下隅が「方位角」および「仰角」で示される位置となり、「幅」および「高さ」で示される幅および高さを有する領域が重畳領域または重畳禁止領域となる。 On the other hand, the area management information in (b) of FIG. 9 indicates that the “azimuth angle range” and “elevation angle range” included in the area management information in (a) of FIG. ”,“ Azimuth angle ”, and“ elevation angle ”. That is, in the area management information of FIG. 9B, “width”, “height”, “azimuth angle”, and “elevation angle” are information indicating the range occupied by the overlapping area or the overlapping prohibited area. More specifically, “width” and “height” indicate the width and height of the overlapping region or the overlapping prohibition region, respectively. In addition, “azimuth angle” and “elevation angle” indicate reference positions for specifying a superimposition region or a superposition prohibition region. The reference position can be an arbitrary position on the superposition area or the superposition prohibition area. For example, when the superposition area or the superposition prohibition area is a rectangle, the position of the lower left corner may be used as the reference position. In this case, the lower left corner is the position indicated by “azimuth angle” and “elevation angle”, and the region having the width and height indicated by “width” and “height” is the overlapping region or the overlapping prohibition region.
 〔重畳領域および重畳禁止領域の拡大表示〕
 重畳領域は、表示領域内に収まるサイズとしてもよく、この場合、重畳領域は全天球画像の画像領域の一部である表示領域よりもさらに狭い領域となる。このため、重畳領域を指導者に選択させる際には、コンテンツを拡大表示してもよい。これにより、図8の例と比べて、重畳領域を選択しやすくすることができる。これについて、図10に基づいて説明する。
[Enlarged display of overlapping area and prohibited area]
The superimposition area may have a size that fits within the display area. In this case, the superimposition area is a narrower area than the display area that is a part of the image area of the omnidirectional image. For this reason, the content may be enlarged and displayed when the instructor selects the overlapping region. Thereby, compared with the example of FIG. 8, it is possible to easily select the overlapping region. This will be described with reference to FIG.
 図10は、コンテンツを拡大表示した上で重畳領域情報および重畳禁止領域情報を表示した例を示す図である。図10の例では、図8の表示領域B1の部分画像を、表示制御装置1の表示部17の全画面に表示している。部分画像が全画面表示されることにより、領域情報C1およびC3が図8の例よりも詳細に確認できるようになっている。なお、ここではコンテンツを全画面に拡大表示することとしたが、任意の拡大率で拡大した上で元のコンテンツに重畳してもよい。 FIG. 10 is a diagram showing an example in which the superimposition area information and the superposition prohibition area information are displayed after the content is enlarged and displayed. In the example of FIG. 10, the partial image of the display area B <b> 1 of FIG. 8 is displayed on the entire screen of the display unit 17 of the display control device 1. By displaying the partial image on the full screen, the area information C1 and C3 can be confirmed in more detail than the example of FIG. Although the content is enlarged and displayed on the entire screen here, it may be superimposed on the original content after being enlarged at an arbitrary enlargement ratio.
 また、図10の例では、重畳領域を示す領域情報C1と、重畳禁止領域を示す領域情報C3の表示態様を異ならせている。具体的には、領域情報C3には重畳禁止であることを示す「X」の印が付されている。これにより、指導者に重畳領域と重畳禁止領域とを直感的に認識させることができる。 Further, in the example of FIG. 10, the display mode of the area information C1 indicating the overlapping area and the area information C3 indicating the overlapping prohibited area are different. Specifically, the area information C3 is marked with “X” indicating that superposition is prohibited. As a result, the instructor can intuitively recognize the overlapping area and the overlapping prohibited area.
 領域情報C3は、オブジェクトD1の表示領域を含むように設定された領域である。これにより、重畳画像を表示させたときにもオブジェクトD1が見えなくなることがない。なお、重畳禁止領域は、全天球画像からオブジェクトD1を自動で検出して設定してもよく、この場合検出したオブジェクトD1の表示領域を含む所定範囲の領域を重畳禁止領域とすればよい。自動検出する場合、検出するオブジェクトの外観(形状、大きさ、色等)を予め定めておけばよい。これにより、全天球画像を解析することにより、そのような外観のオブジェクトを自動で検出することができる。また、機械学習等を用いてオブジェクトを検出してもよい。このようなオブジェクトの検出は、重畳処理部16が行う構成としてもよいし、オブジェクトを検出する処理ブロックを別途設けて、該処理ブロックにてオブジェクトを検出する構成としてもよい。 Area information C3 is an area set to include the display area of the object D1. Thereby, even when the superimposed image is displayed, the object D1 does not disappear. The superposition prohibition area may be set by automatically detecting the object D1 from the omnidirectional image. In this case, a predetermined range including the display area of the detected object D1 may be set as the superposition prohibition area. In the case of automatic detection, the appearance (shape, size, color, etc.) of the object to be detected may be determined in advance. Thus, by analyzing the omnidirectional image, an object having such an appearance can be automatically detected. Further, the object may be detected using machine learning or the like. Such an object detection may be configured to be performed by the superimposition processing unit 16 or may be configured to separately provide a processing block for detecting an object and detect the object in the processing block.
 〔処理の流れ(重畳画像表示)〕
 本実施形態の表示制御装置1が実行する処理の流れ(表示制御装置の制御方法)の一例を図11に基づいて説明する。図11は、表示制御装置1が表示装置3に重畳画像を表示させる際の処理の一例を示すフローチャートである。なお、図11では、全天球画像を表示部17に表示させた後の処理を示している。また、以下では、表示装置3が1つである例を説明するが、表示装置3が複数の場合の処理も以下と同様である。
[Processing flow (superimposed image display)]
An example of the flow of processing executed by the display control device 1 of the present embodiment (control method of the display control device) will be described with reference to FIG. FIG. 11 is a flowchart illustrating an example of processing when the display control device 1 displays a superimposed image on the display device 3. Note that FIG. 11 illustrates processing after the omnidirectional image is displayed on the display unit 17. In the following, an example in which there is one display device 3 will be described, but the processing in the case where there are a plurality of display devices 3 is the same as the following.
 S21では、表示領域特定部12は、表示装置3と通信して、当該表示装置3の表示領域を特定する。また、S22では、重畳処理部16は、表示している全天球画像における重畳領域を抽出する。具体的には、重畳処理部16は、領域管理情報(図9参照)に示される領域のうち、コンテンツの再生時間に対応し、かつ「重畳/禁止」の情報が「重畳」である領域を抽出する。なお、S22では、S21で特定された表示領域内のみを抽出対象としてもよいし、より広い範囲(例えば表示領域およびその周囲の所定領域)を抽出対象としてもよい。この場合、領域管理情報(図9参照)に示される所定領域(方位角の範囲および仰角の範囲で特定される領域)のうち、その少なくとも一部が抽出対象の領域内に含まれるものを抽出する。 In S <b> 21, the display area specifying unit 12 specifies the display area of the display device 3 by communicating with the display device 3. In S22, the superimposition processing unit 16 extracts a superimposition region in the displayed omnidirectional image. Specifically, the superimposition processing unit 16 selects an area corresponding to the playback time of the content and having “superimposition / prohibition” information “superimposition” among the areas indicated in the area management information (see FIG. 9). Extract. In S22, only the display area specified in S21 may be the extraction target, or a wider range (for example, the display area and a predetermined area around it) may be the extraction target. In this case, a predetermined area (area specified by the azimuth angle range and the elevation angle range) indicated in the area management information (see FIG. 9) is extracted at least a part of which is included in the extraction target area. To do.
 S23では、重畳処理部16は、S22で抽出した重畳領域に表示させる重畳画像があるか否かを判定する。図9のような領域管理情報を予め設定しておく場合、各領域に表示させる重畳領域についても予め設定しておいてもよい。この場合、S23において、重畳処理部16は、S22で抽出した重畳領域に対応付けられている重畳画像の有無を判定する。S23で重畳画像ありと判定された場合(S23でYES)処理はS24に進む。一方、重畳画像なしと判定された場合(S23でNO)処理はS21に戻る。 In S23, the superimposition processing unit 16 determines whether there is a superimposed image to be displayed in the superimposition region extracted in S22. When the area management information as shown in FIG. 9 is set in advance, the overlapping area to be displayed in each area may be set in advance. In this case, in S23, the superimposition processing unit 16 determines the presence or absence of a superimposed image associated with the superimposition region extracted in S22. If it is determined in S23 that there is a superimposed image (YES in S23), the process proceeds to S24. On the other hand, if it is determined that there is no superimposed image (NO in S23), the process returns to S21.
 S24では、重畳処理部16は、重畳画像の表示態様を示す情報を取得する。このような情報は重畳画像に対応付けて予め記憶しておけばよい。表示態様を示す情報としては、例えばパース(パースペクティブ表示、すなわち透視投影による表示)の有無、重畳画像の透過率、および重畳装飾方法(例えば重畳画像の輪郭をぼかす画像処理の適用の有無等)を示す情報が挙げられる。 In S24, the superimposition processing unit 16 acquires information indicating the display mode of the superimposed image. Such information may be stored in advance in association with the superimposed image. The information indicating the display mode includes, for example, the presence / absence of perspective (perspective display, that is, display by perspective projection), the transmittance of the superimposed image, and the superimposed decoration method (for example, presence / absence of application of image processing for blurring the outline of the superimposed image). The information to show is mentioned.
 S25では、重畳処理部16は、S22で抽出した重畳領域を示す重畳領域情報を表示部17に表示させる。この際、重畳処理部16は、重畳領域を含む画像領域を拡大表示させてもよい(図10参照)。 In S25, the superimposition processing unit 16 causes the display unit 17 to display superimposition region information indicating the superimposition region extracted in S22. At this time, the superimposition processing unit 16 may enlarge and display an image area including the superimposition area (see FIG. 10).
 S26では、重畳処理部16は、重畳領域および重畳画像の選択を受け付ける。重畳領域の選択は、例えば入力部18への入力操作によって、表示されている重畳領域情報を選択することで行われてもよい。また、選択された重畳領域に複数の重畳画像が対応付けられている場合、それら複数の重畳画像を表示して、指導者に選択させればよい。 In S26, the superimposition processing unit 16 receives selection of a superimposition region and a superimposed image. The selection of the overlapping area may be performed by selecting the displayed overlapping area information by an input operation to the input unit 18, for example. Further, when a plurality of superimposed images are associated with the selected superimposed region, the plurality of superimposed images may be displayed and selected by the instructor.
 S27では、合成部15は、S26で選択を受け付けた重畳領域に重畳画像の画像を合成して表示部17に描画(表示)させる。なお、重畳画像は、S26より前にプレビューとして表示させてもよい。 In S27, the synthesizing unit 15 synthesizes an image of the superimposed image in the superimposition area received in S26 and draws (displays) the image on the display unit 17. Note that the superimposed image may be displayed as a preview before S26.
 S28では、重畳処理部16は、S26で選択を受け付けた重畳領域と重畳画像を表示装置3に通知して、この重畳領域にこの重畳画像を重畳表示するように指示する。これにより、表示装置3においても重畳領域に重畳画像が重畳表示される。なお、表示装置3が重畳画像を格納していない場合、重畳処理部16は、重畳画像を表示装置3に送信してもよい。この後処理はS21に戻る。 In S28, the superimposition processing unit 16 notifies the display device 3 of the superposed region and the superposed image that have been selected in S26, and instructs the superposed region to superimpose and display this superposed image. Thereby, also on the display device 3, the superimposed image is superimposed and displayed on the superimposed region. When the display device 3 does not store the superimposed image, the superimposition processing unit 16 may transmit the superimposed image to the display device 3. Thereafter, the process returns to S21.
 〔重畳禁止領域について〕
 上記S22の処理では、重畳禁止領域についても抽出してもよく、上記S25では重畳禁止領域を示す重畳禁止領域情報についても表示してもよい。重畳禁止領域は、(1)指導者が重畳領域の位置を自由に設定できるようにした場合、(2)コンテンツ中のオブジェクトの位置を基準として重畳領域の位置を決定する場合、および(3)表示領域の位置を基準として重畳領域の位置を決定する場合等に有用である。
[Overlapping prohibited area]
In the process of S22, the superposition prohibition area may be extracted, and in S25, the superposition prohibition area information indicating the superposition prohibition area may be displayed. The superimposition prohibition area is (1) when the instructor can freely set the position of the superimposition area, (2) when determining the position of the superimposition area based on the position of the object in the content, and (3) This is useful, for example, when determining the position of the superimposition area based on the position of the display area.
 上記(1)の場合、指導者は、重畳禁止領域を除いて重畳領域の位置を自由に設定できるので、重畳領域の位置の自由度を確保しつつ、重畳禁止領域については重畳画像で隠されることがなくなる。 In the case of (1) above, since the instructor can freely set the position of the superimposition area except for the superposition prohibition area, the superposition prohibition area is hidden by the superimposition image while ensuring the degree of freedom of the position of the superposition area. Nothing will happen.
 上記(2)の場合、例えばコンテンツ中において、所定の外観を有するオブジェクト(例えば図8の例では建物)を検出して、そのオブジェクトの周囲(例えばオブジェクト中の基準位置から所定のオフセットだけずらした位置)に重畳画像を表示させるとする。このとき、そのオブジェクトの周囲に重畳禁止領域があれば、これを避けて重畳領域を設定することにより、オブジェクトに関連付けて重畳画像を表示させることができると共に、重畳禁止領域については重畳画像で隠されることがなくなる。 In the case of (2) above, for example, an object having a predetermined appearance (for example, a building in the example of FIG. 8) is detected in the content, and is shifted by a predetermined offset from the periphery of the object (for example, a reference position in the object). Suppose that a superimposed image is displayed at (position). At this time, if there is a superposition prohibition area around the object, the superposition image can be displayed in association with the object by setting the superposition area while avoiding this, and the superposition prohibition area is hidden by the superposition image. It will not be done.
 上記(3)の例としては、表示領域の基準位置(例えば表示領域の中心)から所定のオフセットだけずらした位置に重畳画像を表示させる例が挙げられる。これにより、表示領域と重畳領域が、表示領域が移動しても所定の位置関係となるので、重畳領域の閲覧性を向上させることができる。ただし、所定のオフセットだけずらした位置に重畳禁止領域があれば、これを避けて重畳領域を設定する。これにより、重畳禁止領域については重畳画像で隠されることがなくなる。 As an example of the above (3), there is an example in which a superimposed image is displayed at a position shifted by a predetermined offset from a reference position of the display area (for example, the center of the display area). Thereby, even if the display area moves, the display area and the overlapping area are in a predetermined positional relationship, so that the visibility of the overlapping area can be improved. However, if there is a superposition prohibition area at a position shifted by a predetermined offset, the superposition area is set avoiding this. As a result, the superposition prohibition area is not hidden by the superposition image.
 〔実施形態3〕
 本発明のさらに他の実施形態について、図12に基づいて説明する。本実施形態では、表示制御装置にて表示装置で表示する画像を生成する例を説明する。図12は、本発明の一実施形態に係る制御システム5を構成する表示制御装置1と表示装置3の要部構成の一例を示すブロック図である。
[Embodiment 3]
Still another embodiment of the present invention will be described with reference to FIG. In the present embodiment, an example will be described in which an image to be displayed on a display device is generated by a display control device. FIG. 12 is a block diagram illustrating an example of a main configuration of the display control device 1 and the display device 3 configuring the control system 5 according to an embodiment of the present invention.
 表示制御装置1は、画像送信部40を備えている点で、上記各実施形態の表示制御装置1と相違している。また、表示装置3は、全天球画像描画部32を備えず、画像取得部45を備えている点で、上記各実施形態の表示装置3と相違している。 The display control device 1 is different from the display control device 1 of each embodiment described above in that it includes an image transmission unit 40. The display device 3 is different from the display device 3 of each of the above embodiments in that the omnidirectional image drawing unit 32 is not provided but the image acquisition unit 45 is provided.
 画像送信部40は、通信部19を介して表示装置3に画像を送信する。送信する画像は、全天球画像のうち、表示装置3の表示領域に対応する部分画像である。なお、表示領域は、表示領域通知部34からの通知により、表示領域特定部12が特定する。また、表示部17において、全天球画像に重畳画像が重畳表示されている場合には、重畳画像が重畳された状態の部分画像を送信する。そして、画像取得部45は、通信部38を介して表示制御装置1が送信した画像を取得し、表示部35に表示させる。これにより、表示装置3は、全天球画像や重畳画像を記憶していなくとも、全天球画像の部分画像や、これに重畳画像が重畳された画像を表示することができる。 The image transmission unit 40 transmits an image to the display device 3 via the communication unit 19. The image to be transmitted is a partial image corresponding to the display area of the display device 3 in the omnidirectional image. The display area is specified by the display area specifying unit 12 based on a notification from the display area notifying unit 34. In addition, when the superimposed image is superimposed and displayed on the omnidirectional image on the display unit 17, the partial image in a state where the superimposed image is superimposed is transmitted. Then, the image acquisition unit 45 acquires the image transmitted by the display control device 1 via the communication unit 38 and causes the display unit 35 to display the image. Accordingly, the display device 3 can display a partial image of the omnidirectional image or an image in which the superimposed image is superimposed on the omnidirectional image without storing the omnidirectional image or the superimposed image.
 〔実施形態4〕
 上記各実施形態の表示制御装置1および表示装置3の機能は、サーバと表示制御装置と表示装置とを含む制御システムによって実現することもできる。本実施形態では、制御システムの他の例を図13に基づいて説明する。図13は、本発明の実施形態4に係る制御システム50の概要を示す図である。制御システム50は、表示制御装置1、表示装置3、およびサーバ7を含む。なお、制御システム50には複数の表示装置3が含まれていてもよい。
[Embodiment 4]
The functions of the display control device 1 and the display device 3 of each of the above embodiments can also be realized by a control system including a server, a display control device, and a display device. In the present embodiment, another example of the control system will be described with reference to FIG. FIG. 13 is a diagram showing an overview of a control system 50 according to Embodiment 4 of the present invention. The control system 50 includes a display control device 1, a display device 3, and a server 7. The control system 50 may include a plurality of display devices 3.
 表示制御装置1と表示装置3とサーバ7とは、所定の通信ネットワーク(例えばインターネット等)を介して通信可能に接続されている。よって、表示制御装置1のユーザ(指導者)、または表示装置3のユーザ(被指導者)は、上記通信ネットワークに接続可能な環境であれば、他のユーザ(指導者または被指導者)から離れた遠隔地に居たとしても制御システム50を利用できる。 The display control device 1, the display device 3, and the server 7 are communicably connected via a predetermined communication network (for example, the Internet). Therefore, the user of the display control device 1 (instructor) or the user of the display device 3 (instructor) can be connected to the communication network from other users (instructors or instructors). The control system 50 can be used even if the user is in a remote location.
 サーバ7は、制御システム50において使用される各種情報を記憶しており、これらの情報を表示制御装置1および表示装置3に送信する装置である。サーバ7は、サーバ7が他の装置(本実施形態では表示制御装置1および表示装置3)と通信するための通信部、サーバ7の各部を統括して制御する制御部、および各種情報を記憶する記憶部を備えている。 The server 7 stores various information used in the control system 50, and transmits the information to the display control device 1 and the display device 3. The server 7 stores a communication unit for the server 7 to communicate with other devices (in the present embodiment, the display control device 1 and the display device 3), a control unit that controls each unit of the server 7, and various information. A storage unit is provided.
 また、サーバ7の制御部には、他の装置からの要求に応じて全天球画像21を送信する全天球画像送信部が含まれている。よって、表示制御装置1と表示装置3は、全天球画像送信部から全天球画像21を取得して表示することができる。このため、本実施形態の表示制御装置1と表示装置3は、全天球画像21を記憶しておく必要がない。 Further, the control unit of the server 7 includes an omnidirectional image transmission unit that transmits the omnidirectional image 21 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the omnidirectional image 21 from the omnidirectional image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the omnidirectional image 21.
 さらに、サーバ7の制御部には、他の装置からの要求に応じてガイド情報22を送信するガイド情報送信部が含まれている。よって、表示制御装置1は、ガイド情報送信部からガイド情報22を取得して領域ガイド情報を表示することができる。このため、本実施形態の表示制御装置1は、ガイド情報22を記憶しておく必要がない。表示した領域ガイド情報が選択されたときには、表示制御装置1は、選択された領域ガイド情報の示すガイド領域を表示装置3に通知する。そして、表示装置3は、通知されたガイド領域の部分画像をサーバ7から取得して表示する。 Furthermore, the control unit of the server 7 includes a guide information transmission unit that transmits guide information 22 in response to a request from another device. Therefore, the display control apparatus 1 can acquire the guide information 22 from the guide information transmission unit and display the area guide information. For this reason, the display control apparatus 1 of this embodiment does not need to memorize | store guide information 22. FIG. When the displayed area guide information is selected, the display control apparatus 1 notifies the display apparatus 3 of the guide area indicated by the selected area guide information. The display device 3 acquires the notified partial image of the guide area from the server 7 and displays it.
 また、サーバ7の制御部には、他の装置からの要求に応じて重畳画像23を送信する重畳画像送信部が含まれている。よって、表示制御装置1と表示装置3は、重畳画像送信部から重畳画像23を取得して表示することができる。このため、本実施形態の表示制御装置1と表示装置3は、重畳画像23を記憶しておく必要がない。同様に、領域管理情報(図9参照)についてもサーバ7から取得する構成とすることが可能である。 The control unit of the server 7 includes a superimposed image transmission unit that transmits the superimposed image 23 in response to a request from another device. Therefore, the display control device 1 and the display device 3 can acquire and display the superimposed image 23 from the superimposed image transmission unit. For this reason, the display control device 1 and the display device 3 of the present embodiment do not need to store the superimposed image 23. Similarly, the area management information (see FIG. 9) can be obtained from the server 7.
 〔変形例〕
 上記各実施形態では、表示制御装置1が、全天球画像を2次元平面にマッピングした矩形状の平面画像を表示する例を説明したが、表示制御装置1は表示装置3の表示領域を少なくとも含む画像を表示するものであればよく、この例に限られない。例えば、2つの魚眼カメラで撮影した全天球画像を表示する場合、各魚眼カメラで撮影した画像をそれぞれ2次元平面にマッピングして2つの円形の平面画像を生成し、これを表示してもよい。
[Modification]
In each of the embodiments described above, the display control device 1 has described an example of displaying a rectangular planar image obtained by mapping the omnidirectional image on a two-dimensional plane. However, the display control device 1 displays at least the display area of the display device 3. It is only necessary to display an image including it, and is not limited to this example. For example, when displaying omnidirectional images captured by two fisheye cameras, each image captured by each fisheye camera is mapped to a two-dimensional plane to generate two circular planar images, which are displayed. May be.
 また、上記各実施形態では、コンテンツが全天球画像である例を説明したが、コンテンツは、画像領域全体のうち指定された一部である表示領域の部分画像を表示させることができるものであればよく、全天球画像に限られない。例えば、半天球画像であってもよいし、表示装置3の一画面に収まらない表示サイズの平面画像(パノラマ写真等)であってもよい。また、例えば、通常の縮尺で表示した場合には、表示装置3の一画面に収まる表示サイズの画像であっても、これを拡大表示した際には一画面に収まらなくなる場合があるので、画像を拡大表示した状態において、表示制御装置1に表示領域情報等を表示させてもよい。また、コンテンツは、教育や指導を目的とするものに限られない。 In each of the above embodiments, the example in which the content is a spherical image has been described. However, the content can display a partial image of a display area that is a specified part of the entire image area. What is necessary is not limited to the spherical image. For example, the image may be a hemispherical image, or may be a flat image (such as a panoramic photograph) having a display size that does not fit on one screen of the display device 3. Further, for example, when displayed at a normal scale, even an image having a display size that can fit on one screen of the display device 3 may not fit on one screen when displayed in an enlarged size. May be displayed on the display control device 1. Also, the content is not limited to those intended for education and guidance.
 さらに、上記各実施形態では、コンテンツが動画像である例を示したが、コンテンツは静止画像であってもよく、また複数の静止画像を構成要素とするものであってもよい。この場合、表示制御装置1と表示装置3とで表示させる静止画像を同期させる。 Furthermore, in each of the above-described embodiments, an example in which the content is a moving image has been shown. However, the content may be a still image or may include a plurality of still images as components. In this case, the still images displayed on the display control device 1 and the display device 3 are synchronized.
 上記各実施形態では、表示装置3のユーザである被指導者が、視線を所望の方向に向けることによって表示領域の指定を行う例を説明したが、表示領域の指定方法は特に限定されない。例えば、表示領域の指定を行うためのコントローラ等によって表示領域を指定してもよい。 In each of the above embodiments, an example has been described in which the instructor who is the user of the display device 3 designates the display area by directing his / her line of sight in a desired direction, but the display area designation method is not particularly limited. For example, the display area may be specified by a controller or the like for specifying the display area.
 また、上記各実施形態では、表示制御装置1が表示装置3に指示することにより表示領域を変更させる例を説明したが、表示装置3のユーザに表示領域の変更を促すことにより、表示装置3の表示領域を変更させてもよい。表示領域の変更は、ガイド領域を表示させることを促すメッセージや、ガイド領域を表示させるための視線の移動方向を示す記号(矢印等)またはメッセージ等を含む重畳画像やアノテーションを表示装置3に表示させることによって促すことができる。また、表示装置3が音声を出力する機能を備えている場合、上記のようなメッセージを表示装置3に音声出力させることにより、表示領域の変更を促すこともできる。例えば、「視線を右方向に移動させて三角形の建物に注目して下さい」のようなメッセージをアノテーションとして表示させてもよい。また、「視界の中央に円筒形の低い建物が位置するように、視線を少し左方向に移動させて下さい」等のように、ユーザの視線方向を所定の視線方向に誘導するメッセージをアノテーションとして表示してもよい。また、ガイド領域と、その他の画像領域との表示態様を異ならせることにより、ガイド領域を表示することをユーザに促してもよい。例えば、ガイド領域の表示輝度を他の領域よりも高くさせてもよく、この場合、ユーザは視線方向を移動させて、ガイド領域を容易に見つけることができる。 Further, in each of the above embodiments, the example in which the display control apparatus 1 changes the display area by instructing the display apparatus 3 has been described. However, by prompting the user of the display apparatus 3 to change the display area, the display apparatus 3 can be changed. The display area may be changed. The display area is changed by displaying a superimposed image or an annotation including a message prompting the display of the guide area, a symbol (arrow or the like) indicating the moving direction of the line of sight for displaying the guide area, or a message on the display device 3. Can be urged to. In addition, when the display device 3 has a function of outputting sound, the display device 3 can be prompted to change the display area by causing the display device 3 to output the sound. For example, a message such as “Move the line of sight to the right and pay attention to a triangular building” may be displayed as an annotation. In addition, a message that guides the user's line-of-sight direction to a predetermined line-of-sight direction such as "Please move your line of sight slightly to the left so that a low-cylindrical building is located in the center of the field of view" It may be displayed. In addition, the user may be prompted to display the guide area by changing the display mode of the guide area and the other image areas. For example, the display brightness of the guide area may be made higher than other areas. In this case, the user can easily find the guide area by moving the line-of-sight direction.
 〔ソフトウェアによる実現例〕
 表示制御装置1および表示装置3の制御ブロック(特に制御部10および制御部30に含まれる各部)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Example of software implementation]
The control blocks of the display control device 1 and the display device 3 (particularly, each unit included in the control unit 10 and the control unit 30) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. However, it may be realized by software using a CPU (Central Processing Unit).
 後者の場合、表示制御装置1および表示装置3は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の一態様の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the display control device 1 and the display device 3 include a CPU that executes instructions of a program that is software that implements each function, and a ROM (in which the program and various data are recorded so as to be readable by a computer (or CPU)). Read Only Memory) or a storage device (these are referred to as "recording media"), a RAM (Random Access Memory) for expanding the program, and the like. The computer (or CPU) reads the program from the recording medium and executes the program, thereby achieving the object of one embodiment of the present invention. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. Note that one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
 〔まとめ〕
 本発明の態様1に係る表示制御装置(1)は、コンテンツを第1の表示装置(表示部17)に表示させる表示制御装置であって、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置(表示装置3)が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定部(12)と、上記表示領域特定部が特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知部(13)と、を備えている構成である。
[Summary]
A display control device (1) according to aspect 1 of the present invention is a display control device that displays content on a first display device (display unit 17), and displays a specified display region among the image regions of the content. A display area specifying unit (12) for specifying which area of the image area the second display device (display apparatus 3) for displaying a partial image is a display area, and an area specified by the display area specifying unit And a display area notification unit (13) for displaying the display area information on the first display device.
 上記の構成によれば、第2の表示装置の表示領域を特定し、該領域を示す表示領域情報を第1の表示装置に表示させる。よって、第2の表示装置の表示領域を第1の表示装置のユーザに認識させることができるという効果を奏する。なお、第1の表示装置は、上記コンテンツにおける、上記表示領域を少なくとも含む領域を表示するものであればよく、コンテンツの画像領域の全体を表示するものであってもよいし、より狭い領域を表示するものであってもよい。 According to the above configuration, the display area of the second display device is specified, and display area information indicating the area is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device. The first display device only needs to display an area including at least the display area in the content. The first display device may display the entire image area of the content, or a narrower area. It may be displayed.
 本発明の態様2に係る表示制御装置は、上記態様1において、上記表示領域情報が第1の表示装置に表示されているコンテンツ上で移動された場合に、上記第2の表示装置の上記画像領域を上記表示領域情報の移動先の位置に移動させる表示領域制御部(ガイド部14)を備えている構成としてもよい。 The display control device according to aspect 2 of the present invention is the display control device according to aspect 1, in which the image of the second display device is displayed when the display area information is moved on the content displayed on the first display device. It is good also as a structure provided with the display area control part (guide part 14) which moves an area | region to the position of the movement destination of the said display area information.
 上記の構成によれば、第1の表示装置のユーザは、第2の表示装置のユーザに閲覧させたい画像領域に表示領域情報を移動させることにより、その画像領域の部分画像を第2の表示装置に表示させることができる。 According to the above configuration, the user of the first display device moves the display region information to the image region that the user of the second display device wants to view, thereby displaying the partial image of the image region on the second display. It can be displayed on the device.
 本発明の態様3に係る表示制御装置は、上記態様1において、上記第2の表示装置に表示させるべき画像領域を示す領域ガイド情報を、上記第1の表示装置に表示させるガイド部(14)を備えている構成としてもよい。 The display control device according to aspect 3 of the present invention is the guide unit (14) that causes the first display device to display region guide information indicating an image region to be displayed on the second display device in the above aspect 1. It is good also as a structure provided with.
 上記の構成によれば、領域ガイド情報により第2の表示装置に表示させるべき画像領域を、第1の表示装置のユーザに認識させることができる。また、上述のように、第2の表示装置の表示領域は、表示領域情報により第1の表示装置のユーザに認識させることができる。よって、第2の表示装置に表示させるべき画像領域と、第2の表示装置の実際の表示領域とが一致しているかずれているかを第1の表示装置のユーザに認識させることができる。 According to the above configuration, the user of the first display device can recognize the image region to be displayed on the second display device based on the region guide information. Further, as described above, the display area of the second display device can be recognized by the user of the first display device based on the display area information. Therefore, the user of the first display device can recognize whether the image region to be displayed on the second display device matches the actual display region of the second display device.
 本発明の態様4に係る表示制御装置は、上記態様3において、上記領域ガイド情報が選択された場合に、当該領域ガイド情報が示す画像領域を上記第2の表示装置に表示させる表示領域制御部(ガイド部14)を備えている構成としてもよい。 The display control apparatus according to aspect 4 of the present invention is the display area control unit that causes the second display apparatus to display an image area indicated by the area guide information when the area guide information is selected in the aspect 3. It is good also as a structure provided with (guide part 14).
 上記の構成によれば、第1の表示装置のユーザは、領域ガイド情報を選択することにより、その画像領域の部分画像を第2の表示装置に表示させることができる。よって、第2の表示装置に表示させるべき画像領域の部分画像を、第2の表示装置のユーザに閲覧させることができる。 According to the above configuration, the user of the first display device can display the partial image of the image region on the second display device by selecting the region guide information. Therefore, the user of the second display device can browse the partial image of the image area to be displayed on the second display device.
 本発明の態様5に係る表示制御装置は、上記態様3または4において、所定の外観を有するオブジェクトを上記コンテンツから検出する検出部を備え、上記ガイド部は、上記検出部が検出したオブジェクトを含む画像領域を示す上記領域ガイド情報を上記第1の表示装置に表示させる、構成としてもよい。 The display control device according to aspect 5 of the present invention includes a detection unit that detects an object having a predetermined appearance from the content in the aspect 3 or 4, and the guide unit includes the object detected by the detection unit. The area guide information indicating the image area may be displayed on the first display device.
 上記の構成によれば、コンテンツ中の所定の外観を有するオブジェクトを含む画像領域を示す領域ガイド情報を第1の表示装置に表示させる。よって、オブジェクトの表示領域と、第2の表示装置の表示領域とが一致しているかずれているか、すなわち当該オブジェクトを第2の表示装置のユーザが閲覧しているか否かを、第1の表示装置のユーザに認識させることができる。 According to the above configuration, area guide information indicating an image area including an object having a predetermined appearance in the content is displayed on the first display device. Therefore, the first display indicates whether the display area of the object and the display area of the second display device are coincident with each other, that is, whether or not the user of the second display device is viewing the object. It can be recognized by the user of the device.
 本発明の態様6に係る表示制御装置は、上記態様1から5の何れかにおいて、上記コンテンツと関連する重畳画像を上記部分画像に重畳して上記第2の表示装置に表示させる重畳処理部(16)を備え、上記重畳画像を上記部分画像に重畳する重畳領域は、上記第1の表示装置が表示する上記コンテンツにおいて指定された領域である構成としてもよい。 The display control device according to aspect 6 of the present invention is the superimposition processing unit according to any one of the aspects 1 to 5 described above, wherein the superimposed image related to the content is superimposed on the partial image and displayed on the second display device. 16), and the superimposed region in which the superimposed image is superimposed on the partial image may be a region designated in the content displayed by the first display device.
 上記の構成によれば、第1の表示装置のユーザは、第1の表示装置が表示するコンテンツにおいて領域を指定することにより、第2の表示装置において表示されている部分画像における当該領域に重畳画像を表示させることができる。 According to the above configuration, the user of the first display device designates a region in the content displayed by the first display device, thereby superimposing the region on the partial image displayed on the second display device. An image can be displayed.
 本発明の態様7に係る第2の表示装置(表示装置3)は、第1の表示装置(表示部17)にコンテンツを表示させる表示制御装置(1)と直接または間接に通信可能であり、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置であって、上記画像領域の何れの領域を表示領域として指定されているかを特定する表示領域特定部(全天球画像描画部32)と、上記表示領域特定部が特定した領域を示す表示領域情報を上記表示制御装置に通知して、該表示領域情報を上記第1の表示装置に表示させる表示領域通知部(34)と、を備えている構成である。 The second display device (display device 3) according to aspect 7 of the present invention can communicate directly or indirectly with the display control device (1) that displays the content on the first display device (display unit 17). A second display device that displays a partial image of a designated display area of the image area of the content, wherein a display area identifying unit that identifies which area of the image area is designated as the display area ( A display area for notifying the display control apparatus of display area information indicating the area specified by the omnidirectional image drawing section 32) and the display area specifying section and displaying the display area information on the first display apparatus And a notification unit (34).
 上記の構成によれば、第2の表示装置の表示領域を特定すると共に、特定した領域を示す表示領域情報を表示制御装置に通知して該表示領域情報を第1の表示装置に表示させる。よって、第2の表示装置の表示領域を第1の表示装置のユーザに認識させることができるという効果を奏する。 According to the above configuration, the display area of the second display device is specified, the display area information indicating the specified area is notified to the display control device, and the display area information is displayed on the first display device. Therefore, the display area of the second display device can be recognized by the user of the first display device.
 本発明の態様8に係る表示制御装置(1)の制御方法は、コンテンツを第1の表示装置(表示部17)に表示させる表示制御装置の制御方法であって、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置(表示装置3)が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定ステップ(S2)と、上記表示領域特定ステップで特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知ステップ(S5)と、を含む方法である。よって、上記態様1と同様の効果を奏する。 A control method of the display control device (1) according to the aspect 8 of the present invention is a control method of a display control device that causes a content to be displayed on the first display device (display unit 17). A display region specifying step (S2) for specifying which region of the image region the second display device (display device 3) for displaying the partial image of the specified display region is used as the display region; and the display region A display area notifying step (S5) for causing the first display device to display display area information indicating the area specified in the specifying step. Therefore, the same effects as those of the first aspect are obtained.
 本発明の各態様に係る表示制御装置(1)は、コンピュータによって実現してもよく、この場合には、コンピュータを表示制御装置(1)が備える各部(ソフトウェア要素)として動作させることにより表示制御装置(1)をコンピュータにて実現させる表示制御装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。また、本発明の上記態様に係る第2の表示装置(3)もコンピュータによって実現してもよく、この場合には、コンピュータを第2の表示装置(3)が備える各部(ソフトウェア要素)として動作させることにより第2の表示装置(3)をコンピュータにて実現させる第2の表示装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The display control device (1) according to each aspect of the present invention may be realized by a computer. In this case, display control is performed by operating the computer as each unit (software element) included in the display control device (1). A control program for a display control apparatus for realizing the apparatus (1) by a computer and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention. The second display device (3) according to the above aspect of the present invention may also be realized by a computer. In this case, the computer operates as each unit (software element) included in the second display device (3). Thus, a control program for the second display device for realizing the second display device (3) by a computer and a computer-readable recording medium on which the control program is recorded fall within the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Furthermore, a new technical feature can be formed by combining the technical means disclosed in each embodiment.
 (関連出願の相互参照)
 本出願は、2017年1月24日に出願された日本国特許出願:特願2017-010316に対して優先権の利益を主張するものであり、それを参照することにより、その内容の全てが本書に含まれる。
(Cross-reference of related applications)
This application claims the benefit of priority over Japanese patent application filed on Jan. 24, 2017: Japanese Patent Application No. 2017-010316. Included in this document.
 1       表示制御装置
12       表示領域特定部
13       表示領域通知部
14       ガイド部(表示領域制御部)
16       重畳処理部
17       表示部(第1の表示装置)
21       全天球画像(コンテンツ)
23       重畳画像
 3       表示装置(第2の表示装置)
32       全天球画像描画部(表示領域特定部)
34       表示領域通知部
DESCRIPTION OF SYMBOLS 1 Display control apparatus 12 Display area specific | specification part 13 Display area notification part 14 Guide part (display area control part)
16 Superimposition processing unit 17 Display unit (first display device)
21 Spherical image (content)
23 superimposed image 3 display device (second display device)
32 Spherical image drawing part (display area specifying part)
34 Display area notification section

Claims (9)

  1.  コンテンツを第1の表示装置に表示させる表示制御装置であって、
     上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定部と、
     上記表示領域特定部が特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知部と、を備えていることを特徴とする表示制御装置。
    A display control device for displaying content on a first display device,
    A display area specifying unit for specifying which area of the image area is a second display device that displays a partial image of a specified display area of the image area of the content;
    A display control device, comprising: a display region notification unit that causes the first display device to display display region information indicating a region specified by the display region specifying unit.
  2.  上記表示領域情報が第1の表示装置に表示されているコンテンツ上で移動された場合に、上記第2の表示装置の上記画像領域を上記表示領域情報の移動先の位置に移動させる表示領域制御部を備えていることを特徴とする請求項1に記載の表示制御装置。 Display area control for moving the image area of the second display device to a position to which the display area information is moved when the display area information is moved on the content displayed on the first display device. The display control apparatus according to claim 1, further comprising a unit.
  3.  上記第2の表示装置に表示させるべき画像領域を示す領域ガイド情報を、上記第1の表示装置に表示させるガイド部を備えていることを特徴とする請求項1に記載の表示制御装置。 2. The display control device according to claim 1, further comprising a guide unit that causes the first display device to display region guide information indicating an image region to be displayed on the second display device.
  4.  上記領域ガイド情報が選択された場合に、当該領域ガイド情報が示す画像領域を上記第2の表示装置に表示させる表示領域制御部を備えていることを特徴とする請求項3に記載の表示制御装置。 4. The display control according to claim 3, further comprising: a display area control unit configured to display an image area indicated by the area guide information on the second display device when the area guide information is selected. apparatus.
  5.  所定の外観を有するオブジェクトを上記コンテンツから検出する検出部を備え、
     上記ガイド部は、上記検出部が検出したオブジェクトを含む画像領域を示す上記領域ガイド情報を上記第1の表示装置に表示させる、ことを特徴とする請求項3または4に記載の表示制御装置。
    A detection unit for detecting an object having a predetermined appearance from the content;
    The display control device according to claim 3, wherein the guide unit displays the region guide information indicating an image region including the object detected by the detection unit on the first display device.
  6.  上記コンテンツと関連する重畳画像を上記部分画像に重畳して上記第2の表示装置に表示させる重畳処理部を備え、
     上記重畳画像を上記部分画像に重畳する重畳領域は、上記第1の表示装置が表示する上記コンテンツにおいて指定された領域であることを特徴とする請求項1から5の何れか1項に記載の表示制御装置。
    A superimposing processor that superimposes the superimposed image related to the content on the partial image and displays the superimposed image on the second display device;
    6. The superimposing region in which the superimposing image is superimposed on the partial image is a region designated in the content displayed by the first display device, according to claim 1. Display control device.
  7.  第1の表示装置にコンテンツを表示させる表示制御装置と直接または間接に通信可能であり、上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置であって、
     上記画像領域の何れの領域を表示領域として指定されているかを特定する表示領域特定部と、
     上記表示領域特定部が特定した領域を示す表示領域情報を上記表示制御装置に通知して、該表示領域情報を上記第1の表示装置に表示させる表示領域通知部と、を備えていることを特徴とする第2の表示装置。
    A second display device that is capable of directly or indirectly communicating with a display control device that displays content on a first display device, and that displays a partial image of a specified display region of the image region of the content,
    A display area specifying unit for specifying which area of the image area is designated as a display area;
    A display area notifying unit for notifying the display control device of display area information indicating the area specified by the display area specifying unit and displaying the display area information on the first display device. A featured second display device.
  8.  コンテンツを第1の表示装置に表示させる表示制御装置の制御方法であって、
     上記コンテンツの画像領域のうち指定された表示領域の部分画像を表示する第2の表示装置が、上記画像領域の何れの領域を表示領域としているかを特定する表示領域特定ステップと、
     上記表示領域特定ステップで特定した領域を示す表示領域情報を上記第1の表示装置に表示させる表示領域通知ステップと、を含むことを特徴とする表示制御装置の制御方法。
    A control method of a display control device for displaying content on a first display device,
    A display area specifying step for specifying which area of the image area the second display device for displaying a partial image of a specified display area of the image area of the content includes:
    And a display area notifying step for causing the first display device to display display area information indicating the area specified in the display area specifying step.
  9.  請求項1に記載の表示制御装置としてコンピュータを機能させるための制御プログラムであって、上記表示領域特定部および上記表示領域通知部としてコンピュータを機能させるための制御プログラム。 A control program for causing a computer to function as the display control device according to claim 1, wherein the control program causes the computer to function as the display area specifying unit and the display area notifying unit.
PCT/JP2017/044284 2017-01-24 2017-12-11 Display control device, second display device, method for controlling display control device, and control program WO2018139073A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017010316 2017-01-24
JP2017-010316 2017-01-24

Publications (1)

Publication Number Publication Date
WO2018139073A1 true WO2018139073A1 (en) 2018-08-02

Family

ID=62978209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/044284 WO2018139073A1 (en) 2017-01-24 2017-12-11 Display control device, second display device, method for controlling display control device, and control program

Country Status (1)

Country Link
WO (1) WO2018139073A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050058A1 (en) * 2018-09-07 2020-03-12 ソニー株式会社 Content distribution system, content distribution method, and program
WO2020054456A1 (en) * 2018-09-14 2020-03-19 ソニー株式会社 Display control device and display control method, and program
JP2021087136A (en) * 2019-11-28 2021-06-03 株式会社リコー Communication terminal, shooting system, image processing method, and program
JP7647554B2 (en) 2019-07-03 2025-03-18 ソニーグループ株式会社 File generation device, file generation method, playback device, and playback processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014174401A (en) * 2013-03-11 2014-09-22 Seiko Epson Corp Image display system and head-mounted type display device
WO2016002445A1 (en) * 2014-07-03 2016-01-07 ソニー株式会社 Information processing device, information processing method, and program
JP2016038876A (en) * 2014-08-11 2016-03-22 カシオ計算機株式会社 Image input device, image output device, and image input/output system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014174401A (en) * 2013-03-11 2014-09-22 Seiko Epson Corp Image display system and head-mounted type display device
WO2016002445A1 (en) * 2014-07-03 2016-01-07 ソニー株式会社 Information processing device, information processing method, and program
JP2016038876A (en) * 2014-08-11 2016-03-22 カシオ計算機株式会社 Image input device, image output device, and image input/output system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050058A1 (en) * 2018-09-07 2020-03-12 ソニー株式会社 Content distribution system, content distribution method, and program
US11470395B2 (en) 2018-09-07 2022-10-11 Sony Corporation Content distribution system and content distribution method
WO2020054456A1 (en) * 2018-09-14 2020-03-19 ソニー株式会社 Display control device and display control method, and program
JP7647554B2 (en) 2019-07-03 2025-03-18 ソニーグループ株式会社 File generation device, file generation method, playback device, and playback processing method
JP2021087136A (en) * 2019-11-28 2021-06-03 株式会社リコー Communication terminal, shooting system, image processing method, and program
JP7400407B2 (en) 2019-11-28 2023-12-19 株式会社リコー Communication terminal, photographing system, image processing method and program

Similar Documents

Publication Publication Date Title
WO2018101227A1 (en) Display control device, head-mounted display, control method for display control device, and control program
EP3019939B1 (en) Display control apparatus and computer-readable recording medium
EP1404126B1 (en) Video combining apparatus and method
US20160210785A1 (en) Augmented reality system and method for positioning and mapping
WO2018139073A1 (en) Display control device, second display device, method for controlling display control device, and control program
US20160292923A1 (en) System and method for incorporating a physical image stream in a head mounted display
JP2007042073A (en) Video presentation system, video presentation method, program for causing computer to execute video presentation method and storage medium
TW201708883A (en) Electronic system, portable display device and guiding device
JP6422584B2 (en) Information processing device
JP2009267729A (en) Image processing apparatus, image processing method, program, and recording medium
JP2005174021A (en) Method and device for presenting information
KR20210100170A (en) Electronic device and its control method
KR102200115B1 (en) System for providing multi-view 360 angle vr contents
WO2018168823A1 (en) Image processing device and electronic equipment
JP6262283B2 (en) Method, program, and recording medium for providing virtual space
US20240264662A1 (en) Head mounted information processing apparatus and head mounted display system
JP7005160B2 (en) Electronic devices and their control methods
JP6442619B2 (en) Information processing device
CN112053444B (en) Method for superposing virtual objects based on optical communication device and corresponding electronic equipment
WO2018139147A1 (en) Control device, head mounted display, method for controlling control device, and control program
KR101315398B1 (en) Apparatus and method for display 3D AR information
JP2010063076A (en) Image processing apparatus and image processing apparatus program
JP5172794B2 (en) Video communication system, method and program
WO2018168825A1 (en) Image processing device and electronic equipment
JP5647813B2 (en) Video presentation system, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17893900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17893900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载