WO2019003323A1 - Programme de jeu - Google Patents
Programme de jeu Download PDFInfo
- Publication number
- WO2019003323A1 WO2019003323A1 PCT/JP2017/023648 JP2017023648W WO2019003323A1 WO 2019003323 A1 WO2019003323 A1 WO 2019003323A1 JP 2017023648 W JP2017023648 W JP 2017023648W WO 2019003323 A1 WO2019003323 A1 WO 2019003323A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- field
- player
- game
- hand
- game program
- Prior art date
Links
- 238000000034 method Methods 0.000 claims description 33
- 238000004891 communication Methods 0.000 description 27
- 239000003550 marker Substances 0.000 description 21
- 238000001514 detection method Methods 0.000 description 20
- 230000001133 acceleration Effects 0.000 description 17
- 210000003128 head Anatomy 0.000 description 14
- 230000006399 behavior Effects 0.000 description 10
- 210000004247 hand Anatomy 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000001429 stepping effect Effects 0.000 description 8
- 210000003813 thumb Anatomy 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 210000005224 forefinger Anatomy 0.000 description 3
- 206010025482 malaise Diseases 0.000 description 3
- 230000009193 crawling Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5258—Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the technology disclosed by the present specification relates to a game program for a game system.
- Patent Document 1 Japanese Patent Application Laid-Open No. 7-200162 (hereinafter referred to as Patent Document 1), a head-mounted display mounted on the head of a player so as to cover the player's field of view and a stepping detection unit for detecting a stepping action of the player
- a game system including a step setting unit for setting the moving direction (forward or backward) and the step of the player, a controller imitating a gun used in the game, and a computer.
- the computer causes the head mounted display to display a game screen representing a range corresponding to the view of the player in the game in the virtual three-dimensional space constituting the game. Then, the computer calculates the position and action of the player according to detection signals from the footstep detection unit, the stride detection unit, and the controller, and changes the display mode of the game screen displayed on the head mounted display in real time. Do.
- the stepping detection unit detects the stepping motion of the player, is the player virtually walking (or running) in the virtual three-dimensional space according to the setting in the stride setting unit at that time? Is displayed on the head mounted display. By stepping on the spot, the player can perceive as if walking and moving in the virtual three-dimensional space (i.e., in the game space).
- the present specification discloses a technique for providing a game system that can suppress giving a player a sense of discomfort.
- a game program for a game system includes a head mounted display mounted on the player's head so as to cover the player's field of view, an input receiving device capable of detecting the physical movement of the player, and a computer.
- the game program causes the computer to display, on the head mounted display, a game screen representing a range corresponding to the field of view of the player among the following processes, that is, the virtual three-dimensional space constituting the game.
- the virtual three-dimensional space includes a field in which an operation target object is disposed, and the display processing in the case where the input reception device detects a field change operation, the player in the game screen Field change processing is performed, which changes the display mode of the field displayed in the game screen according to the change mode indicated by the field change operation without moving the viewpoint.
- the computer when the player performs the field change operation, the computer can display the field in the game screen displayed on the head-mounted display without changing the player's view in the game screen. It can be changed in accordance with the fluctuation mode indicated by the fluctuation operation. That is, since the player's vision does not move, the player who has performed the field movement motion does not feel that he / she moves by walking etc. in the field, but feels that the player is moving the field.
- the display mode of can be changed. Therefore, the player's physical condition and the player, as compared with the conventional technology that causes the player to perceive walking and the like on the screen even though the player's body does not actually move from the spot. There is less divergence between the game screen and the game screen that can be perceived visually.
- the "player's view” includes the meanings of the player's line of sight (line of sight) in the game, the player's standing position, the player's point of view, the player's view, the player's point of view, and the like.
- the player's perspective may move at the same position on the screen, for example, in accordance with an action such as a change in the attitude of the real player.
- the input reception device may be a device capable of detecting the movement of the player's hand.
- the field change operation may include a first operation for virtually touching the field, and a second operation for virtually moving the field virtually touched by the first operation.
- the field change processing may include changing the display mode of the field in accordance with the movement direction and movement amount virtually moved in the second operation.
- the player performs the field fluctuation operation including the first operation and the second operation to virtually feel that the field is “touched (for example,“ grip ”) and moved.
- the display mode of can be changed. It is easy for the player to intuitively perceive the feeling of "touching and moving" the field, and the divergence between the real player's physical condition and the game screen that the player can perceive visually is smaller. Therefore, according to the above configuration, it is possible to realize a game system capable of more appropriately suppressing the player from feeling uncomfortable.
- the game screen may further include a hand object that virtually represents the hand of the player.
- the game program may cause the computer to execute hand object operation processing for moving the hand object in accordance with the movement of the player's hand detected by the input reception device.
- the game screen viewed by the player includes a hand object that moves in accordance with the movement of the player's hand. Therefore, for example, when the player performs a field change operation, if the hand object moves so as to grasp and move the field, the player may more intuitively perceive the feeling of "touching and moving" the field. it can.
- the virtual three-dimensional space may further include a background space.
- the field may be arranged in the background space. In the game screen, the field may be displayed in a manner in which the field is overlooked.
- a virtual three-dimensional space constituting a game includes a background space and fields arranged in the background space and is displayed in a manner such that the fields are overlooked in the game screen
- the player is moving the field "as a result of the field fluctuation operation, as compared to a game in which the player can perceive that the player stands in the field (ie the player's gaze is in the field). It is easy to get the feeling of According to the above configuration, it is possible to more appropriately suppress the player from feeling uncomfortable.
- a game system realized by the above game program, a control method for realizing the game system, and a computer readable medium storing the above game program are also novel and useful.
- FIG. 1 shows a block diagram of a game system.
- An example of a game screen is shown.
- the flowchart of the field change process which the control part of a main body performs is shown.
- the example of the game screen just before a field moves is shown.
- the example of the game screen after the field has been moved is shown.
- the game system 2 shown in FIG. 1 and FIG. 2 is a game system for making a player experience and play a game in a virtual three-dimensional space.
- the game system 2 includes an HMD (abbreviation of Head Mount Display) 10, a right controller 30R, a left controller 30L, a camera 50, and a main body 60.
- the HMD 10 and the main body 60, and the camera 50 and the main body 60 are connected to each other in a wired communication manner.
- the right controller 30R, the left controller 30L, and the main unit 60 are connected to each other in a wirelessly communicable manner.
- controller 30 when the right controller 30R and the left controller 30L are referred to without distinction, they may be simply referred to as "controller 30".
- the HMD 10 shown in FIGS. 1 and 2 is an image display device (so-called head-mounted display) used by being attached to the head of a player.
- the HMD 10 includes a frame 11, a marker 12, a display unit 13, and an operation unit 14.
- the frame 11 is an eyeglass frame-like member.
- the player can wear the HMD 10 on the head by wearing the frame 11 so as to wear the glasses from the front side of the face.
- the frame 11 may be a headband-like member, a helmet-like member, or any other frame that can be worn on the head.
- the display unit 13 is a light shielding display member. As shown in FIG. 2, when the player wears the HMD 10 on the head, the display unit 13 is disposed at a position facing the player's eyes. When the player wears the HMD 10, the display unit 13 blocks the view of the player. In the present embodiment, a game screen (see FIG. 4 and the like) represented by screen data supplied from the main body 60 is displayed on the display unit 13.
- the markers 12 are disposed at both ends of the front surface of the display unit 13.
- the marker 12 is a mark of a color (for example, bright color such as fluorescent color) different in lightness from the frame 11 and the display unit 13.
- the control unit 70 of the main body 60 can specify the position and orientation of the head of the player wearing the HMD 10 according to the position of the marker 12 in the captured image captured by the camera 50. . That is, the marker 12 is a mark for specifying the position and orientation of the HMD 10 (ie, the position and orientation of the head of the player).
- the shape of the marker 12 is substantially rectangular in FIGS. 1 and 2, in another example, the shape of the marker 12 is not limited to this and may be any shape. Further, in another example, the marker 12 may be disposed at a position different from the example of FIGS. 1 and 2.
- the operation unit 14 is provided exposed on the surface of the frame 11.
- the operation unit 14 includes a plurality of buttons, a dial, and the like.
- the player can operate the operation unit 14 to input various operations to the HMD 10.
- the operation unit 14 may be configured as a controller independent of the frame 11 (that is, provided at a distance from the frame 11). In that case, the controller may be able to communicate with the HMD 10 by wired communication or wireless communication.
- the HMD 10 further includes a wired interface 16, a control unit 20, and a memory 22.
- the three-axis acceleration sensor 24 and the position detection device 26 are illustrated by broken lines in FIG. 3, the HMD 10 of this embodiment does not include the three-axis acceleration sensor 24 and the position detection device 26.
- the three-axis acceleration sensor 24 and the position detection device 26 will be mentioned in the second embodiment described later.
- the interface will be described as "I / F".
- the wired I / F 16 is an I / F for performing wired communication with the main body 60.
- a communication cable for performing wired communication with the main body 60 is connected to the wired I / F 16.
- the control unit 20 executes various processes in accordance with the program stored in the memory 22.
- the control unit 20 executes a process of causing the display unit 13 to display a game screen (see FIG. 4) or the like represented by screen data supplied from the main body 60.
- the memory 22 is configured by a ROM, a RAM, and the like.
- the memory 22 stores programs for the control unit 20 to execute various processes.
- the right controller 30R shown in FIG. 1 and FIG. 2 is a controller for the player to input an operation related to the game.
- the right controller 30R includes a gripping unit 31R, a marker 32R, and operation units 34R and 35R.
- the gripping portion 31R is a main body of the right controller 30R and a portion for the player to grip with the right hand (see FIG. 2).
- the marker 32R is a substantially spherical member formed integrally with the gripping portion 31R at the upper end portion of the gripping portion 31R.
- the marker 32R is a mark of a color (for example, bright color such as fluorescent color) different in lightness from the grip portion 31R.
- the color of the marker 32R may be the same as the color of the marker 12 described above.
- the control unit 70 of the main body 60 specifies the position and operation of the right hand of the player gripping the right controller 30R according to the position of the marker 32R in the captured image captured by the camera 50.
- the marker 32R is a mark for specifying the position and operation of the player's right hand.
- the shape of the marker 32R is substantially spherical, but in another example, the shape of the marker 32R is not limited to this and may be any shape. Further, in another example, the marker 32R may be disposed at a position different from the example of FIGS. 1 and 2.
- the operating portions 34R, 35R are provided exposed on the surface of the gripping portion 31R.
- the operation units 34R, 35R include a plurality of buttons, levers, dials, and the like.
- the operating unit 34R is disposed at a position where it can be operated by the thumb of the right hand when the player grips the gripping unit 31R with the right hand.
- the operation unit 35R is disposed at a position where it can be operated by a finger other than the thumb of the right hand (for example, a forefinger) when the player holds the grip 31R with the right hand.
- the player can operate the operation units 34R and 35R to input various operations (in particular, operations related to a game, etc.) to the right controller 30R.
- the right controller 30R further includes a wireless I / F 36R, a control unit 40R, and a memory 42R.
- the three-axis acceleration sensor 44R and the position detection device 46R are illustrated by broken lines in FIG. 3, the right controller 30R of this embodiment does not include the three-axis acceleration sensor 44R and the position detection device 46R.
- the three-axis acceleration sensor 44R and the position detection device 46R will be mentioned in a second embodiment described later.
- the wireless I / F 36R is an I / F for performing wireless communication with the main unit 60.
- the control unit 40R executes various processes in accordance with the program stored in the memory 42R. In the present embodiment, during the game, the control unit 40 executes processing of transmitting various instruction signals indicating various operations input to the operation units 34 and 35 to the main body 60 by wireless communication.
- the memory 42R is configured of a ROM, a RAM, and the like, and stores programs for the control unit 40R to execute various processes.
- the left controller 30L has the same configuration as that of the right controller 30R except that the left controller 30L is laterally opposite to the right controller 30R.
- the left controller 30L also includes a gripping unit 31L, a marker 32L, operation units 34L and 35L, a wireless I / F 36L, a control unit 40L, and a memory 42L.
- the camera 50 shown in FIGS. 1 to 3 is a camera for mounting the HMD 10 and photographing a player holding the controller 30 (ie, a player playing a game).
- the camera 50 is disposed at a position where it is possible to shoot the entire body of the player.
- the camera 50 is connected so as to be able to execute wired communication with the main body 60 via a communication cable. In the present embodiment, during the game, the camera 50 continuously supplies the captured image to the main body 60 by wire communication.
- the main body 60 shown in FIGS. 1 to 3 is an apparatus for generating screen data showing a game screen (see FIG. 4) according to a game program.
- the main body 60 includes a wired I / F 62, a wireless I / F 64, a control unit 70, a memory 72, and a recording medium reading unit 74.
- the wired I / F 62 is an I / F for performing wired communication with the HMD 10 and the camera 50.
- a communication cable for performing wired communication with the HMD 10 and a communication cable for performing wired communication with the camera 50 are connected to the wired I / F 62.
- the wireless I / F 64 is an I / F for performing wireless communication with the controller 30.
- the control unit 70 performs various processes such as processing of generating screen data representing a game screen, field fluctuation processing of FIG. 4, and the like according to the OS program 73 stored in the memory 72 and the game program 90 read by the recording medium reading unit 74.
- Execute the process The memory 72 is configured by a ROM, a RAM, and the like.
- the memory 72 stores the OS program 73 in advance.
- the recording medium reading unit 74 is a DVD drive.
- the recording medium reading unit 74 can read the DVD ROM 80 in which the game program 90 is recorded.
- the recording medium reading unit 74 may be a reading unit capable of reading another recording medium, such as a CD ROM or a flash memory.
- the control unit 70 of the main body 60 starts the game based on the game program 90 recorded in the DVD ROM 80 read by the recording medium reading unit 74. Specifically, the control unit 70 generates screen data representing a game screen (see FIG. 4 etc.) according to the game program 90, supplies the screen data to the HMD 10 by wire communication, and transmits the screen data to the display unit 13 of the HMD 10. Perform processing to display the screen.
- this process may be referred to as "display process”.
- control unit 70 acquires, from the controller 30, operation signals indicating various operations (for example, various operations related to the game operation) input by the player to the operation units 34 and 35 of the controller 30. Furthermore, the control unit 70 continuously acquires, from the camera 50, a captured image captured by the camera 50. The control unit 70 is based on the position, the direction, and the movement of the markers 12, 32R, 32L included in the captured image of the camera 50, the behavior of the real player (head direction, head position, both hands position, both hands movement Etc.). Then, the control unit 70 identifies the motion of the player in the game based on the operation signal acquired from the controller 30 and the identified behavior of the real player. The control unit 70 changes the display mode of the game screen displayed on the display unit 13 (that is, changes the screen data) in accordance with the operation of the player in the specified game.
- the game screen 100 shown in FIG. 4 is a screen constituting a game realized by the game program 90.
- the game screen 100 is a screen representing a range corresponding to the view of the player in the game in the virtual three-dimensional space constituting the game.
- a virtual three-dimensional space constituting the game of the present embodiment includes a background space 110 and a field 120.
- the background space 110 is a space that constitutes a background in which the field 120 is disposed. In the present embodiment, the background space 110 does not change according to the progress of the game.
- a field 120 is a place where an operation target object (for example, own characters P1 to P3, enemy characters E1 to E5, etc.) is arranged, and is a place which constitutes a scene related to the progress of the game.
- the game shown in the example of FIG. 4 is an action type simulation game in which an own army having own characters P1 to P3 fights against an enemy army having enemy characters E1 to E5.
- the field 120 is a place (battlefield) where the army and the enemy army fight, and various terrains such as a pond, a forest, and a mountain are arranged.
- the field 120 is disposed in the background space 110. In the game screen 100 of FIG. 4, the field 120 is displayed in a manner in which the field 120 is hidden.
- the game screen 100 also includes hand objects 130R and 130L.
- the hand objects 130R and 130L are objects that virtually represent the player's right hand and left hand.
- the hand objects 130R and 130L are game screens according to operations of the right controller 30R and the player's right and left hands gripping the left controller 30L and various operations input to the operation units 34R, 35R, 34L, and 35L, respectively. It is moved within 100 (i.e. the display is changed). For example, when the player inputs a selection operation for selecting a character in the field 120 to the operation unit 34R of the controller 30R, the hand object 130R operates so as to point to the character to be selected. The character pointed to by the hand object 130R is in the selected state and changes to the operable state. As described above, by displaying the hand objects 130R and 130L in the game screen 100, the player can intuitively recognize the sense of participating in the game using his or her hand.
- control unit 70 repeatedly executes the above-described processes (ie, display processes). Further, the control unit 70 executes the field variation process shown in FIG. 5 in parallel with the above display process while playing the game.
- the player's view (virtual line of sight (line of sight) height, line of sight (line of sight) position, player standing position, player's point of view, player's view, player's view in the game , Etc.) is a process for changing the display mode of the field 120 in the game screen 100 in accordance with the field change operation performed by the player.
- the control unit 70 starts the process of FIG. 5 according to the game program 90.
- the control unit 70 monitors that a field gripping operation is input.
- the field holding operation is an operation for the player to virtually hold (grip) the field 120 present in the background space 110.
- the player is the operation unit 34 of the controller 30 (the operation unit 34R of the right controller 30R and the operation unit 34L of the left controller 30L.
- the operation unit 35 (the operation unit 35R of the right controller 30R, and the left controller 30L)
- a predetermined field grasping operation can be input to the operation unit 35L of (the same applies hereinafter).
- the field gripping operation is an operation in which the player simultaneously presses the buttons of the operation units 34R and 35R with the thumb and forefinger of the right hand and simultaneously presses the buttons of the operation units 34L and 35L with the thumb and forefinger of the left hand It is. That is, the field gripping operation of the present embodiment is an operation corresponding to the player actually gripping the controller 30 strongly. In the present embodiment, while the controller 30 is strongly gripped (that is, while the buttons of the operation units 34 and 35 are simultaneously pressed), a field gripping operation is input and the controller 30 is released from gripping.
- the field gripping operation is released (that is, the later-described gripping release operation is input).
- the field gripping operation may be, for example, a predetermined gesture by the player's right and left hands gripping the controller 30.
- the field gripping operation may be input by continuously pressing one of the operation units 34 and 35 for a predetermined period or more (so-called long press). In that case, the field holding operation may be released by releasing the long press.
- the control unit 70 determines YES in S10, and proceeds to S12.
- the control unit 70 changes the mode of the hand objects 130R and 130L in the game screen 100 to a mode in which the field 120 is gripped, as shown in FIG. This makes it possible to intuitively recognize that the player virtually grips the field 120 by inputting the field gripping operation.
- control unit 70 determines whether or not a field moving operation has been input.
- the field moving operation is an operation for causing the player to virtually move the field 120 virtually held (gripped) by the field holding operation.
- the player inputs the field movement operation by moving the right hand and the left hand holding the controller 30 in a desired behavior after performing the above-described field holding operation (that is, pressing the predetermined button) Can.
- the player wants to pull the field 120 forward (for example, when the player wants to see the far side of the field 120), the player stretches the right hand and the left hand with the controller 30, Move to the As a result, the hand objects 130R and 130L displayed on the game screen 100 are also moved forward (in the far side of the field) in the virtual space. Therefore, the above-described field gripping operation is performed.
- An operation in which the hand objects 130R and 130L grasp the field is displayed.
- the player pulls the right hand and the left hand holding the controller 30 from the front toward the rear. Thereby, it is possible to input a field moving operation for pulling the back side of the field 120 to the front side (see FIG. 6).
- the operation may be performed with one hand.
- the player when the player wants to push the field 120 back (for example, when the player wants to look at the near side of the field 120), the player performs an operation to push the right hand and the left hand forward from the back.
- a field movement operation can be input to send out the front side of the front side 120.
- the player when the player wants to move the field 120 away (for example, when the player wants to display the field 120 in a wide area), the player performs a field move operation to move the field 120 away by pushing down the right hand and the left hand. Can.
- the player when the player wants to rotate the field 120 (for example, when the player wants to see the field 120 from the opposite side), the player rotates the right hand and the left hand in the same rotation direction (clockwise or counterclockwise). To input a field moving operation for rotating the field 120.
- the player can input a field movement operation to move the field 120 in a desired manner by inputting a desired gesture.
- the hand objects 130R and 130L may be temporarily moved out of the game screen 100 (that is, frame out) according to the size of the gesture.
- the control unit 70 determines YES in S12 and proceeds to S14. On the other hand, if the field moving operation has not been input, the control unit 70 determines NO in S12, skips S14, and proceeds to S16.
- control unit 70 changes the display mode of the field 120 in accordance with the movement direction and movement amount instructed in the field movement operation input in S12 (see FIG. 7).
- the control unit 70 determines whether or not a grip releasing operation for releasing the gripping of the virtual field (YES in S10) is input.
- the player can input a predetermined grip release operation to the operation unit 34 of the controller 30.
- the grip release operation is an operation for releasing the pressing of the button pressed in the input of the field grip operation.
- the grip release operation may be, for example, a predetermined gesture by the player's right and left hands gripping the controller 30.
- the grip releasing operation may be, for example, releasing the long press of any of the operation units 34 and 35.
- the field 120 is fixed at the position at the time of the grip release operation. If it is determined as YES in S16, the control unit 70 returns to S10 and monitors that the field gripping operation is input again. On the other hand, when the grip release operation has not been input, the control unit 70 determines NO in S16, and returns to S12. In this case, the control unit 70 again determines whether a field movement operation has been input.
- FIG. 4 shows an example of the game screen 100 before the field gripping operation is input (NO in S10 of FIG. 5).
- the own character P1 and the enemy character E1 are in battle.
- the player operates by selecting his character P1.
- enemy characters E2 and E3 are waiting.
- own characters P2 and P3 and enemy characters E4 and E5 exist at a certain distance.
- FIG. 6 shows an example of the game screen 100 displayed after FIG.
- the own character P1 which has engaged with the enemy character E1 on the near side in FIG. 4 defeats the opponent character E1.
- the distance between the player's own characters P2 and P3 and the enemy characters E4 and E5, which were present on the back side is close, and it seems that the battle is about to start.
- the enemy characters E2 and E3 waiting near the center of the field 120 move toward the back side to add to the enemy characters E4 and E5 on the back side.
- a player who has seen this situation thinks that he / she would like to see the far side of the field 120 largely (that is, want to draw the far side of the field 120 closer to the foreground to see greatly).
- the player moves the right hand and the left hand to the back side, and inputs the field gripping operation at that position (YES in S10).
- the hand objects 130R and 130L are changed to a mode in which the field 120 is virtually held.
- the player performs an operation of pulling the right hand and the left hand to the front side, and inputs a field movement operation to draw the back side of the field 120 to the front (YES in S12).
- the player performs a grip releasing operation (YES in S16).
- FIG. 7 shows an example of the game screen 100 after the field holding operation and the field moving operation are performed.
- the back side of the field 120 is drawn to the front side and displayed large by the field holding operation and the field moving operation described above.
- the player looks over the field from the viewpoint of the game, there is no significant change in the background space 110.
- FIG. 7 the situation in which the own characters P2 and P3 and the enemy characters E4 and E5 which were present on the back side in FIG. And, a part of the enemy characters E2 and E3 which have been moved to be added to the enemy characters E4 and E5 are also displayed.
- the display mode of the field 120 is changed from the game screen 100 of FIGS. 4 and 6, but when this display mode is changed, the player's view (player (You may rephras this with your eyes, standing position, etc.). Since the player's view does not change, the player does not feel that "I move in the field 120 by walking etc.”, but the display mode of the field 120 in the sense that "I grabbed the field and pulled and moved it". It can be changed.
- the control unit 70 displays the display mode of the field 120 in the game screen 100 as the game screen 100. It is possible to change according to the variation mode indicated by the field movement operation without moving the player's perspective in the inside (see FIG. 4, FIG. 6, and FIG. 7). Since the player's view does not change, the player who performed the field gripping operation and the field moving operation does not feel that he or she moves and walks in the field 120, but he is moving the field 120.
- the display mode of the field 120 can be changed in the same manner.
- the player's physical condition and the player as compared with the conventional technology that causes the player to perceive walking and the like on the screen even though the player's body does not actually move from the spot.
- the player needs to continuously perform two operations of the field holding operation and the field moving operation because of the variation of the display mode of the field 120.
- the player can change the display mode of the field 120 while virtually feeling that the field 120 is "held” (touched). It is easy for the player to intuitively perceive the feeling of "touching and moving" the field 120, and the gap between the real player's physical condition and the game screen 100 that the player can perceive from the visual is smaller. Become. Therefore, according to the present embodiment, it is possible to realize the game system 2 capable of more appropriately suppressing the player from feeling uncomfortable.
- the game screen 100 includes hand objects 130R and 130L that virtually represent the player's hand.
- the hand objects 130R, 130L are moved in accordance with the movement of the player's hand (ie, the movement of the marker 32 of the controller 30 captured by the camera 50). Therefore, for example, when the player performs the field holding operation and the field moving operation due to the change in the display mode of the field 120, if the hand objects 130R and 130L move so as to move the field 120, the player moves. More intuitively, the user can perceive the feeling of "touching and moving" in the field 120.
- the virtual three-dimensional space constituting the game includes a background space 110 and a field 120.
- the field 120 is arranged in the background space 110 in a crawling manner.
- a virtual three-dimensional space constituting a game includes a background space 110 and a field 120 arranged in the background space 110 and displayed in a mode in which the field 120 is overlooked on the game screen 100, the player The game can be played in the same manner as looking in the field 120 from within the background space 110.
- the HMD 10 is an example of the “head-mounted display”.
- the combination of the controller 30, the camera 50, and the control unit 70 is an example of the “input reception device”.
- the control unit 70 is an example of the “computer”.
- the own characters P1 to P3 and the enemy characters E1 to E5 are examples of the “operation target object”.
- the field gripping operation and the field moving operation are examples of the "field change operation”.
- the field gripping operation is an example of the “first operation”
- the field moving operation is an example of the “second operation”.
- the game system 2 of the second embodiment will be described focusing on differences from the first embodiment.
- the game system 2 does not have the camera 50.
- the HMD 10 does not have the marker 12.
- the controller 30 also does not have the marker 32.
- the HMD 10 is provided with the three-axis acceleration sensor 24 and the position detection device 26 as illustrated by a broken line in FIG. 3.
- the right controller 30R and the left controller 30L respectively include three-axis acceleration sensors 44R and 44L and position detectors 46R and 46L.
- the three-axis acceleration sensor 24 of the HMD 10 detects accelerations in three axes of X, Y, and Z.
- the control unit 20 of the HMD 10 can specify the posture and the movement state of the HMD 10 using the detection values of the three-axis acceleration sensor 24.
- the position detection device 26 of the HMD 10 is a GPS (abbreviation of Global Positioning System) receiver.
- the position detection device 26 receives radio waves from GPS satellites and measures the current position of the HMD 10.
- the control unit 20 performs main body 60 on positional relationship information indicating the posture and motion state of the HMD 10 specified by the three-axis acceleration sensor 24 and the current position of the HMD 10 measured by the position detection device 26. Supply to
- the three-axis acceleration sensor 44R and the position detection device 46R of the right controller 30R are also similar to the above.
- the control unit 40R supplies, to the main body 60, positional relationship information indicating the posture and motion state of the right controller 30R identified by the three-axis acceleration sensor 44R and the current position of the right controller 30R measured by the position detection device 46R.
- the three-axis acceleration sensor 44L and the position detection device 46L of the left controller 30L are also similar to the above.
- the control unit 40L supplies, to the main body 60, positional relationship information indicating the posture and motion state of the left controller 30L specified by the three-axis acceleration sensor 44L and the current position of the left controller 30L measured by the position detection device 46L. Do.
- the control unit 70 of the main body 60 is based on the positional relationship information acquired from the HMD 10, the positional relationship information acquired from the right controller 30R, and the positional relationship information acquired from the left controller 30L. Identify the real player's behavior (head orientation, head position, position of both hands, movement of both hands, etc.). Then, the control unit 70 identifies the motion of the player in the game based on the operation signal acquired from the controller 30 and the identified behavior of the real player. The control unit 70 changes the display mode of the game screen displayed on the display unit 13 (that is, changes the screen data) in accordance with the operation of the player in the specified game.
- the method for identifying the actual player's behavior is different from that of the first embodiment.
- the processing of the control unit 70 other than this is common to that of the first embodiment, and thus detailed description will be omitted.
- the combination of the HMD 10 and the controller 30 of the present embodiment is an example of the “input reception device”.
- the main body 60 may be capable of communicating with an external server, another main body, or the like via a communication network (for example, the Internet).
- a communication network for example, the Internet
- the game realized by the game program 90 may be a so-called online communication game in which one game is played with another player in one virtual three-dimensional space via a communication network.
- the game program 90 is stored in advance in the DVD ROM 80 and read by the recording medium reading unit 74 of the main body 60.
- the game program may be downloaded to the main unit 60 from an external server or the like via the communication network. In that case, the downloaded game program may be stored in the memory 72.
- the method for identifying the behavior of a real player is the method described in the first and second embodiments (the method of photographing the markers 12 and 32 with the camera 50 (first embodiment), and The present invention is not limited to the method (second embodiment) using the three-axis acceleration sensors 24, 44R, 44L and the position detectors 26, 46R, 46L.
- the behavior of the real player may be identified by irradiating the real player with infrared light, photographing with an infrared camera, and analyzing the pattern of the photographed infrared light.
- the HMD 10 or the controller 30 is provided with a light receiving unit for infrared light, and the HMD 10 or the controller 30 is irradiated with infrared light, and the light receiving unit receives the infrared light. It is possible to identify the player's behavior by identifying the direction or the like. Also, for example, communication by light other than infrared light or communication by sound may be used to specify the behavior of the player.
- the main unit 60 and the HMD 10 may be able to execute wireless communication.
- the main unit 60 and the camera 50 may be capable of wireless communication.
- the main unit 60 and the controller 30 may be capable of executing wired communication.
- the player performs the field gripping operation and the field moving operation for the change of the display mode of the field 120.
- the operation for the player to virtually contact the field 120 due to the variation of the display aspect of the field 120 is not limited to the field gripping operation described above, and may be any operation.
- the player may perform field touch operation to virtually touch the field and swipe operation (or / and slide operation) to move the touched field, for variation of the display mode of the field. You may
- the virtual three-dimensional space constituting the game includes the background space 110 and the field 120, and the field 120 is the background space It is arranged in a manner to be crawled within 110.
- the aspect of the game realized by the game program is not limited to this.
- the virtual three-dimensional space in the game realized by the game program may have only the field without the background space. That is, it may be a game in which it can be perceived that the player stands in the field (that is, the player's gaze is in the field). Even in such a case, it is sufficient that the control unit 70 can execute the field variation process of FIG. 5.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
L'invention concerne un programme de jeu pour un système de jeu, le système de jeu étant pourvu d'un visiocasque monté sur la tête d'un joueur de manière à couvrir le champ de vision du joueur, d'un dispositif de réception d'entrée capable de détecter un mouvement physique du joueur, et d'un ordinateur, et le programme de jeu amenant l'ordinateur à exécuter : un traitement d'affichage pour amener une image d'écran de jeu représentant une plage correspondant au champ de vision du joueur à s'afficher dans le visiocasque dans un espace tridimensionnel virtuel constituant un jeu, l'espace tridimensionnel virtuel dans le traitement d'affichage comprenant un champ dans lequel est disposé un objet d'opération ; et un traitement de variation de champ pour changer une forme d'affichage du champ affiché dans l'image d'écran de jeu en fonction d'une forme de variation indiquée par une action de variation de champ sans déplacer le point d'observation du joueur dans l'image d'écran de jeu lorsque le dispositif de réception d'entrée détecte l'action de variation de champ.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018500812A JP6352574B1 (ja) | 2017-06-27 | 2017-06-27 | ゲームプログラム |
PCT/JP2017/023648 WO2019003323A1 (fr) | 2017-06-27 | 2017-06-27 | Programme de jeu |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/023648 WO2019003323A1 (fr) | 2017-06-27 | 2017-06-27 | Programme de jeu |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019003323A1 true WO2019003323A1 (fr) | 2019-01-03 |
Family
ID=62779931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/023648 WO2019003323A1 (fr) | 2017-06-27 | 2017-06-27 | Programme de jeu |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6352574B1 (fr) |
WO (1) | WO2019003323A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002351309A (ja) * | 2001-05-30 | 2002-12-06 | Cad Center:Kk | 都市vr地図連動情報表示装置 |
US20160361658A1 (en) * | 2015-06-14 | 2016-12-15 | Sony Interactive Entertainment Inc. | Expanded field of view re-rendering for vr spectating |
JP2016224810A (ja) * | 2015-06-02 | 2016-12-28 | キヤノン株式会社 | システム、システムの制御方法 |
JP2017094120A (ja) * | 2016-12-26 | 2017-06-01 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
JP2017516186A (ja) * | 2014-03-14 | 2017-06-15 | 株式会社ソニー・インタラクティブエンタテインメント | 空間感知を備えるゲーム機 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6643776B2 (ja) * | 2015-06-11 | 2020-02-12 | 株式会社バンダイナムコエンターテインメント | 端末装置及びプログラム |
JP5980393B1 (ja) * | 2015-09-01 | 2016-08-31 | ガンホー・オンライン・エンターテイメント株式会社 | 端末装置 |
-
2017
- 2017-06-27 WO PCT/JP2017/023648 patent/WO2019003323A1/fr active Application Filing
- 2017-06-27 JP JP2018500812A patent/JP6352574B1/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002351309A (ja) * | 2001-05-30 | 2002-12-06 | Cad Center:Kk | 都市vr地図連動情報表示装置 |
JP2017516186A (ja) * | 2014-03-14 | 2017-06-15 | 株式会社ソニー・インタラクティブエンタテインメント | 空間感知を備えるゲーム機 |
JP2016224810A (ja) * | 2015-06-02 | 2016-12-28 | キヤノン株式会社 | システム、システムの制御方法 |
US20160361658A1 (en) * | 2015-06-14 | 2016-12-15 | Sony Interactive Entertainment Inc. | Expanded field of view re-rendering for vr spectating |
JP2017094120A (ja) * | 2016-12-26 | 2017-06-01 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2019003323A1 (ja) | 2019-06-27 |
JP6352574B1 (ja) | 2018-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12072505B2 (en) | Display control apparatus, display control method, and recording medium | |
US11504609B2 (en) | Head mounted display | |
EP3469466B1 (fr) | Objet à interface directionnelle | |
EP3263194B1 (fr) | Programme de commande d'affichage, dispositif de commande d'affichage, et procédé de commande d'affichage | |
JP2016158795A (ja) | 表示制御プログラム、表示制御装置、及び表示制御方法 | |
JP7138138B2 (ja) | プログラム、情報処理方法及び情報処理装置 | |
JP6479933B1 (ja) | プログラム、情報処理装置、および方法 | |
JP6352574B1 (ja) | ゲームプログラム | |
JP7192151B2 (ja) | プログラム、情報処理装置、及び情報処理方法 | |
JP2022191240A (ja) | プログラム、方法および情報処理装置 | |
JP2019005564A (ja) | 仮想現実プログラム | |
JP2022153478A (ja) | アニメーション制作システム | |
US11660536B2 (en) | Display control program, display control apparatus and display control method | |
JP7218873B2 (ja) | アニメーション制作システム | |
WO2023281819A1 (fr) | Dispositif de traitement d'informations pour déterminer la rétention d'un objet | |
JP2018147497A (ja) | 仮想現実を提供するための方法、当該方法をコンピュータに実行させるためのプログラムおよび、情報処理装置 | |
WO2023286191A1 (fr) | Appareil de traitement d'informations et procédé de génération de données de commande | |
JP2018147469A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
JP2018088019A (ja) | 仮想現実を提供するための方法、当該方法をコンピュータに実行させるためのプログラムおよび、情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018500812 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17916396 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17916396 Country of ref document: EP Kind code of ref document: A1 |