+

WO2016167664A2 - Commande de jeu - Google Patents

Commande de jeu Download PDF

Info

Publication number
WO2016167664A2
WO2016167664A2 PCT/NL2016/050272 NL2016050272W WO2016167664A2 WO 2016167664 A2 WO2016167664 A2 WO 2016167664A2 NL 2016050272 W NL2016050272 W NL 2016050272W WO 2016167664 A2 WO2016167664 A2 WO 2016167664A2
Authority
WO
WIPO (PCT)
Prior art keywords
touchscreen
trajectory
virtual environment
display screen
previous
Prior art date
Application number
PCT/NL2016/050272
Other languages
English (en)
Other versions
WO2016167664A3 (fr
Inventor
Mark Thomas Gertruda BEUMERS
Jacobus Josephus Adrianus GROEN IN 'T WOUT
Original Assignee
Lagotronics Projects B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from NL2014665A external-priority patent/NL2014665B1/en
Priority claimed from NL2014664A external-priority patent/NL2014664B1/en
Application filed by Lagotronics Projects B.V. filed Critical Lagotronics Projects B.V.
Publication of WO2016167664A2 publication Critical patent/WO2016167664A2/fr
Publication of WO2016167664A3 publication Critical patent/WO2016167664A3/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes

Definitions

  • the present invention generally relates to a system for interaction of at least one user or multiple simultaneous users with a virtual environment. I n particular the present invention relates to a game controller and more in particular to a pointing device to be used to shoot virtual projectiles towards an object displayed on the display screen and a touchscreen game controller to be used to shoot virtual projectiles towards an object displayed on the display screen
  • the concept of an interactive large screen entertainment system is that users can interact with the scene displayed at the screen by manipulation of elements of the scene.
  • the manipulation is typically performed by a dedicated device known as a game controller.
  • game controller In home use, small screen home entertainment systems these controllers are for example gamepads, mice, or a pointing device such as a laser gun shooter.
  • Such home and professional entertainment systems are controlled by a computer.
  • the computer receives the input signals from the input devices, e.g. the gamepads, and processes these signals for manipulating certain elements of a virtual scenario, game or the like that is displayed on a screen that is also attached to the computer.
  • a central computer device or a cluster of central computer devices are used to calculate and process al the input and output signals of the system to enable a multi-user interactive environment.
  • Game controllers that are able to manipulate objects within the virtual scenario, such as a projectile shot from a shooter, such by relative movement of the projectile.
  • the game controller to that end for example consists of a series of action buttons, e.g. to trigger the launch of the projectile, and a direction controller known as a joy-pad or d-pad, to control the direction of the projectile.
  • Modern amusement parks are subject to a public which demands more and more spectacular amusement rides with more realistic virtual environment experiences.
  • a an amusement ride with realistic game controllers that wherein the ride is capable of processing multiple controllers at the same time, such that multiple users can interact in the same virtual environment.
  • Traditional game controllers are not arranged for realistic virtual environment experiences since for example the control of for example projectiles or other objects is not realistic and controlled in a non-optimal manner, e.g. in a relative in stead of absolute manner.
  • Amusement parks with traditional large screen entertainment systems can for example comprise light gun type game controllers in the form of light guns that are able to manipulate objects within the virtual scenario, such as a projectile shot from a shooter, by relative movement thereof.
  • a light gun consists of at least one button, e.g. to trigger, to launch a projectile, as well as means for determining the target location on the screen, e.g. a light signal.
  • Light guns are known from home console systems and are often modelled on a ballistic weapon such as a pistol. Different technical design are known. For example a detection method is known which involves drawing frames in which each target is sequentially displayed in the form of white light after a full black frame is shown. Such a light gun is for example known from US4813682A. How it works is that in the frame after the full black frame, the target area is the only area on the screen that is white, and the rest of the screen remains black. A photodiode inside the light gun detects this change from low to bright light as well as the duration of the flash to identify a target an optionally multiple targets on screen. Drawback of these light guns is that they can only be used on older cathode ray tube screens and not on modern thin-film transistor displays, liquid crystal displays or projection type screen.
  • More modern type light guns rely on one or several infrared or laser light emitters placed near the screen and one sensor in the gun.
  • certain values are sent toward the computer, such as the intensity of the infrared or laser light beam. Since the intensity depends on both the distance and the relative angle to the screen, angle sensors are located in the gun.
  • a trigonometric equation system is solved, and the guns position relative to the screen is thus calculated.
  • the impact point is determined.
  • no angle detector is comprised but only four light detectors. These are however less accurate since calculation of the position is difficult.
  • Other variants are also known in which the sensors are located around the screen and emitters are located inside the gun.
  • Multiplayer environments can pose problems for known light gun game systems since the system needs to be able to handle all data from the controllers and be able to distinguish the different controllers from each other.
  • a system for interaction of multiple users with a virtual environment comprising:
  • At least one computer device arranged for processing and outputting a video signal to at least one display screen for displaying the virtual environment and moving at least one object within the virtual environment;
  • each of the human interface devices is arranged for generating control signals corresponding to input from the user and for communicating the control signal towards the at least one computer device for manipulating movement of at least one objects on the display screen
  • each of the human interface devices comprises a pointing device stationary attached by at least one shaft and arranged for determining an azimuth and altitude of the pointing device and generating the control signals thereon for movement of the at least one object within the virtual environment in a corresponding trajectory
  • the at least one computer device is arranged for distinguishing each of the multiple human interface devices and the relative position thereof in view of the display screen.
  • Professional systems for interaction of one or, mostly more simultaneous users are comprised of at least one computer device, one or more (large) display screens for displaying the virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
  • the computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and generate output video signals for controlling the display screen(s).
  • the human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user.
  • the human interface device is a pointing device for controlling movement of one or more objects on the display screen.
  • the pointing device determines a movement of the pointing device by the user and generates a corresponding control signal thereof, which control signal is send to the computer device for further processing with the virtual environment.
  • a projectile can be fired by a gun or the like, the moment of firing, i.e. the trigger, and the trajectory of the projectile are determined by the control signal and hence, by a trigger of the pointing device and movement defined by two values in two perpendicular planes.
  • the gun In known systems for interaction of multiple users with a virtual environment wherein the pointing devices comprise laser guns, the gun emits a particle beam in the form of a non-visible light, e.g. a laser light, which is aimed at the large screen display.
  • a non-visible light e.g. a laser light
  • One or more sensors or cameras are arranged to sense the non-visible light and determine the position aimed at. The computer then determines if the aimed position and the actual position of the target on the screen correspond and if thus the shot was a hit or miss.
  • the invention is based on the insight that the movement of the at least one object, e.g. the projectile, can be controlled by the movement of the pointing device in relation to a stationary attachment, e.g. a stationary attachment of the pointing device with a shaft.
  • a stationary attachment e.g. a stationary attachment of the pointing device with a shaft.
  • the present invention is based on the insight of the use of a spherical coordinate system wherein the central position, i.e. the origin, of the coordinate system corresponds to the rotation point of the pointing device on the shaft, e.g. the position at which the device is attached to the shaft.
  • Spherical coordinate systems define an origin and a vector
  • the vector is the part of the control system that determines the movement of the object, i.e. the trajectory of the projectile.
  • the vector comprises an azimuth component and an altitude component.
  • the azimuth component defines the angle of the vector from the origin around a horizon, i.e. a horizontal displacement.
  • the azimuth is for example denoted as the angle alpha, a.
  • Azimuth can also be more generally defined as a horizontal angle measured clockwise from any fixed reference plane or easily established base direction line such as the direction to the horizontal centre of the large screen display.
  • the altitude component defines the angle of the vector from the origin and the plane perpendicular to the horizon. For example, the angle between the vector and a reference horizon such a the horizontal plane through the centre of the large screen display.
  • the altitude is for example denoted as the angle beta, ⁇ .
  • the trajectory of the projectile on the screen i.e. the manipulated object in the virtual environment
  • the origin value is defined by the position of the pointing device, i.e. the relative position with respect to the large display screen.
  • the vector is defined by an azimuth and altitude component from the origin.
  • the azimuth and altitude of the pointing device are determined by at least two variable resistors, wherein the variable resisters are in particular potentiometers.
  • the pointing device comprises at least two variable resistors, one per component.
  • the value of the resistor changes in correspondence therewith. From the change in resistance of the resistor, the device can determine the angular movement of the pointing device, i.e. the azimuth or altitude component.
  • the variable resistors are potentiometers having three electrical terminals. Such potentiometers, or sometimes called potmeters or simply a pot, are arranged to be used as a voltage divider used to measure the electric potential which is used for defining a corresponding angular movement of the pointing device, i.e. the azimuth or altitude component.
  • the advantage of the use of variable resistors or potentiometers is the low costs of these components and the accuracy since the resistance value is a continues, non-discrete value.
  • the azimuth and altitude of the pointing device are determined by at least two optical rotary encoders.
  • the azimuth and altitude of the pointing device are determined by at least two magnetic rotary encoders.
  • the angular movement i.e. the azimuth and altitude component
  • the angular movement can be determined in accordance with the example of the variable resistor, however, without variable resistor but with a optical rotary encoder or a magnetic rotary encoder.
  • the advantage of the use of rotary encoder over variable resistors such as potentiometers is that rotary encoders are more durable.
  • control signal of the pointing devices comprises the azimuth, altitude and an identification value of the pointing device for distinguishing the respective pointing device from the multiple human input devices.
  • control signal comprises a trigger signal for triggering movement of the at least on object.
  • control signal comprises three components, a movement component having an azimuth and altitude angular movement part, an identification signal to distinguish each individual pointing device and its corresponding position with respect to the large screen display, and finally a trigger signal defining timing of the shot.
  • the trigger signal is generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
  • the trigger signal can be generated in accordance with multiple embodiments of the invention.
  • the pointing device can be shaped in the form of a gun, or in a snowball shooting game, the pointing device can be snowball cannon.
  • the gun can be provided with a gun trigger for example, and the snowball cannon with for example a fire button or a pull cord.
  • the pointing device can also be used in accordance with the invention and which corresponding trigger signal generating means these pointing devices comprise.
  • system is arranged for stereoscopy, and in particular, wherein the system comprises multiple 3D glasses for each of the multiple users, and wherein the display screen is arranged to display virtual environment with a depth perspective.
  • the virtual environment can be a conventional 2D, two-dimensional scene wherein users objects are manipulated by the user.
  • the examples of the invention are however also applicable for 3D, three-dimensional, virtual environments wherein the large display screen is arranged for stereoscopy, i.e. comprising a depth perspective.
  • a 3D example of the system according to the invention is arranged to visualize a trajectory of a projectile which starts from the pointing device and ends somewhere in the screen, at or near a target object for example.
  • the trajectory of the projectile shown on the large display screen can be defined by a horizontal position of the pointing device in respect of the large display screen and the horizontal orientation of the pointing device.
  • the system is arranged according to one dimension, e.g. the X or horizontal dimension.
  • the vertical or Y dimension is also defined.
  • the vertical position of the pointing device in respect of the large display screen and/or the vertical orientation of the pointing device provides a two dimensional, 2D system.
  • the trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the pointing device in a trajectory towards a certain target location, all in 3D.
  • 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc.
  • the system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
  • the multiple pointing devices are arranged for wired or wireless communication with the at least one computer. More in particular the communication between the multiple pointing devices and the at least one computer is arranged by Bluetooth, 801 .1 1 Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN , WiFi like, or other appropriate short range communication system such as infra-red.
  • the computer should be able to distinguish the individual input devices, i.e. pointing devices. Such can be done by location determination of each pointing device, for example on a visual basis, or, in a more practical embodiment, by use of digital identification of the input devices.
  • Each input device has a unique identifier that is communicated to the computer such that the computer can distinguish the input devices.
  • the pointing devices can be connected with the computer device via wire or wireless.
  • wireless connections are Bluetooth and Wifi Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN or Wifi like communication.
  • the system comprises at least 2 pointing devices, and in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
  • the virtual environment is a virtual environment of a shooter game, and wherein the at least one object is a projectile. It is an object of the present invention to provide an improved system for interaction of at least one user with a virtual environment with an increased realistic experience for the users. More in particular, it is an object of the present invention to provide a system for interaction of at least one user with a virtual environment comprising a controller arranged for realistic manipulation of projectiles of a virtual environment.
  • a system for interaction of at least one user with a virtual environment comprising:
  • At least one computer device arranged for processing and outputting a video signal to a display screen for displaying the virtual environment and moving objects within the virtual environment; at least one human interface device, arranged for receiving input from a user, generating a control signal corresponding to the user input and communicating the control signal towards the at least one computer for manipulating movement of at least one of the objects on the display screen, characterized in that the at least one human interface device comprises a touchscreen device arranged for generating the control signal on position and movement of at least one finger on the touchscreen, and wherein the at least one computer device is arranged to display movement of the at least one object on the display screen in a trajectory corresponding to a trajectory defined by the position and movement of the at least one finger on the touchscreen.
  • Professional systems for interaction of one or, mostly more simultaneous users are comprised of at least one computer device, one or more (large) screen displays for displaying the virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
  • the computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and generate output video signals for controlling the screen(s).
  • the human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user.
  • the human interface device is in particular a device having a touchscreen arranged to generate the input control signal on the basis of a position and movement of one or more fingers on the touchscreen.
  • a device is provided to control objects within the virtual environment by defining a control signal based on absolute movement of the object. What is meant herewith, is that the object can be moved on the touchscreen and the trajectory determined and calculated thereof can be used to display movement of the object in a trajectory on the display screen that corresponds to the trajectory of movement on the touchscreen. Such, with or without displaying the object on the touchscreen device.
  • the invention is based on the insight that objects such as projectiles are moved in a real life environment in a absolute manner, not in a manner currently known in game controllers, i.e. by moving the object up, down, left, right, or a combination thereof, such in respect of the current position.
  • a touchscreen device i.e. a touchscreen (game) controller
  • a trajectory of a projectile can be controlled by movement of at least one finger over the touchscreen in a corresponding trajectory.
  • Known game controllers are only able to control the aspects of a game or virtual environment by relative movement.
  • a shooter game wherein a projectile is fired, the trajectory of the projectile displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of the gun or the like is permanently displayed.
  • the user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the projectile starts its trajectory in a straight line from the bottom centre of the display towards the aimed position on the display, hoping to hit the target if the aimed position and target position correspond.
  • the trajectory is different since the starting position of the projectile on the display is not static but corresponds to the relative position between the touchscreen controller and the display. If the touchscreen controller is located at a certain distance away from the display but in centre of the display, i.e. in the centre of the width of the display, thus in the middle of the x- position of the display, the trajectory of the projectile start at the middle of the display, from the bottom towards the aimed position on the display. But if the touchscreen controller is located at the left side of the display, the trajectory of the projectile starts from the bottom left corner of the screen. In this way, the touchscreen controller according to the invention is able to manipulate an object in a virtual environment, e.g.
  • trajectory displayed on the display corresponds to the relative position between touchscreen controller and display in such a way that the starting position of the trajectory of the projectile on the screen is determined by the relative position between of the touchscreen controller and the display.
  • the at least one computer device is arranged to display movement of the at least one of object on the display screen in a trajectory defined as an extrapolation of a trajectory defined by the position and movement of the at least one finger on the touchscreen.
  • the touchscreen controller is arranged to define or calculate a trajectory of the projectile by a first, starting position on the touchscreen, a second, end position on the touchscreen and a trajectory on the touchscreen from first to second.
  • the computer device to which the touchscreen controller is connected knows the position of the controller and knows the position of the display, for example by a predefined display location position and predefined controller position location stored in the computer or remote.
  • one or both of the positions of the controller and the display are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game.
  • the calibration methods of determining one of the controller and display positions (or both) can be performed on a real-time or near real-time basis.
  • the touchscreen controller is arranged to define or calculate a trajectory of the projectile, that trajectory is defined by the first position on the touchscreen, the second position thereon and trajectory from first to second.
  • the computer device calculates a further virtual trajectory based on the trajectory determined from the touchscreen.
  • the further virtual trajectory is an extrapolation of the trajectory determined from the touchscreen. If the position of the display is somewhere within that further virtual trajectory, i.e. in the extrapolation of the trajectory of the projectile on the touchscreen, the projectile is shown on the display. If however, the extrapolated trajectory, i.e. the further virtual trajectory does not cross the display, the projectile is not shown on the display.
  • the computer device can determine on the basis of the position of the display and the defined virtual trajectory, i.e. the extrapolated trajectory, how the projectile should be displayed on the large screen display.
  • the trajectory on the display i.e. as an extrapolation of the trajectory of the projectile on the touchscreen
  • the computer can count the "shot” as a "hit”.
  • a hit can also occur if the target is displayed with depth perspective, i.e. as if its position lies between the large screen and touchscreen, and if the virtual trajectory between touchscreen and the display crosses the target displayed at some distance between the touchscreen and large screen.
  • the at least one object is displayed on the touchscreen, and wherein the at least one object on the display screen is introduced on the display screen once the at least one object is moved out of the touchscreen.
  • a timing element is involved. What is meant thereby is that only after completion of the trajectory on the touchscreen that the projectile will be shown on the display. In one example, the projectile will be shown directly after the end of the trajectory on the touchscreen, in another example, there is time in between, which would represents the distance between the display and the touchscreen. This has the advantage that it is more realistic. In a further, even more realistic example the time between the start of displaying the projectile on the display and the end of the trajectory on the touchscreen correspond to the distance between touchscreen and display and in an even more realistic example the time between both also depends on the speed of the projectile's trajectory of the touchscreen.
  • the time duration between the end of the trajectory and start on the display is zero, in another example it is a certain fixed time duration, in another example, the time duration depends only on the absolute distance between both touchscreen and display and in yet another example, the time duration depends on both the absolute distance and the speed/velocity of the projectile, defined by the speed at which the trajectory is performed on the touchscreen, i.e. the time between first and second positions on the touchscreen.
  • the virtual environment is a virtual environment of a shooter game, and wherein the at least one object is a projectile.
  • the examples of the present invention can be applied in a plurality of different virtual environments.
  • the preferred virtual environments are however game environments, and in particular shooter games.
  • the projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc.
  • This projectile can also be a snowball of a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
  • the at least one computer device is arranged to determine a position of the at least one touchscreen device in relation to the display screen, or in relation to the screen, i.e. the virtual room wherein the system is positioned, or in relation to any other calibration point.
  • the computer device can be a single device, or a cluster of computer devices in case of a large game environment with for example a large amount of simultaneous players and thus large amount of controllers.
  • a single computer or a cluster of computers can be used, such depending on the processing power needed to process al information of the virtual environment.
  • the computer can, as already mentioned above, determine the position of one or all touchscreen controllers by identification of a particular touchscreen controller and a predefined stored position of that touchscreen, or the computer can determine the real-time position by calculation of the position.
  • the person skilled in the art will understand that several techniques are known for determining positions such as by radiofrequency signals, by optical detection or imaging.
  • the invention is not restricted to the way in which the position of the touchscreen is determined.
  • the positions can be derived by identification of the touchscreen only.
  • 16 players can simultaneously play a shooter game, each having their own touchscreen.
  • the computer devices knows the setup of the scene, i.e. the ride, and thus the absolute position of each touchscreen in the scene. By identifying which input signals are from which touchscreen, and by knowing the absolute position of the display in the scene, the computer can determine the relative position between display and each individual touchscreen.
  • the at least one computer is arranged to communicate with a plurality of touchscreen devices and wherein each respective position thereof is determined by the at least one computer device.
  • the computer is in communicative connection with all touchscreens, either by wire or wireless.
  • wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801 .1 1 wireless, 801 .15 wireless PAN, etc.
  • the relative position can be determined by either the computer devices, for example on the basis of a predefined value of all touchscreens and identification of each touchscreen, or can be determined by the touchscreen device itself, e.g. by a predefined value stored in inside the touchscreen, or by (real-time) calculation of the position by the touchscreen on the basis of known location determining means.
  • the at least one touchscreen device is a tablet.
  • the touchscreen device can be a passive touchscreen display which is only used as input device, i.e. no information is displayed in the display of the touchscreen device, for example a touchpad, or as most known touchscreens, i.e. by a combination of input device and output display device. Any of these touchscreens can be dedicated touchscreens, custom designs for the game, for example embedded in a showballshooter device, or can be off shelf touchscreen devices such as tablets.
  • one of the touchscreen device or the at least one computer is arranged to determine the movement of the object on the display screen by calculating the trajectory on the basis of the position and movement of the at least one finger on the touchscreen.
  • Touchscreen devices can be controlled by finger or by a dedicated input device for a touchscreen such as a stylus pen.
  • a speed at which the at least one object is moved on the display screen corresponds to a speed at which the at least one of object is moved on the touchscreen device.
  • first start and second end position of the finger (or stylus pen) on the touchscreen are used as variables of manipulation of the object(s) the virtual environment.
  • more variables manipulate the movement of the objects. Examples thereof are the speed at which the object is moved on the touchscreen, i.e. the time between first and second position of the finger, i.e. the speed at which the trajectory is made on the touchscreen.
  • Another example is to add more dimensions.
  • the x- position of the touchscreen in relation to the display is determined, also the Y- position.
  • the Z position being the distance from display to touchscreen, is a variable in manipulating movement of the object (projectile).
  • Multiple dimensions based input determination is in particular more realistic if a three dimensional representation is used.
  • the system i.e. computer, and display are arranged for 3D.
  • the trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the touchscreen in a trajectory towards a certain target location, all in 3D.
  • 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc.
  • the system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
  • an output angle under which the at least one object is moved out of the touchscreen device corresponds to an input angle under which the at least one object is introduced on the display screen.
  • the trajectory of the projectile on the display there are multiple ways to define the trajectory of the projectile on the display. As indicated above, this can be done by extrapolation of the trajectory defined on the touchscreen.
  • the trajectory on the touchscreen is a combination of the variables start (first) position, end (second) position and the path between the start and second position.
  • the path between the first and second position can be defined by only a few intermediate positions or by multiple intermediate positions.
  • the amount of intermediate positions is very high. This amount can even be increased by interpolation by which multiple intermediate positions are calculated for example by a linear interpolation method, a piecewise constant interpolation method, a polynomial interpolation method, a spline interpolation method, etc.
  • the person skilled in the art will understand which other interpolation methods are known and applicable.
  • the trajectory of the projectile on the display can also be defined by the output angle at the touchscreen. What is meant thereby is that the output angle is the angle between the trajectory on the touchscreen and the edge of the touchscreen, or by the tangent of the trajectory and the edge of the touchscreen. That (output) angle then corresponds to the angle under which the trajectory of the projectile on the display starts, i.e. the input angle of the trajectory on the display. That input angle is defined as the angle between the trajectory on the display and the edge of the display or between the tangent of the trajectory and the edge of the display.
  • the at least one object explodes on the display screen when the touchscreen device is tapped.
  • a projectile is an example of an object that is to be manipulated in the virtual environment. That projectile can for example be a snowball in a snowball game, or a rocket in a shooter game. In an example this projectile is not only fired upon a trigger action, the projectile can also explode on the display upon a further trigger signal such as tapping the touchscreen or pressing/tapping a specific button displayed on the touchscreen. Then the snowball or rocket is fired upon the first trigger and explodes upon the second trigger.
  • the at least one touchscreen device comprises a two- dimensional accelerometer, and in particular a three-dimensional accelerometer.
  • the touchscreen device can be provided with more additional input units than only a touchscreen.
  • Examples thereof are a accelerometer, e.g. a one, two or three dimensional version thereof. With such an accelerometer the device can determine an a (proper) acceleration of the device (g-force) relative to the freefall.
  • g-force a acceleration of the device
  • the option arises to create a inertial navigation system which can be used to manipulate elements/objects of the virtual environment such as movement of a player in the game. Then the player can freely move within the game by movement and rotation of the touchscreen device and the touchscreen of the touchscreen device is then used to fire a projectile.
  • the accelerometers and/or gyroscopes can also be used to manipulate other objects or elements of the virtual environment, depending on the type of the game played and selections of the programmer of the game.
  • the at least one touchscreen device is arranged to communicate an identification signal to the at least one computer device.
  • the computer should be able to distinguish the individual input devices, i.e. the touchscreen devices. Such can be done by location determination of each touchscreen device, for example on a visual basis, or, in a more practical embodiment, by use of digital identification of the input devices.
  • Each input device has a unique identifier that is communicated to the computer such that the computer can distinguish the input devices.
  • the at least one touchscreen device is arranged to determine a position of the at least one touchscreen device in relation to the at least one display screen and for communicating the position to the at least one computer device.
  • a touchscreen device arranged to be used as a human interface device in a system for interaction of at least one user with a virtual environment according to any of the previous descriptions, wherein the touchscreen device is arranged for generating a control signal and communication thereof to the computer device based on the position and movement of at least one finger on the touchscreen, and for calculating a trajectory of movement of the at least one object on the display screen in correspondence with a trajectory defined by the position and movement of the at least one finger on the touchscreen.
  • a computer device arranged to be used as a human interface device in a system for interaction of at least one user with a virtual environment according to any of the previous descriptions, wherein the computer device is arranged for receiving a control signal from the at least one touchscreen device and wherein the at least one computer device is arranged for calculating a trajectory of movement of the at least one object on the display screen in correspondence with the control signal.
  • an amusement ride comprising a track and at least one car or carriage for moving users over the track, the amusement ride comprising a system for interaction of the users with a virtual environment according to any of the previous descriptions, wherein the at least one car or carriage comprises a plurality of touchscreen devices arranged for generating a control signal and communication thereof to the computer device based on the position and movement of at least one finger on the touchscreen, and for calculating a trajectory of movement of the at least one object on the display screen in correspondence with a trajectory defined by the position and movement of the at least one finger on the touchscreen.
  • the virtual environment is comprised of a large screen display.
  • static displays i.e. a scene in which the physical objects are the targets of the shooter game and the system is comprised of one or more computers, one or more touchscreen controllers and one or more physical target objects positioned somewhere in the scene.
  • the trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory on the touchscreen.
  • the system i.e. the computer thereof, then determines on the basis of the determined extrapolation of the trajectory and the position of the touchscreen and the physical target object, whether or not the projectile hits or misses the target object.
  • the user i.e. player, is informed of a hit or miss by visual and/or auditive and/or force feedback information on/by the touchscreen or other device such a scoreboard, speaker, etc.
  • a pointing device arranged to be used as a human interface device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions, wherein the pointing device is arranged for stationary attachment with at least one shaft and arranged for determining an azimuth and altitude of the pointing device and generating the control signals thereon for movement of the at least one object within the virtual environment in a corresponding trajectory.
  • a computer device arranged to be used as a computer device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions, wherein the computer device is arranged for distinguishing each of the multiple human interface devices and the relative position thereof in view of the display screen.
  • an interactive amusement ride comprising a scene and a system for interaction of at least one user with a virtual environment according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein scene, and in particular the car or carriage comprises a plurality of pointing devices.
  • the virtual environment is comprised of a large screen display.
  • the invention also applies to static displays, i.e. a scene in which the physical objects are the targets of the shooter game and the system is comprised of one or more computers, one or more pointing devices and one or more physical target objects positioned somewhere in the scene.
  • the trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory on the basis of the position of the pointing device and the orientation thereof, or in particular on the basis of an extrapolation of the trajectory from the pointing device.
  • the system i.e.
  • the computer thereof determines on the basis of the virtual trajectory and the position of the physical target object, whether or not the projectile hits or misses the target object.
  • the user i.e. player, is informed of a hit or miss by visual and/or auditive and/or force feedback information on/by a screen on the pointing device or other device such a scoreboard, speaker, etc.
  • Figure 1 shows a setup of a system, according to a first aspect of the invention, with a computer, large screen display and multiple users each controlling a single pointing device.
  • Figures 2 shows an illustration, according to a first aspect of the invention, of an example of a trajectory of a projectile from a snowball cannon and the corresponding trajectory shown on the large screen display.
  • Figures 3 shows a stationary pointing device, according to a first aspect of the invention, in the embodiment of a snowball cannon.
  • Figure 4 shows a setup of a system according to a first aspect of the invention with a computer, large screen display and multiple users with each a touchscreen device.
  • Figures 5 shows an illustration according to examples of the invention of trajectories of projectiles on the touchscreens and the corresponding trajectories of the projectiles on the large screen display.
  • Figures 6 shows another illustration according to an example of the invention of a trajectory of a projectile on the touchscreen and the corresponding trajectory of the projectile on the large screen display.
  • a scene 100 is illustrated in accordance with a first aspect of the invention.
  • the scene is in this example an interactive stationary ride in which multiple users, which users 131 , 132 can manipulate objects 141 , 142, 143, in the virtual environment.
  • the interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each simultaneously operate their own pointing device 151 , 152.
  • the game illustrated here is a game wherein projectiles in the form of snowballs are fired from a snowball cannon 151 , 152.
  • the first pointing device is a first snowball cannon 151
  • the second pointing device is the second snowball cannon 152.
  • the snowball cannons 151 , 152 are just illustrative. Many different shapes and forms of pointing devices are applicable.
  • the snowball cannon 151 , 152 is stationary and attached via a single shaft 154 of which only the shaft of the first snowball cannon 151 is visible in Fig. 1 .
  • the first user 131 operates the snowball cannon 151 by pushing the cannon downwards/upwards/left/right, all according to a spherical coordinate system.
  • the present invention is based on the insight of the use of a spherical coordinate system wherein the central position, i.e. the origin, of the coordinate system corresponds to the rotation point of the pointing device 151 on the shaft 154, e.g. the position at which the device is attached to the shaft 154.
  • Fig. 1 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120.
  • the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process al the data of the system.
  • the computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television.
  • the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160.
  • the projector will be connected with the computer device 120 via wired communication 122.
  • the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
  • the scene 100 of Fig. 1 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment.
  • the scene 100 of Fig. 1 shows, by way of example, a two-speaker set-up.
  • the invention as the skilled person will understand, is not restricted to merely a two speaker stereo set- up, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound setups.
  • Fig. 1 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a pointing device 151 , 152 should be enabled or disabled from/in the game if a user 131 , 132 is detected that is operating the pointing device.
  • the camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene.
  • FIG. 2 an illustration of the functioning of the spherical coordinate system on the pointing device 151 .
  • the centre of rotation 156 of the pointing device 151 corresponds to the origin 210 of the spherical coordinate system.
  • the pointing device 151 can be rotated along two axis, which together are represented by the vector 260. It is the vector that defines the aimed direction, i.e. the position from the origin 210 towards the aimed position 220.
  • the aimed direction 220 and thus the vector 260 in the spherical coordinate system representation as shown, does not correspond to the orientation in which the shooter 151 in Fig. 2 is shown.
  • the vector 260 thus determined the trajectory of the projectile fired from the pointing device 151 , e.g. the snowball in the snowball shooter game illustrated in Fig. 1 .
  • the vector 260 is comprised of two components, an azimuth 230 and a elevation or altitude 240 component.
  • the azimuth component defines the angle of the vector from the origin around a horizon, i.e. a horizontal displacement.
  • the angle between the vector and the X horizontal axis of the large screen display 1 10 or the direction to the horizontal centre of the large screen display.
  • the azimuth is for example denoted as the angle alpha, a.
  • Azimuth can also be more generally defined as a horizontal angle measured clockwise from any fixed reference plane or easily established base direction line such as the direction to the horizontal centre of the large screen display.
  • the components of the spherical coordinate system can also be comprised of a radial distance, a polar angle and azimuthal angle. The radial distance is then the length of the vector, the polar angle the altitude and the azimuthal angle the azimuth.
  • the altitude component defines the angle of the vector from the origin and the plane perpendicular to the horizon. For example, the angle between the vector and a reference horizon such a horizontal plane through the centre of the large screen display.
  • the altitude is for example denoted as the angle beta, ⁇ .
  • Fig. 3 it is disclosed how the pointing device 151 is build up, by way of example, from at least a housing a shaft and two variable resistors, e.g. potentiometers. These potentiometers can also be replaced by a rotary encoder, e.g. an optical rotary encoder or a magnetic rotary encoder.
  • a rotary encoder e.g. an optical rotary encoder or a magnetic rotary encoder.
  • two potentiometers are shown, one potentiometer 159a for determining the azimuth angle, i.e. the rotation of the shaft that determines the azimuth angle 158 and the other potentiometer 159b for determining the elevation/altitude angle, i.e. the rotation of the shaft 159 that determines the elevation/altitude angle.
  • These two potentiometers (or encoders) 159a-b are arranged to determine the azimuth 158 and altitude component 157 of the pointing device, i.e. the vector 260 of Fig. 2.
  • the value of the resistor changes in correspondence therewith. From the change in resistance of the resistor the device can determine the angular movement of the pointing device, i.e. the azimuth or altitude component.
  • a scene 400 is illustrated in accordance with a first aspect of the invention.
  • the scene 400 is in this example an interactive stationary ride in which multiple users, which users 131 , 132 can manipulate objects 141 , 142, 143, in the virtual environment.
  • the interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each (simultaneously) operate their own touchscreen device 451 , 452.
  • the game illustrated here is a game wherein snowballs are fired from a snowball cannon 451 , 452.
  • the first touchscreen device is a first game controller 451 for firing a projectile in the form of a snowball
  • the second touchscreen device is the second game controller 452.
  • these touchscreen game controllers 451 , 452 are tablets, these however can also be touchscreens comprised in a housing in shape of gun or cannon.
  • These touchscreen devices 451 , 452 can be stationary, i.e. fixedly, attached to a frame such as a car, carriage or the like. They can also be handheld devices that can be taken from a docking station or the like. In an example these are attached to a fixed structure in such a way that they cannot be removed (stolen) from the scene.
  • the first user 131 operates the first touchscreen device 451 by moving a finger or device for operating a touchscreen such as a stylus pen, over the touchscreen device. The movement over the touchscreen defines a trajectory that corresponds with the trajectory of the projectile, e.g. the snowball.
  • the trajectory of the projectile, snowball, on the large screen display 1 10, and in particular the active part of the display 1 1 1 , towards a first target object 141 of the plurality of objects 141 , 142, 143 of the virtual environment corresponds with the trajectory defined by movement over the touchscreen device 451 .
  • the snowball trajectory is not only defined by relative movement, i.e. the movement over the touchscreen, but also by the absolute location of the touchscreen device.
  • Fig. 4 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120.
  • the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process al the data of the system.
  • the computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television.
  • the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160.
  • the projector will be connected with the computer device 120 via wired communication 122.
  • the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
  • the scene 400 of Fig. 4 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment.
  • the scene 400 of Fig. 4 shows, by way of example, a two-speaker set-up.
  • the invention as the skilled person will understand, is not restricted to merely a two speaker stereo set- up, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound setups.
  • Fig. 4 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a pointing device 451 , 452 should be enabled or disabled from/in the game if a user 131 , 132 is detected who is operating the pointing device.
  • the camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene.
  • Known game controllers are only able to control the aspects of a game or virtual environment by relative movement.
  • the trajectory of the snowball displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of a gun, or applicable here, a snowball cannon, is permanently displayed.
  • the user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the snowball starts its trajectory in a straight line from the bottom centre of the display, at the end of the cannon, towards the aimed position on the display, hoping to hit the target, e.g. target object 141 , if the aimed position and target object position 141 correspond.
  • the trajectory is different since the starting position of the snowball on the display is not static but corresponds to the relative position between the touchscreen device 451 , 452, and the display 1 10.
  • the touchscreen device 451 is located at a certain distance away from the display, illustrated by a position on the Z depth axis of the large display screen 1 10 as illustrated in Fig. 4, but positioned at centre of the display, i.e. in the centre of the width of the display, thus in the origin of Z horizontal axis of the large screen display 1 10 as illustrated in Fig. 4, the trajectory of the projectile start at the middle of the display, from the bottom towards the aimed position on the display, e.g. target object 141.
  • the touchscreen device 451 is located at the right side of the display 1 10, thus on a position away from the origin on the X horizontal axis, the trajectory of the snowball starts from the bottom right corner of the screen 1 10. In this way, the touchscreen device 451 is able to manipulate an object in a virtual environment, e.g. a trajectory of a snowball in a snowball shooter game, non-only on the basis of the aimed direction but also on the relative position of the touchscreen device 451 in relation to the display 1 10.
  • a virtual environment e.g. a trajectory of a snowball in a snowball shooter game
  • Touchscreen device 451 is in particular arranged to define or calculate a trajectory of the projectile by a first, starting position on the touchscreen, a second, end position on the touchscreen and a trajectory on the touchscreen from first to second.
  • the computer device 120 to which the touchscreen device 451 is connected knows the position of each touchscreen device 451 , 452 and knows the position of the display 1 10 for example by a predefined device location position value, stored local or remote.
  • one of, or both the positions of the touchscreen devices 451 , 452 and the display 1 10 are determined at certain (calibration) moments in time, for example when a carriage of a ride (not shown) enters the scene where the users 131 , 132 can see the display 1 10 and can start the game.
  • the calibration methods of determining one of the touchscreen 451 , 452 and display 1 10 positions (or both) can be performed on a real-time or near real-time basis.
  • the touchscreen device 451 is arranged to define or calculate a trajectory of the projectile, that trajectory is defined by the first position on the touchscreen, the second position thereon and trajectory from first to second.
  • the computer device calculates a further virtual trajectory based on the trajectory determined from the touchscreen.
  • the further virtual trajectory is an extrapolation of the trajectory determined from the touchscreen. If the position of the display is somewhere within that further virtual trajectory, i.e. in the extrapolation of the trajectory of the projectile on the touchscreen, the projectile is shown on the display. If however, the extrapolated trajectory, i.e. the further virtual trajectory does not cross the display, the projectile is not shown on the display.
  • the computer device can determine on the basis of the position of the display and the defined virtual trajectory, i.e. the extrapolated trajectory, how the projectile should be displayed.
  • the trajectory on the display i.e. as an extrapolation of the trajectory of the projectile on the touchscreen
  • the computer can count the "shot" as a "hit”.
  • Fig. 5 and 6 the different trajectories are shown.
  • the large screen display 1 10, with active area 1 1 1 displays (besides other elements of the game) two objects 141 , 142 that can be manipulated. These are the targets that are aimed at by the two users 131 , 132.
  • the first user 131 with first touchscreen device 451 is on the left side of the screen 1 10, thus as shown on Fig. 4, on the left side of the X axis. That first user 131 generates a trajectory 41 1 -412-413 on the touchscreen device 451 .
  • the trajectory is defined by a first start point 41 1 , a second end point 412 and the trajectory 413 from first to second.
  • That virtual trajectory 413 is an extrapolation of the trajectory 41 1 -412-413 on the touchscreen 451 .
  • the virtual trajectory i.e. the extrapolated trajectory 431 , continues on the screen 1 10 at start point 221 and the trajectory is further continued, as an extrapolation of the trajectory 41 1 -412-413 on the touchscreen 451 and the virtual trajectory 431 , towards the end point on the screen 222. If trajectory 223 displayed on the screen 1 10 crosses a target object 141 , 142, a hit is counted, otherwise, the shot was counted as a miss.
  • Target object 142 and the virtual trajectory 471 i.e. the extrapolation of the trajectory 451 -452-453 on the touchscreen 452, correspond.
  • the touchscreen devices as shown in these figures are tables 451 , 452 having a housing and active touchscreen area 455, 457. They furthermore have a button 454, 456 for additional control signals, such as to generate an explosion of the projectile, or to display the score or the like.
  • Fig. 6 yet another trajectory is shown.
  • the trajectory 612-613-314 on the touchscreen 451 is not a straight line but is curved.
  • the extrapolation 631 of that trajectory 612-613-314 is thus also curved according to a corresponding radius.
  • This trajectory does not cross a target object 141 and thus the shot is registered by the computer as a missed shot.
  • objects of the virtual environment can also move into the display of the touchscreen device, e.g. on the screen of the tablet.
  • These objects can be new objects that have not yet been shown on the large screen display, but also objects that move from the large screen display towards the touchscreen device and which are then shown in the touchscreen, for example in the same trajectory as the objects move over the large screen display.
  • objects are suitable for which virtual environments. Examples thereof could be snow flocks in a snow storm that move in a certain trajectory (depending on the wind direction for example) from the large screen display towards the touchscreen device and are then shown on the touchscreen in a trajectory that corresponds or in particular is an extrapolation of the trajectory on the large screen display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

De manière générale, l'invention concerne un système permettant à au moins un utilisateur ou à plusieurs utilisateurs d'interagir avec un environnement virtuel. En particulier, l'invention concerne des HID et, selon un aspect, une commande de jeu à dispositif de pointage permettant de tirer des projectiles virtuels vers un objet affiché sur l'écran d'affichage fourni par un système pour l'interaction d'une pluralité d'utilisateurs avec un environnement virtuel, le système comprenant : au moins un dispositif informatique conçu pour traiter et générer un signal vidéo vers au moins un écran d'affichage permettant d'afficher l'environnement virtuel et de déplacer au moins un objet dans l'environnement virtuel ; plusieurs dispositifs d'interface humaine conçus pour recevoir respectivement une entrée provenant de plusieurs utilisateurs, chacun des dispositifs d'interface humaine étant conçu pour générer des signaux de commande correspondant à une entrée de l'utilisateur et pour communiquer le signal de commande au(x) dispositif(s) informatique(s) afin de manipuler le mouvement d'au moins un objet sur l'écran d'affichage. Le système est caractérisé en ce que chacun des dispositifs d'interface humaine comprend un dispositif de pointage attaché de manière fixe par au moins un arbre et conçu pour déterminer un azimut et l'altitude du dispositif de pointage et pour générer les signaux de commande concernant le mouvement du ou des objets dans l'environnement virtuel dans une trajectoire correspondante, et le ou les dispositifs informatiques étant conçus pour distinguer chacun des multiples dispositifs d'interface humaine et la position relative de ceux-ci en en tenant compte de l'écran d'affichage.
PCT/NL2016/050272 2015-04-17 2016-04-18 Commande de jeu WO2016167664A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
NL2014665 2015-04-17
NL2014665A NL2014665B1 (en) 2015-04-17 2015-04-17 Shooter game controller.
NL2014664 2015-04-17
NL2014664A NL2014664B1 (en) 2015-04-17 2015-04-17 Touchscreen game controller.

Publications (2)

Publication Number Publication Date
WO2016167664A2 true WO2016167664A2 (fr) 2016-10-20
WO2016167664A3 WO2016167664A3 (fr) 2017-01-05

Family

ID=56360453

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2016/050272 WO2016167664A2 (fr) 2015-04-17 2016-04-18 Commande de jeu

Country Status (1)

Country Link
WO (1) WO2016167664A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107773987A (zh) * 2017-10-24 2018-03-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4813682A (en) 1985-08-09 1989-03-21 Nintendo Co., Ltd. Video target control and sensing circuit for photosensitive gun

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US8384565B2 (en) * 2008-07-11 2013-02-26 Nintendo Co., Ltd. Expanding operating device and operating system
US20100131947A1 (en) * 2008-11-24 2010-05-27 Disney Enterprises, Inc. System and method for enabling a local user of a real-life simulation environment to interact with a remote user of a corresponding virtual environment
US8226484B2 (en) * 2009-08-27 2012-07-24 Nintendo Of America Inc. Simulated handlebar twist-grip control of a simulated vehicle using a hand-held inertial sensing remote controller
US20120157204A1 (en) * 2010-12-20 2012-06-21 Lai Games Australia Pty Ltd. User-controlled projector-based games
WO2013111119A1 (fr) * 2012-01-27 2013-08-01 Saar Wilf Simulation d'interaction avec un environnement en 3d

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4813682A (en) 1985-08-09 1989-03-21 Nintendo Co., Ltd. Video target control and sensing circuit for photosensitive gun

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107773987A (zh) * 2017-10-24 2018-03-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
US10857462B2 (en) 2017-10-24 2020-12-08 Netease (Hangzhou) Network Co., Ltd. Virtual character controlling method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
WO2016167664A3 (fr) 2017-01-05

Similar Documents

Publication Publication Date Title
US20230347254A1 (en) Method and apparatus for providing online shooting game
CN110585712A (zh) 在虚拟环境中投掷虚拟爆炸物的方法、装置、终端及介质
US9684369B2 (en) Interactive virtual reality systems and methods
CN109821238B (zh) 游戏中的瞄准方法及装置、存储介质、电子装置
US9542011B2 (en) Interactive virtual reality systems and methods
WO2016204617A2 (fr) Contrôleur de jeu
US20150080121A1 (en) Method for tracking physical play objects by virtual players in online environments
US20120329558A1 (en) Method and apparatus for using a common pointing input to control 3d viewpoint and object targeting
EP2221707A1 (fr) Système et procédé pour fournir une interaction d'utilisateur avec des environnements tridimensionnels projetés
KR101366444B1 (ko) 실시간 상호 연동 가능한 가상 사격 시스템
US8267793B2 (en) Multiplatform gaming system
AU2015244158A1 (en) Interactive virtual reality systems and methods
US10928915B2 (en) Distributed storytelling environment
JP2009011657A (ja) ゲームプログラムおよびゲーム装置
CN113827949A (zh) 利用人工智能技术的屏幕射击场以及屏幕射击游戏方法
KR101247213B1 (ko) 전투 게임용 로봇 및 이를 이용한 전투 게임 시스템과 방법
CN206444151U (zh) 一种沉浸式虚拟现实射击互动平台
Wolf BattleZone and th e Origins of First-Person Shooting Games
WO2013111119A1 (fr) Simulation d'interaction avec un environnement en 3d
WO2016167664A2 (fr) Commande de jeu
KR200462198Y1 (ko) 스크린 사격 게임 장치
US10369487B2 (en) Storytelling environment: mapping virtual settings to physical locations
JP2011004992A (ja) ビデオゲーム装置
NL2014665B1 (en) Shooter game controller.
KR101448244B1 (ko) 피씨게임용 제어장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16735726

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16735726

Country of ref document: EP

Kind code of ref document: A2

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载