+

HK1171402B - Game system, controller device, and game process method - Google Patents

Game system, controller device, and game process method Download PDF

Info

Publication number
HK1171402B
HK1171402B HK12112244.5A HK12112244A HK1171402B HK 1171402 B HK1171402 B HK 1171402B HK 12112244 A HK12112244 A HK 12112244A HK 1171402 B HK1171402 B HK 1171402B
Authority
HK
Hong Kong
Prior art keywords
game
data
image
unit
controller
Prior art date
Application number
HK12112244.5A
Other languages
Chinese (zh)
Other versions
HK1171402A1 (en
Inventor
竹田玄洋
川井英次
Original Assignee
任天堂株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010245299A external-priority patent/JP4798809B1/en
Application filed by 任天堂株式会社 filed Critical 任天堂株式会社
Publication of HK1171402A1 publication Critical patent/HK1171402A1/en
Publication of HK1171402B publication Critical patent/HK1171402B/en

Links

Description

Game system, operation device, and game processing method
Technical Field
The present invention relates to a game system including an operation device that can be moved and operated by a player, and an operation device and a game processing method in the game system.
Background
Conventionally, there is a game system in which a player moves an operation device to perform a game operation (see, for example, patent document 1). For example, in the game system described in patent document 1, the operation device includes components such as an acceleration sensor and an imaging element, and the game device can calculate (estimate) the movement of the operation device using these components. Thus, the player can perform a game operation by moving the operation device itself, and thus can perform a more intuitive operation, a more realistic operation, a more complicated operation, and the like, as compared with the case of operating only the buttons or the joysticks.
Patent document 1: japanese patent No. 4265814 Specification
Disclosure of Invention
Problems to be solved by the invention
In the game system described in patent document 1, a game image is displayed on a display device separate from an operation device, and a player performs a game operation using a handheld operation device while viewing a screen of the display device. Therefore, in the above game system, the player cannot directly operate the game image displayed on the screen. That is, although the player can perform an operation of pointing the operation device in the direction of the screen to indicate a desired position on the screen, for example, the player cannot perform an operation of directly touching the screen or an operation of moving the screen itself.
Therefore, an object of the present invention is to provide a game system, an operation device, and a game processing method that enable a new game operation.
Means for solving the problems
In order to solve the above problems, the present invention adopts the following configurations (1) to (10).
(1) One example of the present invention is a game system including a stationary game device and a first operation device.
The game device includes a first operation data receiving unit, a game processing unit, an image generating unit, a game image compressing unit, a game image transmitting unit, and an image output unit. The first operation data receiving section receives first operation data from a first operation device. The game processing unit executes game processing based on the first operation data. The image generation unit sequentially generates a first game image and a second game image based on the game processing. The game image compression unit sequentially compresses the first game image to generate compressed image data. The game image transmitting unit sequentially transmits the compressed image data to the first operating device via wireless. The image output unit sequentially outputs the second game images to an external display device independent of the first operation device.
The first operation device includes a display unit, a touch panel, an inertial sensor, a first operation data transmission unit, a game image reception unit, and a game image decompression unit. The touch panel is provided on a screen of the display section. The first operation data transmitting unit wirelessly transmits first operation data including output data of the touch panel and the inertial sensor to the game device. The game image receiving unit sequentially receives compressed image data from the game device. A game image decompression unit sequentially decompresses the compressed image data to obtain a first game image. The display unit sequentially displays the first game images obtained by decompression.
The "game device" may be any information processing device that executes game processing and generates an image based on the game processing. The game device may be an information processing device dedicated to a game, or may be a multi-purpose information processing device such as a general personal computer.
The "first operation device" may have any other configuration as in the terminal device in the embodiment described below, as long as it includes at least the display unit, the touch panel, the inertial sensor, the first operation data transmission unit, the game image reception unit, and the game image decompression unit.
The "game system" may include the external display device that displays the second game image, or may not include the external display device, as long as the game device and the first operation device are included. That is, the game system may be provided so as not to include the external display device, or may be provided so as to include the external display device.
The "external display device" may be independent of the first operation device, and may be any device that can display the second game image generated by the game device, other than the television 2 in the embodiment described below. For example, the external display device may be integrated with the game device (in one housing).
According to the configuration of the above (1), the first operation device includes the touch panel and the inertial sensor, and the game device executes the game process based on the first operation data including the output data of the touch panel and the inertial sensor. Thus, the player can perform a game operation by directly touching the screen of the first operation device or moving the screen itself (the first operation device itself). That is, according to the configuration of the above (1), it is possible to provide a new game operation of directly operating a game image displayed on a screen to a player.
In the configuration of the above (1), the first game image displayed on the screen of the first operation device is often a game image for operation using a touch panel. Depending on the game content, it may be desirable to display an image that is not used for an operation performed by the touch panel, but it is difficult to display such an image while performing an operation on the touch panel. In this regard, in the configuration of the above (1), since the second game image can be displayed on the external display device, two different game images can be presented to the player. Therefore, for example, a first game image suitable for the operation of the touch panel is displayed on the screen of the first operation device, and a second game image suitable for grasping a game space is displayed on the external display device, so that the game space can be expressed by various methods using two kinds of game images. Therefore, according to the configuration of the above (1), a game image that is easier to observe and easier to perform a game operation can be presented to the player.
In addition, according to the configuration of the above (1), the first operation device only needs to execute at least the decompression processing of the image data, and the game processing only needs to be executed on the game device side. Even if the game processing is complicated, it is only necessary to increase the processing on the game device side, and the processing amount of the image decompression processing in the first operation device is hardly affected, so that even when the complicated game processing is necessary, the processing load on the first operation device side can be suppressed within a predetermined range, and a high information processing capability is not required for the first operation device. Therefore, the first operation device to be held by the user's hand can be easily reduced in size and weight, and the manufacturing is also easy.
Further, according to the configuration of the above (1), since the first game image is compressed and transmitted from the game device to the first operation device, the game image can be wirelessly transmitted at high speed, and a delay from the game process to the display of the game image can be reduced.
(2) The game system may also further include a second operating device. The second operation device includes a second operation data transmission unit that wirelessly transmits second operation data indicating an operation performed on the second operation device to the game device. The game device further includes a second operation data receiving unit that receives second operation data. The game processing unit executes game processing based on the second operation data.
The "second operation device" is not limited to the controller in the embodiment described later, and may be any device that can wirelessly transmit operation data (second operation data) to the game device.
According to the configuration of the above (2), the player can perform the game operation using not only the first operation device but also the second operation device. Since the player using the second operation device can play the game while viewing the game image displayed on the external display device, according to the configuration of the above (2), the game can be played by two players viewing the screens of the external display device and the first operation device, respectively.
(3) The game device may further include a game sound generation unit, a game sound output unit, and a game sound transmission unit. The game sound generation unit generates a first game sound and a second game sound according to a game process. The game sound output unit outputs the second game sound to an external acoustic device independent of the first operation device. The game sound transmitting unit wirelessly transmits the first game sound to the first operation device. The first operation device further includes a game sound receiving unit and a speaker. The game sound receiving unit receives a first game sound from a game device. The speaker outputs the first game sound received by the game sound receiving unit.
In the above (3), the first game sound wirelessly transmitted from the game device to the first operation device may be transmitted after being compressed as in the embodiment described later, or may be transmitted without being compressed.
According to the configuration of the above (3), two kinds of game sounds can be output for the game sound as well as the game image. Thus, in the first operation device, the game sound conforming to the first game image can be output, and the second game sound conforming to the second game image can be output from the external acoustic device.
(4) The first operation device may further include a microphone. At this time, the first operation data transmitting unit also wirelessly transmits data of the sound detected by the microphone to the game device.
In the above (4), the data of the voice wirelessly transmitted from the first operation device to the game device may be transmitted after being compressed as in the embodiment described later, or may be transmitted without being compressed.
According to the configuration of the above (4), the sound (microphone sound) detected by the microphone of the first operation device is transmitted to the game device. Thus, the game device can use the microphone sound as the game sound or use the result of the sound recognition processing on the microphone sound as the game input.
(5) The first operation device may further include a camera and a camera image compression unit. The camera image compression unit compresses a camera image captured by a camera to generate compressed captured data. At this time, the first operation data transmitting unit also transmits the compressed image data to the game device by wireless. The game device further includes a camera image decompression unit configured to decompress the compressed image data to obtain a camera image.
According to the configuration of the above (5), the camera image captured by the camera of the first operation device is transmitted to the game device. Therefore, the game device can use the camera image as a game image, or use a result obtained by performing image recognition processing on the camera image as a game input. Further, according to the configuration of (5) above, since the camera image is compressed and transmitted, the camera image can be wirelessly transmitted at high speed.
(6) The first operation device may further include a plurality of front operation buttons and a direction input unit capable of indicating a direction. A plurality of front operation buttons are provided on both sides of a screen on which a display portion is provided and a front plane of a touch panel. The direction input portions are disposed on both sides of the screen on the front plane. In this case, the first operation data further includes data indicating operations performed on the plurality of front operation buttons and the direction input unit.
According to the configuration of the above (6), the operation buttons and the direction input portion are provided on both sides of the screen of the first operation device. Therefore, the player can operate the operation buttons and the direction input portion while holding the first operation device (typically, with the thumbs of both hands), and therefore, even during the operation of moving the first operation device, the operation buttons and the direction input portion can be easily operated.
(7) The first operating device may further include a plurality of back surface operating buttons and a plurality of side surface operating buttons. A plurality of back operation buttons are provided on the back plane. The back surface plane is a surface opposite to the front surface plane of the screen provided with the display unit and the touch panel. A plurality of side operation buttons are provided on the side between the front surface plane and the back surface plane. In this case, the first operation data further includes data indicating operations performed on the plurality of back operation buttons and the plurality of side operation buttons.
According to the configuration of the above (7), the operation buttons are provided on the back surface plane and the side surfaces of the first operation device. Therefore, the player can operate these operation buttons in a state of holding the first operating device (typically, with the index finger or middle finger), and therefore, even in the process of performing an operation of moving the first operating device, the operation buttons can be easily operated.
(8) The first operating device may further include a magnetic sensor. At this time, the first operation data also includes data of the detection result of the magnetic sensor.
According to the configuration of the above (8), the first operation device includes the magnetic sensor, and the output result of the magnetic sensor is used in the game process in the game device. Thus, the player can perform a game operation by moving the first operation device. Further, since the game device can determine the absolute posture of the first operation device in the real space from the output result of the magnetic sensor, the posture of the first operation device can be accurately calculated by using the output result of the inertial sensor and the output result of the magnetic sensor, for example.
(9) The inertial sensors may be any inertial sensors, and may include, for example, three-axis acceleration sensors and three-axis gyro sensors.
According to the configuration of the above (9), by using two types of sensors, that is, the acceleration sensor and the gyro sensor, as the inertial sensor, the movement and the posture of the first operation device can be accurately calculated.
(10) The game device may further include a reading unit, a network communication unit, and a power supply unit. The reading unit reads information from an external recording medium on which a game program is recorded, the external recording medium being attachable to and detachable from the game device. The network communication unit is connectable to a network and communicates with an information processing apparatus that can communicate via the network. The power supply unit supplies power from a power supply outside the game device to each unit in the game device. The game processing unit executes the game processing based on the game program read by the reading unit.
According to the configuration of (10), the game program executed by the game device can be easily changed by replacing the external recording medium on which the game program is recorded. Further, since the game device can communicate with each other via a network, it is possible to enrich the functions of the game device and the contents of the game executed by the game device by downloading new applications and data via the network, for example. As described in "7. another operation example of the game system" described later, the terminal device 7 can be used as an interface for performing communication with another information processing device via a network.
As another example of the present invention, the present invention can be implemented as the first operation device in the above (1) to (10). As another example of the present invention, the present invention can be implemented as a game processing method performed in the game systems of (1) to (10) above.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, a game process is executed based on an operation performed on an operation device including a touch panel and an inertial sensor, thereby enabling a new game operation.
Drawings
Fig. 1 is an external view of the game system 1.
Fig. 2 is a block diagram showing an internal configuration of game device 3.
Fig. 3 is a perspective view showing an external configuration of the controller 5.
Fig. 4 is a perspective view showing an external configuration of the controller 5.
Fig. 5 is a diagram showing an internal configuration of the controller 5.
Fig. 6 is a diagram showing an internal configuration of the controller 5.
Fig. 7 is a block diagram showing the configuration of the controller 5.
Fig. 8 is a diagram showing an external configuration of the terminal device 7.
Fig. 9 is a diagram showing a state where the user holds the terminal device 7.
Fig. 10 is a block diagram showing an internal configuration of the terminal device 7.
Fig. 11 is a diagram showing various data used in the game processing.
Fig. 12 is a main flowchart showing a flow of game processing executed by the game device 3.
Fig. 13 is a flowchart showing a detailed flow of the game control process.
Fig. 14 is a diagram showing a screen of the television 2 and the terminal device 7 in the first game example.
Fig. 15 is a diagram showing a screen of the television 2 and the terminal device 7 in the second game example.
Fig. 16 is a diagram showing an example of a television game image displayed on the television 2 in the third game example.
Fig. 17 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example.
Fig. 18 is a diagram showing an example of a television game image displayed on the television 2 in the fourth game example.
Fig. 19 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example.
Fig. 20 is a diagram showing a usage of the game system 1 in the fifth game example.
Fig. 21 is a diagram showing a connection relationship of each device included in the game system 1 when connected to an external device via a network.
Description of the reference numerals
1: a gaming system; 2: a television set; 3: a game device; 4: an optical disc; 5: a controller; 6: a marking device (marker device); 7: a terminal device; 10: a CPU; 11 e: an internal main memory; 12: an external main memory; 19: a controller communication module; 28: a terminal communication module; 35: an imaging information calculation unit; 37: an acceleration sensor; 44: a wireless module; 48: a gyroscope sensor; 51: an LCD; 52: a touch panel; 53: analog Stick (Analog Stick); 54: an operation button; 55: a marking section; 56: a camera; 62: a magnetic sensor; 63: an acceleration sensor; 64: a gyroscope sensor; 66: a codec LSI (Large-Scale integration: Large-Scale Integrated Circuit); 67: a speaker; 69: a microphone; 70: a wireless module; 90: controller operational data; 97: terminal operation data; 98: camera image data; 99: microphone sound data.
Detailed Description
[1. overall Structure of Game System ]
A game system 1 according to an embodiment of the present invention will be described below with reference to the drawings. Fig. 1 is an external view of the game system 1. In fig. 1, a game system 1 includes a stationary display device (hereinafter referred to as a "television") 2 represented by a television receiver or the like, a stationary game device 3, an optical disk 4, a controller 5, a marker device 6, and a terminal device 7. The game system 1 executes game processing in the game device 3 in accordance with game operations performed by the controller 5, and displays a game image obtained by the game processing on the television 2 and/or the terminal device 7.
An optical disk 4 is detachably inserted into the game device 3, and the optical disk 4 is an example of an information storage medium that is replaceably used for the game device 3. The optical disk 4 stores an information processing program (typically, a game program) to be executed by the game device 3. An insertion port for the optical disk 4 is provided on the front surface of the game device 3. The game device 3 reads and executes the information processing program stored in the optical disk 4 inserted into the insertion port to execute game processing.
The game device 3 is connected to the television 2 via a connection cable (cord). The television 2 displays a game image obtained by a game process executed by the game device 3. The television 2 has a speaker 2a (fig. 2), and the speaker 2a outputs game sound obtained as a result of the game processing. In other embodiments, the game device 3 may be integrated with a stationary display device. The communication between the game device 3 and the television 2 may be wireless communication.
A marker 6 is provided around the screen of the television set 2 (on the upper side of the screen in fig. 1). The user (player) can perform a game operation of the motion controller 5, and the marker device 6 is used for calculating the movement, position, posture, and the like of the controller 5 by the game device 3, and details thereof will be described later. The marking device 6 is provided with two markers (markers) 6R and 6L at both ends thereof. The marker 6R (the same applies to the marker 6L) is specifically one or more infrared LEDs (Light Emitting diodes), and outputs infrared Light toward the front of the television 2. The marker device 6 is connected to the game device 3, and the game device 3 can control the lighting of each infrared LED provided in the marker device 6. Further, the marker 6 is a portable device, and the user can set the marker 6 at an arbitrary position. Fig. 1 shows a state in which the marker 6 is disposed above the television set 2, but the position and orientation in which the marker 6 is disposed are arbitrary.
The controller 5 is configured to provide the game device 3 with operation data indicating the content of an operation performed on the controller 5. The controller 5 and the game device 3 can communicate by wireless communication. In the present embodiment, for example, Bluetooth (registered trademark) is used for wireless communication between the controller 5 and the game device 3. In another embodiment, the controller 5 and the game device 3 may be connected by a wire. In the present embodiment, the game device 3 is capable of communicating with a plurality of controllers, and a game can be played by a plurality of players by using a predetermined number of controllers at the same time, although one controller 5 is included in the game system 1. The detailed configuration of the controller 5 will be described later.
The terminal device 7 has a size that can be held by a user, and the user can use the terminal device 7 by holding the terminal device 7 with his hand and moving it or by placing the terminal device 7 at an arbitrary position. The terminal device 7 includes an LCD (liquid crystal Display) 51 as a Display unit and an input unit (a touch panel 52, a gyro sensor 64, and the like described later), and the detailed configuration thereof will be described later. The terminal device 7 and the game device 3 can communicate by a wireless method (or a wired method). Terminal device 7 receives data of an image (for example, a game image) generated in game device 3 from game device 3, and displays the image on LCD 51. In the present embodiment, an LCD is used as the display device, but the terminal device 7 may have any other display device such as a display device using EL (Electro Luminescence), for example. Further, the terminal device 7 transmits operation data indicating the contents of the operation performed on the terminal device 7 to the game device 3.
[2. internal Structure of Game device 3 ]
Next, the internal configuration of the game device 3 will be described with reference to fig. 2. Fig. 2 is a block diagram showing an internal configuration of game device 3. The game device 3 includes a CPU (Central processing Unit) 10, a system LSI 11, an external main memory 12, a ROM/RTC 13, a disk drive 14, an AV-IC 15, and the like.
The CPU 10 executes a game process by executing a game program stored in the optical disk 4, and the CPU 10 functions as a game processor. The CPU 10 is connected to the system LSI 11. The system LSI 11 is connected to an external main memory 12, a ROM/RTC 13, a disk drive 14, and an AV-IC 15, in addition to the CPU 10. The system LSI 11 performs the following processing and the like: controlling data transmission between the structural elements connected with the data transmission device; generating an image to be displayed; data is acquired from an external device. The internal configuration of the system LSI 11 will be described later. The volatile external main memory 12 is used for storing programs such as a game program read from the optical disk 4 and a game program read from a Flash memory (Flash memory)17, and various data, and the external main memory 12 is used as a work area and a buffer area of the CPU 10. The ROM/RTC 13 includes a ROM (so-called boot ROM) on which a start program of the game device 3 is installed and a clock circuit (RTC (real Time clock) for performing timing. The disk drive 14 reads program data, texture data, and the like from the optical disk 4, and writes the read data in the internal main memory 11e or the external main memory 12, which will be described later.
The system LSI 11 is provided with an input/output Processor (I/O Processor) 11a, a GPU (Graphic Processor Unit) 11b, a DSP (digital signal Processor) 11c, a VRAM (Video RAM)11d, and an internal main memory 11 e. Although not shown, these components 11a to 11e are connected to each other by an internal bus.
The GPU11b forms a part of the drawing unit, and generates an image in accordance with a drawing Command (Graphics Command) from the CPU 10. The VRAM 11d stores data (polygon data, texture data, and the like) necessary for the GPU11b to execute the drawing command. When generating an image, the GPU11b creates image data using the data stored in the VRAM 11 d. In the present embodiment, the game device 3 generates both the game image displayed on the television set 2 and the game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 may be referred to as a "television game image", and the game image displayed on the terminal device 7 may be referred to as a "terminal game image".
The DSP 11c functions as an audio processor, and generates audio data using audio data (sound data) and audio waveform (tone) data stored in the internal main memory 11e and the external main memory 12. In the present embodiment, as for the game sound, both the game sound output from the speaker of the television set 2 and the game sound output from the speaker of the terminal device 7 are generated similarly to the game image. Hereinafter, the game sound output from the television 2 may be referred to as "television game sound", and the game sound output from the terminal device 7 may be referred to as "terminal game sound".
Of the images and sounds generated in game device 3 as described above, data of the images and sounds to be output by television set 2 is read by AV-IC 15. The AV-IC 15 outputs the read image data to the television set 2 via the AV connector 16, and outputs the read sound data to the speaker 2a built in the television set 2. Thereby, an image is displayed on the television set 2, and sound is output from the speaker 2 a.
In addition, of the images and sounds generated in game device 3, data of the images and sounds to be output in terminal device 7 is transmitted to terminal device 7 through input-output processor 11 a. Data transmission to the terminal device 7 by the input/output processor 11a and the like will be described later.
The input/output processor 11a performs transmission/reception of data between components connected thereto or performs download of data from an external device. The input/output processor 11a is connected to a flash memory 17, a network communication module 18, a controller communication module 19, an expansion connector 20, a memory card connector 21, and a codec (codec) LSI 27. An antenna 22 is connected to the network communication module 18. An antenna 23 is connected to the controller communication module 19. The codec LSI27 is connected to a terminal communication module 28, and an antenna 29 is connected to the terminal communication module 28.
The game device 3 can be connected to a network such as the internet to communicate with an external information processing device (for example, another game device, various servers, and the like). That is, the input/output processor 11a is connected to a network such as the internet via the network communication module 18 and the antenna 22, and can communicate with an external information processing apparatus connected to the network. The input/output processor 11a periodically accesses the flash memory 17, detects whether or not there is data to be transmitted to the network, and transmits the data to the network through the network communication module 18 and the antenna 22 in the case where the data exists. The input/output processor 11a receives data transmitted from an external information processing apparatus and data downloaded from a download server via a network, the antenna 22, and the network communication module 18, and stores the received data in the flash memory 17. The CPU 10 reads data stored in the flash memory 17 by executing the game program and utilizes the data in the game program. The flash memory 17 may store, in addition to data transmitted and received between the game device 3 and the external information processing device, archived data (game result data or data in the middle) of a game played by the game device 3. In addition, the flash memory 17 may store a game program.
In addition, the game device 3 can receive operation data from the controller 5. That is, the input/output processor 11a receives the operation data transmitted from the controller 5 via the antenna 23 and the controller communication module 19, and stores (temporarily stores) the operation data in the buffer area of the internal main memory 11e or the external main memory 12.
The game device 3 can transmit and receive data such as images and sounds to and from the terminal device 7. When a game image (terminal game image) is to be transmitted to the terminal device 7, the input/output processor 11a outputs the data of the game image generated by the GPU11b to the codec LSI 27. The codec LSI27 performs predetermined compression processing on the image data from the input/output processor 11 a. The terminal communication module 28 performs wireless communication with the terminal device 7. Thus, the image data compressed by the codec LSI27 is transmitted to the terminal device 7 by the terminal communication module 28 through the antenna 29. In the present embodiment, the image data transmitted from the game device 3 to the terminal device 7 is data for a game, and if a delay occurs in the image displayed in the game, the operability of the game is adversely affected. Therefore, it is preferable to avoid the occurrence of delay as much as possible in the transmission of image data from the game device 3 to the terminal device 7. Therefore, in the present embodiment, the codec LSI27 compresses the image data using, for example, the high-efficiency compression technique of the h.264 standard. In addition, other compression techniques may be used, and when the communication speed is sufficiently high, the image data may be transmitted without being compressed. The terminal communication module 28 is, for example, a communication module subjected to Wi-Fi (wireless fidelity) authentication, and may perform wireless communication with the terminal device 7 at high speed using, for example, the MIMO (Multiple input Multiple Output) technology adopted under the ieee802.11n standard, or may use another communication method.
In addition, the game device 3 transmits the image data to the terminal device 7, and also transmits the audio data to the terminal device 7. That is, the input/output processor 11a outputs the sound data generated by the DSP 11c to the terminal communication module 28 through the codec LSI 27. The codec LSI27 also performs compression processing on the audio data, as with the image data. The compression method for the audio data may be any method, and a method with a high compression rate and little audio degradation is preferable. In other embodiments, the audio data may be transmitted without being compressed. The terminal communication module 28 transmits the compressed image data and sound data to the terminal device 7 through the antenna 29.
In addition to the image data and the sound data, the game device 3 transmits various control data to the terminal device 7 as necessary. The control data is data indicating a control instruction for a component provided in the terminal device 7, and for example, indicates an instruction to control lighting of a marker portion (marker portion 55 shown in fig. 10), an instruction to control imaging by a camera (camera 56 shown in fig. 10), and the like. The input/output processor 11a transmits control data to the terminal device 7 in accordance with an instruction from the CPU 10. In addition, although the codec LSI27 does not perform data compression processing on the control data in this embodiment, the codec LSI may perform data compression processing on the control data in another embodiment. The data transmitted from the game device 3 to the terminal device 7 may be encrypted or not encrypted as necessary.
The game device 3 can receive various data from the terminal device 7. In the present embodiment, the terminal device 7 transmits operation data, image data, and audio data, which will be described in detail later. Each data transmitted from the terminal device 7 is received by the terminal communication module 28 through the antenna 29. Here, the image data and the sound data from the terminal device 7 are subjected to the same compression processing as the image data and the sound data transmitted from the game device 3 to the terminal device 7. Therefore, the image data and the sound data are transmitted from the terminal communication module 28 to the codec LSI27, subjected to decompression processing by the codec LSI27, and output to the input/output processor 11 a. On the other hand, the operation data from the terminal device 7 is smaller in data amount than the image and the voice, and therefore, the compression processing may not be performed. Further, encryption may be performed or may not be performed as necessary. Thus, the operation data is output to the input-output processor 11a via the codec LSI27 after being received by the terminal communication module 28. The input/output processor 11a stores (temporarily stores) the data received from the terminal device 7 in a buffer area of the internal main memory 11e or the external main memory 12.
The game device 3 can be connected to other devices and external storage media. That is, the expansion connector 20 and the memory card connector 21 are connected to the input/output processor 11 a. The expansion connector 20 is a connector for an Interface such as USB or SCSI (Small computer system Interface). The network communication module 18 can be replaced with a communication connector to which a medium such as an external storage medium is connected, a peripheral device such as another controller is connected, or a wired communication connector is connected, thereby enabling communication with the network. The memory card connector 21 is a connector for connecting an external storage medium such as a memory card. For example, the input/output processor 11a can access the external storage medium through the expansion connector 20 and the memory card connector 21 to store data in the external storage medium or read data from the external storage medium.
The game device 3 is provided with a power button 24, a reset button 25, and an eject button 26. The power button 24 and the reset button 25 are connected to the system LSI 11. When power button 24 is turned on, power is supplied from an external power source to each component of game device 3 via an AC adapter not shown. When the reset button 25 is pressed, the system LSI 11 restarts the startup program of the game device 3. An eject button 26 is connected to the disc drive 14. When the eject button 26 is pressed, the optical disk 4 is ejected from the disk drive 14.
In another embodiment, some of the components included in the game device 3 may be configured as extension devices independent of the game device 3. In this case, the extension device may be connected to the game apparatus 3 via the extension connector 20. Specifically, the expansion device may include the components of the codec LSI27, the terminal communication module 28, and the antenna 29, and may be attachable to and detachable from the expansion connector 20. In this way, by connecting the extension device to a game device that does not include the above-described components, the game device can be configured to be able to communicate with the terminal device 7.
[3. Structure of controller 5 ]
Next, the controller 5 will be described with reference to fig. 3 to 7. Fig. 3 is a perspective view showing an external configuration of the controller 5. Fig. 4 is a perspective view showing an external configuration of the controller 5. Fig. 3 is a perspective view of the controller 5 as viewed from the upper rear side of the controller 5, and fig. 4 is a perspective view of the controller 5 as viewed from the lower front side of the controller 5.
In fig. 3 and 4, the controller 5 has a housing 31 formed, for example, by plastic molding. The housing 31 has a substantially rectangular parallelepiped shape with its longitudinal direction in the front-rear direction (Z-axis direction shown in fig. 3), and is sized to be held by an adult or a child as a whole with one hand. The user can perform a game operation by changing the position and posture (inclination) thereof by pressing a button provided on the controller 5 and moving the controller 5 itself.
The housing 31 is provided with a plurality of operation buttons. As shown in fig. 3, on the upper surface of the housing 31, there are provided a cross button 32a, a button No. 132 b, a button No. 2 32c, a button a 32d, a minus (-) button 32e, a home button 32f, a plus (+) button 32g, and a power button 32 h. In the present specification, the upper surface of the case 31 on which the buttons 32a to 32h are provided may be referred to as a "button top". On the other hand, as shown in fig. 4, a recess is formed on the lower surface of the housing 31, and a B button 32i is provided on the rear-side inclined surface of the recess. To each of these operation buttons 32a to 32i, a function corresponding to an information processing program executed by the game device 3 is appropriately assigned. In addition, the power button 32h is used to remotely turn on/off the power of the game device 3 main body. The home button 32f and the power button 32h are disposed such that the upper surface thereof is lower than the upper surface of the housing 31. This can prevent the user from erroneously pressing the home button 32f or the power button 32 h.
A connector 33 is provided on the rear surface of the housing 31. The connector 33 is used to connect other devices (e.g., other sensor units, controllers) to the controller 5. Further, locking holes 33a are provided on both sides of the connector 33 on the rear surface of the housing 31 to prevent the other devices from being easily detached.
A plurality of (four in fig. 3) LEDs 34a to 34d are provided on the rear portion of the upper surface of the housing 31. Here, the controller 5 is assigned a controller type (number) for distinguishing from other controllers. The LEDs 34a to 34d are used for the following purposes: the user is notified of the above-described controller category currently set for the controller 5, or the user is notified of the remaining battery level of the controller 5. Specifically, when a game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on according to the controller type.
The controller 5 includes an imaging information calculation unit 35 (fig. 6), and as shown in fig. 4, a light incident surface 35a of the imaging information calculation unit 35 is provided on the front surface of the housing 31. The light incident surface 35a is made of a material that transmits at least infrared light from the markers 6R and 6L.
A sound outlet 31a for emitting sound from a speaker 47 (fig. 5) incorporated in the controller 5 to the outside is formed between the No. 1 button 32b and the home button 32f on the upper surface of the housing 31.
Next, the internal structure of the controller 5 will be described with reference to fig. 5 and 6. Fig. 5 and 6 are diagrams showing an internal structure of the controller 5. Fig. 5 is a perspective view showing a state where the upper housing (a part of the housing 31) of the controller 5 is removed. Fig. 6 is a perspective view showing a state where the lower housing (a part of the housing 31) of the controller 5 is removed. The perspective view shown in fig. 6 is a perspective view of the substrate 30 shown in fig. 5 as viewed from the back side.
In fig. 5, a substrate 30 is fixedly provided inside a housing 31, and operation buttons 32a to 32h, LEDs 34a to 34d, an acceleration sensor 37, an antenna 45, a speaker 47, and the like are provided on an upper main surface of the substrate 30. These are connected to a microcomputer (Micro Computer) 42 (see fig. 6) via a wiring (not shown) formed on the substrate 30 or the like. In the present embodiment, the acceleration sensor 37 is disposed at a position offset from the center of the controller 5 in the X-axis direction. This makes it easy to calculate the movement of the controller 5 when the controller 5 is rotated about the Z axis. The acceleration sensor 37 is disposed forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). The controller 5 functions as a wireless controller using the wireless module 44 (fig. 7) and the antenna 45.
On the other hand, in fig. 6, an imaging information calculation unit 35 is provided at the edge of the front end on the lower main surface of the substrate 30. The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41 in this order from the front of the controller 5. These components 38 to 41 are mounted on the lower main surface of the substrate 30.
The microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator (vibrator)46 is, for example, a vibration motor or a solenoid (solenoid), and is connected to the microcomputer 42 through a wiring formed on the substrate 30 or the like. The vibrator 46 is operated in accordance with an instruction from the microcomputer 42, thereby generating vibration in the controller 5. This enables a so-called vibration-assisted game in which the vibration is transmitted to the hand of the user holding the controller 5. In the present embodiment, the vibrator 46 is disposed at a position slightly forward of the housing 31. That is, the vibrator 46 is disposed on the end side of the center of the controller 5, and thus large vibration can be generated in the entire controller 5 by the vibration of the vibrator 46. In addition, the connector 33 is mounted at the rear end edge on the lower main surface of the substrate 30. In addition to those shown in fig. 5 and 6, the controller 5 includes a crystal oscillator for generating a basic clock of the microcomputer 42, an amplifier for outputting an audio signal to the speaker 47, and the like.
The shape of the controller 5, the shape of each operation button, the number and the installation positions of the acceleration sensors and the vibrators, and the like shown in fig. 3 to 6 are merely examples, and other shapes, numbers, and installation positions may be used. In the present embodiment, the imaging direction of the imaging means is the positive Z-axis direction, but the imaging direction may be any direction. That is, the position of the imaging information arithmetic unit 35 in the controller 5 (the light incident surface 35a of the imaging information arithmetic unit 35) may not be the front surface of the housing 31, and may be provided on another surface as long as light can be taken in from the outside of the housing 31.
Fig. 7 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (each of the operation buttons 32a to 32i), an imaging information calculation unit 35, a communication unit 36, an acceleration sensor 37, and a gyro sensor 48. The controller 5 transmits data indicating the content of an operation performed on the controller 5 to the game device 3 as operation data. In addition, hereinafter, the operation data transmitted by the controller 5 is sometimes referred to as "controller operation data", and the operation data transmitted by the terminal device 7 is sometimes referred to as "terminal operation data".
The operation unit 32 includes the operation buttons 32a to 32i, and outputs operation button data indicating the input state to the operation buttons 32a to 32i (whether or not the operation buttons 32a to 32i are pressed) to the microcomputer 42 of the communication unit 36.
The imaging information calculation unit 35 is a system for analyzing image data captured by the imaging unit, identifying a region having high brightness, and calculating the position of the center of gravity, the size, and the like of the region. The imaging information calculation unit 35 has a sampling period of about 200 frames/second at maximum, for example, and therefore can track and analyze the movement of the controller 5 even at a relatively high speed.
The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41. The infrared filter 38 passes only infrared rays of light incident from the front of the controller 5. The lens 39 condenses the infrared rays having passed through the infrared filter 38 and makes them incident on the image pickup device 40. The imaging element 40 is a solid-state imaging element such as a CMOS sensor or a CCD sensor, for example, and receives the infrared rays condensed by the lens 39 and outputs an image signal. Here, the marker unit 55 and the marker device 6 of the terminal device 7 to be imaged are constituted by markers that output infrared light. Therefore, by providing the infrared filter 38, the image pickup device 40 receives only the infrared rays having passed through the infrared filter 38 to generate image data, and thus can more accurately capture an image of the image pickup object (the marker 55 and/or the marker 6). Hereinafter, the image captured by the imaging element 40 is referred to as a captured image. The image processing circuit 41 processes the image data generated by the image pickup device 40. The image processing circuit 41 calculates the position of the imaging object within the captured image. The image processing circuit 41 outputs the coordinates indicating the calculated position to the microcomputer 42 of the communication unit 36. The data of the coordinates is transmitted to the game device 3 as operation data by the microcomputer 42. Hereinafter, the above coordinates are referred to as "marker coordinates". Since the marker coordinates change in accordance with the orientation (inclination angle) and position of the controller 5 itself, the game device 3 can calculate the orientation and position of the controller 5 using the marker coordinates.
In other embodiments, the controller 5 may not include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. In this case, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41 to calculate the marker coordinates.
The acceleration sensor 37 detects acceleration (including gravitational acceleration) of the controller 5, that is, detects force (including gravitational force) applied to the controller 5. The acceleration sensor 37 detects a value of acceleration in a linear direction along the sensing axis direction (linear acceleration) among accelerations applied to the detection portion of the acceleration sensor 37. For example, in the case of a biaxial or biaxial multi-axis acceleration sensor, the acceleration of the component along each axis is detected as the acceleration applied to the detection unit of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro mechanical system) type acceleration sensor, but other types of acceleration sensors may be used.
In the present embodiment, the acceleration sensor 37 detects linear accelerations in three axial directions, i.e., the vertical direction (Y-axis direction shown in fig. 3), the horizontal direction (X-axis direction shown in fig. 3), and the front-rear direction (Z-axis direction shown in fig. 3) with reference to the controller 5. Since the acceleration sensor 37 detects acceleration in a linear direction along each axis, the output from the acceleration sensor 37 represents the value of the linear acceleration of each of the three axes. That is, the detected acceleration is expressed as a three-dimensional vector on an XYZ coordinate system (controller coordinate system) set with reference to the controller 5.
Data (acceleration data) indicating the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Further, since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (inclination angle) and movement of the controller 5 itself, the game device 3 can calculate the orientation and movement of the controller 5 using the acquired acceleration data. In the present embodiment, the game device 3 calculates the posture, the inclination angle, and the like of the controller 5 from the acquired acceleration data.
Further, a computer such as a processor (for example, CPU 10) of the game device 3 or a processor (for example, microcomputer 42) of the controller 5 may process the acceleration signal output from the acceleration sensor 37 (the same applies to the acceleration sensor 63 described later), thereby estimating or calculating (determining) more information on the controller 5, and those skilled in the art can easily understand this information from the description of the present specification. For example, in the case where the process on the computer side is executed on the assumption that the controller 5 on which the acceleration sensor 37 is mounted is in a stationary state (that is, in the case where the process is executed assuming that the acceleration detected by the acceleration sensor is only the acceleration due to gravity), if the controller 5 is actually in a stationary state, it is possible to know whether or not the posture of the controller 5 is inclined or to what degree with respect to the direction of gravity from the detected acceleration. Specifically, when the state in which the detection axis of the acceleration sensor 37 is oriented in the vertical direction is used as a reference, whether or not the controller 5 is tilted with respect to the reference can be known from whether or not 1G (gravitational acceleration) is applied, and the degree of tilt with respect to the reference can be known from the magnitude of the gravitational acceleration. In the case of the multi-axis acceleration sensor 37, the degree of inclination of the controller 5 with respect to the direction of gravity can be known in more detail by further processing the signals of the acceleration of each axis. In this case, the processor may calculate the tilt angle of the controller 5 from the output from the acceleration sensor 37, or may calculate the tilt direction of the controller 5 without calculating the tilt angle. In this way, the inclination angle or the posture of the controller 5 can be determined by using the acceleration sensor 37 in combination with the processor.
On the other hand, assuming that the controller 5 is in a dynamic state (a state in which the controller 5 is moving), the acceleration sensor 37 detects acceleration corresponding to the movement of the controller 5 in addition to the gravitational acceleration, and therefore, the movement direction of the controller 5 can be known by removing a component of the gravitational acceleration from the detected acceleration by predetermined processing. Even when the controller 5 is in a dynamic state, the inclination of the controller 5 with respect to the direction of gravity can be known by removing a component of the acceleration corresponding to the movement of the acceleration sensor from the detected acceleration through a predetermined process. In other embodiments, the acceleration sensor 37 may include an embedded processing device or other type of dedicated processing device for performing predetermined processing on the acceleration signal detected by the built-in acceleration detection unit before the acceleration signal is output to the microcomputer 42. The embedded or dedicated processing means may also convert the acceleration signal into a tilt angle (or other preferred parameter), for example in case the acceleration sensor 37 is used to detect static acceleration (e.g. gravitational acceleration).
The gyro sensor 48 detects angular velocities about three axes (XYZ axes in the present embodiment). In the present specification, with reference to the imaging direction (Z-axis positive direction) of the controller 5, the rotation direction around the X-axis is referred to as a pitch direction (pitch direction), the rotation direction around the Y-axis is referred to as a yaw direction (yaw direction), and the rotation direction around the Z-axis is referred to as a roll direction (roll direction). The gyro sensors 48 may be any sensors as long as they can detect angular velocities about three axes, and the number and combination of the gyro sensors to be used may be arbitrary. For example, the gyro sensor 48 may be a three-axis gyro sensor, or a combination of a two-axis gyro sensor and a single-axis gyro sensor may detect angular velocities about three axes. Data indicating the angular velocity detected by the gyro sensor 48 is output to the communication unit 36. In addition, the gyro sensor 48 may also detect angular velocity about one or two axes.
The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 uses the memory 43 as a storage area at the time of processing, and the microcomputer 42 controls the wireless module 44, and the wireless module 44 wirelessly transmits data acquired by the microcomputer 42 to the game device 3.
Data output from the operation unit 32, the imaging information calculation unit 35, the acceleration sensor 37, and the gyro sensor 48 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted as operation data (controller operation data) to the game device 3. That is, when the transmission timing to transmit to the controller communication module 19 of the game device 3 comes, the microcomputer 42 outputs the operation data stored in the memory 43 to the wireless module 44. The wireless module 44 modulates a carrier wave of a predetermined frequency with operation data using, for example, the technology of Bluetooth (registered trademark), and transmits the weak radio wave signal from the antenna 45. That is, the operation data is modulated into a weak radio signal by the wireless module 44 and transmitted from the controller 5. The weak radio wave signal is received by the controller communication module 19 on the game device 3 side. The game device 3 can acquire the operation data by demodulating and decoding the received weak radio wave signal. Then, the CPU 10 of the game device 3 performs game processing using the operation data acquired from the controller 5. While the wireless transmission from the communication unit 36 to the controller communication module 19 is performed sequentially at predetermined intervals, the processing of the game is generally performed in units of 1/60 seconds (as one frame time), and therefore, the transmission is preferably performed at a period equal to or shorter than this time. The communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of, for example, 1/200 seconds once.
As described above, the controller 5 can transmit marker coordinate data, acceleration data, angular velocity data, and operation button data as operation data indicating an operation on the controller 5. Further, the game device 3 executes game processing using the operation data as game input. Therefore, by using the controller 5, the user can perform not only a conventional general game operation of pressing each operation button but also a game operation of moving the controller 5 itself. For example, an operation of tilting the controller 5 in an arbitrary posture, an operation of instructing an arbitrary position on the screen by the controller 5, an operation of moving the controller 5 itself, and the like can be performed.
In the present embodiment, the controller 5 does not have a display unit for displaying a game image, but may have a display unit for displaying an image indicating the remaining battery level, for example.
[4. Structure of terminal device 7 ]
Next, the configuration of the terminal device 7 will be described with reference to fig. 8 to 10. Fig. 8 is a diagram showing an external configuration of the terminal device 7. Fig. 8 (a) is a front view of the terminal device 7, (b) is a top view, (c) is a right side view, and (d) is a bottom view. Fig. 9 is a diagram showing a state in which the user holds the terminal device 7.
As shown in fig. 8, the terminal device 7 includes a case 50 having a substantially horizontally long rectangular plate shape. The housing 50 is sized to be gripped by a user. Therefore, the user can hold and move the terminal device 7 or change the arrangement position of the terminal device 7.
The terminal device 7 has an LCD51 on the front surface of the housing 50. The LCD51 is disposed near the center of the front surface of the housing 50. Therefore, the user can hold the terminal device while viewing the screen of the LCD51 by holding the case 50 on both sides of the LCD51 as shown in fig. 9. In fig. 9, an example is shown in which the user holds the terminal device 7 in the lateral direction (in the laterally long orientation) by holding the housing 50 of the portion on both the left and right sides of the LCD51, but the user can hold the terminal device 7 in the vertical direction (in the vertically long orientation).
As shown in fig. 8 (a), the terminal device 7 has a touch panel 52 as an operation unit on the screen of the LCD 51. In the present embodiment, the touch panel 52 is a resistive film type touch panel. However, the touch panel is not limited to the resistive type, and any type of touch panel such as a capacitive type can be used. The touch panel 52 may be of a single-touch type or a multi-touch type. In the present embodiment, a touch panel having the same resolution (detection accuracy) as that of the LCD51 is used as the touch panel 52. However, the resolution of the touch panel 52 and the resolution of the LCD51 do not necessarily have to coincide. The input to the touch panel 52 is usually performed using a stylus, but not limited to the stylus, the user can perform the input to the touch panel 52 with a finger. The housing 50 may be provided with a receiving hole for receiving a stylus for operating the touch panel 52. Since the terminal device 7 includes the touch panel 52 in this manner, the user can operate the touch panel 52 while moving the terminal device 7. That is, the user can directly input to the screen of the LCD51 (through the touch panel 52) while moving the screen.
As shown in fig. 8, the terminal device 7 includes two analog sticks 53A and 53B and a plurality of buttons 54A to 54L as operation means. Each analog joystick 53A and 53B is a device that indicates direction. Each of the analog sticks 53A and 53B is configured to be able to slide or tilt the stick part operated by the user's finger in an arbitrary direction (an arbitrary angle in the vertical, horizontal, and oblique directions) with respect to the surface of the housing 50. In addition, the left analog stick 53A is provided on the left side of the screen of the LCD51, and the right analog stick 53B is provided on the right side of the screen of the LCD 51. Thus, the user can input the pointing direction using the analog stick regardless of the left or right hand. In addition, as shown in fig. 9, the respective analog sticks 53A and 53B are provided at positions where the user can operate the left and right portions of the terminal device 7 in a state of holding them, and therefore the user can easily operate the respective analog sticks 53A and 53B even in a case of holding and moving the terminal device 7.
The buttons 54A to 54L are operation means for performing predetermined input. As will be described later, the buttons 54A to 54L are provided at positions that can be operated by the user while holding the left and right portions of the terminal device 7 (see fig. 9). Therefore, even when the user holds and moves the terminal device 7, the user can easily operate these operation means.
As shown in fig. 8 a, a cross button (direction input button) 54A and buttons 54B to 54H among the operation buttons 54A to 54L are provided on the front surface of the housing 50. That is, these buttons 54A to 54H are arranged at positions that can be operated by the thumb of the user (see fig. 9).
The cross button 54A is provided on the left side of the LCD51 and on the lower side of the left analog stick 53A. That is, the cross button 54A is arranged at a position where the user can operate with the left hand. The cross button 54A has a cross shape and is a button capable of indicating the vertical and horizontal directions. Buttons 54B to 54D are provided on the lower side of LCD 51. The three buttons 54B to 54D are disposed at positions where both the left and right hands can operate. The four buttons 54E to 54H are provided on the right side of the LCD51 and below the right analog stick 53B. That is, the four buttons 54E to 54H are arranged at positions where the user can operate with the right hand. The four buttons 54E to 54H are disposed in a vertical and horizontal positional relationship (with respect to the center positions of the four buttons 54E to 54H). Therefore, the terminal device 7 can also function the four buttons 54E to 54H as buttons for causing the user to instruct the up, down, left, and right directions.
In addition, as shown in fig. 8 (a), (b), and (c), the first L button 54I and the first R button 54J are provided at obliquely upper portions (upper left and upper right portions) of the housing 50. Specifically, the first L-shaped button 54I is provided at the left end of the upper side surface of the plate-shaped housing 50 and is exposed from the upper and left side surfaces. The first R button 54J is provided at the right end of the upper side surface of the housing 50 and is exposed from the upper and right side surfaces. In this way, the first L button 54I is disposed at a position where the user can operate with the left index finger, and the first R button 54J is disposed at a position where the user can operate with the right index finger (refer to fig. 9).
As shown in fig. 8 (B) and (c), the second L button 54K and the second R button 54L are disposed on leg portions 59A and 59B provided to protrude from the rear surface of the plate-shaped housing 50 (i.e., the surface opposite to the front surface on which the LCD 50 is provided). Specifically, the second L button 54K is provided slightly above the left side (left side when viewed from the front side) of the rear surface of the housing 50, and the second R button 54L is provided slightly above the right side (right side when viewed from the front side) of the rear surface of the housing 50. In other words, the second L button 54K is provided at a position substantially opposite to the left analog stick 53A provided on the front surface, and the second R button 54L is provided at a position substantially opposite to the right analog stick 53B provided on the front surface. In this way, the second L button 54K is disposed at a position where the user can operate with the middle finger of the left hand, and the second R button 54L is disposed at a position where the user can operate with the middle finger of the right hand (see fig. 9). As shown in fig. 8 (c), the second L button 54K and the second R button 54L are provided on the obliquely upward surfaces of the leg portions 59A and 59B, and have obliquely upward button surfaces. It is considered that the finger moves in the vertical direction when the user holds the terminal device 7, and therefore, the user can easily press the second L button 54K and the second R button 54L by facing the buttons upward. Further, the leg portions are provided on the rear surface of the case 50, so that the user can easily grip the case 50, and the button is provided on the leg portions, so that the user can easily operate the case 50 while gripping it.
Further, with the terminal device 7 shown in fig. 8, since the second L button 54K and the second R button 54L are provided on the rear surface, when the terminal device 7 is placed with the screen of the LCD51 (the front surface of the housing 50) facing upward, the screen may not be completely horizontal. Therefore, in another embodiment, three or more leg portions may be formed on the rear surface of the housing 50. Accordingly, in a state where the screen of the LCD51 is directed upward, the terminal device 7 can be placed on the placement surface by the leg portions contacting the placement surface, and therefore, the terminal device 7 can be placed with the screen horizontal. Further, the terminal device 7 may be horizontally placed by adding a leg portion that can be attached and detached.
The buttons 54A to 54L are assigned functions corresponding to the game program as appropriate. For example, the cross button 54A and the buttons 54E to 54H may be used for a direction instruction operation, a selection operation, and the like, and the buttons 54B to 54E may be used for a determination operation, a cancel operation, and the like.
Although not shown, the terminal device 7 has a power button for turning on/off the power of the terminal device 7. The terminal device 7 may have a button for turning on/off the screen display of the LCD51, a button for performing connection setting (pairing) with the game device 3, and a button for adjusting the volume of a speaker (speaker 67 shown in fig. 10).
As shown in fig. 8 a, the terminal device 7 includes a marker (marker 55 shown in fig. 10) including a marker 55A and a marker 55B on the front surface of the housing 50. The marker 55 is provided on the upper side of the LCD 51. Each of the markers 55A and 55B is composed of one or more infrared LEDs, as in the markers 6R and 6L of the marker device 6. The marker 55 is used when the game device 3 calculates the movement of the controller 5, as in the marker 6 described above. Further, the game device 3 can control the lighting of each infrared LED provided in the marker section 55.
The terminal device 7 includes a camera 56 as an imaging unit. The camera 56 includes an image pickup element (e.g., a CCD image sensor, a CMOS image sensor, etc.) having a prescribed resolution and a lens. As shown in fig. 8, in the present embodiment, the camera 56 is provided on the front surface of the housing 50. Therefore, the camera 56 can capture the face of the user holding the terminal device 7, for example, the user who is playing a game while watching the LCD 51.
The terminal device 7 includes a microphone (a microphone 69 shown in fig. 10) as an audio input means. A microphone hole 60 is provided in the front surface of the housing 50. The microphone 69 is provided inside the casing 50 in the microphone hole 60. The microphone detects sound around the terminal device 7 such as the user's voice.
The terminal device 7 includes a speaker (speaker 67 shown in fig. 10) as an audio output means. As shown in fig. 8 (d), a speaker hole 57 is provided in the lower side surface of the housing 50. The output sound of the speaker 67 is output from the speaker hole 57. In the present embodiment, the terminal device 7 includes two speakers, and speaker holes 57 are provided at positions of the left speaker and the right speaker, respectively.
The terminal device 7 includes an extension connector 58 for connecting another device to the terminal device 7. In the present embodiment, as shown in fig. 8 (d), the expansion connector 58 is provided on the lower side surface of the housing 50. The other devices connected to the expansion connector 58 may be any devices, and may be input devices such as a controller (gun-type controller or the like) used in a specific game and a keyboard. The expansion connector 58 may not be provided if there is no need to connect other devices.
In the terminal device 7 shown in fig. 8, the shape of each operation button and the housing 50, the number of components, the installation position, and the like are merely examples, and other shapes, numbers, and installation positions may be used.
Next, the internal configuration of the terminal device 7 will be described with reference to fig. 10. Fig. 10 is a block diagram showing an internal configuration of the terminal device 7. As shown in fig. 10, the terminal device 7 includes, in addition to the configuration shown in fig. 8, a touch panel controller 61, a magnetic sensor 62, an acceleration sensor 63, a gyro sensor 64, a user interface controller (UI controller) 65, a codec LSI66, a speaker 67, a voice IC (Integrated Circuit) 68, a microphone 69, a wireless module 70, an antenna 71, an infrared communication module 72, a flash memory 73, a power IC 74, and a battery 75. These electronic components are mounted on an electronic circuit board and housed in a case 50.
The UI controller 65 is a circuit for controlling input/output of data to/from various input/output units. The UI controller 65 is connected to the touch panel controller 61, the analog sticks 53 (analog sticks 53A and 53B), the operation buttons 54 (operation buttons 54A to 54L), the marker section 55, the magnetic sensor 62, the acceleration sensor 63, and the gyro sensor 64. In addition, the UI controller 65 is connected to the codec LSI66 and the expansion connector 58. Further, a power supply IC 74 is connected to the UI controller 65, and power is supplied to each unit through the UI controller 65. A built-in battery 75 is connected to the power IC 74 to supply power. Further, a charger 76 or a cable that can receive power from an external power supply can be connected to the power supply IC 74 via a connector or the like, and the terminal device 7 can be supplied with power from the external power supply and charged with power using the charger 76 or the cable. The terminal device 7 may be mounted on a cradle (cradle) having a charging function (not shown) to charge the terminal device 7.
The touch panel controller 61 is a circuit connected to the touch panel 52 and controls the touch panel 52. The touch panel controller 61 generates touch position data in a predetermined format based on a signal from the touch panel 52 and outputs the touch position data to the UI controller 65. The touch position data indicates coordinates of a position input on the input surface of the touch panel 52. The touch panel controller 61 reads a signal from the touch panel 52 and generates touch position data on a rate of once every predetermined time. In addition, various control instructions for the touch panel 52 are output from the UI controller 65 to the touch panel controller 61.
The analog stick 53 outputs stick data indicating the direction and amount of the stick section sliding (or tilting) operated by the user with the finger to the UI controller 65. The operation buttons 54 output operation button data indicating the input status (whether or not they are pressed) to the operation buttons 54A to 54L to the UI controller 65.
The magnetic sensor 62 detects the orientation by detecting the magnitude and direction of the magnetic field. Orientation data representing the detected orientation is output to the UI controller 65. Further, a control instruction for the magnetic sensor 62 is output from the UI controller 65 to the magnetic sensor 62. As the magnetic sensor 62, there is a sensor using an MI (magnetic impedance) element, a fluxgate sensor (fluxgate sensor), a hall element, a GMR (giant magnetoresistive) element, a TMR (tunnel magnetoresistive) element, an AMR (anisotropic magnetoresistive) element, or the like, but any sensor may be used as long as it can detect an azimuth. In addition, strictly speaking, the obtained azimuth data does not indicate the azimuth at a place where a magnetic field other than the geomagnetic field is generated, but even in this case, since the azimuth data changes when the terminal device 7 moves, it is possible to calculate the change in the posture of the terminal device 7.
The acceleration sensor 63 is provided inside the housing 50, and detects the magnitude of linear acceleration in three-axis directions (xyz axes shown in fig. 8 a). Specifically, the acceleration sensor 63 detects the magnitude of linear acceleration in each axis with the longitudinal direction of the casing 50 as the x-axis, the short-side direction of the casing 50 as the y-axis, and the direction perpendicular to the front surface of the casing 50 as the z-axis. Acceleration data indicating the detected acceleration is output to the UI controller 65. Further, a control instruction for the acceleration sensor 63 is output from the UI controller 65 to the acceleration sensor 63. The acceleration sensor 63 is, for example, a capacitive MEMS acceleration sensor in the present embodiment, but other types of acceleration sensors may be used in other embodiments. The acceleration sensor 63 may be an acceleration sensor that detects a uniaxial direction or a biaxial direction.
The gyro sensor 64 is provided inside the housing 50, and detects angular velocities about the three axes of the x-axis, the y-axis, and the z-axis. Angular velocity data indicating the detected angular velocity is output to the UI controller 65. Further, a control instruction for the gyro sensor 64 is output from the UI controller 65 to the gyro sensor 64. The number and combination of gyro sensors for detecting the angular velocities of the three axes may be arbitrary, and the gyro sensor 64 may be configured by a two-axis gyro sensor and a one-axis gyro sensor, as in the case of the gyro sensor 48. The gyro sensor 64 may be a gyro sensor that detects a uniaxial direction or a biaxial direction.
The UI controller 65 outputs operation data including the touch position data, the joystick data, the operation button data, the azimuth data, the acceleration data, and the angular velocity data received from the above-described components to the codec LSI 66. When another device is connected to the terminal device 7 via the expansion connector 58, the operation data may include data indicating an operation performed on the other device.
The codec LSI66 is a circuit that performs compression processing on data transmitted to the game device 3 and decompression processing on data transmitted from the game device 3. The codec LSI66 is connected to the LCD51, the camera 56, the audio IC 68, the wireless module 70, the flash memory 73, and the infrared communication module 72. In addition, the codec LSI66 includes a CPU 77 and an internal memory 78. The terminal device 7 is not configured to perform the game process itself, but is required to execute a minimum program for management and communication of the terminal device 7. When the power is turned on, the program stored in the flash memory 73 is read into the internal memory 78, and the CPU 77 executes the program, thereby activating the terminal device 7. In addition, a partial area of the internal memory 78 is used as a VRAM for the LCD 51.
The camera 56 captures an image in accordance with an instruction from the game device 3, and outputs captured image data to the codec LSI 66. Further, a control instruction for the camera 56, such as an image capturing instruction of an image, is output from the codec LSI66 to the camera 56. The camera 56 can also take a moving image. That is, the camera 56 can repeatedly take images and repeatedly output image data to the codec LSI 66.
The audio IC 68 is a circuit connected to the speaker 67 and the microphone 69, and controls input and output of audio data to and from the speaker 67 and the microphone 69. That is, when the audio data is received from the codec LSI66, the audio IC 68 outputs an audio signal obtained by D/a conversion of the audio data to the speaker 67, and outputs the audio from the speaker 67. The microphone 69 detects sound (such as a user's voice) transmitted to the terminal device 7, and outputs a voice signal representing the sound to the voice IC 68. The audio IC 68 a/D converts an audio signal from the microphone 69 and outputs audio data of a predetermined format to the codec LSI 66.
The codec LSI66 transmits image data from the camera 56, sound data from the microphone 69, and operation data from the UI controller 65 as terminal operation data to the game device 3 through the wireless module 70. In the present embodiment, the codec LSI66 performs the same compression processing as the codec LSI27 on the image data and the sound data. The terminal operation data and the compressed image data and sound data are output to the wireless module 70 as transmission data. The wireless module 70 is connected to an antenna 71, and the wireless module 70 transmits the transmission data to the game device 3 via the antenna 71. The wireless module 70 has the same function as the terminal communication module 28 of the game device 3. That is, the wireless module 70 has a function of connecting to a wireless LAN by a method conforming to the ieee802.11n standard, for example. The transmitted data may or may not be encrypted as desired.
As described above, the transmission data transmitted from the terminal device 7 to the game device 3 includes the operation data (terminal operation data), the image data, and the audio data. When another device is connected to the terminal device 7 via the extension connector 58, the transmission data may include data received from the other device. In addition, the Infrared communication module 72 performs Infrared communication with other devices in compliance with, for example, the IRDA (Infrared Data Association) standard. The codec LSI66 may include data received by infrared communication in the transmission data and transmit the data to the game device 3 as necessary.
As described above, the compressed image data and audio data are transmitted from the game device 3 to the terminal device 7. These data are received by the codec LSI66 via the antenna 71 and the wireless module 70. The codec LSI66 decompresses the received image data and sound data. The decompressed image data is output to the LCD51, whereby an image is displayed on the LCD 51. The decompressed sound data is output to the voice IC 68, and the voice IC 68 outputs sound from the speaker 67.
When the data received from the game device 3 includes control data, the codec LSI66 and the UI controller 65 give control instructions to the respective units in accordance with the control data. As described above, the control data is data indicating a control instruction for each component (in the present embodiment, the camera 56, the touch panel controller 61, the marker section 55, each of the sensors 62 to 64, and the infrared communication module 72) included in the terminal device 7. In the present embodiment, as the control instruction indicated by the control data, an instruction to operate the above-described component or to stop (stop) the operation may be considered. That is, in order to suspend components that are not used in the game in order to suppress power consumption, in this case, the transmission data transmitted from the terminal device 7 to the game device 3 may not include data from the suspended components. Further, since the marker portion 55 is an infrared LED, the control only needs to start/stop the supply of electric power.
As described above, the terminal device 7 includes the operation means such as the touch panel 52, the analog stick 53, and the operation buttons 54, but in other embodiments, another operation means may be provided instead of or in addition to these operation means.
The terminal device 7 includes the magnetic sensor 62, the acceleration sensor 63, and the gyro sensor 64 as sensors for calculating the movement (including the position, the orientation, or the change in the position or the orientation) of the terminal device 7, but may be configured to include only one or two of these sensors in another embodiment. In other embodiments, other sensors may be provided instead of or in addition to these sensors.
Although the terminal device 7 includes the camera 56 and the microphone 69, in other embodiments, the camera 56 and the microphone 69 may not be provided, and only one of the camera 56 and the microphone 69 may be provided.
The terminal device 7 is configured to include the marker 55 as a configuration for calculating the positional relationship between the terminal device 7 and the controller 5 (the position, orientation, and the like of the terminal device 7 as viewed from the controller 5), but may be configured not to include the marker 55 in other embodiments. In other embodiments, the terminal device 7 may include other means for calculating the positional relationship. For example, in another embodiment, the controller 5 may include a marker portion and the terminal device 7 may include an imaging element. In this case, the marker 6 may be configured to include an image pickup device instead of the infrared LED.
[5. Game processing ]
Next, the game processing executed in the present game system will be described in detail. First, various data used in the game process will be described. Fig. 11 is a diagram showing various data used in the game processing. Fig. 11 is a diagram showing main data stored in the main memory (external main memory 12 or internal main memory 11e) of the game device 3. As shown in fig. 11, the game program 90, the reception data 91, and the processing data 106 are stored in the main memory of the game device 3. In addition to the data shown in fig. 11, the main memory stores data necessary for the game, such as image data of various objects appearing in the game and audio data used in the game.
A part or all of the game program 90 is read from the optical disk 4 and stored in the main memory at an appropriate timing after the game device 3 is powered on. Instead of the optical disk 4, the game program 90 may be acquired from the flash memory 17 or from a device external to the game device 3 (for example, via the internet). In addition, a part of the program included in the game program 90 (for example, a program for calculating the posture of the controller 5 and/or the terminal device 7) may be stored in the game device 3 in advance.
The reception data 91 is various data received from the controller 5 and the terminal device 7. The received data 91 contains controller operation data 92, terminal operation data 97, camera image data 104, and microphone sound data 105. In the case where a plurality of controllers 5 are connected, the controller operation data 92 also becomes a plurality. When a plurality of terminal apparatuses 7 are connected, the terminal operation data 97, the camera image data 104, and the microphone sound data 105 are also a plurality of data.
The controller operation data 92 is data representing an operation of the controller 5 by a user (player). The controller operation data 92 is transmitted from the controller 5, acquired by the game device 3, and stored in the main memory. The controller operation data 92 includes first operation button data 93, first acceleration data 94, first angular velocity data 95, and marker coordinate data 96. Further, a predetermined number of controller operation data may be stored in the main memory in order from the latest (last acquired) data.
The first operation button data 93 is data indicating the input state to each of the operation buttons 32a to 32i provided on the controller 5. Specifically, the first operation button data 93 indicates whether or not each of the operation buttons 32a to 32i is pressed.
The first acceleration data 94 is data indicating the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, the first acceleration data 94 represents three-dimensional acceleration having acceleration in the XYZ triaxial directions shown in fig. 3 as each component, but in other embodiments, it may represent acceleration in one or more arbitrary directions.
The first angular velocity data 95 is data indicating the angular velocity detected by the gyro sensor 48 in the controller 5. Here, the first angular velocity data 95 represents each angular velocity about the XYZ triaxial directions shown in fig. 3, but in other embodiments, it may represent an angular velocity about one or more arbitrary axes.
The marker coordinate data 96 is data indicating the coordinates calculated by the image processing circuit 41 of the imaging information calculation unit 35, that is, the marker coordinates. The marker coordinates are expressed in a two-dimensional coordinate system for indicating a position on a plane corresponding to the captured image, and the marker coordinate data 96 indicates coordinate values in the two-dimensional coordinate system.
The controller operation data 92 may be data representing an operation by a user operating the controller 5, and may include only a part of the data 93 to 96. In addition, in the case where the controller 5 has another input unit (for example, a touch panel, an analog stick, or the like), the controller operation data 92 may include data indicating an operation on the other input unit. Further, in the case where the movement of the controller 5 itself is used as the game operation as in the present embodiment, the controller operation data 92 is made to include data in which the value of the first acceleration data 94, the first angular velocity data 95, or the marker coordinate data 96 changes in accordance with the movement of the controller 5 itself.
The terminal operation data 97 is data indicating an operation of the terminal device 7 by the user. The terminal operation data 97 is transmitted from the terminal device 7, acquired by the game device 3, and stored in the main memory. The terminal operation data 97 includes second operation button data 98, stick data 99, touch position data 100, second acceleration data 101, second angular velocity data 102, and orientation data. Further, a predetermined number of terminal operation data may be stored in the main memory in order from the latest (last acquired) data.
The second operation button data 98 is data indicating the input state to each of the operation buttons 54A to 54L provided on the terminal device 7. Specifically, the second operation button data 98 indicates whether or not each of the operation buttons 54A to 54L is pressed.
The stick data 99 is data indicating the direction and amount in which the stick part of the analog stick 53 (analog sticks 53A and 53B) slides (or tilts). The directions and amounts may also be expressed as two-dimensional coordinates or two-dimensional vectors, for example.
The touch position data 100 is data indicating a position (touch position) of an input on the input surface of the touch panel 52. In the present embodiment, the touch position data 100 represents coordinate values on a two-dimensional coordinate system showing the position on the input surface. In addition, when the touch panel 52 is of the multi-touch type, the touch position data 100 may indicate a plurality of touch positions.
The second acceleration data 101 is data indicating an acceleration (acceleration vector) detected by the acceleration sensor 63. In the present embodiment, the second acceleration data 101 represents three-dimensional acceleration having acceleration in the xyz three-axis direction shown in fig. 8 as each component, but in other embodiments, acceleration in one or more arbitrary directions may be represented.
The second angular velocity data 102 is data indicating the angular velocity detected by the gyro sensor 64. In the present embodiment, the second angular velocity data 102 represents each angular velocity about the xyz three axes shown in fig. 8, but in other embodiments, it may represent an angular velocity about any axis of one or more axes.
The azimuth data 103 is data indicating the azimuth detected by the magnetic sensor 62. In the present embodiment, the azimuth data 103 represents the direction of a predetermined azimuth (for example, north) with reference to the terminal device 7. However, in a place where a magnetic field other than the geomagnetic field is generated, the azimuth data 103 does not strictly indicate an absolute azimuth (north or the like), but indicates a relative direction of the terminal device 7 with respect to the magnetic field direction of the place, and therefore, in this case as well, the posture change of the terminal device 7 can be calculated.
The terminal operation data 97 may be data indicating an operation by a user operating the terminal device 7, and may include only one of the data 98 to 103. In the case where the terminal device 7 has another input means (for example, a touch panel, an imaging means of the controller 5, or the like), the terminal operation data 97 may include data indicating an operation on the other input means. In the case where the movement of the terminal device 7 itself is used as the game operation as in the present embodiment, the terminal operation data 97 includes data in which the value of the second acceleration data 101, the second angular velocity data 102, or the direction data 103 changes according to the movement of the terminal device 7 itself.
The camera image data 104 is data indicating an image (camera image) captured by the camera 56 of the terminal device 7. The camera image data 104 is image data obtained by decompressing compressed image data from the terminal device 7 by the codec LSI27, and is stored in the main memory by the input/output processor 11 a. Further, a predetermined number of camera image data may be stored in the main memory in order from the latest (last acquired) data.
The microphone sound data 105 is data indicating a sound (microphone sound) detected by the microphone 69 of the terminal device 7. The microphone sound data 105 is sound data obtained by decompressing the compressed sound data transmitted from the terminal device 7 by the codec LSI27, and is stored in the main memory by the input/output processor 11 a.
The processing data 106 is data used in a game process (fig. 12) described later. The processing data 106 includes control data 107, controller gesture data 108, terminal gesture data 109, image recognition data 110, and voice recognition data 111. In addition to the data shown in fig. 11, the processing data 106 includes various data used in game processing, such as data indicating various parameters set for various objects appearing in a game.
The control data 107 is data indicating a control instruction for a component provided in the terminal device 7. The control data 107 indicates, for example, an instruction to control the lighting of the marker section 55, an instruction to control the imaging of the camera 56, and the like. The control data 107 is transmitted to the terminal device 7 at an appropriate timing.
The controller posture data 108 is data representing the posture of the controller 5. In the present embodiment, the controller attitude data 108 is calculated from the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The calculation method of the controller attitude data 108 is described later in step S23.
The terminal posture data 109 is data indicating the posture of the terminal device 7. In the present embodiment, the terminal posture data 109 is calculated from the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The calculation method of the terminal posture data 109 will be described later in step S24.
The image recognition data 110 is data indicating a result of predetermined image recognition processing performed on the camera image. The image recognition processing may be any processing as long as it detects some feature from the camera image and outputs the result, and may be, for example, processing of extracting a predetermined object (for example, the face of the user, a marker, or the like) from the camera image and calculating information on the extracted object.
The voice recognition data 111 is data indicating the result of predetermined voice recognition processing performed on the microphone voice. The voice recognition processing may be any processing as long as it is processing to detect some feature from the microphone voice and output the result, and may be processing to detect the language of the user or processing to output only the volume, for example.
Next, the game processing performed in the game device 3 will be described in detail with reference to fig. 12. Fig. 12 is a main flowchart showing a flow of game processing executed by the game device 3. When the power of the game device 3 is turned on, the CPU 10 of the game device 3 initializes each unit such as a main memory by executing a boot program stored in a boot ROM, not shown. Then, the game program stored in the optical disk 4 is read into the main memory, and the CPU 10 starts executing the game program. In the game device 3, the game program stored in the optical disk 4 may be executed immediately after the power is turned on, or the game program stored in the optical disk 4 may be executed when the user instructs the start of the game after the built-in program displaying a predetermined menu screen is first executed after the power is turned on. The flowchart shown in fig. 12 is a flowchart showing the processing performed after the above processing is completed.
Note that the processing of each step in the flowchart shown in fig. 12 is merely an example, and the order of the processing of each step may be changed as long as the same result can be obtained. The values of the variables and the threshold used in the determination step are only examples, and other values may be used as necessary. In the present embodiment, the CPU 10 executes the processing of each step in the flowchart, but a processor or a dedicated circuit other than the CPU 10 may execute the processing of a part of the steps.
First, in step S1, the CPU 10 executes initial processing. The initial processing is, for example, the following processing: a virtual game space is constructed and objects appearing in the game space are arranged at initial positions, or initial values of various parameters used in game processing are set.
In the present embodiment, in the initial processing, the CPU 10 controls the marker device 6 and the marker unit 55 to be lit up according to the type of the game program. Here, the game system 1 includes both the marker device 6 and the marker unit 55 of the terminal device 7 as imaging targets of the imaging means (imaging information calculation unit 35) of the controller 5. Either one or both of the marker 6 and the marker 55 are used depending on the game content (the type of game program). The game program 90 also includes data indicating whether or not each marker 6 and the marker unit 55 are lit. The CPU 10 reads out the data and determines whether or not to light up. Then, when the marker 6 and/or the marker 55 is to be lit, the following processing is executed.
That is, when the marker device 6 is to be turned on, the CPU 10 transmits a control signal to the marker device 6, the control signal instructing to turn on each infrared LED provided in the marker device 6. The transmission of the control signal may be only for the purpose of supplying power. In response thereto, the infrared LEDs of the marking device 6 are lit. On the other hand, when the marker section 55 is to be turned on, the CPU 10 generates control data indicating an instruction to turn on the marker section 55 and stores the control data in the main memory. The generated control data is transmitted to the terminal device 7 in step S10 described later. The control data received by the wireless module 70 of the terminal device 7 is transmitted to the UI controller 65 through the codec LSI66, and the UI controller 65 instructs the marker 55 to light up. Thereby, the infrared LED of the marker portion 55 is turned on. Note that, although the above description has been given of the case where the marker 6 and the marker 55 are turned on, the marker 6 and the marker 55 can be turned off by the same processing as in the case of the lighting.
The process of step S2 is performed after the above step S1. Thereafter, the processing loop formed by the series of processing of steps S2 to S11 is repeatedly executed at a rate of once every predetermined time (one frame time).
In step S2, the CPU 10 acquires the controller operation data transmitted from the controller 5. Since the controller 5 transmits the controller operation data back to the game device 3, the controller communication module 19 in the game device 3 sequentially receives the controller operation data and sequentially stores the received controller operation data in the main memory via the input/output processor 11 a. It is preferable that the interval of transmission and reception is shorter than the processing time of the game, for example, 1/200 seconds. In step S2, the CPU 10 reads the latest controller operation data 92 from the main memory. The process of step S3 is performed after step S2.
In step S3, the CPU 10 acquires various data transmitted from the terminal device 7. The terminal device 7 repeatedly transmits the terminal operation data, the camera image data, and the microphone sound data to the game device 3, and thus the game device 3 sequentially receives these data. In the game device 3, the terminal communication module 28 sequentially receives these data, and the codec LSI27 sequentially performs decompression processing on the camera image data and the microphone sound data. Then, the input-output processor 11a stores the terminal operation data, the camera image data, and the microphone sound data in the main memory in order. In step S3, the CPU 10 reads the latest terminal operation data 97 from the main memory. The process of step S4 is performed after step S3.
In step S4, the CPU 10 executes game control processing. The game control process is a process of advancing a game by executing a process of operating an object in a game space in accordance with a game operation performed by a user. In the present embodiment, the user can play various games using the controller 5 and/or the terminal device 7. Next, the game control process will be described with reference to fig. 13.
Fig. 13 is a flowchart showing a detailed flow of the game control process. The series of processes shown in fig. 13 are various processes that can be executed when the controller 5 and the terminal device 7 are used as operation devices, but the entire processes need not be executed, and only a part of the processes may be executed depending on the type and content of the game.
In the game control process, first, in step S21, the CPU 10 determines whether or not to change the marker to be used. As described above, in the present embodiment, when the game process is started (step S1), the process of controlling the lighting of the marker device 6 and the marker unit 55 is executed. Here, it is also considered that the object to be used (to be lit) among the marker device 6 and the marker unit 55 is changed during the game according to the difference in the game. Further, it is also conceivable to use both the marker 6 and the marker unit 55 depending on the game, but if both are lit, there is a possibility that one marker may be erroneously detected as the other marker. Therefore, it is also preferable to switch the lighting so that only one of them is lit during the game. The process of step S21 is a process of determining whether or not the object to be lit is changed during the game in consideration of the above-described situation.
The determination at step S21 described above can be made by the following method, for example. That is, the CPU 10 can make the above determination according to whether or not the game situation (the stage of the game, the operation target, or the like) has changed. This is because it is considered that, when the game situation changes, the operation method is changed between the operation method of operating the controller 5 toward the marker device 6 and the operation method of operating the controller 5 toward the marker portion 55. The CPU 10 can make the above determination according to the posture of the controller 5. That is, the determination can be made according to whether the controller 5 is the direction marking device 6 or the direction marking unit 55. The posture of the controller 5 can be calculated from the detection results of the acceleration sensor 37 and the gyro sensor 48, for example (see step S23 described later). The CPU 10 may also perform the determination based on whether or not there is a change instruction by the user.
In the case where the result of the determination in the above-described step S21 is affirmative, the process in step S22 is executed. On the other hand, if the determination result at step S21 is negative, the process at step S22 is skipped and the process at step S23 is executed.
In step S22, the CPU 10 controls the lighting of the marker device 6 and the marker section 55. That is, the lighting state of the marker device 6 and/or the marker portion 55 is changed. Further, as in the case of step S1 described above, the specific processing of turning on or off the marker device 6 and/or the marker portion 55 can be performed. The process of step S23 is performed after step S22.
As described above, according to the present embodiment, the light emission (lighting) of the marker device 6 and the marker unit 55 can be controlled according to the type of the game program by the processing of step S1, and the light emission (lighting) of the marker device 6 and the marker unit 55 can be controlled according to the game situation by the processing of steps S21 and S22.
In step S23, the CPU 10 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated from the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. Next, a method of calculating the posture of the controller 5 will be described.
First, the CPU 10 calculates the attitude of the controller 5 from the first angular velocity data 95 stored in the main memory. The method of calculating the attitude of the controller 5 from the angular velocity may be any method, and the attitude is calculated using the previous attitude (the previously calculated attitude) and the current angular velocity (the angular velocity acquired in step S2 in the current processing cycle). Specifically, the CPU 10 calculates the attitude by rotating the previous attitude at the current angular velocity for a unit time. The previous posture is represented by the controller posture data 108 stored in the main memory, and the current angular velocity is represented by the first angular velocity data 95 stored in the main memory. Thus, the CPU 10 reads the controller attitude data 108 and the first angular velocity data 95 from the main memory to calculate the attitude of the controller 5. Data indicating the "posture based on the angular velocity" calculated as described above is stored in the main memory.
Further, in the case of calculating the posture from the angular velocity, it is preferable to determine the initial posture in advance. That is, in the case of calculating the posture of the controller 5 from the angular velocity, the CPU 10 initially calculates the initial posture of the controller 5 in advance. The initial posture of the controller 5 may be calculated from the acceleration data, or the specific posture at the time of the specific operation may be used as the initial posture by causing the player to perform the specific operation in a state where the controller 5 is brought into the specific posture. It is preferable to calculate the initial attitude when the attitude of the controller 5 is calculated as an absolute attitude with reference to a predetermined direction in space, but the initial attitude may not be calculated when the attitude of the controller 5 is calculated as a relative attitude with reference to the attitude of the controller 5 at the game start time, for example.
Next, the CPU 10 corrects the attitude of the controller 5 calculated from the angular velocity using the first acceleration data 94. Specifically, the CPU 10 first reads the first acceleration data 94 from the main memory, and calculates the posture of the controller 5 based on the first acceleration data 94. Here, the fact that the controller 5 is almost stationary means that the acceleration applied to the controller 5 is the acceleration due to gravity. Therefore, in this state, the direction of the gravitational acceleration (gravitational direction) can be calculated using the first acceleration data 94 output by the acceleration sensor 37, and therefore the orientation (posture) of the controller 5 with respect to the gravitational direction can be calculated from the first acceleration data 94. Data indicating the "posture based on acceleration" calculated as described above is stored in the main memory.
When the acceleration-based posture is calculated, the CPU 10 then corrects the angular velocity-based posture with the acceleration-based posture. Specifically, the CPU 10 reads out data indicating the posture based on the angular velocity and data indicating the posture based on the acceleration from the main memory, and performs correction to make the posture based on the angular velocity data approach the posture based on the acceleration data at a predetermined ratio. The predetermined ratio may be a predetermined fixed value or may be set based on the acceleration indicated by the first acceleration data 94. In addition, since the attitude based on the acceleration cannot be calculated with respect to the rotation direction about the gravity direction as an axis, the CPU 10 may not correct the rotation direction. In the present embodiment, the data indicating the corrected posture obtained as described above is stored in the main memory.
After correcting the attitude based on the angular velocity as described above, the CPU 10 further corrects the corrected attitude using the marker coordinate data 96. First, the CPU 10 calculates the posture of the controller 5 (posture based on the marker coordinates) from the marker coordinate data 96. The marker coordinate data 96 indicates the positions of the markers 6R and 6L within the captured image, and therefore the attitude of the controller 5 with respect to the roll direction (the rotational direction about the Z axis) can be calculated from these positions. That is, the attitude of the controller 5 with respect to the roll direction can be calculated from the slope of a straight line connecting the position of the marker 6R and the position of the marker 6L within the captured image. In addition, when the position of the controller 5 with respect to the marker device 6 can be specified (for example, when it can be assumed that the controller 5 is positioned on the front side of the marker device 6), the attitude of the controller 5 with respect to the pitch direction and the yaw direction can be calculated from the position of the marker device 6 within the captured image. For example, in the case where the positions of the markers 6R and 6L are moved to the left within the captured image, it can be determined that the orientation (posture) of the controller 5 is changed to the right. In this way, the attitude of the controller 5 with respect to the pitch direction and the yaw direction can be calculated from the positions of the markers 6R and 6L. Through the above processing, the posture of the controller 5 can be calculated from the marker coordinate data 96.
When the posture based on the marker coordinates is calculated, the CPU 10 then corrects the above-described corrected posture (posture corrected using the posture based on the acceleration) using the posture based on the marker coordinates. That is, the CPU 10 performs correction to approximate the corrected posture to the posture based on the marker coordinates at a predetermined ratio. The predetermined ratio may be a predetermined fixed value. In addition, the correction using the posture based on the marker coordinates may be performed only in any one direction or any two directions of the roll direction, the pitch direction, and the yaw direction. For example, in the case of using the marker coordinate data 96, the attitude can be calculated with high accuracy with respect to the roll direction, so the CPU 10 may correct only the roll direction with the attitude based on the marker coordinate data 96. In addition, since the posture based on the marker coordinate data 96 cannot be calculated when the marker device 6 or the marker portion 55 is not imaged by the imaging element 40 of the controller 5, the correction process using the marker coordinate data 96 may not be executed in this case.
From the above, the CPU 10 corrects the first posture of the controller 5 calculated from the first angular velocity data 95 using the first acceleration data 94 and the marker coordinate data 96. Here, with the method of using the angular velocity among the methods for calculating the attitude of the controller 5, the attitude can be calculated regardless of how the controller 5 is moving. On the other hand, in the method using angular velocities, since the attitude is calculated by adding up the sequentially detected angular velocities, there is a possibility that accuracy deteriorates due to accumulation of errors or the like, or accuracy of the gyro sensor deteriorates due to a problem called temperature drift. In addition, the method using acceleration does not accumulate errors, but in a state where the controller 5 is moved vigorously, the posture cannot be calculated with high accuracy (because the direction of gravity cannot be detected accurately). In addition, the method using the marker coordinates can calculate the attitude with high accuracy (particularly with respect to the roll direction), but cannot calculate the attitude in a state where the marker portion 55 is not captured. In contrast, according to the present embodiment, since the three methods having different advantages are used as described above, the posture of the controller 5 can be calculated more accurately. In other embodiments, the gesture may be calculated by any one or two of the above three methods. When the lighting control of the marker is performed in the processing of step S1 or S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates.
The process of step S24 is performed after the above-described step S23. In step S24, the CPU 10 calculates the posture of the terminal device 7. That is, the terminal operation data 97 acquired from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102, and the orientation data 103, and therefore the CPU 10 calculates the posture of the terminal device 7 from these data. Here, the CPU 10 can know the amount of rotation (the amount of change in the posture) per unit time of the terminal device 7 from the second angular velocity data 102. In addition, since the terminal device 7 is almost stationary, the acceleration applied to the terminal device 7 is the gravitational acceleration, and the direction of the gravitational force applied to the terminal device 7 (that is, the posture of the terminal device 7 with reference to the gravitational direction) can be known from the second acceleration data 101. Further, a predetermined azimuth with respect to the terminal device 7 (that is, the posture of the terminal device 7 with respect to the predetermined azimuth) can be known from the azimuth data 103. In addition, even when a magnetic field other than the geomagnetic field is generated, the rotation amount of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 from the second acceleration data 101, the second angular velocity data 102, and the orientation data 103. In the present embodiment, the posture of the terminal device 7 is calculated from the three pieces of data, but in another embodiment, the posture may be calculated from one or two pieces of data among the three pieces of data.
Note that a specific calculation method of the orientation of the terminal device 7 may be any method, and for example, a method of correcting the orientation calculated from the angular velocity indicated by the second angular velocity data 102 using the second acceleration data 101 and the orientation data 103 is considered. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 102. Further, the method of calculating the posture from the angular velocity may be the same as the method in step S23 described above. Next, the CPU 10 corrects the posture calculated from the angular velocity using the posture calculated from the second acceleration data 101 and/or the posture calculated from the orientation data 103 at an appropriate timing (for example, when the terminal device 7 is in a state close to a stationary state). Further, the method of correcting the angular velocity-based posture with the acceleration-based posture may be the same method as the above-described case of calculating the posture of the controller 5. In addition, when the posture based on the angular velocity is corrected by the posture based on the orientation data, the CPU 10 may make the posture based on the angular velocity approach the posture based on the orientation data at a predetermined rate. From the above, the CPU 10 can accurately calculate the posture of the terminal device 7.
Further, since the controller 5 includes the imaging information calculation unit 35 as the infrared detection means, the game device 3 can acquire the marker coordinate data 96. Therefore, the game device 3 can know the absolute posture of the controller 5 in the real space (which posture the controller 5 is in the coordinate system set in the real space) from the marker coordinate data 96. On the other hand, the terminal device 7 does not include infrared detection means as in the imaging information calculation unit 35. Therefore, the game device 3 cannot know the absolute attitude in the actual space with respect to the rotation direction about the gravity direction as the axis only from the second acceleration data 101 and the second angular velocity data 102. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor 62, and the game device 3 acquires the azimuth data 103. As a result, the game device 3 can calculate the absolute orientation in the actual space with respect to the rotation direction about the gravity direction as the axis from the orientation data 103, and can calculate the orientation of the terminal device 7 more accurately.
As a specific process of step S24, the CPU 10 reads the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 from the main memory, and calculates the posture of the terminal device 7 from these data. Then, data indicating the calculated posture of the terminal device 7 is stored in the main memory as terminal posture data 109. The process of step S25 is performed after step S24.
In step S25, the CPU 10 executes recognition processing of the camera image. That is, the CPU 10 performs predetermined recognition processing on the camera image data 104. The recognition processing may be any processing as long as it is processing for detecting some feature from the camera image and outputting the result. For example, when the camera image includes the face of the player, the processing may be face recognition processing. Specifically, the processing may be processing for detecting a part of the face (eyes, nose, mouth, and the like), or processing for detecting an expression of the face. In addition, data indicating the result of the recognition processing is stored in the main memory as the image recognition data 110. The process of step S26 is performed after step S25.
In step S26, the CPU 10 executes the recognition processing of the microphone sound. That is, the CPU 10 performs predetermined recognition processing on the microphone sound data 105. The recognition processing may be any processing as long as it is processing for detecting some feature from the microphone sound and outputting the result. For example, the processing may be processing for detecting an instruction of the player from the microphone sound, or processing for detecting only the volume of the microphone sound. In addition, data indicating the result of the recognition processing is stored in the main memory as voice recognition data 111. The process of step S27 is performed after step S26.
In step S27, the CPU 10 executes game processing corresponding to the game input. Here, the game input may be any data as long as it is data transmitted from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input may be data obtained from the controller operation data 92 and the terminal operation data 97 (controller gesture data 108, terminal gesture data 109, image recognition data 110, and voice recognition data 111) in addition to the respective data. The content of the game processing in step S27 may be any, and may be, for example, processing for moving an object (character) appearing in the game, processing for controlling a virtual camera, or processing for moving a cursor displayed on the screen. Further, the processing may be processing using a camera image (or a part thereof) as a game image, processing using a microphone sound as a game sound, or the like. Note that an example of the game processing will be described later. In step S27, data indicating the result of the game control process, such as data on various parameters set for characters (objects) appearing in the game, data on parameters relating to virtual cameras arranged in the game space, and data on scores, are stored in the main memory. After step S27, the CPU 10 ends the game control process of step S4.
Returning to the description of fig. 12, in step S5, a game image for a television set for display on the television set 2 is generated by the CPU 10 and the GPU11 b. That is, the CPU 10 and the GPU11b read data indicating the result of the game control processing in step S4 from the main memory, and read data necessary for generating a game image from the VRAM 11d to generate a game image. The game image may be generated by any method as long as it indicates the result of the game control process of step S4. For example, the game image may be generated by a method of generating a three-dimensional CG image by calculating a game space viewed from a virtual camera by arranging the virtual camera in the virtual game space, or a method of generating a two-dimensional image (without using the virtual camera). The generated television game image is stored in the VRAM 11 d. The process of step S6 is performed after the above-described step S5.
In step S6, the CPU 10 and the GPU11b generate a terminal game image for display on the terminal device 7. The terminal game image may be generated by any method as long as it shows the result of the game control process of step S4, similarly to the television game image. The terminal game image may be generated by the same method as the television game image or by a different method. The generated terminal game image is stored in the VRAM 11 d. Note that, depending on the game content, the television game image and the terminal game image may be the same, and in this case, the game image generation process may not be executed in step S6. The process of step S7 is performed after the above-described step S6.
In step S7, a television game sound to be output to the speaker 2a of the television 2 is generated. That is, the CPU 10 causes the DSP 11c to generate a game sound corresponding to the result of the game control processing at step S4. The generated game sound may be, for example, an effect sound of a game, a sound of a character appearing in the game, a background sound (BGM), or the like. The process of step S8 is performed after the above-described step S7.
In step S8, a terminal game sound is generated to be output to the speaker 67 of the terminal device 7. That is, the CPU 10 causes the DSP 11c to generate a game sound corresponding to the result of the game control processing at step S4. The terminal game sound may be the same as or different from the television game sound. For example, only a part of the BGM may be different, as the BGM is the same, but the effect sound is different. In addition, when the television game sound is the same as the terminal game sound, the game sound generation process may not be executed in step S8. The process of step S9 is performed after the above-described step S8.
In step S9, the CPU 10 outputs a game image and game sound to the television set 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM 11d and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. Accordingly, the AV-IC 15 outputs data of images and sound to the television set 2 through the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game sound is output from the speaker 2 a. The process of step S10 is performed after step S9.
In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the CPU 10 transmits the image data of the terminal game image stored in the VRAM 11d and the audio data generated by the DSP 11c in step S8 to the codec LSI27, and the codec LSI27 performs predetermined compression processing. The data of the compressed image and audio is transmitted to the terminal device 7 through the antenna 29 by the terminal communication module 28. The terminal device 7 receives the image and audio data transmitted from the game device 3 by the wireless module 70, and performs predetermined decompression processing by the codec LSI 66. The image data subjected to the decompression processing is output to the LCD51, and the audio data subjected to the decompression processing is output to the audio IC 68. Thereby, the terminal game image is displayed on the LCD51, and the terminal game sound is output from the speaker 67. The process of step S11 is performed after step S10.
In step S11, the CPU 10 determines whether to end the game. The determination of step S11 is made, for example, based on whether or not the game is ended, whether or not the user has made an instruction to suspend the game, or the like. In the case where the determination result of step S11 is negative, the process of step S2 is executed again. On the other hand, in the case where the determination result of step S11 is affirmative, the CPU 10 ends the game processing shown in fig. 12. Thereafter, the series of processes of steps S2 to S11 is repeatedly executed until it is determined in step S11 that the game is ended.
As described above, in the present embodiment, the terminal device 7 includes the inertial sensor such as the touch panel 52, the acceleration sensor 63, or the gyro sensor 64, and outputs of the touch panel 52 and the inertial sensor are transmitted to the game device 3 as operation data to be used as an input of the game (steps S3 and S4). The terminal device 7 includes a display device (LCD 51), and displays a game image obtained by the game processing on the LCD51 (steps S6 and S10). Therefore, the user can perform an operation of directly touching the game image with the touch panel 52 (by detecting the movement of the terminal device 7 with the inertial sensor), and can also perform an operation of moving the LCD51 displaying the game image itself. With these operations, the user can play a game with an operation feeling as if the user directly operated a game image, and therefore, for example, a game with a new operation feeling as in the first and second game examples described later can be provided.
In the present embodiment, the terminal device 7 includes the analog stick 53 and the operation buttons 54 that can be operated while holding the terminal device 7, and the game device 3 can use the operation of the analog stick 53 and the operation buttons 54 as the input of the game (steps S3 and S4). Therefore, even when the game image is directly operated as described above, the user can perform a more detailed game operation by a button operation or a joystick operation.
In the present embodiment, the terminal device 7 includes the camera 56 and the microphone 69, and transmits data of the camera image captured by the camera 56 and data of the microphone sound detected by the microphone 69 to the game device 3 (step S3). Therefore, the game device 3 can use the camera image and/or the microphone sound as a game input, and therefore the user can also perform a game operation by an operation of capturing an image by the camera 56 and an operation of inputting a sound to the microphone 69. Further, since these operations can be performed in a state where the terminal device 7 is held, by performing these operations while directly operating the game image as described above, the user can perform more various game operations.
In the present embodiment, since the game image is displayed on the LCD51 of the portable terminal device 7 (steps S6 and S10), the user can freely arrange the terminal device 7. Therefore, when the user operates the controller 5 toward the marker, the user can play the game in any direction by disposing the terminal device 7 at any position, and the degree of freedom in operating the controller 5 can be increased. Further, since the terminal device 7 can be arranged at an arbitrary position, a game having a stronger sense of reality can be provided by arranging the terminal device 7 at a position suitable for the game content, as in a fifth game example described later.
In addition, according to the present embodiment, the game device 3 acquires operation data and the like from the controller 5 and the terminal device 7 (steps S2, S3), and therefore the user can use both the controller 5 and the terminal device 7 as operation units. Therefore, in the game system 1, a plurality of users can play a game using each device, and one user can play a game using two devices.
Further, according to the present embodiment, the game device 3 generates two kinds of game images (steps S5 and S6), and can cause the television set 2 and the terminal device 7 to display the game images (steps S9 and S10). In this way, by displaying two kinds of game images on different devices, it is possible to provide a game image that is easier for the user to view, and it is possible to improve the operability of the game. For example, when two players play a game, as in the third or fourth game example described later, each player can play the game from a viewpoint that is easy to see by displaying a game image from a viewpoint that is easy to see by one of the users on the television 2 and displaying a game image from a viewpoint that is easy to see by the other user on the terminal device 7. Further, even when a game is played by one person, for example, as in the first, second, and fifth game examples described later, by displaying two types of game images at two different viewpoints, the player can more easily grasp the appearance of the game space, and the operability of the game can be improved.
[6. Game example ]
Next, a specific example of a game to be played in the game system 1 will be described. In addition, in the game example to be described below, there is a case where a part of the configuration of each device in the game system 1 is not used, and there is a case where a part of the series of processes shown in fig. 12 and 13 is not executed. That is, the game system 1 may not have all of the above-described configurations, and the game device 3 may not perform a part of the series of processes shown in fig. 12 and 13.
(first Game example)
The first game example is a game in which an object (a sword in hands) is flown in a game space by operating the terminal device 7. The player can instruct the direction in which the sword is emitted by an operation of changing the posture of the terminal device 7 and an operation of drawing a line on the touch panel 52.
Fig. 14 is a diagram showing the screen of the television set 2 and the terminal device 7 in the first game example. In fig. 14, a game image showing a game space is displayed on the LCD51 of the television 2 and the terminal device 7. A sword 121, a control surface 122, and a target 123 are displayed on the television set 2. On the LCD51, a control surface 122 (and a sword 121) are displayed. In the first game example, the player plays the game by causing the sword 121 to fly out and hit the target 123 by the operation of the terminal device 7.
When the sword 121 flies out, the player first changes the posture of the control surface 122 disposed in the virtual game space to a desired posture by manipulating the posture of the terminal device 7. That is, the CPU 10 calculates the attitude of the terminal device 7 from the outputs of the inertial sensor (the acceleration sensor 63 and the gyro sensor 64) and the magnetic sensor 62 (step S24), and changes the attitude of the control surface 122 according to the calculated attitude (step S27). In the first game example, the posture of the control surface 122 is controlled to a posture corresponding to the posture of the terminal device 7 in the real space. That is, the player can change the posture of the control surface 122 in the game space by changing the posture of the terminal device 7 (the control surface 122 displayed on the terminal device 7). In the first game example, the position of the control surface 122 is fixed to a predetermined position in the game space.
Next, the player performs an operation of drawing a line on touch panel 52 using stylus 124 or the like (see an arrow shown in fig. 14). Here, in the first game example, the control surface 122 is displayed on the LCD51 of the terminal device 7 so that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, the direction on the control surface 122 (the direction indicated by the line) can be calculated from the line drawn on the touch panel 52. The hand sword 121 is launched in the direction determined by this. As described above, the CPU 10 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs the process of moving the sword 121 in the calculated direction (step S27). The CPU 10 may control the speed of the sword 121 based on, for example, the length of the line or the speed of the line.
As described above, according to the first game example, the game device 3 can move the control surface 122 in accordance with the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and can specify the direction on the control surface 122 by using the output of the touch panel 52 as the game input. As a result, the player can move the game image (the image of the control surface 122) displayed on the terminal device 7 or perform a touch operation on the game image, and thus can play a game with a new operation feeling as if the game image were directly operated.
In addition, in the first game example, directions in a three-dimensional space can be easily indicated by using the sensor outputs of the inertial sensor and the touch panel 52 as game inputs. That is, the player can easily instruct the direction by an intuitive operation such as actually inputting the direction in space by actually adjusting the posture of the terminal device 7 with one hand and inputting the direction on the touch panel 52 in a line manner with the other hand. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52 in parallel, the player can quickly perform the operation of instructing the direction in the three-dimensional space.
In addition, according to the first game example, in order to facilitate the touch input operation on the control surface 122, the control surface 122 is displayed in full screen on the terminal device 7. On the other hand, the television 2 displays an image of a game space including the entire control surface 122 and the target 123 so that the posture of the control surface 122 can be easily grasped and the target 123 can be easily aimed (see fig. 14). That is, in step S27, the first virtual camera for generating the game image for the television is set so that the entire control surface 122 and the target 123 are included in the visual field range, and the second virtual camera for generating the game image for the terminal is set so that the screen of the LCD51 (the input surface of the touch panel 52) and the control surface 122 coincide with each other on the screen. Therefore, in the first game example, the game operation is more easily performed by displaying the images of the game space viewed from different viewpoints on the television set 2 and the terminal device 7.
(second Game example)
Further, the game using the sensor outputs of the inertial sensor and the touch panel 52 as game inputs is not limited to the first game example described above, and various game examples can be conceived. The second game example is a game in which the terminal device 7 is operated to fly an object (cannonball) in the game space, as in the first game example. The player can instruct the direction in which the shell is fired by an operation of changing the posture of the terminal device 7 and an operation of specifying the position on the touch panel 52.
Fig. 15 is a diagram showing the screen of the television set 2 and the terminal device 7 in the second game example. In fig. 15, a cannon 131, a cannonball 132, and a target 133 are shown on the television set 2. A projectile 132 and a target 133 are shown on the terminal device 7. The terminal game image displayed on the terminal device 7 is an image obtained by observing the game space from the position of the cannon 131.
In the second game example, the player can change the display range displayed on the terminal device 7 as the terminal game image by manipulating the posture of the terminal device 7. That is, the CPU 10 calculates the attitude of the terminal device 7 from the outputs of the inertial sensor (the acceleration sensor 63 and the gyro sensor 64) and the magnetic sensor 62 (step S24), and controls the position and attitude of the second virtual camera for generating the terminal game image based on the calculated attitude (step S27). Specifically, the second virtual camera is provided at the position of the cannon 131, and the orientation (posture) thereof is controlled in accordance with the posture of the terminal device 7. In this way, the player can change the range of the game space displayed on the terminal device 7 by changing the posture of the terminal device 7.
In addition, in the second game example, the player specifies the shooting direction of the cannonball 132 by an operation (touch operation) of inputting a point on the touch panel 52. Specifically, as the processing of step S27 described above, the CPU 10 calculates a position (control position) in the game space corresponding to the touched position, and calculates a direction from a predetermined position (for example, the position of the cannon 131) in the game space toward the control position as the launch direction. Then, a process of moving the cannonball 132 in the shooting direction is performed. As described above, in the first game example, the player performs an operation of drawing a line on the touch panel 52, but in the second game example, an operation of designating a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as in the first game example (although the control surface is not shown in the second game example). That is, by arranging the control surface in accordance with the posture of the second virtual camera so as to correspond to the display range of the terminal device 7 (specifically, the control surface rotates around the position of the cannon 131 in accordance with the change in the posture of the terminal device 7), the position on the control surface corresponding to the touched position can be calculated as the control position.
According to the second game example described above, the game device 3 can change the display range of the terminal-use game image in accordance with the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and can determine the direction (the emission direction of the cannonball 132) within the game space by using the touch input specifying the position within the display range as the game input. Therefore, in the second game example as well, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, as in the first game example, and therefore can play a game with a new operation feeling as if the game image were directly operated.
In the second game example, as in the first game example, the player can easily indicate a direction by an intuitive operation such as actually inputting a direction in space by actually adjusting the posture of the terminal device 7 with one hand and performing a touch input on the touch panel 52 with the other hand. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52 in parallel, the player can quickly perform the operation of instructing the direction in the three-dimensional space.
In the second game example, the image displayed on the television set 2 may be an image viewed from the same viewpoint as the terminal device 7, but in fig. 15, the game device 3 is assumed to display an image viewed from a different viewpoint. That is, the second virtual camera for generating the terminal game image is set at the position of the cannon 131, whereas the first virtual camera for generating the television game image is set at the position behind the cannon 131. Here, for example, by displaying the range that cannot be seen on the screen of the terminal device 7 on the television 2, it is possible to realize a game system in which the player aims at the target 133 that cannot be seen on the screen of the terminal device 7 while watching the screen of the television 2. By differentiating the display ranges of the television 2 and the terminal device 7 in this way, not only the appearance in the game space can be grasped more easily, but also the interest of the game can be further improved.
As described above, according to the present embodiment, since the terminal device 7 including the touch panel 52 and the inertial sensor can be used as the operation device, it is possible to realize a game having an operation feeling of directly operating a game image as in the first and second game examples described above.
(third Game example)
Next, a third game example is explained with reference to fig. 16 and 17. A third game example is a baseball game in the form of a two-player battle. That is, the first player operates the batter with the controller 5, and the second player operates the pitcher with the terminal device 7. Further, a game image that is easy for each player to perform a game operation is displayed on the television 2 and the terminal device 7.
Fig. 16 is a diagram showing an example of a television game image displayed on the television 2 in the third game example. The television game image shown in fig. 16 is an image mainly provided to the first player. That is, the television game image shows a game space obtained by observing a pitcher (pitcher object) 142 as an operation object of the second player from the side of a batter (batter object) 141 as an operation object of the first player. The first virtual camera for generating a television game image is disposed at a position behind the batter 141 so as to face the pitcher 142 from the batter 141.
On the other hand, fig. 17 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example. The terminal game image shown in fig. 17 is an image mainly provided to the second player. That is, the terminal game image represents a game space obtained by observing the batter 141 as the operation target of the first player from the pitcher 142 side as the operation target of the second player. Specifically, in step S27, the CPU 10 controls the second virtual camera used to generate the terminal game image in accordance with the orientation of the terminal device 7. As in the second game example, the attitude of the second virtual camera is calculated in accordance with the attitude of the terminal device 7. The position of the second virtual camera is fixed at a predetermined position. The terminal game image includes a cursor 143 for indicating the direction in which the pitcher 142 throws the ball.
Further, the method of operating the batter 141 by the first player and the method of operating the pitcher 142 by the second player may be any methods. For example, the CPU 10 may detect a swing operation to the controller 5 from output data of an inertial sensor of the controller 5, and may cause the batter 141 to swing a club in accordance with the swing operation. For example, the CPU 10 may move the cursor 143 in accordance with the operation of the analog stick 53, and may cause the pitcher 142 to perform a pitching operation toward the position indicated by the cursor 143 when a predetermined button of the operation buttons 54 is pressed. Instead of operating the analog stick 53, the cursor 143 may be moved according to the posture of the terminal device 7.
As described above, in the third game example, the game images are generated from different viewpoints on the television 2 and the terminal device 7, thereby providing the game images that are easy to view and easy to operate for each player.
In the third game example, two virtual cameras are set in one game space, and two kinds of game images obtained by observing the game space from the respective virtual cameras are displayed (fig. 16 and 17). Therefore, the two kinds of game images generated in the third game example have an advantage that the game processes (control of objects in the game space, etc.) for the game space are almost the same, and each game image can be generated by performing the drawing process twice in the same game space, and therefore, the processing efficiency is higher than that in the case of performing the game processes separately.
In the third game example, since the cursor 143 indicating the pitching direction is displayed only on the terminal device 7 side, the first player cannot see the position indicated by the cursor 143. Therefore, there is no unreasonable in the game that the first player knows the direction of pitching to be disadvantageous to the second player. As described above, in the present embodiment, if one player sees the game image, if an unreasonable game is caused for the other player, the game image may be displayed on the terminal device 7. This prevents problems such as a decrease in game strategy. In another embodiment, the game device 3 may display the terminal game image on the television 2 together with the television game image, depending on the content of the game (for example, in a case where the first player does not see the terminal game image unreasonably as described above).
(fourth Game example)
Next, a fourth game example is described with reference to fig. 18 and 19. A fourth game example is a shooting game in a two-player cooperation format. That is, the first player performs an operation of moving the plane by using the controller 5, and the second player performs an operation of controlling the shooting direction of the cannon of the plane by using the terminal device 7. In the fourth game example, as in the third game example, a game image that is easy for each player to perform a game operation is displayed on the television 2 and the terminal device 7.
Fig. 18 is a diagram showing an example of a television game image displayed on the television 2 in the fourth game example. Fig. 19 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example. As shown in fig. 18, in the fourth game example, an airplane (airplane object) 151 and a target (balloon object) 153 appear in a virtual game space. In addition, the airplane 151 has a cannon (cannon object) 152.
As shown in fig. 18, an image of a game space including a plane 151 is displayed as a television game image. The first virtual camera for generating a television game image is set to generate an image of a game space obtained by viewing the airplane 151 from behind. That is, the first virtual camera is arranged at a position behind the airplane 151 in a posture in which the airplane 151 is included in the shooting range (field range). In addition, the first virtual camera is controlled to move as the airplane 151 moves. That is, the CPU 10 controls the movement of the airplane 151 according to the controller operation data in the process of the above-described step S27, and controls the position and the attitude of the first virtual camera. In this way, the position and posture of the first virtual camera are controlled in accordance with the operation of the first player.
On the other hand, as shown in fig. 19, an image of the game space viewed from the flight 151 (more specifically, the cannon 152) is displayed as the terminal game image. Thus, the second virtual camera for generating the terminal game image is arranged at the position of the airplane 151 (more specifically, the position of the cannon 152). The CPU 10 controls the movement of the airplane 151 according to the controller operation data and controls the position of the second virtual camera in the process of the above-described step S27. The second virtual camera may be disposed at a position around the plane 151 or the cannon 152 (for example, a position slightly behind the cannon 152). As described above, the position of the second virtual camera is controlled by the operation of the first player (operating the movement of the airplane 151). Therefore, in the fourth game example, the first virtual camera and the second virtual camera move in conjunction with each other.
Further, as the terminal game image, an image of the game space viewed in the direction of the shooting direction of the cannon 152 is displayed. Here, the firing direction of the cannon 152 is controlled in a manner corresponding to the attitude of the terminal device 7. That is, in the present embodiment, the posture of the second virtual camera is controlled so that the line-of-sight direction of the second virtual camera coincides with the shooting direction of the cannon 152. In the processing of step S27, the CPU 10 controls the orientation of the cannon 152 and the orientation of the second virtual camera in accordance with the orientation of the terminal device 7 calculated in step S24. In this way, the posture of the second virtual camera is controlled in accordance with the operation of the second player. In addition, the second player can change the shooting direction of the cannon 152 by changing the posture of the terminal device 7.
Further, in the case of firing a cannonball from the cannon 152, the second player presses a prescribed button of the terminal device 7. When a given button is pressed, the cannonball 152 is fired toward its orientation. In the terminal game image, an aiming cursor 154 is displayed at the center of the screen of the LCD51, and the projectile is fired in the direction indicated by the aiming cursor 154.
As described above, in the fourth game example, the first player operates the airplane 151 (e.g., moves it in the direction of the desired target 153) while mainly observing the television game image (fig. 18) indicating the game space observed in the traveling direction of the airplane 151. On the other hand, the second player operates the cannon 152 while mainly observing the terminal-use game image (fig. 19) representing the game space observed in the shooting direction of the cannon 152. In this way, in the fourth game example, in a game of a type in which two players cooperate, game images that are easy to observe and easy to operate for the respective players can be displayed on the television 2 and the terminal device 7, respectively.
In the fourth game example, the positions of the first virtual camera and the second virtual camera are controlled in accordance with the operation by the first player, and the posture of the second virtual camera is controlled in accordance with the operation by the second player. That is, in the present embodiment, the position or orientation of the virtual camera changes according to the game operation of each player, and as a result, the display range of the game space displayed on each display device changes. Since the display range of the game space displayed on the display device changes in accordance with the operation of each player, each player can actually feel that his or her game operation is sufficiently reflected in the progress of the game, and can sufficiently enjoy the game.
In the fourth game example, a game image viewed from behind the flight 151 is displayed on the television 2, and a game image viewed from the position of the cannon of the flight 151 is displayed on the terminal device 7. Here, in another game example, the game device 3 may display the game image viewed from the rear of the airplane 151 on the terminal device 7 and display the game image viewed from the position of the cannon 152 of the airplane 151 on the television 2. In this case, the characters of the players may be exchanged with those of the fourth game example, and the first player may operate the cannon 152 by using the controller 5, and the second player may operate the airplane 151 by using the terminal device 7.
(fifth Game example)
Next, a fifth game example is explained with reference to fig. 20. The fifth game example is a game in which a player operates using the controller 5, and the terminal device 7 is used as a display device instead of an operation device. Specifically, the fifth game example is a golf game, and the game device 3 causes the player character in the virtual game space to perform a golf swing motion in accordance with an operation (swing operation) in which the player swings the controller 5 like a golf club.
Fig. 20 is a diagram showing a use situation of the game system 1 in the fifth game example. In fig. 20, an image of a game space including (an object of) a player character 161 and (an object of) a golf club 162 is displayed on the screen of the television set 2. In fig. 20, (an object of) a ball 163 which is not shown but is disposed in the game space due to being hidden in the golf club 162 is also displayed on the television 2. On the other hand, as shown in fig. 20, the terminal device 7 is disposed on the floor on the front side of the television set 2 such that the screen of the LCD51 faces vertically upward. An image showing the ball 163, an image showing a part of the golf club 162 (specifically, the head 162a of the golf club), and an image showing the floor of the game space are displayed on the terminal device 7. The terminal game image is an image obtained by observing the periphery of the ball from above.
When playing a game, the player 160 stands near the terminal device 7 and performs a swing operation of swinging the controller 5 like a golf club. At this time, the CPU 10 controls the position and posture of the golf club 162 in the game space in accordance with the posture of the controller 5 calculated by the process of the above step S23 in the above step S27. Specifically, the golf club 162 is controlled in the following manner: the golf club 162 in the game space hits the ball 163 with the tip direction of the controller 5 (the positive Z-axis direction shown in fig. 3) directed toward the image of the ball 163 displayed on the LCD 51.
When the front end direction of the controller 5 is directed to the LCD51, an image (head image) 164 showing a part of the golf club 162 is displayed on the LCD51 (see fig. 20). In addition, in order to increase the realistic sensation, the image of the ball 163 may be displayed in a full size, or the orientation of the head image 164 may be displayed so as to rotate in accordance with the rotation of the controller 5 about the Z axis. The terminal game image may be generated by a virtual camera provided in the game space, or may be generated by image data prepared in advance. In the case of generation using image data prepared in advance, a detailed and realistic image can be generated with a small processing load without building a topographical model of a golf course in detail.
When the ball 163 is hit by the golf club 162 as a result of the player 160 swinging the golf club 162 by performing the swing operation described above, the ball 163 moves (flies). That is, the CPU 10 determines whether or not the golf club 162 has contacted the ball 163 in step S27, and moves the ball 163 when it has contacted. Here, a game image for a television is generated so as to include the moved ball 163. That is, the CPU 10 controls the position and the posture of the first virtual camera for generating the television game image so that the moving ball is included in the shooting range of the first virtual camera. On the other hand, in the terminal device 7, when the golf club 162 hits the ball 163, the image of the ball 163 moves and disappears immediately outside the screen. Thus, in the fifth game example, the situation in which the ball moves is mainly displayed on the television 2, and the player 160 can confirm the trajectory of the ball flying through the swing operation by the game image on the television.
As described above, in the fifth game example, the player 160 can swing the golf club 162 (make the player character 161 swing the golf club 162) by swinging the controller 5. Here, in the fifth game example, control is performed in the following manner: when the tip direction of the controller 5 is directed to the image of the ball 163 displayed on the LCD51, the golf club 162 in the game space is caused to hit the ball 163. Thus, the player can obtain a feeling as if he is swinging an actual golf club through the swing operation, so that the swing operation can be performed with more realistic feeling.
In the fifth game example, the head image 164 is also displayed on the LCD51 with the front end direction of the controller 5 facing the terminal device 7. Therefore, the player can obtain a feeling that the posture of the golf club 162 in the virtual space corresponds to the posture of the controller 5 in the real space by directing the tip end direction of the controller 5 toward the terminal device 7, and can perform the swing operation with more realistic feeling.
As described above, in the fifth game example, when the terminal device 7 is used as the display device, the operation by the controller 5 can be performed with more realistic feeling by disposing the terminal device 7 at an appropriate position.
In the fifth game example, the terminal device 7 is disposed on the floor, and an image showing only the game space around the ball 163 is displayed on the terminal device 7. Therefore, the position and posture of the entire golf club 162 in the game space cannot be displayed on the terminal device 7, and the movement of the ball 163 after the swing operation cannot be displayed on the terminal device 7. Therefore, in the fifth game example, the entire golf club 162 is displayed on the television 2 before the ball 163 moves, and the situation where the ball 163 moves after the ball 163 moves is displayed on the television 2. In this way, according to the fifth game example, it is possible to provide the player with realistic operations and to present an easily viewable game image to the player through the two screens of the television set 2 and the terminal device 7.
In the fifth game example, the marker 55 of the terminal device 7 is used to calculate the posture of the controller 5. That is, CPU 10 turns on marker unit 55 (does not turn on marker device 6) in the initial processing of step S1, and CPU 10 calculates the posture of controller 5 from marker coordinate data 96 in step S23. This makes it possible to accurately determine whether or not the front end of the controller 5 is oriented toward the marker 55. Further, in the fifth game example, the above steps S21 and S22 may not be performed, but in another game example, the markers to be lit may be changed during the game by performing the above processes of steps S21 and S22. For example, the CPU 10 may determine whether the tip end direction of the controller 5 is oriented in the gravity direction from the first acceleration data 94 in step S21, and control the following in step S22: the marker 55 is turned on if the direction of gravity is oriented, and the marker device 6 is turned on if the direction of gravity is not oriented. Thus, when the tip end direction of the controller 5 is oriented in the gravity direction, the posture of the controller 5 can be calculated with high accuracy by acquiring the marker coordinate data of the marker section 55, and when the tip end direction of the controller 5 is oriented in the television set 2, the posture of the controller 5 can be calculated with high accuracy by acquiring the marker coordinate data of the marker device 6.
As described in the fifth game example, the game system 1 can use the terminal device 7 as a display device by being installed at an arbitrary position. Thus, when marker coordinate data is used as a game input, the controller 5 can be used in any direction by setting the terminal device 7 at a desired position in addition to the controller 5 toward the television 2. That is, according to the present embodiment, since the direction in which the controller 5 can be used is not limited, the degree of freedom of the operation of the controller 5 can be improved.
[7 ] other action examples of the Game System ]
The game system 1 can perform operations for playing various games as described above. The terminal device 7 can also be used as a portable display or a second display, and can also be used as a controller for performing touch input or input by movement, and according to the game system 1, a wide range of games can be implemented. Further, the following operations can be performed for applications other than games.
(operation example in which a player plays a game only with the terminal device 7)
In the present embodiment, the terminal device 7 functions as a display device and also functions as an operation device. Therefore, the terminal device 7 is used as a display unit and an operation unit without using the television set 2 and the controller 5, whereby the terminal device 7 can also be used like a portable game device.
Specifically describing the game process shown in fig. 12, the CPU 10 acquires the terminal operation data 97 from the terminal device 7 in step S3, and executes the game process using only the terminal operation data 97 as a game input (without using the controller operation data) in step S4. Then, a game image is generated in step S6, and the game image is transmitted to the terminal device 7 in step S10. In this case, steps S2, S5, and S9 may not be performed. According to the above, the game processing is performed in accordance with the operation of the terminal device 7, and the game image showing the result of the game processing is displayed on the terminal device 7. In this way, the terminal device 7 can be used as a portable game device (although the game device actually executes the game process). Therefore, according to the present embodiment, even when a game image cannot be displayed on the television set 2 due to the television set 2 being used (for example, another person is watching a television broadcast), the user can play a game using the terminal device 7.
Further, the CPU 10 may transmit an image of the menu screen displayed after power-on to the terminal device 7 and display the image, not limited to the game image. This enables the player to play the game from the beginning without using the television 2, which is convenient.
In the above, the display device that displays the game image can be changed from the terminal device 7 to the television 2 during the game. Specifically, the CPU 10 may output the game image to the television 2 by further executing the step S9. Further, the image output to the television set 2 in step S9 is the same as the game image transmitted to the terminal device 7 in step S10. Thus, by switching the input from the television 2 to the display of the input from the game device 3, the same game image as that of the terminal device 7 can be displayed on the television 2, and therefore, the display device displaying the game image can be changed to the television 2. Further, after the game image is displayed on the television 2, the screen display of the terminal device 7 may be turned off.
In the game system 1, an infrared remote control signal for the television 2 may be output from an infrared output unit (the marker device 6, the marker section 55, or the infrared communication module 72). Accordingly, the game device 3 can operate the television 2 by outputting the infrared remote control signal from the infrared output means in accordance with the operation on the terminal device 7. In this case, since the user can operate the television set 2 by using the terminal device 7 without operating the remote controller of the television set 2, it is convenient in the case of switching the input of the television set 2 as described above.
(example of operation for communicating with other devices via network)
As described above, since the game device 3 has a function of connecting to a network, the game system 1 can be used even when communicating with an external device via a network. Fig. 21 is a diagram showing a connection relationship of each device included in the game system 1 when connected to an external device via a network. As shown in fig. 21, the game device 3 can communicate with an external device 201 via a network 200.
As described above, when the external device 201 and the game device 3 can communicate with each other, the game system 1 can communicate with the external device 201 using the terminal device 7 as an interface. For example, the game system 1 can be used as a television phone by transmitting and receiving images and sounds between the external device 201 and the terminal device 7. Specifically, the game device 3 receives images and sounds (images and sounds of the other party of the telephone) from the external device 201 via the network 200, and transmits the received images and sounds to the terminal device 7. Thereby, the terminal device 7 displays an image from the external device 201 on the LCD51, and outputs sound from the external device 201 from the speaker 67. Further, the game device 3 receives the camera image captured by the camera 56 and the microphone sound detected by the microphone 69 from the terminal device 7, and transmits the camera image and the microphone sound to the external device 201 via the network 200. The game device 3 can use the game system 1 as a video phone by repeating the transmission and reception of the image and the sound with the external device 201.
In the present embodiment, since the terminal device 7 is a portable device, the user can use the terminal device 7 at an arbitrary position or orient the camera 56 in an arbitrary direction. In the present embodiment, since the terminal device 7 includes the touch panel 52, the game device 3 can also transmit the input information (touch position data 100) to the touch panel 52 to the external device 201. For example, when an image or a voice from the external device 201 is output through the terminal device 7 and characters or the like written on the touch panel 52 by the user are transmitted to the external device 201, the game System 1 can be used as the online teaching System (E-Learning System) 1.
(example of operation in conjunction with television broadcast)
In addition, the game system 1 can also operate in conjunction with a television broadcast when the television broadcast is viewed through the television set 2. That is, when a television program is being viewed through the television set 2, the game system 1 causes the terminal device 7 to output information and the like relating to the television program. Next, an operation example when the game system 1 operates in conjunction with television broadcasting will be described.
In the above operation example, the game device 3 can communicate with a server via a network (in other words, the external device 201 shown in fig. 21 is a server). The server stores various information (television information) associated with the television broadcast for each channel of the television broadcast. The television information may be information related to a program such as subtitles and cast information, or information of an EPG (electronic program guide) or information broadcast as data broadcast. The television information may be image, sound, or text information, or a combination thereof. Further, the number of servers is not necessarily one, and a server may be provided for each channel of television broadcasting or each program, and the game device 3 may communicate with each server.
When the television 2 is outputting video and audio of a television broadcast, the game device 3 allows the user to input a channel of the television broadcast being viewed using the terminal device 7. Then, the server is requested via the network to transmit television information corresponding to the inputted channel. Accordingly, the server transmits data of the television information corresponding to the channel. Upon receiving the data transmitted from the server, game device 3 outputs the received data to terminal device 7. The terminal device 7 displays image and character data among the data on the LCD51, and outputs audio data from a speaker. According to the above, the user can enjoy information and the like relating to the television program currently being viewed by using the terminal device 7.
As described above, the game system 1 can also provide information linked with television broadcasting to the user through the terminal device 7 by communicating with an external device (server) via a network. In particular, in the present embodiment, the terminal device 7 is a portable device, and therefore, the user can use the terminal device 7 at an arbitrary position, and convenience is high.
As described above, in the present embodiment, the user can use the terminal device 7 in various applications and modes in addition to using the terminal device 7 in the game.
[8. modification ]
The above embodiment is an example of the present invention, and the present invention can be implemented by, for example, the configuration described below in other embodiments.
(modification having a plurality of terminal devices)
In the above embodiment, the game system 1 has only one terminal device, but the game system 1 may have a plurality of terminal devices. That is, the game device 3 may be configured as follows: the game device is capable of performing wireless communication with each of a plurality of terminal devices, transmitting game image data, game sound data, and control data to each terminal device, and receiving operation data, camera image data, and microphone sound data from each terminal device. In this case, the game device 3 may perform wireless communication with each of the terminal devices in a time division manner, or may perform communication by allocating a frequency band.
When a plurality of terminal devices are provided as described above, a wider variety of games can be played using the game system. For example, in the case where the game system 1 has two terminal devices, since the game system 1 has three display devices, game images provided to three players can be generated and displayed on the respective display devices. In the case where the game system 1 has two terminal devices, two players can play a game at the same time in a game in which the controller and the terminal devices are used as one set (for example, the fifth game example described above). When the game processing in step S27 is performed based on marker coordinate data output from the two controllers, the two players can each perform a game operation in which the controller is directed to the marker (the marker device 6 or the marker unit 55). That is, one player can cause the controller to perform a game operation toward the marker device 6, and the other player can cause the controller to perform a game operation toward the marker section 55.
(modification of function of terminal device)
In the above embodiment, the terminal device 7 functions as a so-called thin client (thin client) that does not execute a game process. In other embodiments, a part of the series of game processes executed by the game device 3 in the above embodiments may be executed by another device such as the terminal device 7. For example, the terminal device 7 may be caused to execute a part of the processing (for example, the terminal game image generation processing). In a game system having a plurality of information processing apparatuses (game apparatuses) capable of communicating with each other, for example, the plurality of information processing apparatuses may share and execute a game process.
Industrial applicability
As described above, the present invention can be used as a game system, a terminal device used in a game system, or the like, for example, for the purpose of allowing a player to perform a new game operation or the like.

Claims (14)

1. A game system comprising a placement type game device and a first operation device,
the game device includes:
a first operation data receiving unit that receives first operation data from the first operation device;
a game processing unit that executes a game process based on the first operation data;
an image generation unit that sequentially generates a first game image and a second game image based on the game processing;
a game image compression unit that sequentially compresses the first game image to generate compressed image data;
a game image transmitting unit that sequentially transmits the compressed image data to the first operating device via wireless; and
an image output unit that sequentially outputs the second game images to an external display device independent of the first operation device,
the first operation device includes:
a display unit;
a touch panel provided on a screen of the display unit;
an inertial sensor;
a first operation data transmitting unit that wirelessly transmits first operation data including output data of the touch panel and the inertial sensor to the game device;
a game image receiving unit that sequentially receives the compressed image data from the game device; and
a game image decompression unit that sequentially decompresses the compressed image data to obtain the first game image,
wherein the display unit sequentially displays the first game images obtained by decompression.
2. The gaming system of claim 1,
the above game system further comprises a second operating device,
the second operation device includes a second operation data transmission unit that wirelessly transmits second operation data indicating an operation performed on the second operation device to the game device,
the game device further includes a second operation data receiving unit that receives the second operation data,
the game processing unit executes a game process based on the second operation data.
3. The game system according to claim 1 or 2,
the game device further includes:
a game sound generation unit that generates a first game sound and a second game sound based on the game processing;
a game sound output unit that outputs the second game sound to an external acoustic device that is independent of the first operation device; and
a game sound transmitting unit that wirelessly transmits the first game sound to the first operating device,
the first operating device further includes:
a game sound receiving unit that receives the first game sound from the game device; and
a speaker that outputs the first game sound received by the game sound receiving unit.
4. The gaming system of claim 1,
the first operating device further includes a microphone,
the first operation data transmitting unit wirelessly transmits data of the sound detected by the microphone to the game device.
5. The gaming system of claim 1,
the first operating device further includes:
a camera; and
a camera image compression unit which compresses a camera image captured by the camera to generate compressed captured data,
the first operation data transmitting unit further transmits the compressed image data to the game device by wireless,
the game device further includes a camera image decompression unit configured to decompress the compressed image data to obtain a camera image.
6. The gaming system of claim 1,
the first operating device further includes:
a plurality of front operation buttons provided on both sides of a screen on which the display portion is provided and a front plane of the touch panel; and
direction input parts which are provided on both sides of the screen on the front plane and can indicate a direction,
the first operation data further includes data indicating operations performed on the plurality of front operation buttons and the direction input unit.
7. The game system according to claim 1 or 6,
the first operating device further includes:
a plurality of back operation buttons provided on a back plane opposite to a front plane of the touch panel and on which the screen of the display unit is provided; and
a plurality of side operation buttons provided on a side surface between the front surface plane and the rear surface plane,
the first operation data further includes data indicating operations performed on the plurality of back operation buttons and the plurality of side operation buttons.
8. The gaming system of claim 1,
the first operating device further includes a magnetic sensor,
the first operation data further includes data of a detection result of the magnetic sensor.
9. The gaming system of claim 1,
the inertial sensor includes a three-axis acceleration sensor and a three-axis gyro sensor.
10. The gaming system of claim 1,
the game device further includes:
a reading unit that reads information from an external recording medium on which a game program is recorded, the external recording medium being attachable to and detachable from the game device;
a network communication unit that can be connected to a network and communicates with an information processing apparatus that can communicate via the network; and
a power supply unit for supplying power from a power supply outside the game device to each unit in the game device,
wherein the game processing unit executes the game processing based on the game program read by the reading unit.
11. A game system comprising a placement type game device and a first operation device,
the game device includes:
a first operation data receiving unit that receives first operation data from the first operation device;
a game processing unit that executes a game process based on the first operation data;
an image generation unit that sequentially generates a first game image and a second game image based on the game processing;
a game image compression unit that sequentially compresses the first game image to generate compressed image data;
a game image transmitting unit that sequentially transmits the compressed image data to the first operating device via wireless;
an image output unit that sequentially outputs the second game images to an external display device that is independent of the first operation device;
a game sound generation unit that generates a first game sound and a second game sound based on the game processing;
a game sound output unit that outputs the second game sound to an external acoustic device that is independent of the first operation device;
a game sound transmitting unit that wirelessly transmits the first game sound to the first operating device,
a camera image decompression unit which decompresses the compressed image data to obtain a camera image;
a reading unit that reads information from an external recording medium on which a game program is recorded, the external recording medium being attachable to and detachable from the game device;
a network communication unit that can be connected to a network and communicates with an information processing apparatus that can communicate via the network; and
a power supply unit that supplies power from a power supply external to the game device to each unit in the game device, wherein the game processing unit executes game processing in accordance with the game program read by the reading unit;
the first operation device includes:
a display unit;
a touch panel provided on a screen of the display unit;
an inertial sensor;
a first operation data transmitting unit that wirelessly transmits first operation data including output data of the touch panel and the inertial sensor to the game device;
a game image receiving unit that sequentially receives the compressed image data from the game device;
a game image decompression unit that sequentially decompresses the compressed image data to obtain the first game image, wherein the display unit sequentially displays the first game image obtained by the decompression;
a game sound receiving unit that receives the first game sound from the game device;
a speaker that outputs the first game sound received by the game sound receiving unit;
a microphone, the first operation data transmitting unit further transmitting data of a sound detected by the microphone to the game device by wireless;
a camera;
a camera image compression unit that compresses a camera image captured by the camera to generate the compressed captured data, wherein the first operation data transmission unit further transmits the compressed captured data to the game device by wireless;
a plurality of front operation buttons provided on both sides of a screen on which the display portion is provided and a front plane of the touch panel;
direction input portions which are provided on both sides of the screen on the front plane and can indicate directions, and the first operation data further includes data indicating operations performed on the plurality of front operation buttons and the direction input portions;
a plurality of back operation buttons provided on a back plane opposite to a front plane of the touch panel and on which the screen of the display unit is provided;
a plurality of side operation buttons provided on a side surface between the front surface plane and the rear surface plane, the first operation data further including data indicating operations performed on the plurality of rear surface operation buttons and the side operation buttons;
a magnetic sensor, the first operation data further including data of a detection result of the magnetic sensor; and
the inertial sensor comprises a three-axis acceleration sensor and a three-axis gyroscope sensor;
wherein the game system further includes a second operation device including a second operation data transmission unit that wirelessly transmits second operation data indicating an operation performed on the second operation device to the game device,
the game device further includes a second operation data receiving unit that receives the second operation data, and the game processing unit executes game processing based on the second operation data.
12. An operation device capable of wireless communication with a stationary game device, the operation device comprising:
a display unit;
a touch panel provided on a screen of the display unit;
an inertial sensor;
an operation data transmitting unit that wirelessly transmits operation data including output data of the touch panel and the inertial sensor to the game device;
a game image receiving unit that sequentially receives, from the game device, compressed image data obtained by applying a compression process to a game image generated based on a game process executed based on the operation data in the game device; and
a game image decompression unit that sequentially decompresses the compressed image data to obtain the game image,
wherein the display unit sequentially displays the game images obtained by decompression.
13. A game processing method executed in a game system including a stationary game device and a first operation device,
the first operation device executes a first operation data transmission step of transmitting first operation data including output data of a touch panel provided on a screen of a display unit provided in the first operation device and output data of an inertial sensor to the game device by wireless,
the game device executes the following steps:
a first operation data receiving step of receiving first operation data from the first operation device;
a game processing step of executing game processing based on the first operation data;
an image generation step of sequentially generating a first game image and a second game image based on the game processing;
a game image compression step of sequentially compressing the first game image to generate compressed image data;
a game image transmission step of sequentially transmitting the compressed image data to the first operation device via wireless; and
an image output step of sequentially outputting the second game images to an external display device independent of the first operation device,
the first operating device further performs the steps of:
a game image receiving step of sequentially receiving the compressed image data from the game device;
a game image decompression step of sequentially decompressing the compressed image data to obtain the first game image; and
a display step of sequentially displaying the first game images obtained by decompression on the display unit.
14. The game processing method of claim 13,
the above game system further comprises a second operating device,
the second operation device executes a second operation data transmission step of transmitting second operation data indicating an operation performed on the second operation device to the game device by wireless,
the game device further performs a second operation data receiving step of receiving the second operation data,
in the game processing step, game processing is executed based on the second operation data.
HK12112244.5A 2010-11-01 2012-11-28 Game system, controller device, and game process method HK1171402B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010-245299 2010-11-01
JP2010-245298 2010-11-01
JP2010245298 2010-11-01
JP2010245299A JP4798809B1 (en) 2010-11-01 2010-11-01 Display device, game system, and game processing method

Publications (2)

Publication Number Publication Date
HK1171402A1 HK1171402A1 (en) 2013-03-28
HK1171402B true HK1171402B (en) 2014-12-12

Family

ID=

Similar Documents

Publication Publication Date Title
KR101287696B1 (en) Game system, controller device, and game process method
TWI442963B (en) Controller device and information processing device
TWI541051B (en) Game system,controller device,and game process method
JP5829020B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP6188766B2 (en) Operating device and operating system
JP4798809B1 (en) Display device, game system, and game processing method
US20120270651A1 (en) Display device, game system, and game method
JP6103677B2 (en) GAME SYSTEM, OPERATION DEVICE, AND GAME PROCESSING METHOD
JP2012096005A (en) Display device, game system and game processing method
JP5936315B2 (en) Information processing system and information processing apparatus
HK1171402B (en) Game system, controller device, and game process method
HK1171401B (en) Display device, game system, and game method
HK1171403B (en) Controller device and information processing device
HK1171403A1 (en) Controller device and information processing device
HK1165745B (en) Controller device and controller system
HK1171400B (en) Controller device and controller system
HK1171399B (en) Device support system and support device
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载