US20180036636A1 - Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor - Google Patents
Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor Download PDFInfo
- Publication number
- US20180036636A1 US20180036636A1 US15/666,769 US201715666769A US2018036636A1 US 20180036636 A1 US20180036636 A1 US 20180036636A1 US 201715666769 A US201715666769 A US 201715666769A US 2018036636 A1 US2018036636 A1 US 2018036636A1
- Authority
- US
- United States
- Prior art keywords
- computer
- based data
- display screen
- electronic device
- interest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003672 processing method Methods 0.000 title claims description 10
- 230000000694 effects Effects 0.000 claims description 32
- 230000000007 visual effect Effects 0.000 claims description 24
- 230000005540 biological transmission Effects 0.000 claims description 4
- 238000009795 derivation Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 31
- 230000008878 coupling Effects 0.000 description 8
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000000295 complement effect Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1431—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/32—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
- A63F13/327—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections using wireless networks, e.g. Wi-Fi® or piconet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6072—Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6081—Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Definitions
- the present disclosure generally relates to a computer having a main display screen for displaying information and an electronic device having a companion display module, to the main display screen, which displays auxiliary information not displayed by the main display screen.
- the computer on which the game is played displays game based graphics as a gamer plays the game.
- graphics displayed generally become more detailed.
- the computer screen generally shows graphics related to the controlled game character.
- graphics related to the controlled game character include, for example, changing scenarios as the game character is moved and/or actions related to the controlled game character.
- activities of the gamer controlled game character need not necessary be visually perceivable via the computer screen.
- other characters i.e., other than the aforementioned gamer controlled game character
- the other characters need not necessarily appear in the same scene as the gamer controlled game character. Therefore, activities (e.g., movement) of these other characters which may be of interest to the gamer may, unfortunately, not be visually perceivable via the computer screen at a current point in time when the gamer controlled game character is shown.
- gaming experience may be detracted as capability of the computer screen to appropriately provide display for sophisticated games (e.g., a Massively Multiplayer Online Role Playing Game) may be limited.
- an electronic device operable with a computer.
- the computer can be configured to run/execute a program which can be associated with graphic based data and audio based data.
- the computer can include a main display screen which can be configured to display information based on graphic based data.
- the electronic device can include a display module (e.g., a supplementary display screen) which can be configured to display auxiliary information.
- a display module e.g., a supplementary display screen
- Auxiliary information can be related to information being displayed via the main display screen. Moreover, such auxiliary information is not displayed by the main display screen. Specifically, auxiliary information is not visually perceivable via the main display screen at the same point in time the main display screen displays the aforementioned information based on graphic based data. Furthermore, auxiliary information displayed by the display module, but not the main display screen, can be derived based on audio based data.
- auxiliary information displayed via the display module relates to an object of interest.
- the object of interest can be associated with audio based data audibly perceivable via the computer.
- the graphic based data can, for example, be associated with an environment and auxiliary information is associable with approximate location of the object of interest within the environment.
- audio based data associated with the object of interest corresponds to sound effect associated with the object of interest.
- a processing method associated with a system which can include a computer and an electronic device.
- the computer can be configured to signal communicate with the electronic device.
- the processing method can include a selection step where a user of the computer selects an object of interest and an identification step where audio based data associated with the selected object of interest is identified at the computer.
- the processing method can also include a derivation step where supplementary signals are generated, at the computer, based on the identified audio based data.
- the processing method can include a transmission step where generated supplementary signals are communicated from the computer to the electronic device and an output step where received supplementary signals are processed to generate at least one visual cue visually perceivable via the electronic device but not the computer.
- FIG. 1 a shows a system which can include a computer which can communicate with an electronic device, according to an embodiment of the disclosure
- FIG. 1 b shows the computer and the electronic device of FIG. 1 a in further detail, according to an embodiment of the disclosure
- FIG. 1 c shows an exemplary scenario where the computer of FIG. 1 a can be a gaming laptop and the electronic device 104 of FIG. 1 a can be a Smartphone, according to an embodiment of the disclosure;
- FIG. 2 shows a flow diagram for a processing method in association with the system of FIG. 1 a , according to an embodiment of the disclosure.
- the present disclosure contemplates an electronic device having a display module which is capable of functioning as, for example, a complementary display.
- the electronic device can be operated with a computer having a screen.
- the screen of the computer can be considered a main display screen and the display module of the electronic device can be considered to be, for example, a supplementary display relative to the main display screen.
- the supplementary display can be in the form of, for example, a supplementary display screen which can be considered to be a complementary display screen to the main display screen.
- the electronic device is a portable type device which can be configured to display, via its display module, information auxiliary to information being displayed by the main display screen. Therefore, the display module can display auxiliary information with reference to information being displayed by the main display screen.
- auxiliary information is not displayed by/visually perceivable via the main display screen and can be derived via audio based data audibly perceivable via the computer as will be discussed with reference to FIG. 1 and FIG. 2 hereinafter.
- FIG. 1 a shows a system 100 in accordance with an embodiment of the disclosure.
- the system 100 can include a computer 102 (e.g., Smartphones, Tablets, Laptops, Desktop computers) which is suitable for gaming.
- the system 100 can further include an electronic device 104 (e.g., Smartphones, Tablets, Laptops, pointing devices such as a mouse, keyboards or another computer similar to the computer 102 ).
- the computer 102 can be coupled to the electronic device 104 such that the computer 102 can communicate with the electronic device 104 . Coupling between the computer 102 and the electronic device 104 can be via one or both of wired coupling and wireless coupling.
- Each of the computer 102 and the electronic device 104 can include a body such as a casing (not shown) shaped and dimensioned to carry, respectively, a screen 102 a and a display module 104 a .
- the screen of the computer 102 can be considered a main display screen 102 a and the display module 104 a of the electronic device 104 can be considered a supplementary display 104 a relative to the main display screen 102 a .
- the main display screen 102 a displays information generated in respect of the computer 102 at a current point in time and the supplementary display 104 a displays information related to that being displayed by the main display screen 102 a , but such related information (i.e., displayed by the supplementary display 104 a ) is not displayed by the main display screen 102 a at that same current point in time. Therefore, it is appreciable that the supplementary display 104 a displays information auxiliary to information being displayed by the main display screen 102 a .
- the supplementary display 104 a can display auxiliary information with reference to information being displayed by the main display screen 102 a .
- the supplementary display 104 a can be considered to be a companion/complementary display (e.g., a companion/complementary display screen) to the main display screen 102 a.
- auxiliary Information is displayed only via the supplementary display screen 104 a and not the main display screen 102 a and such auxiliary information is related to information being displayed by the main display screen 102 a , it is appreciable that there is a need to derive such auxiliary information.
- auxiliary information can be derived using, for example, audio based data generated by, and audibly perceivable via, the computer 102 .
- the system 100 will be discussed in further detail with reference to FIG. 1 b hereinafter.
- the computer 102 can further include an input portion 102 b , a processing portion 102 c , a visual portion 102 d , an audio portion 102 e and a transceiver portion 102 f .
- the body (e.g., casing) of the computer 102 can be further shaped and dimensioned to carry the input portion 102 b , the processing portion 102 c , the visual portion 102 d , the audio portion 102 e and the transceiver portion 102 f.
- the visual portion 102 d can include a display driver (not shown).
- the audio portion 102 e can include an audio processor 102 g , an audio amplifier 102 h and a plurality of speaker drivers 102 i.
- the input portion 102 b can be coupled to the processing portion 102 c .
- the processing portion 102 c can be coupled to each of the visual portion 102 d , the audio portion 102 e and the transceiver portion 102 f.
- the display driver can be coupled to the processing portion 102 c .
- the display driver can be further coupled to the main display screen 102 a.
- the audio processor 102 g can be coupled to the processing portion 102 c .
- the audio processor 102 g can be further coupled to the audio amplifier 102 h .
- the audio amplifier 102 h can be coupled to the plurality of speaker drivers 102 i .
- the plurality of speaker drivers can include a first speaker driver, a second speaker driver, a third speaker driver and a fourth speaker driver.
- the first speaker driver can, for example, correspond to a front left (i.e., “FL” as indicated in FIG. 1 b ) channel speaker.
- the second speaker driver can, for example, correspond to a front right (i.e., “FR” as indicated in FIG. 1 b ) channel speaker.
- the third speaker driver can, for example, correspond to a rear left (i.e., “RL” as indicated in FIG. 1 b ) channel speaker.
- the fourth speaker driver can, for example, correspond to a rear right (i.e., “RR” as indicated in FIG. 1 b ) channel speaker.
- Audio output from the computer 102 can be via the plurality of speaker drivers 102 i .
- audio output from the computer 102 can include FL audio output, FR audio output, RL audio output and/or RR audio output via the FL channel speaker, the FR channel speaker, the RL channel speaker and/or the RR channel speaker respectively.
- coupling between the computer 102 and the electronic device 104 can be via one or both of wired coupling and wireless coupling.
- the computer 102 can be coupled to the electronic device 104 via a communication network 106 .
- the computer 102 can be configured to wirelessly signal communicate with the electronic device 104 via the communication network 106 .
- the communication network 106 can include, for example, Bluetooth based communication, Infrared based communication and/or Wi-Fi based communication.
- the computer 102 can be coupled to the electronic device 104 via hardwiring.
- the computer 102 can be coupled to the electronic device 104 via a cable (not shown).
- the electronic device 104 display module 104 a (i.e., also referable as supplementary display 104 a ) can, in one embodiment, be configured for display in accordance with audio output from the computer 102 .
- the supplementary display 104 a can be in the form of a display screen.
- display by the supplementary display 104 a can, for example, be partitioned based on audio output from the computer 102 .
- the supplementary display 104 a can, for example, be partitioned into four sections corresponding to the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker.
- display by the supplementary display 104 a can be partitioned into four equal sections by a vertical axis 108 a and a horizontal axis 108 b , in accordance with an embodiment of the disclosure.
- the display by the supplementary display 104 a can, for example, be partitioned into a first quarter 108 c indicative of FL audio output, a second quarter 108 d indicative of FR audio output, a third quarter 108 e indicative of RL audio output and a fourth quarter 108 f indicative of RR audio output.
- the vertical and horizontal axes 108 a / 108 b are included in this discussion to merely illustrate a possible manner of demarcating display by the supplementary display 104 a and need not necessarily be visually perceivable (i.e., the vertical and horizontal axes 108 a / 108 b can be imaginary lines which are not visually perceivable).
- the processing portion 102 c can be configured to run/execute a software program. Based on the software program being run, output signals can be generated and communicated from the processing portion 102 c . Output signals can include one or both of visual based output signals and audio based output signals.
- the software program can be associated with one or both of graphics based data and audio based data which can be processed to generate, respectively, visual based output signals and audio based output signals.
- Visual based output signals can be communicated to the visual portion 102 d for further processing (e.g., by the display driver) so that the visual based output signals can, for example, be in a format suitable for display (i.e., visually perceivable by a user) of graphics by the main display screen 102 a.
- Audio based output signals can be communicated to the audio portion 102 e for further processing.
- the audio based output signals can be processed by the audio processor 102 g and the audio amplifier 102 h .
- the processed audio based output signals can be output via the plurality of speaker drivers 102 i for audible perception by a user.
- audio based output signals can include a FL channel signal, a FR channel signal, a RL channel signal and a RR channel signal.
- the FL channel signal, the FR channel signal, the RL channel signal and the RR channel signal can be processed by the audio processor 102 g and the audio amplifier 102 h .
- the processed FL channel signal, the processed FR channel signal, the processed RL channel signal and the processed RR channel signal can be output by the FL channel speaker, the FR channel speaker, the RL channel speaker, the RR channel speaker respectively.
- Outputs from the FL channel speaker, the FR channel speaker, the RL channel speaker, the RR channel speaker can be referred to as FL audio output, FR audio output, RL audio output and RR audio output respectively.
- Input signals can be generated, by a user operating the input portion 102 b (e.g., a keyboard or a pointing device such as a mouse), and communicated to processing portion 102 c .
- the processing portion 102 c can be configured to process the graphic based data and/or the audio based data based on the input signals to produce, respectively, visual based output signals and/or audio based output signals.
- the processing portion 102 c can be further configured to process one or both of the graphics based data and the audio based data to produce supplementary signals which can be communicated to the electronic device 104 .
- the graphics based data and/or audio based data can be processed by the processing portion 102 c based on the input signals.
- the supplementary signals are communicated to the transceiver portion 102 f which transmits the supplementary signals to the electronic device 104 wirelessly via the communication network 106 .
- auxiliary information can be displayed via the supplementary display 104 a as will be discussed with reference to an exemplary scenario per FIG. 1 c hereinafter.
- the computer 102 can be a gaming laptop and the electronic device 104 can be a Smartphone having a display screen.
- the aforementioned display module/supplementary display 104 a can be in the form of a display screen.
- the aforementioned display module/supplementary display 104 a can be referred to as a supplementary display screen 104 a hereinafter.
- the supplementary display screen 104 a can be configured for coordinate graph type display in which a vertical axis 202 and a horizontal axis 204 section the supplementary display screen 104 a into four equal sections.
- the supplementary display screen 104 a can be partitioned into a first quadrant 206 a indicative of a FL based location, a second quadrant 206 b indicative of a FR based location, a third quadrant 206 c indicative of a RL based location and a fourth quadrant 206 d indicative of a RR based location.
- a point of origin 208 can be displayed.
- the point of origin 208 can be a reference point which is indicative that there is no output from each of the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker in respect of sound effect of interest (i.e., each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can be quantified to be 0 dB by the processing portion 102 c , in relation to the sound effect of interest).
- one or more visual cues 210 a / 210 b i.e., visual indication(s)
- the software program executed can be a game based program.
- the game based program can, for example, be of a war/battle type game genre (i.e., a game which is war/battle themed).
- a game there will be one or more game characters and/or a game environment (i.e., graphics based data) which can be visually perceived via the main display screen 102 a .
- a gamer can play the game in accordance with the storyline or game rules.
- the gamer may be a need for the gamer to move one or more game characters in a game environment so as to achieve a certain objective.
- the game can include accompanying game audio (i.e., audio based data) such as background music, soundtracks and/or sound effects which can be audibly perceived via the plurality of speaker drivers 102 i .
- accompanying game audio i.e., audio based data
- the game character(s) and game environment can correspond to graphics based data.
- the accompanying game audio can correspond to audio based data.
- game characters can include soldiers and the game environment can correspond to a battlefield.
- An objective of the game can be to neutralize enemy targets.
- Gamer control can be via the input portion 102 b (i.e., input signals generated by a gamer using a keyboard) to, for example, move the soldier(s) and/or to shoot at an enemy target.
- Sound effects of the game can include gunshots and footsteps (e.g., as the gamer moves the soldiers through the battlefield and/or as the enemy targets move through the battlefield). Sound effects can further include gunshots as enemy targets shoot at soldiers controlled by the gamer.
- the main display screen 102 a can be configured to display the battlefield and soldiers etc (i.e., information displayed by the main display screen 102 a ).
- audio based data can be associated with game audio.
- audio based data can include, for example, background music, soundtracks and/or sound effects.
- information displayed by the main display screen 102 a can include the game environment, the soldier(s) and/or enemy target(s). Information displayed by the main display screen 102 a can further include movement of the soldier(s), movement of the enemy target(s) and/or changes in the game environment as the gamer moves the soldier(s) at a current point in time during gameplay.
- the processing portion 102 c can, for example, be configured provide an indication and/or perform calculations based on audio based data associated with the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal.
- Audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can, for example, be quantified in decibels (dB).
- the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can be indicative of loudness of the FL audio output, the FR audio output, the RL audio output and the RR audio output respectively.
- the supplementary signals communicated from the computer 102 to the electronic device 104 can be derived based on audio based data.
- Audio based data can be based on a sound effect of interest to a gamer.
- the sound effect of interest to a gamer can be associated with an object of interest to the gamer.
- An object of interest can, for example, include a movable game character such as an enemy target.
- the processing portion 102 c can be configured to quantify (e.g., in dB) audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal at any point in time when the game is being played by a gamer.
- a sound effect e.g., gunshots from an enemy target and/or footsteps of an enemy target
- the processing portion 102 c can be configured to quantify, in dB, audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal in relation to the sound effect of interest (e.g., gunshots from an enemy target and/or footsteps of an enemy target).
- supplementary signals can be generated by the processing portion 102 c and communicated from the computer 102 to the electronic device 104 .
- a sound effect of interest can be identified by a gamer by manner of, for example, selection of an object of interest as will be discussed later in further detail with reference to FIG. 2 .
- the processing portion 102 c can be configured to quantify (e.g., in dB) audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal at any point in time when the game is being played by a gamer.
- the processing portion 102 c can be further configured to compare audio based data associated with one channel signal (e.g., FL channel signal) with audio based data associated with another channel signal (e.g., FR channel signal).
- a sound effect (e.g., gunshots from an enemy target and/or footsteps of an enemy target) of interest can be identified by a gamer, and the processing portion 102 c can be configured to quantify and compare audio based data associated with the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal in relation to the sound effect of interest (e.g., gunshots from an enemy target and/or footsteps of an enemy target).
- a sound effect of interest can be identified by a gamer by manner of, for example, selection of an object of interest as will be discussed later in further detail with reference to FIG. 2 .
- audio based data associated with the FL channel signal can be quantified to be 6 dB
- audio based data associated with the FR channel signal can be quantified to be 3 dB
- audio based data associated with the RL channel signal can be quantified to be 0 dB
- audio based data associated with RR channel signal can be quantified to be 0 dB.
- the processing portion 102 c can be configured to compare audio based data associated with the FL and FR channel signals (since audio based data associated with the RL and RR channel signals, being 0 dB, can be disregarded).
- supplementary signals indicating location of the sound effect of interest during gameplay can be generated. Specifically, given that audio based data associated with the FL channel signal is quantified to be 6 dB whereas audio based data associated with the FR channel signal is quantified to be 3 dB, supplementary signals indicating that, for example, enemy target gunshots can be heard near the front (i.e., between FL and FR channels) closer to the left side (i.e., since audio based data associated with the FL channel signal is 6 dB and audio based data associated with the FR channel signal is 3 dB).
- audio based data associated with the FL channel signal can be quantified to be 0 dB
- audio based data associated with the FR channel signal can be quantified to be 5 dB
- audio based data associated with the RL channel signal can be quantified to be 0 dB
- audio based data associated with the RR channel signal can be quantified to be 2 dB.
- the processing portion 102 c can be configured to compare audio based data associated with the FR and RR channel signals (since audio based data associated with the FL and FR channel signals, being 0 dB, can be disregarded).
- supplementary signals indicating location of the sound effect of interest during gameplay can be generated. Specifically, given that audio based data associated with the FR channel signal is quantified to be 5 dB whereas audio based data associated with the RR channel signal is quantified to be 2 dB, supplementary signals indicating that, for example, enemy target gunshots can be heard near the right side (i.e., between FR and RR channels) closer to the front (i.e., since audio based data associated with the FR channel signal is 5 dB and audio based data associated with the RR channel signal is 2 dB).
- the supplementary signals can be indicative of an approximate location of an enemy target although the enemy target may not be necessarily visually perceivable via the main display screen 102 a at a particular point in time during gameplay when the sound effect of interest (i.e., gunshots from an enemy target) can be audibly perceived.
- an indication of an approximate location of an object of interest can be provided via the supplementary display screen 104 a based on audio based data even though the object of interest is not visually perceivable via the main display screen 102 a at a current point in time during gameplay. Therefore, auxiliary information displayed via the supplementary display screen 104 a can relate to the aforementioned approximate location of an object of interest not shown/displayed by the main display screen 102 a.
- the supplementary signals communicated from the computer 102 to the electronic device 104 can be received and processed by the electronic device 104 so as to provide at least one indication (i.e., visual cue(s) 210 a / 210 b ) of, for example, an approximate location of an object of interest (e.g., an enemy target).
- at least one indication i.e., visual cue(s) 210 a / 210 b
- an approximate location of an object of interest e.g., an enemy target.
- audio based data associated with the FL channel signal can be quantified to be 6 dB
- audio based data associated with the FR channel signal can be quantified to be 3 dB
- audio based data associated with each of the RL channel signal and the RR channel signal can be quantified to be 0 dB in relation to the sound effect of interest
- the supplementary signals can indicate that the object of interest (e.g., the enemy target) is approximately located near the front (i.e., between FL and FR channels) closer to the left side (i.e., since audio based data associated with the FL channel signal is 6 dB and audio based data associated with the FR channel signal is 3 dB).
- the supplementary display screen 104 a can display an indication such as a visual cue 210 a somewhere in the first quadrant 206 a indicating as such.
- audio based data associated with the FL channel signal can be quantified to be 0 dB
- audio based data associated with the FR channel signal can be quantified to be 5 dB
- audio based data associated with the RL channel signal can be quantified to be 0 dB
- audio based data associated with RR channel signal can be quantified to be 2 dB in relation to the sound effect of interest
- the supplementary signals can indicate that the object of interest (e.g., the enemy target) is approximately located at the right side (i.e., between FR and RR channels) closer to the front (i.e., since audio based data associated with the FR channel signal is 5 dB and audio based data associated with the RR channel signal is 2 dB).
- the supplementary display screen 104 a can display an indication such as a visual cue 210 b somewhere in the second quadrant 206 b indicating as such.
- the sound effect of interest e.g., gunshots from an enemy target, footsteps of an enemy target
- the object of interest e.g., an enemy target
- the sound effect of interest may not necessarily be shown (i.e., visually perceivable) on the main display screen 102 a even if the sound effect of interest can be audibly perceived at a particular point in time when the game is played.
- information displayed by the main display screen 102 a can include the game character(s) moved by the gamer and the game environment whereas auxiliary information such as the object of interest (e.g., a movable enemy target which movement may be computer controlled), which is not displayed by the main display screen 102 a when the associated sound of interest can be audibly perceived, can be displayed by the supplementary display screen 104 a .
- the object of interest e.g., a movable enemy target which movement may be computer controlled
- the gamer can still be provided with an approximate location of the object of interest in the game environment by virtue of the aforementioned visual cue(s) 210 a / 210 b displayed by the supplementary display screen 104 a.
- the approximate location of the object of interest in the game environment can be provided by displaying the aforementioned visual cue(s) 210 a / 210 b via the supplementary display screen 104 a , it is appreciable that there is no need to provision any additional space on the main display screen 102 a for display of such auxiliary information.
- auxiliary information there will be no such visual distractions shown on the main display screen 102 a which may detract gaming experience. That is, a gamer can have access to desired auxiliary information and yet still fully enjoy the intended display (the movable game character controlled by the gamer and the game environment etc.) on the main display screen 102 a without being distracted by display of auxiliary information.
- FIG. 2 shows a flow diagram for a processing method 300 in association with the system 100 of FIG. 1 in accordance with an embodiment of the disclosure.
- the system 100 includes a computer 102 and an electronic device 104 .
- the computer 102 is communicable with the electronic device 104 .
- the processing method 300 can include a selection step 302 , an identification step 304 , a derivation step 306 , a transmission step 308 and an output step 310 .
- a graphics user interface can be provided for user selection by a user of the computer 102 .
- the GUI can be displayed by the main display screen 102 a .
- a gamer can be provided with an option to select a desired object of interest (e.g., enemy target).
- the GUI can, for example, be presented to the gamer as the game program is initialized.
- Selection of an objection of interest can be by manner of the gamer operating the input portion 102 b (e.g., a keyboard or a pointing device such as a mouse) so that input signals communicated to the processing portion 102 c can be indicative the gamer's selected object of interest. Therefore, a user of the computer 102 can be able to select an object of interest.
- the processing portion 102 c can be configured to identify audio based data associated with the selected object of interest.
- the processing portion 102 c can be configured to identify the relevant sound effect(s) associated with the selected object of interest. Therefore, audio based data associated with the selected object of interest can be identified at the computer 102 .
- the processing portion 102 c can be configured to generate supplementary signals based on the identified audio based data. Therefore, supplementary signals can be generated, at the computer 102 , based on the identified audio based data.
- supplementary signals communicated from the processing portion 102 c to the transceiver portion 102 f can be communicated to the electronic device 104 . Therefore, generated supplementary signals can be communicated from the computer 102 to the electronic device 104 .
- supplementary signals received by the electronic device 104 can be further processed so that visual cue(s) 210 a / 210 b can be displayed via the display module 104 a .
- the gamer can visually perceive auxiliary information displayed by the display module 104 a . Therefore, received supplementary signals can be processed to generate auxiliary information which can be visually perceivable via the electronic device 104 but not the computer 102 .
- predictive based visual cues can be provided by analyzing historical game play data (e.g., analyzing history of indications 210 a / 210 b displayed by the supplementary display screen 104 a ) or marked up by global community (e.g., in the case of a Massively Multiplayer Online Role Playing Game). Predictive based visual cues can be generated via, for example, Kalman filtering based signal processing (e.g., by the processing portion 102 c ).
- options can be presented, via the display module 104 a , to a gamer running macros such as a series of keystrokes or commands.
- a gamer running macros such as a series of keystrokes or commands.
- six graphical buttons each being customizable by the gamer to turn on/off (i.e., to activate or to deactivate) certain functions/trigger certain activities/trigger certain actions associated with a game run/executed at the computer 102 side can be presented via the display module 104 a .
- one of the six graphical buttons can be customized to trigger an action by the gamer controlled character (e.g., scouting mode, stealth mode and/or to run from/evade enemy target fire).
- the graphical button(s) can effectively function as shortcut key(s) for game control during gameplay.
- the electronic device 104 can be used as a complementary control device during gameplay and can enhance gaming experience by facilitating ease of control of a game character when a gamer is playing a game.
- the display module 104 a can be in other forms.
- the display module 104 a can be in the form of a light emitting diode (LED) array where an appropriate LED is lit according to the supplementary signals received and processed by the electronic device 104 .
- LED light emitting diode
- an LED located at the first quadrant 206 a can be lit if the supplementary signals indicate that the object of interest is located approximately near the front and closer to the left side.
- system 100 being a four channel output (e.g., FR channel, FL channel, RL channel and RR channel) system
- the system 100 can be based on any number of channel outputs.
- the system 100 can be based on six channel outputs where the plurality of speaker drivers 102 i further includes another two speaker drivers (e.g., a top channel speaker and a side channel speaker) in addition to the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Graphics (AREA)
- Acoustics & Sound (AREA)
- User Interface Of Digital Computer (AREA)
- Stereophonic System (AREA)
Abstract
Description
- The present disclosure generally relates to a computer having a main display screen for displaying information and an electronic device having a companion display module, to the main display screen, which displays auxiliary information not displayed by the main display screen.
- Generally, in gaming, the computer on which the game is played displays game based graphics as a gamer plays the game.
- As games become more sophisticated, graphics displayed generally become more detailed. Moreover, as gamers play a game by controlling a game character, it is appreciable that the computer screen generally shows graphics related to the controlled game character. Such related graphics include, for example, changing scenarios as the game character is moved and/or actions related to the controlled game character.
- However, it is appreciable that at the same time activities of the gamer controlled game character is shown on the computer screen, activities of other characters in the game which may be computer controlled and/or controlled by another gamer (e.g., in the case of a Massively Multiplayer Online Role Playing Game) need not necessary be visually perceivable via the computer screen. For example, other characters (i.e., other than the aforementioned gamer controlled game character) may be moving in the background and do not appear in the scene where the aforementioned gamer controlled game character appears at a current point in time. Specifically, the other characters need not necessarily appear in the same scene as the gamer controlled game character. Therefore, activities (e.g., movement) of these other characters which may be of interest to the gamer may, unfortunately, not be visually perceivable via the computer screen at a current point in time when the gamer controlled game character is shown.
- Therefore, gaming experience may be detracted as capability of the computer screen to appropriately provide display for sophisticated games (e.g., a Massively Multiplayer Online Role Playing Game) may be limited.
- It is therefore desirable to provide a solution to address the foregoing problem.
- In accordance with an aspect of the disclosure, there is provided an electronic device operable with a computer. The computer can be configured to run/execute a program which can be associated with graphic based data and audio based data.
- The computer can include a main display screen which can be configured to display information based on graphic based data.
- The electronic device can include a display module (e.g., a supplementary display screen) which can be configured to display auxiliary information.
- Auxiliary information can be related to information being displayed via the main display screen. Moreover, such auxiliary information is not displayed by the main display screen. Specifically, auxiliary information is not visually perceivable via the main display screen at the same point in time the main display screen displays the aforementioned information based on graphic based data. Furthermore, auxiliary information displayed by the display module, but not the main display screen, can be derived based on audio based data.
- In one embodiment, auxiliary information displayed via the display module relates to an object of interest. The object of interest can be associated with audio based data audibly perceivable via the computer. The graphic based data can, for example, be associated with an environment and auxiliary information is associable with approximate location of the object of interest within the environment. Moreover, audio based data associated with the object of interest corresponds to sound effect associated with the object of interest.
- In accordance with another aspect of the disclosure, there is provided a processing method associated with a system which can include a computer and an electronic device. The computer can be configured to signal communicate with the electronic device.
- The processing method can include a selection step where a user of the computer selects an object of interest and an identification step where audio based data associated with the selected object of interest is identified at the computer.
- The processing method can also include a derivation step where supplementary signals are generated, at the computer, based on the identified audio based data.
- Moreover, the processing method can include a transmission step where generated supplementary signals are communicated from the computer to the electronic device and an output step where received supplementary signals are processed to generate at least one visual cue visually perceivable via the electronic device but not the computer.
- Embodiments of the disclosure are described hereinafter with reference to the following drawings, in which:
-
FIG. 1a shows a system which can include a computer which can communicate with an electronic device, according to an embodiment of the disclosure; -
FIG. 1b shows the computer and the electronic device ofFIG. 1a in further detail, according to an embodiment of the disclosure; -
FIG. 1c shows an exemplary scenario where the computer ofFIG. 1a can be a gaming laptop and theelectronic device 104 ofFIG. 1a can be a Smartphone, according to an embodiment of the disclosure; and -
FIG. 2 shows a flow diagram for a processing method in association with the system ofFIG. 1a , according to an embodiment of the disclosure. - Representative embodiments of the disclosure, for addressing the foregoing problem(s), are described hereinafter with reference to
FIG. 1 andFIG. 2 . - Specifically, the present disclosure contemplates an electronic device having a display module which is capable of functioning as, for example, a complementary display. The electronic device can be operated with a computer having a screen. In this regard, the screen of the computer can be considered a main display screen and the display module of the electronic device can be considered to be, for example, a supplementary display relative to the main display screen. The supplementary display can be in the form of, for example, a supplementary display screen which can be considered to be a complementary display screen to the main display screen. Preferably, the electronic device is a portable type device which can be configured to display, via its display module, information auxiliary to information being displayed by the main display screen. Therefore, the display module can display auxiliary information with reference to information being displayed by the main display screen. Preferably, such auxiliary information is not displayed by/visually perceivable via the main display screen and can be derived via audio based data audibly perceivable via the computer as will be discussed with reference to
FIG. 1 andFIG. 2 hereinafter. -
FIG. 1a shows asystem 100 in accordance with an embodiment of the disclosure. Thesystem 100 can include a computer 102 (e.g., Smartphones, Tablets, Laptops, Desktop computers) which is suitable for gaming. Thesystem 100 can further include an electronic device 104 (e.g., Smartphones, Tablets, Laptops, pointing devices such as a mouse, keyboards or another computer similar to the computer 102). Thecomputer 102 can be coupled to theelectronic device 104 such that thecomputer 102 can communicate with theelectronic device 104. Coupling between thecomputer 102 and theelectronic device 104 can be via one or both of wired coupling and wireless coupling. - Each of the
computer 102 and theelectronic device 104 can include a body such as a casing (not shown) shaped and dimensioned to carry, respectively, ascreen 102 a and adisplay module 104 a. The screen of thecomputer 102 can be considered amain display screen 102 a and thedisplay module 104 a of theelectronic device 104 can be considered asupplementary display 104 a relative to themain display screen 102 a. In general, themain display screen 102 a displays information generated in respect of thecomputer 102 at a current point in time and thesupplementary display 104 a displays information related to that being displayed by themain display screen 102 a, but such related information (i.e., displayed by thesupplementary display 104 a) is not displayed by themain display screen 102 a at that same current point in time. Therefore, it is appreciable that thesupplementary display 104 a displays information auxiliary to information being displayed by themain display screen 102 a. Specifically, thesupplementary display 104 a can display auxiliary information with reference to information being displayed by themain display screen 102 a. As such, thesupplementary display 104 a can be considered to be a companion/complementary display (e.g., a companion/complementary display screen) to themain display screen 102 a. - Since auxiliary Information is displayed only via the
supplementary display screen 104 a and not themain display screen 102 a and such auxiliary information is related to information being displayed by themain display screen 102 a, it is appreciable that there is a need to derive such auxiliary information. As will be discussed in further detail with reference to an exemplary scenario, such auxiliary information can be derived using, for example, audio based data generated by, and audibly perceivable via, thecomputer 102. - The
system 100 will be discussed in further detail with reference toFIG. 1b hereinafter. - Referring to
FIG. 1b , thecomputer 102 can further include aninput portion 102 b, aprocessing portion 102 c, avisual portion 102 d, anaudio portion 102 e and atransceiver portion 102 f. The body (e.g., casing) of thecomputer 102 can be further shaped and dimensioned to carry theinput portion 102 b, theprocessing portion 102 c, thevisual portion 102 d, theaudio portion 102 e and thetransceiver portion 102 f. - The
visual portion 102 d can include a display driver (not shown). Theaudio portion 102 e can include anaudio processor 102 g, anaudio amplifier 102 h and a plurality ofspeaker drivers 102 i. - The
input portion 102 b can be coupled to theprocessing portion 102 c. Theprocessing portion 102 c can be coupled to each of thevisual portion 102 d, theaudio portion 102 e and thetransceiver portion 102 f. - In regard to the
visual portion 102 d, the display driver can be coupled to theprocessing portion 102 c. The display driver can be further coupled to themain display screen 102 a. - In regard to the
audio portion 102 e, theaudio processor 102 g can be coupled to theprocessing portion 102 c. Theaudio processor 102 g can be further coupled to theaudio amplifier 102 h. Theaudio amplifier 102 h can be coupled to the plurality ofspeaker drivers 102 i. In one example, the plurality of speaker drivers can include a first speaker driver, a second speaker driver, a third speaker driver and a fourth speaker driver. The first speaker driver can, for example, correspond to a front left (i.e., “FL” as indicated inFIG. 1b ) channel speaker. The second speaker driver can, for example, correspond to a front right (i.e., “FR” as indicated inFIG. 1b ) channel speaker. The third speaker driver can, for example, correspond to a rear left (i.e., “RL” as indicated inFIG. 1b ) channel speaker. The fourth speaker driver can, for example, correspond to a rear right (i.e., “RR” as indicated inFIG. 1b ) channel speaker. Audio output from thecomputer 102 can be via the plurality ofspeaker drivers 102 i. In a more specific example, audio output from thecomputer 102 can include FL audio output, FR audio output, RL audio output and/or RR audio output via the FL channel speaker, the FR channel speaker, the RL channel speaker and/or the RR channel speaker respectively. - Earlier mentioned, coupling between the
computer 102 and theelectronic device 104 can be via one or both of wired coupling and wireless coupling. - In regard to wireless coupling, the
computer 102 can be coupled to theelectronic device 104 via acommunication network 106. Specifically, thecomputer 102 can be configured to wirelessly signal communicate with theelectronic device 104 via thecommunication network 106. Thecommunication network 106 can include, for example, Bluetooth based communication, Infrared based communication and/or Wi-Fi based communication. - In regard to wired coupling, the
computer 102 can be coupled to theelectronic device 104 via hardwiring. For example, thecomputer 102 can be coupled to theelectronic device 104 via a cable (not shown). - The
electronic device 104display module 104 a (i.e., also referable assupplementary display 104 a) can, in one embodiment, be configured for display in accordance with audio output from thecomputer 102. Moreover, thesupplementary display 104 a can be in the form of a display screen. - More specifically, display by the
supplementary display 104 a can, for example, be partitioned based on audio output from thecomputer 102. Yet more specifically, thesupplementary display 104 a can, for example, be partitioned into four sections corresponding to the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker. - As shown, display by the
supplementary display 104 a can be partitioned into four equal sections by avertical axis 108 a and ahorizontal axis 108 b, in accordance with an embodiment of the disclosure. In this regard, the display by thesupplementary display 104 a can, for example, be partitioned into afirst quarter 108 c indicative of FL audio output, asecond quarter 108 d indicative of FR audio output, athird quarter 108 e indicative of RL audio output and afourth quarter 108 f indicative of RR audio output. It should be noted that the vertical andhorizontal axes 108 a/108 b are included in this discussion to merely illustrate a possible manner of demarcating display by thesupplementary display 104 a and need not necessarily be visually perceivable (i.e., the vertical andhorizontal axes 108 a/108 b can be imaginary lines which are not visually perceivable). - Operationally, the
processing portion 102 c can be configured to run/execute a software program. Based on the software program being run, output signals can be generated and communicated from theprocessing portion 102 c. Output signals can include one or both of visual based output signals and audio based output signals. In this regard, the software program can be associated with one or both of graphics based data and audio based data which can be processed to generate, respectively, visual based output signals and audio based output signals. - Visual based output signals can be communicated to the
visual portion 102 d for further processing (e.g., by the display driver) so that the visual based output signals can, for example, be in a format suitable for display (i.e., visually perceivable by a user) of graphics by themain display screen 102 a. - Audio based output signals can be communicated to the
audio portion 102 e for further processing. Specifically, the audio based output signals can be processed by theaudio processor 102 g and theaudio amplifier 102 h. The processed audio based output signals can be output via the plurality ofspeaker drivers 102 i for audible perception by a user. For example, audio based output signals can include a FL channel signal, a FR channel signal, a RL channel signal and a RR channel signal. The FL channel signal, the FR channel signal, the RL channel signal and the RR channel signal can be processed by theaudio processor 102 g and theaudio amplifier 102 h. The processed FL channel signal, the processed FR channel signal, the processed RL channel signal and the processed RR channel signal can be output by the FL channel speaker, the FR channel speaker, the RL channel speaker, the RR channel speaker respectively. Outputs from the FL channel speaker, the FR channel speaker, the RL channel speaker, the RR channel speaker can be referred to as FL audio output, FR audio output, RL audio output and RR audio output respectively. - Input signals can be generated, by a user operating the
input portion 102 b (e.g., a keyboard or a pointing device such as a mouse), and communicated to processingportion 102 c. Theprocessing portion 102 c can be configured to process the graphic based data and/or the audio based data based on the input signals to produce, respectively, visual based output signals and/or audio based output signals. - The
processing portion 102 c can be further configured to process one or both of the graphics based data and the audio based data to produce supplementary signals which can be communicated to theelectronic device 104. The graphics based data and/or audio based data can be processed by theprocessing portion 102 c based on the input signals. Preferably, the supplementary signals are communicated to thetransceiver portion 102 f which transmits the supplementary signals to theelectronic device 104 wirelessly via thecommunication network 106. - Based on the supplementary signals, auxiliary information can be displayed via the
supplementary display 104 a as will be discussed with reference to an exemplary scenario perFIG. 1c hereinafter. - Referring to
FIG. 1c , in oneexemplary scenario 200, thecomputer 102 can be a gaming laptop and theelectronic device 104 can be a Smartphone having a display screen. In this regard, the aforementioned display module/supplementary display 104 a can be in the form of a display screen. Hence, the aforementioned display module/supplementary display 104 a can be referred to as asupplementary display screen 104 a hereinafter. - The
supplementary display screen 104 a can be configured for coordinate graph type display in which avertical axis 202 and ahorizontal axis 204 section thesupplementary display screen 104 a into four equal sections. In this regard, thesupplementary display screen 104 a can be partitioned into afirst quadrant 206 a indicative of a FL based location, asecond quadrant 206 b indicative of a FR based location, athird quadrant 206 c indicative of a RL based location and afourth quadrant 206 d indicative of a RR based location. Additionally, a point oforigin 208 can be displayed. The point oforigin 208 can be a reference point which is indicative that there is no output from each of the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker in respect of sound effect of interest (i.e., each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can be quantified to be 0 dB by theprocessing portion 102 c, in relation to the sound effect of interest). Moreover, one or morevisual cues 210 a/210 b (i.e., visual indication(s)) can be displayed based on the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal as will be discussed later in further detail. - The software program executed can be a game based program. The game based program can, for example, be of a war/battle type game genre (i.e., a game which is war/battle themed).
- Usually, in a game, there will be one or more game characters and/or a game environment (i.e., graphics based data) which can be visually perceived via the
main display screen 102 a. In the game, there can be a storyline or game rules and a gamer can play the game in accordance with the storyline or game rules. For example, there may be a need for the gamer to move one or more game characters in a game environment so as to achieve a certain objective. Appreciably, in a game, there can be one or more movable game characters and/or one or more stationary game characters. The movable game character(s) can be moved in accordance with gamer control to achieve a certain objective in the game. Furthermore, the game can include accompanying game audio (i.e., audio based data) such as background music, soundtracks and/or sound effects which can be audibly perceived via the plurality ofspeaker drivers 102 i. The game character(s) and game environment can correspond to graphics based data. The accompanying game audio can correspond to audio based data. - Specifically, in a game which is war themed, game characters can include soldiers and the game environment can correspond to a battlefield. An objective of the game can be to neutralize enemy targets. Gamer control can be via the
input portion 102 b (i.e., input signals generated by a gamer using a keyboard) to, for example, move the soldier(s) and/or to shoot at an enemy target. Sound effects of the game can include gunshots and footsteps (e.g., as the gamer moves the soldiers through the battlefield and/or as the enemy targets move through the battlefield). Sound effects can further include gunshots as enemy targets shoot at soldiers controlled by the gamer. Themain display screen 102 a can be configured to display the battlefield and soldiers etc (i.e., information displayed by themain display screen 102 a). - In this regard, audio based data can be associated with game audio. Specifically, audio based data can include, for example, background music, soundtracks and/or sound effects.
- Additionally, information displayed by the
main display screen 102 a can include the game environment, the soldier(s) and/or enemy target(s). Information displayed by themain display screen 102 a can further include movement of the soldier(s), movement of the enemy target(s) and/or changes in the game environment as the gamer moves the soldier(s) at a current point in time during gameplay. - The
processing portion 102 c can, for example, be configured provide an indication and/or perform calculations based on audio based data associated with the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal. Audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can, for example, be quantified in decibels (dB). Moreover, the FL channel signal, FR channel signal, RL channel signal and the RR channel signal can be indicative of loudness of the FL audio output, the FR audio output, the RL audio output and the RR audio output respectively. - Preferably, the supplementary signals communicated from the
computer 102 to theelectronic device 104 can be derived based on audio based data. Audio based data can be based on a sound effect of interest to a gamer. The sound effect of interest to a gamer can be associated with an object of interest to the gamer. An object of interest can, for example, include a movable game character such as an enemy target. - In one embodiment, the
processing portion 102 c can be configured to quantify (e.g., in dB) audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal at any point in time when the game is being played by a gamer. Specifically, a sound effect (e.g., gunshots from an enemy target and/or footsteps of an enemy target) of interest can be identified by a gamer and theprocessing portion 102 c can be configured to quantify, in dB, audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal in relation to the sound effect of interest (e.g., gunshots from an enemy target and/or footsteps of an enemy target). Based on such quantification, in relation to the sound effect of interest, of audio based data associated with the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal, supplementary signals can be generated by theprocessing portion 102 c and communicated from thecomputer 102 to theelectronic device 104. A sound effect of interest can be identified by a gamer by manner of, for example, selection of an object of interest as will be discussed later in further detail with reference toFIG. 2 . - In another embodiment, the
processing portion 102 c can be configured to quantify (e.g., in dB) audio based data associated with each of the FL channel signal, FR channel signal, RL channel signal and the RR channel signal at any point in time when the game is being played by a gamer. Theprocessing portion 102 c can be further configured to compare audio based data associated with one channel signal (e.g., FL channel signal) with audio based data associated with another channel signal (e.g., FR channel signal). Specifically, a sound effect (e.g., gunshots from an enemy target and/or footsteps of an enemy target) of interest can be identified by a gamer, and theprocessing portion 102 c can be configured to quantify and compare audio based data associated with the FL channel signal, FR channel signal, RL channel signal and/or the RR channel signal in relation to the sound effect of interest (e.g., gunshots from an enemy target and/or footsteps of an enemy target). A sound effect of interest can be identified by a gamer by manner of, for example, selection of an object of interest as will be discussed later in further detail with reference toFIG. 2 . - In one example, in relation to the sound effect of interest audibly perceivable at a point in time during gameplay, audio based data associated with the FL channel signal can be quantified to be 6 dB, audio based data associated with the FR channel signal can be quantified to be 3 dB, audio based data associated with the RL channel signal can be quantified to be 0 dB and audio based data associated with RR channel signal can be quantified to be 0 dB. The
processing portion 102 c can be configured to compare audio based data associated with the FL and FR channel signals (since audio based data associated with the RL and RR channel signals, being 0 dB, can be disregarded). By comparing audio based data associated with the FL and FR channel signals, supplementary signals indicating location of the sound effect of interest during gameplay can be generated. Specifically, given that audio based data associated with the FL channel signal is quantified to be 6 dB whereas audio based data associated with the FR channel signal is quantified to be 3 dB, supplementary signals indicating that, for example, enemy target gunshots can be heard near the front (i.e., between FL and FR channels) closer to the left side (i.e., since audio based data associated with the FL channel signal is 6 dB and audio based data associated with the FR channel signal is 3 dB). - In another example, in relation to the sound effect of interest audibly perceivable at a point in time during gameplay, audio based data associated with the FL channel signal can be quantified to be 0 dB, audio based data associated with the FR channel signal can be quantified to be 5 dB, audio based data associated with the RL channel signal can be quantified to be 0 dB and audio based data associated with the RR channel signal can be quantified to be 2 dB. The
processing portion 102 c can be configured to compare audio based data associated with the FR and RR channel signals (since audio based data associated with the FL and FR channel signals, being 0 dB, can be disregarded). By comparing audio based data associated with the FR and RR channel signals, supplementary signals indicating location of the sound effect of interest during gameplay can be generated. Specifically, given that audio based data associated with the FR channel signal is quantified to be 5 dB whereas audio based data associated with the RR channel signal is quantified to be 2 dB, supplementary signals indicating that, for example, enemy target gunshots can be heard near the right side (i.e., between FR and RR channels) closer to the front (i.e., since audio based data associated with the FR channel signal is 5 dB and audio based data associated with the RR channel signal is 2 dB). - Appreciably, in this manner (e.g., per earlier discussion concerning the two examples immediately preceding this paragraph), an indication of the approximate location of an enemy target can be provided. Therefore, the supplementary signals can be indicative of an approximate location of an enemy target although the enemy target may not be necessarily visually perceivable via the
main display screen 102 a at a particular point in time during gameplay when the sound effect of interest (i.e., gunshots from an enemy target) can be audibly perceived. Specifically, an indication of an approximate location of an object of interest (e.g., an enemy target) can be provided via thesupplementary display screen 104 a based on audio based data even though the object of interest is not visually perceivable via themain display screen 102 a at a current point in time during gameplay. Therefore, auxiliary information displayed via thesupplementary display screen 104 a can relate to the aforementioned approximate location of an object of interest not shown/displayed by themain display screen 102 a. - In this regard, the supplementary signals communicated from the
computer 102 to theelectronic device 104 can be received and processed by theelectronic device 104 so as to provide at least one indication (i.e., visual cue(s) 210 a/210 b) of, for example, an approximate location of an object of interest (e.g., an enemy target). - In an earlier example (i.e., audio based data associated with the FL channel signal can be quantified to be 6 dB, audio based data associated with the FR channel signal can be quantified to be 3 dB, audio based data associated with each of the RL channel signal and the RR channel signal can be quantified to be 0 dB in relation to the sound effect of interest), where based on audio based data, the supplementary signals can indicate that the object of interest (e.g., the enemy target) is approximately located near the front (i.e., between FL and FR channels) closer to the left side (i.e., since audio based data associated with the FL channel signal is 6 dB and audio based data associated with the FR channel signal is 3 dB). The
supplementary display screen 104 a can display an indication such as avisual cue 210 a somewhere in thefirst quadrant 206 a indicating as such. - In another earlier example (i.e., audio based data associated with the FL channel signal can be quantified to be 0 dB, audio based data associated with the FR channel signal can be quantified to be 5 dB, audio based data associated with the RL channel signal can be quantified to be 0 dB and audio based data associated with RR channel signal can be quantified to be 2 dB in relation to the sound effect of interest), where based on audio based data, the supplementary signals can indicate that the object of interest (e.g., the enemy target) is approximately located at the right side (i.e., between FR and RR channels) closer to the front (i.e., since audio based data associated with the FR channel signal is 5 dB and audio based data associated with the RR channel signal is 2 dB). The
supplementary display screen 104 a can display an indication such as avisual cue 210 b somewhere in thesecond quadrant 206 b indicating as such. - Appreciably, for a gamer playing a game using the
computer 102, the sound effect of interest (e.g., gunshots from an enemy target, footsteps of an enemy target) may be audibly perceivable from the plurality ofspeaker drivers 102 i. However, the object of interest (e.g., an enemy target) associated with the sound effect of interest may not necessarily be shown (i.e., visually perceivable) on themain display screen 102 a even if the sound effect of interest can be audibly perceived at a particular point in time when the game is played. In this regard, information displayed by themain display screen 102 a can include the game character(s) moved by the gamer and the game environment whereas auxiliary information such as the object of interest (e.g., a movable enemy target which movement may be computer controlled), which is not displayed by themain display screen 102 a when the associated sound of interest can be audibly perceived, can be displayed by thesupplementary display screen 104 a. Therefore, even if the object of interest is not displayed by themain display screen 102 a when the associated sound of interest can be audibly perceived by a gamer, the gamer can still be provided with an approximate location of the object of interest in the game environment by virtue of the aforementioned visual cue(s) 210 a/210 b displayed by thesupplementary display screen 104 a. - Moreover, since the approximate location of the object of interest in the game environment can be provided by displaying the aforementioned visual cue(s) 210 a/210 b via the
supplementary display screen 104 a, it is appreciable that there is no need to provision any additional space on themain display screen 102 a for display of such auxiliary information. Hence there will be no such visual distractions shown on themain display screen 102 a which may detract gaming experience. That is, a gamer can have access to desired auxiliary information and yet still fully enjoy the intended display (the movable game character controlled by the gamer and the game environment etc.) on themain display screen 102 a without being distracted by display of auxiliary information. -
FIG. 2 shows a flow diagram for aprocessing method 300 in association with thesystem 100 ofFIG. 1 in accordance with an embodiment of the disclosure. Per earlier discussion, it is appreciable that thesystem 100 includes acomputer 102 and anelectronic device 104. Thecomputer 102 is communicable with theelectronic device 104. - The
processing method 300 can include aselection step 302, anidentification step 304, aderivation step 306, atransmission step 308 and anoutput step 310. - In regard to the
selection step 302, a graphics user interface (GUI) can be provided for user selection by a user of thecomputer 102. The GUI can be displayed by themain display screen 102 a. For example, a gamer can be provided with an option to select a desired object of interest (e.g., enemy target). The GUI can, for example, be presented to the gamer as the game program is initialized. Selection of an objection of interest can be by manner of the gamer operating theinput portion 102 b (e.g., a keyboard or a pointing device such as a mouse) so that input signals communicated to theprocessing portion 102 c can be indicative the gamer's selected object of interest. Therefore, a user of thecomputer 102 can be able to select an object of interest. - In regard to the
identification step 304, theprocessing portion 102 c can be configured to identify audio based data associated with the selected object of interest. For example, theprocessing portion 102 c can be configured to identify the relevant sound effect(s) associated with the selected object of interest. Therefore, audio based data associated with the selected object of interest can be identified at thecomputer 102. - In regard to the
derivation step 306, theprocessing portion 102 c can be configured to generate supplementary signals based on the identified audio based data. Therefore, supplementary signals can be generated, at thecomputer 102, based on the identified audio based data. - In regard to the
transmission step 308, supplementary signals communicated from theprocessing portion 102 c to thetransceiver portion 102 f can be communicated to theelectronic device 104. Therefore, generated supplementary signals can be communicated from thecomputer 102 to theelectronic device 104. - In regard to the
output step 310, supplementary signals received by theelectronic device 104 can be further processed so that visual cue(s) 210 a/210 b can be displayed via thedisplay module 104 a. As such the gamer can visually perceive auxiliary information displayed by thedisplay module 104 a. Therefore, received supplementary signals can be processed to generate auxiliary information which can be visually perceivable via theelectronic device 104 but not thecomputer 102. - In the foregoing manner, various embodiments of the disclosure are described for addressing at least one of the foregoing disadvantages. Such embodiments are intended to be encompassed by the following claims, and are not to be limited to specific forms or arrangements of parts so described and it will be apparent to one skilled in the art in view of this disclosure that numerous changes and/or modification can be made, which are also intended to be encompassed by the following claims.
- In one example, predictive based visual cues can be provided by analyzing historical game play data (e.g., analyzing history of
indications 210 a/210 b displayed by thesupplementary display screen 104 a) or marked up by global community (e.g., in the case of a Massively Multiplayer Online Role Playing Game). Predictive based visual cues can be generated via, for example, Kalman filtering based signal processing (e.g., by theprocessing portion 102 c). - In another example, options (e.g., via a graphics user interface) can be presented, via the
display module 104 a, to a gamer running macros such as a series of keystrokes or commands. For example, six graphical buttons, each being customizable by the gamer to turn on/off (i.e., to activate or to deactivate) certain functions/trigger certain activities/trigger certain actions associated with a game run/executed at thecomputer 102 side can be presented via thedisplay module 104 a. In a more specific example, one of the six graphical buttons can be customized to trigger an action by the gamer controlled character (e.g., scouting mode, stealth mode and/or to run from/evade enemy target fire). Therefore, the graphical button(s) can effectively function as shortcut key(s) for game control during gameplay. As can be appreciated, theelectronic device 104 can be used as a complementary control device during gameplay and can enhance gaming experience by facilitating ease of control of a game character when a gamer is playing a game. - In yet another example, although a display screen has been used in earlier examples, it is appreciable that the
display module 104 a can be in other forms. For example, thedisplay module 104 a can be in the form of a light emitting diode (LED) array where an appropriate LED is lit according to the supplementary signals received and processed by theelectronic device 104. For example, an LED located at thefirst quadrant 206 a can be lit if the supplementary signals indicate that the object of interest is located approximately near the front and closer to the left side. - In yet a further example, although earlier examples are based on the
system 100 being a four channel output (e.g., FR channel, FL channel, RL channel and RR channel) system, it is appreciable that thesystem 100 can be based on any number of channel outputs. For example, thesystem 100 can be based on six channel outputs where the plurality ofspeaker drivers 102 i further includes another two speaker drivers (e.g., a top channel speaker and a side channel speaker) in addition to the FL channel speaker, the FR channel speaker, the RL channel speaker and the RR channel speaker.
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/387,370 US11571621B2 (en) | 2016-08-04 | 2019-04-17 | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG10201606458W | 2016-08-04 | ||
SG10201606458WA SG10201606458WA (en) | 2016-08-04 | 2016-08-04 | A companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/387,370 Continuation US11571621B2 (en) | 2016-08-04 | 2019-04-17 | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180036636A1 true US20180036636A1 (en) | 2018-02-08 |
Family
ID=59485299
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/666,769 Abandoned US20180036636A1 (en) | 2016-08-04 | 2017-08-02 | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
US16/387,370 Active US11571621B2 (en) | 2016-08-04 | 2019-04-17 | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/387,370 Active US11571621B2 (en) | 2016-08-04 | 2019-04-17 | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor |
Country Status (5)
Country | Link |
---|---|
US (2) | US20180036636A1 (en) |
EP (1) | EP3278849B1 (en) |
CN (1) | CN107688449B (en) |
SG (2) | SG10201606458WA (en) |
TW (1) | TWI734812B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108744516A (en) * | 2018-05-29 | 2018-11-06 | 腾讯科技(深圳)有限公司 | Obtain method and apparatus, storage medium and the electronic device of location information |
US20200348387A1 (en) * | 2018-05-29 | 2020-11-05 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112827165A (en) * | 2021-02-15 | 2021-05-25 | 苏州优它科技有限公司 | Wearing recreation external member based on mutual feedback of virtual reality multimode |
US20240121569A1 (en) * | 2022-10-09 | 2024-04-11 | Sony Interactive Entertainment Inc. | Altering audio and/or providing non-audio cues according to listener's audio depth perception |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100113148A1 (en) * | 2008-11-04 | 2010-05-06 | Quado Media Inc. | Multi-player, multi-screens, electronic gaming platform and system |
US20120200667A1 (en) * | 2011-02-08 | 2012-08-09 | Gay Michael F | Systems and methods to facilitate interactions with virtual content |
US20160023116A1 (en) * | 2014-07-03 | 2016-01-28 | Spitfire Technologies, Llc | Electronically mediated reaction game |
US20170165569A1 (en) * | 2015-12-15 | 2017-06-15 | Nvidia Corporation | Built-in support of in-game virtual split screens with peer-to peer-video conferencing |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1804945B1 (en) * | 2004-09-21 | 2022-04-13 | Timeplay Inc. | System, method and handheld controller for multi-player gaming |
WO2010091514A1 (en) * | 2009-02-11 | 2010-08-19 | Hellatronix | Video game with behavior indicators and controller therefor with integrated display screen |
US20110300930A1 (en) * | 2010-06-08 | 2011-12-08 | Hsu Kent T J | Video game controller with an auxiliary display |
EP2506464A1 (en) * | 2011-03-30 | 2012-10-03 | Harman International Industries Ltd. | Audio processing apparatus and method of outputting status information |
US9628858B2 (en) * | 2014-10-31 | 2017-04-18 | Microsoft Technology Licensing, Llc | Individualized content presentation for increased user interaction performance in group settings |
-
2016
- 2016-08-04 SG SG10201606458WA patent/SG10201606458WA/en unknown
-
2017
- 2017-07-24 TW TW106124776A patent/TWI734812B/en active
- 2017-07-27 SG SG10201706135UA patent/SG10201706135UA/en unknown
- 2017-07-28 EP EP17183869.1A patent/EP3278849B1/en active Active
- 2017-08-02 US US15/666,769 patent/US20180036636A1/en not_active Abandoned
- 2017-08-03 CN CN201710655005.1A patent/CN107688449B/en active Active
-
2019
- 2019-04-17 US US16/387,370 patent/US11571621B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100113148A1 (en) * | 2008-11-04 | 2010-05-06 | Quado Media Inc. | Multi-player, multi-screens, electronic gaming platform and system |
US20120200667A1 (en) * | 2011-02-08 | 2012-08-09 | Gay Michael F | Systems and methods to facilitate interactions with virtual content |
US20160023116A1 (en) * | 2014-07-03 | 2016-01-28 | Spitfire Technologies, Llc | Electronically mediated reaction game |
US20170165569A1 (en) * | 2015-12-15 | 2017-06-15 | Nvidia Corporation | Built-in support of in-game virtual split screens with peer-to peer-video conferencing |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108744516A (en) * | 2018-05-29 | 2018-11-06 | 腾讯科技(深圳)有限公司 | Obtain method and apparatus, storage medium and the electronic device of location information |
US20200348387A1 (en) * | 2018-05-29 | 2020-11-05 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US11241625B2 (en) | 2018-05-29 | 2022-02-08 | Tencent Technology (Shenzhen) Company Limited | Positioning information prompting method and apparatus, storage medium, and electronic device |
US11536796B2 (en) * | 2018-05-29 | 2022-12-27 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US11971494B2 (en) * | 2018-05-29 | 2024-04-30 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US12287417B2 (en) | 2018-05-29 | 2025-04-29 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
TW201805050A (en) | 2018-02-16 |
US11571621B2 (en) | 2023-02-07 |
TWI734812B (en) | 2021-08-01 |
SG10201606458WA (en) | 2018-03-28 |
CN107688449A (en) | 2018-02-13 |
US20190240576A1 (en) | 2019-08-08 |
CN107688449B (en) | 2022-08-30 |
EP3278849A1 (en) | 2018-02-07 |
EP3278849B1 (en) | 2022-01-05 |
SG10201706135UA (en) | 2018-03-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11571621B2 (en) | Companion display module to a main display screen for displaying auxiliary information not displayed by the main display screen and a processing method therefor | |
US8187095B2 (en) | Universal game console controller | |
US11465040B2 (en) | System and method for playing video games on touchscreen-based devices | |
KR101582296B1 (en) | Automatic aiming system and method for mobile game | |
JP2019051311A (en) | Information processing method, device, computer program, and computer readable storage medium | |
TWI654019B (en) | Method for inserting virtual resource objects into an application, terminal and computer readable storage medium | |
US20090197679A1 (en) | Video Game Controller | |
JP6663634B2 (en) | Video game device, video game control method, video game control program, and recording medium | |
US10898791B2 (en) | Game apparatus | |
TWI793838B (en) | Method, device, apparatus, medium and product for selecting interactive mode for virtual object | |
CN112843716B (en) | Virtual object prompting and viewing method and device, computer equipment and storage medium | |
US20220001280A1 (en) | Method, device, and computer program for displaying interaction graphic user interface | |
JP2023082039A (en) | Game program, game processing method and game terminal | |
US20120295707A1 (en) | Computer and recording medium | |
KR102614708B1 (en) | Method for selecting target object and gaming device for executint the method | |
KR102609293B1 (en) | Apparatus and method for determining game action | |
KR20200080818A (en) | Method for outputting screen and display device for executing the same | |
KR102584901B1 (en) | Apparatus and method for sending event information, apparatus and method for displayng event information | |
KR20170001539A (en) | Automatic aiming system and method for mobile game | |
JP2023090815A (en) | Game program, game processing method and game terminal | |
KR20200080996A (en) | Smart controler, apparatus for controling user terminal, and method for controling user terminal | |
US20240367035A1 (en) | Information processing method, information processing system and computer program | |
KR102141477B1 (en) | Apparatus and method for controlling game | |
KR101983696B1 (en) | Apparatus for interfacing of game | |
JP7345093B2 (en) | Game program, game processing method, and game terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CREATIVE TECHNOLOGY LTD, SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, YEE SHIAN;CHEONG, CHEE KIN;XU, FENG;REEL/FRAME:043170/0115 Effective date: 20160804 Owner name: CREATIVE TECHNOLOGY LTD, SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CREATIVE LABS PTE. LTD.;REEL/FRAME:043170/0033 Effective date: 20160804 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |