US8232989B2 - Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment - Google Patents
Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment Download PDFInfo
- Publication number
- US8232989B2 US8232989B2 US12/344,531 US34453108A US8232989B2 US 8232989 B2 US8232989 B2 US 8232989B2 US 34453108 A US34453108 A US 34453108A US 8232989 B2 US8232989 B2 US 8232989B2
- Authority
- US
- United States
- Prior art keywords
- avatar
- user
- computer
- virtual environment
- keyframes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000002708 enhancing effect Effects 0.000 title abstract description 4
- 230000033001 locomotion Effects 0.000 claims abstract description 111
- 230000009471 action Effects 0.000 claims abstract description 30
- 229940030850 avar Drugs 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims 5
- 230000003993 interaction Effects 0.000 abstract description 5
- 238000004891 communication Methods 0.000 description 5
- 230000000881 depressing effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- KRQUFUKTQHISJB-YYADALCUSA-N 2-[(E)-N-[2-(4-chlorophenoxy)propoxy]-C-propylcarbonimidoyl]-3-hydroxy-5-(thian-3-yl)cyclohex-2-en-1-one Chemical compound CCC\C(=N/OCC(C)OC1=CC=C(Cl)C=C1)C1=C(O)CC(CC1=O)C1CCCSC1 KRQUFUKTQHISJB-YYADALCUSA-N 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to virtual environments and, more particularly, to a method and apparatus for enhancing control of an Avatar in a three dimensional computer-generated virtual environment.
- Virtual environments simulate actual or fantasy 3-D environments and allow for many participants to interact with each other and with constructs in the environment via remotely-located clients.
- One context in which a virtual environment may be used is in connection with gaming, although other uses for virtual environments are also being developed.
- a virtual environment an actual or fantasy universe is simulated within a computer processor/memory.
- a virtual environment may be implemented as a stand-alone application, such as a computer aided design package or a computer game.
- the virtual environment may be implemented on-line so that multiple people may participate in the virtual environment through a computer network, such as a local area network or a wide area network such as the Internet.
- Users are represented in a virtual environment by an “Avatar” which is often a three-dimensional representation of a person or other object to represent them in the virtual environment. Participants interact with the virtual environment software to control how their Avatars move within the virtual environment.
- the participant may control the Avatar using conventional input devices, such as a computer mouse and keyboard or optionally may use more specialized controls such as a gaming controller.
- the view experienced by the Avatar changes according to where the Avatar is located within the virtual environment.
- the views may be displayed to the participant so that the participant controlling the Avatar may see what the Avatar is seeing.
- many virtual environments enable the participant to toggle to a different point of view, such as from a vantage point outside (i.e. behind) the Avatar, to see where the Avatar is in the virtual environment.
- An Avatar may be allowed to walk, run, swim, and make other gross motor movements within the virtual environment.
- the Avatar may also be able to perform fine motor skills such as be allowed to pick up an object, throw an object, use a key to open a door, and perform other similar tasks.
- Virtual environments are commonly used in on-line and stand-alone gaming, such as for example in role playing games where a user assumes the role of a character and takes control over most of that character's actions.
- virtual environments are also being used to simulate real life environments to provide an interface for users that will enable on-line education, training, shopping, and other types of interactions between groups of users and between businesses and users.
- Animation of the Avatar's movements involves three dimensional computer animation.
- a stick-figure or skeleton representation will be used.
- Each segment of the skeletal model is defined by an animation variable, which is commonly referred to as an Avar.
- the collection of Avars defines the Avatar's skeleton and, hence, define how parts of the Avatar's model can move.
- the skeletal model also restricts movement of the Avatar to prevent the Avatar from assuming unrealistic positions or engaging in unrealistic motion.
- the skeletal model itself is invisible and not rendered. Rather, the position of the Avars defines the position and orientation of the Avatar within the virtual environment.
- a skin is defined around the skeletal model.
- the animator will manipulate the Avars directly. Movement of the Avars causes movement of the Avatar which, when wrapped with the skin, causes the three dimensional model of the Avatar to appear to move.
- Movement of an object is actually a series of frames that shows the object in slightly different positions. By showing the frames sufficiently rapidly, such as at 30 or 60 frames per second, the object may appear to move in a continuous manner. Movement of an Avatar is accomplished in this same way.
- keyframes key positions of the Avars will be defined at strategic points in the animation sequence (referred to as keyframes) and the computer will interpolate between the key positions to determine the precise position of the Avars at every frame.
- the process of interpolating between keyframes is commonly referred to as “Tweening”, which refers to the computer determining the position in between keyframes.
- Avatar When a user controls an Avatar in a virtual environment, the user activates sequences of Avar movements, which causes the Avatar to move through specified sequences of keyframes.
- the Avatar will appear to execute the desired motion as the computer moves the Avatar through the predefined sequence of keyframes.
- the user may direct the Avatar's actions within the virtual environment, the user does not actually control the Avatar's movements. Rather, the user selects a particular animation sequence which will enable the user to make a particular movement.
- the movement itself is defined by the keyframes associated with the animation sequence and, hence, the actual movement of the Avatar is controlled by the virtual environment program.
- the virtual environment is set up so that the mouse controls where the Avatar walks in the virtual environment.
- the Avatar will walk toward the area.
- user is thus “directing” the Avatar's action within the virtual environment since they control where the Avatar walks within the virtual environment.
- the user is not controlling the Avatar's actions, since the user is not actually controlling how the Avatar moves their legs, feet, etc.
- the virtual environment client will cause the Avars of the Avatar to move through a sequence that will cause the skeleton and, hence the Avatar, to appear to execute a step motion.
- the actual step motion i.e.
- the sequence of keyframes is provided by the computer without input from the user.
- the actual walking motion and other actions of the Avatar are generated by the computer using particular animation sequences.
- the user provides input to tell the virtual environment what the Avatar should do, and the actual movement or action of the Avatar is implemented by the virtual environment server.
- canned action sequences to be implemented when the user provides particular input is useful, since it makes it easier for the user to learn how to navigate within the virtual environment and makes it easier for the user to control the Avatar's gross motor actions in the virtual environment.
- the Avatar's actions are not individually controllable by the user, the types of movement in which an Avatar may engage are limited according to selection from the available canned action sequences. As virtual environments become more sophisticated, and Avatars become more expressive, it would be advantageous to more directly control how Avatars can operate in a three dimensional computer-generated virtual environment.
- a method and apparatus for enhancing control of an Avatar in a three dimensional computer-generated virtual environment is provided.
- a user can control one or more controllable aspects of an Avatar directly via interacting with a touch sensitive user input device such as a touchpad or touch sensitive screen. Interaction with the touch sensitive user input device enables more precise control and more direct control to be implemented over the action of the Avatar in the virtual environment.
- Multiple aspects of the Avatar may be controlled, such as the Avatar's forward motion, orientation, arm movements, and grasping of objects.
- FIG. 1 is a functional block diagram of a portion of an example system enabling users to have access to three dimensional computer-generated virtual environment;
- FIG. 2 is a view of an example Avatar appearing on a display as part of a three dimensional computer-generated virtual environment
- FIGS. 3A-3R are example control patterns that may be used to provided enhanced control of an Avatar in a three dimensional computer-generated virtual environment
- FIG. 4 is a functional block diagram of a touch pad configured to be used to control multiple aspects of an Avatar in a three dimensional computer-generated virtual environment
- FIG. 5 is a functional block diagram of an example keyboard including a touch pad that may be used to control an Avatar in a three dimensional computer-generated virtual environment.
- FIG. 1 shows a portion of an example system 10 showing the interaction between a plurality of users 12 and one or more network-based virtual environments 14 .
- a user may access the network-based virtual environment 14 from their computer 22 over a packet network 16 or other common communication infrastructure.
- the virtual environment 14 is implemented by one or more virtual environment servers 18 .
- Communication sessions such as audio calls between the users 12 may be implemented by one or more communication servers 20 .
- a virtual environment may be implemented by stand-alone virtual environment software 25 as well, which may be is instantiated on the user's computer 22 instead of the virtual environment client to enable the computer to generate the virtual environment directly for display to the user.
- the computer may still connect to the network 16 , but would not need to do so to generate the virtual environment and enable the user to interact with the virtual environment.
- the virtual environment 14 may be any type of virtual environment, such as a virtual environment created for an on-line game, a virtual environment created to implement an on-line store, a virtual environment created to implement an on-line training facility, or for any other purpose.
- Virtual environments are being created for many reasons, and may be designed to enable user interaction to achieve a particular purpose.
- Example uses of virtual environments include gaming, business, retail, training, social networking, and many other aspects.
- a virtual environment will have its own distinct three dimensional coordinate space.
- Avatars representing users may move within the three dimensional coordinate space and interact with objects and other Avatars within the three dimensional coordinate space.
- the virtual environment software maintains the virtual environment and generates a visual presentation for each user based on the location of the user's Avatar within the virtual environment.
- stand-alone mode this may be implemented by the stand-alone virtual environment software.
- networked mode this may be implemented by a combination of the virtual environment client and virtual environment server.
- the view may also depend on the direction in which the Avatar is facing and the selected viewing option, such as whether the user has opted to have the view appear as if the user was looking through the eyes of the Avatar, or whether the user has opted to pan back from the Avatar to see a three dimensional view of where the Avatar is located and what the Avatar is doing in the three dimensional computer-generated virtual environment.
- Each user 12 has a computer 22 that may be used to access the three-dimensional computer-generated virtual environment.
- a user interface 26 to the virtual environment enables input from the user to control aspects of the virtual environment.
- the user interface 26 may be part of the virtual environment client 24 , or implemented as a separate process.
- a separate virtual environment client may be required for each virtual environment that the user would like to access, although a particular virtual environment client may be designed to interface with multiple virtual environment servers.
- a communication client 28 is provided to enable the user to communicate with other users who are also participating in the three dimensional computer-generated virtual environment.
- the communication client may be part of the virtual environment client 24 , the user interface 26 , or may be a separate process running on the computer 22 .
- the user may see a representation of a portion of the three dimensional computer-generated virtual environment on a display/audio 30 and input commands via a user input device 32 such as a mouse and keyboard.
- a touch sensitive user input device such as a touchpad or touch sensitive display may also be used as an input device.
- the display/audio 30 may be used by the user to transmit/receive audio information while engaged in the virtual environment.
- the display/audio 30 may be a display screen having a speaker and a microphone.
- an Avatar is a three dimensional rendering of a person or other creature that represents the user in the virtual environment.
- the user selects the way that their Avatar looks when creating a profile for the virtual environment. This may be implemented by enabling the user to specify particular values of the Avars defining the Avatar's skeleton.
- the user can control the movement of the Avatar in the virtual environment such as by causing the Avatar to walk, run, wave, talk, or make other similar movements.
- the block 34 representing the Avatar in the virtual environment 14 is not intended to show how an Avatar would be expected to appear in a virtual environment. Rather, the actual appearance of the Avatar is immaterial since the actual appearance of each user's Avatar may be expected to be somewhat different and customized according to the preferences of that user.
- FIG. 2 shows the display in greater detail.
- the user display 30 will generally include a three dimensional computer-generated virtual environment 10 within which the user's Avatar 34 may move and interact with objects and other Avatars.
- a simplified Avatar 34 has been shown in this figure for simplicity of illustration. Actual Avatars are generally three dimensional representations of humans or other creatures rather than simple line drawings.
- the user may use control devices such as a computer keyboard and mouse to control the Avatar's motions within the virtual environment.
- keys on the keyboard may be used to control the Avatar's movements and the mouse may be used to control the direction of motion.
- One common set of letters that is frequently used to control an Avatar are the letters WASD, although other keys also generally are assigned particular tasks.
- the user may hold the W key, for example, to cause their Avatar to walk and use the mouse to control the direction in which the Avatar is walking.
- Numerous other input devices have been developed, such as touch sensitive screens, dedicated game controllers, joy sticks, etc. Many different ways of controlling gaming environments and other types of virtual environments have been developed over time.
- the user may have many input devices, including a keypad 37 , keyboard 38 , light pen 39 , mouse 40 , game controller 41 , audio microphone 42 , and other types of input devices.
- the user may also use one or more touch sensitive user input devices to control particular actions of the Avatar within the virtual environment.
- an Avatar is generally defined by Avars that describe the Avatar's skeleton. Skeletal animation also specifies limitations on how the Avatar may move so that the Avatar does not move unrealistically.
- a particular sequence will cause animation by causing the Avars to move through a pre-set series of keyframes. The computer will interpolate between the keyframes to cause the Avatar to appear to move smoothly to execute the desired motion. Multiple motions may be accommodated by executing several sequences at once. This is commonly referred to as animation blending. For example, an Avatar may be caused to raise its arm while walking.
- a touch sensitive user input device such as a touch pad 36 or touch sensitive display screen may be used to provide enhanced control over the user's Avatar.
- the display 30 may be touch sensitive such that touching the screen may be used to control particular aspects of the virtual environment.
- a multi-touch touchpad may be used to control the Avatar.
- the touchpad 36 may be used alone or in combination with other inputs, such as a keypad 37 , keyboard 38 , light pen 39 , mouse 40 , game controller 41 , audio microphone 42 , or other available input device.
- FIGS. 3A-3R show several combinations of motions on a touch sensitive surface such as a touchpad or touch screen that may be used to provide enhanced control of an Avatar in a three dimensional computer-generated virtual environment.
- the control inputs are used to directly control the motion of the Avatar, such as the placement of the Avatar's feet, the quickness of the steps that the Avatar is taking in the virtual environment, the location of the Avatar's hand, the angle, direction, and speed with which the Avatar moves his arm, and other similar aspects of the Avatar's motion.
- the way in which the user instructs the virtual environment server provides more control over the precise execution of the canned motion sequence to thereby enhance control over the Avatar's motions. For example, if the user directs the Avatar to raise its arm 20 degrees, the actual rendering of the motion within the virtual environment may be taken from a canned execution sequence. However, the extent of execution of the sequence may be controlled by the user's input to simulate more direct control over the precise motion of the Avatar.
- a particular motion is defined as a set of keyframes that are executed in a sequence
- the user may use the touch pad to specify particular keyframes along the sequence so that the Avatar will move to the specified keyframe rather than all the way through the animation sequence.
- the virtual environment software will progress through the sequence of keyframes, tweening to fill in the fluid animation between keyframes, until it reaches the user-selected keyframe.
- the user may move their fingertip in one direction on the touchpad to cause the animation sequence to move forward to subsequent keyframes or, alternatively, may move their fingertip in the opposite direction on the touchpad to cause the animation sequence to move in the opposite direction to move backward toward previous keyframes.
- Avatar motion Another way that the user input may be translated to Avatar motion is to enable the user to control an Avar directly, so that the user can manipulate the Avar through a plane of motion in two or three dimensions. This enables the user to control, for example, the position and orientation of the Avar, so that the user may cause the Avatar to move without requiring the Avatar to move through a predefined sequence of keyframes. As the Avar is moved under the control of the user, the computer will cause the other Avars affected by movement of the controlled Avar to also move accordingly. The Avatar will then be rendered as defined by the motion of the Avars.
- the user may simultaneously control two or more sequences.
- the user may move their fingertip up/down on the touchpad to control how far up/down the Avatar raises its arm.
- the user may also move their finger left/right on the touchpad to control the direction the Avatar points its arm. This may be useful, for example, to enable more precise control over the Avatar. For example, where the Avatar is holding a virtual sword, this may enable the user to directly control the Avatar in a sword fight rather than simply instructing the Avatar to make a downward slashing motion that would then be executed by the Avatar.
- the enhanced control enables particular actions to be taken and controlled by the person using a straight-forward easy to remember set of controls. Rather than requiring the user to remember particular keystrokes to get their Avatar to perform a particular movement, such as using a combination of keyboard keys, multiple types of movements may be executed simply by touching the touch sensitive surface. This simplifies the user interface for the virtual environment, making it more intuitive and, hence, easier to use than user interfaces requiring memorization of particular keyboard keystrokes and combinations of keystrokes.
- Direct control over a particular feature of Avatar animation may be implemented using the touch sensitive user input device at the same time that conventional Avatar controls are used to control other actions of the Avatar in the virtual environment.
- keys on the keyboard may be used to control the Avatar to cause the Avatar to walk within the virtual environment.
- the touch sensitive user input device may be used to control the Avatar's arm to cause the Avatar to raise its arm and wave to hail another Avatar.
- the touch sensitive user input device may be used in connection with other types of user input devices to control particular features of the Avatar.
- FIGS. 3A-3R show several sequences that may be used to control an Avatar via a touch-sensitive surface such as a touchpad or touch screen.
- touch sensitive surfaces may be used as well.
- touch sensitive display screens are commonly available on personal data assistants and other handheld electronic devices, and are becoming available on laptop and desktop computers.
- the several touch sequences shown in the Figs. are only intended to be examples.
- a hexagon represents an initial touch of the touch sensitive surface
- arrows are used to show contact motion of the user's fingertip with the touch sensitive surface.
- Wide arrows are used to show fingertip motion that is intended to control an Avatar's leg/foot
- narrow arrows are used to show fingertip motion that is intended to control an Avatar's arm/hand.
- a touch-pad is shown in FIGS. 3A-3R , the motion combinations may be used on other touch-sensitive surfaces as well.
- FIG. 3A shows a way that the touchpad may be used to enable the user to control their Avatar to cause their Avatar to walk forward in the three dimensional computer-generated virtual environment.
- the user's finger tip contacting the pad is represented by the hexagon, and the motion of the user's finger on the touchpad is represented by the arrow.
- the user may make their fingers do a walking motion by causing their fingers to serially contact the touchpad and pull down over the surface of the touchpad.
- the computer may interpret the left fingertip contact as representing the Avatar's left foot and the right fingertip contact as representing the Avatar's right foot.
- FIG. 3B shows a similar set of motions that may be used to cause the Avatar to run in the virtual environment.
- the motion used to make the Avatar run in the virtual environment may be similar to the motion used to make the Avatar walk, except that the motion may be increased in some way to signal to the computer that the Avatar's motion should be more dramatic.
- the length of the stroke on the touchpad may be increased by the user, the frequency with which the user strokes the pad may be increased, or both length and frequency may be increased.
- FIGS. 3C-3D show how the Avatar may be controlled to turn toward the right in the three dimensional computer-generated virtual environment
- FIGS. 3E-3F show how the Avatar may be controlled to turn toward the left in the three dimensional computer-generated virtual environment.
- the direction of the stroke of the user's fingertip on the touch-sensitive surface may be used to control the direction of movement of a portion of the Avatar within the virtual environment.
- the Avatar will turn to the right.
- FIGS. 3G-3L show several other finger sequences that may be used to enable the Avatar to perform motions other than walking/running.
- FIG. 3G shows an example finger motion that may be used by the user to cause the Avatar to slide in the virtual environment. Assume that the Avatar has been running, and is on a slippery surface. By causing both fingers to simultaneously contact the touchpad and slide them forward, the user may cause the Avatar to execute a sliding motion to skim across the slippery surface. This may cause the Avatar to slide to a stop or simply to slide for a while on the surface.
- the user would like to cause the Avatar to slow down or stop.
- the user may touch both fingertips to the touchpad and slide the fingers slowly toward the bottom of the touchpad. Executing a motion of this nature may cause the Avatar to slow its rate of travel within the virtual environment.
- FIGS. 3I and 3J show several examples of how the user may control the Avatar to cause the Avatar to turn around in the virtual environment.
- the user will cause their fingertips to contact the touchpad and slide their fingertips in opposite directions.
- this motion will cause the Avatar's left foot to move up and right foot to move down to thereby cause the Avatar to turn around toward the right.
- the converse motion is shown in FIG. 3J which will cause the Avatar to turn toward the left.
- FIGS. 3K and 3L show several additional combinations that may enable the user to cause the Avatar to execute more unusual motions. For example, if the user taps both fingertips on the touchpad in a hopping motion, the user may cause the Avatar to jump in the virtual environment. Similarly, if the user alternates double taps of his fingers on the touchpad, the user may cause the Avatar's left foot to hop, right foot to hop, left foot to hop, etc., to cause the Avatar to skip in the virtual environment.
- the touchpad may be divided into areas 36 A- 36 E, particularly if the touchpad is sufficiently large.
- the different areas may be used for different purposes, such as to control different aspects of the Avatar.
- a touchpad may be divided into quadrants 36 A- 36 D, in which each quadrant controls a different limb of the Avatar.
- particular portions 36 E, 36 F of the touchpad may be designated as buttons such that a tap or other touch in that designated area would be deemed to be a button push rather than as controlling an aspect of Avatar movement.
- Other ways of using touch sensitive user input devices may be used as well.
- FIGS. 3M-3R show several finger motions that may be used to control the Avatar's other limbs, such as the Avatar's arms and hands. Since the touchpad is not able to discern a touch associated with an arm from a touch associated with a foot, in one embodiment a modifier key such as a button adjacent the touch pad may be depressed to enable the computer to differentiate a touch/stroke sequence associated with an arm movement from a touch/stroke sequence associated with a foot. Thin arrows have been used to show motion of finger strokes that cause arm/hand motion.
- buttons associated with the touch pad 36 a left button 37 L and a right button 37 R. If one button is pressed, the user may use one finger to control one arm. If both buttons are pushed, the user may use two fingers to contact the touchpad to control the movement of both arms.
- depressing the left button 37 L will cause the computer to allow the user to control the user's left arm
- depressing the right button 37 R will cause the computer to allow the user to control the user's right arm.
- the button that is depressed causes any movement on the pad to be associated with the corresponding arm.
- this may be reversed depending on the preference of the user. For example, the user may want to control the Avatar's right arm with their right hand, and hold the touchpad button with their left hand.
- FIGS. 3M and 3N two arrows have been showing, illustrating that the user may move their fingertip up and down sequentially to cause the Avatar to raise and lower its arm in an up/down motion.
- FIGS. 3O and 3P show two examples of touch/stroke combinations that may be used to enable a user to control their Avatar to cause the Avatar to grasp an item, ungrasp the item, and optionally throw the item.
- the user may touch the touchpad while holding down one of the arm selection buttons and move their fingertip to cause the Avatar to raise its arm toward the item to be grasped.
- the arm may be automatically moved in that direction to make manipulation of the grasping motion easier for the user.
- the user may quickly tap the keypad to control the Avatar to cause the Avatar to grasp the item.
- the user may tap the keypad a second time, or optionally quickly tap the keypad several times in succession.
- the user may make a downward motion with the Avatar's arm and tap the keyboard in a motion approximately inverse to the grasping motion shown in FIG. 3O .
- the motion associated with letting go of an item may be to simply place the item on the virtual ground, table, or other surface rendered in the virtual environment.
- the Avatar may be allowed to throw the item by causing the Avatar's arm to make a fast throwing motion and then tapping the touchpad to signal the release of the item.
- the trajectory of the item within the virtual environment may be influenced by the direction of the stroke, the speed of the stroke, and the point where the user taps the touch sensitive surface.
- the Avatar's arm may be desirable for the Avatar to shake hands with another Avatar.
- This may be implemented, for example, as shown in FIG. 3Q by causing the Avatar to extend his arm (using the double tap motion described in connection with FIG. 3R ) and then rapidly move his arm up and down as shown in FIG. 3Q .
- the user may signal this motion by depressing the arm selection button, moving their finger in an upward motion on the touchpad, and then quickly moving their finger up and down a very short distance to signal the handshake motion.
- an Avatar may be desirable for an Avatar to extend his arm straight in front of him, for example to point. This may be signaled using a particular keystroke, such as a double tap unaccompanied by a sliding motion.
- a particular keystroke such as a double tap unaccompanied by a sliding motion.
- Other combinations of taps and stroking motions may be used to implement the described gestures and to implement other gestures, and the invention is not limited to these particular sets of motions and gestures.
- the double tap motion was also described above as a possible motion sequence that may cause an Avatar to ungrasp a grasped item.
- the virtual environment server may view the touch sequence in context to enable the same or similar touch/stroke combinations to be viewed in context and to enable the same/similar touch/stroke combinations to be used to control unrelated aspects of the Avatar's movement.
- the shake hands motion of FIG. 3Q may cause the Avatar to shake hands with another Avatar if the other Avatar is sufficiently close by.
- the same motion may be used to make an Avatar grasp an eraser and erase a virtual white board if the Avatar is standing next to the white board.
- the particular motion performed by the Avatar may be dependent on the surrounding objects within the virtual environment and interpreted by the virtual environment server in the context of these other objects and the Avatar's position within the virtual environment.
- the keyboard may also be used in connection with the touch sensitive user input device to enable enhanced control over which limb or other feature of the Avatar is controlled by the touch sensitive user input device.
- the user may depress a button 37 L to activate the touch pad and then press a button on the keyboard.
- the H key may be used to designate the touch pad to control the Avatar's Head.
- the touch pad may be used to control the Avatar's Head.
- the keyboard may be used in numerous other ways to specify which feature or features of the Avatar's body should be controlled by the keypad.
- FIG. 4 shows an example multi-touch touchpad that may enable the user to have four or more touch points active at one time.
- the same motions described above in connection with FIGS. 3A-3R may be used to control hand and foot movement of the Avatar.
- the computer since there are multiple touchpoints, the computer will need to know how to interpret the various inputs.
- a user may depress a touchpad button whenever a limb such as an arm is to be controlled.
- the user would signal to the computer by pressing down one of the buttons that the next tap on the touchpad will be used to control an arm.
- the computer would then wait for the next touch and associate that area of the touchpad with the arm control.
- the computer could then keep track of where the arm control touchpoint was last located and use this to associate motions in that general area with control of the arm.
- the user has four touchpoints on the touchpad.
- the user will identify a particular touch point with the Avatar's left arm, another touchpoint with the Avatar's right arm, and other touch points with the Avatar's feet.
- These touch points may be designated as described above, by depressing a button, or may be identified by initially touching particular areas of the touchpad.
- the top left touchpoint may be interpreted as controlling the Avatar's left hand
- the top right touchpoint may be interpreted as controlling the Avatar's right hand
- the lower touch points may be interpreted as controlling the Avatar's feet.
- Many different ways of associating a touchpoint with a controllable feature of the Avatar may be available.
- the touch pad may also be divided into quadrants with each quadrant used to control a particular feature.
- the computer will keep track of that touchpoint and look for future instructions from the user by looking for subsequent contact within the same general area of the touch sensitive surface.
- the computer may maintain a notion of persistence, which is that once a touchpoint has been associated with a controllable feature, the computer will assume that a touchpoint in the same general area is also associated with the controllable feature. Although this will require the user to use different portions of the touchpad for different controllable features, it allows the user to continue control over the controllable feature without re-designating the touchpoint whenever contact with the touchpad is lost.
- FIG. 5 shows a keyboard 38 that may be used in connection with an embodiment of the invention.
- the keyboard has been shown as having a plurality of keys which generally correspond to a standard QWERTY keyboard. Many different key combinations and computer keyboards have been created, and the invention is not limited to any particular key layout or type of keyboard. Thus, the keyboard may have fewer keys or additional keys, and the term keyboard is not to be construed as limited to a keyboard having only the illustrated keys or to having keys in this particular arrangement.
- the keyboard has a touchpad area that is able to be used to control controllable features of an Avatar.
- the keyboard may include the touchpad 36 such as the touchpad shown in FIG. 4 .
- the user may use the touch pad to control motion of the Avatar.
- the keys of the keyboard may be used to select the limb and the touch pad may be used to control the motion of the selected limb.
- a portion of the display may be designated for controlling the Avatar in a manner similar to that described above in connection with the touch pad.
- the touch screen could be used directly to control the Avatar's actions such as by enabling the user to touch the Avatar's hand to cause the Avatar to raise its hand.
- a dot or other visual feedback may appear on the particular limb being controlled to provide additional visual feedback as to what aspect is being controlled.
- a dot, aura, or other indication may be shown on the Avatar's foot when the user touches the touchpad to show the user that the user is controlling the Avatar's foot. If the use is not intending to control the Avatar's foot the user may take their finger off the touchpad and retouch a different portion of the touchpad or otherwise provide with input to select a different limb.
- the functions described above may be implemented as one or more sets of program instructions that are stored in a computer readable memory within the network element(s) and executed on one or more processors within the network element(s).
- ASIC Application Specific Integrated Circuit
- programmable logic used in conjunction with a programmable logic device such as a Field Programmable Gate Array (FPGA) or microprocessor, a state machine, or any other device including any combination thereof.
- Programmable logic can be fixed temporarily or permanently in a tangible medium such as a read-only memory chip, a computer memory, a disk, or other storage medium. All such embodiments are intended to fall within the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (13)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/344,531 US8232989B2 (en) | 2008-12-28 | 2008-12-28 | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment |
PCT/CA2009/001710 WO2010071980A1 (en) | 2008-12-28 | 2009-11-25 | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/344,531 US8232989B2 (en) | 2008-12-28 | 2008-12-28 | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment |
Publications (2)
Publication Number | Publication Date |
---|---|
US20100164946A1 US20100164946A1 (en) | 2010-07-01 |
US8232989B2 true US8232989B2 (en) | 2012-07-31 |
Family
ID=42284347
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/344,531 Expired - Fee Related US8232989B2 (en) | 2008-12-28 | 2008-12-28 | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment |
Country Status (2)
Country | Link |
---|---|
US (1) | US8232989B2 (en) |
WO (1) | WO2010071980A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140125678A1 (en) * | 2012-07-11 | 2014-05-08 | GeriJoy Inc. | Virtual Companion |
US20160171743A1 (en) * | 2007-10-29 | 2016-06-16 | Julian Michael Urbach | Efficiently implementing and displaying independent 3-dimensional interactive viewports of a virtual world on multliple client devices |
US9865081B2 (en) * | 2008-06-16 | 2018-01-09 | Julian Michael Urbach | Re-utilization of render assets for video compression |
US11631228B2 (en) | 2020-12-04 | 2023-04-18 | Vr-Edu, Inc | Virtual information board for collaborative information sharing |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110296043A1 (en) * | 2010-06-01 | 2011-12-01 | Microsoft Corporation | Managing Shared Sessions in a Shared Resource Computing Environment |
US20120192088A1 (en) * | 2011-01-20 | 2012-07-26 | Avaya Inc. | Method and system for physical mapping in a virtual world |
US20120204120A1 (en) * | 2011-02-08 | 2012-08-09 | Lefar Marc P | Systems and methods for conducting and replaying virtual meetings |
US9159152B1 (en) * | 2011-07-18 | 2015-10-13 | Motion Reality, Inc. | Mapping between a capture volume and a virtual world in a motion capture simulation environment |
US20130159935A1 (en) * | 2011-12-16 | 2013-06-20 | Garrick EVANS | Gesture inputs for navigating in a 3d scene via a gui |
KR101736477B1 (en) * | 2011-12-20 | 2017-05-16 | 인텔 코포레이션 | Local sensor augmentation of stored content and ar communication |
US8854178B1 (en) * | 2012-06-21 | 2014-10-07 | Disney Enterprises, Inc. | Enabling authentication and/or effectuating events in virtual environments based on shaking patterns and/or environmental information associated with real-world handheld devices |
US9262856B1 (en) * | 2012-07-17 | 2016-02-16 | Disney Enterprises, Inc. | Providing content responsive to performance of available actions solicited via visual indications |
CN103218844B (en) * | 2013-04-03 | 2016-04-20 | 腾讯科技(深圳)有限公司 | The collocation method of virtual image, implementation method, client, server and system |
US9542579B2 (en) | 2013-07-02 | 2017-01-10 | Disney Enterprises Inc. | Facilitating gesture-based association of multiple devices |
US9612664B2 (en) * | 2014-12-01 | 2017-04-04 | Logitech Europe S.A. | Keyboard with touch sensitive element |
JP7193015B2 (en) * | 2020-10-14 | 2022-12-20 | 住友電気工業株式会社 | Communication support program, communication support method, communication support system, terminal device and non-verbal expression program |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2351426A (en) | 1999-06-24 | 2000-12-27 | Stephen James Crampton | Method and apparatus for the generation of computer graphic representations of individuals |
US20040075677A1 (en) | 2000-11-03 | 2004-04-22 | Loyall A. Bryan | Interactive character system |
US20060087510A1 (en) * | 2004-09-01 | 2006-04-27 | Nicoletta Adamo-Villani | Device and method of keyboard input and uses thereof |
US20060247046A1 (en) * | 2003-07-26 | 2006-11-02 | Choi Kang-In | Method of synchronizing motion of cooperative game system method of realizing interaction between pluralities of cooperative game system using it and cooperative game method |
CA2659672A1 (en) | 2006-06-26 | 2008-01-03 | Icosystem Corporation | Methods and systems for interactive customization of avatars and other animate or inanimate items in video games |
US20080158232A1 (en) * | 2006-12-21 | 2008-07-03 | Brian Mark Shuster | Animation control method for multiple participants |
US7468728B2 (en) * | 2003-07-22 | 2008-12-23 | Antics Technologies Limited | Apparatus for controlling a virtual environment |
US20090031240A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Item selection using enhanced control |
US20090147008A1 (en) * | 2007-12-10 | 2009-06-11 | International Business Machines Corporation | Arrangements for controlling activites of an avatar |
-
2008
- 2008-12-28 US US12/344,531 patent/US8232989B2/en not_active Expired - Fee Related
-
2009
- 2009-11-25 WO PCT/CA2009/001710 patent/WO2010071980A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2351426A (en) | 1999-06-24 | 2000-12-27 | Stephen James Crampton | Method and apparatus for the generation of computer graphic representations of individuals |
US20040075677A1 (en) | 2000-11-03 | 2004-04-22 | Loyall A. Bryan | Interactive character system |
US7478047B2 (en) * | 2000-11-03 | 2009-01-13 | Zoesis, Inc. | Interactive character system |
US7468728B2 (en) * | 2003-07-22 | 2008-12-23 | Antics Technologies Limited | Apparatus for controlling a virtual environment |
US20060247046A1 (en) * | 2003-07-26 | 2006-11-02 | Choi Kang-In | Method of synchronizing motion of cooperative game system method of realizing interaction between pluralities of cooperative game system using it and cooperative game method |
US20060087510A1 (en) * | 2004-09-01 | 2006-04-27 | Nicoletta Adamo-Villani | Device and method of keyboard input and uses thereof |
CA2659672A1 (en) | 2006-06-26 | 2008-01-03 | Icosystem Corporation | Methods and systems for interactive customization of avatars and other animate or inanimate items in video games |
US20080158232A1 (en) * | 2006-12-21 | 2008-07-03 | Brian Mark Shuster | Animation control method for multiple participants |
US20090031240A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Item selection using enhanced control |
US20090147008A1 (en) * | 2007-12-10 | 2009-06-11 | International Business Machines Corporation | Arrangements for controlling activites of an avatar |
Non-Patent Citations (1)
Title |
---|
Written Opinion of the International Searching Authority from corresponding PCT application PCT/CA2009/001710. |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160171743A1 (en) * | 2007-10-29 | 2016-06-16 | Julian Michael Urbach | Efficiently implementing and displaying independent 3-dimensional interactive viewports of a virtual world on multliple client devices |
US9659400B2 (en) * | 2007-10-29 | 2017-05-23 | Julian Michael Urbach | Efficiently implementing and displaying independent 3-dimensional interactive viewports of a virtual world on multiple client devices |
US9865081B2 (en) * | 2008-06-16 | 2018-01-09 | Julian Michael Urbach | Re-utilization of render assets for video compression |
US10109101B2 (en) | 2008-06-16 | 2018-10-23 | Julian Michael Urbach | Re-utilization of render assets for video compression |
US10504276B2 (en) | 2008-06-16 | 2019-12-10 | Julian Michael Urbach | Re-utilization of render assets for video compression |
US20140125678A1 (en) * | 2012-07-11 | 2014-05-08 | GeriJoy Inc. | Virtual Companion |
US11631228B2 (en) | 2020-12-04 | 2023-04-18 | Vr-Edu, Inc | Virtual information board for collaborative information sharing |
US11734906B2 (en) | 2020-12-04 | 2023-08-22 | VR-EDU, Inc. | Automatic transparency of VR avatars |
US11756280B2 (en) | 2020-12-04 | 2023-09-12 | VR-EDU, Inc. | Flippable and multi-faced VR information boards |
US11983837B2 (en) | 2020-12-04 | 2024-05-14 | VR-EDU, Inc. | Cheating deterrence in VR education environments |
Also Published As
Publication number | Publication date |
---|---|
WO2010071980A1 (en) | 2010-07-01 |
US20100164946A1 (en) | 2010-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8232989B2 (en) | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment | |
Seinfeld et al. | User representations in human-computer interaction | |
Pei et al. | Hand interfaces: Using hands to imitate objects in ar/vr for expressive interactions | |
US10551993B1 (en) | Virtual reality content development environment | |
JP2018142313A (en) | System and method for touch of virtual feeling | |
Aliprantis et al. | Natural Interaction in Augmented Reality Context. | |
Bleiweiss et al. | Enhanced interactive gaming by blending full-body tracking and gesture animation | |
Sadihov et al. | Prototype of a VR upper-limb rehabilitation system enhanced with motion-based tactile feedback | |
Zhang et al. | Double hand-gesture interaction for walk-through in VR environment | |
Stuerzlinger et al. | The value of constraints for 3D user interfaces | |
Van Veldhuizen et al. | The effect of semi-transparent and interpenetrable hands on object manipulation in virtual reality | |
Yu et al. | Force push: Exploring expressive gesture-to-force mappings for remote object manipulation in virtual reality | |
CN111389003A (en) | Game role control method, device, equipment and computer readable storage medium | |
Jang et al. | Incorporating kinesthetic creativity and gestural play into immersive modeling | |
Oshita | Multi-touch interface for character motion control using example-based posture synthesis | |
WO2022180894A1 (en) | Tactile-sensation-expansion information processing system, software, method, and storage medium | |
Rodriguez et al. | Gestural interaction for virtual reality environments through data gloves | |
Adapa et al. | Multi-player Gaming Application Based on Human Body Gesture Control | |
Yusof et al. | Virtual Block Augmented Reality Game Using Freehand Gesture Interaction | |
Henschke et al. | Wands are magic: A comparison of devices used in 3D pointing interfaces | |
Lee et al. | iSphere: a free-hand 3D modeling interface | |
CN118363465B (en) | Interaction method and device of MR (magnetic resonance) equipment and electronic equipment | |
US12153854B1 (en) | Animation of hand-finger communicator with real-world voice output | |
Kwon et al. | Designing 3D menu interfaces for spatial interaction in virtual environments | |
Wu et al. | Interface design for somatosensory interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NORTEL NETWORKS LIMITED,CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HYNDMAN, ARN;REEL/FRAME:022342/0204 Effective date: 20081218 Owner name: NORTEL NETWORKS LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HYNDMAN, ARN;REEL/FRAME:022342/0204 Effective date: 20081218 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT,NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023892/0500 Effective date: 20100129 Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023892/0500 Effective date: 20100129 |
|
AS | Assignment |
Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT,NEW YO Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT, NEW Y Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 |
|
AS | Assignment |
Owner name: AVAYA INC.,NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTEL NETWORKS LIMITED;REEL/FRAME:023998/0878 Effective date: 20091218 Owner name: AVAYA INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTEL NETWORKS LIMITED;REEL/FRAME:023998/0878 Effective date: 20091218 |
|
AS | Assignment |
Owner name: BANK OF NEW YORK MELLON TRUST, NA, AS NOTES COLLATERAL AGENT, THE, PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC., A DELAWARE CORPORATION;REEL/FRAME:025863/0535 Effective date: 20110211 Owner name: BANK OF NEW YORK MELLON TRUST, NA, AS NOTES COLLAT Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC., A DELAWARE CORPORATION;REEL/FRAME:025863/0535 Effective date: 20110211 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS INC.;OCTEL COMMUNICATIONS CORPORATION;AND OTHERS;REEL/FRAME:041576/0001 Effective date: 20170124 |
|
AS | Assignment |
Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL COMMUNICATIONS CORPORATION), CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 023892/0500;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044891/0564 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 025863/0535;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST, NA;REEL/FRAME:044892/0001 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNI Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: VPNET TECHNOLOGIES, INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 030083/0639;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A.;REEL/FRAME:045012/0666 Effective date: 20171128 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW Y Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 |
|
AS | Assignment |
Owner name: SIERRA HOLDINGS CORP., NEW JERSEY Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITICORP USA, INC.;REEL/FRAME:045045/0564 Effective date: 20171215 Owner name: AVAYA, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITICORP USA, INC.;REEL/FRAME:045045/0564 Effective date: 20171215 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045124/0026 Effective date: 20171215 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, MINNESOTA Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:053955/0436 Effective date: 20200925 |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, AS COLLATERAL AGENT, DELAWARE Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;INTELLISIST, INC.;AVAYA MANAGEMENT L.P.;AND OTHERS;REEL/FRAME:061087/0386 Effective date: 20220712 |
|
AS | Assignment |
Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA HOLDINGS CORP., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 |
|
AS | Assignment |
Owner name: WILMINGTON SAVINGS FUND SOCIETY, FSB (COLLATERAL AGENT), DELAWARE Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA MANAGEMENT L.P.;AVAYA INC.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:063742/0001 Effective date: 20230501 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;REEL/FRAME:063542/0662 Effective date: 20230501 |
|
AS | Assignment |
Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: CAAS TECHNOLOGIES, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY II, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: ZANG, INC. (FORMER NAME OF AVAYA CLOUD INC.), NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: VPNET TECHNOLOGIES, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: OCTEL COMMUNICATIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 |
|
AS | Assignment |
Owner name: AVAYA LLC, DELAWARE Free format text: (SECURITY INTEREST) GRANTOR'S NAME CHANGE;ASSIGNOR:AVAYA INC.;REEL/FRAME:065019/0231 Effective date: 20230501 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240731 |