WO2013016707A1 - Applications interactives de contenu numérique - Google Patents
Applications interactives de contenu numérique Download PDFInfo
- Publication number
- WO2013016707A1 WO2013016707A1 PCT/US2012/048721 US2012048721W WO2013016707A1 WO 2013016707 A1 WO2013016707 A1 WO 2013016707A1 US 2012048721 W US2012048721 W US 2012048721W WO 2013016707 A1 WO2013016707 A1 WO 2013016707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scene
- dimensional
- annotation
- images
- view
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 claims description 35
- 230000007704 transition Effects 0.000 claims description 32
- 238000009877 rendering Methods 0.000 claims description 13
- 230000009471 action Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 3
- 230000003068 static effect Effects 0.000 claims description 3
- 238000004873 anchoring Methods 0.000 claims 2
- 230000000694 effects Effects 0.000 abstract description 21
- 238000009826 distribution Methods 0.000 abstract description 6
- 238000011161 development Methods 0.000 abstract description 3
- 230000015654 memory Effects 0.000 description 15
- 238000003860 storage Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 5
- 230000006855 networking Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000004297 night vision Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 230000003362 replicative effect Effects 0.000 description 2
- 241000234282 Allium Species 0.000 description 1
- 235000002732 Allium cepa var. cepa Nutrition 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/305—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6009—Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
- A63F2300/6669—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera using a plurality of virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character change rooms
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
- A63F2300/6676—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
Definitions
- the present disclosure relates to creating digital content and more specifically to creating interactive digital content associated with replicating and simulating the target interactive digital application.
- Digital applications such as electronic games and other computer executable content, often include a plethora of features.
- developers generally provide a user guide.
- Many user guides especially those for games or highly complex applications, can be rather limited in content. In some cases only providing limited explanatory content is strategic, such as with a game where some of the appeal is derived from the unknown.
- application developers as well as third-party content providers develop and publish supplemental user guides, such as strategy guides.
- user guides Even though user guides contain content regarding the use and features of digital applications, which involve user interaction, user guides are generally entirely flat. That is, if the user guide is available in a digital format, it includes very limited interactivity, if any at all. For example, a user guide can be a PDF full of text and still images. Some newer user guides can contain videos, but even with this more dynamic content, the user still remains a mostly passive observer. This is in stark contrast to the user's activities related to the interactive digital application. Thus, while the user guides are likely informative and teach the user about the interactive digital application, they fail to replicate the actions and activities that a user will perform while interacting with the digital publication. Such a limitation can restrict the effectiveness of a user guide.
- An interactive digital content application can be any computer executable application designed to include content about or directly from a target interactive digital application, such as audio, video, text, and/or images.
- a digital content application can include one or more interactive elements that require a user to take an active role with respect to the content presented.
- An advantage of a digital content application is that a user can interact with the content in a risk free environment. For example, a user is able to explore paths through an electronic game, identify best routes, or identify mistakes to avoid using fewer resources, such as time, real or virtual currency, etc. Additionally, a user is able to learn and/or experiment in an environment outside of the view of other users.
- a digital content application can contain one or more content sections. Each section can present the content using one of a variety of different presentation formats.
- a presentation format can be an interactive presentation format. Examples of possible presentation formats can include active view, branching video, effects viewer, concept gallery, and/or multi-way view. Additionally, more passive presentation formats can also be used, such as video trailers, interviews, FAQs, etc.
- the branching video presentation format can be used to include one or more related video segments in a section.
- the video segments can include any content, such as developer commentary or gameplay from a particular section of an electronic game.
- One or more video segments contained in the section can be presented to a user. The user can view the available video segments in any order.
- a branching video presentation format can be multi-level. That is, upon completing a video segment, one or more additional video segments can be presented that are related to the just completed video segment.
- a possible use of a multi-level branching video presentation format can be to present different paths and/or strategies through a section of an electronic game. A user can then follow different paths through the branching video to see the effects of decisions and/or strategies.
- various gameplay statistics can be carried through from one segment to the next so that a user can see which path proved to be most successful.
- the active view presentation format can be used to present a three-dimensional view of one or more scenes in a digital application without the use of a 3D engine.
- the three-dimensional view can allow a user to explore all angles of a scene. Once inside the three-dimensional scene a user can pan, scan, and/or zoom to explore the various aspects of the scene.
- the three-dimensional scene can include audio from the target application.
- the audio can be a subset of the audio from the target application.
- the included audio can be just the ambient sounds isolated from active sounds, such as active game elements, control panel item, animations, etc.
- a three-dimensional scene can be constructed from a set of screenshots.
- Each screenshot can be captured from a location in a scene in a target digital application, but from a different angle.
- the set of screenshots can then be mapped to the inside of a three- dimensional space such as a sphere or cube to create a full three-dimensional view of the scene.
- a scene in a target digital application can include a variety of content that obstructs the scene, such as heads-up display (HUD) elements, control- related elements, or active animations.
- the three-dimensional scene can be constructed to not include scene obstructing content.
- a rendered three-dimensional scene can include one or more annotations that a user can activate to reveal additional content. In some cases, activating an annotation can reveal audio, video, and/or text content.
- an annotation can also be a transition annotation.
- a transition annotation can be used to connect one three-dimensional scene to another.
- the transition between two three-dimensional scenes can include displaying transition content, such as a video, including captured video or simulations of one or more actual transitions from the digital application.
- the effects viewer presentation format can be used to present the effect of one or more digital application settings, hardware types, and/or hardware configurations.
- the effects viewer presentation format can be used to allow the user to explore the difference in scene rendering when different graphics cards are used.
- a scene can include multiple versions of an image depicting the same scene only with different settings. When the scene is rendered a portion of one or more of the images can be used in the rendering.
- the left half of the rendered scene can be displayed using a first image and the right half can be rendered using a second image.
- the user can then move around a demarcating object, such as a slider bar (or shape or series of shapes), to alter where each image is rendered.
- the concept gallery presentation format can be used to present a gallery of images designed to be explored in detail by a user.
- a user can pan, scan, and/or zoom.
- an image can be associated with audio content that can play while the user is exploring the image.
- the audio component can include narration describing different aspects of image and/or features of the digital application exposed in the image.
- a user's exploration of an image can be independent of any audio narration.
- the multi-way view presentation format can be used to present multiple content items that each illustrates the same feature of a target digital application, but using a different perspective.
- the multi-way presentation format can be used to illustrate the relative effectiveness of different gameplay strategies through multiple gameplay videos.
- a user can select one or more of the content items to discover information about the feature from the chosen perspective.
- one or more statistics or data points can be revealed.
- a section configured using the multi-way presentation format can also include a voting feature. The voting feature can allow a user to vote on a content item and see the global vote tally.
- the digital content application can include an interactive opening screen that presents the available sections.
- the opening screen can include a section icon for each available section.
- a section icon can include a variety of information about the section, such as a representative image or video, a summary of the section's content, whether the section is locked, a completion rate, number of user "likes," number of user comments, etc.
- the information can be displayed in a pop-up window associated with the section icon.
- Each section icon can be unique and/or of a different size.
- the opening screen can be created automatically based on the number of sections in a digital content application and a representative image, a weight, and/or other information associated with each section. The associated weights can be used in determining the placement and/or size of a section icon.
- a digital content application can include a number of other social networking type features, such as "likes" and comments or achievements.
- the digital content application as a whole can have a comment feature, and each section in the digital content application can have its own associated comment section.
- a user can post a comment specific to the section and see comments posted by other users.
- a comment can be directly linked to content in a section, such as a specific time in a video.
- one or more users can post comments associated with specific times in a video. As the video plays, the comments can be displayed and/or highlighted when the video reaches the specific times. Similar functionality can be included for a "likes" feature.
- FIG. 1 illustrates an example system embodiment
- FIG. 2 illustrates an exemplary configuration of devices and a network
- FIG. 3 illustrates an exemplary opening screen for a digital content application
- FIG. 4 illustrates an exemplary opening screen in which a pop-up box has been activated
- FIG. 5 illustrates an exemplary opening screen for a section that uses the branching video presentation format
- FIG. 6 illustrates an exemplary section that uses a multi-level branching video format
- FIG. 7 illustrates a first exemplary view in a three-dimensional scene from a section that uses the active view presentation format
- FIG. 8 illustrates a second exemplary view in a three-dimensional scene from a section that uses the active view presentation format
- FIG. 9 illustrates a third exemplary view in a three-dimensional scene from a section that uses the active view presentation format
- FIG. 10 illustrates a fourth exemplary view in a three-dimensional scene from a section that uses the active view presentation format
- FIG. 11 illustrates a first exemplary section using the effects view presentation format
- FIG. 12 illustrates a second exemplary section using the effects view presentation format
- FIG. 13 illustrates a first exemplary section using the concept gallery presentation format
- FIG. 14 illustrates a second exemplary section using the concept gallery presentation format
- FIG. 15 illustrates an exemplary section using the concept gallery presentation format in which the user has zoomed in on a portion of an image
- FIG. 16 illustrates an exemplary section that uses the multi-way presentation format
- FIG. 17 illustrates an exemplary set of multi-way content items
- FIG. 18 illustrates an exemplary comment section associated with a section in a digital content application
- FIG. 19 illustrates an exemplary section with a user control bar that includes a "like" button
- FIG. 20 illustrates an exemplary overall social statistics tally for a section.
- the present disclosure addresses the need in the art for a way to develop digital content that teaches a user about a target interactive digital application while replicating activities that the user will do when interacting with the target digital application.
- the disclosure first sets forth a discussion of a basic general purpose system or computing device in FIG. 1 that can be employed to practice the concepts disclosed herein before turning to a detailed description of the techniques for creating the features of a digital content application.
- an exemplary system 100 includes a general-purpose computing device 100, including a processing unit (CPU or processor) 120 and a system bus 110 that couples various system components including the system memory 130 such as read only memory (ROM) 140 and random access memory (RAM) 150 to the processor 120.
- the system 100 can include a cache 122 connected directly with, in close proximity to, or integrated as part of the processor 120.
- the system 100 copies data from the memory 130 and/or the storage device 160 to the cache for quick access by the processor 120. In this way, the cache provides a performance boost that avoids processor 120 delays while waiting for data.
- These and other modules can control or be configured to control the processor 120 to perform various actions.
- Other system memory 130 may be available for use as well.
- the memory 130 can include multiple different types of memory with different performance characteristics. It can be appreciated that the disclosure may operate on a computing device 100 with more than one processor 120 or on a group or cluster of computing devices networked together to provide greater processing capability.
- the processor 120 can include any general purpose processor and a hardware module or software module, such as module 1 162, module 2 164, and module 3 166 stored in storage device 160, configured to control the processor 120 as well as a special-purpose processor where software instructions are incorporated into the actual processor design.
- the processor 120 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc.
- a multi-core processor may be symmetric or asymmetric.
- the system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- a basic input/output (BIOS) stored in ROM 140 or the like may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up.
- the computing device 100 further includes storage devices 160 such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive or the like.
- the storage device 160 can include software modules 162, 164, 166 for controlling the processor 120. Other hardware or software modules are contemplated.
- the storage device 160 is connected to the system bus 110 by a drive interface.
- a hardware module that performs a particular function includes the software component stored in a non-transitory computer-readable medium in connection with the necessary hardware components, such as the processor 120, bus 110, output device 170, and so forth, to carry out the function.
- the basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device 100 is a small; a handheld computing device, e.g. mobile phone, smart phone, tablet; a desktop computer; an electronic game console; or a computer server.
- Non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
- an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch- sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth.
- An output device 170 can also be one or more of a number of output mechanisms known to those of skill in the art.
- multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100.
- the communications interface 180 generally governs and manages the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
- the illustrative system embodiment is presented as including individual functional blocks including functional blocks labeled as a "processor" or processor 120.
- the functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor 120, that is purpose-built to operate as an equivalent to software executing on a general purpose processor.
- the functions of one or more processors presented in FIG. 1 may be provided by a single shared processor or multiple processors.
- Illustrative embodiments may include microprocessor and/or digital signal processor (DSP) hardware, read-only memory (ROM) 140 for storing software performing the operations discussed below, and random access memory (RAM) 150 for storing results.
- DSP digital signal processor
- ROM read-only memory
- RAM random access memory
- VLSI Very large scale integration
- the logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits.
- the system 100 shown in FIG. 1 can practice all or part of the recited methods, can be a part of the recited systems, and/or can operate according to instructions in the recited non-transitory computer-readable storage media.
- Such logical operations can be implemented as modules configured to control the processor 120 to perform particular functions according to the programming of the module. For example, FIG.
- Modi 162, Mod2 164 and Mod3 166 which are modules configured to control the processor 120. These modules may be stored on the storage device 160 and loaded into RAM 150 or memory 130 at runtime or may be stored as would be known in the art in other computer-readable memory locations.
- a digital content application can be any computer executable code in which at least a portion of the content contained in the digital content application is associated with at least one digital application, e.g. an electronic game, a client application, a server application, a digital product, etc.
- a digital content application can include audio, video, text, and/or images, as well as various interactive elements.
- the interactive features can be any features that require a user to take an active role with respect to what content is displayed by the digital content application.
- a digital content application can include one or more sections, where each section can have one or more sub-sections. The content within each section and/or sub-section can include interactive elements. Additionally, interactive elements can connect sections and/or sub-sections.
- a digital content application can include one or more network- based features, such as social networking features.
- the network-based features can include voting, comments, "likes," etc. Through the network-based a community can be created based around the digital content application.
- the goal of a digital content application can be to include content about or directly from a target digital application that informs a user about the target digital application.
- a digital content application can include scenes or levels from an electronic game.
- a digital content application can include a video of live action within an electronic game.
- the one or more sections can be designed to teach a user about the target digital application in an interactive manner that can replicate the activities a user will engage in when interacting with the actual target digital application.
- FIG. 2 illustrates an exemplary system configuration 200 for distribution and use of digital content applications.
- a configuration is particularly useful for digital applications that include one or more network based features, such as social networking features.
- a digital content application can be created on, distributed by, and/or executed on a general-purpose computing device like system 100 in FIG. 1.
- a digital content application system 208 can communicate with one or more client devices 2021, 202 2 , 202 n (collectively "202") connected to a network 204 by direct and/or indirect communication.
- the digital content application system 208 can support connections from a variety of different client devices, such as desktop computers; mobile computers; handheld communications devices, e.g. mobile phones, smart phones, tablets; electronic game consoles, and/or any other network enabled communications device.
- client devices such as desktop computers; mobile computers; handheld communications devices, e.g. mobile phones, smart phones, tablets; electronic game consoles, and/or any other network enabled communications device.
- digital content application system 208 can concurrently accept connections from and interact with multiple client devices 202.
- the digital content application system 208 can be configured to manage the distribution of a digital content application.
- the digital content application system 208 can receive one or more digital content applications from one or more digital content application providers 206i, 206 2 , 206 n (collectively "206").
- the digital content application system 208 can be configured to store a copy of a digital content application.
- the digital content application system 208 can request a copy of a digital content application from a digital content application provider 206.
- the digital content application system 208 can then send the digital content application to the requesting client device 202.
- the digital content application system 208 can also be configured to manage any data associated with network-based features in a digital application.
- a digital application can include one or more social networking features, such as comments, voting, and/or "likes.”
- social networking features such as comments, voting, and/or "likes.”
- the data associated with the feature should be passed along to the other users of the digital application.
- a comment should be made visible to the users.
- the digital content application system 208 can receive the data and distribute to all active digital content applications.
- the digital content application system 208 can be configured to support push and/or pull data distribution. Additionally, the digital content application system 208 can maintain a database of the data.
- digital content application system 208 and digital content application providers 206 are presented herein as separate entities, this is for illustrative purposes only. In some cases, the digital content application system 208 and the digital content application providers 206 can be the same entity. Thus, a single entity can create a digital application, distribute the digital application to one or more client devices, and manage any network-based features associated with the use of a digital application on one or more client devices 202.
- a digital content application can include one or more sections.
- Each section can include content associated with the same or different target applications.
- each section can present the content using a different presentation format. Examples of possible presentation formats include active view, branching video, effects viewer, concept gallery, and/or multi-way view. Each of these presentation formats will be discussed in greater detail below. Additional presentation formats are also possible, such as video trailers, interviews, FAQs, etc.
- the digital content application can include an opening screen that presents the available sections.
- Each available section can be presented as an icon, a window, and/or text.
- FIG. 3 illustrates an exemplary opening screen 300 for a digital content application.
- the digital content application illustrated in FIG. 3 includes six sections 302, 304, 306, 308, 310, and 312.
- the section icons can be arranged in a cluster, a linear sequence, and/or any other arrangement.
- Each section icon can be unique.
- each section icon can display a representative image of the section.
- each section icon can include dynamic content, such as an indication as to whether the user has started and/or completed the section, whether the section or a subsection within the section is locked, the number of associated user comments.
- a section icon can display a percentage representing the user's completion rate for the section, such as completion percentage 314.
- a section icon can be greyed out to represent that a section is locked.
- a section icon can display a number of user comments associated with the section, such as comment indicator 316.
- each section icon can be a different size.
- a user can select a section icon to activate the content in the section.
- a user can mouse or roll over a section icon to reveal additional information, such as a pop-up information box that includes text, audio, and/or video describing the section.
- FIG. 4 illustrates an exemplary opening screen 300 in which a pop-up box 402 has been activated by rolling over section icon 306.
- pop-up box 402 displays information 404 about the digital application featured in the section as well as a description 406 of the content in the section.
- the opening screen can be presented each time the digital application is launched. Additionally, the opening screen can serve as a home screen, where upon completion or exit from a section, control is transferred to the opening screen so that the user can select a new section to use.
- An opening screen can be created automatically for each digital content application based on the number of sections in a digital content application.
- Each section can have an associated content item, such as a representative image, image sequence, video, and/or text.
- the associated content item can be displayed in the section icon or window on the opening screen.
- each section can have an associated weight.
- a section can be assigned a weight based on the value of the information in the section or the level of importance of the section, e.g. a section containing more detailed information or a greater amount of information can be assigned a higher weight.
- a section can be assigned a weight based on the amount of interactivity in the section, e.g. a section with more interactive elements can be assigned a higher weight.
- a digital content application developer can assign the weight.
- the weight can be automatically computed based on the presence or absence of features in a section.
- a weight can be automatically computed based on the number of interactive elements in a section or based on the number of interactive elements in a section relative to the other sections. In another example, a weight can be automatically computed based on the length of a section.
- the arrangement and/or sizes of the section icons can be determined at least in part based on the associated weights. For example, a section with a greater associated weight can have a larger section icon, such as section icon 204. In another example, a section with a greater weight can be placed in a more prominent position on the opening screen. Additionally, the arrangement and/or size can be influenced by a suggested use order. For example, the section icons can be arranged such that the section that is suggested to be used first is most prominent or placed first in a sequence. In some cases, the suggest use order can be part of the weight. Alternatively, a suggested use order can be associated with a section. For example, each section can be assigned a number in a sequence. Then the associated sequence numbers can influence the opening screen arrangement.
- the layout of the opening screen can be designed to be static. That is, the layout of the section icons on the screen does not change from one execution to another.
- the opening screen can also be created to have a dynamic layout.
- the placement of the sections can rotate with each use of the digital content application, after a specified number of uses, or after a specified elapsed time.
- the size of a section icon can change by changing the weight associated with a section assigned to a completed section or a higher weight can be assigned to a started but not completed section.
- a presentation format can be branching video.
- a section that uses the branching video presentation format can include one or more video segments from which the user can select.
- FIG. 5 illustrates an exemplary opening screen 500 for a section that uses the branching video presentation format.
- the section's opening screen can include an icon, button, link, etc., that a user can click on to select a video segment.
- a video segment icon can include a representative image and/or text, such as video segment icon 502.
- the video segments can contain any content, such as developer commentary or gameplay from a particular section of an electronic game.
- a section that uses the branching video format can be configured to only present a subset of the video segments at any particular time. That is, the section can be configured with multiple levels of video segments. For example, after completing a first video segment, the section can present multiple new video segment options that logically follow from the completed video segment.
- FIG. 6 illustrates an exemplary section 600 that uses a multi-level branching video format. Initially, the section can present the first level video segments. After completing a first level video segment, the section can present the relevant second level video segments. Such a presentation scheme can continue until a leaf video segment has been completed. For example, if a user completes first level video segment 602, the section can present second level video segments 604. If a user selects and completes second level video segment 606, the user has reached a leaf so now new video segments can be presented. In some cases, different paths from a root video segment to a leaf video segment can have different lengths.
- the video segments presented after completing a leaf video segment can vary with the configuration of the branching video section. In some cases, after completing a leaf video segment, control can be returned to the parent level. If the user has completed all of those videos, then the user can select to be returned to some other level. To decrease the burden on the user, after completing a leaf video segment, control can be returned to closest parent level that has unwatched and/or uncompleted video segments.
- the section can be configured to indicate to the user which video segments the user has already completed.
- a completed video segment can be greyed out.
- a completed video segment can have a checkmark next to it. Additional methods of indicating that the user has already completed a video segment are also possible.
- a possible use of a multi-level branching video presentation format can be to present different paths and/or strategies through a section of an electronic game.
- a section of an electronic game such as a level, can be broken up into video segments based on decision points in the electronic game.
- a user can then see the change in the gameplay that can occur when different decisions are made.
- various gameplay statistics can be carried through from one segment to the next so that a user can see which path proved to be most successful.
- a goal of a section using the multi-level branching video approach to illustrate an electronic game can be for the user to explore the video segments to identify a successful path through the illustrated game section. For example, it may be that a certain sequence of decisions will result in the most optimal path through the level, such as based on speed of completion, length of survival time, most points or awards earned, most achievements, or some other goal.
- the branching video sequence can be designed to allow the user to explore the different decisions to identify the most successful sequence. After completing the branching video section, the user can use the information learned in the user's own gameplay of the electronic game.
- a branching video sequence can allow a user to learn using fewer resources, such as time, real or virtual currency, etc. Additionally, a branching video sequence can be presented in an environment outside of the view of other user, thus allowing the user to learn and/or explore in a risk-free environment.
- a presentation format can be active view.
- a section that uses the active view presentation format can present a three-dimensional view of a scene without the use of a 3D-graphics-rendering engine.
- the three-dimensional view can allow a user to explore all angles of a scene. Once inside the three-dimensional scene a user can pan, scan, and/or zoom to explore the various aspects of the scene.
- the active view format can be used to present a section of an electronic game. The scene can be presented to the user as if the user was standing at a location in the game. The user can then look up, down, and all around to see everything that is around the user at that location.
- a three-dimensional scene can include audio from the target digital application, such as the audio that would be playing while the user was in the scene.
- the audio used in a three-dimensional scene can be stripped of sounds associated with active or control-related elements present in the scene in the target digital application, such as from characters or inanimate objects speaking, moving or appearing, so that only ambient sounds remain.
- a three-dimensional scene can be constructed from a set of screenshots.
- the three-dimensional scene can be constructed by mapping the set of screenshots to the inside of a three-dimensional space such as a sphere or cube.
- the panoramic screenshots should provide full coverage of the desired scene.
- One possible method for capturing sufficient coverage can be to pick a location in the scene in the target digital application. Point the camera straight down and take the first panoramic screenshot. After the first shot, rotate the camera to the right until only about 25 percent of the previous screenshot in visible on the left of the screen and capture another panoramic screenshot. Continue rotating right capturing panoramic screenshots until the camera has returned to the starting position.
- a scene in a target digital application can include a variety of content that obstructs the scene.
- many electronic games include a heads-up display (HUD) to display a variety of gameplay statistics, such as available weapons, ammunition, health, score, maps, time, capabilities, game progression, speedometer, compass, etc.
- the HUD elements can obscure the view of the scene. Therefore, it may be desirable to eliminate any HUD elements from the panoramic screenshots prior to constructing the three-dimensional scene.
- the HUD elements can be removed by disabling the HUD elements prior to capturing the screenshots.
- the HUD elements can be removed through a post-processing step.
- the HUD elements can be removed by increasing the amount of overlap between successive screenshots so that when the screenshots are put together, the screenshots are overlapped in such a manner that HUD elements are eliminated.
- some target digital applications can include animations, such as enemies or moving obstacles. Therefore, to create a clear view of the scene it may be necessary to disable and/or freeze any animations prior to capturing the screenshots.
- FIG. 7 illustrates an exemplary view 700 in a three-dimensional scene from a section that uses the active view presentation format.
- An annotation can include audio, video, and/or text.
- a user can mouse over, click on, or otherwise activate the content associated with an annotation.
- FIG. 7 includes an audio annotation 702.
- a user can mouse over the audio annotation 702 to reveal an information pop-up window 704. If the user is interested in the content of the audio annotation, the user can click on the audio annotation 702 to activate the audio.
- FIG. 8 illustrates another exemplary view 800 in the three-dimensional scene.
- view 800 the scene has been rotated to the right (by the character looking / being- navigated to the left). Additionally, the view 800 includes an information annotation 802. The user can mouse over or click on the annotation 802 to reveal the information box 804.
- a section that uses the active presentation format can include multiple three- dimensional scenes.
- one or more three-dimensional scenes can be connected.
- a scene in an electronic game can include a point where a user can enter a building, a tunnel, etc.
- the section can model the electronic games environment through the use of two three-dimensional scenes.
- the three- dimensional scenes can be connected through a transition annotation.
- FIG. 9 illustrates an exemplary view 900 in a section that uses the active view presentation format.
- the view 900 includes a transition annotation 902.
- a user can mouse over the transition annotation 902, to reveal information about the transition, such as pop-up window 904 that provides a hint about the transition destination.
- a transition can include an effect, such as an explosion that opens a hole into the next room or three-dimensional scene. Once in the new three-dimensional scene, the user can explore in the same manner as the previous three-dimensional scene.
- two three-dimensional scenes in the section may not be directly connected in the target digital application.
- the two three-dimensional scenes may have been selected because they include particularly difficult or interesting features, while the section connecting the two scenes is less interesting.
- the transition between the two three-dimensional scenes can be a video. Therefore, the user is still able to visualize the relationship between the three-dimensional scenes, but is not burdened with having to activate multiple transition annotations to find the interesting three- dimensional scenes.
- a number of hints and/or user controls can be displayed to the user to reveal the available features and/or how the user can interact with a rendered scene and/or the section.
- the hint window can be disabled after a predefined period so that the user has an unobstructed view of the rendered scene.
- a hint window can be displayed only upon initial entry into a section using the active view format. However, if different scenes in the section require different user actions in order to interact with the scene, then a new hint window can be displayed.
- a digital application can display a control bar, such as control bar 1002 so that the user can control any audio, move to another scene, and/or explore the scene.
- a control bar can remain visible.
- a digital application can be configured such that the control bar is only visible when a mouse pointer is active and/or when a mouse pointer is active in a predefined area, such as in the area of the control bar. Additional techniques for enabling and/or disabling a visual representation of a control bar are also possible.
- Another presentation format can be an effects viewer.
- a section that uses the effects viewer presentation format can include one or more scenes designed such that a user is able to explore the differences between various digital application settings, hardware types, and/or hardware configurations.
- a scene can include multiple versions of an image depicting the same scene only with different settings. For example, a first image can be captured on a device with a first graphics card type and a second image can be captured on a device with a second graphics card type. In another example, a first image can be captured with night vision enabled while the second image can be captured with night vision disabled.
- the images can be layered one on top of the other in a scene.
- the scene can then include functionality that allows a user to view the rendered scene where different layers are revealed for different portions of the rendered scene.
- the left half of the rendered scene can be displayed using the image captured on a device with a first graphics card while the right half can be displayed using the image captured using the second graphics card.
- Which layer is revealed for which portion of the rendered scene can be demarcated using one or more slider bars or any other geometric, graphic or organic shapes, and/or any combination thereof.
- a user can move a slider bar left and right to alter which of portions of two different layers are used in rendering the scene.
- a geometric shape can be based on the viewable area as seen through night vision goggles.
- Additional geometric, graphic, and/or organic shapes are also possible, such as a circle, a thematic shape such as a shield in the case of a medieval game, or peels of an onion.
- a series of shapes can also be used, such as to look like sections of peel.
- FIG. 11 illustrates an exemplary section 1100 using the effects view presentation format in which a user can explore the difference between two different settings on a rendered scene 1102.
- the rendered scene can include a slider bar 1104 that a user can move right and left.
- On the left side of the slider bar 1104 is the rendered scene 1102 as captured with the first settings 1106 while on the right side of the slider bar 1104 is the rendered scene 1102 as captured with the second settings 1108.
- the user moves the slider bar 1104 to the right, more of the scene 1102 is rendered using the first settings F06.
- FIG. 12 illustrates the exemplary section 1100 in which the slider bar 1104 has been moved to the right.
- a scene configured for use in an effects view section can include more than two images.
- a scene can be rendered with multiple slider bars or other demarcating objects that allow the scene to be rendered as a combination of each of the multiple images.
- a scene can include three images and two slider bars, thus creating three different sections in the rendered scene.
- the scene can also be configured such that a user can select two of the settings to compare. In this case, the scene can be rendered using the two images that correspond to the selected settings.
- hint window 1110 informs the user know that the slider bar 1104 can be moved right and left to alter the rendering of the scene.
- the hint window 1110 can be disabled after a predefined period so that the user has an unobstructed view of the rendered scene. For example, in FIG. 12 the hint window is no longer obstructing the display of the rendered scene 1104.
- a hint window can be displayed only upon initial entry into a section using the effects view format. However, if different scenes in the section require different user actions in order to interact with the scene, then a new hint window can be displayed.
- a digital application can display a control bar 1112 so that the user can control any audio, move to another scene, and/or explore the scene.
- a control bar can remain visible.
- a digital application can be configured such that the control bar is only visible when a mouse pointer is active and/or when a mouse pointer is active in a predefined area, such as in the area of the control bar. Additional techniques for enabling and/or disabling a visual representation of a control bar are also possible.
- a presentation format can be concept gallery.
- a section that uses the effects viewer presentation format can include a gallery of images. That is, a section using the concept gallery format can include one or more images from one or more digital applications.
- a concept gallery section can include one or more images from a single digital application, such as images from different levels in an electronic game.
- a concept gallery section can include one or more images from different digital applications, such as a single image from multiple similar digital applications so that a user can see the difference between the applications.
- Each image can be captured so as to present a particular feature of the digital application. For example, an image can capture a particular point in a game that includes a number interesting and/or challenging features.
- An image can be a two-dimensional image that includes effects to give the appearance of a three-dimensional image.
- an image can have an associated audio component.
- the audio component can include narration describing different aspects of image and/or features of the digital application exposed in the image.
- FIG. 13 illustrates an exemplary section 1300 using the concept gallery presentation format in which an image 1302 is displayed.
- a user's exploration of an image can be independent of any audio narration. That is, if an audio narration is discussing a particular aspect of the image, the user is able to explore a different aspect of the image.
- a number of hints and/or user controls can be displayed to the user to reveal the available features and/or how the user can interact with an image and/or the section.
- hint window 1304 lets the user know that the image can be explored by panning and scanning across the image.
- the hint window 1304 can be disabled after a predefined period so that the user has an unobstructed view of the image.
- FIG. 14 illustrates an exemplary section 1300 using the concept gallery presentation format in which the hint window is no longer obstructing the display of image 1302.
- a hint window can be displayed only upon initial entry into a section using the concept gallery format. However, if different images in the section require different user actions in order to interact with the image, then a new hint window can be displayed.
- a digital application can display a control bar 1306 so that the user can control any audio, move to another image, and/or explore the image. In some cases, a control bar can remain visible.
- a digital application can be configured such that the control bar is only visible when a mouse pointer is active and/or when a mouse pointer is active in a predefined area, such as in the area of the control bar. Additional techniques for enabling and/or disabling a visual representation of a control bar are also possible.
- FIG. 15 illustrates an exemplary section 1300 using the concept gallery presentation format in which a user has zoomed in on an aspect of the image 1302.
- a user can zoom in using a mouse action, such as double clicking, double tapping, pinching, scrolling, etc.
- the control bar can be configured to include a zoom feature, such as zoom control 1502 in control bar 1306.
- the level of zoom permitted for an image can be based on the image resolution. For example, a digital content application that includes high- resolution images can include greater zoom functionality.
- Another presentation format can be multi-way view.
- a section that uses the multi-way presentation format can include multiple content items that each illustrates the same feature of a digital application but using a different perspective.
- the multi-way presentation format can be used to illustrate the relative effectiveness of different gameplay strategies.
- each content item can be a video.
- a game player can complete a section of a digital application using a particular game play strategy.
- a video can include player commentary.
- FIG. 16 illustrates an exemplary section 1600 that uses the multi-way presentation format.
- Section 1600 includes three illustrative videos 1602, 1604, and 1606.
- a section using the multi-way presentation format can include any number of content items.
- a user can select one or more of the content items to discover information about the feature from the chosen perspective.
- one or more statistics or data points can be revealed. For example, after completing a strategy video for an electronic game, various game play statistics can be displayed, such as time to completion, resources used, achievements accomplished, change in health level, etc.
- FIG. 17 illustrates an exemplary set of multi-way content items 1700 after the user has viewed each content item at least once. After the content items have been viewed, the data points are revealed, such as statistics 1702.
- a section configured using the multi-way presentation format can also include a voting feature.
- the voting feature can allow a user to vote on a content item, such as the content item they found most useful or like the best. For example, a user can vote on the gameplay strategy the user liked best.
- the voting can be accomplished through vote buttons associated with each content item, such as vote button 1704. At some point a global vote count can be revealed to the user, such as vote count 1706. In some cases, the overall voting results can be hidden from a user until the user casts a vote. Additionally, the overall vote tally can be updated in real time and/or at periodic intervals.
- a digital content application can also include a number of different social features, such as "likes" and comments. These social features can be in addition to those already mentioned above, such as the voting feature in the multi-way presentation format.
- the digital content application can include one or more comment features.
- the digital content application as a whole can have a comment feature, and each section in the digital content application can have its own associated comment section.
- a user can post a comment specific to the section and see comments posted by other users.
- FIG. 18 illustrates an exemplary comment section 1800 associated with a section in a digital content application.
- a comment can be directly linked to content in a section.
- a user can associate a comment with a particular video segment in a branch video section.
- a commenting feature can be configured so that a comment is associated with a specific time in a video. Then during video playback, the comments can be highlighted or revealed when the video reaches the point at which the comments were made.
- Each section in the digital content application can also have an associated "likes” feature.
- individual content within a section can have a "likes” feature.
- FIG. 19 illustrates an exemplary section 1900 with a user control bar 1902 that includes a "like” button 1904. A user can click the "like” button to indicate that the user likes the particular content.
- the "likes" feature can include a “dislike” button. Additional methods of including user approval or disapproval are also possible.
- a digital content application can be configured with various ways to display the social features, such as likes, dislikes, votes, comments, etc.
- the display can be an overall tally.
- FIG. 20 illustrates an exemplary overall social statistics tally.
- a section icon on the opening screen can have an associated pop-up window with information regarding the section.
- active view section 2002 can have an associated pop-up information window 2004.
- the window 2004 can include a social statistics tally 2006. Additional methods of displaying social statistics are also possible.
- a digital application can include an achievements feature.
- the achievements feature can be configured such that as a user completes the various sections and/or sub-sections, the user is awarded a bonus.
- a bonus can be unlocking additional content, such as new game strategy hints.
- a bonus can also be a badge or some other achievement level indicator that can give the user status within the digital content application's social community.
- Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer- executable instructions or data structures stored thereon.
- Such non-transitory computer- readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above.
- non- transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design.
- Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- Computer- executable instructions also include program modules that are executed by computers in stand-alone or network environments.
- program modules include routines, programs, components, data structures, objects, and the functions inherent in the design of special-purpose processors, etc. that perform particular tasks or implement particular abstract data types.
- Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
- Embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, videogame consoles, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Selon l'invention, l'efficacité d'un contenu de type guide d'utilisateur, tel que des guides de stratégie pour des jeux électroniques, peut être améliorée par l'intermédiaire du développement et de la distribution d'applications interactives de contenu numérique. Une application interactive de contenu numérique peut être conçue pour comprendre un contenu concernant une application numérique cible ou directement issu de celle-ci, telle qu'un audio, une vidéo, un texte et/ou des images. En outre, une application de contenu numérique peut comprendre un ou plusieurs éléments interactifs qui nécessitent qu'un utilisateur prenne un rôle actif par rapport au contenu présenté. Pour faciliter l'interactivité, une application de contenu numérique peut être conçue pour comprendre un ou plusieurs formats de présentation interactive, qui peuvent comprendre une vue active, une vidéo de branchement, un spectateur d'effets, une galerie de concept et/ou une vue multidirectionnelle.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161512870P | 2011-07-28 | 2011-07-28 | |
US61/512,870 | 2011-07-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013016707A1 true WO2013016707A1 (fr) | 2013-01-31 |
Family
ID=46614660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/048721 WO2013016707A1 (fr) | 2011-07-28 | 2012-07-27 | Applications interactives de contenu numérique |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013016707A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109074404A (zh) * | 2016-05-12 | 2018-12-21 | 三星电子株式会社 | 用于提供内容导航的方法和装置 |
WO2021054852A1 (fr) | 2019-09-17 | 2021-03-25 | Акционерное общество "Нейротренд" | Procédé pour déterminer l'efficacité de la présentation visuelle de documents textuels |
US11563915B2 (en) | 2019-03-11 | 2023-01-24 | JBF Interlude 2009 LTD | Media content presentation |
US11997413B2 (en) | 2019-03-11 | 2024-05-28 | JBF Interlude 2009 LTD | Media content presentation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1103920A2 (fr) * | 1999-11-25 | 2001-05-30 | Sony Computer Entertainment Inc. | Appareil de divertissement, méthode de génération d'images et support d'enregistrement |
WO2008147561A2 (fr) * | 2007-05-25 | 2008-12-04 | Google Inc. | Rendu, visualisation et annotation d'images panoramiques et ses applications |
-
2012
- 2012-07-27 WO PCT/US2012/048721 patent/WO2013016707A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1103920A2 (fr) * | 1999-11-25 | 2001-05-30 | Sony Computer Entertainment Inc. | Appareil de divertissement, méthode de génération d'images et support d'enregistrement |
WO2008147561A2 (fr) * | 2007-05-25 | 2008-12-04 | Google Inc. | Rendu, visualisation et annotation d'images panoramiques et ses applications |
Non-Patent Citations (4)
Title |
---|
ARTFUNKLE: "Skybox (2D)", 10 July 2011 (2011-07-10), XP002683397, Retrieved from the Internet <URL:https://developer.valvesoftware.com/w/index.php?title=Skybox_%282D%29&oldid=154679> [retrieved on 20120913] * |
MILLER G ET AL: "THE VIRTUAL MUSEUM: INTERACTIVE 3D NAVIGATION OF A MULTIMEDIA DATABASE", JOURNAL OF VISUALIZATION AND COMPUTER ANIMATION, XX, XX, vol. 3, no. 3, 1 January 1992 (1992-01-01), pages 183 - 197, XP000925118 * |
SHENCHANG ERIC CHEN ED - COOK R: "QUICKTIME VR - AN IMAGE-BASED APPROACH TO VIRTUAL ENVIRONMENT NAVIGATION", COMPUTER GRAPHICS PROCEEDINGS. LOS ANGELES, AUG. 6 - 11, 1995; [COMPUTER GRAPHICS PROCEEDINGS (SIGGRAPH)], NEW YORK, IEEE, US, 6 August 1995 (1995-08-06), pages 29 - 38, XP000546213, ISBN: 978-0-89791-701-8 * |
WILSON A ET AL: "A video-based rendering acceleration algorithm for interactive walkthroughs", PROCEEDINGS ACM MULTIMEDIA 2000 WORKSHOPS. MARINA DEL REY, CA, NOV. 4, 2000; [ACM INTERNATIONAL MULTIMEDIA CONFERENCE], NEW YORK, NY : ACM, US, vol. CONF. 8, 30 October 2000 (2000-10-30), pages 75 - 84, XP002175632, ISBN: 978-1-58113-311-0, DOI: 10.1145/354384.354431 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109074404A (zh) * | 2016-05-12 | 2018-12-21 | 三星电子株式会社 | 用于提供内容导航的方法和装置 |
EP3443489A4 (fr) * | 2016-05-12 | 2019-04-10 | Samsung Electronics Co., Ltd. | Procédé et appareil permettant une navigation de contenu |
US10841557B2 (en) | 2016-05-12 | 2020-11-17 | Samsung Electronics Co., Ltd. | Content navigation |
US11563915B2 (en) | 2019-03-11 | 2023-01-24 | JBF Interlude 2009 LTD | Media content presentation |
US11997413B2 (en) | 2019-03-11 | 2024-05-28 | JBF Interlude 2009 LTD | Media content presentation |
WO2021054852A1 (fr) | 2019-09-17 | 2021-03-25 | Акционерное общество "Нейротренд" | Procédé pour déterminer l'efficacité de la présentation visuelle de documents textuels |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Linowes | Unity virtual reality projects | |
US9665972B2 (en) | System for compositing educational video with interactive, dynamically rendered visual aids | |
Parisi | Learning virtual reality: Developing immersive experiences and applications for desktop, web, and mobile | |
Blackman | Beginning 3D Game Development with Unity 4: All-in-one, multi-platform game development | |
Herbst et al. | TimeWarp: interactive time travel with a mobile mixed reality game | |
US10970843B1 (en) | Generating interactive content using a media universe database | |
EP2887322B1 (fr) | Développement d'objets holographiques dans la réalité mixte | |
CN103959344B (zh) | 跨越多个设备的增强现实表示 | |
CN102663799B (zh) | 用于利用创作系统创建可播放场景的方法和装置 | |
Linowes | Unity 2020 Virtual Reality Projects: Learn VR development by building immersive applications and games with Unity 2019.4 and later versions | |
EP3422148B1 (fr) | Appareil et procédés associés d'affichage de contenu de réalité virtuelle | |
US20140049559A1 (en) | Mixed reality holographic object development | |
Linowes | Unity virtual reality projects: Learn virtual reality by developing more than 10 engaging projects with unity 2018 | |
CN107590771A (zh) | 具有用于在建模3d空间中投影观看的选项的2d视频 | |
US11513658B1 (en) | Custom query of a media universe database | |
CN106462324A (zh) | 一种用于在虚拟环境内提供交互性的方法和系统 | |
ONG. et al. | Beginning windows mixed reality programming | |
Glover et al. | Complete Virtual Reality and Augmented Reality Development with Unity: Leverage the power of Unity and become a pro at creating mixed reality applications | |
Smith et al. | Unity 5. x Cookbook | |
WO2013016707A1 (fr) | Applications interactives de contenu numérique | |
Khor et al. | AR Mobile Application for Enhancing National Museum Heritage Visualization | |
Schofield | Camera Phantasma: Reframing virtual photographies in the age of AI | |
Christian | Enhancing Virtual Reality Experiences with Unity 2022: Use Unity's latest features to level up your skills for VR games, apps, and other projects | |
Felicia | Getting started with Unity: Learn how to use Unity by creating your very own" Outbreak" survival game while developing your essential skills | |
Jones et al. | What is VR and why use it in research? |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12743636 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12743636 Country of ref document: EP Kind code of ref document: A1 |