US20040066395A1 - Graphical user interface for a motion video planning and editing system for a computer - Google Patents
Graphical user interface for a motion video planning and editing system for a computer Download PDFInfo
- Publication number
- US20040066395A1 US20040066395A1 US10/673,663 US67366303A US2004066395A1 US 20040066395 A1 US20040066395 A1 US 20040066395A1 US 67366303 A US67366303 A US 67366303A US 2004066395 A1 US2004066395 A1 US 2004066395A1
- Authority
- US
- United States
- Prior art keywords
- clip
- shot
- user
- computer
- motion video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000013439 planning Methods 0.000 title abstract description 9
- 238000000034 method Methods 0.000 claims abstract description 23
- 230000008569 process Effects 0.000 claims abstract description 19
- 238000004590 computer program Methods 0.000 claims description 25
- 238000004519 manufacturing process Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 abstract description 18
- 230000007704 transition Effects 0.000 description 33
- 230000000694 effects Effects 0.000 description 30
- 239000000203 mixture Substances 0.000 description 25
- 230000006399 behavior Effects 0.000 description 15
- 230000004044 response Effects 0.000 description 12
- 230000007246 mechanism Effects 0.000 description 10
- 238000009966 trimming Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 238000012217 deletion Methods 0.000 description 5
- 230000037430 deletion Effects 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000003780 insertion Methods 0.000 description 4
- 230000037431 insertion Effects 0.000 description 4
- 238000010561 standard procedure Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/40—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
- H04N19/61—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/40—Combinations of multiple record carriers
- G11B2220/41—Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/90—Tape-like record carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/36—Monitoring, i.e. supervising the progress of recording or reproducing
Definitions
- the present invention is related generally to systems for planning and editing motion video information. More particularly, the present invention is related to computer-assisted motion video editing systems, wherein motion video information is stored as a sequence of digital still images in a data file on a computer system.
- the high cost of motion video editing tools for computers is due, in part, to specialized hardware used to capture, digitize, compress, decompress and display motion video information on a computer screen with sufficient detail and resolution.
- the complexity of the graphical user interface of these motion video editing tools is due, in part, to the variety of possible operations which can be performed on motion video and also to a design for professionals familiar with motion video editing, and terms and concepts of that trade, with which the average person is unfamiliar. For example, many systems use a representation of a motion video composition as two tracks, A and B, between which “rolling” occurs.
- A/B rolling is unduly complex and has been simplified in other systems by using a “timeline,” such as in the AVID/1 Media Composer from Avid Technology, Inc., of Tewksbury, Mass.
- a “timeline” such as in the AVID/1 Media Composer from Avid Technology, Inc., of Tewksbury, Mass.
- Another complexity is the use of multiple windows for controlling the various parameters of the motion video, displaying the motion video itself, sound track controls, and other features.
- a general aim of this invention is to provide a motion video editing system for a computer with reduced cost and complexity.
- Another aim of this invention is to provide a motion video editing system for a computer with tools for assisting creative design and planning of a motion video composition.
- the present invention provides a simplified interface which directs a user through the process of editing a video program.
- the interface also enables a user to plan a video program.
- selectable interfaces each of which provide a group of planning, capturing, editing, and recording functions, provides an intuitive interface for producing a video program.
- Other simplifications to the user interface can be provided to assist in editing, such as by maintaining the video display window at a fixed position.
- video information can be captured directly into a timeline representation of a video program, rather than a bin. Using a storyboard tied to the capturing process, a user is directed through the process of collecting and capturing the video clips to be used in the video program.
- one aspect of the invention is a graphical user interface for a computer motion video editing system, which has a single window interface including a plurality of alternatively selectable interfaces.
- a first of the plurality of selectable interfaces is an interface for making capturing commands available to a user for receiving motion video information to be edited.
- a second of the plurality of selectable interfaces is an interface for making editing commands available to a user for editing the received motion video information.
- a third of the plurality of selectable interfaces is an interface for making playback commands available to a user for outputting the edited motion video information to an external device.
- a fourth of the plurality of selectable interfaces includes an interface for making storyboarding commands available to a user for preparing a plan describing a motion video program to be edited.
- the second of the plurality of selectable interfaces further includes a second plurality of alternatively selectable interfaces, wherein each selectable interface provides a set of editing functions of a particular type, and wherein each selectable interface has a video region for previewing the motion video program being edited and wherein the video region in each of the selectable interfaces is at an identical position within the single window interface.
- Another aspect of the invention is a graphical user interface for editing computer motion video having a single window interface having a plurality of alternatively selectable interfaces.
- Each selectable interface provides a set of motion video editing functions of a different type.
- Each selectable interface also has a video region for previewing the motion video to be edited such that the video region in each selectable interface is at the same position within the single window interface.
- Another aspect of the invention is a graphical user interface for a computer for assisting editing of a motion video program, having a planner module with inputs for receiving commands from a user descriptive of a plan of shots of video in the video program and providing an output representative of the plan.
- a capture module has a first input for receiving the plan defined by the user, a second input for receiving an input from a user for controlling recording of motion video information, and a third input for receiving a selection of a shot in the received plan.
- the capture module has an output in which captured motion video information is associated directly with the selected shot to provide the motion video program as a sequence of the recorded clips in an order defined by the plan.
- Another aspect of the invention is computer video capture system which represents a sequence of video clips in a video program. Clips of a video program are captured directly into the represented sequence.
- Another aspect of the invention is a set of a plurality of predefined plans stored in a computer memory.
- One aspect of this invention includes a mechanism for selecting one of the plans, for editing a selected plan, and for capturing video and for automatically generating a video sequence according to the selected plan.
- Another aspect of the invention is a process for capturing motion video information and for generating a video program of a plurality of clips of captured motion video information.
- the process involves selecting a clip of the video program, capturing video information and associating the captured video information with the selected clip of the video program.
- the step of selecting includes the step of selecting a shot from a plan representing and associated with the video program.
- the step of associating includes the step of associating the captured video information with the clip of the video program associated with the selected shot from the plan.
- the process further involves performing the step of indicating whether a clip of the video program has captured motion video information associated thereto.
- FIG. 1 is a block diagram of an example general purpose computer system in which the present invention may be implemented
- FIG. 2 is an example memory system shown in FIG. 1;
- FIG. 3 is a diagram illustrating software layers in one embodiment of the present invention.
- FIG. 4 is a perspective view of a computer system having a display showing one embodiment of the graphical user interface of the present invention
- FIG. 5 is a graphic of a graphical user interface for providing planning functions in accordance with one embodiment of the present invention.
- FIG. 6 is a diagram of a data structure for representing shots in accordance with one embodiment of the present invention.
- FIG. 7 is a diagram of a data structure for representing clips in accordance with one embodiment of the present invention.
- FIG. 8 is a graphic of a graphical user interface for providing capturing functions in accordance with one embodiment of the present invention.
- FIGS. 9 - 13 are graphics of graphical user interfaces for providing editing functions in accordance with one embodiment of the present invention.
- FIG. 14 is a graphic of a graphical user interface for providing recording functions in accordance with one embodiment of the present invention.
- FIG. 15 is a block diagram illustrating interaction between a module for maintaining and displaying a storyboard and a module for creating and maintaining clip description of a composition
- FIG. 16 is a flowchart describing how clip descriptions and shot descriptions are synchronized during capture of motion video information
- FIGS. 17 a - 17 e are a representation of timeline behavior produced in response to a user operation
- FIGS. 18 a - 18 b are a representation of timeline behavior produced in response to a user operation
- FIGS. 19 a - 19 d are a representation of timeline behavior produced in response to a user operation
- FIGS. 20 a - 20 f are a representation of timeline behavior produced in response to a user operation
- FIGS. 21 a - 21 d are a representation of timeline behavior produced in response to a user operation
- FIGS. 22 a - 22 g are a representation of timeline behavior produced in response to a user operation
- FIGS. 23 a - 23 c are a representation of timeline behavior produced in response to a user operation.
- FIGS. 24 a - 24 i and 24 k - 24 m are a representation of timeline behavior produced in response to a user operation.
- the graphical user interface directs a user through the steps of editing a motion video program, including planning (storyboarding), capturing the video information, editing the video information, and exporting the video information to a final data file or a video tape.
- the user is directed through the steps of editing the primary content of the video program, adding effects at transitions between video clips, adding titles and credits, and finally, editing sound.
- a composition is a heterogeneous aggregation of tracks and, in one embodiment of the invention, includes five tracks: one title track, one video track, and three audio tracks.
- the composition is also referred to as a motion video program.
- One of the audio tracks is synchronized and grouped with the video track (the audio track that is captured with the video), one audio track is called a voice-over track, and the third audio track is a music track.
- Each track is a two part entity: a synchronized media subtrack and an effects subtrack.
- Each subtrack consists of a sequence of segments and holes.
- the media subtrack includes media segments, and the effects subtrack includes effects segments.
- a media segment is a portion of a media subtrack with a time-based beginning and ending.
- the interior of a media segment refers to a portion of a media clip.
- a media clip is an independent, playable entity which has duration and possibly multiple pieces of synchronized media associated with it.
- Media clips also have ancillary data associated with them, such as a name and description.
- Media is motion video media, audio media, or text media stored in a data file on a computer, for example, in a QuickTime file.
- a sync-lock group is a group of segments which have been grouped together for editing purposes. Editing operations will not move the components of a sync-lock group relative to each other.
- the video track and its corresponding audio track may be the only sync-lock group and cannot be unlocked or unsynced.
- a media segment is a video media segment, audio media segment, and text media segment, depending on which track the segment resides.
- a hole is a span in a track with a time-based beginning and ending which has no associated segment. On the video track, a hole displays black. On an audio track, a hole plays silence. On the titles track, a hole displays full transparency. Relative to a point or span in the composition, upstream composition elements are located earlier in the composition and downstream composition elements are located later in the composition. The beginning of a media segment is called its incoming edge, and the ending of a media segment is called its outgoing edge. The edges of media segments are also called transition points.
- a transition point has zero length.
- the edges of a group are transition points where a segment on one side of the transition is inside the group and any segment on the other side of the transition is outside of the group.
- a cut is a transition point that does not have an effect segment spanning it.
- the outgoing segment is the segment which displays before the transition point
- the incoming segment is the segment which displays after the transition point.
- the outgoing segment is to the left of a cut in the timeline; the incoming segment is to the right.
- the present invention may be implemented using a digital computer.
- a typical computer system 20 is shown in FIG. 1, and includes including a processor 22 connected to a memory system 24 via an interconnection mechanism 26 .
- a special-purpose processor 23 may also be used for performing specific functions, such as encoding/decoding of data, or complex mathematical or graphic operations.
- An input device 28 is also connected to the processor and memory system via the interconnection mechanism, as is an output device 30 .
- the interconnection is typically a combination of one or more buses and one or more switches.
- the output device 30 may be a display 32 and the input device may be a keyboard 34 or mouse 36 .
- the processor, interconnection mechanism and memory system typically are embodied in a main unit 38 .
- Example output devices include a cathode ray tube (CRT) display, liquid crystal display (LCD), printers, communication devices, such as a modem, and audio output. To enable recording of motion video information in an analog form, this computer system also may have a video output for providing a video signal to a VCR, camcorder or the like.
- one or more input devices 28 may be connected to the computer system.
- Example input devices include a video capture circuit connected to a VCR or camcorder, keyboard, keypad, trackball, mouse, pen and tablet, communication device, audio input and scanner. The motion video capture circuit may be one of many commercially available boards.
- a video capture card may connect to the PCI interface, and may use Motion-JPEG video compression and pixel averaging to compress images to 320 ⁇ 240-pixels at 30 frames per second.
- the video capture card may receive and may output composite video and S-video. It should be understood that the invention is not limited to the particular input or output devices used in combination with the computer system or to those described herein. The input and output devices may be included within or external to the main unit 38 .
- the computer system 20 may be a general purpose computer system, which is programmable using a high level computer programming language, such as “C++” or “Pascal”.
- the computer system may also be implemented using specially programmed, special purpose hardware.
- the processor is typically a commercially available processor, such as the Power PC 603e RISC microprocessor. It may include a special purpose processor such as a CL540B Motion JPEG compression/decompression chip, from C-Cube of Milpitas, Calif. Many other processors are also available.
- Such a processor executes a program called an operating system, such as the Macintosh operating system, such as Macintosh System Software, version 7.5.3, which controls the execution of other computer programs and provides scheduling, debugging, input output control, accounting compilation, storage assignment, data management and memory management, and communication control and related services.
- the processor and operating system define a computer platform for which application programs in high level programming languages are written. It should be understood that the invention is not limited to a particular computer platform, particular operating system, particular processor, or particular high level programming language.
- the computer system 20 may be a multi-processor computer system or may include multiple computers connected over a computer network.
- One embodiment of the present invention is implemented using either a Macintosh Performa computer or Power Macintosh computer, with a PCI expansion slot and the Apple Video System, such as Performa 5400, 5420 or 6400 series computers from Apple Computer of Cupertino, Calif.
- an Apple Power Macintosh computer with a built-in compositor as video input and a PCI expansion slot such as the 7600 or 8500 series computers with audio/video capabilities may be used.
- the computer system may also include an application for managing motion video files, such as the QuickTime 2.5 motion video system of Apple Computer.
- a memory system typically includes a computer readable and writable non-volatile recording medium 40 , of which a magnetic disk, a flash memory, and tape are examples.
- the disk may be removable, known as a floppy disk, and/or permanent, known as a hard drive.
- a PowerPC processor-based Macintosh Performa computer having a gigabyte or more capacity hard disk drive and at least 16 to 24 megabytes of DRAM is preferred.
- the disk should have sufficient size to hold the video information to be edited, which is typically around 830 k bytes per second.
- the disk which is shown in FIG.
- the processor 22 causes data to be read from the non-volatile recording medium 40 into an integrated circuit memory element 46 , which is typically a volatile random access memory, such as a dynamic random access memory (DRAM) or static memory (SRAM).
- DRAM dynamic random access memory
- SRAM static memory
- the system memory may be used as a buffer between the disk and output device 30 or the video information, as will be described in more detail below.
- the processor generally causes the data to be manipulated within the integrated circuit memory 46 and copies the data to the disk 40 if modified, when processing is completed.
- a variety of mechanisms are known for managing data movement between the disk 40 and the integrated circuit memory 46 , and the invention is not limited thereto. It should also be understood that the invention is not limited to a particular memory system.
- a video capture card is provided as indicated at 62 .
- the QuickTime video system 64 interacts with the video capture card 62 via drivers 66 .
- a video player system 67 such as the Apple Video Player, interacts with QuickTime 2.5.
- the software providing the editing instructions and graphical user interface to access these instructions is also designed to interact with QuickTime in parallel with the video player, as indicated at 68 .
- FIG. 4 shows, on an output device 32 , a perspective view of a graphical user interface in one embodiment of the invention.
- a single window interface 50 is shown, having several selectable interfaces.
- the interfaces 52 , 54 , 56 and 58 are selectable by tabs. It should be understood that many other mechanisms are available, such as so-called “radio buttons” or “check boxes,” which may be used to select a desired interface.
- each interface is selectable in response to a cursor controlled input device, such as a mouse 36 , but may also be keyboard operated.
- the graphical user interface 50 and its functionality will now be described in more detail in connection with FIGS. 5 - 16 .
- the four interfaces in this embodiment include a storyboard interface 52 , an interface 54 for bringing in motion video information, an interface 56 for editing a movie, an interface 58 for sending a movie out, for example, for recording to an external videotape device. Unless a previously stored composition is being opened for editing, the user is presented with either the storyboard interface or the bring video in interface when the editing system is first used.
- the storyboard interface 52 enables a user to plan the motion video program to be prepared.
- storyboards or plans include filming tips and editing tips for common motion video programs, such as a birthday party, graduation or wedding.
- One aspect of the invention is that such storyboards and plans can be produced and distributed separately from the computer program and from actual motion video programs, by storing them on a computer-readable medium such as a floppy disk or CD-ROM or by making them accessible through a computer network.
- the storyboard interface 52 displays a written description of a composition or video program, including the title 70 of the composition and a linear sequence of description 72 of each shot. In one embodiment, these sequences represent the segments present in the video media track only of the composition. Holes are not represented in the storyboard.
- the displayed description of each shot includes the title 74 of the shot, a duration 76 (either actual or estimated), and a description 80 of either a filming tip or an editing tip.
- the duration may be a suggested duration or an actual duration of any media associated with the shot.
- Each shot is assigned a number, sequentially, which is displayed over a still image 78 .
- the still image may be the first frame of an associated media clip or a default image used for all shots. Display of filming or editing tips is performed by selection of these options via an interface 82 .
- a scroll bar 83 enables a user to scroll through the view of the storyboard for the selected video program.
- the “down” arrow key changes the current selection to the group containing the first shot which follows the last shot in the current selection.
- the “up” arrow key changes the current selection to the group containing the last shot which precedes the first shot in the current selection.
- the “Home” key changes the current selection to the group containing the first shot in the storyboard.
- the “End” key changes the current selection to the group containing the last shot in the storyboard.
- “Page Up” and “Page Down” keys may be used to scroll through several shot descriptions at a time.
- the information used for each shot to enable the display of the storyboard interface 50 can be represented by an array or other ordered structure 86 of shot descriptions 87 (see FIG. 6) which stores, for each shot, the title 90 , a film tip 94 , an editing tip 96 , a duration 98 and an indication 100 of a pointer to another structure representing a clip of media data captured in a video data file and associated with the shot.
- Operations which edit, delete, or add information about a shot for a given video program manipulate the data in this data structure 86 .
- the displays shown in FIG. 5 are generated by creating display objects in response to data read from the data structure for a shot. These display objects are regenerated when necessary in response to changes to the data that they represent, as will be described in more detail below.
- a data structure 88 similar to data structure 86 may also be used to represent the motion video program itself, and includes clip descriptions 89 for each clip including a reference to a motion video data file to be used to produce the clip.
- Such a data structure 88 is shown in FIG. 7. It should be understood that the shot descriptions 87 in FIG. 5 and the clip descriptions 89 in FIG. 7 may be combined into one structure to represent the storyboard and motion video data of a motion video program. While the clip descriptions and shot descriptions may have redundant data, the redundant data clearly can be omitted and can be represented in only one of the structures or only once in a combined structure.
- the clip data structure 88 may be implemented as a QuickTime movie.
- a clip description will have an indication of a file name 102 , indication of start and stop times 104 within the file, and other information 106 about the data file.
- a clip description may have empty fields, i.e., no video data file, yet but have a duration, to indicate a “hole” in a track in the program.
- any one of the displayed elements 74 , 76 and 80 may be selected and edited by a user.
- Operations on a shot such as insertion of a shot, deletion of a shot and moving of a shot are also possible.
- a storyboard can also be printed to allow a user to use the filming tips during filming, for example.
- Moving a shot may be performed by the user selecting and dragging a shot to a transition point in the display between shots.
- the computer detects the location of, for example, a mouse cursor, and determines a corresponding shot which the select shot should follow and rearranges the order of the shot descriptions in the data structure 86 .
- New shots may be added via a command button 84 , through which a new shot description 72 is added with blank field 74 , 76 , 78 and 80 , either immediately after a selected shot or at the end of a list of shots.
- the new shot exists only in the storyboard and is not added to the timeline of clips until the associated media data is captured.
- a default title, e.g., “Untitled”, and duration, e.g., “0”, and empty strings for filming and editing tips may be used for the new shot.
- shots deleted from the storyboard are deleted from the timeline also, if there is a corresponding clip description, but the associated media is not deleted. Only the reference to the media in the clip description is deleted.
- the operations performed on the clips in the timeline preferably are reflected automatically in the shot descriptions of the storyboard and vice versa. While this feature is easily implemented by representing the shot and clips using a single data structure, when clips and shots are represented separately, each operation on a clip or shot description should also make appropriate modifications to a corresponding shot or clip description, respectively. The process of controlling the clip and shot descriptions for this purpose will be described in more detail below.
- the combination of the shot descriptions and clip descriptions are particularly useful in capturing motion video information from a video storage device, such as a camcorder, into a motion video data file where it can be edited on the computer.
- An interface 54 providing commands for capturing motion video, i.e., bringing motion video data into the computer system is shown in FIG. 8.
- the interface for capturing motion video into the computer includes a display area 120 , which displays motion video information currently being received by the computer as an input. For example, a user may be playing back a videotape on a camcorder connected as an input device through a video capture board to the computer system. If no video is available, the display area 120 can convey an instruction to connect a video source to the computer.
- a control 122 controls recording of the received motion video information.
- a display region 132 also displays available disk area as a function of time of video information which can be captured. In this example, it is assumed that roughly 27.7 k is required for each frame, such that roughly 830 k is required for each second of video information. This value is generated by monitoring the available disk space, dividing the available space value by a target size per frame (resulting in a number of frames which can be stored), and converting that quotient into minutes and seconds using the time resolution of the video, e.g., 30 frames per second.
- a storyboard region is also displayed at 134 to indicate the plan of shots for the selected video program for which data is being captured.
- a timeline 136 is displayed which corresponds to the storyboard region 134 .
- the storyboard region 134 includes, for each shot, its title 138 , an indication 140 of whether or not the video data for the shot has been captured (determined using the reference field 100 , FIG. 6), and an indication 142 of the title of the video program.
- a selection button 144 also allows for the insertion of a new shot, similar to the operation performed using button 84 in FIG. 5.
- shots may be selected, inserted or deleted. Such functionality can be provided using standard techniques. Because operations on this interface affect the data structure shown in FIG.
- the timeline 136 has a display object 146 for each clip which is captured.
- the display object has a size, which is calculated as a function of the duration of the clip, and a title 148 , obtained from the title of the corresponding shot description.
- a bar 150 also indicates whether audio is associated with the clip.
- Motion video information is captured using this interface 54 and is tied directly to a selected shot.
- the first shot in the storyboard for which motion video information has not yet been captured is selected.
- the user may select any given shot in the storyboard region for capturing associated motion video information.
- the user may cause motion video information to be input to the computer by playing a portion of a videotape from a camcorder device.
- the input motion video data is displayed in display area 120 .
- the user depresses button 124 to begin capture.
- the captured motion video information is stored in a data file on the hard disk of the computer system.
- the file name of that file is associated with the selected shot, if any, and corresponding clip in the storyboard and timeline. If no shot is selected, then a new media file is created in a library or directory of files.
- the stop button 126 is depressed and the data file on the hard disk is closed.
- the motion video information is automatically and immediately associated with a selected shot.
- the need for a “bin” of motion video data files is eliminated and the user interface is simplified.
- a message may be displayed to the user that tells the user to continue to the next selectable interface, for editing the movie. Nonetheless, the user may still add shots and capture more video.
- a storyboard module 200 is a part of the computer program which handles operations on shot descriptions of a storyboard. It receives as an input, and outputs, shot descriptions 202 .
- User input 204 is processed to change the data in the shot descriptions and to generated the displayed graphics 206 of the storyboard interface 52 .
- a capture module 208 processes the shot descriptions and the clip descriptions 216 to provide the display graphics 210 of interface 54 . It also processes user input 212 to perform operations such as capturing data or inserting and deleting shots.
- Video input and output 214 is controlled into data files.
- the clip descriptions 216 are created and modified according to the selected shot and the name of the data file into which the data is captured.
- the capture module 208 modifies the corresponding clip description 216 .
- the corresponding shot is modified via a message passing technique, indicating a clip that is modified and the operation causing the modification.
- FIG. 16 is a flowchart describing an example operation in which the clip descriptions and shot descriptions are synchronized.
- a data file for the video information is created in step 220 .
- Video data is then captured in steps 222 and 224 .
- a clip description is created with a reference to the data file, and start and stop times corresponding to the beginning of the file in step 226 .
- This clip description is stored in a data structure 88 which represents the sequence of clip descriptions which make up the timeline.
- a message is then passed in step 228 to the storyboard indicating that a clip was created, having a duration.
- the selected shot description modifies its duration and pointer to reference the new clip description in step 230 .
- the interface for editing a movie has a timeline region 160 , which includes a representation of a timeline 162 , associated title track 164 , an additional audio track 166 , and a soundtrack 168 .
- a timeline is a time-based representation of a composition. The horizontal dimension represents time, and the vertical dimension represents the tracks of the composition. Each of the tracks has a fixed row in the timeline which it occupies. The video track is split into three rows, including the effect subtrack, the video media subtrack, and the audio subtrack.
- the size of a displayed element is determined as a function of the duration of the segment it represents and a timeline scale, described below.
- Each element in the title, audio and soundtrack timelines has a position determined by its start time within the motion video program, a duration, a title, and associated data.
- Each track is thus represented by a structure similar to data structure 88 , but audio tracks have references to data files containing audio information.
- the timeline also has a scale which specifies how much time a certain number of pixels represents.
- To increase the scale means to increase the number of pixels that represent one time unit.
- Providing a mechanism to increase and decrease the time scale allows a user to focus in on a particular location in the composition, or to have more of an overview of the composition.
- Each of the selectable interfaces of the editing interface has a viewer window 172 , which has the same size and location within each window.
- the viewer window 172 also has an associated timeline 174 , representing the entire video program, which includes a play button 176 , forward and backward skip buttons 178 and 180 , and a position indicator 182 , which points to the present position within the video program which is being played back.
- the indicator 184 is linked to another position indicator 186 in the timeline region 160 .
- the various buttons 176 , 178 , 180 and indicator 182 can be used to control viewing of the video program being edited.
- the program can be played back at a full rate, paused to show a still frame and shuttled to view individual frames to the left and/or right at a number of speeds.
- a display region 188 shows the title and duration of the video program.
- a user can play back the video program, adjust the duration of clips (by trimming), delete clips, insert clips and/or move clips within the video program.
- a segment in the timeline may also be split into two separate segments or clips. These operations can be performed by simple cut and paste operations on the timeline 162 which can be implemented using standard techniques. For example, deletion of a clip from the timeline replaces the clip description with a hole of the same duration. The reference to this clip is removed from the corresponding shot description.
- clips are insertable at transitions and can be performed using a “drag and drop” operation, which can be implemented using standard techniques. Insertion of a clip involves creating a hole the size of the clip, then replacing the hole with the clip to be inserted. The hole may be created after a selected clip, at a transition point nearest the drop or anywhere beyond the end of the last clip in the timeline. It may be desirable to show what the timeline would look like if a drop were to occur when the user has a drop position selected, but prior to the drop operation being performed.
- An inserted clip may be selected by a copy or cut operation, followed by a paste operation; a selection from a library; or by dragging a selected clip to the desired location (which is in essence a combination of cut and paste operations).
- Trim operations add or remove frames from selected edges of segments in the composition.
- a trim right operation either removes frames from an incoming edge or adds frames to an outgoing edge.
- a trim left operation either removes frames from an outgoing edge or adds frames to an incoming edge. This operation is performed by simply adjusting the start or stop frames in the clip description. A trim operation accordingly cannot add or remove frames beyond the boundary of the data file used by the clip. To provide additional boundary conditions on the trim operation, the start point may be required to precede the stop point and define at least one frame. Trim operations other than edge trims may provide more advanced functionality, but are likely not to be needed by the nonprofessional.
- the selection of a right trim or left trim operation uses some mechanism for the user to select an edge and to indicate that a trim operation is desired.
- One example mechanism which may be used are “trim handles” which are displayed on the left and right ends of a displayed clip when a user selects the clip. The user may then drag the edge to the desired trim point.
- timeline behavior specification is provided by FIGS. 17 a - 24 m , and describes in more detail the desired behavior in response to most user operations.
- FIGS. 17 a - 24 m are diagrams which show insertion, deletion, and trimming operations in a timeline which are possible by adding frames or clips and by removing frames or clips. These figures are shown as examples only and many other operations are possible.
- FIG. 17 a shows that a selected clip A may be indicated by a long clip which has, for example, a length of ten frames or a short clip which has a length of five frames.
- Other clips B. C and D are shown in FIG. 17 b .
- Generally holes may be shown with their length as eleven frames.
- a playout position is indicated in the clip as shown in FIG. 17 c .
- a playout position may be indicated in a hole and shown in FIG. 17 d , while a selected transition may be indicated as shown in 17 e.
- FIGS. 18 a - b illustrate behavior which occurs when holes are removed from a timeline. As shown in FIG. 18 a , a hole which is eleven frames in length exists between clips B and C. FIG. 18 a then illustrates the timeline after the hole has been removed. Similarly, FIG. 18 b illustrates a hole before clip B and a hole before clip C. FIG. 18 b then illustrates the timeline after removal of the first hole.
- FIGS. 19 a - 19 d illustrate behavior which occurs by adding a hole.
- FIG. 19 a a transition is selected between clips B and C.
- FIG. 19 a illustrates the timeline after a hold has been added between clips B and C.
- FIG. 19 b illustrates a timeline having a hole between clips C and D.
- FIG. 19 b illustrates a timeline after a hole has been added following clip B.
- FIG. 19 c illustrates a transition before clip C
- FIG. 19 c illustrates a hole which is added before the selected transition in clip C. Since a hole seven frames in length already existed before the transition in clip C, as shown in FIG. 19 c , only four frames need to be added to create a hole before the selected transition.
- FIG. 19 d there is a seven frame gap between a selected transition in clip B and clip D.
- FIG. 19 d then illustrates a hole which has been added between clips B and D by adding four frames to the existing seven frames.
- FIGS. 20 a - 20 f illustrate delete/cut behavior.
- FIGS. 20 a - 20 c do not include a hole before or after the deletion.
- FIG. 20 a illustrates three clips in a timeline.
- FIG. 20 a then illustrates deleting a front clip A which results in a timeline as shown in FIG. 20 a having clips B and C.
- FIG. 20 b illustrates three clips including a middle clip A.
- FIG. 20 b then illustrates a timeline after a middle clip A has been removed.
- FIG. 20 c illustrates three clips with a last clip A, then FIG. 20 c illustrates deleting the last clip A to result in a timeline having only clip B and C.
- FIG. 20 g illustrates a timeline with three clips A,B and C including a hole which is eleven frames in length.
- FIG. 20 h illustrates a timeline after clip A has been removed in which the hole is preserved between clips B and C.
- FIG. 20 i illustrates a timeline having three clips with a hole between clip B and A.
- FIG. 20 j illustrates removing clip A, resulting in a timeline including clips B and C with the hole now between clips B and C.
- FIG. 20 k illustrates three clips wherein clip A is included in the hole between clips B and C.
- FIG. 201 illustrates a timeline which results after deleting clip A from the hole.
- FIGS. 21 a - d illustrate pasting a clip into a timeline.
- a timeline includes clips B and C with a transition between them. Clip A is pasted between clips B and C resulting in the timeline next shown in FIG. 21 a .
- FIG. 21 b a timeline is shown with a hole beyond a selected transition.
- FIG. 21 b then illustrates a timeline after a clip A has been inserted between clips B and C.
- FIG. 21 c illustrates a timeline having clips C and D including a hole between a selected transition in clip C.
- FIG. 21 c then illustrates a clip which has been pasted before the selected transition in clip C.
- FIG. 21 d illustrates a hole which is existing between the transition in clip B and clip D.
- a track as shown next in FIG. 21 d which illustrates a clip A has been added after the selected transition in clip B.
- FIGS. 22 a - 22 g illustrates the behavior which results from dragging a clip from a timeline.
- clip A which is included in a timeline also including clips B and C, is dragged from the timeline and dropped between clips B and C.
- the third line of FIG. 22 a illustrates clip A which is dragged and dropped at the end of the timeline after clip C.
- the fourth line of FIG. 22 a illustrates dragging clip A and dropping it eleven frames after the end of the timeline.
- FIG. 22 b illustrates a timeline in which clip A exists in the middle of clips B and C. Clip A may be dragged and dropped before clip B. The third line of FIG. 22 b illustrates a timeline which results after clip A is dragged and dropped at the end of the timeline. Clip A may also be dragged and dropped eleven frames after the end of the timeline.
- FIG. 22 c illustrates a timeline including clip A at the end of the timeline.
- the length of the timeline is not preserved after a clip is dragged from the end of a timeline.
- Clip A may be dragged to the beginning of the timeline before clip B or it may be dragged and dropped between clips B and C. Similar to the above examples in FIG. 22, clip A may be dragged to the end of the timeline and dropped eleven frames after clip C.
- FIG. 22 d illustrates a clip A in a timeline which may be dragged so that the length of the timeline is preserved.
- clip A may be dragged and dropped between clip B and a hole.
- clip A may be dragged and dropped seven frames after the start of the hole or it may be dropped after the hole, but before clip C.
- Clip a may also be dropped at the end of the timeline following clip C or it may be dragged and dropped, for example, four frames after the end of the timeline.
- FIG. 22 e illustrates further examples of dragging clip A and preserving a hole and the length of the timeline.
- clip A may be dragged into the middle of a hole, but before clip B or it may be dragged before clip B or between clips B and C.
- Clip A may be also dragged to the end of the timeline and may be dropped four frames after the end of the timeline
- FIG. 22 f illustrates dragging a last clip in a timeline without preserving the length of the timeline.
- clip A may be dragged to the beginning of a timeline and as shown, the hole after clip B is not preserved.
- Clip A may be dragged and dropped between clips C and B or may be dragged and dropped at the end of clip B.
- Clip A may also, for example, be dragged and dropped seven or thirteen frames after the end of clip B and after the start of a hole.
- FIG. 22 g illustrates dragging a clip which is surrounded by a hole.
- Clip A may be dragged to the beginning of a timeline and dropped before clip B as shown in FIG. 22 g . These operations preserve the hole and the length of the timeline. For example, clip A may be dragged and dropped after clip B but before the beginning of the first hole. Clip A may also be dragged so that it is dropped four frames after the beginning of the hole or it may be dropped between the end of the hole and the beginning of clip B. In addition, clip A may be dragged to the end of clip B or it may be dragged and dropped four frames after the end of the timeline.
- FIGS. 23 a - 23 c illustrate operations performed by dragging a clip from an outside timeline.
- FIG. 23 a illustrates a timeline with clips B and C. However, as shown in the next line, clip A may be dragged from an outside timeline and dropped before clip B. Clip A may be dropped also between clips B and C, after clip C or, for example, seven frames after the end of clip C.
- FIG. 23 b is similar to FIG. 23 a except that a hole exists in the timeline between clips B and C. The same functions of dragging and dropping clip A may be performed while preserving the hole between clips B and C.
- FIG. 23 c illustrates a timeline including two holes and a clip B and C.
- Clip A may be dragged from an outside timeline and dropped four frames from the start of the timeline.
- Clip A may also be dropped before clip B, after clip B, into a second hole after clip B or in a second hole five frames after clip B.
- Clip A may also be dropped at the end of the timeline after clip C or, for example, four frames after the end of the timeline.
- FIGS. 24 a - 24 i and 24 k - 24 m illustrate trim behavior which results from trimming clips in a timeline.
- FIG. 24 a illustrates clips A and B and the result from trimming clip A such that the inpoint is trimmed in five frames by removing five frames from the beginning of clip A. As shown, the length of the other items in the timeline are preserved.
- FIG. 24 b illustrates a clip A and B where the input is trimmed out by adding five frames prior to clip A. The result is shown in the second line of FIG. 24 b with a result of clips A and B being of equal length.
- FIGS. 24 c and 24 d illustrate trimming clip A such that the outpoint is trimmed in five frames by removing five frames from the end of clip A (FIG.
- FIG. 24 e illustrates a timeline having a hole before clip A. Clip A is trimmed at its inpoint in five frames and the result is shown in the second line of FIG. 24 e .
- FIG. 24 f illustrates a hole before clip A and a trim operation performed on clip A trimming the inpoint out five frames.
- FIGS. 24 g and 24 h illustrate a timeline having a hole before clip A and trimming the outpoint in by five frames (FIG. 24 g ) and trimming the outpoint out by five frames (FIG. 24 h ).
- FIG. 24 i illustrates a timeline having a hole after a clip A and before clip B.
- FIG. 24 i and FIGS. 24 k through 24 m illustrate a hole after a selected clip A and before a clip B.
- the second line of these examples illustrate the result of the timeline after trimming clip A in and out at its inpoints and outpoints by five frames.
- the next step to perform in the editing process is the addition of special effects at transitions.
- the user may access these special effects at any time and is not required to complete all trims prior to creating any effects.
- a second selectable interface 153 for editing the motion video includes operations for selecting special effects to be applied to transitions between two clips. Given two selected clips, a selected effect can be applied to the transition. A corresponding object 190 in FIG. 9 is displayed on the timeline, describing the transition. A suitable interface for providing selection of an effect and clips is shown in FIG. 10.
- a list of possible effects is provided at region 192 .
- Each effect has a title 193 which refers to a computer program which causes the effect to be made.
- the effect may be applied by selecting button 194 and is applied to the transition closest to the current position on the timeline.
- the effect may be removed, and replaced by a straight cut, by selecting button 196 .
- a selected effect can be previewed in window 198 , for example by “double-clicking” on the name in the effects window 192 .
- Each effect has a corresponding file in which the effect has been created using graphics of the letters A and B, for this purpose.
- an effect segment is created in the effect subtrack of the video track, with a default duration, e.g., one second.
- the effect as applied to the transition is rendered in the background and associated with the segment.
- the effects segment may be trimmed. Such trimming may be implemented in the same manner as a trim on a regular clip.
- buttons 202 and 204 allow bold, italic and underlining formatting, while buttons 204 adjust justification. Font and size are selected via a menu style interface 206 and 208 , respectively. Additional options for scrolling are provided at 210 . Scrolling can be made left to right, right to left, top to bottom, or bottom to top.
- a titling effect can be removed or applied through selection buttons 212 and 214 , respectively.
- This information input through this interface is used, using known techniques, to apply the title to the video information and to display the effect in the display region 172 .
- the video data file of the clip to which it is applied is not modified.
- the titling information may be finally applied, for example, only when the video program is output in final form. In this way, titles may be added and removed more easily.
- the dimensions of the space which can contain text is limited to the frame size, which in this case is represented by the canvas area 209 . If the vertical scroll option is enabled, then the width of the canvas is the width of the video image, but the height is indefinite. If the horizontal scroll option is enabled, then the height of the canvas is the height of the video image and the width is indefinite.
- the length of the title may be the length of any associated video clip or the length of a hole over which it is created.
- one of the final steps of the process of making a video program involves “sweetening” of the sound or audio tracks. This involves more detailed editing of the audio tracks.
- Another interface 155 shown in FIG. 12, provides editing functions for sound. Using this interface, all modifications to clips, including creation and deletion of clips, operate on one of the audio tracks.
- a voice can be captured directly into the timeline in a manner similar to the way video is captured, via interface 220 . Such an operation automatically creates a voice-over clip on the voice-over track.
- music such as from a CD-ROM, can be imported using interface 222 . Such an operation automatically creates a sound clip on the soundtrack.
- the volume of each selected track can be adjusted using interface region 224 . It is also possible to select fade-in, fade-out options. Given the inputs provided through this interface, the operations to be performed are implemented using known techniques.
- a library of audio and video information can be provided and accessed through a library interface 156 , as shown in FIG. 13.
- a list operation invoked through button 230 causes a list of the available clips to be displayed in region 232 .
- the available clips are all media clips which have been digitized or imported for use in a composition, for example, through the “Bring Video In” interface or through the “Sound” interface.
- the list operation involves a directory lookup to be performed by the computer on its file system, for example.
- the list view shows clips in a manner similar to the storyboard of interface 52 . For each clip, its date, type, duration and description are displayed. These fields are editable.
- the preview button 234 allows a user to see one clip at a time from the library instead of an entire list.
- a user can display a selected video clip from the library in region 232 using a viewer which is similar to, but smaller than, the viewer 172 which is reserved for playing back the currently edited video program from the timeline.
- the movie can be saved in a final form as one contiguous video program, using the interface 58 shown in FIG. 14.
- the title of the video program is shown in region 250 .
- Start button 252 and stop button 254 are provided to control, of course, starting and stopping of the playback of the video, respectively.
- Selections are provided to the user for either previewing the video program on the computer screen, as indicated at 256 , for making a videotape by outputting the video information through an encoder to a VCR, for example in VHS format, as indicated at 258 , or the video information can be saved as a data file in one of several formats, such as QuickTime video, Microsoft video, MPEG video, or Motion-JPEG video as indicated at 260 . Such files could be used for presentations, Internet publishing or CD-ROM publication.
- the selection of the format of the final program is selected, for example, by using the drop down menu 262 . Given the inputs provided through this interface, the titles are rendered.
- the computer then instructs the user, if appropriate, to ensure that the destination of the data, such as a camcorder, is ready.
- the steps of generating and playing back the video data from the data files into one contiguous stream of video data may be implemented using known techniques.
- a user may want to stop and save the current version of the video program or storyboard. Additionally, the user may want to continue editing a composition that is not yet finished.
- This capability is provided through menu functions which are separate from the selectable interfaces that provide the planning, capturing, editing and recording functions. Menu functions may also be provided for each interface to represent keystrokes used to execute a given command and to set default values for audio and video, input and output, and file and signal formats.
- a composition can be stored in one or both of two formats.
- the first format stores the composition only as a storyboard. Storing a composition as a storyboard involves creating a data file and storing in the data file all of the information about a storyboard, without information about the associated clips.
- the second format stores all of the information about the current video program as well as the state of the editing program, i.e., what interface is being used during the save operation.
- This file format includes an indication of the interface being used, followed by the representations of each track, and the clip descriptions in each track, along with the storyboard shot descriptions including the indications of associated clips. Given a stored composition, when the document is opened again for further editing, the same interface which was last used is presented to the user.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
A graphical user interface for a computer-assisted motion video editing system directs a user through the process of editing a video program. The graphical user interface may also enables a user to plan a video program. Alternatively selectable interfaces within a single window interface, each of which provide a group of planning, capturing, editing, and recording functions can provide such an interface for producing a video program. Other simplifications to the user interface can be provided to assist in editing, such as by maintaining a video display window for displaying the edited video program at a fixed position for all available editing operations. Additionally, video information can be captured directly into a timeline representation of a video program, rather than a bin. Using a storyboard tied to the capturing process, a user is directed through the process of collecting and capturing the video clips to be used in the video program.
Description
- This application claims the benefit under 35 U.S.C. §120, and is a continuation of U.S. patent application Ser. No. 08/687,926, filed Jul. 29, 1996, now U.S. Pat. No. 6,628,303, issuing Sep. 30, 2003, which is hereby incorporated by reference.
- The present invention is related generally to systems for planning and editing motion video information. More particularly, the present invention is related to computer-assisted motion video editing systems, wherein motion video information is stored as a sequence of digital still images in a data file on a computer system.
- Computer systems with motion video editing tools have been used to produce major motion picture films, television shows, news broadcasts and in corporate settings to edit motion video. Unlike word processing tools, however, such motion video editing tools are not yet common for home use, primarily due to the cost of motion video editing tools, including hardware and software, and the complexity of the architecture and graphical user interface.
- The high cost of motion video editing tools for computers is due, in part, to specialized hardware used to capture, digitize, compress, decompress and display motion video information on a computer screen with sufficient detail and resolution. The complexity of the graphical user interface of these motion video editing tools is due, in part, to the variety of possible operations which can be performed on motion video and also to a design for professionals familiar with motion video editing, and terms and concepts of that trade, with which the average person is unfamiliar. For example, many systems use a representation of a motion video composition as two tracks, A and B, between which “rolling” occurs. The concept of A/B rolling is unduly complex and has been simplified in other systems by using a “timeline,” such as in the AVID/1 Media Composer from Avid Technology, Inc., of Tewksbury, Mass. Another complexity is the use of multiple windows for controlling the various parameters of the motion video, displaying the motion video itself, sound track controls, and other features.
- Another drawback of many motion video editing tools for computers is that most people have not been taught how to communicate ideas using motion video or how to efficiently produce a motion video program. Schools commonly teach written and oral expression and expression through still graphics but not motion video. While most motion video editing tools for computers are suitable for creating motion video programs, few tools assist in the creative design, planning and production of motion video programs.
- Accordingly, a general aim of this invention is to provide a motion video editing system for a computer with reduced cost and complexity. Another aim of this invention is to provide a motion video editing system for a computer with tools for assisting creative design and planning of a motion video composition.
- The present invention provides a simplified interface which directs a user through the process of editing a video program. In one aspect of the invention, the interface also enables a user to plan a video program. Alternatively selectable interfaces, each of which provide a group of planning, capturing, editing, and recording functions, provides an intuitive interface for producing a video program. Other simplifications to the user interface can be provided to assist in editing, such as by maintaining the video display window at a fixed position. Additionally, video information can be captured directly into a timeline representation of a video program, rather than a bin. Using a storyboard tied to the capturing process, a user is directed through the process of collecting and capturing the video clips to be used in the video program.
- Accordingly, one aspect of the invention is a graphical user interface for a computer motion video editing system, which has a single window interface including a plurality of alternatively selectable interfaces. A first of the plurality of selectable interfaces is an interface for making capturing commands available to a user for receiving motion video information to be edited. A second of the plurality of selectable interfaces is an interface for making editing commands available to a user for editing the received motion video information. A third of the plurality of selectable interfaces is an interface for making playback commands available to a user for outputting the edited motion video information to an external device. In one embodiment, a fourth of the plurality of selectable interfaces includes an interface for making storyboarding commands available to a user for preparing a plan describing a motion video program to be edited.
- In another embodiment, the second of the plurality of selectable interfaces further includes a second plurality of alternatively selectable interfaces, wherein each selectable interface provides a set of editing functions of a particular type, and wherein each selectable interface has a video region for previewing the motion video program being edited and wherein the video region in each of the selectable interfaces is at an identical position within the single window interface.
- Another aspect of the invention is a graphical user interface for editing computer motion video having a single window interface having a plurality of alternatively selectable interfaces. Each selectable interface provides a set of motion video editing functions of a different type. Each selectable interface also has a video region for previewing the motion video to be edited such that the video region in each selectable interface is at the same position within the single window interface.
- Another aspect of the invention is a graphical user interface for a computer for assisting editing of a motion video program, having a planner module with inputs for receiving commands from a user descriptive of a plan of shots of video in the video program and providing an output representative of the plan. A capture module has a first input for receiving the plan defined by the user, a second input for receiving an input from a user for controlling recording of motion video information, and a third input for receiving a selection of a shot in the received plan. The capture module has an output in which captured motion video information is associated directly with the selected shot to provide the motion video program as a sequence of the recorded clips in an order defined by the plan.
- Another aspect of the invention is computer video capture system which represents a sequence of video clips in a video program. Clips of a video program are captured directly into the represented sequence.
- Another aspect of the invention is a set of a plurality of predefined plans stored in a computer memory. One aspect of this invention includes a mechanism for selecting one of the plans, for editing a selected plan, and for capturing video and for automatically generating a video sequence according to the selected plan.
- Another aspect of the invention is a process for capturing motion video information and for generating a video program of a plurality of clips of captured motion video information. The process involves selecting a clip of the video program, capturing video information and associating the captured video information with the selected clip of the video program. In one embodiment, the step of selecting includes the step of selecting a shot from a plan representing and associated with the video program. In this embodiment, the step of associating includes the step of associating the captured video information with the clip of the video program associated with the selected shot from the plan.
- In another embodiment, the process further involves performing the step of indicating whether a clip of the video program has captured motion video information associated thereto.
- These and other aspects, goals, advantages and features of the invention will be apparent from a reading of the following detailed description.
- In the drawing,
- FIG. 1 is a block diagram of an example general purpose computer system in which the present invention may be implemented;
- FIG. 2 is an example memory system shown in FIG. 1;
- FIG. 3 is a diagram illustrating software layers in one embodiment of the present invention;
- FIG. 4 is a perspective view of a computer system having a display showing one embodiment of the graphical user interface of the present invention;
- FIG. 5 is a graphic of a graphical user interface for providing planning functions in accordance with one embodiment of the present invention;
- FIG. 6 is a diagram of a data structure for representing shots in accordance with one embodiment of the present invention;
- FIG. 7 is a diagram of a data structure for representing clips in accordance with one embodiment of the present invention;
- FIG. 8 is a graphic of a graphical user interface for providing capturing functions in accordance with one embodiment of the present invention;
- FIGS.9-13 are graphics of graphical user interfaces for providing editing functions in accordance with one embodiment of the present invention;
- FIG. 14 is a graphic of a graphical user interface for providing recording functions in accordance with one embodiment of the present invention;
- FIG. 15 is a block diagram illustrating interaction between a module for maintaining and displaying a storyboard and a module for creating and maintaining clip description of a composition;
- FIG. 16 is a flowchart describing how clip descriptions and shot descriptions are synchronized during capture of motion video information;
- FIGS. 17a-17 e are a representation of timeline behavior produced in response to a user operation;
- FIGS. 18a-18 b are a representation of timeline behavior produced in response to a user operation;
- FIGS. 19a-19 d are a representation of timeline behavior produced in response to a user operation;
- FIGS. 20a-20 f are a representation of timeline behavior produced in response to a user operation;
- FIGS. 21a-21 d are a representation of timeline behavior produced in response to a user operation;
- FIGS. 22a-22 g are a representation of timeline behavior produced in response to a user operation;
- FIGS. 23a-23 c are a representation of timeline behavior produced in response to a user operation; and
- FIGS. 24a-24 i and 24 k-24 m are a representation of timeline behavior produced in response to a user operation.
- The present invention will be more completely understood through the following detailed description which should be read in conjunction with the attached drawing in which similar reference numbers indicate similar structures.
- While many computer systems are available which enable a user to edit motion video, the selection of an appropriate interface for making commands available is a complex task due to the large number of possible operations which can be performed on video information. In the present invention, the graphical user interface directs a user through the steps of editing a motion video program, including planning (storyboarding), capturing the video information, editing the video information, and exporting the video information to a final data file or a video tape. In the process of editing, the user is directed through the steps of editing the primary content of the video program, adding effects at transitions between video clips, adding titles and credits, and finally, editing sound. By providing a simple interface which directs a user through these steps in which follows the steps typically used by professional video editors, the ability to edit quality video programs is available to the non-professional.
- One embodiment of this invention will now be described in more detail. In this document, several terms are used to describe a video program and associated information. The following are definitions of these terms. A composition is a heterogeneous aggregation of tracks and, in one embodiment of the invention, includes five tracks: one title track, one video track, and three audio tracks. The composition is also referred to as a motion video program. One of the audio tracks is synchronized and grouped with the video track (the audio track that is captured with the video), one audio track is called a voice-over track, and the third audio track is a music track. Each track is a two part entity: a synchronized media subtrack and an effects subtrack. Each subtrack consists of a sequence of segments and holes. The media subtrack includes media segments, and the effects subtrack includes effects segments. A media segment is a portion of a media subtrack with a time-based beginning and ending. The interior of a media segment refers to a portion of a media clip. A media clip is an independent, playable entity which has duration and possibly multiple pieces of synchronized media associated with it. Media clips also have ancillary data associated with them, such as a name and description. Media is motion video media, audio media, or text media stored in a data file on a computer, for example, in a QuickTime file. A sync-lock group is a group of segments which have been grouped together for editing purposes. Editing operations will not move the components of a sync-lock group relative to each other. The video track and its corresponding audio track may be the only sync-lock group and cannot be unlocked or unsynced. A media segment is a video media segment, audio media segment, and text media segment, depending on which track the segment resides. A hole is a span in a track with a time-based beginning and ending which has no associated segment. On the video track, a hole displays black. On an audio track, a hole plays silence. On the titles track, a hole displays full transparency. Relative to a point or span in the composition, upstream composition elements are located earlier in the composition and downstream composition elements are located later in the composition. The beginning of a media segment is called its incoming edge, and the ending of a media segment is called its outgoing edge. The edges of media segments are also called transition points. A transition point has zero length. The edges of a group are transition points where a segment on one side of the transition is inside the group and any segment on the other side of the transition is outside of the group. A cut is a transition point that does not have an effect segment spanning it. At a transition point between two segments, the outgoing segment is the segment which displays before the transition point, and the incoming segment is the segment which displays after the transition point. Hence, the outgoing segment is to the left of a cut in the timeline; the incoming segment is to the right.
- The present invention may be implemented using a digital computer. A
typical computer system 20 is shown in FIG. 1, and includes including aprocessor 22 connected to amemory system 24 via aninterconnection mechanism 26. A special-purpose processor 23 may also be used for performing specific functions, such as encoding/decoding of data, or complex mathematical or graphic operations. Aninput device 28 is also connected to the processor and memory system via the interconnection mechanism, as is anoutput device 30. The interconnection is typically a combination of one or more buses and one or more switches. As shown in FIG. 4, theoutput device 30 may be adisplay 32 and the input device may be akeyboard 34 ormouse 36. The processor, interconnection mechanism and memory system typically are embodied in amain unit 38. - It should be understood that one or more output devices may be connected to the computer system. Example output devices include a cathode ray tube (CRT) display, liquid crystal display (LCD), printers, communication devices, such as a modem, and audio output. To enable recording of motion video information in an analog form, this computer system also may have a video output for providing a video signal to a VCR, camcorder or the like. It should also be understood that one or
more input devices 28 may be connected to the computer system. Example input devices include a video capture circuit connected to a VCR or camcorder, keyboard, keypad, trackball, mouse, pen and tablet, communication device, audio input and scanner. The motion video capture circuit may be one of many commercially available boards. For example, a video capture card may connect to the PCI interface, and may use Motion-JPEG video compression and pixel averaging to compress images to 320×240-pixels at 30 frames per second. The video capture card may receive and may output composite video and S-video. It should be understood that the invention is not limited to the particular input or output devices used in combination with the computer system or to those described herein. The input and output devices may be included within or external to themain unit 38. - The
computer system 20 may be a general purpose computer system, which is programmable using a high level computer programming language, such as “C++” or “Pascal”. The computer system may also be implemented using specially programmed, special purpose hardware. In a general purpose computer system, the processor is typically a commercially available processor, such as the Power PC 603e RISC microprocessor. It may include a special purpose processor such as a CL540B Motion JPEG compression/decompression chip, from C-Cube of Milpitas, Calif. Many other processors are also available. Such a processor executes a program called an operating system, such as the Macintosh operating system, such as Macintosh System Software, version 7.5.3, which controls the execution of other computer programs and provides scheduling, debugging, input output control, accounting compilation, storage assignment, data management and memory management, and communication control and related services. The processor and operating system define a computer platform for which application programs in high level programming languages are written. It should be understood that the invention is not limited to a particular computer platform, particular operating system, particular processor, or particular high level programming language. Additionally, thecomputer system 20 may be a multi-processor computer system or may include multiple computers connected over a computer network. One embodiment of the present invention, is implemented using either a Macintosh Performa computer or Power Macintosh computer, with a PCI expansion slot and the Apple Video System, such as Performa 5400, 5420 or 6400 series computers from Apple Computer of Cupertino, Calif. Alternatively, an Apple Power Macintosh computer with a built-in compositor as video input and a PCI expansion slot, such as the 7600 or 8500 series computers with audio/video capabilities may be used. The computer system may also include an application for managing motion video files, such as the QuickTime 2.5 motion video system of Apple Computer. - An
example memory system 24 will now be described in more detail in connection with FIG. 2. A memory system typically includes a computer readable and writablenon-volatile recording medium 40, of which a magnetic disk, a flash memory, and tape are examples. The disk may be removable, known as a floppy disk, and/or permanent, known as a hard drive. In particular, a PowerPC processor-based Macintosh Performa computer, having a gigabyte or more capacity hard disk drive and at least 16 to 24 megabytes of DRAM is preferred. The disk should have sufficient size to hold the video information to be edited, which is typically around 830 k bytes per second. The disk, which is shown in FIG. 2, has a number of tracks, as indicated at 42, in which signals are stored, in binary form, i.e., a form interpreted as a sequence of 1's and 0's, as shown at 44. Such signals may define an application program to be executed by the microprocessor, or information stored on the disk to be processed by the application program, such as video information stored in a data file. Typically, in operation, theprocessor 22 causes data to be read from thenon-volatile recording medium 40 into an integratedcircuit memory element 46, which is typically a volatile random access memory, such as a dynamic random access memory (DRAM) or static memory (SRAM). The integratedcircuit memory element 46 allows for faster access to the information by the processor anddisk 40, and is typically called the system memory. The system memory may be used as a buffer between the disk andoutput device 30 or the video information, as will be described in more detail below. The processor generally causes the data to be manipulated within the integratedcircuit memory 46 and copies the data to thedisk 40 if modified, when processing is completed. A variety of mechanisms are known for managing data movement between thedisk 40 and theintegrated circuit memory 46, and the invention is not limited thereto. It should also be understood that the invention is not limited to a particular memory system. - The different levels of software which interact in this system will now be described in more detail in connection with FIG. 3. Using a computer such as a Power Macintosh 7500 or 8500, using the System 7.5.2 operating system or higher, as indicated at60, a video capture card is provided as indicated at 62. The
QuickTime video system 64 interacts with thevideo capture card 62 viadrivers 66. Avideo player system 67, such as the Apple Video Player, interacts with QuickTime 2.5. The software providing the editing instructions and graphical user interface to access these instructions is also designed to interact with QuickTime in parallel with the video player, as indicated at 68. - Such a platform as described in FIGS.1-3 can be used to implement a graphical user interface in accordance with the invention. FIG. 4 shows, on an
output device 32, a perspective view of a graphical user interface in one embodiment of the invention. Within thedisplay area 48, asingle window interface 50 is shown, having several selectable interfaces. In the embodiment shown in FIG. 4, theinterfaces mouse 36, but may also be keyboard operated. - The
graphical user interface 50 and its functionality will now be described in more detail in connection with FIGS. 5-16. Referring now to FIG. 5, one of the selected interfaces is shown along with the graphic controls for selection of the other interfaces. The four interfaces in this embodiment include astoryboard interface 52, aninterface 54 for bringing in motion video information, aninterface 56 for editing a movie, aninterface 58 for sending a movie out, for example, for recording to an external videotape device. Unless a previously stored composition is being opened for editing, the user is presented with either the storyboard interface or the bring video in interface when the editing system is first used. - The
storyboard interface 52 enables a user to plan the motion video program to be prepared. In one aspect of the invention, storyboards or plans include filming tips and editing tips for common motion video programs, such as a birthday party, graduation or wedding. One aspect of the invention is that such storyboards and plans can be produced and distributed separately from the computer program and from actual motion video programs, by storing them on a computer-readable medium such as a floppy disk or CD-ROM or by making them accessible through a computer network. Thestoryboard interface 52 displays a written description of a composition or video program, including thetitle 70 of the composition and a linear sequence ofdescription 72 of each shot. In one embodiment, these sequences represent the segments present in the video media track only of the composition. Holes are not represented in the storyboard. The displayed description of each shot includes thetitle 74 of the shot, a duration 76 (either actual or estimated), and adescription 80 of either a filming tip or an editing tip. The duration may be a suggested duration or an actual duration of any media associated with the shot. Each shot is assigned a number, sequentially, which is displayed over astill image 78. The still image may be the first frame of an associated media clip or a default image used for all shots. Display of filming or editing tips is performed by selection of these options via aninterface 82. - In the storyboard interface, a scroll bar83 enables a user to scroll through the view of the storyboard for the selected video program. The “down” arrow key changes the current selection to the group containing the first shot which follows the last shot in the current selection. Likewise, the “up” arrow key changes the current selection to the group containing the last shot which precedes the first shot in the current selection. The “Home” key changes the current selection to the group containing the first shot in the storyboard. The “End” key changes the current selection to the group containing the last shot in the storyboard. “Page Up” and “Page Down” keys may be used to scroll through several shot descriptions at a time. When any storyboard navigation occurs due to keystrokes, the storyboard view scrolls to display the earliest selected shot. Typing a shot number selects a shot. Numbers typed in less time than a double-click time of a mouse are treated as multidigital numbers for navigation purposes.
- The information used for each shot to enable the display of the
storyboard interface 50 can be represented by an array or other orderedstructure 86 of shot descriptions 87 (see FIG. 6) which stores, for each shot, thetitle 90, afilm tip 94, anediting tip 96, aduration 98 and anindication 100 of a pointer to another structure representing a clip of media data captured in a video data file and associated with the shot. Operations which edit, delete, or add information about a shot for a given video program manipulate the data in thisdata structure 86. The displays shown in FIG. 5 are generated by creating display objects in response to data read from the data structure for a shot. These display objects are regenerated when necessary in response to changes to the data that they represent, as will be described in more detail below. - A
data structure 88 similar todata structure 86 may also be used to represent the motion video program itself, and includes clip descriptions 89 for each clip including a reference to a motion video data file to be used to produce the clip. Such adata structure 88 is shown in FIG. 7. It should be understood that theshot descriptions 87 in FIG. 5 and the clip descriptions 89 in FIG. 7 may be combined into one structure to represent the storyboard and motion video data of a motion video program. While the clip descriptions and shot descriptions may have redundant data, the redundant data clearly can be omitted and can be represented in only one of the structures or only once in a combined structure. Theclip data structure 88 may be implemented as a QuickTime movie. Accordingly, a clip description will have an indication of afile name 102, indication of start and stoptimes 104 within the file, andother information 106 about the data file. A clip description may have empty fields, i.e., no video data file, yet but have a duration, to indicate a “hole” in a track in the program. - Referring again to FIG. 5, using standard techniques for implementation, any one of the displayed
elements data structure 86. - New shots may be added via a
command button 84, through which anew shot description 72 is added withblank field - To delete a selected shot from the storyboard, the user selects the shot using the navigation steps noted above, and indicates a delete operation, for example, by using a <delete> key. Shots deleted from the storyboard are deleted from the timeline also, if there is a corresponding clip description, but the associated media is not deleted. Only the reference to the media in the clip description is deleted.
- The operations performed on the clips in the timeline preferably are reflected automatically in the shot descriptions of the storyboard and vice versa. While this feature is easily implemented by representing the shot and clips using a single data structure, when clips and shots are represented separately, each operation on a clip or shot description should also make appropriate modifications to a corresponding shot or clip description, respectively. The process of controlling the clip and shot descriptions for this purpose will be described in more detail below.
- The combination of the shot descriptions and clip descriptions are particularly useful in capturing motion video information from a video storage device, such as a camcorder, into a motion video data file where it can be edited on the computer. An
interface 54 providing commands for capturing motion video, i.e., bringing motion video data into the computer system is shown in FIG. 8. The interface for capturing motion video into the computer includes adisplay area 120, which displays motion video information currently being received by the computer as an input. For example, a user may be playing back a videotape on a camcorder connected as an input device through a video capture board to the computer system. If no video is available, thedisplay area 120 can convey an instruction to connect a video source to the computer. Acontrol 122 controls recording of the received motion video information. By selecting therecord button 124, motion video information being displayed inregion 120 is captured into a data file until thestop button 126 is selected. Audio levels may be displayed at 128 and output of audio information may be muted usingselection area 130. Adisplay region 132 also displays available disk area as a function of time of video information which can be captured. In this example, it is assumed that roughly 27.7 k is required for each frame, such that roughly 830 k is required for each second of video information. This value is generated by monitoring the available disk space, dividing the available space value by a target size per frame (resulting in a number of frames which can be stored), and converting that quotient into minutes and seconds using the time resolution of the video, e.g., 30 frames per second. - A storyboard region is also displayed at134 to indicate the plan of shots for the selected video program for which data is being captured. A
timeline 136 is displayed which corresponds to thestoryboard region 134. Thestoryboard region 134 includes, for each shot, itstitle 138, anindication 140 of whether or not the video data for the shot has been captured (determined using thereference field 100, FIG. 6), and anindication 142 of the title of the video program. Aselection button 144 also allows for the insertion of a new shot, similar to the operation performed usingbutton 84 in FIG. 5. Using thestoryboard display 134, shots may be selected, inserted or deleted. Such functionality can be provided using standard techniques. Because operations on this interface affect the data structure shown in FIG. 6, changes made to the storyboard through theinterface 54 of FIG. 8 are also reflected in thestoryboard interface 52 shown in FIG. 5, as will be described below. Similarly, thetimeline 136 has adisplay object 146 for each clip which is captured. The display object has a size, which is calculated as a function of the duration of the clip, and atitle 148, obtained from the title of the corresponding shot description. Abar 150 also indicates whether audio is associated with the clip. - Motion video information is captured using this
interface 54 and is tied directly to a selected shot. Upon initiation, the first shot in the storyboard for which motion video information has not yet been captured is selected. However, the user may select any given shot in the storyboard region for capturing associated motion video information. After a user selects a shot, or if no shot is selected, the user may cause motion video information to be input to the computer by playing a portion of a videotape from a camcorder device. The input motion video data is displayed indisplay area 120. The user depressesbutton 124 to begin capture. The captured motion video information is stored in a data file on the hard disk of the computer system. The file name of that file is associated with the selected shot, if any, and corresponding clip in the storyboard and timeline. If no shot is selected, then a new media file is created in a library or directory of files. When the user has finished capturing the selected motion video information, thestop button 126 is depressed and the data file on the hard disk is closed. - By capturing motion video information in this manner, the motion video information is automatically and immediately associated with a selected shot. By capturing video information directly into the timeline representing the motion video program, the need for a “bin” of motion video data files is eliminated and the user interface is simplified. When all shots have been associated with clips, a message may be displayed to the user that tells the user to continue to the next selectable interface, for editing the movie. Nonetheless, the user may still add shots and capture more video.
- The interaction of the clip and shot descriptions will now be described in connection with FIGS. 15 and 16. A
storyboard module 200 is a part of the computer program which handles operations on shot descriptions of a storyboard. It receives as an input, and outputs, shotdescriptions 202.User input 204 is processed to change the data in the shot descriptions and to generated the displayedgraphics 206 of thestoryboard interface 52. Similarly, acapture module 208 processes the shot descriptions and theclip descriptions 216 to provide thedisplay graphics 210 ofinterface 54. It also processesuser input 212 to perform operations such as capturing data or inserting and deleting shots. Video input andoutput 214 is controlled into data files. Theclip descriptions 216 are created and modified according to the selected shot and the name of the data file into which the data is captured. When an operation is performed on a clip in the timeline, thecapture module 208 modifies thecorresponding clip description 216. The corresponding shot is modified via a message passing technique, indicating a clip that is modified and the operation causing the modification. - FIG. 16 is a flowchart describing an example operation in which the clip descriptions and shot descriptions are synchronized. Given a selected shot and a command to begin capturing video data, a data file for the video information is created in
step 220. Video data is then captured insteps step 226. This clip description is stored in adata structure 88 which represents the sequence of clip descriptions which make up the timeline. A message is then passed instep 228 to the storyboard indicating that a clip was created, having a duration. The selected shot description modifies its duration and pointer to reference the new clip description instep 230. - After clips for a movie have been captured, more finely detailed editing of the video program can be started. Accordingly, another of the
selectable interfaces 56 provides functions for editing a movie, as shown in FIG. 9 via several selectable interfaces 152-156. The interface for editing a movie has atimeline region 160, which includes a representation of atimeline 162, associatedtitle track 164, anadditional audio track 166, and asoundtrack 168. A timeline is a time-based representation of a composition. The horizontal dimension represents time, and the vertical dimension represents the tracks of the composition. Each of the tracks has a fixed row in the timeline which it occupies. The video track is split into three rows, including the effect subtrack, the video media subtrack, and the audio subtrack. The size of a displayed element, such aselement 170, is determined as a function of the duration of the segment it represents and a timeline scale, described below. Each element in the title, audio and soundtrack timelines has a position determined by its start time within the motion video program, a duration, a title, and associated data. Each track is thus represented by a structure similar todata structure 88, but audio tracks have references to data files containing audio information. - The timeline also has a scale which specifies how much time a certain number of pixels represents. To increase the scale means to increase the number of pixels that represent one time unit. Providing a mechanism to increase and decrease the time scale allows a user to focus in on a particular location in the composition, or to have more of an overview of the composition.
- Each of the selectable interfaces of the editing interface has a
viewer window 172, which has the same size and location within each window. Theviewer window 172 also has an associatedtimeline 174, representing the entire video program, which includes aplay button 176, forward and backward skipbuttons position indicator 182, which points to the present position within the video program which is being played back. Theindicator 184 is linked to anotherposition indicator 186 in thetimeline region 160. Thevarious buttons indicator 182 can be used to control viewing of the video program being edited. The program can be played back at a full rate, paused to show a still frame and shuttled to view individual frames to the left and/or right at a number of speeds. - In the
viewer interface 152, adisplay region 188 shows the title and duration of the video program. A user can play back the video program, adjust the duration of clips (by trimming), delete clips, insert clips and/or move clips within the video program. A segment in the timeline may also be split into two separate segments or clips. These operations can be performed by simple cut and paste operations on thetimeline 162 which can be implemented using standard techniques. For example, deletion of a clip from the timeline replaces the clip description with a hole of the same duration. The reference to this clip is removed from the corresponding shot description. - For rearranging clips on the timeline, clips are insertable at transitions and can be performed using a “drag and drop” operation, which can be implemented using standard techniques. Insertion of a clip involves creating a hole the size of the clip, then replacing the hole with the clip to be inserted. The hole may be created after a selected clip, at a transition point nearest the drop or anywhere beyond the end of the last clip in the timeline. It may be desirable to show what the timeline would look like if a drop were to occur when the user has a drop position selected, but prior to the drop operation being performed. An inserted clip may be selected by a copy or cut operation, followed by a paste operation; a selection from a library; or by dragging a selected clip to the desired location (which is in essence a combination of cut and paste operations).
- Trim operations add or remove frames from selected edges of segments in the composition. A trim right operation either removes frames from an incoming edge or adds frames to an outgoing edge. A trim left operation either removes frames from an outgoing edge or adds frames to an incoming edge. This operation is performed by simply adjusting the start or stop frames in the clip description. A trim operation accordingly cannot add or remove frames beyond the boundary of the data file used by the clip. To provide additional boundary conditions on the trim operation, the start point may be required to precede the stop point and define at least one frame. Trim operations other than edge trims may provide more advanced functionality, but are likely not to be needed by the nonprofessional. The selection of a right trim or left trim operation uses some mechanism for the user to select an edge and to indicate that a trim operation is desired. One example mechanism which may be used are “trim handles” which are displayed on the left and right ends of a displayed clip when a user selects the clip. The user may then drag the edge to the desired trim point.
- Many other more advanced operations may be performed on timelines. A timeline behavior specification is provided by FIGS. 17a-24 m, and describes in more detail the desired behavior in response to most user operations.
- FIGS. 17a-24 m are diagrams which show insertion, deletion, and trimming operations in a timeline which are possible by adding frames or clips and by removing frames or clips. These figures are shown as examples only and many other operations are possible. FIG. 17a shows that a selected clip A may be indicated by a long clip which has, for example, a length of ten frames or a short clip which has a length of five frames. Other clips B. C and D are shown in FIG. 17b. Generally holes may be shown with their length as eleven frames. A playout position is indicated in the clip as shown in FIG. 17c. A playout position may be indicated in a hole and shown in FIG. 17d, while a selected transition may be indicated as shown in 17 e.
- FIGS. 18a-b illustrate behavior which occurs when holes are removed from a timeline. As shown in FIG. 18a, a hole which is eleven frames in length exists between clips B and C. FIG. 18a then illustrates the timeline after the hole has been removed. Similarly, FIG. 18b illustrates a hole before clip B and a hole before clip C. FIG. 18b then illustrates the timeline after removal of the first hole.
- FIGS. 19a-19 d illustrate behavior which occurs by adding a hole. For instance, in FIG. 19a a transition is selected between clips B and C. FIG. 19a then illustrates the timeline after a hold has been added between clips B and C. FIG. 19b illustrates a timeline having a hole between clips C and D. FIG. 19b then illustrates a timeline after a hole has been added following clip B. FIG. 19c illustrates a transition before clip C, then FIG. 19c illustrates a hole which is added before the selected transition in clip C. Since a hole seven frames in length already existed before the transition in clip C, as shown in FIG. 19c, only four frames need to be added to create a hole before the selected transition. Similarly, in FIG. 19d there is a seven frame gap between a selected transition in clip B and clip D. FIG. 19d then illustrates a hole which has been added between clips B and D by adding four frames to the existing seven frames.
- FIGS. 20a-20 f illustrate delete/cut behavior. FIGS. 20a-20 c do not include a hole before or after the deletion. FIG. 20a illustrates three clips in a timeline.
- FIG. 20a then illustrates deleting a front clip A which results in a timeline as shown in FIG. 20a having clips B and C. FIG. 20b illustrates three clips including a middle clip A. FIG. 20b then illustrates a timeline after a middle clip A has been removed. FIG. 20c illustrates three clips with a last clip A, then FIG. 20c illustrates deleting the last clip A to result in a timeline having only clip B and C.
- FIG. 20g illustrates a timeline with three clips A,B and C including a hole which is eleven frames in length. FIG. 20h illustrates a timeline after clip A has been removed in which the hole is preserved between clips B and C. FIG. 20i illustrates a timeline having three clips with a hole between clip B and A. FIG. 20j illustrates removing clip A, resulting in a timeline including clips B and C with the hole now between clips B and C. FIG. 20k illustrates three clips wherein clip A is included in the hole between clips B and C. FIG. 201 illustrates a timeline which results after deleting clip A from the hole.
- FIGS. 21a-d illustrate pasting a clip into a timeline. For example in FIG. 21a, a timeline includes clips B and C with a transition between them. Clip A is pasted between clips B and C resulting in the timeline next shown in FIG. 21a. In FIG. 21b, a timeline is shown with a hole beyond a selected transition. FIG. 21b then illustrates a timeline after a clip A has been inserted between clips B and C. FIG. 21c illustrates a timeline having clips C and D including a hole between a selected transition in clip C. FIG. 21c then illustrates a clip which has been pasted before the selected transition in clip C. FIG. 21d illustrates a hole which is existing between the transition in clip B and clip D. A track as shown next in FIG. 21d which illustrates a clip A has been added after the selected transition in clip B.
- FIGS. 22a-22 g illustrates the behavior which results from dragging a clip from a timeline. In FIG. 22a, clip A, which is included in a timeline also including clips B and C, is dragged from the timeline and dropped between clips B and C. The third line of FIG. 22a illustrates clip A which is dragged and dropped at the end of the timeline after clip C. The fourth line of FIG. 22a illustrates dragging clip A and dropping it eleven frames after the end of the timeline.
- FIG. 22b illustrates a timeline in which clip A exists in the middle of clips B and C. Clip A may be dragged and dropped before clip B. The third line of FIG. 22b illustrates a timeline which results after clip A is dragged and dropped at the end of the timeline. Clip A may also be dragged and dropped eleven frames after the end of the timeline.
- FIG. 22c illustrates a timeline including clip A at the end of the timeline. In this example, the length of the timeline is not preserved after a clip is dragged from the end of a timeline. Clip A may be dragged to the beginning of the timeline before clip B or it may be dragged and dropped between clips B and C. Similar to the above examples in FIG. 22, clip A may be dragged to the end of the timeline and dropped eleven frames after clip C.
- FIG. 22d illustrates a clip A in a timeline which may be dragged so that the length of the timeline is preserved. For example, clip A may be dragged and dropped between clip B and a hole. In addition, clip A may be dragged and dropped seven frames after the start of the hole or it may be dropped after the hole, but before clip C. Clip a may also be dropped at the end of the timeline following clip C or it may be dragged and dropped, for example, four frames after the end of the timeline.
- FIG. 22e illustrates further examples of dragging clip A and preserving a hole and the length of the timeline. For example, clip A may be dragged into the middle of a hole, but before clip B or it may be dragged before clip B or between clips B and C. Clip A may be also dragged to the end of the timeline and may be dropped four frames after the end of the timeline
- FIG. 22f illustrates dragging a last clip in a timeline without preserving the length of the timeline. For example, clip A may be dragged to the beginning of a timeline and as shown, the hole after clip B is not preserved. Clip A may be dragged and dropped between clips C and B or may be dragged and dropped at the end of clip B. Clip A may also, for example, be dragged and dropped seven or thirteen frames after the end of clip B and after the start of a hole.
- FIG. 22g illustrates dragging a clip which is surrounded by a hole. Clip A may be dragged to the beginning of a timeline and dropped before clip B as shown in FIG. 22g. These operations preserve the hole and the length of the timeline. For example, clip A may be dragged and dropped after clip B but before the beginning of the first hole. Clip A may also be dragged so that it is dropped four frames after the beginning of the hole or it may be dropped between the end of the hole and the beginning of clip B. In addition, clip A may be dragged to the end of clip B or it may be dragged and dropped four frames after the end of the timeline.
- FIGS. 23a-23 c illustrate operations performed by dragging a clip from an outside timeline. FIG. 23a illustrates a timeline with clips B and C. However, as shown in the next line, clip A may be dragged from an outside timeline and dropped before clip B. Clip A may be dropped also between clips B and C, after clip C or, for example, seven frames after the end of clip C.
- FIG. 23b is similar to FIG. 23a except that a hole exists in the timeline between clips B and C. The same functions of dragging and dropping clip A may be performed while preserving the hole between clips B and C.
- FIG. 23c illustrates a timeline including two holes and a clip B and C. Clip A may be dragged from an outside timeline and dropped four frames from the start of the timeline. Clip A may also be dropped before clip B, after clip B, into a second hole after clip B or in a second hole five frames after clip B. Clip A may also be dropped at the end of the timeline after clip C or, for example, four frames after the end of the timeline.
- FIGS. 24a-24 i and 24 k-24 m illustrate trim behavior which results from trimming clips in a timeline. FIG. 24a illustrates clips A and B and the result from trimming clip A such that the inpoint is trimmed in five frames by removing five frames from the beginning of clip A. As shown, the length of the other items in the timeline are preserved. FIG. 24b illustrates a clip A and B where the input is trimmed out by adding five frames prior to clip A. The result is shown in the second line of FIG. 24b with a result of clips A and B being of equal length. FIGS. 24c and 24 d illustrate trimming clip A such that the outpoint is trimmed in five frames by removing five frames from the end of clip A (FIG. 24c) or is trimmed out five frames by adding five frames subsequent to the end of clip A (FIG. 24d). FIG. 24e illustrates a timeline having a hole before clip A. Clip A is trimmed at its inpoint in five frames and the result is shown in the second line of FIG. 24e. FIG. 24f illustrates a hole before clip A and a trim operation performed on clip A trimming the inpoint out five frames. FIGS. 24g and 24 h illustrate a timeline having a hole before clip A and trimming the outpoint in by five frames (FIG. 24g) and trimming the outpoint out by five frames (FIG. 24h). FIG. 24i illustrates a timeline having a hole after a clip A and before clip B. FIG. 24i and FIGS. 24k through 24 m illustrate a hole after a selected clip A and before a clip B. The second line of these examples illustrate the result of the timeline after trimming clip A in and out at its inpoints and outpoints by five frames.
- When a user has edited the clips of the video program in more detail, the next step to perform in the editing process is the addition of special effects at transitions. However, the user may access these special effects at any time and is not required to complete all trims prior to creating any effects.
- A second
selectable interface 153 for editing the motion video includes operations for selecting special effects to be applied to transitions between two clips. Given two selected clips, a selected effect can be applied to the transition. Acorresponding object 190 in FIG. 9 is displayed on the timeline, describing the transition. A suitable interface for providing selection of an effect and clips is shown in FIG. 10. - A list of possible effects is provided at
region 192. Each effect has atitle 193 which refers to a computer program which causes the effect to be made. The effect may be applied by selectingbutton 194 and is applied to the transition closest to the current position on the timeline. The effect may be removed, and replaced by a straight cut, by selectingbutton 196. A selected effect can be previewed inwindow 198, for example by “double-clicking” on the name in theeffects window 192. Each effect has a corresponding file in which the effect has been created using graphics of the letters A and B, for this purpose. - When an effect is selected, an effect segment is created in the effect subtrack of the video track, with a default duration, e.g., one second. The effect as applied to the transition is rendered in the background and associated with the segment. When displayed and selected on the timeline, the effects segment may be trimmed. Such trimming may be implemented in the same manner as a trim on a regular clip.
- After addition of special effects, such as transition effects, to the motion video program, it is common to add titles next. Operations enabling a user to add titles to the video program are provided through
interface 154, such as shown in FIG. 11. While titling operations and how they are performed on motion video are known in this art, this particular interface provides an easy mechanism for adding titles. This interface includes anediting region 200 andformat selection buttons Buttons 202 allow bold, italic and underlining formatting, whilebuttons 204 adjust justification. Font and size are selected via amenu style interface selection buttons display region 172. However, the video data file of the clip to which it is applied is not modified. The titling information may be finally applied, for example, only when the video program is output in final form. In this way, titles may be added and removed more easily. - Using the titling interface, when no scrolling option is enabled, the dimensions of the space which can contain text is limited to the frame size, which in this case is represented by the
canvas area 209. If the vertical scroll option is enabled, then the width of the canvas is the width of the video image, but the height is indefinite. If the horizontal scroll option is enabled, then the height of the canvas is the height of the video image and the width is indefinite. The length of the title may be the length of any associated video clip or the length of a hole over which it is created. When this interface is active, all modifications to the timeline are done to the title track. - Typically, one of the final steps of the process of making a video program involves “sweetening” of the sound or audio tracks. This involves more detailed editing of the audio tracks. Another
interface 155, shown in FIG. 12, provides editing functions for sound. Using this interface, all modifications to clips, including creation and deletion of clips, operate on one of the audio tracks. Given a selected point in the video program, a voice can be captured directly into the timeline in a manner similar to the way video is captured, viainterface 220. Such an operation automatically creates a voice-over clip on the voice-over track. Similarly, music, such as from a CD-ROM, can be imported usinginterface 222. Such an operation automatically creates a sound clip on the soundtrack. Given a selected clip of voice or music information, or from the video/audio timeline, the volume of each selected track can be adjusted usinginterface region 224. It is also possible to select fade-in, fade-out options. Given the inputs provided through this interface, the operations to be performed are implemented using known techniques. - Finally, a library of audio and video information can be provided and accessed through a
library interface 156, as shown in FIG. 13. A list operation invoked throughbutton 230 causes a list of the available clips to be displayed inregion 232. The available clips are all media clips which have been digitized or imported for use in a composition, for example, through the “Bring Video In” interface or through the “Sound” interface. The list operation involves a directory lookup to be performed by the computer on its file system, for example. The list view shows clips in a manner similar to the storyboard ofinterface 52. For each clip, its date, type, duration and description are displayed. These fields are editable. Thepreview button 234 allows a user to see one clip at a time from the library instead of an entire list. In this mode, a user can display a selected video clip from the library inregion 232 using a viewer which is similar to, but smaller than, theviewer 172 which is reserved for playing back the currently edited video program from the timeline. - When a user has completed editing a movie, the movie can be saved in a final form as one contiguous video program, using the
interface 58 shown in FIG. 14. The title of the video program is shown inregion 250.Start button 252 and stopbutton 254 are provided to control, of course, starting and stopping of the playback of the video, respectively. Selections are provided to the user for either previewing the video program on the computer screen, as indicated at 256, for making a videotape by outputting the video information through an encoder to a VCR, for example in VHS format, as indicated at 258, or the video information can be saved as a data file in one of several formats, such as QuickTime video, Microsoft video, MPEG video, or Motion-JPEG video as indicated at 260. Such files could be used for presentations, Internet publishing or CD-ROM publication. The selection of the format of the final program is selected, for example, by using the drop downmenu 262. Given the inputs provided through this interface, the titles are rendered. The computer then instructs the user, if appropriate, to ensure that the destination of the data, such as a camcorder, is ready. The steps of generating and playing back the video data from the data files into one contiguous stream of video data may be implemented using known techniques. - By providing a simplified interface as described above for accessing several commands for video editing, playback and recording, a user is easily guided through the process of producing a video program.
- At any time during the editing process, a user may want to stop and save the current version of the video program or storyboard. Additionally, the user may want to continue editing a composition that is not yet finished. This capability is provided through menu functions which are separate from the selectable interfaces that provide the planning, capturing, editing and recording functions. Menu functions may also be provided for each interface to represent keystrokes used to execute a given command and to set default values for audio and video, input and output, and file and signal formats.
- A composition can be stored in one or both of two formats. The first format stores the composition only as a storyboard. Storing a composition as a storyboard involves creating a data file and storing in the data file all of the information about a storyboard, without information about the associated clips. The second format stores all of the information about the current video program as well as the state of the editing program, i.e., what interface is being used during the save operation. This file format includes an indication of the interface being used, followed by the representations of each track, and the clip descriptions in each track, along with the storyboard shot descriptions including the indications of associated clips. Given a stored composition, when the document is opened again for further editing, the same interface which was last used is presented to the user.
- Having now described a few embodiments of the invention, it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by way of example only. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention as defined by the appended claims and equivalent thereto.
Claims (19)
1. A computer program product, comprising:
a computer readable medium storing computer program instructions that, when processed by a computer, instruct the computer to perform a process for facilitating editing of a motion picture, comprising:
automatically generating in the computer system a sequence of clips representing the motion picture from a description of the motion picture, wherein each clip has an initial duration defined by the description of the motion picture;
receiving input from a user indicating instructions to associate motion video information stored in computer data files with clips in the automatically generated representation of the motion picture, and storing for each clip a reference to the associated data file and a time range of the motion video information from the associated data file; and
updating the duration of each clip to which motion video information is associated to correspond to the duration of the motion video information from the time range.
2. The computer program product of claim 1 , wherein receiving input comprises:
receiving input from the user to select a clip; and
receiving input from the user to specify motion video information to be associated with the selected clip.
3. The computer program product of claim 2 , further comprising displaying to the user an indication of each clip to which motion video information has not been associated.
4. The computer program product of claim 1 , wherein receiving input comprises:
receiving input from the user to select a clip;
receiving input from the user instructing the computer to capture motion video information into a data file on the computer while the selected clip is selected; and
associating the captured data file with the selected clip.
5. The computer program product of claim 4 , further comprising displaying to the user an indication of each clip to which motion video information has not been associated.
6. The computer program product of claim 1 , wherein the description of the motion picture includes a plurality of shot descriptions.
7. The computer program product of claim 6 , wherein each shot description includes a field for storing a reference to a single still image descriptive of the shot.
8. The computer program product of claim 6 , wherein each shot description includes a field for storing a number identifying the shot.
9 The computer program product of claim 6 , wherein each shot description includes a field for storing text providing a tip for filming a shot during production.
10. The computer program product of claim 6 , wherein each shot description includes a field for storing text providing a tip for editing a shot in the motion picture.
11. A computer program product, comprising:
a computer readable medium storing computer program instructions that, when processed by a computer, instruct the computer to perform a process for facilitating editing of a motion picture, comprising:
storing in a computer system a representation of a plan for the motion picture, wherein the plan specifies a sequence of shots, wherein each shot is specified by a shot description including a reference to a textual description of the shot and a duration of the shot, wherein at least one shot lacks a reference to a source of motion video information for the shot;
displaying to a user a storyboard on a display for the computer system according to the sequence of shots specified by the plan;
allowing the user to modify the representation of the plan in the computer system;
automatically generating in the computer system a sequence of clips representing the motion picture from the stored representation of the plan, wherein each clip corresponds to a shot in the sequence of shots and has a duration that corresponds at least initially to the duration of the corresponding shot;
storing motion video information from the sources in data files on the computer system;
associating motion video information stored in the data files on the computer system with each clip in the representation of the motion picture and storing for each clip a reference to the associated data file and a range within the data file, such that the duration of each clip corresponds to the associated motion video information;
displaying to the user the sequence of clips as a timeline and in a video window on a display for the computer system according using the associated motion video information; and
allowing the user to modify the sequence of clips in the computer system.
12. The computer program product of claim 11 , wherein associating motion video information stored in the data files on the computer system with each clip comprises:
receiving input from the user to select a clip; and
receiving input from the user to specify motion video information to be associated with the selected clip.
13. The computer program product of claim 12 , further comprising displaying to the user an indication of each clip to which motion video information has not been associated.
14. The computer program product of claim 11 , wherein associating motion video information stored in the data files on the computer system with each clip comprises:
receiving input from the user to select a clip;
receiving input from the user instructing the computer to capture motion video information into a data file on the computer while the selected clip is selected; and
associating the captured data file with the selected clip.
15. The computer program product of claim 14 , further comprising displaying to the user an indication of each clip to which motion video information has not been associated.
16. The computer program product of claim 11 , wherein each shot description includes a field for storing a reference to a single still image descriptive of the shot.
17. The computer program product of claim 11 , wherein each shot description includes a field for storing a number identifying the shot.
18. The computer program product of claim 11 , wherein each shot description includes a field for storing text providing a tip for filming a shot during production.
19. The computer program product of claim 11 , wherein each shot description includes a field for storing text providing a tip for editing a shot in the motion picture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/673,663 US20040066395A1 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/687,926 US6628303B1 (en) | 1996-07-29 | 1996-07-29 | Graphical user interface for a motion video planning and editing system for a computer |
US10/673,663 US20040066395A1 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/687,926 Continuation US6628303B1 (en) | 1996-07-29 | 1996-07-29 | Graphical user interface for a motion video planning and editing system for a computer |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040066395A1 true US20040066395A1 (en) | 2004-04-08 |
Family
ID=24762419
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/687,926 Expired - Lifetime US6628303B1 (en) | 1996-07-29 | 1996-07-29 | Graphical user interface for a motion video planning and editing system for a computer |
US09/911,145 Expired - Lifetime US6469711B2 (en) | 1996-07-29 | 2001-07-23 | Graphical user interface for a video editing system |
US10/673,902 Abandoned US20040071441A1 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
US10/674,033 Expired - Fee Related US7124366B2 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
US10/673,663 Abandoned US20040066395A1 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/687,926 Expired - Lifetime US6628303B1 (en) | 1996-07-29 | 1996-07-29 | Graphical user interface for a motion video planning and editing system for a computer |
US09/911,145 Expired - Lifetime US6469711B2 (en) | 1996-07-29 | 2001-07-23 | Graphical user interface for a video editing system |
US10/673,902 Abandoned US20040071441A1 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
US10/674,033 Expired - Fee Related US7124366B2 (en) | 1996-07-29 | 2003-09-29 | Graphical user interface for a motion video planning and editing system for a computer |
Country Status (1)
Country | Link |
---|---|
US (5) | US6628303B1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040255251A1 (en) * | 2001-09-06 | 2004-12-16 | Microsoft Corporation | Assembling verbal narration for digital display images |
US20050028103A1 (en) * | 2003-07-30 | 2005-02-03 | Takayuki Yamamoto | Editing device |
US20060041632A1 (en) * | 2004-08-23 | 2006-02-23 | Microsoft Corporation | System and method to associate content types in a portable communication device |
US20060072017A1 (en) * | 2004-10-06 | 2006-04-06 | Microsoft Corporation | Creation of image based video using step-images |
US20060184673A1 (en) * | 2004-03-18 | 2006-08-17 | Andrew Liebman | Novel media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US20060204214A1 (en) * | 2005-03-14 | 2006-09-14 | Microsoft Corporation | Picture line audio augmentation |
US20060203199A1 (en) * | 2005-03-08 | 2006-09-14 | Microsoft Corporation | Photostory 3 - automated motion generation |
US20060218488A1 (en) * | 2005-03-28 | 2006-09-28 | Microsoft Corporation | Plug-in architecture for post-authoring activities |
US20060224778A1 (en) * | 2005-04-04 | 2006-10-05 | Microsoft Corporation | Linked wizards |
WO2010146558A1 (en) * | 2009-06-18 | 2010-12-23 | Madeyoum Ltd. | Device, system, and method of generating a multimedia presentation |
US20110126236A1 (en) * | 2009-11-25 | 2011-05-26 | Nokia Corporation | Method and apparatus for presenting media segments |
US20110125818A1 (en) * | 2004-03-18 | 2011-05-26 | Andrew Liebman | Novel media file for multi-platform non-linear video editing systems |
US20110167036A1 (en) * | 2008-06-19 | 2011-07-07 | Andrew Liebman | Novel media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US20120210230A1 (en) * | 2010-07-15 | 2012-08-16 | Ken Matsuda | Media-Editing Application with Anchored Timeline |
CN102668586A (en) * | 2009-09-25 | 2012-09-12 | 夏普株式会社 | Display device, program, and storage medium |
US20140250055A1 (en) * | 2008-04-11 | 2014-09-04 | Adobe Systems Incorporated | Systems and Methods for Associating Metadata With Media Using Metadata Placeholders |
US8966367B2 (en) | 2011-02-16 | 2015-02-24 | Apple Inc. | Anchor override for a media-editing application with an anchored timeline |
US9626375B2 (en) | 2011-04-08 | 2017-04-18 | Andrew Liebman | Systems, computer readable storage media, and computer implemented methods for project sharing |
US9870802B2 (en) | 2011-01-28 | 2018-01-16 | Apple Inc. | Media clip management |
US9997196B2 (en) | 2011-02-16 | 2018-06-12 | Apple Inc. | Retiming media presentations |
US11747972B2 (en) | 2011-02-16 | 2023-09-05 | Apple Inc. | Media-editing application with novel editing tools |
Families Citing this family (397)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6628303B1 (en) * | 1996-07-29 | 2003-09-30 | Avid Technology, Inc. | Graphical user interface for a motion video planning and editing system for a computer |
US6108637A (en) * | 1996-09-03 | 2000-08-22 | Nielsen Media Research, Inc. | Content display monitor |
WO1998012702A1 (en) * | 1996-09-20 | 1998-03-26 | Sony Corporation | Editing system, editing method, clip management apparatus, and clip management method |
US7055166B1 (en) * | 1996-10-03 | 2006-05-30 | Gotuit Media Corp. | Apparatus and methods for broadcast monitoring |
US20030093790A1 (en) * | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
US20060280437A1 (en) * | 1999-01-27 | 2006-12-14 | Gotuit Media Corp | Methods and apparatus for vending and delivering the content of disk recordings |
WO1998026418A1 (en) * | 1996-12-09 | 1998-06-18 | Sony Corporation | Editing device, editing system, and editing method |
US6340978B1 (en) | 1997-01-31 | 2002-01-22 | Making Everlasting Memories, Ltd. | Method and apparatus for recording and presenting life stories |
US7657835B2 (en) * | 1997-01-31 | 2010-02-02 | Making Everlasting Memories, L.L.C. | Method and system for creating a commemorative presentation |
JP3736706B2 (en) * | 1997-04-06 | 2006-01-18 | ソニー株式会社 | Image display apparatus and method |
US7284187B1 (en) * | 1997-05-30 | 2007-10-16 | Aol Llc, A Delaware Limited Liability Company | Encapsulated document and format system |
GB0225339D0 (en) * | 2002-10-31 | 2002-12-11 | Trevor Burke Technology Ltd | Method and apparatus for programme generation and classification |
US20050039177A1 (en) * | 1997-07-12 | 2005-02-17 | Trevor Burke Technology Limited | Method and apparatus for programme generation and presentation |
GB9714624D0 (en) * | 1997-07-12 | 1997-09-17 | Trevor Burke Technology Limite | Visual programme distribution system |
US7263659B2 (en) * | 1998-09-09 | 2007-08-28 | Ricoh Company, Ltd. | Paper-based interface for multimedia information |
US7596755B2 (en) * | 1997-12-22 | 2009-09-29 | Ricoh Company, Ltd. | Multimedia visualization and integration environment |
US7954056B2 (en) * | 1997-12-22 | 2011-05-31 | Ricoh Company, Ltd. | Television-based visualization and navigation interface |
US6380950B1 (en) * | 1998-01-20 | 2002-04-30 | Globalstreams, Inc. | Low bandwidth television |
US20020008751A1 (en) * | 1998-03-25 | 2002-01-24 | Stephen L. Spurgeon | Decorating system for edible items |
US7266782B2 (en) * | 1998-09-09 | 2007-09-04 | Ricoh Company, Ltd. | Techniques for generating a coversheet for a paper-based interface for multimedia information |
US7215436B2 (en) * | 1998-09-09 | 2007-05-08 | Ricoh Company, Ltd. | Device for generating a multimedia paper document |
US7263671B2 (en) | 1998-09-09 | 2007-08-28 | Ricoh Company, Ltd. | Techniques for annotating multimedia information |
US7835920B2 (en) * | 1998-12-18 | 2010-11-16 | Thomson Licensing | Director interface for production automation control |
US6452612B1 (en) * | 1998-12-18 | 2002-09-17 | Parkervision, Inc. | Real time video production system and method |
US8560951B1 (en) * | 1998-12-18 | 2013-10-15 | Thomson Licensing | System and method for real time video production and distribution |
US7024677B1 (en) | 1998-12-18 | 2006-04-04 | Thomson Licensing | System and method for real time video production and multicasting |
US6909874B2 (en) * | 2000-04-12 | 2005-06-21 | Thomson Licensing Sa. | Interactive tutorial method, system, and computer program product for real time media production |
US6952221B1 (en) * | 1998-12-18 | 2005-10-04 | Thomson Licensing S.A. | System and method for real time video production and distribution |
US9123380B2 (en) | 1998-12-18 | 2015-09-01 | Gvbb Holdings S.A.R.L. | Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production |
US11109114B2 (en) | 2001-04-18 | 2021-08-31 | Grass Valley Canada | Advertisement management method, system, and computer program product |
EP1033718B1 (en) * | 1999-03-02 | 2006-01-11 | Hitachi Denshi Kabushiki Kaisha | Motion picture information displaying method and apparatus |
US7877774B1 (en) * | 1999-04-19 | 2011-01-25 | At&T Intellectual Property Ii, L.P. | Browsing and retrieval of full broadcast-quality video |
US9171545B2 (en) * | 1999-04-19 | 2015-10-27 | At&T Intellectual Property Ii, L.P. | Browsing and retrieval of full broadcast-quality video |
US8266657B2 (en) | 2001-03-15 | 2012-09-11 | Sling Media Inc. | Method for effectively implementing a multi-room television system |
US6263503B1 (en) | 1999-05-26 | 2001-07-17 | Neal Margulis | Method for effectively implementing a wireless television system |
KR100370247B1 (en) * | 1999-08-26 | 2003-01-29 | 엘지전자 주식회사 | Video browser based on character relation |
US7996878B1 (en) | 1999-08-31 | 2011-08-09 | At&T Intellectual Property Ii, L.P. | System and method for generating coded video sequences from still media |
KR100346263B1 (en) * | 1999-11-05 | 2002-07-26 | 엘지전자주식회사 | A multi level position/range designating user interface of a multimedia stream for efficient browsing, editing and indexing of a multimedia stream |
US7653925B2 (en) * | 1999-11-17 | 2010-01-26 | Ricoh Company, Ltd. | Techniques for receiving information during multimedia presentations and communicating the information |
US7299405B1 (en) * | 2000-03-08 | 2007-11-20 | Ricoh Company, Ltd. | Method and system for information management to facilitate the exchange of ideas during a collaborative effort |
US6976032B1 (en) | 1999-11-17 | 2005-12-13 | Ricoh Company, Ltd. | Networked peripheral for visitor greeting, identification, biographical lookup and tracking |
US6976229B1 (en) * | 1999-12-16 | 2005-12-13 | Ricoh Co., Ltd. | Method and apparatus for storytelling with digital photographs |
EP1168838B1 (en) * | 2000-01-26 | 2009-08-12 | Sony Corporation | Information processing device and processing method and program storing medium |
US7085995B2 (en) * | 2000-01-26 | 2006-08-01 | Sony Corporation | Information processing apparatus and processing method and program storage medium |
US7262778B1 (en) | 2000-02-11 | 2007-08-28 | Sony Corporation | Automatic color adjustment of a template design |
US8407595B1 (en) * | 2000-02-11 | 2013-03-26 | Sony Corporation | Imaging service for automating the display of images |
US7810037B1 (en) | 2000-02-11 | 2010-10-05 | Sony Corporation | Online story collaboration |
US6771801B1 (en) | 2000-02-11 | 2004-08-03 | Sony Corporation | Adaptable pre-designed photographic storyboard |
US6925602B1 (en) | 2000-03-20 | 2005-08-02 | Intel Corporation | Facilitating access to digital video |
JP2001290938A (en) * | 2000-03-24 | 2001-10-19 | Trw Inc | Integrated digital production line for full-motion visual product |
EP1273008A2 (en) * | 2000-03-31 | 2003-01-08 | Parkervision, Inc. | Method, system and computer program product for full news integration and automation in a real time video production environment |
EP1300019A2 (en) * | 2000-04-05 | 2003-04-09 | Sony United Kingdom Limited | Audio and/or video generation apparatus and method of generating audio and/or video signals |
JP4660879B2 (en) * | 2000-04-27 | 2011-03-30 | ソニー株式会社 | Information providing apparatus and method, and program |
EP1279119A4 (en) * | 2000-04-28 | 2008-03-19 | D4 Media Inc | Goal seeking engine and method for generating custom media presentations |
US20020138843A1 (en) * | 2000-05-19 | 2002-09-26 | Andrew Samaan | Video distribution method and system |
US6920181B1 (en) * | 2000-09-19 | 2005-07-19 | Todd Porter | Method for synchronizing audio and video streams |
US8006192B1 (en) * | 2000-10-04 | 2011-08-23 | Apple Inc. | Layered graphical user interface |
US7444593B1 (en) * | 2000-10-04 | 2008-10-28 | Apple Inc. | Disk space management and clip remainder during edit operations |
US7478327B1 (en) * | 2000-10-04 | 2009-01-13 | Apple Inc. | Unified capture and process interface |
US7325199B1 (en) * | 2000-10-04 | 2008-01-29 | Apple Inc. | Integrated time line for editing |
US7266767B2 (en) * | 2000-11-27 | 2007-09-04 | Parker Philip M | Method and apparatus for automated authoring and marketing |
US7558781B2 (en) * | 2000-12-12 | 2009-07-07 | Home Box Office, Inc. | Digital asset data type definitions |
US6791548B2 (en) * | 2000-12-14 | 2004-09-14 | International Business Machines Corporation | Method and apparatus for automatically displaying dynamic images as a sequence of still frames arranged in a predetermined order |
US8006186B2 (en) * | 2000-12-22 | 2011-08-23 | Muvee Technologies Pte. Ltd. | System and method for media production |
US20070300258A1 (en) * | 2001-01-29 | 2007-12-27 | O'connor Daniel | Methods and systems for providing media assets over a network |
JP3906031B2 (en) * | 2001-01-31 | 2007-04-18 | 株式会社東芝 | Moving picture reproducing apparatus and program for causing computer to execute moving picture reproducing process |
US20020114613A1 (en) * | 2001-02-16 | 2002-08-22 | Sony Corporation | Audio/video editing in digital network recorders |
US7248778B1 (en) * | 2001-03-16 | 2007-07-24 | Gateway Inc. | Automated video editing system and method |
JP3847098B2 (en) * | 2001-03-29 | 2006-11-15 | アルパイン株式会社 | Audio information display device |
US20050005308A1 (en) * | 2002-01-29 | 2005-01-06 | Gotuit Video, Inc. | Methods and apparatus for recording and replaying sports broadcasts |
US20030052909A1 (en) * | 2001-06-25 | 2003-03-20 | Arcsoft, Inc. | Real-time rendering of edited video stream |
US20020196269A1 (en) * | 2001-06-25 | 2002-12-26 | Arcsoft, Inc. | Method and apparatus for real-time rendering of edited video stream |
US20030048302A1 (en) * | 2001-08-31 | 2003-03-13 | International Business Machines Corporation | Context flags for menus, toolbars, and other UI objects |
JP4670207B2 (en) * | 2001-08-31 | 2011-04-13 | ソニー株式会社 | Information processing apparatus and method, recording medium, and program |
US7307043B2 (en) * | 2001-09-28 | 2007-12-11 | Syngenta Crop Protection, Inc. | Aqueous neonicotinoid compositions for seed treatment |
US20040205479A1 (en) * | 2001-10-30 | 2004-10-14 | Seaman Mark D. | System and method for creating a multimedia presentation |
US7747655B2 (en) * | 2001-11-19 | 2010-06-29 | Ricoh Co. Ltd. | Printable representations for time-based media |
US7495795B2 (en) * | 2002-02-21 | 2009-02-24 | Ricoh Company, Ltd. | Interface for printing multimedia information |
US7788080B2 (en) * | 2001-11-19 | 2010-08-31 | Ricoh Company, Ltd. | Paper interface for simulation environments |
US8539344B2 (en) * | 2001-11-19 | 2013-09-17 | Ricoh Company, Ltd. | Paper-based interface for multimedia information stored by multiple multimedia documents |
US7743347B2 (en) * | 2001-11-19 | 2010-06-22 | Ricoh Company, Ltd. | Paper-based interface for specifying ranges |
US8635531B2 (en) * | 2002-02-21 | 2014-01-21 | Ricoh Company, Ltd. | Techniques for displaying information stored in multiple multimedia documents |
US7861169B2 (en) | 2001-11-19 | 2010-12-28 | Ricoh Co. Ltd. | Multimedia print driver dialog interfaces |
US7703044B2 (en) * | 2001-11-19 | 2010-04-20 | Ricoh Company, Ltd. | Techniques for generating a static representation for time-based media information |
US7149957B2 (en) | 2001-11-19 | 2006-12-12 | Ricoh Company, Ltd. | Techniques for retrieving multimedia information using a paper-based interface |
US6928613B1 (en) * | 2001-11-30 | 2005-08-09 | Victor Company Of Japan | Organization, selection, and application of video effects according to zones |
US6663244B1 (en) * | 2001-12-14 | 2003-12-16 | Infocus Corporation | Illumination field blending for use in subtitle projection systems |
US7146574B2 (en) * | 2001-12-21 | 2006-12-05 | Microsoft Corporation | Systems and methods for interfacing with digital history data |
US20070113250A1 (en) * | 2002-01-29 | 2007-05-17 | Logan James D | On demand fantasy sports systems and methods |
US7188066B2 (en) | 2002-02-04 | 2007-03-06 | Microsoft Corporation | Speech controls for use with a speech system |
US7519589B2 (en) * | 2003-02-04 | 2009-04-14 | Cataphora, Inc. | Method and apparatus for sociological data analysis |
US8374879B2 (en) | 2002-02-04 | 2013-02-12 | Microsoft Corporation | Systems and methods for managing interactions from multiple speech-enabled applications |
US7139713B2 (en) * | 2002-02-04 | 2006-11-21 | Microsoft Corporation | Systems and methods for managing interactions from multiple speech-enabled applications |
US7167831B2 (en) * | 2002-02-04 | 2007-01-23 | Microsoft Corporation | Systems and methods for managing multiple grammars in a speech recognition system |
US8135711B2 (en) | 2002-02-04 | 2012-03-13 | Cataphora, Inc. | Method and apparatus for sociological data analysis |
EP1485825A4 (en) * | 2002-02-04 | 2008-03-19 | Cataphora Inc | A method and apparatus for sociological data mining |
US7603627B2 (en) | 2002-02-05 | 2009-10-13 | Microsoft Corporation | Systems and methods for creating and managing graphical user interface lists |
US7257776B2 (en) * | 2002-02-05 | 2007-08-14 | Microsoft Corporation | Systems and methods for scaling a graphical user interface according to display dimensions and using a tiered sizing schema to define display objects |
US7873260B2 (en) * | 2002-02-15 | 2011-01-18 | Acoustic Technology Llc | Video and audio processing control |
US7587317B2 (en) * | 2002-02-15 | 2009-09-08 | Microsoft Corporation | Word training interface |
US7199805B1 (en) * | 2002-05-28 | 2007-04-03 | Apple Computer, Inc. | Method and apparatus for titling |
JP4065142B2 (en) * | 2002-05-31 | 2008-03-19 | 松下電器産業株式会社 | Authoring apparatus and authoring method |
JP4218264B2 (en) * | 2002-06-25 | 2009-02-04 | ソニー株式会社 | Content creation system, content plan creation program, program recording medium, imaging device, imaging method, imaging program |
US7073127B2 (en) * | 2002-07-01 | 2006-07-04 | Arcsoft, Inc. | Video editing GUI with layer view |
US20040034869A1 (en) * | 2002-07-12 | 2004-02-19 | Wallace Michael W. | Method and system for display and manipulation of thematic segmentation in the analysis and presentation of film and video |
US7231630B2 (en) * | 2002-07-12 | 2007-06-12 | Ensequence Inc. | Method and system automatic control of graphical computer application appearance and execution |
US7653544B2 (en) * | 2003-08-08 | 2010-01-26 | Audioeye, Inc. | Method and apparatus for website navigation by the visually impaired |
US7549127B2 (en) * | 2002-08-01 | 2009-06-16 | Realnetworks, Inc. | Method and apparatus for resizing video content displayed within a graphical user interface |
US7734144B2 (en) * | 2002-10-30 | 2010-06-08 | Koninklijke Philips Electronics N.V. | Method and apparatus for editing source video to provide video image stabilization |
US8009966B2 (en) * | 2002-11-01 | 2011-08-30 | Synchro Arts Limited | Methods and apparatus for use in sound replacement with automatic synchronization to images |
US20040117822A1 (en) * | 2002-12-11 | 2004-06-17 | Jeyhan Karaoguz | Method and system for personal media program production in a media exchange network |
US7694225B1 (en) * | 2003-01-06 | 2010-04-06 | Apple Inc. | Method and apparatus for producing a packaged presentation |
US7546544B1 (en) * | 2003-01-06 | 2009-06-09 | Apple Inc. | Method and apparatus for creating multimedia presentations |
US7319764B1 (en) * | 2003-01-06 | 2008-01-15 | Apple Inc. | Method and apparatus for controlling volume |
US7840905B1 (en) | 2003-01-06 | 2010-11-23 | Apple Inc. | Creating a theme used by an authoring application to produce a multimedia presentation |
US20040130566A1 (en) * | 2003-01-07 | 2004-07-08 | Prashant Banerjee | Method for producing computerized multi-media presentation |
US7882258B1 (en) * | 2003-02-05 | 2011-02-01 | Silver Screen Tele-Reality, Inc. | System, method, and computer readable medium for creating a video clip |
GB2400289A (en) * | 2003-04-04 | 2004-10-06 | Autodesk Canada Inc | Selecting functions in a Context-Sensitive Menu |
GB2402588B (en) * | 2003-04-07 | 2006-07-26 | Internet Pro Video Ltd | Computer based system for selecting digital media frames |
WO2004090900A1 (en) * | 2003-04-07 | 2004-10-21 | Internet Pro Video Limited | Method of enabling an application program running on an electronic device to provide media manipulation capabilities |
JP2005012256A (en) * | 2003-06-16 | 2005-01-13 | Canon Inc | Data processing apparatus |
US7437682B1 (en) * | 2003-08-07 | 2008-10-14 | Apple Inc. | Icon label placement in a graphical user interface |
US7352952B2 (en) * | 2003-10-16 | 2008-04-01 | Magix Ag | System and method for improved video editing |
US20050144305A1 (en) * | 2003-10-21 | 2005-06-30 | The Board Of Trustees Operating Michigan State University | Systems and methods for identifying, segmenting, collecting, annotating, and publishing multimedia materials |
US7689712B2 (en) | 2003-11-26 | 2010-03-30 | Ricoh Company, Ltd. | Techniques for integrating note-taking and multimedia information |
AU2004233453B2 (en) * | 2003-12-03 | 2011-02-17 | Envysion, Inc. | Recording a sequence of images |
GB2409124B (en) * | 2003-12-03 | 2009-03-18 | Safehouse Internat Inc | Processing input data signals |
US7664292B2 (en) * | 2003-12-03 | 2010-02-16 | Safehouse International, Inc. | Monitoring an output from a camera |
US20050163345A1 (en) * | 2003-12-03 | 2005-07-28 | Safehouse International Limited | Analysing image data |
NZ536913A (en) * | 2003-12-03 | 2006-09-29 | Safehouse Internat Inc | Displaying graphical output representing the topographical relationship of detectors and their alert status |
US8732221B2 (en) * | 2003-12-10 | 2014-05-20 | Magix Software Gmbh | System and method of multimedia content editing |
US20050132293A1 (en) * | 2003-12-10 | 2005-06-16 | Magix Ag | System and method of multimedia content editing |
GB2410664B (en) * | 2004-01-31 | 2009-04-08 | Autodesk Canada Inc | Generating a user interface |
TWI255141B (en) * | 2004-06-02 | 2006-05-11 | Imagetech Co Ltd | Method and system for real-time interactive video |
US7882436B2 (en) * | 2004-03-10 | 2011-02-01 | Trevor Burke Technology Limited | Distribution of video data |
US20050216840A1 (en) * | 2004-03-25 | 2005-09-29 | Keith Salvucci | In-timeline trimming |
US7779355B1 (en) | 2004-03-30 | 2010-08-17 | Ricoh Company, Ltd. | Techniques for using paper documents as media templates |
US7512886B1 (en) * | 2004-04-15 | 2009-03-31 | Magix Ag | System and method of automatically aligning video scenes with an audio track |
US7932909B2 (en) * | 2004-04-16 | 2011-04-26 | Apple Inc. | User interface for controlling three-dimensional animation of an object |
US7805678B1 (en) * | 2004-04-16 | 2010-09-28 | Apple Inc. | Editing within single timeline |
US20050231512A1 (en) * | 2004-04-16 | 2005-10-20 | Niles Gregory E | Animation of an object using behaviors |
US20050235198A1 (en) * | 2004-04-16 | 2005-10-20 | Howard Johnathon E | Editing system for audiovisual works and corresponding text for television news |
JP4385974B2 (en) * | 2004-05-13 | 2009-12-16 | ソニー株式会社 | Image display method, image processing apparatus, program, and recording medium |
US8099755B2 (en) | 2004-06-07 | 2012-01-17 | Sling Media Pvt. Ltd. | Systems and methods for controlling the encoding of a media stream |
US7917932B2 (en) | 2005-06-07 | 2011-03-29 | Sling Media, Inc. | Personal video recorder functionality for placeshifting systems |
US8346605B2 (en) | 2004-06-07 | 2013-01-01 | Sling Media, Inc. | Management of shared media content |
US9998802B2 (en) | 2004-06-07 | 2018-06-12 | Sling Media LLC | Systems and methods for creating variable length clips from a media stream |
US7975062B2 (en) | 2004-06-07 | 2011-07-05 | Sling Media, Inc. | Capturing and sharing media content |
WO2005122025A2 (en) | 2004-06-07 | 2005-12-22 | Sling Media, Inc. | Personal media broadcasting system |
US7769756B2 (en) | 2004-06-07 | 2010-08-03 | Sling Media, Inc. | Selection and presentation of context-relevant supplemental content and advertising |
US7375768B2 (en) * | 2004-08-24 | 2008-05-20 | Magix Ag | System and method for automatic creation of device specific high definition material |
US8108776B2 (en) * | 2004-08-31 | 2012-01-31 | Intel Corporation | User interface for multimodal information system |
US20060064642A1 (en) * | 2004-09-22 | 2006-03-23 | Edurite Technologies Pvt. Ltd. | Seamless presentation integrator |
US20060067654A1 (en) * | 2004-09-24 | 2006-03-30 | Magix Ag | Graphical user interface adaptable to multiple display devices |
EP1645944B1 (en) * | 2004-10-05 | 2012-08-15 | Sony France S.A. | A content-management interface |
US7752548B2 (en) * | 2004-10-29 | 2010-07-06 | Microsoft Corporation | Features such as titles, transitions, and/or effects which vary according to positions |
JP2006133891A (en) * | 2004-11-02 | 2006-05-25 | Seiko Epson Corp | Information processing apparatus and program |
GB0426247D0 (en) * | 2004-11-25 | 2004-12-29 | Koninkl Philips Electronics Nv | User interface for content authoring |
EP1666967B1 (en) * | 2004-12-03 | 2013-05-08 | Magix AG | System and method of creating an emotional controlled soundtrack |
AU2004240229B2 (en) * | 2004-12-20 | 2011-04-07 | Canon Kabushiki Kaisha | A radial, three-dimensional, hierarchical file system view |
US7660416B1 (en) | 2005-01-11 | 2010-02-09 | Sample Digital Holdings Llc | System and method for media content collaboration throughout a media production process |
US8489990B2 (en) * | 2005-03-02 | 2013-07-16 | Rovi Guides, Inc. | Playlists and bookmarks in an interactive media guidance application system |
US7450124B2 (en) * | 2005-03-18 | 2008-11-11 | Microsoft Corporation | Generating 2D transitions using a 3D model |
US7669130B2 (en) * | 2005-04-15 | 2010-02-23 | Apple Inc. | Dynamic real-time playback |
GB0509047D0 (en) * | 2005-05-04 | 2005-06-08 | Pace Micro Tech Plc | Television system |
US20060271855A1 (en) * | 2005-05-27 | 2006-11-30 | Microsoft Corporation | Operating system shell management of video files |
US20060282776A1 (en) * | 2005-06-10 | 2006-12-14 | Farmer Larry C | Multimedia and performance analysis tool |
US7663691B2 (en) * | 2005-10-11 | 2010-02-16 | Apple Inc. | Image capture using display device as light source |
US20060284895A1 (en) * | 2005-06-15 | 2006-12-21 | Marcu Gabriel G | Dynamic gamma correction |
US8085318B2 (en) * | 2005-10-11 | 2011-12-27 | Apple Inc. | Real-time image capture and manipulation based on streaming data |
US8805929B2 (en) | 2005-06-20 | 2014-08-12 | Ricoh Company, Ltd. | Event-driven annotation techniques |
US7554576B2 (en) * | 2005-06-20 | 2009-06-30 | Ricoh Company, Ltd. | Information capture and recording system for controlling capture devices |
WO2007005790A2 (en) | 2005-06-30 | 2007-01-11 | Sling Media, Inc. | Firmware update for consumer electronic device |
WO2007005789A2 (en) * | 2005-06-30 | 2007-01-11 | Sling Media, Inc. | Screen management system for media player |
US7639873B2 (en) * | 2005-07-28 | 2009-12-29 | Microsoft Corporation | Robust shot detection in a video |
WO2007089274A2 (en) * | 2005-07-29 | 2007-08-09 | Cataphora, Inc. | An improved method and apparatus for sociological data analysis |
US20070035665A1 (en) * | 2005-08-12 | 2007-02-15 | Broadcom Corporation | Method and system for communicating lighting effects with additional layering in a video stream |
US8977965B1 (en) | 2005-08-19 | 2015-03-10 | At&T Intellectual Property Ii, L.P. | System and method for controlling presentations using a multimodal interface |
US9116989B1 (en) | 2005-08-19 | 2015-08-25 | At&T Intellectual Property Ii, L.P. | System and method for using speech for data searching during presentations |
US9042703B2 (en) * | 2005-10-31 | 2015-05-26 | At&T Intellectual Property Ii, L.P. | System and method for content-based navigation of live and recorded TV and video programs |
US9020326B2 (en) | 2005-08-23 | 2015-04-28 | At&T Intellectual Property Ii, L.P. | System and method for content-based navigation of live and recorded TV and video programs |
US7562099B1 (en) * | 2005-09-09 | 2009-07-14 | Avid Technology, Inc. | Graphical user interface for a media management system for communicating quality, source and accessibility information for media data and media objects |
US7739599B2 (en) * | 2005-09-23 | 2010-06-15 | Microsoft Corporation | Automatic capturing and editing of a video |
US9021424B2 (en) * | 2005-09-27 | 2015-04-28 | Sap Se | Multi-document editor with code inlining |
US7644364B2 (en) * | 2005-10-14 | 2010-01-05 | Microsoft Corporation | Photo and video collage effects |
US20070101278A1 (en) * | 2005-10-31 | 2007-05-03 | Microsoft Corporation | Web site theme designer |
US9026915B1 (en) | 2005-10-31 | 2015-05-05 | At&T Intellectual Property Ii, L.P. | System and method for creating a presentation using natural language |
US7614012B1 (en) * | 2005-12-22 | 2009-11-03 | Adobe Systems Incorporated | Methods and apparatus for graphical object implementation |
US20070162857A1 (en) * | 2006-01-06 | 2007-07-12 | Ralf Weber | Automated multimedia authoring |
US20070162855A1 (en) * | 2006-01-06 | 2007-07-12 | Kelly Hawk | Movie authoring |
US7546532B1 (en) * | 2006-02-17 | 2009-06-09 | Adobe Systems Incorporated | Methods and apparatus for editing content |
US7823056B1 (en) | 2006-03-15 | 2010-10-26 | Adobe Systems Incorporated | Multiple-camera video recording |
US7669128B2 (en) * | 2006-03-20 | 2010-02-23 | Intension, Inc. | Methods of enhancing media content narrative |
US7735101B2 (en) | 2006-03-28 | 2010-06-08 | Cisco Technology, Inc. | System allowing users to embed comments at specific points in time into media presentation |
US20080036917A1 (en) * | 2006-04-07 | 2008-02-14 | Mark Pascarella | Methods and systems for generating and delivering navigatable composite videos |
US7730047B2 (en) * | 2006-04-07 | 2010-06-01 | Microsoft Corporation | Analysis of media content via extensible object |
USD562344S1 (en) * | 2006-04-07 | 2008-02-19 | Microsoft Corporation | Set of icons for a portion of a display screen |
WO2007120694A1 (en) * | 2006-04-10 | 2007-10-25 | Yahoo! Inc. | User interface for editing media assets |
US8438646B2 (en) * | 2006-04-28 | 2013-05-07 | Disney Enterprises, Inc. | System and/or method for distributing media content |
WO2007137240A2 (en) * | 2006-05-21 | 2007-11-29 | Motionphoto, Inc. | Methods and apparatus for remote motion graphics authoring |
US8458595B1 (en) | 2006-05-31 | 2013-06-04 | Adobe Systems Incorporated | Video editing including simultaneously displaying timelines and storyboards |
US7890867B1 (en) | 2006-06-07 | 2011-02-15 | Adobe Systems Incorporated | Video editing functions displayed on or near video sequences |
JP4207981B2 (en) * | 2006-06-13 | 2009-01-14 | ソニー株式会社 | Information processing apparatus, information processing method, program, and recording medium |
US8006189B2 (en) * | 2006-06-22 | 2011-08-23 | Dachs Eric B | System and method for web based collaboration using digital media |
US20070296734A1 (en) * | 2006-06-26 | 2007-12-27 | Frank Edughom Ekpar | Method and apparatus for creating and managing high impact special effects |
WO2008011243A2 (en) * | 2006-07-17 | 2008-01-24 | Video Thang Llc. | Systems and methods for encoding, editing and sharing multimedia files |
US20080034277A1 (en) * | 2006-07-24 | 2008-02-07 | Chen-Jung Hong | System and method of the same |
US7623755B2 (en) | 2006-08-17 | 2009-11-24 | Adobe Systems Incorporated | Techniques for positioning audio and video clips |
US7999810B1 (en) * | 2006-08-30 | 2011-08-16 | Boice Gina L | System and method for animated computer visualization of historic events |
US20080066005A1 (en) * | 2006-09-07 | 2008-03-13 | Mcmullen David | Systems and Methods of Interfacing with Enterprise Resource Planning Systems |
US8954851B2 (en) * | 2006-09-15 | 2015-02-10 | Microsoft Corporation | Adding video effects for video enabled applications |
US7877690B2 (en) * | 2006-09-20 | 2011-01-25 | Adobe Systems Incorporated | Media system with integrated clip views |
EP2091252B1 (en) | 2006-11-07 | 2015-11-04 | Sony Corporation | Communication system, transmitting device, receiving device, communication method, program and communication cable |
JP5386984B2 (en) * | 2006-11-07 | 2014-01-15 | ソニー株式会社 | Transmitting apparatus, video signal transmitting method in transmitting apparatus, receiving apparatus, and video signal receiving method in receiving apparatus |
CN102065262B (en) | 2006-11-07 | 2013-04-03 | 索尼株式会社 | Electronic device and control information reception method |
US8375302B2 (en) * | 2006-11-17 | 2013-02-12 | Microsoft Corporation | Example based video editing |
US20080155627A1 (en) * | 2006-12-04 | 2008-06-26 | O'connor Daniel | Systems and methods of searching for and presenting video and audio |
US8020100B2 (en) * | 2006-12-22 | 2011-09-13 | Apple Inc. | Fast creation of video segments |
US8943410B2 (en) * | 2006-12-22 | 2015-01-27 | Apple Inc. | Modified media presentation during scrubbing |
US7992097B2 (en) | 2006-12-22 | 2011-08-02 | Apple Inc. | Select drag and drop operations on video thumbnails across clip boundaries |
US20080172704A1 (en) * | 2007-01-16 | 2008-07-17 | Montazemi Peyman T | Interactive audiovisual editing system |
US20080178087A1 (en) * | 2007-01-19 | 2008-07-24 | Microsoft Corporation | In-Scene Editing of Image Sequences |
WO2008098259A2 (en) * | 2007-02-09 | 2008-08-14 | Scrollmotion, Inc. | Scrollable media and system and method for play-back and production of scrollable media |
US9177603B2 (en) | 2007-03-19 | 2015-11-03 | Intension, Inc. | Method of assembling an enhanced media content narrative |
US8307287B2 (en) * | 2007-04-13 | 2012-11-06 | Apple Inc. | Heads-up-display for use in a media manipulation operation |
US8122378B2 (en) * | 2007-06-08 | 2012-02-21 | Apple Inc. | Image capture and manipulation |
US9047374B2 (en) * | 2007-06-08 | 2015-06-02 | Apple Inc. | Assembling video content |
US20080303949A1 (en) * | 2007-06-08 | 2008-12-11 | Apple Inc. | Manipulating video streams |
US20090003712A1 (en) * | 2007-06-28 | 2009-01-01 | Microsoft Corporation | Video Collage Presentation |
KR20090001090A (en) * | 2007-06-29 | 2009-01-08 | 삼성전자주식회사 | Video Communication Device and Control Method |
TW200904185A (en) * | 2007-07-05 | 2009-01-16 | Intervideo Digital Thchnology Corp | Video editing method |
US9449648B2 (en) * | 2007-08-06 | 2016-09-20 | Apple Inc. | Arranging audio or video sections |
US20090049409A1 (en) * | 2007-08-15 | 2009-02-19 | Archos Sa | Method for generating thumbnails for selecting video objects |
US20090063496A1 (en) * | 2007-08-29 | 2009-03-05 | Yahoo! Inc. | Automated most popular media asset creation |
US20090064005A1 (en) * | 2007-08-29 | 2009-03-05 | Yahoo! Inc. | In-place upload and editing application for editing media assets |
US20090070371A1 (en) * | 2007-09-12 | 2009-03-12 | Yahoo! Inc. | Inline rights request and communication for remote content |
US20090070370A1 (en) * | 2007-09-12 | 2009-03-12 | Yahoo! Inc. | Trackbacks for media assets |
US8477793B2 (en) | 2007-09-26 | 2013-07-02 | Sling Media, Inc. | Media streaming device with gateway functionality |
US20090094159A1 (en) * | 2007-10-05 | 2009-04-09 | Yahoo! Inc. | Stock video purchase |
US20090100013A1 (en) * | 2007-10-10 | 2009-04-16 | Fein Gene S | Method or apparatus of data processing to compile a digital data media presentation for transferring between one or more computers |
USD590837S1 (en) * | 2007-10-17 | 2009-04-21 | Sony Corporation | Display device showing network status icon |
US8350971B2 (en) | 2007-10-23 | 2013-01-08 | Sling Media, Inc. | Systems and methods for controlling media devices |
US8977958B2 (en) * | 2007-11-20 | 2015-03-10 | Microsoft Technology Licensing, Llc | Community-based software application help system |
US20090132920A1 (en) * | 2007-11-20 | 2009-05-21 | Microsoft Corporation | Community-based software application help system |
US7840661B2 (en) * | 2007-12-28 | 2010-11-23 | Yahoo! Inc. | Creating and editing media objects using web requests |
US8060609B2 (en) | 2008-01-04 | 2011-11-15 | Sling Media Inc. | Systems and methods for determining attributes of media items accessed via a personal media broadcaster |
US20090193034A1 (en) * | 2008-01-24 | 2009-07-30 | Disney Enterprises, Inc. | Multi-axis, hierarchical browser for accessing and viewing digital assets |
KR20090093105A (en) * | 2008-02-28 | 2009-09-02 | 삼성전자주식회사 | Content playing apparatus and method |
US9349109B2 (en) * | 2008-02-29 | 2016-05-24 | Adobe Systems Incorporated | Media generation and management |
US8667279B2 (en) | 2008-07-01 | 2014-03-04 | Sling Media, Inc. | Systems and methods for securely place shifting media content |
US20100001960A1 (en) * | 2008-07-02 | 2010-01-07 | Sling Media, Inc. | Systems and methods for gestural interaction with user interface objects |
US20100031152A1 (en) * | 2008-07-31 | 2010-02-04 | Microsoft Corporation | Creation and Navigation of Infinite Canvas Presentation |
US8108777B2 (en) | 2008-08-11 | 2012-01-31 | Microsoft Corporation | Sections of a presentation having user-definable properties |
US8381310B2 (en) | 2009-08-13 | 2013-02-19 | Sling Media Pvt. Ltd. | Systems, methods, and program applications for selectively restricting the placeshifting of copy protected digital media content |
US8667163B2 (en) | 2008-09-08 | 2014-03-04 | Sling Media Inc. | Systems and methods for projecting images from a computer system |
WO2010033233A1 (en) * | 2008-09-18 | 2010-03-25 | Screen Test Studios, Llc | Interactive entertainment system for recording performance |
US8843375B1 (en) * | 2008-09-29 | 2014-09-23 | Apple Inc. | User interfaces for editing audio clips |
US9191610B2 (en) | 2008-11-26 | 2015-11-17 | Sling Media Pvt Ltd. | Systems and methods for creating logical media streams for media storage and playback |
US20100158391A1 (en) * | 2008-12-24 | 2010-06-24 | Yahoo! Inc. | Identification and transfer of a media object segment from one communications network to another |
US8438602B2 (en) | 2009-01-26 | 2013-05-07 | Sling Media Inc. | Systems and methods for linking media content |
US8380866B2 (en) | 2009-03-20 | 2013-02-19 | Ricoh Company, Ltd. | Techniques for facilitating annotations |
US8171148B2 (en) | 2009-04-17 | 2012-05-01 | Sling Media, Inc. | Systems and methods for establishing connections between devices communicating over a network |
US8359537B2 (en) | 2009-04-30 | 2013-01-22 | Apple Inc. | Tool for navigating a composite presentation |
US8286081B2 (en) * | 2009-04-30 | 2012-10-09 | Apple Inc. | Editing and saving key-indexed geometries in media editing applications |
US8566721B2 (en) * | 2009-04-30 | 2013-10-22 | Apple Inc. | Editing key-indexed graphs in media editing applications |
US8555169B2 (en) | 2009-04-30 | 2013-10-08 | Apple Inc. | Media clip auditioning used to evaluate uncommitted media content |
US8392004B2 (en) * | 2009-04-30 | 2013-03-05 | Apple Inc. | Automatic audio adjustment |
US8418082B2 (en) * | 2009-05-01 | 2013-04-09 | Apple Inc. | Cross-track edit indicators and edit selections |
US8984406B2 (en) * | 2009-04-30 | 2015-03-17 | Yahoo! Inc! | Method and system for annotating video content |
US8881013B2 (en) * | 2009-04-30 | 2014-11-04 | Apple Inc. | Tool for tracking versions of media sections in a composite presentation |
US8522144B2 (en) * | 2009-04-30 | 2013-08-27 | Apple Inc. | Media editing application with candidate clip management |
US8549404B2 (en) | 2009-04-30 | 2013-10-01 | Apple Inc. | Auditioning tools for a media editing application |
US8769421B2 (en) * | 2009-04-30 | 2014-07-01 | Apple Inc. | Graphical user interface for a media-editing application with a segmented timeline |
US8701007B2 (en) * | 2009-04-30 | 2014-04-15 | Apple Inc. | Edit visualizer for modifying and evaluating uncommitted media content |
US9564173B2 (en) | 2009-04-30 | 2017-02-07 | Apple Inc. | Media editing application for auditioning different types of media clips |
US8612858B2 (en) | 2009-05-01 | 2013-12-17 | Apple Inc. | Condensing graphical representations of media clips in a composite display area of a media-editing application |
US8627207B2 (en) * | 2009-05-01 | 2014-01-07 | Apple Inc. | Presenting an editing tool in a composite display area |
US20100299621A1 (en) * | 2009-05-20 | 2010-11-25 | Making Everlasting Memories, L.L.C. | System and Method for Extracting a Plurality of Images from a Single Scan |
US10127524B2 (en) * | 2009-05-26 | 2018-11-13 | Microsoft Technology Licensing, Llc | Shared collaboration canvas |
US20100306018A1 (en) * | 2009-05-27 | 2010-12-02 | Microsoft Corporation | Meeting State Recall |
FR2946823B1 (en) * | 2009-06-10 | 2011-11-25 | Borrego Films | METHOD FOR GENERATING AND MANAGING A MULTIMEDIA SEQUENCE MODEL, CORRESPONDING METHOD AND RESTITUTION DEVICE |
US8406431B2 (en) | 2009-07-23 | 2013-03-26 | Sling Media Pvt. Ltd. | Adaptive gain control for digital audio samples in a media stream |
US20110029865A1 (en) * | 2009-07-31 | 2011-02-03 | Nellcor Puritan Bennett Llc | Control Interface For A Medical Monitor |
US9479737B2 (en) | 2009-08-06 | 2016-10-25 | Echostar Technologies L.L.C. | Systems and methods for event programming via a remote media player |
US8532472B2 (en) | 2009-08-10 | 2013-09-10 | Sling Media Pvt Ltd | Methods and apparatus for fast seeking within a media stream buffer |
US9525838B2 (en) | 2009-08-10 | 2016-12-20 | Sling Media Pvt. Ltd. | Systems and methods for virtual remote control of streamed media |
US8799408B2 (en) | 2009-08-10 | 2014-08-05 | Sling Media Pvt Ltd | Localization systems and methods |
US8966101B2 (en) | 2009-08-10 | 2015-02-24 | Sling Media Pvt Ltd | Systems and methods for updating firmware over a network |
US9565479B2 (en) | 2009-08-10 | 2017-02-07 | Sling Media Pvt Ltd. | Methods and apparatus for seeking within a media stream using scene detection |
US9160974B2 (en) | 2009-08-26 | 2015-10-13 | Sling Media, Inc. | Systems and methods for transcoding and place shifting media content |
US8314893B2 (en) | 2009-08-28 | 2012-11-20 | Sling Media Pvt. Ltd. | Remote control and method for automatically adjusting the volume output of an audio device |
US20110055770A1 (en) * | 2009-08-31 | 2011-03-03 | Hed Maria B | User interface method and apparatus for a reservation departure and control system |
US9265429B2 (en) * | 2009-09-18 | 2016-02-23 | Welch Allyn, Inc. | Physiological parameter measuring platform device supporting multiple workflows |
US8621099B2 (en) | 2009-09-21 | 2013-12-31 | Sling Media, Inc. | Systems and methods for formatting media content for distribution |
US8332757B1 (en) * | 2009-09-23 | 2012-12-11 | Adobe Systems Incorporated | Visualizing and adjusting parameters of clips in a timeline |
US9015225B2 (en) | 2009-11-16 | 2015-04-21 | Echostar Technologies L.L.C. | Systems and methods for delivering messages over a network |
US8799485B2 (en) | 2009-12-18 | 2014-08-05 | Sling Media, Inc. | Methods and apparatus for establishing network connections using an inter-mediating device |
US8626879B2 (en) | 2009-12-22 | 2014-01-07 | Sling Media, Inc. | Systems and methods for establishing network connections using local mediation services |
US9178923B2 (en) | 2009-12-23 | 2015-11-03 | Echostar Technologies L.L.C. | Systems and methods for remotely controlling a media server via a network |
US9275054B2 (en) | 2009-12-28 | 2016-03-01 | Sling Media, Inc. | Systems and methods for searching media content |
US8812538B2 (en) * | 2010-01-29 | 2014-08-19 | Wendy Muzatko | Story generation methods, story generation apparatuses, and articles of manufacture |
US8856349B2 (en) | 2010-02-05 | 2014-10-07 | Sling Media Inc. | Connection priority services for data communication between two devices |
US9237294B2 (en) | 2010-03-05 | 2016-01-12 | Sony Corporation | Apparatus and method for replacing a broadcasted advertisement based on both heuristic information and attempts in altering the playback of the advertisement |
US8631047B2 (en) | 2010-06-15 | 2014-01-14 | Apple Inc. | Editing 3D video |
US8583605B2 (en) | 2010-06-15 | 2013-11-12 | Apple Inc. | Media production application |
US8819557B2 (en) | 2010-07-15 | 2014-08-26 | Apple Inc. | Media-editing application with a free-form space for organizing or compositing media clips |
US8875025B2 (en) | 2010-07-15 | 2014-10-28 | Apple Inc. | Media-editing application with media clips grouping capabilities |
US8744239B2 (en) * | 2010-08-06 | 2014-06-03 | Apple Inc. | Teleprompter tool for voice-over tool |
US8555170B2 (en) * | 2010-08-10 | 2013-10-08 | Apple Inc. | Tool for presenting and editing a storyboard representation of a composite presentation |
US10095367B1 (en) * | 2010-10-15 | 2018-10-09 | Tivo Solutions Inc. | Time-based metadata management system for digital media |
US9832528B2 (en) | 2010-10-21 | 2017-11-28 | Sony Corporation | System and method for merging network-based content with broadcasted programming content |
JP5765927B2 (en) * | 2010-12-14 | 2015-08-19 | キヤノン株式会社 | Display control device and control method of display control device |
US9383888B2 (en) | 2010-12-15 | 2016-07-05 | Microsoft Technology Licensing, Llc | Optimized joint document review |
US9118612B2 (en) | 2010-12-15 | 2015-08-25 | Microsoft Technology Licensing, Llc | Meeting-specific state indicators |
US9864612B2 (en) | 2010-12-23 | 2018-01-09 | Microsoft Technology Licensing, Llc | Techniques to customize a user interface for different displays |
US8745499B2 (en) | 2011-01-28 | 2014-06-03 | Apple Inc. | Timeline search and index |
TWI449407B (en) * | 2011-01-28 | 2014-08-11 | Realtek Semiconductor Corp | Displayer, image processing apparatus and image processing method |
US20120198319A1 (en) | 2011-01-28 | 2012-08-02 | Giovanni Agnoli | Media-Editing Application with Video Segmentation and Caching Capabilities |
US9412414B2 (en) * | 2011-02-16 | 2016-08-09 | Apple Inc. | Spatial conform operation for a media-editing application |
US9271035B2 (en) | 2011-04-12 | 2016-02-23 | Microsoft Technology Licensing, Llc | Detecting key roles and their relationships from video |
US20120263439A1 (en) * | 2011-04-13 | 2012-10-18 | David King Lassman | Method and apparatus for creating a composite video from multiple sources |
US20120331385A1 (en) * | 2011-05-20 | 2012-12-27 | Brian Andreas | Asynchronistic platform for real time collaboration and connection |
USD691620S1 (en) * | 2011-06-03 | 2013-10-15 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD700194S1 (en) * | 2011-06-15 | 2014-02-25 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with a graphical user interface |
US8907957B2 (en) | 2011-08-30 | 2014-12-09 | Apple Inc. | Automatic animation generation |
US8819567B2 (en) | 2011-09-13 | 2014-08-26 | Apple Inc. | Defining and editing user interface behaviors |
US9164576B2 (en) | 2011-09-13 | 2015-10-20 | Apple Inc. | Conformance protocol for heterogeneous abstractions for defining user interface behaviors |
US9536564B2 (en) | 2011-09-20 | 2017-01-03 | Apple Inc. | Role-facilitated editing operations |
US20130080913A1 (en) * | 2011-09-22 | 2013-03-28 | Microsoft Corporation | Multi-column notebook interaction |
US8682973B2 (en) | 2011-10-05 | 2014-03-25 | Microsoft Corporation | Multi-user and multi-device collaboration |
US9544158B2 (en) | 2011-10-05 | 2017-01-10 | Microsoft Technology Licensing, Llc | Workspace collaboration via a wall-type computing device |
US9996241B2 (en) | 2011-10-11 | 2018-06-12 | Microsoft Technology Licensing, Llc | Interactive visualization of multiple software functionality content items |
US10198485B2 (en) | 2011-10-13 | 2019-02-05 | Microsoft Technology Licensing, Llc | Authoring of data visualizations and maps |
US20130047081A1 (en) * | 2011-10-25 | 2013-02-21 | Triparazzi, Inc. | Methods and systems for creating video content on mobile devices using storyboard templates |
US9792955B2 (en) | 2011-11-14 | 2017-10-17 | Apple Inc. | Automatic generation of multi-camera media clips |
USD693364S1 (en) * | 2011-11-29 | 2013-11-12 | Microsoft Corporation | Display screen with icon |
US10496250B2 (en) | 2011-12-19 | 2019-12-03 | Bellevue Investments Gmbh & Co, Kgaa | System and method for implementing an intelligent automatic music jam session |
US8705932B2 (en) * | 2011-12-21 | 2014-04-22 | Pelco, Inc. | Method and system for displaying a timeline |
USD681672S1 (en) * | 2012-01-06 | 2013-05-07 | Microsoft Corporation | Display screen with icon |
KR101952260B1 (en) * | 2012-04-03 | 2019-02-26 | 삼성전자주식회사 | Video display terminal and method for displaying a plurality of video thumbnail simultaneously |
US10226200B2 (en) | 2012-04-05 | 2019-03-12 | Welch Allyn, Inc. | User interface enhancements for physiological parameter monitoring platform devices |
USD916713S1 (en) | 2012-04-05 | 2021-04-20 | Welch Allyn, Inc. | Display screen with graphical user interface for patient central monitoring station |
US9055870B2 (en) | 2012-04-05 | 2015-06-16 | Welch Allyn, Inc. | Physiological parameter measuring platform device supporting multiple workflows |
USD772252S1 (en) * | 2012-04-05 | 2016-11-22 | Welch Allyn, Inc. | Patient monitoring device with a graphical user interface |
US9235682B2 (en) | 2012-04-05 | 2016-01-12 | Welch Allyn, Inc. | Combined episodic and continuous parameter monitoring |
US9367522B2 (en) * | 2012-04-13 | 2016-06-14 | Google Inc. | Time-based presentation editing |
USD754159S1 (en) | 2012-06-11 | 2016-04-19 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US8793582B2 (en) | 2012-08-22 | 2014-07-29 | Mobitv, Inc. | Personalized timeline presentation |
US20140099080A1 (en) * | 2012-10-10 | 2014-04-10 | International Business Machines Corporation | Creating An Abridged Presentation Of A Media Work |
RU2627096C2 (en) * | 2012-10-30 | 2017-08-03 | Сергей Анатольевич Гевлич | Methods for multimedia presentations prototypes manufacture, devices for multimedia presentations prototypes manufacture, methods for application of devices for multimedia presentations prototypes manufacture (versions) |
US20140226955A1 (en) * | 2013-02-12 | 2014-08-14 | Takes Llc | Generating a sequence of video clips based on meta data |
USD735227S1 (en) * | 2013-04-01 | 2015-07-28 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
JP2015002417A (en) * | 2013-06-14 | 2015-01-05 | キヤノン株式会社 | Photographing apparatus and method for controlling the same |
JP6093289B2 (en) * | 2013-12-10 | 2017-03-08 | 株式会社フレイ・スリー | Video processing apparatus, video processing method, and program |
USD755201S1 (en) * | 2013-12-30 | 2016-05-03 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
TWD165580S (en) * | 2014-03-07 | 2015-01-21 | 金寶電子工業股份有限公司 | Changeable graphical user interface for display screen |
USD754676S1 (en) * | 2014-04-04 | 2016-04-26 | Adp, Llc | Display screen or portion thereof with graphical user interface |
USD754675S1 (en) * | 2014-04-04 | 2016-04-26 | Adp, Llc | Display screen or portion thereof with graphical user interface |
USD754144S1 (en) * | 2014-04-04 | 2016-04-19 | Adp, Llc | Display screen or portion thereof with graphical user interface |
TWI577227B (en) * | 2014-08-26 | 2017-04-01 | 宏碁股份有限公司 | Method of setting cell broadcast service function of user equipment |
USD777190S1 (en) * | 2015-03-30 | 2017-01-24 | Captioncall, Llc | Display screen of a captioning communication device with graphical user interface |
CN104822092B (en) * | 2015-04-30 | 2018-08-24 | 无锡天脉聚源传媒科技有限公司 | Video gets ready, indexes and subtitle merging treatment method and device |
USD769298S1 (en) * | 2015-05-01 | 2016-10-18 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD768170S1 (en) * | 2015-05-01 | 2016-10-04 | Microsoft Corporation | Display screen with graphical user interface |
USD768171S1 (en) * | 2015-05-01 | 2016-10-04 | Microsoft Corporation | Display screen with graphical user interface |
JP6702657B2 (en) * | 2015-05-29 | 2020-06-03 | キヤノン株式会社 | Video processing device, video processing method, and program |
US20170076752A1 (en) * | 2015-09-10 | 2017-03-16 | Laura Steward | System and method for automatic media compilation |
USD810779S1 (en) | 2015-12-29 | 2018-02-20 | Sony Corporation | Portion of display panel or screen with icon |
US11727195B2 (en) | 2016-03-18 | 2023-08-15 | Audioeye, Inc. | Modular systems and methods for selectively enabling cloud-based assistive technologies |
US10423709B1 (en) | 2018-08-16 | 2019-09-24 | Audioeye, Inc. | Systems, devices, and methods for automated and programmatic creation and deployment of remediations to non-compliant web pages or user interfaces |
US10896286B2 (en) | 2016-03-18 | 2021-01-19 | Audioeye, Inc. | Modular systems and methods for selectively enabling cloud-based assistive technologies |
US10444934B2 (en) | 2016-03-18 | 2019-10-15 | Audioeye, Inc. | Modular systems and methods for selectively enabling cloud-based assistive technologies |
US10867120B1 (en) | 2016-03-18 | 2020-12-15 | Audioeye, Inc. | Modular systems and methods for selectively enabling cloud-based assistive technologies |
KR102028198B1 (en) * | 2017-01-26 | 2019-10-04 | 한국전자통신연구원 | Device for authoring video scene and metadata |
CN107515704B (en) * | 2017-08-04 | 2020-01-07 | 珠海格力电器股份有限公司 | Method and device for previewing compressed file |
US10453496B2 (en) * | 2017-12-29 | 2019-10-22 | Dish Network L.L.C. | Methods and systems for an augmented film crew using sweet spots |
US10834478B2 (en) | 2017-12-29 | 2020-11-10 | Dish Network L.L.C. | Methods and systems for an augmented film crew using purpose |
US10783925B2 (en) | 2017-12-29 | 2020-09-22 | Dish Network L.L.C. | Methods and systems for an augmented film crew using storyboards |
USD873844S1 (en) * | 2018-04-13 | 2020-01-28 | Martell Broadcasting Systems, Inc. | Display screen with transitional graphical user interface |
CN110971957B (en) * | 2018-09-30 | 2022-04-15 | 阿里巴巴集团控股有限公司 | Video editing method and device and mobile terminal |
US11126344B2 (en) * | 2019-01-22 | 2021-09-21 | Facebook, Inc. | Systems and methods for sharing content |
JP7409370B2 (en) * | 2019-03-27 | 2024-01-09 | ソニーグループ株式会社 | Video processing device and video processing method |
US12136445B2 (en) | 2019-04-01 | 2024-11-05 | Blackmagic Design Pty Ltd | User interface for video editing system |
EP3948502A4 (en) | 2019-04-01 | 2022-12-28 | Blackmagic Design Pty Ltd | MEDIA MANAGEMENT SYSTEM |
EP4008102A4 (en) | 2019-08-02 | 2023-07-19 | Blackmagic Design Pty Ltd | Video editing system, method and user interface |
CN110798744A (en) * | 2019-11-08 | 2020-02-14 | 北京字节跳动网络技术有限公司 | Multimedia information processing method, device, electronic equipment and medium |
US11798282B1 (en) | 2019-12-18 | 2023-10-24 | Snap Inc. | Video highlights with user trimming |
US11610607B1 (en) | 2019-12-23 | 2023-03-21 | Snap Inc. | Video highlights with user viewing, posting, sending and exporting |
US11538499B1 (en) | 2019-12-30 | 2022-12-27 | Snap Inc. | Video highlights with auto trimming |
KR102427890B1 (en) * | 2020-11-25 | 2022-08-01 | 네이버 주식회사 | Method and system to provide object for content arrangement |
CN113111220A (en) * | 2021-03-26 | 2021-07-13 | 北京达佳互联信息技术有限公司 | Video processing method, device, equipment, server and storage medium |
US11875764B2 (en) * | 2021-03-29 | 2024-01-16 | Avid Technology, Inc. | Data-driven autosuggestion within media content creation |
US20240205516A1 (en) * | 2021-04-08 | 2024-06-20 | Captuure, Inc. | Video capture, production, and delivery systems |
CN115442538A (en) * | 2021-06-04 | 2022-12-06 | 北京字跳网络技术有限公司 | Video generation method, device, equipment and storage medium |
US11508413B1 (en) * | 2021-08-27 | 2022-11-22 | Verizon Patent And Licensing Inc. | Systems and methods for editing media composition from media assets |
USD1031755S1 (en) * | 2022-08-25 | 2024-06-18 | EMOCOG Co., Ltd. | Display screen or portion thereof with graphical user interface |
JP1775537S (en) * | 2022-08-25 | 2024-07-17 | Graphical User Interface |
Citations (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4538188A (en) * | 1982-12-22 | 1985-08-27 | Montage Computer Corporation | Video composition method and apparatus |
US4685003A (en) * | 1983-12-02 | 1987-08-04 | Lex Computing & Management Corporation | Video composition method and apparatus for providing simultaneous inputting and sorting of video source material |
US4746994A (en) * | 1985-08-22 | 1988-05-24 | Cinedco, California Limited Partnership | Computer-based video editing system |
US5012334A (en) * | 1990-01-29 | 1991-04-30 | Dubner Computer Systems, Inc. | Video image bank for storing and retrieving video image sequences |
US5097351A (en) * | 1990-08-06 | 1992-03-17 | Holotek, Ltd. | Simultaneous multibeam scanning system |
US5196933A (en) * | 1990-03-23 | 1993-03-23 | Etat Francais, Ministere Des Ptt | Encoding and transmission method with at least two levels of quality of digital pictures belonging to a sequence of pictures, and corresponding devices |
US5214528A (en) * | 1990-09-14 | 1993-05-25 | Konica Corporation | Optical beam scanning apparatus |
US5237648A (en) * | 1990-06-08 | 1993-08-17 | Apple Computer, Inc. | Apparatus and method for editing a video recording by selecting and displaying video clips |
US5267351A (en) * | 1989-12-22 | 1993-11-30 | Avid Technology, Inc. | Media storage and retrieval system |
US5274758A (en) * | 1989-06-16 | 1993-12-28 | International Business Machines | Computer-based, audio/visual creation and presentation system and method |
US5307456A (en) * | 1990-12-04 | 1994-04-26 | Sony Electronics, Inc. | Integrated multi-media production and authoring system |
US5317732A (en) * | 1991-04-26 | 1994-05-31 | Commodore Electronics Limited | System for relocating a multimedia presentation on a different platform by extracting a resource map in order to remap and relocate resources |
US5390138A (en) * | 1993-09-13 | 1995-02-14 | Taligent, Inc. | Object-oriented audio system |
US5404316A (en) * | 1992-08-03 | 1995-04-04 | Spectra Group Ltd., Inc. | Desktop digital video processing system |
US5442744A (en) * | 1992-04-03 | 1995-08-15 | Sun Microsystems, Inc. | Methods and apparatus for displaying and editing multimedia information |
US5488433A (en) * | 1993-04-21 | 1996-01-30 | Kinya Washino | Dual compression format digital video production system |
US5489947A (en) * | 1994-06-17 | 1996-02-06 | Thomson Consumer Electronics, Inc. | On screen display arrangement for a digital video signal processing system |
US5493568A (en) * | 1993-11-24 | 1996-02-20 | Intel Corporation | Media dependent module interface for computer-based conferencing system |
US5513306A (en) * | 1990-08-09 | 1996-04-30 | Apple Computer, Inc. | Temporal event viewing and editing system |
US5515490A (en) * | 1993-11-05 | 1996-05-07 | Xerox Corporation | Method and system for temporally formatting data presentation in time-dependent documents |
US5534942A (en) * | 1994-06-17 | 1996-07-09 | Thomson Consumer Electronics, Inc. | On screen display arrangement for digital video signal processing system |
US5537157A (en) * | 1993-04-21 | 1996-07-16 | Kinya Washino | Multi-format audio/video production system |
US5539869A (en) * | 1992-09-28 | 1996-07-23 | Ford Motor Company | Method and system for processing and presenting on-line, multimedia information in a tree structure |
US5568275A (en) * | 1992-04-10 | 1996-10-22 | Avid Technology, Inc. | Method for visually and audibly representing computer instructions for editing |
US5592602A (en) * | 1994-05-17 | 1997-01-07 | Macromedia, Inc. | User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display |
US5613057A (en) * | 1994-01-14 | 1997-03-18 | International Business Machines Corporation | Method for creating a multimedia application using multimedia files stored in directories that are characteristics of display surface areas |
US5619636A (en) * | 1994-02-17 | 1997-04-08 | Autodesk, Inc. | Multimedia publishing system |
US5623308A (en) * | 1995-07-07 | 1997-04-22 | Lucent Technologies Inc. | Multiple resolution, multi-stream video system using a single standard coder |
US5652714A (en) * | 1994-09-30 | 1997-07-29 | Apple Computer, Inc. | Method and apparatus for capturing transient events in a multimedia product using an authoring tool on a computer system |
US5659790A (en) * | 1995-02-23 | 1997-08-19 | International Business Machines Corporation | System and method for globally scheduling multimedia stories |
US5659792A (en) * | 1993-01-15 | 1997-08-19 | Canon Information Systems Research Australia Pty Ltd. | Storyboard system for the simultaneous timing of multiple independent video animation clips |
US5659793A (en) * | 1994-12-22 | 1997-08-19 | Bell Atlantic Video Services, Inc. | Authoring tools for multimedia application development and network delivery |
US5664216A (en) * | 1994-03-22 | 1997-09-02 | Blumenau; Trevor | Iconic audiovisual data editing environment |
US5680619A (en) * | 1995-04-03 | 1997-10-21 | Mfactory, Inc. | Hierarchical encapsulation of instantiated objects in a multimedia authoring system |
US5682326A (en) * | 1992-08-03 | 1997-10-28 | Radius Inc. | Desktop digital video processing system |
US5684963A (en) * | 1995-03-20 | 1997-11-04 | Discreet Logic, Inc. | System and method for distributing video from a plurality of video providers |
US5712953A (en) * | 1995-06-28 | 1998-01-27 | Electronic Data Systems Corporation | System and method for classification of audio or audio/video signals based on musical content |
USD391558S (en) * | 1994-10-05 | 1998-03-03 | Bell Video Services Company | Set of icons for a display screen of a video monitor |
US5724605A (en) * | 1992-04-10 | 1998-03-03 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using a tree structure |
USD392267S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
USD392264S (en) * | 1994-10-05 | 1998-03-17 | Bell Video Services Company | Set of icons for the display screen of a video monitor |
USD392268S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
USD392269S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
US5760767A (en) * | 1995-10-26 | 1998-06-02 | Sony Corporation | Method and apparatus for displaying in and out points during video editing |
USD395291S (en) * | 1995-04-06 | 1998-06-16 | Avid Technology, Inc. | Icon for a display screen |
US5781435A (en) * | 1996-04-12 | 1998-07-14 | Holroyd; Delwyn | Edit-to-it |
US5801685A (en) * | 1996-04-08 | 1998-09-01 | Tektronix, Inc. | Automatic editing of recorded video elements sychronized with a script text read or displayed |
US5892507A (en) * | 1995-04-06 | 1999-04-06 | Avid Technology, Inc. | Computer system for authoring a multimedia composition using a visual representation of the multimedia composition |
US5959697A (en) * | 1996-06-07 | 1999-09-28 | Electronic Data Systems Corporation | Method and system for detecting dissolve transitions in a video signal |
US5974218A (en) * | 1995-04-21 | 1999-10-26 | Hitachi, Ltd. | Method and apparatus for making a digest picture |
US5983069A (en) * | 1994-09-06 | 1999-11-09 | Stv Asia Ltd. | Point of purchase video distribution system |
US6469711B2 (en) * | 1996-07-29 | 2002-10-22 | Avid Technology, Inc. | Graphical user interface for a video editing system |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1988007719A2 (en) | 1987-03-31 | 1988-10-06 | Aimtech Corporation | Apparatus for iconographically representing and executing a program |
CA2015215C (en) | 1989-06-16 | 1994-01-11 | Bradley James Beitel | Computer-based, audio/visual creation and presentation system and method |
JPH056251A (en) | 1990-07-30 | 1993-01-14 | Farallon Computing Inc | Device for previously recording, editing and regenerating screening on computer system |
JP3028241B2 (en) * | 1990-11-07 | 2000-04-04 | 株式会社サトー | Thermal fixing device in electrophotographic apparatus |
DE69222102T2 (en) * | 1991-08-02 | 1998-03-26 | Grass Valley Group | Operator interface for video editing system for the display and interactive control of video material |
US5355450A (en) | 1992-04-10 | 1994-10-11 | Avid Technology, Inc. | Media composer with adjustable source material compression |
US5999173A (en) * | 1992-04-03 | 1999-12-07 | Adobe Systems Incorporated | Method and apparatus for video editing with video clip representations displayed along a time line |
JP2548497B2 (en) | 1992-10-09 | 1996-10-30 | 松下電器産業株式会社 | Video editing equipment |
EP0613145A2 (en) | 1993-02-26 | 1994-08-31 | Sony Electronics Inc. | Card file graphical user interface with visual representation of video data |
EP0702832B1 (en) | 1993-06-10 | 1998-03-04 | Lightworks Editing Systems Ltd | Video editing systems |
US5546528A (en) | 1994-06-23 | 1996-08-13 | Adobe Systems Incorporated | Method of displaying multiple sets of information in the same area of a computer screen |
US5675752A (en) | 1994-09-15 | 1997-10-07 | Sony Corporation | Interactive applications generator for an interactive presentation environment |
US5826102A (en) * | 1994-12-22 | 1998-10-20 | Bell Atlantic Network Services, Inc. | Network arrangement for development delivery and presentation of multimedia applications using timelines to integrate multimedia objects and program objects |
US5838938A (en) * | 1995-02-15 | 1998-11-17 | Sony Electronics, Inc. | Multimedia user interface employing components of color to indicate the values of variables |
DE69623712T2 (en) | 1995-04-08 | 2003-05-28 | Sony Corp., Tokio/Tokyo | INTERFACE SYSTEM |
US5732184A (en) * | 1995-10-20 | 1998-03-24 | Digital Processing Systems, Inc. | Video and audio cursor video editing system |
US5852435A (en) * | 1996-04-12 | 1998-12-22 | Avid Technology, Inc. | Digital multimedia editing and data management system |
US5781188A (en) * | 1996-06-27 | 1998-07-14 | Softimage | Indicating activeness of clips and applying effects to clips and tracks in a timeline of a multimedia work |
-
1996
- 1996-07-29 US US08/687,926 patent/US6628303B1/en not_active Expired - Lifetime
-
2001
- 2001-07-23 US US09/911,145 patent/US6469711B2/en not_active Expired - Lifetime
-
2003
- 2003-09-29 US US10/673,902 patent/US20040071441A1/en not_active Abandoned
- 2003-09-29 US US10/674,033 patent/US7124366B2/en not_active Expired - Fee Related
- 2003-09-29 US US10/673,663 patent/US20040066395A1/en not_active Abandoned
Patent Citations (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4538188A (en) * | 1982-12-22 | 1985-08-27 | Montage Computer Corporation | Video composition method and apparatus |
US4685003A (en) * | 1983-12-02 | 1987-08-04 | Lex Computing & Management Corporation | Video composition method and apparatus for providing simultaneous inputting and sorting of video source material |
US4746994A (en) * | 1985-08-22 | 1988-05-24 | Cinedco, California Limited Partnership | Computer-based video editing system |
US4746994B1 (en) * | 1985-08-22 | 1993-02-23 | Cinedco Inc | |
US5274758A (en) * | 1989-06-16 | 1993-12-28 | International Business Machines | Computer-based, audio/visual creation and presentation system and method |
US5584006A (en) * | 1989-12-22 | 1996-12-10 | Avid Technology, Inc. | Media storage and retrieval system including determination of media data associated with requests based on source identifiers and ranges within the media data |
US5267351A (en) * | 1989-12-22 | 1993-11-30 | Avid Technology, Inc. | Media storage and retrieval system |
US5012334B1 (en) * | 1990-01-29 | 1997-05-13 | Grass Valley Group | Video image bank for storing and retrieving video image sequences |
US5012334A (en) * | 1990-01-29 | 1991-04-30 | Dubner Computer Systems, Inc. | Video image bank for storing and retrieving video image sequences |
US5196933A (en) * | 1990-03-23 | 1993-03-23 | Etat Francais, Ministere Des Ptt | Encoding and transmission method with at least two levels of quality of digital pictures belonging to a sequence of pictures, and corresponding devices |
US5237648A (en) * | 1990-06-08 | 1993-08-17 | Apple Computer, Inc. | Apparatus and method for editing a video recording by selecting and displaying video clips |
US5097351A (en) * | 1990-08-06 | 1992-03-17 | Holotek, Ltd. | Simultaneous multibeam scanning system |
US5513306A (en) * | 1990-08-09 | 1996-04-30 | Apple Computer, Inc. | Temporal event viewing and editing system |
US5214528A (en) * | 1990-09-14 | 1993-05-25 | Konica Corporation | Optical beam scanning apparatus |
US5307456A (en) * | 1990-12-04 | 1994-04-26 | Sony Electronics, Inc. | Integrated multi-media production and authoring system |
US5317732A (en) * | 1991-04-26 | 1994-05-31 | Commodore Electronics Limited | System for relocating a multimedia presentation on a different platform by extracting a resource map in order to remap and relocate resources |
US5442744A (en) * | 1992-04-03 | 1995-08-15 | Sun Microsystems, Inc. | Methods and apparatus for displaying and editing multimedia information |
US5752029A (en) * | 1992-04-10 | 1998-05-12 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using references to tracks in the composition to define components of the composition |
US5568275A (en) * | 1992-04-10 | 1996-10-22 | Avid Technology, Inc. | Method for visually and audibly representing computer instructions for editing |
US5754851A (en) * | 1992-04-10 | 1998-05-19 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using recursively defined components |
US5724605A (en) * | 1992-04-10 | 1998-03-03 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using a tree structure |
US5682326A (en) * | 1992-08-03 | 1997-10-28 | Radius Inc. | Desktop digital video processing system |
US5404316A (en) * | 1992-08-03 | 1995-04-04 | Spectra Group Ltd., Inc. | Desktop digital video processing system |
US5539869A (en) * | 1992-09-28 | 1996-07-23 | Ford Motor Company | Method and system for processing and presenting on-line, multimedia information in a tree structure |
US5659792A (en) * | 1993-01-15 | 1997-08-19 | Canon Information Systems Research Australia Pty Ltd. | Storyboard system for the simultaneous timing of multiple independent video animation clips |
US5488433A (en) * | 1993-04-21 | 1996-01-30 | Kinya Washino | Dual compression format digital video production system |
US5537157A (en) * | 1993-04-21 | 1996-07-16 | Kinya Washino | Multi-format audio/video production system |
US5390138A (en) * | 1993-09-13 | 1995-02-14 | Taligent, Inc. | Object-oriented audio system |
US5515490A (en) * | 1993-11-05 | 1996-05-07 | Xerox Corporation | Method and system for temporally formatting data presentation in time-dependent documents |
US5493568A (en) * | 1993-11-24 | 1996-02-20 | Intel Corporation | Media dependent module interface for computer-based conferencing system |
US5613057A (en) * | 1994-01-14 | 1997-03-18 | International Business Machines Corporation | Method for creating a multimedia application using multimedia files stored in directories that are characteristics of display surface areas |
US5619636A (en) * | 1994-02-17 | 1997-04-08 | Autodesk, Inc. | Multimedia publishing system |
US5664216A (en) * | 1994-03-22 | 1997-09-02 | Blumenau; Trevor | Iconic audiovisual data editing environment |
US5592602A (en) * | 1994-05-17 | 1997-01-07 | Macromedia, Inc. | User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display |
US5489947A (en) * | 1994-06-17 | 1996-02-06 | Thomson Consumer Electronics, Inc. | On screen display arrangement for a digital video signal processing system |
US5534942A (en) * | 1994-06-17 | 1996-07-09 | Thomson Consumer Electronics, Inc. | On screen display arrangement for digital video signal processing system |
US5983069A (en) * | 1994-09-06 | 1999-11-09 | Stv Asia Ltd. | Point of purchase video distribution system |
US5652714A (en) * | 1994-09-30 | 1997-07-29 | Apple Computer, Inc. | Method and apparatus for capturing transient events in a multimedia product using an authoring tool on a computer system |
USD391558S (en) * | 1994-10-05 | 1998-03-03 | Bell Video Services Company | Set of icons for a display screen of a video monitor |
USD392264S (en) * | 1994-10-05 | 1998-03-17 | Bell Video Services Company | Set of icons for the display screen of a video monitor |
US5659793A (en) * | 1994-12-22 | 1997-08-19 | Bell Atlantic Video Services, Inc. | Authoring tools for multimedia application development and network delivery |
US5659790A (en) * | 1995-02-23 | 1997-08-19 | International Business Machines Corporation | System and method for globally scheduling multimedia stories |
US5684963A (en) * | 1995-03-20 | 1997-11-04 | Discreet Logic, Inc. | System and method for distributing video from a plurality of video providers |
US5680619A (en) * | 1995-04-03 | 1997-10-21 | Mfactory, Inc. | Hierarchical encapsulation of instantiated objects in a multimedia authoring system |
USD392268S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
US5892507A (en) * | 1995-04-06 | 1999-04-06 | Avid Technology, Inc. | Computer system for authoring a multimedia composition using a visual representation of the multimedia composition |
USD392267S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
USD392269S (en) * | 1995-04-06 | 1998-03-17 | Avid Technology, Inc. | Icon for a display screen |
USD395291S (en) * | 1995-04-06 | 1998-06-16 | Avid Technology, Inc. | Icon for a display screen |
US5974218A (en) * | 1995-04-21 | 1999-10-26 | Hitachi, Ltd. | Method and apparatus for making a digest picture |
US5712953A (en) * | 1995-06-28 | 1998-01-27 | Electronic Data Systems Corporation | System and method for classification of audio or audio/video signals based on musical content |
US5623308A (en) * | 1995-07-07 | 1997-04-22 | Lucent Technologies Inc. | Multiple resolution, multi-stream video system using a single standard coder |
US5760767A (en) * | 1995-10-26 | 1998-06-02 | Sony Corporation | Method and apparatus for displaying in and out points during video editing |
US5801685A (en) * | 1996-04-08 | 1998-09-01 | Tektronix, Inc. | Automatic editing of recorded video elements sychronized with a script text read or displayed |
US5781435A (en) * | 1996-04-12 | 1998-07-14 | Holroyd; Delwyn | Edit-to-it |
US5959697A (en) * | 1996-06-07 | 1999-09-28 | Electronic Data Systems Corporation | Method and system for detecting dissolve transitions in a video signal |
US6469711B2 (en) * | 1996-07-29 | 2002-10-22 | Avid Technology, Inc. | Graphical user interface for a video editing system |
US6628303B1 (en) * | 1996-07-29 | 2003-09-30 | Avid Technology, Inc. | Graphical user interface for a motion video planning and editing system for a computer |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7725830B2 (en) | 2001-09-06 | 2010-05-25 | Microsoft Corporation | Assembling verbal narration for digital display images |
US20040255251A1 (en) * | 2001-09-06 | 2004-12-16 | Microsoft Corporation | Assembling verbal narration for digital display images |
US20050028103A1 (en) * | 2003-07-30 | 2005-02-03 | Takayuki Yamamoto | Editing device |
US9076488B2 (en) | 2004-03-18 | 2015-07-07 | Andrew Liebman | Media file for multi-platform non-linear video editing systems |
US20110125818A1 (en) * | 2004-03-18 | 2011-05-26 | Andrew Liebman | Novel media file for multi-platform non-linear video editing systems |
US20060184673A1 (en) * | 2004-03-18 | 2006-08-17 | Andrew Liebman | Novel media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US8266283B2 (en) | 2004-03-18 | 2012-09-11 | Andrew Liebman | Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US8751604B2 (en) | 2004-03-18 | 2014-06-10 | Andrew Liebman | Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US20060041632A1 (en) * | 2004-08-23 | 2006-02-23 | Microsoft Corporation | System and method to associate content types in a portable communication device |
US7400351B2 (en) | 2004-10-06 | 2008-07-15 | Microsoft Corporation | Creation of image based video using step-images |
US20060072017A1 (en) * | 2004-10-06 | 2006-04-06 | Microsoft Corporation | Creation of image based video using step-images |
US20060203199A1 (en) * | 2005-03-08 | 2006-09-14 | Microsoft Corporation | Photostory 3 - automated motion generation |
US7372536B2 (en) | 2005-03-08 | 2008-05-13 | Microsoft Corporation | Photostory 3—automated motion generation |
US20060204214A1 (en) * | 2005-03-14 | 2006-09-14 | Microsoft Corporation | Picture line audio augmentation |
US20060218488A1 (en) * | 2005-03-28 | 2006-09-28 | Microsoft Corporation | Plug-in architecture for post-authoring activities |
US20060224778A1 (en) * | 2005-04-04 | 2006-10-05 | Microsoft Corporation | Linked wizards |
US20140250055A1 (en) * | 2008-04-11 | 2014-09-04 | Adobe Systems Incorporated | Systems and Methods for Associating Metadata With Media Using Metadata Placeholders |
US20110167036A1 (en) * | 2008-06-19 | 2011-07-07 | Andrew Liebman | Novel media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
US9552843B2 (en) | 2008-06-19 | 2017-01-24 | Andrew Liebman | Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems |
WO2010146558A1 (en) * | 2009-06-18 | 2010-12-23 | Madeyoum Ltd. | Device, system, and method of generating a multimedia presentation |
CN102668586A (en) * | 2009-09-25 | 2012-09-12 | 夏普株式会社 | Display device, program, and storage medium |
US20110126236A1 (en) * | 2009-11-25 | 2011-05-26 | Nokia Corporation | Method and apparatus for presenting media segments |
WO2011064440A1 (en) * | 2009-11-25 | 2011-06-03 | Nokia Corporation | Method and apparatus for presenting media segments |
US8631436B2 (en) * | 2009-11-25 | 2014-01-14 | Nokia Corporation | Method and apparatus for presenting media segments |
US20120210230A1 (en) * | 2010-07-15 | 2012-08-16 | Ken Matsuda | Media-Editing Application with Anchored Timeline |
US8910046B2 (en) * | 2010-07-15 | 2014-12-09 | Apple Inc. | Media-editing application with anchored timeline |
US9600164B2 (en) | 2010-07-15 | 2017-03-21 | Apple Inc. | Media-editing application with anchored timeline |
US9870802B2 (en) | 2011-01-28 | 2018-01-16 | Apple Inc. | Media clip management |
US8966367B2 (en) | 2011-02-16 | 2015-02-24 | Apple Inc. | Anchor override for a media-editing application with an anchored timeline |
US9997196B2 (en) | 2011-02-16 | 2018-06-12 | Apple Inc. | Retiming media presentations |
US10324605B2 (en) | 2011-02-16 | 2019-06-18 | Apple Inc. | Media-editing application with novel editing tools |
US11157154B2 (en) | 2011-02-16 | 2021-10-26 | Apple Inc. | Media-editing application with novel editing tools |
US11747972B2 (en) | 2011-02-16 | 2023-09-05 | Apple Inc. | Media-editing application with novel editing tools |
US9626375B2 (en) | 2011-04-08 | 2017-04-18 | Andrew Liebman | Systems, computer readable storage media, and computer implemented methods for project sharing |
Also Published As
Publication number | Publication date |
---|---|
US20040056882A1 (en) | 2004-03-25 |
US20010040592A1 (en) | 2001-11-15 |
US6628303B1 (en) | 2003-09-30 |
US6469711B2 (en) | 2002-10-22 |
US20040071441A1 (en) | 2004-04-15 |
US7124366B2 (en) | 2006-10-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6628303B1 (en) | Graphical user interface for a motion video planning and editing system for a computer | |
US11157154B2 (en) | Media-editing application with novel editing tools | |
US7836389B2 (en) | Editing system for audiovisual works and corresponding text for television news | |
JP3892342B2 (en) | Video editing method and apparatus for editing video projects | |
US6400378B1 (en) | Home movie maker | |
JP3067801B2 (en) | Digital audio workstation providing digital storage and display of video information | |
EP1872268B1 (en) | Icon bar display for video editing system | |
EP0916136B1 (en) | Graphical user interface for a motion video planning and editing system for a computer | |
US20030164845A1 (en) | Performance retiming effects on synchronized data in an editing system | |
US8006192B1 (en) | Layered graphical user interface | |
US7242847B1 (en) | Systems and methods for editing video streams using a grid-based representation | |
US6892353B1 (en) | Edit to tape | |
US6272279B1 (en) | Editing method of moving images, editing apparatus and storage medium storing its editing method program | |
JP2000059720A (en) | Apparatus for displaying and editing continuous media information, method for displaying and editing continuous media information, and recording medium recording the processing procedure | |
Meadows | Digital storytelling | |
JP3263443B2 (en) | Animation editing equipment | |
JP2000082278A (en) | Moving image editing method | |
JPH1139849A (en) | Editing apparatus | |
Dixon | How to Use Adobe Premiere 6.5 | |
Brenneis | Final Cut Pro 6: Visual QuickPro Guide | |
JP2004140750A (en) | Image editor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |