US20020126201A1 - Systems and methods for connecting video conferencing to a distributed network - Google Patents
Systems and methods for connecting video conferencing to a distributed network Download PDFInfo
- Publication number
- US20020126201A1 US20020126201A1 US09/984,499 US98449901A US2002126201A1 US 20020126201 A1 US20020126201 A1 US 20020126201A1 US 98449901 A US98449901 A US 98449901A US 2002126201 A1 US2002126201 A1 US 2002126201A1
- Authority
- US
- United States
- Prior art keywords
- data stream
- video
- video conference
- digital audio
- digital
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/152—Multipoint control units therefor
Definitions
- This invention relates to systems and methods that allow data to be transmitted between a video conference system and a distributed network.
- Conventional video conferencing equipment is generally divided into a number of categories. These categories include data stream display and capture equipment, such as cameras, microphones, televisions and speakers, end point equipment that connects a particular video conference participant to another end point device or to a multi-point control unit, and the multi-point control unit, which allows three or more end point devices to participate in a single video conference session.
- data stream display and capture equipment such as cameras, microphones, televisions and speakers
- end point equipment that connects a particular video conference participant to another end point device or to a multi-point control unit
- the multi-point control unit which allows three or more end point devices to participate in a single video conference session.
- the end point equipment is used by participants in a video conference to convert the audio and video signals from the camera and microphone into data transmittable to another other end point device or the multi-point control unit.
- the end point equipment is also used to convert the transmitted audio and video signals, received at one end point from another end point or from the multi-point control unit, into signals usable by audio and video display devices connected to that end point to play the video and audio signals to the participants.
- the multi-point control unit is a conference bridge that connects the various end points of a single video conference session together when more than two end point devices are to be involved in the video conference.
- two end point devices can be connected directly to each other.
- most video conference sessions, even when only two participants are involved, are routed through a multi-point control device.
- one multi-point control unit can be used for a number of video conference sessions, where each session has two or more participants. In operation, each of the end points contact the multi-point control unit.
- the multi-point control unit After data sufficient for the multi-point control unit to authenticate the participants' authorization to participate in a video conference is provided to that multi-point control unit, the multi-point control unit connects that end point device to the one or more other end point devices, so that the user of that end point device can participate in that video conference session.
- the multi-point control unit or a video conference administrator or coordinator, confirms a video conference participants' authorization to participate in the video conference based on the video conference participant supplying a predefined password, or the like.
- the H.320 standard is the standard for ISDN video conferencing.
- the H.323 standard extends the H.320 ISDN video conferencing standards to a standard usable for Internet protocol (IP)-based distributed networks.
- IP Internet protocol
- SIP Session Initiation Protocol
- Video conference equipment which uses the H.323 standard, uses standard Internet protocol (IP) handshake and messaging protocols and data and packet formats that would be used on a standard Internet protocol (IP)-based distributed network, such as the Internet, many wide area networks and local area networks, intranets, extranets, and other distributed networks.
- IP Internet protocol
- FIG. 1 Porting the audio and video data streams of a video conference to a distributed network, such as the Internet, for distribution as a multimedia data stream is known.
- Conventional video conference broadcasters re-encode the audio and video portions of the video conference through one of two techniques.
- One technique includes capturing the video portion of the video conference separately, by accessing the analog auxiliary audio and video outputs on one of the video conference end point devices that are being used to participate in a particular video conference session. This first technique is illustrated in FIG. 1.
- a second technique uses an entirely different type of video conference equipment, which allows three or more participants to participate in a video conference without needing a multi-point control unit.
- This system connects the end point equipment of the various participants in a peer-to-peer style network, where each end point receives the video and audio data streams directly from each of the other end points.
- This is described as multi-tasking the video conference across the network.
- an IPTV client which is a software application available from Cisco Systems, can be connected to the network to view the data packets of a video conference session as the data packets are passed back and forth between the actual end points participating in the video conference.
- the IPTV client sits in the background and monitors all of the packets that are transmitted between the end points of the video conference session.
- One advantage of this second system over the first technique is that the audio and video data streams stay in digital form.
- the IPTV client merely listens to the multi-task IP addresses.
- the IPTV client creates a multicast.
- multicasts generally cannot be received by most conventional local area or wide area networks that the video conference has not originated on.
- this peer-to-peer system can only be used within a multi-cast capable network, such as a single local or wide area network.
- the IPTV client can only make the video conference data available to another IPTV client that is also on a multi-cast capable network.
- the system shown in FIG. 1 accesses the digital video and audio streams of the video conference output by a video conference end point device 60 through the analog output streams output by a video conference standard client 70 . These analog output streams are also used to drive the audio and visual display devices used by the actual video conference participants.
- the system shown in FIG. 1 reconverts the audio streams back into digital data streams. As a result, the system shown in FIG. 1 can significantly degrade or otherwise distort the video and audio data.
- the video and audio data which is originally in digital format, must be converted to analog format and then reconverted to digital format.
- the system shown in FIG. 1 requires a physical connection between the video conference standard client 70 and a video capture encoding device 80 to transmit analog signals 72 and 74 .
- each video capture encoding device 80 can be connected to at most one client 70 .
- This invention provides systems and methods that allow video conference data streams to be transmitted between the video conference participants and clients on a distributed network.
- This invention separately provides systems and methods that allow the transmitted data streams to remain in digital form as the data streams are transmitted between the video conference participants and the clients on the distributed network.
- This invention separately provides systems and methods that allow the transmitted data streams to be transmitted as a unicast on the distributed network.
- This invention separately provides systems and methods that use a pseudo-end point to receive audio and video data streams transmitted between the end point devices actually participating in the video conference session.
- This invention further provides systems and methods that transmit the audio and video data streams from the pseudo-end point device to clients on a distributed network.
- This invention further provides systems and methods that transmit the audio and video data streams from the pseudo-end point device to clients on a distributed network entirely as digital data.
- This invention further provides systems and methods that recode the digital audio and video data streams received by the pseudo-end point device, while the audio and video data streams remain in digital format.
- This invention separately provides systems and methods that use a pseudo-end point such that the access to the video conference data is controlled in the same way that access is controlled for an actual video conference participant.
- This invention separately provides systems and methods that use a pseudo-end point device to inject audio and video data streams stored on the distributed network into the video conference session.
- a pseudo-end point device can be connected to a multi-point control unit managing a particular video conference session in the same way as the end point device of an actual participant to the video conference session is connected to that multi-point control unit.
- the pseudo-end point device receives the digital video conference data packets in the same way that the end point devices of the actual participants receive the digital video conference data packets.
- that pseudo-end point device is connected to a video conference standard module.
- the video conference standard module can be connected to a plurality of different pseudo-end point devices, each acting as a pseudo-participant in a different video conference session.
- the video conference standard module transmits the received audio and video data packets as a multicast to one or more unicast servers, and zero, one or more multicast clients.
- the unicast servers include servers able to output unicast multimedia data streams using any known or later-developed protocol or software package, such as, for example, the Microsoft® Windows® Media protocol (Windows® MMS), the Apple® QuickTime® protocol, the Real Networks® Real® protocol, the Internet Engineering Task Force (IETF) Real Time Streaming Protocol (RTSP), or the like.
- Microsoft® Windows® Media protocol Windows® MMS
- Apple® QuickTime® protocol the Real Networks® Real® protocol
- IETF Internet Engineering Task Force
- RTSP Real Time Streaming Protocol
- FIG. 1 is a block diagram illustrating one exemplary embodiment of a conventional system for porting video conference audio and video data streams to a distributed network
- FIG. 2 is a block diagram of a first exemplary embodiment of a system for connecting a video conference end point to a distributed network
- FIG. 3 is a second exemplary embodiment of a system for connecting an end point of a video conference to a distributed network according to this invention
- FIG. 4 is a third exemplary embodiment of a system for connecting a video conference end point to a distributed network according to this invention.
- FIG. 5 is a fourth exemplary embodiment of a system for connecting a video conference end point to a distributed network according to this invention.
- FIG. 6 is a flowchart outlining a first exemplary embodiment of a method for distributing the video and audio content of a video conference over a distributed network according to this invention.
- FIG. 7 is a flowchart outlining a second exemplary embodiment of the method for distributing the video and audio content of a video conference over a distributed network according to this invention.
- the video conferencing systems and methods according to this invention allow video conferencing systems and Internet-based media streaming systems to converge.
- the systems and methods according to this invention allow the audio and video content of a video conference to be distributed as a multimedia data stream over a distributed network, such as the Internet.
- the various exemplary embodiments of the systems and methods according to this invention allow a network administrator or video conference coordinator to broadcast a live video conference using standard video streaming techniques and protocols for distributing video streams over distributed networks. This makes use of existing distributed network infrastructures while reducing initial purchase costs, maintenance requirements and installation costs.
- video conference standard encompasses the H.323 standard, the SIP standard, the H.320 standard, and any other known or later-developed standard that provides for the concept of a video conference call.
- Such standards will usually provide for one or more of some form of call routing, some form of call signaling and alerting, some form of negotiation regarding the capabilities of the video conference end points and the parameters to be used during the video conference, and some form of resource releasing of the resources allocated to the video conference.
- a video conference end point device 60 implementing a video conference standard outputs three data streams 62 - 66 to a video conference standard client 70 of the conventional video conference broadcasting system.
- the three streams of data 62 - 66 include a video conference standard messaging stream 62 , a digital video stream 64 and a digital audio stream 66 . It should be appreciated that each of the video streams 62 - 66 are bi-directional between the video conference end point device 60 and the client 70 .
- Each of the digital video conference standard messaging stream 62 , the digital video stream 64 and the digital audio stream 66 are transmitted between the video conference end point device 60 and the client 70 using an Internet protocol (IP) packet transport method. It should also be appreciated that the digital video stream 64 and the digital audio stream 66 are transmitted between the video conference end point device 60 and the client 70 using the Internet Engineering Task Force (IETF) Real Time Protocol (RTP).
- IP Internet protocol
- IETF Internet Engineering Task Force
- RTP Real Time Protocol
- the video conference standard client 70 converts the digital video stream 64 into an analog composite video signal 72 .
- the video conference standard client 70 also converts the digital audio signal 66 into a analog line-level audio signal 74 , which are output to a video capture and encoding device 80 .
- the analog composite video signal 72 and the analog line-level audio signal 74 are unidirectional signals from the video conference standard client 70 to the video capture and encoding device 80 .
- the video capture and encoding device 80 captures the analog video frames within the analog video signal 72 and digitizes the analog audio signal 74 .
- the video capture and encoding device 80 then generates, from the captured analog video frames and the digitized audio signal, digital video signals and audio signals and encodes the digital video and audio signals as video and audio streams, or a combined audio/video data stream, for transmission over a distributed network, such as the Internet.
- the video capture and encoding device 80 depending on the particular streaming software to be used, encodes and packetizes the digitized audio ended video data using different formats based on the selected streaming software to be used. For example, Microsoft and Real Networks use proprietary, closed-system encoding and transmission protocols.
- Apple has developed the open system named “QuickTime”, while the Internet engineering task force (IETF) has developed the Real Time Streaming Protocol (RTSP).
- IETF Internet engineering task force
- RTSP Real Time Streaming Protocol
- Any of these open-system or closed-system encoding and packetizing methods can be used by the video capture and encoding device 80 to convert the analog data received from the video conference standard client 70 into digital data suitable for transmission over a distributed network.
- the video capture and encoding device 80 then outputs the digitized and packetized video and audio data streams 82 to a streaming media server 84 .
- the streaming media server 84 which for example, can output the digitized and packetized audio and video data as a unicast audio/video data stream 86 using the Microsoft® Windows® Media Protocol (Windows® MMS) or the Real Time Streaming Protocol (RTSP).
- the output video/audio stream 86 can then be received by any number of clients 300 connected to the distributed network over which the audio/video stream 86 is distributed.
- FIG. 2 is a block diagram illustrating a first exemplary embodiment of a video conference access system 100 usable to connect a video conference session to a distributed network according to this invention.
- the video conference access system 100 includes a video conference standard module 110 connected to a plurality of video-conference-standard video conference end point devices 190 and one or more of an MMS (or other proprietary system) server 120 , an RTSP server 130 and a web server 140 .
- the MMS (or other proprietary system) server 120 is connected over a messaging channel 122 and outputs audio/video streams 124 to one or more MMS (or other proprietary system) clients 210 .
- the RTSP server 130 is connected over an RTSP messaging channel 132 , and outputs video streams 134 and audio streams 136 to one or more RTSP clients 220 .
- a web server 140 is connected over a link 141 to an administrator client 230 , which is also connected over a link 143 to a serial console 142 .
- the administrator client 230 and the other clients 210 and 220 are not necessarily part of the video conference access system 100
- the web server 140 and the serial console 142 are generally part of the video conference access system 100 .
- one or both of the web server 140 and the serial console 142 can be omitted from the video conference access system 100 .
- Each of the video-conference-standard video conference end point devices 190 outputs three data streams to the video conference standard module 110 .
- These data streams include a bi-directional digital video conference standard messaging stream 192 , a unidirectional digital video stream 194 and a unidirectional digital audio stream 196 .
- each of the video and audio streams are encoded using the real time protocol (RTP).
- RTP real time protocol
- Each of the streams 192 - 196 are transmitted between the video conference standard video conference end point device 190 and the video conference standard module 110 using an Internet protocol (IP) packet transport technique.
- IP Internet protocol
- Each of the provided servers 120 and 130 receive unidirectional digital video streams 112 and unidirectional audio streams 114 from the video conference standard module 110 . Each of these data streams 112 and 114 is transmitted using an internal digital transport technique.
- the video conference standard module 110 communicates with the web server 140 using a bidirectional digital messaging stream 116 . This digital messaging stream 116 is also transmitted using the internal digital transport method discussed above with respect to the data streams 112 and 114 .
- the bi-directional digital messaging stream 116 uses a proprietary protocol.
- each of the channels and streams disclosed herein as unidirectional can be replaced with one or more bi-directional channels or streams.
- each of the channels disclosed herein as bidirectional can be replaced with one or more unidirectional channels or streams.
- each unidirectional channel or stream can be implemented as two or more unidirectional channels or streams, and each bi-direction channel or stream can be implemented as two or more bidirectional channels or streams.
- FIG. 3 is a block diagram illustrating a second exemplary embodiment of the video conference access system 100 usable to connect a video conference session to a distributed network according to this invention.
- the second exemplary embodiment shown in FIG. 3 is generally the same as the first exemplary embodiment shown in FIG. 2.
- a transcoder 150 has been inserted between the video conference standard module 110 and the MMS server 120 and the RTSP server 130 .
- the transcoder 150 converts the audio and video data streams 112 and 114 received from the video conference standard module 110 from the form output by the video conference standard module 110 to one or more different video and audio streams 152 and 154 usable by various ones of the clients 210 and/or 220 .
- the transcoder 150 decompresses or decodes the video and audio streams 112 and 114 output from the video conference standard module 110 and recompresses or re-encodes the video and audio streams 114 into one or more different forms as the separate video and audio streams 152 and 154 , and 156 and 158 .
- Each of these different streams 152 - 158 can use a different video or audio compression or encoding technique and/or use a different bit rate. Additionally, one or more of these different streams 152 - 158 can use the same video and audio compression or encoding techniques and bit rate as the corresponding video and/or audio streams 112 and 114 .
- Each of these different forms of the transcoded video and audio streams 152 - 158 are output to one or both of the MMS server 120 and/or the RTSP server 130 .
- Each of the different forms of the audio and video streams 152 - 158 provided to the MMS server 120 and the RTSP server 130 can be accessed by the clients by transmitting a unique identifier, such as a specific uniform resource locator (URL), to one of the servers 120 or 130 .
- a unique identifier such as a specific uniform resource locator (URL)
- URL uniform resource locator
- a user would transmit a specific identifier associated with that particular set to one of the MMS server 120 or the RTSP server 130 .
- the MMS server 120 or the RTSP server would unicast that particular set of video and audio streams 152 - 158 to that user.
- the specific identifier and the particular set of video and audio streams 152 - 58 that identifier is associated with are displayed to the user on a web page that is associated with the particular video conference the user wishes to view.
- the user transmits the specific identifier to the MMS server 120 or the RTSP server 130 by selecting and activating an associated hyperlink.
- FIG. 4 is a block diagram illustrating a third exemplary embodiment of the video conference access system 100 usable to connect a video conference session to a distributed network according to this invention.
- the third exemplary embodiment shown in FIG. 4 is generally the same as the first exemplary embodiment shown in FIG. 2.
- a record module 160 ar d one or more storage devices 170 have been connected to the video conference standard module 110 .
- the record module 160 allows the video and audio streams 112 and 114 to be recorded.
- the video and audio streams 112 and 114 can be played back to a client after the video conference has begun, and even after the video conference has ended.
- a portion of the video and audio streams 112 and 114 stored in one or more of the one or more storage devices 170 can be read and played back by the record module 160 to the video conference standard module 110 and through the video conference standard module 110 to the video-conference-standard video conference devices 190 .
- a previous portion of the video conference can be played back to the participants in the video conference. This could be useful if there was a dispute over what had previously occurred during the video conference, or if a participant was absent during a particular portion of the video conference.
- the record module 160 and the one or more storage devices 170 can received and store other electronic data uploaded by one of the clients 210 or 220 through the MMS server 120 or the RTSP server 130 , respectively, to the video conference standard module 1 10 . Then, like a recorded portion of the video conference, this uploaded electronic data can be transmitted by the record module 160 to the video conference standard module 110 and through the video conference standard module 110 to the video conference standard video conference devices 190 . In this way, the uploaded electronic data can be displayed to the participants in the video conference.
- the one or more storage devices 170 can include one or more locally located physical storage devices, such as a hard disk, RAM, flash memory, a writeable or re-writeable optical disk, or any other known or later-developed locally located storage device, that is locally implemented, for example, as part of the video conference standard module 110 and/or the record module 160 .
- the one or more storage devices 170 can include one or more remotely located storage devices, such as a storage server, or any other known or later-developed remotely located storage device that is accessed by the record module 160 over a distributed network.
- the one or more storage devices 170 can include both one or more locally-located storage devices, and one or more remotely-located storage devices.
- FIG. 5 is a block diagram illustrating a fourth exemplary embodiment of the video conference access system 100 usable to connect a video conference session to a distributed network according to this invention.
- the fourth exemplary embodiment shown in FIG. 5 is generally the same as the first exemplary embodiment show in FIG. 2.
- both the transcoder 150 described above with respect to FIG. 3, has been inserted between the video conference standard module 110 and the MMS server 120 and the RTSP server and the record module 160 and the one or more storage devices 170 , described above with respect to FIG. 4, have been connected to the video conference standard module 110 .
- the conventional system shown in FIG. 1 piggy backs on the video conference end point device 60 used by one of the video conference participants. That is, the video conference end point device 60 is the end point of one of the video conference participants.
- the video conference standard client 70 is thus used both by the video conference participants to convert the digital audio and video streams into analog format so that the video conference video and audio streams can be presented to the video conference participants.
- the video capture and encoding device 80 piggy backs on these analog signals and reconverts them back into digital format.
- the video-conference-standard video conference end point device 190 of the video conference access system 100 is not the video conference device used by one of the actual participants to the video conference. Rather, the video-conference-standard video conference end point device 190 of the video conference access system 100 according to this invention separately interacts with the particular multi-point control unit or a particular video conference in the same way that the video conference end point devices 60 of the actual participants interact with the multi-point control unit.
- the video-conference-standard video conference end point device 190 is not an active participant in that particular video conference session, and does not actively transmit video and audio data to the multi-point control unit as is done by the video conference end point devices 60 of the active participants.
- the video-conference-standard video conference end point device 190 acts as a “pseudo-participant” within that particular video conference session.
- the video-conference-standard video conference device 190 can be located anywhere relative to the other video conference participants.
- the video conference standard module 110 unlike the video capture and encoding device 80 , is not limited to being located in the same room, or even the same physical structure, as the video conference equipment of one of the participants to the video conference.
- the video conference access system 100 acts as a video-conference-standard video conferencing network appliance.
- the video conference access system 100 can work with any Internet protocol (IP)-based video conference standard network, or even, via an ISDN to video conference standard gateway, with H.320 video conferencing systems.
- IP Internet protocol
- the video conference access system 100 connects with other video-conference-standard video conferencing equipment like any other end point device. This allows an end point device 60 to connect to one of the video-conference-standard video conference end point devices 190 directly, or for one of the video-conference-standard video conference end point devices 190 to connect to a multi-point conference through the multi-point control unit 70 .
- the video conference standard module 110 of the video conference access system 100 takes advantage of existing encoded video and audio data that is already being transmitted between the participants of the particular video conference session.
- the video-conference-standard video conference end point device 190 acts as a “pseudo-participant” to repackage the existing encoded video data for playback by conventional streaming media players, such as
- the unicast servers include servers able to output unicast multimedia data streams using the Microsoft®Windows® Media Player®, the Apple® QuickTime® player, the Real Networks® Real® player. or the like.
- the video conference standard module 110 takes advantage of the high-quality video compression hardware present in the video-conference-standard video conference end point device 190 .
- the video conference access system 100 due to the video and audio data remaining in digital format from the time the video and audio streams are received by the video-conference-standard conference end point device 190 until the video and audio streams are transmitted to the clients 210 and 220 , there is little to no latency caused by the video conference access system 100 , such as that caused by the software digitizing and encoding used in the conventional system shown in FIG. 1.
- the experience of the users of the clients 210 and 220 is enhanced relative to the experience of the users of the clients 200 that access the system shown in FIG. 1.
- FIG. 6 is a flowchart outlining a first exemplary embodiment of a method for distributing the audio and video content of a video conference as a multimedia data stream over a distributed network according to this invention.
- step S 100 operation continues to step S 200 , where a video conference to be distributed as a multimedia data stream over distributed network is established between two or more video conference end point devices, if a peer-to-peer system is used, or between two or more video conference end point devices and a multipoint control unit.
- step S 300 a video conference pseudo-participant end point unit according to this invention is connected to the established video conference.
- step S 400 the digital video and audio streams of the video conference are supplied from the pseudo-participant end point unit to a streaming module. Operation then continues to step S 500 .
- step S 500 the digital video and audio streams supplied to the streaming module are resupplied to one or more streaming servers that have one or more different protocols.
- These servers include, but are not limited to, servers able to output unicast multimedia data streams using the Microsoft® Windows® Media protocol (Windows® MMS), the Apple® QuickTime® protocol, the Real Networks® Real® protocol, the Internet Engineering Task Force (IETF) Real Time Streaming Protocol (RTSP), or any other known or related developed protocol.
- each of the streaming servers converts the supplied digital video and audio streams provided to that particular streaming server into the corresponding protocol implemented by that streaming server.
- step S 700 each different streaming server supplies the converted digital audio and video streams, now in the protocol corresponding to that particular streaming server, to one or more corresponding clients. Operation then continues to step S 800 .
- step S 800 a determination is made whether the digital video and audio streams should continue to be captured from the video conference and supplied through the pseudo-participant end point and the streaming module to the streaming servers. If so, operation jumps back to step S 400 . Otherwise, operation continues to step S 900 , where the method ends.
- step S 500 supplying the digital video and audio streams from the streaming module to the one or more streaming servers can comprise supplying the particular digita video and audio streams to a particular streaming server at different audio and/or compression rates and/or using different audio and/or video compression and/or encoding techniques.
- FIG. 7 is a flowchart outlining a second exemplary embodiment of a method for distributing the audio and video content of a video conference as a multimedia data stream over a distributed network according to this invention.
- operation continue to step S 1100 , where a video conference is established.
- step S 1200 a pseudo-participant end point is connected to the established video conference.
- step S 1300 the digital video and audio streams from the pseudo-participant end point are supplied to the streaming module. Operation then continues to step S 1400 .
- step S 1400 the digital video and audio streams from the streaming module are supplied to one or more streaming servers having one or more different protocols, as well as to a storage device that stores the digital video and audio streams.
- step SI 500 the received digital video and audio streams received at each different streaming server are converted to the protocol corresponding to that stream server.
- step S 1600 the converted digital audio and video streams are supplied, from each different streaming server, in the various protocols corresponding to the different streaming servers, to one or more corresponding clients. Operation then continues to step S 1700 .
- step S 1700 a determination is made whether the video conference continues to supply the video and audio data streams to the streaming module, and thence to the different streaming servers. If so, operation continues to step S 1800 . Otherwise, operation jumps to step S 2000 .
- step S 1800 a determination is made whether or not to play back any of the portions of the video and audio streams of this video conference that have been stored in the storage device in step S 1400 , or to play back any other data that may have been uploaded and/or stored in the storage device. If so, operation continues to step S 1900 . Otherwise operation jumps back to step S 1300 .
- step S 190 C the stored digital video and/or audio streams and/or the supplied video and/or audio data stored in the storage device is played back into the current video conference. Operation then again jumps back to step S 1300 . In contrast, in step S 2000 , the operation of the method ends.
- the various software and hardware elements are supported by a Linux kernel that provides the network resources.
- the small operating system footprint and versatile network stack provided by the Linux kernel work exceptionally well with the video conference standard stack.
- the video conference standard module 110 is able to seamlessly connect the video conference audio and video digital streams to Internet protocol (IP)-based networks.
- IP Internet protocol
- Linux has been proven, in a significant number of embedded devices, to be an extremely functional real time operating system, while still providing necessary system resources.
- the high performance of Linux in a small specialized device provides the ability to ensure that the video conference access system 100 will be able to meet both present and future streaming media requirements in a fully scalable fashion.
- the administrator client 230 allows an administrator to grant or deny permission to a user to view a broadcast. This allows the IT manager or a video conference coordinator to maintain full control over the distribution of proprietary and/or confidential information, while still allowing the transition from conventional media distribution to modern Internet-based content delivery technologies.
- the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 of the various exemplary embodiments of the video conference access system 100 are, in various exemplary embodiments, implemented on one or more programmed general purpose computers.
- the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 of the various exemplary embodiments of the video conference access system 100 can also be implemented on one or more special purpose computers, one or more programmed microprocessors or microcontrollers and peripheral integrated circuit elements, one or more ASICs or other integrated circuits, one or more digital signal processors, one or more hardwired electronic or logic circuits such as a discrete element circuit, a programmable logic device such as a PLD, PLA, FPGA or PAL, or the like.
- any device capable of implementing a finite state machine that is in turn capable of implementing the flowcharts shown in FIGS. 6 and 7, can be used to implement the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 of the various exemplary embodiments of the video conference access system 100 .
- each of the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 shown in FIGS. 2 - 5 can be implemented as portions of a suitably programmed general purpose computer.
- each of the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 shown in FIGS. 2 - 5 can be implemented as physically distinct hardware circuits within an ASIC, or using a FPGA, a PDL, a PLA or a PAL, or using discrete logic elements or discrete circuit elements.
- each of the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 shown in FIGS. 2 - 5 will take is a design choice and will be obvious and predicable to those skilled in the art.
- the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 can be implemented as software executing on a programmed general purpose computer, a special purpose computer, a microprocessor or the like.
- the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 can be implemented as a resource residing on a server or the like.
- the video conference standard module 110 , the transcoder 150 , the record module 160 and/or the clients 142 , 210 and/or 220 can also be implemented by physically incorporating them into a software and/or hardware system.
- the storage devices 170 can be implemented using any appropriate combination of alterable, volatile or non-volatile memory or non-alterable, or fixed, memory.
- the alterable memory whether volatile or non-volatile, can be implemented using any one or more of static or dynamic RAM, a floppy disk and disk drive, a writable or re-rewriteable optical disk and disk drive, a hard drive, flash memory or the like.
- the non-alterable or fixed memory can be implemented using any one or more of ROM, PROM, EPROM, EEPROM, an optical ROM disk, such as a CD-ROM or DVD-ROM disk, and disk drive or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Telephonic Communication Services (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
- 1. Field of Invention
- This invention relates to systems and methods that allow data to be transmitted between a video conference system and a distributed network.
- 2. Description of Related Art
- Conventional video conferencing equipment is generally divided into a number of categories. These categories include data stream display and capture equipment, such as cameras, microphones, televisions and speakers, end point equipment that connects a particular video conference participant to another end point device or to a multi-point control unit, and the multi-point control unit, which allows three or more end point devices to participate in a single video conference session.
- The end point equipment is used by participants in a video conference to convert the audio and video signals from the camera and microphone into data transmittable to another other end point device or the multi-point control unit. The end point equipment is also used to convert the transmitted audio and video signals, received at one end point from another end point or from the multi-point control unit, into signals usable by audio and video display devices connected to that end point to play the video and audio signals to the participants.
- The multi-point control unit is a conference bridge that connects the various end points of a single video conference session together when more than two end point devices are to be involved in the video conference. In general, two end point devices can be connected directly to each other. In practice, most video conference sessions, even when only two participants are involved, are routed through a multi-point control device. In general, one multi-point control unit can be used for a number of video conference sessions, where each session has two or more participants. In operation, each of the end points contact the multi-point control unit.
- After data sufficient for the multi-point control unit to authenticate the participants' authorization to participate in a video conference is provided to that multi-point control unit, the multi-point control unit connects that end point device to the one or more other end point devices, so that the user of that end point device can participate in that video conference session. The multi-point control unit, or a video conference administrator or coordinator, confirms a video conference participants' authorization to participate in the video conference based on the video conference participant supplying a predefined password, or the like.
- The H.320 standard is the standard for ISDN video conferencing. The H.323 standard extends the H.320 ISDN video conferencing standards to a standard usable for Internet protocol (IP)-based distributed networks. The Session Initiation Protocol (SIP) is a third video conferencing protocol. Video conference equipment, which uses the H.323 standard, uses standard Internet protocol (IP) handshake and messaging protocols and data and packet formats that would be used on a standard Internet protocol (IP)-based distributed network, such as the Internet, many wide area networks and local area networks, intranets, extranets, and other distributed networks.
- Porting the audio and video data streams of a video conference to a distributed network, such as the Internet, for distribution as a multimedia data stream is known. Conventional video conference broadcasters re-encode the audio and video portions of the video conference through one of two techniques. One technique includes capturing the video portion of the video conference separately, by accessing the analog auxiliary audio and video outputs on one of the video conference end point devices that are being used to participate in a particular video conference session. This first technique is illustrated in FIG. 1.
- A second technique uses an entirely different type of video conference equipment, which allows three or more participants to participate in a video conference without needing a multi-point control unit. This system connects the end point equipment of the various participants in a peer-to-peer style network, where each end point receives the video and audio data streams directly from each of the other end points. This is described as multi-tasking the video conference across the network. In this case, an IPTV client, which is a software application available from Cisco Systems, can be connected to the network to view the data packets of a video conference session as the data packets are passed back and forth between the actual end points participating in the video conference. The IPTV client sits in the background and monitors all of the packets that are transmitted between the end points of the video conference session.
- One advantage of this second system over the first technique is that the audio and video data streams stay in digital form. However, the IPTV client merely listens to the multi-task IP addresses. Thus, there is no centralized streaming server that is able to output a unicast multimedia data stream to a client. Rather, the IPTV client creates a multicast. However, multicasts generally cannot be received by most conventional local area or wide area networks that the video conference has not originated on. Thus, this peer-to-peer system can only be used within a multi-cast capable network, such as a single local or wide area network. As a result, the IPTV client can only make the video conference data available to another IPTV client that is also on a multi-cast capable network.
- The system shown in FIG. 1 accesses the digital video and audio streams of the video conference output by a video conference
end point device 60 through the analog output streams output by a video conferencestandard client 70. These analog output streams are also used to drive the audio and visual display devices used by the actual video conference participants. The system shown in FIG. 1 reconverts the audio streams back into digital data streams. As a result, the system shown in FIG. 1 can significantly degrade or otherwise distort the video and audio data. - Additionally, the video and audio data, which is originally in digital format, must be converted to analog format and then reconverted to digital format. As a result, there is a significant delay between receiving the digital video and audio streams at the video conference
end point device 60 and transmitting the re-encoded digital video and audio streams. This latency can be as long as 40 seconds. Finally, the system shown in FIG. 1 requires a physical connection between the video conferencestandard client 70 and a videocapture encoding device 80 to transmitanalog signals capture encoding device 80 can be connected to at most oneclient 70. - This invention provides systems and methods that allow video conference data streams to be transmitted between the video conference participants and clients on a distributed network.
- This invention separately provides systems and methods that allow the transmitted data streams to remain in digital form as the data streams are transmitted between the video conference participants and the clients on the distributed network.
- This invention separately provides systems and methods that allow the transmitted data streams to be transmitted as a unicast on the distributed network.
- This invention separately provides systems and methods that use a pseudo-end point to receive audio and video data streams transmitted between the end point devices actually participating in the video conference session.
- This invention further provides systems and methods that transmit the audio and video data streams from the pseudo-end point device to clients on a distributed network.
- This invention further provides systems and methods that transmit the audio and video data streams from the pseudo-end point device to clients on a distributed network entirely as digital data.
- This invention further provides systems and methods that recode the digital audio and video data streams received by the pseudo-end point device, while the audio and video data streams remain in digital format.
- This invention separately provides systems and methods that use a pseudo-end point such that the access to the video conference data is controlled in the same way that access is controlled for an actual video conference participant.
- This invention separately provides systems and methods that use a pseudo-end point device to inject audio and video data streams stored on the distributed network into the video conference session.
- In various exemplary embodiments of the systems and methods according to this invention, a pseudo-end point device can be connected to a multi-point control unit managing a particular video conference session in the same way as the end point device of an actual participant to the video conference session is connected to that multi-point control unit. The pseudo-end point device receives the digital video conference data packets in the same way that the end point devices of the actual participants receive the digital video conference data packets.
- In various exemplary embodiments, that pseudo-end point device is connected to a video conference standard module. The video conference standard module can be connected to a plurality of different pseudo-end point devices, each acting as a pseudo-participant in a different video conference session. In various exemplary embodiments, the video conference standard module transmits the received audio and video data packets as a multicast to one or more unicast servers, and zero, one or more multicast clients.
- In various exemplary embodiments, the unicast servers include servers able to output unicast multimedia data streams using any known or later-developed protocol or software package, such as, for example, the Microsoft® Windows® Media protocol (Windows® MMS), the Apple® QuickTime® protocol, the Real Networks® Real® protocol, the Internet Engineering Task Force (IETF) Real Time Streaming Protocol (RTSP), or the like.
- These and other features and advantages of this invention are described in, or are apparent from, the following detailed description of various exemplary embodiments of the systems and methods according to this invention.
- Various exemplary embodiments of this invention will be described in detail, with reference to the following figures, wherein:
- FIG. 1 is a block diagram illustrating one exemplary embodiment of a conventional system for porting video conference audio and video data streams to a distributed network;
- FIG. 2 is a block diagram of a first exemplary embodiment of a system for connecting a video conference end point to a distributed network;
- FIG. 3 is a second exemplary embodiment of a system for connecting an end point of a video conference to a distributed network according to this invention;
- FIG. 4 is a third exemplary embodiment of a system for connecting a video conference end point to a distributed network according to this invention;
- FIG. 5 is a fourth exemplary embodiment of a system for connecting a video conference end point to a distributed network according to this invention;
- FIG. 6 is a flowchart outlining a first exemplary embodiment of a method for distributing the video and audio content of a video conference over a distributed network according to this invention; and
- FIG. 7 is a flowchart outlining a second exemplary embodiment of the method for distributing the video and audio content of a video conference over a distributed network according to this invention.
- The video conferencing systems and methods according to this invention allow video conferencing systems and Internet-based media streaming systems to converge. In various exemplary embodiments, the systems and methods according to this invention allow the audio and video content of a video conference to be distributed as a multimedia data stream over a distributed network, such as the Internet. In general, the various exemplary embodiments of the systems and methods according to this invention allow a network administrator or video conference coordinator to broadcast a live video conference using standard video streaming techniques and protocols for distributing video streams over distributed networks. This makes use of existing distributed network infrastructures while reducing initial purchase costs, maintenance requirements and installation costs.
- As used herein, the term “video conference standard” encompasses the H.323 standard, the SIP standard, the H.320 standard, and any other known or later-developed standard that provides for the concept of a video conference call. Such standards will usually provide for one or more of some form of call routing, some form of call signaling and alerting, some form of negotiation regarding the capabilities of the video conference end points and the parameters to be used during the video conference, and some form of resource releasing of the resources allocated to the video conference.
- As mentioned above, conventional video conference broadcasters reencode the audio and video portions of the video conference through one of two techniques. As shown in FIG. 1, a video conference
end point device 60 implementing a video conference standard outputs three data streams 62-66 to a videoconference standard client 70 of the conventional video conference broadcasting system. The three streams of data 62-66 include a video conferencestandard messaging stream 62, adigital video stream 64 and adigital audio stream 66. It should be appreciated that each of the video streams 62-66 are bi-directional between the video conferenceend point device 60 and theclient 70. Each of the digital video conferencestandard messaging stream 62, thedigital video stream 64 and thedigital audio stream 66 are transmitted between the video conferenceend point device 60 and theclient 70 using an Internet protocol (IP) packet transport method. It should also be appreciated that thedigital video stream 64 and thedigital audio stream 66 are transmitted between the video conferenceend point device 60 and theclient 70 using the Internet Engineering Task Force (IETF) Real Time Protocol (RTP). - The video
conference standard client 70 converts thedigital video stream 64 into an analogcomposite video signal 72. The videoconference standard client 70 also converts thedigital audio signal 66 into a analog line-level audio signal 74, which are output to a video capture andencoding device 80. It should be appreciated that the analogcomposite video signal 72 and the analog line-level audio signal 74 are unidirectional signals from the videoconference standard client 70 to the video capture andencoding device 80. - The video capture and
encoding device 80 captures the analog video frames within theanalog video signal 72 and digitizes theanalog audio signal 74. The video capture andencoding device 80 then generates, from the captured analog video frames and the digitized audio signal, digital video signals and audio signals and encodes the digital video and audio signals as video and audio streams, or a combined audio/video data stream, for transmission over a distributed network, such as the Internet. In particular, the video capture andencoding device 80, depending on the particular streaming software to be used, encodes and packetizes the digitized audio ended video data using different formats based on the selected streaming software to be used. For example, Microsoft and Real Networks use proprietary, closed-system encoding and transmission protocols. - In contrast, Apple has developed the open system named “QuickTime”, while the Internet engineering task force (IETF) has developed the Real Time Streaming Protocol (RTSP). Any of these open-system or closed-system encoding and packetizing methods can be used by the video capture and
encoding device 80 to convert the analog data received from the videoconference standard client 70 into digital data suitable for transmission over a distributed network. The video capture andencoding device 80 then outputs the digitized and packetized video and audio data streams 82 to a streaming media server 84. - The streaming media server84, which for example, can output the digitized and packetized audio and video data as a unicast audio/
video data stream 86 using the Microsoft® Windows® Media Protocol (Windows® MMS) or the Real Time Streaming Protocol (RTSP). The output video/audio stream 86 can then be received by any number ofclients 300 connected to the distributed network over which the audio/video stream 86 is distributed. - FIG. 2 is a block diagram illustrating a first exemplary embodiment of a video
conference access system 100 usable to connect a video conference session to a distributed network according to this invention. As shown in FIG. 2, the videoconference access system 100 includes a videoconference standard module 110 connected to a plurality of video-conference-standard video conferenceend point devices 190 and one or more of an MMS (or other proprietary system)server 120, anRTSP server 130 and aweb server 140. If provided, the MMS (or other proprietary system)server 120 is connected over amessaging channel 122 and outputs audio/video streams 124 to one or more MMS (or other proprietary system)clients 210. TheRTSP server 130 is connected over anRTSP messaging channel 132, and outputsvideo streams 134 andaudio streams 136 to one ormore RTSP clients 220. - A
web server 140 is connected over alink 141 to anadministrator client 230, which is also connected over alink 143 to aserial console 142. In particular, it should be appreciated that theadministrator client 230 and theother clients conference access system 100, while theweb server 140 and theserial console 142 are generally part of the videoconference access system 100. However, one or both of theweb server 140 and theserial console 142 can be omitted from the videoconference access system 100. - Each of the video-conference-standard video conference
end point devices 190 outputs three data streams to the videoconference standard module 110. These data streams include a bi-directional digital video conferencestandard messaging stream 192, a unidirectionaldigital video stream 194 and a unidirectional digitalaudio stream 196. In particular, each of the video and audio streams are encoded using the real time protocol (RTP). Each of the streams 192-196 are transmitted between the video conference standard video conferenceend point device 190 and the videoconference standard module 110 using an Internet protocol (IP) packet transport technique. - Each of the provided
servers conference standard module 110. Each of thesedata streams conference standard module 110 communicates with theweb server 140 using a bidirectionaldigital messaging stream 116. Thisdigital messaging stream 116 is also transmitted using the internal digital transport method discussed above with respect to the data streams 112 and 114. In various exemplary embodiments, the bi-directionaldigital messaging stream 116 uses a proprietary protocol. - It should be appreciated that, while various ones of the channels and streams are variously described herein as bi-directional or unidirectional, in various exemplary embodiments, each of the channels and streams disclosed herein as unidirectional can be replaced with one or more bi-directional channels or streams. Similarly, each of the channels disclosed herein as bidirectional can be replaced with one or more unidirectional channels or streams. Likewise, each unidirectional channel or stream can be implemented as two or more unidirectional channels or streams, and each bi-direction channel or stream can be implemented as two or more bidirectional channels or streams.
- FIG. 3 is a block diagram illustrating a second exemplary embodiment of the video
conference access system 100 usable to connect a video conference session to a distributed network according to this invention. The second exemplary embodiment shown in FIG. 3 is generally the same as the first exemplary embodiment shown in FIG. 2. However, in the second exemplary embodiment, atranscoder 150 has been inserted between the videoconference standard module 110 and theMMS server 120 and theRTSP server 130. Thetranscoder 150 converts the audio and video data streams 112 and 114 received from the videoconference standard module 110 from the form output by the videoconference standard module 110 to one or more different video andaudio streams clients 210 and/or 220. - In general, there are a number of different encoding techniques that can be used to compress or encode the video and
audio streams audio streams - In this case, if that client received the compressed or encoded video and
audio streams conference standard module 110, that client would not be able to decompress or decode one or both of the video oraudio streams audio streams conference standard module 110 may not match the bit rate required or desired byvarious clients 210 and/or 220. - The
transcoder 150 decompresses or decodes the video andaudio streams conference standard module 110 and recompresses or re-encodes the video andaudio streams 114 into one or more different forms as the separate video andaudio streams audio streams MMS server 120 and/or theRTSP server 130. - Each of the different forms of the audio and video streams152-158 provided to the
MMS server 120 and theRTSP server 130 can be accessed by the clients by transmitting a unique identifier, such as a specific uniform resource locator (URL), to one of theservers MMS server 120 or theRTSP server 130. In response, theMMS server 120 or the RTSP server would unicast that particular set of video and audio streams 152-158 to that user. In various exemplary embodiments, the specific identifier and the particular set of video and audio streams 152-58 that identifier is associated with are displayed to the user on a web page that is associated with the particular video conference the user wishes to view. In this case, the user transmits the specific identifier to theMMS server 120 or theRTSP server 130 by selecting and activating an associated hyperlink. - FIG. 4 is a block diagram illustrating a third exemplary embodiment of the video
conference access system 100 usable to connect a video conference session to a distributed network according to this invention. The third exemplary embodiment shown in FIG. 4 is generally the same as the first exemplary embodiment shown in FIG. 2. However, in the second exemplary embodiment, arecord module 160 ar d one ormore storage devices 170 have been connected to the videoconference standard module 110. Therecord module 160 allows the video andaudio streams audio streams - Alternately, a portion of the video and
audio streams more storage devices 170 can be read and played back by therecord module 160 to the videoconference standard module 110 and through the videoconference standard module 110 to the video-conference-standardvideo conference devices 190. In this way, a previous portion of the video conference can be played back to the participants in the video conference. This could be useful if there was a dispute over what had previously occurred during the video conference, or if a participant was absent during a particular portion of the video conference. - Finally, the
record module 160 and the one ormore storage devices 170 can received and store other electronic data uploaded by one of theclients MMS server 120 or theRTSP server 130, respectively, to the video conference standard module 1 10. Then, like a recorded portion of the video conference, this uploaded electronic data can be transmitted by therecord module 160 to the videoconference standard module 110 and through the videoconference standard module 110 to the video conference standardvideo conference devices 190. In this way, the uploaded electronic data can be displayed to the participants in the video conference. - The one or
more storage devices 170 can include one or more locally located physical storage devices, such as a hard disk, RAM, flash memory, a writeable or re-writeable optical disk, or any other known or later-developed locally located storage device, that is locally implemented, for example, as part of the videoconference standard module 110 and/or therecord module 160. Similarly, the one ormore storage devices 170 can include one or more remotely located storage devices, such as a storage server, or any other known or later-developed remotely located storage device that is accessed by therecord module 160 over a distributed network. Furthermore, the one ormore storage devices 170 can include both one or more locally-located storage devices, and one or more remotely-located storage devices. - FIG. 5 is a block diagram illustrating a fourth exemplary embodiment of the video
conference access system 100 usable to connect a video conference session to a distributed network according to this invention. The fourth exemplary embodiment shown in FIG. 5 is generally the same as the first exemplary embodiment show in FIG. 2. However, in the fourth exemplary embodiment, both thetranscoder 150, described above with respect to FIG. 3, has been inserted between the videoconference standard module 110 and theMMS server 120 and the RTSP server and therecord module 160 and the one ormore storage devices 170, described above with respect to FIG. 4, have been connected to the videoconference standard module 110. - As outlined above with respect to FIG. 1, the conventional system shown in FIG. 1 piggy backs on the video conference
end point device 60 used by one of the video conference participants. That is, the video conferenceend point device 60 is the end point of one of the video conference participants. The videoconference standard client 70 is thus used both by the video conference participants to convert the digital audio and video streams into analog format so that the video conference video and audio streams can be presented to the video conference participants. The video capture andencoding device 80 piggy backs on these analog signals and reconverts them back into digital format. - In contrast, in the various exemplary embodiments of the systems and methods according to this invention, such as those outlined above with respect to FIGS.2-5, the video-conference-standard video conference
end point device 190 of the videoconference access system 100 is not the video conference device used by one of the actual participants to the video conference. Rather, the video-conference-standard video conferenceend point device 190 of the videoconference access system 100 according to this invention separately interacts with the particular multi-point control unit or a particular video conference in the same way that the video conferenceend point devices 60 of the actual participants interact with the multi-point control unit. Thus, in general, although not necessarily, the video-conference-standard video conferenceend point device 190 is not an active participant in that particular video conference session, and does not actively transmit video and audio data to the multi-point control unit as is done by the video conferenceend point devices 60 of the active participants. Thus, the video-conference-standard video conferenceend point device 190 acts as a “pseudo-participant” within that particular video conference session. - This provides several distinct advantages over the conventional system illustrated in FIG. 1. Initially, like any video conference participant, the video-conference-standard
video conference device 190 can be located anywhere relative to the other video conference participants. Thus, the videoconference standard module 110, unlike the video capture andencoding device 80, is not limited to being located in the same room, or even the same physical structure, as the video conference equipment of one of the participants to the video conference. - Additionally, because the video-conference-standard video conference
end point device 190 does not have to have any specific relationship to the other video conference participants, multiple video-conference-standard video conferenceend point devices 190 can be connected to the videoconference standard module 110 and act as “pseudo -participants” to a variety of different video conference sessions at the same time. Thus, the videoconference access system 100 acts as a video-conference-standard video conferencing network appliance. The videoconference access system 100 can work with any Internet protocol (IP)-based video conference standard network, or even, via an ISDN to video conference standard gateway, with H.320 video conferencing systems. The videoconference access system 100 connects with other video-conference-standard video conferencing equipment like any other end point device. This allows anend point device 60 to connect to one of the video-conference-standard video conferenceend point devices 190 directly, or for one of the video-conference-standard video conferenceend point devices 190 to connect to a multi-point conference through themulti-point control unit 70. - The video
conference standard module 110 of the videoconference access system 100 takes advantage of existing encoded video and audio data that is already being transmitted between the participants of the particular video conference session. The video-conference-standard video conferenceend point device 190 acts as a “pseudo-participant” to repackage the existing encoded video data for playback by conventional streaming media players, such as - In various exemplary embodiments, the unicast servers include servers able to output unicast multimedia data streams using the Microsoft®Windows® Media Player®, the Apple® QuickTime® player, the Real Networks® Real® player. or the like. The video
conference standard module 110 takes advantage of the high-quality video compression hardware present in the video-conference-standard video conferenceend point device 190. In general, due to the video and audio data remaining in digital format from the time the video and audio streams are received by the video-conference-standard conferenceend point device 190 until the video and audio streams are transmitted to theclients conference access system 100, such as that caused by the software digitizing and encoding used in the conventional system shown in FIG. 1. - Moreover, because the
clients clients clients 200 that access the system shown in FIG. 1. - FIG. 6 is a flowchart outlining a first exemplary embodiment of a method for distributing the audio and video content of a video conference as a multimedia data stream over a distributed network according to this invention. Beginning in step S100, operation continues to step S200, where a video conference to be distributed as a multimedia data stream over distributed network is established between two or more video conference end point devices, if a peer-to-peer system is used, or between two or more video conference end point devices and a multipoint control unit. Next, in step S300, a video conference pseudo-participant end point unit according to this invention is connected to the established video conference. Then, in step S400, the digital video and audio streams of the video conference are supplied from the pseudo-participant end point unit to a streaming module. Operation then continues to step S500.
- In step S500, the digital video and audio streams supplied to the streaming module are resupplied to one or more streaming servers that have one or more different protocols. These servers include, but are not limited to, servers able to output unicast multimedia data streams using the Microsoft® Windows® Media protocol (Windows® MMS), the Apple® QuickTime® protocol, the Real Networks® Real® protocol, the Internet Engineering Task Force (IETF) Real Time Streaming Protocol (RTSP), or any other known or related developed protocol. Then, in step S600, each of the streaming servers converts the supplied digital video and audio streams provided to that particular streaming server into the corresponding protocol implemented by that streaming server. Next, in step S700, each different streaming server supplies the converted digital audio and video streams, now in the protocol corresponding to that particular streaming server, to one or more corresponding clients. Operation then continues to step S800.
- In step S800, a determination is made whether the digital video and audio streams should continue to be captured from the video conference and supplied through the pseudo-participant end point and the streaming module to the streaming servers. If so, operation jumps back to step S400. Otherwise, operation continues to step S900, where the method ends.
- It should be appreciated that, as outlined above with respect to FIGS. 3 and 5, in step S500, supplying the digital video and audio streams from the streaming module to the one or more streaming servers can comprise supplying the particular digita video and audio streams to a particular streaming server at different audio and/or compression rates and/or using different audio and/or video compression and/or encoding techniques.
- FIG. 7 is a flowchart outlining a second exemplary embodiment of a method for distributing the audio and video content of a video conference as a multimedia data stream over a distributed network according to this invention. As shown in FIG. 7, beginning in step S1000, operation continue to step S1100, where a video conference is established. Then, in step S1200, a pseudo-participant end point is connected to the established video conference. Next, in step S1300, the digital video and audio streams from the pseudo-participant end point are supplied to the streaming module. Operation then continues to step S1400.
- In step S1400, the digital video and audio streams from the streaming module are supplied to one or more streaming servers having one or more different protocols, as well as to a storage device that stores the digital video and audio streams. Next, in
step SI 500, the received digital video and audio streams received at each different streaming server are converted to the protocol corresponding to that stream server. Then, in step S1600, the converted digital audio and video streams are supplied, from each different streaming server, in the various protocols corresponding to the different streaming servers, to one or more corresponding clients. Operation then continues to step S1700. - In step S1700, a determination is made whether the video conference continues to supply the video and audio data streams to the streaming module, and thence to the different streaming servers. If so, operation continues to step S1800. Otherwise, operation jumps to step S2000.
- In step S1800, a determination is made whether or not to play back any of the portions of the video and audio streams of this video conference that have been stored in the storage device in step S1400, or to play back any other data that may have been uploaded and/or stored in the storage device. If so, operation continues to step S1900. Otherwise operation jumps back to step S1300. In step S190C, the stored digital video and/or audio streams and/or the supplied video and/or audio data stored in the storage device is played back into the current video conference. Operation then again jumps back to step S1300. In contrast, in step S2000, the operation of the method ends.
- In various exemplary embodiments of the video
conference access system 100 shown in FIGS. 2-5, the various software and hardware elements are supported by a Linux kernel that provides the network resources. The small operating system footprint and versatile network stack provided by the Linux kernel work exceptionally well with the video conference standard stack. Thus, the videoconference standard module 110 is able to seamlessly connect the video conference audio and video digital streams to Internet protocol (IP)-based networks. - Linux has been proven, in a significant number of embedded devices, to be an extremely functional real time operating system, while still providing necessary system resources. The high performance of Linux in a small specialized device provides the ability to ensure that the video
conference access system 100 will be able to meet both present and future streaming media requirements in a fully scalable fashion. - In various exemplary embodiments, the
administrator client 230 allows an administrator to grant or deny permission to a user to view a broadcast. This allows the IT manager or a video conference coordinator to maintain full control over the distribution of proprietary and/or confidential information, while still allowing the transition from conventional media distribution to modern Internet-based content delivery technologies. - The video
conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference access system 100 are, in various exemplary embodiments, implemented on one or more programmed general purpose computers. However, the videoconference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference access system 100 can also be implemented on one or more special purpose computers, one or more programmed microprocessors or microcontrollers and peripheral integrated circuit elements, one or more ASICs or other integrated circuits, one or more digital signal processors, one or more hardwired electronic or logic circuits such as a discrete element circuit, a programmable logic device such as a PLD, PLA, FPGA or PAL, or the like. In general, any device, capable of implementing a finite state machine that is in turn capable of implementing the flowcharts shown in FIGS. 6 and 7, can be used to implement the videoconference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference access system 100. - It should be understood that each of the video
conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients - Moreover, the video
conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients conference standard module 110, thetranscoder 150, therecord module 160 and/or theclients - The
storage devices 170 can be implemented using any appropriate combination of alterable, volatile or non-volatile memory or non-alterable, or fixed, memory. The alterable memory, whether volatile or non-volatile, can be implemented using any one or more of static or dynamic RAM, a floppy disk and disk drive, a writable or re-rewriteable optical disk and disk drive, a hard drive, flash memory or the like. similarly, the non-alterable or fixed memory can be implemented using any one or more of ROM, PROM, EPROM, EEPROM, an optical ROM disk, such as a CD-ROM or DVD-ROM disk, and disk drive or the like. - While this invention has been described in conjunction with the exemplary embodiments outlined above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the exemplary embodiments of the invention, as set forth above, are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of the invention.
Claims (23)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/984,499 US20020126201A1 (en) | 2001-03-08 | 2001-10-30 | Systems and methods for connecting video conferencing to a distributed network |
US11/161,701 US7043528B2 (en) | 2001-03-08 | 2005-08-12 | Systems and methods for connecting video conferencing to a distributed network |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US27382501P | 2001-03-08 | 2001-03-08 | |
US09/984,499 US20020126201A1 (en) | 2001-03-08 | 2001-10-30 | Systems and methods for connecting video conferencing to a distributed network |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/161,701 Continuation US7043528B2 (en) | 2001-03-08 | 2005-08-12 | Systems and methods for connecting video conferencing to a distributed network |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020126201A1 true US20020126201A1 (en) | 2002-09-12 |
Family
ID=26956454
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/984,499 Abandoned US20020126201A1 (en) | 2001-03-08 | 2001-10-30 | Systems and methods for connecting video conferencing to a distributed network |
US11/161,701 Expired - Fee Related US7043528B2 (en) | 2001-03-08 | 2005-08-12 | Systems and methods for connecting video conferencing to a distributed network |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/161,701 Expired - Fee Related US7043528B2 (en) | 2001-03-08 | 2005-08-12 | Systems and methods for connecting video conferencing to a distributed network |
Country Status (1)
Country | Link |
---|---|
US (2) | US20020126201A1 (en) |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040008635A1 (en) * | 2002-07-10 | 2004-01-15 | Steve Nelson | Multi-participant conference system with controllable content delivery using a client monitor back-channel |
US20040207724A1 (en) * | 2003-04-17 | 2004-10-21 | Siemens Information And Communication Networks, Inc. | System and method for real time playback of conferencing streams |
US20040230655A1 (en) * | 2003-05-16 | 2004-11-18 | Chia-Hsin Li | Method and system for media playback architecture |
US20040236830A1 (en) * | 2003-05-15 | 2004-11-25 | Steve Nelson | Annotation management system |
WO2005048600A1 (en) * | 2003-11-14 | 2005-05-26 | Tandberg Telecom As | Distributed real-time media composer |
US20050243741A1 (en) * | 2004-03-05 | 2005-11-03 | Tandberg Telecom As | Method, apparatus, system, and computer program product for interruption-free conference calling |
US20050289064A1 (en) * | 2002-12-31 | 2005-12-29 | Medialive, A Corporation Of France | Personalized markup for protecting numerical audiovisual streams |
US20060271626A1 (en) * | 2005-05-27 | 2006-11-30 | Microsoft Corporation | Supporting a serial and a parallel invitation protocol |
US20060268753A1 (en) * | 2005-05-27 | 2006-11-30 | Microsoft Corporation | Establishing a multiparty session by sending invitations in parallel |
WO2007035109A1 (en) * | 2005-09-26 | 2007-03-29 | Tandberg Telecom As | Method for gatekeeper streaming |
US20070192486A1 (en) * | 2006-02-14 | 2007-08-16 | Sbc Knowledge Ventures L.P. | Home automation system and method |
US20080077692A1 (en) * | 2006-09-25 | 2008-03-27 | Microsoft Corporation | Application programming interface for efficient multicasting of communications |
US20090181659A1 (en) * | 2006-05-05 | 2009-07-16 | Stalnacke Marika | Method and arrangement for management of virtual meetings |
US7619645B2 (en) | 2002-08-23 | 2009-11-17 | Tandberg Nz Limited | Audio visual media encoding system |
US20090310614A1 (en) * | 2008-06-13 | 2009-12-17 | Cisco Technology, Inc. | System and Method for Establishment of a Multiprotocol Label Switching (MPLS) Tunnel |
US20100095223A1 (en) * | 2008-10-15 | 2010-04-15 | Ted Beers | Reconfiguring a collaboration event |
US20100091687A1 (en) * | 2008-10-15 | 2010-04-15 | Ted Beers | Status of events |
US20100177158A1 (en) * | 2006-04-06 | 2010-07-15 | Walter Edward A | System and Method for Distributing Video Conference Data over an Internet Protocol Television System |
US20100287295A1 (en) * | 2009-05-07 | 2010-11-11 | International Business Machines Corporation | Architecture for building multi-media streaming applications |
US20110069141A1 (en) * | 2008-04-30 | 2011-03-24 | Mitchell April S | Communication Between Scheduled And In Progress Event Attendees |
US20110093590A1 (en) * | 2008-04-30 | 2011-04-21 | Ted Beers | Event Management System |
US20110154417A1 (en) * | 2009-12-22 | 2011-06-23 | Reha Civanlar | System and method for interactive synchronized video watching |
US20110179157A1 (en) * | 2008-09-26 | 2011-07-21 | Ted Beers | Event Management System For Creating A Second Event |
US20110205328A1 (en) * | 2009-08-24 | 2011-08-25 | Hidekatsu Ozeki | Video conferencing system, video conferencing apparatus, video conferencing control method, and video conferencing control program |
JP2011525770A (en) * | 2008-06-23 | 2011-09-22 | ラドヴィジョン リミテッド | System, method and medium for providing a cascaded multipoint video conference device |
US20110279635A1 (en) * | 2010-05-12 | 2011-11-17 | Alagu Periyannan | Systems and methods for scalable composition of media streams for real-time multimedia communication |
US8457614B2 (en) | 2005-04-07 | 2013-06-04 | Clearone Communications, Inc. | Wireless multi-unit conference phone |
US20140059629A1 (en) * | 2012-08-23 | 2014-02-27 | Electronics And Telecommunications Research Institute | Two-way broadcast service providing system and method including media transmission apparatus |
US20140247887A1 (en) * | 2011-12-28 | 2014-09-04 | Verizon Patent And Licensing Inc. | Just-in-time (jit) encoding for streaming media content |
US8832193B1 (en) * | 2011-06-16 | 2014-09-09 | Google Inc. | Adjusting a media stream in a video communication system |
US9124757B2 (en) | 2010-10-04 | 2015-09-01 | Blue Jeans Networks, Inc. | Systems and methods for error resilient scheme for low latency H.264 video coding |
US20150256796A1 (en) * | 2014-03-07 | 2015-09-10 | Zhigang Ma | Device and method for live video chat |
US9300705B2 (en) | 2011-05-11 | 2016-03-29 | Blue Jeans Network | Methods and systems for interfacing heterogeneous endpoints and web-based media sources in a video conference |
US9369673B2 (en) | 2011-05-11 | 2016-06-14 | Blue Jeans Network | Methods and systems for using a mobile device to join a video conference endpoint into a video conference |
EP3329670A4 (en) * | 2015-07-28 | 2019-03-06 | Mersive Technologies, Inc. | VIRTUAL VIDEO CONTROL DEVICE BRIDGE SYSTEM FOR MULTISOURCE COLLABORATION IN A WEB CONFERENCE SYSTEM |
US10264131B2 (en) * | 2014-02-28 | 2019-04-16 | Ricoh Company, Ltd. | Transmission control system, transmission system, and method of transmission control |
US10375088B2 (en) * | 2015-06-04 | 2019-08-06 | Vm-Robot, Inc. | Routing systems and methods |
US10771508B2 (en) | 2016-01-19 | 2020-09-08 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
US10931725B2 (en) * | 2017-09-29 | 2021-02-23 | Apple Inc. | Multiway audio-video conferencing |
US11171938B2 (en) | 2018-12-21 | 2021-11-09 | Wells Fargo Bank, N.A. | Multi-layer user authentication with live interaction |
US11259180B2 (en) * | 2015-06-04 | 2022-02-22 | Vm-Robot, Inc. | Routing systems and methods |
Families Citing this family (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020126201A1 (en) * | 2001-03-08 | 2002-09-12 | Star-Bak Communication Inc. | Systems and methods for connecting video conferencing to a distributed network |
US20020165920A1 (en) * | 2001-04-24 | 2002-11-07 | Alcatel, Societe Anonyme | Facilitating simultaneous download of a multicast file to a plurality of end user download devices |
JP4077330B2 (en) * | 2003-02-06 | 2008-04-16 | 富士通株式会社 | Data generator |
NO318868B1 (en) * | 2003-10-24 | 2005-05-18 | Tandberg Telecom As | Video conferencing with enhanced multimedia capabilities |
TWI238663B (en) * | 2004-02-13 | 2005-08-21 | Era Digital Media Co Ltd | Cross media information integration system |
KR100630897B1 (en) * | 2004-07-05 | 2006-10-04 | 에스케이 텔레콤주식회사 | Interactive multimedia service system and method using mobile terminal |
US7870192B2 (en) * | 2004-12-16 | 2011-01-11 | International Business Machines Corporation | Integrated voice and video conferencing management |
US20070240185A1 (en) * | 2005-08-26 | 2007-10-11 | Weaver Timothy H | Methods, apparatuses, and computer program products for delivering audio content on demand |
US20070250875A1 (en) * | 2005-08-26 | 2007-10-25 | Weaver Timothy H | Methods, apparatuses, and computer program products for delivering one or more television programs for viewing during a specified viewing interval |
US20080320530A1 (en) * | 2005-08-26 | 2008-12-25 | Weaver Timothy H | Methods, apparatuses, and computer program products for delivering video on demand content |
US8760485B2 (en) * | 2006-03-02 | 2014-06-24 | Cisco Technology, Inc. | System and method for displaying participants in a videoconference between locations |
CN101496387B (en) | 2006-03-06 | 2012-09-05 | 思科技术公司 | System and method for access authentication in a mobile wireless network |
US20070250567A1 (en) * | 2006-04-20 | 2007-10-25 | Graham Philip R | System and method for controlling a telepresence system |
US7692680B2 (en) * | 2006-04-20 | 2010-04-06 | Cisco Technology, Inc. | System and method for providing location specific sound in a telepresence system |
US7679639B2 (en) | 2006-04-20 | 2010-03-16 | Cisco Technology, Inc. | System and method for enhancing eye gaze in a telepresence system |
US7707247B2 (en) * | 2006-04-20 | 2010-04-27 | Cisco Technology, Inc. | System and method for displaying users in a visual conference between locations |
US7710448B2 (en) * | 2006-04-20 | 2010-05-04 | Cisco Technology, Inc. | System and method for preventing movement in a telepresence system |
US7532232B2 (en) * | 2006-04-20 | 2009-05-12 | Cisco Technology, Inc. | System and method for single action initiation of a video conference |
US7558823B2 (en) * | 2006-05-31 | 2009-07-07 | Hewlett-Packard Development Company, L.P. | System and method for managing virtual collaboration systems |
US8856371B2 (en) * | 2006-08-07 | 2014-10-07 | Oovoo Llc | Video conferencing over IP networks |
US9635315B2 (en) * | 2006-08-07 | 2017-04-25 | Oovoo Llc | Video conferencing over IP networks |
US8817668B2 (en) * | 2006-09-15 | 2014-08-26 | Microsoft Corporation | Distributable, scalable, pluggable conferencing architecture |
US8085290B2 (en) | 2006-12-06 | 2011-12-27 | Cisco Technology, Inc. | System and method for displaying a videoconference |
US20080168137A1 (en) * | 2007-01-08 | 2008-07-10 | Ray Benza | Continuing education portal |
US8103363B2 (en) * | 2007-01-31 | 2012-01-24 | Hewlett-Packard Development Company, L.P. | Device control system |
US7911955B2 (en) * | 2007-01-31 | 2011-03-22 | Hewlett-Packard Development Company, L.P. | Coordinated media control system |
US8024486B2 (en) | 2007-03-14 | 2011-09-20 | Hewlett-Packard Development Company, L.P. | Converting data from a first network format to non-network format and from the non-network format to a second network format |
WO2008112001A1 (en) * | 2007-03-14 | 2008-09-18 | Hewlett-Packard Development Company, L.P. | Synthetic bridging |
US8203591B2 (en) | 2007-04-30 | 2012-06-19 | Cisco Technology, Inc. | Method and system for optimal balance and spatial consistency |
US11095583B2 (en) | 2007-06-28 | 2021-08-17 | Voxer Ip Llc | Real-time messaging method and apparatus |
US8180029B2 (en) | 2007-06-28 | 2012-05-15 | Voxer Ip Llc | Telecommunication and multimedia management method and apparatus |
EP2091203A1 (en) * | 2008-02-12 | 2009-08-19 | Koninklijke KPN N.V. | Method and system for transmitting a multimedia stream |
US8276195B2 (en) * | 2008-01-02 | 2012-09-25 | Microsoft Corporation | Management of split audio/video streams |
US8379076B2 (en) * | 2008-01-07 | 2013-02-19 | Cisco Technology, Inc. | System and method for displaying a multipoint videoconference |
WO2009100565A1 (en) * | 2008-01-30 | 2009-08-20 | Zte Corporation | Method and system for realizing multi-part conference call in a wireless communication system |
US8797377B2 (en) | 2008-02-14 | 2014-08-05 | Cisco Technology, Inc. | Method and system for videoconference configuration |
US8355041B2 (en) * | 2008-02-14 | 2013-01-15 | Cisco Technology, Inc. | Telepresence system for 360 degree video conferencing |
US8144187B2 (en) | 2008-03-14 | 2012-03-27 | Microsoft Corporation | Multiple video stream capability negotiation |
US8319819B2 (en) | 2008-03-26 | 2012-11-27 | Cisco Technology, Inc. | Virtual round-table videoconference |
US8390667B2 (en) | 2008-04-15 | 2013-03-05 | Cisco Technology, Inc. | Pop-up PIP for people not in picture |
US8694658B2 (en) | 2008-09-19 | 2014-04-08 | Cisco Technology, Inc. | System and method for enabling communication sessions in a network environment |
US20110173263A1 (en) * | 2008-09-26 | 2011-07-14 | Ted Beers | Directing An Attendee Of A Collaboration Event To An Endpoint |
KR100899666B1 (en) * | 2008-12-29 | 2009-05-27 | (주)키스코 | Distributed multistreaming transmission device |
US8477175B2 (en) * | 2009-03-09 | 2013-07-02 | Cisco Technology, Inc. | System and method for providing three dimensional imaging in a network environment |
US8659637B2 (en) | 2009-03-09 | 2014-02-25 | Cisco Technology, Inc. | System and method for providing three dimensional video conferencing in a network environment |
US8659639B2 (en) * | 2009-05-29 | 2014-02-25 | Cisco Technology, Inc. | System and method for extending communications between participants in a conferencing environment |
US9082297B2 (en) | 2009-08-11 | 2015-07-14 | Cisco Technology, Inc. | System and method for verifying parameters in an audiovisual environment |
KR101267621B1 (en) * | 2009-11-20 | 2013-05-23 | 한국전자통신연구원 | Overlay multicasting system for group media transmission application service which is composed of multiplex stream |
US8411129B2 (en) | 2009-12-14 | 2013-04-02 | At&T Intellectual Property I, L.P. | Video conference system and method using multicast and unicast transmissions |
US9225916B2 (en) | 2010-03-18 | 2015-12-29 | Cisco Technology, Inc. | System and method for enhancing video images in a conferencing environment |
USD626102S1 (en) | 2010-03-21 | 2010-10-26 | Cisco Tech Inc | Video unit with integrated features |
USD626103S1 (en) | 2010-03-21 | 2010-10-26 | Cisco Technology, Inc. | Video unit with integrated features |
US8854416B2 (en) * | 2010-04-27 | 2014-10-07 | Lifesize Communications, Inc. | Recording a videoconference using a recording server |
US9313452B2 (en) | 2010-05-17 | 2016-04-12 | Cisco Technology, Inc. | System and method for providing retracting optics in a video conferencing environment |
US20120020374A1 (en) * | 2010-07-26 | 2012-01-26 | Kenneth Jonsson | Method and System for Merging Network Stacks |
US8896655B2 (en) | 2010-08-31 | 2014-11-25 | Cisco Technology, Inc. | System and method for providing depth adaptive video conferencing |
US8599934B2 (en) | 2010-09-08 | 2013-12-03 | Cisco Technology, Inc. | System and method for skip coding during video conferencing in a network environment |
US8599865B2 (en) | 2010-10-26 | 2013-12-03 | Cisco Technology, Inc. | System and method for provisioning flows in a mobile network environment |
US8699457B2 (en) | 2010-11-03 | 2014-04-15 | Cisco Technology, Inc. | System and method for managing flows in a mobile network environment |
US8730297B2 (en) | 2010-11-15 | 2014-05-20 | Cisco Technology, Inc. | System and method for providing camera functions in a video environment |
US9338394B2 (en) | 2010-11-15 | 2016-05-10 | Cisco Technology, Inc. | System and method for providing enhanced audio in a video environment |
US9143725B2 (en) | 2010-11-15 | 2015-09-22 | Cisco Technology, Inc. | System and method for providing enhanced graphics in a video environment |
US8902244B2 (en) | 2010-11-15 | 2014-12-02 | Cisco Technology, Inc. | System and method for providing enhanced graphics in a video environment |
US8542264B2 (en) | 2010-11-18 | 2013-09-24 | Cisco Technology, Inc. | System and method for managing optics in a video environment |
US8723914B2 (en) | 2010-11-19 | 2014-05-13 | Cisco Technology, Inc. | System and method for providing enhanced video processing in a network environment |
US9111138B2 (en) | 2010-11-30 | 2015-08-18 | Cisco Technology, Inc. | System and method for gesture interface control |
US8446455B2 (en) | 2010-12-08 | 2013-05-21 | Cisco Technology, Inc. | System and method for exchanging information in a video conference environment |
US8553064B2 (en) | 2010-12-08 | 2013-10-08 | Cisco Technology, Inc. | System and method for controlling video data to be rendered in a video conference environment |
WO2012079208A1 (en) * | 2010-12-13 | 2012-06-21 | Motorola, Inc. | Sharing media among remote access clients in a universal plug and play environment |
USD682293S1 (en) | 2010-12-16 | 2013-05-14 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD678308S1 (en) | 2010-12-16 | 2013-03-19 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD682864S1 (en) | 2010-12-16 | 2013-05-21 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD682294S1 (en) | 2010-12-16 | 2013-05-14 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD678320S1 (en) | 2010-12-16 | 2013-03-19 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD678894S1 (en) | 2010-12-16 | 2013-03-26 | Cisco Technology, Inc. | Display screen with graphical user interface |
USD682854S1 (en) | 2010-12-16 | 2013-05-21 | Cisco Technology, Inc. | Display screen for graphical user interface |
USD678307S1 (en) | 2010-12-16 | 2013-03-19 | Cisco Technology, Inc. | Display screen with graphical user interface |
US9451319B2 (en) * | 2010-12-17 | 2016-09-20 | Microsoft Technology Licensing, Llc | Streaming digital content with flexible remote playback |
US8832564B2 (en) * | 2011-02-11 | 2014-09-09 | Sony Corporation | Personalized second display browsing experience due to multiple session feature |
US8692862B2 (en) | 2011-02-28 | 2014-04-08 | Cisco Technology, Inc. | System and method for selection of video data in a video conference environment |
US8786667B2 (en) | 2011-04-26 | 2014-07-22 | Lifesize Communications, Inc. | Distributed recording of a videoconference in multiple formats |
US8780166B2 (en) | 2011-04-26 | 2014-07-15 | Lifesize Communications, Inc. | Collaborative recording of a videoconference using a recording server |
US8670019B2 (en) | 2011-04-28 | 2014-03-11 | Cisco Technology, Inc. | System and method for providing enhanced eye gaze in a video conferencing environment |
US8786631B1 (en) | 2011-04-30 | 2014-07-22 | Cisco Technology, Inc. | System and method for transferring transparency information in a video environment |
US8934026B2 (en) | 2011-05-12 | 2015-01-13 | Cisco Technology, Inc. | System and method for video coding in a dynamic environment |
US8947493B2 (en) | 2011-11-16 | 2015-02-03 | Cisco Technology, Inc. | System and method for alerting a participant in a video conference |
US8682087B2 (en) | 2011-12-19 | 2014-03-25 | Cisco Technology, Inc. | System and method for depth-guided image filtering in a video conference environment |
CN102413309A (en) * | 2011-12-27 | 2012-04-11 | 中兴通讯股份有限公司 | Method and device for joining video conference |
US9667513B1 (en) * | 2012-01-24 | 2017-05-30 | Dw Associates, Llc | Real-time autonomous organization |
US9681154B2 (en) | 2012-12-06 | 2017-06-13 | Patent Capital Group | System and method for depth-guided filtering in a video conference environment |
US9843621B2 (en) | 2013-05-17 | 2017-12-12 | Cisco Technology, Inc. | Calendaring activities based on communication processing |
KR102279582B1 (en) | 2017-10-30 | 2021-07-19 | 삼성에스디에스 주식회사 | Conferencing apparatus and method for switching access terminal thereof |
TW201933864A (en) * | 2018-01-26 | 2019-08-16 | 圓展科技股份有限公司 | Conversion device, connection conference system and connection conference method |
US10673913B2 (en) * | 2018-03-14 | 2020-06-02 | 8eo, Inc. | Content management across a multi-party conference system by parsing a first and second user engagement stream and transmitting the parsed first and second user engagement stream to a conference engine and a data engine from a first and second receiver |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5764901A (en) * | 1995-12-21 | 1998-06-09 | Intel Corporation | Record and playback in a data conference |
US6195683B1 (en) * | 1992-06-03 | 2001-02-27 | Compaq Computer Corporation | Video teleconferencing for networked workstations |
US20020033880A1 (en) * | 2000-09-19 | 2002-03-21 | Dong-Myung Sul | Method for performing multipoint video conference in video conferencing system |
US6380968B1 (en) * | 1998-01-06 | 2002-04-30 | Intel Corporation | Method and apparatus for controlling a remote video camera in a video conferencing system |
US6421733B1 (en) * | 1997-03-25 | 2002-07-16 | Intel Corporation | System for dynamically transcoding data transmitted between computers |
US6421706B1 (en) * | 1998-02-25 | 2002-07-16 | Worldcom, Inc. | Multicast and unicast internet protocol content distribution having a feedback mechanism for real-time and store and forward information transfer |
US20020112004A1 (en) * | 2001-02-12 | 2002-08-15 | Reid Clifford A. | Live navigation web-conferencing system and method |
US20030172131A1 (en) * | 2000-03-24 | 2003-09-11 | Yonghui Ao | Method and system for subject video streaming |
US6704769B1 (en) * | 2000-04-24 | 2004-03-09 | Polycom, Inc. | Media role management in a video conferencing network |
US20040071098A1 (en) * | 2000-02-22 | 2004-04-15 | Magnuski Henry S. | Videoconferencing system |
US6760749B1 (en) * | 2000-05-10 | 2004-07-06 | Polycom, Inc. | Interactive conference content distribution device and methods of use thereof |
US7043528B2 (en) * | 2001-03-08 | 2006-05-09 | Starbak Communications, Inc. | Systems and methods for connecting video conferencing to a distributed network |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5625407A (en) * | 1994-07-08 | 1997-04-29 | Lucent Technologies Inc. | Seamless multimedia conferencing system using an enhanced multipoint control unit and enhanced endpoint devices |
US5802281A (en) * | 1994-09-07 | 1998-09-01 | Rsi Systems, Inc. | Peripheral audio/video communication system that interfaces with a host computer and determines format of coded audio/video signals |
US6335927B1 (en) * | 1996-11-18 | 2002-01-01 | Mci Communications Corporation | System and method for providing requested quality of service in a hybrid network |
US6317776B1 (en) * | 1998-12-17 | 2001-11-13 | International Business Machines Corporation | Method and apparatus for automatic chat room source selection based on filtered audio input amplitude of associated data streams |
US6496201B1 (en) * | 1999-09-30 | 2002-12-17 | International Business Machines Corporation | System and user interface for multiparty conferencing |
US6657975B1 (en) * | 1999-10-25 | 2003-12-02 | Voyant Technologies, Inc. | Large-scale, fault-tolerant audio conferencing over a hybrid network |
US6816468B1 (en) * | 1999-12-16 | 2004-11-09 | Nortel Networks Limited | Captioning for tele-conferences |
-
2001
- 2001-10-30 US US09/984,499 patent/US20020126201A1/en not_active Abandoned
-
2005
- 2005-08-12 US US11/161,701 patent/US7043528B2/en not_active Expired - Fee Related
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6195683B1 (en) * | 1992-06-03 | 2001-02-27 | Compaq Computer Corporation | Video teleconferencing for networked workstations |
US5764901A (en) * | 1995-12-21 | 1998-06-09 | Intel Corporation | Record and playback in a data conference |
US6421733B1 (en) * | 1997-03-25 | 2002-07-16 | Intel Corporation | System for dynamically transcoding data transmitted between computers |
US6380968B1 (en) * | 1998-01-06 | 2002-04-30 | Intel Corporation | Method and apparatus for controlling a remote video camera in a video conferencing system |
US6421706B1 (en) * | 1998-02-25 | 2002-07-16 | Worldcom, Inc. | Multicast and unicast internet protocol content distribution having a feedback mechanism for real-time and store and forward information transfer |
US20040071098A1 (en) * | 2000-02-22 | 2004-04-15 | Magnuski Henry S. | Videoconferencing system |
US20030172131A1 (en) * | 2000-03-24 | 2003-09-11 | Yonghui Ao | Method and system for subject video streaming |
US6704769B1 (en) * | 2000-04-24 | 2004-03-09 | Polycom, Inc. | Media role management in a video conferencing network |
US6760749B1 (en) * | 2000-05-10 | 2004-07-06 | Polycom, Inc. | Interactive conference content distribution device and methods of use thereof |
US20020033880A1 (en) * | 2000-09-19 | 2002-03-21 | Dong-Myung Sul | Method for performing multipoint video conference in video conferencing system |
US20020112004A1 (en) * | 2001-02-12 | 2002-08-15 | Reid Clifford A. | Live navigation web-conferencing system and method |
US7043528B2 (en) * | 2001-03-08 | 2006-05-09 | Starbak Communications, Inc. | Systems and methods for connecting video conferencing to a distributed network |
Cited By (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040008635A1 (en) * | 2002-07-10 | 2004-01-15 | Steve Nelson | Multi-participant conference system with controllable content delivery using a client monitor back-channel |
US7362349B2 (en) * | 2002-07-10 | 2008-04-22 | Seiko Epson Corporation | Multi-participant conference system with controllable content delivery using a client monitor back-channel |
US7619645B2 (en) | 2002-08-23 | 2009-11-17 | Tandberg Nz Limited | Audio visual media encoding system |
US20100040228A1 (en) * | 2002-12-31 | 2010-02-18 | Querell Data Limited Liability Company | Personalized marking for protecting digital audiovisual streams |
US8094876B2 (en) | 2002-12-31 | 2012-01-10 | Querell Data Limited Liability Company | Personalized marking for protecting digital audiovisual streams |
US20050289064A1 (en) * | 2002-12-31 | 2005-12-29 | Medialive, A Corporation Of France | Personalized markup for protecting numerical audiovisual streams |
US7639833B2 (en) * | 2002-12-31 | 2009-12-29 | Daniel Lecomte | Personalized markup for protecting numerical audiovisual streams |
WO2004095839A1 (en) * | 2003-04-17 | 2004-11-04 | Siemens Communications, Inc. | System and method for real time playback of conferencing streams |
US20040207724A1 (en) * | 2003-04-17 | 2004-10-21 | Siemens Information And Communication Networks, Inc. | System and method for real time playback of conferencing streams |
US20040236830A1 (en) * | 2003-05-15 | 2004-11-25 | Steve Nelson | Annotation management system |
US20080098295A1 (en) * | 2003-05-15 | 2008-04-24 | Seiko Epson Corporation | Annotation Management System |
US20040230655A1 (en) * | 2003-05-16 | 2004-11-18 | Chia-Hsin Li | Method and system for media playback architecture |
US20080256463A1 (en) * | 2003-05-16 | 2008-10-16 | Seiko Epson Corporation | Method and System for Media Playback Architecture |
US9462228B2 (en) | 2003-11-04 | 2016-10-04 | Cisco Technology, Inc. | Distributed real-time media composer |
US8289369B2 (en) | 2003-11-14 | 2012-10-16 | Cisco Technology, Inc. | Distributed real-time media composer |
WO2005048600A1 (en) * | 2003-11-14 | 2005-05-26 | Tandberg Telecom As | Distributed real-time media composer |
US20050122392A1 (en) * | 2003-11-14 | 2005-06-09 | Tandberg Telecom As | Distributed real-time media composer |
US7561179B2 (en) | 2003-11-14 | 2009-07-14 | Tandberg Telecom As | Distributed real-time media composer |
US8773497B2 (en) | 2003-11-14 | 2014-07-08 | Cisco Technology, Inc. | Distributed real-time media composer |
CN100568948C (en) * | 2003-11-14 | 2009-12-09 | 坦德伯格电信公司 | Distributed real-time media composer |
US20050243741A1 (en) * | 2004-03-05 | 2005-11-03 | Tandberg Telecom As | Method, apparatus, system, and computer program product for interruption-free conference calling |
US7990879B2 (en) | 2004-03-05 | 2011-08-02 | Tandberg Telecom As | Method, apparatus, system, and computer program product for interruption-free conference calling |
US8457614B2 (en) | 2005-04-07 | 2013-06-04 | Clearone Communications, Inc. | Wireless multi-unit conference phone |
US20060268753A1 (en) * | 2005-05-27 | 2006-11-30 | Microsoft Corporation | Establishing a multiparty session by sending invitations in parallel |
US7882176B2 (en) | 2005-05-27 | 2011-02-01 | Microsoft Corporation | Establishing a multiparty session by sending invitations in parallel |
US20060271626A1 (en) * | 2005-05-27 | 2006-11-30 | Microsoft Corporation | Supporting a serial and a parallel invitation protocol |
US7660850B2 (en) * | 2005-05-27 | 2010-02-09 | Microsoft Corporation | Supporting a serial and a parallel invitation protocol |
US20070127463A1 (en) * | 2005-09-26 | 2007-06-07 | Tandberg Telecom As | Method, apparatus, and computer program product for gatekeeper streaming |
WO2007035109A1 (en) * | 2005-09-26 | 2007-03-29 | Tandberg Telecom As | Method for gatekeeper streaming |
US7792063B2 (en) | 2005-09-26 | 2010-09-07 | Tandberg Telecom As | Method, apparatus, and computer program product for gatekeeper streaming |
US20070192486A1 (en) * | 2006-02-14 | 2007-08-16 | Sbc Knowledge Ventures L.P. | Home automation system and method |
US8516087B2 (en) * | 2006-02-14 | 2013-08-20 | At&T Intellectual Property I, L.P. | Home automation system and method |
US20140192143A1 (en) * | 2006-04-06 | 2014-07-10 | At&T Intellectual Property I, Lp | System and method for distributing video conference data over an internet protocol television system |
US9661268B2 (en) * | 2006-04-06 | 2017-05-23 | At&T Intellectual Property I, L.P. | System and method for distributing video conference data over an internet protocol television system |
US8706807B2 (en) * | 2006-04-06 | 2014-04-22 | AT&T Intellectual Protperty I, LP | System and method for distributing video conference data over an internet protocol television system |
US20100177158A1 (en) * | 2006-04-06 | 2010-07-15 | Walter Edward A | System and Method for Distributing Video Conference Data over an Internet Protocol Television System |
US20090181659A1 (en) * | 2006-05-05 | 2009-07-16 | Stalnacke Marika | Method and arrangement for management of virtual meetings |
US20080077692A1 (en) * | 2006-09-25 | 2008-03-27 | Microsoft Corporation | Application programming interface for efficient multicasting of communications |
US7698439B2 (en) | 2006-09-25 | 2010-04-13 | Microsoft Corporation | Application programming interface for efficient multicasting of communications |
US20110069141A1 (en) * | 2008-04-30 | 2011-03-24 | Mitchell April S | Communication Between Scheduled And In Progress Event Attendees |
US20110093590A1 (en) * | 2008-04-30 | 2011-04-21 | Ted Beers | Event Management System |
US8493984B2 (en) * | 2008-06-13 | 2013-07-23 | Cisco Technology, Inc. | System and method for establishment of a multiprotocol label switching (MPLS) tunnel |
US20090310614A1 (en) * | 2008-06-13 | 2009-12-17 | Cisco Technology, Inc. | System and Method for Establishment of a Multiprotocol Label Switching (MPLS) Tunnel |
JP2011525770A (en) * | 2008-06-23 | 2011-09-22 | ラドヴィジョン リミテッド | System, method and medium for providing a cascaded multipoint video conference device |
US20110179157A1 (en) * | 2008-09-26 | 2011-07-21 | Ted Beers | Event Management System For Creating A Second Event |
US20100095223A1 (en) * | 2008-10-15 | 2010-04-15 | Ted Beers | Reconfiguring a collaboration event |
US20100091687A1 (en) * | 2008-10-15 | 2010-04-15 | Ted Beers | Status of events |
US7792901B2 (en) | 2008-10-15 | 2010-09-07 | Hewlett-Packard Development Company, L.P. | Reconfiguring a collaboration event |
US8819258B2 (en) * | 2009-05-07 | 2014-08-26 | International Business Machines Corporation | Architecture for building multi-media streaming applications |
US20100287295A1 (en) * | 2009-05-07 | 2010-11-11 | International Business Machines Corporation | Architecture for building multi-media streaming applications |
US8704868B2 (en) * | 2009-08-24 | 2014-04-22 | Panasonic Corporation | Video conferencing system, video conferencing apparatus, video conferencing control method, and video conferencing control program |
US20110205328A1 (en) * | 2009-08-24 | 2011-08-25 | Hidekatsu Ozeki | Video conferencing system, video conferencing apparatus, video conferencing control method, and video conferencing control program |
US20110154417A1 (en) * | 2009-12-22 | 2011-06-23 | Reha Civanlar | System and method for interactive synchronized video watching |
WO2011087727A1 (en) * | 2009-12-22 | 2011-07-21 | Delta Vidyo, Inc. | System and method for interactive synchronized video watching |
US9055312B2 (en) | 2009-12-22 | 2015-06-09 | Vidyo, Inc. | System and method for interactive synchronized video watching |
US9232191B2 (en) | 2010-05-12 | 2016-01-05 | Blue Jeans Networks, Inc. | Systems and methods for scalable distributed global infrastructure for real-time multimedia communication |
US9143729B2 (en) | 2010-05-12 | 2015-09-22 | Blue Jeans Networks, Inc. | Systems and methods for real-time virtual-reality immersive multimedia communications |
US8514263B2 (en) | 2010-05-12 | 2013-08-20 | Blue Jeans Network, Inc. | Systems and methods for scalable distributed global infrastructure for real-time multimedia communication |
US8482593B2 (en) * | 2010-05-12 | 2013-07-09 | Blue Jeans Network, Inc. | Systems and methods for scalable composition of media streams for real-time multimedia communication |
US8875031B2 (en) | 2010-05-12 | 2014-10-28 | Blue Jeans Network, Inc. | Systems and methods for shared multimedia experiences in virtual videoconference rooms |
US8885013B2 (en) | 2010-05-12 | 2014-11-11 | Blue Jeans Network, Inc. | Systems and methods for novel interactions with participants in videoconference meetings |
US20110279635A1 (en) * | 2010-05-12 | 2011-11-17 | Alagu Periyannan | Systems and methods for scalable composition of media streams for real-time multimedia communication |
US9035997B2 (en) * | 2010-05-12 | 2015-05-19 | Blue Jeans Network | Systems and methods for real-time multimedia communications across multiple standards and proprietary devices |
US9041765B2 (en) | 2010-05-12 | 2015-05-26 | Blue Jeans Network | Systems and methods for security and privacy controls for videoconferencing |
US20110279634A1 (en) * | 2010-05-12 | 2011-11-17 | Alagu Periyannan | Systems and methods for real-time multimedia communications across multiple standards and proprietary devices |
US9124757B2 (en) | 2010-10-04 | 2015-09-01 | Blue Jeans Networks, Inc. | Systems and methods for error resilient scheme for low latency H.264 video coding |
US9369673B2 (en) | 2011-05-11 | 2016-06-14 | Blue Jeans Network | Methods and systems for using a mobile device to join a video conference endpoint into a video conference |
US9300705B2 (en) | 2011-05-11 | 2016-03-29 | Blue Jeans Network | Methods and systems for interfacing heterogeneous endpoints and web-based media sources in a video conference |
US10284616B2 (en) * | 2011-06-16 | 2019-05-07 | Google Llc | Adjusting a media stream in a video communication system based on participant count |
US8832193B1 (en) * | 2011-06-16 | 2014-09-09 | Google Inc. | Adjusting a media stream in a video communication system |
US20140365620A1 (en) * | 2011-06-16 | 2014-12-11 | Google Inc. | Adjusting a media stream in a video communication system |
US20140247887A1 (en) * | 2011-12-28 | 2014-09-04 | Verizon Patent And Licensing Inc. | Just-in-time (jit) encoding for streaming media content |
US9609340B2 (en) * | 2011-12-28 | 2017-03-28 | Verizon Patent And Licensing Inc. | Just-in-time (JIT) encoding for streaming media content |
US20140059629A1 (en) * | 2012-08-23 | 2014-02-27 | Electronics And Telecommunications Research Institute | Two-way broadcast service providing system and method including media transmission apparatus |
US10264131B2 (en) * | 2014-02-28 | 2019-04-16 | Ricoh Company, Ltd. | Transmission control system, transmission system, and method of transmission control |
US20160050392A1 (en) * | 2014-03-07 | 2016-02-18 | Shenzhen Seefaa Scitech Co., Ltd. | Method for live video chat |
US9531997B2 (en) * | 2014-03-07 | 2016-12-27 | Shenzhen Seefaa Scitech Co., Ltd. | Method for live video chat |
US20150256796A1 (en) * | 2014-03-07 | 2015-09-10 | Zhigang Ma | Device and method for live video chat |
US9219881B2 (en) * | 2014-03-07 | 2015-12-22 | Shenzhen Seefaa Scitech Co., Ltd. | Device and method for live video chat |
US11259180B2 (en) * | 2015-06-04 | 2022-02-22 | Vm-Robot, Inc. | Routing systems and methods |
US10375088B2 (en) * | 2015-06-04 | 2019-08-06 | Vm-Robot, Inc. | Routing systems and methods |
EP3329670A4 (en) * | 2015-07-28 | 2019-03-06 | Mersive Technologies, Inc. | VIRTUAL VIDEO CONTROL DEVICE BRIDGE SYSTEM FOR MULTISOURCE COLLABORATION IN A WEB CONFERENCE SYSTEM |
US11489891B2 (en) | 2015-07-28 | 2022-11-01 | Mersive Technologies, Inc. | Virtual video driver bridge system for multi-source collaboration within a web conferencing system |
US10771508B2 (en) | 2016-01-19 | 2020-09-08 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
US11582269B2 (en) | 2016-01-19 | 2023-02-14 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
US10931725B2 (en) * | 2017-09-29 | 2021-02-23 | Apple Inc. | Multiway audio-video conferencing |
US11171938B2 (en) | 2018-12-21 | 2021-11-09 | Wells Fargo Bank, N.A. | Multi-layer user authentication with live interaction |
US11695746B2 (en) | 2018-12-21 | 2023-07-04 | Wells Fargo Bank, N.A. | Multi-layer user authentication with live interaction |
Also Published As
Publication number | Publication date |
---|---|
US20060047750A1 (en) | 2006-03-02 |
US7043528B2 (en) | 2006-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7043528B2 (en) | Systems and methods for connecting video conferencing to a distributed network | |
US6944136B2 (en) | Two-way audio/video conferencing system | |
US7171485B2 (en) | Broadband network system configured to transport audio or video at the transport layer, and associated method | |
US7124195B2 (en) | Broadband network system configured to transport audio or video at the transport layer, and associated method | |
US9055312B2 (en) | System and method for interactive synchronized video watching | |
US20050283813A1 (en) | Systems and methods for recording signals from communication devices as messages and making the messages available for later access by other communication devices | |
US9288442B2 (en) | Multicasting a videoconference recording to a plurality of clients | |
US7773581B2 (en) | Method and apparatus for conferencing with bandwidth control | |
EP1578129A1 (en) | Method and apparatus for conferencing with stream selectivity | |
US20040119814A1 (en) | Video conferencing system and method | |
US20030074554A1 (en) | Broadband interface unit and associated method | |
JP2007507190A (en) | Conference system | |
JP2008022552A (en) | CONFERENCE METHOD AND CONFERENCE SYSTEM | |
US20040170159A1 (en) | Digital audio and/or video streaming system | |
CN101116306A (en) | On-demand multi-channel streaming sessions over packet-switched networks | |
CN107493453A (en) | System and method for netted end-to-end video conference | |
JP4741325B2 (en) | Multipoint conference method and multipoint conference system | |
US20020194606A1 (en) | System and method of communication between videoconferencing systems and computer systems | |
US20100020156A1 (en) | Method and device for simultaneous multipoint distributing of video, voice and data | |
Bassbouss et al. | Streamlining WebRTC and DASH for near-real-time media delivery | |
Willebeek-LeMair et al. | Centralized versus distributed schemes for videoconferencing | |
JP5239756B2 (en) | Media synchronization method for video sharing | |
KR20070018269A (en) | Multi-point video conferencing controller, video conferencing service expansion system using same and method | |
Schulzrinne | Internet media-on-demand: The real-time streaming protocol | |
Cricri et al. | Mobile and Interactive Social Television—A Virtual TV Room |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: STAR-BAK COMMUNICATIONS INC., OHIO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHMITT, MATTHEW R.;REEL/FRAME:012291/0146 Effective date: 20011015 |
|
AS | Assignment |
Owner name: STAR-BAK COMMUNICATION, INC., OHIO Free format text: (1) DECLARATION OF JOHN M. JAMAIL; (2) ASSIGNMENT BY OPERATION OF LAW; (3) NON-COMPETITION AND NON-DISCLOSURE AGREEMENT OF AN AT-WILL EMPLOYEE OF NICHOLAS A. POOLOS;ASSIGNOR:POOLOS, NICHOLAS A.;REEL/FRAME:012385/0944 Effective date: 20000116 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK DBA SILICON VALLEY EAST, CALIF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STARBAK COMMNUNICATIONS, INC.;REEL/FRAME:014218/0125 Effective date: 20031212 |
|
AS | Assignment |
Owner name: STARBAK COMMUNICATIONS, INC., DELAWARE Free format text: CHANGE OF NAME;ASSIGNOR:STAR-BAK, INC.;REEL/FRAME:016799/0300 Effective date: 20000914 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY AGREEMENT;ASSIGNOR:STARBACK COMMUNICATIONS, INC.;REEL/FRAME:018194/0182 Effective date: 20060818 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY AGREEMENT;ASSIGNOR:STARBAK COMMUNICATIONS, INC.;REEL/FRAME:018194/0667 Effective date: 20060818 Owner name: GOLD HILL VENTURE LENDING 03, L.P., CALIFORNIA Free format text: SECURITY AGREEMENT;ASSIGNOR:STARBAK COMMUNICATIONS, INC.;REEL/FRAME:018194/0667 Effective date: 20060818 Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE STARBACK COMMUNICATIONS, INC. PREVIOUSLY RECORDED ON REEL 018194 FRAME 0182;ASSIGNOR:STARBAK COMMUNICATIONS, INC.;REEL/FRAME:018207/0963 Effective date: 20060818 Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE STARBACK COMMUNICATIONS, INC. PREVIOUSLY RECORDED ON REEL 018194 FRAME 0182. ASSIGNOR(S) HEREBY CONFIRMS THE STARBAK COMMUNICATIONS, INC.;ASSIGNOR:STARBAK COMMUNICATIONS, INC.;REEL/FRAME:018207/0963 Effective date: 20060818 |
|
AS | Assignment |
Owner name: GOLD HILL VENTURE LENDING 03, L.P., CALIFORNIA Free format text: SECURITY AGREEMENT;ASSIGNOR:GULFSTREAM MEDIA CORPORATION;REEL/FRAME:019140/0679 Effective date: 20070315 Owner name: SILICON VALLEY BANK, AS AGENT, CALIFORNIA Free format text: SECURITY AGREEMENT;ASSIGNOR:GULFSTREAM MEDIA CORPORATION;REEL/FRAME:019140/0679 Effective date: 20070315 |
|
AS | Assignment |
Owner name: GULFSTREAM MEDIA CORPORATION, MASSACHUSETTS Free format text: AFFIDAVIT REGARDING LOAN DEFAULT AND TRANSFER OF INTELLECTUAL PROPERTY;ASSIGNORS:SILICON VALLEY BANK;GOLD HILL VENTURE LENDING 03, L.P.;REEL/FRAME:019353/0835 Effective date: 20070315 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |