+

WO2018178748A1 - Système de terminal à dispositif mobile, où un terminal est commandé par l'intermédiaire d'un dispositif mobile, et procédé de commande à distance de terminal - Google Patents

Système de terminal à dispositif mobile, où un terminal est commandé par l'intermédiaire d'un dispositif mobile, et procédé de commande à distance de terminal Download PDF

Info

Publication number
WO2018178748A1
WO2018178748A1 PCT/IB2017/052026 IB2017052026W WO2018178748A1 WO 2018178748 A1 WO2018178748 A1 WO 2018178748A1 IB 2017052026 W IB2017052026 W IB 2017052026W WO 2018178748 A1 WO2018178748 A1 WO 2018178748A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
mobile device
data
video
video data
Prior art date
Application number
PCT/IB2017/052026
Other languages
English (en)
Inventor
Valeriy Vitalievich DUBOV
Original Assignee
Dvr Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dvr Llc filed Critical Dvr Llc
Publication of WO2018178748A1 publication Critical patent/WO2018178748A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • H04N21/6379Control signals issued by the client directed to the server or network components directed to server directed to encoder, e.g. for requesting a lower encoding rate

Definitions

  • This invention relates to terminal-to-mobile-device systems, wherein a terminal is controlled by a mobile device, and to the field of creating virtual reality.
  • this invention relates to data transfer from a computing device to a mobile device.
  • VR virtual reality
  • VR virtual reality
  • US 8,965,460 discloses a mobile communication system and intelligent electronic glasses augmenting reality based on a network used by a mobile device.
  • This mobile communication system is based on digital content including images and video that may be displayed using a plurality of mobile devices, smartphones, tablet computers, stationary computers, intelligent electronic glasses, smart glasses, headsets, watches, smart devices, vehicles and servers.
  • Digital content can be acquired continuously by input and output devices and displayed in a social network. Images can have additional properties including types of voice, audio, data and other information. This content can be displayed and acted upon in an augmented reality or virtual reality system.
  • the imaged base network system may have the ability to learn and form intelligent association between aspects, people and other entities; between images and the associated data relating to both animate and inanimate entities for intelligent image based communication in a network.
  • the disadvantage of this technology is that server operation is not determined and video stream transfer method is not specified.
  • Soft head mounted display goggles for use with mobile computing devices (US 9,274,340), comprising a soft main body made entirely of a soft and compressible material.
  • the main body has a retention pocket configured to accept and secure the mobile computing device.
  • a lens assembly in the goggles comprises two lenses configured to focus vision on respective areas of the display screen of the mobile computing device, which is divided into two images.
  • the lens assembly is held within one or more apertures formed in the main one-piece body; the two lenses are mounted independent of each other, so that a split screen image can be viewed through the two lenses on the display screen.
  • the specified document describes a mechanism to determine location of the user and his/her head rotations and synchronization of this data with the server.
  • the data transfer speed in the disclosed technology is low, and 3D image is formed in this case in the mobile device, which makes the system operation slow.
  • Riftcat application which functions according to the following principle: computer games are launched on a computer, which are transmitted to a mobile phone connected to a common network with the computer. At the same time, the mobile phone can deliver the information to the game from some of its own sensors, for example, position sensors, which allows the game to react to such information, for example, to user's head rotations and tilting.
  • the application is embedded with a game library, and such games can be downloaded and tested without resorting to external sources.
  • Riftcat application is that it is available only to users of Sdk Oculus or Sdk Htc Vive virtual reality games and headsets.
  • Another disadvantage of this application is that the network data transfer is slow and unreliable, which results in a low quality image.
  • Trinus VR software package making it possible to transfer streaming video from a personal computer (PC) to an Android-based device, taking into account the position of the Android-based device in space.
  • PC personal computer
  • Android-based device taking into account the position of the Android-based device in space.
  • the disadvantage of this package is that the system operation speed is low.
  • the object of this invention is to create a universal system having a high rate of interoperation between a computing device and a mobile device.
  • a proposed terminal-to-mobile-device system comprising a terminal controlled by a mobile device, the system further comprising: a computing device comprising a computing device processor, a random access memory, a read-only memory, a video adapter, computing device display means and computing device data exchange means and implementing functions of the terminal.
  • the mobile device comprises a mobile device processor, a decoding module, mobile device display means and control means configured to be manipulated by a user, the mobile device being configured to exchange data with the computing device data exchange means.
  • the computing device is configured to transfer video data from the video adapter to the computing device data exchange means after the video adapter sends a signal to the computing device processor informing it about readiness of the video data and before sending the video data from video adapter to the computing device display means.
  • the computing device data exchange means are configured to code the video data by using a coding parameter characterized by a number of frames to which the processed frame refers, and to transfer the coded video data to the mobile device.
  • the mobile device is configured to decode, by means of the decoding module, the video data obtained from the computing device, and to display, by means of the mobile device display means, the decoded video data in a virtual reality format.
  • the mobile device is configured to generate control signals based on signals from the control means and to send the generated control signals to the computing device.
  • the decoding module is configured to use a minimum possible number of buffers and to decode without synchronization with the frame frequency; the decoding module further being configured to use modes designed for a larger resolution or a larger frame frequency than that of
  • the proposed system provides high speed of transfer of video data and commands between the computing device and the mobile device. Additionally, the system provides creation of a universal system making it possible to use in a mobile device any application requiring high speed of data transfer and large resources for their processing.
  • the computing device is configured to transfer video data from the video adapter to data exchange means after the video adapter sends a signal to the computing device processor informing it about readiness of video data and before sending video data from the video adapter to the computing device's display means
  • the decoding module is configured to using of a minimum possible number of buffers, while there is no synchronization with the frame frequency when decoding; furthermore, the decoding module is configured to use modes designed for a higher resolution or frame frequency than the actual video stream.
  • the data exchange means of the computing device are configured to establish wireless and/or wired connection to the mobile device.
  • the computing device is configured to transfer audio data to the mobile device, and the mobile device additionally contains an audio data player, while the data exchange means of the computing device are configured to transfer audio data to the mobile device together with video data and/or separately from video data.
  • the data exchange means of the computing device are configured to transfer encrypted video and/or audio data.
  • the data exchange means of the computing device are configured to ensure connection to the mobile device by a local area network and/or the Internet
  • the computing device is remote in relation to the mobile device, while the data exchange means of the computing device are configured to exchange data with the mobile device by the Internet.
  • the mobile device represents one of the following devices: a mobile phone, a smartphone, PDA, a navigator, a handheld PC, a laptop, a tablet PC, a portable gaming device.
  • the embodiments improve functionality of the proposed system, contributing to the universal nature of the proposed invention.
  • the method comprising: generating video data by the video adapter of the computing device, transferring the video data from the video adapter to the data exchange means after sending a signal to the processor of the computing device by the video adapter informing the computing device of readiness of the video data and before sending the video data from the video adapter to the display means of the computing device, coding of the video data using a coding parameter which is characterized by a certain number of frames to which the processed frame refers, transferring the coded video data to the mobile device, decoding the video data obtained from the computing device using a decoding parameter which is characterized by a lower number of frames to which the processed frame refers as compared to the coding parameter, transforming the video data in the mobile device into a virtual reality format, displaying the transformed video data by the mobile device display means, registering control commands from the user by the control means of the mobile device, generating control
  • the method provides the technical effect in form of a high speed of transfer of video data and commands between the computing device and the mobile device due to the fact that, in particular, the method transfers video data from the video adapter to data exchange means before the video data is sent from the video adapter to display means of the computing device and decodes video data obtained from the computing device using a decoding parameter which is characterized by a lower number of frames to which the processed frame refers, as compared to the coding parameter.
  • the method comprises generating audio data in the computing device, wherein the audio data is transferred to the mobile device together with the video data or separately from it.
  • the video data and the audio data are transferred to the mobile device in an encrypted form.
  • the method further comprises transforming video and/or audio data generated in the computing device based on the control signals, and transferring the transformed video and/or audio data to the mobile device.
  • a computer software is proposed, which is configured to implement the method for remote control of a terminal by a user.
  • the proposed system, method and computer software make it possible to minimize latency of transfer of data and commands between the computing device and the mobile device and to use in the mobile device the applications requiring high speed data transfer and many resources for its processing.
  • Fig. 1 illustrates the layout of the terminal-to-mobile-device system in accordance with one of the implementations of this invention.
  • Fig. 2 illustrates the method for remote terminal control by the user in accordance with one of the implementations of this invention.
  • terminal refers to any general or special-purpose computing device.
  • a "user" in singular form means any number of users.
  • System 100 consists mainly of computing device 1 10 and mobile device 160.
  • the computing device 1 10 is a personal computer (PC) and the mobile device 160 is a mobile phone that can receive and transfer data through wired and wireless networks.
  • Other implementations can have any other types of computing devices (for example, personal computers, laptops, tablet PCs, game consoles) and mobile devices (for example, smartphones, PDAs, navigators, handheld PCs, laptops, tablet PCs, portable gaming devices).
  • the mobile device being used should preferably have a screen of a size that makes it possible to create virtual reality without causing discomfort to the user.
  • the computing device 1 10 used as a terminal includes a computing device processor 1 15, a random access memory 120, a read-only memory 125, a video adapter 130, computing device display means 135 and computing device data exchange means 140.
  • Data exchange means 140 are any wireless and/or wired data exchange means, known by specialists, and can ensure connection to the mobile device 160 by a local area network and/or the Internet.
  • Other exemplary implementations may use a computing device 1 10 with additional or other components, in particular, such device may have no display means.
  • Mobile device 160 contains a mobile device processor 165, a decoding module 170, mobile device display means 175 and control means 180 configured to be manipulated by user 105.
  • the mobile device can exchange data with computing device 1 10 using data exchange means 140 of computing device 1 10.
  • computing device 1 10 transfers video data from video adapter 130 to data exchange means 140 after video adapter 130 sends a signal to processor 1 15 of the computing device informing it about readiness of video data and before video data is sent from video adapter 130 to display means 135 of the computing device or to any other relevant component of computing device 1 10 in absence of display means 135.
  • This specific feature is one of the distinctions from the known systems, wherein video data is transferred from a computing device only after they are sent to display means of the computing device from the video adapter, which makes it possible to considerably minimize latency of data transfer from the computing device to the mobile device.
  • this may be implemented using the fact that, in a rendering process, the function, using which an object will be shown on the display, is determined for each object in a standard API, and an absolute address of such function, i.e. address of such function in memory, is set. Knowing where the address library is, the offset address of such function, which is used in such library, is calculated. Thus, if the address of the function library for a particular application (a particular game), which appears only during loading of such library, when it is required by a processor, and the previously calculated address within such library are known, the function address for each rendering object becomes known for each application.
  • the processor operation is interrupted, and during the interruption the required functions in the required loaded libraries are replaced with those functions that makes it possible to convert all objects to the format required to display them on a particular target device.
  • a special- purpose library is used that makes it possible to acquire a frame right after it is perceived in a video adapter. After this procedure, the frame is coded and sent to the mobile device.
  • a connection of a special audio device is also emulated at the software level to transfer audio from the game to the mobile device.
  • Video data may relate, without limitation, to any executed computer program, films, television broadcasting, video clips, augmented reality videos, etc.
  • data exchange means 140 Prior to transfer of video data to mobile device 150 code video data using the compression standard with a coding parameter setting a certain number of frames to which the current processed frame refers, and with a possibility of transferring coded video data.
  • mobile device 160 when receiving video data from data exchange means 140, decodes the received video data using decoding module 170, which deploys a circular buffer in its operation and uses a decoding parameter setting a lesser number of frames, as compared to the number of frames to which the current processed frame set by the above said coding parameter refers.
  • decoding module 170 which deploys a circular buffer in its operation and uses a decoding parameter setting a lesser number of frames, as compared to the number of frames to which the current processed frame set by the above said coding parameter refers.
  • This specific feature is also not found in the known systems and makes it possible to minimize latency of data processing in the mobile device, since the decoding parameter in the known systems uses the same number of frames to which the current processed frame refers, as the coding parameter does. For example, a large number of frames that are processed by the decoding module is calculated for decoding in the mobile device.
  • decoding module 170 can use the minimum possible number of buffers, while there is no synchronization with the frame frequency when decoding, and decoding module 170 can use the modes designed for a larger resolution or frame frequency than an actual video stream has.
  • the decoding module is configured to create a circular buffer.
  • a mobile device requires 5 different spots for the video buffer for the decoding module; in this case, the buffer is set to use maximum 2 frames, but the full buffer still needs to be prepared. Therefore, when initializing this module, a circular buffer is created, i.e. an area in memory, which is to be repeatedly overwritten when new data is received from the computer. Hence, this makes it possible not to allocate memory once more for new data, but to use the old memory.
  • a texture having an appropriate format for the coder may be specifically prepared in memory of the video adapter so that, at the coding stage, it would save the effort of copying data from system memory to video memory since this takes a long time. Owing to this coder, a reference to the prepared area in memory may be given for coding.
  • Decoded video data is displayed in the virtual reality format using display means
  • the virtual reality format should be created by dividing the display into two areas which, when viewed by the user with or without a virtual reality headset, create a virtual reality effect.
  • any other known virtual reality formats, where video data based image is otherwise transformed for display, can be used.
  • Mobile device 160 can send to computing device 1 10 control signals generated in mobile device 160 based on signals from control means 180, for example, from a gyroscope or an accelerometer available in mobile device 160, as well as from any known manipulators, joysticks and other devices connected to a mobile phone and allowing a user in any way to control or interact with video data displayed to him/her in display means 175 of mobile device 160. Therefore, computing device 1 10 is controlled by mobile device 160 in terminal-to-mobile-device system 100. Based on said control signals, computing device 1 10 can additionally transform video data generated by it and transfer the transformed video data to mobile device 160.
  • control means 180 for example, from a gyroscope or an accelerometer available in mobile device 160, as well as from any known manipulators, joysticks and other devices connected to a mobile phone and allowing a user in any way to control or interact with video data displayed to him/her in display means 175 of mobile device 160. Therefore, computing device 1 10 is controlled by mobile device 160 in terminal-to-mobile-device system
  • computing device 1 10 can transfer to mobile device 160 not only video data, but audio data also, which is played by mobile device 160 by an audio data player, while data exchange means 140 of computing device 1 10 transfer audio data to mobile device 160 together with video data and/or separately from video data. Accordingly, based on the control signals, computing device 1 10 can additionally transform audio data generated by it and transfer transformed video and audio data to mobile device 160.
  • data exchange means 140 of computing device 1 10 can transfer encrypted video and/or audio data.
  • computing device 1 10 is physically remote in relation to mobile device 160, so data exchange means 140 exchange data with mobile device 160 by a local area network or the Internet.
  • Said terminal-to-mobile-device system 100 makes it possible to transfer data and commands from the computing device to the mobile device at a speed of 3 ms, which allows using in the mobile device such applications executed in the computing device, which cannot function in the mobile device due to the specific features of the mobile device hardware.
  • video data is generated by the video adapter of the computing device at stage 210, then video data from the video adapter is transferred to data exchange means at stage 220 after the video adapter sends the signal to the processor of the computing device informing it about readiness of video data and before sending video data from the video adapter to display means of the computing device.
  • video data is coded using a coding parameter which is characterized by a certain number of frames to which the processed frame refers (stage 230), and coded video data is transferred to the mobile device (stage 240).
  • stage 250 is executed, where video data obtained from the computing device is decoded using a decoding parameter which is characterized by a lower number of frames to which the processed frame refers, as compared to the coding parameter, then stage 260, where video data in the mobile device is transformed into the virtual reality format, and then stage 270, where transformed video data is displayed by display means of the mobile device.
  • control commands from the user are registered by control means of the mobile device, at stage 285 control signals are generated based on said control commands from the mobile device, and at stage 290 control signals from the mobile device are sent to the computing device.
  • the user of the mobile device can remotely control the computing device.
  • the method described above uses transfer of video data from the video adapter to the data exchange means before sending video data from the video adapter to the display means of the computing device and decoding video data using a decoding parameter which is characterized by a lower number of frames to which the processed frame refers, as compared to the coding parameter.
  • audio data can be generated on the computing device, which is transferred to the mobile device together with the video data or separately from it in non-encrypted or encrypted form.
  • video and/or audio data generated on the computing device is additionally transformed based on control signals from the mobile device, then transformed video and/or audio data is transferred to the mobile device.
  • the method disclosed above may be implemented, in particular, using computer software executing stages of the method, when it is used in combination with the required hardware.
  • Windows-based PC games can be played on Android-based and iOS-based mobile phones.
  • Streaming is registered in the network using Multicast DNS as VrStreaming tcp service.
  • the protocol is based on Bonjour designed by Apple, in this case the following is used: jmDns for Android - http://jmdns.sourceforge.net/
  • TinySvcMDNS for PC - https://bitbucket.org/geekman/tinysvcmdns
  • Input devices are currently processed on Android only.
  • connection to input devices through Bluetooth is standardized and maintained by the Android system by dispatchKeyEvent
  • Android reports of an input device status change event, including device type, event code, entry name and value. Then, if it is a system event - volume, home button or back button - it is transferred to the system.
  • appStreamer.motionEvent (2, keyCode, 0);
  • appStreamer.motionEvent (2, keyCode, 1 );
  • the processed data is transferred to the module for connection to the game.
  • the list of games is provided in JSON format.
  • the game launch method returns a status after the game has been launched and the integration module has been initialized.
  • This library makes it possible to control data integrity in the course of transfer by UDP and to make a decision as to when a packet was important and when not.
  • RTSP is replaced for configuration and video transfer.
  • the size of a network packet is 1 ,292 bytes.
  • a buffer for a single frame is limited to 512 kb or to 406 packets, respectively.
  • h264 and h265 compression standards are used for data coding, where video is split into NAL units that can include video configuration, key frames and data on image change. Low latency of operation is achieved due to the use of reliable delivery of the data on image change.
  • a circular buffer is created, i.e. an area in memory, which is to be repeatedly overwritten when new data is received from the computer.
  • One and the same thread deals with data receipt and transfer in the mobile phone; first, it checks whether any module has sent new data, for example, a new angle of rotation or mouse move or command from a game controller, second, it checks whether any data has been received from the computer.
  • Each packet from the mobile phone and from the computer is marked with its own flag, by which the code determines what to do with such data. If the data with a flag denoting that it is audio has been received, the audio is decoded immediately in the same thread, since this is a high-speed operation of about 2 ms, while starting a new thread for such task would require extra synchronizations and context switches in mobile phones with few cores.
  • the coder is sent a message that a new frame for decoding and display is saved at a particular address in the circular buffer. if (enet.readPacketBuffered(cb, channel, 0) > 0) ⁇
  • This module operates based on the class dealing with analysis of NAL units.
  • the system is inquired for a list of possible decoders.
  • the Opus library is used for audio coding as a specific library optimized for transfer of real time audio by a network. PC application.
  • API server For the mobile client and SDK. All logic is processed here.
  • the game is fully interoperated by this class, primarily in cut-by mode directly to the game, except for cases when the game has not been initialized yet.
  • SDK is used.
  • data stream from the mobile phone is encrypted and reaches this server, then it is decompressed and transferred by a UDP Loopback device to the game that uses SDK.
  • Video capture There are many methods to acquire a video frame ready for display. References to BackBuffer in DirectX 9 and DirectX 1 1 are obtained and the Present method in Windows is called, which reports to the operating system that frame render is completed, after which the driver reads this BackBuffer and draws the image on the monitor display. Two ready frame events are reported: one likewise to Windows, and the other to own library.
  • the server application has an API for this purpose http://127.0.0.1 :47000/dxHooks x64
  • This maintenance application creates a hidden window for DirectX render, which is the same as the game itself.
  • a virtual address table is read, where, being guided by the official API, a function number is used: for example, 8 for the Present function in DXGI_Swapchain_present.
  • the function is received according to this number, and its absolute address is searched in memory. Then, its offset address is calculated by subtracting the displacement from the library downloading address and such offset address is saved for further transfer via API.
  • offset addresses and having the function for search for the dxgi.dll or d3d.dll library within the real game, they may be summed up, and the place where a particular game starts to show an image can be determined.
  • DXVA2 Video Processor is used for DirectX 9:
  • HRESULT hr m_pDXVA2VideoProcessor->VideoProcessBlt(pNV12Dst, &vpblt, &vs, 1 , NULL);
  • ID3D1 1 VideoDevice and ID3D1 1 VideoContext:
  • HRESULT hr d1 1 VideoContext->VideoProcessorBlt(d1 1 VideoProcessor,
  • a reference to the prepared area in memory may be given for coding.
  • This data is transferred to the queue for sending to the mobile phone.
  • bitrate is over 20,000 kbps
  • Com object in CoreAudio is used, which allows creating an audio device and activating it in the game.
  • raw audio from the game reaches the virtual device.
  • Audio coding The coder settings are optimized for 10 ms audio latency
  • const float * inpData (const float * )data
  • int len alphabet_encode(encoder, inpData, samples, output, maxPacket);
  • Windows messages are used to deliver a keyboard and a mouse.
  • One of the modules sensitive to speed.
  • the unit dealing with connection to other modules, procedure for launching and settings for such modules and dealing with data transfer between the mobile phone, the game and the API. There is a system of settings which determines with what parameters and in what order other modules should be launched.
  • Audio is captured also, using CoreAudio.
  • the stream which will receive commands from and send data to the mobile phone is launched. First, it endlessly waits for this mobile phone to connect. As soon as such mobile phone connects and sends a command to start streaming, the system reads the recommended quality settings and launches the coder with the appropriate parameters. void enetThread() ⁇
  • the EnetWork() function determines whether there has been a message from the mobile phone (for example, a new angle of rotation, a command to move a mouse) or whether there is video/audio data to be sent to the mobile phone.
  • This component deals with forming a stereo image directly on the PC.
  • Overlay also includes displayed copyright and text description in a visible area on the monitor display, but at the same time, it does not transmit it to the mobile client.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne d'une manière générale des procédés et des dispositifs d'administration de serveur. Un système de terminal à dispositif mobile (100) comprend un terminal commandé par un dispositif mobile (160), le système (100) comprenant en outre : un dispositif informatique (110) comprenant un processeur de dispositif informatique (115), une mémoire vive (120), une mémoire morte (125), un adaptateur vidéo (130), des moyens d'affichage de dispositif informatique (135) et des moyens d'échange de données de dispositif informatique (140) ainsi que des fonctions de mise en œuvre du terminal. Le dispositif mobile (160) comprend un processeur de dispositif mobile (165), un module de décodage (170), des moyens d'affichage de dispositif mobile (175) et des moyens de commande (180) configurés pour être manipulés par un utilisateur (105), le dispositif mobile (160) étant configuré pour échanger des données avec les moyens d'échange de données de dispositif informatique (140). Le dispositif informatique (110) est configuré pour transférer des données vidéo de l'adaptateur vidéo (130) à destination des moyens d'échange de données de dispositif informatique (140) après que l'adaptateur vidéo (130) a envoyé un signal au processeur de dispositif informatique (115) l'informant sur la disponibilité des données vidéo et avant d'envoyer les données vidéo de l'adaptateur vidéo (130) à destination des moyens d'affichage de dispositif informatique (135). Les moyens d'échange de données de dispositif informatique (140) sont configurés pour coder les données vidéo à l'aide d'un paramètre de codage caractérisé par un nombre de trames auxquelles la trame traitée se réfère, et pour transférer les données vidéo codées au dispositif mobile (160). Le dispositif mobile (160) est configuré pour décoder, à l'aide du module de décodage (170), les données vidéo obtenues à partir du dispositif informatique (110), et pour afficher, à l'aide des moyens d'affichage de dispositif mobile (175), les données vidéo décodées dans un format de réalité virtuelle. Le dispositif mobile (160) est configuré pour générer des signaux de commande sur la base de signaux provenant des moyens de commande (180) et pour envoyer les signaux de commande générés au dispositif informatique (110). Le module de décodage (170) est configuré pour utiliser un nombre minimal possible de tampons et pour effectuer un décodage sans synchronisation avec la fréquence de trame ; le module de décodage (170) est en outre configuré pour utiliser des modes conçus pour une plus grande résolution ou une fréquence de trame plus grande que celles d'un flux vidéo réel. L'invention concerne également un procédé de commande à distance d'un terminal par un utilisateur à l'aide du système proposé.
PCT/IB2017/052026 2017-03-31 2017-04-07 Système de terminal à dispositif mobile, où un terminal est commandé par l'intermédiaire d'un dispositif mobile, et procédé de commande à distance de terminal WO2018178748A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
RU2017110870 2017-03-31
RU2017110870A RU2017110870A (ru) 2017-03-31 2017-03-31 Система терминал - мобильное устройство, в которой терминал управляется посредством мобильного устройства, и способ удаленного контроля терминала

Publications (1)

Publication Number Publication Date
WO2018178748A1 true WO2018178748A1 (fr) 2018-10-04

Family

ID=58800856

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/052026 WO2018178748A1 (fr) 2017-03-31 2017-04-07 Système de terminal à dispositif mobile, où un terminal est commandé par l'intermédiaire d'un dispositif mobile, et procédé de commande à distance de terminal

Country Status (2)

Country Link
RU (1) RU2017110870A (fr)
WO (1) WO2018178748A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117440209A (zh) * 2023-12-15 2024-01-23 牡丹江师范学院 一种基于演唱场景的实现方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050141620A1 (en) * 2003-10-20 2005-06-30 Sony Corporation Decoding apparatus and decoding method
US20120201308A1 (en) * 2010-11-24 2012-08-09 Texas Instruments Incorporated Method for Low Memory Footprint Compressed Video Decoding
US20140173674A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Server gpu assistance for mobile gpu applications
US8965460B1 (en) 2004-01-30 2015-02-24 Ip Holdings, Inc. Image and augmented reality based networks using mobile devices and intelligent electronic glasses
US9274340B2 (en) 2014-02-18 2016-03-01 Merge Labs, Inc. Soft head mounted display goggles for use with mobile computing devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050141620A1 (en) * 2003-10-20 2005-06-30 Sony Corporation Decoding apparatus and decoding method
US8965460B1 (en) 2004-01-30 2015-02-24 Ip Holdings, Inc. Image and augmented reality based networks using mobile devices and intelligent electronic glasses
US20120201308A1 (en) * 2010-11-24 2012-08-09 Texas Instruments Incorporated Method for Low Memory Footprint Compressed Video Decoding
US20140173674A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Server gpu assistance for mobile gpu applications
US9274340B2 (en) 2014-02-18 2016-03-01 Merge Labs, Inc. Soft head mounted display goggles for use with mobile computing devices

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FABRIZIO LAMBERTI ET AL: "A Streaming-Based Solution for Remote Visualization of 3D Graphics on Mobile Devices", IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 12, no. 2, 1 March 2007 (2007-03-01), pages 247 - 260, XP011157909, ISSN: 1077-2626 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117440209A (zh) * 2023-12-15 2024-01-23 牡丹江师范学院 一种基于演唱场景的实现方法及系统
CN117440209B (zh) * 2023-12-15 2024-03-01 牡丹江师范学院 一种基于演唱场景的实现方法及系统

Also Published As

Publication number Publication date
RU2017110870A3 (fr) 2018-10-03
RU2017110870A (ru) 2018-10-03

Similar Documents

Publication Publication Date Title
US10229651B2 (en) Variable refresh rate video capture and playback
US10771565B2 (en) Sending application input commands over a network
RU2744969C1 (ru) Способ и устройство для эффективной доставки и использования аудиосообщений для высокого качества восприятия
US9370718B2 (en) System and method for delivering media over network
US20180219929A1 (en) Method and system for distributed processing, rendering, and displaying of content
CN113209632B (zh) 一种云游戏的处理方法、装置、设备及存储介质
US8876601B2 (en) Method and apparatus for providing a multi-screen based multi-dimension game service
US20140344469A1 (en) Method of in-application encoding for decreased latency application streaming
JP6379107B2 (ja) 情報処理装置並びにその制御方法、及びプログラム
US12272013B2 (en) Creating cloud-hosted, streamed augmented reality experiences with low perceived latency
US8860720B1 (en) System and method for delivering graphics over network
US9233308B2 (en) System and method for delivering media over network
RU2673560C1 (ru) Способ и система воспроизведения мультимедийной информации, стандартизированный сервер и терминал прямой трансляции
CN109168032B (zh) 视频数据的处理方法、终端、服务器及存储介质
US11936928B2 (en) Method, system and device for sharing contents
CN112354176A (zh) 云游戏实现方法、云游戏实现装置、存储介质与电子设备
AlDuaij et al. Heterogeneous multi-mobile computing
US20250131630A1 (en) Prop display method, apparatus, device, and storage medium
WO2018178748A1 (fr) Système de terminal à dispositif mobile, où un terminal est commandé par l'intermédiaire d'un dispositif mobile, et procédé de commande à distance de terminal
CN111213088B (zh) 网络控制的3d视频捕获
CN114071170A (zh) 一种网络直播互动方法及设备
CN116016972B (zh) 直播间美颜方法、装置、系统、存储介质以及电子设备
KR102359367B1 (ko) 게임 스트리밍을 위한 방법 및 장치
US9954718B1 (en) Remote execution of applications over a dispersed network
CN111065053B (zh) 一种视频串流的系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17726680

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/01/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17726680

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载