+

CN111563027A - Application operation monitoring method, device and system - Google Patents

Application operation monitoring method, device and system Download PDF

Info

Publication number
CN111563027A
CN111563027A CN202010360865.4A CN202010360865A CN111563027A CN 111563027 A CN111563027 A CN 111563027A CN 202010360865 A CN202010360865 A CN 202010360865A CN 111563027 A CN111563027 A CN 111563027A
Authority
CN
China
Prior art keywords
target frame
frame picture
application
terminal
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010360865.4A
Other languages
Chinese (zh)
Other versions
CN111563027B (en
Inventor
牛长锋
逯海亮
李睿珩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shiboyun Information Technology Co ltd
Original Assignee
Beijing Shiboyun Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shiboyun Information Technology Co ltd filed Critical Beijing Shiboyun Information Technology Co ltd
Priority to CN202010360865.4A priority Critical patent/CN111563027B/en
Publication of CN111563027A publication Critical patent/CN111563027A/en
Application granted granted Critical
Publication of CN111563027B publication Critical patent/CN111563027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/302Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention provides a method, a device and a system for monitoring the operation of an application; the method is applied to a monitoring server and comprises the following steps: acquiring the operation parameters of the appointed sub-process in the complete process period of the target frame picture of the monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the beginning of terminal acquisition of first posture information of a terminal to the end of display completion of a target frame picture received by the terminal, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and is encoded by the cloud rendering server and then is issued to the terminal; and analyzing the running state of the application according to the running parameters of the specified subprocess, and outputting the analysis result of the running state of the application. The method and the device have the advantages that the application operation is efficiently monitored, the operating condition and the user experience of the application are conveniently managed and checked by workers, and the data processing amount of the cloud rendering server is reduced.

Description

Application operation monitoring method, device and system
Technical Field
The invention relates to the technical field of cloud computing, in particular to a method, a device and a system for monitoring operation of an application.
Background
In the method, the application runs on a cloud rendering server, the cloud rendering server renders scenes generated by the running of the application, then collects pictures generated by rendering and audio data generated by the running of the application, codes the application pictures and the audio data, transmits the coded application pictures and audio data to a terminal, and decodes the coded application pictures and audio data by the terminal and displays the coded application pictures and audio data.
In the running process of the applications such as the cloud game and the cloud VR, managers need to know the running conditions of the applications in time so as to maintain the running systems of the applications. In the prior art, the terminal display device is arranged to upload the running state data to the cloud rendering server at regular time, the cloud rendering server records the state data in a mode of generating a log file, and then a manager can know the running information of the whole system by checking the log file.
According to the mode of generating the log file through the cloud rendering server, on one hand, the load of the cloud rendering server is increased under the condition that the application quantity is large, on the other hand, when the management personnel check the log file to know the system operation condition, the management personnel need to download the log file, then the data to be checked in the log are searched one by one and the judgment is made, and the efficiency is low.
Disclosure of Invention
In view of this, the present invention provides a method, an apparatus, and a system for monitoring running of an application, so as to efficiently monitor a running state of the application and know user experience in time.
Specifically, the invention is realized by the following technical scheme:
in a first aspect, an embodiment of the present invention provides an application operation monitoring method, where the method is applied to a monitoring server, and the method includes:
acquiring operation parameters of a designated sub-process in a complete process cycle of a target frame picture of a monitored application, wherein the complete process cycle refers to a processing process from the time when a terminal acquires first posture information of the terminal to the time when the terminal finishes displaying the received target frame picture, and the target frame picture is an application picture generated by a cloud rendering server according to the first posture information, is coded by the cloud rendering server and is transmitted to the terminal;
and analyzing the running state of the application according to the running parameters of the specified subprocess, and outputting the analysis result of the running state of the application.
In a second aspect, an embodiment of the present invention provides an operation monitoring system for an application, where the system includes: the system comprises a cloud rendering server, a terminal and a monitoring server;
the monitoring server is used for acquiring the operation parameters of the appointed sub-process in the complete process cycle of the target frame picture of the monitored application, wherein one complete process cycle refers to the processing process from the time when the terminal acquires the first posture information of the terminal to the time when the terminal finishes displaying the received target frame picture, and the target frame picture is an application picture generated by the cloud rendering server according to the first posture information, is coded by the cloud rendering server and is transmitted to the terminal;
the monitoring server is further used for analyzing the running state of the application according to the running parameters of the specified subprocess and outputting the analysis result of the running state of the application.
In a third aspect, an embodiment of the present invention provides an apparatus for monitoring an operation of an application, where the apparatus includes:
the acquisition module is used for acquiring the operation parameters of the appointed sub-process in the complete process period of the target frame picture of the monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the beginning of terminal acquisition of first posture information of a terminal to the end of display completion of a target frame picture received by the terminal, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and is encoded by the cloud rendering server and then is issued to the terminal;
and the analysis module is used for analyzing the running state of the application according to the running parameters of the specified subprocess and outputting an analysis result of the running state of the application.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present invention further provides a computer device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps as described in the first aspect when executing a program stored in the memory.
In the method, the device and the system for monitoring the operation of the application provided in the embodiment of the present invention, the monitoring server obtains the operation parameters of the designated sub-process in the complete process cycle of the target frame picture of the monitored application, and the complete process cycle includes: and the whole processing process from the beginning of acquiring the first posture information of the terminal by the terminal to the end of displaying the received target frame picture by the terminal is finished, wherein the target frame picture is an application picture generated by the cloud rendering server according to the first posture information, and is coded by the cloud rendering server and then is issued to the terminal. And the cloud rendering server analyzes the applied operation condition according to the operation parameters of the sub-process and outputs the analysis result of the application operation condition. According to the embodiment of the invention, the application running state is uniformly monitored by the set monitoring server, the running state of the application is analyzed according to the monitored running parameters, the analysis result sent by the application running state is output, the application running is efficiently monitored, the management and the checking of the running state and the user experience of the application by workers are facilitated, and the data processing amount of the cloud rendering server is reduced.
Drawings
FIG. 1 is a diagram illustrating a complete process cycle for a frame of game play in accordance with an exemplary embodiment of the present invention;
FIG. 2 is a flow chart illustrating a method for monitoring the operation of an application in accordance with an exemplary embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for monitoring the operation of a first application according to an exemplary embodiment of the present invention;
fig. 4 is a schematic view of a first scenario of an operation monitoring method for an application according to an exemplary embodiment of the present invention;
FIG. 5 is a time line schematic of a complete progress period of a frame of a game screen according to an exemplary embodiment of the present invention;
fig. 6 is a schematic view of a second scenario of an operation monitoring method for an application according to an exemplary embodiment of the present invention;
fig. 7 is a flowchart illustrating an operation monitoring method for a second application according to an exemplary embodiment of the present invention;
FIG. 8 is a schematic diagram of an operational monitoring device for an application according to an exemplary embodiment of the present invention;
fig. 9 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Under the operation architecture of applications such as cloud games, cloud VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), and the like, the applications are deployed on a cloud rendering server, in the application operation process, taking a cloud game as an example, the cloud rendering server receives an operation instruction which is sent by a terminal and used for controlling the operation of a cloud game application by a user through a network and transmits the operation instruction to the cloud game application, the cloud game application responds to the operation instruction to generate game data, the cloud rendering server renders the game data to obtain a game picture, the cloud rendering server acquires the game picture and acquires audio data output to an audio system by the cloud game application, encodes the acquired game picture and audio data, and transmits the encoded game picture and audio data to the terminal through the network in real time in the form of audio/video stream, and the terminal decodes and outputs the received audio and video stream. Because the rendering and the encoding of the game in the cloud game mode are realized on the cloud rendering server, the dependence of the game on the computing capacity and the storage capacity of the terminal is greatly reduced through the cloud game mode.
In the prior art, the running state parameters of the applications such as the cloud game are generally acquired by the cloud rendering server and stored in a log mode, and managers search data to be watched in the log reading mode, so that the efficiency is low, and the statistical parameters in the existing running state parameter statistical mode are single, so that the evaluation of the running state of the applications is difficult to perform systematically and comprehensively; based on this, the embodiment of the invention provides an application operation monitoring method, device and system.
In the application running process, the terminal periodically collects the attitude information of the terminal and uploads the attitude information to the cloud rendering server, and the cloud rendering server renders an application picture according to the attitude information so that the rendered application picture is consistent with the field angle of a user; in the embodiment of the present invention, a terminal includes: head mounted displays (such as VR helmets, AR helmets, MR helmets), mobile terminals, and the like.
Taking a cloud game as an example, fig. 1 is a schematic diagram illustrating a complete process cycle of a frame of cloud game screen according to an exemplary embodiment of the present invention; referring to fig. 1, a complete progress period of a game frame includes the following sub-processes: a posture information detecting sub-process of the terminal, a posture information transmitting sub-process by which the terminal transmits the detected posture information to the cloud rendering server, a posture information receiving sub-process, a game picture rendering sub-process, a game picture encoding sub-process, a game picture transmitting sub-process, a game picture receiving sub-process, a decoding sub-process, and a display sub-process. In the embodiment of the invention, the running state of the cloud game application is analyzed by monitoring the running data of the appointed sub-process in the complete process cycle of the cloud game picture, so that the experience of a terminal user is known in time.
FIG. 2 is a flow chart illustrating a method for monitoring the operation of an application in accordance with an exemplary embodiment of the present invention; referring to fig. 2, the method includes the steps of:
s10, acquiring the operation parameters of the appointed sub-process in the complete process cycle of the target frame picture of the monitored application; the one complete process cycle refers to a complete processing process from the time when a terminal collects first posture information of the terminal to the time when the terminal finishes displaying a received target frame picture, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information, and is coded by the cloud rendering server and then sent to the terminal.
In this embodiment, the above may be to monitor one or more sub-processes in the complete process period of the target frame picture, and specifically, the monitored sub-processes may be selected and set by a worker.
The target frame picture may refer to each frame picture of the application, and at this time, the terminal uploads attitude information to trigger the monitoring of the running state of the application; the target frame picture may also be a partial frame picture of the application, for example, the running state of the application is monitored at certain intervals, and after the monitoring period is reached, if it is detected that the terminal uploads the attitude information, the running state of the application is monitored by triggering.
With reference to fig. 1, in this embodiment, a complete process from when the terminal acquires the first pose information of the terminal to when the terminal finishes displaying the received target frame image includes the following sub-processes: a gesture information detection sub-process, a gesture information sending sub-process, a gesture information receiving sub-process, a game picture rendering sub-process, a game picture coding sub-process, a game picture sending sub-process, a game picture receiving sub-process, a decoding sub-process and a display sub-process of the terminal; the running parameters of each subprocess comprise the time consumed by running each subprocess.
And S20, analyzing the running state of the application according to the running parameters of the specified subprocess, and outputting the analysis result of the running state of the application.
In this embodiment, the operation parameter of the designated sub-process may be compared with a preset operation parameter threshold, and the comparison result is used as an analysis result of the application operation status; and outputting the operation parameters and/or analysis results which are selected by the user and are desired to be viewed according to the selection operation of the user in the presentation mode selected by the user, such as the formats of graphs, tables and the like.
In one embodiment of the present invention, the monitoring server monitors the operation parameters of the designated sub-process by receiving the operation parameters uploaded by the terminal.
Fig. 3 is a flowchart illustrating a method for monitoring the operation of a first application according to an exemplary embodiment of the present invention; referring to fig. 3, in an embodiment of the present invention, in the step S10, acquiring the operation parameters of the designated sub-process in the complete process cycle of the target frame picture of the monitored application specifically includes the following steps S10':
s10', acquiring the time consumed by each designated sub-process in the complete process cycle of the target frame picture uploaded by the terminal; wherein the time consumed by all the designated sub-processes is calculated by the terminal and/or the cloud rendering server.
Illustratively, the terminal calculates the time consumed by each sub-process according to the timestamp data of the designated sub-process, or the terminal directly acquires the time consumed by the designated sub-process on the cloud rendering server side calculated by the cloud rendering server and the time consumed by the designated sub-process on the terminal side calculated by the terminal itself from the cloud rendering server.
In the embodiment of the application, the terminal uploads the time consumed by each designated sub-process in the complete process period of the target frame picture to the monitoring server, so that the problem of network congestion caused by uploading different data by the terminal and the cloud rendering server can be solved.
In the embodiment of the invention, the terminal and the cloud rendering server record the starting timestamp and the ending timestamp of the sub-process in the process of executing the corresponding designated sub-process.
For example, the recording mode may be to encapsulate a start timestamp and an end timestamp of the sub-process in a packet processed by the sub-process, and finally, the terminal calculates, by the terminal, timestamp data of all specified sub-processes encapsulated in the final packet including the target frame picture, to obtain time consumed by each specified sub-process.
In this embodiment, specifying a child process includes: a posture information sending subprocess, a posture information acquiring subprocess, a target frame picture rendering subprocess, a target frame picture coding subprocess, a target frame picture sending subprocess, a target frame picture receiving subprocess and a decoding subprocess are taken as examples; the consumed running time of each designated sub-process comprises: the sum of the time consumed by all the appointed subprocesses, the receiving time consumed by the terminal for receiving the target frame picture, the decoding time consumed by decoding the target frame picture and the network downlink time delay; and the cloud rendering server receives first attitude information, obtains first attitude information acquisition time obtained by taking the first attitude information away from consumed time by an application, obtains rendering time consumed by a target frame picture according to the rendering of the first attitude information, coding time consumed by coding the target frame picture, waiting sending time from the coding end of the target frame picture to the sending end of the target frame picture, and network uplink time delay.
Fig. 4 is a schematic view of a first scenario of an operation monitoring method for an application according to an exemplary embodiment of the present invention; referring to fig. 4, in the process of uploading the collected first posture information to the cloud rendering server 200, the terminal 100 encapsulates a transmission timestamp T1 in a data packet of the first posture information, the cloud rendering server 200 encapsulates the received first posture information to receive a timestamp T2, the cloud rendering server 200 encapsulates the posture information to remove the timestamp T3 when detecting that the cloud game application removes the posture information, the cloud rendering server 200 renders the game screen by using the posture information, if the game screen rendering is completed, the cloud rendering server encapsulates the first posture information and the encapsulated timestamp data together with the game screen rendering completion timestamp T4 in the rendered game screen, and continues to encapsulate the encoding start timestamp T5 and the encoding end timestamp T6 in the process of encoding the game screen, encapsulates the transmission timestamp T7 when transmitting the encoded game screen to the terminal 100, on the terminal side, the terminal 100 continues to pack the received encoded game picture with the reception start time stamp T8 and the reception end time stamp T9, and packs the decoding start time stamp T10 and the decoding end time stamp T11 in the decoding process.
The terminal respectively calculates and obtains the time consumed by the designated sub-process according to the timestamp data; calculating the time for acquiring the posture information by the cloud game application according to the receiving time stamp T2 of the first posture information and the posture information removing time stamp T3; and calculating rendering time consumed by the rendering sub-process to perform game picture rendering according to the attitude information takeout time stamp T3 and the game picture rendering completion time stamp T4, and calculating coding time consumed by coding the game picture according to the coding start time stamp T5 and the coding end time stamp T6 of the game picture.
In an optional embodiment of the present invention, the network uplink delay may be calculated according to a sending timestamp T1 encapsulated in the data packet of the first posture information and a receiving timestamp T2 encapsulated by the cloud rendering server for receiving the first posture information.
In another embodiment of the invention, a terminal sends a delay detection signaling to a cloud rendering server, the cloud rendering server sends feedback information after receiving the delay detection signaling, the terminal calculates a time period lasting from the time when a delay detection instruction is sent to the time when the feedback information is received, divides the time period by 2 to be used as network uplink delay, and sends the network uplink delay to a monitoring server.
Optionally, after obtaining the time sum, the terminal or the monitoring server calculates to obtain the network downlink delay of the target frame picture in the complete processing period by subtracting the consumed receiving time, decoding time, first posture information obtaining time, rendering time, encoding time, waiting sending time, and network uplink delay of the target frame picture from the time sum. If the execution subject of the calculation can be a terminal, all time data are uploaded to the monitoring server for analysis and presentation after the calculation of the terminal is completed.
Referring to fig. 5, in the embodiment of the present application, the operation monitoring data of the application further includes: the terminal further calculates the terminal time delay sum (receiving time + decoding time) and the server time delay sum (first posture information acquisition time + rendering time + encoding time + waiting sending time) of the target frame picture in the processing process after the time consumed by each sub-process is obtained, and uploads the terminal time delay sum and the server time delay sum to the monitoring server.
The time consumed by the terminal to calculate each sub-process and the time consumed by each sub-process to upload to the cloud rendering server may be after the target frame picture is encoded or after the target frame picture is output and displayed, which is not limited in the present invention.
It should be noted that, the foregoing embodiments of the present invention only illustrate specific time of encapsulating the timestamp or specific time of encapsulating the timestamp, and should not be considered as limiting the present invention.
In another possible embodiment of the present invention, referring to fig. 6, the monitoring server 300 specifies the operation parameters of the sub-process by acquiring the complete process cycle of the target frame picture from the terminal 100 and the cloud rendering server 200, respectively.
Fig. 7 is a flowchart illustrating an operation monitoring method for a second application according to an exemplary embodiment of the present invention; referring to fig. 7, in the above step S10 in this embodiment, acquiring the operation parameters of the specified sub-process in the complete process cycle of the target frame picture of the monitored application specifically includes the following steps S101 to S102:
s101, first operation data of a terminal side appointed sub-process of the target frame picture uploaded by the terminal in a complete process period and second operation data of a target frame picture server side appointed sub-process uploaded by a cloud rendering server running the application are respectively obtained.
In this embodiment, the terminal side designating sub-process includes: the terminal attitude information sending sub-process, the terminal attitude information receiving sub-process and the terminal attitude information decoding sub-process are carried out; the server side assigns sub-processes including: the method comprises a terminal attitude information receiving sub-process, a game picture rendering sub-process, a game picture coding sub-process and a sending sub-process.
And S102, calculating to obtain network delay data in the complete processing period based on the first operation data and the second operation data.
In this embodiment, in step S20, analyzing the operation status of the application according to the first operation data and the second operation data, and outputting an analysis result of the operation status of the application includes:
s20', analyzing the operation condition of the application according to the first operation data, the second operation data and the network delay data, and outputting the analysis result of the operation condition of the application.
In this embodiment, the above-mentioned assigned subprocess includes: a gesture information sending sub-process, a gesture information acquiring sub-process, a target frame picture rendering sub-process, a target frame picture coding sub-process, a target frame picture sending sub-process, a target frame picture receiving sub-process and a decoding sub-process; further, in this embodiment, the first operation data includes: the sum of the time consumed by all the designated sub-processes, the reception time consumed by receiving a target frame picture, and the decoding time consumed by decoding the target frame picture.
The second operation data includes:
the cloud rendering server obtains first attitude information by calculating the time consumed by taking the first attitude information away from an application, obtains first attitude information obtaining time, obtains rendering time consumed by a target frame picture according to the first attitude information, coding time consumed by coding the target frame picture, waiting sending time from the coding end of the target frame picture to the sending end of the target frame picture, and network uplink time delay.
Optionally, in this embodiment, the step S20' of obtaining the network delay data applied in the complete processing cycle by calculating based on the first operation data and the second operation data specifically includes the following step a 10:
step A10, after obtaining the sum of the time consumed by all the designated sub-processes, applying the sum of the time to subtract the receiving time consumed by receiving the target frame picture, the decoding time consumed by decoding the target frame picture, the first posture information obtaining time of the cloud rendering server, the rendering time, the encoding time, the waiting sending time and the network uplink time delay, and calculating to obtain the network downlink time delay of the target frame picture in the complete processing period.
In the embodiment of the present invention, the terminal and the cloud rendering server may also calculate the time consumed by each subprocess by using the timestamp data of each corresponding designated subprocess, and upload the time consumed by each subprocess to the monitoring server respectively; in the invention, the terminal and the cloud rendering server may upload the time consumed by each designated subprocess to the monitoring server after the subprocess is finished, or upload the operation parameters of each designated subprocess to the monitoring server after the last subprocess is finished, wherein the specific calculation and uploading time is not limited in the invention.
In another possible embodiment of the present invention, the above-mentioned sub-process further includes a display sub-process, and the operation parameters of the display sub-process include display parameters characterizing the display quality of the target frame picture, for example, the display parameters of the display quality include: black edge rate.
In the embodiment, in the process of displaying a target frame picture by a terminal, the terminal acquires first posture information of the terminal used by a cloud rendering server when the currently displayed target frame picture is generated by rendering and second posture information of the current terminal; the terminal calculates a black edge rate of the target frame picture according to a field angle of view (FOV), the terminal FOV, the first posture information, and the second posture information of the target frame picture.
In this embodiment, after the black edge rate of the target frame picture is obtained through calculation, the terminal uploads the black edge rate to the monitoring server, and the detection server analyzes the black edge rate, so that a manager can judge the experience of the user according to the analysis result of the black edge rate.
The operation parameters further comprise a code rate and a frame rate of the cloud rendering server side, the code rate and the frame rate are directly uploaded to the monitoring server by the cloud rendering server, or the code rate and the frame rate and the timestamp data are issued to the terminal by the cloud rendering server, and the code rate and the frame rate are uploaded to the monitoring server after being collected by the terminal.
In another possible embodiment of the present invention, the method further includes: acquiring working state parameters and/or application information of a terminal, wherein the working state parameters comprise information such as electric quantity information and CPU utilization rate, and the application information comprises the following information: and applying information such as resolution, code rate and name of the configuration, and further realizing comprehensive monitoring and statistics on application operation in the application operation process.
In a possible embodiment of the present invention, the monitoring server notifies the operating condition analysis result of the application to the staff in a set manner at a set time; illustratively, the monitoring server counts the number of abnormal data occurrences of each operating parameter and the number of abnormal times of the terminal operating state within a set time period, and notifies the worker of the data such as the abnormal data and the number of abnormal data occurrences of the monitored application in a set manner such as a mail or a short message after the set time period is over, thereby effectively monitoring the operating process and the user experience of the monitored application.
In another embodiment of the present invention, an operation monitoring system for an application is provided, including: the system comprises a cloud rendering server, a terminal and a monitoring server; the application operation monitoring method of the application operation monitoring system of the present invention may refer to the description in each embodiment of the application operation monitoring method.
In this embodiment, the monitoring server is configured to obtain an operation parameter of a designated sub-process within a complete process cycle of a target frame image of a monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the time when a terminal collects first posture information of the terminal to the time when the terminal finishes displaying a received target frame picture, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and the application picture is coded by the cloud rendering server and then sent to the terminal.
The monitoring server is further configured to analyze the operation status of the application according to the operation parameters of the designated sub-process, and output an analysis result of the operation status of the application.
Optionally, the monitoring server is configured to obtain an operation parameter of a designated sub-process within a complete process cycle of a target frame picture of the monitored application by: acquiring time consumed by each designated sub-process in the complete process period of the target frame picture uploaded by the terminal;
the terminal is used for uploading the time consumed by each designated sub-process in the complete process period of the target frame picture; wherein the time consumed by all the designated sub-processes is calculated by the terminal and/or the cloud rendering server.
Optionally, the time consumed for the given sub-process comprises: the sum of the time consumed by all the appointed subprocesses, the receiving time consumed by the terminal for receiving the target frame picture, the decoding time consumed by decoding the target frame picture and the network downlink time delay; and the cloud rendering server receives first attitude information acquisition time consumed by taking the first attitude information away from the application, rendering time consumed by the cloud rendering server to obtain the target frame picture according to rendering of the first attitude information, coding time consumed by the cloud rendering server to code the target frame picture, waiting sending time from the end of coding the target frame picture to the sending of the target frame picture to the terminal and network uplink time delay.
In another embodiment of the present application, the monitoring server is specifically configured to: respectively acquiring first operation data of a terminal side specified sub-process of the target frame picture uploaded by the terminal in a complete process period and second operation data of a target frame picture service side specified sub-process uploaded by a cloud rendering server running the application; calculating to obtain network delay data in the complete processing period based on the first operation data and the second operation data;
and analyzing the operation condition of the application according to the first operation data, the second operation data and the network delay data, and outputting the analysis result of the operation condition of the application.
In this embodiment, the first operation data includes: the sum of the time consumed by all the designated sub-processes, the reception time consumed by receiving a target frame picture, and the decoding time consumed by decoding the target frame picture.
The second operation data includes:
the cloud rendering server obtains first attitude information obtaining time obtained by calculating time consumed by taking the first attitude information away from the application, rendering time consumed by obtaining the target frame picture according to the first attitude information, coding time consumed by coding the target frame picture, waiting sending time from the end of coding the target frame picture to the middle of sending the target frame picture to the terminal, and network uplink time delay.
Optionally, the monitoring server is specifically configured to:
and after the time sum consumed by all the appointed sub-processes is obtained, subtracting the consumed receiving time for receiving the target frame picture, the decoding time consumed for decoding the target frame picture, the first attitude information acquisition time, the rendering time, the encoding time, the waiting sending time and the network uplink time delay of the cloud rendering server from the time sum, and calculating to obtain the network downlink time delay of the target frame picture in the complete processing period.
Optionally, the operating parameters further include: display parameters representing display quality of the target frame image and/or code rate and frame rate corresponding to the target frame image on the cloud rendering server side
FIG. 8 is a schematic diagram of an operational monitoring device for an application according to an exemplary embodiment of the present invention; referring to fig. 8, an embodiment of the present invention further provides an application operation monitoring apparatus 800, where the apparatus includes:
an obtaining module 801, configured to obtain an operation parameter of a designated sub-process within a complete process cycle of a target frame picture of a monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the beginning of terminal acquisition of first posture information of a terminal to the end of display completion of a target frame picture received by the terminal, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and is encoded by the cloud rendering server and then is issued to the terminal;
an analysis module 802, configured to analyze the operation status of the application according to the operation parameter of the designated sub-process, and output an analysis result of the operation status of the application.
Optionally, the obtaining module 801 is specifically configured to:
acquiring time consumed by each appointed sub-process in the complete process period of the target frame picture uploaded by the terminal;
wherein the time consumed by all the designated sub-processes is calculated by the terminal and/or the cloud rendering server.
Optionally, the time consumed for the given sub-process comprises: the sum of the time consumed by all the appointed subprocesses, the receiving time consumed by the terminal for receiving the target frame picture, the decoding time consumed by decoding the target frame picture and the network downlink time delay;
and the cloud rendering server receives first attitude information acquisition time consumed by taking the first attitude information away from the application, rendering time consumed by the cloud rendering server to obtain the target frame picture according to rendering of the first attitude information, coding time consumed by the cloud rendering server to code the target frame picture, waiting sending time from the end of coding the target frame picture to the sending of the target frame picture to the terminal and network uplink time delay.
Optionally, the obtaining module 801 is specifically configured to:
respectively acquiring first operation data of a terminal side specified sub-process of the target frame picture uploaded by the terminal in a complete process period and second operation data of a target frame picture service side specified sub-process uploaded by a cloud rendering server running the application;
calculating to obtain network delay data in the complete processing period based on the first operation data and the second operation data;
the analysis module 802 is specifically configured to:
and analyzing the running condition of the application according to the first running data, the second running data and the network delay data, and outputting an analysis result of the running condition of the application.
Optionally, the first operation data includes: the sum of the time consumed by all the designated sub-processes, the reception time consumed by receiving a target frame picture, and the decoding time consumed by decoding the target frame picture.
Optionally, the second operation data includes:
the cloud rendering server obtains first attitude information obtaining time obtained by calculating time consumed by taking the first attitude information away from the application, rendering time consumed by obtaining the target frame picture according to the first attitude information, coding time consumed by coding the target frame picture, waiting sending time from the end of coding the target frame picture to the middle of sending the target frame picture to the terminal, and network uplink time delay.
Optionally, the obtaining module 801 is specifically configured to calculate, based on the first operating data and the second operating data, network delay data of the application in the complete processing cycle by the following method:
and if the sum of the time consumed by all the designated sub-processes uploaded by the terminal is received, subtracting the consumed receiving time for receiving the target frame picture, the decoding time consumed for decoding the target frame picture, the first attitude information acquisition time uploaded by the cloud rendering server, the rendering time, the coding time, the waiting and sending time and the network uplink time delay from the overall consumed time, and calculating to obtain the network downlink time delay of the target frame picture in the complete processing period.
Optionally, the operating parameters further include: and representing the display parameters of the target frame picture display quality and/or the code rate and the frame rate of the target frame picture.
In another embodiment of the present invention, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor, implements the steps of the method for applying audio isolated acquisition described in any of the above embodiments.
FIG. 9 is a schematic diagram illustrating a configuration of a computer device in accordance with an exemplary embodiment of the present invention; an electronic device provided in an embodiment of the present invention, as shown in fig. 9, includes a processor 501, a communication interface 502, a memory 503, and a communication bus 504, where the processor 501, the communication interface 502, and the memory 503 complete mutual communication through the communication bus 504,
a memory 503 for storing a computer program;
the processor 501 is configured to implement the steps of the method for isolated audio capture according to any of the above embodiments when executing the program stored in the memory 503.
The communication bus mentioned in the above terminal may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the terminal and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the invention. One of ordinary skill in the art can understand and implement it without inventive effort.
Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., internal magnetic disks or removable disks), magneto-optical disks, and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. In other instances, features described in connection with one embodiment may be implemented as discrete components or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Further, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (12)

1. An operation monitoring method of an application, which is applied to a monitoring server, and comprises the following steps:
acquiring the operation parameters of the appointed sub-process in the complete process period of the target frame picture of the monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the beginning of terminal acquisition of first posture information of a terminal to the end of display completion of a target frame picture received by the terminal, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and is encoded by the cloud rendering server and then is issued to the terminal;
and analyzing the running state of the application according to the running parameters of the specified subprocess, and outputting the analysis result of the running state of the application.
2. The method according to claim 1, wherein the obtaining of the operation parameters of the designated sub-process within the complete process cycle of the target frame picture of the monitored application comprises:
acquiring time consumed by each designated sub-process in the complete process period of the target frame picture uploaded by the terminal;
wherein the time consumed by all the designated sub-processes is calculated by the terminal and/or the cloud rendering server.
3. The method of claim 2, wherein the specifying the elapsed time for the sub-process comprises: the sum of the time consumed by all the appointed subprocesses, the receiving time consumed by the terminal for receiving the target frame picture, the decoding time consumed by decoding the target frame picture and the network downlink time delay;
and the cloud rendering server receives first attitude information acquisition time consumed by taking the first attitude information away from the application, rendering time consumed by the cloud rendering server to obtain the target frame picture according to rendering of the first attitude information, coding time consumed by the cloud rendering server to code the target frame picture, waiting sending time from the end of coding the target frame picture to the sending of the target frame picture to the terminal and network uplink time delay.
4. The method according to claim 1, wherein the obtaining of the operation parameters of the designated sub-process of the target frame picture of the monitored application in the complete process cycle comprises:
respectively acquiring first operation data of a terminal side specified sub-process of the target frame picture uploaded by the terminal in a complete process period and second operation data of a target frame picture service side specified sub-process uploaded by a cloud rendering server running the application;
calculating to obtain network delay data in the complete processing period based on the first operation data and the second operation data;
analyzing the running state of the application according to the running parameters of the specified subprocess, and outputting the analysis result of the running state of the application, wherein the analysis result comprises the following steps:
and analyzing the running condition of the application according to the first running data, the second running data and the network delay data, and outputting an analysis result of the running condition of the application.
5. The system of claim 4, wherein the first operational data comprises: the sum of the time consumed by all the designated sub-processes, the reception time consumed by receiving a target frame picture, and the decoding time consumed by decoding the target frame picture.
6. The system of claim 5, wherein the second operational data comprises:
the cloud rendering server obtains first attitude information obtaining time obtained by calculating time consumed by taking the first attitude information away from the application, rendering time consumed by obtaining the target frame picture according to the first attitude information, coding time consumed by coding the target frame picture, waiting sending time from the end of coding the target frame picture to the middle of sending the target frame picture to the terminal, and network uplink time delay.
7. The method of claim 6, wherein calculating network latency data for the application over the complete processing cycle based on the first operational data and the second operational data comprises:
and after the time sum consumed by all the appointed sub-processes is obtained, subtracting the consumed receiving time for receiving the target frame picture, the decoding time consumed for decoding the target frame picture, the first attitude information acquisition time, the rendering time, the encoding time, the waiting sending time and the network uplink time delay of the cloud rendering server from the time sum, and calculating to obtain the network downlink time delay of the target frame picture in the complete processing period.
8. The method of any of claims 1-7, wherein the operating parameters further comprise: and the display parameters representing the display quality of the target frame picture and/or the code rate and the frame rate corresponding to the target frame picture at the cloud rendering server side.
9. An operation monitoring system for an application, the system comprising: the system comprises a cloud rendering server, a terminal and a monitoring server;
the monitoring server is used for acquiring the operation parameters of the appointed sub-process in the complete process cycle of the target frame picture of the monitored application, wherein one complete process cycle refers to the processing process from the time when the terminal acquires the first posture information of the terminal to the time when the terminal finishes displaying the received target frame picture, and the target frame picture is an application picture generated by the cloud rendering server according to the first posture information, is coded by the cloud rendering server and is transmitted to the terminal;
the monitoring server is further used for analyzing the running state of the application according to the running parameters of the specified subprocess and outputting the analysis result of the running state of the application.
10. An operation monitoring device for an application, the device comprising:
the acquisition module is used for acquiring the operation parameters of the appointed sub-process in the complete process period of the target frame picture of the monitored application; the processing method comprises the steps that a complete process cycle refers to a processing process from the beginning of terminal acquisition of first posture information of a terminal to the end of display completion of a target frame picture received by the terminal, wherein the target frame picture is an application picture generated by a cloud rendering server according to the first posture information in a rendering mode, and is encoded by the cloud rendering server and then is issued to the terminal;
and the analysis module is used for analyzing the running state of the application according to the running parameters of the specified subprocess and outputting an analysis result of the running state of the application.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
12. The computer equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing the communication between the processor and the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 8 when executing a program stored in the memory.
CN202010360865.4A 2020-04-30 2020-04-30 Application operation monitoring method, device and system Active CN111563027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010360865.4A CN111563027B (en) 2020-04-30 2020-04-30 Application operation monitoring method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010360865.4A CN111563027B (en) 2020-04-30 2020-04-30 Application operation monitoring method, device and system

Publications (2)

Publication Number Publication Date
CN111563027A true CN111563027A (en) 2020-08-21
CN111563027B CN111563027B (en) 2023-09-01

Family

ID=72070748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010360865.4A Active CN111563027B (en) 2020-04-30 2020-04-30 Application operation monitoring method, device and system

Country Status (1)

Country Link
CN (1) CN111563027B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113452944A (en) * 2021-08-31 2021-09-28 江苏北弓智能科技有限公司 Picture display method of cloud mobile phone
WO2022218209A1 (en) * 2021-04-14 2022-10-20 华为技术有限公司 Information processing method, apparatus and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130263199A1 (en) * 2011-08-17 2013-10-03 Square Enix Holdings Co., Ltd. Moving image distribution server, moving image reproduction apparatus, control method, program, and recording medium
CN107992392A (en) * 2017-11-21 2018-05-04 国家超级计算深圳中心(深圳云计算中心) A kind of automatic monitoring repair system and method for cloud rendering system
CN111061560A (en) * 2019-11-18 2020-04-24 北京视博云科技有限公司 Cloud rendering resource scheduling method and device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130263199A1 (en) * 2011-08-17 2013-10-03 Square Enix Holdings Co., Ltd. Moving image distribution server, moving image reproduction apparatus, control method, program, and recording medium
CN107992392A (en) * 2017-11-21 2018-05-04 国家超级计算深圳中心(深圳云计算中心) A kind of automatic monitoring repair system and method for cloud rendering system
CN111061560A (en) * 2019-11-18 2020-04-24 北京视博云科技有限公司 Cloud rendering resource scheduling method and device, electronic equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022218209A1 (en) * 2021-04-14 2022-10-20 华为技术有限公司 Information processing method, apparatus and system
CN113452944A (en) * 2021-08-31 2021-09-28 江苏北弓智能科技有限公司 Picture display method of cloud mobile phone
CN113452944B (en) * 2021-08-31 2021-11-02 江苏北弓智能科技有限公司 Picture display method of cloud mobile phone

Also Published As

Publication number Publication date
CN111563027B (en) 2023-09-01

Similar Documents

Publication Publication Date Title
CN112291520B (en) Abnormal event identification method and device, storage medium and electronic device
JP6769943B2 (en) Methods and systems for playing recorded videos
JP4363421B2 (en) Monitoring system, monitoring system server and monitoring method
EP3550844A1 (en) Three-dimensional model distribution method and three-dimensional model distribution device
KR101794005B1 (en) Error detection system for network camera
WO2019057034A1 (en) Method, device, storage medium and electronic device for determining video segment
CN103731631B (en) The method, apparatus and system of a kind of transmitting video image
CN111563027B (en) Application operation monitoring method, device and system
CN106713571A (en) Mobile terminal and method for testing performance of game engine application
US11711509B2 (en) Early video equipment failure detection system
CN109327486A (en) Method, system and gateway for uploading data to cloud platform, and machine-readable medium
CN110855947B (en) Image snapshot processing method and device
CN105472385A (en) Video decoding and image output quality detection method and system
CN108012049A (en) Video analysis method and its device
CN112419639A (en) Video information acquisition method and device
EP3371971B1 (en) Network switch
CN111263113B (en) Data packet sending method and device and data packet processing method and device
CN110177024A (en) Monitoring method and client, server-side, the system of hotspot device
CN112419638B (en) Method and device for acquiring alarm video
CN111741247B (en) Video playback method and device and computer equipment
US10944993B2 (en) Video device and network quality evaluation/diagnostic tool
CN112988337A (en) Task processing system, method, device, electronic equipment and storage medium
CN111314743A (en) Interface data playback method and device
CN103019912A (en) Processing monitoring data in a monitoring system
CN109413386B (en) A kind of temperature monitoring method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载