US20150319071A1 - System for providing test environments for executing and analysing test routines - Google Patents
System for providing test environments for executing and analysing test routines Download PDFInfo
- Publication number
- US20150319071A1 US20150319071A1 US14/668,897 US201514668897A US2015319071A1 US 20150319071 A1 US20150319071 A1 US 20150319071A1 US 201514668897 A US201514668897 A US 201514668897A US 2015319071 A1 US2015319071 A1 US 2015319071A1
- Authority
- US
- United States
- Prior art keywords
- software
- wireless communication
- communication devices
- test
- routines
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/14—Arrangements for monitoring or testing data switching networks using software, i.e. software packages
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3698—Environments for analysis, debugging or testing of software
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/10—Scheduling measurement reports ; Arrangements for measurement reports
Definitions
- the various embodiments of the present invention seeks to provide a system which provides a test environment in which software products can be tested for compatibility and performance issues.
- the system is operable to modify automatically one or more software applications for the wireless communication devices or terminals in response to earlier test results recorded by the system for execution of the one or more software applications on computing hardware of the wireless communication devices or terminals.
- API Application Programming Interface
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Debugging And Monitoring (AREA)
Abstract
A system is operable to provide one or more test environments for executing and analysing test routine. The system includes one or more user interfaces coupled via a communication network to a server arrangement for hosting a plurality of emulations of wireless communication devices or terminals. The server arrangement is operable to receive one or more software applications for execution upon the wireless communication devices or terminals, and one or more test routines for use in testing the wireless communication devices or terminals. Moreover, the server arrangement is operable to execute one or more the software applications and apply the one or more test routines thereto. Furthermore, the server arrangement is operable to monitor operation of the wireless communication devices or terminals and to provide corresponding test results to the one or more user interfaces.
Description
- The present invention relates to systems for providing test environments for executing and analysing test routines, for example for executing and analysing test routines for execution upon computing hardware of mobile devices and/or terminals. Moreover, the present invention comprises methods of providing test environments for executing test routines, for example for executing and analysing test routines for execution upon computing hardware of mobile devices and/or terminals. Furthermore, the present invention relates to software products recorded on machine-readable data storage media, wherein the software products are executable upon computing hardware, for example in a cloud computing environment, for implementing aforementioned methods.
- As There are contemporarily a multitude of manufacturers of mobile wireless communication devices, for example mobile telephones, also known as “cell phones” in certain parts of the World. Each manufacturer potentially manufactures several different models or types of mobile devices and/or terminals. The mobile devices and terminals are often provided to users in several different versions, wherein hardware and/or software can vary between versions. For example, variations between versions concerns one or more of:
- (i) different screen resolution;
- (ii) different colour depths;
- (iii) different sets of wireless interfaces, for example 3G and/or 4G and/or Bluetooth and/or Near Field Communication (NFC);
- (iv) different types of keyboards;
- (v) different languages;
- (vi) different configurations of sensors, for example different sensitivities and/or accuracies for the sensors, for example different in-built camera resolutions;
- (vii) different computer processing units (CPU's); and
- (viii) different software operating systems.
- It will be appreciated from the foregoing that there are potentially a huge number of permutations of mobile devices and/or terminals. Moreover, it is contemporarily desirable by the end-users to be able to download to the mobile device(s) one or more software applications, for example “apps” and “plug-ins”, which are compatible with a wide range of mobile devices and/or terminals without encountering compatibility issues.
- The huge number of permutations of mobile devices creates practical difficulties for software product developers who are desirous to ensure that their software is correctly executable on a large spectrum of mobile devices and/or terminals. It is impractical for the software developers to purchase examples of each different type of mobile device on which their software products are to be run, for example purchase of hundreds of different mobile devices which is prohibitively expensive.
- It is known that there is a possibility to utilize software simulators for testing software. However, it would be even more laborious and costly for a given software product developer both to procure a myriad of mobile devices, and then to develop software simulations of the mobile devices after having characterized their operation. In view of the aforesaid difficulties, a situation arises that software products for mobile devices and/or terminals often face incompatibility issues, namely to the frustration of users, or are uncomfortably expensive when provided in versions which have been tested and verified to execute correctly on a broad spectrum of mobile devices and/or terminals.
- The various embodiments of the present invention seeks to provide a system which provides a test environment in which software products can be tested for compatibility and performance issues.
- Moreover, the various embodiments of the present invention also seeks to provide a method of providing a test environment in which software products can be tested for compatibility and performance issues.
- According to a first aspect, there is provided a system as defined in appended claim 1: there is provided a system for providing one or more test environments for executing and analysing test routines, wherein the system includes one or more user interfaces coupled via a communication network to a server arrangement for hosting a plurality of emulations of wireless communication devices or terminals, wherein:
- (a) the server arrangement is operable to receive one or more software applications for execution upon the wireless communication devices or terminals, and one or more test routines for use in testing the wireless communication devices or terminals;
- (b) the server arrangement is operable to execute the one or more software applications and apply the one or more test routines thereto; and
- (c) the server arrangement is operable to monitor operation of the wireless communication devices or terminals and to provide corresponding test results to the one or more user interfaces.
- The embodiment is of advantage in that the plurality of emulated devices or terminals is available to many software developers and provides the developers with a test platform in which compatibility issues arising between different models of devices or terminals, and also between different versions of a model of devices or terminals, can be tested rapidly and conveniently.
- Optionally, in the system, the plurality of emulations and/or simulators of wireless communication devices or terminals includes at least one real physical device or terminal which is connected in communication with the server arrangement. In other words, the system provides for testing on simulated/emulated devices and terminals, as well as verification on real physical devices and terminals coupled to the system for the software developers to investigate.
- Optionally, in the system, the server arrangement is operable to load the software applications from one or more software application stores in response to instructions input at the one or more user interfaces.
- Optionally, in the system, the server arrangement is operable to implement the one or more test routines on the plurality of emulations of wireless communication devices or terminals as a concurrent batch execution operation.
- Optionally, in the system, the plurality of emulations of wireless communication devices or terminals includes simulation of at least one of: data memory capacity, screen size, screen format, one or more sensors of the devices or terminals, temperature, movement of the devices or terminals.
- Optionally, the system is operable to monitor operation of the wireless communication devices or terminals and to provide corresponding test results to the one or more user interfaces by way of analysis for determining at least one of: differences in screenshot detail presented on the wireless communication devices or terminals, software application execution speed on the wireless communication devices or terminals, operating system compatibility for the wireless communication devices or terminals.
- Optionally, the system is operable to modify automatically one or more software applications for the wireless communication devices or terminals in response to earlier test results recorded by the system for execution of the one or more software applications on computing hardware of the wireless communication devices or terminals.
- According to a second aspect, there is provided a method of using a system for providing one or more test environments for executing and analysing test routines, wherein the system includes one or more user interfaces coupled via a communication network to a server arrangement for hosting a plurality of emulations of wireless communication devices or terminals, wherein the method includes:
- (a) using the server arrangement to receive one or more software applications for execution upon the wireless communication devices or terminals, and one or more test routines for use in testing the wireless communication devices or terminals;
- (b) executing in the server arrangement one or more the software applications and applying the one or more test routines thereto; and
- (c) using the server arrangement to monitor operation of the wireless communication devices or terminals for providing corresponding test results to the one or more user interfaces.
- Optionally, the method includes for the plurality of emulations of wireless communication devices or terminals to include at least one real physical device or terminal which is connected in communication with the server arrangement.
- Optionally, the method includes using the server arrangement to load the software applications from one or more software application stores in response to instructions input at the one or more user interfaces.
- Optionally, the method includes using the server arrangement to implement the one or more test routines on the plurality of emulations of wireless communication devices or terminals as a concurrent batch execution operation.
- Optionally, when implementing the method, the plurality of emulations of wireless communication devices or terminals includes simulation of at least one of: data memory capacity, screen size, screen format, one or more sensors of the devices or terminals, temperature, movement of the devices or terminals.
- Optionally, the method includes using the system to monitor operation of the wireless communication devices or terminals and to provide corresponding test results to the one or more user interfaces by way of analysis for determining at least one of: differences in screenshot detail presented on the wireless communication devices or terminals, software application execution speed on the wireless communication devices or terminals, operating system compatibility for the wireless communication devices or terminals.
- Optionally, the method includes using the system to modify automatically one or more software applications for the wireless communication devices or terminals in response to earlier test results recorded by the system for execution of the one or more software applications on computing hardware of the wireless communication devices or terminals.
- According to a third aspect, there is provided a software product recorded on machine-readable data storage media, wherein the software product is executable upon computing hardware for implementing a method pursuant to the second aspect of the invention.
- It will be appreciated that features of the invention are susceptible to being combined in various combinations without departing from the scope of the invention as defined by the appended claims.
- Embodiments of the present invention will now be described, by way of example only, with reference to the following diagrams wherein:
-
FIG. 1 is an illustration of a system for providing one or more target environments for executing and analysing test routines; -
FIG. 2 is an illustration of steps of a method executable in the system ofFIG. 1 for applying test routines to software for mobile wireless devices and terminals; -
FIG. 3 is an illustration of a layout of a user interface for use in the system ofFIG. 1 . - In the accompanying diagrams, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
- Referring to
FIG. 1 , there is shown an illustration of a system for providing target environments for executing and analysing test routines; the system is indicated generally by 10. Moreover, thesystem 10 includes atest environment 100, wherein thetest environment 100 is beneficially implemented by way of a cloud computing service, for example hosted in the Internet, which can be accessed by software developers via user interfaces. Each software developer employs a software development environment provided in a user interface, for example hosted in a laptop computer orworkstation 104; optionally, the software development environment is hosted by way of cloud computing services and accessed via a laptop computer or workstation. Optionally, each software developer has at least one mobilewireless communication device 106, which is connectable via wireless communication links to the aforesaid software development environment. - Beneficially, the
test environment 100 includes aserver arrangement 110. Theserver arrangement 110 is configurable to control one or more physical or simulated mobilewireless communication devices 114A to 114E pursuant to the instruction from one or more of the software developers. Optionally, theserver arrangement 110 includes a database server 112 for storing test patterns and test results. Beneficially, physical mobile wireless communication devices are connectable to theserver system 110 wirelessly or with coupling leads, for example via one or more Universal Serial Buses (USB). - A method of utilizing the
system 10 will now be described with reference toFIG. 2 . In afirst step 200A, a software developer composes a test routine by way of the laptop computer orworkstation 104. Alternatively, in afirst step 200B, the software developer sends their developed software (binary and/or source code) to theserver arrangement 110, or merely defines a name of the developed software, or any other unique identifier, and the laptop computer orworkstation 104 fetches the developed software from various sources or binaries from one or more of: - (i) any software application stored in the Internet or similar data communication network;
- (ii) any source code storage device; and
- (iii) any device connected to the Internet or similar data communication network.
- In a
second step 210, the software developer is able to execute the test routine locally upon computing hardware included in his/her own mobilewireless communication device 106. - In a
third step 220A, the software developer sends both the software to be tested as well as the test routine to thetest environment 100. Optionally, the software is binary code, byte code, source code and so forth depending upon operation of thesystem 10. Moreover, the test routine is optionally a set of instructions for controlling operation of thetest environment 100. Alternatively, in athird step 220B, thetest environment 100 automatically generates one or more test routines for the software uploaded instep 200B. Optionally, these one or more test routines include routines pertaining to one or more of: - (i) software security;
- (ii) software performance;
- (iii) software usability;
- (iv) software interoperability;
- (v) software stress testing;
- (vi) testing software interactions between large numbers of users and devices, or automatically alternating and optimizing the binary, byte or source code of the software to created desired test routines.
- In a
fourth step 230, the software is uploaded to one or more of the wirelesscommunication device simulations 114A to 114E in thetest environment 100, or to any device or emulated or simulated software available on the communication network, for example the Internet or emulated/simulated environment running inserver arrangement 110. The one or more test routines are thereby capable of being applied to the developed software which is executed by way of simulation on the wirelesscommunication device simulations 114A to 114E in thetest environment 100. - In a
fifth step 240, execution of the one or more test routines is controlled by theserver arrangement 110, wherein the simulation is optionally timed, random or follows a pattern defined on theservice arrangement 110. Theserver arrangement 110 is thus capable of controlling any of the wirelesscommunication device simulations 114A to 114E as well as emulated/simulated environments. - In a
sixth step 250, the results of applying the developed software and the one or more test routines are stored on thedatabase server 120, and thereafter processed to expose common patterns therein and more detailed data pertaining to the tested software, devices, users, networks and so forth. In other words, the results are analyzed to provide the software developer with valuable insight regarding performance and compatibility aspects of their developed software on the wirelesscommunication device simulations 114A to 114E in thetest environment 100. - In a
seventh step 260, the software developer is able to access the test results and therefrom analyses the software and optionally download optimized versions of the source, byte or binary codes of their software. Optionally, at least a portion of the test results acts as a trigger and/or parameters for further automated steps, for example publishing the tested software on the Internet, on application stores or publishing the tested software directly on any wireless communication devices, for example mobile telephones. - In an
eighth step 270, the software developer can execute all or parts of the test routines as many times as he/she desires. Alternatively, the software developer can change parts of the test routines and then re-execute the test routines, for example for iterative developed software testing and/or for implementing developed software adjustments. Optionally, execution of test routines is automated by employing any trigger or criteria that influence in a manner of an automated parameter of thesystem 10. Such triggers and parameters optionally include, but are not limited to, one or more of: - (i) an amendment to an application source code;
- (ii) a change of hardware/software configuration or any of the connected terminals or any of the connected terminals or emulators/simulators; and
- (iii) time-based triggers or triggers originating from any other systems connected to the
system 10, for example Internet services, software applications stores, source code control systems, malware scanners, mobile wireless communication devices. - In addition to implementing the method as outlined with reference to
FIG. 2 , the method can alternatively, or additionally, be employed to cause theserver arrangement 110 to be configured to run simulated/emulated environments for testing purposes. For example, the aforesaid test routines can be executed automatically, manually or a combination thereof within thetest environment 100. - Optionally, the test routines can be configured to provide user feedback to the software developer regarding functionality of developed software, for example to investigate a manner in which screen size and resolution differences between mobile wireless devices, for example mobile telephones, influence software functionality and usability, to analyze software performance and execution speed of the software, to simulate arbitrary numbers of simulated or real users, and so forth. The test routines can be executed concurrently in many wireless terminals or simulated or emulated environments; optionally, inputs may be provided from real users or real physical environments, for example depending upon spatial location, temperature, acceleration, movements, pressure, cadence, altitude, visual information and audio information. When executing the aforementioned test routines, mutually different terminals, mutually different previous test executions, mutually different simulated or emulated environments, mutually different records or real user interactions, mutually different simulations of user interactions and processed comparative data values can be beneficially employed for developed software benchmarking purposes.
- Optionally, the aforesaid test routines include hardware and mobile wireless terminal sensor-related tests. For example, accelerometers included within the wireless devices or terminals can be tested either in a simulated mode or by physical tests, for example via test bench whereat the simulated
mobile wireless devices 114A to 114E can be coupled to sensors which can be tilted, shaken or turned automatically. Optionally, such mobile wireless terminals can be carried by human users or other living entities, for example canine animals, as a part of executing the test routines. Optionally, there are included in thesystem 10 several sets of test terminals which can be operated concurrently and arranged to execute mutually different test routines for the same developed software to be tested, thereby enabling more rapid testing of the developed software. - Beneficially, software developers are provided with web-based interfaces to the
server arrangement 110 for enabling the developers to view results after executing of the test routines or in real-time whilst the test routines are being executed. Referring toFIG. 3 , a user interface 200 is beneficially shown in thesystem 10 via a browser window. The software developers can select from a plurality of mutually different mobile wireless devices, for example mobile telephones, 202 to define those which the software developers are desirous to test using one or more appropriate test routines. In an example situation, a software developer has tested his/her developed software on all available models ofmobile wireless devices 114A to 114E, namely the mobile telephones 202, and wants to study and analyze associated test results. Selected models of mobile telephones 202 as illustrated in the user interface 200 are to be investigated by the software developer in greater detail, for example for software debugging purposes for these selected models 202. The software developer is able to modify or develop, via the user interface, specific test routines for the selected mobile telephones 202. - When employing the
system 10, software developers are also able to access a service of an Application Programming Interface (API) through which all interactions providing input to thetest environment 100 and handling output from thetest environment 100 can be automated through use of suitable software. Such inputs include, but are not limited to: - (i) software to be tested;
- (ii) any test parameters; and
- (iii) any usage scenarios and inputs for creating random, semi-random, or model-based test inputs.
- Outputs include, but are not limited to:
- (i) test results in machine-processable format;
- (ii) modified source-, byte- or binary-code;
- (iii) triggers to other systems, for example Internet services, application stores, source code control systems, malware scanners, mobile wireless devices; and
- (iv) e-mail, SMS and any other machine or human-processable format.
- The
system 10 can optionally be configured to record screen-shots, video, audio, temperature, movements, power usage, communication network usage, radio frequency emissions and magnetic induction arising from terminals to be tested using thesystem 10, namely for analyzing performance of developed software and to analyze that all target terminals are rendering, receiving and processing information in a manner as intended by the software developer. - Capturing user interactions for automated test execution will now be described. Beneficially, a specific software application is used to capture all user actions, as well as software and hardware internal states; such internal states include one or more of processes, memory details, processor load, communication network traffic, communication network latency, communication network load, mobile wireless device usage logs system logs, as well as data pertaining to external environmental variables, for example temperature, radio signal magnitude, spatial location, acceleration, direction of movement, air pressure, ambient illumination, ambient sounds, ambient electromagnetic fields. Such data can be further processed to generate a test script which can be subsequently employed to reproduce exactly similar usage patterns and user environments on physical and simulated/emulated devices controlled by the
server arrangement 110. Such capture is optionally delayed (namely, a capture step, followed by a store-all-information step, followed by generate-script step, followed by a execute-script-in-all-devices step) or it may be real-time (namely one user employs a real or simulated/emulated device to direct other real or simulated/emulated devices in real-time). Moreover, thesystem 10 is also capable of recording interactions between several users and several mobile communication devices for later regeneration of associated interaction patterns. - Results of analysis provide by the
system 10 will next be described. A software developer can inform thesystem 10 concerning which target terminal 106 he/she is desirous to use for purposes of giving “approved” references to thesystem 10. Such “approved” references are also optionally generated by thetest environment 100. For example, the software developer selects one or more terminals as reference terminals; thesystem 10 can be configured to change all or part of target software automatically based on tests arising from execution of one or more of aforementioned test routines, for ensuring that the developed software functions in a fluent manner as intended in all target terminals, namely thesystem 10 is susceptible to being integrated to a software development environment to create desired versions of the software, for example software providing different image widths to suit specific models of mobile telephones. - Test scripts employed for testing target software in conjunction with wireless communication devices can be used as initial values for a next test run amendments to be implemented to the target developed software. Optionally, the
system 10 can also generate test scripts based upon any simulated, recorded, and model-based usage scenarios, or from real-time or delayed inputs from real users as well as inputs for creating random, semi-random or model-based scripts originating from other computer systems. These other systems optionally include, but are not limited to, Internet services, software application stores, source code control systems, malware scanners, and mobile wireless devices. Moreover, these text scripts can also be dynamically changed during test execution based on any of the internal or external inputs created during test routine execution. - The
system 10 is capable of hosting different test routine executions. Different variations of test routine runs can be optionally configured to be executed on computing hardware of certain vendor mobile terminals, for example a given procedure can be configured to be executed with different versions of a nominally same operating system, for example Android v2.01, v2.2, v2.4 or similar, to investigate forward and backward compatibility of developed software. Moreover, for example, an impact of screen resolution in respect of software execution speed is susceptible to being analyzed using thesystem 10. In general, any variable pertaining to mobile telephones and mobile terminal versions can be used for testing in thesystem 10; for example, thesystem 10 can be used to configure and test mobile telephones equipped with memory capacities of 4 Gbytes, 8 Gbytes, 16 Gbytes and so forth of internal data memory to determined an effect of memory size on performance when the terminals execute given developed software. Moreover, physical capabilities of the terminals is also susceptible to being tested in thesystem 10, for example target terminals can be subjected to different radio environments when being simulated in thetest environment 100, for example some target terminals exhibiting good reception with relatively high data rate communication characteristics and other target terminals exhibiting bad reception and relatively low data bandwidth. - Furthermore, other physical parameters of target terminals can be tested in the
test environment 100, for example “look and feel” of user interfaces in different illumination conditions. Yet additionally, target devices for testing in thetest environment 100 can be configured before testing, for example to reduce their amount of available data storage memory to investigate whether or not any adverse effects on target device performance is likely to arise in practice, namely by way of simulation. Beneficially, the target devices can be configured to execute on their computing hardware a plurality of software applications concurrently to test within thetest environment 100 whether or not there are any conflict problems between computing resources consumed by the plurality of software applications. Optionally, thetest environment 100 can be configured to send messages to the target terminals and also receive messages therefrom, for example short messaging service (SMS) communications, multimedia messaging service (MMS) communications, push notifications, push messages, voice calls, video calls, for example messages having open Internet Protocol (IP) connections concurrently as the test routines are executed in thetest environment 100. Moreover, during such sending of aforesaid messages, influences from physical characteristics can also be simulated, for example influences of one or more of: temperature, vibration, ambient illumination, acoustic environment, electromagnetic environment. - Test results from executing one or more test routines in the test environment in respect of one or more target terminals can be saved in the database server 112, in the
mobile devices 114A to 114E being simulated or any device connected to the Internet, for example concurrently simulated in thetest environment 100. The test results can be viewed by the software developer or can be provided as raw data to other parties as required. Thesystem 10 can include an automatic arrangement for proposing improvements based upon collated data from thetest environment 10; optionally, the developed software is automatically changed based upon generated test results from thetest environment 100. Beneficially, thesystem 10 is operable to create visualizations regarding performance of developed software for various mobile wireless devices, software versions and wireless communication networks. Moreover, thesystem 10 can also be used to visualize for the software developer a manner in which different use cases, navigation paths and usage load scenarios are handled by the developed software under test conditions. - Beneficially, the
test system 10 can be configured to search and fetch at least some of its executable software files from defined network addresses, for example all its executable software files therefrom, or from anywhere from where such executable software files may be stored. After fetching these executable software files, theserver arrangement 110 can automatically create test scripts for each executable software file, and thereafter execute each executable software file and associated generated test script on all physical and/or simulated/emulated mobile wireless devices whereat execution is possible, thereby providing detailed test results indicative of a manner in which these executable software files behave in each unique hardware/software configuration of the mobile wireless devices. Such an automated manner of operation of thesystem 10 is beneficially typically employed as a form of batch process, for instance when systematically testing all software applications recorded in a software application store for ensuring compatibility with a defined range mobile wireless devices, for example mobile telephones or mobile terminals. - The
server arrangement 110 is beneficially configurable to make screen captures of each target device or terminal being tested in thetest environment 100. Optionally, the screen capture is achieved via software which accesses internal memory of each target device or terminal, whether physical or simulated/emulated. Optionally, one or more cameras on top of themobile devices 114A to 114E, when implemented in a physical manner, are employed to record screen activities. Beneficially, thesystem 10 employs other sensors for recording sound, temperature, movements, electromagnetic induction and so forth. - Optionally, pattern recognition is employed to analyze video content, for example using cameras or screen capture, presented on screens of the
mobile devices 114A to 114E, for example for ensuring that all devices or terminals are showing similar content simultaneously. Video analysis of the presented content can be utilized to determine whether or not a given device or terminal is faster or slower than others when presenting video content. Moreover, recorded test results can be compared in thesystem 10 for each iteration of developed software to determine, for example, improvements in performance for the different versions of the development software. - Optionally, the
system 10 can be configured to take one or more screenshots of output from developed software execution for different stages of test execution. Such screenshot views enable software developers to compare side-by-side real pixel-to-pixel screenshots from all devices connected to thesystem 10 or simulated/emulated by thesystem 10. Such comparison of screenshots provides significant time saving for software developers when developing software; different screen sizes, resolutions and form factors can be checked, namely parameters which make visual validation of screen layout in each device or terminal essential for software developers. Beneficially, thesystem 10 provides a screenshot baseline feature which software developers can utilize to define a temporal baseline for every screenshot for every device and/or terminal hosted by thesystem 10; every time tests are executed in thetest environment 100, new screenshots are compared to the software developers' baseline screenshots, and differences there between notified to the software developers via thesystem 10. Beneficially, the screenshot baseline feature even highlights differing areas of the screenshot to help the software developers identify and appreciate how screenshots mutually vary. - As illustrated in
FIG. 3 , thesystem 10 provides auser interface 300, beneficially provided via a web interface, for example an Internet web interface such as a browser window. Theuser interface 300 provides one ormore fields 310 for presenting test results, one ormore fields 320 for defining software applications, one ormore fields 330 for defining test scripts, and one ormore fields 340 for defining models of devices and/or terminals to be investigated. - Terminal can refer, but is not limited to, for example to mobile terminal, mobile device, mobile phone, laptop, web pad, smart phone, accessories for mobile devices or a device with embedded software (such as house hold appliance, car, robot, vehicle, multimedia device, television, medical device etc) i.e. to anything which has software in it. Emulation/simulation can also include emulation/simulation of any device i.e. not limited to mobile devices.
-
Server system 110 anddatabase 120 can be arranged as centralized computer system or those can be distributed as cloud service. Servers and databases can be physically in same or distributed locations (including hosting by any user connected to Internet). Physical terminals can be in same physical place or those can be distributed enabling for example crowd sourcing of terminals. - Modifications to embodiments of the invention described in the foregoing are possible without departing from the scope of the invention as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe and claim the present invention are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. Numerals included within parentheses in the accompanying claims are intended to assist understanding of the claims and should not be construed in any way to limit subject matter claimed by these claims.
Claims (20)
1-15. (canceled)
16. A server system for providing one or more test environments for executing and analyzing test routines for a plurality of wireless communication devices, comprising:
a processor and a memory comprising software, the processor and memory with the software configured to cause the server system to:
receive at least one software application for execution upon the plurality of wireless communication devices, and receive a plurality of test routines for use in testing the wireless communication devices;
control execution of the at least one software application in the wireless communication devices and control application of the plurality of test routines thereto, wherein a set of the plurality of wireless communication devices is operated concurrently and arranged to execute mutually different test routines for the same software application; and
monitor operation of the physical wireless communication devices and provide corresponding test results to one or more user interfaces.
17. The server system of claim 16 , wherein the processor and memory with the software are further configured to use the plurality of test routines to determine at least one of: software application execution speed on the wireless communication devices and operating system compatibility for the wireless communication devices.
18. The server system of claim 16 , wherein the plurality of wireless communication devices comprise simulated wireless communication devices
19. The server system of claim 16 , wherein the plurality of wireless communication devices comprise physical wireless communication devices.
20. The server system of claim 16 , wherein the plurality of test routines include routines pertaining to one or more of software security, software functionality, software performance, software usability, software interoperability, software stress testing, and testing software interactions between users and the wireless communications devices.
21. The server system of claim 16 , wherein at least a portion of the test results causes the at least one software application to be published to one or more of the wireless communication devices or one or more application stores.
22. The server system of claim 16 , wherein the processor and memory with the software are further configured to allow modification and re-application of the plurality of test routines for implementing adjustments to the at least one software application.
23. The server system of claim 16 , wherein the modification of the plurality of test routines is determined from previous test results.
24. The server system of claim 16 , wherein the processor and memory with the software are further configured to trigger application of the plurality of test routines automatically upon one or more of an amendment to a software application source code, a change of configuration of any of the wireless communication devices, and a trigger originating from a system connected to the server system.
25. A method of using a server system for providing one or more test environments for executing and analyzing test routines for a plurality of wireless communication devices, comprising:
receiving, using computing hardware of the server system, least one software application for execution upon the plurality of wireless communication devices, and receiving a plurality of test routines for use in testing the wireless communication devices;
executing the at least software application and applying the plurality of test routines thereto, via the computing hardware of the server system, wherein a set of the plurality of wireless communication devices is operated concurrently and arranged to execute mutually different test routines for the same software application; and
monitoring operation of the wireless communication devices for providing corresponding test results to one or more user interfaces.
26. The method of claim 25 , comprising using the plurality of test routines to determine at least one of: software application execution speed on the wireless communication devices and operating system compatibility for the wireless communication devices.
27. The method of claim 25 , wherein the plurality of wireless communication devices comprise simulated wireless communication devices
28. The method of claim 25 , wherein the plurality of wireless communication devices comprise physical wireless communication devices.
29. The method of claim 25 , wherein the plurality of test routines include routines pertaining to one or more of software security, software performance, software usability, software interoperability, software stress testing, and testing software interactions between users and the wireless communications devices.
30. The method of claim 25 , wherein at least a portion of the test results causes the at least one software application to be published to one or more of the wireless communication devices.
31. The method of claim 25 , further comprising allowing modification and re-application of the plurality of test routines for implementing adjustments to the at least one software application.
32. The method of claim 25 , further comprising allowing modification of the plurality of test routines based on previous test results.
33. The method of claim 25 , comprising triggering application of the plurality of test routines automatically upon one or more of an amendment to a software application source code, a change of configuration of any of the wireless communication devices, and a trigger originating from a system connected to the server system.
34. The method of claim 25 , implemented as a software product recorded on non-transitory machine-readable data storage media wherein the software product is executable upon the computing hardware.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/668,897 US20150319071A1 (en) | 2012-08-13 | 2015-03-25 | System for providing test environments for executing and analysing test routines |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/584,335 US9015654B2 (en) | 2012-08-13 | 2012-08-13 | System for providing test environments for executing and analysing test routines |
US14/668,897 US20150319071A1 (en) | 2012-08-13 | 2015-03-25 | System for providing test environments for executing and analysing test routines |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/584,335 Continuation US9015654B2 (en) | 2012-08-13 | 2012-08-13 | System for providing test environments for executing and analysing test routines |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150319071A1 true US20150319071A1 (en) | 2015-11-05 |
Family
ID=50067196
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/584,335 Active 2033-01-23 US9015654B2 (en) | 2012-08-13 | 2012-08-13 | System for providing test environments for executing and analysing test routines |
US14/668,897 Abandoned US20150319071A1 (en) | 2012-08-13 | 2015-03-25 | System for providing test environments for executing and analysing test routines |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/584,335 Active 2033-01-23 US9015654B2 (en) | 2012-08-13 | 2012-08-13 | System for providing test environments for executing and analysing test routines |
Country Status (1)
Country | Link |
---|---|
US (2) | US9015654B2 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9450834B2 (en) | 2010-07-19 | 2016-09-20 | Soasta, Inc. | Animated globe showing real-time web user performance measurements |
US9491248B2 (en) | 2010-07-19 | 2016-11-08 | Soasta, Inc. | Real-time analytics of web performance using actual user measurements |
US9720569B2 (en) | 2006-08-14 | 2017-08-01 | Soasta, Inc. | Cloud-based custom metric/timer definitions and real-time analytics of mobile applications |
US9772923B2 (en) | 2013-03-14 | 2017-09-26 | Soasta, Inc. | Fast OLAP for real user measurement of website performance |
US9928163B2 (en) | 2015-06-10 | 2018-03-27 | International Business Machines Corporation | Dynamic test topology visualization |
US10037393B1 (en) | 2016-05-03 | 2018-07-31 | Akamai Technologies, Inc. | Consumer performance index scoring for websites and web-based applications |
US10067850B2 (en) | 2010-07-19 | 2018-09-04 | Akamai Technologies, Inc. | Load test charts with standard deviation and percentile statistics |
US10346431B1 (en) | 2015-04-16 | 2019-07-09 | Akamai Technologies, Inc. | System and method for automated run-tme scaling of cloud-based data store |
US10579507B1 (en) | 2006-08-14 | 2020-03-03 | Akamai Technologies, Inc. | Device cloud provisioning for functional testing of mobile applications |
US10586358B1 (en) | 2017-05-10 | 2020-03-10 | Akamai Technologies, Inc. | System and method for visualization of beacon clusters on the web |
US10601674B2 (en) | 2014-02-04 | 2020-03-24 | Akamai Technologies, Inc. | Virtual user ramp controller for load test analytic dashboard |
US10606736B1 (en) | 2017-03-03 | 2020-03-31 | Akamai Technologies Inc. | System and method for automated creation of a load test plan |
US11422911B2 (en) | 2019-03-14 | 2022-08-23 | International Business Machines Corporation | Assisted smart device context performance information retrieval |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9454464B2 (en) * | 2012-08-08 | 2016-09-27 | Cbs Interactive Inc. | Application development center testing system |
US9218269B2 (en) * | 2012-09-07 | 2015-12-22 | Red Hat Israel, Ltd. | Testing multiple target platforms |
US20150277858A1 (en) * | 2012-10-02 | 2015-10-01 | Nec Corporation | Performance evaluation device, method, and medium for information system |
US9189378B1 (en) * | 2012-11-30 | 2015-11-17 | Mobile Labs, LLC | Systems, methods, and apparatuses for testing mobile device applications |
US9208063B1 (en) * | 2013-02-21 | 2015-12-08 | Groupon, Inc. | Method for testing mobile application and associated apparatus and system |
DE102013006012A1 (en) * | 2013-04-09 | 2014-10-09 | Airbus Defence and Space GmbH | Multi-user test environment for a plurality of test objects |
US20140365199A1 (en) * | 2013-06-11 | 2014-12-11 | The Mathworks, Inc. | Pairing a physical device with a model element |
US20150025818A1 (en) * | 2013-07-16 | 2015-01-22 | Azimuth Systems, Inc. | Synchronized testing of multiple wireless devices |
US10360052B1 (en) | 2013-08-08 | 2019-07-23 | The Mathworks, Inc. | Automatic generation of models from detected hardware |
CN104793870B (en) * | 2014-01-22 | 2018-05-22 | 阿里巴巴集团控股有限公司 | Data sharing method and device |
CN105335282A (en) * | 2014-07-30 | 2016-02-17 | 国际商业机器公司 | Method and system for cross-platform test of applications |
US9626282B2 (en) * | 2014-09-17 | 2017-04-18 | Ricoh Company, Ltd. | Data processing apparatus and data processing method |
US10433195B2 (en) | 2014-10-14 | 2019-10-01 | Rohde & Schwarz Gmbh & Co. Kg | Technique for testing wireless network load produced by mobile app-carrying devices |
SG10201406596XA (en) | 2014-10-14 | 2016-05-30 | Rohde & Schwarz Asia Pte Ltd | Technique for testing wireless network load produced by mobile app-carrying devices |
US10678681B2 (en) * | 2015-03-10 | 2020-06-09 | Siemens Aktiengesellshaft | Method and device for automatic testing |
US9756515B1 (en) * | 2015-03-16 | 2017-09-05 | Amazon Technologies, Inc. | Mobile device test infrastructure |
CN104717337B (en) * | 2015-04-03 | 2018-03-02 | 杭州昕云信息科技有限公司 | A kind of method of batch testing mobile phone application and its equipment used |
EP3295311B1 (en) * | 2015-05-12 | 2021-08-11 | Suitest s.r.o. | Method and system for automating the process of testing of software application |
US10429437B2 (en) * | 2015-05-28 | 2019-10-01 | Keysight Technologies, Inc. | Automatically generated test diagram |
US9760476B2 (en) | 2015-10-16 | 2017-09-12 | International Business Machines Corporation | Crowdsourced device cloud for application testing |
US10372587B1 (en) * | 2015-11-09 | 2019-08-06 | The United States Of America As Represented By Secretary Of The Navy | Electronic device monitoring using induced electromagnetic emissions from software stress techniques |
US9348727B1 (en) * | 2015-11-18 | 2016-05-24 | International Business Machines Corporation | Enhancing GUI automation testing using video |
CN105512029B (en) * | 2015-11-27 | 2018-12-25 | 北京奇虎科技有限公司 | A kind of method, server and system for testing intelligent terminal |
CN106970870B (en) * | 2016-01-14 | 2023-02-24 | 腾讯科技(北京)有限公司 | Webpage test platform, webpage test method and webpage test system |
US10296444B1 (en) * | 2016-06-03 | 2019-05-21 | Georgia Tech Research Corporation | Methods and systems for testing mobile applications for android mobile devices |
US10719428B2 (en) * | 2016-07-20 | 2020-07-21 | Salesforce.Com, Inc. | Automation framework for testing user interface applications |
US10878140B2 (en) * | 2016-07-27 | 2020-12-29 | Emerson Process Management Power & Water Solutions, Inc. | Plant builder system with integrated simulation and control system configuration |
JP6845928B2 (en) * | 2016-10-20 | 2021-03-24 | ワイ ソフト コーポレーション アー エスY Soft Corporation,A.S. | General-purpose automated testing of embedded systems |
US10860461B2 (en) | 2017-01-24 | 2020-12-08 | Transform Sr Brands Llc | Performance utilities for mobile applications |
US10620013B2 (en) | 2017-03-09 | 2020-04-14 | Sita Information Networking Computing Usa, Inc. | Testing apparatus and method for testing a location-based application on a mobile device |
CN107122306A (en) * | 2017-05-15 | 2017-09-01 | 网易(杭州)网络有限公司 | Automated testing method and device, storage medium, electronic equipment |
ES2896480T3 (en) * | 2017-09-20 | 2022-02-24 | Hoffmann La Roche | Procedure to validate a medical application, end user device and medical system |
CN107885661A (en) * | 2017-11-08 | 2018-04-06 | 百度在线网络技术(北京)有限公司 | The terminal transparency method of testing and system of Mobile solution, equipment, medium |
DE102017126560A1 (en) * | 2017-11-13 | 2019-05-16 | Airbus Defence and Space GmbH | Test system and robot assembly for performing a test |
KR102005718B1 (en) * | 2018-08-14 | 2019-07-31 | 알서포트 주식회사 | Situation information indexing type actual operation based script generation method for mobile device |
CN109582564A (en) * | 2018-10-29 | 2019-04-05 | 中国电力科学研究院有限公司 | A kind of test method of mobile application software |
WO2020143030A1 (en) * | 2019-01-11 | 2020-07-16 | Entit Software Llc | Test script generation based on event data and video frames |
CN110213234B (en) * | 2019-04-30 | 2022-06-28 | 深圳市腾讯计算机系统有限公司 | Application program file developer identification method, device, equipment and storage medium |
CN110281274B (en) * | 2019-06-28 | 2020-12-22 | 北京云迹科技有限公司 | Robot full-flow test platform |
CN110389903B (en) * | 2019-07-19 | 2023-08-22 | 中国工商银行股份有限公司 | Test environment deployment method and device, electronic device and readable storage medium |
US20220327046A1 (en) * | 2019-07-19 | 2022-10-13 | Nippon Telegraph And Telephone Corporation | Testing system, testing method, and testing program |
CN110677477A (en) * | 2019-09-27 | 2020-01-10 | 京东数字科技控股有限公司 | Processing method for electronic equipment, server and second electronic equipment |
CN110913362B (en) * | 2019-12-26 | 2022-12-27 | 新奥数能科技有限公司 | Method and device for realizing wireless signal test through client and test equipment |
CN111611121B (en) * | 2020-04-09 | 2023-11-07 | 浙江口碑网络技术有限公司 | Hardware simulation test method, device and equipment |
CN111580412A (en) * | 2020-05-11 | 2020-08-25 | 中国人民解放军陆军研究院装甲兵研究所 | Test evaluation system based on semi-physical model |
US11418969B2 (en) | 2021-01-15 | 2022-08-16 | Fisher-Rosemount Systems, Inc. | Suggestive device connectivity planning |
US20240111665A1 (en) * | 2022-10-04 | 2024-04-04 | The Travelers Indemnity Company | Active analytics |
US11914504B1 (en) * | 2023-06-27 | 2024-02-27 | Starbucks Corporation | Performing physical experiments based on automatically-generated testing scripts |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050132382A1 (en) * | 2003-12-15 | 2005-06-16 | Mcguire Thomas D. | System and method for updating files utilizing delta compression patching |
US20060174162A1 (en) * | 2005-02-03 | 2006-08-03 | Satyam Computer Services Ltd. | System and method for self-testing of mobile wireless devices |
US20110197176A1 (en) * | 2010-02-08 | 2011-08-11 | Microsoft Corporation | Test Code Qualitative Evaluation |
US20130072126A1 (en) * | 2011-09-20 | 2013-03-21 | Dimitrios M. Topaltzas | System and Method for Determining Quality of Service of a Mobile Device |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020078380A1 (en) * | 2000-12-20 | 2002-06-20 | Jyh-Han Lin | Method for permitting debugging and testing of software on a mobile communication device in a secure environment |
WO2004092982A2 (en) * | 2003-04-07 | 2004-10-28 | Dexterra, Inc. | System and method for context sensitive mobile data and software update |
US7600220B2 (en) * | 2005-01-11 | 2009-10-06 | Worksoft, Inc. | Extensible execution language |
US20060282247A1 (en) * | 2005-05-25 | 2006-12-14 | Brennan James T | Combined hardware and network simulator for testing embedded wireless communication device software and methods |
US20080126862A1 (en) * | 2006-08-25 | 2008-05-29 | Microchip Technology Incorporated | System and Method for Testing Software Code for Use on a Target Processor |
US8589955B2 (en) * | 2008-02-12 | 2013-11-19 | Nuance Communications, Inc. | System and method for building applications, such as customized applications for mobile devices |
US8291408B1 (en) * | 2010-03-10 | 2012-10-16 | Google Inc. | Visual programming environment for mobile device applications |
US20120253745A1 (en) * | 2011-03-28 | 2012-10-04 | Infosys Technologies Limited | System and method for testing performance of mobile application server |
US9563544B2 (en) * | 2012-01-10 | 2017-02-07 | Sap Se | Framework for automated testing of mobile apps |
-
2012
- 2012-08-13 US US13/584,335 patent/US9015654B2/en active Active
-
2015
- 2015-03-25 US US14/668,897 patent/US20150319071A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050132382A1 (en) * | 2003-12-15 | 2005-06-16 | Mcguire Thomas D. | System and method for updating files utilizing delta compression patching |
US20060174162A1 (en) * | 2005-02-03 | 2006-08-03 | Satyam Computer Services Ltd. | System and method for self-testing of mobile wireless devices |
US20110197176A1 (en) * | 2010-02-08 | 2011-08-11 | Microsoft Corporation | Test Code Qualitative Evaluation |
US20130072126A1 (en) * | 2011-09-20 | 2013-03-21 | Dimitrios M. Topaltzas | System and Method for Determining Quality of Service of a Mobile Device |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9720569B2 (en) | 2006-08-14 | 2017-08-01 | Soasta, Inc. | Cloud-based custom metric/timer definitions and real-time analytics of mobile applications |
US10579507B1 (en) | 2006-08-14 | 2020-03-03 | Akamai Technologies, Inc. | Device cloud provisioning for functional testing of mobile applications |
US10067850B2 (en) | 2010-07-19 | 2018-09-04 | Akamai Technologies, Inc. | Load test charts with standard deviation and percentile statistics |
US9491248B2 (en) | 2010-07-19 | 2016-11-08 | Soasta, Inc. | Real-time analytics of web performance using actual user measurements |
US9450834B2 (en) | 2010-07-19 | 2016-09-20 | Soasta, Inc. | Animated globe showing real-time web user performance measurements |
US9772923B2 (en) | 2013-03-14 | 2017-09-26 | Soasta, Inc. | Fast OLAP for real user measurement of website performance |
US10601674B2 (en) | 2014-02-04 | 2020-03-24 | Akamai Technologies, Inc. | Virtual user ramp controller for load test analytic dashboard |
US10346431B1 (en) | 2015-04-16 | 2019-07-09 | Akamai Technologies, Inc. | System and method for automated run-tme scaling of cloud-based data store |
US10055340B2 (en) | 2015-06-10 | 2018-08-21 | International Business Machines Corporation | Dynamic test topology visualization |
US9928163B2 (en) | 2015-06-10 | 2018-03-27 | International Business Machines Corporation | Dynamic test topology visualization |
US10037393B1 (en) | 2016-05-03 | 2018-07-31 | Akamai Technologies, Inc. | Consumer performance index scoring for websites and web-based applications |
US10606736B1 (en) | 2017-03-03 | 2020-03-31 | Akamai Technologies Inc. | System and method for automated creation of a load test plan |
US10586358B1 (en) | 2017-05-10 | 2020-03-10 | Akamai Technologies, Inc. | System and method for visualization of beacon clusters on the web |
US11422911B2 (en) | 2019-03-14 | 2022-08-23 | International Business Machines Corporation | Assisted smart device context performance information retrieval |
Also Published As
Publication number | Publication date |
---|---|
US20140047417A1 (en) | 2014-02-13 |
US9015654B2 (en) | 2015-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9015654B2 (en) | System for providing test environments for executing and analysing test routines | |
US9697108B2 (en) | System, method, and apparatus for automatic recording and replaying of application executions | |
US8612947B2 (en) | System and method for remotely compiling multi-platform native applications for mobile devices | |
CN108415826B (en) | Application test method, terminal device and computer-readable storage medium | |
CN110362490B (en) | Automatic testing method and system for integrating iOS and Android mobile applications | |
Akour et al. | Mobile software testing: thoughts, strategies, challenges, and experimental study | |
US9760472B2 (en) | Tenant code debugging in multi-tenant systems | |
CN112506785A (en) | Automatic testing method, device, equipment and medium for login of Html5 game page | |
CN110610089B (en) | User behavior simulation method and device and computer equipment | |
CN112395184A (en) | Information acquisition method, equipment and computer storage medium | |
CN112214405A (en) | A software testing method, apparatus, electronic device and readable storage medium | |
CN109739704A (en) | An interface testing method, server and computer-readable storage medium | |
CN110688095B (en) | Method and device for constructing unmanned aerial vehicle SDK development platform | |
CN113610242A (en) | Data processing method and device and server | |
CN117076335A (en) | Model test method, system, medium and electronic equipment | |
CN117370203A (en) | Automatic test method, system, electronic equipment and storage medium | |
CN106557411B (en) | Method and system for testing Hybrid application in Android system | |
CN118259922B (en) | Compiling method, compiling device, compiling product, compiling device, compiling equipment and compiling medium for application program | |
CN111949510B (en) | Test processing method, device, electronic equipment and readable storage medium | |
CN118606211A (en) | Test script generation method, device, electronic equipment and vehicle | |
Demmel et al. | Data synthesis is going mobile—on community-driven dataset generation for android devices | |
CN105339974B (en) | Analog sensor | |
CN108536607B (en) | UI test method, device and readable storage medium | |
CN114217874B (en) | Mini-program generation method, device, equipment, readable storage medium and product | |
CN107247661B (en) | Method and system for supporting automatic verification of installation package of application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BITBAR TECHNOLOGIES OY, FINLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAASILA, JOUKO;POHJANEN, PETRI;KIVELA, HENRI;AND OTHERS;SIGNING DATES FROM 20170320 TO 20170322;REEL/FRAME:042352/0363 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |