Anderstedt et al., 2025 - Google Patents
Benchmarking and Load Testing a Dynamic CRM ArchitectureAnderstedt et al., 2025
View PDF- Document ID
- 14649061780462137493
- Author
- Anderstedt H
- Wifvesson M
- Publication year
- Publication venue
- LU-CS/HBG-EX
External Links
Snippet
Ensuring performance and reliability is critical for any modern web application. Continuous performance measurement and evaluation are necessary to detect anomalies, adapt to evolving demands, and prevent degradation before it affects users. In this work, the design …
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3414—Workload generation, e.g. scripts, playback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3696—Methods or tools to render software testable
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/46—Multiprogramming arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/875—Monitoring of systems including the internet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Error detection; Error correction; Monitoring responding to the occurence of a fault, e.g. fault tolerance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/22—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance or administration or management of packet switching networks
- H04L41/14—Arrangements for maintenance or administration or management of packet switching networks involving network analysis or design, e.g. simulation, network model or planning
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Jiang et al. | A survey on load testing of large-scale software systems | |
| Aktas et al. | Provenance aware run‐time verification of things for self‐healing Internet of Things applications | |
| Tamilarasi et al. | Research and development on software testing techniques and tools | |
| Bozkurt et al. | Testing web services: A survey | |
| Shetty et al. | Building ai agents for autonomous clouds: Challenges and design principles | |
| Frajtak et al. | Exploratory testing supported by automated reengineering of model of the system under test | |
| Frank et al. | Misim: A simulator for resilience assessment of microservice-based architectures | |
| Naqvi et al. | On evaluating self-adaptive and self-healing systems using chaos engineering | |
| Camacho et al. | Chaos as a Software Product Line—a platform for improving open hybrid‐cloud systems resiliency | |
| Jha et al. | Itbench: Evaluating ai agents across diverse real-world it automation tasks | |
| Chauvel et al. | Evaluating robustness of cloud-based systems | |
| Nawagamuwa | Infrastructure as code frameworks evaluation for serverless applications testing in aws | |
| Harsh et al. | Cloud enablers for testing large-scale distributed applications | |
| Krichen et al. | A resource-aware model-based framework for load testing of ws-bpel compositions | |
| Fu et al. | Runtime recovery actions selection for sporadic operations on public cloud | |
| Anderstedt et al. | Benchmarking and Load Testing a Dynamic CRM Architecture | |
| Jernberg | Building a Framework for Chaos Engineering | |
| Sun et al. | Detecting inconsistencies in microservice-based systems: An annotation-assisted scenario-oriented approach | |
| Kruse et al. | Assessing the applicability of a combinatorial testing tool within an industrial environment | |
| Galpaya | Stress Testing Tool to check the performance of a Moodle Instance | |
| Campbell et al. | Assured Cloud Computing | |
| Ilieva et al. | An automated approach to robustness testing of BPEL orchestrations | |
| Nguyen et al. | Evaluating the fittest automated testing tools: An industrial case study | |
| Peng et al. | Automated Server Testing: an Industrial Experience Report | |
| Shariff | Investigating selenium usage challenges and reducing the performance overhead of selenium-based load tests |