The goal of this thesis is to analyse a common XForms framework (Chiba) with the focus on profiling for performance inadequacies and to fix them if possible. A prototype of a largely automated performance measurement setup to underpin the analysis and the verification of enhancements has been designed and implemented.
The thesis starts with a theoretical part to specify terms & methodologies. Before the paper comes to its practical part the analysed project is introduced. Afterwards a practical performance analysis from the first steps to the narrow of the analysed project is exemplarily described.
In detail, chapter 1 Performance & Profiling elaborates how to quantify and profile performance in theory. Chapter 2 Methods introduces various methods to analyse performance. XForms and the analysed product Chiba are presented in chapter 3 XForms / Chiba Fundamentals. The implemented performance analysis is described in chapter 4 Performance Analysis Iteration and chapter 5 Tuning XForms Actions.
The Performance Analysis Iteration focuses on the design and implementation of tests to analyse and validate potential performance shortages, while Tuning XForms Actions illustrates how profiling techniques can be utilised not only to analyse but also to enhance a validated shortage. Finally - in chapter 6 Closing - the results of the analysis and the enhancements are reconsidered.
Table of Contents
1. Performance & Profiling
1.1. Performance
1.2. Performance in the Software Development Life Cycle
1.3. Profiling
2. Methods
2.1. Modelling Approach
2.2. Measurement Approach
2.3. Testing
2.3.1. Volume Testing
2.3.2. Stress Testing
2.3.3. Performance Testing
2.4. Benchmarking
2.5. Monitoring
2.5.1. J2EE Management Specification (JSR-77)
2.5.2. Java Management Extension (JMX)
2.6. Exhaustive Analysis
2.7. Hotspot Analysis Approach
2.8. Exploring the Domain
2.8.1. Community
2.8.2. Third Parties
2.8.3. Master Performance Report
2.9. Verify & Validate
2.9.1. Java Virtual Machine (JVM)
2.9.2. Final JVM Annotations
2.10. Tuning Approach
2.10.1. Common Tuning Pitfalls
2.11. Conclusion
3. XForms / Chiba Fundamentals
3.1. XForms
3.2. Chiba
3.2.1. Chiba Sandbox – The Core
3.2.2. Chiba Web – The Main Distribution
3.3. Conclusion
4. Performance Analysis Iteration
4.1. Utilised System
4.2. Profiling Tools Put into Service
4.2.1. EtmMonitor – Java Execution Time Measurement Library
4.2.2. AspectWerkz – Lightweight Java Code Profiler
4.2.3. YourKit Java Profiler – Comprehensive Code and Memory Profiler
4.2.4. Clover – Comprehensive Java Coverage Testing
4.3. Acquiring Information about Chiba
4.3.1. Community Driven
4.3.2. Third Party Driven
4.4. Verify Hypothesis About Hotspots
4.4.1. Hotspot 1 - Chiba Session
4.4.2. Hotspot 2 - JAXP Document Creation
4.4.3. Hotspot 3 – XForms Actions
4.5. XForms Test Bundles
4.6. Test Setup
4.6.1. Requirements
4.6.2. Infrastructure
4.6.3. Webtest
4.6.4. Handicaps
4.7. Verify Hypothesis About Hotspots 3 - XForms Actions
4.7.1. First Attempt
4.7.2. Second Attempt
4.7.3. Unscripted Mode Profiling
4.7.4. Scripted Mode Profiling
4.7.5. Verify Profiled Data
5. Tuning XForms Actions
5.1. Tuning preparations
5.2. Analyse Bottleneck
5.3. Confute Hypothesis
5.4. The Tuning Process
5.4.1. Tuning Iteration 1 – Next Generation Refresh
5.4.2. Tuning Iteration 2 – ModelItem
5.4.3. Tuning Iteration 3 – Inserted ModelItems
5.4.4. Tuning Iteration 4 – UI Element State
5.4.5. Tuning Iteration 5 – Output Control
5.4.6. Tuning Iteration 6 – Referencing Non Existing ModelItems
5.5. Prove Enhancement
5.5.1. Small Instance Data
5.5.2. Medium Large Instance
5.5.3. Very Big Instance Data / Repeat
5.6. Finale
6. Closing
Objectives & Core Themes
The primary objective of this thesis is to identify and resolve performance inadequacies within the Chiba XForms framework. By applying a structured hotspot analysis, the author investigates how efficient profiling and performance testing can be implemented to pinpoint bottlenecks, particularly within the XForms engine's core mechanisms.
- Performance profiling methodologies within Java-based architectures.
- Evaluation of automated performance testing and measurement infrastructures.
- Identification of hotspots in XForms processing, specifically regarding event sequencing and Model refresh cycles.
- Implementation and validation of performance-enhancing refactoring for XForms core components.
- Comparative analysis of scripted vs. unscripted mode performance in distributed environments.
Excerpt from the Thesis
2.7. Hotspot Analysis Approach
An article about a unified performance analysis approach published in the IBM System Journal Volume 39 No 1 [ABLU00] describes the common approach of a performance analyst as follows: “Measure performance, find constraints to the level of performance achieved, eliminate or reduce their effects, and then start again; stop when measured performance achieves a previously agreed-to target (cp. [ABLU00, p. 118])”
This approach bases on a common rule, the Pareto rule, which says that most impact on something comes from few activities, while most activities have only few impact on that something (cp. [Gupt03, p. 31]).
Translated to a performance analysis of an application this means that the application spends most of its execution time in only a small part of the code, the hotspot, while the majority of the code does not affect the execution time heavily. A common metric here is the 80/20 rule, where 20% of the source code affect 80% of the application execution time (cp. [Shir03, p. 397]). It is not important how exactly this proportion is; it might only be 60/40 in a small balanced program or it goes up to 90/10. Undertaking this rule an iterating performance analyse approach including the fixing of found hotspots like mentioned by IBM is a very efficient and passable approach to receive meaningful performance metrics about an application without spending time analysing irrelevant parts of the program.
Summary of Chapters
Performance & Profiling: Explores theoretical foundations of performance metrics, benchmarking, and the significance of profiling within the software development life cycle.
Methods: Details various performance modelling and measurement approaches, emphasizing a structured hotspot analysis workflow.
XForms / Chiba Fundamentals: Introduces the technical background of the XForms specification and the architecture of the Chiba framework.
Performance Analysis Iteration: Documents the practical application of automated performance tests to detect bottlenecks in Chiba's core processing.
Tuning XForms Actions: Describes the iterative refactoring process to optimize the xforms-refresh and rebuild mechanisms, proving performance enhancements through re-testing.
Closing: Concludes the thesis by reviewing the analysis outcomes and confirming the successful optimization of the framework.
Keywords
XForms, Chiba, Performance Profiling, Software Optimization, Bottleneck Analysis, Java Performance, Hotspot Analysis, Automated Testing, Servlet Container, DOM, JSR-77, JMX, Software Development Life Cycle, Benchmarking, Scalability.
Frequently Asked Questions
What is the core focus of this research?
The research focuses on the performance analysis and optimization of the Chiba XForms framework. The main goal is to identify why certain XForms actions exhibit performance degradation and to implement targeted fixes.
Which primary methodology is employed for performance improvement?
The author uses an iterative hotspot analysis approach, based on the Pareto principle (80/20 rule), to systematically find, verify, and resolve performance bottlenecks.
What are the central challenges in profiling XForms frameworks?
Key challenges include distinguishing between framework-specific overhead and application-level inefficiencies, as well as the complexity of Java environment factors like JVM warmup and garbage collection.
How is the automated test infrastructure structured?
The infrastructure uses Maven for build management, TestNG for executing test suites, and Selenium for simulating user behavior in browsers, all integrated within a headless automated process.
What specific bottlenecks were identified in the Chiba framework?
Significant bottlenecks were found in the Model.refresh() mechanism and the Model.rebuild() function, which caused inefficient iteration over the host document during event processing.
What were the measurable results of the tuning process?
The tuning effort led to significant performance gains, with some operations (like the setvalue action) becoming up to 160 times faster in specific test scenarios compared to the original implementation.
How does unscripted mode differ from scripted mode in Chiba's performance?
Unscripted mode involves re-creating the entire application state on the server for each action, which contributes significantly to performance overhead compared to the more granular updates possible in scripted mode.
What role does the JSR-77 specification play in this work?
JSR-77 and JMX are utilized as the underlying standards for monitoring the performance metrics of the managed Java objects within the framework environment.
- Quote paper
- Diplom Informatiker Lars Windauer (Author), 2007, Performance analysis of an XForms framework with the main focus on profiling by example of Chiba, Munich, GRIN Verlag, https://www.grin.com/document/85134