ONJava.com    
 Published on ONJava.com (http://www.onjava.com/)
 See this if you're having trouble printing code examples


Java Design and Performance Optimization

Performance Planning for Managers

02/22/2001

This article presents a ten point plan for ensuring that your application will perform adequately. It isn't restricted to Java (point 2 includes extra information for Java projects). I'm very grateful to Alex Rossi of Accenture for his valuable input in discussions about performance planning.

1. The application will need performance tuning

Recognize that your application will need performance tuning at various stages of the project, and plan for the resources and time necessary to handle tuning requirements. The following points may assist you in determining budgetary allocations.

2. Create internal performance experts

Allocate two developers to understand performance tuning. There are performance experts available, but if you can allocate internal resources the overall cost is cheaper. At the very least, your internal resources can handle basics -- setting up benchmark harnesses, specifying target response times, and so on -- even if you subsequently employ an expert to validate process or to provide extended recommendations. For Java projects there's an abundance of interesting Java performance tuning material and tools available so performance tuning is likely to be considered a positive task by the developer. The Java Performance Tuning site lists many Java performance tuning resources. Have your developers start there.

Include allocation for books and magazines in the budget, as well as developer training and web browsing time. Also include allocation for testing and purchasing various performance tools, which your performance experts should evaluate: profiling tools; benchmark harnesses; web loading or GUI capture and playback or other client emulation tools. The choice of tools for measuring performance and the approach to creating the benchmark harness (writing in-house vs. buying) make a difference to the overall cost and time taken for tuning. Allocate for these too, and make sure you have the appropriate process in place to make the right choice according to your needs.

Understanding performance tuning and evaluating tools need not be the primary task for these developers. It may never become their primary task if things go well. However, my experience is that the closer the project comes to completion, the more time these internal performance experts will spend on performance tuning.

3. Set performance requirements in the specifications

During the specification stage the application's performance requirements need to be defined. This is not primarily a developer's task. Customers or business experts need to establish what response times are acceptable. It may be more useful to start by specifying what response times will be unacceptable.

This task can be undertaken at a later stage of development. In fact, it can be simpler, if a prototype has been written, to use a prototype and other business information to specify acceptable response times. But don't neglect to specify response time requirements before starting any implementation performance tuning. If code tuning starts before performance requirements are specified, then goals are inadequately defined and tuning efforts will be wasted on parts of the application that do not require them.

If your development environment is layered (application layer, component layer, technical architecture layer), try to define performance specifications at each layer so that each team has its own set of performance targets to aim for. If this is not practical, the performance experts will need to be able to tune across all layers and interact with all teams.

4. Include a performance focus in the analysis

During the analysis stage, the main focus is to analyze the requirements for shared and limited resources in the application. For example, a network connection is both a shared and a limited resource; a database table is a shared resource; threads are a limited resource. These are the resources that will cost the most to fix later in development if they are not designed correctly. Analysis of data volumes and load carrying capacities should also be carried out to determine the limitations of the system.

This task should fit into the normal analysis stage comfortably. To be on the safe side, or to highlight the requirement for performance analysis, you may wish to allocate 10% of planned analysis time to performance analysis. It's important that the analysis team be aware of the performance impact of different design choices so that they do not miss aspects of the system that need analysis. The analysis team may first need to become familiar with performance targeted design books such as "High Performance Client/Server" (see further resources). The analysis should be made in association with the technical architecture analysis. You should end up with an architecture blueprint which clearly identifies performance aspects.

5. Require performance predictions from the design

Progressing from the analysis stage performance considerations in the design phase should focus on the application's use of shared resources and on the performance consequences of the expected physical architecture of the deployed application.

Make sure the designers are aware of the performance consequences of different decisions by asking for performance impact predictions to be included with the normal design aspects. The external design review should include design experts familiar with the performance aspects of design choices. Otherwise a secondary performance expert who is familiar with design should also review the application design. If any significant third party products will be used -- like middleware or database products -- the product vendor should have performance experts who can validate the design and identify potential performance problems. A 10% allocation in the budget for performance aspects is normally a safe choice to highlight the emphasis on performance.

The design should include reference to the scalability both for users and for data/object volumes; the amount of distribution possible for the application depending on the required level of messaging between distributed components; and the transaction mechanisms and modes (pessimistic, optimistic, required locks, durations of transactions and locks held). The theoretical limitation of the performance of many multi-user applications is the number and duration of locks held on shared resources. The designers should also include, if relevant, a section on handling queries against large data sets.

6. Create a performance test environment

The performance task for the beginning of the development phase is establishing the performance testing environment (Performance tuning of code should be scheduled toward the end of the development phase; see point 9). You need to

7. Test a simulation or skeleton system for validation

Related Reading

Java Performance TuningJava Performance Tuning
By Jack Shirazi
Table of Contents
Index
Sample Chapter
Full Description
Read Online -- Safari

Create a simulation of the system which faithfully represents the main components of the application. The simulation should be implemented so that you can test the scalability of the system, determine how the shared resources respond to increased loads, and determine at which stage limited resources start to become exhausted or bottlenecked. The simulation should allow finished components to be integrated as they become available. If budget resources are unavailable, skip the initial simulation but start testing as soon as sufficient components become available to implement a skeleton version of the system. The targets are to determine response times and scalability of the system for design validation feedback as early as possible.

If you have a Proof of Concept stage planned, it could provide the simulation or a good basis for the simulation. Ideally the validation would take place as part of the Proof of Concept.

8. Integrate performance logging into the application layer boundaries

Integrate performance logging into the application. This logging should be deployed with the released application. Performance logging should be added to all the main layer boundaries: I/O and marshaling layers, servlet I/O and marshaling, JVM server I/O and marshaling, DB access/update; transaction boundaries; and so on. Performance logging should not produce more than one line of output to a log file per 20 seconds. Performance logging should be designed so that it adds less than 1% of time to all application activity. Ideally it will be configurable to aggregate variable amounts of data so that the logging can be deployed to produce one summary log line per configurable time unit (e.g. one summary line every minute). Ideally your logging framework should be designed so that the output can be used in other tools for easy manipulation and analysis.

9. Performance test the system at multiple scales and tune using the resulting information

During code implementation, unit performance testing should be scheduled along with QA. No performance tuning is required at the unit level until the unit is ready for QA. Unit performance tuning proceeds by integrating the unit into the system simulation and running scaling tests with profiling.

It's important to test the full system or simulation as soon as is feasible, even if many of the units are incomplete. Simulated units are acceptable at an early stage of system performance testing. Initially the purpose of this system performance test is to validate the design and architecture and to identify any parts of the design or implementation that will not scale (point 7 and 8). Later the tests should provide detailed logs and profiles that will allow developers to target bottlenecks in the system and produce faster versions of the application.

To support performance testing at later stages, the testbed should be able to be configured to provide performance profiles of any JVM processes, as well as system and network statistics in addition to performance logging. Your performance experts should be able to produce the JVM profiles. Budget for a short (three to five day) course on obtaining and analyzing system statistics from your target system. Ideally, a system administrator already has those skills.

The performance tests should scale to higher loads of users and data. Scale tests to twice the expected peak load:

Additional resources:

High Performance Client/Server, Chris Loosley & Frank Douglas (John Wiley & Sons)

The Java Performance Tuning website

Java Performance Tuning Strategy

Java Performance Tuning, Jack Shirazi (O'Reilly)

User activity should be simulated as accurately as possible, but it is most important that data is simulated to produce the expected real data variety, otherwise cache activity can produce completely misleading results. The numbers of objects should be scaled to reasonable amounts: this is especially important for query testing and batch updates. Do not underestimate the complexity of creating large amounts of realistic data for scalability testing.

10. Deploy the system with performance logging features

Performance logging features should be deployed with the released application. Such logging allows remote analysis and constant monitoring capabilities for the deployed application. Ideally, you should develop tools that automatically analyze the performance logs. The minimum performance log analysis tools desirable are ones that use the logs and a set of reference logs to compare performance and highlight anomalies.

In addition other useful tools include one which identifies long term trends in the performance logs and one which identifies when particular performance measurements exceed defined ranges. A graphical interface or support of a standard GUI admin tool for these tools is also useful.

Jack Shirazi is the author of Java Performance Tuning. He was an early adopter of Java, and for the last few years has consulted mainly for the financial sector, focusing on Java performance.


Return to ONJava.com.

Copyright © 2009 O'Reilly Media, Inc.