Confluence 2.9 のサポートは終了しています。
ドキュメントの最新バージョンを確認してください。
Load Testing Confluence
This page contains scripts and hints on load-testing your Confluence installations.
コンテンツ
はじめに
Before making a new Confluence instance available to your users it is useful to get a feel for how it will perform under your anticipated load and where you may need to consider improving your configuration to remove bottlenecks. Likewise, before making changes to your Confluence instance it would again be useful to assess the impact of these changes before making them live in a production context.
This kind of testing is not an exact science but the tools and process described here are intended to be a straightforward, configurable and extensible way of allowing you to begin this kind of load testing.
It will rarely be the case that these scripts will perform representative testing for you 'out of the box'. But either through configuration or by extending the scripts it should be possible to build an appropriate load test.
Load testing scripts are not designed for a production environment
The load testing scripts will update the data within the targeted Confluence instance and are not designed to be run against a production server. If you want to load test your production environment you will need to perform these tests on a backup of your data and restore your real data after the tests.
セットアップ
You will need the following -
- A Confluence server, set up and running with an admin user. The scripts assume a default username and password for this user: 'admin'/'admin'.
- Apache JMeter (currently version 2.3.2).
- The load testing scripts and resources which are available in our public Maven repository. Download and extract this package (version 0.10).
The test scripts have been updated to work with Confluence 2.9 in version 0.10. Using an older version of the tests will result in errors when running the test.
Quick, Just Tell Me How To Run It.
If you don't want to read the rest of this document, here are the main points:
- Create the test data:
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=<scripts location> -Jspace.zip=<path to a space export> \ -Jadmin.user=<username> -Jadmin.pass=<password>
- Run the test:
<jmeter location/bin/jmeter -n -t fixedLoad.jmx -Jscript.base=<scripts location>
このドキュメントの残りの部分は、これらの二つのステップを詳しく説明したものです。
Creating the Test Data
A known data set is required to run the testing against. By default this is the Confluence demo space (space key = DS) although this can be changed (more on this later).
The script setUpTest.jmx
is used to:
- create a set of users to be used in the test
- import the Confluence demo space for running tests against.
You should first ensure that you don't already have the demo space (key = DS) on your test instance. Trash it if you do.
Run the script from the performance-testing
directory as follows:
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=<scripts location> -Jspace.zip=<path to a space export> \ -Jadmin.user=<username> -Jadmin.pass=<password>
ここで:
<scripts location>
is the absolute path to where you expanded the scripts e.g./Users/YourName/Download/performanceTest
. This is needed for the script to find its external resources and must be specified absolutely since JMeter occasionally does unexpected things with the working directory when it is running.
<path to a space export>
is the absolute path to the space export zip you want to be used in your testing. For example, the path todemo-site.zip
as found in your Confluence distribution or source:<confluence install>/confluence/WEB-INF/classes/com/atlassian/confluence/setup/demo-site.zip
<username>>
and<password
are the username and password for an admin user that is able to create Confluence users and to import spaces.
By default the setup process will create 250 users — 50 each of the following formats: tstreader<n>, tstcommentor<n>, tsteditor<n>, tstcreator<n> and tstsearcher<n>. The password for each matches the username.
A typical run of the setup script will only take a few seconds.
Removing the Test Data
You can reverse the effects of the setup script by setting the remove.data
parameter to true
, e.g.
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=<scripts check out> -Jremove.data=true -Jadmin.user=<username> -Jadmin.pass=<password>
Setup Script Parameters
You can modify the behaviour of the setup script via JMeter parameters. These are supplied on the command line in the form -J<parameter name>=<parameter value>
.
パラメーター |
既定 |
説明 |
---|---|---|
script.base |
N/A |
The absolute path to the script. |
space.zip |
N/A |
The absolute path to space export zip file to be imported as test data. |
remove.data |
false |
Run the script in reverse — remove all test data. |
admin.user |
admin |
The admin user name used to import data and create users. |
admin.pass |
admin |
The password for the admin user. |
confluence.context |
confluence |
The confluence webapp context. |
confluence.host |
ローカルホスト |
The address or host name of the test instance. |
confluence.port |
8080 |
The port of the test instance. |
space.key |
ds |
The space key for the space import that will be tested against. |
space.setup |
true |
Control whether the test space will be created (or removed). |
commentor.max |
50 |
The number of users to be created for making comments. |
creator.max |
50 |
The number of users to be created for adding pages. |
editor.max |
50 |
The number of users to be created for editing existing pages. |
reader.max |
50 |
The number of users to be created for viewing existing pages. |
searcher.max |
50 |
The number of users to be created for performing searches. |
Setup Script Output
On the console you will see no obvious indication of success or otherwise. JMeter will output something similar to this:
Created the tree successfully Starting the test @ Mon Apr 14 17:35:08 EST 2008 (1208158508222) Tidying up ... @ Mon Apr 14 17:35:08 EST 2008 (1208158508928) ... end of run
The scripts location/results
directory will contain the file jmeter-result-setuptest.jtl
. There were failures or errors if there are any assertions in this file that have the value true
for the failure or error element, e.g.
<assertionResult> <name>Manage Users</name> <failure>true</failure> <error>false</error> <failureMessage>Test failed: URL expected to contain /browseusers.action/</failureMessage> </assertionResult>
Running the Test
The test script itself will put Confluence under a fixed load. That is to say, the individual samples within the test do not terminate after a period of time, they only terminate once they have finished their prescribed work. This is by design so that test runs can accurately be compared against each other.
Execute the test as follows:
<jmeter location/bin/jmeter -n -t fixedLoad.jmx -Jscript.base=<scripts location>
Where:
<scripts location>
is the absolute path to where you extracted the scripts e.g. /Users/YourName/Download/performanceTest
. This is needed for the script to find its external resources.
Test Behaviour
The test has a number of parameters to tweak its behaviour but generally speaking it has the rough format of:
- 5 groups of users - readers, commentors, searchers, editors and creators.
- readers simply view a set of individual pages or browse space functionality.
- commentors add comments to a set of pages.
- searchers perform searches on a fixed set of keywords.
- editors make small additions to the end of a set of pages.
- creators add new pages to a particular space.
- Each individual user in each group will repeat a fixed number of times with a variable pause between each repeat.
Note that there is no execution of JavaScript by the client. Keep this in mind if you use this test to gauge Confluence performance in a production environment.
There is also very little use of permissions in these tests. All data involved is accessible to all of the test users.
Test Script Parameters
You can modify the behaviour of the test script via JMeter parameters. These are supplied on the command line in the form -J<parameter name>=<parameter value>
.
パラメーター |
既定 |
説明 |
---|---|---|
script.base |
N/A |
The absolute path to the script. |
confluence.context |
confluence |
The confluence webapp context. |
confluence.host |
ローカルホスト |
The address or host name of the test instance. |
confluence.port |
8080 |
The port of the test instance. |
create.page.prefix |
Nihilist |
The title prefix for any created page e.g. Nihilist00001 |
Test Thread Parameters
パラメーター |
既定 |
説明 |
---|---|---|
threads.reader |
15 |
Number of readers. |
loop.reader |
50 |
Number of times each reader will repeat. |
pause.reader |
2000 |
The approximate (within 500ms) millisecond pause between reader repeats. |
threads.searcher |
8 |
Number of searchers. |
loop.searcher |
50 |
Number of times each search will repeat. |
pause.searcher |
2000 |
The approximate (within 500ms) millisecond pause between searcher repeats. |
threads.creator |
3 |
Number of page creators. |
loop.creator |
50 |
Number of times each creator will repeat. |
pause.creator |
2000 |
The approximate (within 500ms) millisecond pause between creator repeats. |
threads.editor |
3 |
Number of page editors. |
loop.editor |
50 |
Number of times each editor will repeat. |
pause.editor |
2000 |
The approximate (within 500ms) millisecond pause between editor repeats. |
threads.commentor |
4 |
Number of page commentors. |
loop.commentor |
50 |
Number of times each commentor will repeat. |
pause.commentor |
2000 |
The approximate (within 500ms) millisecond pause between commentor repeats. |
So with the default parameters, you are emulating a load on Confluence of 33 concurrent users who will each be hitting the server approximately every 2 seconds (16 users per second).
23 of these users are read only (searchers or readers) and 10 of them are read/write — 11 read only users per second and 5 read/write users per second.
As a guide, a test run using the above default parameters on a dual core MacBook Pro with no profiling and against HSQLDB will take approximately 20 minutes.
Test Script Output
During the run of the test script Jmeter will output progress to the console of the form:
Created the tree successfully Starting the test @ Fri Apr 18 00:07:39 EST 2008 (1208441259523) Display Summary Results During Run + 462 in 77.6s = 5.9/s Avg: 1564 Min: 18 Max: 33738 Err: 1 (0.22%) Display Summary Results During Run + 1338 in 189.9s = 7.0/s Avg: 3596 Min: 24 Max: 34545 Err: 0 (0.00%) Display Summary Results During Run = 1800 in 257.6s = 7.0/s Avg: 3074 Min: 18 Max: 34545 Err: 1 (0.06%) Display Summary Results During Run + 1046 in 200.9s = 5.2/s Avg: 4529 Min: 40 Max: 50461 Err: 0 (0.00%) Display Summary Results During Run = 2846 in 438.2s = 6.5/s Avg: 3609 Min: 18 Max: 50461 Err: 1 (0.04%) Display Summary Results During Run + 677 in 201.2s = 3.4/s Avg: 6638 Min: 46 Max: 27636 Err: 0 (0.00%) Display Summary Results During Run = 3523 in 618.1s = 5.7/s Avg: 4191 Min: 18 Max: 50461 Err: 1 (0.03%) Display Summary Results During Run + 561 in 197.5s = 2.8/s Avg: 8326 Min: 171 Max: 39494 Err: 0 (0.00%) Display Summary Results During Run = 4084 in 798.3s = 5.1/s Avg: 4759 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 555 in 199.2s = 2.8/s Avg: 8247 Min: 160 Max: 45270 Err: 0 (0.00%) Display Summary Results During Run = 4639 in 978.0s = 4.7/s Avg: 5177 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 575 in 211.8s = 2.7/s Avg: 4025 Min: 64 Max: 35173 Err: 0 (0.00%) Display Summary Results During Run = 5214 in 1158.6s = 4.5/s Avg: 5050 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 559 in 186.8s = 3.0/s Avg: 2019 Min: 54 Max: 18541 Err: 0 (0.00%) Display Summary Results During Run = 5773 in 1338.2s = 4.3/s Avg: 4756 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 472 in 191.2s = 2.5/s Avg: 2149 Min: 67 Max: 20230 Err: 0 (0.00%) Display Summary Results During Run = 6245 in 1517.9s = 4.1/s Avg: 4559 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 182 in 186.5s = 1.0/s Avg: 3481 Min: 80 Max: 16173 Err: 0 (0.00%) Display Summary Results During Run = 6427 in 1699.4s = 3.8/s Avg: 4528 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 122 in 190.6s = 0.6/s Avg: 4998 Min: 82 Max: 17724 Err: 0 (0.00%) Display Summary Results During Run = 6549 in 1880.8s = 3.5/s Avg: 4537 Min: 18 Max: 50461 Err: 1 (0.02%) Display Summary Results During Run + 118 in 191.3s = 0.6/s Avg: 5360 Min: 93 Max: 18484 Err: 0 (0.00%) Display Summary Results During Run = 6667 in 2060.0s = 3.2/s Avg: 4552 Min: 18 Max: 50461 Err: 1 (0.01%) Display Summary Results During Run + 117 in 193.0s = 0.6/s Avg: 5464 Min: 98 Max: 16515 Err: 0 (0.00%) Display Summary Results During Run = 6784 in 2240.3s = 3.0/s Avg: 4567 Min: 18 Max: 50461 Err: 1 (0.01%) Display Summary Results During Run + 108 in 193.0s = 0.6/s Avg: 6014 Min: 109 Max: 16905 Err: 0 (0.00%) Display Summary Results During Run = 6892 in 2421.2s = 2.8/s Avg: 4590 Min: 18 Max: 50461 Err: 1 (0.01%) Display Summary Results During Run + 91 in 190.5s = 0.5/s Avg: 5228 Min: 119 Max: 15795 Err: 0 (0.00%) Display Summary Results During Run = 6983 in 2599.2s = 2.7/s Avg: 4598 Min: 18 Max: 50461 Err: 1 (0.01%) Display Summary Results During Run + 3 in 19.1s = 0.2/s Avg: 4882 Min: 118 Max: 7901 Err: 0 (0.00%) Display Summary Results During Run = 6986 in 2611.2s = 2.7/s Avg: 4599 Min: 18 Max: 50461 Err: 1 (0.01%) Tidying up ... @ Fri Apr 18 00:51:13 EST 2008 (1208443873622) ... end of run
For an explanation of this output see the JMeter documentation.
A summary report of the entire run will also be created in the file results/jmeter-summary-fixedload.jtl
. You can view this by opening the fixedLoad.jmx
script in the JMeter GUI and loading the fixedLoadSummary.jtl
into the Summary Report test component (by clicking on 'Browse').
For an explanation of the report see the JMeter documentation.
How To
The remainder of this documentation will hopefully answer questions on how you can configure individual performance tests.
How do I change the number of users emulated?
This is probably quite obvious from the parameters described earlier but we have put some command lines here for ease of copy and pasting.
First, make sure that when you used the setUpTest.jmx
script, you created a big enough pool of users of each category. So to increase beyond the default for all user categories:
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=`pwd` -Jspace.zip=demo-site.zip \ -Jcommentor.max=200 -Jreader.max=200 -Jsearcher.max=200 -Jcreator.max=200 -Jeditor.max=200 \ -Jadmin.user=<username> -Jadmin.pass=<password>
Then start the test with your required number of threads configured, e.g.
<jmeter location>/bin/jmeter -n -t fixedLoad.jmx -Jscript.base=`pwd` \ -Jthreads.commentor=50 -Jthreads.reader=200 -Jthreads.searcher=100 -Jthreads.creator=40 -Jthreads.editor=40
Finally, remove all the test data with:
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=`pwd` -Jspace.zip=demo-site.zip \ -Jcommentor.max=200 -Jreader.max=200 -Jsearcher.max=200 -Jcreator.max=200 -Jeditor.max=200 \ -Jremove.data=true
How do I make the script run longer?
To run longer and do more work, you need to increase the repeat for the user categories you want to run longer e.g.
<jmeter location>/bin/jmeter -n -t fixedLoad.jmx -Jscript.base=`pwd` \ -loop.commentor=8000 -Jloop.reader=8000 -Jloop.searcher=8000 -Jloop.creator=8000 -Jloop.editor=8000
To run longer but with the same amount of work means each thread must pause longer before it repeats. To make each thread for each user category pause for approximately 30 seconds before repeating:
<jmeter location>/bin/jmeter -n -t fixedLoad.jmx -Jscript.base=`pwd` \ -Jpause.commentor=120000 -Jpause.reader=30000 -Jpause.searcher=30000 -Jpause.creator=150000 -Jpause.editor=120000
How do I run against a pre-existing space instead of the demo space?
Changing the reader pages
The reader threads iterate over the pages defined in <script.base>/resources/pages/pagesByTitle.csv
.
The reader threads also iterate over 'space browse' screens in Confluence as defined in <script.base>/resources/spaces/spaces.csv
.
Changing the pages that are edited
The pages that are edited during a test run are defined in <script.base>/resources/pages/pagesToEdit.csv
.
Changing the pages that are commented upon
The pages that have comments added are definied in <script.base>/resources/pages/pagesToComment.csv
.
Changing the spaces that pages are added to
New pages are created in the spaces with the keys defined in <script.base>/resources/spaces/spaceKeys.csv
. Each space key should be on a separate line.)
On all edits to the above files be sure not to leave a blank line at the end of the file. JMeter will give strange errors otherwise.
Running the setup script without creating a space
Set the space.setup
parameter to false, e.g.
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=`pwd` -Jadmin.user=<username> -Jadmin.pass=<password> -Jspace.setup=false
How do I change the search terms used?
If you want to change the words queried for by the searcher threads you can change the file <script.base>/resources/search/keywords.csv
How do I test reading pages by ID?
By default all pages are accessed by their space key and page title. There is a different code path in Confluence if you want to access pages by ID. If you know the ID of particular pages you want to hit, you can edit <script.base>/resources/pagesById.csv
.
To enable this file to be used open the fixedLoad.jmx
script in the JMeter GUI and enable the Reader by id sampler inside the Readers thread group.
How do I test against a remote Confluence instance?
Ideally, you will be running the test script on a separate machine from the Confluence instance being tested. Both the setUpTest.jmx
and fixedLoad.jmx
scripts can be run against remote machines with use of the parameters -Jconfluence.host=<remote machine> -Jconfluence.port=<http port>
If doing this be sure that you have good latency and bandwidth between the two machines.
How do I test a Confluence instance running at the root context of the app server?
I'm afraid this is a bit painful at the moment. You will need to load the fixedLoad.jmx
script into the JMeter GUI and change the path on each of the HTTPSampler components. This is highlighted in the diagram.
How do I make the setup script upload a difference space?
The configuration of the script is by default to run against the Confluence demo space (space key = DS). If you want to upload a different space export, simply specify it to the setUpTest.jmx
script using the space.zip
parameter. e.g.
<jmeter location>/bin/jmeter -n -t setUpTest.jmx -Jscript.base=`pwd` -Jadmin.user=<username> -Jadmin.pass=<password> -Jspace.zip=alternate-space.zip
You also need to supply the script with the key for this new space via the space.key
parameter e.g. -Jspace.key=ALTKEY
.
Remember to refer to the previous sections on changing the pages that are used in the tests so that they match this new space.
Where is the source?
The JMeter scripts are XML so you have the source if you downloaded the package, as described in the Setup section above.