Jira Service Desk 3.16.x Enterprise release performance report



アトラシアン コミュニティをご利用ください。


This page compares the performance of Jira Service Desk 3.9.10 and Jira Service Desk 3.16 Enterprise release.

About Enterprise releases
We recommend upgrading Jira Service Desk regularly. That said, if your organisation's process means you only upgrade about once a year, upgrading to an Enterprise Release may be a good option, as it provides continued access to critical security, stability, data integrity and performance issues until this version reaches end of life. 


Jira Service Desk 3.16 was not focused solely on performance, but we aim to provide the same, if not better, performance with each release. In this section, we’ll compare Jira Service Desk 3.9.10 to Jira Service Desk 3.16 Enterprise release, for both Server and Data Center. We ran the same extensive test scenario for both Jira versions. 

The following table presents mean response times of individual actions performed in Jira Service Desk. To check the details of these actions and the Jira instance they were performed in, see Testing methodology.

The performance was measured under a user load we estimate to be peak traffic, on a 5,000 agent instance.

Response times for Jira Service Desk actions (in seconds)

操作 3.9.10 Server 3.16.1 Server 3.9.10 Data Center 3.16.1 Data Center
View workload report (medium) 39.39 3.85 22.74 3.54
View workload report (small) 1.054 0.908 1.094 0.793
リクエストを表示する 1.691 1.289 1.140 1.099
View portals page 1.650 1.594 1.442 1.564
View welcome guide 0.693 0.620 0.683 0.562
View created vs. resolved report 1.440 1.335 1.320 1.141
View time to resolution report 3.060 2.939 2.769 2.619
チームの招待 2.671 2.698 2.623 2.636
View customers page 1.649 1.688 1.603 1.609
View queues with SLAs 19.12 20.15 11.42 11.86
View queue: All open issues 37.85 38.90 33.75 34.44
Create customer request 23.61 24.97 17.72 18.85
View service desk issue 1.780 2.133 1.352 1.469

In summary

It's great news for reports (thumbs up), with some showing huge improvements! Other key actions have also substantially improved. Highlights:

  • Viewing workload report improved by 80-90%
  • Viewing created vs resolved report improved by 5-15%
  • Viewing time to resolution report improved by 5-15%
  • Viewing requests improved 20-30%
  • Viewing portals page improved by 5-15%
  • View welcome guide improved by 5-15%

For all remaining actions, performance looks similar between the two versions, with slight degradations, of around one second or less, observed when viewing the following: queues, issues, customers, and when creating a customer request.

We'll continue to invest in improving future performance, so that service desk teams can move with ease through their workspace, and our largest customers can confidently scale.  


以下のセクションでは、当社のパフォーマンス テストで使用するテスト環境 (ハードウェア仕様を含む) とテスト方法を詳しく説明します。


Before we started the test, we needed to determine what size and shape of dataset, represents a typical large Jira Service Desk instance. To achieve that, we used our Analytics data to form a picture of our customers' environments and what difficulties they face when scaling Jira Service Desk in a large organization. Learn more

The following table presents the rounded values of the 99th percentile of each data dimension. We used these values to generate a sample dataset with random test data.

Baseline data set

Jira Service Desk data dimension
課題 300,000
プロジェクト 1,000
エージェント 1,000
顧客 100,000

We chose a mix of actions that would represent a sample of the most common user actions. An action in this context is a complete user operation, like opening an issue in the browser window. The following table details the actions that we included in the script, for our testing persona, indicating how many times each action is repeated during a single test run.

操作名 説明 1 回のテストの間に操作を実行した回数
Create customer request
Open a customer portal, type in the issue summary and description, them submit the request. ~850
チームの招待 Select Invite team in the left-hand-side menu, search for an agent on a 1,000 agent instance, choose an agent, click the Invite button, and wait for success confirmation. ~350
View workload report (small) Display the workload report for a project with no open issues. ~85
View workload report (medium)
Display the workload report for a project with 1,000 assigned issues. ~85
View queue: all open issues
Display the default service desk queue, in a project with over 10,000 open issues. ~340
View queue: with SLAs
Display a custom service desk queue, in a project with over 10,000 open issues, with 6 SLA values for each issue.

View customers page Display the Customers page, in a project that has 100,000 customers. ~350
View portals page
Display the help center, with all customer portals, by selecting the unique help center link. ~340
View report: created vs resolved
Display the Created vs Resolved report (in the past year), with over 10,000 issues in the timeline. ~330
View report: time to resolution
Display the Time to resolution report (in the past year), with over 10,000 issues in the timeline. ~340
リクエストを表示する Display the My requests screen from the customer portal.  ~340
View service desk issue Display a service desk issue with 6 SLA values. ~520
View welcome guide Display the Welcome guide from the left-hand-side menu. ~340

The performance tests were all run on a set of AWS EC2 instances. For each test, the entire environment was reset and rebuilt, and then each test started with some idle cycles to warm up instance caches. Below, you can check the details of the environments used for Jira Service Desk Server and Data Center, as well as the specifications of the EC2 instances.

To run the tests, we used 20 scripted browsers and measured the time taken to perform the actions. Each browser was scripted to perform a random action from a predefined list of actions, and immediately move on to the next action. Please note that it resulted in each browser performing substantially more tasks than would be possible by a real user, and you should not equate the number of browsers to represent the number of real-world concurrent users.

Each test was run for 40 minutes, after which statistics were collected.

Here are the details of our test environment:

Jira Server Jira Data Center

The environment consisted of:

  • 1 Jira ノード
  • 別のノード上にあるデータベース
  • 別のノード上にあるロード ジェネレーター

The environment consisted of:

  • 2 JIRA nodes
  • 別のノード上にあるデータベース
  • 別のノード上にあるロード ジェネレーター
  • 別のノード上にある共有ホーム ディレクトリ
  • ロード バランサ (AWS ELB HTTP ロード バランサ)
ハードウェア ソフトウェア
EC2 タイプ:

c4.8xlarge (EC2 タイプを参照)

Jira Service Desk Server: 1 node

Jira Service Desk Data Center: 2 nodes

オペレーティング システム: Ubuntu 16.04LTS
CPU: Intel Xeon E5-2666 v3 (Haswell) Java プラットフォーム: Java 1.8.0
CPU コア: 36 Java オプション:

8 GB ヒープ

メモリ: 60GB
ディスク: AWS EBS 100 GB gp2
ハードウェア ソフトウェア
EC2 タイプ: c4.8xlarge (see EC2 types)   データベース: MySQL 5.5
CPU: Intel Xeon E5-2666 v3 (Haswell) オペレーティング システム:

Ubuntu 16.04LTS

CPU コア: 36
メモリ: 60GB

Jira Service Desk Server: AWS EBS 100 GB gp2

Jira Service Desk Data Center: AWS EBS 60 GB gp2

ロード ジェネレーター
ハードウェア ソフトウェア
EC2 タイプ: c4.8xlarge (see EC2 types)   オペレーティング システム: Ubuntu 16.04LTS
CPU: Intel Xeon E5-2666 v3 (Haswell) ブラウザ: Google Chrome 62

CPU コア: 36 自動化スクリプト:

Chromedriver 2.33

WebDriver 3.4.0

Java JDK 8u131

メモリ: 60GB
ディスク: AWS EBS 30 GB gp2
最終更新日 2019 年 1 月 29 日


Powered by Confluence and Scroll Viewport.