bitbucket-pipelines.yml を設定する

The bitbucket-pipelines.yml file defines your Pipelines builds configuration. If you're new to Pipelines you can learn more on how to get started here


Basic configuration

With a basic configuration, you can do things like writing scripts to build and deploy your projects and configuring caches to speed up builds. You can also specify different images for each step to manage different dependencies across actions you're performing in your pipeline.

A pipeline is made up of a list of steps, and you can define multiple pipelines in the configuration file. In the following graph, you can see a pipeline configured under the default section. The pipeline configuration file can have multiple sections identified by particular keywords.

はじめる前に

  • At least, the file must contain one pipeline section containing at least one step and one script inside the step.
  • Each step has 4GB of memory available.
  • 1 つのパイプラインに、最大 100 のステップを含めることができます。
  • Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images.

手順 

1. To configure the yaml file, in Bitbucket go to your repo > Pipelines, and click . Alternatively, you can configure your yaml file without using Bitbucket's interface.

2. Choose a language.

Note: Pipelines can be configured for building or deploying projects written in any language. Language guides

3. Choose an image.

Note: Edit it directly from the product when you first get to Pipelines or anytime from within your pipeline by clicking ,  or from your repo.

The file must at least contain one pipeline section containing at least one step and one script inside the step.

Section                                                                 Description


default - Contains the pipeline definition for all branches that don't match a pipeline definition in other sections.

The default pipeline runs on every push to the repository unless a branch-specific pipeline is defined. You can define a branch pipeline in the branches section.

Note: The default pipeline doesn't run on tags or bookmarks.



branches - Defines a section for all branch-specific build pipelines. The names or expressions in this section are matched against:

  • Branches in your Git repository
  • Named branches in your Mercurial repository

See Branch workflows for more information about configuring pipelines to build specific branches in your repository.

(grey lightbulb) Check out the glob patterns cheat sheet to define the branch names.



tags - Defines all tag-specific build pipelines. The names or expressions in this section are matched against tags and annotated tags in your Git repository.

(grey lightbulb) Check out the glob patterns to define your tags.

bookmarks - Defines all bookmark-specific build pipelines. The names or expressions in this section are matched against bookmarks in your Mercurial repository.

image: node:10.15.0
   
pipelines:
  default:
    - step:
        name: Build and test
        script:
          - npm install
          - npm test
  bookmarks:                      # add the 'bookmarks' section
    release-*:                    # specify the bookmark
      - step:                     # define the build pipeline for the bookmark
          name: Build and release
          script:
            - npm install
            - npm test
            - npm run release
  branches:
    staging:
      - step:
          name: Clone
          script:
            - echo "Clone all the things!"

(grey lightbulb) Check out the glob patterns cheat sheet to define your bookmarks.


pull-requestsA special pipeline that only runs on pull requests initiated from within your repo. It merges the destination branch into your working branch before it runs. Pull requests from a forked repository don't trigger the pipeline. If the merge fails, the pipeline stops.

重要

Pull request pipelines run in addition to any branch and default pipelines that are defined, so if the definitions overlap you may get 2 pipelines running at the same time.

If you already have branches in your configuration, and you want them all to only run on pull requests, replace the keyword branches with pull-requests.

pipelines:
  pull-requests:
    '**': #this runs as default for any branch not elsewhere defined
      - step:
          script:
            - ...
    feature/*: #any branch with a feature prefix
      - step:
          script:
            - ...
 branches:    #these will run on every push of the branch
    staging:
      - step:
          script:
            - ...

(grey lightbulb) Check out the glob patterns cheat sheet to define the pull-requests.

custom - Defines pipelines that can only be triggered manually or scheduled from the Bitbucket Cloud interface.

image: node:10.15.0
    
pipelines:
  custom: # Pipelines that are triggered manually
    sonar: # The name that is displayed in the list in the Bitbucket Cloud GUI
      - step:
          script:
            - echo "Manual triggers for Sonar are awesome!"
    deployment-to-prod: # Another display name
      - step:
          script:
            - echo "Manual triggers for deployments are awesome!"
  branches:  # Pipelines that run automatically on a commit to a branch
    staging:
      - step:
          script:
            - echo "Auto pipelines are cool too."

With a configuration like the one above, you should see the following pipelines in the Run pipeline dialog in Bitbucket Cloud:

For more information, see Run pipelines manually.


例:

image: node:10.15.0
   
pipelines:
  default:
    - step:
        name: Build and test
        script:
          - npm install
          - npm test
  tags:                         # add the 'tags' section
    release-*:                  # specify the tag
      - step:                   # define the build pipeline for the tag
          name: Build and release
          script:
            - npm install
            - npm test
            - npm run release
  branches:
    staging:
      - step:
          name: Clone
          script:
            - echo "Clone all the things!"


高度な設定

Use the advanced options for running services and running test in parallel. You can also do things such as configuring a manual step and setting a maximum time for each step, configure 2x steps to get 8GB of memory.

はじめる前に

  • A pipeline YAML file must have at least one section with a keyword and one or more steps.
  • Each step has 4GB of memory available.
  • 1 つのパイプラインに、最大 100 のステップを含めることができます。
  • Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images.

Global configuration options

Keywords list

Keyword                                                                Description


variables[Custom pipelines only] Contains variables that are supplied when a pipeline is launched. To enable the variables, define them under the custom pipeline that you want to enter when you run the pipeline:

pipelines:
  custom:
    custom-name-and-region: #name of this pipeline
      - variables:          #list variable names under here
          - name: Username
          - name: Region
      - step: 
          script:
            - echo "User name is $Username"
            - echo "and they are in $Region"

Then, when you run a custom pipeline (Branches ⋯ Run pipeline for a branch > Custom:..) you'll be able to fill them in.

The keyword variables can also be part of the definition of a service.


nameWhen the keyword name is in the variables section of your yaml, it defines variables that you can add or update when running a custom pipeline. Pipelines can use the keyword name inside a step.

parallel -  Parallel steps enable you to build and test faster, by running a set of steps at the same time. The total number of build minutes used by a pipeline will not change if you make the steps parallel, but you'll be able to see the results sooner.

並列または単独のどちらで実行されているかにかかわらず、パイプラインで実行できるステップの合計数は 100 に制限されています。

同時に実行するステップをインデントで定義します。

pipelines:
  default:
    - step: # non-parallel step
        name: Build
        script:
          - ./build.sh
    - parallel: # these 2 steps will run in parallel
        - step:
            name: Integration 1
            script:
              - ./integration-tests.sh --batch 1
        - step:
            name: Integration 2
            script:
              - ./integration-tests.sh --batch 2
    - step:          # non-parallel step
        script:
          - ./deploy.sh

Learn more about parallel steps.

step - Defines a build execution unit. Steps are executed in the order that they appear in the bitbucket-pipelines.yml file. You can use up to 100 steps in a pipeline.

Each step in your pipeline will start a separate Docker container to run the commands configured in the script. Each step can be configured to:

  • Use a different Docker image.
  • Configure a custom max-time.
  • 固有のキャッシュとサービスを使用する。
  • Produce artifacts that subsequent steps can consume.

Steps can be configured to wait for a manual trigger before running. To define a step as manual, add trigger: manual to the step in your bitbucket-pipelines.yml file. Manual steps:

  • 設定されている順序でのみ実行できます。手動ステップをスキップすることはできません。
  • 前の手順が正常に完了した場合にのみ実行できます。
  • リポジトリへの書き込みアクセス権限を持つユーザーのみがトリガーできます。
  • Pipelines の Web インターフェイス経由でトリガーされます。

If your build uses both manual steps and artifacts, the artifacts are stored for 7 days following the execution of the step that produced them. After this time, the artifacts expire and any manual steps in the pipeline can no longer be executed.

Note: You can't configure the first step of a pipeline as a manual step.

name - Defines a name for a step to make it easier to see what each step is doing in the display.


image - Bitbucket Pipelines uses Docker containers to run your builds.

  • You can use the default image (atlassian/default-image:2) provided by Bitbucket or define a custom image. You can specify any public or private Docker image that isn't hosted on a private network.
  • イメージは、グローバルまたはステップ レベルで定義できます。ブランチ レベルでは定義できません。

To specify an image, use image: <your_account/repository_details>:<tag>

For more information about using and creating images, see Use Docker images as build environments.

pipelines:
  default:
    - step:          # non-parallel step
        name: Build
        script:
          - ./build.sh
    - parallel:      # these 2 steps will run in parallel
        - step:
            name: Integration 1
            script:
              - ./integration-tests.sh --batch 1
        - step:
            name: Integration 2
            script:
              - ./integration-tests.sh --batch 2
    - step:          # non-parallel step
        script:
          - ./deploy.sh


triggerSpecifies whether a step will run automatically or only after someone manually triggers it. You can define the trigger type as manual or automatic. If the trigger type is not defined, the step defaults to running automatically. The first step cannot be manual. If you want to have a whole pipeline only run from a manual trigger then use a custom pipeline.

pipelines:
  default:
    - step:
        name: Build and test
        image: node:10.15.0
        script:
          - npm install
          - npm test
          - npm run build
        artifacts:
          - dist/**
    - step:
        name: Deploy
        image: python:3.7.2
        trigger: manual
        script:
          - python deploy.py


deployment -  
Sets the type of environment for your deployment step, and it is used in the Deployments dashboard. The Valid values are: test, staging, or production.

The following step will display in the test environment in the Deployments view:

有効な値: teststaging、または production

- step:
    name: Deploy to test
    image: aws-cli:1.0
    deployment: test
    script:
      - python deploy.py test

size - You can allocate additional resources to a step, or to the whole pipeline. By specifying the size of 2x, you'll have double the resources available (eg. 4GB memory → 8GB memory).

At this time, valid sizes are 1x and 2x.

2x pipelines will use twice the number of build minutes.

Example: Overriding the size of a single step
pipelines:
  default:
    - step:
        script:
          - echo "All good things..."
    - step:
        size: 2x # Double resources available for this step.
        script:
          - echo "Come to those who wait."

scriptContains a list of commands that are executed in sequence. Scripts are executed in the order in which they appear in a step. We recommend that you move large scripts to a separate script file and call it from the bitbucket-pipelines.yml.

pipe - 

Pipes make complex tasks easier, by doing a lot of the work behind the scenes. This means you can just select which pipe you want to use, and supply the necessary variables. You can look at the repository for the pipe to see what commands it is running. Learn more about pipes.

Opsgenie にメッセージを送信するパイプは次のようになります。

pipelines:
  default:
    - step:
        name: Alert Opsgenie
        script:
          - pipe: atlassian/opsgenie-send-alert:0.2.0
            variables:
              GENIE_KEY: $GENIE_KEY
              MESSAGE: "Danger, Will Robinson!"
              DESCRIPTION: "An Opsgenie alert sent from Bitbucket Pipelines"
              SOURCE: "Bitbucket Pipelines"
              PRIORITY: "P1"

You can also create your own pipes. If you do, you can specify a docker based pipe with the syntax:

 pipe: docker://<DockerAccountName>/<ImageName>:<version>

after-scriptCommands inside an after-script section will run when the step succeeds or fails. This could be useful for clean up commands, test coverage, notifications, or rollbacks you might want to run, especially if your after-script uses the value of BITBUCKET_EXIT_CODE.

: after-script セクションのコマンドが失敗した場合、次のようになります。

  • そのセクションでは以降のコマンドは実行されません。
  • it will not affect the reported status of the step.
pipelines:
  default:
    - step:
        name: Build and test
        script:
          - npm install
          - npm test
        after-script:
          - echo "after script has run!"

artifacts - Defines files that are produced by a step, such as reports and JAR files, that you want to share with a following step.

Artifacts can be defined using glob patterns.

pipelines:
  default:
    - step:
        name: Build and test
        image: node:10.15.0
        script:
          - npm install
          - npm test
          - npm run build
        artifacts:
          - dist/**
    - step:
        name: Deploy to production
        image: python:3.7.2
        script:
          - python deploy-to-production.py

For more information, see using artifacts in steps.

optionsContains global settings that apply to all your pipelines. The main keyword you'd use here is max-time.

max-timeYou can define the maximum amount of minutes a step can execute for at a global level or at a step level. Use a whole number greater than 0 and less than 120.

options:
  max-time: 60
pipelines:
  default:
    - step:
        name: Sleeping step
        script:
          - sleep 120m # This step will timeout after 60 minutes
    - step:
        name: quick step
        max-time: 5
        script:
          - sleep 120m #this step will timeout after 5 minutes

max-time を指定しない場合、既定で 120 に設定されます。

clone - Contains settings for when we clone your repository into a container. Settings here include:

  • lfs - Support for Git lfs
  • depth - the depth of the Git clone.

lfs (GIT only) - Enables the download of LFS files in your clone. Ifs defaults to false if not specified. Note that keyword is supported only for Git repositories.

clone:
  lfs: true
   
pipelines:
  default:
    - step:
        name: Clone and download
        script:
          - echo "Clone and download my LFS files!"

depth (Git only)Defines the depth of Git clones for all pipelines. Note that keyword is supported only for Git repositories.

深度を指定する際は、0 より大きい整数を使用します。フル クローンには full を使用します。Git クローンの深度を指定しない場合、既定で 50 に設定されます。

clone:
  depth: 5       # include the last five commits
  
pipelines:
  default:
    - step:
        name: Cloning
        script:
          - echo "Clone all the things!"

definitionsDefine resources used elsewhere in your pipeline configuration. Resources can include:

servicesPipelines can spin up separate docker containers for services, which results in faster builds, and easy service editing.

Example of a fully configured service

If you want a MySQL service container (a blank database available on localhost:3306 with a default database pipelines, user root and password let_me_in) you could add:

definitions:
  services:
    mysql:
      image: mysql
      variables:
        MYSQL_DATABASE: pipelines
        MYSQL_ROOT_PASSWORD: let_me_in
pipelines:
  default:
   - step:
      services:
        - docker
        - mysql
      script:
        - ...
definitions:
  services:
    mysql: mysql:latest

Learn more on how to use services here.

cachesRe-downloading dependencies from the internet for each step of a build can take a lot of time. Using a cache they are downloaded once to our servers and then locally loaded into the build each time.

definitions:
  caches:
    bundler: vendor/bundle
pipelines:
  default:
   - step:
      caches:
        - npm
      script:
        - npm install


YAML anchorsYAML anchors - a way to define a chunk of your yaml for easy re-use - see YAML anchors.







最終更新日 2019 年 9 月 30 日

この内容はお役に立ちましたか?

はい
いいえ
この記事についてのフィードバックを送信する
Powered by Confluence and Scroll Viewport.