Bitbucket Pipelines でデータベースを使用してテストする

When testing with a database, we recommend that you use service containers to run database services in a linked container. Docker has a number of official images of popular databases on Docker Hub.

This page has example bitbucket-pipelines.yml files showing how to connect to the following DB types.

You can check your bitbucket-pipelines.yml file with our online validator.

See also Use services and databases in Bitbucket Pipelines.

Alternatively, you can use a Docker image that contains the database you need – see Use a Docker image configured with a database on this page.

MongoDB

Using the Mongo image on Docker Hub.

image: node:6.9.4
pipelines: 
  default: 
    - step: 
        script: 
          - npm install 
          - npm test 
        services: 
          - mongo 

definitions: 
  services: 
    mongo: 
      image: mongo

MongoDB will be available on 127.0.0.1:27017 without authentication. As you connect to a database, MongoDB will create it for you.

Note that MongoDB's default configuration only listens for connections on IPv4, whereas some platforms (like Ruby) default to connecting via IPv6 if your Mongo connection is configured to use localhost. This is why we recommend connecting on 127.0.0.1 rather than localhost.


MySQL – test user

Using the MySQL image on Docker Hub.

image: node:6.9.4 
pipelines: 
  default: 
    - step: 
        script: 
          - npm install 
          - npm test 
        services: 
          - mysql

definitions: 
  services: 
    mysql: 
      image: mysql:5.7 
      environment: 
        MYSQL_DATABASE: 'pipelines'
        MYSQL_RANDOM_ROOT_PASSWORD: 'yes' 
        MYSQL_USER: 'test_user'
        MYSQL_PASSWORD: 'test_user_password'

Connecting to MySQL

If you use the example above, MySQL (version 5.7) will be available on:

  • Host name: 127.0.0.1 (avoid using localhost, as some clients will attempt to connect via a local "Unix socket", which will not work in Pipelines)
  • Port: 3306
  • Default database: pipelines
  • User: test_user, password: test_user_password. (The root user of MySQL will not be accessible.)

You will need to populate the pipelines database with your tables and schema. If you need to configure the underlying database engine further, refer to the official Docker Hub image for details.


MySQL – root user

Using the MySQL image on Docker Hub.

image: node:6.9.4
pipelines: 
  default: 
    - step: 
        script: 
          - npm install 
          - npm test 
        services: 
          - mysql 

definitions: 
  services: 
    mysql: 
      image: mysql:5.7 
      environment: 
        MYSQL_DATABASE: 'pipelines' 
        MYSQL_ROOT_PASSWORD: 'let_me_in'

Connecting to MySQL

If you use the example above, MySQL (version 5.7) will be available on:

  • Host name: 127.0.0.1 (avoid using localhost, as some clients will attempt to connect via a local "Unix socket", which will not work in Pipelines)
  • Port: 3306
  • Default database: pipelines
  • User: root, password: let_me_in

You will need to populate the pipelines database with your tables and schema. If you need to configure the underlying database engine further, refer to the official Docker Hub image for details.


PostgreSQL – default user

Using the Postgres image on Docker Hub.

image: node:6.9.4 
pipelines: 
  default: 
    - step: 
        script: 
          - npm install 
          - npm test 
        services: 
          - postgres 

definitions: 
  services: 
    postgres: 
      image: postgres

PostgreSQL will be available on localhost:5432 with default database 'postgres', user 'postgres' and no password. You will need to populate the postgres database with your tables and schema, or create a second database for your use. If you need to configure the underlying database engine further, please refer to the official dockerhub image for details.


PostgreSQL – test user

Using the Postgres image on Docker Hub.

image: node:6.9.4
pipelines: 
  default: 
    - step: 
        script: 
          - npm install
          - npm test
        services: 
          - postgres

definitions: 
  services: 
    postgres: 
      image: postgres 
      environment: 
        POSTGRES_DB: 'pipelines' 
        POSTGRES_USER: 'test_user'
        POSTGRES_PASSWORD: 'test_user_password'

PostgreSQL will be available on localhost:5432 with default a database named 'pipelines', user 'test_user' and password 'test_user_password'. You will need to populate the pipelines database with your tables and schema. If you need to configure the underlying database engine further, please refer to the official dockerhub image for details.


Redis

Using the Redis image on Docker Hub.

image: node:6.9.4 
pipelines: 
  default: 
    - step: 
        script: 
          - npm install 
          - npm test 
        services: 
          - redis 

definitions: 
  services: 
    redis: 
      image: redis

Redis will be available on localhost:6379 without authentication.


Cassandra

Using the Cassandra image on Docker Hub.

image: node:6.9.4 
pipelines: 
  default: 
    - step:
        script: 
          - npm install 
          - sleep 10 # wait for cassandra 
          - npm test 
        services: 
          - cassandra 

definitions: 
  services: 
    cassandra: 
      image: cassandra 
      environment: 
        MAX_HEAP_SIZE: '512M' # Need to restrict the heapsize or else Cassandra will OOM 
        HEAP_NEWSIZE: '128M'

Cassandra will be available on localhost:9042.


Use a Docker image configured with a database

As an alternative to running a separate container for the database (which is our recommended approach), you can use a Docker image that already has the database installed. The following images for Node and Ruby contain databases, and can be extended or modified for other languages and databases.

最終更新日 2018 年 5 月 1 日

この翻訳に満足しましたか?

はい
いいえ
この記事についてのフィードバックを送信する
Powered by Confluence and Scroll Viewport.