1

I'm trying to set up a CI pipeline which uses a Docker container to run tests. The pipeline is supposed to create a container based on an image I already have and remove that container when it's finished.

For my tests I need to mount a few volumes and bind a few ports from my runner to my container, so to simplify things I want to use a docker-compose file that's constantly stored in /home/runner/docker/docker-compose.yml on my runner.

The problem is as follows: in my docker-compose.yml I have the following lines, binding the current working directory to the HTML folder in my container :

volumes:
  - .:var/www/html

When I use the command docker-compose -f "/home/runner/docker/docker-compose.yml" -d, . should be whichever folder GitLab CI cloned my project to, not /home/runner/dockeras is currently the case.

Is there a way to make it so that . is my cloned project folder (without hardcoding the name), or am I better off just executing a docker run in my GitLab CI script?

1
  • Does symlinking the docker-compose.yml file into the current directory and running docker-compose from there help? Commented Dec 2, 2019 at 12:47

1 Answer 1

1

One option could be to use an environment variable to define the path to the repo, so that instead of

volumes:
  - .:var/www/html

you have

volumes:
  - ${YOUR_REPO}:var/www/html

This way you only need to set YOUR_REPO before running docker-compose and that's it.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.