'Laravel Vapor Docker Runtime with Gitlab CI want not to be work

I use Laravel Vapor for deploying our microservices based on Laravel. This works very good so far, if the app with their dependencies is not too large. But if it is then it gets a little bit tricky. Vapor provides a Docker runtime for this case where you are able to deploy apps up to 10GB size.

For local development we usually use Laradock.io because its easy and flexible.

That means if we deploy from our local environment it easy to enter the workspace container and and run the vapor deploy commands. After enabling Docker Client for the workspace container it works with the vapor Docker runtime properly.

But now we integrated the deployment process into Gitlab CI Pipeline. That works very well for our small services with Vapor PHP runtime. But for the Docker runtime I desperate on the CI deployment.

The docker runtime needs an installed docker instance where vapor will be invoked. That means in the Gitlab-ci.yml I have to add an image with installed Docker and PHP to invoke the Vapor scripts.

So I created an docker image base on the laradock workspace container but the Gitlab-runner exits always with the error message no docker deamon is available.

This is the related part of my GitLab-CI yml (the image is only local available):

testing:
  image: 
    name: lexitaldev/vapor-docker-deploy:latest
    pull_policy: never
  securityContext:
    privileged: true
  environment: testing
  stage: deploy
  only:
    - test
  script:
    - composer install
    - php vendor/bin/vapor deploy test

This is the specific output:

  Error Output:                                                                
  ================                                                             
  Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the   
  docker daemon running? 

I've tried to use the standard 'laravelphp/vapor:php80' image and install docker over the script section as well.

before_script:
      - apk add docker
      - addgroup root docker

But nothing helped. It seems to be there is a problem with the docker.sock.

Did anybody managed to add Vapor Docker Runtime deployment to CI scripts?

Best, Michael



Solution 1:[1]

I would like to tell you, that you only need to add the Service: dind, but after you do that, it will throw an error, related to the image that Gitlab create for your pipelines. So you need to create a runner with volumes, privileged flag, and tags.

I did it, using gitlab-runner on my machine.

sudo gitlab-runner register -n \
    --url {{ your_url }} \
    --registration-token {{your_token}} \
    --executor docker \
    --description "{{ Describe your runner }}" \
    --docker-image "docker:20.10.12-alpine3.15" \
    --docker-privileged \
    --docker-volumes="/certs/client" \
    --docker-volumes="cache" \
    --docker-volumes="/var/run/docker.sock:/var/run/docker.sock"
    --tag-list {{ a_tag_for_your_pipeline }}

Once you did that, you would need to use a docker stable version in your gitlab-ci.yml file. For some reason, it doesn't work when I was trying to use version 20 or latest

image: docker:stable
services:
    -   name: docker:stable:dind

before_script:
    - echo $CI_JOB_TOKEN | docker login $CI_REGISTRY -u $CI_REGISTRY_USER --password-stdin

build:
    tags:
        - {{the tag you defined in your runner}}
    variables:
        IMAGE_TAG: $CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG
    script:
        - echo $IMAGE_TAG
        - docker build -t $CI_REGISTRY_IMAGE -f {{your Dockerfile}} .
        - docker push $CI_REGISTRY_IMAGE

All the variables are previously defined in Gitlab, so don't worry, you can "copy & paste". Also, I added some advices that Gitlab mention on its documentation when you need to register your Docker container in Gitlab container.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Aarón Cervantes