'Bitbucket pipeline env variables

I'm deploying my react app to the s3 bucket and I have a lot of env variables, so the question is how can I handle the env variables in bitbucket pipelines?

error in pipeline

+ npm run build
> [email protected] build /opt/atlassian/pipelines/agent/build
> env-cmd -f .env.prod react-scripts build  && cp build/index.html build/200.html
Error: Failed to find .env file at path: .env.prod
    at getEnvFile (/opt/atlassian/pipelines/agent/build/node_modules/env-cmd/dist/get-env-vars.js:40:19)
    at process._tickCallback (internal/process/next_tick.js:68:7)
    at Function.Module.runMain (internal/modules/cjs/loader.js:834:11)
    at startup (internal/bootstrap/node.js:283:19)
    at bootstrapNodeJSCore (internal/bootstrap/node.js:623:3)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] build: `env-cmd -f .env.prod react-scripts build  && cp build/index.html build/200.html`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR!     /root/.npm/_logs/2022-03-01T10_17_14_844Z-debug.log

bitbucket-pipeline.yml

image: node:10

# Workflow Configuration

pipelines:
  default:
    - parallel:
      - step:
          name: Build and Test
          caches:
            - node
          script:
            - npm install
            # CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
      
  branches:
    master:
      - parallel:
        - step:
            name: Build and Test
            caches:
              - node
            script:
              - npm install
              # CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
              - npm run build
            artifacts:
              - build/**
        - step:
            name: Security Scan
            script:
              # Run a security scan for sensitive data.
              # See more security tools at https://bitbucket.org/product/features/pipelines/integrations?&category=security
              - pipe: atlassian/git-secrets-scan:0.5.1
      - step:
          name: Deploy to Production
          deployment: Production
          trigger: manual
          clone:
            enabled: false
          script:
            # sync your files to S3
            - pipe: atlassian/aws-s3-deploy:1.1.0
              variables:
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                S3_BUCKET: $S3_BUCKET
                LOCAL_PATH: 'build'
            # triggering a distribution invalidation to refresh the CDN caches
            - pipe: atlassian/aws-cloudfront-invalidate:0.6.0
              variables:
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                DISTRIBUTION_ID: '123xyz'



Solution 1:[1]

Simply add a line in the script where you need the variables and load the file:

- export $(cat .env | xargs)

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Tim-Erwin