'Disable Caching on Google Cloud Storage

I have been using GCS to storage my images and also use the NodeJS package to upload these images to my bucket. I have noticed that if I frequently change an image, it either does one of the following:

  • It changes
  • It serves an old image
  • It doesn't change

This seems to happen pretty randomly despite setting all of the options properly and even cross-referencing that with GCS.

I upload my images like this:

const options = {
    destination,
    public: true,
    resumable: false,
    metadata: {
      cacheControl: 'no-cache, max-age=0',
    },
  };

  const file = await this.bucket.upload(tempImageLocation, options);
  const { bucket, name, generation } = file[0].metadata;

  const imageUrl = `https://storage.googleapis.com/${bucket}/${name}`;

I have debated whether to use the base URL you see there or use this one: https://storage.cloud.google.com.

I can't seem to figure out what I am doing wrong and how to always serve a fresh image. I have also tried ?ignoreCache=1 and other query parameters.



Solution 1:[1]

As per the official API documentation - accessible here - shows, you should not need the await. This might be affecting your upload sometime. If you want to use the await, you need to have your function to be async in the declaration, as showed in the second example from the documentation. Your code should look like this.

  const bucketName = 'Name of a bucket, e.g. my-bucket';
  const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';

  const {Storage} = require('@google-cloud/storage');
  const storage = new Storage();

  async function uploadFile() {
    // Uploads a local file to the bucket
    await storage.bucket(bucketName).upload(filename, {
      // Support for HTTP requests made with `Accept-Encoding: gzip`
      gzip: true,
      // By setting the option `destination`, you can change the name of the
      // object you are uploading to a bucket.
      metadata: {
        // Enable long-lived HTTP caching headers
        // Use only if the contents of the file will never change
        // (If the contents will change, use cacheControl: 'no-cache')
        cacheControl: 'public, max-age=31536000',
      },
    });

    console.log(`${filename} uploaded to ${bucketName}.`);
  }

  uploadFile().catch(console.error);

While this is untested, it should help you avoiding the issue with not uploading always the images.

Besides that, as explained in the official documentation of Editing Metada, you can change the way that metadata - which includes the cache control - is used and managed by your project. This way, you can change your cache configuration as well.

I also, would like to include the below link for a complete tutorial on how to send images to Cloud Storage with Node.js, in case you want to check a different approach.

Let me know if the information helped you!

Solution 2:[2]

u can try change ?ignoreCache=1 to ?ignoreCache=0.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 gso_gabriel
Solution 2 Quy?t Nghiêm