'How to get file size of objects from google cloud python library?

Problem

Hello everyone. I am currently trying to get the file size of an object programmatically from the google-cloud python library.

Here is what I have right now.

from google.cloud import storage

client = storage.Client()
bucket = client.get_bucket("example-bucket-name")
object = bucket.blob("example-object-name.jpg")

print(object.exists())
>>> True

print(object.chunk_size)
>>> None

It seems to me that the google-cloud library is neglecting to load data into the attribute chunk_size, content_type, etc, ...

Question

How do I force google-cloud to load actual data into the blob's respective metadata attributes - as opposed to just leaving everything as None?



Solution 1:[1]

Call size on Blob.

from google.cloud import storage

# create client
client: storage.client.Client = storage.Client('projectname')

# get bucket
bucket: storage.bucket.Bucket = client.get_bucket('bucketname')

size_in_bytes = bucket.get_blob('filename').size

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 orby