How to transfer a batch of data between GCS buckets using Cloud Functions

Rajathithan Rajasekar
4 min readMar 1, 2023
Photo by kazuend on Unsplash

In this post, we will see how to transfer data from one cloud storage bucket to another cloud storage bucket using EventArc and CloudFunction.

Let’s Consider a scenario where data is uploaded to source bucket at different times in a day and all these data needs to be staged together in another GCS bucket for further processing down the data pipeline . Accumulating data all together in a batch helps in efficient & cost-saving batch operations.

Batch load of data from GCS to GCS

Step 1:

Enable all the required APIs when prompted at each section.

Create your source bucket in the region of your choice, set all default options and disable public access.

Step 2:

The data that is uploaded to the source bucket should follow a certain lexicography . eg: 03–01–2023-File-001.txt, 03–01–2023-File-002.txt.. and so on .

Step 3:

Create an eventarc trigger for the storage bucket event type — “google.cloud.storage.object.v1.finalized” and destination platform as cloud functions to create the cloud function along with this event…

--

--

Rajathithan Rajasekar

I like to write code in Python . Interested in cloud , dataAnalysis, computerVision, ML and deepLearning. https://rajathithanrajasekar.medium.com/membership