Introduction
Transferring data from Google Drive to Google Cloud Storage (GCS) is a common need for teams that want to archive large files, integrate with data‑processing pipelines, or simply back up important documents in a more cost‑effective, durable environment. While the Google Cloud Console offers a manual upload interface, automating the process with Google Apps Script gives you the flexibility to move single files, whole folders, or even schedule recurring syncs without leaving the familiar Drive environment. This tutorial walks you through every step required to create a reliable, script‑driven workflow: from preparing your GCS bucket and service account, to writing and testing the Apps Script code that copies content, handles metadata, and reports success or failure. By the end, you’ll have a reusable solution that can be adapted to a variety of business scenarios.
Preparing the Cloud Environment
Before any script can interact with GCS, you must set up a bucket and grant the script the appropriate permissions. Follow these sub‑steps:
- Create a bucket in the Google Cloud Console, choosing a storage class and location that match your performance and cost requirements.
- Enable the Cloud Storage JSON API for the project that owns the bucket; this API is what Apps Script will call.
- Generate a service account with the role Storage Object Admin (or a more restrictive custom role) and download its JSON key.
- Store the key securely—you can paste it into a protected Script Property or use the Properties Service in Apps Script to keep it out of the source code.
These actions give your script a secure, programmatic identity that can read and write objects in the bucket without exposing user credentials.
Writing the Apps Script to Transfer Files
The core of the solution is a short Apps Script function that authenticates with the service account, reads a Drive file, and streams it to GCS. A typical implementation looks like this:
- Use UrlFetchApp.fetch with the OAuth2 token derived from the service account JSON.
- Read the file’s Blob via DriveApp.getFileById(id).getBlob() to preserve the original MIME type.
- Construct the PUT request to
https://storage.googleapis.com/upload/storage/v1/b/BUCKET_NAME/o?uploadType=media&name=OBJECT_NAME, inserting the bucket name and desired object path. - Handle the response, logging success or throwing an error if the HTTP status is not 200.
Encapsulating this logic in a reusable function—uploadToGCS(fileId, destinationPath)—allows you to call it from other scripts, menu items, or even time‑driven triggers.
Extending the Script for Folders, Error Handling, and Automation
Real‑world use cases rarely involve a single file. To support entire folders, the script can recursively enumerate children with DriveApp.getFolderById, building the destination path to mirror the Drive hierarchy inside the bucket. Robust error handling is essential: wrap API calls in try…catch blocks, log detailed messages with Logger.log, and optionally send email alerts when a transfer fails. Finally, automate the workflow by creating a time‑driven trigger (e.g., daily at midnight) or a custom menu entry that lets users select a folder and start the sync with a single click. This turns a manual, error‑prone process into a reliable, hands‑off operation.
Conclusion
By combining Google Cloud Storage’s scalable, durable object store with the simplicity of Google Apps Script, you can create a powerful pipeline that moves files and folders from Drive to GCS with minimal effort. The process begins with proper bucket and service‑account setup, proceeds to a concise script that authenticates, streams blobs, and respects metadata, and finishes with enhancements for folder recursion, error reporting, and scheduled execution. Implementing these steps not only saves time but also ensures that your data is safely archived in the cloud, ready for downstream analytics, backup, or sharing across projects. With the foundation laid out in this tutorial, you can adapt the script to fit any workflow, extending its capabilities as your organization’s needs evolve.









