# Google Cloud Storage Destination

The Google Cloud Storage (GCS) destination puts the raw logs of the data Segment receives into your GCS bucket. The data is copied into your bucket at least every hour. You might see multiple files over a period of time depending on how much data is copied.

> \[!WARNING]
>
> The Google Cloud Storage destination works differently than other destinations in Segment. Segment sends **all** data from an Engage source to GCS during the sync process, not only the connected audiences and traits. Using a destinations selector like the [integrations object](/docs/segment/connections/spec/common/#integrations) doesn't affect the events events sent to GCS.

> \[!NOTE]
>
> The Google Cloud Storage destination is in Public Beta, and doesn't support product features such as deletions or surfacing errors in the UI.

## Getting started

1. Create a Service Account to allow Segment to copy files into the bucket
2. Create a bucket in your preferred region.

## Set up Service Account to give Segment access to upload to your bucket

1. Go to [http://cloud.google.com/iam](http://cloud.google.com/iam)
2. Click **VIEW CONSOLE**.

   ![Google Cloud Identity and Access Management overview with documentation and console buttons.](https://docs-resources.prod.twilio.com/9e07bce68317e7ee6d1f00cee075c5f9de20107d7643b464722feccf5a0780c0.png)
3. Select a project to which you would like to send Segment data
4. In the sidebar, click **Service Accounts**
5. Click **CREATE SERVICE ACCOUNT**.

   ![Google Cloud service accounts list for project 'My First Project' showing emails, names, and descriptions.](https://docs-resources.prod.twilio.com/47b0b21d1b6739f5b9ef709e62cb963e740baad29396af7694367704ac0b6d9b.png)
6. In the **Name** field, give your service account a name, for example, `Segment Upload Objects`.
7. In the **Description** field, enter a description that will remind you the purpose of the role. For example, `This role gives Segment access to upload raw data files to our bucket`.
8. Click **CREATE**.

   ![Google Cloud service account setup with roles and key creation options.](https://docs-resources.prod.twilio.com/2005040b433e5a5f117b1866ef5552aee569a6aa9973f97eeeb035390892111a.png)
9. Click **CONTINUE** to skip adding Service Account Permissions. We will add permissions directly to the bucket instead.
10. Click **CREATE KEY**.
11. Select Key Type `JSON`
12. Click **CREATE** to create the key.

    A key downloads to your computer. You'll use this key when creating the Segment Google Cloud Storage Destination. Keep it in a safe place.

    ![Screenshot of the key download tab, a step in Google Cloud's Create service account setup flow.](https://docs-resources.prod.twilio.com/73b1f71ca2da7fa8c335942fa051faec527cca6834375b0680b0911fc4eff375.png)
13. Click on **DONE** to finish creating your Service Account

    ![Screenshot of the Create service account setup flow in Google Cloud.](https://docs-resources.prod.twilio.com/df29c37f52566bcc9e9d888f4c0d3bf5c2d1ebef1615e9d71f2a97f0e8b50be1.png)

## Set up a Google Cloud bucket for Segment to copy objects

To receive raw data files from Segment, you must first provide a Google Cloud Storage bucket that can store the raw data files:

1. Go to [https://cloud.google.com/storage](https://cloud.google.com/storage)
2. Click **GO TO CONSOLE**.

   ![Google Cloud Storage Browser showing bucket 'testgcscopy' with multi-regional storage class.](https://docs-resources.prod.twilio.com/9d859c9b3e71c7c8d767897a592bcb38696b8ae2f9458d747826bb902b95190b.png)
3. Select a project.
4. Click **CREATE BUCKET**.

   ![Screenshot of the Storage Browser page in Google Cloud.](https://docs-resources.prod.twilio.com/3114b1b9236502f1b9d71b10afcf95685ae5a9f037dbb2c047e0b0ff959585ec.png)
5. In the **Name** field, enter a name for your bucket.
   Any name will work here, but we recommend that include the word "segment", for example: `my-segment-data`.
6. In the **Storage Class** field, we recommend `Multi-Regional`
7. In the **Access Control** field, choose `Set object-level and bucket-level permissions`
8. No **Advanced Options** are necessary. Click **Create** to finish creating the bucket.

   ![Google Cloud bucket setup with encryption and retention policy options.](https://docs-resources.prod.twilio.com/52205f1f71934c86f8180af23ee351fa61c3cdcb14dd27d1c3643c3c75f00958.png)
9. Click the tab **PERMISSIONS** to show bucket permissions
10. Click **ADD MEMBERS**
11. In the **NEW MEMBERS** text input, enter the email address associated with the Service Account we created.
12. Click **SELECT A ROLE**.
    Select the **STORAGE OBJECT ADMIN** role. This gives the Service Account read/write access to objects in this bucket *only*. We require `Object Admin` access in order to overwrite existing files. Overwriting is required when running replays or in cases of failure handling.
13. Click **SAVE**.

![Google Cloud Storage bucket details page with the add members and roles tab open.](https://docs-resources.prod.twilio.com/5e72e4800ffc8641c6b83afc47e804a0dca74546f0fdf2f1d192a487cdd369d7.png)

Congratulations! You now have a bucket ready to accept Segment data.

## Configure Google Cloud Storage destination

Once the Google Cloud Storage Bucket and Service Account are created, a destination that will send data files to the bucket must be configured:

1. In the Segment **Destinations** section, click **Add Destination**.
   You will be redirected to the `Catalog`.
2. Search for "Google Cloud Storage", and click the destination in the catalog.
3. Click **Configure Google Cloud Storage**.
4. Select the source you want to send to this destination.
5. Enter the values for the settings below:
   * **Bucket**: The name of the bucket you created on the Google Cloud Storage Console.
   * **GCS Credentials**: Copy and paste the contents of the credentials (Private Key) file that downloaded to your computer when you created the Google Cloud Service Account. This grants access so Segment can upload raw data files to your bucket
6. Click on the toggle to enable your Destination.

Congratulations! You've set up a GCS destination. You'll receive files in your Bucket in 60 minutes, assuming the Segment Source is regularly producing events.

## Troubleshooting

**Why is data not syncing to the GCS destination?**
Common errors which can cause sync failures are:

* **Bucket not configured**: A bucket for the GCS destination was not provided. Check the GCS destination settings to confirm that a valid bucket is entered.
* **GCS credentials not configured**: Credentials for the GCS destination cannot be found. Confirm that you've inputted GCS credentials into the destination settings.
* **Invalid GCS credentials**: Credentials for the GCS destination are found, but they are not correctly formatted credentials. Re-enter a valid credential as a setting for the destination to work.
* **Unable to upload files**: Segment can't upload files to GCS due to incorrect credentials (for example, a non-existent bucket), insufficient permissions, or a GCS error. Confirm that credentials and permissions are set correctly.
* **Destination not found**: There is no GCS destination connected to the source, and can be connected within the workspace overview page.
* **Destination disabled**: The GCS destination for the source is disabled, and can be enabled in the destination settings page.
