Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in google_storage_transfer_job & google_storage_transfer_project_service_account #10798

Closed
faizan-ahmad-db opened this issue Dec 24, 2021 · 12 comments

Comments

@faizan-ahmad-db
Copy link

faizan-ahmad-db commented Dec 24, 2021

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.
  • If an issue is assigned to the modular-magician user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned to hashibot, a community member has claimed the issue already.

Terraform Version

terraform v1.0.11
google provider version 4.5.0

Affected Resource(s)

google_storage_transfer_job
google_storage_transfer_project_service_account

Terraform Configuration Files

data "google_storage_transfer_project_service_account" "default" {

     project    = “<my_project_ID>”
}

output "default_account" {
  value = data.google_storage_transfer_project_service_account.default.email
}

Expected Behavior

the config file should fetch the default storage transfer Job SA of my project.

Actual Behavior

But it is trying pick the default storage transfer Job SA of provider project (where our Terraform cloud is hosted)

This is the same case for google_storage_transfer_job. Eventhough, we clearly mentioned the project ID in the config file. It is trying to create the transfer job in provider project(where TF cloud is hosted)

Getting below error during TF plan execution

Error: Error when reading or editing Google Cloud Storage Transfer service account not found: googleapi: Error 403: Storage Transfer API has not been used in project xxxxxxxxxxx before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project=xxxxxxxxxxx then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
Details:

[
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [

{ "description": "Google developers console API activation", "url": "https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project=xxxxxxxxxxx" }
]
},
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"domain": "googleapis.com",
"metadata":

{ "consumer": "projects/xxxxxxxxxxx", "service": "storagetransfer.googleapis.com" }
,
"reason": "SERVICE_DISABLED"
}
]
, accessNotConfigured

b/302673113

@unki
Copy link

unki commented Apr 4, 2022

@faizan-ahmad-db was hitting this issue too, but apparently it's works-as-designed

You need to enable the storage-transfer API in both projects:

  • project which homes the service-account
  • project where you want to use the storage-transfer API

@ambeshsingh
Copy link

@unki regarding your statement to enable the storage-transfer API in both projects, I have created Composer, Dataflow, and many other services without enabling their respective APIs in the project which homes the service-account. Their APIs are enabled only in the project where these resources are being created. Also, I can run Terraform from my local machine right with my user credentials. Only when I create a transfer service is when I am facing this issue where it asks me to enable the storage-transfer-api in the home project.

@ghost
Copy link

ghost commented Jan 25, 2023

Any news on this? This continues happening. Thanks! :)

@github-actions github-actions bot added service/storage forward/review In review; remove label to forward labels Sep 11, 2023
@edwardmedia edwardmedia removed the forward/review In review; remove label to forward label Sep 18, 2023
@calum-github
Copy link

Still seeing this on version 5.16 of the google TF provider - this is silly.
If you pass the project ID to the data source it should return the service agent FOR THE PROJECT THAT YOU HAVE PASSED IN - no ifs no buts

@googlyrahman
Copy link

Hi, We're unable to reproduce this error on our system, i've tried this config on my system

terraform {
  required_providers {
    google = {
      source = "hashicorp/google"
      version = "4.51.0"
    }
  }
}

data "google_storage_transfer_project_service_account" "default" {
     project = "seventhsky"
}

output "default_account" {
  value = data.google_storage_transfer_project_service_account.default.email
}

and it works fine, when using terraform plan, is there anything which i'm missing here?

@vgelot
Copy link

vgelot commented Apr 30, 2024

Hi,
I also have the same issue, (tried with same input as @googlyrahman but changing the project by 1 of my project).
I tried on my laptop and also on cloud shell.
Both failed, but with different consumer project numbers.

And by looking at those project numbers, they don't belong to us, maybe those projects belong to Google?

@arya-harness
Copy link

Any news on this, I am facing the same issue.
I have an entity that creates tf resources, it sits on project A and creates resource on project B.
I have not seen any other terraform resource with such behaviour

@SarahFrench
Copy link
Member

SarahFrench commented May 21, 2024

@googlyrahman I've been able to reproduce the problem, but it requires some setup. For what it's worth I believe the problem is a misunderstanding of when and where an API needs to be enabled, and isn't a bug in the provider. I can expand on this but: using a debugger I've seen thatproject values provided as arguments in the google_storage_transfer_job resource and google_storage_transfer_project_service_account data source blocks in Terraform config are used by the code and override the provider default project.

Ok, onto the reproduction of the error observed in this issue :

You need 2 projects. Project A which has the storagetransfer.googleapis.com API disabled and Project B where the storagetransfer.googleapis.com API is enabled (and so the storage transfer account will exist).

Project A is where the service account exists that Terraform will use as its identity when interacting with Google APIs. Make a service account and a JSON key file to use to configure the Google provider with.

Project B is where we'll be either trying to read storagetransfer-related data from, or create resources in.

You can give the service account from Project A some project-level permissions in Project B (e.g. make them project Owner, seeing as this is just a bug reproduction and everything will be deleted after) but from what I've seen the error disrupts the process before permissions become relevant.

terraform {
 required_providers {
   google = {
     source = "hashicorp/google"
     version = "5.30.0"
   }
 }
}

provider "google" {
 credentials = "./path/to/keyfile/for/service-account/in/project-A.json"
 project = "project-C" // this provider default isn't used when doing plan/apply with this config - feel free to check
}

data "google_storage_transfer_project_service_account" "default" {
    project = "project-B"
}

The result is:

Error: Error when reading or editing Google Cloud Storage Transfer service account not found:
googleapi: Error 403: Storage Transfer API has not been used in project <PROJECT NUMBER CORRESPONDING TO PROJECT A> before or it is disabled.
Enable it by visiting https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project= <PROJECT NUMBER CORRESPONDING TO PROJECT A>  then retry.
If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.

Note that the error is reporting that the Storage Transfer API isn't available for use in project A.

When I set TF_LOG=DEBUG as an environment variable I can see that the data source is attempting to read data from the project set in the configuration:

---[ REQUEST ]---------------------------------------
GET /v1/googleServiceAccounts/<PROJECT_B_ID>?alt=json&prettyPrint=false HTTP/1.1
Host: storagetransfer.googleapis.com
User-Agent: google-api-go-client/0.5 Terraform/1.8.0-dev (+https://www.terraform.io) Terraform-Plugin-SDK/2.33.0 terraform-provider-google/dev
X-Goog-Api-Client: gl-go/1.21.3 gdcl/0.177.0
Accept-Encoding: gzip

Given this information, I can see that there isn't a problem with the google_storage_transfer_project_service_account data source attempting to read data from the wrong project. Instead my mental model is that because the service account identity that Terraform is authenticated as is in project A, any API calls need to be made through APIs that are enabled in project A. I feel this is supported by how I can trigger the same problem using google_bigquery_default_service_account too, as long as the BigQuery API is not enabled in project A, and also how this Dialogflow documentation describes how both consumer and resource projects both need the Dialogflow API to be enabled when using service accounts in separate projects to manade Dialogflow resources.

The problem appears to be that people believe the API only needs to be enabled where storage transfer resources are being provisioned.

@googlyrahman sorry for the looong comment. I see from your profile that you work at Google- I'm at HashiCorp so I'm not able to be active on the internal ticket linked to this issue. Could you please comment on how correct my mental model of the problem is, and whether there are any solutions other than ensuring APIs are enabled in the GCP project that contains the service account used by Terraform? Thanks!

@googlyrahman
Copy link

That's correct, with the mentioned step above - I'm able to reproduce this error. To summarize the above comment:

We would be needing minimum of two projects to reproduce this error - Let's call them Project A, and Project B, Use a service account of Project A [let's call it proj_a_service_account] that've access to Project B. Given STS API isn't enabled in Project A, and it's enabled in Project B, if proj_a_service_Account is used to access transfer related stuff of Project B, it would throw error.

In this case, both of projects should've STS API enabled, if any of project have not STS API enabled, it would throw the error, so the only solution here is to enable them at both the places.

Thanks @SarahFrench for writing such a detailed comment!

@SarahFrench
Copy link
Member

Just to add to what we've described above, it could be complicated some more by using user_project_override=true when configuring the provider. I'd need to experiment some more to check as I'm unsure.

@SarahFrench
Copy link
Member

Just to add to what we've described above, it could be complicated some more by using user_project_override=true when configuring the provider. I'd need to experiment some more to check as I'm unsure.

I've not been able to find a different outcome when using user_project_override=true.


I'm closing this GitHub issue because:

  • The mentioned resources and data source are using the correct project value set in the config
  • The error that's being experienced results from APIs not being enabled in the project containing the identity that Terraform is authenticated as. For more information please see this comment and the one above.

I recommend that users enable the storagetransfer.googleapis.com API in the project containing the service account they use to authenticate Terraform to address this issue.

Note: Enabling the Storage Transfer API can be achieved using the google_project_service resource, however there are a few pitfalls with that resource too. Ensure that the project containing your service account has the Service Usage API enabled, as use of google_project_service depends on that API being enabled. For further information please see this guide.

modular-magician added a commit to modular-magician/terraform-provider-google that referenced this issue Jun 4, 2024
…covery_config, as well as fields to support single-resource mode for big_query_target and cloud_sql_target (hashicorp#10798)

[upstream:02cf34c5dd30da27f4482b65a616c9eac823ec18]

Signed-off-by: Modular Magician <magic-modules@google.com>
modular-magician added a commit that referenced this issue Jun 4, 2024
…covery_config, as well as fields to support single-resource mode for big_query_target and cloud_sql_target (#10798) (#18324)

[upstream:02cf34c5dd30da27f4482b65a616c9eac823ec18]

Signed-off-by: Modular Magician <magic-modules@google.com>
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jun 23, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

10 participants