Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/main' into fix-3171
Browse files Browse the repository at this point in the history
  • Loading branch information
sfc-gh-jmichalak committed Dec 12, 2024
2 parents 46d6e39 + 933335f commit bf6df94
Show file tree
Hide file tree
Showing 75 changed files with 4,299 additions and 449 deletions.
24 changes: 24 additions & 0 deletions MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,30 @@ Changes:
- The underlying resource identifier was changed from `<account_locator>` to `<organization_name>.<account_name>`. Migration will be done automatically. Notice this introduces changes in how `snowflake_account` resource is imported.
- New `show_output` field was added (see [raw Snowflake output](./v1-preparations/CHANGES_BEFORE_V1.md#raw-snowflake-output)).

### snowflake_accounts data source changes
New filtering options:
- `with_history`

New output fields
- `show_output`

Breaking changes:
- `pattern` renamed to `like`
- `accounts` field now organizes output of show under `show_output` field and the output of show parameters under `parameters` field.

Before:
```terraform
output "simple_output" {
value = data.snowflake_accounts.test.accounts[0].account_name
}
```
After:
```terraform
output "simple_output" {
value = data.snowflake_accounts.test.accounts[0].show_output[0].account_name
}
```

### snowflake_tag_association resource changes
#### *(behavior change)* new id format
To provide more functionality for tagging objects, we have changed the resource id from `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"` to `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"|TAG_VALUE|OBJECT_TYPE`. This allows to group tags associations per tag ID, tag value and object type in one resource.
Expand Down
78 changes: 75 additions & 3 deletions docs/data-sources/accounts.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,45 +2,117 @@
page_title: "snowflake_accounts Data Source - terraform-provider-snowflake"
subcategory: ""
description: |-
Data source used to get details of filtered accounts. Filtering is aligned with the current possibilities for SHOW ACCOUNTS https://docs.snowflake.com/en/sql-reference/sql/show-accounts query. The results of SHOW are encapsulated in one output collection accounts.
---

# snowflake_accounts (Data Source)

Data source used to get details of filtered accounts. Filtering is aligned with the current possibilities for [SHOW ACCOUNTS](https://docs.snowflake.com/en/sql-reference/sql/show-accounts) query. The results of SHOW are encapsulated in one output collection `accounts`.

## Example Usage

```terraform
# Simple usage
data "snowflake_accounts" "simple" {
}
output "simple_output" {
value = data.snowflake_accounts.simple.accounts
}
# Filtering (like)
data "snowflake_accounts" "like" {
like = "account-name"
}
output "like_output" {
value = data.snowflake_accounts.like.accounts
}
# With history
data "snowflake_accounts" "with_history" {
with_history = true
}
output "with_history_output" {
value = data.snowflake_accounts.like.accounts
}
# Ensure the number of accounts is equal to at least one element (with the use of postcondition)
data "snowflake_accounts" "assert_with_postcondition" {
like = "account-name"
lifecycle {
postcondition {
condition = length(self.accounts) > 0
error_message = "there should be at least one account"
}
}
}
# Ensure the number of accounts is equal to at exactly one element (with the use of check block)
check "account_check" {
data "snowflake_accounts" "assert_with_check_block" {
like = "account-name"
}
assert {
condition = length(data.snowflake_accounts.assert_with_check_block.accounts) == 1
error_message = "accounts filtered by '${data.snowflake_accounts.assert_with_check_block.like}' returned ${length(data.snowflake_accounts.assert_with_check_block.accounts)} accounts where one was expected"
}
}
```

<!-- schema generated by tfplugindocs -->
## Schema

### Optional

- `pattern` (String) Specifies an account name pattern. If a pattern is specified, only accounts matching the pattern are returned.
- `like` (String) Filters the output with **case-insensitive** pattern, with support for SQL wildcard characters (`%` and `_`).
- `with_history` (Boolean) Includes dropped accounts that have not yet been deleted.

### Read-Only

- `accounts` (List of Object) List of all the accounts available in the organization. (see [below for nested schema](#nestedatt--accounts))
- `accounts` (List of Object) Holds the aggregated output of all accounts details queries. (see [below for nested schema](#nestedatt--accounts))
- `id` (String) The ID of this resource.

<a id="nestedatt--accounts"></a>
### Nested Schema for `accounts`

Read-Only:

- `show_output` (List of Object) (see [below for nested schema](#nestedobjatt--accounts--show_output))

<a id="nestedobjatt--accounts--show_output"></a>
### Nested Schema for `accounts.show_output`

Read-Only:

- `account_locator` (String)
- `account_locator_url` (String)
- `account_name` (String)
- `account_old_url_last_used` (String)
- `account_old_url_saved_on` (String)
- `account_url` (String)
- `comment` (String)
- `consumption_billing_entity_name` (String)
- `created_on` (String)
- `dropped_on` (String)
- `edition` (String)
- `is_events_account` (Boolean)
- `is_org_admin` (Boolean)
- `is_organization_account` (Boolean)
- `managed_accounts` (Number)
- `marketplace_consumer_billing_entity_name` (String)
- `marketplace_provider_billing_entity_name` (String)
- `moved_on` (String)
- `moved_to_organization` (String)
- `old_account_url` (String)
- `organization_name` (String)
- `organization_old_url` (String)
- `organization_old_url_last_used` (String)
- `organization_old_url_saved_on` (String)
- `organization_url_expiration_on` (String)
- `region_group` (String)
- `restored_on` (String)
- `scheduled_deletion_time` (String)
- `snowflake_region` (String)
86 changes: 78 additions & 8 deletions docs/resources/function_java.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ Resource used to manage java function objects. For more information, check [func
### Required

- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](/~https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`.
- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be Java source code. For more information, see [Introduction to Java UDFs](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-introduction). To mitigate permadiff on this field, the provider replaces blank characters with a space. This can lead to false positives in cases where a change in case or run of whitespace is semantically significant.
- `handler` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class.
- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](/~https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`.
- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `<result_data_type>` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages).
Expand All @@ -29,16 +28,17 @@ Resource used to manage java function objects. For more information, check [func
- `comment` (String) Specifies a comment for the function.
- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output).
- `external_access_integrations` (Set of String) The names of [external access integrations](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) needed in order for this function’s handler code to access external networks. An external access integration specifies [network rules](https://docs.snowflake.com/en/sql-reference/sql/create-network-rule) and [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) that specify external locations and credentials (if any) allowed for use by handler code when making requests of an external network, such as an external REST API.
- `imports` (Set of String) The location (stage), path, and name of the file(s) to import. A file can be a JAR file or another type of file. If the file is a JAR file, it can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). Java UDFs can also read non-JAR files. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#java).
- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be Java source code. For more information, see [Introduction to Java UDFs](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-introduction). To mitigate permadiff on this field, the provider replaces blank characters with a space. This can lead to false positives in cases where a change in case or run of whitespace is semantically significant.
- `imports` (Block Set) The location (stage), path, and name of the file(s) to import. A file can be a JAR file or another type of file. If the file is a JAR file, it can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). Java UDFs can also read non-JAR files. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#java). (see [below for nested schema](#nestedblock--imports))
- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure functions (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level).
- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level).
- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`.
- `packages` (Set of String) The name and version number of Snowflake system packages required as dependencies. The value should be of the form `package_name:version_number`, where `package_name` is `snowflake_domain:package`.
- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`.
- `return_results_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`.
- `runtime_version` (String) Specifies the Java JDK runtime version to use. The supported versions of Java are 11.x and 17.x. If RUNTIME_VERSION is not set, Java JDK 11 is used.
- `secrets` (Block Set) Assigns the names of [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) to variables so that you can use the variables to reference the secrets when retrieving information from secrets in handler code. Secrets you specify here must be allowed by the [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) specified as a value of this CREATE FUNCTION command’s EXTERNAL_ACCESS_INTEGRATIONS parameter. (see [below for nested schema](#nestedblock--secrets))
- `target_path` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class.
- `target_path` (Block Set, Max: 1) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class. (see [below for nested schema](#nestedblock--target_path))
- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level).

### Read-Only
Expand All @@ -57,6 +57,19 @@ Required:
- `arg_data_type` (String) The argument type.
- `arg_name` (String) The argument name.

Optional:

- `arg_default_value` (String) Optional default value for the argument. For text values use single quotes. Numeric values can be unquoted. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint".


<a id="nestedblock--imports"></a>
### Nested Schema for `imports`

Required:

- `path_on_stage` (String) Path for import on stage, without the leading `/`.
- `stage_location` (String) Stage location without leading `@`. To use your user's stage set this to `~`, otherwise pass fully qualified name of the stage (with every part contained in double quotes or use `snowflake_stage.<your stage's resource name>.fully_qualified_name` if you manage this stage through terraform).


<a id="nestedblock--secrets"></a>
### Nested Schema for `secrets`
Expand All @@ -67,15 +80,72 @@ Required:
- `secret_variable_name` (String) The variable that will be used in handler code when retrieving information from the secret.


<a id="nestedblock--target_path"></a>
### Nested Schema for `target_path`

Required:

- `path_on_stage` (String) Path for import on stage, without the leading `/`.
- `stage_location` (String) Stage location without leading `@`. To use your user's stage set this to `~`, otherwise pass fully qualified name of the stage (with every part contained in double quotes or use `snowflake_stage.<your stage's resource name>.fully_qualified_name` if you manage this stage through terraform).


<a id="nestedatt--parameters"></a>
### Nested Schema for `parameters`

Read-Only:

- `enable_console_output` (Boolean)
- `log_level` (String)
- `metric_level` (String)
- `trace_level` (String)
- `enable_console_output` (List of Object) (see [below for nested schema](#nestedobjatt--parameters--enable_console_output))
- `log_level` (List of Object) (see [below for nested schema](#nestedobjatt--parameters--log_level))
- `metric_level` (List of Object) (see [below for nested schema](#nestedobjatt--parameters--metric_level))
- `trace_level` (List of Object) (see [below for nested schema](#nestedobjatt--parameters--trace_level))

<a id="nestedobjatt--parameters--enable_console_output"></a>
### Nested Schema for `parameters.enable_console_output`

Read-Only:

- `default` (String)
- `description` (String)
- `key` (String)
- `level` (String)
- `value` (String)


<a id="nestedobjatt--parameters--log_level"></a>
### Nested Schema for `parameters.log_level`

Read-Only:

- `default` (String)
- `description` (String)
- `key` (String)
- `level` (String)
- `value` (String)


<a id="nestedobjatt--parameters--metric_level"></a>
### Nested Schema for `parameters.metric_level`

Read-Only:

- `default` (String)
- `description` (String)
- `key` (String)
- `level` (String)
- `value` (String)


<a id="nestedobjatt--parameters--trace_level"></a>
### Nested Schema for `parameters.trace_level`

Read-Only:

- `default` (String)
- `description` (String)
- `key` (String)
- `level` (String)
- `value` (String)



<a id="nestedatt--show_output"></a>
Expand Down
Loading

0 comments on commit bf6df94

Please sign in to comment.