Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Google Bigquery input option #70

Merged
merged 3 commits into from
Mar 3, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 94 additions & 0 deletions docs/resources/job_definition.md
Original file line number Diff line number Diff line change
Expand Up @@ -747,6 +747,36 @@ resource "trocco_job_definition" "decoder_example" {

### InputOptions

#### BigqueryInputOption

```terraform
resource "trocco_job_definition" "bigquery_input_example" {
input_option_type = "bigquery"
input_option = {
bigquery_input_option = {
bigquery_connection_id = 1
gcs_uri = "test_bucket"
gcs_uri_format = "bucket"
query = "SELECT * FROM `test_dataset.test_table`"
temp_dataset = "temp_dataset"
location = "asia-northeast1"
is_standard_sql = true
cleanup_gcs_files = true
file_format = "CSV"
cache = true
bigquery_job_wait_second = 600

columns = [
{
name = "col1__c"
type = "string"
}
]
}
}
}
```

#### MysqlInputOption

```terraform
Expand Down Expand Up @@ -1146,12 +1176,76 @@ Optional:

Optional:

- `bigquery_input_option` (Attributes) Attributes about source bigquery (see [below for nested schema](#nestedatt--input_option--bigquery_input_option))
- `gcs_input_option` (Attributes) Attributes about source GCS (see [below for nested schema](#nestedatt--input_option--gcs_input_option))
- `google_spreadsheets_input_option` (Attributes) Attributes about source Google Spreadsheets (see [below for nested schema](#nestedatt--input_option--google_spreadsheets_input_option))
- `mysql_input_option` (Attributes) Attributes of source mysql (see [below for nested schema](#nestedatt--input_option--mysql_input_option))
- `salesforce_input_option` (Attributes) Attributes about source Salesforce (see [below for nested schema](#nestedatt--input_option--salesforce_input_option))
- `snowflake_input_option` (Attributes) Attributes about source snowflake (see [below for nested schema](#nestedatt--input_option--snowflake_input_option))

<a id="nestedatt--input_option--bigquery_input_option"></a>
### Nested Schema for `input_option.bigquery_input_option`

Required:

- `bigquery_connection_id` (Number) Id of bigquery connection
- `columns` (Attributes List) List of columns to be retrieved and their types (see [below for nested schema](#nestedatt--input_option--bigquery_input_option--columns))
- `gcs_uri` (String) GCS URI
- `query` (String) Query
- `temp_dataset` (String) Temporary dataset name

Optional:

- `bigquery_job_wait_second` (Number) Wait time in seconds until bigquery job is completed
- `cache` (Boolean) Flag whether query cache is enabled
- `cleanup_gcs_files` (Boolean) Flag whether temporary GCS files should be cleaned up
- `custom_variable_settings` (Attributes List) (see [below for nested schema](#nestedatt--input_option--bigquery_input_option--custom_variable_settings))
- `decoder` (Attributes) (see [below for nested schema](#nestedatt--input_option--bigquery_input_option--decoder))
- `file_format` (String) File format of temporary GCS files
- `gcs_uri_format` (String) Format of GCS URI
- `is_standard_sql` (Boolean) Flag whether standard SQL is enabled
- `location` (String) Location of bigquery job

<a id="nestedatt--input_option--bigquery_input_option--columns"></a>
### Nested Schema for `input_option.bigquery_input_option.columns`

Required:

- `name` (String) Column name
- `type` (String) Column type.

Optional:

- `format` (String) format


<a id="nestedatt--input_option--bigquery_input_option--custom_variable_settings"></a>
### Nested Schema for `input_option.bigquery_input_option.custom_variable_settings`

Required:

- `name` (String) Custom variable name. It must start and end with `$`
- `type` (String) Custom variable type. The following types are supported: `string`, `timestamp`, `timestamp_runtime`

Optional:

- `direction` (String) Direction of the diff from context_time. The following directions are supported: `ago`, `later`. Required in `timestamp` and `timestamp_runtime` types
- `format` (String) Format used to replace variables. Required in `timestamp` and `timestamp_runtime` types
- `quantity` (Number) Quantity used to calculate diff from context_time. Required in `timestamp` and `timestamp_runtime` types
- `time_zone` (String) Time zone used to format the timestamp. Required in `timestamp` and `timestamp_runtime` types
- `unit` (String) Time unit used to calculate diff from context_time. The following units are supported: `hour`, `date`, `month`. Required in `timestamp` and `timestamp_runtime` types
- `value` (String) Fixed string which will replace variables at runtime. Required in `string` type


<a id="nestedatt--input_option--bigquery_input_option--decoder"></a>
### Nested Schema for `input_option.bigquery_input_option.decoder`

Optional:

- `match_name` (String) Relative path after decompression (regular expression). If not entered, all data in the compressed file will be transferred.



<a id="nestedatt--input_option--gcs_input_option"></a>
### Nested Schema for `input_option.gcs_input_option`

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
resource "trocco_job_definition" "bigquery_input_example" {
input_option_type = "bigquery"
input_option = {
bigquery_input_option = {
bigquery_connection_id = 1
gcs_uri = "test_bucket"
gcs_uri_format = "bucket"
query = "SELECT * FROM `test_dataset.test_table`"
temp_dataset = "temp_dataset"
location = "asia-northeast1"
is_standard_sql = true
cleanup_gcs_files = true
file_format = "CSV"
cache = true
bigquery_job_wait_second = 600

columns = [
{
name = "col1__c"
type = "string"
}
]
}
}
}
28 changes: 28 additions & 0 deletions internal/client/entity/job_definition/input_option/bigquery.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
package input_option

import (
"terraform-provider-trocco/internal/client/entity"
)

type BigqueryInputOption struct {
BigqueryConnectionID int64 `json:"bigquery_connection_id"`
GcsUri string `json:"gcs_uri"`
GcsUriFormat *string `json:"gcs_uri_format"`
Query string `json:"query"`
TempDataset string `json:"temp_dataset"`
IsStandardSQL *bool `json:"is_standard_sql"`
CleanupGcsFiles *bool `json:"cleanup_gcs_files"`
FileFormat *string `json:"file_format"`
Location *string `json:"location"`
Cache *bool `json:"cache"`
BigqueryJobWaitSecond *int64 `json:"bigquery_job_wait_second"`

Columns []BigqueryColumn `json:"columns"`
CustomVariableSettings *[]entity.CustomVariableSetting `json:"custom_variable_settings"`
}

type BigqueryColumn struct {
Name string `json:"name"`
Type string `json:"type"`
Format *string `json:"format"`
}
3 changes: 3 additions & 0 deletions internal/client/job_definition.go
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,7 @@ type InputOption struct {
SnowflakeInputOption *inputOptionEntities.SnowflakeInputOption `json:"snowflake_input_option"`
SalesforceInputOption *inputOptionEntities.SalesforceInputOption `json:"salesforce_input_option"`
GoogleSpreadsheetsInputOption *inputOptionEntities.GoogleSpreadsheetsInputOption `json:"google_spreadsheets_input_option"`
BigqueryInputOption *inputOptionEntities.BigqueryInputOption `json:"bigquery_input_option"`
}

type InputOptionInput struct {
Expand All @@ -100,6 +101,7 @@ type InputOptionInput struct {
SnowflakeInputOption *parameter.NullableObject[input_options.SnowflakeInputOptionInput] `json:"snowflake_input_option,omitempty"`
SalesforceInputOption *parameter.NullableObject[input_options.SalesforceInputOptionInput] `json:"salesforce_input_option,omitempty"`
GoogleSpreadsheetsInputOption *parameter.NullableObject[input_options.GoogleSpreadsheetsInputOptionInput] `json:"google_spreadsheets_input_option,omitempty"`
BigqueryInputOption *parameter.NullableObject[input_options.BigqueryInputOptionInput] `json:"bigquery_input_option,omitempty"`
}

type UpdateInputOptionInput struct {
Expand All @@ -108,6 +110,7 @@ type UpdateInputOptionInput struct {
SnowflakeInputOption *parameter.NullableObject[input_options.UpdateSnowflakeInputOptionInput] `json:"snowflake_input_option,omitempty"`
SalesforceInputOption *parameter.NullableObject[input_options.UpdateSalesforceInputOptionInput] `json:"salesforce_input_option,omitempty"`
GoogleSpreadsheetsInputOption *parameter.NullableObject[input_options.UpdateGoogleSpreadsheetsInputOptionInput] `json:"google_spreadsheets_input_option,omitempty"`
BigqueryInputOption *parameter.NullableObject[input_options.UpdateBigqueryInputOptionInput] `json:"bigquery_input_option,omitempty"`
}

type OutputOption struct {
Expand Down
46 changes: 46 additions & 0 deletions internal/client/parameter/job_definition/input_option/bigquery.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
package input_options

import (
"terraform-provider-trocco/internal/client/parameter"
job_definitions "terraform-provider-trocco/internal/client/parameter/job_definition"
)

type BigqueryInputOptionInput struct {
BigqueryConnectionID int64 `json:"bigquery_connection_id"`
GcsUri string `json:"gcs_uri"`
GcsUriFormat *parameter.NullableString `json:"gcs_uri_format,omitempty"`
Query string `json:"query"`
TempDataset string `json:"temp_dataset"`
IsStandardSQL *bool `json:"is_standard_sql,omitempty"`
CleanupGcsFiles *bool `json:"cleanup_gcs_files,omitempty"`
FileFormat *parameter.NullableString `json:"file_format,omitempty"`
Location *parameter.NullableString `json:"location,omitempty"`
Cache *bool `json:"cache,omitempty"`
BigqueryJobWaitSecond *int64 `json:"bigquery_job_wait_second,omitempty"`
Columns []BigqueryColumn `json:"columns,omitempty"`
CustomVariableSettings *[]parameter.CustomVariableSettingInput `json:"custom_variable_settings,omitempty"`
Decoder *job_definitions.DecoderInput `json:"decoder,omitempty"`
}

type UpdateBigqueryInputOptionInput struct {
BigqueryConnectionID *int64 `json:"bigquery_connection_id,omitempty"`
GcsUri *parameter.NullableString `json:"gcs_uri,omitempty"`
GcsUriFormat *parameter.NullableString `json:"gcs_uri_format,omitempty"`
Query *parameter.NullableString `json:"query,omitempty"`
TempDataset *parameter.NullableString `json:"temp_dataset,omitempty"`
IsStandardSQL *bool `json:"is_standard_sql,omitempty"`
CleanupGcsFiles *bool `json:"cleanup_gcs_files,omitempty"`
FileFormat *parameter.NullableString `json:"file_format,omitempty"`
Location *parameter.NullableString `json:"location,omitempty"`
Cache *bool `json:"cache,omitempty"`
BigqueryJobWaitSecond *int64 `json:"bigquery_job_wait_second,omitempty"`
Columns []BigqueryColumn `json:"columns,omitempty"`
CustomVariableSettings *[]parameter.CustomVariableSettingInput `json:"custom_variable_settings,omitempty"`
Decoder *job_definitions.DecoderInput `json:"decoder,omitempty"`
}

type BigqueryColumn struct {
Name string `json:"name"`
Type string `json:"type"`
Format *string `json:"format"`
}
4 changes: 4 additions & 0 deletions internal/provider/model/job_definition/input_option.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ type InputOption struct {
SnowflakeInputOption *input_options.SnowflakeInputOption `tfsdk:"snowflake_input_option"`
SalesforceInputOption *input_options.SalesforceInputOption `tfsdk:"salesforce_input_option"`
GoogleSpreadsheetsInputOption *input_options.GoogleSpreadsheetsInputOption `tfsdk:"google_spreadsheets_input_option"`
BigqueryInputOption *input_options.BigqueryInputOption `tfsdk:"bigquery_input_option"`
}

func NewInputOption(inputOption client.InputOption) *InputOption {
Expand All @@ -21,6 +22,7 @@ func NewInputOption(inputOption client.InputOption) *InputOption {
SnowflakeInputOption: input_options.NewSnowflakeInputOption(inputOption.SnowflakeInputOption),
SalesforceInputOption: input_options.NewSalesforceInputOption(inputOption.SalesforceInputOption),
GoogleSpreadsheetsInputOption: input_options.NewGoogleSpreadsheetsInputOption(inputOption.GoogleSpreadsheetsInputOption),
BigqueryInputOption: input_options.NewBigqueryInputOption(inputOption.BigqueryInputOption),
}
}

Expand All @@ -31,6 +33,7 @@ func (o InputOption) ToInput() client.InputOptionInput {
SnowflakeInputOption: model.WrapObject(o.SnowflakeInputOption.ToInput()),
SalesforceInputOption: model.WrapObject(o.SalesforceInputOption.ToInput()),
GoogleSpreadsheetsInputOption: model.WrapObject(o.GoogleSpreadsheetsInputOption.ToInput()),
BigqueryInputOption: model.WrapObject(o.BigqueryInputOption.ToInput()),
}
}

Expand All @@ -41,5 +44,6 @@ func (o InputOption) ToUpdateInput() *client.UpdateInputOptionInput {
SnowflakeInputOption: model.WrapObject(o.SnowflakeInputOption.ToUpdateInput()),
SalesforceInputOption: model.WrapObject(o.SalesforceInputOption.ToUpdateInput()),
GoogleSpreadsheetsInputOption: model.WrapObject(o.GoogleSpreadsheetsInputOption.ToUpdateInput()),
BigqueryInputOption: model.WrapObject(o.BigqueryInputOption.ToUpdateInput()),
}
}
Loading