New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updates to Teradata Provider #39217
base: main
Are you sure you want to change the base?
Updates to Teradata Provider #39217
Changes from 3 commits
1596bee
73ca455
c28ac1a
ba0e5d9
0a9c4af
f815fef
1747e64
4baf9ab
dd2dce4
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -17,11 +17,15 @@ | |
# under the License. | ||
from __future__ import annotations | ||
|
||
from typing import Sequence | ||
from typing import TYPE_CHECKING, Sequence | ||
|
||
from airflow.models import BaseOperator | ||
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator | ||
from airflow.providers.teradata.hooks.teradata import TeradataHook | ||
|
||
if TYPE_CHECKING: | ||
from airflow.utils.context import Context | ||
|
||
|
||
class TeradataOperator(SQLExecuteQueryOperator): | ||
""" | ||
|
@@ -41,8 +45,8 @@ class TeradataOperator(SQLExecuteQueryOperator): | |
""" | ||
|
||
template_fields: Sequence[str] = ( | ||
"parameters", | ||
"sql", | ||
"parameters", | ||
) | ||
template_ext: Sequence[str] = (".sql",) | ||
template_fields_renderers = {"sql": "sql"} | ||
|
@@ -62,3 +66,38 @@ def __init__( | |
} | ||
super().__init__(**kwargs) | ||
self.conn_id = conn_id | ||
|
||
|
||
class TeradataStoredProcedureOperator(BaseOperator): | ||
""" | ||
Executes stored procedure in a specific Teradata database. | ||
|
||
:param procedure: name of stored procedure to call (templated) | ||
:param conn_id: The :ref:`Teradata connection id <howto/connection:teradata>` | ||
reference to a specific Teradata database. | ||
:param parameters: (optional, templated) the parameters provided in the call | ||
|
||
""" | ||
|
||
template_fields: Sequence[str] = ( | ||
"procedure", | ||
"parameters", | ||
) | ||
ui_color = "#ededed" | ||
|
||
def __init__( | ||
self, | ||
*, | ||
procedure: str, | ||
conn_id: str = TeradataHook.default_conn_name, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I thought better to use There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Modified conn_id to teradata_conn_id. It requires to change system tests. Changed system tests as per this modification. |
||
parameters: dict | list | None = None, | ||
**kwargs, | ||
) -> None: | ||
super().__init__(**kwargs) | ||
self.conn_id = conn_id | ||
self.procedure = procedure | ||
self.parameters = parameters | ||
|
||
def execute(self, context: Context): | ||
hook = TeradataHook(teradata_conn_id=self.conn_id) | ||
return hook.callproc(self.procedure, autocommit=True, parameters=self.parameters) |
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
@@ -33,6 +33,8 @@ dependencies: | |||||||||||||||||||||||||||||||||||||||
- apache-airflow-providers-common-sql>=1.3.1 | ||||||||||||||||||||||||||||||||||||||||
- teradatasqlalchemy>=17.20.0.0 | ||||||||||||||||||||||||||||||||||||||||
- teradatasql>=17.20.0.28 | ||||||||||||||||||||||||||||||||||||||||
- apache-airflow-providers-microsoft-azure | ||||||||||||||||||||||||||||||||||||||||
- apache-airflow-providers-amazon | ||||||||||||||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Setting this means that anyone who is using teradata provider will be forced to also have azure and amazon. airflow/airflow/providers/google/provider.yaml Lines 167 to 185 in 1074b8e
This will allow users of teradata to choose if they want to add the optional dependencies into their installation. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @eladkal thank you. Will change it. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Modified as suggested |
||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||
integrations: | ||||||||||||||||||||||||||||||||||||||||
- integration-name: Teradata | ||||||||||||||||||||||||||||||||||||||||
|
@@ -57,6 +59,14 @@ transfers: | |||||||||||||||||||||||||||||||||||||||
target-integration-name: Teradata | ||||||||||||||||||||||||||||||||||||||||
python-module: airflow.providers.teradata.transfers.teradata_to_teradata | ||||||||||||||||||||||||||||||||||||||||
how-to-guide: /docs/apache-airflow-providers-teradata/operators/teradata_to_teradata.rst | ||||||||||||||||||||||||||||||||||||||||
- source-integration-name: Microsoft Azure Blob Storage | ||||||||||||||||||||||||||||||||||||||||
target-integration-name: Teradata | ||||||||||||||||||||||||||||||||||||||||
python-module: airflow.providers.teradata.transfers.azure_blob_to_teradata | ||||||||||||||||||||||||||||||||||||||||
how-to-guide: /docs/apache-airflow-providers-teradata/operators/azure_blob_to_teradata.rst | ||||||||||||||||||||||||||||||||||||||||
- source-integration-name: Amazon Simple Storage Service (S3) | ||||||||||||||||||||||||||||||||||||||||
target-integration-name: Teradata | ||||||||||||||||||||||||||||||||||||||||
python-module: airflow.providers.teradata.transfers.s3_to_teradata | ||||||||||||||||||||||||||||||||||||||||
how-to-guide: /docs/apache-airflow-providers-teradata/operators/s3_to_teradata.rst | ||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||
connection-types: | ||||||||||||||||||||||||||||||||||||||||
- hook-class-name: airflow.providers.teradata.hooks.teradata.TeradataHook | ||||||||||||||||||||||||||||||||||||||||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,95 @@ | ||
# | ||
# Licensed to the Apache Software Foundation (ASF) under one | ||
# or more contributor license agreements. See the NOTICE file | ||
# distributed with this work for additional information | ||
# regarding copyright ownership. The ASF licenses this file | ||
# to you under the Apache License, Version 2.0 (the | ||
# "License"); you may not use this file except in compliance | ||
# with the License. You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an | ||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY | ||
# KIND, either express or implied. See the License for the | ||
# specific language governing permissions and limitations | ||
# under the License. | ||
from __future__ import annotations | ||
|
||
from typing import TYPE_CHECKING, Sequence | ||
|
||
from airflow.models import BaseOperator | ||
from airflow.providers.microsoft.azure.hooks.wasb import WasbHook | ||
from airflow.providers.teradata.hooks.teradata import TeradataHook | ||
|
||
if TYPE_CHECKING: | ||
from airflow.utils.context import Context | ||
|
||
|
||
class AzureBlobStorageToTeradataOperator(BaseOperator): | ||
""" | ||
|
||
Loads CSV, JSON and Parquet format data from Azure Blob Storage to Teradata. | ||
|
||
.. seealso:: | ||
For more information on how to use this operator, take a look at the guide: | ||
:ref:`howto/operator:AzureBlobStorageToTeradataOperator` | ||
|
||
:param blob_source_key: The URI format specifying the location of the Azure blob object store.(templated) | ||
The URI format is `/az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/YOUR-CONTAINER/YOUR-BLOB-LOCATION`. | ||
Refer to | ||
https://docs.teradata.com/search/documents?query=native+object+store&sort=last_update&virtual-field=title_only&content-lang=en-US | ||
:param azure_conn_id: The Airflow WASB connection used for azure blob credentials. | ||
:param teradata_table: The name of the teradata table to which the data is transferred.(templated) | ||
:param teradata_conn_id: The connection ID used to connect to Teradata | ||
:ref:`Teradata connection <howto/connection:Teradata>` | ||
|
||
Note that ``blob_source_key`` and ``teradata_table`` are | ||
templated, so you can use variables in them if you wish. | ||
""" | ||
|
||
template_fields: Sequence[str] = ("blob_source_key", "teradata_table") | ||
ui_color = "#e07c24" | ||
|
||
def __init__( | ||
self, | ||
*, | ||
blob_source_key: str, | ||
azure_conn_id: str = "azure_default", | ||
teradata_table: str, | ||
teradata_conn_id: str = "teradata_default", | ||
**kwargs, | ||
) -> None: | ||
super().__init__(**kwargs) | ||
self.blob_source_key = blob_source_key | ||
self.azure_conn_id = azure_conn_id | ||
self.teradata_table = teradata_table | ||
self.teradata_conn_id = teradata_conn_id | ||
|
||
def execute(self, context: Context) -> None: | ||
self.log.info( | ||
"transferring data from %s to teradata table %s...", self.blob_source_key, self.teradata_table | ||
) | ||
azure_hook = WasbHook(wasb_conn_id=self.azure_conn_id) | ||
conn = azure_hook.get_connection(self.azure_conn_id) | ||
# Obtaining the Azure client ID and Azure secret in order to access a specified Blob container | ||
access_id = conn.login if conn.login is not None else "" | ||
access_secret = conn.password if conn.password is not None else "" | ||
teradata_hook = TeradataHook(teradata_conn_id=self.teradata_conn_id) | ||
sql = f""" | ||
CREATE MULTISET TABLE {self.teradata_table} AS | ||
( | ||
SELECT * FROM ( | ||
LOCATION = '{self.blob_source_key}' | ||
ACCESS_ID= '{access_id}' | ||
ACCESS_KEY= '{access_secret}' | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Won't this expose the secret in the logs? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
) AS d | ||
) WITH DATA | ||
""" | ||
Taragolis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
try: | ||
teradata_hook.run(sql, True) | ||
except Exception as ex: | ||
self.log.error(str(ex)) | ||
raise | ||
self.log.info("The transfer of data from Azure Blob to Teradata was successful") |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,100 @@ | ||
# | ||
# Licensed to the Apache Software Foundation (ASF) under one | ||
# or more contributor license agreements. See the NOTICE file | ||
# distributed with this work for additional information | ||
# regarding copyright ownership. The ASF licenses this file | ||
# to you under the Apache License, Version 2.0 (the | ||
# "License"); you may not use this file except in compliance | ||
# with the License. You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an | ||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY | ||
# KIND, either express or implied. See the License for the | ||
# specific language governing permissions and limitations | ||
# under the License. | ||
from __future__ import annotations | ||
|
||
from typing import TYPE_CHECKING, Sequence | ||
|
||
from airflow.models import BaseOperator | ||
from airflow.providers.amazon.aws.hooks.s3 import S3Hook | ||
from airflow.providers.teradata.hooks.teradata import TeradataHook | ||
|
||
if TYPE_CHECKING: | ||
from airflow.utils.context import Context | ||
|
||
|
||
class S3ToTeradataOperator(BaseOperator): | ||
""" | ||
Loads CSV, JSON and Parquet format data from Amazon S3 to Teradata. | ||
|
||
.. seealso:: | ||
For more information on how to use this operator, take a look at the guide: | ||
:ref:`howto/operator:S3ToTeradataOperator` | ||
|
||
:param s3_source_key: The URI format specifying the location of the S3 object store.(templated) | ||
The URI format is /s3/YOUR-BUCKET.s3.amazonaws.com/YOUR-BUCKET-NAME. | ||
Refer to | ||
https://docs.teradata.com/search/documents?query=native+object+store&sort=last_update&virtual-field=title_only&content-lang=en-US | ||
:param teradata_table: The name of the teradata table to which the data is transferred.(templated) | ||
:param aws_conn_id: The Airflow AWS connection used for AWS credentials. | ||
:param teradata_conn_id: The connection ID used to connect to Teradata | ||
:ref:`Teradata connection <howto/connection:Teradata>`. | ||
|
||
Note that ``s3_source_key`` and ``teradata_table`` are | ||
templated, so you can use variables in them if you wish. | ||
""" | ||
|
||
template_fields: Sequence[str] = ("s3_source_key", "teradata_table") | ||
ui_color = "#e07c24" | ||
|
||
def __init__( | ||
self, | ||
*, | ||
s3_source_key: str, | ||
teradata_table: str, | ||
aws_conn_id: str = "aws_default", | ||
teradata_conn_id: str = "teradata_default", | ||
**kwargs, | ||
) -> None: | ||
super().__init__(**kwargs) | ||
self.s3_source_key = s3_source_key | ||
self.teradata_table = teradata_table | ||
self.aws_conn_id = aws_conn_id | ||
self.teradata_conn_id = teradata_conn_id | ||
|
||
def execute(self, context: Context) -> None: | ||
self.log.info( | ||
"transferring data from %s to teradata table %s...", self.s3_source_key, self.teradata_table | ||
) | ||
|
||
s3_hook = S3Hook(aws_conn_id=self.aws_conn_id) | ||
access_key = ( | ||
s3_hook.conn_config.aws_access_key_id if s3_hook.conn_config.aws_access_key_id is not None else "" | ||
) | ||
access_secret = ( | ||
s3_hook.conn_config.aws_secret_access_key | ||
if s3_hook.conn_config.aws_secret_access_key is not None | ||
else "" | ||
) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This one is not semantically correct in case of AWS Connection Contain itself might not contain You should call ReadOnlyCredentials = namedtuple(
'ReadOnlyCredentials', ['access_key', 'secret_key', 'token']
) Please note in case if And I can't find how to do it within the terradata, because manual doesn't contain such information https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/Teradata-VantageTM-Native-Object-Store-Getting-Started-Guide-17.20/Authentication-for-External-Object-Stores/Using-AWS-Assume-Role/Setting-Up-Assume-Role-on-Analytics-Database however this KB shows that somehow it supported And finally there is pretty difficult to handle anonymous access, because there is no out-of-box solution for that in AWS Hooks, so I would recommend to add separate parameter for that, so we could skip obtain connection at all if this kind of access required. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I would recommend to add into the Operator documentation about current limitation
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Added documentation as suggested at docs/apache-airflow-providers-teradata/operators/s3_to_teradata.rst https://github.com/apache/airflow/pull/39217/files#diff-763c72396c181b27414fffe4f0cd1a2c97c01868759a119099b56867a03d9e8b |
||
|
||
teradata_hook = TeradataHook(teradata_conn_id=self.teradata_conn_id) | ||
sql = f""" | ||
CREATE MULTISET TABLE {self.teradata_table} AS | ||
( | ||
SELECT * FROM ( | ||
LOCATION = '{self.s3_source_key}' | ||
ACCESS_ID= '{access_key}' | ||
ACCESS_KEY= '{access_secret}' | ||
) AS d | ||
) WITH DATA | ||
""" | ||
Taragolis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
try: | ||
teradata_hook.run(sql, True) | ||
except Exception as ex: | ||
self.log.error(str(ex)) | ||
raise | ||
self.log.info("The transfer of data from S3 to Teradata was successful") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure that i understand this function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function is used to translate stored procedure out parameters into a format understandable by the driver. For instance,
str
will be converted to an empty string (''). Stored procedures can be invoked with output parameters in various ways, as illustrated below.This will result in the statement:
{CALL TEST_PROCEDURE(?,?,?,?)}, with parameters: [3, 1, 0, '']
.If we omit the usage of this function, the statement would be converted to
{CALL TEST_PROCEDURE(?,?,?,?)}, with parameters: [3, 1, <class 'int'>, <class 'str'>],
which leads to failure with an error.Similarly, consider another invocation of the
TeradataStoredProcedureOperator
:This will translate to the statement: {CALL TEST_PROCEDURE(?,?,?,?)}, with parameters: [3, 1, ?, ?].
Example DAG - https://github.com/apache/airflow/blob/1747e64f51f53a50a62ed31550be9ecf0c5e4ac7/tests/system/providers/teradata/example_teradata_call_sp.py