Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RecursionError: maximum recursion depth exceeded in __instancecheck__ #920

Open
anmolbansal2 opened this issue Apr 23, 2024 · 4 comments
Open
Assignees
Labels
area:config Related to configuration, like YAML files, environment variables, or executer configuration area:execution Related to the execution environment/mode, like Docker, Kubernetes, Local, VirtualEnv, etc area:profile Related to ProfileConfig, like Athena, BigQuery, Clickhouse, Spark, Trino, etc bug Something isn't working epic-assigned execution:local Related to Local execution environment parsing:dbt_ls Issues, questions, or features related to dbt_ls parsing profile:snowflake Related to Snowflake ProfileConfig triage-needed Items need to be reviewed / assigned to milestone
Milestone

Comments

@anmolbansal2
Copy link

Stack trace:

[2024-04-19T11:13:13.311+0000] {dagbag.py:346} ERROR - Failed to import: /runner/_work/bi-global-data-orchestration/bi-global-data-orchestration/tests/unit/airflow/dags/../../../../airflow/dags_live/snowflake_reports.py
Traceback (most recent call last):
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/airflow/models/dagbag.py", line 342, in parse
    loader.exec_module(new_module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/runner/_work/bi-global-data-orchestration/bi-global-data-orchestration/tests/unit/airflow/dags/../../../../airflow/dags_live/snowflake_reports.py", line 57, in <module>
    basic_cosmos_dag = DbtTaskGroup(
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/airflow/task_group.py", line 26, in __init__
    DbtToAirflowConverter.__init__(self, *args, **specific_kwargs(**kwargs))
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/converter.py", line 244, in __init__
    dbt_graph.load(method=render_config.load_method, execution_mode=execution_config.execution_mode)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/dbt/graph.py", line 176, in load
    self.load_via_dbt_ls()
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/dbt/graph.py", line 251, in load_via_dbt_ls
    with self.profile_config.ensure_profile(use_mock_values=True) as profile_values, environ(
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/config.py", line 257, in ensure_profile
    profile_contents = self.profile_mapping.get_profile_file_contents(
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/cosmos/profiles/base.py", line 1[86](https://github.com/hellofresh/bi-global-data-orchestration/actions/runs/8752588050/job/24020604688?pr=1184#step:11:87), in get_profile_file_contents
    return str(yaml.dump(profile_contents, indent=4))
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/__init__.py", line 253, in dump
    return dump_all([data], stream, Dumper=Dumper, **kwds)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/__init__.py", line 241, in dump_all
    dumper.represent(data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 27, in represent
    node = self.represent_data(data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 48, in represent_data
    node = self.yaml_representers[data_types[0]](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 207, in represent_dict
    return self.represent_mapping('tag:yaml.org,2002:map', data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 118, in represent_mapping
    node_value = self.represent_data(item_value)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 48, in represent_data
    node = self.yaml_representers[data_types[0]](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 207, in represent_dict
    return self.represent_mapping('tag:yaml.org,2002:map', data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 118, in represent_mapping
    node_value = self.represent_data(item_value)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 48, in represent_data
    node = self.yaml_representers[data_types[0]](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 207, in represent_dict
    return self.represent_mapping('tag:yaml.org,2002:map', data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 118, in represent_mapping
    node_value = self.represent_data(item_value)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 48, in represent_data
    node = self.yaml_representers[data_types[0]](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 207, in represent_dict
    return self.represent_mapping('tag:yaml.org,2002:map', data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 118, in represent_mapping
    node_value = self.represent_data(item_value)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 52, in represent_data
    node = self.yaml_multi_representers[data_type](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 342, in represent_object
    return self.represent_mapping(
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line [118](https://github.com/hellofresh/bi-global-data-orchestration/actions/runs/8752588050/job/24020604688?pr=1184#step:11:119), in represent_mapping
    node_value = self.represent_data(item_value)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 52, in represent_data
    node = self.yaml_multi_representers[data_type](self, data)
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 342, in represent_object
    return self.represent_mapping(
  File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/representer.py", line 118, in represent_mapping
    node_value = self.represent_data(item_value)
Copy link

dosubot bot commented Apr 23, 2024

The RecursionError: maximum recursion depth exceeded in __instancecheck__ you're encountering is likely due to a circular dependency or recursive calls within your code, particularly during the YAML serialization process with yaml.dump in the get_profile_file_contents method. This issue can arise if there's a self-referencing structure within the data being serialized, which is common when dealing with complex nested structures that YAML tries to represent.

To address this issue, check for any circular references or recursive patterns in the data structures (profile_vars or the result of self.dbt_config_vars.as_dict()) that are passed to yaml.dump. Ensure that these structures do not contain self-references or recursive links, which can trigger the recursion limit error during serialization.

If the problem persists, consider reviewing the initialization and usage patterns of your classes to ensure they do not inadvertently create a loop or exceed the maximum recursion depth. This might involve reevaluating the class design, particularly the use of inheritance and composition, to prevent recursive initialization or method calls.

By carefully examining and potentially refactoring the data structures or class designs involved in the serialization process, you should be able to resolve the recursion error and successfully serialize your data to YAML.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added area:config Related to configuration, like YAML files, environment variables, or executer configuration area:execution Related to the execution environment/mode, like Docker, Kubernetes, Local, VirtualEnv, etc area:profile Related to ProfileConfig, like Athena, BigQuery, Clickhouse, Spark, Trino, etc execution:local Related to Local execution environment parsing:dbt_ls Issues, questions, or features related to dbt_ls parsing profile:snowflake Related to Snowflake ProfileConfig labels Apr 23, 2024
@anmolb12
Copy link

anmolb12 commented Apr 24, 2024

@dosubot I am passing an vault secret in config which is causing this infinite recursion. Does cosmos not handle vault secrets?

@tatiana
Copy link
Collaborator

tatiana commented Apr 25, 2024

@anmolb12 Cosmos should handle vault secrets without issues.
Is there any chance you could give a step-by-step to reproduce this issue?

@tatiana tatiana added the bug Something isn't working label May 17, 2024
@tatiana tatiana added this to the 1.5.0 milestone May 17, 2024
@tatiana
Copy link
Collaborator

tatiana commented May 17, 2024

This was seen probably in Cosmos 1.3.2 or 1.4.0a1 - we need more details

@tatiana tatiana added the triage-needed Items need to be reviewed / assigned to milestone label May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:config Related to configuration, like YAML files, environment variables, or executer configuration area:execution Related to the execution environment/mode, like Docker, Kubernetes, Local, VirtualEnv, etc area:profile Related to ProfileConfig, like Athena, BigQuery, Clickhouse, Spark, Trino, etc bug Something isn't working epic-assigned execution:local Related to Local execution environment parsing:dbt_ls Issues, questions, or features related to dbt_ls parsing profile:snowflake Related to Snowflake ProfileConfig triage-needed Items need to be reviewed / assigned to milestone
Projects
None yet
Development

No branches or pull requests

4 participants