AttributeError: 'PostgresHook' object has no attribute 'schema'
See original GitHub issueApache Airflow version: 2.1.0
Kubernetes version (if you are using kubernetes) (use kubectl version):1.21
Environment:
- Cloud provider or hardware configuration:
- OS (e.g. from /etc/os-release):
- Kernel (e.g.
uname -a):Linux workspace - Install tools:
- Others:
What happened:
Running PostgresOperator errors out with ‘PostgresHook’ object has no attribute ‘schema’. I tested it as well with the code from PostgresOperator tutorial in https://airflow.apache.org/docs/apache-airflow-providers-postgres/stable/operators/postgres_operator_howto_guide.html
This is happening since upgrade of apache-airflow-providers-postgres to version 2.1.0
*** Reading local file: /tmp/logs/postgres_operator_dag/create_pet_table/2021-08-04T15:32:40.520243+00:00/2.log
[2021-08-04 15:57:12,429] {taskinstance.py:876} INFO - Dependencies all met for <TaskInstance: postgres_operator_dag.create_pet_table 2021-08-04T15:32:40.520243+00:00 [queued]>
[2021-08-04 15:57:12,440] {taskinstance.py:876} INFO - Dependencies all met for <TaskInstance: postgres_operator_dag.create_pet_table 2021-08-04T15:32:40.520243+00:00 [queued]>
[2021-08-04 15:57:12,440] {taskinstance.py:1067} INFO -
--------------------------------------------------------------------------------
[2021-08-04 15:57:12,440] {taskinstance.py:1068} INFO - Starting attempt 2 of 2
[2021-08-04 15:57:12,440] {taskinstance.py:1069} INFO -
--------------------------------------------------------------------------------
[2021-08-04 15:57:12,457] {taskinstance.py:1087} INFO - Executing <Task(PostgresOperator): create_pet_table> on 2021-08-04T15:32:40.520243+00:00
[2021-08-04 15:57:12,461] {standard_task_runner.py:52} INFO - Started process 4692 to run task
[2021-08-04 15:57:12,466] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', '***_operator_dag', 'create_pet_table', '2021-08-04T15:32:40.520243+00:00', '--job-id', '6', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/test_dag.py', '--cfg-path', '/tmp/tmp2mez286k', '--error-file', '/tmp/tmpgvc_s17j']
[2021-08-04 15:57:12,468] {standard_task_runner.py:77} INFO - Job 6: Subtask create_pet_table
[2021-08-04 15:57:12,520] {logging_mixin.py:104} INFO - Running <TaskInstance: ***_operator_dag.create_pet_table 2021-08-04T15:32:40.520243+00:00 [running]> on host 5995a11eafd1
[2021-08-04 15:57:12,591] {taskinstance.py:1280} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=***_operator_dag
AIRFLOW_CTX_TASK_ID=create_pet_table
AIRFLOW_CTX_EXECUTION_DATE=2021-08-04T15:32:40.520243+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-08-04T15:32:40.520243+00:00
[2021-08-04 15:57:12,591] {postgres.py:68} INFO - Executing:
CREATE TABLE IF NOT EXISTS pet (
pet_id SERIAL PRIMARY KEY,
name VARCHAR NOT NULL,
pet_type VARCHAR NOT NULL,
birth_date DATE NOT NULL,
OWNER VARCHAR NOT NULL);
[2021-08-04 15:57:12,608] {base.py:69} INFO - Using connection to: id: ***_default.
[2021-08-04 15:57:12,610] {taskinstance.py:1481} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1137, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.8/site-packages/airflow/providers/postgres/operators/postgres.py", line 70, in execute
self.hook.run(self.sql, self.autocommit, parameters=self.parameters)
File "/usr/local/lib/python3.8/site-packages/airflow/hooks/dbapi.py", line 177, in run
with closing(self.get_conn()) as conn:
File "/usr/local/lib/python3.8/site-packages/airflow/providers/postgres/hooks/postgres.py", line 97, in get_conn
dbname=self.schema or conn.schema,
AttributeError: 'PostgresHook' object has no attribute 'schema'
[2021-08-04 15:57:12,612] {taskinstance.py:1524} INFO - Marking task as FAILED. dag_id=***_operator_dag, task_id=create_pet_table, execution_date=20210804T153240, start_date=20210804T155712, end_date=20210804T155712
[2021-08-04 15:57:12,677] {local_task_job.py:151} INFO - Task exited with return code 1
What you expected to happen:
How to reproduce it:
Anything else we need to know:
Issue Analytics
- State:
- Created 2 years ago
- Reactions:2
- Comments:5 (4 by maintainers)
Top Results From Across the Web
Pandas to_sql to sqlite returns 'Engine' object has no attribute ...
I'm having the same issue of this post. I tried your suggestion but the error now is AttributeError: 'Connection' object has no attribute...
Read more >AttributeError: 'PostgresHook' object has no attribute 'schema'
[GitHub] [airflow] potiuk commented on issue #17422: AttributeError: 'PostgresHook' object has no attribute 'schema'.
Read more >mssql pandas.DataFrame.to_sql AttributeError: 'Engine' object ...
I've searched high and low and have not been able to find a solution to the problem. Output of pd.show_versions().
Read more >How-to Guide for PostgresOperator - Apache Airflow
A task defined or implemented by a operator is a unit of work in your data pipeline. The purpose of Postgres Operator is...
Read more >How to fix pandas to_sql() AttributeError: 'DataFrame' object ...
How to fix pandas to_sql() AttributeError: 'DataFrame' object has no attribute 'cursor'. Problem: You are trying to save your DataFrame in an ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
For now downgrading Postgres operator to 2.0.0 should fix the problem.
Yep. Confirmed. We have unintended backwards incompatibility with Postgres Hook