这是indexloc提供的服务,不要输入任何密码
Skip to content

Connection to Postgres failed due to special characters in a password #38187

@DEworkerDE

Description

@DEworkerDE

Apache Airflow version

Other Airflow 2 version (please specify below)

If "Other Airflow 2 version" selected, which one?

2.7.1

What happened?

"Airflow UI test connection" shows that it was successful, but when task uses PostgresHook I got connection issue (db name= TEST_ETL for example).
But airflow db which we use for airflow service works fine, i.e. we are using the same server, but different db.

in UI:
image

in task then we got error:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) invalid dsn: invalid connection option "extra"
(Background on this error at: https://sqlalche.me/e/14/f405)
[2024-03-15, 15:15:08 UTC] {taskinstance.py:1398} INFO - Marking task as FAILED. dag_id=data_ingestion, task_id=set_static_data, execution_date=20240315T151344, start_date=20240315T151431, end_date=20240315T151508
[2024-03-15, 15:15:08 UTC] {standard_task_runner.py:104} ERROR - Failed to execute job 10 for task set_static_data ((psycopg2.ProgrammingError) invalid dsn: invalid connection option "extra"
(Background on this error at: https://sqlalche.me/e/14/f405); 123)
[2024-03-15, 15:15:08 UTC] {local_task_job_runner.py:228} INFO - Task exited with return code 1
[2024-03-15, 15:15:08 UTC] {taskinstance.py:2776} INFO - 0 downstream tasks scheduled from follow-on schedule check

passwords which were 100% invalid:

  1. "+&I>[A1FEyxze-P3"
  2. "!&*Rr1(-_=[{<?"

I guess some special characters or combinations are not valid, or if it at first position in a password.

What you think should happen instead?

At least if UI write success in testing connection, airflow works with its own db fine too using the same password,
then PostgresHook inside dags/tasks also should work as well, but seems that not

How to reproduce

Just try to create connection to database in UI and test it:
possible password:
+&I>[A1FEyxze-P3
!&*Rr1(-_=[{<?

and for example code in a task:

import pandas as pd
from airflow.providers.postgres.hooks.postgres import PostgresHook

psql_hook = PostgresHook(postgres_conn_id="postgres_conn_id")
data = {'product_name': ['Computer','Tablet','Monitor','Printer'],
        'price': [900,300,450,150]
        }
df = pd.DataFrame(data, columns= ['product_name','price'])
df .to_sql(
        name="test_product",
        con=psql_hook.get_sqlalchemy_engine(),
        index=False,
        if_exists="replace",
)

Operating System

Linux

Versions of Apache Airflow Providers

No response

Deployment

Official Apache Airflow Helm Chart

Deployment details

deployed in k8s, connections were setup and tested via REST API and manually

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions