FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'

See original GitHub issue

airflow 1.10.12 ubuntu

I am puzzled by this error and am trying to determine what file/directory of ‘airflow’ is not being seen. I initially started off trying to build out papermill dag’s when I noticed I was getting this issue, so I have reduced it down to use all the basics found in the documentation to try and find what is wrong.

I have used most of the defaults.

  • home directory has a folder of airflow
  • when in there I performed:
airflow initdb

Within the airflow folder I have placed a file called airflow.env the contents of which are:

AIRFLOW_HOME=/home/ubuntu/airflow/

I have created the systemd file in /etc/systemd/system/airflow-scheduler.service, the contents of which are:

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
EnvironmentFile=/home/ubuntu/airflow/airflow.env
User=ubuntu
Group=ubuntu
Type=simple
ExecStart=/home/ubuntu/anaconda3/envs/data_analysis_lab/bin/airflow scheduler
Restart=always
RestartSec=5s

[Install]
WantedBy=multi-user.target

Manually triggering the buildin example of: example_python_operator fails:

journal -u airflow-scheduler.service shows:

Oct 09 17:32:38 aa airflow[27741]: [2020-10-09 17:32:38,759] {scheduler_job.py:1195} INFO - Sending ('example_python_operator', 'print_the_context', datetime.datetime(2020, 10, 9, 17, 26, 57, 296322, tzinfo=<Timezone [UTC]>), 1) to execu
Oct 09 17:32:38 aa airflow[27741]: [2020-10-09 17:32:38,760] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_python_operator', 'print_the_context', '2020-10-09T17:26:57.296322+00:00', '--local', '--pool', 'defau
Oct 09 17:32:38 aa airflow[27741]: [2020-10-09 17:32:38,762] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_python_operator', 'print_the_context', '2020-10-09T17:26:57.296322+00:00', '--local', '--pool'
Oct 09 17:32:38 aa airflow[27741]: [2020-10-09 17:32:38,787] {scheduler_job.py:1401} ERROR - Exception when executing execute_helper
Oct 09 17:32:38 aa airflow[27741]: Traceback (most recent call last):
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1399, in _execute
Oct 09 17:32:38 aa airflow[27741]:     self._execute_helper()
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1470, in _execute_helper
Oct 09 17:32:38 aa airflow[27741]:     if not self._validate_and_run_task_instances(simple_dag_bag=simple_dag_bag):
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1532, in _validate_and_run_task_instances
Oct 09 17:32:38 aa airflow[27741]:     self.executor.heartbeat()
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/site-packages/airflow/executors/base_executor.py", line 134, in heartbeat
Oct 09 17:32:38 aa airflow[27741]:     self.sync()
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/site-packages/airflow/executors/sequential_executor.py", line 57, in sync
Oct 09 17:32:38 aa airflow[27741]:     subprocess.check_call(command, close_fds=True)
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/subprocess.py", line 358, in check_call
Oct 09 17:32:38 aa airflow[27741]:     retcode = call(*popenargs, **kwargs)
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/subprocess.py", line 339, in call
Oct 09 17:32:38 aa airflow[27741]:     with Popen(*popenargs, **kwargs) as p:
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/subprocess.py", line 800, in __init__
Oct 09 17:32:38 aa airflow[27741]:     restore_signals, start_new_session)
Oct 09 17:32:38 aa airflow[27741]:   File "/home/ubuntu/anaconda3/envs/data_analysis_lab/lib/python3.7/subprocess.py", line 1551, in _execute_child
Oct 09 17:32:38 aa airflow[27741]:     raise child_exception_type(errno_num, err_msg, err_filename)
Oct 09 17:32:38 aa airflow[27741]: FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'

Interestingly if I don’t use systemd, and just launch the airflow scheduler from the terminal in my home directory, I don’t get the errors - so I am assuming it has something to do with the systemd configuration?

I have, as of yet, been able to get more detail on: FileNotFoundError: [Errno 2] No such file or directory: ‘airflow’: 'airflow`

Any ideas for additional troubleshooting?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:10 (1 by maintainers)

github_iconTop GitHub Comments

5reactions
afonitcommented, Oct 12, 2020

For others who come across this.

The error was due to, on Ubuntu, The [Service] section needs to be Environment="PATH=/...." not EnvironmentFile=...

0reactions
idnoclipcommented, Mar 28, 2022

I had similar issue on Raspberry Pi - it seemed that somehow service used different PATH variable than the user running it. I managed to work it around by modyfing /etc/systemd/system/airflow-scheduler.service file, replacing the line:

ExecStart=/bin/bash -c '/path/to/airflow scheduler'

with

ExecStart=/bin/bash -c 'PATH=$PATH:/path/to/airflow; /path/to/airflow scheduler'

This way i was sure that the PATH variable contains path to the airflow executable. Then I had to stop services, reload service daemon and start them again:

sudo service airflow-webserver stop sudo service airflow-scheduler stop sudo systemctl daemon-reload sudo service airflow-webserver start sudo service airflow-scheduler start

Read more comments on GitHub >

github_iconTop Results From Across the Web

Airflow BashOperator OSError: [Errno 2] No such file or directory
this solved my issue with No such file or directory, because airflow couldn't find the binary that I was calling inside my bash...
Read more >
Airflow: FileNotFoundError: [Errno 2] No such file or directory
I'm using mac OS. My log files says: FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/data/movieoutput.
Read more >
Google Groups
Sorry one more, using quickstart "OSError: [Errno 2] No such file or directory" when running [fxudply@auaeuap070wbcr2 bin]$ ./airflow webserver -p 100.
Read more >
FileNotFoundError: [Errno 2] No such file or directory: 'airflow'
[GitHub] [airflow] afonit commented on issue #11309: FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'.
Read more >
When executable doesn't exist, `meltano invoke airflow` fails ...
As reported in Slack by Pankaj Saini. [Errno 2] No such file or directory: '/project/.meltano/run/airflow/airflow.cfg'.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found