I am trying to submit EMR jobs. EMR on EC2. I am suing the code given by Airflow. Installed Airflow with Docker as recommended by Apache Airflow.
This is given in automatic steps https://airflow.apache.org/docs/apache-airflow-providers-amazon/2.2.0/_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html
from datetime import timedelta from airflow import DAG from airflow.providers.amazon.aws.operators.emr_create_job_flow import EmrCreateJobFlowOperator from airflow.providers.amazon.aws.sensors.emr_job_flow import EmrJobFlowSensor from airflow.utils.dates import days_ago
SPARK_STEPS = [ { 'Name': 'calculate_pi', 'ActionOnFailure': 'CONTINUE', 'HadoopJarStep': { 'Jar': 'command-runner.jar', 'Args': ['/usr/lib/spark/bin/run-example', 'SparkPi', '10'], }, } ]
JOB_FLOW_OVERRIDES = { 'Name': 'PiCalc', 'ReleaseLabel': 'emr-5.29.0', 'Instances': { 'InstanceGroups': [ { 'Name': 'Master node', 'Market': 'SPOT', 'InstanceRole': 'MASTER', 'InstanceType': 'm1.medium', 'InstanceCount': 1, } ], 'KeepJobFlowAliveWhenNoSteps': False, 'TerminationProtected': False, }, 'Steps': SPARK_STEPS, 'JobFlowRole': 'EMR_EC2_DefaultRole', 'ServiceRole': 'EMR_DefaultRole', }
with DAG( dag_id='emr_job_flow_automatic_steps_dag', default_args={ 'owner': 'airflow', 'depends_on_past': False, 'email': ['airflow@example.com'], 'email_on_failure': False, 'email_on_retry': False, }, dagrun_timeout=timedelta(hours=2), start_date=days_ago(2), schedule_interval='0 3 * * *', tags=['example'], ) as dag:
# [START howto_operator_emr_automatic_steps_tasks]
job_flow_creator = EmrCreateJobFlowOperator(
task_id='create_job_flow',
job_flow_overrides=JOB_FLOW_OVERRIDES,
aws_conn_id='aws_default',
emr_conn_id='emr_default',
)
job_sensor = EmrJobFlowSensor(
task_id='check_job_flow',
job_flow_id=job_flow_creator.output,
aws_conn_id='aws_default',
)
# [END howto_operator_emr_automatic_steps_tasks]
# Task dependency created via `XComArgs`:
# job_flow_creator >> job_sensor
########################################### Issues are:#
- from airflow.providers.amazon.aws.operators.emr_create_job_flow import EmrCreateJobFlowOperator
from airflow.providers.amazon.aws.sensors.emr_job_flow import EmrJobFlowSensor
give error saying cannot import module though amazon providers is installed in my scheduler container. They can be imported using (given in manual steps)
from airflow.providers.amazon.aws.operators.emr import ( EmrAddStepsOperator, EmrCreateJobFlowOperator, EmrModifyClusterOperator, EmrTerminateJobFlowOperator, )
from airflow.providers.amazon.aws.sensors.emr import EmrJobFlowSensor
- While submitting job ['Args': ['/usr/lib/spark/bin/run-example', 'SparkPi', '10'],] gives error saying \
"Exception in thread "main" java.lang.RuntimeException: java.io.IOException: Cannot run program "/usr/lib/spark/bin/run-example" (in directory "."): error=2, No such file or directory "
What is my issues here? Expecting some help. Thanks
source https://stackoverflow.com/questions/76017388/how-to-submit-apache-airflow-emr-job-on-ec2
Comments
Post a Comment