I am fresher to pyspark. I have a python script from which i call spark job which has 5 queries.
-
Now when the spark job is called and say 3rd query failed , at that time i want my spark job to close the existing spark connection and open a new spark connection and run from the failed query , like this i want the spark job to attempt twice on failed queries.
-
And i even want like if the first query is done the spark should have a logger giving us that the first query done .
How to proceed
source https://stackoverflow.com/questions/70806595/run-spark-job-when-failed-automatically-using-python-script
Comments
Post a Comment