Files
doris/fe
wyb 532d15d381 [Spark Load]Fe submit spark etl job (#3716)
After user creates a spark load job which status is PENDING, Fe will schedule and submit the spark etl job.
1. Begin transaction
2. Create a SparkLoadPendingTask for submitting etl job
2.1 Create etl job configuration according to https://github.com/apache/incubator-doris/issues/3010#issuecomment-635174675
2.2 Upload the configuration file and job jar to HDFS with broker
2.3 Submit etl job to spark cluster
2.4 Wait for etl job submission result
3. Update job state to ETL and log job update info if etl job is submitted successfully

#3433
2020-06-19 17:44:47 +08:00
..