温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

我的Spark源码核心SparkContext走读全纪录

发布时间:2020-07-20 09:29:05 来源:网络 阅读:312 作者:moviebat 栏目:大数据

我的Spark源码核心SparkContext走读全纪录



Dirver Program(SparkConf)  package org.apache.spark.SparkConf

Master        package org.apache.spark.deploy.master


SparkContext  package org.apache.spark.SparkContext


Stage         package org.apache.spark.scheduler.Stage

Task          package org.apache.spark.scheduler.Task  

DAGScheduler  package org.apache.spark.scheduler   

TaskScheduler package org.apache.spark.scheduler.TaskScheduler

TaskSchedulerImpl  package org.apache.spark.scheduler

Worker        package org.apache.spark.deploy.worker

Executor      package org.apache.spark.executor

BlockManager  package org.apache.spark.storage

TaskSet       package org.apache.spark.scheduler


//初始化后开始创建

// Create and start the scheduler

    val (sched, ts) = SparkContext.createTaskScheduler(this, master)

    _schedulerBackend = sched

    _taskScheduler = ts

    _dagScheduler = new DAGScheduler(this)

    _heartbeatReceiver.send(TaskSchedulerIsSet)

 

/**

   * Create a task scheduler based on a given master URL.

   * Return a 2-tuple of the scheduler backend and the task scheduler.

   */

  private def createTaskScheduler(

      sc: SparkContext,

      master: String): (SchedulerBackend, TaskScheduler) = {


master match {

      case "local" =>


实例化一个

val scheduler = new TaskSchedulerImpl(sc)

构建masterUrls:

val masterUrls = localCluster.start()

据说是非常关键的backend:

val backend = new SparkDeploySchedulerBackend(scheduler, sc, masterUrls)

        scheduler.initialize(backend)

        backend.shutdownCallback = (backend: SparkDeploySchedulerBackend) => {

          localCluster.stop()

        }

        (backend, scheduler)



向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI