温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

macOS Spark 2.4.3 standalone 搭建的示例分析

发布时间:2021-12-13 11:17:30 来源:亿速云 阅读:121 作者:柒染 栏目:互联网科技

本篇文章给大家分享的是有关macOS Spark 2.4.3 standalone 搭建的示例分析,小编觉得挺实用的,因此分享给大家学习,希望大家阅读完这篇文章后可以有所收获,话不多说,跟着小编一起来看看吧。

based on

jdk 1.8

Mac os

1、解压

nancylulululu:local nancy$ tar -zxvf /Users/nancy/Downloads/spark-2.4.3-bin-hadoop2.7.tar -C /usr/local/opt/

2、重命名并修改配置文件

nancylulululu:conf nancy$ cp slaves.template slaves

nancylulululu:conf nancy$ cp spark-defaults.conf.template spark-defaults.conf

nancylulululu:conf nancy$ cp spark-env.sh.template spark-env.sh

nancylulululu:conf nancy$ vi slaves

localhost

nancylulululu:conf nancy$ vi spark-env.sh

SPARK_LOCAL_IP=127.0.0.1

SPARK_MASTER_HOST=127.0.0.1

SPARK_MASTER_PORT=7077

SPARK_WORKER_CORES=2

SPARK_WORKER_MEMORY=1G

3、启动spark

nancylulululu:spark-2.4.3 nancy$ sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /usr/local/opt/spark-2.4.3/logs/spark-nancy-org.apache.spark.deploy.master.Master-1-nancylulululu.out

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/opt/spark-2.4.3/logs/spark-nancy-org.apache.spark.deploy.worker.Worker-1-nancylulululu.out

jps 查看一个 master 和一个 worker服务

nancylulululu:conf nancy$ jps

3908 CoarseGrainedExecutorBackend

3478 Master

4166 Jps

3896 SparkSubmit

3514 Worker

4、Launching Applications with spark-submit

Once a user application is bundled, it can be launched using the bin/spark-submit script. This script takes care of setting up the classpath with Spark and its dependencies, and can support different cluster managers and deploy modes that Spark supports:

./bin/spark-submit \   --class <main-class> \   --master <master-url> \   --deploy-mode <deploy-mode> \   --conf <key>=<value> \   ... # other options   <application-jar> \   [application-arguments]

Some of the commonly used options are:

  • --class: The entry point for your application (e.g. org.apache.spark.examples.SparkPi)

  • --master: The  master URL  for the cluster (e.g. spark://23.195.26.187:7077)

  • --deploy-mode: Whether to deploy your driver on the worker nodes (cluster) or locally as an external client (client) (default: client† 

  • --conf: Arbitrary Spark configuration property in key=value format. For values that contain spaces wrap “key=value” in quotes (as shown).

  • application-jar: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an hdfs:// path or a file:// path that is present on all nodes.

  • application-arguments: Arguments passed to the main method of your main class, if any

脚本:

./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://localhost:7077 \
 /usr/local/opt/spark-2.4.3/examples/jars/spark-examples_2.11-2.4.3.jar \
100

macOS Spark 2.4.3 standalone 搭建的示例分析

结果

Pi is roughly 3.1413047141304715

19/05/14 14:23:06 INFO SparkUI: Stopped Spark web UI at http://localhost:4040

5、spark-shell 模式

nancylulululu:spark-2.4.3 nancy$ bin/spark-shell --master spark://localhost:7077

macOS Spark 2.4.3 standalone 搭建的示例分析

6、测试wordcount

测试文件

nancylulululu:opt nancy$ vi 3.txt

hello scala hello spark hello mysql hello java hello java hello scala

测试结果

scala> sc.textFile("/usr/local/opt/3.txt").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect

res1: Array[(String, Int)] = Array((scala,2), (mysql,1), (hello,6), (java,2), (spark,1))

scala> 

以上就是macOS Spark 2.4.3 standalone 搭建的示例分析,小编相信有部分知识点可能是我们日常工作会见到或用到的。希望你能通过这篇文章学到更多知识。更多详情敬请关注亿速云行业资讯频道。

向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI