温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

安装SPARK和SCALA

发布时间:2020-07-14 22:25:16 来源:网络 阅读:418 作者:刀刀_高扬 栏目:大数据

1、下载 spark


http://mirrors.cnnic.cn/apache/spark/spark-1.3.0/spark-1.3.0-bin-hadoop2.3.tgz



2、下载scala


http://www.scala-lang.org/download/2.10.5.html



3、安装scala

mkdir /usr/lib/scala

tar –zxvf scala-2.10.5.tgz

mv scala-2.10.5 /usr/lib/scala



4、设置scala路径

vim /etc/bashrc

export SCALA_HOME=/usr/lib/scala/scala-2.10.5

export PATH=$SCALA_HOME/bin:$PATH


source /etc/bashrc


scala –version



5、分发

scp -r /usr/lib/scala/ hd2:/usr/lib/scala

scp -r /usr/lib/scala/ hd3:/usr/lib/scala

scp -r /usr/lib/scala/ hd4:/usr/lib/scala

scp -r /usr/lib/scala/ hd5:/usr/lib/scala


scp /etc/bashrc hd2:/etc/bashrc

scp /etc/bashrc hd3:/etc/bashrc

scp /etc/bashrc hd4:/etc/bashrc

scp /etc/bashrc hd5:/etc/bashrc



6、安装spark

tar -zxvf spark-1.3.0-bin-hadoop2.3.tgz

mkdir /usr/local/spark

mv spark-1.3.0-bin-hadoop2.3 /usr/local/spark



vim /etc/bashrc

export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3

export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH



source /etc/bashrc


cd /usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf/

cp spark-env.sh.template spark-env.sh



vim spark-env.sh


export JAVA_HOME=/java

export SCALA_HOME=/usr/lib/scala/scala-2.10.5

export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3

export SPARK_MASTER_IP=192.168.137.101

export SPARK_WORKER_MEMORY=10g

export SPARK_DRIVER_MEMORY=9g

export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop

export SPARK_LIBRARY_PATH=$SPARK_HOME/lib

export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH


cp slaves.template slaves



vim slaves


hd1

hd2

hd3

hd4

hd5



7、分发

scp /etc/bashrc hd2:/etc

scp /etc/bashrc hd3:/etc

scp /etc/bashrc hd4:/etc

scp /etc/bashrc hd5:/etc


scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/



7、启动

在hd1,启动

cd $SPARK_HOME/sbin

./start-all.sh


向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI