本篇文章为大家展示了spark 2.1.0 standalone模式配置以及jar包怎么通过spark-submit提交,内容简明扼要并且容易理解,绝对能使你眼前一亮,通过这篇文章的详细介绍希望你能有所收获。
配置
spark-env.sh
export JAVA_HOME=/apps/jdk1.8.0_181
export SPARK_MASTER_HOST=bigdata00
export SPARK_MASTER_PORT=7077
slaves
bigdata01
bigdata02
bigdata03
启动spark shell
./spark-shell --master spark://bigdata00:7077 --executor-memory 512M
用spark shell 完成一个wordcount
scala> sc.textFile("hdfs://bigdata00:9000/words").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
结果:
res3: Array[(String, Int)] = Array((this,1), (is,4), (girl,3), (love,1), (will,1), (day,1), (boreing,1), (my,1), (miss,2), (test,2), (forget,1), (spark,2), (soon,1), (most,1), (that,1), (a,2), (afternonn,1), (i,3), (might,1), (of,1), (today,2), (good,1), (for,1), (beautiful,1), (time,1), (and,1), (the,5))
//主类
package hgs.sparkwc
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("WordCount")
val context = new SparkContext()
context.textFile(args(0),1).flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).sortBy(_._2).saveAsTextFile(args(1))
context.stop
}
}
//------------------------------------------------------------------------------------------
//以下式pom.xml文件
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>hgs</groupId>
<artifactId>sparkwc</artifactId>
<version>1.0.0</version>
<packaging>jar</packaging>
<name>sparkwc</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.6</version>
<configuration>
<archive>
<manifest>
<!-- 我运行这个jar所运行的主类 -->
<mainClass>hgs.sparkwc.WordCount</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>
<!-- 必须是这样写 -->
jar-with-dependencies
</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<!-- <arg>-make:transitive</arg> -->
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>
</plugins>
</build>
</project>
最后在build assembly:assembly的时候出现以下问题
scalac error: bad option: '-make:transitive'
原因是scala-maven-plugin 插件的配置 <arg>-make:transitive</arg> 有问题,把该行注释掉即可
网上的答案:
删除<arg>-make:transitive</arg>
或者添加该依赖:
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-junit_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
最后在服务器提交任务:
./spark-submit --master spark://bigdata00:7077 --executor-memory 512M --total-executor-cores 3 /home/sparkwc.jar hdfs://bigdata00:9000/words hdfs://bigdata00:9000/wordsout2
上述内容就是spark 2.1.0 standalone模式配置以及jar包怎么通过spark-submit提交,你们学到知识或技能了吗?如果还想学到更多技能或者丰富自己的知识储备,欢迎关注亿速云行业资讯频道。
亿速云「云服务器」,即开即用、新一代英特尔至强铂金CPU、三副本存储NVMe SSD云盘,价格低至29元/月。点击查看>>
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。
原文链接:http://blog.itpub.net/31506529/viewspace-2215620/