温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

Spark SQL怎么用

发布时间:2022-01-14 17:20:26 来源:亿速云 阅读:140 作者:iii 栏目:开发技术

这篇文章主要介绍“Spark SQL怎么用”,在日常操作中,相信很多人在Spark SQL怎么用问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答”Spark SQL怎么用”的疑惑有所帮助!接下来,请跟着小编一起来学习吧!

pom.xml
<dependencies>

  1.     <dependency>

  2.         <groupId>org.apache.spark</groupId>

  3.         <artifactId>spark-core_2.10</artifactId>

  4.         <version>2.1.0</version>

  5.     </dependency>

  6.     <dependency>

  7.         <groupId>org.apache.spark</groupId>

  8.         <artifactId>spark-sql_2.10</artifactId>

  9.         <version>2.1.0</version>

  10.     </dependency>

  11. </dependencies>

Java:

  1. import java.io.Serializable;

  2. import java.util.Arrays;


  3. import org.apache.spark.SparkConf;

  4. import org.apache.spark.api.java.JavaRDD;

  5. import org.apache.spark.api.java.JavaSparkContext;

  6. import org.apache.spark.sql.Dataset;

  7. import org.apache.spark.sql.Row;

  8. import org.apache.spark.sql.SQLContext;

  9. import org.apache.spark.sql.SparkSession;


  10. public class SparkSqlTest {

  11.     public static class Person implements Serializable {

  12.         private static final long serialVersionUID = -6259413972682177507L;

  13.         private String name;

  14.         private int age;

  15.         

  16.         public Person(String name, int age) {

  17.             this.name = name;

  18.             this.age = age;

  19.         }

  20.         public String toString() {

  21.             return name + ": " + age;

  22.         }

  23.         public String getName() {

  24.             return name;

  25.         }

  26.         public void setName(String name) {

  27.             this.name = name;

  28.         }

  29.         public int getAge() {

  30.             return age;

  31.         }

  32.         public void setAge(int age) {

  33.             this.age = age;

  34.         }

  35.     }

  36.     

  37.     public static void main(String[] args) {

  38.         SparkConf conf = new SparkConf().setAppName("Test").setMaster("local");

  39.         JavaSparkContext sc = new JavaSparkContext(conf);

  40.         

  41.         SparkSession spark = SparkSession.builder().appName("Test").getOrCreate();

  42.         JavaRDD<String> input = sc.parallelize(Arrays.asList("abc,1", "test,2"));

  43.         JavaRDD<Person> persons = input.map(s -> s.split(",")).map(s -> new Person(s[0], Integer.parseInt(s[1])));

  44.         

  45.         //[abc: 1, test: 2]

  46.         System.out.println(persons.collect());

  47.         

  48.         Dataset<Row> df = spark.createDataFrame(persons, Person.class);

  49.         

  50.         /*

  51.         +---+----+

  52.         |age|name|

  53.         +---+----+

  54.         |  1| abc|

  55.         |  2|test|

  56.         +---+----+

  57.          */

  58.         df.show();

  59.         

  60.         /*

  61.          root

  62.            |-- age: integer (nullable = false)

  63.            |-- name: string (nullable = true)

  64.          */

  65.         df.printSchema();


  66.         SQLContext sql = new SQLContext(spark);

  67.         sql.registerDataFrameAsTable(df, "person");

  68.         

  69.         /*

  70.         +---+----+

  71.         |age|name|

  72.         +---+----+

  73.         |  2|test|

  74.         +---+----+

  75.          */

  76.         sql.sql("SELECT * FROM person WHERE age>1").show();

  77.         

  78.         sc.close();

  79.     }


  80. }

到此,关于“Spark SQL怎么用”的学习就结束了,希望能够解决大家的疑惑。理论与实践的搭配能更好的帮助大家学习,快去试试吧!若想继续学习更多相关知识,请继续关注亿速云网站,小编会继续努力为大家带来更多实用的文章!

向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI