RHEL6 | 角色 | jdk |
hadoop-2.8.1.tar.gz | ||
apache-maven-3.3.9 | mysql-5.1 | |
xx.xx.xx.xx ip地址 | NN | hadoop01 |
xx.xx.xx.xx ip地址 | DN | hadoop02 |
xx.xx.xx.xx ip地址 | DN | hadoop03 |
xx.xx.xx.xx ip地址 | DN | hadoop04 |
xx.xx.xx.xx ip地址 | DN | hadoop05 |
本次涉及伪分布式部署只是要主机hadoop01,软件安装参考伪分布式部署终极篇
Requirements
============
- Java 1.6, 1.7 ---jdk只能使用1.6或者1.7 不能使用1.8
- Hadoop 1.x, 2.x
mkdir /usr/java && cd /usr/java/
tar -zxvf /tmp/server-jre-7u80-linux-x64.tar.gz
chown -R root:root /usr/java/jdk1.7.0_80/
echo 'export JAVA_HOME=/usr/java/jdk1.7.0_80'>>/etc/profile
source /etc/profile
cd /usr/local/
unzip /tmp/apache-maven-3.3.9-bin.zip
chown root: /usr/local/apache-maven-3.3.9 -R
echo 'export MAVEN_HOME=/usr/local/apache-maven-3.3.9'>>/etc/profile
echo 'export MAVEN_OPTS="-Xms256m -Xmx512m"'>>/etc/profile
echo 'export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH'>>/etc/profile
source /etc/profile
--JDK 和maven部署可参考---大数据之----部署安装编译打包hadoop终极篇
yum -y install mysql-server mysql
/etc/init.d/mysqld start
chkconfig mysqld on
mysqladmin -u root password 123456
mysql -uroot -p123456
use mysql;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' IDENTIFIED BY 'v123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'127.0.0.1' IDENTIFIED BY '123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY '123456' WITH GRANT OPTION;
update user set password=password('123456') where user='root';
delete from user where not (user='root') ;
delete from user where user='root' and password='';
drop database test;
DROP USER ''@'%';
flush privileges;
# http://archive.cloudera.com/cdh6/cdh/5/
# 根据cdh版本选择对应hive软件包:
# hive-1.1.0-cdh6.7.1-src.tar.gz
# 解压后使用maven命令编译成安装包
cd /tmp/
tar -xf hive-1.1.0-cdh6.7.1-src.tar.gz
cd /tmp/hive-1.1.0-cdh6.7.1
mvn clean package -DskipTests -Phadoop-2 -Pdist
# 编译生成的包在以下位置:
# packaging/target/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz
cd /usr/local/
tar -xf /tmp/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz
ln -s apache-hive-1.1.0-cdh6.7.1-bin hive
chown -R hadoop:hadoop apache-hive-1.1.0-cdh6.7.1-bin
chown -R hadoop:hadoop hive
echo 'export HIVE_HOME=/usr/local/hive'>>/etc/profile
echo 'export PATH=$HIVE_HOME/bin:$PATH'>>/etc/profile
su - hadoop
cd /usr/local/hive
cd conf
1、hive-env.sh
cp hive-env.sh.template hive-env.sh&&vi hive-env.sh
HADOOP_HOME=/usr/local/hadoop
2、hive-site.xml
vi hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/vincent_hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>vincent</value>
</property>
</configuration>
# 上方的hive-site.xml使用了java的mysql驱动包
# 需要将这个包上传到hive的lib目录之下
# 解压 mysql-connector-java-5.1.45.zip 对应的文件到目录即可
cd /tmp
unzip mysql-connector-java-5.1.45.zip
cd mysql-connector-java-5.1.45
cp mysql-connector-java-5.1.45-bin.jar /usr/local/hive/lib/
未拷贝有相关报错:
The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH.
Please check your CLASSPATH specification,
and the name of the driver.
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。