• Mysql、Hive、Sqoop的安装及配置


    Mysql的安装及配置

    1、查看系统安装的MariaDB

    rpm -qa|grep mariadb
    
    • 1

    2、删除查询到的MariaDB,这里的mariadb-libs-5.5.68-1.el7.x86_64是通过上一步查出来的

    rpm -e --nodeps mariadb-libs-5.5.68-1.el7.x86_64
    
    • 1

    3、执行以下命令下载并安装mysql5.7

    wget http://dev.mysql.com/get/mysql57-community-release-el7-10.noarch.rpm
    
    • 1
    yum -y install mysql57-community-release-el7-10.noarch.rpm
    
    • 1
    yum install mysql-community-server
    
    • 1

    若出现 mysql-community-client-5.7.40-1.el7.x86_64.rpm 的公钥尚未安装 需要执行这一句

    rpm --import https://repo.mysql.com/RPM-GPG-KEY-mysql-2022
    
    • 1

    再执行

    yum install mysql-community-server
    
    • 1

    4、启动MySQL服务

    systemctl start mysqld.service
    
    • 1

    5、查看MySQL状态

    systemctl status mysqld.service
    
    • 1

    6、获取生成的初始密码 root@lovalhost:,密码为*

    grep "password" /var/log/mysqld.log
    
    • 1

    在这里插入图片描述

    7、进入MySQL

    mysql -u root -pW#gfyW.y7,v
    
    • 1

    8、修改密码

    mysql> set global validate_password_policy=0;
    mysql> set global validate_password_length=4;
    mysql> set password=password("123456");
    
    • 1
    • 2
    • 3

    9、设置mysql可远程登录

    mysql> grant all privileges on *.* To 'root'@'%' identified by '123456';
    mysql> flush privileges;
    
    • 1
    • 2

    10、退出

    mysql>exit
    
    • 1

    11、重新登录MySQL

    mysql -u root -p123456
    
    • 1

    Hive的安装及配置

    1、进入hive安装包位置,解压

    cd /opt/packages
    tar -zxvf apache-hive-1.2.2-bin.tar.gz -C /opt/programs/
    
    • 1
    • 2

    2、进入MySQL 在mysql中创建数据库hive

    mysql -u root -p123456
    
    • 1
    mysql> create database hive character set latin1;
    
    • 1
    exit
    
    • 1

    3、通过xftp将本地的mysql-connector-java-5.1.48.jar上传到Hive的lib目录下

    cd /opt/programs/apache-hive-1.2.2-bin/lib
    
    • 1

    4、进入指定目录,新建hive-site.xml 并配置

    cd /opt/programs/apache-hive-1.2.2-bin/conf
    
    • 1

    hive-site.xml

    "1.0" encoding="UTF-8" standalone="no"?>
    type="text/xsl" href="configuration.xsl"?>
    
    
    	javax.jdo.option.ConnectionURL</name>
    	jdbc:mysql://hadoop0:3306/hive?useSSL=false</value>
    </property>
    
    	javax.jdo.option.ConnectionDriverName</name>
    	com.mysql.jdbc.Driver</value>
    </property>
    
    	javax.jdo.option.ConnectionUserName</name>
    	root</value>
    </property>
    
    	javax.jdo.option.ConnectionPassword</name>
    	123456</value>
    </property>
    
    	hive.metastore.schema.verification</name>
    	false</value>
    </property>
    </configuration>
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24

    5、环境变量

    vim /etc/profile
    
    • 1
    export HIVE_HOME=/opt/programs/apache-hive-1.2.2-bin
    export PATH=$PATH:$HIVE_HOME/bin
    export HIVE_CONF_DIR=$HIVE_HOME/conf
    
    • 1
    • 2
    • 3
    source /etc/profile
    
    • 1

    6、初始化元数据库

    cd /opt/programs/apache-hive-1.2.2-bin/bin
    
    • 1
    schematool -initSchema  -dbType mysql -verbose
    
    • 1

    7、执行命令,检验是否安装成功

    hive
    
    • 1

    Sqoop的安装及配置

    1、进入Sqoop安装包位置,解压

    cd /opt/packages
    tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz -C /opt/programs/  
    
    • 1
    • 2

    2、通过xftp将本地mysql-connector-java-5.1.48.jar传到sqoop的lib目录下

    cd /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib
    
    • 1

    3、进入sqoop目录下conf文件夹,将sqoop-env-template.sh 文件复制并重命名为sqoop-env.sh

    cd /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/conf
    cp sqoop-env-template.sh sqoop-env.sh
    
    • 1
    • 2

    4、修改sqoop-env.sh文件

    vim sqoop-env.sh
    
    • 1

    文件末尾加上

    export HADOOP_COMMON_HOME=/opt/programs/hadoop-2.7.2
    export HADOOP_MAPRED_HOME=/opt/programs/hadoop-2.7.2
    export HIVE_HOME=/opt/programs/apache-hive-1.2.2-bin
    
    • 1
    • 2
    • 3

    5、环境变量

    vim /etc/profile
    
    • 1
    export SQOOP_HOME=/opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0
    export PATH=$PATH:${SQOOP_HOME}/bin
    export CLASSPATH=$CLASSPATH:${SQOOP_HOME}/lib
    
    • 1
    • 2
    • 3
    source /etc/profile
    
    • 1

    6、执行命令,检验是否安装成功

    sqoop version
    
    • 1

    出现这些内容说明成功

    22/11/13 13:50:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
    Sqoop 1.4.7
    git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8
    Compiled by maugli on Thu Dec 21 15:59:58 STD 2017
    
    • 1
    • 2
    • 3
    • 4

    将mysql中数据导入到Hive中

    1、进入mysql ,输入以下代码

    mysql -u root -p123456
    
    • 1
    create database test;
    
    use test;
    
    create table user(user_id int,user_name varchar(64));
    
    insert into user values (1,'Justin');
    insert into user values (2,'Mars');
    insert into user values (3,'Alano');
    insert into user values (4,'Alex');
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10

    2、进入hive,输入以下代码

    hive 
    
    • 1
    create table user_mysql(user_id int, user_name varchar(64))row format delimited fields terminated by ",";
    
    • 1

    3、将mysql中user表的数据导入到Hive中的user_mysql表中

    sqoop import --connect jdbc:mysql://hadoop0:3306/test --username root --password 123456 --table user --target-dir /user/mysql --fields-terminated-by "," --hive-import --hive-table user_mysql -m 1
    
    • 1

    将Hive数据导出到MySQL

    1、进入mysql ,输入以下代码

    mysql -u root -p123456
    
    • 1
    create table user2 like user;
    
    • 1

    2、将hive中usera_mysql表中的数据导出到mysql的user2表中

    sqoop export --connect jdbc:mysql://hadoop0:3306/test --username root --password 123456 --table user2 --fields-terminated-by ',' --export-dir /user/hive/warehouse/user_mysql
    
    • 1

    有可能的错误

    1、如果出现这个错误

    ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
    
    • 1

    说明环境变量缺少HIVE_CONF_DIR
    解决方法

    vim /etc/profile
    
    export HIVE_CONF_DIR=$HIVE_HOME/conf
    
    source /etc/profile
    
    • 1
    • 2
    • 3
    • 4
    • 5

    2、如果出现这个错误

    ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
    ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
    
    • 1
    • 2

    解决方法
    将hive/lib包中的hive-common-1.2.2.jar和hive-exec-1.2.2.jar拷贝到sqoop/lib包中

    cp /opt/programs/apache-hive-1.2.2-bin/lib/hive-common-1.2.2.jar  /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib/
    cp /opt/programs/apache-hive-1.2.2-bin/lib/hive-exec-1.2.2.jar  /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib/
    
    • 1
    • 2

    3、如果出现这个错误

    ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop0:9000/user/mysql already exists
    
    • 1

    解决方法
    因为hdfs已经存在了/user/mysql 需要删除

    hdfs dfs -rm -r  /user/mysql
    
    • 1
    
    
    • 1
  • 相关阅读:
    java基础学习:java中的反射
    matplotlib笔记分享之基础设置(二)
    java 8 stream api将List<T>转换成树形结构
    动态规划 DP 的一些笔记以及解题思路
    【读论文】GANMcC
    计算机网络:应用层知识点汇总
    Django项目目录及项目文件介绍
    分享|破世界纪录的OceanBase,如今入选了国际顶会VLDB 2022
    环形链表-力扣
    python-操作列表
  • 原文地址:https://blog.csdn.net/weixin_45942827/article/details/127927558