前言:
现时你们对“hive load导入数据报错”大概比较注意,咱们都想要学习一些“hive load导入数据报错”的相关文章。那么小编同时在网上网罗了一些有关“hive load导入数据报错””的相关资讯,希望小伙伴们能喜欢,大家快快来了解一下吧!好不容把大数据集群装好了,但是hive始终启动不了。
第一步先认真的检查一下mysql中创建的hive数据和用户名
create database hive default charset utf8 collate utf8_general_ci; CREATE USER 'hive'@'%' IDENTIFIED BY 'Hive-123';GRANT ALL PRIVILEGES ON hive.* TO 'hive'@'%';FLUSH PRIVILEGES; 这个一定要增加,要不然本地连接会失败,没有权限CREATE USER 'hive'@'localhost' IDENTIFIED BY 'Hive-123';GRANT ALL PRIVILEGES ON hive.* TO 'hive'@'localhost';
上面操作执行后还是有下面的错误出现。重点观察到‘Failed to load driver’。
因此初步判断是没有com.mysql.jdbc.Driver mysql驱动,顺着报错找到hive的lib地址
/usr/hdp/3.0.1.0-187/hive/libcd /usr/share/java/scp -r mysql-connector-java.jar slave1:/usr/hdp/3.0.1.0-187/hive/libscp -r mysql-connector-java.jar slave2:/usr/hdp/3.0.1.0-187/hive/libcp mysql-connector-java.jar /usr/hdp/3.0.1.0-187/hive/lib
再次启动hive,OK终于解决了。
tderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_metastore.py", line 201, in <module> HiveMetastore().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_metastore.py", line 61, in start create_metastore_schema() # execute without config lock 中间省略 2020-12-18 15:27:05,098 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}2020-12-18 15:27:05,100 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}2020-12-18 15:27:05,100 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json2020-12-18 15:27:05,100 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json'] {'content': Template('input.config-hive.json.j2'), 'mode': 0644}2020-12-18 15:27:05,101 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/ ; /usr/hdp/current/hive-server2/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED] -verbose'] {'not_if': "ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/ ; /usr/hdp/current/hive-server2/bin/schematool -info -dbType mysql -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'}Command failed after 1 tries
测试连接一直报错误
设置启动的mysql驱动路径,再次执行就可以测试成功了。
ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
总结说明,多次安装报错,先排查内存和缓存问题,如果正常大概率是依赖相关的驱动问题(版本兼容问题居多)
标签: #hive load导入数据报错