「java连接hive」java连接hive kerberos

博主:adminadmin 2023-03-22 12:51:09 870

本篇文章给大家谈谈java连接hive,以及java连接hive kerberos对应的知识点,希望对各位有所帮助,不要忘了收藏本站喔。

本文目录一览:

Hive | Hive 启动和操作出错

具体步骤:

step1:在修改 hive-site.xml 前,先创建 tmp 目录

进入到hive的安装目录下,本文是进入到  /opt/modules/hive/apache-hive-1.2.2-bin

输入      mkdir   tmp             存储临时文件

step2:进入到 /apache-hive-1.2.2-bin/conf 目录下,修改 hive-site.xml 文件

输入    vim    hive-site.xml         编辑 hive-site.xml  文件

搜索 system:java.io.tmpdir ,输入  /system:java.io.tmpdir  进行搜索

可定位到多处含有 system:java.io.tmpdir 的地方(搜索功能按小写 n可切换到下一处;按小写 N 可切换到上一处)

输入 i 键 进入编辑模式

将 system:java.io.tmpdir 全部替换成 /opt/modules/hive/apache-hive-1.2.2-bin/tmp(这个是我存放临时文件的路径,替换成你所创建的)

输入 Esc 键 退出编辑模式,输入 :wq   保存并退出

具体步骤:

进入到 /apache-hive-1.2.2-bin/conf 目录下,修改 hive-site.xml 文件

输入    vim    hive-site.xml      编辑 hive-site.xml  文件

搜索 hive.exec.local.scratchdir ,输入  /hive.exec.local.scratchdir   进行搜索

输入 i 键 进入编辑模式

将 ${system:user.name} 替换成 ${user.name}

改成:

输入 Esc 键 退出编辑模式,输入 :wq   保存并退出

step1:查看目前 hdfs 和 yarn 的启动情况

输入   jps

从上图可看到,datanode 木有掉了,所以重启一下 hdfs 和 yarn

step2: 先停止 hdfs 和 yarn

输入      stop-all.sh       

或者分别输入        stop-dfs.sh        stop-yarn.sh

step3:再启动 hdfs 和 yarn

输入 start-all.sh

或是分别输入          start-dfs.sh        start-yarn.sh

参考链接:

如何在Java中执行Hive命令或HiveQL

String sql="show tables; select * from test_tb limit 10";

ListString command = new ArrayListString();

command.add("hive");

command.add("-e");

command.add(sql);

ListString results = new ArrayListString();

ProcessBuilder hiveProcessBuilder = new ProcessBuilder(command);

hiveProcess = hiveProcessBuilder.start();

BufferedReader br = new BufferedReader(new InputStreamReader(

hiveProcess.getInputStream()));

String data = null;

while ((data = br.readLine()) != null) {

results.add(data);

}

如何使用kettle连接hive和hive2

连接hive的方法:

进入hive所在的服务器,输入:hive --service hiveserver(目的:启动thrift)

打开kettle配置连接界面,输入hive所在服务器的ip、所需要的hive库、端口号(thrift默认端口为:10000)

测试连接,即可

连接hive2的方法:

[plain] view plain copy

Error connecting to database [Hive] : org.pentaho.di.core.exception.KettleDatabaseException:

Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)

Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

org.pentaho.di.core.exception.KettleDatabaseException:

Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)

Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

at org.pentaho.di.core.database.Database.normalConnect(Database.java:428)

at org.pentaho.di.core.database.Database.connect(Database.java:361)

at org.pentaho.di.core.database.Database.connect(Database.java:314)

at org.pentaho.di.core.database.Database.connect(Database.java:302)

at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)

at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)

at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.test(DatabaseDialog.java:109)

at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2.test(CreateDatabaseWizardPage2.java:157)

at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2$3.widgetSelected(CreateDatabaseWizardPage2.java:147)

at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)

at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)

at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)

at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)

at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)

at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)

at org.eclipse.jface.window.Window.open(Window.java:796)

at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizard.createAndRunDatabaseWizard(CreateDatabaseWizard.java:111)

at org.pentaho.di.ui.spoon.Spoon.createDatabaseWizard(Spoon.java:7457)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

at java.lang.reflect.Method.invoke(Unknown Source)

at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)

at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)

at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)

at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)

at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)

at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)

at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)

at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)

at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)

at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)

at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)

at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)

at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)

at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1297)

at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7801)

at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9130)

at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:638)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

at java.lang.reflect.Method.invoke(Unknown Source)

at org.pentaho.commons.launcher.Launcher.main(Launcher.java:151)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)

Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:573)

at org.pentaho.di.core.database.Database.normalConnect(Database.java:410)

... 43 more

Caused by: java.sql.SQLException: Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:107)

at org.apache.hive.jdbc.HiveDriver.callWithActiveDriver(HiveDriver.java:121)

at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:132)

at java.sql.DriverManager.getConnection(Unknown Source)

at java.sql.DriverManager.getConnection(Unknown Source)

at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:555)

... 44 more

Caused by: java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

at java.lang.reflect.Method.invoke(Unknown Source)

at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:105)

... 49 more

Caused by: java.lang.RuntimeException: Unable to load JDBC driver of type: hive2

at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:108)

... 54 more

Caused by: java.lang.Exception: JDBC driver of type 'hive2' not supported

at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:104)

... 54 more

上述报错的解决方法如下:

1.找到%KETTLE_HOME%/plugins/pehtaho-big-data-plugin/plugin.properties文件

2.修改plugin.properties文件中的值:active.hadoop.configuration=hdp13

3.修改后重启kettle

4.配置完成后,即可连接上对应的库

如果要使用hadoop-20,则需要添加如下jar包:

hadoop-core-1.2.1.jar

hive-common-0.13.0.jar

hive-jdbc-0.13.0.jar

hive-service-0.13.0.jar

libthrift-0.9.1.jar

slf4j-api-1.7.5.jar

httpclient-4.2.5.jar

httpcore-4.2.5.jar

java中怎么实现查询出hive下所有数据库下表名

try {

Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");

String selectSql = "select * from db.data where address = '11111111'";

Connection connect = DriverManager.getConnection("jdbc:hive://192.168.xx.xx:10000/db", "xxx", "xxx");

PreparedStatement state = null;

state = connect.prepareStatement(selectSql);

ResultSet resultSet = state.executeQuery();

while (resultSet != null resultSet.next()) {

System.out.println(resultSet.getString(1) + " " + resultSet.getString(2));

}

} catch (Exception e) {

e.printStackTrace();

}

hive jdbc连接不成功。。报错org.apache.thrift.transport.TTransportException: Invalid status -128

jdbc和连接池对于你这个场景来说,都足够,既然用spring管理了,建议还是使用连接池,另外,spring自身没有实现连接池,一般都是对第三方连接池的包装,常见的有C3P0,dbcp以及最近比较流行的boneCP等,这几个配置都差不多太多,以boneCP为例:

bean id="dataSource" class="com.jolbox.bonecp.BoneCPDataSource"

destroy-method="close"

property name="driverClass" value="${jdbc.driverClass}" /

property name="jdbcUrl" value="${jdbc.url}" /

property name="username" value="${jdbc.user}" /

property name="password" value="${jdbc.password}" /

property name="idleConnectionTestPeriod" value="60" /

property name="idleMaxAge" value="240" /

property name="maxConnectionsPerPartition" value="30" /

property name="minConnectionsPerPartition" value="10" /

property name="partitionCount" value="2" /

property name="acquireIncrement" value="5" /

property name="statementsCacheSize" value="100" /

property name="releaseHelperThreads" value="3" /

/bean

bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate"

property name="dataSource" ref="dataSource" /

/bean

关于java连接hive和java连接hive kerberos的介绍到此就结束了,不知道你从中找到你需要的信息了吗 ?如果你还想了解更多这方面的信息,记得收藏关注本站。