Error initialized or created transport for authentication

Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.   The error message that you showed above is the partial (or) full error? If it is par...

I am trying to connect to Hive database with Oracle SQL developer using Cloudera Hive JDBC drivers.

I keep getting following error message

Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.

My Hadoop environment has

  • Hive version 1.2.1.2.3
  • Kerberos version 1.10.3-10

I am trying to connect to this Hive database from Windows 7 64-bit machine which has

  • Sqldeveloper version 4.2.0.16.356.1154
  • Cloudera_Hive JDBC4 driver 2.5.18.1050
  • MIT Kerberos app version 4.1

Important: Windows machine I am connecting from is on a different domain as that of Hadoop cluster.

I have followed instructions from Using SQL Developer to access Apache Hive with kerberos authentication and steps I have performed are.

  1. Imported all the jar files to SQL Developer from the JDBC driver .
  2. Updated Java Crypto jars (local_policy.jar and US_export_policy.jap in sqldeveloperjdkjrelibsecurity folder) with ones provided in UnlimitedJCEPolicy.zip.
  3. Created an environment variable  KRB5CCNAME whose value is set to C:sqldeveloperktfile.keytab
  4. Installed MIT kerberos 4.1 64-bit app
  5. Acquired valid ticked (via kinit/hrough the app)
  6. Picture below shows the connection details

Hive connection details were:

Host name: machine.test.group
port: 10010
database: default

Krbservicename: hive
AuthMech: 1
KrbFQDN: machine.test.group
KrbRealm:  dev.mycompany.com

Can someone please advise me what I can do to fix the issue and connect to Hive using JDBC drivers.

I am trying to connect to Hive2 server via JDBC with kerberos authentication. After numerous attempts to make it work, I can’t get it to work with the Cloudera driver.

If someone can help me to solve the problem, I can greatly appreciate it.

I have this method:

    private Connection establishConnection() {
    final String driverPropertyClassName = "driver";
    final String urlProperty = "url";
    Properties hiveProperties = config.getMatchingProperties("hive.jdbc");
    String driverClassName = (String) hiveProperties.remove(driverPropertyClassName);
    String url = (String) hiveProperties.remove(urlProperty);
    Configuration hadoopConfig = new Configuration();
    hadoopConfig.set("hadoop.security.authentication", "Kerberos");
    String p = config.getProperty("hadoop.core.site.path");
    Path path = new Path(p);
    hadoopConfig.addResource(path);
    UserGroupInformation.setConfiguration(hadoopConfig);

    Connection conn = null;
    if (driverClassName != null) {
        try {
            UserGroupInformation.loginUserFromKeytab(config.getProperty("login.user"), config.getProperty("keytab.file"));
            Driver driver = (Driver) Class.forName(driverClassName).newInstance();
            DriverManager.registerDriver(driver);
            conn = DriverManager.getConnection(url, hiveProperties);
        } catch (Throwable e) {
            LOG.error("Failed to establish Hive connection", e);
        }
    }
    return conn;
}

URL for the server, that I am getting from the properties in the format described in Cloudera documentation

I am getting an exception:

2018-05-05 18:26:49 ERROR HiveReader:147 - Failed to establish Hive connection
java.sql.SQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: Peer indicated failure: Unsupported mechanism type PLAIN.
    at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
    at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
...

I thought, that it is missing AuthMech attribute and added AuthMech=1 to the URL. Now I am getting:

java.sql.SQLNonTransientConnectionException: [Cloudera][JDBC](10100) Connection Refused: [Cloudera][JDBC](11640) Required Connection Key(s): KrbHostFQDN, KrbServiceName; [Cloudera][JDBC](11480) Optional Connection Key(s): AsyncExecPollInterval, AutomaticColumnRename, CatalogSchemaSwitch, DecimalColumnScale, DefaultStringColumnLength, DelegationToken, DelegationUID, krbAuthType, KrbRealm, PreparedMetaLimitZero, RowsFetchedPerBlock, SocketTimeOut, ssl, StripCatalogName, transportMode, UseCustomTypeCoercionMap, UseNativeQuery, zk
    at com.cloudera.hiveserver2.exceptions.ExceptionConverter.toSQLException(Unknown Source)
    at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.checkResponseMap(Unknown Source)
    ...

But KrbHostFQDN is already specified in the principal property as required in the documentation.

Am I missing something or is this documentation wrong?

DBeaver https://dbeaver.io/ is a a powerful free opensource SQL editor tool than can connect to 80+ different databases. The below procedures will enable DBeaver to connect to Cloudera Hive/Impala using kerberos.

Initially tried to use the Cloudera JDBC connection but it kept giving kerberos error:

[Cloudera]ImpalaJDBCDriver Error initialized or created transport for authentication: [Cloudera]ImpalaJDBCDriver Unable to connect to server: GSS initiate failed.

So tried the ODBC connection by creating a 64bit Impala ODBC Driver DSN. You will need to download the Cloudera Impala or Hive 64bit ODBC driver from the Cloudera website and install it. After that create a ODBC 64bit DSN and make sure the connection to Hive or Impala is successful using Kerberos realm and principal.

Check this link on how to create a ODBC DSN using Kerberos: https://plenium.wordpress.com/2019/08/02/connect-microsoft-power-bi-desktop-to-cloudera-impala-with-kerberos/

Once you have a working ODBC DSN which tests successful connection to Hive/Impala. Create a new ODBC Database connection in DBeaver from Database menu. It may ask you to download the jdbc-odbc-bridge driver. Enter the name of the ODBC DSN in the field Database/Schema: myodbc64bitdsnname . Leave the User Name and Password fields blank.

Test the connection and it should work. After that you can run any SQL query on Impala or Hive.

Published
October 15, 2019December 2, 2019

Problem

When trying to connect Impala via SSL, error is thrown by the driver.

Symptom

[Simba][ImpalaJDBCDriver](500164) Error initialized or created
transport for authentication: Socket already connected..

Environment

Cloudera Impala JDBC driver version 2.5.32

Resolving The Problem

Solution 1:

Change JDBC url format to:

jdbc:impala://servername.domain.com:21050;AuthMech=3;SSL=1;transportMode=sasl

Solution 2:

— Downgrade the Cloudera Impala JDBC drivers to version 2.5.22: http://www.cloudera.com/downloads/connectors/impala/jdbc/2-5-22.html

JDBC url format:

jdbc:impala://servername.domain.com:21050;AuthMech=3;SSL=1;SSLKeyStore=<keystorepath>;SSLKeyStorePwd=<password>

Notes:

— Make sure no other Hive/Impala driver versions are within the same dispatcher drivers location (..c10webappsp2pdWEB-INFlib).
— Cognos only supports jdbc4 type drivers (we do not support jdbc3 or jdbc41).

[{«Product»:{«code»:»SSEP7J»,»label»:»Cognos Business Intelligence»},»Business Unit»:{«code»:»BU059″,»label»:»IBM Software w/o TPS»},»Component»:»Cognos Administration»,»Platform»:[{«code»:»PF016″,»label»:»Linux»},{«code»:»PF033″,»label»:»Windows»}],»Version»:»10.2.2″,»Edition»:»»,»Line of Business»:{«code»:»LOB10″,»label»:»Data and AI»}}]

After installing Hadoop, hive, mysql, and hive command line interface according to the online tutorial, I prepared to use JDBC to connect hive for simple query, but found that the above is not enough, need further configuration, the following are some of my records.

HiveServer2

HiveServer2 (HS2) is a server interface that enables remote clients to perform Hive queries and return results. The current Thrift RPC-based implementation is an improved version of HiveServer and supports multi-client concurrency and authentication. After starting the hiveServer2 service, you can connect using JDBC, ODBC, or thrift. Java code JDBC or Beeline Connection JDBC connection. Hue is connected to the Hive service in thrift mode.

When connecting to hive using JDBC, you need to verify the verification mode. You need to configure the verification mode in the hive-site. XML ($HIVE_HOME/conf/hive-site.xml) command:

Here can be set to NONE and the CUSTOM, the former is not need to verify that the latter is a user name and password authentication  property   name  hive. Server2. Authentication  / name   value  NONE  / value  ! -- or CUSTOM-- /propertyCopy the code

Further configuration is required when setting to CUSTOM:

1, need a custom validation class to implement org. Apache. Hive. Service. The auth. PasswdAuthenticationProvider interface, the custom class package path is org.. Apache hadoop. Hive. Contrib. Auth, $HIVE_HOME/lib: $HIVE_HOME/lib: $HIVE_HOME

Maven needs to import jars:

! -- https://mvnrepository.com/artifact/org.apache.hive/hive-service -- dependency groupIdorg.apache.hive/groupId  artifactId  hive - service  / artifactId   version  2.3.5  / version   / dependency  ! -- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -- dependency  groupId  org, apache hadoop  / groupId   artifactId  hadoop - common  / artifactId   version  3.1.2  / version   / dependency Copy the code

Implementation class code:

package org.apache.hadoop.hive.contrib.auth;

import javax.security.sasl.AuthenticationException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;

/ * * *@Author:
 * @Date: the 2019-7-30 9:56 * /
public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{

    private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);

    private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";

    private Configuration conf=null;

    @Override
    public void Authenticate(String userName, String passwd)
            throws AuthenticationException {
        LOG.info("user: "+userName+" try login.");
        String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));
        if(passwdConf==null){
            String message = "user's ACL configration is not found. user:"+userName;
            LOG.info(message);
            throw new AuthenticationException(message);
        }
        if(! passwd.equals(passwdConf)){ String message ="user name and password is mismatch. user:"+userName;
            throw newAuthenticationException(message); }}public Configuration getConf(a) {
        if(conf==null) {this.conf=new Configuration(new HiveConf());
        }
        return conf;
    }

    public void setConf(Configuration conf) {
        this.conf=conf; }}Copy the code

2. Add the configuration in hive-site. XML

! - configure the above custom validation implementation class -   property   name  hive. Server2. Custom. Authentication. Class  / name  valueorg.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator/value /property ! Root1 specifies the user name and password. 123456789 -- property namehive.jdbc_passwd.auth.root1/name value123456789/value /propertyCopy the code

After the configuration, restart HiveServer2 and run the./beeline ($HIVE_HOME/bin) command to test the connection

The hiveServer2 startup mode is $HIVE_HOME/bin/ hiveServer2 or $HIVE_HOME/bin/hive --service hiveServer2Copy the code

Screenshot of successful test:

Here are some of the problems I encountered in my own testing

Impersonate anonymous User root is not allowed to impersonate anonymous when impersonate anonymous is impersonate anonymous

XML ($HADOOP_HOME/etc/hadoop/core-site.xml) file to configure the corresponding Hadoop proxy user. Hadoop. Proxyuser. Root. Hosts the root part of the name of the configuration items for error the User: the User name part of the *  property   name . Hadoop proxyuser. Root. Hosts  / name  value*/value /property property namehadoop.proxyuser.root.groups/name value*/value /propertyCopy the code

Hadoop needs to be restarted to take effect after configuration. I use Hadoop2.6.5, so I can find stop-all and start-all scripts in the $HADOOP_HOME/sbin directory, execute them, and check whether the startup is successful by JPS. When I run the stop-all script, I find that the Hadoop process under JPS is not shut down, so I kill it. Startup Can start normally.

ErrorCode 500164:Error Initialized or created transport for authentication: Peer indicated failure: Error validating the login

AuthMech=3

0- no password required, 1- KRB authentication, 2- username authentication, 3- username and password authentication

This is the JDBC connection to hive encountered some problems and how to solve, record, convenient later check

We wish to integrate our Business Objects Environment to Kerberos-Enabled Hive Database.

Environment Details:

Version: SAP BIP 4.2 SP03 Patch 05

OS: RHEL

Clustered Environment with 4 processing Tiers

Authentication : SAP & Enterprise.

When we create a connection in IDT, it fails with below error:

[Simba][HiveJDBCDriver](500164) Error initialized or created transport for authentication: Peer indicated failure: Unsupported mechanism type PLAIN.

When ‘AuthMech=1’ is added to JDBC Driver Properties, the error changes to:

I am not sure if SPN or other configurations are needed to be setup on BO Server, We are not using SSO.

Any help would be appreciated.

Thanks and Regards,

Aakash Gupta

After much trial and error and time spent we’ve been able to use the following to get it to work. Hopefully this will help someone else.

Things needed for Splunk DB Connect on Windows to work with a Hive 2 Kerberos connection:

  • Kerberos MIT — https://web.mit.edu/kerberos/dist/ (kfw-4.0.1-amd64)
  • Cloudera JDBC driver — https://www.cloudera.com/downloads/connectors/hive/jdbc/2-6-5.html (HiveJDBC41.jar)
  • OpenJDK 8 — https://jdk.java.net/java-se-ri/8-MR3

Other things to mention:

Oracle’s Java 8 did not work for us. 

Krb5.conf configuration files placed in %JAVA_HOME%jrelibsecurity directory didn’t seem to help DB Connect when verifying the connection.

We tested successfully on Windows servers with Splunk v8.2/DB Connect v3.5.1 and Splunk v7.3/DB Connect 3.3.1.

db_connection_types.conf file needs to exist or be created in the DB Connect local directory with a config that looks like this for the Cloudera driver…

[cloudera_hive_2]
displayName = Cloudera Hive 2
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = com.cloudera.hive.jdbc41.HS2Driver
jdbcUrlFormat = jdbc:hive2://<host>:<port>/<database>
port = 20500
ui_default_catalog = $database$

The more up-to-date versions of DB Connect are more verbose with any errors generated.

If I were to start the process again I would install the newest version of DB Connect compatible with existing version of Splunk. Download OpenJDK v8, extract and copy it into a directory (C:Program FilesJavajava-se-8u41-ri). Then create a JAVA_HOME system environment variable and place that directory inside as the value. Reboot the server. You may then need to manually update the «JRE Installation Path» field in DB Connects Configuration -> Settings -> General tab and Save. Then reboot Splunk web via Settings -> Server Controls. Once there are no more errors popping up, download the Cloudera driver and move it into the DB Connect «drivers» directory (splunk_app_db_connectdrivers). Go to the Configuration -> Settings -> Drivers tab and click reload. The driver should now exist on the page and have a green check mark next to it along with the version. Install Kerberos MIT. Create a connection in DB Connect and setup a connection string in the «JDBC URL» field with something like…

jdbc:hive2://dbserverhostname:10000/db_table_name;AuthMech=1;KrbRealm=domain_name_here;KrbHostFQDN=dbserverhostname;KrbServiceName=hive;KrbAuthType=2

Click «Save» and see if the connection is successful. If it is, there should be no errors that pop up. 

Понравилась статья? Поделить с друзьями:
  • Error initialize scatter file failed
  • Error initialize libfreshclam init failed
  • Error initialization video mode 640x480 fallout 2
  • Error init webgl failed
  • Error init render что это