Last Modified Date: 24 Aug 2022
Issue
After connecting to a Hadoop Hive data from Tableau Desktop, when you try to drag a field into the view or filter data, the following error might occur:
Error from Hive: error code: ‘1’ error message: Error while processing statement: Failed: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
Environment
- Tableau Desktop
- Hortonworks Hadoop Hive
- MapR Hadoop
- Cloudera Hadoop Hive
- Cloudera Impala
Resolution
Work with you database administrator to troubleshoot the following database configurations:
- Verify that the Hadoop Hive configuration does not contain added properties:
- Verify that properties allow for SELECT statements with defined fields from tables, for example, SELECT <field_name> FROM <table_name>.
- Restart the Hadoop Hive cluster to ensure that the database is not referring to cached configurations or JAR files that no longer exist.
Cause
This error is passed to Tableau Desktop from the database. It indicates that there is a problem with the Hadoop Hive database configuration.
Additional Information
Cloudera: Hive ODBC query fails, works in Hive shell though — hive.aux.jars.path?
Solution 1
I see you are using Hiveserver2. For such aggregations, depending upon configuration, I believe you may need to set the number of reducers prior to execution. With Hive you can use this syntax:
SET mapred.reduce.tasks=1
However on Hive2 I have noticed I need to use:
SET mapreduce.job.reduces=1
I hope this helps! Previously I had a the same error message and changing this resolved the problem for me.
Solution 2
This could be a permission issue,Try starting beeline as follows
beeline
!connect jdbc:hive2://hadoop7:10000/default
enter your user and password
Comments
-
I have started metastore and hiveserver2
#./hive --service metastore #./hive --service hiveserver2
When I excute below query
#./beeline -u jdbc:hive2://192.168.0.10:10000 -e 'select count(*) from test_tb' --hiveconf hive.root.logger=DEBUG,console --verbose=true
It throws below error
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask (state=08S01,code=1) java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:275) at org.apache.hive.beeline.Commands.execute(Commands.java:736) at org.apache.hive.beeline.Commands.sql(Commands.java:657) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:804) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:608) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:630) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:368) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:351) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:160) Beeline version 0.13.1 by Apache Hive
hiveserver2 log below
6/06/14 10:57:32 [main]: WARN common.LogUtils: DEPRECATED: Ignoring hive-default.xml found on the CLASSPATH at /data/offline/apache-hive-0.13.1-bin/conf/hive-default.xml 16/06/14 10:57:32 [main]: INFO metastore.HiveMetaStore: Starting hive metastore on port 9083 16/06/14 10:57:32 [main]: INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 16/06/14 10:57:32 [main]: INFO metastore.ObjectStore: ObjectStore, initialize called 16/06/14 10:57:33 [main]: INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 16/06/14 10:57:33 [main]: INFO metastore.ObjectStore: Initialized ObjectStore 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Added admin role in metastore 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Added public role in metastore 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Starting DB backed MetaStore Server 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Started the new metaserver on port [9083]... 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Options.minWorkerThreads = 200 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: Options.maxWorkerThreads = 100000 16/06/14 10:57:34 [main]: INFO metastore.HiveMetaStore: TCP keepalive = true 16/06/14 10:57:40 [pool-3-thread-1]: INFO metastore.HiveMetaStore: 1: source:/10.234.177.127 get_table : db=default tbl=test_tb 16/06/14 10:57:40 [pool-3-thread-1]: INFO HiveMetaStore.audit: ugi=qspace ip=/10.234.177.127 cmd=source:/192.168.0.10 get_table : db=default tbl=test_tb 16/06/14 10:57:40 [pool-3-thread-1]: INFO metastore.HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 16/06/14 10:57:40 [pool-3-thread-1]: INFO metastore.ObjectStore: ObjectStore, initialize called 16/06/14 10:57:40 [pool-3-thread-1]: INFO metastore.ObjectStore: Initialized ObjectStore
-
I tried, but it still throw the same error, still thanks
-
was a permission issue for me! specifically i needed to use small letters in my user id as opposed to the actual caps in it. like remigiusz mentioned below it apparently works fine for select * queries, but fails for everything else
Recents
Hi,
some of the Hive queries started to fail with this error message, so I checked the web and community, and usually the response is the setting the autoconvert.join to false can help.
The error appears very quickly, almost immediately after the query is submitted. I know that Hive is trying to create a map-side join but I dont understand why it fails, when X months it was running ok. The actual tables are very small (<1000 rows) so I dont think the hash-table can cause the out-of-memory.
2018-08-29 04:23:46,468 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,468 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,468 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: </PERFLOG method=acquireReadWriteLocks start=1535509426468 end=1535509426480 duration=12 from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: Executing command(queryId=hive_20180829042323_49ab08f3-b5e3-45db-9ca5-94ce91d63b11): INSERT OVERWRITE TABLE `prod_work`.`table` SELECT `esub`.`yesterday` AS `yesterday`, `esub`.`product_group` AS `product_group`, `esub`.`priceplan` AS `priceplan`, `esub`.`brand` AS `brand`, `esub`.`brand_desc` AS `brand_desc`, `esub`.`segment` AS `segment`, `esub`.`product_payment_type` AS `product_payment_type`, `esub`.`esub` AS `esub`, `rnch`.`rnch` AS `rnch`, `gradd`.`gradd` AS `gradd` FROM `prod_work`.`tablea` `esub` LEFT JOIN `prod_work`.`tableb` `rnch` ON `esub`.`priceplan` = `rnch`.`priceplan` LEFT JOIN `prod_work`.`tablec` `gradd` ON `esub`.`priceplan` = `gradd`.`priceplan` 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: Query ID = hive_20180829042323_49ab08f3-b5e3-45db-9ca5-94ce91d63b11 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: Total jobs = 1 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: </PERFLOG method=TimeToSubmit start=1535509426468 end=1535509426480 duration=12 from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,480 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,481 INFO org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: Starting task [Stage-6:MAPREDLOCAL] in serial mode 2018-08-29 04:23:46,481 INFO org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask: [HiveServer2-Background-Pool: Thread-138349]: Generating plan file file:/tmp/hive/dbb7f6a5-a988-466b-8170-58bbcc924fb9/hive_2018-08-29_04-23-46_256_3553242332203703514-2517/-local-10004/plan.xml 2018-08-29 04:23:46,481 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=serializePlan from=org.apache.hadoop.hive.ql.exec.Utilities> 2018-08-29 04:23:46,481 INFO org.apache.hadoop.hive.ql.exec.Utilities: [HiveServer2-Background-Pool: Thread-138349]: Serializing MapredLocalWork via kryo 2018-08-29 04:23:46,483 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: </PERFLOG method=serializePlan start=1535509426481 end=1535509426483 duration=2 from=org.apache.hadoop.hive.ql.exec.Utilities> 2018-08-29 04:23:46,532 WARN org.apache.hadoop.hive.conf.HiveConf: [HiveServer2-Background-Pool: Thread-138349]: HiveConf of name hive.server2.idle.session.timeout_check_operation does not exist 2018-08-29 04:23:46,532 WARN org.apache.hadoop.hive.conf.HiveConf: [HiveServer2-Background-Pool: Thread-138349]: HiveConf of name hive.sentry.conf.url does not exist 2018-08-29 04:23:46,532 WARN org.apache.hadoop.hive.conf.HiveConf: [HiveServer2-Background-Pool: Thread-138349]: HiveConf of name hive.entity.capture.input.URI does not exist 2018-08-29 04:23:46,543 INFO org.apache.hadoop.hdfs.DFSClient: [HiveServer2-Background-Pool: Thread-138349]: Created token for hive: HDFS_DELEGATION_TOKEN owner=hive/ip-10-197-23-43.eu-west-1.compute.internal@DOMAIN.LOCAL, renewer=hive, realUser=, issueDate=1535509426541, maxDate=1536114226541, sequenceNumber=1381844, masterKeyId=457 on ha-hdfs:hanameservice 2018-08-29 04:23:46,544 INFO org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask: [HiveServer2-Background-Pool: Thread-138349]: Executing: /opt/cloudera/parcels/CDH-5.13.3-1.cdh5.13.3.p0.2/lib/hadoop/bin/hadoop jar /opt/cloudera/parcels/CDH-5.13.3-1.cdh5.13.3.p0.2/jars/hive-common-1.1.0-cdh5.13.3.jar org.apache.hadoop.hive.ql.exec.mr.ExecDriver -libjars file:///opt/cloudera/parcels/CDH-5.13.3-1.cdh5.13.3.p0.2/lib/hive/auxlib/hive-exec-1.1.0-cdh5.13.3-core.jar,file:///opt/cloudera/parcels/CDH-5.13.3-1.cdh5.13.3.p0.2/lib/hive/auxlib/hive-exec-core.jar -localtask -plan file:/tmp/hive/dbb7f6a5-a988-466b-8170-58bbcc924fb9/hive_2018-08-29_04-23-46_256_3553242332203703514-2517/-local-10004/plan.xml -jobconffile file:/tmp/hive/dbb7f6a5-a988-466b-8170-58bbcc924fb9/hive_2018-08-29_04-23-46_256_3553242332203703514-2517/-local-10005/jobconf.xml 2018-08-29 04:23:46,583 ERROR org.apache.hadoop.hive.ql.exec.Task: [HiveServer2-Background-Pool: Thread-138349]: Execution failed with exit status: 1 2018-08-29 04:23:46,583 ERROR org.apache.hadoop.hive.ql.exec.Task: [HiveServer2-Background-Pool: Thread-138349]: Obtaining error information 2018-08-29 04:23:46,583 ERROR org.apache.hadoop.hive.ql.exec.Task: [HiveServer2-Background-Pool: Thread-138349]: Task failed! Task ID: Stage-6 Logs: 2018-08-29 04:23:46,583 ERROR org.apache.hadoop.hive.ql.exec.Task: [HiveServer2-Background-Pool: Thread-138349]: /var/log/hive/hadoop-cmf-CD-HIVE-nRaFPvFN-HIVESERVER2-ip-10-197-23-43.eu-west-1.compute.internal.log.out 2018-08-29 04:23:46,583 ERROR org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask: [HiveServer2-Background-Pool: Thread-138349]: Execution failed with exit status: 1 2018-08-29 04:23:46,584 ERROR org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask 2018-08-29 04:23:46,584 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: </PERFLOG method=Driver.execute start=1535509426480 end=1535509426584 duration=104 from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,584 INFO org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-138349]: Completed executing command(queryId=hive_20180829042323_49ab08f3-b5e3-45db-9ca5-94ce91d63b11); Time taken: 0.104 seconds 2018-08-29 04:23:46,584 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,594 INFO org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-138349]: </PERFLOG method=releaseLocks start=1535509426584 end=1535509426594 duration=10 from=org.apache.hadoop.hive.ql.Driver> 2018-08-29 04:23:46,595 ERROR org.apache.hive.service.cli.operation.Operation: [HiveServer2-Background-Pool: Thread-138349]: Error running hive query: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:400) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:238) at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89) at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
Is it probably related to the HiveServer memory configuration? Do I understand it correctly that this fail happens inside the HiveServer2 JVM, when it tries to build a hash-table for a map-side join?
Thanks
Hi,
Currently I am working with hive, and while using one command to check out all records inside the table, its showing me an error;
Command:
Error:
I am not getting why the error is happening and what can be the solution to this?
May 15, 2019
in Big Data Hadoop
by
diana
•
3,929 views
1 answer to this question.
Hey,
The error you are getting because the query you have used should show the the number of rows in your table, but there is no data inside your table.
You can use a query to list down the elements, just use:
select * from table_name;
I hope it works.
answered
May 15, 2019
by
• 65,910 points
Related Questions In Big Data Hadoop
- All categories
-
ChatGPT
(4) -
Apache Kafka
(84) -
Apache Spark
(596) -
Azure
(131) -
Big Data Hadoop
(1,907) -
Blockchain
(1,673) -
C#
(141) -
C++
(271) -
Career Counselling
(1,060) -
Cloud Computing
(3,446) -
Cyber Security & Ethical Hacking
(147) -
Data Analytics
(1,266) -
Database
(855) -
Data Science
(75) -
DevOps & Agile
(3,575) -
Digital Marketing
(111) -
Events & Trending Topics
(28) -
IoT (Internet of Things)
(387) -
Java
(1,247) -
Kotlin
(8) -
Linux Administration
(389) -
Machine Learning
(337) -
MicroStrategy
(6) -
PMP
(423) -
Power BI
(516) -
Python
(3,188) -
RPA
(650) -
SalesForce
(92) -
Selenium
(1,569) -
Software Testing
(56) -
Tableau
(608) -
Talend
(73) -
TypeSript
(124) -
Web Development
(3,002) -
Ask us Anything!
(66) -
Others
(1,929) -
Mobile Development
(263)
Subscribe to our Newsletter, and get personalized recommendations.
Already have an account? Sign in.