I use Spark 2.1.0.
When I run spark-shell
, I encounter this error:
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
What could be the reason? How to fix it?
asked Jun 6, 2017 at 9:24
3
I was facing the same issue, after investigation i observed there was the compatibility issue between spark version and winutils.exe of hadoop-2.x.x.
After experiment i suggest you to use hadoop-2.7.1 winutils.exe with spark-2.2.0-bin-hadoop2.7 version and hadoop-2.6.0 winutils.exe with spark-1.6.0-bin-hadoop2.6 version and set below environment variables
SCALA_HOME : C:Program Files (x86)scala2.11.7;
JAVA_HOME : C:Program FilesJavajdk1.8.0_51
HADOOP_HOME : C:Hadoopwinutils-masterhadoop-2.7.1
SPARK_HOME : C:Hadoopspark-2.2.0-bin-hadoop2.7
PATH : %JAVA_HOME%bin;%SCALA_HOME%bin;%HADOOP_HOME%bin;%SPARK_HOME%bin;
Create C:tmphive diroctory and give access permission using below command
C:Hadoopwinutils-masterhadoop-2.7.1bin>winutils.exe chmod -R 777 C:tmphive
Remove local Derby-based metastore metastore_db directory from Computer if it exist.
C:Users<User_Name>metastore_db
Use below command to start spark shell
C:>spark-shell
answered Aug 5, 2017 at 0:15
The reason for the error is that the instance could not be created due to some earlier issues (which may have happened because you are on Windows and you have not installed winutils.exe
binary or some other session keeps the local Derby-based metastore).
The recommendation is to scroll up and review the entire screen of logs where you find the root cause.
answered Jun 6, 2017 at 10:14
Jacek LaskowskiJacek Laskowski
71.3k26 gold badges235 silver badges411 bronze badges
2
If you are on Cloudera the soltuion from this github ticket worked for me (https://github.com/cloudera/clusterdock/issues/30):
The root user (who you’re running as when you start spark-shell) has no user directory in HDFS. If you create one (sudo -u hdfs hdfs dfs -mkdir /user/root followed by sudo -u hdfs dfs -chown root:root /user/root), this should be fixed.
I.e. create a HDFS user home directory for the user running spark-shell.
answered Dec 13, 2017 at 23:04
aaa90210aaa90210
10.8k13 gold badges50 silver badges88 bronze badges
For Ubuntu
users
I had the exact same error and i fixed it the following way.
If you are running spark-shell from the terminal close and re-open the terminal and then restart the spark-shell.
answered Nov 5, 2017 at 12:37
Abdullah KhanAbdullah Khan
11.6k6 gold badges65 silver badges74 bronze badges
If you are running Cloudera, please check in cloudera manager and make sure HIVE services are ON. I had same issue and figured my HIVE service was down. (HIVE METASTORE server, HIVESERVER, HOSTS)
for Spark, you need to make sure HDFS, YARN and HIVE are ON.
Above error appears if HIVE is OFF.
answered Nov 27, 2017 at 18:32
0
I had the same error. In my case, the hard disk was almost full. I deleted some large files from the disk and re-run again after a reboot. It worked! But I think this is not always the case.
answered Sep 4, 2019 at 19:36
user3503711user3503711
1,3931 gold badge17 silver badges31 bronze badges
I use Spark 2.1.0.
When I run spark-shell
, I encounter this error:
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
What could be the reason? How to fix it?
asked Jun 6, 2017 at 9:24
3
I was facing the same issue, after investigation i observed there was the compatibility issue between spark version and winutils.exe of hadoop-2.x.x.
After experiment i suggest you to use hadoop-2.7.1 winutils.exe with spark-2.2.0-bin-hadoop2.7 version and hadoop-2.6.0 winutils.exe with spark-1.6.0-bin-hadoop2.6 version and set below environment variables
SCALA_HOME : C:Program Files (x86)scala2.11.7;
JAVA_HOME : C:Program FilesJavajdk1.8.0_51
HADOOP_HOME : C:Hadoopwinutils-masterhadoop-2.7.1
SPARK_HOME : C:Hadoopspark-2.2.0-bin-hadoop2.7
PATH : %JAVA_HOME%bin;%SCALA_HOME%bin;%HADOOP_HOME%bin;%SPARK_HOME%bin;
Create C:tmphive diroctory and give access permission using below command
C:Hadoopwinutils-masterhadoop-2.7.1bin>winutils.exe chmod -R 777 C:tmphive
Remove local Derby-based metastore metastore_db directory from Computer if it exist.
C:Users<User_Name>metastore_db
Use below command to start spark shell
C:>spark-shell
answered Aug 5, 2017 at 0:15
The reason for the error is that the instance could not be created due to some earlier issues (which may have happened because you are on Windows and you have not installed winutils.exe
binary or some other session keeps the local Derby-based metastore).
The recommendation is to scroll up and review the entire screen of logs where you find the root cause.
answered Jun 6, 2017 at 10:14
Jacek LaskowskiJacek Laskowski
71.3k26 gold badges235 silver badges411 bronze badges
2
If you are on Cloudera the soltuion from this github ticket worked for me (https://github.com/cloudera/clusterdock/issues/30):
The root user (who you’re running as when you start spark-shell) has no user directory in HDFS. If you create one (sudo -u hdfs hdfs dfs -mkdir /user/root followed by sudo -u hdfs dfs -chown root:root /user/root), this should be fixed.
I.e. create a HDFS user home directory for the user running spark-shell.
answered Dec 13, 2017 at 23:04
aaa90210aaa90210
10.8k13 gold badges50 silver badges88 bronze badges
For Ubuntu
users
I had the exact same error and i fixed it the following way.
If you are running spark-shell from the terminal close and re-open the terminal and then restart the spark-shell.
answered Nov 5, 2017 at 12:37
Abdullah KhanAbdullah Khan
11.6k6 gold badges65 silver badges74 bronze badges
If you are running Cloudera, please check in cloudera manager and make sure HIVE services are ON. I had same issue and figured my HIVE service was down. (HIVE METASTORE server, HIVESERVER, HOSTS)
for Spark, you need to make sure HDFS, YARN and HIVE are ON.
Above error appears if HIVE is OFF.
answered Nov 27, 2017 at 18:32
0
I had the same error. In my case, the hard disk was almost full. I deleted some large files from the disk and re-run again after a reboot. It worked! But I think this is not always the case.
answered Sep 4, 2019 at 19:36
user3503711user3503711
1,3931 gold badge17 silver badges31 bronze badges
can someone help me ??
22/03/23 18:37:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
22/03/23 18:37:39 INFO spark.SecurityManager: Changing view acls to: hadoop
22/03/23 18:37:39 INFO spark.SecurityManager: Changing modify acls to: hadoop
22/03/23 18:37:39 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
22/03/23 18:37:39 INFO spark.HttpServer: Starting HTTP Server
22/03/23 18:37:39 INFO server.Server: jetty-8.y.z-SNAPSHOT
22/03/23 18:37:39 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:37459
22/03/23 18:37:39 INFO util.Utils: Successfully started service ‘HTTP class server’ on port 37459.
Welcome to
____ __
/ / ___ / /
/ _ / _ `/ __/ ‘/
// .__/_,// //_ version 1.6.3
//
Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.8.0_312)
Type in expressions to have them evaluated.
Type :help for more information.
22/03/23 18:37:42 WARN util.Utils: Your hostname, dakshmeet-virtual-machine resolves to a loopback address: 127.0.1.1; using 192.168.22.131 instead (on interface ens33)
22/03/23 18:37:42 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
22/03/23 18:37:42 INFO spark.SparkContext: Running Spark version 1.6.3
22/03/23 18:37:42 INFO spark.SecurityManager: Changing view acls to: hadoop
22/03/23 18:37:42 INFO spark.SecurityManager: Changing modify acls to: hadoop
22/03/23 18:37:42 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
22/03/23 18:37:42 INFO util.Utils: Successfully started service ‘sparkDriver’ on port 34715.
22/03/23 18:37:43 INFO slf4j.Slf4jLogger: Slf4jLogger started
22/03/23 18:37:43 INFO Remoting: Starting remoting
22/03/23 18:37:43 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.22.131:40555]
22/03/23 18:37:43 INFO util.Utils: Successfully started service ‘sparkDriverActorSystem’ on port 40555.
22/03/23 18:37:43 INFO spark.SparkEnv: Registering MapOutputTracker
22/03/23 18:37:43 INFO spark.SparkEnv: Registering BlockManagerMaster
22/03/23 18:37:43 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-830328b1-8f3e-4f51-b08b-d237a3c7f8d9
22/03/23 18:37:43 INFO storage.MemoryStore: MemoryStore started with capacity 511.1 MB
22/03/23 18:37:43 INFO spark.SparkEnv: Registering OutputCommitCoordinator
22/03/23 18:37:43 INFO server.Server: jetty-8.y.z-SNAPSHOT
22/03/23 18:37:43 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:461)
at sun.nio.ch.Net.bind(Net.java:453)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:85)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2040)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2031)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:15)
at $line3.$read$$iwC.(:24)
at $line3.$read.(:26)
at $line3.$read$.(:30)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/03/23 18:37:43 WARN component.AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@48ae9e8b: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:461)
at sun.nio.ch.Net.bind(Net.java:453)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:85)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2040)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2031)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:15)
at $line3.$read$$iwC.(:24)
at $line3.$read.(:26)
at $line3.$read$.(:30)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
22/03/23 18:37:43 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
22/03/23 18:37:43 INFO server.Server: jetty-8.y.z-SNAPSHOT
22/03/23 18:37:43 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4041: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:461)
at sun.nio.ch.Net.bind(Net.java:453)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:85)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2040)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2031)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:15)
at $line3.$read$$iwC.(:24)
at $line3.$read.(:26)
at $line3.$read$.(:30)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/03/23 18:37:43 WARN component.AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@3d235635: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:461)
at sun.nio.ch.Net.bind(Net.java:453)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:85)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2040)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2031)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:15)
at $line3.$read$$iwC.(:24)
at $line3.$read.(:26)
at $line3.$read$.(:30)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
22/03/23 18:37:43 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
22/03/23 18:37:43 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
22/03/23 18:37:43 INFO server.Server: jetty-8.y.z-SNAPSHOT
22/03/23 18:37:43 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4042
22/03/23 18:37:43 INFO util.Utils: Successfully started service ‘SparkUI’ on port 4042.
22/03/23 18:37:43 INFO ui.SparkUI: Started SparkUI at http://192.168.22.131:4042
22/03/23 18:37:43 INFO executor.Executor: Starting executor ID driver on host localhost
22/03/23 18:37:43 INFO executor.Executor: Using REPL class URI: http://192.168.22.131:37459
22/03/23 18:37:43 INFO util.Utils: Successfully started service ‘org.apache.spark.network.netty.NettyBlockTransferService’ on port 39369.
22/03/23 18:37:43 INFO netty.NettyBlockTransferService: Server created on 39369
22/03/23 18:37:43 INFO storage.BlockManagerMaster: Trying to register BlockManager
22/03/23 18:37:43 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:39369 with 511.1 MB RAM, BlockManagerId(driver, localhost, 39369)
22/03/23 18:37:43 INFO storage.BlockManagerMaster: Registered BlockManager
22/03/23 18:37:44 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
22/03/23 18:37:45 INFO hive.HiveContext: Initializing execution hive, version 1.2.1
22/03/23 18:37:45 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0
22/03/23 18:37:45 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
22/03/23 18:37:45 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
22/03/23 18:37:45 INFO metastore.ObjectStore: ObjectStore, initialize called
22/03/23 18:37:45 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown — will be ignored
22/03/23 18:37:45 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown — will be ignored
22/03/23 18:37:45 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
22/03/23 18:37:46 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
22/03/23 18:37:47 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes=»Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order»
22/03/23 18:37:47 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MFieldSchema» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:47 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MOrder» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:48 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MFieldSchema» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:48 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MOrder» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:48 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
22/03/23 18:37:48 INFO metastore.ObjectStore: Initialized ObjectStore
22/03/23 18:37:48 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
22/03/23 18:37:48 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
22/03/23 18:37:49 INFO metastore.HiveMetaStore: Added admin role in metastore
22/03/23 18:37:49 INFO metastore.HiveMetaStore: Added public role in metastore
22/03/23 18:37:49 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
22/03/23 18:37:49 INFO metastore.HiveMetaStore: 0: get_all_databases
22/03/23 18:37:49 INFO HiveMetaStore.audit: ugi=hadoop ip=unknown-ip-addr cmd=get_all_databases
22/03/23 18:37:49 INFO metastore.HiveMetaStore: 0: get_functions: db=default pat=*
22/03/23 18:37:49 INFO HiveMetaStore.audit: ugi=hadoop ip=unknown-ip-addr cmd=get_functions: db=default pat=*
22/03/23 18:37:49 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MResourceUri» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:49 INFO session.SessionState: Created local directory: /tmp/0fe45791-a452-4ac6-898d-4e3387dfa7c8_resources
22/03/23 18:37:49 INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/0fe45791-a452-4ac6-898d-4e3387dfa7c8
22/03/23 18:37:49 INFO session.SessionState: Created local directory: /tmp/hadoop/0fe45791-a452-4ac6-898d-4e3387dfa7c8
22/03/23 18:37:49 INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/0fe45791-a452-4ac6-898d-4e3387dfa7c8/_tmp_space.db
22/03/23 18:37:49 INFO hive.HiveContext: default warehouse location is /user/hive/warehouse
22/03/23 18:37:49 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
22/03/23 18:37:49 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0
22/03/23 18:37:49 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
22/03/23 18:37:50 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
22/03/23 18:37:50 INFO metastore.ObjectStore: ObjectStore, initialize called
22/03/23 18:37:50 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown — will be ignored
22/03/23 18:37:50 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown — will be ignored
22/03/23 18:37:50 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
22/03/23 18:37:50 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
22/03/23 18:37:51 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes=»Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order»
22/03/23 18:37:52 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MFieldSchema» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:52 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MOrder» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:52 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MFieldSchema» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:52 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MOrder» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:52 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
22/03/23 18:37:52 INFO metastore.ObjectStore: Initialized ObjectStore
22/03/23 18:37:53 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
22/03/23 18:37:53 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
22/03/23 18:37:53 INFO metastore.HiveMetaStore: Added admin role in metastore
22/03/23 18:37:53 INFO metastore.HiveMetaStore: Added public role in metastore
22/03/23 18:37:53 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
22/03/23 18:37:53 INFO metastore.HiveMetaStore: 0: get_all_databases
22/03/23 18:37:53 INFO HiveMetaStore.audit: ugi=hadoop ip=unknown-ip-addr cmd=get_all_databases
22/03/23 18:37:53 INFO metastore.HiveMetaStore: 0: get_functions: db=default pat=*
22/03/23 18:37:53 INFO HiveMetaStore.audit: ugi=hadoop ip=unknown-ip-addr cmd=get_functions: db=default pat=*
22/03/23 18:37:53 INFO DataNucleus.Datastore: The class «org.apache.hadoop.hive.metastore.model.MResourceUri» is tagged as «embedded-only» so does not have its own datastore table.
22/03/23 18:37:53 INFO session.SessionState: Created local directory: /tmp/920b853e-3aa6-4605-8ea5-1e3e67cbf574_resources
22/03/23 18:37:53 INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/920b853e-3aa6-4605-8ea5-1e3e67cbf574
22/03/23 18:37:53 INFO session.SessionState: Created local directory: /tmp/hadoop/920b853e-3aa6-4605-8ea5-1e3e67cbf574
22/03/23 18:37:53 INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/920b853e-3aa6-4605-8ea5-1e3e67cbf574/_tmp_space.db
22/03/23 18:37:53 INFO repl.SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.
scala> spark.conf.set(«spark.sql.sources.default»,»csv»)
:26: error: not found: value spark
spark.conf.set(«spark.sql.sources.default»,»csv»)