Py4jjavaerror an error occurred while calling none org apache spark api java javasparkcontext

I built two workers one master cluster in AWS. installed bigDL under python 3.4. I run this code import pandas import datetime as dt from bigdl.nn.layer import * from bigdl.nn.criterion import * fr...

I built two workers one master cluster in AWS. installed bigDL under python 3.4.

I run this code
import pandas
import datetime as dt

from bigdl.nn.layer import *
from bigdl.nn.criterion import *
from bigdl.optim.optimizer import *
from bigdl.util.common import *
from bigdl.dataset.transformer import *

init_engine()
it reports following error when I run init_engine() no matter from Jupyter nor command line.

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Found both spark.executor.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:560)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:558)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:558)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:546)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)
at org.apache.spark.SparkContext.(SparkContext.scala:376)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:236)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)

If i run sc = SparkContext.getOrCreate(conf=create_spark_conf()) . It will report same error

I was doing some unit test with pyspark and came upon following issue

java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:236)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)
..........................................................
..........................................................
..........................................................
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)

After debugging found out that I had set the wrong master address

conf = SparkConf().setAll([('spark.executor.memory', '3g'),
                           ('spark.executor.cores', '8'),
                           ('spark.cores.max', '24'),
                           ('spark.driver.memory', '9g'),
                           ("spark.app.name", "simpleApplicationTests"),
                           ("spark.master", "spark://10.0.2.12:7077")])

I corrected the address and the error was gone

bigdl setting enviroment. Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. #1425

Comments

I built two workers one master cluster in AWS. installed bigDL under python 3.4.

I run this code
import pandas
import datetime as dt

from bigdl.nn.layer import *
from bigdl.nn.criterion import *
from bigdl.optim.optimizer import *
from bigdl.util.common import *
from bigdl.dataset.transformer import *

init_engine()
it reports following error when I run init_engine() no matter from Jupyter nor command line.

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Found both spark.executor.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:560)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:558)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:558)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:546)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)
at org.apache.spark.SparkContext.(SparkContext.scala:376)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:236)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)

If i run sc = SparkContext.getOrCreate(conf=create_spark_conf()) . It will report same error

The text was updated successfully, but these errors were encountered:

Источник

Details


    • Type:


      Bug

    • Status:

      Resolved


    • Priority:


      Major

    • Resolution:

      Fixed


    • Affects Version/s:



      1.5.0

    • Fix Version/s:




      1.5.0


    • Component/s:



      YARN


    • Labels:

      None


    • Target Version/s:

      1.5.0

    Description

      The fix is already in master, and it’s one line out of the patch for SPARK-5754; the bug is that a Windows file path cannot be used to create a URI, to File.toURI() needs to be called.

      Attachments

        Issue Links

          links to

          Pull request #8493

          [Github] Pull Request #8493 (vanzin)

          Activity

            People

              Assignee:

              vanzin
              Marcelo Masiero Vanzin

              Reporter:

              vanzin
              Marcelo Masiero Vanzin

              Votes:
              0

              Vote for this issue

              Watchers:

              4

              Start watching this issue

              Dates

                Created:

                28/Aug/15 01:37
                Updated:

                09/Oct/15 08:27
                Resolved:

                28/Aug/15 22:57

                Понравилась статья? Поделить с друзьями:
              • Pxe mof exiting pxe rom как исправить
              • Python create error message
              • Pxe m0f ошибка на ноутбуке
              • Python continue on error
              • Python connection error exception