Sbt compiler evalexception type error in expression

Expected behaviour sbt docker:publish or sbt docker:publishLocal generate a docker image and publish it on a private local registery Actual behaviour sbt.compiler.EvalException: Type error in expre...

@muuki88 thanks for confirming that the build.sbt looks good, I think the problem is with my registry configuration the image seem to be built properly (I can run it) but it is not published to the registry.

sbt docker:publishLocal
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
[info] Set current project to spark-k8s (in build file:/Users/dzlab/Projects/spark-k8s/)
[info] Wrote /Users/dzlab/Projects/spark-k8s/target/scala-2.12/spark-k8s_2.12-0.1.pom
[warn] [1] There are no exposed ports for your docker image
[warn]  Configure the `dockerExposedPorts` or `dockerExposedUdpPorts` setting. E.g.
[warn] 
[warn]  // standard tcp ports
[warn]  dockerExposedPorts ++= Seq(9000, 9001)
[warn] 
[warn]  // for udp ports
[warn]  dockerExposedUdpPorts += 4444
[warn]           
[success] All package validations passed
[info] Sending build context to Docker daemon  103.9MB
[info] Step 1/19 : FROM gcr.io/spark-operator/spark:v2.4.5 as stage0
[info]  ---> 775e46820946
[info] Step 2/19 : LABEL snp-multi-stage="intermediate"
[info]  ---> Running in bbc5b5b8e7d1
[info] Removing intermediate container bbc5b5b8e7d1
[info]  ---> 39620874f5ad
[info] Step 3/19 : LABEL snp-multi-stage-id="28f7a183-ff2d-49a9-a7f4-8bf0922cc14a"
[info]  ---> Running in 52724730b27f
[info] Removing intermediate container 52724730b27f
[info]  ---> f09c6c35fe2e
[info] Step 4/19 : WORKDIR /opt/docker
[info]  ---> Running in 57917f47244e
[info] Removing intermediate container 57917f47244e
[info]  ---> ddd5bf51b6f8
[info] Step 5/19 : COPY 1/opt /1/opt
[info]  ---> 89f9087f61e1
[info] Step 6/19 : COPY 2/opt /2/opt
[info]  ---> 2ebbff0a92f1
[info] Step 7/19 : USER root
[info]  ---> Running in c978e3160b49
[info] Removing intermediate container c978e3160b49
[info]  ---> 16a0a7467ee6
[info] Step 8/19 : RUN ["chmod", "-R", "u=rX,g=rX", "/1/opt/docker"]
[info]  ---> Running in eb2c169a5984
[info] Removing intermediate container eb2c169a5984
[info]  ---> fb2828e0e3e5
[info] Step 9/19 : RUN ["chmod", "-R", "u=rX,g=rX", "/2/opt/docker"]
[info]  ---> Running in 7f135d6c8b47
[info] Removing intermediate container 7f135d6c8b47
[info]  ---> 1954043d6805
[info] Step 10/19 : RUN ["chmod", "u+x,g+x", "/1/opt/docker/bin/spark-k8s"]
[info]  ---> Running in 1973d6e1ce5c
[info] Removing intermediate container 1973d6e1ce5c
[info]  ---> 222cfb1efd3d
[info] Step 11/19 : FROM gcr.io/spark-operator/spark:v2.4.5 as mainstage
[info]  ---> 775e46820946
[info] Step 12/19 : USER root
[info]  ---> Running in a40600f3498c
[info] Removing intermediate container a40600f3498c
[info]  ---> a86cc4e1fed5
[info] Step 13/19 : RUN id -u demiourgos728 1>/dev/null 2>&1 || (( getent group 0 1>/dev/null 2>&1 || ( type groupadd 1>/dev/null 2>&1 && groupadd -g 0 root || addgroup -g 0 -S root )) && ( type useradd 1>/dev/null 2>&1 && useradd --system --create-home --uid 1001 --gid 0 demiourgos728 || adduser -S -u 1001 -G root demiourgos728 ))
[info]  ---> Running in 678ff1b1a486
[info] Removing intermediate container 678ff1b1a486
[info]  ---> 410b180d0973
[info] Step 14/19 : WORKDIR /opt/docker
[info]  ---> Running in 88bc2ff2a37d
[info] Removing intermediate container 88bc2ff2a37d
[info]  ---> 6eab9db7dc8a
[info] Step 15/19 : COPY --from=stage0 --chown=demiourgos728:root /1/opt/docker /opt/docker
[info]  ---> 2322c55c1b43
[info] Step 16/19 : COPY --from=stage0 --chown=demiourgos728:root /2/opt/docker /opt/docker
[info]  ---> 34f27fe0790d
[info] Step 17/19 : USER 1001:0
[info]  ---> Running in 5003593f3696
[info] Removing intermediate container 5003593f3696
[info]  ---> acaa9a126038
[info] Step 18/19 : ENTRYPOINT ["/opt/docker/bin/spark-k8s"]
[info]  ---> Running in 9c78002cfa36
[info] Removing intermediate container 9c78002cfa36
[info]  ---> eb79e083769b
[info] Step 19/19 : CMD []
[info]  ---> Running in e1c068a7d142
[info] Removing intermediate container e1c068a7d142
[info]  ---> 684766b9e996
[info] Successfully built 684766b9e996
[info] Successfully tagged spark-k8s:0.1
[info] Successfully tagged localhost:5000/spark-k8s:0.1
[info] Removing intermediate image(s) (labeled "snp-multi-stage-id=28f7a183-ff2d-49a9-a7f4-8bf0922cc14a") 
[info] Deleted Images:
[info] deleted: sha256:222cfb1efd3d9d4b7f51c3367b0622f43ef88f30150cc62b95e5770926240a53
[info] deleted: sha256:612a0fa67059f9ab6cb34bc0b9cf59025b7d6470dd326b1747599610dfb2dbe5
[info] deleted: sha256:1954043d68051ff59b260890e0fa3f42914ea06aaa0fda99715fbca8d5936b8c
[info] deleted: sha256:7aa225573b8863caf132d98142d4209f562e6c77a9b84a42a009647aeee64f15
[info] deleted: sha256:fb2828e0e3e57879651a71151d2965aed43c8e0f43afd4c4e3646812450fd7f8
[info] deleted: sha256:ae7e0b06ff423029c1940557aed93bba09547df3553fb80534de09aa3efe199a
[info] deleted: sha256:16a0a7467ee6f039de96ab4da6f2261a17d55b08989fe607224207392447d75b
[info] deleted: sha256:2ebbff0a92f14fedd9baf02ae157a391e4d78e124519c855d342d441390b93ef
[info] deleted: sha256:0d62c91bf58ed079dc4f4752d1657691c040b36b81db2f661e849fbb20fa8e7a
[info] deleted: sha256:89f9087f61e1968fb4d0521de0a23d7b55deefe81ce69d8e660b9f0d834d8e37
[info] deleted: sha256:aea6cf1ed953e8fcddf08be76cf637aabdc86663f0328596bcf6c3f0f70087a2
[info] deleted: sha256:ddd5bf51b6f8a941fccb81c49bb497f027fffa5bc53a6c3d452f010dc29f8bc6
[info] deleted: sha256:d60fb71c699434417efefef2e731a80f266316ee4292e291b46be3b426b18d6d
[info] deleted: sha256:f09c6c35fe2e5d98d489d58f1fecae5a9b4c2308e701c7049540bd6a5beb6ee0
[info] deleted: sha256:39620874f5adefe3ffdfd73a7572a043fa0b247ab496d7fd18b079f88a7e3a54
[info] Total reclaimed space: 207.6MB
[info] Built image spark-k8s with tags [0.1, 0.1]
[success] Total time: 38 s, completed Jul 20, 2020 12:04:32 PM
$ docker images
REPOSITORY                                     TAG                   IMAGE ID            CREATED             SIZE
spark-k8s                                      0.1                   684766b9e996        2 minutes ago       658MB

I will investigate into registry configuration and let you know

Вопрос:

Я пытаюсь обновить исторический проект Scala/Spark SBT до версии SBT версии 1.1.0. Он использует Scala 2.10.6, как указано в файле build.sbt. Неясно, для какой версии SBT был реализован файл build.sbt, но предположительно 0.12.x или даже раньше.

Я не могу перенести строку assemblyExcludedJars в файле build.sbt:

import sbt.Keys.{libraryDependencies, _}
import sbtassembly.AssemblyKeys.assemblyExcludedJars

[...]

lazy val sparkSettings = Seq(
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-sql" % "1.6.0",
[...],
assemblyExcludedJars in assembly <<= (fullClasspath in assembly) map { cp =>
val excludes = Set(
"minlog-1.2.jar",
[...]
)
cp filter { jar => excludes(jar.data.getName) }
}
)

Как указано в руководстве по миграции, <<= больше не поддерживается, поэтому я изменил его на :=.

Когда я запускаю sbt assembly после изменения, возникает следующая ошибка:

$ sbt assembly
[info] Loading settings from idea.sbt ...
[info] Loading global plugins from /home/XXX/.sbt/1.0/plugins
[info] Loading settings from assembly.sbt ...
[info] Loading project definition from YYY/project
YYY/build.sbt:49: error: type mismatch;
found   : sbt.Def.Initialize[sbt.Task[Seq[sbt.internal.util.Attributed[java.io.File]]]]
required: sbt.Keys.Classpath
(which expands to)  Seq[sbt.internal.util.Attributed[java.io.File]]
assemblyExcludedJars in assembly := (fullClasspath in assembly) map { cp =>
^
[error] sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

Я не мог найти много в документации SBT о том, что именно должна делать эта строка. Я считаю это довольно запутанным, хотя тип assembly похоже, изменился.

Я также рассматриваю это в IntelliJ Idea, но на самом деле указывает на ошибку. Cannot resolve symbol assembly.

Как я могу правильно перенести этот файл сборки?

Лучший ответ:

Чтобы переписать параметр assemblyExcludedJars он должен выглядеть примерно так:

assemblyExcludedJars in assembly := { 
  val cp = (fullClasspath in assembly).value
  val excludes = Set("minlog-1.2.jar", [...])
  cp filter { jar => excludes(jar.data.getName) }
}

Просто замена оператора <<= завершится неудачно, потому что оператор := принимает тип результата задачи в правой части вместо Initialize[Task[...]]

Sun, Mar 29, 2020

As a library author, I’ve been wanting to tag methods in Scala that can trigger custom warnings or compiler errors. Why would I want to intentionally cause a compiler error? One potential use case is displaying a migration message for a removed API.

Restligeist macro: n. A macro that fails immediately to display migration message after implementation has been removed from the API.

— ∃ugene yokot∀ (@eed3si9n)
August 30, 2016

For example, if you try to use <<= in sbt 1.3.8 you’d get the following error on load:

/tmp/hello/build.sbt:13: error: `<<=` operator is removed. Use `key := { x.value }` or `key ~= (old => { newValue })`.
See http://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html
    foo <<= test,
        ^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

It’s good that it’s doable, but using a macro for this is too pompous. According to Yoshida-san, you can do this in Haskell just by putting Whoops in the type signature:

-- | This function is being removed and is no longer usable.
-- Use 'Data.IntMap.Strict.insertWith'
insertWith' :: Whoops "Data.IntMap.insertWith' is gone. Use Data.IntMap.Strict.insertWith."
            => (a -> a -> a) -> Key -> a -> IntMap a -> IntMap a
insertWith' _ _ _ _ = undefined

configurable warnings

In March 2019, I sent a pull request #7790 to scala/scala proposing @compileTimeError annotation. The pull request evolved into @restricted annotation and configurable warning option -Wconf. The idea was that @restricted can tag methods with labels, and -Wconf would be able to escalate the tag to either a warning or an error like -Wconfig apiMayChange:foo.*:error.

Unfortunately #7790 got shot down as we were approaching Scala 2.13.0, but -Wconfig was resurrected during the summer by Lukas Rytz (@lrytz) as a general-purpose filter #8373 that can configure any warnings by the category, message content, source, origin, or the deprecation since field. Using this the library users will be able to toggle deprecation messages from certain version as error etc. #8373 is merged and will be part of Scala 2.13.2.

ApiMayChange annotation

As an example of denoting a “status” of API, Lightbend’s Akka library has a few interesting ones. For example ApiMayChange denotes that the tagged APIs are exempt from the normal binary compatibility guarantees, basically a beta feature that might evolve in the future.

This would be an interesting tag for any long-supported libraries. One interesting aspect of this annotation is that it’s purely a social convention. Meaning that the compiler will not print any warnings if you call the “may change” API.

apiStatus annotation (proposal)

-Wconfig is useful, but currently the only tool given to library authors are @deprecated annotation to trigger a warning without resorting to a macro. A week ago, I sent #8820 to scala/scala proposing the idea of @apiStatus that enables user-land compiler warnings and errors.

Here are some examples. Let’s say we want to make <<= method an error.

import scala.annotation.apiStatus, apiStatus._

@apiStatus(
  "method <<= is removed; use := syntax instead",
  category = Category.ForRemoval,
  since = "foo-lib 1.0",
  defaultAction = Action.Error,
)
def <<=(): Unit = ???

Here how it would look if someone calls this method:

example.scala:26: error: method <<= is removed; use := syntax instead (foo-lib 1.0)
  <<=()
  ^

So the custom compiler message works.

implementing ApiMayChange

Let’s try implementing ApiMayChange annotation.

package foo

import scala.annotation.apiStatus, apiStatus._

@apiStatus(
  "should DSL is incubating, and future compatibility is not guaranteed",
  category = Category.ApiMayChange,
  since = "foo-lib 1.0",
  defaultAction = Action.Silent,
)
implicit class ShouldDSL(s: String) {
  def should(o: String): Unit = ()
}

Following Akka, I chose the default action to be Action.Silent so it won’t display a warning. Here’s where -Wconf can shine. Using -Wconf:cat=api-may-change&origin=foo..*:warning option, the user can enable “api-may-change” category just for foo.* package.

example.scala:28: warning: should DSL is incubating, and future compatibility is not guaranteed (foo-lib 1.0)
  "bar" should "something"
  ^

If you want to make it a warning by default you can also change it to defaultAction = Action.Warning.

user-land warnings and errors

The category field is just a String so you can use your imagination on what kind of useful tagging you can do to denote your classes and methods. (This should also make it easy to backport to older Scala for cross building).

In general, what do you think about the idea of user-land warnings and errors? Please let us know by hitting +1/-1 or commenting on #8820.

I am trying to use scalapb to generate case classes from my protobuf. But, I am currently compilation errors.

I have my scalapb.sbt as follows:

addSbtPlugin("com.trueaccord.scalapb" % "sbt-scalapb" % "0.5.26")

libraryDependencies ++= Seq(
  "com.trueaccord.scalapb" %% "compilerplugin" % "0.5.26",
  "com.github.os72" % "protoc-jar" % "3.0.0-b2.1"
)

And, my build.sbt is as follows:

// for scalapb

import com.trueaccord.scalapb.{ScalaPbPlugin => PB}

PB.targets in Compile := Seq(
  scalapb.gen() -> (sourceManaged in Compile).value
)

PB.protobufSettings

PB.runProtoc in PB.protobufConfig := (args =>
  com.github.os72.protocjar.Protoc.runProtoc("-v241" +: args.toArray))

libraryDependencies ++= Seq(
    "io.grpc" % "grpc-netty" % "0.14.0",
    "com.trueaccord.scalapb" %% "scalapb-runtime-grpc" % (PB.scalapbVersion in PB.protobufConfig).value
)

Also, I have created a sample .proto file srcmainprotobuf as follows:

syntax = "proto2"

package org.pk.stream.protos

message Tweet {
    required string filter_level = 1;
}

Now, when I am trying to sbt compile, I am getting the following error:

S:MyReposLogStreaming>sbt compile
[info] Loading global plugins from C:Userspkumar25.sbt.13plugins
[info] Loading project definition from S:MyReposRLoggerStreamingproject
S:MyReposLogStreamingbuild.sbt:21: error: object trueaccord is not a 
member of package com
import com.trueaccord.scalapb.{ScalaPbPlugin => PB}
           ^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q

Could someone help me in resolving this error?

I am also little confused between the scalapb versions, com.thesamet.scalapb (https://scalapb.github.io/sbt-settings.html) and com.trueaccord.scalapb (https://mvnrepository.com/artifact/com.trueaccord.scalapb). I am curious, which one should be used and how to use that aptly?

Much appreciated!

Downloads/play-with-scalajs-example-master/build.sbt:5: error: reference to scalaJSProjects is ambiguous;
it is imported twice in the same scope by
import _root_.webscalajs.WebScalaJS.autoImport._
and import _root_.playscalajs.PlayScalaJS.autoImport._
  scalaJSProjects := Seq(client),
  ^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

i’m using IntelliJ 2016.3.5 enterprise

if i want to use encodeURI() in Js,what package should i need import in scala

welcome anyone to solve my problem

whern i went to an older version of plugins.sbt i got this error:

[error] Some keys were defined with the same name but different types: 'scalaJSDev' (sbt.Task[scala.collection.Seq[scala.Tuple2[java.io.File, java.lang.String]]], sbt.Task[scala.Function1[scala.collection.Seq[scala.Tuple2[java.io.File, java.lang.String]], scala.collection.Seq[scala.Tuple2[java.io.File, java.lang.String]]]])
[error] Use 'last' for the full log.

has anyone seen that before?


//addSbtPlugin("com.vmunier" % "sbt-web-scalajs" % "1.0.3")

addSbtPlugin("com.vmunier" % "sbt-play-scalajs" % "0.2.8")

when i used sbt-play-scalajs it worked…

@zerodrift Please do not mention @ all in this gitter room. You’re sending a mail to 1500+ people.

how can I match a js.| in order to distinguish what it is, like with Either Right Left?
is there a conversion to scala’s Either?

cast to Any and use pattern matching

@sjrd sorry about that. do you know how to fix that error though?

Not really. Looks a binary or source incompatibility between different versions of sbt plugins.

It seems more related to sbt-web-scalajs than Scala.js core, though. Scala.js core doesn’t define scalaJSDev. So you have more chance in the following room:

@sjrd just fyi…wondering if it happens to you just downloading the source from the scala.js example on github?

wondering if it’s my machine perhaps. i just downloaded the scala.js play example from GitHub without any modification

@sjrd right. Could I somehow be notified when it is fixed? Will that already happen because I’m cc’d on the email?

Yes you’ll receive a mail every time something happens on that issue (comment, closed, etc.) as usual for GitHub issues.

Ah, I see. What of this? https://github.com/scala/scala-dev «Please use our official issue tracker to lodge bugs and feature requests, rather than using this issue tracker.» ?

That applies to «people». I’m me :p

More seriously, that sentence is not that applicable anymore. They’re silently, progressively moving to the GitHub issue tracker ;)

I’m starting to feel like an insider! Okay, thanks @sjrd It’s nice to have my project completely on 2.12 now — access to the latest version of Slick and more.

@zerodrift It looks like you’re using Play and Scala.js, which I’ve been doing for a while now. Feel free to p/m me.

i’m getting this error using scalajs with Silhouette, anyone see this before?

Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'self' clef.io cdnjs.cloudflare.com". Either the 'unsafe-inline' keyword, a hash ('sha256-RKtlAPx86GOjTlPNdiKrIs4dJH23G8D2/SiGldptuRo='), or a nonce ('nonce-...') is required to enable inline execution.

Понравилась статья? Поделить с друзьями:
  • Sbr library initialization error
  • Sbl oms 00203 error null invoking method null for business service null
  • Sbis3plugin exe ошибка приложения
  • Sbin reboot input output error
  • Sbin mount vboxsf mounting failed with the error protocol error