The last time Hackerfall tried to access this page, it returned a not found error. A cached version of the page is below, or clickhereto continue anyway

Scala 2.12 Release Notes | The Scala Programming Language

We are very happy to announce the availability of Scala 2.12.0!


One important goal of Scala 2.12 was to make optimal use of Java 8s new features (and thus generates code that requires a Java 8 runtime).

In addition to that, an amazing amount of work has happened to run Scala on other platforms, improve the standard library and the compiler, polish the language, introduce cleanups throughout the codebase, and provide new tools (Scalafmt, Scalafix) as well as services (Scaladex, ScalaFiddle) to Scala developers.

Many of these features were developed in fruitful collaboration with the Dotty team.

Platform status

Scala on the JVM

Scala 2.12 requires a Java 8 runtime to compile code. It requires at least Java 8 to run the generated code. 2.12.1 will be out shortly (by the end of November) to address some known (but rare) issues in 2.12.0. We are planning to add (some) support for Java 9 in the near future. Full Java 9 support will be part of the 2.13 roadmap discussions.

Scala 2.11 requires a Java 6, 7 or 8 runtime to compile code. It requires Java 6, 7 or 8 to run the generated code. 2.11.9 is the last planned 2.11.x release (due by mid December).

In the next few weeks, we at Lightbend will share our plans for Scala 2.13.


Scala.js emits standard JavaScript that runs in all popular browsers (ECMAScript 5.1 Strict Mode or ECMAScript 2015). It has been production-ready since the beginning of 2015.

I have done some work with GWT before and compared to that the first thing you will notice is that Scala.js is FAST. It compiles 100 times faster, yet it still outputs JavaScript that is both faster and smaller.

Baldur Norddahl

Scala.js has amazing interoperability with all JavaScript libraries and can use many Scala libraries, allowing developers to share code between client and server.

Scala.js has very good tooling and very good documentation. If you already write JVM flavoured Scala you can be up and running in no time and write JS flavoured Scala.

Guillaume Belrose

Scala.js supports CommonJS modules and can directly work with npm dependencies.

The current version of Scala.js is 0.6.13 and requires Scala 2.10, 2.11 or 2.12 to compile Scala code to JavaScript.

To use Scala.js, add the plugin to your plugin configuration (project/plugins.sbt):

addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.13")

After that, you can enable Scala-to-JavaScript compilation by adding


to your build configuration (build.sbt).

The Scala.js website provides further information.

Scala on Android

Scala on Android allows you to write Android apps in Scala. The current version of Scala on Android is 1.7.1.

Scala on Android requires Scala 2.11. Support for Scala 2.12 is pending as Googles support for converting Java 8 bytecode to Androids native executable format is not production-ready yet.

To start using Scala on Android, add the plugin to your plugin configuration (project/plugins.sbt):

addSbtPlugin("org.scala-android" % "sbt-android" % "1.7.1")

After that, you can enable Android support by adding


to your build configuration (build.sbt).

The custom ProGuard caching provided by the Android plugin integrates with incremental compilation, which allows skipping the ProGuard step and shaving off significant amounts of time during development!

An optional extension of this plugin, sbt-android-protify, provides even further speed-ups by allowing you to make changes to your code and see those changes reflected immediately on your device (or emulator):

Unlike Googles Instant Run feature, this functionality also works on devices running earlier versions than Android 5.0.

Another optional extension of this plugin, sbt-android-gradle, automatically imports an existing build.gradle file, which means you can get started without writing any SBT build configuration.

Writing Android apps with Scala adds a small size overhead that depends on how much functionality from the Scala standard library is used. Currently, the minimal overhead is about 30kB.

A new, dedicated website helps you to get started using Scala on Android devices.


Scala-Native compiles Scala to native code and provides seamless interoperability with C.

It uses LLVM for code generation and currently supports Linux and macOS. Additional targets are planned and will be added in time.

Even at this early stage performance has been very promising: A simple ray-tracer written in Scala-Native is close in speed to the equivalent code written in C++.

To try Scala-Native, you need to have LLVM and BoehmGC installed.

Then add the Scala-Native plugin to your plugin configuration (project/plugins.sbt):

// Scala-Native releases only snapshot builds
resolvers += Resolver.sonatypeRepo("snapshots")
addSbtPlugin("org.scala-native" % "sbtplugin"  % "0.1-SNAPSHOT")

After that, you can enable native compilation by adding


to your build configuration (build.sbt).

Scala-Native uses Scala 2.11 currently, support for 2.12 will be added with the release of Scala-Native 0.1.

We are extremely grateful for the support Scala-Native received from many contributors: In only 4 months since the announcement in May it has already accepted more than 100 contributions.

The Scala-Native documentation provides further information.


Dotty is a platform to try out new language concepts and compiler technologies for Scala. The focus is mainly on simplification. We remove extraneous syntax (e.g. no XML literals), and try to boil down Scalas types into a smaller set of more fundamental constructors.

The Dotty team has already implemented union, intersection, literal singleton types as well as trait parameters and many more improvements designed to come up with an even more minimal, orthogonal language design.

The theory behind these constructors is researched in DOT, a calculus for dependent object types.

Dottys current status as a technology preview means that it is unsupported, may be functionally incomplete or unsuitable for production use.

To try Dotty, add the Dotty plugin to your plugin configuration (project/plugins.sbt):

addSbtPlugin("com.felixmulder" % "sbt-dotty" % "0.1.4")

After that, you can enable compilation with Dotty by adding


to your build configuration (build.sbt).

If you are interested in more details, visit the Dotty homepage or watch Martin Oderskys talk on Dotty and compiler design:

New features

Here are the most noteworthy pull request of the 2.12 release. See also the RC2 and RC1 release notes for the most recent changes.

Traits compile to interfaces

With Java 8 allowing concrete methods in interfaces (default methods), Scala 2.12 is able to compile a trait to a single interface.

Before, a trait was represented as an interface and a class that held the method implementations (T$class.class).

Due to technical limitations of default methods, the compiler still has quite a bit of magic to perform behind the scenes, so that care must be taken if a trait is meant to be implemented in Java. Briefly, if a trait does any of the following its subclasses require synthetic code:

Lambda syntax for SAM types

The Scala 2.12 type checker accepts a function literal as a valid expression for any Single Abstract Method (SAM) type, in addition to the FunctionN types from standard library. This improves the experience of using libraries written for Java 8 in Scala.

scala> val runRunnable: Runnable = () => println("Run!")
runRunnable: Runnable = $$Lambda$1073/754978432@7cf283e1


Note that only lambda expressions are converted to SAM type instances, not arbitrary expressions of FunctionN type:

scala> val f = () => println("Faster!")
f: () => Unit = $$Lambda$1079/110296358@36f1046f

scala> val fasterRunnable: Runnable = f
<console>:12: error: type mismatch;
 found   : () => Unit
 required: Runnable

With the use of default methods, Scalas built-in FunctionN traits are compiled to SAM interfaces. This allows creating Scala functions from Java using Javas lambda syntax:

public class A {
    scala.Function1<String, String> f = s -> s.trim();

Specialized function classes are also SAM interfaces and can be found in the package scala.runtime.java8.

Thanks to an improvement in type-checking, the parameter type in a lambda expression can be omitted even when the invoked method is overloaded (see #5307). In the following example, the compiler infers parameter type Int for type checking the lambda:

scala> trait MyFun { def apply(x: Int): String }

scala> object T {
     |   def m(f: Int => String) = 0
     |   def m(f: MyFun) = 1
     | }

scala> T.m(x => x.toString)
res0: Int = 0

Note that both methods are applicable, and overloading resolution selects the one with the Function1 argument type, as explained in more detail below.

Better type inference for Scala.js

The improved type inference for lambda parameters also benefits js.Functions. For example, you can now write:

dom.window.requestAnimationFrame { now => // inferred as Double

without having to specify (now: Double) explicitly. In a similar spirit, the more consistent inference for overriding vals allows to more easily implement Scala.js-defined JS traits with anonymous objects. For example:

trait SomeOptions extends js.Object {
  val width: Double | String // e.g., "300px"
val options = new SomeOptions {
  val width = 200 // implicitly converted from Int to the inferred Double | String

Java 8-style bytecode for lambdas

Scala 2.12 emits closures in the same style as Java 8, whether they target a FunctionN class from the standard library or a user-defined Single Abstract Method (SAM) type.

For each lambda the compiler generates a method containing the lambda body, and emits an invokedynamic that will spin up a lightweight class for this closure using the JDKs LambdaMetaFactory. Note that in the following situations, the an anonymous function class is still synthesized at compile-time:

Compared to Scala 2.11, the new scheme has the advantage that, in most cases, the compiler does not need to generate an anonymous class for each closure.

Our backend support for invokedynamic is also available to macro authors.


The Java 8 compatibility module for Scala has received an overhaul for Scala 2.12. Even though interoperability of Java 8 SAMs and Scala functions is now baked into the language, this module provides additional convenience for working with Java 8 SAMs. Java 8 streams support was also added during the development cycle of Scala 2.12. Releases are available for both Scala 2.11 and Scala 2.12.

Smaller JAR and class files

Compared to Scala 2.11, the new scheme has the advantage thatin most casesthe compiler does not need to generate an anonymous class for each closure.

Combined with the improvements in compiling traits to interfaces with default methods, this leads to significantly smaller JAR files:

2.9 2.10 2.11 2.12 Scala library 9.0 7.1 5.7 5.5 Scala compiler 11.5 14.5 15.5 10.1 ScalaTest (3.0.0) 10.5 10.4 7.0 Akka Actors (2.4.12) 3.3 2.9 Shapeless (2.3.2) 3.6 3.5 2.8

Actual size improvements depend heavily on the style of code you are writing, code bases that create lambdas benefit from these improvements disproportionally.

We have received reports ranging from a 5% reduction in code size to decreases of 50%.

Please let us know your experience and report any regressions you find!

Better lazy vals and objects

Local lazy vals and objects, i.e., those defined in methods, now use a more efficient representation (implemented in #5294 and #5374).

In Scala 2.11, a local lazy val is encoded using two heap-allocated objects (one for the value, a second for the initialized flag), and initialization synchronizes on the enclosing class instance. With the new representation for lambdas in 2.12, which emits the lambda body as a method in the enclosing class, this encoding can cause new deadlocks for lazy vals or objects defined in the lambda body.

This has been fixed by creating a single heap-allocated object that holds both the value and the initialized flag, and is at the same time used as initialization lock. A similar implementation already existed in Dotty.

New back end

Scala 2.12 standardizes on the GenBCode back end, which emits code more quickly because it directly generates bytecode from Scala compiler trees, while the previous back end used an intermediate representation called ICode. The old back ends (GenASM and GenIcode) have been removed (#4814, #4838).

New optimizer

The GenBCode back end includes a new inliner and bytecode optimizer, delivered by Lukas Rytz, and built on earlier work by Miguel Garcia.

The optimizer is configured using the -opt compiler option. By default it only removes unreachable code within a method. Check -opt:help to see the list of available options for the optimizer.

The following optimizations are available:

For example, the following code

def f(a: Int, b: Boolean) = (a, b) match {
  case (0, true) => -1
  case _ if a < 0 => -a
  case _ => a

produces, when compiled with -opt:l:method, the following bytecode (decompiled using cfr):

public int f(int a, boolean b) {
  int n = 0 == a && true == b ? -1 : (a < 0 ? - a : a);
  return n;

The optimizer supports inlining (disabled by default). With -opt:l:project code from source files currently being compiled is inlined, while -opt:l:classpath enables inlining code from libraries on the compilers classpath. Other than methods marked @inline, higher-order methods are inlined if the function argument is a lambda, or a parameter of the caller.

Note that:

The Scala distribution is built using -opt:l:classpath, which improves the performance of the Scala compiler by roughly 5% (hot and cold, measured using our JMH-based benchmark suite) compared to a non-optimized build.

In addition to that, the @inline and @noinline annotations can now be added to method calls:

// @inline // we don't want to inline all the time, ...
def addOne(x: Int) = x + 1
addOne(23): @inline // ... but here we want to inline!

Either is now right-biased

Either now supports operations like map, flatMap, contains, toOption, and so forth, which operate on the right-hand side. The .left and .right may be deprecated in favor of .swap in a later release.

The changes are source-compatible with old code (except in the presence of conflicting extension methods).

Important libraries like cats have already deprecated their own implementation of a right-biased Either, standardizing on scala.util.Either in the future.

Thanks, Simon Ochsenreither, for this contribution.

Improved Futures

scala.concurrent.Future gained a few new combinators and utility functions, including flatten, zipWith, transform, transformWith and unit and never.

In addition to that, Futures also received further performance tuning and and improvements to the behavior of long chains of Futures.

This blog post series by Viktor Klang explores the diverse improvements made to Future for 2.12.

Improved REPL

Scalas REPL (read-evaluate-print-loop) ships with many improvements.

If you like color (who doesnt!), add the -Dscala.color option until its turned on by default.

The implementation of tab-completion in the Scala REPL has been rewritten and uses the same infrastructure as for example Scala IDE and ENSIME now, which greatly improves the experience!

There are a number of improvements:

To try Scalas interactive shell, launch it from the command line with the scala script or in sbt using the console task.

Thanks to Jason Zaug and Andrew Marki for their fruitful collaboration on this work!

These improvements have also been added to Scala 2.11.

Collection improvements

Apart from many general fixes and polishing, the performance of Iterator, Map, Set, ListMap, ListSet, PriorityQueue, OpenHashMap, ArrayStack has been improved, in some cases significantly.

An implementation of a mutable TreeMap has been added.

Thanks to the many contributors who have worked on this!

Better deprecations and library evolution

As library authorsregardless of programming languageknow, evolving code is hard as most changes are not considered to be binary compatible on the JVM.

While Java itself has adopted one approach of addressing this issue (never change anything), this strategy does not work for all library authors.

To help library authors evolve their libraries (while avoiding unexpected breakages and unnecessary warnings for their users at the same time) Scala has greatly extended and improved the way library authors can communicate upcoming changes to their users.

Scala now provides the @deprecatedInheritance and the @deprecatedOverriding annotations in addition to the existing @deprecated and @deprecatedName annotations in SI-6162.

The @deprecatedInheritance annotation signals that inheriting from a class is deprecated.

This is usually done to warn about a non-final class being made final in a future version. Sub-classing such a class then generates a warning:

// Library code
@deprecatedInheritance("this class will be made final", "FooLib 12.0")
class Foo

// User code
val foo = new Foo     // no deprecation warning
class Bar extends Foo
// warning: inheritance from class Foo is deprecated (since FooLib 12.0):
//          this class will be made final
// class Bar extends Foo
//                   ^

No warnings are generated if the subclass is in the same compilation unit.

The @deprecatedOverriding annotation signals that overriding a member is deprecated. Overriding such a member in a sub-class then generates a warning:

// Library code
class Foo {
  @deprecatedOverriding("this method will be made final", "FooLib 12.0")
  def add(x: Int, y: Int) = x + y

// User code
class Bar extends Foo // no deprecation warning
class Baz extends Foo {
  override def add(x: Int, y: Int) = x - y
// warning: overriding method add in class Foo is deprecated (since FooLib 12.0):
//          this method will be made final
// override def add(x: Int, y: Int) = x - y
//              ^

Additionally, all reports of deprecations (including language deprecations) are now grouped by their since field (#5076), making it easy to tell which changes are upcoming and prioritize accordingly:

// Library code
@deprecated("this method will be removed", "FooLib 12.0")
def oldMethod(x: Int) = ...

// User code
aDeprecatedMethodFromLibraryBar(3, 4)
// warning: there was one deprecation warning (since BarLib 3.2)
// warning: there were two deprecation warnings (since FooLib 12.0)
// warning: there were three deprecation warnings in total;
//          re-run with -deprecation for details

Library authors should document their librarys deprecation policy to give developers guidance on how long a deprecated definition will be preserved. We encourage library authors to prepend the name of their library to the version number to help developers distinguish deprecations coming from different libraries.

New Scaladoc look-and-feel

Scaladocs output is now more attractive, more modern, and easier to use. The unified search makes sure developers find the class, object, method or value they are searching for.

Take a look at the Scala Standard Library API:

Thanks, Felix Mulder, for leading this effort.

Scaladoc support for Java sources

This allows scaladoc to generate comprehensive documentation from projects with both Scala and Java sources. Thanks, Jakob Odersky, for this fix for SI-4826.

This feature is enabled by default, but can be disabled with:

scalacOptions in (Compile, doc) += "-no-java-comments"

Parameter names available at runtime

Java 8 added a mechanism in JEP-118 to store parameter names in class files and read this information at runtime using reflection.

While javac requires an additional flag to emit parameter names, Scala 2.12 emits them by default:

case class Person(name: String, age: Int)
// result:
// Array(final java.lang.String name, final int age)

Partial unification of type constructors

Compiling with -Ypartial-unification adds partial unification of type constructors, fixing the notorious SI-2712.

For starters, here is the sort of code affected by SI-2712:

def foo[F[_], A](fa: F[A]): String = fa.toString

foo { x: Int => x * 2 }

Were calling the foo function, passing a value of type Int => Int, which is syntactic sugar for Function1[Int, Int]. In the meantime, foo expects a parameter of type F[A], where F[_] and A are unsolved type variables. This will not compile.

The reason it does not compile is because Function1 takes two type parameters, whereas F[_] only takes one. The compiler tries to instantiate F[_] = Function1[_, _], but it doesnt work because the number of parameters doesnt line up, and it fails the build.

With this fix, the above snippet compiles without modification and behaves exactly as you would expect. In fact, nearly every example of SI-2712 magically springs to life and just starts working exactly the way you want it to. Specifically in the above example, the compiler will see that we are passing a value of a type constructor with two parameters (Function1) to a function (foo) that expects a value of a type constructor with one parameter (F[_]), and it will handle the situation by assuming that there is some other type constructor which is a partial application from the left of Function1.

The fix assumes that type constructors are always partially applied in a left-to-right order. Whenever the compiler infers a type constructor which has a higher arity than the type constructor it really needs, it will assume that the type constructor it was supposed to infer was some partial application of the larger one. For example:

// required
F[_, _, _]

// provided
Foo[Monad, Int, String, Boolean, Unit]

// inferred
[A, B, C]Foo[Monad, Int, A, B, C]

The compiler will only ever infer these placeholder parameters on the right side of the type constructor. Never on the left. This has some very significant implications for the way in which you design data types.

This improvement has been backported to Scala 2.11 and will ship with the next minor release, 2.11.9.

Thanks to Miles Sabin for implementing this long-standing feature request and to Daniel Spiewak for this description of the issue and how it was addressed.

Use of invokedynamic

In addition to lambda support, caches for structural calls and Symbol literals are also implemented with invokedynamic now.

The invokedynamic instruction can also be used in macros, allowing macro authors to e. g. hoist and cache expensive computations (like Regex pattern compilation).


The code base has been significantly improved, cleaning up and removing a lot of code. This has delivered great results, a few of which we want to mention:

Fields phase

Scala doesnt provide a way to define fields directly. Nevertheless, they need to be emitted by the compiler when values, lazy values, variables, objects etc. are defined.

The creation of these fields was spread over multiple phases in the compiler.

This has been cleaned up in #5141 and #5294 by introducing a dedicated phase called fields.

This change was inspired by Dotty and simplified many parts of the compiler, improving type inference along the way.

Ant build replaced with SBT build

Scala itself is now completely built, tested and published with sbt! This makes it easier to get started hacking on the compiler and standard library. All you need on your machine is JDK 8 and sbt - no ant, no environment variables to set, no shell scripts to run.

You can build, use, test and publish Scala like any other sbt-based project. Due to the recursive nature of building Scala with itself, IntelliJ cannot yet import our sbt build directly use the intellij task instead to generate suitable project files.

Removal of the old compiler backend

The old GenASM backend and the ICode infrastructure has been removed in favor of the GenBCode backend, simplifying and minimizing the last stages of the compilation pipeline.

sun.misc.Unsafe replacement

In preparation of future Java releases which might limit access to special classes like sun.misc.Unsafe, all usages of such classes have been replaced by other implementations.

Viktor Klangs blog post gives some insights based on his work on Futures.

Removal of ForkJoin library

Earlier releases of the Scala standard library included the ForkJoin library.

As Java 8 has finally included the ForkJoin library in its standard library, and Scala 2.12 requires Java 8, this code has been dropped from the Scala standard library, reducing its size.

Reduced memory usage during compilation

The parts of the compiler that read class files during compilation have been improved, substantially reducing the amount of memory used.

Fewer dependencies for scaladoc

The scaladoc tool drops its dependency on the parser-combinators library, making it easier to evolve, build and publish.


Except for the changes listed below, code that compiles on 2.11.x without deprecation warnings should compile on 2.12.x too, unless you use experimental APIs such as reflection.

If you find an incompatibility not listed below, please file an issue.

Thanks to source compatibility, cross-building is a one-line change to most sbt builds.

SAM conversion precedes implicits

The SAM conversion built into the type system takes priority over implicit conversion of function types to SAM types. This can change the semantics of existing code relying on implicit conversion to SAM types:

trait MySam { def i(): Int }
implicit def convert(fun: () => Int): MySam = new MySam { def i() = 1 }
val sam1: MySam = () => 2 // Uses SAM conversion, not the implicit
sam1.i()                  // Returns 2

To retain the old behavior, you may compile under -Xsource:2.11, use an explicit call to the conversion method, or disqualify the type from being a SAM (e.g. by adding a second abstract method).

Note that SAM conversion only applies to lambda expressions, not to arbitrary expressions with Scala FunctionN types:

val fun = () => 2     // Type Function0[Int]
val sam2: MySam = fun // Uses implicit conversion
sam2.i()              // Returns 1

SAM conversion in overloading resolution

In order to improve source compatibility, overloading resolution has been adapted to prefer methods with Function-typed arguments over methods with parameters of SAM types. The following example is identical in Scala 2.11 and 2.12:

scala> object T {
     |   def m(f: () => Unit) = 0
     |   def m(r: Runnable) = 1
     | }

scala> val f = () => ()

scala> T.m(f)
res0: Int = 0

In Scala 2.11, the first alternative is chosen because it is the only applicable method. In Scala 2.12, both methods are applicable, therefore overloading resolution needs to pick the most specific alternative. The specification for compatibility has been updated to consider SAM conversion, so that the first alternative is more specific.

Note that SAM conversion in overloading resolution is always considered, also if the argument expression is not a function literal (like in the example). This is unlike SAM conversions of expressions themselves, see the previous section. See also the discussion in scala-dev#158.

While the adjustment to overloading resolution improves compatibility, there also exists code that compiles in 2.11, but is ambiguous in 2.12:

scala> object T {
     |   def m(f: () => Unit, o: Object) = 0
     |   def m(r: Runnable, s: String) = 1
     | }
defined object T

scala> T.m(() => (), "")
<console>:13: error: ambiguous reference to overloaded definition

More consistent type inference for vals

Type inference for val, and lazy val has been aligned with def, fixing assorted corner cases and inconsistencies (#5141 and #5294). Concretely, when computing the type of an overriding field, the type of the overridden field is used used as expected type. As a result, the inferred type of a val or lazy val may change in Scala 2.12.

In particular, implicit vals that didnt need explicitly declared types before may need them now. (This is always good practice anyway.)

You can get the old behavior with -Xsource:2.11. This may be useful for testing whether these changes are responsible if your code fails to compile.

Improvements to method conversions

Scala considers adapting a method without an argument list in multiple steps to figure out whether some requested type can be satisfied.

One of those steps is eta-expansion, which happens before empty application. In some cases considering eta-expansion before empty application would cause the compiler to reject code that could have been accepted if empty application happened before eta-expansion.

Eta-expansion before empty application has been deprecated in 2.12 and will be swapped in 2.13.

Improvements to internal syntax trees

PR #4794 improved the syntax trees for selections of statically accessible symbols.

For example, a selection of Predef no longer has the shape q"scala.this.Predef" but simply q"scala.Predef". Macros and compiler plugins matching on the old tree shape need to be adjusted.

This change only affects authors of macros and compiler plugins.

Object initialization locks and lambdas

In Scala 2.11, the body of a lambda is in the apply method of the anonymous function class generated at compile time. The new lambda encoding in 2.12 lifts the lambda body into a method in the enclosing class. An invocation of the lambda will therefore indirect through the enclosing class, which may cause deadlocks that did not happen before.

For example, the following code

import scala.concurrent._
import scala.concurrent.duration._
object O { Await.result(Future(1), 5.seconds) }

compiles to (simplified):

public final class O$ {
  public static O$ MODULE$;
  public static final int $anonfun$new$1() { return 1; }
  public static { new O$(); }
  private O$() {
    MODULE$ = this;
    Await.result(Future.apply(LambdaMetaFactory(Function0, $anonfun$new$1)), DurationInt(5).seconds);

Accessing O for the first time initializes the O$ class and executes the static initializer (which invokes the instance constructor). Class initialization is guarded by an initialization lock (Chapter 5.5 in the JVM specification).

The main thread locks class initialization and spawns the Future. The Future, executed on a different thread, attempts to execute the static lambda body method $anonfun$new$1, which also requires initialization of the class O$. Because initialization is locked by the main thread, the thread running the future will block. In the meantime, the main thread continues to run Await.result, which will block until the future completes, causing the deadlock.

One example of this surprised the authors of ScalaCheck now fixed.

Lambdas capturing outer instances

Because lambda bodies are emitted as methods in the enclosing class, a lambda can capture the outer instance in cases where this did not happen in 2.11. This can affect serialization.

The Scala compiler analyzes classes and methods to prevent unnecessary outer captures: unused outer parameters are removed from classes (#4652), and methods not accessing any instance members are made static (#5099). One known limitation is that the analysis is local to a class and does not cover subclasses.

class C {
  def f = () => {
    class A extends Serializable
    class B extends A
    serialize(new A)

In this example, the classes A and B are first lifted into C. When flattening the classes to the package level, the A obtains an outer pointer to capture the A instance. Because A has a subclass B, the class-level analysis of A cannot conclude that the outer parameter is unused (it might be used in B).

Serializing the A instance attempts to serialize the outer field, which causes a NotSerializableException: C.


SBT, the Scala Build Tool, is the backbone of every Scala project.

While Scala is well-supported by Maven, Ant and other build tools, SBT provides some additional features to make your life much easier:

It manages and resolves dependencies, configures the build, incrementally compiles your source code, runs tests, provides a project-specific console, creates artifacts and publishes them to repositories.

The recommended version of SBT is 0.13.13 which provides a glimpse of many upcoming improvements of SBT 1.0 like auto plugins, launcher enhancements for SBT server, defined in the sbt-remote-control project, and other necessary API changes.

The 0.13.x releases maintain binary compatibility with plugins that are published against SBT 0.13.0, but add new features in preparation for SBT 1.0. This allows us to test new ideas like auto plugins and performance improvements on dependency resolution; the build users can try new features without losing the existing plugin resources; and plugin authors can gradually migrate to the new plugin system before SBT 1.0 arrives.

Visit the documentation or the changelog for further information!


The Scala Center team is very proud to announce that The Scala Package Index, or Scaladex for short, has reached beta v1! This means were now confident that Scaladex is now in state where it is ready for widespread use.

Scaladex now supports most major features we planned for it, in addition to serving as an index of the known Scala ecosystem; self-updating, reindexing, user editing of published libraries (keywords, documentation links, and artifact deprecation), and more, even as we work to iron out user experience. In particular wed like to thank contributors Ronald Marske, and Rafa Paradela and Israel Prez from 47 Degrees in helping us to reach this milestone!

Heres a quick walkthrough we put together of some of Scaladexs main features:

Have a look at the announcement for further details!


ScalaFiddle, an online playground for exploring and sharing Scala code, has been released!

Get started at!


Scalafmt is a formatter for Scala source code and emits readable, idiomatic and consistently formatted Scala code. It can be used as a stand-alone tool, integrated into your SBT build or as a library.

Check out for more details!


Scalafix is a newly released tool to automatically adapt existing Scala source code to future releases. It takes care of easy, repetitive and tedious code transformations so you can focus on the changes that truly deserve your attention.

In a nutshell, scalafix reads a source file, transforms usage of unsupported features into newer alternatives, and writes the final result back to the original source file. Scalafix aims to automate the migration of as many unsupported features as possible. There will be cases where scalafix is unable to perform automatic migration. In such situations, scalafix will point to the offending source line and provide instructions on how to refactor the code.

Please have a look at the scalafix announcement for more information.

Migration Manager (MiMa)

The migration manager is a tool that can scan different versions of a library and point out binary incompatible changes. This helps library authors of all JVM languages to make sure that their libraries follow their rules on compatibility like no binary incompatible changes in minor releases.

MiMa can be used as a stand-alone tool or integrated into your SBT build. It has been updated to take the new capabilities of Java 8 bytecode into account.

MiMas Github page provides more details.

IDEs and editors


JetBrains provides a Scala plugin which works in the community as well as the commercial version of their IntelliJ IDE.

It turns IntelliJ into a fully featured Scala IDE offering syntax highlighting, warning/error markers, auto-completion, refactorings, inspections and intentions.

Just place a *.scala file in your project and IntelliJ will offer to download and install the IntelliJ Scala plugin.

Have a look at the official documentation for more information.

The Scala plugin for IntelliJ supports Scala 2.12.


ScalaIDE, the Scala plugin for Eclipse, uses Scala 2.11 by default. Scala 2.10 and 2.12 can be configured manually.

In the coming weeks a new release of ScalaIDE will switch this default to Scala 2.12.

Have a look at the features page and the release notes of ScalaIDE 4.0, 4.1, 4.2, 4.3 and 4.4 for more information about new features since Scala 2.11.


ENSIME is a tool that provides IDE-like features (auto-completion, show inferred types, jump to definition, ) to editors like Emacs, Atom, Vim, Sublime.

Support for Scala 2.12 is tracked in #1414. ENSIME is free/libre software developed by the Scala community, please consider contributing or supporting it!


The list of open-source libraries released for Scala 2.12 is growing quickly!


Minor releases of Scala are binary compatible with each other: every 2.12.x release will be binary compatible with 2.12.0.

Although Scala 2.11 and 2.12 are mostly source compatible to facilitate cross-building, they are not binary compatible. This allows us to keep improving the Scala compiler and standard library.

Cross-building is a one-line change to most SBT builds, and even provides support for version-specific source folders out of the box, when necessary to work around incompatibilities (e.g. macros).

The documentation on compatibility explains in detail how binary compatibility works in Scala.

Known issues

There are some known issues with this release that will be resolved in 2.12.1, due out by the end of November.

Due to performance issues in Hotspot, heavy use of default methods for compiling traits can cause performance regressions in the startup time. Note that steady-state performance is not affected according to our measurements.

We addressed this #5429 by emitting additional forwarder methods in the bytecode. This increases the size of class files compared to earlier release candidates of 2.12, but is still a sizable improvement over 2.11. If necessary, you can use the -Xmixin-force-forwarders option to optimize for size instead of performance.

We will continue to tweak the bytecode we emit during the 2.12.x cycle to get the best performance out of the JVM. Please let us know if you notice any performance regressions.

We hope to address the following in a future 2.12.x release:


A big thank you to everyone whos helped improve Scala by reporting bugs, improving our documentation, kindly helping others on our forums and at meetups, and submitting and reviewing pull requests! You are all magnificent.

Scala 2.12.0 is the result of merging over 500 pull requests out of about 600 received PRs. The contributions to 2.12.x over the last 2 years were split as 64/32/4 between the Scala team at Lightbend (lrytz, retronym, adriaanm, SethTisue, szeiger), the community and EPFL.

The new encodings of traits, lambdas and lazy vals were developed in fruitful collaboration with the Dotty team at EPFL.

Thank you very much to all contributors that helped realize this Scala release!


Please note that Scala requires an installed Java 8 runtime (Java 9 is not yet supported).

Continue reading on