Fork me on GitHub

µPickle 0.4.3

uPickle (pronounced micro-pickle) is a lightweight serialization library for Scala. It's key features are:

Getting Started

Add the following to your SBT config:

libraryDependencies += "com.lihaoyi" %% "upickle" % "0.4.3"

And then you can immediately start writing and reading common Scala objects to strings:

import upickle.default._

write(1)                          ==> "1"

write(Seq(1, 2, 3))               ==> "[1,2,3]"

read[Seq[Int]]("[1, 2, 3]")       ==> List(1, 2, 3)

write((1, "omg", true))           ==> """[1,"omg",true]"""

type Tup = (Int, String, Boolean)

read[Tup]("""[1, "omg", true]""") ==> (1, "omg", true)


For ScalaJS applications, use this dependencies instead:

libraryDependencies += "com.lihaoyi" %%% "upickle" % "0.4.3"

Other than that, everything is used the same way. upickle-0.2.8 is only compatible with ScalaJS 0.6.x.

Scala 2.10

If you are using Scala 2.10 (instead of 2.11) be sure to add this dependency as well:

libraryDependencies += "org.scalamacros" %% s"quasiquotes" % "2.0.0" % "provided"

This applies both for ScalaJVM and ScalaJS.

Supported Types

Out of the box, uPickle supports writing and reading the following types:

Readability/writability is recursive: a container such as a Tuple or case class is only readable if all its contents are readable, and only writable if all its contents are writable. That means that you cannot serialize a List[Any], since uPickle doesn't provide a generic way of serializing Any. Case classes are only serializable up to 22 fields.

Case classes are serialized using the apply and unapply methods on their companion objects. This means that you can make your own classes serializable by giving them companions apply and unapply. sealed hierarchies are serialized as tagged unions: whatever the serialization of the actual object, together with the fully-qualified name of its class, so the correct class in the sealed hierarchy can be reconstituted later.

That concludes the list of supported types. Anything else is not supported.

Default Picklers

This is a non-comprehensive list of what the most commonly-used types pickle to using uPickle. To begin, let's import upickle

import upickle.default._

Booleans are serialized as JSON booleans

write(true: Boolean)              ==> "true"
write(false: Boolean)             ==> "false"

Numbers are serialized as JSON numbers

write(12: Int)                    ==> "12"
write(12: Short)                  ==> "12"
write(12: Byte)                   ==> "12"
write(12.5f: Float)               ==> "12.5"
write(12.5: Double)               ==> "12.5"

Except for Longs, which too large for Javascript. These are serialized as JSON Strings, keeping the interchange format compatible with the browser's own JSON parser, which provides the best performance in Scala.js

write(12: Long)                   ==> "\"12\""
write(4000000000000L: Long)       ==> "\"4000000000000\""

Special values of Doubles and Floats are also serialized as Strings

write(1.0/0: Double)              ==> "\"Infinity\""
write(Float.PositiveInfinity)     ==> "\"Infinity\""
write(Float.NegativeInfinity)     ==> "\"-Infinity\""

Both Chars and Strings are serialized as Strings

write('o')                        ==> "\"o\""
write("omg")                      ==> "\"omg\""

Arrays and most immutable collections are serialized as JSON lists

write(Array(1, 2, 3))             ==> "[1,2,3]"

// You can pass in an `indent` parameter to format it nicely
write(Array(1, 2, 3), indent = 4)  ==>
    |    1,
    |    2,
    |    3

write(Seq(1, 2, 3))               ==> "[1,2,3]"
write(Vector(1, 2, 3))            ==> "[1,2,3]"
write(List(1, 2, 3))              ==> "[1,2,3]"
import collection.immutable.SortedSet
write(SortedSet(1, 2, 3))         ==> "[1,2,3]"

Options are serialized as JSON lists with 0 or 1 element

write(Some(1))                    ==> "[1]"
write(None)                       ==> "[]"

Tuples of all sizes (1-22) are serialized as heterogenous JSON lists

write((1, "omg"))                 ==> """[1,"omg"]"""
write((1, "omg", true))           ==> """[1,"omg",true]"""

Case classes of sizes 1-22 are serialized as JSON dictionaries with the keys being the names of each field

case class Thing(myFieldA: Int, myFieldB: String)
case class Big(i: Int, b: Boolean, str: String, c: Char, t: Thing)
import upickle._
write(Thing(1, "gg"))             ==> """{"myFieldA":1,"myFieldB":"gg"}"""
write(Big(1, true, "lol", 'Z', Thing(7, ""))) ==>

write(Big(1, true, "lol", 'Z', Thing(7, "")), indent = 4) ==>
    |    "i": 1,
    |    "b": true,
    |    "str": "lol",
    |    "c": "Z",
    |    "t": {
    |        "myFieldA": 7,
    |        "myFieldB": ""

Sealed hierarchies are serialized as tagged values, the serialized object tagged with the full name of the instance's class:

sealed trait IntOrTuple
case class IntThing(i: Int) extends IntOrTuple
case class TupleThing(name: String, t: (Int, Int)) extends IntOrTuple
write(IntThing(1)) ==> """{"$type":"example.Sealed.IntThing","i":1}"""

write(TupleThing("naeem", (1, 2))) ==>

// You can read tagged value without knowing its
// type in advance, just use type of the sealed trait
read[IntOrTuple]("""{"$type":"example.Sealed.IntThing","i": 1}""") ==> IntThing(1)

Serializability is recursive; you can serialize a type only if all its members are serializable. That means that collections, tuples and case-classes made only of serializable members are themselves serializable

case class Foo(i: Int)
case class Bar(name: String, foos: Seq[Foo])
write((((1, 2), (3, 4)), ((5, 6), (7, 8)))) ==>

write(Seq(Thing(1, "g"), Thing(2, "k"))) ==>

write(Bar("bearrr", Seq(Foo(1), Foo(2), Foo(3)))) ==>

Nulls serialize into JSON nulls, as you would expect

write(Bar(null, Seq(Foo(1), null, Foo(3)))) ==>

uPickle only throws exceptions on unpickling; if a pickler is properly defined, serializing a data structure to a String should never throw an exception.

On unpickling, uPickle throws one of two subclasses of upickle.Invalid:

Manual Sealed Trait Picklers

Due to a bug in the Scala compiler SI-7046, automatic sealed trait pickling can fail unpredictably. This can be worked around by instead using the macroRW and merge methods to manually specify which sub-types of a sealed trait to consider when pickling:

sealed trait TypedFoo
object TypedFoo{
  import upickle.default._
  implicit val readWriter: ReadWriter[TypedFoo] =
    macroRW[Bar] merge macroRW[Baz] merge macroRW[Quz]

  case class Bar(i: Int) extends TypedFoo
  case class Baz(s: String) extends TypedFoo
  case class Quz(b: Boolean) extends TypedFoo

By placing it in the companion object, we ensure that the pickler is cached and always used by default, even if not explicitly imported. From there you can use it manually:

  assert(implicitly[upickle.default.Reader[TypedFoo]] eq TypedFoo.readWriter)
  assert(implicitly[upickle.default.Writer[TypedFoo]] eq TypedFoo.readWriter)
  assert(implicitly[upickle.default.ReadWriter[TypedFoo]] eq TypedFoo.readWriter)

  rw(TypedFoo.Bar(1): TypedFoo, """{"$type": "upickle.TypedFoo.Bar", "i": 1}""")
  rw(TypedFoo.Baz("lol"): TypedFoo, """{"$type": "upickle.TypedFoo.Baz", "s": "lol"}""")
  rw(TypedFoo.Quz(true): TypedFoo, """{"$type": "upickle.TypedFoo.Quz", "b": true}""")


If a field is missing upon deserialization, uPickle uses the default value if one exists


  read[FooDefault]("{}")                ==> FooDefault(10, "lol")
  read[FooDefault]("""{"i": 123}""")    ==> FooDefault(123,"lol")

If a field at serialization time has the same value as the default, uPickle leaves it out of the serialized blob

  write(FooDefault(i = 11, s = "lol"))  ==> """{"i":11}"""
  write(FooDefault(i = 10, s = "lol"))  ==> """{}"""
  write(FooDefault())                   ==> """{}"""

This allows you to make schema changes gradually, assuming you have already pickled some data and want to add new fields to the case classes you pickled. Simply give the new fields a default value (e.g. "" for Strings, or wrap it in an Option[T] and make the default None) and uPickle will happily read the old data, filling in the missing field using the default value.

Custom Keys

uPickle allows you to specify the key that a field is serialized with via a @key annotation

import derive.key
case class KeyBar(@key("hehehe") kekeke: Int)
write(KeyBar(10))                     ==> """{"hehehe":10}"""
read[KeyBar]("""{"hehehe": 10}""")    ==> KeyBar(10)

Practically, this is useful if you want to rename the field within your Scala code while still maintaining backwards compatibility with previously-pickled objects. Simple rename the field and add a @key("...") with the old name so uPickle can continue to work with the old objects correctly.

You can also use @key to change the name used when pickling the case class itself. Normally case classes are pickled without their name, but an exception is made for members of sealed hierarchies which are tagged with their fully-qualified name. uPickle allows you to use @key to override what the class is tagged with:

import derive.key
sealed trait A
@key("Bee") case class B(i: Int) extends A
case object C extends A
write(B(10))                          ==> """{"$type":"Bee","i":10}"""
read[B]("""{"$type":"Bee","i":10}""") ==> B(10)

This is useful in cases where:

Custom Picklers

Apart from customizing the keys used to store the fields of a class, uPickle also allows you to completely replace the default Reader/Writer used to write that class. For classes you control. You need to provide an implicit Reader/Writer pair in the companion object:

import upickle.Js
class CustomThing2(val i: Int, val s: String)
object CustomThing2{
  implicit val thing2Writer = upickle.default.Writer[CustomThing2]{
    case t => Js.Str(t.i + " " + t.s)
  implicit val thing2Reader = upickle.default.Reader[CustomThing2]{
    case Js.Str(str) =>
      val Array(i, s) = str.split(" ")
      new CustomThing2(i.toInt, s)

In this example, instead of pickling to a normal Js.Obj, we pickle to a Js.Str, storing both i and s as part of that single string.

Note that when writing custom picklers, it is entirely up to you to get it right, e.g. making sure that an object that gets round-trip pickled/unpickled comes out the same as when it started.

Custom Configuration

Often, there will be times that you want to customize something on a project-wide level. uPickle provides hooks in letting you subclass the upickle.Api trait to create your own bundles apart from the in-built upickle.default and upickle.legacy. You have multiple levels of possible customization:

If you are using uPickle to convert JSON from another source into Scala data structures, you may find the following encoding of Option[T] more convenient than the defaults:

object OptionPickler extends upickle.AttributeTagged {
  override implicit def OptionW[T: Writer]: Writer[Option[T]] = Writer {
    case None    => Js.Null
    case Some(s) => implicitly[Writer[T]].write(s)

  override implicit def OptionR[T: Reader]: Reader[Option[T]] = Reader {
    case Js.Null     => None
    case v: Js.Value => Some(implicitly[Reader[T]].read.apply(v))

This custom configuration allows you to treat nulls as Nones and anything else as Some(...)s. Simply import OptionPickler._ instead of the normal uPickle import throughout your project and you'll have the customized reading/writing available to you.


uPickle is a work in progress, and doesn't currently support:

Most of these limitations are inherent in the fact that ScalaJS does not support reflection, and are unlikely to ever go away. In general, uPickle is designed to serialize statically-typed, tree-shaped, immutable data structures. Anything more complex is out of scope.


Although uPickle's object read/writing API makes does not expose you to it, under the hood it uses a nice JSON serialization format. Despite being less-compact than binary formats, this allows for very-fast serializing and deserializing from Strings on both Scala-JVM (which has other alternatives) and ScalaJS, where JSON is really your only choice. The JSON API is minimal but nonetheless very convenient, and can be used directly.

uPickle bundles two very-fast JSON parsers, which it uses for parsing strings into structured-trees, before then marshalling them into typed objects.

That makes uPickle's JSON library competitive with the highest performance JSON libraries both on the JVM (GSON, Jackson, etc.) as well as in Javascript.

uPickle's JSON API is exposed in two places: in our upickle.Js.* AST:

object Js {
  object Js {

    sealed trait Value extends Any {
      def value: Any

        * Returns the `String` value of this [[Js.Value]], fails if it is not
        * a [[Js.Str]]
      def str = this match{
        case Str(value) => value
        case _ => throw Invalid.Data(this, "Expected Js.Str")
        * Returns the key/value map of this [[Js.Value]], fails if it is not
        * a [[Js.Obj]]
      def obj = this match{
        case Obj(value @ _*) => value.toMap
        case _ => throw Invalid.Data(this, "Expected Js.Obj")
        * Returns the elements of this [[Js.Value]], fails if it is not
        * a [[Js.Arr]]
      def arr = this match{
        case Arr(value @ _*) => value
        case _ => throw Invalid.Data(this, "Expected Js.Arr")
        * Returns the `Double` value of this [[Js.Value]], fails if it is not
        * a [[Js.Num]]
      def num = this match{
        case Num(value) => value
        case _ => throw Invalid.Data(this, "Expected Js.Num")

        * Looks up the [[Js.Value]] as a [[Js.Arr]] using an index, throws
        * otherwise if it's not a [[Js.Arr]]
      def apply(i: Int): Value = this.arr(i)
        * Looks up the [[Js.Value]] as a [[Js.Obj]] using an index, throws
        * otherwise if it's not a [[Js.Obj]]
      def apply(s: java.lang.String): Value = this.obj(s)
    case class Str(value: java.lang.String) extends AnyVal with Value
    case class Obj(value: (java.lang.String, Value)*) extends AnyVal with Value
    case class Arr(value: Value*) extends AnyVal with Value
    case class Num(value: Double) extends AnyVal with Value
    case object False extends Value{
      def value = false
    case object True extends Value{
      def value = true
    case object Null extends Value{
      def value = null

    def apply(i: Int): Value = this.asInstanceOf[Arr].value(i)
    def apply(s: java.lang.String): Value = this.asInstanceOf[Obj].value.find(_._1 == s).get._2
  case class Str(value: java.lang.String) extends AnyVal with Value
  case class Obj(value: (java.lang.String, Value)*) extends AnyVal with Value
  case class Arr(value: Value*) extends AnyVal with Value
  case class Num(value: Double) extends AnyVal with Value
  case object False extends Value{
    def value = false
  case object True extends Value{
    def value = true
  case object Null extends Value{
    def value = null

As well as in the and upickle.json.write functions:

def read(s: String): Js.Value
def write(v: Js.Value): String

Which you use to convert between structured Js.* trees and unstructured Strings. As described earlier, the implementation of these functions differs between ScalaJVM/ScalaJS.

You can use this JSON data structure for simple tasks:"[1]").arr        ==> Seq(Js.Num(1))"1").num          ==> 1"\"1\"").str      ==> "1""{\"1\": 1}").obj ==> Map("1" -> Js.Num(1))
val unparsed = json.write(parsed)
val reparsed =
for (json <- Seq(parsed, reparsed)){
    json(0).value == "JSON Test Pattern pass1",
    json(8)("real").value == -9876.54321,
    json(8)("comment").value == "// /* <!-- --",
    json(8)("jsontext").value == "{\"object with 1 member\":[\"array with 1 element\"]}",
    json(19).value == "rosebud"
(parsed(19), reparsed(19))

uPickle does not provide any other utilities are JSON that other libraries do (zippers, lenses, combinators, ...). If you're looking for a compact JSON AST to construct or pattern match on, together with fast serializing and deserializing, it may do the trick.

uPickle's Js.Value type and subclasses are all themselves serializable, and they serialize to themselves.

Caching Picklers

Synthesizing case class picklers at every callsite where you're pickling something can get expensive: at compile-time the compiler is deriving the correct combination of readers of writers over and over at every callsite, and at runtime the JVM is instantiating that same combination over and over.

To speed things up both at compile time and runtime, you can pre-generate picklers inside the case classes companion objects. This should greatly speed up compilation in large codebases with lots of different case-classes being pickled, and improve runtime performance. Simply add a macroRW implicit into your case class's companion object:

case class CachedCaseClass(b: String, a: Double)
object CachedCaseClass {
  implicit val pkl = upickle.default.macroRW[CachedCaseClass]

And then the next time you read and write the case class, things should work as expected:

import upickle.default.{read, write}
val res = read[CachedCaseClass](write(CachedCaseClass("aaa", 42.0)))
assert(res == CachedCaseClass("aaa", 42.0))

Except that the Reader and Writer is simply taken from the companion object each time instead of being synthesized again and again at every callside

import upickle.default.{Reader, Writer}
// Each time you ask for an implicit reader or writer, it's the same one
assert(implicitly[Reader[CachedCaseClass]] eq implicitly[Reader[CachedCaseClass]])
assert(implicitly[Writer[CachedCaseClass]] eq implicitly[Writer[CachedCaseClass]])

Why uPickle

I wrote uPickle because I needed a transparent serialization library that worked both in Scala-JVM and Scala-JS, and my dissatisfaction with existing solutions:

uPickle on the other hand aims much lower: by limiting the scope of the problem to statically-typed, tree-like, immutable data structures, it greatly simplifies both the internal implementation and the external API and behavior of the library. uPickle serializes objects using a very simple set of rules ("Does it have an implicit? Is it a class with apply/unapply on the companion?") that makes its behavior predictable and simple to understand.

Version History