Scalar Conf 2017 – A quick visit to Warsaw


And so I got a last minute opportunity to attend Scalar Conf 2017. It was the first time I attended to this event, and also the first time I went to Warsaw – which is where the event took place.

All in all, the event was good. I went with a few other Zalandos, and we had a booth there. One of the most interesting things overall were all the conversations I had with several people that visited us. Among the topics discussed were our Tech Radar (which we were displaying in our booth), people curious about the Eff Monad and several points about how is it to work at Zalando.

I did not watch all the talks since, like hinted above, the discussions were the most interesting part of the conference, and I wanted to spend time in our booth. I did nonetheless watch a few of them, which I’ll discuss a bit below.

The first talk I watched was the first one of the conference: Dave Gurnell’s Adventures in Meta Programming. The topic is basically something he seems to be diving deep into in the last few years, which is meta programming in general – as can be noted from other conferences where he talked about Shapeless for example. This time, the focus was not only Shapeless, but also macros, and when to use one instead of the other. I especially liked the terms “Typey stuff” and “Syntaxy stuff” that he coined to point out what is better for which kind of scenario. In summary, use macros for things that are “syntaxy things”, and Shapeless for things that are “typey things”.


Next there was a talk about type classes, from Andrea Lattuada: Typeclasses, a Typesystem Construct. I watched just part of it, since it was targeted at beginners. If you are curious, the examples were moving back and forth between Scala and Haskell – and if this topic is new to you, you should probably watch the video.

Another interesting talk I watched was from John de Goes: Quark: A Purely-Functional Scala DSL for Data Processing & Analytics. The main message from this talk was that functional programming is a better way to deal with Data Analytics in general, in contrast with the way Apache Spark does things, which causes lots of problems – even though it is productive to start with. Instead of having computational lambdas, we should instead decouple the computation description from actually executing such computation. A nice quote from him that goes in this direction is: “We are lazy functional programmers”. If you are curious about what he is doing there, other then checking the video of his talk you can also check the project’s Github page at


The conference also featured an interesting talk about monad transformers from Gabriele Petronella: Practical Monad Transformers. It was very interesting for anyone that want to understand what such tools are for and what are the problems that come with using them. A highlight of this talk was towards the end, when he was talking about alternative tools and mentioned the Eff Monad – which is a framework that is quite new but we are already using in my team at Zalando. Perhaps this also explains a bit about why so many people came to the Zalando booth curious to ask about Eff.


There were other interesting talks, but the last one I want to mention is Gatling distilled, from Andrzej Ludwikowski. I wanted to have a look at Gatling for a while now, so this talk came in handy. He introduced what Gatling is and gave a few tips. A few take-aways for me:

  • Gatling can also be used for integration tests;
  • it has a nice DSL;
  • you shouldn’t use the recorder;
  • assertions can include response time limits;
  • several different data sources can be used;
  • remember to turn on logging, to understand what is going on.


During the whole conference, the organizers had a couple of flip-charts up with questions like which frameworks do people use for persistence, among other things. If you are curious, there is a post on the conference blog about this here where they go through all the questions asked and the results.


In summary, the conference was interesting and it was certainly worth the time going. The downside for me was that there were too many Akka related talks, and as you can see from my selection in this blog post, I’m not exactly interested in seeing too many things in that direction. I understand that Akka is important and cannot be left aside, but having five talks about that, plus a couple of others that were also indirectly related was a bit too much.

That being said, I hope I’m able to attend Scalar Conf again next year. If you want to see a bit of how it was, I uploaded my pictures to Flickr. And you can check all the conference talks in this Youtube Playlist.

Posted in scala | Tagged , , , , , , , , , , , , , , , | Leave a comment

Six things I learned at Scala eXchange 2016

2016-12-09 14.18.43

In December 2016 I attended Scala eXchange 2016 – a traditional and quite interesting Scala conference that occurs every year in London, UK. I had the opportunity to attend it for the 4th time, and like every previous one, it was well worth the effort. In this post, we will take a pragmatic look at six things I learned during the conference. We won’t talk about the beer though – you should go there next time if you want in on this 😉

By the way, all talks are available on the conference web site linked above, so be sure to check it out. After you are done with this post, of course!

Number 1: Compilation time is still an issue, but it is being tackled

Compilation time was the subject of at least three talks during the event. The first of these was the keynote by Adriaan Moors, team lead of the compiler team at Lightbend, where he made it clear that they plan to spend half of their team efforts during 2017 improving the situation.

Next up was the controversial “Compilation time: a bigger hammer”, from Iulian Dragos and Mirko Dotta, from the recently created Triplequote company. They presented the tool they are creating for speeding up the compilation times, based on parallelization of compilation units. The whole topic was a bit controversial because the tool is a commercial effort. In one hand, the compiler should be faster by itself, without the support of external tools. On the other hand, the Triplequote developers are investing their own resources, so it is only fair that they can get something out of the effort.

Finally, there was the talk “Can scalac be 10x faster”, from Rory Graves. This one was very interesting and full of tips you can apply right now to your projects to improve their compilation times. We actually tried a couple of them in our project and managed to get some improvements. We specifically changed two things:

* replaced several wildcard imports (._) with specific imports

* split traits and class that were sharing the same .scala source file into separated files

The first change helped a lot in the total compilation time (after a clean). This is most likely due to the fact that with more specific imports, there are less places for the compiler to look for implicit conversions – and we do use implicit conversions quite a lot.

The second point was more useful in reducing the incremental compilation time – i.e., the chances of a given file to be invalidated and have to be recompiled were made smaller.

In terms of compilation time, there was a third point that we looked at briefly: macros. In this case, we wanted to start using a certain macro, to reduce code repetition, but gave up because it was increasing the compilation times quite a lot. For this kind of thing, the best tip we can give is to always pay attention to your build times when you decide to try new features. In our case, it seems that the extra compilation times were coming from a combination of the usage of macros and the usage of Shapeless in code generated by macro. Unfortunately, we never had time to properly isolate and fix the problem.

For all cases, the plan was to share some numbers, but since, like mentioned above, we didn’t have a properly isolated test scenario to share, the numbers were a bit biased. So I’ll just leave the message: do try those tips yourself and see if it changes something for your own projects.

Number 2: Scala compiler fork: a positive thing to have!

You probably already know that there is a group called Typelevel, and that they forked the Scala compiler. When this happened, there was some commotion in the community. Now, during Scala eXchange 2016 we had an interesting talk from Miles Sabin regarding this all, and what Typelevel had been up to, specially in 2016.

The biggest takeaway here is that the Typelevel scala compiler fork is a very positive thing, and has made a positive impact on the Scala community. For example, there are a couple of bug fixes for the Scala compiler that are available in the Typelevel version of the compiler. Moreover, you can use those fixes in the standard Scala compiler with a compiler plugin also made available by the team. Finally, such fixes are also being sent back to the Lightbend Scala compiler as Pull Requests, and some have been accepted!

One such case is the fix for the Scala issue 2712. It was fixed by the Typelevel guys and sent as a Pull Request to the standard compiler.

The fact that the above is possible confirms what Miles said during the presentation: Typelevel has an interesting rule: whatever improvements someone wants to make to the Typelevel Scala compiler, it must also be submitted as a PR to the standard compiler.

Finally, if you want to use the Typelevel Scala compiler in your projects, it is as simple as adding an sbt plugin to your build, as can be seen here.

Number 3: Shapeless is awesome and doesn’t have to be scary!

I’ll not dive too deep in this topic, but if you have ever heard of Shapeless and felt like it is too complex, you have to watch Dave’s talk. It was a very gentle introduction to the awesome framework that lets you do some great generic programming.

We use Shapeless in one of our libraries, Grafter, that offer a nice, generic, and “new old” way of dealing with constructor-based dependency injection, so this talk was useful in practice already.

One note before closing this topic: Shapeless is really useful, but it is intended more as a support to library authors than to application developers. So don’t watch the talk and start using it everywhere! That being said, knowing how it works will help you deal with such libraries, which includes the above mentioned Grafter, and the now famous Circe json framework, among others.

Number 4: The Future of Scala: it is moving forward and will keep doing so

Martin Odersky’s keynote talk, “From DOT to Dotty” was an interesting one for anybody that still has doubts about the future of the Scala language. Is it going to keep evolving?

Dotty is the future Scala. The interesting thing here is that this doesn’t mean it will replace Scala, at least not in the short or even medium term, but instead that it is a place where innovation can be tried out and really happen without too many restrictions, before they are implemented in the Scala language itself, where the stakes are way higher. Quoting directly from Martin’s keynote: “The plan is that Dotty should support future iterations of the Scala programming language”.

Dotty might one day become Scala 3, but this is not yet something that is written in stone – right now, some features that work well in Dotty are actually being backported to scalac – confirming Martin’s words again, when he said that Dotty is there to “support the next iteration of Scala”.

The diagram below, extracted directly from Martin’s talk, show the current planned roadmap – obviously highly subject to change:


A nice feature example, still to come in the (hopefully) near future, is typesafe equality. This is being implemented in Dotty but, right now, you have to use a library like cats or scalaz to get this functionality in the Scala language itself. But in the future, this will most likely be part of the standard Scala language.

Number 5: Zalando growing the Scala community

This is a bit self-served, but it is nice to know that Zalando as a company is growing in the Scala community. In Scala eXchange for example, we were present with six developers, from which two had talks being presented at the conference. The first one was by Joachim, who spoke about Akka Streams, and the second was Eric, with a talk about Practical Eff. Our participants also represented two different locations: four of them came from the office in Berlin, and two came from the Dublin office.

You can find Joachim’s talk here and Eric’s here. And we also had a booth. And lots of fun!

Number 6: Scala is OpenSource. And needs us all

Let us close this post with a call to action, mirroring the message from Heather Miller’s keynote. This call to action could be summarized in a single sentence, quoted directly from Heather’s talk: “we don’t want your money, we want your PRs”.

Scala is Open Source and lives, grows and evolves with help from the community. Its future is not tied to Lightbend’s hands, and to that end they have even created the Scala Center, which is an independent, not-for-profit organization focused on the future of Scala, community participation and the likes.

In the context of community initiatives, a lot was said about the SIP process and its improvements. This is another point where Scala seems promising for the future. Other than that, they are also trying to move to a more friendly platform for community discussions, other than mailing lists, in the form of a Discuss forum which can be seen here: – modern and pretty 🙂

In the end, all of these initiatives aren’t appearing out of thin air. One source of information that she referenced was the book “Social Architecture” ( – already added to my to-read list.

So, in summary: Scala is still hot, and will be hot for a long time to come! And when this time passes, there will come Dotty!

Edit: I just uploaded some pictures here, if you are interested 😉

Posted in scala | Tagged , , , , , , , , , , , | Leave a comment

A Scala Book… in Portuguese

It took me a while. A very long while, actually, but it is finally out: I have written a Scala book! The only gotcha here is that it is in Portuguese. And as far as I know (I might be wrong, though) it is the first Scala book in this language. You can find it for sale here in the Casa do Código page.

The book was written with beginners in mind, i.e. people that never wrote any Scala code before, and one of the main reasons I decided it was worth writing in Portuguese was the lack of learning resources for Portuguese speakers to start learning the Scala language – also, there are plenty of resources covering those topics in English already.

Writing a book is a huge task. When I started, I never imagined how consuming that could be. But it ended up also being quite fun and full of learning opportunities, so well worth the effort. Now it is time to breath and perhaps consider another book… but not in the short term 😉

Posted in misc, scala | Tagged , , , , , , , , | Leave a comment

Implicit conversions in Specs2 gone mad


In this resurrection post I want to talk a little bit about a problem we faced recently at CarJump (how I ended up there is an story for another post) with Specs2 and Mockito. Actually, the issue I want to address is subtle and appeared in a very specific scenario. Lets start describing such scenario.

Disclaimer: the scenario below is valid for specs2 version 3.7.x – with specs2 2.x everything was fine. Implicit conversions defined by specs2 changed quite a bit between those two versions.

First, we have a specs2 test specification. Something as simple as the following:

class ImplicitsSpec extends Specification {
  "my spec" should {
    "do something" in {

This works just fine, it is just a simple specs2 Specification. Next comes adding mockito. In specs2, it is nothing more then adding the Mockito trait to the test suite. Or, and this is where it gets you, wherever you define your mocks. This is the pattern we started to used recently:

object ImplicitsSpec extends Mockito {
  // my common test vals here

Putting that code into words, we create a companion object that will hold all common vals used in the tests. We actually are starting to do this kind of thing in lots of places, and this is the first time it was a problem.

So, how are we to use those common values? Just import the companion object members. Applying this strategy to the spec presented earlier, the result would be something like:

class ImplicitsSpec extends Specification {
  import ImplicitsSpec._

  "my spec" should {
    "do something" in {

Pretty simple and nothing can go wrong there, right? Well… wrong. If you try to compile the code above, you will get an error like the following:

/src/test/scala/com/jcranky/specs2/implicits/ImplicitsSpec.scala:11: type mismatch;
[error]  found   : org.specs2.specification.core.Fragment
[error]  required: org.specs2.matcher.Matcher[String]
[error]     "do something" in {

Wait, what?

What happens is that the specs2 Mockito trait brings into context several other implicit conversions, not only mock related stuff. And what’s more, imported conversions have higher precedence over conversions got from class definitions. In our case, it means that whatever comes from

import ImplicitsSpec._

comes before what we get from extending Specification. In this case, in practice, we are losing a conversion from Fragment to Matcher. At this moment, I couldn’t find exactly where this conversion is, but there is a few workarounds to fix this. The first one is to change the companion object declaration to be:

object ImplicitsSpec extends Specification with Mockito

This will bring all relevant implicit conversions to the same scope. Another common solution would be to declare the test specification like below, and remote Mockito from the companion object:

class ImplicitsSpec extends Specification with Mockito

The problem with this solution is that then you cannot create common mock objects in the companion object. There are obviously other solutions, but they usually get more complicated. Still, if you know the root of the problem, please leave a comment!

Posted in scala | Tagged , , , , , , | Leave a comment

What is the Best forum software out there?

And I’m back! It has been a long while since I last blogged, so lets get going right away!

What is the best forum software available out there?

Not the best question to start with, so the best answer it not optimal as well: it depends. Depends on what you want, on what you are looking for. What I want is something that:

  • is simple and easy to manage – I won’t have much time the manage stuff;
  • is free or at least with a good entry level pricing – I want to create a community, but it has no direct commercial goals;
  • is modern looking and easy to use.

Those are perhaps too much to ask, but lets find out.

The options

Bellow are the forum board software I found, some I already knew about, and others were recommend by friends. The grouping is not random: we will quickly analyse similar systems together.

These are the simpler ones. phpBB is probably amongst the better known of all software that I looked at, and both jforum and Simple Machines seems to be highly inspired by it. Both phpBB and jforum are open source, but I couldn’t really find that out about Simple Machines.

Those three got ruled out because of the third of my requirements: they are old-looking. They have a very dated UI, but could be a good option if you are looking for a well known formula.


Now things get visually much better. All three options above are modern looking and seem quite interesting. NodeBB and Discourse also bring something new to our comparison table: they both offer commercial / hosted solutions, which is great when you don’t have expertise or time to setup your own installation.

I discarded Mamute because it is not much more than a Stackoverflow clone, and I wanted something more community focused, and less Q & A focused. Discussion should be fine and encouraged for what I’m looking for.

NodeBB and Discourse were a different matter. I was really close to choose one of them when I found about the winner – more on that one later. Also, they both have a small problem for me: they are based on technologies I’m not totally familiar with, which means I would have to spend sometime learning. It would not be a big deal, but the winner is really a killer. Finally, the hosted version seemed a bit expensive to me.


A fully commercial option. Good looking and feature rich, lost me on the price point. They base their price on the number of online users – but how am I to know that, considering I’m just starting?

The winner.

Another commercial option. Two features really got me: first, it is embeddable. It can be part of your site, instead of a different or separated thing altogether. Second, it has a great starting price: Zero. You can use the free version for as long as you want, and if you decide to pay, the service levels are not based on the number of users you have, which makes everything that much simpler.

Also, the way Muut organizes information is exactly what I wanted: community focused. And to make things even better: the setup is ridiculously easy. Just drop in a html snippet in your site and you are done. They also have an API and other really cool features!

There are only two features I miss on Muut: sticky posts and locked posts. If the community is well behaved, this shouldn’t be a problem, but we will find out.


What is it that I am doing, you might wonder… Well, I’ve been working on EasyForger ( for a while now. It is starting to get some users, and I would like a way to connect them together. You can see our forums at – it is only in Portuguese for now, though =].

Posted in misc, web development | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

An Amazon S3 Script, a Video and a question

Hi guys and gals! It’s been a long time since I last posted something. I’ll talk a bit about the reasons behind it in the end of this post, but first the main course: the S3 Script.

I had to write a script to fix some images I have in my small store (Lojinha), and the images are all uploaded to Amazon S3. So I figured I should do it in Scala, and also that I should create a video explaining how I did it. You can find the script in a gist here.

The video is here and, with it, I bring you a question. The video is in Portuguese – which obviously means that if you don’t speak this language, this is a problem. So the question: Are you interested in this content in English?

This is an important question, that goes back to my point in the first paragraph: the reason why I’m writing less here: I’m focusing a lot more on my Youtube channels (Dev, Games and Personal). Now, if you are a non-Portuguese speaker and if you tell me that you want to be able to understand the videos anyway, this might give me some energy to solve this problem.

The solution I have in mind is not creating whole new videos in English – that would be just too much work. But Youtube supports Closed Captions – which means I could just translate whatever I’m saying in Portuguese and add subs to the videos. Would that be compelling for you?

Thanks for sharing your opinions!


Posted in scala | Tagged , , , , , , , , | Leave a comment

Akka Essentials Book Review

Recently I had the opportunity to read the Akka Essentials book. This post is a brief review of this publication. Hope you enjoy.

The only good thing about this book is that it is short. Ok, that’s an exaggeration, but don’t expect too much from this book. To keep things in perspective, I read the kindle edition of this book this last month – so perhaps some of the problems mentioned below are specific to this version.

First thing I need to say about this book is that, a lot of times, it has at least awkward (to not say wrong) English sentences. It really could use some serious revision here. Some formatting review would also be good: in the “About the Reviewers” section there is even a whole duplicated paragraph.

The book covers lots of stuff about Akka – so it might be a good first glance at what this framework can do. Just don’t trust too much the code samples – more on that in a moment. Another interesting point, which might be good for some people, and bad for others, is that it presents examples both in Scala and Java.

In the sample code, lots of times, the author has slices of code that he uses to explain something. Than he goes on to add the entire code again, in a single unit. This is another point where I guess some people will like, and others won’t. I didn’t – it feels like wasting space that could be used to better explain some concepts.

Now to my pet peeve: a sample message, in an earlier chapter in the book. Try to guess what’s wrong in the code bellow (extracted from the book) before reading on:

import java.util.List;
public final class MapData {
  private final List<WordCount> dataList;
  public List<WordCount> getDataList() {
    return dataList;
  public MapData(List<WordCount> dataList) {
    this.dataList = dataList;

We have a class named MapData, whose objects are going to be used as messages to actors. It has a java.util.List attribute which is not guaranteed to be immutable, and no defensive copies are used. The problem with this is that such mutable messages can corrupt the actors, making them unreliable at all – this is even mentioned in the official Akka documentation, and is a serious error: – take a look at the “Messages and immutability” section.

In Scala, writing immutable messages is a lot easier, so at least there the author should have a correct message, right?  Unfortunately not. This is his Scala version for the message:

case class MapData(dataList: ArrayBuffer[WordCount])

Oh man… Even the Scala version of the message is wrong! ArrayBuffer is a mutable data structure. You really have to get out of your way to make this error. It is normal for people that are starting to use actors to write this kind of code, but I really expect more from a book trying to TEACH Akka. I wrote a bit about this subject here.

Another thing a bit strange or off-putting in the book is that sometimes it feels like the author is just throwing information at you, randomly. Also, a lot of times, the text says one thing about the code, and the code itself is a bit different from what is mentioned in the text – another clear lack of revision. In one example, when talking about routers for the first time, the author says we could pass messages in a round robin fashion using routers (which is correct). Then he goes on to show a sample code using a … BroadcastRouter! This kind of router has a completely different semantics.

All in all, since I love Akka, I would be happy to be able to recommend this book. The latter chapters even got better then the first ones. But I can’t. You are probably better off just reading the official documentation.

Posted in scala | Tagged , , , , , , , , , , , , , | Leave a comment