It's de facto standard on how to publish a scala library/framework -
artifact is expected to include in artifact id scala binary version that
it's compatible with. That makes it more transparent to the user what is
expected and supported, without digging through pom.
Here, this is a point in favor of a naming convention (not cross compiling).
The downside is that upgrading will be more complex for maven or gradle
users as they will have to update not only the version, but the artifactId
Any software, regardless of the language it's written in, has inherent
inertia. Upgrading dependencies/stack of any, including Scala projects, is
a process, and one is lucky if migration is only technical issue. Scala
code has one downside more compared to Java code - Scala binary version
compatibility requirement. To ease that burden and give more breathing
space for migration to take place, Scala libraries are typically published
across multiple Scala binary versions. This broadens their reach, eases
adoption, and maintenance for users.
We all know how organizations work. The only thing that make them moving is
the dead line / end of life. See how things work for Windows, and JDK...
The more time you give them, the slower they will move.
IMHO, organizations shouldn't be picking Scala as their language of choice
if they're not willing to adapt to (and benefit from) the innovation pace.
But that's just a personal opinion.
Scala 2.11 has been out for one year.
Gatling 2.0 was targeting 2.10 and, IIRC, we moved to 2.11 with 2.1 last
This looks to me like a decent time window.
I'm a newbie to Gatling. If I understood correctly, before/after hooks are
there for initialization/cleanup. I found one thread on Gatling mailing
list (mentioning there are many more on the same topic) where Gatling users
are advised not to use those hooks, but to write initialization/cleanup
code in separate scripts and use tools to orchestrate execution of
everything. IMO it would be more fair then to remove completely support for
before/after hooks from Gatling APIs.
So now you're talking about breaking compatibility?!
People have been asking for them for a loooooong time, so we finally added
them. We can't give and take back.
But, while the hooks are in there I prefer to use them since they offer me
easier and typesafe means to share state between initialization/cleanup
code and actual simulation(s). Btw, I've seen Gatling simulations being
used to initialize/cleanup other Gatling simulations - I'm interested what
you think about that approach; personally I do not like it.
I'm personally more in favor of database restore scripts. For example, if
you massively insert and then delete into a RDBMS through a REST API,
without any further care, you might end up with irrelevant statistics and
your explain plan might be completely wrong. This kind of issue happens
with other databases too, eg with Cassandra, you'd get tons of tombstones.
Anyway, since Gatling is only 2.11 now, code I can write in before/after
hooks is limited, I can only use libraries which are Scala 2.11 compatible.
Unfortunately at the moment I need to use a client library which only has
2.10 variant that is compatible with server behind service I'm testing.
There is new 2.11 (and 2.10) client but it requires compatible just
released new server as well, and as stated previously upgrade is a process.
You could maybe use Gatling 2.0 as a temporary solution. Or don't use the
Then, as I said in my previous email, we're very compelled to drop JDK7
compatibility in some not so far future, as other libraries we already use
or plan to use are doing such a move.