Discussion:
[DISCUSS] How many binary combos do we want to release?
Trevor Grant
2017-07-10 17:30:46 UTC
Permalink
In 0.13.1 we had one binary tarball.

A full spread would look something like this in 0.13.2-

Spark-1.6, Scala-2.10
Spark-2.0, Scala-2.10
Spark-2.1, Scala-2.10
Spark-1.6, Scala-2.11
Spark-2.0, Scala-2.11
Spark-2.1, Scala-2.11

Spark-1.6, Scala-2.10, viennacl
Spark-2.0, Scala-2.10, viennacl
Spark-2.1, Scala-2.10, viennacl
Spark-1.6, Scala-2.11, viennacl
Spark-2.0, Scala-2.11, viennacl
Spark-2.1, Scala-2.11, viennacl

Spark-1.6, Scala-2.10, viennacl-omp
Spark-2.0, Scala-2.10, viennacl-omp
Spark-2.1, Scala-2.10, viennacl-omp
Spark-1.6, Scala-2.11, viennacl-omp
Spark-2.0, Scala-2.11, viennacl-omp
Spark-2.1, Scala-2.11, viennacl-omp

Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp

That's 24 tarballs of pre-compiled binaries.

The main thing I'm concerned about is getting all combos of spark/scala,
viennacl/scala, viennacl-omp/scala into Maven repositories. This can be
accomplished with 6 tarballs:

Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp


Not all users want ViennaCL (I would imagine) - A compromise might be the
first and last 6 combinations:

Spark-1.6, Scala-2.10
Spark-2.0, Scala-2.10
Spark-2.1, Scala-2.10
Spark-1.6, Scala-2.11
Spark-2.0, Scala-2.11
Spark-2.1, Scala-2.11

Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp

Thoughts?
Trevor Grant
2017-07-10 23:56:02 UTC
Permalink
From the Spark website:

"Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
Scala 2.10 users should download the Spark source package and build with
Scala 2.10 support."

Given that, the minimum set (imho) would be:

Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp

It has been pointed out that our spark-2.0 may cover all of spark 2.x, but
I haven't tested that.
Awesome!
INFRA may have an issue here. And we may need to move some of the older
releases to the archives...
We have a waiver for > the standard 200Mb cap, which should still be in
place.. But if you start to notice that you're having trouble Uploading
artifacts to the staging ground, It may be that we've blown their caps.
Please let me know if this happens, and I'll figure out what needs to be
done.
Thanks
--andy
________________________________
Sent: Monday, July 10, 2017 1:30:46 PM
To: Mahout Dev List
Subject: [DISCUSS] How many binary combos do we want to release?
In 0.13.1 we had one binary tarball.
A full spread would look something like this in 0.13.2-
Spark-1.6, Scala-2.10
Spark-2.0, Scala-2.10
Spark-2.1, Scala-2.10
Spark-1.6, Scala-2.11
Spark-2.0, Scala-2.11
Spark-2.1, Scala-2.11
Spark-1.6, Scala-2.10, viennacl
Spark-2.0, Scala-2.10, viennacl
Spark-2.1, Scala-2.10, viennacl
Spark-1.6, Scala-2.11, viennacl
Spark-2.0, Scala-2.11, viennacl
Spark-2.1, Scala-2.11, viennacl
Spark-1.6, Scala-2.10, viennacl-omp
Spark-2.0, Scala-2.10, viennacl-omp
Spark-2.1, Scala-2.10, viennacl-omp
Spark-1.6, Scala-2.11, viennacl-omp
Spark-2.0, Scala-2.11, viennacl-omp
Spark-2.1, Scala-2.11, viennacl-omp
Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp
That's 24 tarballs of pre-compiled binaries.
The main thing I'm concerned about is getting all combos of spark/scala,
viennacl/scala, viennacl-omp/scala into Maven repositories. This can be
Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp
Not all users want ViennaCL (I would imagine) - A compromise might be the
Spark-1.6, Scala-2.10
Spark-2.0, Scala-2.10
Spark-2.1, Scala-2.10
Spark-1.6, Scala-2.11
Spark-2.0, Scala-2.11
Spark-2.1, Scala-2.11
Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.10, viennacl, viennacl-omp
Spark-2.1, Scala-2.10, viennacl, viennacl-omp
Spark-1.6, Scala-2.11, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp
Thoughts?
Loading...