Discussion:
Build failed in Jenkins: Mahout-Quality #3367
Apache Jenkins Server
2016-06-11 02:01:40 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3367/changes>

Changes:

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

------------------------------------------
[...truncated 86049 lines...]
06/11/2016 02:00:49 DataSink (org.apache.flink.api.java.Utils$***@612a1ba1)(1/1) switched to SCHEDULED
06/11/2016 02:00:49 DataSink (org.apache.flink.api.java.Utils$***@612a1ba1)(1/1) switched to DEPLOYING
06/11/2016 02:00:49 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to FINISHED
06/11/2016 02:00:49 DataSink (org.apache.flink.api.java.Utils$***@612a1ba1)(1/1) switched to RUNNING
06/11/2016 02:00:49 DataSink (org.apache.flink.api.java.Utils$***@612a1ba1)(1/1) switched to FINISHED
06/11/2016 02:00:49 Job execution switched to status FINISHED.
(40,40)
06/11/2016 02:00:49 Job execution switched to status RUNNING.
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to SCHEDULED
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to RUNNING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to RUNNING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to RUNNING
06/11/2016 02:00:49 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to RUNNING
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to RUNNING
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to RUNNING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(1/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(1/4) switched to DEPLOYING
06/11/2016 02:00:49 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to FINISHED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(2/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(2/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(3/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(3/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(4/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(4/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/4) switched to DEPLOYING
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/4) switched to SCHEDULED
06/11/2016 02:00:49 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/4) switched to DEPLOYING
06/11/2016 02:00:49 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(4/4) switched to SCHEDULED
06/11/2016 02:00:49 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(4/4) switched to DEPLOYING
06/11/2016 02:00:49 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(3/4) switched to SCHEDULED
06/11/2016 02:00:49 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(3/4) switched to DEPLOYING
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(1/4) switched to SCHEDULED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(1/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(1/4) switched to RUNNING
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(2/4) switched to SCHEDULED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(2/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(3/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(2/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(4/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/4) switched to RUNNING
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(4/4) switched to RUNNING
06/11/2016 02:00:50 DataSource (at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:50) (org.apache.flink.api.java.io.Collec)(1/1) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(3/4) switched to RUNNING
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(1/4) switched to RUNNING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(3/4) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(4/4) switched to FINISHED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(2/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(4/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(4/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(1/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(4/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(1/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/4) switched to FINISHED
06/11/2016 02:00:50 MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52))(2/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(2/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37))(3/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(1/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(2/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(2/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(2/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(3/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(3/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(4/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpTimesRightMatrix$.drmTimesInCore(FlinkOpTimesRightMatrix.scala:52)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(3/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(4/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(2/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (***@1d0ea52c)(3/4) switched to FINISHED
06/11/2016 02:00:50 Job execution switched to status FINISHED.
06/11/2016 02:00:50 Job execution switched to status RUNNING.
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to RUNNING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to RUNNING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to RUNNING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(2/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(2/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(2/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(1/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(1/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(2/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(4/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(4/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(4/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(1/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(2/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(4/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(4/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(3/4) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(3/4) switched to DEPLOYING
06/11/2016 02:00:50 DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat)(3/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(3/4) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@1727a10)(3/4) switched to FINISHED
06/11/2016 02:00:50 Job execution switched to status FINISHED.
06/11/2016 02:00:50 Job execution switched to status RUNNING.
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/4) switched to SCHEDULED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/4) switched to SCHEDULED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/4) switched to SCHEDULED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/4) switched to SCHEDULED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/4) switched to DEPLOYING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/4) switched to RUNNING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/4) switched to RUNNING
06/11/2016 02:00:50 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to SCHEDULED
06/11/2016 02:00:50 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to DEPLOYING
06/11/2016 02:00:50 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to RUNNING
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/4) switched to FINISHED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@7356f959)(1/1) switched to SCHEDULED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@7356f959)(1/1) switched to DEPLOYING
06/11/2016 02:00:50 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to FINISHED
06/11/2016 02:00:50 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.java.io.TypeSerializerInputFormat) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/4) switched to FINISHED
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@7356f959)(1/1) switched to RUNNING
06/11/2016 02:00:50 DataSink (org.apache.flink.api.java.Utils$***@7356f959)(1/1) switched to FINISHED
06/11/2016 02:00:50 Job execution switched to status FINISHED.
(10,500)
pca:
{
0 => {0:-2.4676501314475323,1:2.3392385416128043,2:-1.07966550519966,3:0.324107496035375,4:-0.43076503305327585,5:-0.16050059381906018,6:-0.011625098335560276,7:-0.02114996564706478,8:-0.0019022091061633639,9:-5.887889855267425E-4}
1 => {0:-6.2384346379967965,1:-5.837402474461583,2:-0.26421069096656624,3:0.358871149895974,4:0.11917423155376326,5:0.039795519679992536,6:-0.019376628406741363,7:0.0034526645557725185,8:-0.002475888909454384,9:0.002958508025835741}
2 => {0:-2.409701343568667,1:2.045550981950236,2:-0.922420359695994,3:0.425092068859569,4:0.06848598136114856,5:0.03416358663339173,6:-0.01786828253904686,7:0.008134547520365168,8:0.0074575387159514905,9:-9.007604854956488E-4}
3 => {0:14.879272938363401,1:-0.6835217798001592,2:-0.9499847971245348,3:0.4154457539281164,4:0.07086305507263854,5:0.02013274797480125,6:-0.019539947138102153,7:0.007128112393755682,8:-0.002512863810441518,9:-4.432551502623816E-4}
4 => {0:15.074742008207947,1:-0.3872434396038523,2:-0.9656295929319691,3:0.33331288153410094,4:0.08353894945431983,5:0.04984685245983895,6:-0.010765896631021047,7:0.00691657173831421,8:-0.0036607291124231747,9:-8.790357385018697E-4}
5 => {0:-2.88983478896473,1:2.4663868265998667,2:-0.8317830924101163,3:0.39234796309193254,4:0.11974588183134893,5:0.06445750710189539,6:0.06619523299152227,7:0.002278590000506337,8:-8.154547974691237E-4,9:-0.0013545695796418576}
6 => {0:-1.2238181653233113,1:2.3552086958478795,2:-1.0332412649823794,3:0.3400937853531089,4:0.15553624995112758,5:-0.14954098878312005,6:0.07155963877233722,7:0.007818232752146355,8:0.008932400378976043,9:-8.605147168435362E-4}
7 => {0:-0.49157129251159043,1:1.990660857013911,2:-1.0355820412260153,3:0.42588992633543693,4:0.0714373164440338,5:0.06276317531773816,6:-0.018198994264645163,7:0.00524686807010663,8:0.007299372708089596,9:-4.7790453829370786E-4}
8 => {0:-2.487599623354213,1:3.8683805808675693,2:2.7992822985796053,3:0.08723683123810211,4:0.06904415798702852,5:0.019529787949794833,6:-0.01867577482805713,7:-0.021921492846054248,8:-0.002240807443707811,9:-0.001173128748762315}
9 => {0:-2.825320931626675,1:1.5334654828508436,2:-0.7001664056395671,3:0.17953361471371412,4:0.10588146251012728,5:0.03274579527078207,6:-0.014174966881087053,7:-0.024173237928535013,8:-0.002652539599749571,9:0.0027349207773151767}
... }
pcaControl:
{
0 => {0:-2.4676501354324984,1:2.339238527608115,2:-1.079665481678566,3:0.3241075469420286,4:0.43076501534848266,5:-0.16050065556653328,6:0.011625035067942737,7:0.021149974012791707,8:0.0019022155112964117,9:-5.888714160842349E-4}
1 => {0:-6.238434637300225,1:-5.837402471251088,2:-0.26421069957738574,3:0.3588711400705909,4:-0.11917422258725482,5:0.039795531794454365,6:0.019376644553382713,7:-0.00345267621862331,8:0.002475885143280255,9:0.0029585232002020297}
2 => {0:-2.4097013463571404,1:2.0455509781736803,2:-0.9224203570973676,3:0.42509208665362724,4:-0.06848597969319296,5:0.03416357072860101,6:0.017868291947823634,7:-0.008134568259737423,8:-0.007457548036212235,9:-9.007604128603458E-4}
3 => {0:14.879272938440515,1:-0.6835217804671135,2:-0.9499847935016623,3:0.4154457637409966,4:-0.07086305590146709,5:0.02013274403518007,6:0.019539950340211623,7:-0.007128125587585067,8:0.002512850904742119,9:-4.432120291409837E-4}
4 => {0:15.074742009821314,1:-0.3872434366704478,2:-0.9656296034369475,3:0.3333128770902902,4:-0.08353897050572598,5:0.049846851976557575,6:0.010765911714280057,7:-0.006916535548964641,8:0.003660744872873759,9:-8.790865835565427E-4}
5 => {0:-2.8898347866691068,1:2.4663868258361274,2:-0.8317830826668942,3:0.3923479534316219,4:-0.11974587934712937,5:0.06445751182599561,6:-0.06619526057911647,7:-0.0022785732288538984,8:8.154550871409613E-4,9:-0.001354569600192572}
6 => {0:-1.2238181624881417,1:2.3552086863617196,2:-1.033241240941881,3:0.3400938169029464,4:-0.1555362675288111,5:-0.1495410254156162,6:-0.07155968893924457,7:-0.00781821427496282,8:-0.008932411102702318,9:-8.6052594769804E-4}
7 => {0:-0.4915712916709655,1:1.990660873432375,2:-1.0355820725894433,3:0.4258898625286445,4:-0.07143727592332669,5:0.06276325750607233,6:0.01819905680730904,7:-0.005246900012523334,8:-0.007299397368936163,9:-4.7778111205563E-4}
8 => {0:-2.487599623105936,1:3.8683806043364752,2:2.799282253721486,3:0.0872367581334604,4:-0.06904412981499462,5:0.019529874447667787,6:0.018675859543171656,7:0.02192148138421062,8:0.0022408087248061443,9:-0.0011730558725945904}
9 => {0:-2.825320923323882,1:1.533465481184772,2:-0.7001664023689562,3:0.1795336168945971,4:-0.1058814911313629,5:0.0327457866015237,6:0.014174953410710895,7:0.02417328723242143,8:0.0026525556188113225,9:0.0027348795747106753}
... }
- dspca
spectrum:{0:300.0,1:110.3638323514327,2:40.60058497098381,3:14.936120510359183,4:5.494691666620254,5:2.02138409972564,6:0.7436256529999076,7:0.2735645896663549,8:0.10063878837075356,9:0.037022941226003865,10:0.013619978928745457,11:0.005010510237073698,12:0.001843263705998463,13:0.001,14:0.001,15:0.001,16:0.001,17:0.001,18:0.001,19:0.001,20:0.001,21:0.001,22:0.001,23:0.001,24:0.001,25:0.001,26:0.001,27:0.001,28:0.001,29:0.001,30:0.001,31:0.001,32:0.001,33:0.001,34:0.001,35:0.001,36:0.001,37:0.001,38:0.001,39:0.001}
06/11/2016 02:00:54 Job execution switched to status RUNNING.
06/11/2016 02:00:54 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to SCHEDULED
06/11/2016 02:00:54 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to DEPLOYING
06/11/2016 02:00:54 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to RUNNING
06/11/2016 02:00:55 RangePartition: LocalSample(1/1) switched to SCHEDULED
06/11/2016 02:00:55 RangePartition: LocalSample(1/1) switched to DEPLOYING
06/11/2016 02:00:55 RangePartition: LocalSample(1/1) switched to RUNNING
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x000000076cd00000, 515899392, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 515899392 bytes for committing reserved memory.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Mahout-Quality/ws/flink/hs_err_pid10574.log>
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Mahout
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Mahout Build Tools
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Mahout Build Tools ................................. SUCCESS [ 5.521 s]
[INFO] Apache Mahout ...................................... SUCCESS [ 0.338 s]
[INFO] Mahout Math ........................................ SUCCESS [01:21 min]
[INFO] Mahout HDFS ........................................ SUCCESS [ 11.568 s]
[INFO] Mahout Map-Reduce .................................. SUCCESS [12:25 min]
[INFO] Mahout Integration ................................. SUCCESS [ 56.902 s]
[INFO] Mahout Examples .................................... SUCCESS [ 23.696 s]
[INFO] Mahout Math Scala bindings ......................... SUCCESS [04:24 min]
[INFO] Mahout H2O backend ................................. SUCCESS [03:26 min]
[INFO] Mahout Spark bindings .............................. SUCCESS [02:18 min]
[INFO] Mahout Flink bindings .............................. FAILURE [01:49 min]
[INFO] Mahout Spark bindings shell ........................ SKIPPED
[INFO] Mahout Release Package ............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27:27 min
[INFO] Finished at: 2016-06-11T02:00:57+00:00
[INFO] Final Memory: 70M/417M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project mahout-flink_2.10: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :mahout-flink_2.10
Build step 'Invoke top-level Maven targets' marked build as failure
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-11 06:37:55 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3368/changes>

Changes:

[smarthi] Rolling back 0.12.2 Release candidate 2

------------------------------------------
[...truncated 136038 lines...]
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
281226 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (3/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (3/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281233 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (8/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (8/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281246 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (6/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (6/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281246 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (7/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (7/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281248 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (11/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (11/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281248 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (1/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (1/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281250 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (14/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (14/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281248 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (4/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (4/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281248 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (5/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (5/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281251 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (13/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (13/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281250 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (2/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (2/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281249 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (15/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (15/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281249 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (10/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (10/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281249 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (9/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (9/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
281248 [CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (12/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:93)) (12/16)
java.lang.RuntimeException: Requesting the next InputSplit failed.
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:91)
at org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:342)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:137)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at scala.concurrent.Await.result(package.scala)
at org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:71)
... 4 more
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-11 10:21:24 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3369/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

------------------------------------------
[...truncated 58262 lines...]
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(15/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(5/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(11/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(13/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(12/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(14/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(7/16) switched to FINISHED
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(9/16) switched to FINISHED
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(6/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(7/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(9/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(6/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(3/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(12/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(8/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(16/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(14/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(15/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(8/16) switched to RUNNING
06/11/2016 07:12:28 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(16/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(3/16) switched to RUNNING
06/11/2016 07:12:28 DataSink (***@36356e30)(14/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(3/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(11/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(5/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(9/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(4/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(10/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(6/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(8/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(12/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(13/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(2/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(15/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(7/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(16/16) switched to FINISHED
06/11/2016 07:12:28 DataSink (***@36356e30)(1/16) switched to FINISHED
06/11/2016 07:12:28 Job execution switched to status FINISHED.
06/11/2016 07:12:29 Job execution switched to status RUNNING.
06/11/2016 07:12:29 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to SCHEDULED
06/11/2016 07:12:29 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to DEPLOYING
06/11/2016 07:12:29 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to RUNNING
06/11/2016 07:12:29 RangePartition: LocalSample(1/1) switched to SCHEDULED
06/11/2016 07:12:29 RangePartition: LocalSample(1/1) switched to DEPLOYING
06/11/2016 07:12:29 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to FINISHED
06/11/2016 07:12:29 RangePartition: PreparePartition(1/1) switched to SCHEDULED
06/11/2016 07:12:29 RangePartition: PreparePartition(1/1) switched to DEPLOYING
06/11/2016 07:12:29 RangePartition: LocalSample(1/1) switched to RUNNING
06/11/2016 07:12:29 RangePartition: PreparePartition(1/1) switched to RUNNING
06/11/2016 07:12:29 RangePartition: GlobalSample(1/1) switched to SCHEDULED
06/11/2016 07:12:29 RangePartition: GlobalSample(1/1) switched to DEPLOYING
06/11/2016 07:12:29 RangePartition: LocalSample(1/1) switched to FINISHED
06/11/2016 07:12:29 RangePartition: GlobalSample(1/1) switched to RUNNING
06/11/2016 07:12:29 RangePartition: Histogram(1/1) switched to SCHEDULED
06/11/2016 07:12:29 RangePartition: Histogram(1/1) switched to DEPLOYING
06/11/2016 07:12:29 RangePartition: GlobalSample(1/1) switched to FINISHED
06/11/2016 07:12:29 RangePartition: Histogram(1/1) switched to RUNNING
06/11/2016 07:12:29 RangePartition: Histogram(1/1) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(1/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(2/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(3/16) switched to SCHEDULED
06/11/2016 07:12:29 RangePartition: PreparePartition(1/1) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(1/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(4/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(5/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(6/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(7/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(8/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(3/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(6/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(9/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(10/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(12/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(11/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(8/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(13/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(14/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(10/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(16/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(2/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(16/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(15/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(14/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(13/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(11/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(12/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(9/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(7/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(5/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(4/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(15/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(1/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(1/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(1/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(10/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(10/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(16/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(16/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(2/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(2/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(6/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(6/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(14/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(14/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(3/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(3/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(11/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(13/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(9/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(11/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(7/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(5/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(9/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(5/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(13/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(7/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(15/16) switched to SCHEDULED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(6/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(8/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(16/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(13/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(11/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(12/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(9/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(14/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(7/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(10/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(2/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(3/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(15/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(12/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(8/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(12/16) switched to DEPLOYING
06/11/2016 07:12:29 DataSink (***@1210c01c)(8/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(5/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(4/16) switched to SCHEDULED
06/11/2016 07:12:29 DataSink (***@1210c01c)(4/16) switched to DEPLOYING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(4/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(15/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(1/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(1/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(2/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(14/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(10/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(10/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(16/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(3/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(11/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(6/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(16/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(9/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(13/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(2/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(7/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(12/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(15/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(8/16) switched to FINISHED
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(12/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(8/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(4/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(4/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(6/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(11/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(15/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(3/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(13/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(7/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(5/16) switched to RUNNING
06/11/2016 07:12:29 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dfsWrite(CheckpointedFlinkDrm.scala:231))(5/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(14/16) switched to RUNNING
06/11/2016 07:12:29 DataSink (***@1210c01c)(2/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(1/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(16/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(6/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(3/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(12/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(8/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(15/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(14/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(5/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(4/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(10/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(7/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(13/16) switched to FINISHED
06/11/2016 07:12:29 DataSink (***@1210c01c)(11/16) switched to FINISHED
06/11/2016 07:12:39 DataSink (***@1210c01c)(9/16) switched to FAILED
java.lang.Exception: Failed to send ExecutionStateChange notification to JobManager
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage$3$$anonfun$apply$2.apply(TaskManager.scala:398)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage$3$$anonfun$apply$2.apply(TaskManager.scala:382)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka://flink/user/jobmanager_1#667045242]] after [10000 ms]
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)

06/11/2016 07:12:39 Job execution switched to status FAILING.
java.lang.Exception: Failed to send ExecutionStateChange notification to JobManager
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage$3$$anonfun$apply$2.apply(TaskManager.scala:398)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage$3$$anonfun$apply$2.apply(TaskManager.scala:382)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka://flink/user/jobmanager_1#667045242]] after [10000 ms]
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
06/11/2016 07:12:39 Job execution switched to status FAILED.
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-11 14:04:55 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3370/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

------------------------------------------
[...truncated 59078 lines...]
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to RUNNING
06/11/2016 10:56:55 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to RUNNING
06/11/2016 10:56:56 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to SCHEDULED
06/11/2016 10:56:56 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(5/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(6/16) switched to FINISHED
06/11/2016 10:56:56 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@754a88b9)(1/1) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@754a88b9)(1/1) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to FINISHED
06/11/2016 10:56:56 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@754a88b9)(1/1) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@754a88b9)(1/1) switched to FINISHED
06/11/2016 10:56:56 Job execution switched to status FINISHED.
(1,1)
06/11/2016 10:56:56 Job execution switched to status RUNNING.
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(1/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(1/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(2/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(2/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(3/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(3/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(7/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(7/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(8/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(8/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(9/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(9/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(13/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(13/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(14/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(14/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(1/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(2/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(3/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(7/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(9/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(8/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(13/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(14/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(9/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(9/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(7/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(7/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(1/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(11/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(1/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(11/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(2/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(13/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(12/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(6/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(4/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(10/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(13/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(12/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(10/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(2/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(4/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(6/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(14/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(8/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(8/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(9/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(7/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(1/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(14/16) switched to DEPLOYING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(16/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(2/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(16/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(15/16) switched to SCHEDULED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(13/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(15/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(14/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(8/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(9/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(7/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(3/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(3/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(9/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(1/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(11/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(13/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(12/16) switched to RUNNING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(10/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(7/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(2/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(4/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(6/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(1/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(8/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(16/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(13/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(14/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(11/16) switched to FINISHED
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(3/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(15/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(12/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(10/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(4/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(2/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(3/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(6/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(14/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(16/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(15/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(3/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(8/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(5/16) switched to SCHEDULED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(5/16) switched to DEPLOYING
06/11/2016 10:56:56 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to FINISHED
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(5/16) switched to RUNNING
06/11/2016 10:56:56 DataSink (org.apache.flink.api.java.Utils$***@a36d42c)(5/16) switched to FINISHED
06/11/2016 10:56:56 Job execution switched to status FINISHED.
06/11/2016 10:56:57 Job execution switched to status RUNNING.
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(5/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(5/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(6/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(6/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to SCHEDULED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(5/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(6/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to RUNNING
06/11/2016 10:56:57 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to SCHEDULED
06/11/2016 10:56:57 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to DEPLOYING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(2/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(12/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(5/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(11/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(8/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(13/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(4/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(14/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(6/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(15/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(7/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(10/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(3/16) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/16) switched to FINISHED
06/11/2016 10:56:57 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to RUNNING
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(16/16) switched to FINISHED
06/11/2016 10:56:57 DataSink (org.apache.flink.api.java.Utils$***@2a4343ab)(1/1) switched to SCHEDULED
06/11/2016 10:56:57 DataSink (org.apache.flink.api.java.Utils$***@2a4343ab)(1/1) switched to DEPLOYING
06/11/2016 10:56:57 Reduce (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(1/1) switched to FINISHED
06/11/2016 10:56:57 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75)) -> Map (Map at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:79)) -> Combine (Reduce at org.apache.mahout.flinkbindings.drm.CheckpointedFlinkDrm.dim$lzycompute(CheckpointedFlinkDrm.scala:83))(9/16) switched to FINISHED
06/11/2016 10:56:57 DataSink (org.apache.flink.api.java.Utils$***@2a4343ab)(1/1) switched to RUNNING
06/11/2016 10:56:57 DataSink (org.apache.flink.api.java.Utils$***@2a4343ab)(1/1) switched to FINISHED
06/11/2016 10:56:57 Job execution switched to status FINISHED.
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 87.7% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-11 18:47:52 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3371/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

------------------------------------------
[...truncated 63686 lines...]
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(7/16) switched to RUNNING
06/11/2016 15:40:02 DataSink (***@38784680)(4/16) switched to SCHEDULED
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/16) switched to RUNNING
06/11/2016 15:40:02 DataSink (***@38784680)(4/16) switched to DEPLOYING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/16) switched to RUNNING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(2/16) switched to FINISHED
06/11/2016 15:40:02 DataSink (***@38784680)(2/16) switched to SCHEDULED
06/11/2016 15:40:02 DataSink (***@38784680)(2/16) switched to DEPLOYING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(16/16) switched to FAILED
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (0)(1) - 09bf0280e80b04bbb5f206c11daef115 @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/16) switched to RUNNING
06/11/2016 15:40:02 Job execution switched to status FAILING.
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (0)(1) - 09bf0280e80b04bbb5f206c11daef115 @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(8/16) switched to FINISHED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(11/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(12/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(13/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(14/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(15/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(16/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(8/16) switched to RUNNING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(1/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(2/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(5/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(6/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(7/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(8/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(9/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(10/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(11/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(12/16) switched to CANCELING
436346 [flink-akka.actor.default-dispatcher-15] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(13/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(14/16) switched to CANCELING
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(11/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(15/16) switched to CANCELING
06/11/2016 15:40:02 DataSink (***@38784680)(1/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(2/16) switched to CANCELING
06/11/2016 15:40:02 DataSink (***@38784680)(3/16) switched to CANCELING
06/11/2016 15:40:02 DataSink (***@38784680)(4/16) switched to CANCELING
06/11/2016 15:40:02 DataSink (***@38784680)(5/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(6/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(7/16) switched to CANCELING
06/11/2016 15:40:02 DataSink (***@38784680)(8/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(9/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(10/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(11/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(12/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(13/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(14/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(15/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(16/16) switched to CANCELED
06/11/2016 15:40:02 DataSink (***@38784680)(7/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(13/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(16/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(14/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(3/16) switched to CANCELED
436348 [flink-akka.actor.default-dispatcher-7] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 15:40:02 DataSink (***@38784680)(4/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(15/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN RangePartition: Partition -> Partition(12/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(7/16) switched to CANCELED
06/11/2016 15:40:02 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(4/16) switched to CANCELED
442314 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (15/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (15/16)
org.apache.flink.runtime.io.network.partition.PartitionNotFoundException: Partition ***@4b58fed97961f6963e2fadeaf97ae34e not found.
at org.apache.flink.runtime.io.network.partition.ResultPartitionManager.createSubpartitionView(ResultPartitionManager.java:76)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel.requestSubpartition(LocalInputChannel.java:103)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel$1.run(LocalInputChannel.java:136)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
442315 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (15/16)] ERROR org.apache.flink.runtime.taskmanager.Task - FATAL - exception in task resource cleanup
java.lang.IllegalStateException: Memory manager has been shut down.
at org.apache.flink.runtime.memory.MemoryManager.releaseAll(MemoryManager.java:468)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:661)
at java.lang.Thread.run(Thread.java:745)
442317 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (6/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (6/16)
org.apache.flink.runtime.io.network.partition.PartitionNotFoundException: Partition ***@5e4270569522bc026a0525ada9101c46 not found.
at org.apache.flink.runtime.io.network.partition.ResultPartitionManager.createSubpartitionView(ResultPartitionManager.java:76)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel.requestSubpartition(LocalInputChannel.java:103)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel$1.run(LocalInputChannel.java:136)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
442318 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (6/16)] ERROR org.apache.flink.runtime.taskmanager.Task - FATAL - exception in task resource cleanup
java.lang.IllegalStateException: Memory manager has been shut down.
at org.apache.flink.runtime.memory.MemoryManager.releaseAll(MemoryManager.java:468)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:661)
at java.lang.Thread.run(Thread.java:745)
442329 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (11/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (11/16)
org.apache.flink.runtime.io.network.partition.PartitionNotFoundException: Partition ***@404e1b3c0f37f3b015ed3827da6163ba not found.
at org.apache.flink.runtime.io.network.partition.ResultPartitionManager.createSubpartitionView(ResultPartitionManager.java:76)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel.requestSubpartition(LocalInputChannel.java:103)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel$1.run(LocalInputChannel.java:136)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
442330 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (11/16)] ERROR org.apache.flink.runtime.taskmanager.Task - FATAL - exception in task resource cleanup
java.lang.IllegalStateException: Memory manager has been shut down.
at org.apache.flink.runtime.memory.MemoryManager.releaseAll(MemoryManager.java:468)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:661)
at java.lang.Thread.run(Thread.java:745)
442332 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (5/16)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (5/16)
org.apache.flink.runtime.io.network.partition.PartitionNotFoundException: Partition ***@6854e73192d275d789fa43d97afd9b23 not found.
at org.apache.flink.runtime.io.network.partition.ResultPartitionManager.createSubpartitionView(ResultPartitionManager.java:76)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel.requestSubpartition(LocalInputChannel.java:103)
at org.apache.flink.runtime.io.network.partition.consumer.LocalInputChannel$1.run(LocalInputChannel.java:136)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
442333 [Combine (Reduce at org.apache.mahout.flinkbindings.blas.FlinkOpAt$.sparseTrick(FlinkOpAt.scala:61)) (5/16)] ERROR org.apache.flink.runtime.taskmanager.Task - FATAL - exception in task resource cleanup
java.lang.IllegalStateException: Memory manager has been shut down.
at org.apache.flink.runtime.memory.MemoryManager.releaseAll(MemoryManager.java:468)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:661)
at java.lang.Thread.run(Thread.java:745)
06/11/2016 15:40:42 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(14/16) switched to CANCELED
06/11/2016 15:40:42 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(12/16) switched to CANCELED
06/11/2016 15:40:42 CHAIN MapPartition (MapPartition at org.apache.mahout.flinkbindings.drm.RowsFlinkDrm.asBlockified(FlinkDrm.scala:52)) -> Map (Map at org.apache.mahout.flinkbindings.blas.FlinkOpMapBlock$.apply(FlinkOpMapBlock.scala:37)) -> FlatMap (FlatMap at org.apache.mahout.flinkbindings.drm.BlockifiedFlinkDrm.asRowWise(FlinkDrm.scala:93))(15/16) switched to CANCELED
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Mahout
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Mahout Build Tools
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Mahout Build Tools ................................. SUCCESS [ 2.424 s]
[INFO] Apache Mahout ...................................... SUCCESS [ 0.144 s]
[INFO] Mahout Math ........................................ SUCCESS [01:19 min]
[INFO] Mahout HDFS ........................................ SUCCESS [ 4.194 s]
[INFO] Mahout Map-Reduce .................................. SUCCESS [13:05 min]
[INFO] Mahout Integration ................................. SUCCESS [ 46.705 s]
[INFO] Mahout Examples .................................... SUCCESS [ 24.518 s]
[INFO] Mahout Math Scala bindings ......................... SUCCESS [05:17 min]
[INFO] Mahout H2O backend ................................. SUCCESS [03:32 min]
[INFO] Mahout Spark bindings .............................. SUCCESS [02:26 min]
[INFO] Mahout Flink bindings .............................. FAILURE [ 03:15 h]
[INFO] Mahout Spark bindings shell ........................ SKIPPED
[INFO] Mahout Release Package ............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:42 h
[INFO] Finished at: 2016-06-11T18:47:34+00:00
[INFO] Final Memory: 62M/787M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project mahout-flink_2.10: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :mahout-flink_2.10
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-11 22:31:15 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3372/changes>

Changes:

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

------------------------------------------
[...truncated 55780 lines...]
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to CANCELING
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to CANCELING
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(1/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(2/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(3/16) switched to CANCELING
257738 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(4/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(5/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(6/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(7/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(8/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(9/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(10/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(12/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(13/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(14/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(15/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(16/16) switched to CANCELING
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(6/16) switched to CANCELED
257739 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(9/16) switched to CANCELED
257740 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(8/16) switched to CANCELED
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(1/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(2/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(16/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(5/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(13/16) switched to CANCELED
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(7/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(3/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(14/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(15/16) switched to CANCELED
06/11/2016 19:20:02 DataSink (org.apache.flink.api.java.Utils$***@3cebfc69)(12/16) switched to CANCELED
06/11/2016 19:20:02 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to CANCELED
06/11/2016 19:20:02 Job execution switched to status FAILED.
- Model DFS Serialization *** FAILED ***
 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:717)
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:663)
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:663)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
 at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
 at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
 at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
 at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
 ...
 Cause: java.lang.Exception: Failed to deploy the task to slot SimpleSlot (10)(1) - 33fa336ad0e9aedc7c4bce2e6990b9ee @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
 at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
 at akka.dispatch.OnComplete.internal(Future.scala:247)
 at akka.dispatch.OnComplete.internal(Future.scala:244)
 at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
 at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
 at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
 at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
 at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
 at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
 ...
06/11/2016 19:20:02 Job execution switched to status RUNNING.
06/11/2016 19:20:02 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to SCHEDULED
06/11/2016 19:20:02 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to DEPLOYING
06/11/2016 19:20:02 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to RUNNING
06/11/2016 19:20:02 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to FINISHED
06/11/2016 19:20:02 RangePartition: LocalSample(1/1) switched to SCHEDULED
06/11/2016 19:20:02 RangePartition: LocalSample(1/1) switched to DEPLOYING
06/11/2016 19:20:02 RangePartition: PreparePartition(1/1) switched to SCHEDULED
06/11/2016 19:20:02 RangePartition: PreparePartition(1/1) switched to DEPLOYING
06/11/2016 19:20:02 RangePartition: LocalSample(1/1) switched to RUNNING
06/11/2016 19:20:02 RangePartition: PreparePartition(1/1) switched to RUNNING
06/11/2016 19:20:02 RangePartition: GlobalSample(1/1) switched to SCHEDULED
06/11/2016 19:20:02 RangePartition: LocalSample(1/1) switched to FINISHED
06/11/2016 19:20:02 RangePartition: GlobalSample(1/1) switched to DEPLOYING
06/11/2016 19:20:02 RangePartition: GlobalSample(1/1) switched to RUNNING
06/11/2016 19:20:02 RangePartition: Histogram(1/1) switched to SCHEDULED
06/11/2016 19:20:02 RangePartition: Histogram(1/1) switched to DEPLOYING
06/11/2016 19:20:02 RangePartition: GlobalSample(1/1) switched to FINISHED
06/11/2016 19:20:02 RangePartition: Histogram(1/1) switched to RUNNING
06/11/2016 19:20:02 RangePartition: Histogram(1/1) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to SCHEDULED
06/11/2016 19:20:02 RangePartition: PreparePartition(1/1) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(10/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(10/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to SCHEDULED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to DEPLOYING
06/11/2016 19:20:02 Reduce (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/1) switched to SCHEDULED
06/11/2016 19:20:02 Reduce (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/1) switched to DEPLOYING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(10/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to FINISHED
06/11/2016 19:20:02 Reduce (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/1) switched to RUNNING
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to FINISHED
06/11/2016 19:20:02 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to FINISHED
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-12 02:14:35 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3373/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

------------------------------------------
[...truncated 55412 lines...]
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(1/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(4/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(4/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(13/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(9/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(13/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(5/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(9/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(11/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(5/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(11/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(7/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(14/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(2/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(14/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(2/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(6/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(3/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(10/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(3/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(10/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(8/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(8/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(6/16) switched to DEPLOYING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(7/16) switched to DEPLOYING
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(14/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(1/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(10/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(3/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(4/16) switched to FINISHED
Exception in thread "ForkJoinPool-898-worker-17" java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2037)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(9/16) switched to FINISHED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(15/16) switched to SCHEDULED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(15/16) switched to DEPLOYING
Exception in thread "ForkJoinPool-898-worker-7" java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2037)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to FINISHED
Exception in thread "ForkJoinPool-898-worker-29" java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2037)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Exception in thread "ForkJoinPool-898-worker-21" java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2037)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(2/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to FINISHED
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(7/16) switched to FINISHED
259395 [flink-akka.actor.default-dispatcher-8] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(3/16) switched to FAILED
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (5)(1) - a2c5d8333c3e7bdd7dd75e2211837f1c @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

06/11/2016 23:03:16 Job execution switched to status FAILING.
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (5)(1) - a2c5d8333c3e7bdd7dd75e2211837f1c @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(8/16) switched to CANCELING
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(12/16) switched to CANCELING
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(13/16) switched to CANCELING
06/11/2016 23:03:16 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(1/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(2/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(4/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(5/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(6/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(7/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(8/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(9/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(10/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(11/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(12/16) switched to CANCELED
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(13/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(14/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(15/16) switched to CANCELING
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(16/16) switched to CANCELED
259400 [flink-akka.actor.default-dispatcher-8] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(7/16) switched to CANCELED
259400 [flink-akka.actor.default-dispatcher-8] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(2/16) switched to CANCELED
259401 [flink-akka.actor.default-dispatcher-8] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/11/2016 23:03:16 DataSink (org.apache.flink.api.java.Utils$***@66501fcd)(15/16) switched to CANCELED
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-12 05:57:55 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3374/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

------------------------------------------
[...truncated 58214 lines...]
... 4 more
348219 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348228 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-52]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348229 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348230 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-53]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348230 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348231 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-54]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348232 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348232 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-55]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348232 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348233 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-56]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348233 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
348234 [flink-akka.actor.default-dispatcher-2] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-57]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 87.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-12 09:41:17 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3375/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] [maven-release-plugin] rollback the release of mahout-0.12.2

------------------------------------------
[...truncated 58167 lines...]
... 4 more
353136 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353136 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-51]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353136 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353137 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-52]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353137 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353137 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-53]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353138 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353138 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-54]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353138 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353138 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-55]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353139 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
353139 [flink-akka.actor.default-dispatcher-26] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-56]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 88.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-12 22:37:56 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3376/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

------------------------------------------
[...truncated 58952 lines...]
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to CANCELING
06/12/2016 19:29:17 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(15/16) switched to CANCELING
06/12/2016 19:29:17 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(1/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(2/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(3/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(4/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(5/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(6/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(9/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(7/16) switched to FAILED
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (13)(1) - ac5064bff43db4138bcc5494eda323a4 @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(10/16) switched to CANCELING
357474 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(11/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(12/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(13/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(12/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(14/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(15/16) switched to CANCELING
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(16/16) switched to CANCELED
357475 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(13/16) switched to CANCELED
357475 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(6/16) switched to CANCELED
357476 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(1/16) switched to CANCELED
357476 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(9/16) switched to CANCELED
357477 [flink-akka.actor.default-dispatcher-11] ERROR org.apache.flink.runtime.taskmanager.TaskManager - SubmitTask failed
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.flink.runtime.taskmanager.Task.startTaskThread(Task.java:401)
at org.apache.flink.runtime.taskmanager.TaskManager.submitTask(TaskManager.scala:1043)
at org.apache.flink.runtime.taskmanager.TaskManager.org$apache$flink$runtime$taskmanager$TaskManager$$handleTaskMessage(TaskManager.scala:411)
at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$handleMessage$1.applyOrElse(TaskManager.scala:265)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.taskmanager.TaskManager.aroundReceive(TaskManager.scala:119)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(15/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(3/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(10/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(14/16) switched to CANCELED
06/12/2016 19:29:17 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(11/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(4/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(2/16) switched to CANCELED
06/12/2016 19:29:17 DataSink (org.apache.flink.api.java.Utils$***@5ac9c802)(5/16) switched to CANCELED
06/12/2016 19:29:17 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(16/16) switched to CANCELED
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 87.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-13 02:21:18 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3377/changes>

Changes:

[smarthi] Rolling back Mahout 0.12.2 Release candidate, thanks github connectivity

------------------------------------------
[...truncated 55968 lines...]
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(13/16) switched to RUNNING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(12/16) switched to RUNNING
06/12/2016 23:10:37 Job execution switched to status FAILING.
java.lang.Exception: Failed to deploy the task to slot SimpleSlot (0)(1) - df67b19955b334462482d095752b6ddb @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
at akka.dispatch.OnComplete.internal(Future.scala:247)
at akka.dispatch.OnComplete.internal(Future.scala:244)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
06/12/2016 23:10:37 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to CANCELING
06/12/2016 23:10:37 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(1/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(2/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(3/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(4/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(5/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(6/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(7/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(8/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(9/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(10/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(11/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(11/16) switched to RUNNING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(12/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(13/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(14/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(15/16) switched to CANCELING
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(10/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(7/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(9/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(12/16) switched to CANCELED
06/12/2016 23:10:37 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(5/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(13/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(8/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(1/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(2/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(4/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(3/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(15/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(5/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(11/16) switched to CANCELED
06/12/2016 23:10:37 DataSink (org.apache.flink.api.java.Utils$***@4846fae7)(14/16) switched to CANCELED
06/12/2016 23:10:37 CHAIN DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:396) (org.apache.flink.api.scala.hadoop.mapred.HadoopInputFo) -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.drmDfsRead(FlinkEngine.scala:75))(6/16) switched to CANCELED
06/12/2016 23:10:37 Job execution switched to status FAILED.
- Model DFS Serialization *** FAILED ***
 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:717)
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:663)
 at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:663)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
 at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
 at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
 at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
 at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
 ...
 Cause: java.lang.Exception: Failed to deploy the task to slot SimpleSlot (0)(1) - df67b19955b334462482d095752b6ddb @ localhost - 16 slots - URL: akka://flink/user/taskmanager_1 - ALLOCATED/ALIVE: Response was not of type Acknowledge
 at org.apache.flink.runtime.executiongraph.Execution$2.onComplete(Execution.java:395)
 at akka.dispatch.OnComplete.internal(Future.scala:247)
 at akka.dispatch.OnComplete.internal(Future.scala:244)
 at akka.dispatch.japi$CallbackBridge.apply(Future.scala:174)
 at akka.dispatch.japi$CallbackBridge.apply(Future.scala:171)
 at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
 at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
 at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
 at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
 ...
06/12/2016 23:10:38 Job execution switched to status RUNNING.
06/12/2016 23:10:38 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to SCHEDULED
06/12/2016 23:10:38 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to DEPLOYING
06/12/2016 23:10:38 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to RUNNING
06/12/2016 23:10:38 RangePartition: LocalSample(1/1) switched to SCHEDULED
06/12/2016 23:10:38 DataSource (at org.apache.mahout.flinkbindings.FlinkEngine$.parallelize(FlinkEngine.scala:273) (org.apache.flink.api.java.io.CollectionInputFormat))(1/1) switched to FINISHED
06/12/2016 23:10:38 RangePartition: LocalSample(1/1) switched to DEPLOYING
06/12/2016 23:10:38 RangePartition: PreparePartition(1/1) switched to SCHEDULED
06/12/2016 23:10:38 RangePartition: PreparePartition(1/1) switched to DEPLOYING
06/12/2016 23:10:38 RangePartition: LocalSample(1/1) switched to RUNNING
06/12/2016 23:10:38 RangePartition: PreparePartition(1/1) switched to RUNNING
06/12/2016 23:10:38 RangePartition: GlobalSample(1/1) switched to SCHEDULED
06/12/2016 23:10:38 RangePartition: GlobalSample(1/1) switched to DEPLOYING
06/12/2016 23:10:38 RangePartition: LocalSample(1/1) switched to FINISHED
06/12/2016 23:10:38 RangePartition: GlobalSample(1/1) switched to RUNNING
06/12/2016 23:10:38 RangePartition: Histogram(1/1) switched to SCHEDULED
06/12/2016 23:10:38 RangePartition: Histogram(1/1) switched to DEPLOYING
06/12/2016 23:10:38 RangePartition: GlobalSample(1/1) switched to FINISHED
06/12/2016 23:10:38 RangePartition: Histogram(1/1) switched to RUNNING
06/12/2016 23:10:38 RangePartition: Histogram(1/1) switched to FINISHED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to DEPLOYING
06/12/2016 23:10:38 RangePartition: PreparePartition(1/1) switched to FINISHED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(10/16) switched to SCHEDULED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(10/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(13/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(15/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(16/16) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(4/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to RUNNING
06/12/2016 23:10:38 Reduce (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/1) switched to SCHEDULED
06/12/2016 23:10:38 Reduce (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/1) switched to DEPLOYING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(5/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(14/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(2/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(11/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(12/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(9/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(8/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(3/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(7/16) switched to RUNNING
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(1/16) switched to FINISHED
06/12/2016 23:10:38 CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194))(6/16) switched to FINISHED
261715 [CHAIN RangePartition: Partition -> Partition -> Map (Map at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:192)) -> Combine (Reduce at org.apache.mahout.flinkbindings.FlinkEngine$.colSums(FlinkEngine.scala:194)) (12/16)] ERROR org.apache.flink.runtime.taskmanager.Task - FATAL - exception in task resource cleanup
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.fullExternalPush(ForkJoinPool.java:1905)
at scala.concurrent.forkjoin.ForkJoinPool.externalPush(ForkJoinPool.java:1834)
at scala.concurrent.forkjoin.ForkJoinPool.execute(ForkJoinPool.java:2955)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:387)
at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
at akka.dispatch.Dispatcher.registerForExecution(Dispatcher.scala:118)
at akka.dispatch.Dispatcher.dispatch(Dispatcher.scala:59)
at akka.actor.dungeon.Dispatch$class.sendMessage(Dispatch.scala:123)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.Cell$class.sendMessage(ActorCell.scala:290)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.RepointableActorRef.$bang(RepointableActorRef.scala:166)
at akka.actor.ActorRef.tell(ActorRef.scala:123)
at org.apache.flink.runtime.instance.AkkaActorGateway.tell(AkkaActorGateway.java:79)
at org.apache.flink.runtime.taskmanager.Task.notifyFinalState(Task.java:735)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:670)
at java.lang.Thread.run(Thread.java:745)
261715 [flink-akka.actor.default-dispatcher-19] ERROR org.apache.flink.runtime.executiongraph.ExecutionGraph - Error while notifying execution graph of execution state transition.
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJoinPool.java:1072)
at scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.java:654)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:386)
at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
at akka.dispatch.Dispatcher.registerForExecution(Dispatcher.scala:118)
at akka.dispatch.Dispatcher.dispatch(Dispatcher.scala:59)
at akka.actor.dungeon.Dispatch$class.sendMessage(Dispatch.scala:123)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.Cell$class.sendMessage(ActorCell.scala:290)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.RepointableActorRef.$bang(RepointableActorRef.scala:166)
at akka.actor.ActorRef.tell(ActorRef.scala:123)
at org.apache.flink.runtime.instance.AkkaActorGateway.tell(AkkaActorGateway.java:79)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.notifyExecutionChange(ExecutionGraph.java:1217)
at org.apache.flink.runtime.executiongraph.ExecutionVertex.notifyStateTransition(ExecutionVertex.java:627)
at org.apache.flink.runtime.executiongraph.Execution.transitionState(Execution.java:984)
at org.apache.flink.runtime.executiongraph.Execution.transitionState(Execution.java:966)
at org.apache.flink.runtime.executiongraph.Execution.markFinished(Execution.java:658)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.updateState(ExecutionGraph.java:1091)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$4.apply$mcV$sp(JobManager.scala:518)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$4.apply(JobManager.scala:517)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$4.apply(JobManager.scala:517)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
261718 [flink-akka.actor.default-dispatcher-4] ERROR org.apache.flink.runtime.taskmanager.TaskManager -
==============================================================
====================== FATAL =======================
==============================================================

A fatal error occurred, forcing the TaskManager to shut down: FATAL - exception in task resource cleanup
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at scala.concurrent.forkjoin.ForkJoinPool.fullExternalPush(ForkJoinPool.java:1905)
at scala.concurrent.forkjoin.ForkJoinPool.externalPush(ForkJoinPool.java:1834)
at scala.concurrent.forkjoin.ForkJoinPool.execute(ForkJoinPool.java:2955)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:387)
at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
at akka.dispatch.Dispatcher.registerForExecution(Dispatcher.scala:118)
at akka.dispatch.Dispatcher.dispatch(Dispatcher.scala:59)
at akka.actor.dungeon.Dispatch$class.sendMessage(Dispatch.scala:123)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.Cell$class.sendMessage(ActorCell.scala:290)
at akka.actor.ActorCell.sendMessage(ActorCell.scala:369)
at akka.actor.RepointableActorRef.$bang(RepointableActorRef.scala:166)
at akka.actor.ActorRef.tell(ActorRef.scala:123)
at org.apache.flink.runtime.instance.AkkaActorGateway.tell(AkkaActorGateway.java:79)
at org.apache.flink.runtime.taskmanager.Task.notifyFinalState(Task.java:735)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:670)
at java.lang.Thread.run(Thread.java:745)
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 88.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-13 18:18:07 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3378/changes>

Changes:

[smarthi] [maven-release-plugin] prepare release mahout-0.12.2

[smarthi] [maven-release-plugin] prepare for next development iteration

------------------------------------------
[...truncated 62034 lines...]
... 4 more
428080 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428080 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-73]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428081 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428081 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-74]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428081 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428082 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-75]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428082 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428082 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-76]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428083 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428083 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-77]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428083 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
428084 [flink-akka.actor.default-dispatcher-22] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-78]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.05 MB of artifacts by 86.9% relative to #3363
Recording test results
Publishing Javadoc
Apache Jenkins Server
2016-06-16 03:47:52 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3379/changes>

Changes:

[apalumbo] MAHOUT-1837: Sparse/Dense Matrix analysis for Matrix Multiplication.

------------------------------------------
[...truncated 23602 lines...]
37273 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37273 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-57]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37273 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37274 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-58]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37274 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37275 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-59]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37275 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37275 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-60]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37276 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37276 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-61]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37276 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - exception on LARS? timer thread
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
37277 [flink-akka.actor.default-dispatcher-32] ERROR akka.actor.ActorSystemImpl - Uncaught error from thread [flink-scheduler-62]
java.lang.IllegalStateException: problem in scala.concurrent internal callback
at scala.concurrent.Future$InternalCallbackExecutor$.reportFailure(Future.scala:592)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:471)
at akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)
at akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)
at akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:133)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$4.apply(JobManager.scala:132)
at scala.concurrent.impl.ExecutionContextImpl.reportFailure(ExecutionContextImpl.scala:125)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)
at akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)
at scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)
at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)
... 4 more
Build timed out (after 223 minutes). Marking the build as failed.
Build was aborted
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 171.06 MB of artifacts by 86.8% relative to #3363
Recording test results
Publishing Javadoc
Updating MAHOUT-1837
Apache Jenkins Server
2016-06-24 22:16:33 UTC
Permalink
See <https://builds.apache.org/job/Mahout-Quality/3380/changes>

Loading...