Discussion:
[jira] [Created] (MAHOUT-1758) mahout spark-shell - get illegal acces eror at startup
JP Bordenave (JIRA)
2015-07-17 21:29:04 UTC
Permalink
JP Bordenave created MAHOUT-1758:
------------------------------------

Summary: mahout spark-shell - get illegal acces eror at startup
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10


Reporter: JP Bordenave
Priority: Critical


Hello,

i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10

now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP

i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077

{noformat}
***@stargate:~$ mahout spark-shell
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0


Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)

Mahout distributed context is available as "implicit val sdc".
{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-07-17 21:29:05 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

JP Bordenave updated MAHOUT-1758:
---------------------------------
Summary: mahout spark-shell - get illegal acces error at startup (was: mahout spark-shell - get illegal acces eror at startup)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Priority: Critical
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Dmitriy Lyubimov
2015-07-17 22:33:39 UTC
Permalink
we don't support spark 1.4 yet. 0.10.x branch is compatible with spark
0.9..1.2 and 0.11.0 (master) is compatible with 1.3. I don't think anyone
yet looked at 1.4. So, ok, we can start working on this with this issue.
thank you.
Post by JP Bordenave (JIRA)
[
https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
---------------------------------
Summary: mahout spark-shell - get illegal acces error at startup
(was: mahout spark-shell - get illegal acces eror at startup)
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave,
16GB ram by node.
Post by JP Bordenave (JIRA)
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Priority: Critical
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i
start
Post by JP Bordenave (JIRA)
mahout, i get illegalaccesseror, it try tot start in local mode, i get
same error, look to be incompatible with spark 1.4.x and mahout 10.1,
Post by JP Bordenave (JIRA)
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
Post by JP Bordenave (JIRA)
SLF4J: Found binding in
[jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
Post by JP Bordenave (JIRA)
SLF4J: Found binding in
[jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
Post by JP Bordenave (JIRA)
SLF4J: Found binding in
[jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
Post by JP Bordenave (JIRA)
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Post by JP Bordenave (JIRA)
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Post by JP Bordenave (JIRA)
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method
org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer;
from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
Post by JP Bordenave (JIRA)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
Post by JP Bordenave (JIRA)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Post by JP Bordenave (JIRA)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
Post by JP Bordenave (JIRA)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
Post by JP Bordenave (JIRA)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
Post by JP Bordenave (JIRA)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
Post by JP Bordenave (JIRA)
at
org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
Post by JP Bordenave (JIRA)
at
org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
Post by JP Bordenave (JIRA)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Suneel Marthi (JIRA)
2015-08-09 02:40:45 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663227#comment-14663227 ]

Suneel Marthi commented on MAHOUT-1758:
---------------------------------------

Mahout 0.10.1 is not compatible with Spark 1.3+, please try with 0.11.0 that was released on Aug 7, 2015. Resolving this as a non-issue now, please feel free to file a jira if you are seeing similar issue with 0.11.0,
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Suneel Marthi (JIRA)
2015-08-09 02:40:45 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Suneel Marthi updated MAHOUT-1758:
----------------------------------
Assignee: Suneel Marthi
Affects Version/s: 0.10.1
Fix Version/s: 0.11.0
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Suneel Marthi (JIRA)
2015-08-09 02:41:45 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Suneel Marthi resolved MAHOUT-1758.
-----------------------------------
Resolution: Not A Problem
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-08-09 08:10:46 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663320#comment-14663320 ]

JP Bordenave commented on MAHOUT-1758:
--------------------------------------


Hello,

Thanks for answer,

i already removed mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-08-09 08:12:46 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663320#comment-14663320 ]

JP Bordenave edited comment on MAHOUT-1758 at 8/9/15 8:12 AM:
--------------------------------------------------------------

Hello,

Thanks for answer,

i already solved my issue by removing mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP





was (Author: jpbordi):

Hello,

Thanks for answer,

i already removed mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-08-09 08:14:46 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663320#comment-14663320 ]

JP Bordenave edited comment on MAHOUT-1758 at 8/9/15 8:14 AM:
--------------------------------------------------------------

Hello,

Thanks for answer,

i already solved my issue by removing mahout from my global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu work with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP





was (Author: jpbordi):
Hello,

Thanks for answer,

i already solved my issue by removing mahout from my global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-08-09 08:14:45 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663320#comment-14663320 ]

JP Bordenave edited comment on MAHOUT-1758 at 8/9/15 8:14 AM:
--------------------------------------------------------------

Hello,

Thanks for answer,

i already solved my issue by removing mahout from my global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP





was (Author: jpbordi):
Hello,

Thanks for answer,

i already solved my issue by removing mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools, they all working fine together

KR
JP
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
JP Bordenave (JIRA)
2015-08-09 08:35:45 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

JP Bordenave closed MAHOUT-1758.
--------------------------------
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Chomon (JIRA)
2016-11-09 05:45:58 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15649888#comment-15649888 ]

Chomon commented on MAHOUT-1758:
--------------------------------

When I type bin/mahout spark-shell,I get the following error:
***@ubuntu:~/mahout$ bin/mahout spark-shell
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
16/11/08 21:38:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.12.2


Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/11/08 21:38:39 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
Created spark context..
Spark context is available as "val sc".
Mahout distributed context is available as "implicit val sdc".
16/11/08 21:39:00 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/11/08 21:39:00 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/11/08 21:39:00 WARN : Your hostname, ubuntu resolves to a loopback/non-reachable address: 127.0.0.1, but we couldn't find any external IP address!
16/11/08 21:39:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/08 21:39:10 WARN : Your hostname, ubuntu resolves to a loopback/non-reachable address: 127.0.0.1, but we couldn't find any external IP address!
SQL context available as "val sqlContext".
mahout>
how can I solve it ?
I used to version : spark 1.5.2,hadoop 2.6,java 7u 79 jdk and scala 2.10.4.
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Suneel Marthi (JIRA)
2016-11-09 18:33:59 UTC
Permalink
[ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15651683#comment-15651683 ]

Suneel Marthi commented on MAHOUT-1758:
---------------------------------------

I don't see an error in ur message, what u r seeing is a Warning and u should still be good to go.
Post by JP Bordenave (JIRA)
mahout spark-shell - get illegal acces error at startup
-------------------------------------------------------
Key: MAHOUT-1758
URL: https://issues.apache.org/jira/browse/MAHOUT-1758
Project: Mahout
Issue Type: Bug
Affects Versions: 0.10.1
Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node.
Hadoop 2.6
Spark 1.4.1
Mahout 10.1
R 3.0.2/Rhadoop
scala 2.10
Reporter: JP Bordenave
Assignee: Suneel Marthi
Priority: Critical
Fix For: 0.11.0
Hello,
i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue
scala 2.10
now i try to configure mahout with my cluster spark/hadoop, but when i start
mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1,
can you confirm ? patch ?
edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
Thanks for your info
JP
i set my variable nd my cluster spark
export SPARK_HOME=/usr/local/spark
export MASTER=spark://stargate:7077
{noformat}
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
_ _
_ __ ___ __ _| |__ ___ _ _| |_
| '_ ` _ \ / _` | '_ \ / _ \| | | | __|
| | | | | | (_| | | | | (_) | |_| | |_
|_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0
Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Mahout distributed context is available as "implicit val sdc".
{noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Loading...