Support for Spark 2.0

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

Support for Spark 2.0

Phadnis, Varun
Hello,

Can someone please tell me when is the support for Spark 2.0 planned?

Currently Ignite cannot be built for Spark 2.0. Attempting this yields the following error :


------------------------------------------------------------------------

[INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]

------------------------------------------------------------------------

[INFO]

[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target

[INFO]

[INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean (flatten.clean.before) @ ignite-spark --- [INFO] Deleting /home/spark/code/ignite/modules/spark/pom-installed.xml

[INFO]

[INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ ignite-spark --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ ignite-spark --- [INFO] Using 'UTF-8' encoding to copy filtered resources.

[INFO] skip non existing resourceDirectory /home/spark/code/ignite/modules/spark/src/main/resources

[INFO] Copying 4 resources

[INFO] Copying 4 resources

[INFO]

[INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @ ignite-spark --- [INFO] Generating flattened POM of project org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...

[WARNING] Ignoring multiple XML header comment!

[INFO]

[INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @ ignite-spark --- [INFO] Add Source directory:

/home/spark/code/ignite/modules/spark/src/main/scala

[INFO] Add Test Source directory:

/home/spark/code/ignite/modules/spark/src/test/scala

[INFO]

[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ ignite-spark --- [WARNING]  Expected all dependencies to require Scala version: 2.11.7 [WARNING]  org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT requires scala

version: 2.11.7

[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7 [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala version:

2.11.8

[WARNING] Multiple versions of scala libraries detected!

[INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:

compiling

[INFO] Compiling 8 source files to

/home/spark/code/ignite/modules/spark/target/classes at 1474028222935 [ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:25:

error: object Logging is not a member of package org.apache.spark [ERROR] import org.apache.spark.{Logging, SparkContext}

[ERROR]        ^

[ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:37:

error: not found: type Logging

[ERROR]     ) extends Serializable with Logging {

[ERROR]                                 ^

[ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:50:

error: not found: value logInfo

[ERROR]         logInfo("Will start Ignite nodes on " + workers + "

workers")

[ERROR]         ^

[ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:129:

error: not found: value logInfo

[ERROR]             logInfo("Setting IGNITE_HOME from driver not as it is

not available on this worker: " + igniteHome)

[ERROR]             ^

[ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:146:

error: not found: value logError

[ERROR]                 logError("Failed to start Ignite.", e)

[INFO]                 ^

[ERROR]

/home/spark/code/ignite/modules/spark/src/main/scala/org/apache/ignite/spark/IgniteContext.scala:164:

error: not found: value logInfo

[ERROR]                 logInfo("Will stop Ignite nodes on " + workers + "

workers")

[ERROR]                 ^

[ERROR] 6 errors found

[INFO]

------------------------------------------------------------------------

Thanks!
Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

Sergey Kozlov
Hi

It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>

Unfortunately there's no progress yet

On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <[hidden email]>
wrote:

> Hello,
>
> Can someone please tell me when is the support for Spark 2.0 planned?
>
> Currently Ignite cannot be built for Spark 2.0. Attempting this yields the
> following error :
>
>
> ------------------------------------------------------------------------
>
> [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
>
> ------------------------------------------------------------------------
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark ---
> [INFO] Deleting /home/spark/code/ignite/modules/spark/target
>
> [INFO]
>
> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean (flatten.clean.before)
> @ ignite-spark --- [INFO] Deleting /home/spark/code/ignite/
> modules/spark/pom-installed.xml
>
> [INFO]
>
> [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark ---
> [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> ignite-spark --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
> (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
> maven-resources-plugin:2.6:resources (default-resources) @ ignite-spark
> --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
>
> [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
> modules/spark/src/main/resources
>
> [INFO] Copying 4 resources
>
> [INFO] Copying 4 resources
>
> [INFO]
>
> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
> ignite-spark --- [INFO] Generating flattened POM of project
> org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
>
> [WARNING] Ignoring multiple XML header comment!
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
> ignite-spark --- [INFO] Add Source directory:
>
> /home/spark/code/ignite/modules/spark/src/main/scala
>
> [INFO] Add Test Source directory:
>
> /home/spark/code/ignite/modules/spark/src/test/scala
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
> ignite-spark --- [WARNING]  Expected all dependencies to require Scala
> version: 2.11.7 [WARNING]  org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
> requires scala
>
> version: 2.11.7
>
> [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
> [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala version:
>
> 2.11.8
>
> [WARNING] Multiple versions of scala libraries detected!
>
> [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
>
> compiling
>
> [INFO] Compiling 8 source files to
>
> /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:25:
>
> error: object Logging is not a member of package org.apache.spark [ERROR]
> import org.apache.spark.{Logging, SparkContext}
>
> [ERROR]        ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:37:
>
> error: not found: type Logging
>
> [ERROR]     ) extends Serializable with Logging {
>
> [ERROR]                                 ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:50:
>
> error: not found: value logInfo
>
> [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
>
> workers")
>
> [ERROR]         ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:129:
>
> error: not found: value logInfo
>
> [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it is
>
> not available on this worker: " + igniteHome)
>
> [ERROR]             ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:146:
>
> error: not found: value logError
>
> [ERROR]                 logError("Failed to start Ignite.", e)
>
> [INFO]                 ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:164:
>
> error: not found: value logInfo
>
> [ERROR]                 logInfo("Will stop Ignite nodes on " + workers + "
>
> workers")
>
> [ERROR]                 ^
>
> [ERROR] 6 errors found
>
> [INFO]
>
> ------------------------------------------------------------------------
>
> Thanks!
>



--
Sergey Kozlov
GridGain Systems
www.gridgain.com
Reply | Threaded
Open this post in threaded view
|

RE: Support for Spark 2.0

Phadnis, Varun
Hello,

Just to be clear, I am not explicitly building the Hadoop edition. I get the described error when I simply specify the Spark version:

  mvn clean package -Dspark.version=2.0.0 -DskipTests

Thanks for the quick response!


-----Original Message-----
From: Sergey Kozlov [mailto:[hidden email]]
Sent: 19 September 2016 05:56
To: [hidden email]
Subject: Re: Support for Spark 2.0

Hi

It's a known issue IGNITE-3596 Hadoop edition can't be compiled against spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>

Unfortunately there's no progress yet

On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <[hidden email]>
wrote:

> Hello,
>
> Can someone please tell me when is the support for Spark 2.0 planned?
>
> Currently Ignite cannot be built for Spark 2.0. Attempting this yields
> the following error :
>
>
> ----------------------------------------------------------------------
> --
>
> [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
>
> ----------------------------------------------------------------------
> --
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
> --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
>
> [INFO]
>
> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
> (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
> /home/spark/code/ignite/ modules/spark/pom-installed.xml
>
> [INFO]
>
> [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
> --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
> (default) @ ignite-spark --- [INFO] [INFO] ---
> maven-remote-resources-plugin:1.5:process
> (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
> maven-resources-plugin:2.6:resources (default-resources) @
> ignite-spark
> --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
>
> [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
> modules/spark/src/main/resources
>
> [INFO] Copying 4 resources
>
> [INFO] Copying 4 resources
>
> [INFO]
>
> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
> ignite-spark --- [INFO] Generating flattened POM of project
> org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
>
> [WARNING] Ignoring multiple XML header comment!
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
> ignite-spark --- [INFO] Add Source directory:
>
> /home/spark/code/ignite/modules/spark/src/main/scala
>
> [INFO] Add Test Source directory:
>
> /home/spark/code/ignite/modules/spark/src/test/scala
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
> ignite-spark --- [WARNING]  Expected all dependencies to require Scala
> version: 2.11.7 [WARNING]  
> org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
> requires scala
>
> version: 2.11.7
>
> [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
> [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala version:
>
> 2.11.8
>
> [WARNING] Multiple versions of scala libraries detected!
>
> [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
>
> compiling
>
> [INFO] Compiling 8 source files to
>
> /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:25:
>
> error: object Logging is not a member of package org.apache.spark
> [ERROR] import org.apache.spark.{Logging, SparkContext}
>
> [ERROR]        ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:37:
>
> error: not found: type Logging
>
> [ERROR]     ) extends Serializable with Logging {
>
> [ERROR]                                 ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:50:
>
> error: not found: value logInfo
>
> [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
>
> workers")
>
> [ERROR]         ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:129:
>
> error: not found: value logInfo
>
> [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it is
>
> not available on this worker: " + igniteHome)
>
> [ERROR]             ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:146:
>
> error: not found: value logError
>
> [ERROR]                 logError("Failed to start Ignite.", e)
>
> [INFO]                 ^
>
> [ERROR]
>
> /home/spark/code/ignite/modules/spark/src/main/scala/
> org/apache/ignite/spark/IgniteContext.scala:164:
>
> error: not found: value logInfo
>
> [ERROR]                 logInfo("Will stop Ignite nodes on " + workers + "
>
> workers")
>
> [ERROR]                 ^
>
> [ERROR] 6 errors found
>
> [INFO]
>
> ----------------------------------------------------------------------
> --
>
> Thanks!
>



--
Sergey Kozlov
GridGain Systems
www.gridgain.com
Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

Richard Siebeling
I'd wish someone could fix this, unfortunately I can't do anything about it
in the near future...

On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <[hidden email]>
wrote:

> Hello,
>
> Just to be clear, I am not explicitly building the Hadoop edition. I get
> the described error when I simply specify the Spark version:
>
>   mvn clean package -Dspark.version=2.0.0 -DskipTests
>
> Thanks for the quick response!
>
>
> -----Original Message-----
> From: Sergey Kozlov [mailto:[hidden email]]
> Sent: 19 September 2016 05:56
> To: [hidden email]
> Subject: Re: Support for Spark 2.0
>
> Hi
>
> It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
> spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
>
> Unfortunately there's no progress yet
>
> On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <[hidden email]>
> wrote:
>
> > Hello,
> >
> > Can someone please tell me when is the support for Spark 2.0 planned?
> >
> > Currently Ignite cannot be built for Spark 2.0. Attempting this yields
> > the following error :
> >
> >
> > ----------------------------------------------------------------------
> > --
> >
> > [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
> >
> > ----------------------------------------------------------------------
> > --
> >
> > [INFO]
> >
> > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
> > --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
> >
> > [INFO]
> >
> > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
> > (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
> > /home/spark/code/ignite/ modules/spark/pom-installed.xml
> >
> > [INFO]
> >
> > [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
> > --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
> > (default) @ ignite-spark --- [INFO] [INFO] ---
> > maven-remote-resources-plugin:1.5:process
> > (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
> > maven-resources-plugin:2.6:resources (default-resources) @
> > ignite-spark
> > --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
> >
> > [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
> > modules/spark/src/main/resources
> >
> > [INFO] Copying 4 resources
> >
> > [INFO] Copying 4 resources
> >
> > [INFO]
> >
> > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
> > ignite-spark --- [INFO] Generating flattened POM of project
> > org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
> >
> > [WARNING] Ignoring multiple XML header comment!
> >
> > [INFO]
> >
> > [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
> > ignite-spark --- [INFO] Add Source directory:
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala
> >
> > [INFO] Add Test Source directory:
> >
> > /home/spark/code/ignite/modules/spark/src/test/scala
> >
> > [INFO]
> >
> > [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
> > ignite-spark --- [WARNING]  Expected all dependencies to require Scala
> > version: 2.11.7 [WARNING]
> > org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
> > requires scala
> >
> > version: 2.11.7
> >
> > [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
> > [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
> version:
> >
> > 2.11.8
> >
> > [WARNING] Multiple versions of scala libraries detected!
> >
> > [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
> >
> > compiling
> >
> > [INFO] Compiling 8 source files to
> >
> > /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:25:
> >
> > error: object Logging is not a member of package org.apache.spark
> > [ERROR] import org.apache.spark.{Logging, SparkContext}
> >
> > [ERROR]        ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:37:
> >
> > error: not found: type Logging
> >
> > [ERROR]     ) extends Serializable with Logging {
> >
> > [ERROR]                                 ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:50:
> >
> > error: not found: value logInfo
> >
> > [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
> >
> > workers")
> >
> > [ERROR]         ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:129:
> >
> > error: not found: value logInfo
> >
> > [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it is
> >
> > not available on this worker: " + igniteHome)
> >
> > [ERROR]             ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:146:
> >
> > error: not found: value logError
> >
> > [ERROR]                 logError("Failed to start Ignite.", e)
> >
> > [INFO]                 ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:164:
> >
> > error: not found: value logInfo
> >
> > [ERROR]                 logInfo("Will stop Ignite nodes on " + workers +
> "
> >
> > workers")
> >
> > [ERROR]                 ^
> >
> > [ERROR] 6 errors found
> >
> > [INFO]
> >
> > ----------------------------------------------------------------------
> > --
> >
> > Thanks!
> >
>
>
>
> --
> Sergey Kozlov
> GridGain Systems
> www.gridgain.com
>
Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

Vladimir Ozerov
I think we can expect Spark 2.0 support to be added in one of the nearest
releases (not sure if it will be 1.8 or 2.0).

On Mon, Sep 19, 2016 at 10:20 PM, Richard Siebeling <[hidden email]>
wrote:

> I'd wish someone could fix this, unfortunately I can't do anything about it
> in the near future...
>
> On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <[hidden email]>
> wrote:
>
> > Hello,
> >
> > Just to be clear, I am not explicitly building the Hadoop edition. I get
> > the described error when I simply specify the Spark version:
> >
> >   mvn clean package -Dspark.version=2.0.0 -DskipTests
> >
> > Thanks for the quick response!
> >
> >
> > -----Original Message-----
> > From: Sergey Kozlov [mailto:[hidden email]]
> > Sent: 19 September 2016 05:56
> > To: [hidden email]
> > Subject: Re: Support for Spark 2.0
> >
> > Hi
> >
> > It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
> > spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
> >
> > Unfortunately there's no progress yet
> >
> > On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <
> [hidden email]>
> > wrote:
> >
> > > Hello,
> > >
> > > Can someone please tell me when is the support for Spark 2.0 planned?
> > >
> > > Currently Ignite cannot be built for Spark 2.0. Attempting this yields
> > > the following error :
> > >
> > >
> > > ----------------------------------------------------------------------
> > > --
> > >
> > > [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
> > >
> > > ----------------------------------------------------------------------
> > > --
> > >
> > > [INFO]
> > >
> > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
> > > --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
> > >
> > > [INFO]
> > >
> > > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
> > > (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
> > > /home/spark/code/ignite/ modules/spark/pom-installed.xml
> > >
> > > [INFO]
> > >
> > > [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
> > > --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
> > > (default) @ ignite-spark --- [INFO] [INFO] ---
> > > maven-remote-resources-plugin:1.5:process
> > > (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
> > > maven-resources-plugin:2.6:resources (default-resources) @
> > > ignite-spark
> > > --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
> > >
> > > [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
> > > modules/spark/src/main/resources
> > >
> > > [INFO] Copying 4 resources
> > >
> > > [INFO] Copying 4 resources
> > >
> > > [INFO]
> > >
> > > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
> > > ignite-spark --- [INFO] Generating flattened POM of project
> > > org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
> > >
> > > [WARNING] Ignoring multiple XML header comment!
> > >
> > > [INFO]
> > >
> > > [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
> > > ignite-spark --- [INFO] Add Source directory:
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala
> > >
> > > [INFO] Add Test Source directory:
> > >
> > > /home/spark/code/ignite/modules/spark/src/test/scala
> > >
> > > [INFO]
> > >
> > > [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
> > > ignite-spark --- [WARNING]  Expected all dependencies to require Scala
> > > version: 2.11.7 [WARNING]
> > > org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
> > > requires scala
> > >
> > > version: 2.11.7
> > >
> > > [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
> > > [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
> > version:
> > >
> > > 2.11.8
> > >
> > > [WARNING] Multiple versions of scala libraries detected!
> > >
> > > [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
> > >
> > > compiling
> > >
> > > [INFO] Compiling 8 source files to
> > >
> > > /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:25:
> > >
> > > error: object Logging is not a member of package org.apache.spark
> > > [ERROR] import org.apache.spark.{Logging, SparkContext}
> > >
> > > [ERROR]        ^
> > >
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:37:
> > >
> > > error: not found: type Logging
> > >
> > > [ERROR]     ) extends Serializable with Logging {
> > >
> > > [ERROR]                                 ^
> > >
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:50:
> > >
> > > error: not found: value logInfo
> > >
> > > [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
> > >
> > > workers")
> > >
> > > [ERROR]         ^
> > >
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:129:
> > >
> > > error: not found: value logInfo
> > >
> > > [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it
> is
> > >
> > > not available on this worker: " + igniteHome)
> > >
> > > [ERROR]             ^
> > >
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:146:
> > >
> > > error: not found: value logError
> > >
> > > [ERROR]                 logError("Failed to start Ignite.", e)
> > >
> > > [INFO]                 ^
> > >
> > > [ERROR]
> > >
> > > /home/spark/code/ignite/modules/spark/src/main/scala/
> > > org/apache/ignite/spark/IgniteContext.scala:164:
> > >
> > > error: not found: value logInfo
> > >
> > > [ERROR]                 logInfo("Will stop Ignite nodes on " + workers
> +
> > "
> > >
> > > workers")
> > >
> > > [ERROR]                 ^
> > >
> > > [ERROR] 6 errors found
> > >
> > > [INFO]
> > >
> > > ----------------------------------------------------------------------
> > > --
> > >
> > > Thanks!
> > >
> >
> >
> >
> > --
> > Sergey Kozlov
> > GridGain Systems
> > www.gridgain.com
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

Denis Magda
Is this the only issue we have with Spark 2.0?
https://issues.apache.org/jira/browse/IGNITE-3596 <https://issues.apache.org/jira/browse/IGNITE-3596>


Denis

> On Sep 20, 2016, at 11:54 AM, Vladimir Ozerov <[hidden email]> wrote:
>
> I think we can expect Spark 2.0 support to be added in one of the nearest
> releases (not sure if it will be 1.8 or 2.0).
>
> On Mon, Sep 19, 2016 at 10:20 PM, Richard Siebeling <[hidden email]>
> wrote:
>
>> I'd wish someone could fix this, unfortunately I can't do anything about it
>> in the near future...
>>
>> On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <[hidden email]>
>> wrote:
>>
>>> Hello,
>>>
>>> Just to be clear, I am not explicitly building the Hadoop edition. I get
>>> the described error when I simply specify the Spark version:
>>>
>>>  mvn clean package -Dspark.version=2.0.0 -DskipTests
>>>
>>> Thanks for the quick response!
>>>
>>>
>>> -----Original Message-----
>>> From: Sergey Kozlov [mailto:[hidden email]]
>>> Sent: 19 September 2016 05:56
>>> To: [hidden email]
>>> Subject: Re: Support for Spark 2.0
>>>
>>> Hi
>>>
>>> It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
>>> spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
>>>
>>> Unfortunately there's no progress yet
>>>
>>> On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <
>> [hidden email]>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> Can someone please tell me when is the support for Spark 2.0 planned?
>>>>
>>>> Currently Ignite cannot be built for Spark 2.0. Attempting this yields
>>>> the following error :
>>>>
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
>>>> --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
>>>> (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
>>>> /home/spark/code/ignite/ modules/spark/pom-installed.xml
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
>>>> --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
>>>> (default) @ ignite-spark --- [INFO] [INFO] ---
>>>> maven-remote-resources-plugin:1.5:process
>>>> (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
>>>> maven-resources-plugin:2.6:resources (default-resources) @
>>>> ignite-spark
>>>> --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
>>>>
>>>> [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
>>>> modules/spark/src/main/resources
>>>>
>>>> [INFO] Copying 4 resources
>>>>
>>>> [INFO] Copying 4 resources
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
>>>> ignite-spark --- [INFO] Generating flattened POM of project
>>>> org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
>>>>
>>>> [WARNING] Ignoring multiple XML header comment!
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
>>>> ignite-spark --- [INFO] Add Source directory:
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala
>>>>
>>>> [INFO] Add Test Source directory:
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/test/scala
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
>>>> ignite-spark --- [WARNING]  Expected all dependencies to require Scala
>>>> version: 2.11.7 [WARNING]
>>>> org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
>>>> requires scala
>>>>
>>>> version: 2.11.7
>>>>
>>>> [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
>>>> [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
>>> version:
>>>>
>>>> 2.11.8
>>>>
>>>> [WARNING] Multiple versions of scala libraries detected!
>>>>
>>>> [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
>>>>
>>>> compiling
>>>>
>>>> [INFO] Compiling 8 source files to
>>>>
>>>> /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:25:
>>>>
>>>> error: object Logging is not a member of package org.apache.spark
>>>> [ERROR] import org.apache.spark.{Logging, SparkContext}
>>>>
>>>> [ERROR]        ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:37:
>>>>
>>>> error: not found: type Logging
>>>>
>>>> [ERROR]     ) extends Serializable with Logging {
>>>>
>>>> [ERROR]                                 ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:50:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
>>>>
>>>> workers")
>>>>
>>>> [ERROR]         ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:129:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it
>> is
>>>>
>>>> not available on this worker: " + igniteHome)
>>>>
>>>> [ERROR]             ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:146:
>>>>
>>>> error: not found: value logError
>>>>
>>>> [ERROR]                 logError("Failed to start Ignite.", e)
>>>>
>>>> [INFO]                 ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:164:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]                 logInfo("Will stop Ignite nodes on " + workers
>> +
>>> "
>>>>
>>>> workers")
>>>>
>>>> [ERROR]                 ^
>>>>
>>>> [ERROR] 6 errors found
>>>>
>>>> [INFO]
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> Thanks!
>>>>
>>>
>>>
>>>
>>> --
>>> Sergey Kozlov
>>> GridGain Systems
>>> www.gridgain.com
>>>
>>

Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

Roman Shtykh
There is a bunch of requests from users for this upgrade.
https://issues.apache.org/jira/browse/IGNITE-3822 and related.

-Roman
 

    On Wednesday, September 21, 2016 5:14 AM, Denis Magda <[hidden email]> wrote:
 

 Is this the only issue we have with Spark 2.0?
https://issues.apache.org/jira/browse/IGNITE-3596 <https://issues.apache.org/jira/browse/IGNITE-3596>


Denis

> On Sep 20, 2016, at 11:54 AM, Vladimir Ozerov <[hidden email]> wrote:
>
> I think we can expect Spark 2.0 support to be added in one of the nearest
> releases (not sure if it will be 1.8 or 2.0).
>
> On Mon, Sep 19, 2016 at 10:20 PM, Richard Siebeling <[hidden email]>
> wrote:
>
>> I'd wish someone could fix this, unfortunately I can't do anything about it
>> in the near future...
>>
>> On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <[hidden email]>
>> wrote:
>>
>>> Hello,
>>>
>>> Just to be clear, I am not explicitly building the Hadoop edition. I get
>>> the described error when I simply specify the Spark version:
>>>
>>>  mvn clean package -Dspark.version=2.0.0 -DskipTests
>>>
>>> Thanks for the quick response!
>>>
>>>
>>> -----Original Message-----
>>> From: Sergey Kozlov [mailto:[hidden email]]
>>> Sent: 19 September 2016 05:56
>>> To: [hidden email]
>>> Subject: Re: Support for Spark 2.0
>>>
>>> Hi
>>>
>>> It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
>>> spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
>>>
>>> Unfortunately there's no progress yet
>>>
>>> On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <
>> [hidden email]>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> Can someone please tell me when is the support for Spark 2.0 planned?
>>>>
>>>> Currently Ignite cannot be built for Spark 2.0. Attempting this yields
>>>> the following error :
>>>>
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
>>>> --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
>>>> (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
>>>> /home/spark/code/ignite/ modules/spark/pom-installed.xml
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
>>>> --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
>>>> (default) @ ignite-spark --- [INFO] [INFO] ---
>>>> maven-remote-resources-plugin:1.5:process
>>>> (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
>>>> maven-resources-plugin:2.6:resources (default-resources) @
>>>> ignite-spark
>>>> --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
>>>>
>>>> [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
>>>> modules/spark/src/main/resources
>>>>
>>>> [INFO] Copying 4 resources
>>>>
>>>> [INFO] Copying 4 resources
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
>>>> ignite-spark --- [INFO] Generating flattened POM of project
>>>> org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
>>>>
>>>> [WARNING] Ignoring multiple XML header comment!
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
>>>> ignite-spark --- [INFO] Add Source directory:
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala
>>>>
>>>> [INFO] Add Test Source directory:
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/test/scala
>>>>
>>>> [INFO]
>>>>
>>>> [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
>>>> ignite-spark --- [WARNING]  Expected all dependencies to require Scala
>>>> version: 2.11.7 [WARNING]
>>>> org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
>>>> requires scala
>>>>
>>>> version: 2.11.7
>>>>
>>>> [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
>>>> [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
>>> version:
>>>>
>>>> 2.11.8
>>>>
>>>> [WARNING] Multiple versions of scala libraries detected!
>>>>
>>>> [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
>>>>
>>>> compiling
>>>>
>>>> [INFO] Compiling 8 source files to
>>>>
>>>> /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:25:
>>>>
>>>> error: object Logging is not a member of package org.apache.spark
>>>> [ERROR] import org.apache.spark.{Logging, SparkContext}
>>>>
>>>> [ERROR]        ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:37:
>>>>
>>>> error: not found: type Logging
>>>>
>>>> [ERROR]    ) extends Serializable with Logging {
>>>>
>>>> [ERROR]                                ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:50:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]        logInfo("Will start Ignite nodes on " + workers + "
>>>>
>>>> workers")
>>>>
>>>> [ERROR]        ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:129:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]            logInfo("Setting IGNITE_HOME from driver not as it
>> is
>>>>
>>>> not available on this worker: " + igniteHome)
>>>>
>>>> [ERROR]            ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:146:
>>>>
>>>> error: not found: value logError
>>>>
>>>> [ERROR]                logError("Failed to start Ignite.", e)
>>>>
>>>> [INFO]                ^
>>>>
>>>> [ERROR]
>>>>
>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>> org/apache/ignite/spark/IgniteContext.scala:164:
>>>>
>>>> error: not found: value logInfo
>>>>
>>>> [ERROR]                logInfo("Will stop Ignite nodes on " + workers
>> +
>>> "
>>>>
>>>> workers")
>>>>
>>>> [ERROR]                ^
>>>>
>>>> [ERROR] 6 errors found
>>>>
>>>> [INFO]
>>>>
>>>> ----------------------------------------------------------------------
>>>> --
>>>>
>>>> Thanks!
>>>>
>>>
>>>
>>>
>>> --
>>> Sergey Kozlov
>>> GridGain Systems
>>> www.gridgain.com
>>>
>>


   
Reply | Threaded
Open this post in threaded view
|

Re: Support for Spark 2.0

dmagda
Upgraded Ignite Spark module to the latest Spark version. The transition was trivial.

However, I couldn’t fully execute Spark’s the test suite on TeamCity due to some maven dependency related issue. The issue never happens on my local machine.

Anton V., please have a look at it and suggest how to get rid of it. The details are in the ticket:
https://issues.apache.org/jira/browse/IGNITE-3710 <https://issues.apache.org/jira/browse/IGNITE-3710>


Denis

> On Sep 20, 2016, at 6:46 PM, Roman Shtykh <[hidden email]> wrote:
>
> There is a bunch of requests from users for this upgrade.
> https://issues.apache.org/jira/browse/IGNITE-3822 and related.
>
> -Roman
>
>
>    On Wednesday, September 21, 2016 5:14 AM, Denis Magda <[hidden email]> wrote:
>
>
> Is this the only issue we have with Spark 2.0?
> https://issues.apache.org/jira/browse/IGNITE-3596 <https://issues.apache.org/jira/browse/IGNITE-3596>
>
> —
> Denis
>
>> On Sep 20, 2016, at 11:54 AM, Vladimir Ozerov <[hidden email]> wrote:
>>
>> I think we can expect Spark 2.0 support to be added in one of the nearest
>> releases (not sure if it will be 1.8 or 2.0).
>>
>> On Mon, Sep 19, 2016 at 10:20 PM, Richard Siebeling <[hidden email]>
>> wrote:
>>
>>> I'd wish someone could fix this, unfortunately I can't do anything about it
>>> in the near future...
>>>
>>> On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <[hidden email]>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> Just to be clear, I am not explicitly building the Hadoop edition. I get
>>>> the described error when I simply specify the Spark version:
>>>>
>>>>   mvn clean package -Dspark.version=2.0.0 -DskipTests
>>>>
>>>> Thanks for the quick response!
>>>>
>>>>
>>>> -----Original Message-----
>>>> From: Sergey Kozlov [mailto:[hidden email]]
>>>> Sent: 19 September 2016 05:56
>>>> To: [hidden email]
>>>> Subject: Re: Support for Spark 2.0
>>>>
>>>> Hi
>>>>
>>>> It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
>>>> spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
>>>>
>>>> Unfortunately there's no progress yet
>>>>
>>>> On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <
>>> [hidden email]>
>>>> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Can someone please tell me when is the support for Spark 2.0 planned?
>>>>>
>>>>> Currently Ignite cannot be built for Spark 2.0. Attempting this yields
>>>>> the following error :
>>>>>
>>>>>
>>>>> ----------------------------------------------------------------------
>>>>> --
>>>>>
>>>>> [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
>>>>>
>>>>> ----------------------------------------------------------------------
>>>>> --
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
>>>>> --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
>>>>> (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
>>>>> /home/spark/code/ignite/ modules/spark/pom-installed.xml
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
>>>>> --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
>>>>> (default) @ ignite-spark --- [INFO] [INFO] ---
>>>>> maven-remote-resources-plugin:1.5:process
>>>>> (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
>>>>> maven-resources-plugin:2.6:resources (default-resources) @
>>>>> ignite-spark
>>>>> --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
>>>>>
>>>>> [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
>>>>> modules/spark/src/main/resources
>>>>>
>>>>> [INFO] Copying 4 resources
>>>>>
>>>>> [INFO] Copying 4 resources
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
>>>>> ignite-spark --- [INFO] Generating flattened POM of project
>>>>> org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
>>>>>
>>>>> [WARNING] Ignoring multiple XML header comment!
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
>>>>> ignite-spark --- [INFO] Add Source directory:
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala
>>>>>
>>>>> [INFO] Add Test Source directory:
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/test/scala
>>>>>
>>>>> [INFO]
>>>>>
>>>>> [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
>>>>> ignite-spark --- [WARNING]  Expected all dependencies to require Scala
>>>>> version: 2.11.7 [WARNING]
>>>>> org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
>>>>> requires scala
>>>>>
>>>>> version: 2.11.7
>>>>>
>>>>> [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
>>>>> [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
>>>> version:
>>>>>
>>>>> 2.11.8
>>>>>
>>>>> [WARNING] Multiple versions of scala libraries detected!
>>>>>
>>>>> [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
>>>>>
>>>>> compiling
>>>>>
>>>>> [INFO] Compiling 8 source files to
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:25:
>>>>>
>>>>> error: object Logging is not a member of package org.apache.spark
>>>>> [ERROR] import org.apache.spark.{Logging, SparkContext}
>>>>>
>>>>> [ERROR]        ^
>>>>>
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:37:
>>>>>
>>>>> error: not found: type Logging
>>>>>
>>>>> [ERROR]    ) extends Serializable with Logging {
>>>>>
>>>>> [ERROR]                                ^
>>>>>
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:50:
>>>>>
>>>>> error: not found: value logInfo
>>>>>
>>>>> [ERROR]        logInfo("Will start Ignite nodes on " + workers + "
>>>>>
>>>>> workers")
>>>>>
>>>>> [ERROR]        ^
>>>>>
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:129:
>>>>>
>>>>> error: not found: value logInfo
>>>>>
>>>>> [ERROR]            logInfo("Setting IGNITE_HOME from driver not as it
>>> is
>>>>>
>>>>> not available on this worker: " + igniteHome)
>>>>>
>>>>> [ERROR]            ^
>>>>>
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:146:
>>>>>
>>>>> error: not found: value logError
>>>>>
>>>>> [ERROR]                logError("Failed to start Ignite.", e)
>>>>>
>>>>> [INFO]                ^
>>>>>
>>>>> [ERROR]
>>>>>
>>>>> /home/spark/code/ignite/modules/spark/src/main/scala/
>>>>> org/apache/ignite/spark/IgniteContext.scala:164:
>>>>>
>>>>> error: not found: value logInfo
>>>>>
>>>>> [ERROR]                logInfo("Will stop Ignite nodes on " + workers
>>> +
>>>> "
>>>>>
>>>>> workers")
>>>>>
>>>>> [ERROR]                ^
>>>>>
>>>>> [ERROR] 6 errors found
>>>>>
>>>>> [INFO]
>>>>>
>>>>> ----------------------------------------------------------------------
>>>>> --
>>>>>
>>>>> Thanks!
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Sergey Kozlov
>>>> GridGain Systems
>>>> www.gridgain.com
>>>>
>>>
>
>