eclipse(set with scala envirnment) : object apache is not a member of package org -


enter image description here

as shown in image, giving error when importing spark packages. please help. when hover there, shows "object apache not member of package org". searched on error, shows spark jars has not been imported. so, imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. still same error.below want run:

 import org.apache.spark.{sparkconf, sparkcontext}  object abc {   def main(args: array[string]){ //scala main method  println("spark configuration")  val conf = new sparkconf()  conf.setappname("my first spark scala application")  conf.setmaster("spark://ip-10-237-224-94:7077")  println("creating spark context") } } 

adding spark-core jar in classpath should resolve issue. if using build tools maven or gradle (if not should because spark-core has lot many dependencies , keep getting such problem different jars), try use eclipse task provided these tools set classpath in project.


Comments

Popular posts from this blog

Ansible - ERROR! the field 'hosts' is required but was not set -

customize file_field button ruby on rails -

SoapUI on windows 10 - high DPI/4K scaling issue -