Getting connection error while reading data from ElasticSearch using apache Spark & Scala -


i gave following code

val conf = new org.apache.spark.sparkconf()       .setmaster("local[*]")       .setappname("es-example")       .set("es.nodes", "search-2meoihmu.us-est-1.es.amazonaws.com")  val sc = new org.apache.spark.sparkcontext(conf) val resource = "index/data" val count = sc.esrdd(resource).count() println(count) 

using,

elastic search version=1.5.2 spark version=1.5.2 scala version=2.10.4 

and given library dependency follows,

librarydependencies += "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.1.3" 

i getting following error while running program

exception in thread "main" org.elasticsearch.hadoop.rest.eshadoopnonodesleftexception: connection error (check network and/or proxy settings)- nodes failed 

how can read data elastic search using spark , scala?

please @ option "es.nodes.wan.only". default, value key set "false", , when set true, exception went away. here current documentation configuration values: https://www.elastic.co/guide/en/elasticsearch/hadoop/current/configuration.html.

val conf = new org.apache.spark.sparkconf()  .setmaster("local[*]")  .setappname("es-example")  .set("es.nodes", "search-2meoihmu.us-est-1.es.amazonaws.com")  .set("es.nodes.wan.only", "true") 

note doc specifies flip value true environments on aws, exception happened me when attempting point vm elasticsearch running.


Comments

Popular posts from this blog

sql - invalid in the select list because it is not contained in either an aggregate function -

Angularjs unit testing - ng-disabled not working when adding text to textarea -

How to start daemon on android by adb -