Create JavaRDD from list in Spark 2.0 -
the usual way of creating javardd list use javasparkcontext.parallelize(list)
however, in spark 2.0 sparksession
used entry point , don't know how create javardd list
solution: spark-shell (spark 2.0)
import org.apache.spark.api.java.javasparkcontext val jsc = new javasparkcontext(sc) val javardd:java.util.list[int]= java.util.arrays.aslist(1, 2, 3, 4, 5) jsc.parallelize(javardd)
Comments
Post a Comment