错误总结

技术文章 10个月前 完美者
2,020 0

标签:evel   memory   ati   EAP   ast   stopped   manager   text   pac   

20/12/12 15:49:47 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala) 20/12/12 15:49:47 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala)

val ss= SparkSession.builder().master("local").appName("hello").getOrCreate()
val sc=ss.sparkContext
sc.setLogLevel("ERROR")
这个报错是JVM申请的memory不够导致无法启动SparkContext
本地测试的话,可以直接在代码中conf里设置一下spark.testing.memory
val sparkConf = new SparkConf().set("spark.testing.memory", "2147480000")
val ss= SparkSession.builder().config(sparkConf).master("local").appName("tfidf").getOrCreate()
val sc=ss.sparkContext

错误总结

标签:evel   memory   ati   EAP   ast   stopped   manager   text   pac   

原文地址:https://www.cnblogs.com/ShyPeanut/p/14125001.html

版权声明:完美者 发表于 2020-12-17 12:52:27。
转载请注明:错误总结 | 完美导航

暂无评论

暂无评论...