请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

使用spark-submit 提交flume作业报错

将编译好的文件jar提交到服务器,
先启动flume,
flume-ng agent
–name simple-agent
–conf $FLUME_HOME/conf
–conf-file $FLUME_HOME/conf/flume_pull_streaming.conf
-Dflume.root.logger=INFO,console
没有报错,

在提交jar运行:
spark-submit
–class com.derek.spark.FlumePullWordCount
–master local[2]
–packages org.apache.spark:spark-streaming-flume_2.11:2.2.0
/home/hadoop/lib/sparktrain-1.0.jar
hadoop000 41414

我的问题是,我代码在idea里运行,将两个参数在启动时传入,没有问题。
在服务器上按照流程运行就不行。下面是log

error log:

19/12/09 22:21:43 ERROR TaskSetManager: Task 0 in stage 402.0 failed 1 times; aborting job
19/12/09 22:21:43 ERROR ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 402.0 failed 1 times, most recent failure: Lost task 0.0 in stage 402.0 (TID 388, localhost, executor driver): java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR Executor: Exception in task 0.0 in stage 403.0 (TID 389)
java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR TaskSetManager: Task 0 in stage 403.0 failed 1 times; aborting job
19/12/09 22:21:43 ERROR ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 403.0 failed 1 times, most recent failure: Lost task 0.0 in stage 403.0 (TID 389, localhost, executor driver): java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR Executor: Exception in task 0.0 in stage 404.0 (TID 390)
java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR TaskSetManager: Task 0 in stage 404.0 failed 1 times; aborting job
19/12/09 22:21:43 ERROR ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 404.0 failed 1 times, most recent failure: Lost task 0.0 in stage 404.0 (TID 390, localhost, executor driver): java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR Executor: Exception in task 0.0 in stage 405.0 (TID 391)
java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR TaskSetManager: Task 0 in stage 405.0 failed 1 times; aborting job
19/12/09 22:21:43 ERROR ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 405.0 failed 1 times, most recent failure: Lost task 0.0 in stage 405.0 (TID 391, localhost, executor driver): java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR Executor: Exception in task 0.0 in stage 406.0 (TID 392)
java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR TaskSetManager: Task 0 in stage 406.0 failed 1 times; aborting job
19/12/09 22:21:43 ERROR ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 406.0 failed 1 times, most recent failure: Lost task 0.0 in stage 406.0 (TID 392, localhost, executor driver): java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
        at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:601)
        at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:591)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:2212)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
19/12/09 22:21:43 ERROR Executor: Exception in task 0.0 in stage 407.0 (TID 393)
java.lang.AbstractMethodError
        at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.initializeLogIfNecessary(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.log(FlumePollingInputDStream.scala:61)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.logInfo(FlumePollingInputDStream.scala:61)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:88)
        at org.apache.spark.streaming.flume.FlumePollingReceiver$$anonfun$onStart$2.apply(FlumePollingInputDStream.scala:87)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:87)


正在回答

1回答

先了解下你服务器的spark版本和你packages里面指定的spark的版本号是否一致,根据这异常信息,可能是依赖的包版本不太一致

0 回复 有任何疑惑可以回复我~
  • 提问者 zhangyulei #1
    非常感谢!
    回复 有任何疑惑可以回复我~ 2019-12-09 23:49:10
  • 提问者 zhangyulei #2
    我的spark 包 不是cdh的那个,  换成 cdh那个就好了
    回复 有任何疑惑可以回复我~ 2019-12-09 23:49:54
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信