Saturday, November 28, 2015

Apache Spark : how to start worker across cluster

Apache Spark : how to start worker across cluster
previous post - working on RPostgresql

Apache Spark cluster setup. refer if you have any issue Spark over Yarn

How to start worker node from newly added Spark slaves.?
Spark has two slave start-up script under sbin/ dir -: to start all the worker across slaves machine. this should run from Master node -: to start Worker daemon from each and individual Slave. this should run from each slave node. Ex:

sbin/ spark://
above command need to run from slave machine. here is where Spark Master running.
shutting down Netty transport
sbin/ spark://
15/11/27 19:53:53 ERROR NettyTransport: failed to bind to /, shutting down Netty transport
Error due to improper configuration in /etc/hosts. set SPARK_LOCAL_IP to pointng to the local worker system.


Next :
Post a Comment