previous post - working on RPostgresql
How to start worker node from newly added Spark slaves.?
Spark has two slave start-up script under sbin/ dir
start-slaves.sh -: to start all the worker across slaves machine. this should run from Master node
start-slave.sh -: to start Worker daemon from each and individual Slave. this should run from each slave node. Ex:
sbin/start-slave.sh spark://10.184.48.55:7077
above command need to run from slave machine. here 10.184.48.55 is where Spark Master running.
start-slaves.sh -: to start all the worker across slaves machine. this should run from Master node
start-slave.sh -: to start Worker daemon from each and individual Slave. this should run from each slave node. Ex:
sbin/start-slave.sh spark://10.184.48.55:7077
above command need to run from slave machine. here 10.184.48.55 is where Spark Master running.
Error
shutting down Netty transport
sbin/start-slave.sh spark://10.184.48.55:7077
15/11/27 19:53:53 ERROR NettyTransport: failed to bind to /10.184.48.183:0, shutting down Netty transport
15/11/27 19:53:53 ERROR NettyTransport: failed to bind to /10.184.48.183:0, shutting down Netty transport
Solution
Error due to improper configuration in /etc/hosts. set SPARK_LOCAL_IP to pointng to the local worker system.
Error due to improper configuration in /etc/hosts. set SPARK_LOCAL_IP to pointng to the local worker system.
export SPARK_LOCAL_IP=127.0.0.1
OR
export SPARK_LOCAL_IP=IP_ADDR_OF_THE_SYSTEM