Friday, August 17, 2012

Hadoop : Bad connection to FS. command aborted.

I've successfully configured Hadoop on Debian, I got following error message while running the command,
hduser@boss:/opt/hadoop-0.21.0/bin$ ./hadoop fs  -ls /

localhost/ Already tried 0 time(s).
12/08/16 15:46:12 INFO ipc.Client: Retrying connect to server: 
localhost/ Already tried 1 time(s).
Bad connection to FS. command aborted. 
solution for the above issues is,

just check whether all the port are up and running by
hduser@boss:/opt/hadoop-0.21.0/bin$ netstat -nltp

tcp  0  0*   LISTEN      9469/java       
tcp        0*  LISTEN   9879/java  
tcp   0   0*     LISTEN   9879/java 
tcp     0      0*    LISTEN  9469/java  

in my case, the port which was mentioned in core-site.xml not running, then I figured it out,

  was mentioned in both hdfs-site.xml and core-site.xml, 
in hdfs-site.xml
and in core-site.xml


after I removed the property from the hdfs-site.xml, it is working fine.  


sundara rami reddy said...

Hadoop is creating more opportunities to every one. And thanks for sharing best information about hadoop in this blog.
Hadoop Training in hyderabad

anirudh said...

I like your post very much. It is very much useful for my research. I hope you to share more info about this. Keep posting!!Best DevOps Training Institute