Sunday, November 11, 2012

build hadoop eclipse plugin from the source code

1) install Eclipse

 2)build hadoop from the source code.

3) edit build-contrib.xml to enable eclipse plugin building

vi $Hadoop_sr_home/src/contrib/build-contrib.xml

check the version number of built hadoop and add two line in the file. For example


4) got to diretory $Hadoop_sr_home/src/contrib/eclipse-plugin/

5) run ant command

6) get  hadoop-eclipse-plugin-1.1.3-SNAPSHOT.jar under $Hadoop_sr_home/build/contrib/eclipse-plugin


  1. I have hadoop 1.0.4
    "check the version number of built hadoop" do you mean version of hadoop

    what do you mean by "ant command"

    Will this process work on ubuntu?

    1. i mean checking the version number of hadoop built by youself. in my post, the version is 1.1.3-SNAPSHOT (not 1.1.3 only).

      you can copy hadoop jar files from binary distribution to a diretory and modify the ant build file to let ant know where these jar files are. then, you can build plugin without building hadoop. but, it is really not difficult to build hadoop.

      I did not try ubuntu. I think it will be no problem. Only thing you need to pay attension is installing prerequired libraries and tools to be available on your linux platform.

  2. Hi Yiyu,

    I made my own plugin for hadoop 1.0.4. The process was smooth, but it doesn't work in my eclipse. The error was "An internal error occurred during: "Connecting to DFS testHDFS".org/apache/commons/configuration/Configuration"

    I thought it was because of lacking of some jars. Then I added some jars from the lib folder of hadoop. Then ant cann't build it. The error was:

    [echo] contrib: eclipse-plugin
    [javac] /home/doraemon/Downloads/hadoop/hadoop-1.0.4/src/contrib/eclipse-plugin/build.xml:61: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 45 source files to /home/doraemon/Downloads/hadoop/hadoop-1.0.4/build/contrib/eclipse-plugin/classes
    [javac] /home/doraemon/Downloads/hadoop/hadoop-1.0.4/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/ error: package org.apache.hadoop.fs does not exist
    [javac] import org.apache.hadoop.fs.FileStatus;
    [javac] ^
    [javac] /home/doraemon/Downloads/hadoop/hadoop-1.0.4/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/ error: package org.apache.hadoop.fs does not exist
    [javac] import org.apache.hadoop.fs.Path;
    [javac] ^
    [javac] /home/doraemon/Downloads/hadoop/hadoop-1.0.4/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/ error: package org.apache.hadoop.hdfs does not exist
    [javac] import org.apache.hadoop.hdfs.DistributedFileSystem;
    [javac] ^

    It seems that it's saying I should have a package under prg.apache.hadoop.hdfs and other packages but I checked in the folder they do not exist. I just don't know why this is happening. I have tried to build the plugin for 2 days and no success..

    If you can give me some suggestions, it will be great helpful. Thanks.

  3. Forget to mention, my java version is 1.7.0_15, OS is xubuntu 12.04.2 LTS (GNU/Linux 3.2.0-38-generic i686)
    Apache Ant(TM) version 1.8.2 compiled on December 3 2011
    Hadoop is 1.0.4, which has already been successfully configured.

    1. In this case, I suggest you to check your Namenode's log as well. On CentOS, I run command "tail -n 200 /var/log/secure" to check the access log. You may try to use your plugin and monitor this log as well. Probably, the plugin send wrong user account name for setting up connection?

      Just guess. hope it will be a little bit helpful.

  4. I checked the log of namenode. Here is something interesting but I don't know how to deal with it.

    2013-02-24 23:52:06,340 INFO Adding a new node: /default-rack/
    The ratio of reported blocks 1.0000 has reached the threshold 0.9990. Safe mode will be turned off automatically in 29 seconds.
    2013-02-24 23:52:06,555 INFO org.apache.hadoop.hdfs.StateChange: *BLOCK* NameSystem.processReport: from, blocks: 14, processing time: 36 msecs
    2013-02-24 23:52:08,930 ERROR PriviledgedActionException as:hduser cause:org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /app/hadoop/tmp/hadoop-hduser/mapred/system. Name node is in safe mode.
    2013-02-24 23:52:08,931 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 54310, call delete(/app/hadoop/tmp/hadoop-hduser/mapred/system, true) from error: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /app/hadoop/tmp/hadoop-hduser/mapred/system. Name node is in safe mode.


    1. I dont know. But I will do the following if I were you,

      1) check hadoop log and see more info. also check hdfs and jobtrack web interface to see if hadoop is started up without problem.

      2) disable firewall on your machine if it is enabled and see if it is caused by firewall settings.

      3) Check the user privilege settings. And see if the user account sent from plugin has privilege to operate on the files in HDFS

      4) clean the HDFS file system if the file in HDFS is interrupted. Or, just simply configure a fresh hadoop and format the file system again.

      good luck.

  5. Hello Yiyu Jia Sir,

    My System specification:

    Ubuntu 12.04 i686 LTS, Hadoop 1.0.4.deb (I installed from deb file), eclipse Juno

    4.2.2, OpenJDK 6 and 7

    I have downloaded your plugins for both JDK 6 and 7.

    I followed your steps and setup Java env same; both on system as well as Eclipse.

    Issues are

    1. when I try to install jar file for OpenJDK version 1.7

    it gives error

    org/apache/hadoop/eclipse/preferences/MapReducePreferencePage : Unsupported major.minor

    version 51.0

    And for OpenJDK 1.6

    it displays nothing related MapReduce in New Project Window in Eclipse.

    I also have set the classpath and JAVA_HOME in ~/.bashrc and /etc/environment


    Whats going wrong could you please help??

    1. I did not ran into this issue. But, it looks like JDK version issue though you think you correctly configure the JDK env. But who knows. Also, you will get version issue if you include some library jar files, which are compiled target to higher version JDK. So, my suggestion is the clean the env and build the plugin in your own environment. good luck!

  6. Hi,

    In the xml code you have given above, is the name property composed inside the location property or are the two independent.

  7. This comment has been removed by the author.

  8. getting this error after ant command:

    Buildfile: /home/arunaryan/hadoop-1.1.2/src/contrib/eclipse-plugin/build.xml

    /home/arunaryan/hadoop-1.1.2/src/contrib/eclipse-plugin/build.xml:22: The following error occurred while executing this line:
    /home/arunaryan/hadoop-1.1.2/src/contrib/build-contrib.xml:96: property doesn't support the nested "property" element.

    please help.

    1. You may try:
      <!-- Property added for compiling eclipse plugin -->
      <property location="/usr/lib/eclipse/" name="eclipse.home"></property>
      <property name="version" value="1.1.3-SNAPSHOT"></property>

  9. while i build hadoop 1.0.4, 1.0.3, 1.1.2 any version i got following blocking error:
    [ivy:resolve] problem while downloading module descriptor: invalid sha1: expected=f2d3a6ada0e1cad7236e1f2a06ec5d4c811f3681 computed=34692b48f039619794854fdd75213a920896f44d (865ms)
    [ivy:resolve] io problem while parsing ivy file: Impossible to load parent for file:/C:/Users/cheyi/.ivy2/cache/org.apache.commons/commons-math/ivy-2.1.xml.original. Parent=org.apache.commons#commons-parent;14
    [ivy:resolve] module not found: org.apache.commons#commons-math;2.1
    [ivy:resolve] ==== maven2: tried
    [ivy:resolve] ==== apache-snapshot: tried
    [ivy:resolve] -- artifact org.apache.commons#commons-math;2.1!commons-math.jar:
    [ivy:resolve] ::::::::::::::::::::::::::::::::::::::::::::::
    [ivy:resolve] :: UNRESOLVED DEPENDENCIES ::
    [ivy:resolve] ::::::::::::::::::::::::::::::::::::::::::::::
    [ivy:resolve] :: org.apache.commons#commons-math;2.1: not found
    [ivy:resolve] ::::::::::::::::::::::::::::::::::::::::::::::

  10. This comment has been removed by the author.

  11. Hi
    I build hadoop-eclipse-plugin-1.0.4 from source code it build successfully and created hadoop-eclipse-plugin-1.0.4.jar
    inside Hadoop_sr_home/build/contrib/eclipse-plugin . I copied this to eclipse_home/Plugins folder. Restarted eclipse. When adding prospective list (other >> prospective) Its giving error box. Problem opening perspective 'org.apache.hadoop.eclipse.perspective' Could you please suggest what's the problem.