Sunday, October 28, 2012

eclipse mapreduce plugin build for Hadoop 1.0.4



Here is a eclipse plugin built with Eclipse Juno and hadoop 1.0.4.

download jdk1.7 compatible from www.idatamining.org

download jdk1.6 compatible from www.idatamining.org

copy the hadoop-eclipse-plugin-1.0.4.jar into eclipse plugins directory and enjoy it.   

[Edit] See how to build eclipse plugin from the source code here: build hadoop eclipse plugin from the source code

40 comments:

  1. Thanks man, you're a day saver :)

    ReplyDelete
  2. Can you please create this jar using jdk6 , our project is using jdk 6.0 and using this jar throwing exception
    Caused by: java.lang.UnsupportedClassVersionError: org/apache/hadoop/eclipse/view/servers/ServerView : Unsupported major.minor version 51.0

    ReplyDelete
  3. If not possible to create the jar, please share the steps to create the jar file,

    Thanks in Advance.

    ReplyDelete
  4. Ya, same issue. Can you please let us how did you compile it?

    ReplyDelete

  5. here is the jdk1.6 compatible: http://www.idatamining.org/resources/hadoop/hadoop-eclipse-plugin-1.0.4-jdk1.6.jar

    You can check out hadoop source code and the plugin code is under the /src/contr directory. Looks like it is created in IBM? Compile the hadoop core code first then compile the plug in code. You need to setup eclipse home env in the ant script. Not difficulet, just be patient and run the ant script and check error msg and fix the ant script by adding missing property value.

    it is better to do this in linux env. On windows, we have to do more job.

    I have some draft posts about hadoop stack. I will publish them later when i have more free time. good luck.




    ReplyDelete
  6. Thanks for your reply, i have created the plugin using the steps and its working fine.

    ReplyDelete
  7. Hi Yiyu Jia,

    while using this jar i am getting the error message

    An internal error occurred during: "Connecting to DFS localhost".
    org/apache/commons/configuration/Configuration

    Have you encounter the same problem like this.

    ReplyDelete
  8. no. i did not. check your DFS configuration and Hadoop intallation and firewall configuration.

    ReplyDelete
  9. you are a real day saver, I have been trying to get this for over 2 months. Finally. Have you tried the karmasphere studio eclipse plugin? It has more features but even that has problems.

    ReplyDelete
    Replies
    1. Thanks for liking my post. I heard about karmasphere studio. But, it looks like a commercial software. So, I did not spend time on it.

      Delete
  10. I'm getting following exception. I'm using jdk6 update 38

    Unable to create the selected preference page.
    org/apache/hadoop/eclipse/preferences/MapReducePreferencePage : Unsupported major.minor version 51.0

    ReplyDelete
  11. This comment has been removed by the author.

    ReplyDelete
  12. Thanks a lot for providing this plugin! I get following error though when trying to create a new MapReduce project:
    The selected wizard could not be started.
    Plug-in org.apache.hadoop.eclipse was unable to load class org.apache.hadoop.eclipse.NewMapReduceProjectWizard.
    org/apache/hadoop/eclipse/NewMapReduceProjectWizard : Unsupported major.minor version 51.0

    ReplyDelete
    Replies
    1. Any ideas on how to solve this?

      Delete
    2. I think you need to check the JDK version setting of your project especially when you have multiple versions of jdk/jre installed on your machine.

      Delete
    3. Many thanks for your reply. The error message pops up before I can create a MapReduce project or switch to the MapReduce perspective. So I cannot change the project specific settings for the JDK. I am running this on Ubuntu, 64bit, my java version is:
      java -version
      java version "1.6.0_26"
      Java(TM) SE Runtime Environment (build 1.6.0_26-b03)
      Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)

      There certainly more versions installed, but this is the default ones. I use several other Eclipse plugins which work fine.
      Please let me know if there is something else I can try to fix this. Thanks.

      Delete

    4. I dont know. but I suggest you to install a higher version of jdk and check the jvm version your eclipse running on.

      1) launch your eclipse from the command line:
      ./eclipse -vm /path/to/your/JDKhome/bin/java

      2) Or, add -vm line in your eclipse.ini file
      -vm /path/to/your/JDKhome/bin/java

      Delete
    5. Many thanks for your help! Ok, I tried this: ./eclipse -vm /usr/lib/jvm/java-6-sun/jre/bin/java, but I still get exactly the same error message. I am working with Eclipse Juno ... plugin compatible with this version?

      Delete
    6. yes. i use Juno too. are you using jdk or just jre? you may want to try to update your jdk1.6 or try jdk1.7.

      Delete
    7. Ok, this helped. Everything is working now. Thanks a lot!

      Delete
  13. No luck, I have installed the latest eclipse juno, openjdk 1.7, hadoop 1.04 on ubuntu 12.10, after referenced your plugin to my eclipse, many exceptions thrown, such as could not connect to DFS, jackson mapper error, finally I made some change to your package, move commons-cli-1.2.jar, commons-lang-2.4.jar, commons-configuration-1.6.jar, jackson-mapper-asl-1.8.8.jar, jackson-core-asl-1.8.8.jar, commons-httpclient-3.0.1.jar to the lib, alter the MANIFEST.MF and merge all the jars' org to the class, then the plugin worked! But look back to my eclipse error logs, the errors such as "The command ("dfs.browser.action.*") is undefined" still remained, The interesting thing is that the HDFS browsing, uploading, etc., all work fine. I found a guy meet same situation in hadoop-general mailing list, maybe it's openjdk bug or eclipse bug.

    ReplyDelete
  14. I have suffered from the exception "The command ("dfs.browser.action.*") is undefined" for 3 days, changed the DFSActions.java, DFSActionImpl.java code, altered the plugin.xml, rebuild the hadoop-core-1.04.jar, hadoop-eclipse-plugin-1.0.4.jar myself, but the errors still remained. May be I should change the openjdk version or eclipse version, hope I can find a solution to end my suffering.

    ReplyDelete
    Replies
    1. maybe you need a fresh installing of eclipse? I dont know. just guess. I am not using 1.0.4. I am using hadoop 1.1 as my dev env now.

      Delete
  15. Hello, thanks for the plugin. I runned it first downloading the jar file hadoop-eclipse-plugin-1.0.4-jdk1.6.jar, then adding commons-cli-1.2.jar, commons-lang-2.4.jar, commons-configuration-1.6.jar, jackson-mapper-asl-1.8.8.jar, jackson-core-asl-1.8.8.jar, commons-httpclient-3.0.1.jar to the lib and updating MANIFEST.MF adding the new jars (Bundle-ClassPath: classes/,lib/hadoop-core.jar,lib/commons-configuration-1.6.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-lang-2.4.jar,lib/jackson-core-asl-1.0.1.jar,lib/jackson-mapper-asl-1.0.1.jar,commons-cli-1.2.jar). Then I installed eclipse Juno and I had to get Java JDK 7 because it doesn't work with 6 version.

    But in the end when I try to look into the file system I get an authentication error: org.apache.hadoop.security.AccessControlException: Permision denied: user=...,access=READ_EXECUTE, inode="system":root:supergroup:rwx-------
    Also I can't create new files.

    Are there any way to solve it? because I want to connect from another machine.

    ReplyDelete
  16. for define new hadoop location what is port numbers?

    ReplyDelete
  17. I am working on ubuntu 12.10, jdk 7, eclipse juno and hadoop 1.0.4, I set up hadoop and it runs word count successfully, I copied this hadoop eclipse plugin in eclipse plugin folder. when I defined new hadoop location in eclipse, It shows error :failure to login. would you please help me!!!!!!!

    ReplyDelete
    Replies
    1. Hi, you do not need to use eclipse plugin if you only want to programm for hadoop. you can program you app as a normal java app. probably, you can spend sometime on configrue hadoop. If you are tired, you can do something else first.

      I am not kidding. Our purpose is not knowing how to use a plugin. our purpose is program for hadoop and use hadoop. Today, there are some good book about hadoop on the market already. We can learn from them. That will be very helpful.

      good luck.

      Delete
  18. I downloaded the plugin for 1.6, but the class files in the jar is compiled with jdk7 (version 51).
    - you can check this by inspecting first 8 bytes of the class file.
    - here are reference : http://en.wikipedia.org/wiki/Java_class_file
    I wish to compile it by myself, can I have source code?
    or can you compile it with 1.6 compiler?

    ReplyDelete
  19. This comment has been removed by the author.

    ReplyDelete
  20. Has anyone seen this issue before ?

    An internal error occurred during: "Map/Reduce location status updater".
    org/codehaus/jackson/map/JsonMappingException

    ReplyDelete
  21. I fixed all the ports and updated the plugin jar with missing libraries.

    Now i see these errors
    The command ("dfs.browser.action.reconnect") is undefined
    The command ("dfs.browser.action.disconnect") is undefined
    The command ("dfs.browser.action.delete") is undefined
    The command ("dfs.browser.action.refresh") is undefined
    The command ("dfs.browser.action.download") is undefined

    Does anyone know what should be done to fix this ?

    ReplyDelete
    Replies
    1. I did not run into this error when I compiled my plugin. It looks like some jar files are missed. My steps of building eclipse plugin is written down here: http://yiyujia.blogspot.com/2012/11/build-hadoop-eclipse-plugin-from-source.html

      Delete
  22. This comment has been removed by the author.

    ReplyDelete
  23. Hi Yiyu

    I am running into a similar issue. I have hadoop 1.1.1. I tried your plugin but that disappears even my DFS. I asked my question on stackoverflow also and no one has replied yet. Here are details of my question on stackoverflow. I would really appreciate if you know what might be going on.

    http://stackoverflow.com/questions/15673701/hadoop-eclipse-setup-on-mac

    Thanks

    ReplyDelete
  24. /home/ankur/Downloads/hadoop-1.0.4/src/contrib/build-contrib.xml:21: Unexpected element "{}property" {antlib:org.apache.tools.ant}property

    getting error after pasting above mention code in buil-contrib.xml

    plzz help me as soon as possible.

    ReplyDelete
  25. Hi,
    Thanks for the plugin. It worked. Hadoop books definitive guide donot speak much about the hadoop connection to eclipse.

    ReplyDelete
  26. May i have the plugin to connect eclipse juno with hadoop1.1.2? Tried creating the JAR file and its giving Connecting to DFS test HDFS".org/apache/commons/configuration/Configuration error. Tried multiple times and still failure. Can you pls help?

    ReplyDelete
  27. follow these steps to create eclipse plugin for any hadoop versions
    https://docs.google.com/document/d/1yuZ4IjlquPkmC1zXtCeL4GUNKT1uY1xnS_SCBJHps6A/edit?pli=1

    ReplyDelete
  28. HI
    i am getting below exception.
    i am usin the Eclipse Indigo

    The selected wizard could not be started.
    Plug-in org.apache.hadoop.eclipse was unable to load class org.apache.hadoop.eclipse.NewMapReduceProjectWizard.
    org/apache/hadoop/eclipse/NewMapReduceProjectWizard : Unsupported major.minor version 51.0


    please help me to resolve it.

    ReplyDelete