doplnok eclipse hadoop 2.8.3

Hadoop 2 8 3 Eclipse Plugin



Windows Server 2012 R2 Standard
hadoop 2.8.3
zatmenie-jee-mars-2-win32-x86_64

  1. Od https://github.com/winghc/hadoop2x-eclipse-plugin Stiahnite si zdrojový kód do miestnej
  2. Na stiahnutie ant.
  3. Upravte dva súbory:
    3.1. hadoop2x-eclipse-plugin-master ivy libraries.properties
    Zmeňte verziu balíka jar vo vnútri na verziu zodpovedajúcu hadoop
    3.2. hadoop2x-eclipse-plugin-master src contrib eclipse-plugin build.xml, upraviť názov balíka jar, parametre napíšem do príkazového riadku priamo do tohto súboru bingo.
  4. Príkazový riadok cd Do katalógu hadoop2x-eclipse-plugin-mastersrccontribeclipse-plugin, vykonané ant jar Príkaz:
    obrázok
  5. will hadoop2x-eclipse-plugin-masteruildcontribeclipse-plugin Under the directory hadoop-eclipse-plugin-2.8.3.jar Copy to eclipseplugin Under contents
  6. Spustite zatmenie, uvidíte nasledujúcu ikonu:
    obrázok
  7. Potom kliknite na okná-> zobraziť zobrazenie-> Nástroje MapReduce-> Mapa / Zmenšiť umiestnenia.
    obrázok
  8. Kliknutím na malého slona v pravom hornom rohu tohto zobrazenia nakonfigurujete:
    obrázok
  9. Spustite hadoop, zadajte do príkazového riadku: hadoop fs -mkdir / Administrator
    (Adiministrator je moje používateľské meno)
  10. ok
    obrázok

Adresa na stiahnutie doplnku
marec zatmenie: http://download.csdn.net/download/familyshizhouna/10172408
zatmenie kyslíka:
http://download.csdn.net/download/familyshizhouna/10172755



Zostavenie bolo na začiatku neúspešné. Podľa chybového hlásenia boli knižnice libraries.properties a build.xml upravené a kompilácia bola úspešná. Môže sa však zobraziť v zatmení (verzia pre zatmenie je v tomto okamihu kyslíkom), ale vyskytla sa nasledujúca chyba (skontrolujte protokol chýb zatmenia),
obrázok



!MESSAGE An internal error occurred during: 'Map/Reduce location status updater'. !STACK 0 java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2806) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2849) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2831) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:360) at org.apache.hadoop.mapred.LocalJobRunner.(LocalJobRunner.java:739) at org.apache.hadoop.mapred.LocalJobRunner.(LocalJobRunner.java:734) at org.apache.hadoop.mapred.LocalClientProtocolProvider.create(LocalClientProtocolProvider.java:42) at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:121) at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:108) at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:101) at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475) at org.apache.hadoop.mapred.JobClient.(JobClient.java:454) at org.apache.hadoop.eclipse.server.HadoopServer.getJobClient(HadoopServer.java:488) at org.apache.hadoop.eclipse.server.HadoopServer$LocationStatusUpdater.run(HadoopServer.java:103) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:56) Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder cannot be found by org.apache.hadoop.eclipse_0.18.0 at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:484) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:395) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:387) at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:150) at java.lang.ClassLoader.loadClass(Unknown Source) ... 18 more

Cítim sa zle znamená, že jar balíček hrace sa nedá nájsť, ale v doplnku je tento balíček, neviem ako to zmeniť。
Ak narazíte na tento problém, vyriešte ho, dajte mi prosím vedieť, ďakujem!
by mal byť balík jar, priamo som zmenil balík doplnkov, prekompiloval a potom spustil zatmenie takto: eclipse -clean -consolelog -debug, to je všetko.



Potom som zmenil verziu zatmenia a použil Mars. Mars narazil na ďalší problém:

!MESSAGE An internal error occurred during: 'Map/Reduce location status updater'. !STACK 0 java.lang.NullPointerException at org.apache.hadoop.mapred.JobClient.getAllJobs(JobClient.java:851) at org.apache.hadoop.mapred.JobClient.jobsToComplete(JobClient.java:827) at org.apache.hadoop.eclipse.server.HadoopServer$LocationStatusUpdater.run(HadoopServer.java:119) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)

Tento problém je možné vyriešiť odkazom na nasledujúcu metódu hadoop dfs -mkdir /Administrator
Referencia: http://blog.csdn.net/l1028386804/article/details/52665022

knižnice.vlastnosti



# Licensed under the Apache License, Version 2.0 (the 'License') # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an 'AS IS' BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. #This properties file lists the versions of the various artifacts used by hadoop and components. #It drives ivy and the generation of a maven POM # This is the version of hadoop we are generating hadoop.version=2.8.3 #hadoop-gpl-compression.version=0.1.0 #These are the versions of our dependencies (in alphabetical order) #apacheant.version=1.9.6 #ant-task.version=2.0.10 hadoop-ant.version=2.8.3 asm.version=3.2 avro.version=1.7.4 #aspectj.version=1.6.5 #aspectj.version=1.6.11 #checkstyle.version=4.2 commons-cli.version=1.2 commons-codec.version=1.4 commons-beanutils-core.version=1.8.0 commons-beanutils.version=1.7.0 commons-collections.version=3.2.2 commons-configuration.version=1.6 commons-compress.version=1.4.1 commons-daemon.version=1.0.13 commons-digester.version=1.8 commons-httpclient.version=3.1 commons-lang.version=2.6 commons-logging.version=1.1.3 #commons-logging-api.version=1.0.4 commons-math.version=2.2 commons-math3.version=3.1.1 #commons-el.version=1.0 #commons-fileupload.version=1.2 commons-io.version=2.4 commons-net.version=3.1 #core.version=3.1.1 #coreplugin.version=1.3.2 hsqldb.version=2.0.0 #htrace.version=4.0.1 htrace-core4.version=4.0.1 ivy.version=2.1.0 jasper.version=5.5.12 jackson.version=1.9.13 #not able to figureout the version of jsp & jsp-api version to get it resolved throught ivy # but still declared here as we are going to have a local copy from the lib folder #jsp.version=2.1 jsp-api.version=2.1 #jsp-api-2.1.version=6.1.14 #jsp-2.1.version=6.1.14 jets3t.version=0.9.0 jetty.version=6.1.26 jetty-util.version=6.1.26 jersey-core.version=1.9 jersey-json.version=1.9 jersey-server.version=1.9 junit.version=4.11 #jdeb.version=0.8 #jdiff.version=1.0.9 #json.version=1.0 json-smart.version=1.1.1 #kfs.version=0.1 log4j.version=1.2.17 #lucene-core.version=2.3.1 mockito-all.version=1.8.5 jsch.version=0.1.51 #oro.version=2.0.8 #rats-lib.version=0.5.1 #servlet.version=4.0.6 guice-servlet.version=3.0 servlet-api.version=2.5 slf4j-api.version=1.7.10 slf4j-log4j12.version=1.7.10 #wagon-http.version=1.0-beta-2 xmlenc.version=0.52 #xerces.version=2.9.1 xercesImpl.version=2.9.1 protobuf.version=2.5.0 guava.version=11.0.2 netty.version=3.6.2.Final paranamer.version=2.3 xz.version=1.0 zookeeper.version=3.4.6 jsr305.version=3.0.0 snappy-java.version=1.0.4.1 hamcrest-core.version=1.3

build.xml

<project default='jar' name='eclipse-plugin'> <property name='version' value='2.8.3'/> <property name='eclipse.home' value='C:/eclipse-jee-mars-2-win32-x86_64/eclipse'/> <property name='hadoop.home' value='c:/hadoop-2.8.3'/> <import file='../build-contrib.xml'/> <path id='eclipse-sdk-jars'> <fileset dir='${eclipse.home}/plugins/'> <include name='org.eclipse.ui*.jar'/> <include name='org.eclipse.jdt*.jar'/> <include name='org.eclipse.core*.jar'/> <include name='org.eclipse.equinox*.jar'/> <include name='org.eclipse.debug*.jar'/> <include name='org.eclipse.osgi*.jar'/> <include name='org.eclipse.swt*.jar'/> <include name='org.eclipse.jface*.jar'/> <include name='org.eclipse.team.cvs.ssh2*.jar'/> <include name='com.jcraft.jsch*.jar'/> fileset> path> <path id='hadoop-sdk-jars'> <fileset dir='${hadoop.home}/share/hadoop/mapreduce'> <include name='hadoop*.jar'/> fileset> <fileset dir='${hadoop.home}/share/hadoop/hdfs'> <include name='hadoop*.jar'/> fileset> <fileset dir='${hadoop.home}/share/hadoop/common'> <include name='hadoop*.jar'/> fileset> path> <path id='classpath'> <pathelement location='${build.classes}'/> <path refid='eclipse-sdk-jars'/> <path refid='hadoop-sdk-jars'/> path> <target name='check-contrib' unless='eclipse.home'> <property name='skip.contrib' value='yes'/> <echo message='eclipse.home unset: skipping eclipse plugin'/> target> <target name='compile' depends='init, ivy-retrieve-common' unless='skip.contrib'> <echo message='contrib: ${name}'/> <javac encoding='${build.encoding}' srcdir='${src.dir}' includes='**/*.java' destdir='${build.classes}' debug='${javac.debug}' deprecation='${javac.deprecation}'> <classpath refid='classpath'/> javac> target> <target name='jar' depends='compile' unless='skip.contrib'> <mkdir dir='${build.dir}/lib'/> <copy todir='${build.dir}/lib/' verbose='true'> <fileset dir='${hadoop.home}/share/hadoop/mapreduce'> <include name='hadoop*.jar'/> fileset> copy> <copy todir='${build.dir}/lib/' verbose='true'> <fileset dir='${hadoop.home}/share/hadoop/common'> <include name='hadoop*.jar'/> fileset> copy> <copy todir='${build.dir}/lib/' verbose='true'> <fileset dir='${hadoop.home}/share/hadoop/hdfs'> <include name='hadoop*.jar'/> fileset> copy> <copy todir='${build.dir}/lib/' verbose='true'> <fileset dir='${hadoop.home}/share/hadoop/yarn'> <include name='hadoop*.jar'/> fileset> copy> <copy todir='${build.dir}/classes' verbose='true'> <fileset dir='${root}/src/java'> <include name='*.xml'/> fileset> copy> <copy file='${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar' todir='${build.dir}/lib' verbose='true'/> <copy file='${hadoop.home}/share/hadoop/common/lib/htrace-core4-${htrace-core4.version}-incubating.jar' todir='${build.dir}/lib' verbose='true'/> <jar jarfile='${build.dir}/hadoop-${name}-${hadoop.version}.jar' manifest='${root}/META-INF/MANIFEST.MF'> <manifest> <attribute name='Bundle-ClassPath' value='classes/, lib/hadoop-hdfs-client-${hadoop.version}.jar, lib/hadoop-mapreduce-client-core-${hadoop.version}.jar, lib/hadoop-mapreduce-client-common-${hadoop.version}.jar, lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar, lib/hadoop-auth-${hadoop.version}.jar, lib/hadoop-common-${hadoop.version}.jar, lib/hadoop-hdfs-${hadoop.version}.jar, lib/protobuf-java-${protobuf.version}.jar, lib/log4j-${log4j.version}.jar, lib/commons-cli-${commons-cli.version}.jar, lib/commons-configuration-${commons-configuration.version}.jar, lib/httpclient-${httpclient.version}.jar, lib/commons-lang-${commons-lang.version}.jar, lib/commons-collections-${commons-collections.version}.jar, lib/jackson-core-asl-${jackson.version}.jar, lib/jackson-mapper-asl-${jackson.version}.jar, lib/slf4j-log4j12-${slf4j-log4j12.version}.jar, lib/slf4j-api-${slf4j-api.version}.jar, lib/guava-${guava.version}.jar, lib/netty-${netty.version}.jar, lib/servlet-api-${servlet-api.version}.jar, lib/htrace-core4-${htrace-core4.version}-incubating.jar, lib/commons-io-${commons-io.version}.jar'/> manifest> <fileset dir='${build.dir}' includes='classes/ lib/'/> <fileset dir='${root}' includes='resources/ plugin.xml'/> jar> target> project>