编译hadoop过程曲折,还真是不容易,克服种种困难,终于编译成功了
。在此,分享下这个编译过程。

1、下载hadoop2.6.5的源码
2、解压后,先看下BUILD.txt文件,该文件详细说明了build注意事项、以及需要的条件
* Unix System * JDK 1.6+ * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code) * Zlib devel (if compiling native code) * openssl devel ( if compiling native hadoop-pipes ) * Internet connection for first build (to fetch all Maven and Hadoop dependencies) Building distributions: Create binary distribution without native code and without documentation: $ mvn package -Pdist -DskipTests -Dtar Create binary distribution with native code and with documentation: $ mvn package -Pdist,native,docs -DskipTests -Dtar Create source distribution: $ mvn package -Psrc -DskipTests Create source and binary distributions with native code and documentation: $ mvn package -Pdist,native,docs,src -DskipTests -Dtar Create a local staging version of the website (in /tmp/hadoop-site) $ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
3、安装CentOS的编译环境
yum install lzo-devel zlib-devel gcc gcc-c++ yum install openssl-devel yum install ncurses-devel yum install autoconf automake libtool cmake
下面是补充的部分(编译过程中会出现各种错误,可能跟以下有关,最好也一起安装了。如果已安装过,yum自动忽略)
sudo yum install kernel-devel sudo yum -y install gcc* sudo yum -y install cmake sudo yum -y install glibc-headers sudo yum -y install gcc-c++ sudo yum -y install zip-devel sudo yum -y install openssl-devel sudo yum -y install svn sudo yum -y install git sudo yum -y install ncurses-devel sudo yum -y install lzo-devel sudo yum -y install autoconf sudo yum -y install libtool sudo yum -y install automake
4、安装maven
tar -zvxf maven-3.0.1.tar.gz
vim /root/.bashrc ==》配置全局变量
5、安装protobuf(需要在root用户下安装)
tar -zvxf protobuf-2.5.0.tar.gz
./configure
make & make check & make install
卸载命令:make uninstall
6、安装findbugs
tar -zvxf findbugs-3.0.0.tar.gz
vim /root/.bashrc ==》配置全局变量
export FINDBUGS_HOME=/usr/local/findbugs-3.0.0
export PATH=$PATH:$FINDBUGS_HOME/bin
7、安装ant(补充)
tar -zvxf ant.1.9.9.tar.gz
vim /root/.bashrc ==》配置全局变量
8、mvn编译命令
mvn clean package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-pipes
9、编译过程中出现的报错信息和解决方案
(1)[ERROR] Unresolveable build extension: Plugin org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could not be resolved: The following artifacts could not be resolved: biz.aQute.bnd:bndlib:jar:2.1.0, org.osgi:org.osgi.core:jar:4.2.0, org.apache.felix:org.apache.felix.bundlerepository:jar:1.6.6, org.easymock:easymock:jar:2.4, org.codehaus.plexus:plexus-interpolation:jar:1.15, org.apache.maven.shared:maven-dependency-tree:jar:2.1, org.codehaus.plexus:plexus-component-annotations:jar:1.5.5, org.eclipse.aether:aether-util:jar:0.9.0.M2: Could not transfer artifact biz.aQute.bnd:bndlib:jar:2.1.0 from/to central (http://repo.maven.apache.org/maven2): Connection to http://repo.maven.apache.org refused: 连接超时 -> [Help 2]
repo.maven.apache.org timeout,原因是国内连接这个URL失败,翻过长城可以顺利连接。
解决办法:
修改 /path-to-maven/conf/setting.xml,参照:http://maven.apache.org/guides/mini/guide-mirror-settings.html,添加mirror: <mirror> <id>UK</id> <name>UK Central</name> <url>http://uk.maven.org/maven2</url> <mirrorOf>central</mirrorOf> </mirror>
(2)Exit code: 1 - /home/lpf/devTool/hadoop-2.6.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/interfaceStability.java:27: 错误: 意外的结束标记: </ul>
后面加上 -Dmaven.javadoc.skip=true,在stackoverflow上找到的
解决方案:
mvn package -Pdist,native,docs -DskipTests -Dtar -Dmaven.javadoc.skip=true
(3)出现错误:Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: input file /opt/hadoop-2.2.0-src/hadoop-hdfs-project/hadoop-hdfs/target/findbugsXml.xml does not exist
解决办法:
去掉docs参数 mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-pipes
(4)Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-hdfs-httpfs: An Ant BuildException has occured: exec returned: 2
解决方案:
找到hadoop-hdfs-project/pom.xml文件,将<!--module>hadoop-hdfs-httpfs</module-->给注释掉
(5)ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-kms: An Ant BuildException has occured: Can‘t get http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.43/bin/apache-tomcat-6.0.43.tar.gz to /home/liuwl/opt/datas/hadoop-2.5.0-cdh5.3.6/hadoop-common-project/hadoop-kms/downloads/apache-tomcat-6.0.43.tar.gz [ERROR] around Ant part ...<get dest="downloads/apache-tomcat-6.0.43.tar.gz" skipexisting="true" verbose="true" src="http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.43/bin/apache-tomcat-6.0.43.tar.gz"/>... @ 5:182 in /home/liuwl/opt/datas/hadoop-2.5.0-cdh5.3.6/hadoop-common-project/hadoop-kms/target/antrun/build-main.xml
解决方案:
没完整下载到tomcat6.0.43.tar.gz包, 在/hadoop-common-project/hadoop-kms/downloads目录下手动下载tomcat6.0.43.tar.gz包。 地址在hadoop-common-project/hadoop-kms/target/antrun/build-main.xml文件中有。 注意:tomcat版本要看build-main.xml中说明
10、编译完成后,编译包在hadoop-2.6.5-src/hadoop-dist/target/hadoop-2.6.5.tar.gz
libhadoop.so.1.0.0和libhdfs.so.0.0.0在hadoop-2.6.5-src/hadoop-dist/target/hadoop-2.6.5/lib/native下
相关推荐
本文将详细讲解如何在CentOS7.0环境下,使用Hadoop2.6.5版本进行自动化编译,确保你具备运行Java 1.8环境。首先,我们需要理解Hadoop的基本概念及其重要性。 Hadoop是Apache软件基金会开发的一个分布式计算框架,...
在本文中,我们将详细探讨如何在CentOS 6.8环境下部署Hadoop 2.6.5集群。Hadoop是一个开源分布式计算框架,主要用于处理和存储海量数据。它由两个主要组件构成:HDFS(Hadoop Distributed File System)和MapReduce...
以上步骤涉及到Linux操作系统的安装与基本管理、Java开发环境的搭建、Maven构建工具的使用、编译工具的安装以及Hadoop源码的编译过程,这些都是构建和部署Hadoop集群的基础步骤,也是从事Hadoop相关开发或运维工作的...
接着,我们需编译安装Hadoop2.6.5。这涉及到下载源码,配置编译选项,执行编译和安装命令。安装过程中要注意设置Hadoop的安装路径,以及配置文件中的参数,如HDFS的命名节点和数据节点,以及YARN的资源管理器和节点...
2. **编译Hadoop源码** - **准备编译环境**: - 安装JDK 1.7.x。需要注意的是,JDK版本的选择非常重要,过高或过低都可能导致编译失败。在安装完成后,需要配置环境变量。 - 安装Maven 3.3.9。Maven是一个项目...