主要涉及到工具有:hadoop-2.4.0-src.tar.gz、Ant、Maven、JDK、GCC、CMake、openssl
第一步升级系统相关编译所需的软件(升级最新版):
yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel
wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz (源代版)
tar -zxvf hadoop-2.4.0-src.tar.gz
wget http://apache.fayea.com/apache-mirror//ant/binaries/apache-ant-1.9.4-bin.tar.gz
tar -xvf apache-ant-1.9.4-bin.tar.gz
wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
tar -xvf apache-maven-3.0.5-bin.tar.gz
vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.7.0_55
export JAVA_BIN=/usr/java/jdk1.7.0_55/bin
export ANT_HOME=/home/hadoop/ant
export MVN_HOME=/home/hadoop/maven
export FINDBUGS_HOME=/home/hadoop/findbugs-2.0.3
export PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MVN_HOME/bin:$FINDBUGS_HOME/bin
生产配置文件:
source /etc/profile
验证是否配置成功
ant –version
mvn -version
findbugs –version
验证结果:
安装protobuf(以root用户登录)
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
tar zxf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
make
make install
protoc --version
安装cmake(以root用户登录)
wget http://www.cmake.org/files/v2.8/cmake-2.8.12.2-Linux-i386.tar.gz
./bootstrap
make
make install
cmake –version
(别一种方法,直接用yum install cmake)
编译Hadoop
mvn package -DskipTests -Pdist,native –Dtar
此时在下载maven依赖所有包及插件
慢慢等待中……
编译成功,检查nativelib 是否编译成功
[root@master hadoop-2.4.1-src]# cd hadoop-dist/target/hadoop-2.4.1/lib/native/
[root@master native]# file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=0xba68c7f46259525c3aae4ebd99e1faf3b6c7e7a6, not stripped
代表编译成功
或者结果如下, 也表示编译成功。
[INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [5.731s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [4.215s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [5.122s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.548s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.271s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.020s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [7.431s] [INFO] Apache Hadoop Auth ................................ SUCCESS [7.517s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.727s] [INFO] Apache Hadoop Common .............................. SUCCESS [2:53.800s] [INFO] Apache Hadoop NFS ................................. SUCCESS [16.696s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.042s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [6:07.368s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [48.810s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [20.154s] [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [9.709s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.049s] [INFO] hadoop-yarn ....................................... SUCCESS [0.138s] [INFO] hadoop-yarn-api ................................... SUCCESS [2:00.295s] [INFO] hadoop-yarn-common ................................ SUCCESS [1:00.256s] [INFO] hadoop-yarn-server ................................ SUCCESS [0.076s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [21.974s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [28.986s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [6.791s] [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [13.558s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.431s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [2.644s] [INFO] hadoop-yarn-client ................................ SUCCESS [12.729s] [INFO] hadoop-yarn-applications .......................... SUCCESS [0.102s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.878s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.103s] [INFO] hadoop-yarn-site .................................. SUCCESS [0.055s] [INFO] hadoop-yarn-project ............................... SUCCESS [6.390s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.211s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [39.919s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [34.197s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [5.716s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [18.761s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [17.226s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [7.617s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.211s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.571s] [INFO] hadoop-mapreduce .................................. SUCCESS [6.483s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [10.180s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.514s] [INFO] Apache Hadoop Archives ............................ SUCCESS [5.243s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [12.533s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [8.247s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [6.091s] [INFO] Apache Hadoop Extras .............................. SUCCESS [5.339s] [INFO] Apache Hadoop Pipes ............................... SUCCESS [13.666s] [INFO] Apache Hadoop OpenStack support ................... SUCCESS [14.356s] [INFO] Apache Hadoop Client .............................. SUCCESS [14.354s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.145s] [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [39.951s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [8.662s] [INFO] Apache Hadoop Tools ............................... SUCCESS [0.035s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [1:45.654s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 23:27.228s [INFO] Finished at: Tue Sep 16 23:09:45 HKT 2014 [INFO] Final Memory: 67M/179M [INFO] ------------------------------------------------------------------------
错误1
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 46.796s
[INFO] Finished at: Wed Jun 04 13:28:37 CST 2014
[INFO] Final Memory: 36M/88M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.4.0: Failure to find org.apache.commons:commons-compress:jar:1.4.1 in https://repository.apache.org/content/repositories/snapshots was cached in the local repository, resolution will not be reattempted until the update interval of apache.snapshots.https has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
解决方法:
根据上面日志提示说找不到“org.apache.commons:commons-compress:jar:1.4.1”,
直接将本地(Windows)包复制到Linux系统中,解决了。
错误2
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:16.693s
[INFO] Finished at: Wed Jun 04 13:56:31 CST 2014
[INFO] Final Memory: 48M/239M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, 没有那个文件或目录
[ERROR] around Ant part ...<exec dir="/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:133 in /home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
解决方法:
是没有安装cmake导致的,再重新安装cmake;参考《5.3.1编译环境准备》
错误3
错误提示是找不到相应的文件和不能创建目录,在网上没有相关错误(根据自己经验修改目录权限为:775,让目录有创建文件或文件夹的权限,另外最好保证hadoop编译目录有2.5G至4G的空间)
chmod -Rf 775 ./ hadoop-2.4.0-src
main:
[mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/test-dir
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-pipes ---
[INFO] Executing tasks
错误3
main:
[mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native
[exec] -- The C compiler identification is GNU 4.4.7
[exec] -- The CXX compiler identification is GNU 4.4.7
[exec] -- Check for working C compiler: /usr/bin/cc
[exec] -- Check for working C compiler: /usr/bin/cc -- works
[exec] -- Detecting C compiler ABI info
[exec] -- Detecting C compiler ABI info - done
[exec] -- Check for working CXX compiler: /usr/bin/c++
[exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
[exec] -- Detecting CXX compiler ABI info
[exec] -- Detecting CXX compiler ABI info - done
[exec] CMake Error at /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:108 (message):
[exec] Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
[exec] system variable OPENSSL_ROOT_DIR (missing: OPENSSL_LIBRARIES
[exec] OPENSSL_INCLUDE_DIR)
[exec] Call Stack (most recent call first):
[exec] /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE)
[exec] /usr/local/share/cmake-2.8/Modules/FindOpenSSL.cmake:313 (find_package_handle_standard_args)
[exec] CMakeLists.txt:20 (find_package)
[exec]
[exec]
[exec] -- Configuring incomplete, errors occurred!
[exec] See also "/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native/CMakeFiles/CMakeOutput.log".
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [13.745s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5.538s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [7.296s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.568s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [5.858s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.541s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [8.337s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [7.348s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.926s]
[INFO] Apache Hadoop Common .............................. SUCCESS [2:35.956s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [18.680s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.059s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [5:03.525s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [38.335s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [23.780s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [8.769s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.159s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.134s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:07.657s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:10.680s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.165s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [24.174s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [27.293s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [5.177s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [11.399s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.384s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [1.346s]
[INFO] hadoop-yarn-client ................................ SUCCESS [12.937s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.108s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.303s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.212s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.050s]
[INFO] hadoop-yarn-project ............................... SUCCESS [8.638s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.135s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [43.622s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [36.329s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [6.058s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [20.058s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [16.493s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [11.685s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.222s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.656s]
[INFO] hadoop-mapreduce .................................. SUCCESS [8.060s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [8.994s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.886s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [6.659s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [15.722s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [11.778s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [5.953s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [6.414s]
[INFO] Apache Hadoop Pipes ............................... FAILURE [3.746s]
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:43.155s
[INFO] Finished at: Wed Jun 04 17:40:17 CST 2014
[INFO] Final Memory: 79M/239M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native" executable="cmake" failonerror="true">... @ 5:123 in /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
根据网上提示( 下面需要再安装openssl-devel,安装命令yum install openssl-devel,此步不做的话会报如下错误
[exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66 (MESSAGE):
[exec] Could NOT find OpenSSL
[exec] Call Stack (most recent call first):
[exec] CMakeLists.txt:20 (find_package)
[exec]
[exec]
[exec] -- Configuring incomplete, errors occurred!
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluen ... oExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-pipes
)
错误连接:http://f.dataguru.cn/thread-189176-1-1.html
原因是:在安装openssl-devel,少写一个l,重新安装一下
解决方法:重新安装openssl-devel
yum install openssl-devel
5.3.3 编译总结
1、 必须安装(yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel)
2、 必须安装(protobuf,CMake)编译工具
3、 必须配置(ANT、MAVEN、FindBugs)
4、 将maven库指向开源中国,这样就可以加快编译速度,即加快下载依赖jar包速度
5、 编译出错需求详细观察出错日志,根据错误日志分析原因再结束百度和Google解决错误;
相关推荐
centos 7 离线安装docker-engine-1.12.6 yum localinstall *.rpm Installing : docker-engine-1.12.6-... Verifying : docker-engine-1.12.6-1.el7.centos.x86_64 7/8 Verifying : checkpolicy-2.5-6.el7.x86_64
centos 6.x,各位根据版本下载。 适用于yum无法使用情况,具体文件如下: ntpdate-4.2.6p5-5.el6.centos.x86_64.rpm ntp-4.2.6p5-5.el6.centos.x86_64.rpm
# CentOS-7-x86_64-DVD-1810.iso CentOS 7.6 DVD 版 4G http://mirrors.163.com/centos/7.6.1810/isos/x86_64/CentOS-7-x86_64-DVD-1810.iso # CentOS-7-x86_64-Everything-1810.iso CentOS 7.6 Everything版 10G ...
docker-ce-17.03.2.ce-1.el7.centos.x86_64 Centos7 离线安装包和依赖包、 audit-libs-python-2.7.6-3.el7.x86_64.rpm checkpolicy-2.5-4.el7.x86_64.rpm docker-ce-17.03.2.ce-1.el7.centos.x86_64.rpm docker-ce-...
openssh-8.4p1-1.el7.centos.x86_64.rpm以来的openssh包,源生的版本不支持hosts.allow,hosts.deny.
pigz-2.3.3-1.el7.centos.x86_64.rpm
erlang-19.0.4-1.el7.centos.x86_64.rpm ,官网的正式包
在CentOS7上能满足RabbitMQ版本需要的erlang安装包。 安装命令: sudo yum install erlang-20.3-1.el7.centos.x86_64.rpm
mod_ssl-2.2.15-69.el6.centos.x86_64.rpm
基于CentOS7编译的,修复ntp-4.2.6漏洞。 ntp-4.2.8p15-1.el7.centos.x86_64.rpm ntp-debuginfo-4.2.8p15-1.el7.centos.x86_64.rpm ntp-perl-4.2.8p15-1.el7.centos.x86_64.rpm ntpstat-4.2.8p15-1.el7.centos....
centos-release-7-6.1810.2.el7.centos.x86_64.rpm包 命令主要是针对将 centos7.9版本降级到centos7.6,其他版本也可以参考降底版本到7.6,包含操作命令
openssh-server-8.7p1-1.el7.Centos.x86_64.rpm
CentOS 7 x86_64 Minimal 1810
ntpdate-4.2.6p5-29.el7.centos.x86_64.rpm,搭建ntp时间同步,用于集群中时间同步!
openscap-devel-1.0.8-1.0.1.el6.centos.x86_64.rpm
PackageKit-yum-1.1.10-2.el7.centos.x86_64.rpm
httpd-2.4.6-67.el7.centos.x86_64.rpm 高版本的httpd
openGauss_3.0.0 轻量版(openGauss-Lite-3.0.0-CentOS-x86_64.tar.gz)适用于centos_x86_64
ntp-4.2.6p5-29.el7.centos.x86_64.rpm,用于搭建ntp服务,集群中做时间同步
离线安装包,亲测可用