`

fuse挂载hdfs 安装配置

 
阅读更多
  1. fuse安装的情况下,需要使用root进行启动配置,如果需要使用其他账号挂载目录:
    echo user_allow_other >> /etc/fuse.conf
    chmod +rx /bin/fusermount
    然后在使用fusemount的命令加上一些选择,数字是用户id和组id -d -o uid=2001 -o  gid=2001
  2. 下载fuse安装包到指定补录:
    #wget http://sourceforge.net/projects/fuse/files/fuse-2.X/2.8.4/fuse-2.8.4.tar.gz/download?use_mirror=jaist&r=&use_mirror=jaist  -P /usr/local
     
  3. 解压fuse-2.8.4.tar.gz:
    #tar -zxvf fuse-2.8.4.tar.gz
     
  4. 安装配置fuse:
    #cd fuse-2.8.4
    #./configure 
    #make 
    #make install 
    #modprobe fuse
     注意,当make时可能会提示:无法找到makefile错误而导致无法编译,可能是gcc没有安装,检查后是否安装了gcc,如果没有安装则执行如下命令: centos
    #yum-y install gcc
     安装完毕后 执行make & make install  以及modprobe fuse命令
    vim /etc/sysconfig/modules/my.modules  
    #!/bin/sh 
    modprobe fuse >/dev/null 2>&1 
    chmod +x /etc/sysconfig/modules/my.modules
     
  5. 编译hadoop的libhdfs组件,编译此工具组件需要ant工具编译build java的源码文件,接下来则安装ant工具:
    [root@localhost hadoop]#cd /usr/local/
    [root@localhost hadoop]#wget http://www.meisei-u.ac.jp/mirror/apache/dist/ant/binaries/apache-ant-1.7.1-bin.tar.gz
     
    [root@localhost hadoop]#tar zxvf apache-ant-1.7.1-bin.tar.gz
    [root@localhost hadoop]#mv apache-ant-1.7.1-bin ant
     
  6. 将ant加入到系统环境变量 略过
  7. 正式编译 libhdfs工具组件:
    #cd $HADOOP_HOME/ 
    #ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1 
    #ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
     此编译过程需要耐心等待,其中$OS_ARCH-$OS_BIT 是linux的系统位数,请运行:
    #file /bin/ls
     命令查询本机的系统配置,并替换上述ln中的相应参数
  8. 编译fuse-hdfs:
    #cd $HADOOP_HOME
    #ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
     过程中可能会出现错误:
    create-native-configure:
    
    BUILD FAILED
    /usr/local/hadoop/build.xml:634: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "/usr/local/hadoop/src/native"): java.io.IOException: error=2, No such file or directory
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
            at java.lang.Runtime.exec(Runtime.java:593)
            at org.apache.tools.ant.taskdefs.launcher.Java13CommandLauncher.exec(Java13CommandLauncher.java:41)
            at org.apache.tools.ant.taskdefs.Execute.launch(Execute.java:428)
            at org.apache.tools.ant.taskdefs.Execute.execute(Execute.java:442)
            at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:628)
            at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:669)
            at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:495)
            at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
            at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
            at java.lang.reflect.Method.invoke(Method.java:597)
            at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
            at org.apache.tools.ant.Task.perform(Task.java:348)
            at org.apache.tools.ant.Target.execute(Target.java:435)
            at org.apache.tools.ant.Target.performTasks(Target.java:456)
            at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1393)
            at org.apache.tools.ant.Project.executeTarget(Project.java:1364)
            at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
            at org.apache.tools.ant.Project.executeTargets(Project.java:1248)
            at org.apache.tools.ant.Main.runBuild(Main.java:851)
            at org.apache.tools.ant.Main.startAnt(Main.java:235)
            at org.apache.tools.ant.launch.Launcher.run(Launcher.java:280)
            at org.apache.tools.ant.launch.Launcher.main(Launcher.java:109)
    Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
            at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
            at java.lang.ProcessImpl.start(ProcessImpl.java:65)
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
            ... 23 more
    
     请注意红色错误标记,则系统需要安装autoreconf 工具,安装命令如下:
    [root@localhost hadoop]# yum -y install automake autoconf
     接下来继续运行ant编译:
    [root@localhost hadoop]# ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
     请查看编译过程日志,可能出现错误:
        [javac] Compiling 2 source files to /usr/local/hadoop/build/classes
    
    compile-mapred-classes:
    Trying to override old definition of task jsp-compile
        [javac] /usr/local/hadoop/build.xml:549: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    
    create-native-configure:
         [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
         [exec] ../../lib/autoconf/specific.m4:386: AC_USE_SYSTEM_EXTENSIONS is expanded from...
         [exec] ../../lib/autoconf/specific.m4:332: AC_GNU_SOURCE is expanded from...
         [exec] configure.ac:42: the top level
         [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
         [exec] ../../lib/autoconf/specific.m4:386: AC_USE_SYSTEM_EXTENSIONS is expanded from...
         [exec] ../../lib/autoconf/specific.m4:332: AC_GNU_SOURCE is expanded from...
         [exec] configure.ac:42: the top level
         [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
         [exec] ../../lib/autoconf/specific.m4:386: AC_USE_SYSTEM_EXTENSIONS is expanded from...
         [exec] ../../lib/autoconf/specific.m4:332: AC_GNU_SOURCE is expanded from...
         [exec] configure.ac:42: the top level
         [exec] configure.ac:48: error: possibly undefined macro: AC_PROG_LIBTOOL
         [exec]       If this token and others are legitimate, please use m4_pattern_allow.
         [exec]       See the Autoconf documentation.
         [exec] autoreconf: /usr/bin/autoconf failed with exit status: 1
    
    BUILD FAILED
    /usr/local/hadoop/build.xml:634: exec returned: 1
    
    Total time: 18 seconds
     注意红色的错误标记,接下来检查系统中工具m4是否安装,保险起见,直接运行安装命令:
    [root@localhost hadoop]# yum -y install m4
     没有此错误则继续,运行ant编译命令,错误依旧,则执行安装libtool系统工具组件命令:
    [root@localhost hadoop]# yum -y install libtool
     安装成功后继续执行ant安装命令,问题解决。
  9. 修改fuse配置文件:
    [root@localhost fuse-dfs]# vim $HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
    
     文件内容如下:
    #
    # Copyright 2005 The Apache Software Foundation
    #
    # Licensed under the Apache License, Version 2.0 (the "License");
    # you may not use this file except in compliance with the License.
    # You may obtain a copy of the License at
    #
    #     http://www.apache.org/licenses/LICENSE-2.0
    #
    # Unless required by applicable law or agreed to in writing, software
    # distributed under the License is distributed on an "AS IS" BASIS,
    # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    # See the License for the specific language governing permissions and
    # limitations under the License.
    #
    
    if [ "$HADOOP_HOME" = "" ]; then
    export HADOOP_HOME=/usr/local/hadoop
    fi
    
    export PATH=$HADOOP_HOME/contrib/fuse_dfs:$PATH
    
    for f in ls $HADOOP_HOME/lib/*.jar $HADOOP_HOME/*.jar ; do
    export  CLASSPATH=$CLASSPATH:$f
    done
    
    if [ "$OS_ARCH" = "" ]; then
    export OS_ARCH=amd64
    fi
    
    if [ "$JAVA_HOME" = "" ]; then
    export  JAVA_HOME=/usr/local/jdk6
    fi
    
    if [ "$LD_LIBRARY_PATH" = "" ]; then
    export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/usr/local/hadoop/build/libhdfs:/usr/local/lib
    fi
    
    ./fuse_dfs $@
    ~               
     将红色标记处修改为自己的路径即可,保存退出
  10. 设置权限:
    chmod +x /data/soft/hadoop-2.20.1/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh 
    chmod +x /data/soft/hadoop-2.20.1/build/contrib/fuse-dfs/fuse_dfs 
    #建立软连接,方便操作
    ln -s /data/soft/hadoop-2.20.1/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin 
    ln -s /data/soft/hadoop-2.20.1/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/ 
    #创建mount目录
    mkdir /mnt/dfs
     
  11. 手动挂载hdfs系统到本地的/mnt/dfs:
    [root@localhost fuse-dfs]# ./fuse_dfs_wrapper.sh  dfs://192.168.170.248:9000 /mnt/dfs  port=9000,server=192.168.170.248
    fuse-dfs didn't recognize /mnt/dfs,-2 
     出现错误 fuse-dfs didn't recognize /dfs,-2 不影响使用
  12. 加入到开机启动项:
    vi /etc/fstab 
    fuse_dfs_wrapper.sh dfs://192.168.1.11:54310 /mnt/dfs    fuse rw,auto 0 0
     
  13. 上传文件到hdfs系统:
    [hadoop@localhost ~]$ hadoop fs -mkdir test
    [hadoop@localhost data]$ hadoop fs -put /usr/local/hadoop/data/input* test
     
  14. 查看本地挂载目录内容:
    [root@localhost fuse-dfs]# ll /mnt/dfs/
     后语:本次安装fuse还算顺利,本文编写文章按照安装次序编排,根据个人系统环境,个别系统插件工具安装可以跳过
分享到:
评论
2 楼 duguyiren3476 2014-12-12  
用的是hadoop 1.2.X
1 楼 kkgoing 2014-11-12  
请问你的是hadoop什么版本啊?我总是在编译hadoop的libhdfs组件的时候报错,编译不成功!还有,现在的hadoop版本比如1.2.1版,是不已经有libhdfs.so,不用自己编译了

相关推荐

Global site tag (gtag.js) - Google Analytics