- 浏览: 771213 次
文章分类
最新评论
-
dfjjfxyl:
开源项目推荐网站:http://binlily.imwork. ...
JAVA开源项目 -
喵喵大神:
这类免费API还是挺多的,博客上也整理过:https://my ...
Web Api --智能Api接口
(8)基于hadoop的简单网盘应用实现4
文件结构
(1)、index.jsp首页面实现
index.jsp
<%@ include file="head.jsp"%> <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <%@page import="org.apache.hadoop.fs.FileStatus"%> <body style="text-align:center;margin-bottom:100px;"> <div class="navbar" > <div class="navbar-inner"> <a class="brand" href="#" style="margin-left:200px;">网盘</a> <ul class="nav"> <li><a href="LogoutServlet">退出</a></li> </ul> </div> </div> <div style="margin:0px auto; text-align:left;width:1200px; height:50px;"> <form class="form-inline" method="POST" enctype="MULTIPART/FORM-DATA" action="UploadServlet" > <div style="line-height:50px;float:left;"> <input type="submit" name="submit" value="上传文件" /> </div> <div style="line-height:50px;float:left;"> <input type="file" name="file1" size="30"/> </div> </form> </div> <div style="margin:0px auto; width:1200px;height:500px; background:#fff"> <table class="table table-hover" style="width:1000px;margin-left:100px;"> <tr style=" border-bottom:2px solid #ddd"> <td >文件名</td><td style="width:100px">类型</td><td style="width:100px;">大小</td><td style="width:100px;">操作</td><td style="width:100px;">操作</td> </tr> <% FileStatus[] list = (FileStatus[])request.getAttribute("list"); if(list != null) for (int i=0; i<list.length; i++) { %> <tr style="border-bottom:1px solid #eee"> <% if(list[i].isDir()) { out.print("<td> <a href=\"DocumentServlet?filePath="+list[i].getPath()+"\">"+list[i].getPath().getName()+"</a></td>"); }else{ out.print("<td>"+list[i].getPath().getName()+"</td>"); } %> <td><%= (list[i].isDir()?"目录":"文件") %></td> <td><%= list[i].getLen()/1024%></td> <td><a href="DeleteFileServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">x</a></td> <td><a href="DownloadServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">下载</a></td> </tr > <% } %> </table> </div> </body>
(2)document.jsp文件
<%@ include file="head.jsp"%> <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <%@page import="org.apache.hadoop.fs.FileStatus"%> <body style="text-align:center;margin-bottom:100px;"> <div class="navbar" > <div class="navbar-inner"> <a class="brand" href="#" style="margin-left:200px;">网盘</a> <ul class="nav"> <li class="active"><a href="#">首页</a></li> <li><a href="#">Link</a></li> <li><a href="#">Link</a></li> </ul> </div> </div> <div style="margin:0px auto; text-align:left;width:1200px; height:50px;"> <form class="form-inline" method="POST" enctype="MULTIPART/FORM-DATA" action="UploadServlet" > <div style="line-height:50px;float:left;"> <input type="submit" name="submit" value="上传文件" /> </div> <div style="line-height:50px;float:left;"> <input type="file" name="file1" size="30"/> </div> </form> </div> <div style="margin:0px auto; width:1200px;height:500px; background:#fff"> <table class="table table-hover" style="width:1000px;margin-left:100px;"> <tr><td>文件名</td><td>属性</td><td>大小(KB)</td><td>操作</td><td>操作</td></tr> <% FileStatus[] list = (FileStatus[])request.getAttribute("documentList"); if(list != null) for (int i=0; i<list.length; i++) { %> <tr style=" border-bottom:2px solid #ddd"> <% if(list[i].isDir()) { out.print("<td><a href=\"DocumentServlet?filePath="+list[i].getPath()+"\">"+list[i].getPath().getName()+"</a></td>"); }else{ out.print("<td>"+list[i].getPath().getName()+"</td>"); } %> <td><%= (list[i].isDir()?"目录":"文件") %></td> <td><%= list[i].getLen()/1024%></td> <td><a href="DeleteFileServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">x</a></td> <td><a href="DownloadServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">下载</a></td> </tr> <% } %> </table> </div> </body> </html>
(3)DeleteFileServlet 文件
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; import com.sun.security.ntlm.Server; /** * Servlet implementation class DeleteFileServlet */ public class DeleteFileServlet extends HttpServlet { /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.rmr(filePath); System.out.println("===="+filePath+"===="); FileStatus[] list = hdfs.ls("/user/root/"); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
(4)UploadServlet文件
package com.controller; import java.io.File; import java.io.IOException; import java.util.Iterator; import java.util.List; import javax.servlet.ServletContext; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.servlet.jsp.PageContext; import org.apache.commons.fileupload.DiskFileUpload; import org.apache.commons.fileupload.FileItem; import org.apache.commons.fileupload.disk.DiskFileItemFactory; import org.apache.commons.fileupload.servlet.ServletFileUpload; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class UploadServlet */ public class UploadServlet extends HttpServlet { /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doPost(request, response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { request.setCharacterEncoding("UTF-8"); File file ; int maxFileSize = 50 * 1024 *1024; //50M int maxMemSize = 50 * 1024 *1024; //50M ServletContext context = getServletContext(); String filePath = context.getInitParameter("file-upload"); System.out.println("source file path:"+filePath+""); // 验证上传内容了类型 String contentType = request.getContentType(); if ((contentType.indexOf("multipart/form-data") >= 0)) { DiskFileItemFactory factory = new DiskFileItemFactory(); // 设置内存中存储文件的最大值 factory.setSizeThreshold(maxMemSize); // 本地存储的数据大于 maxMemSize. factory.setRepository(new File("c:\\temp")); // 创建一个新的文件上传处理程序 ServletFileUpload upload = new ServletFileUpload(factory); // 设置最大上传的文件大小 upload.setSizeMax( maxFileSize ); try{ // 解析获取的文件 List fileItems = upload.parseRequest(request); // 处理上传的文件 Iterator i = fileItems.iterator(); System.out.println("begin to upload file to tomcat server</p>"); while ( i.hasNext () ) { FileItem fi = (FileItem)i.next(); if ( !fi.isFormField () ) { // 获取上传文件的参数 String fieldName = fi.getFieldName(); String fileName = fi.getName(); String fn = fileName.substring( fileName.lastIndexOf("\\")+1); System.out.println("<br>"+fn+"<br>"); boolean isInMemory = fi.isInMemory(); long sizeInBytes = fi.getSize(); // 写入文件 if( fileName.lastIndexOf("\\") >= 0 ){ file = new File( filePath , fileName.substring( fileName.lastIndexOf("\\"))) ; //out.println("filename"+fileName.substring( fileName.lastIndexOf("\\"))+"||||||"); }else{ file = new File( filePath , fileName.substring(fileName.lastIndexOf("\\")+1)) ; } fi.write( file ) ; System.out.println("upload file to tomcat server success!"); System.out.println("begin to upload file to hadoop hdfs</p>"); //将tomcat上的文件上传到hadoop上 String username = (String) request.getSession().getAttribute("username"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.copyFile(filePath+"\\"+fn, "/"+username+"/"+fn); System.out.println("upload file to hadoop hdfs success!"); System.out.println("username-----"+username); FileStatus[] list = hdfs.ls("/"+username); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request, response); } } }catch(Exception ex) { System.out.println(ex); } }else{ System.out.println("<p>No file uploaded</p>"); } } }
(5)DownloadServlet文件
package com.controller; import java.io.BufferedInputStream; import java.io.BufferedOutputStream; import java.io.File; import java.io.FileInputStream; import java.io.IOException; import java.io.InputStream; import javax.servlet.ServletException; import javax.servlet.ServletOutputStream; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class DownloadServlet */ public class DownloadServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String local = "C:/"; String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); System.out.println(filePath); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.download(filePath, local); FileStatus[] list = hdfs.ls("/user/root/"); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
(6)DocumentServlet文件
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class DocumentServlet */ public class DocumentServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); FileStatus[] documentList = hdfs.ls(filePath); request.setAttribute("documentList",documentList); request.getRequestDispatcher("document.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpSession; /** * Servlet implementation class LogoutServlet */ public class LogoutServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { HttpSession session = request.getSession(); session.removeAttribute("username"); request.getRequestDispatcher("login.jsp").forward(request, response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
到此,一个简单的基于hadoop的网盘应用就完成了,如果想把它做的更像一个真正的网盘,大家可以花多点时间去实现剩下的功能。
源代码下载地址:http://download.csdn.net/detail/wen294299195/7779949
相关推荐
【标题】:“基于Hadoop的网盘应用” 在当今大数据时代,Hadoop作为一个开源的分布式计算框架,已经广泛应用于各种领域,其中包括网盘服务。基于Hadoop的网盘应用充分利用了其分布式存储和处理能力,为用户提供了...
【描述】基于Hadoop的简单网盘实现源代码,配合博客文章进行学习,可以了解到如何将Hadoop的分布式特性应用于文件存储服务。通常,这种网盘实现会包括文件的上传、下载、搜索、删除等基本功能。开发者可以通过分析和...
【标题】基于Hadoop的云盘系统 在云计算领域,数据存储和管理是核心问题之一。基于Hadoop的云盘系统是一种分布式存储解决方案,利用Hadoop的可扩展性和高容错性来处理大规模数据。Hadoop是Apache软件基金会的一个...
标题 "基于Hadoop的简单网盘实现源代码.rar" 提供了一个关键信息,即这个压缩包包含了一套源代码,该代码实现了一个基于Hadoop框架的简单网盘服务。Hadoop是一个开源的分布式计算框架,它允许在大规模数据集上进行...
随着人工智能技术的发展,Hadoop网盘管理系统有望结合AI技术,实现智能文件分类、推荐,甚至自动生成文件摘要,进一步提升用户效率。此外,结合容器化技术如Docker和Kubernetes,可以更灵活地管理和调度Hadoop集群...
【基于Hadoop的网盘程序】是一个学习项目,旨在实现文件的上传和下载功能,利用了分布式计算框架Hadoop的核心特性。Hadoop是Apache软件基金会的一个开源项目,它为大数据处理提供了可扩展的、可靠的解决方案。这个...
总的来说,基于Hadoop的校园网盘设计实现了安全、高效的文件存储和共享,利用先进的分布式计算技术解决了传统存储方式的局限性,为高校信息化建设提供了新的思路。随着技术的发展,未来还可以考虑引入更高级的功能,...
《基于Hadoop的网盘实现详解》 在当今大数据时代,高效、稳定的数据存储与处理成为企业关注的焦点。Hadoop作为开源的分布式计算框架,以其强大的数据处理能力,广泛应用于各种大规模数据处理场景,包括云盘服务。...
本项目“基于HADOOP的网盘后端”便是对这种分布式存储和处理技术在云存储服务中的具体应用,旨在构建一个高效、可靠且可扩展的网络硬盘后台。 Hadoop是Apache软件基金会开发的一个开源项目,主要由HDFS(Hadoop ...
在实际应用中,基于Hadoop的网盘系统可能会面临一些挑战,如数据安全性、性能优化、扩展性等问题。为了解决这些问题,开发团队可能采用了诸如加密传输、访问控制列表、负载均衡等技术手段。同时,通过持续监控和调优...
本资料包"行业分类-设备装置-基于Hadoop平台的应用报表实现方法、设备及存储介质.zip"聚焦于如何利用Hadoop平台来构建和实现应用报表,以及涉及到的相关硬件和存储解决方案。 首先,我们需要理解Hadoop的核心组件。...
本项目“基于hadoop+hbase+springboot实现分布式网盘系统”旨在利用这些技术搭建一个高效、可扩展的存储解决方案。 **Hadoop** 是一个开源的分布式计算框架,主要由HDFS(Hadoop Distributed File System)和...
本文旨在深入探讨基于Hadoop的Web日志挖掘技术,包括其设计原理、实现机制以及实际应用效果。 #### 二、基于Hadoop的Web日志挖掘平台设计 Hadoop是一个能够处理大规模数据集的开源软件框架,由Apache基金会维护。...
总的来说,设计一个基于Hadoop的数据分析系统涉及到多个环节,从需求分析到系统设计,再到具体的部署和优化,每个步骤都需要细致考虑和精心实施。通过这样的系统,企业能够高效地处理和分析海量数据,从而获取有价值...
【标题】"基于HADOOP的网盘前端.zip"揭示了这个项目是关于构建一个基于Hadoop技术的网络存储系统前端。Hadoop是Apache软件基金会开发的一个开源框架,主要用于处理和存储大量数据,尤其适用于大数据分析。这个项目...
本篇文章将深入探讨Hadoop在人工智能领域的应用,以及如何构建基于Hadoop的数据分析系统。 一、Hadoop核心组件与原理 Hadoop主要由两个关键组件构成:Hadoop Distributed File System (HDFS) 和 MapReduce。HDFS是...
标题 "基于hadoop,利用ssh框架实现hdfs网盘.zip" 暗示了这个压缩包包含的内容是关于如何在Hadoop平台上构建一个基于SSH(Secure Shell)框架的HDFS(Hadoop Distributed File System)网盘。这涉及到分布式存储、...
### 基于Hadoop的海量文本处理系统详解 #### 一、系统简介 ##### 1.1 基础架构概述 该系统基于Hadoop平台构建,旨在处理大规模文本数据。它不仅能够处理静态文件,还支持动态的数据流处理。系统通过Eclipse插件...