LineRecordReader.next(LongWritable key, Text value)
LineReader.readLine(Text str, int maxLineLength, int maxBytesToConsume)
DataInputStream.read(byte b[]) /* DFSDataInputStream继承此方法 */
DFSInputStream.read(long position, byte[] buffer, int offset, int length)
DFSInputStream.fetchBlockByteRange(LocatedBlock block, long start,long end, byte[] buf, int offset)
BlockReader.readAll(byte[] buf, int offset, int len)
FSInputChecker.readFully(InputStream stm, byte[] buf, int offset, int len)
BlockReader.read(byte[] buf, int off, int len)
FSInputChecker.read(byte[] b, int off, int len)
FSInputChecker.read1(byte b[], int off, int len)
FSInputChecker.readChecksumChunk(byte b[], final int off, final int len)
BlockReader.readChunk(long pos, byte[] buf, int offset, int len, byte[] checksumBuf)
IOUtils.readFullyreadFully( InputStream in, byte buf[], int off, int len)
DataInputStream.read(byte b[], int off, int len)
BufferedInputStream.read(byte b[], int off, int len)
BufferedInputStream.read1(byte[] b, int off, int len)
org.apache.hadoop.net.SocketInputStream.read(byte[] b, int off, int len)
org.apache.hadoop.net.SocketInputStream.read(ByteBuffer dst)
org.apache.hadoop.net.SocketIOWithTimeout.doIO(ByteBuffer buf, int ops)
org.apache.hadoop.net.SocketInputStream.Reader.performIO(ByteBuffer buf)
sun.nio.ch.SocketChannelImpl.read(ByteBuffer buf)
sun.nio.ch.IOUtiil.read(FileDescriptor fd, ByteBuffer dst, long position, NativeDispatcher nd, Object lock)
sun.nio.ch.IOUtiil.readIntoNativeBuffer(FileDescriptor fd, ByteBuffer bb, long position, NativeDispatcher nd,Object lock)
sun.nio.ch.SocketDispatcher.read(FileDescriptor fd, long address, int len)
sun.nio.ch.SocketDispatcher.read0(FileDescriptor fd, long address, int len) /* Native Method,根据不同的JDK实现不同 */
分享到:
相关推荐
《Hadoop 2.X HDFS源码剖析》一共有5章,其中第1章从总体上介绍了HDFS的组件、概念以及典型的流程,同时详细介绍了HDFS各个组件间RPC接口的定义。第2章介绍了Hadoop RPC框架的实现,Hadoop RPC是HDFS各个组件间通信...
从hadoop hdfs中读取数据,进行groupby 显示统计结果count、avg、max,用文字和柱状图两种图形界面表示
hadoop HDFS学习课件,根据hadoop权威指南和apache官网参考手册整理。整个PPT比较大,教学时需要拆分使用
Hadoop-0.20.0-HDFS+MapReduce+Hive+HBase十分钟快速入门
Hadoop介绍,HDFS和MapReduce工作原理
Hadoop分布式文件系统HDFS的实战,需要的Hdfs.java文件 public static void main(String[] args) throws Exception { //上传文件到hadoop uploadFile(); createFile(); createDir(); fileRename(); deleteFile...
Hadoop环境搭建和HDFS Shell命令 ppt格式 内容丰富生动
赠送jar包:hadoop-hdfs-2.7.3.jar; 赠送原API文档:hadoop-hdfs-2.7.3-javadoc.jar; 赠送源代码:hadoop-hdfs-2.7.3-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.7.3.pom; 包含翻译后的API文档:hadoop...
《Hadoop 2.X HDFS源码剖析》一共有5章,其中第1章从总体上介绍了HDFS的组件、概念以及典型的流程,同时详细介绍了HDFS各个组件间RPC接口的定义。第2章介绍了Hadoop RPC框架的实现,Hadoop RPC是HDFS各个组件间通信...
赠送jar包:hadoop-hdfs-client-2.9.1.jar; 赠送原API文档:hadoop-hdfs-client-2.9.1-javadoc.jar; 赠送源代码:hadoop-hdfs-client-2.9.1-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-client-2.9.1.pom;...
hadoop处理框架,hdfs,mapreduce,yarn
hadoop(二:hadoop3.3.0搭建,HDFS shell 命令,MapReduce程序)
赠送jar包:hadoop-hdfs-2.6.5.jar; 赠送原API文档:hadoop-hdfs-2.6.5-javadoc.jar; 赠送源代码:hadoop-hdfs-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.6.5.pom; 包含翻译后的API文档:hadoop...
Hadoop 2.X HDFS源码剖析-高清-完整目录-2016年3月,分享给所有需要的人!
idea hadoop-hdfs插件,和eclipse上一样的Hadoop hdfs的插件功能一样;端口分别为50020和9000,不用点测试直接点应用即可
hadoop平台下hdfs和mapreduce的源码分析。
大数据 Hadoop HDFS 详解
Linux运维-运维课程MP4频-06-大数据之Hadoop部署-11hdfs写数据流程.mp4
赠送jar包:hadoop-hdfs-2.7.3.jar; 赠送原API文档:hadoop-hdfs-2.7.3-javadoc.jar; 赠送源代码:hadoop-hdfs-2.7.3-sources.jar; 赠送Maven依赖信息文件:hadoop-hdfs-2.7.3.pom; 包含翻译后的API文档:hadoop...