`
sillycat
  • 浏览: 2495756 次
  • 性别: Icon_minigender_1
  • 来自: 成都
社区版块
存档分类
最新评论

HBase(2)JAVA Client

 
阅读更多

HBase(2)JAVA Client

First of all, how to run jar based on maven plugin.
pom.xml
<build>  

 <plugins>   

  <plugin>   

   <groupId>org.apache.maven.plugins</groupId>

   <artifactId>maven-jar-plugin</artifactId>

   <configuration>

    <archive>

     <manifest>

      <mainClass>com.sillycat.easyhbase.ExecutorApp</mainClass>

     </manifest>

    </archive>

   </configuration>

  </plugin>

 </plugins>  

</build>

And the ExecutorApp is just a simplest Java Application.
package com.sillycat.easyhbase;

 

public class ExecutorApp {      

 public static void main(String[] args) {          

   System.out.println("Nice try.");      

 }

}

Fire and build the jar
>mvn clean install

Run the jar
>java -jar target/easyhbase-1.0.jar 
Nice try.


Error Message
08-05 17:15:46 [WARN] org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:106) - Failed to identify the fs of dir hdfs://ubuntu-master:9000/hbase/lib, ignored java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:202)

Solution:
After add this jar, the problem solved.
<dependency>     

 <groupId>org.apache.hadoop</groupId>

  <artifactId>hadoop-hdfs</artifactId>

  <version>${hadoop.version}</version>

</dependency>

Error Message
An internal error occurred during: "Updating Maven Project".
Unsupported IClasspathEntry kind=4

Solution:
Because I am using eclipse, so I use command mvn eclipse:eclipse to generate the jar dependencies. 
Right click on the project [Properties] —> [Java Build Path] —>[Library] 
Remove all the blue entries starting with “M2_REPO”, then I can use Maven—>Update Project again. Even I can download the source codes.

All the codes are in project easyhbase.

The dependencies are as follow:
<properties>  

 <hadoop.version>2.4.1</hadoop.version>  

 <hbase.version>0.98.4-hadoop2</hbase.version>

</properties> 

 

<dependencies>     

 <dependency>          

  <groupId>commons-logging</groupId>          

  <artifactId>commons-logging</artifactId>          

  <version>1.1.3</version>     

 </dependency>     

 

 <dependency>         

  <groupId>org.apache.hbase</groupId>         

  <artifactId>hbase-hadoop2-compat</artifactId>         

  <version>${hbase.version}</version>     

 </dependency>     

 

 <dependency>         

  <groupId>org.apache.hbase</groupId>

  <artifactId>hbase-common</artifactId>         

  <version>${hbase.version}</version>     

 </dependency>     

 

 <dependency>         

  <groupId>org.apache.hbase</groupId>         

  <artifactId>hbase-client</artifactId>         

  <version>${hbase.version}</version>     

 </dependency>     

 

 <dependency>           

  <groupId>org.apache.hadoop</groupId>           

  <artifactId>hadoop-common</artifactId>           

  <version>${hadoop.version}</version>     

 </dependency>     

 

 <dependency>         

  <groupId>org.apache.hadoop</groupId>         

  <artifactId>hadoop-hdfs</artifactId>         

  <version>${hadoop.version}</version>     

 </dependency>     

 

 <dependency>         

  <groupId>log4j</groupId>         

  <artifactId>log4j</artifactId>         

  <version>1.2.17</version>     

 </dependency>

</dependencies>

The JAVA Client Sample Codes
package com.sillycat.easyhbase;

 

import java.io.IOException;

import java.util.ArrayList;

import java.util.List;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.CellUtil;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.HColumnDescriptor;

import org.apache.hadoop.hbase.HTableDescriptor;

import org.apache.hadoop.hbase.MasterNotRunningException;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.ZooKeeperConnectionException;

import org.apache.hadoop.hbase.client.Delete;

import org.apache.hadoop.hbase.client.Get;

import org.apache.hadoop.hbase.client.HBaseAdmin;

import org.apache.hadoop.hbase.client.HTable;

import org.apache.hadoop.hbase.client.Put;

import org.apache.hadoop.hbase.client.Result;

import org.apache.hadoop.hbase.client.ResultScanner;

import org.apache.hadoop.hbase.client.Scan;

import org.apache.hadoop.hbase.util.Bytes;

 

public class HBaseMain {

  private static Configuration conf = null;

 // setting

 static {

  conf = HBaseConfiguration.create();

 }

 // create table

 

 public static void creatTable(String tableName, String[] familys) throws Exception {

  HBaseAdmin admin = new HBaseAdmin(conf);

  if (admin.tableExists(tableName)) {

   System.out.println("table already exists!");

  } else {

   

  TableName tableNameObject = TableName.valueOf(tableName);

  HTableDescriptor tableDesc = new HTableDescriptor(tableNameObject);

  for (inti = 0; i < familys.length; i++) {

   tableDesc.addFamily(new HColumnDescriptor(familys[i]));

  }

  admin.createTable(tableDesc);

  System.out.println("create table " + tableName + " ok.");

 }

}

 

// delete table

public static void deleteTable(String tableName) throws Exception {

  try {

    HBaseAdmin admin = new HBaseAdmin(conf);

    admin.disableTable(tableName);

    admin.deleteTable(tableName);

    System.out.println("delete table " + tableName + " ok.");

  } catch (MasterNotRunningException e) {

    e.printStackTrace();

  } catch (ZooKeeperConnectionException e) {

    e.printStackTrace();

  }

}

 

// insert one line

public static void addRecord(String tableName, String rowKey, String family, String qualifier, String value) throws Exception {

 try {

  HTable table = new HTable(conf, tableName);

  Put put = new Put(Bytes.toBytes(rowKey));

  put.add(Bytes.toBytes(family), Bytes.toBytes(qualifier), Bytes.toBytes(value));

  table.put(put);

  System.out.println("insert recored " + rowKey + " to table " + tableName + " ok.");

 } catch (IOException e) {

  e.printStackTrace();

}

}

// delete one line

public static void delRecord(String tableName, String rowKey) throws IOException {

  HTable table = new HTable(conf, tableName);

  List list = new ArrayList();

  Delete del = new Delete(rowKey.getBytes());

  list.add(del);

  table.delete(list);

  System.out.println("del recored " + rowKey + " ok.");

}

 

// query for one line

public static void getOneRecord(String tableName, String rowKey) throws IOException {

  HTable table = new HTable(conf, tableName);

  Get get = new Get(rowKey.getBytes());

  Result rs = table.get(get);

  for (Cell cell : rs.rawCells()) {

   System.out.print(new String(CellUtil.cloneRow(cell)) + " ");

   System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");

   System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");

   System.out.print(cell.getTimestamp() + " ");

   System.out.println(new String(CellUtil.cloneValue(cell))); }

}

 

// list all data

public static void getAllRecord(String tableName) {

  try {

   HTable table = new HTable(conf, tableName);

   Scan s = new Scan();

   ResultScanner ss = table.getScanner(s);

   for (Result r : ss) {

     for (Cell cell : r.rawCells()) {

      System.out.print(new String(CellUtil.cloneRow(cell)) + " ");

      System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");

      System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");

      System.out.print(cell.getTimestamp() + " ");

      System.out.println(new String(CellUtil.cloneValue(cell)));

     }

   }

  } catch (IOException e) {

   e.printStackTrace();

  }

 }

 

public static void getRangeRecord(String tableName,String startRowKey,String endRowKey){

  try {

    HTable table = new HTable(conf, tableName);

    Scan s = new Scan(startRowKey.getBytes(),endRowKey.getBytes());

    ResultScanner ss = table.getScanner(s);

    for (Result r : ss) {

      for (Cell cell : r.rawCells()) {

       System.out.print(new String(CellUtil.cloneRow(cell)) + " ");

       System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");

       System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");

       System.out.print(cell.getTimestamp() + " ");

       System.out.println(new String(CellUtil.cloneValue(cell)));

      }

     }

    } catch (IOException e) {

      e.printStackTrace();

    }

   }

 

public static void main(String[] args) {

    try {

      String tablename = "scores"; String[] familys = { "grade", "course" };

      HBaseMain.creatTable(tablename, familys);

 

      // add record sillycat

      HBaseMain.addRecord(tablename, "sillycat-20140723", "grade", "", "5");

      HBaseMain.addRecord(tablename, "sillycat-20140723", "course", "math", "97");      

      HBaseMain.addRecord(tablename, "sillycat-20140723", "course", "art", "87");

      HBaseMain.addRecord(tablename, "sillycat-20130723", "grade", "", "5");

      HBaseMain.addRecord(tablename, "sillycat-20130723", "course", "math", "97");

      HBaseMain.addRecord(tablename, "sillycat-20130723", "course", "art", "87");

      HBaseMain.addRecord(tablename, "sillycat-20120723", "grade", "", "5");

      HBaseMain.addRecord(tablename, "sillycat-20120723", "course", "math", "97");

      HBaseMain.addRecord(tablename, "sillycat-20120723", "course", "art", "87");

 

      // add record kiko

      HBaseMain.addRecord(tablename, "kiko-20140723", "grade", "", "4");

      HBaseMain.addRecord(tablename, "kiko-20140723", "course", "math", "89");

 

      System.out.println("===========get one record========");

      HBaseMain.getOneRecord(tablename, "sillycat");

      System.out.println("===========show all record========");

      HBaseMain.getAllRecord(tablename);

      System.out.println("===========del one record========");

      HBaseMain.delRecord(tablename, "kiko");

      HBaseMain.getAllRecord(tablename);

      System.out.println("===========show all record========");

      HBaseMain.getAllRecord(tablename);

      System.out.print("=============show range record=======");

      HBaseMain.getRangeRecord(tablename, "sillycat-20130101", "sillycat-20141231");

    } catch (Exception e) {

      e.printStackTrace();

    }

   }

  }

References:
http://www.cnblogs.com/panfeng412/archive/2011/08/14/hbase-java-client-programming.html
http://lirenjuan.iteye.com/blog/1470645
http://hbase.apache.org/book/hbase_apis.html

http://blog.linezing.com/?tag=hbase
http://f.dataguru.cn/thread-226503-1-1.html
http://www.cnblogs.com/ggjucheng/p/3379459.html
http://blog.sina.com.cn/s/blog_ae33b83901016azb.html
http://blog.nosqlfan.com/html/3694.html
http://www.infoq.com/cn/news/2011/07/taobao-linhao-hbase
http://blog.csdn.net/xunianchong/article/details/8995019

client performance
http://blog.linezing.com/?p=1378
design row key
http://san-yun.iteye.com/blog/1995829

分享到:
评论

相关推荐

    hbase的java client实例

    hbase java api,集成spring使用,mapreduce实例,协处理器实例等等

    hbase-client-1.4.3-API文档-中文版.zip

    标签:apache、client、hbase、jar包、java、API文档、中文版; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译...

    hbase-client-1.2.12-API文档-中文版.zip

    标签:apache、hbase、client、中文文档、jar包、java; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译,请...

    hbase-client-1.1.2-API文档-中文版.zip

    标签:apache、client、hbase、jar包、java、API文档、中文版; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译...

    阿里巴巴开源的Hbase Client node-hbase-client.zip

    当前状态:完全通过 HBase 0.94 和 0.94.16Java hbase-client支持 HBase 服务器的版本[√] 0.94.x[√] 0.94.0[√] 0.94.160.95.x0.96.x安装$ npm install hbase-client使用 CRUD:通过 zookeeper 创建 HBase ...

    phoenix-5.0.0-HBase-2.0-client.jar

    hbase phoenix 客户端连接jdbc的jar包,SQuirreL SQL Client,DbVisualizer 等客户端连接hbase配置使用

    hbase- java开发连接工具类

    java开发连接hbase的jar包,1.2.1版本的hbase-client.jar,还有其他jar包,很全

    hbase-client-1.2.12-API文档-中英对照版.zip

    标签:apache、hbase、client、中英对照文档、jar包、java; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译,...

    hbase-sdk是基于hbase-client和hbase-thrift的原生API封装的一款轻量级的HBase ORM框架

    hbase-sdk是基于hbase-client和hbase-thrift的原生API封装的一款轻量级的HBase ORM框架。 针对HBase各版本API(1.x~2.x)间的差异,在其上剥离出了一层统一的抽象。并提供了以类SQL的方式来读写HBase表中的数据。对...

    HBaseClient:HBase客户端数据管理软件

    HBaseClient HBase客户端数据管理软件 概要说明 类似PL/SQL,是一个HBase数据库的客户端数据管理软件。是免费开源的软件。 基于XJava,使用xml配置文件绘制可视化界面。 可视化界面操作 表 表的定义、编辑、删除; ...

    HBaseClientDemo

    HBase java api 客户端 实例

    phoenix-4.8.1-HBase-0.98-client.jar

    phoenix-4.8.1-HBase-0.98-client.jar在hbase安装、编写java程序、phoenix都要用到

    hbase-1.2.2-Java测试最小依赖包(经过严格测试)

    hbase-1.2.2-Java测试最小依赖包,很多博客都写过,但是很多博客都没有经过验证,本依赖包是经过代码严格检测的,能够通过最简单的hbase数据写入,数据读取等操作的,请放心下载

    java使用hbase-1.2版本需要的最小的jar包

    java客户端连接hbase所需要的最少的jar包集合,这个我用的hbase1.2.1的,大家用的版本可能是有稍许的差别(但是这个jar包应该可以用),不过所需要的包名是一样的,只是版本不一样而已,可以根据具体的需要修改为...

    毕业设计-基于java+HBase实现的手机数据备份系统(短信、联系人、重要文件).zip

    毕业设计-基于java+HBase实现的手机数据备份系统(短信、联系人、重要文件).zip 基于HBase实现的手机数据备份系统,实现了手机关键信息的备份,如短信、联系人等。 包括服务器端(Server)和客户端(Client) Server...

    java操作hbase,增删查改

    java操作hbase,增删查改 版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/zhangjingao/article/details/104390560 例子功能说明 不说废话,直接开始,撸代码。hbase的介绍查看其他博...

    Hbase中文文档

    HBase file format with inline blocks (version 2) F. Other Information About HBase F.1. HBase Videos F.2. HBase Presentations (Slides) F.3. HBase Papers F.4. HBase Sites F.5. HBase Books F.6. Hadoop ...

    spring-data-hadoop-2.1.1.RELEASE-hadoop24-sources.jar

    Java操作hbase完成hbase数据文件下载

    hbase-client-0.98.4-hadoop2.jar

    java运行依赖jar包

Global site tag (gtag.js) - Google Analytics