mvn clean compile package install -Phadoop-2 -DskipTests
main: [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf [copy] Copying 11 files to /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-packaging --- [INFO] [INFO] --- maven-gpg-plugin:1.4:sign (sign-artifacts) @ hive-packaging --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packaging --- [INFO] Installing /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/pom.xml to /root/.m2/repository/org/spark-project/hive/hive-packaging/1.2.1.spark/hive-packaging-1.2.1.spark.pom [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive ............................................... SUCCESS [ 2.563 s] [INFO] Hive Shims Common .................................. SUCCESS [ 3.779 s] [INFO] Hive Shims 0.20S ................................... SUCCESS [ 1.568 s] [INFO] Hive Shims 0.23 .................................... SUCCESS [ 5.433 s] [INFO] Hive Shims Scheduler ............................... SUCCESS [ 2.011 s] [INFO] Hive Shims ......................................... SUCCESS [ 1.557 s] [INFO] Hive Common ........................................ SUCCESS [ 5.571 s] [INFO] Hive Serde ......................................... SUCCESS [ 5.134 s] [INFO] Hive Metastore ..................................... SUCCESS [ 15.928 s] [INFO] Hive Ant Utilities ................................. SUCCESS [ 0.552 s] [INFO] Spark Remote Client ................................ SUCCESS [ 6.468 s] [INFO] Hive Query Language ................................ SUCCESS [ 48.084 s] [INFO] Hive Service ....................................... SUCCESS [ 5.605 s] [INFO] Hive Accumulo Handler .............................. SUCCESS [ 4.734 s] [INFO] Hive JDBC .......................................... SUCCESS [ 13.971 s] [INFO] Hive Beeline ....................................... SUCCESS [ 3.101 s] [INFO] Hive CLI ........................................... SUCCESS [ 2.993 s] [INFO] Hive Contrib ....................................... SUCCESS [ 2.797 s] [INFO] Hive HBase Handler ................................. SUCCESS [ 5.414 s] [INFO] Hive HCatalog ...................................... SUCCESS [ 0.950 s] [INFO] Hive HCatalog Core ................................. SUCCESS [ 4.163 s] [INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 3.001 s] [INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 3.124 s] [INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 3.362 s] [INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 12.030 s] [INFO] Hive HCatalog Streaming ............................ SUCCESS [ 3.114 s] [INFO] Hive HWI ........................................... SUCCESS [ 3.020 s] [INFO] Hive ODBC .......................................... SUCCESS [ 2.443 s] [INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.211 s] [INFO] Hive TestUtils ..................................... SUCCESS [ 0.227 s] [INFO] Hive Packaging ..................................... SUCCESS [ 3.342 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 02:57 min [INFO] Finished at: 2015-12-17T17:02:52+08:00 [INFO] Final Memory: 393M/11096M [INFO] ------------------------------------------------------------------------
相关推荐
spark-hive_2.11-2.3.0...spark-hive-thriftserver_2.11-2.3.0.jar log4j-2.15.0.jar slf4j-api-1.7.7.jar slf4j-log4j12-1.7.25.jar curator-client-2.4.0.jar curator-framework-2.4.0.jar curator-recipes-2.4.0.jar
spark-hive-thriftserver_2.11-2.1.spark-hive-thrift
spark-1.6.3-bin-hadoop2.4-without-hive.tgz 经测试,hadoop 2.8.2下可用。hive2.1.1 可用
这是每个学习spark必备的jar包,是根据我的个人试验后所得,官网正版,在spark官网下载。 资源包里不仅有需要的jar包,并且给不会再官网上下载的新手官方网址,可以自由下载资源
hive-on-spark客户端
用于配置hive on spark的spark安装包,安装包不集成hive的jar包
spark-3.2.0-bin-hadoop3-without-hive
spark--bin-hadoop2-without-hive.tgz
使用maven重新编译spark2.3.1源码,用以实现hive on spark
spark-hive_2.11-2.1.4-SNAPSHOT.rar
hive2.1.0 --- spark1.6.0 hive on spark的spark包,这个是已经经过./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"编译后的了spark-1.6.0-bin-...
spark-2.3.0-bin-hadoop2-without-hive.spark2.3版本源码编译不含hive jar包的安装包。用于安装hive on spark
spark和hive结合依赖,如何使用请看我博客https://blog.csdn.net/z1987865446/article/details/109372818
spark-2.4.3-bin-hadoop2-without-hive_hadoop3.2
spark-hive-udf]# cp target/spark-hive-udf-1.0.0-SNAPSHOT.jar /tmp 通过提供罐子来启动火花壳 spark-shell --master yarn --jars /tmp/spark-hive-udf-1.0.0-SNAPSHOT.jar 创建名称为大写的函数并列出该函数 ...
org.spark-project.hive 源码, The Apache Hive (TM) data warehouse software facilitates querying and managing large datasets residing in distributed storage.
补丁文件包,依赖包,hive3.1.2-spark3.0.0和hive3.1.3-spark3.1.3二进制包已经全部放进该压缩
spark-3.2.4-bin-hadoop3.2-scala2.13 安装包
spark-3.0.0-bin-hadoop2.7.tgz 官网下载不了的,需要资源的,可以到这里下载哦
spark-2.4.3-bin-hadoop2-without-hive.tgz