博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
64位Ubuntu 编译 hadoop源码
阅读量:4678 次
发布时间:2019-06-09

本文共 6341 字,大约阅读时间需要 21 分钟。

本人下载了最新版的ubuntu64(14.04)位桌面版系统,在安装hadoop2.6.0时,由于官方的hadoop是在32位机子上编译的,故需下载hadoop源码进行编译。

准备:hadoop-2.6.0-src

           jdk1.7.0_75(由于最新版的jdk是1.8.0_31版本的,而我在用这个版本编译hadoop时出现了问题,后来发现1.8.0是在hadoop出来之后出来的,所以可能有些不支持,故建议用1.7.0版本的)

 

1、安装JDK

安装jdk的方式大家可以网上搜一下,或者通过这个链接http://www.cnblogs.com/yulijunzj/p/4253748.html查看,方式是一样的

2、安装maven

 

sudo apt-get install maven

安装完后可以查看maven版本信息

mvn --version

3、安装openssh

sudo apt-get install openssh-server

4、安装依赖库

sudo apt-get install g++ autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev

5、安装protoc

sudo apt-get install protobuf-compiler

安装完后可以查看版本信息

protoc --version

6、开始编译,进入HADOOP源代码目录 hadoop-2.6.0-src,执行:

mvn clean package -Pdist,native -DskipTests -Dtar

 7、编译成功,结果如下:

[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [1.205s][INFO] Apache Hadoop Project POM ......................... SUCCESS [1.187s][INFO] Apache Hadoop Annotations ......................... SUCCESS [2.856s][INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.256s][INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.726s][INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [2.863s][INFO] Apache Hadoop MiniKDC ............................. SUCCESS [3.064s][INFO] Apache Hadoop Auth ................................ SUCCESS [4.360s][INFO] Apache Hadoop Auth Examples ....................... SUCCESS [3.036s][INFO] Apache Hadoop Common .............................. SUCCESS [1:35.206s][INFO] Apache Hadoop NFS ................................. SUCCESS [7.385s][INFO] Apache Hadoop KMS ................................. SUCCESS [11.518s][INFO] Apache Hadoop Common Project ...................... SUCCESS [0.037s][INFO] Apache Hadoop HDFS ................................ SUCCESS [2:17.988s][INFO] Apache Hadoop HttpFS .............................. SUCCESS [15.917s][INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [8:54.814s][INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.273s][INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.039s][INFO] hadoop-yarn ....................................... SUCCESS [0.029s][INFO] hadoop-yarn-api ................................... SUCCESS [1:21.677s][INFO] hadoop-yarn-common ................................ SUCCESS [2:04.882s][INFO] hadoop-yarn-server ................................ SUCCESS [0.040s][INFO] hadoop-yarn-server-common ......................... SUCCESS [59.159s][INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [2:39.110s][INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.738s][INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [5.060s][INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [16.724s][INFO] hadoop-yarn-server-tests .......................... SUCCESS [4.537s][INFO] hadoop-yarn-client ................................ SUCCESS [6.404s][INFO] hadoop-yarn-applications .......................... SUCCESS [0.031s][INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.486s][INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1.715s][INFO] hadoop-yarn-site .................................. SUCCESS [0.060s][INFO] hadoop-yarn-registry .............................. SUCCESS [4.597s][INFO] hadoop-yarn-project ............................... SUCCESS [4.524s][INFO] hadoop-mapreduce-client ........................... SUCCESS [0.053s][INFO] hadoop-mapreduce-client-core ...................... SUCCESS [20.050s][INFO] hadoop-mapreduce-client-common .................... SUCCESS [15.055s][INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.437s][INFO] hadoop-mapreduce-client-app ....................... SUCCESS [8.152s][INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.760s][INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [54.547s][INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.622s][INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [4.799s][INFO] hadoop-mapreduce .................................. SUCCESS [4.062s][INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [15.407s][INFO] Apache Hadoop Distributed Copy .................... SUCCESS [36.599s][INFO] Apache Hadoop Archives ............................ SUCCESS [2.880s][INFO] Apache Hadoop Rumen ............................... SUCCESS [5.047s][INFO] Apache Hadoop Gridmix ............................. SUCCESS [4.041s][INFO] Apache Hadoop Data Join ........................... SUCCESS [2.439s][INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [2.007s][INFO] Apache Hadoop Extras .............................. SUCCESS [2.676s][INFO] Apache Hadoop Pipes ............................... SUCCESS [9.342s][INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.124s][INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [15:56.705s][INFO] Apache Hadoop Client .............................. SUCCESS [6.766s][INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.111s][INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [5.307s][INFO] Apache Hadoop Tools Dist .......................... SUCCESS [21.215s][INFO] Apache Hadoop Tools ............................... SUCCESS [0.024s][INFO] Apache Hadoop Distribution ........................ SUCCESS [40.475s][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 42:16.388s[INFO] Finished at: Fri Jan 30 16:30:42 CST 2015[INFO] Final Memory: 148M/691M[INFO] ------------------------------------------------------------------------

问题:

本人在编译hadoop时主要遇到了三个问题,导致多次编译不成功。

1)在编译之前安装maven时他默认的java不是之前装的,并且我之前装得jdk的版本是1.8.0_31,发现与hadoop2.6.0存在不兼容现象,需要去该hadoop的代码,比较麻烦。

解决办法:该用jdk1.7.0_65版本,在改完版本后,我需要更改默认版本:

sudo update-alternatives --config java      sudo update-alternatives --config javac     sudo update-alternatives --config javadoc      sudo update-alternatives --config javah

 

2)遇到无法删除target文件夹的错误

解决方法:

chown 用户名 文件夹/ -R

3)就是由于网速过慢,时不时就直接失败

解决方法:

重复的执行第六步。

转载于:https://www.cnblogs.com/yulijunzj/p/4255214.html

你可能感兴趣的文章
PERSONAL VALUES
查看>>
python文件操作
查看>>
java虚拟机的运行原理
查看>>
配置Oracle10g即时客户端plsql的配置
查看>>
关于设计:Actionscript 有关鼠标事件笔记2
查看>>
【LOJ】#2538. 「PKUWC2018」Slay the Spire
查看>>
Helper
查看>>
架构设计系列-前端模式的后端(BFF)翻译PhilCalçado
查看>>
常用dos命令
查看>>
Redis学习第四课:Redis List类型及操作
查看>>
满血复活前的记录(持续更新ing)
查看>>
vs2008使用过AnkhSVN后不能绑定到vss的问题解决
查看>>
在vue中使用sass
查看>>
IPv4组播通信原理
查看>>
Sql Server 新的日期类型
查看>>
“我爱淘”冲刺阶段Scrum站立会议8
查看>>
js获取元素class的几种方法
查看>>
delphi 枚举类型与字符串的转换
查看>>
UVA-10689 Yet another Number Sequence (矩阵二分幂模板)
查看>>
element自定义表单验证
查看>>