目錄

1.介紹:
2.下載
3.部署
偽分布式模式
1.部署jdk
2.部署hadoop
3.hdfs部署
4.ssh遠(yuǎn)程登錄并執(zhí)行
5.啟動(dòng)hdfs
7.部署yarn?
8.啟動(dòng)yarn
9.打開RM web ui?
10.啟動(dòng)停止命令
1.介紹:廣義:以 apache hadoop軟件為主的生態(tài)圈: hive、flume、hbase、kafka、spark、flink
狹義:apache hadoop軟件
hdfs 存儲(chǔ) 海量的數(shù)據(jù)
mapreduce ?計(jì)算、分析
yarn 資源和作業(yè)的調(diào)度
1.hdfs 存儲(chǔ) 海量的數(shù)據(jù):
namenode 負(fù)責(zé)指揮數(shù)據(jù)的存儲(chǔ)
datanode 主要負(fù)責(zé)數(shù)據(jù)的存儲(chǔ)
seconderynamenode? 主要輔助namenode工作
2.yarn 資源和作業(yè)的調(diào)度
resourcemanager? 負(fù)責(zé)指揮資源分配
nodemanager 真正的資源
1.官網(wǎng):?? ?hadoop.apache.org / project.apache.org
2.https://archive.apache.org/dist
3.部署 3.1偽分布式模式所有進(jìn)程在一臺(tái)機(jī)器上運(yùn)行,所有操作在hadoop用戶下進(jìn)行
1.部署jdktar -zxvf ./jdk-8u45-linux-x64.gz -C ~/app/ //解壓壓縮包
ln -s ./jdk1.8.0_45/ java //建立軟連接 配置相關(guān)參數(shù)比較方便
//目錄介紹
drwxr-xr-x. 2 hadoop hadoop 4096 Apr 11 2015 bin java相關(guān)的腳本
drwxr-xr-x. 3 hadoop hadoop 4096 Apr 11 2015 include java運(yùn)行過(guò)程中需要的jar
drwxr-xr-x. 5 hadoop hadoop 4096 Apr 11 2015 jre
drwxr-xr-x. 5 hadoop hadoop 4096 Apr 11 2015 lib java運(yùn)行過(guò)程中需要的jar
-rw-r--r--. 1 hadoop hadoop 21099089 Apr 11 2015 src.zip java的源碼包配置環(huán)境變量 java 里面的腳本 在當(dāng)前l(fā)inux任何位置都可以使用
vim ~/.bashrc
export JAVA_HOME=/home/hadoop/app/java
export PATH=${JAVA_HOME}/bin:$PATH
source ~/.bashrc
java -version
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)2.部署hadooptar -zxvf ./hadoop-3.3.4.tar.gz -C ~/app/
ln -s ./hadoop-3.3.4/ hadoop
//目錄介紹
drwxr-xr-x. 2 hadoop hadoop 4096 Jul 29 21:44 bin hadoop相關(guān)腳本
drwxr-xr-x. 3 hadoop hadoop 4096 Jul 29 20:35 etc hadoop配置文件
drwxr-xr-x. 2 hadoop hadoop 4096 Jul 29 21:44 include
drwxr-xr-x. 3 hadoop hadoop 4096 Jul 29 21:44 lib
drwxr-xr-x. 3 hadoop hadoop 4096 Jul 29 20:35 sbin hadoop組件啟動(dòng)停止腳本
drwxr-xr-x. 4 hadoop hadoop 4096 Jul 29 22:21 share hadoop相關(guān)案例配置環(huán)境變量:
vim ~/.bashrc
#HADOOP_HOME
export HADOOP_HOME=/home/hadoop/app/hadoop
export PATH=${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin:$PATH
source ~/.bashrc配置參數(shù)
vim hadoop-env.sh
export JAVA_HOME=/home/hadoop/app/java3.hdfs部署//1.core-site.xml
//fs.defaultFS 指定 namenode 所在機(jī)器
cd app/hadoop/conf
vim core-site.xmlfs.defaultFS hdfs://fang02:9000 //2.hdfs-site.xml
vim hdfs-site.xmldfs.replication 1 4.ssh遠(yuǎn)程登錄并執(zhí)行ssh to the localhost without a passphrase //免密登錄
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >>~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys格式化文件系統(tǒng)
hdfs namenode -format
2022-11-11 22:25:33,783 INFO common.Storage: Storage directory /tmp/hadoop-hadoop/dfs/name has been
uccessfully formatted.5.啟動(dòng)hdfsstart-dfs.sh//啟動(dòng)進(jìn)程
檢查 hdfs進(jìn)程
jps/ps -ef | grep hdfs
4642 NameNode
4761 DataNode
4974 SecondaryNameNode6.查看namenode web ui
7.部署yarn?http://fang02:9870/
http://192.168.41.12:9870/
vim mapred-site.xml:mapreduce.framework.name yarn mapreduce.application.classpath $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/* vim yarn-site.xml:yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.env-whitelist JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,HADOOP_MAPRED_HOME 8.啟動(dòng)yarnstart-yarn.sh9.打開RM web ui?10.啟動(dòng)停止命令http://fang02:8088/
http://192.168.41.12:8088/
satar-all.sh //啟動(dòng)dadoop
stop-all.sh //停止hadoop3.2 完全分布式
1.集群劃分2.準(zhǔn)備機(jī)器hdfs:
??? ?namenode nn
??? ?datanode dn
??? ?seconderynamenode ?snn
yarn :
??? ?resourcemanager rm
??? ?nodemanager ? ? nm?bigdata32 : nn ?dn ??? ?nm
?bigdata33 :?dn ?rm ?nm
?bigdata34 : snn dn ? ? ?nm
3臺(tái)?4G 2cpu 40G克隆機(jī)器 修改:
(1) ip? vim /etc/sysconfig/network-scripts/ifcfg-ens33
(2) hostname? vim /etc/hostname
(3) ip映射? ?vim /etc/hosts
[hadoop@bigdata32 ~]$ mkdir app software data shell project
[hadoop@bigdata32 ~]$ ssh-keygen -t rsa
//拷貝公鑰 【三臺(tái)機(jī)器都要做】
ssh-copy-id bigdata32
ssh-copy-id bigdata33
ssh-copy-id bigdata344 jdk 部署【三臺(tái)機(jī)器都要做】//1.scp:
scp [[user@]host1:]file1 ... [[user@]host2:]file2
scp bigdata32:~/1.log bigdata33:~
//2.rsync:
rsync [OPTION]... SRC [SRC]... [USER@]HOST:DEST
rsync ~/1.log bigdata34:~
bigdata32:~/1.log: 文件內(nèi)容發(fā)生更新
rsync -av ~/1.log bigdata34:~5.編寫文件同步腳本#!/bin/bash
#三臺(tái)機(jī)器 進(jìn)行文件發(fā)放
if [ $# -lt 1 ];then
echo "參數(shù)不足"
echo "eg:$0 filename..."
fi
#遍歷發(fā)送文件到 三臺(tái)機(jī)器
for host in bigdata32 bigdata33 bigdata34
do
echo "=============$host=================="
#1.遍歷發(fā)送文件的目錄
for file in $@
do
#2.判斷文件是否存在
if [ -e ${file} ];then
pathdir=$(cd $(dirname ${file});pwd)
filename=$(basename ${file})
#3.同步文件
ssh $host "mkdir -p $pathdir"
rsync -av $pathdir/$filename $host:$pathdir
else
echo "${file} 不存在"
fi
done
done給腳本配置環(huán)境變量:
vim ~/.bashrc
export SHELL_HOME=/home/hadoop/shell
export PATH=${PATH}:${SHELL_HOME}
source ~/.bashrc6.jdk 部署【三臺(tái)機(jī)器都要安裝】//1.bigdata32 先安裝jdk
[hadoop@bigdata32 software]$ tar -zxvf jdk-8u45-linux-x64.gz -C ~/app/
[hadoop@bigdata32 app]$ ln -s jdk1.8.0_45/ java
[hadoop@bigdata32 app]$ vim ~/.bashrc
#JAVA_HOME
export JAVA_HOME=/home/hadoop/app/java
export PATH=${PATH}:${JAVA_HOME}/bin
[hadoop@bigdata32 app]$ which java
~/app/java/bin/java
[hadoop@bigdata32 app]$ java -version
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode
[hadoop@bigdata32 app]$ xsync java/
[hadoop@bigdata32 app]$ xsync jdk1.8.0_45
[hadoop@bigdata32 app]$ xsync ~/.bashrc
//三臺(tái)機(jī)器 source ~/.bashrc7.部署hadoop?bigdata32 : nn ?dn ??? ?nm
?bigdata33 :?? ? ? ?dn ?rm ?nm
?bigdata34 :?? ?snn dn ? ? ?nm
[hadoop@bigdata32 software]$ tar -zxvf hadoop-3.3.4.tar.gz -C ~/app/
[hadoop@bigdata32 app]$ ln -s hadoop-3.3.4/ hadoop
[hadoop@bigdata32 app]$ vim ~/.bashrc
#HADOOP_HOME
export HADOOP_HOME=/home/hadoop/app/hadoop
export PATH=${PATH}:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin
[hadoop@bigdata32 app]$ source ~/.bashrc
[hadoop@bigdata32 app]$ which hadoop
~/app/hadoop/bin/hadoop
//【三臺(tái)機(jī)器一起做】
[hadoop@bigdata32 hadoop]$ pwd
/home/hadoop/data/hadoop
[hadoop@bigdata32 data]$ mkdir hadoop8.配置hdfsvim core-site.xml:fs.defaultFS hdfs://bigdata32:9000 hadoop.tmp.dir /home/hadoop/data/hadoop vim hdfs-site.xml:dfs.replication 3 dfs.namenode.secondary.http-address bigdata34:9868 dfs.namenode.secondary.https-address bigdata34:9869 [hadoop@bigdata32 hadoop]$ pwd
/home/hadoop/app/hadoop/etc/hadoop
[hadoop@bigdata32 hadoop]$ cat workers
bigdata32
bigdata33
bigdata34//同步bigdata32內(nèi)容 到bigdata33 bigdata34
[hadoop@bigdata32 app]$ xsync hadoop
[hadoop@bigdata32 app]$ xsync hadoop-3.3.4
[hadoop@bigdata32 app]$ xsync ~/.bashrc
//三臺(tái)機(jī)器都要做souce ~/.bashrc
//格式化:格式化操作 部署時(shí)候做一次即可 namenode在哪就在哪臺(tái)機(jī)器格式化
[hadoop@bigdata32 app]$hdfs namenode -format
//啟動(dòng)hdfs:
start-dfs.sh //namenode在哪 就在哪啟動(dòng)9.配置yarn訪問(wèn)namenode web ui: http://bigdata32:9870/
//先配置bigdata32 + 同步
vim mapred-site.xml:mapreduce.framework.name yarn mapreduce.application.classpath $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/* vim yarn-site.xml:yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.env-whitelist JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,HADOOP_MAPRED_HOME yarn.resourcemanager.hostname bigdata33 //bigdata32機(jī)器 配置文件分發(fā)到bigdata33 34:
[hadoop@bigdata32 app]$ xsync hadoop-3.3.4
//啟動(dòng)yarn:
start-yarn.sh //resourcemanager在哪 就在哪啟動(dòng)3.3啟動(dòng)停止hadoop 1.偽分布式訪問(wèn)RM web ui:bigdata33:8088
hdfs: start-dfs.sh
yarn: start-yarn.sh
start-all.sh //啟動(dòng)hadoop
stop-all.sh //關(guān)閉hadoop2.完全分布式編寫一個(gè) 群起腳本:
[hadoop@bigdata32 ~]$ vim shell/hadoop-cluster
#!/bin/bash
if [ $# -lt 1 ];then
echo "Usage:$0 start|stop"
exit
fi
case $1 in
"start")
echo "========啟動(dòng)hadoop集群========"
echo "========啟動(dòng) hdfs========"
ssh bigdata32 "/home/hadoop/app/hadoop/sbin/start-dfs.sh"
echo "========啟動(dòng) yarn========"
ssh bigdata33 "/home/hadoop/app/hadoop/sbin/start-yarn.sh"
;;
"stop")
echo "========停止hadoop集群========"
echo "========停止 yarn========"
ssh bigdata33 "/home/hadoop/app/hadoop/sbin/stop-yarn.sh"
echo "========停止 hdfs========"
ssh bigdata32 "/home/hadoop/app/hadoop/sbin/stop-dfs.sh"
;;
*)
echo "Usage:$0 start|stop"
;;
esac編寫查看 java 進(jìn)程的腳本
[hadoop@bigdata32 ~]$ vim shell/jpsall
for host in bigdata32 bigdata33 bigdata34
do
?? ?echo "==========$host========="
?? ?ssh $host "/home/hadoop/app/java/bin/jps| grep -v Jps"
done你是否還在尋找穩(wěn)定的海外服務(wù)器提供商?創(chuàng)新互聯(lián)www.cdcxhl.cn海外機(jī)房具備T級(jí)流量清洗系統(tǒng)配攻擊溯源,準(zhǔn)確流量調(diào)度確保服務(wù)器高可用性,企業(yè)級(jí)服務(wù)器適合批量采購(gòu),新人活動(dòng)首月15元起,快前往官網(wǎng)查看詳情吧
文章名稱:Hadoop(一)-創(chuàng)新互聯(lián)
文章位置:http://chinadenli.net/article6/dhcjog.html
成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供品牌網(wǎng)站建設(shè)、靜態(tài)網(wǎng)站、定制開發(fā)、網(wǎng)頁(yè)設(shè)計(jì)公司、網(wǎng)站改版、微信公眾號(hào)
聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)
猜你還喜歡下面的內(nèi)容