文章目录
- 1.环境准备
- 2.配置主机名,主机映射
- 3.创建文件夹
- 4.免密登陆
- 4.安装jdk
- 6.安装Hadoop
- 解压
- 配置Hadoop
- 环境变量
- 格式化文件系统
- 启动hadoop
- 验证是否成功
1.环境准备
一个纯净版的centos7 虚拟机,配置好静态ip
环境配置方法
2.配置主机名,主机映射
hostnamectl set-hostname hadoop1
vi /etc/hosts
192.168.25.131 hadoop1
3.创建文件夹
在/opt下创建两个文件夹,soft
cd /opt
mkdir soft
将安装包拖入此文件夹中
4.免密登陆
ssh-keygen -t rsa -P ""
cp /root/.ssh/id_rsa.pub /root/.ssh/authorized_keys
chmod 600 /root/.ssh/authorized_keys
ssh hadoop1
exit
4.安装jdk
tar zxvf jdk-8u221-linux-x64.tar.gz -C /opt/soft/
cd /opt/soft
mv jdk1.8.0_221 jdk1.8
6.安装Hadoop
解压
tar zxvf hadoop-2.6.0-cdh5.14.2.tar.gz -C /opt/soft/
mv hadoop-2.6.0-cdh5.14.2 hadoop260
cd /opt/soft/hadoop260/etc/hadoop
配置Hadoop
vi hadoop-env.sh
export JAVA_HOME=/opt/soft/jdk1.8
vi core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://192.168.25.131:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/soft/hadoop260/tmp</value>
</property>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
vi hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>
</configuration>
cp mapred-site.xml.template mapred-site.xml
vi mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
vi yarn-site.xml
<configuration>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop1</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>
环境变量
vi /etc/profile
#java environment
export JAVA_HOME=/opt/soft/jdk1.8
export CLASSPATH=.:${JAVA_HOME}/jre/lib/rt.jar:${JAVA_HOME}/lib/tools.jar:${JAVA_HOME}/lib/dt.jar
export PATH=$PATH:${JAVA_HOME}/bin
#hadoop environment
export HADOOP_HOME=/opt/soft/hadoop260
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
source /etc/profile
格式化文件系统
hadoop namenode -format
启动hadoop
start-all.sh
验证是否成功
jps