2024年3月24日发(作者:)

Nutch相关框架视频教程

第十三讲

1、改变负载

三台机器,改变负载

host2(NameNode、DataNode、TaskTracker)

host6(SecondaryNameNode、DataNode、TaskTracker)

host8(JobTracker 、DataNode、TaskTracker)

指定SecondaryNameNode为host6:

vi conf/masters指定host6

scp conf/masters host6:/home/hadoop/hadoop-1.1.2/conf/masters

scp conf/masters host8:/home/hadoop/hadoop-1.1.2/conf/masters

vi conf/

s

host2:50070

s

host6:50090

scp conf/ host6:/home/hadoop/hadoop-1.1.2/conf/

scp conf/ host8:/home/hadoop/hadoop-1.1.2/conf/

指定JobTracker为host8:

vi conf/

r

host8:9001

scp conf/ host6:/home/hadoop/hadoop-1.1.2/conf/

scp conf/ host8:/home/hadoop/hadoop-1.1.2/conf/

vi conf/

/home/hadoop/dfs/filesystem/namesecondary

scp conf/ host6:/home/hadoop/hadoop-1.1.2/conf/

1 / 20

Nutch相关框架视频教程

scp conf/ host8:/home/hadoop/hadoop-1.1.2/conf/

配置host8:

host8上的脚本会启动host2和host6上面的TaskTracker,所以需要对host8

执行:

ssh-keygen -t rsa(密码为空,路径默认)

ssh-copy-id -i .ssh/id_ hadoop@host2

ssh-copy-id -i .ssh/id_ hadoop@host6

ssh-copy-id -i .ssh/id_ hadoop@host8

可以在host8上面通过ssh无密码登陆host2和host6

ssh host2

ssh host6

ssh host8

在/home/hadoop/.bashrc 中追加:

export PATH=/home/hadoop/hadoop-1.1.2/bin:$PATH

host2: 执行

host8: 执行

2、SecondaryNameNode

ssh host6

停止secondarynamenode

hadoop-1.1.2/bin/ stop secondarynamenode

强制合并fsimage和eidts

hadoop-1.1.2/bin/hadoop secondarynamenode -checkpoint force

启动secondarynamenode

hadoop-1.1.2/bin/ start secondarynamenode

3、启用回收站

al

10080

2 / 20