刮了80末世之幸运宝箱卡就出了十个财富宝箱和十个x2末世之幸运宝箱卡还有下面这些😕

主题帖子积分
欢迎加入about云群 、 ,云计算爱好者群,关注
主题帖子积分
这个属于通信问题,
检查防火墙
检查集群是否开启等。
欢迎加入about云群 、 ,云计算爱好者群,关注
主题帖子积分
下面也是中解决方案
原因:本地用户administrator(本机windows用户)想要远程操作hadoop系统,没有权限引起的。
解决办法:
1、如果是测试环境,可以取消hadoop hdfs的用户权限检查。打开conf/hdfs-site.xml,找到dfs.permissions属性修改为false(默认为true)OK了。(1.2.1 版本只有这个方法可行),如何操作可以参考第一个问题。
2、修改hadoop location参数,在advanced parameter选项卡中,找到hadoop.job.ugi项,将此项改为启动hadoop的用户名即可
3 修改window 机器的用户名为 hadoop 用户名。
欢迎加入about云群 、 ,云计算爱好者群,关注
主题帖子积分
欢迎加入about云群 、 ,云计算爱好者群,关注
主题帖子积分
新手上路, 积分 14, 距离下一级还需 36 积分
新手上路, 积分 14, 距离下一级还需 36 积分
以上还不能解决,看这个
主题帖子积分
中级会员, 积分 905, 距离下一级还需 95 积分
中级会员, 积分 905, 距离下一级还需 95 积分
都很强大啊,努力
主题帖子积分
高级会员, 积分 1151, 距离下一级还需 3849 积分
高级会员, 积分 1151, 距离下一级还需 3849 积分
主题帖子积分
中级会员, 积分 715, 距离下一级还需 285 积分
中级会员, 积分 715, 距离下一级还需 285 积分
主题帖子积分
注册会员, 积分 72, 距离下一级还需 128 积分
注册会员, 积分 72, 距离下一级还需 128 积分
16:29:58,077 INFO [org.apache.hadoop.mapreduce.Job] -&&map 0% reduce 0%
&& 16:30:18,620 WARN [org.apache.hadoop.hdfs.BlockReaderFactory] - I/O error constructing remote block reader.
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:30:18,623 WARN [org.apache.hadoop.hdfs.DFSClient] - Failed to connect to /172.18.4.47:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:30:39,626 WARN [org.apache.hadoop.hdfs.BlockReaderFactory] - I/O error constructing remote block reader.
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:30:39,627 WARN [org.apache.hadoop.hdfs.DFSClient] - Failed to connect to /172.18.4.218:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:30:39,628 INFO [org.apache.hadoop.hdfs.DFSClient] - Could not obtain BP--172.18.4.200-5:blk__2239481 from any node: java.io.IOException: No live nodes contain current block No live nodes contain current block Block locations: 172.18.4.47:.4.218:50010 Dead nodes:&&172.18.4.218:.4.47:50010. Will get new block locations from namenode and retry...
&& 16:30:39,628 WARN [org.apache.hadoop.hdfs.DFSClient] - DFS chooseDataNode: got # 1 IOException, will wait for 437374 msec.
&& 16:31:02,536 WARN [org.apache.hadoop.hdfs.BlockReaderFactory] - I/O error constructing remote block reader.
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:31:02,537 WARN [org.apache.hadoop.hdfs.DFSClient] - Failed to connect to /172.18.4.218:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:31:23,536 WARN [org.apache.hadoop.hdfs.BlockReaderFactory] - I/O error constructing remote block reader.
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:31:23,537 WARN [org.apache.hadoop.hdfs.DFSClient] - Failed to connect to /172.18.4.47:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
&&java.net.ConnectException: Connection timed out: no further information
& & & & at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
& & & & at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
& & & & at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
& & & & at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
& & & & at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:2884)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:747)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:662)
& & & & at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:326)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:570)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
& & & & at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
& & & & at java.io.DataInputStream.read(DataInputStream.java:100)
& & & & at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
& & & & at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
& & & & at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:143)
& & & & at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:183)
& & & & at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
& & & & at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
& & & & at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
& & & & at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
& & & & at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
& & & & at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
& & & & at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
& & & & at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
& & & & at java.util.concurrent.FutureTask.run(FutureTask.java:262)
& & & & at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
& & & & at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
& & & & at java.lang.Thread.run(Thread.java:745)
16:31:23,537 INFO [org.apache.hadoop.hdfs.DFSClient] - Could not obtain BP--172.18.4.200-5:blk__2239481 from any node: java.io.IOException: No live nodes contain current block No live nodes contain current block Block locations: 172.18.4.218:.4.47:50010 Dead nodes:&&172.18.4.218:.4.47:50010. Will get new block locations from namenode and retry...
&& 16:31:23,537 WARN [org.apache.hadoop.hdfs.DFSClient] - DFS chooseDataNode: got # 2 IOException, will wait for 710906 msec.
&&求助什么原因
经常参与各类话题的讨论,发帖内容较有主见
经常帮助其他会员答疑
活跃且尽责职守的版主
为论坛做出突出贡献的会员
站长推荐 /6
about云|新出视频,openstack零基础入门,解决你ping不通外网难题
云计算hadoop视频大全(新增 yarn、flume|storm、hadoop一套视频
视频资料大优惠
大数据零基础由入门到实战
阶段1:hadoop零基础入门基础篇
阶段2:hadoop2入门
阶段3:大数据非hadoop系列课程
阶段4:项目实战篇
阶段5:大数据高级系列应用课程
阶段6:工作实用系列教程
等待验证会员请验证邮箱
新手获取积分方法
Powered by}

我要回帖

更多关于 王者荣耀幸运宝箱 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信