2017-04-19 56 views
0

我想执行一个猪语句,显示我在txt文件中的数据,我正在mapreduce模式下运行,但我收到一个错误,请有人帮我解决这个问题! !错误当试图执行猪陈述

[[email protected] ~]# pig -x mapreduce 
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL 
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE 
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType 
    2017-04-19 17:42:34,853 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0 (r1746530) compiled Jun 01 2016, 23:10:49 
    2017-04-19 17:42:34,853 [main] INFO org.apache.pig.Main - Logging error messages to: /root/pig_1492603954851.log 
    2017-04-19 17:42:34,907 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /root/.pigbootup not found 
    2017-04-19 17:42:36,060 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://localhost 
    2017-04-19 17:42:37,130 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-default-f60d05c3-9fee-4624-9aa8-07f1584e6165 
    2017-04-19 17:42:37,130 [main] WARN org.apache.pig.PigServer - ATS is disabled since yarn.timeline-service.enabled set to false 
    grunt> dump b; 
    2017-04-19 17:42:41,135 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=EXECUTE, inode="/tmp/temp1549818457":dead:supergroup:drwx------ 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1720) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1704) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1692) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3894) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:983) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 

    2017-04-19 17:42:41,136 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b 
    Details at logfile: /root/pig_1492603954851.log 
+0

你可以检查: - http://stackoverflow.com/questions/7194069/apache-pig-permissions-issue –

+0

当我改变了/ tmp目录permisions来向所有人那么它给我的这些错误: - 输入(S): 无法读取从“/ TEMP”数据 输出(S): 未能产生结果在“hdfs:// localhost/tmp/temp1691370991/tmp-1112412323”中 计数器: 记录总数wri tten:0 总字节写:0 溅洒内存管理器溢出次数:0 总包主动泻:0 记录合计主动泻:0 工作DAG: 空 org.apache.pig.tools.grunt.Grunt - 错误1066:无法打开别名的迭代器b –

+0

检查您是否具有从文件夹读取文件的正确访问权限。如果不是,则还提供对HDFS文件夹的访问权限。 –

回答

0

看起来你似乎没有适当的权限pig.temp.dir设置,因此这个问题。在默认情况下,pig将中间结果写入HDFS的/ tmp中。使用-Dpig.temp.dir覆盖它。

+0

我怎么能overwirte呢?哪个文件打开overwirte请解释给我? –

+0

执行-D pig.temp.dir

+0

您可以尝试hadoop fs -chmod -R 777/tmp/* this,然后重新执行pig语句。 –

1

你可以试试这个: -

pig -x mapreduce -p 'pig.temp.dir'='<temp_location_hdfs>' 

'temp_location_hdfs' 应该要么775或777的权限。

这时可以尝试: - Hadoop的FS -chmod -R 777/tmp目录/ *