[ https://jira.fiware.org/browse/HELP-6073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] FW External User updated HELP-6073: ----------------------------------- > FIWARE.Request.Tech.Data.BigData-Analysis.COSMOS BigData Analysis Write Permission > ---------------------------------------------------------------------------------- > > Key: HELP-6073 > URL: https://jira.fiware.org/browse/HELP-6073 > Project: Help-Desk > Issue Type: extRequest > Components: FIWARE-TECH-HELP > Reporter: FW External User > Assignee: Francisco Romero > Attachments: Jose.png, Jose.png, Jose.png > > > Hello. > I am trying to perform an easy example of an analysis using the hadoop examples in Cosmos. > It seems that my user (jvidal) doesn't have permission to do this. I show you the logs. > [jvidal at cosmosmaster-gi ~]$ hadoop jar /usr/lib/hadoop-0.20/hadoop-examples.jar wordcount /user/jvidal/def_serv/def_servpath/556dcfc2a5333eff5d19c8c4_product/556dcfc2a5333eff5d19c8c4_product.txt /home/jvidal/countwords > 16/03/08 13:47:32 WARN snappy.LoadSnappy: Snappy native library is available > 16/03/08 13:47:32 INFO util.NativeCodeLoader: Loaded the native-hadoop library > 16/03/08 13:47:32 INFO snappy.LoadSnappy: Snappy native library loaded > 16/03/08 13:47:32 INFO mapred.FileInputFormat: Total input paths to process : 1 > 16/03/08 13:47:33 INFO mapred.JobClient: Running job: job_201603041134_0059 > 16/03/08 13:47:34 INFO mapred.JobClient: map 0% reduce 0% > 16/03/08 13:47:39 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_m_000003_0, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:47:44 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_r_000010_0, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:47:49 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_m_000003_1, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:47:54 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_r_000010_1, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:47:59 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_m_000003_2, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:48:04 INFO mapred.JobClient: Task Id : attempt_201603041134_0059_r_000010_2, Status : FAILED > org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=jvidal, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1297) > at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:323) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1314) > at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52) > at org.apach > 16/03/08 13:48:12 INFO mapred.JobClient: Job complete: job_201603041134_0059 > 16/03/08 13:48:12 INFO mapred.JobClient: Counters: 4 > 16/03/08 13:48:12 INFO mapred.JobClient: Job Counters > 16/03/08 13:48:12 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=22353 > 16/03/08 13:48:12 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 > 16/03/08 13:48:12 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 > 16/03/08 13:48:12 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=14040 > 16/03/08 13:48:12 INFO mapred.JobClient: Job Failed: NA > java.io.IOException: Job failed! > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1300) > at org.apache.hadoop.examples.WordCount.run(WordCount.java:149) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.examples.WordCount.main(WordCount.java:155) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) > at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) > at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:197) > Could you grant me access to this user? > If I manage more users, should I tell about them specifically, or this is a general issue that can be fixed for all? > Thanks in advance > -- > Confidentiality notice: > This e-mail message, including any attachments, may contain legally privileged and/or confidential information. If you are not the intended recipient(s), or the employee or agent responsible for delivery of this message to the intended recipient(s), you are hereby notified that any dissemination, distribution, or copying of this e-mail message is strictly prohibited. If you have received this message in error, please immediately notify the sender and delete this e-mail message from your computer. > Since January 1st, old domains won't be supported and messages sent to any domain different to @lists.fiware.org will be lost. > Please, send your messages using the new domain (Fiware-tech-help at lists.fiware.org) instead of the old one. > _______________________________________________ > Fiware-tech-help mailing list > Fiware-tech-help at lists.fiware.org > https://lists.fiware.org/listinfo/fiware-tech-help > [Created via e-mail received from: =?utf-8?Q?Jose_Ben=C3=ADtez?= <jose at secmotic.com>] -- This message was sent by Atlassian JIRA (v6.4.1#64016)
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy