[Backlogmanager] [FIWARE-JIRA] (HELP-8860) [fiware-stackoverflow] Cosmos Hive error entering and using map reduce

José Ignacio Carretero Guarde (JIRA) jira-help-desk at jira.fiware.org
Mon May 29 11:36:00 CEST 2017


     [ https://jira.fiware.org/browse/HELP-8860?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

José Ignacio Carretero Guarde reassigned HELP-8860:
---------------------------------------------------

    Assignee: Francisco Romero

> [fiware-stackoverflow] Cosmos Hive error entering and using map reduce
> ----------------------------------------------------------------------
>
>                 Key: HELP-8860
>                 URL: https://jira.fiware.org/browse/HELP-8860
>             Project: Help-Desk
>          Issue Type: Monitor
>          Components: FIWARE-TECH-HELP
>            Reporter: Backlog Manager
>            Assignee: Francisco Romero
>              Labels: fiware, fiware-cosmos, hadoop, hive, mapreduce
>
> Created question in FIWARE Q/A platform on 29-01-2016 at 13:01
> {color: red}Please, ANSWER this question AT{color} https://stackoverflow.com/questions/35084224/cosmos-hive-error-entering-and-using-map-reduce
> +Question:+
> Cosmos Hive error entering and using map reduce
> +Description:+
> I've a couple of problems executing Hive on cosmos fiware lab instance.
> First, after log into the machine, I enter in Hive command line and I get the following error (I saw other questions related to this, but I couldn't find a solution):
> $ hive
> log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter].
> java.lang.RuntimeException: Could not load shims in class org.apache.hadoop.log.metrics.EventCounter
>     at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:123)
>     at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:115)
>     at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
>     at org.apache.hadoop.hive.shims.HiveEventCounter.<init>(HiveEventCounter.java:34)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>     at java.lang.Class.newInstance0(Class.java:357)
>     at java.lang.Class.newInstance(Class.java:310)
>     at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:330)
>     at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121)
>     at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664)
>     at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
>     at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
>     at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
>     at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
>     at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:354)
>     at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127)
>     at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
>     at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
>     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:641)
>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.log.metrics.EventCounter
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:171)
>     at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:120)
>     ... 27 more
> log4j:ERROR Could not instantiate appender named "EventCounter".
> Logging initialized using configuration in jar:file:/usr/local/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive-log4j.properties
> However, I'm able to run a query like SELECT * FROM table;
> On the other hand, if I try to run other query more specific like display only a column field, a map reduce job starts to run and it results in the following error:
> hive> SELECT table.column FROM table;
> Total jobs = 1
> Launching Job 1 out of 1
> Number of reduce tasks is set to 0 since there's no reduce operator
> Starting Job = job_201507101501_40071, Tracking URL = http://cosmosmaster-gi:50030/jobdetails.jsp?jobid=job_201507101501_40071
> Kill Command = /usr/lib/hadoop-0.20/bin/hadoop job  -kill job_201507101501_40071
> Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
> 2016-01-29 12:49:45,518 Stage-1 map = 0%,  reduce = 0%
> 2016-01-29 12:50:08,642 Stage-1 map = 100%,  reduce = 100%
> Ended Job = job_201507101501_40071 with errors
> Error during job, obtaining debugging information...
> Job Tracking URL: http://cosmosmaster-gi:50030/jobdetails.jsp?jobid=job_201507101501_40071
> Examining task ID: task_201507101501_40071_m_000002 (and more) from job job_201507101501_40071
> Task with the most failures(4): 
> -----
> Task ID:
>   task_201507101501_40071_m_000000
> URL:
>   http://cosmosmaster-gi:50030/taskdetails.jsp?jobid=job_201507101501_40071&tipid=task_201507101501_40071_m_000000
> -----
> Diagnostic Messages for this Task:
> java.lang.RuntimeException: Error in configuring object
>     at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:386)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
>     at org.apache.hadoop.mapred.Child.main(Child.java:260)
> Caused by: java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.jav
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> MapReduce Jobs Launched: 
> Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
> Total MapReduce CPU Time Spent: 0 msec
> Any help or suggestion is welcome.
> Thanks.



--
This message was sent by Atlassian JIRA
(v6.4.1#64016)


More information about the Backlogmanager mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy