[FINESCE] COSMOS : Shark error

FRANCISCO ROMERO BUENO francisco.romerobueno at telefonica.com
Mon Sep 21 10:40:43 CEST 2015


Hi Massimiliano,

Regarding the Shark server, it was down. Now it should be up and running.

Regarding the Hive CLI issue,  as explained to Dario some days ago, your user (FINESCE_WP4) should be using Hive 0.9.0 instead of Hive 0.13.0. The reason is Hive 0.9.0 is the version used by Shark at TCP/9999 port, the service you prefer instead of the default HiveServer2 (using Hive 0.13.0) at TCP/10000 that any other user may access.

Thus, your PATH and your HIVE_HOME should be pointing to:

  *   expor HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
  *   export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin

If the above exports are not permanent, but by ssh session, then you will be pointing to Hive 0.13.0 (as any other user) instead of Hive 0.9.0 you need.

I’m copying you the email I originally sent to Dario:

“… As promised, I’ve setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I’ve been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem).

A couple of remarks:

  *   Shark server now runs on port TCP/9999, don’t forget to change this in your client.
  *   As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file:
     *   export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
     *   export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
  *   Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named “frb"). Don’t panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location ‘/path/to/data/in/hdfs/…’"; I guess you know the command because you already created the old tables by your own. If you don’t remember some detail regarding the tables, you can ask for it to Hive (0.13.0): “describe extended|formatted <table_name>"

…”

Regards,
Francisco

De: Massimiliano Nigrelli <massimiliano.nigrelli at eng.it<mailto:massimiliano.nigrelli at eng.it>>
Fecha: lunes, 21 de septiembre de 2015, 10:28
CC: Pellegrino Dario <dario.pellegrino at eng.it<mailto:dario.pellegrino at eng.it>>, Pasquale Andriani <pasquale.andriani at eng.it<mailto:pasquale.andriani at eng.it>>, Leandro Lombardo <Leandro.Lombardo at eng.it<mailto:Leandro.Lombardo at eng.it>>, "fiware-lab-help at lists.fi-ware.org<mailto:fiware-lab-help at lists.fi-ware.org>" <fiware-lab-help at lists.fi-ware.org<mailto:fiware-lab-help at lists.fi-ware.org>>, Francisco Romero Bueno <francisco.romerobueno at telefonica.com<mailto:francisco.romerobueno at telefonica.com>>, SERGIO GARCIA GOMEZ <sergio.garciagomez at telefonica.com<mailto:sergio.garciagomez at telefonica.com>>, MIGUEL CARRILLO PACHECO <miguel.carrillopacheco at telefonica.com<mailto:miguel.carrillopacheco at telefonica.com>>
Asunto: [FINESCE] COSMOS : Shark error

To whom it may concern,
from yesterday our applications have been getting the following error when trying to connect to Hive/Shark:

INFO: Getting Historical Load Data by Sector
Sep 21, 2015 10:16:31 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
SEVERE: HIVE Connection Error
java.sql.SQLException: Could not establish connection to 130.206.80.46:9999/default?user=FINESCE-WP4&password=******************: java.net.ConnectException: Connection refused
        at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:117)
        at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:233)

Just to give you more details, when we try to launch hive from CLI, we get (although the CLI starts in the end):
login as: FINESCE-WP4
FINESCE-WP4 at 130.206.80.46<mailto:FINESCE-WP4 at 130.206.80.46>'s password:
Last login: Mon Sep 21 10:15:40 2015 from 89-97-237-254.ip19.fastwebnet.it
-bash-4.1$ hive
log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter].
java.lang.RuntimeException: Could not load shims in class org.apache.hadoop.log.metrics.EventCounter
        at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:123)
        at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:115)
        at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
        at org.apache.hadoop.hive.shims.HiveEventCounter.<init>(HiveEventCounter.java:34)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at java.lang.Class.newInstance0(Class.java:357)
        at java.lang.Class.newInstance(Class.java:310)
        at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:330)
        at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121)
        at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664)
        at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
        at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
        at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:354)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:641)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.log.metrics.EventCounter
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:171)
        at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:120)
        ... 27 more
log4j:ERROR Could not instantiate appender named "EventCounter".

Logging initialized using configuration in jar:file:/usr/local/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive-log4j.properties


Could someone from COSMOS team help us sorting the issue out, please?

Regards,

Massimiliano

--
===============================================================

Massimiliano Nigrelli
Direzione Ricerca e Innovazione - R&D Lab
massimiliano.nigrelli at eng.it<mailto:massimiliano.nigrelli at eng.it>

Engineering Ingegneria Informatica S.p.A.
Viale Regione Siciliana, 7275 - 90146 Palermo (Italy)

Phone: +39 091.75.11.847

============================================================

________________________________

Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción.

The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it.

Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.fiware.org/private/fiware-lab-help/attachments/20150921/da0a52eb/attachment.html>


More information about the Fiware-lab-help mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy