[Fiware-fiche-coaching] Medbravo - Cosmos GE HDFS Quota

Aurelia Bustos aurelia at medbravo.org
Sat Dec 26 13:02:46 CET 2015


Dear All,

Medbravo is continuously processing big-data on clinical research on
cancer.
In order to accommodate the increasing Medbravo data processing size
in the resources
provided in COSMOS Global Instance we would kindly ask you to for a larger
quota in HDFS.
Specifically *our current quota of 5 GB *is limiting us both input and
output data size of Medbravo Hadoop MapReduce Jobs. Our current input for
big data processing is now 8.5 GB and in the future it would grow bigger.
Our output is larger as it is composed of input data plus additional
information.
Therefore we are asking you to increase our quota in HDFS. We need *at
least 30 GB* to safely store input + output + possible intermediate data
during processing.

We will really appreciate your favourable response,

We wish you a Merry Christmas and a Prosperous New Year,
Aurelia

-- 


     Aurelia Bustos MD
     Cofounder

     tel: (+34) 618 453 214
     www.medbravo.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.fiware.org/private/fiware-fiche-coaching/attachments/20151226/2d6cafb5/attachment.html>


More information about the Fiware-fiche-coaching mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy