[ https://jira.fiware.org/browse/HELP-9279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Fernando Lopez updated HELP-9279:
---------------------------------
HD-Chapter: Data
Description:
Created question in FIWARE Q/A platform on 09-06-2015 at 08:06
{color: red}Please, ANSWER this question AT{color} https://stackoverflow.com/questions/30724933/remote-connection-to-fiware-cosmos-returning-authentication-error
+Question:+
Remote connection to fiware-cosmos returning authentication error
+Description:+
We have a COSMOS account on cosmos.lab.fi-ware.org and can load files locally onto the cluster.
However, we are having trouble loading remotely, the instructions we followed on the guide site show the following:
However, using the WebHDFS/HttpFS RESTful API will allow you to upload
files existing outside the global instance of Cosmos in FI-LAB. The
following example uses HttpFS instead of WebHDFS (uses the TCP/14000
port instead of TCP/50070), and curl is used as HTTP client (but your
applications should implement your own HTTP client):
[remote-vm]$ curl -i -X PUT "http://cosmos.lab.fi-ware.org:14000/webhdfs/v1/user/$COSMOS_USER/input_data?op=MKDIRS&user.name=$COSMOS_USER"
[remote-vm]$ curl -i -X PUT ..etc
[remote-vm]$ curl -i -X PUT -T etc..
As you can see, the data uploading is a two-step operation, as stated
in the WebHDFS specification: the first invocation of the API talks
directly with the Head Node, specifying the new file creation and its
name; then the Head Node sends a temporary redirection response,
specifying the Data Node among all the existing ones in the cluster
where the data has to be stored, which is the endpoint of the second
step. Nevertheless, the HttpFS gateway implements the same API but its
internal behaviour changes, making the redirection to point to the
Head Node itself.
However, when we run these commands we get server errors coming back, one example is:
~ kari$ -bash: user.name=kdempsey: command not found
HTTP/1.1 100 Continue
HTTP/1.1 401 Unauthorized
Server: Apache-Coyote/1.1
Set-Cookie: hadoop.auth=""; Expires=Thu, 01-Jan-1970 00:00:10 GMT; Path=/
Content-Type: text/html;charset=utf-8
Content-Length: 1275
Date: Fri, 05 Jun 2015 12:58:20 GMT
Apache Tomcat/6.0.32 - Error report<!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}-->
HTTP Status 401 - org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
type Status report
message org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
description This request requires HTTP authentication (org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed).
Apache Tomcat/6.0.32
Another was a 500 server error. Could please provide the commands for remotely loading a file into the COSMOS shared resource.
Ultimately we want to take data from our InfluxDB and load into COSMOS, we would like to do it via REST call if possible (otherwise python).
Many thanks,
Kari
was:
Created question in FIWARE Q/A platform on 09-06-2015 at 08:06
{color: red}Please, ANSWER this question AT{color} https://stackoverflow.com/questions/30724933/remote-connection-to-fiware-cosmos-returning-authentication-error
+Question:+
Remote connection to fiware-cosmos returning authentication error
+Description:+
We have a COSMOS account on cosmos.lab.fi-ware.org and can load files locally onto the cluster.
However, we are having trouble loading remotely, the instructions we followed on the guide site show the following:
However, using the WebHDFS/HttpFS RESTful API will allow you to upload
files existing outside the global instance of Cosmos in FI-LAB. The
following example uses HttpFS instead of WebHDFS (uses the TCP/14000
port instead of TCP/50070), and curl is used as HTTP client (but your
applications should implement your own HTTP client):
[remote-vm]$ curl -i -X PUT "http://cosmos.lab.fi-ware.org:14000/webhdfs/v1/user/$COSMOS_USER/input_data?op=MKDIRS&user.name=$COSMOS_USER"
[remote-vm]$ curl -i -X PUT ..etc
[remote-vm]$ curl -i -X PUT -T etc..
As you can see, the data uploading is a two-step operation, as stated
in the WebHDFS specification: the first invocation of the API talks
directly with the Head Node, specifying the new file creation and its
name; then the Head Node sends a temporary redirection response,
specifying the Data Node among all the existing ones in the cluster
where the data has to be stored, which is the endpoint of the second
step. Nevertheless, the HttpFS gateway implements the same API but its
internal behaviour changes, making the redirection to point to the
Head Node itself.
However, when we run these commands we get server errors coming back, one example is:
~ kari$ -bash: user.name=kdempsey: command not found
HTTP/1.1 100 Continue
HTTP/1.1 401 Unauthorized
Server: Apache-Coyote/1.1
Set-Cookie: hadoop.auth=""; Expires=Thu, 01-Jan-1970 00:00:10 GMT; Path=/
Content-Type: text/html;charset=utf-8
Content-Length: 1275
Date: Fri, 05 Jun 2015 12:58:20 GMT
Apache Tomcat/6.0.32 - Error report<!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}-->
HTTP Status 401 - org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
type Status report
message org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
description This request requires HTTP authentication (org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed).
Apache Tomcat/6.0.32
Another was a 500 server error. Could please provide the commands for remotely loading a file into the COSMOS shared resource.
Ultimately we want to take data from our InfluxDB and load into COSMOS, we would like to do it via REST call if possible (otherwise python).
Many thanks,
Kari
HD-Enabler: Cosmos
> [fiware-stackoverflow] Remote connection to fiware-cosmos returning authentication error
> ----------------------------------------------------------------------------------------
>
> Key: HELP-9279
> URL: https://jira.fiware.org/browse/HELP-9279
> Project: Help-Desk
> Issue Type: Monitor
> Components: FIWARE-TECH-HELP
> Reporter: Backlog Manager
> Assignee: Francisco Romero
> Labels: cosmos, filab, fiware
>
> Created question in FIWARE Q/A platform on 09-06-2015 at 08:06
> {color: red}Please, ANSWER this question AT{color} https://stackoverflow.com/questions/30724933/remote-connection-to-fiware-cosmos-returning-authentication-error
> +Question:+
> Remote connection to fiware-cosmos returning authentication error
> +Description:+
> We have a COSMOS account on cosmos.lab.fi-ware.org and can load files locally onto the cluster.
> However, we are having trouble loading remotely, the instructions we followed on the guide site show the following:
> However, using the WebHDFS/HttpFS RESTful API will allow you to upload
> files existing outside the global instance of Cosmos in FI-LAB. The
> following example uses HttpFS instead of WebHDFS (uses the TCP/14000
> port instead of TCP/50070), and curl is used as HTTP client (but your
> applications should implement your own HTTP client):
> [remote-vm]$ curl -i -X PUT "http://cosmos.lab.fi-ware.org:14000/webhdfs/v1/user/$COSMOS_USER/input_data?op=MKDIRS&user.name=$COSMOS_USER"
> [remote-vm]$ curl -i -X PUT ..etc
> [remote-vm]$ curl -i -X PUT -T etc..
>
> As you can see, the data uploading is a two-step operation, as stated
> in the WebHDFS specification: the first invocation of the API talks
> directly with the Head Node, specifying the new file creation and its
> name; then the Head Node sends a temporary redirection response,
> specifying the Data Node among all the existing ones in the cluster
> where the data has to be stored, which is the endpoint of the second
> step. Nevertheless, the HttpFS gateway implements the same API but its
> internal behaviour changes, making the redirection to point to the
> Head Node itself.
> However, when we run these commands we get server errors coming back, one example is:
> ~ kari$ -bash: user.name=kdempsey: command not found
> HTTP/1.1 100 Continue
> HTTP/1.1 401 Unauthorized
> Server: Apache-Coyote/1.1
> Set-Cookie: hadoop.auth=""; Expires=Thu, 01-Jan-1970 00:00:10 GMT; Path=/
> Content-Type: text/html;charset=utf-8
> Content-Length: 1275
> Date: Fri, 05 Jun 2015 12:58:20 GMT
> Apache Tomcat/6.0.32 - Error report<!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}-->
> HTTP Status 401 - org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
> type Status report
> message org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed
> description This request requires HTTP authentication (org.apache.hadoop.security.authentication.client.AuthenticationException: Anonymous requests are disallowed).
> Apache Tomcat/6.0.32
> Another was a 500 server error. Could please provide the commands for remotely loading a file into the COSMOS shared resource.
> Ultimately we want to take data from our InfluxDB and load into COSMOS, we would like to do it via REST call if possible (otherwise python).
> Many thanks,
> Kari
--
This message was sent by Atlassian JIRA
(v6.4.1#64016)
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy