[Fiware-miwi] SceneAPI features: original intent of the EPIC vs . current plans?

Philipp Slusallek Philipp.Slusallek at dfki.de
Thu Jan 2 15:30:16 CET 2014


Hi,

Am 02.01.2014 15:08, schrieb Toni Alatalo:
> 1. use the library to connect to the scene server
> 2. use the WebTundra Scene API (in-memory JS) to examine the scene, for example to do pathfinding — the whole scene data is replicated to this AI client so the object positions etc. all are there automatically (just the ec-data though, not assets, so it’s not that heavy)
> 3. move a character, either by modifying the position directly in the client (again using the scene api there) or by sending commands to the server
>
> I think what I’m saying in your terms is that the scene model and the client code in general in WebTundra is not pure synchronization (which is just the network messaging part) but, well, a client with scene replication.
>
> As mentioned before, with the Second Life protocol used with Opensimulator there’s the LibOMV (Open Metaverse) which gives the same for e.g. AI bots — a headless client which connects to a server, gets the scene state automatically and provides it as an easy to use in-memory API that e.g. AI code can use to query and modify the scene. AFAIK people have been happy and productive with that.

But this is exactly what I would like NOT to do. Why should the 
simulation server have the full scene. This might be necessary in some 
cases, but there are many where it is not.

But I agree that the SceneAPI may be something that is more tailored 
towards queries and isolated changes. One way to combine the two things 
could be to use the Scene API to set up selected synchronization to the 
simulation server for just the interesting aspects for cases where 
continuous updates are necessary. The simulation server could then 
decide to send back changes either through the sync channel (where 
appropriate) or the Scene API.

>> One obvious issue that Toni already talked about is the direction. Contacting a browser instance is not possible without a server as we know from Server-Based Rendering. But then we could design an interface that allows for querying and changing a scene.
>
> Again the Sync GE does provide that, both in form of a WebSocket protocol and a JS client lib made on top of that — you can query the scene and modify it.
>
> But I think is a good idea for us (Lasse perhaps but I’m probably too curious to skip it too :) to check that article to understand more of what you are after.

But WebSocket is again a server-only transport mechanism (and it is what 
we are currently using as well). That may be fine for some cases but for 
others a P2P connection through Web RTC might be required if all we have 
are two Web applications.

BTW, the paper used REST and this is quite limited connectionwise.


Best,

	Philipp
> Cheers,
> ~Toni
>
>>
>> There is already is such a suggestion by Behr et, al (http://dl.acm.org/citation.cfm?id=1836057&CFID=394541370&CFTOKEN=82263824) that may be a good lpace to start. It based on HTTP/REST, though, which would not work for a connection towards the browser, so we should use WebRTC instead.
>>
>>
>> Best,
>>
>> 	Philipp
>>
>> Am 21.11.2013 22:24, schrieb Toni Alatalo:
>>> On 21 Nov 2013, at 10:54, Lasse Öörni <lasse.oorni at ludocraft.com> wrote:
>>>> If there is a good concrete plan to how it should be done instead it's not
>>>> at all too late to change (as any implementation has not began), and if
>>>> it's administratively OK, for example our architecture pictures now
>>>> include the REST scene API described.
>>>
>>> I talked today with a guy who is working on the vw / visualization front in the university project with the traffic sensors in the city.
>>>
>>> We agreed preliminarily that could use their data and system as a use case for this SceneAPI biz on the fi-ware side — if you and others here find it’s a good idea.
>>>
>>> Their data currently updates once per hour, though, so http would work :)
>>>
>>> But he had already proposed as a next step a visualisation where traffic is simulated / visualized as actual individual cars. We could have that simulation service as a user of the scene api / sync biz to control the cars so we’d get streaming nature for the data and much harder reqs for the usage (in the spirit of the EPIC).
>>>
>>> I still have to confirm with the prof whose leading that project that this all would be ok. We could do it so that the actual implementation of the visualization and even the integration comes from the uni and fi-ware (Ludocraft) only provides the API. I can use some of my university time for this as the integration of the city model and the traffic data is good to get there.
>>>
>>> This is not a must and I don’t mean to overcomplicate thing but just figured that a real use case would help to make the *concrete plan* that you called for above.
>>>
>>> The experience with the quick and simple POI & 3DUI integration (completed yesterday) was great, I’ll post about it tomorrow hopefully (the POI guys checked the demo today on ok’d it as this first minimal step). So I hope more integrations and usage of the GEs takes us well forward.
>>>
>>>> Lasse Öörni
>>>
>>> Cheers,
>>> ~Toni
>>>
>>> _______________________________________________
>>> Fiware-miwi mailing list
>>> Fiware-miwi at lists.fi-ware.eu
>>> https://lists.fi-ware.eu/listinfo/fiware-miwi
>>>
>>
>>
>> --
>>
>> -------------------------------------------------------------------------
>> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH
>> Trippstadter Strasse 122, D-67663 Kaiserslautern
>>
>> Geschäftsführung:
>>   Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender)
>>   Dr. Walter Olthoff
>> Vorsitzender des Aufsichtsrats:
>>   Prof. Dr. h.c. Hans A. Aukes
>>
>> Sitz der Gesellschaft: Kaiserslautern (HRB 2313)
>> USt-Id.Nr.: DE 148646973, Steuernummer:  19/673/0060/3
>> ---------------------------------------------------------------------------
>> <slusallek.vcf>
>


-- 

-------------------------------------------------------------------------
Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH
Trippstadter Strasse 122, D-67663 Kaiserslautern

Geschäftsführung:
   Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender)
   Dr. Walter Olthoff
Vorsitzender des Aufsichtsrats:
   Prof. Dr. h.c. Hans A. Aukes

Sitz der Gesellschaft: Kaiserslautern (HRB 2313)
USt-Id.Nr.: DE 148646973, Steuernummer:  19/673/0060/3
---------------------------------------------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slusallek.vcf
Type: text/x-vcard
Size: 441 bytes
Desc: not available
URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20140102/12d0aa50/attachment.vcf>


More information about the Fiware-miwi mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy