Hi, Lets discuss this today as well. Best, Philipp Am 15.01.2014 07:32, schrieb Toni Alatalo: > a brief note about this, Jonne noted in the meet yesterday when this was > discussed a little: > > we already can do selective sync — exactly due to existing IM things in > Tundra. there’s no protocol for it but apps can do it with app level > logic: for example a server side script that the client can command to > configure the collection of entities of interest for that client. so the > mechanisms for simulation servers to participate in scenes is there, > just not tested that way yet (no demo / use case). thanks to Lasse’s > recent (this & last week or so) refactoring in Tundra the kNet and > WebSocket sync things are now integrated so that the same IM mechanisms > work for WS connections too. > > however http interfacing can easily be useful too so might be a good > idea to implement the original SceneAPI plan, for different cases than > realtime sync. > > that does not help the browser-browser p2p case though that Philipp has > also brought up. > > ~Toni > > On 03 Jan 2014, at 08:46, toni at playsign.net <mailto:toni at playsign.net> > wrote: > >> Yes I agree that a selective form of how the sync / client works by >> default might be good for e.g. AI or simulation nodes. But that is >> basically just an optimization which does not change the architecture >> fundamentally. That is, if there’s a use case for that kind of >> SceneAPI (like the traffic data visualization I discussed with Uni >> Oulu folks), we could just implement it right away on top of what we >> have now. >> >> Support for selective Sync is required anyhow for scalability, for >> large / with-much-constant-changes scenes and for large numbers of >> users. We have a few such Interest Management strategies, and support >> for IM in general, implemented in Tundra by CIE/Chiru a while back. I >> think improving that is in Ludocraft’s fi-ware plans. >> >> In those usages it is typically the server which decides what updates >> are of interest for a certain client, utilizing info that the client >> has sent (in Chiru’s Tundra thing the client sends the active camera >> view info for the server to use -- Second Life does the same I think). >> >> For these headless server-clients, we should support the client >> subscribing explicitly to certain changes -- either changes to certain >> objects, or perhaps changes of certain type to any object in the >> scene. That should be easy to add when the support for IM in general >> is well in place. Perhaps a good use case to be kept in mind when >> working on IM. The Verse protocol, made many years ago for live sync >> of changes between 3d authoring apps for realtime collaboration, works >> like that: after connecting, clients subscribe to the kind of changes >> they want to get from the scene (quite normal pub-sub style). >> >> For isolated queries and changes HTTP is fine, that would be >> implemented by the current SceneAPI plan, and is simple. Just is not >> suitable for more realtime simulation visualisations for which the >> Websocket impl from Sync GE is better. >> >> WebRTC is certainly cool for direct browser-to-browser comms, works >> well for us in the original browser-code-only n-player Pong >> implementation (which was later ported to Tundra + WebTundra too). One >> limitation there is that support had been in Chrome only, I don’t know >> how that situation has developed. Lasse mentions WebRTC as a possible >> alternative transport in the continuation plans for Sync (in the >> FI-CORE proposal). I think it is totally feasible to allow >> peer-to-peer sync with the generic scene replication messages over WebRTC. >> >> ~Toni >> >> Sent from Windows Mail >> >> *From:* Philipp Slusallek <mailto:Philipp.Slusallek at dfki.de> >> *Sent:* Thursday, January 2, 2014 4:30 PM >> *To:* Toni Alatalo <mailto:toni at playsign.net> >> *Cc:* FI-WARE, MiWi <mailto:fiware-miwi at lists.fi-ware.eu> >> >> Hi, >> >> Am 02.01.2014 15:08, schrieb Toni Alatalo: >> > 1. use the library to connect to the scene server >> > 2. use the WebTundra Scene API (in-memory JS) to examine the scene, >> for example to do pathfinding — the whole scene data is replicated to >> this AI client so the object positions etc. all are there >> automatically (just the ec-data though, not assets, so it’s not that >> heavy) >> > 3. move a character, either by modifying the position directly in >> the client (again using the scene api there) or by sending commands to >> the server >> > >> > I think what I’m saying in your terms is that the scene model and >> the client code in general in WebTundra is not pure synchronization >> (which is just the network messaging part) but, well, a client with >> scene replication. >> > >> > As mentioned before, with the Second Life protocol used with >> Opensimulator there’s the LibOMV (Open Metaverse) which gives the same >> for e.g. AI bots — a headless client which connects to a server, gets >> the scene state automatically and provides it as an easy to use >> in-memory API that e.g. AI code can use to query and modify the scene. >> AFAIK people have been happy and productive with that. >> >> But this is exactly what I would like NOT to do. Why should the >> simulation server have the full scene. This might be necessary in some >> cases, but there are many where it is not. >> >> But I agree that the SceneAPI may be something that is more tailored >> towards queries and isolated changes. One way to combine the two things >> could be to use the Scene API to set up selected synchronization to the >> simulation server for just the interesting aspects for cases where >> continuous updates are necessary. The simulation server could then >> decide to send back changes either through the sync channel (where >> appropriate) or the Scene API. >> >> >> One obvious issue that Toni already talked about is the direction. >> Contacting a browser instance is not possible without a server as we >> know from Server-Based Rendering. But then we could design an >> interface that allows for querying and changing a scene. >> > >> > Again the Sync GE does provide that, both in form of a WebSocket >> protocol and a JS client lib made on top of that — you can query the >> scene and modify it. >> > >> > But I think is a good idea for us (Lasse perhaps but I’m probably >> too curious to skip it too :) to check that article to understand more >> of what you are after. >> >> But WebSocket is again a server-only transport mechanism (and it is what >> we are currently using as well). That may be fine for some cases but for >> others a P2P connection through Web RTC might be required if all we have >> are two Web applications. >> >> BTW, the paper used REST and this is quite limited connectionwise. >> >> >> Best, >> >> Philipp >> > Cheers, >> > ~Toni >> > >> >> >> >> There is already is such a suggestion by Behr et, al >> (http://dl.acm.org/citation.cfm?id=1836057&CFID=394541370&CFTOKEN=82263824) >> that may be a good lpace to start. It based on HTTP/REST, though, >> which would not work for a connection towards the browser, so we >> should use WebRTC instead. >> >> >> >> >> >> Best, >> >> >> >> Philipp >> >> >> >> Am 21.11.2013 22:24, schrieb Toni Alatalo: >> >>> On 21 Nov 2013, at 10:54, Lasse Öörni <lasse.oorni at ludocraft.com >> <mailto:lasse.oorni at ludocraft.com>> wrote: >> >>>> If there is a good concrete plan to how it should be done instead >> it's not >> >>>> at all too late to change (as any implementation has not began), >> and if >> >>>> it's administratively OK, for example our architecture pictures now >> >>>> include the REST scene API described. >> >>> >> >>> I talked today with a guy who is working on the vw / visualization >> front in the university project with the traffic sensors in the city. >> >>> >> >>> We agreed preliminarily that could use their data and system as a >> use case for this SceneAPI biz on the fi-ware side — if you and others >> here find it’s a good idea. >> >>> >> >>> Their data currently updates once per hour, though, so http would >> work :) >> >>> >> >>> But he had already proposed as a next step a visualisation where >> traffic is simulated / visualized as actual individual cars. We could >> have that simulation service as a user of the scene api / sync biz to >> control the cars so we’d get streaming nature for the data and much >> harder reqs for the usage (in the spirit of the EPIC). >> >>> >> >>> I still have to confirm with the prof whose leading that project >> that this all would be ok. We could do it so that the actual >> implementation of the visualization and even the integration comes >> from the uni and fi-ware (Ludocraft) only provides the API. I can use >> some of my university time for this as the integration of the city >> model and the traffic data is good to get there. >> >>> >> >>> This is not a must and I don’t mean to overcomplicate thing but >> just figured that a real use case would help to make the *concrete >> plan* that you called for above. >> >>> >> >>> The experience with the quick and simple POI & 3DUI integration >> (completed yesterday) was great, I’ll post about it tomorrow hopefully >> (the POI guys checked the demo today on ok’d it as this first minimal >> step). So I hope more integrations and usage of the GEs takes us well >> forward. >> >>> >> >>>> Lasse Öörni >> >>> >> >>> Cheers, >> >>> ~Toni >> >>> >> >>> _______________________________________________ >> >>> Fiware-miwi mailing list >> >>> Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-ware.eu> >> >>> https://lists.fi-ware.eu/listinfo/fiware-miwi >> >>> >> >> >> >> >> >> -- >> >> >> >> >> ------------------------------------------------------------------------- >> >> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH >> >> Trippstadter Strasse 122, D-67663 Kaiserslautern >> >> >> >> Geschäftsführung: >> >> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) >> >> Dr. Walter Olthoff >> >> Vorsitzender des Aufsichtsrats: >> >> Prof. Dr. h.c. Hans A. Aukes >> >> >> >> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >> >> USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 >> >> >> --------------------------------------------------------------------------- >> >> <slusallek.vcf> >> > >> >> >> -- >> >> ------------------------------------------------------------------------- >> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH >> Trippstadter Strasse 122, D-67663 Kaiserslautern >> >> Geschäftsführung: >> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) >> Dr. Walter Olthoff >> Vorsitzender des Aufsichtsrats: >> Prof. Dr. h.c. Hans A. Aukes >> >> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >> USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 >> --------------------------------------------------------------------------- >> _______________________________________________ >> Fiware-miwi mailing list >> Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-ware.eu> >> https://lists.fi-ware.eu/listinfo/fiware-miwi > -- ------------------------------------------------------------------------- Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH Trippstadter Strasse 122, D-67663 Kaiserslautern Geschäftsführung: Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) Dr. Walter Olthoff Vorsitzender des Aufsichtsrats: Prof. Dr. h.c. Hans A. Aukes Sitz der Gesellschaft: Kaiserslautern (HRB 2313) USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 --------------------------------------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: slusallek.vcf Type: text/x-vcard Size: 456 bytes Desc: not available URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20140115/7da1ff04/attachment.vcf>
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy