Quick comments to the responsibilities & status of the parts here - the DOM & JS APIs talk in later posts I need to digest still (I think Jonne has misconceptions there but also very important points) — but now to this: On 30 Oct 2013, at 15:31, Jonne Nauha <jonne at adminotech.com> wrote: > We are going to need a bit more structure than just a "3D client/UI" object. Yes, certainly the client is not just the 3D UI - we are just focusing on that in this particular effort (meet with users of 3D UI) as it’s our responsibility. (Note: I think it’s clear enough to use the term ‘client’ even though it can be used as standalone without a synchronisation server - will check how this is in the glossary). > I think mimicking the Tundra core APIs (the ones we need at least) is a good choice. It is one of the 3 APIs I’ve proposed for analysis in the requirements doc, along with xml3d.js and three.js, in “3. Requirements breakdown - 3.1. Requirements for application functionality development - 3.1.1. Existing 3d application APIs” in https://docs.google.com/document/d/1P03BgfEG1Ly2dI2Cs9ODVDmBhH4A438Ynlaxc4vXn1o/edit?pli=1#heading=h.us4ergchk5k0 > Something like what I've scribbled below. The rest of the GEs will then interact in some kind of manner with these core APIs, in most cases with renderer, scene and ui. > renderer : Object, // API for 3D rendering engine access, creating scene nodes, updating their transforms, raycasting etc. > // Implemented by 3D UI (Playsign). I think typically scene nodes are created with the scene api (note: talking about in-browser in-memory stuff here, not to be confused with the server side rest SceneAPI) — like in Tundra — not directly to renderer. Same for transforms — also in Tundra you manipulate the placeable component, don’t access renderer to move an object. I think the renderer API is needed for these kind of things: 1. Custom drawing, either in a component’s implementation or just with direct drawing commands from application code. For example a component that is a procedural tree — or volumetric terrain. Or some custom code to draw aiming helpers in a shooting game or so, for example some kind of curves. 2. Complex scene queries. Typically they might be better via the scene though. But perhaps something that goes really deep in the renderer for example to query which areas are in shadow or so? Or visibility checks? 3. Things that need to hook into the rendering pipeline — perhaps for things like Render-To-Texture or post-process compositing. Perhaps how XFlow integrates with rendering? Hm, now I think I actually answered Philipp’s later question about this (didn’t mean to do that yet :p) > scene : Object, // API for accessing the Entity-Component-Attribute model. > // Implemented by ??? Yes, the unclarity about the responsibility here is why I started the tread on ‘the entity system’ some weeks ago. I think the situation is not catastrophic as we already have 3 implementations of that: 2 in the ‘WebTundras’ (Chiru-WebClient and WebRocket) and also xml3d.js. Playsign can worry about this at least for now (say, the coming month). We are not writing a renderer from scratch as there are many good ones out there already so we can spend resources on this too. Let’s see whether we can just adopt one of those 3 systems for MIWI — and hence probably as realXtend’s future official WebTundra — or do we need to write a 4th from scratch for some reason. There are however many complex issues with rendering itself too, for example the asset pipeline for which that meeting was, and the material system which we haven’t really addressed at all yet. So we do need to be able to focus in peace on that as well. As we learned again on Monday, though, the DFKI folks are continuously advancing the state of the art with the rendering for example with the upcoming ability to write custom shaders in Javascript etc — and they’ve thought of complex material systems for long — so we have great help there. We are definitely open for participation here, and for someone else taking the lead on this if it fits on their plate. We already started good talks with Lasse yesterday and he’ll actually check how Chiru-WebClient’s entity system implementation looks like from the synchronisation GE’s point of view. Again, one particular question here is the ‘IComponent’: how do we define new components — aka. XML elements? As mentioned before, WebComponents can be related so is to be analysed. Adminotech is already using WebComponents in the 2D UI work so perhaps you could help with understanding this: would the way they have there work for defining reX components? Also Philipp’s comments about KIARA which has a way to define things is related. > asset : Object, // Not strictly necessary for xml3d as it does asset requests for us, but for three.js this is pretty much needed. > // Implemented by ??? I think this belongs to the 3D UI so falls in Playsign’s responsibility. Again there are the existing implementations in WebTundras and xml3d.js — and obviously the browser does much of the work but I think we still need to track dependencies in the loading and keep track of usages for releasing resources etc. (the reason why you have the asset system in web rocket and the resource manager in xml3d.js). Thanks for the draft! Code speaks louder than words (that’s why I wrote the txml (<)-> xml3d converter), and at least for me this kind of code-like API def was very clear and helpful to read :) ~Toni > > ui : Object, // API to add/remove widgets correctly on top of the 3D rendering canvas element, window resize events etc. > // Implemented by 2D/Input GE (Adminotech). > > input : Object // API to hook to input events occurring on top of the 3D scene. > // Implemented by 2D/Input GE (Adminotech). > }; > > > Best regards, > Jonne Nauha > Meshmoon developer at Adminotech Ltd. > www.meshmoon.com > > > On Wed, Oct 30, 2013 at 9:51 AM, Toni Alatalo <toni at playsign.net> wrote: > Hi again, > > new angle here: calling devs *outside* the 3D UI GE: POIs, real-virtual interaction, interface designer, virtual characters, 3d capture, synchronization etc. > > I think we need to proceed rapidly with integration now and propose that one next step towards that is to analyze the interfaces between 3D UI and other GEs. This is because it seems to be a central part with which many others interface: that is evident in the old 'arch.png' where we analyzed GE/Epic interdependencies: is embedded in section 2 in the Winterthur arch discussion notes which hopefully works for everyone to see, https://docs.google.com/document/d/1Sr4rg44yGxK8jj6yBsayCwfitZTq5Cdyyb_xC25vhhE/edit > > I propose a process where we go through the usage patterns case by case. For example so that me & Erno visit the other devs to discuss it. I think a good goal for those sessions is to define and plan the implementation of first tests / minimal use cases where the other GEs are used together with 3D UI to show something. I'd like this first pass to happen quickly so that within 2 weeks from the planning the first case is implemented. So if we get to have the sessions within 2 weeks from now, in a month we'd have demos with all parts. > > Let's organize this so that those who think this applies to their work contact me with private email (to not spam the list), we meet and collect the notes to the wiki and inform this list about that. > > One question of particular interest to me here is: can the users of 3D UI do what they need well on the entity system level (for example just add and configure mesh components), or do they need deeper access to the 3d scene and rendering (spatial queries, somehow affect the rendering pipeline etc). With Tundra we have the Scene API and the (Ogre)World API(s) to support the latter, and also access to the renderer directly. OTOH the entity system level is renderer independent. > > Synchronization is a special case which requires good two-way integration with 3D UI. Luckily it's something that we and especially Lasse himself knows already from how it works in Tundra (and in WebTundras). Definitely to be discussed and planned now too of course. > > So please if you agree that this is a good process do raise hands and let's start working on it! We can discuss this in the weekly too if needed. > > Cheers, > ~Toni > > _______________________________________________ > Fiware-miwi mailing list > Fiware-miwi at lists.fi-ware.eu > https://lists.fi-ware.eu/listinfo/fiware-miwi > > -------------- next part -------------- An HTML attachment was scrubbed... URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20131031/e4302678/attachment.html>
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy