[Fiware-miwi] 3D UI Usage from other GEs / epics / apps

Torsten Spieldenner torsten.spieldenner at dfki.de
Thu Oct 31 10:23:23 CET 2013


Hello,

let me quickly comment on the topic with the DOM API. When developing an 
XML3D application, you are not limited to the raw API, but you are free 
to use jQuery or any other tool you want that makes writing the 
application faster and more efficient. XML3D works in a way that 
changing the DOM by adding or manipulating nodes results in changes in 
the scene. How you build the DOM, query it or manipulate it is 
completely up to you.

We make excessive use of jQuery to work on our XML3D scenes. We also had 
very successful experiments with building entire 3D scenes with the 
backbone model-view-JavaScript framework. The framework does all the job 
of querying the scene from the database, and automatically creates DOM 
elements as view, which, in our case, were XML3D group nodes that 
automatically appeared at the right position.

On top the capabilities of the DOM API and additional powers of 
sophisticated JavaScript-libraries, XML3D introduces an API extension by 
its own to provide a convenient way to access the DOM elements as 
XML3D-Elements, for example retrieving translation as XML3DVec3 or 
Rotation as XML3DRotation (for example, to retrieve the rotation part of 
an XML3D transformation, you can do this by using jQuery to query the 
transformation node from the DOM, and access the rotation there then:  
var r  = $("#my_transformation").rotation).

And if you want to do even more complex computations that you don't want 
to code entirely in JavaScript, you have Xflow on top, which helps to 
express these complex compations in the DOM as well.

So in conclusion, XML3D is far more than just a renderer, but gives you 
plenty of options to conveniently operate on your scene.

> If we think that XML3D (or the DOM and XML3D acts on those manipulations)
> is already this perfect API I'm not sure what we are even trying to
> accomplish here? If we are not building a nice to use 3D SDK whats the
> target here?
I totally agree that we still need to build this easily programmable 3D 
SDK. But XML3D makes it very simple to maintain the 3D scene in the DOM 
according to the scene state of the application.
You may want to have a look at our example web client for our FiVES 
server (https://github.com/rryk/FiVES). Although I admit that the code 
needs some refactoring, the example of how entities are created shows 
this nicely : As soon as you create a new Entity object, the DOM 
representation of its scenegraph and its transformations are created 
automatically and maintained as View of the entity model. As developer, 
you only need to operate on the client application's API.
  This could be an example, of how an SDK could operate on the XML3D 
representation of the scene.


~ Torsten

> On Wed, Oct 30, 2013 at 11:35 PM, Philipp Slusallek <
> Philipp.Slusallek at dfki.de> wrote:
>
>> Hi Jonne, all,
>>
>> I am not sure that applying the Tudra API in the Web context is really the
>> right approach. One of the key differences is that we already have a
>> central "scene" data structure and it already handles rendering and input
>> (DOM events), and other aspects. Also an API oriented approach may not be
>> the best option in this declarative context either (even though I
>> understands that it feels more natural when coming from C++, I had the same
>> issues).
>>
>> So let me be a bit more specific:
>>
>> -- Network: So, yes we need a network module. It's not something that
>> "lives" in the DOM but rather watches it and sends updates to the server to
>> achieve sync.
>>
>> -- Renderer: Why do we need an object here. Its part of the DOM model. The
>> only aspect is that we may want to set renderer-specific parameters. We
>> currently do so through the <xml3d> DOM element, which seems like a good
>> approach. The issues to be discussed here is what would be the advantages
>> of a three.js based renderer and implement it of really needed.
>>
>> -- Scene: This can be done in the DOM nicely and with WebComponents its
>> even more elegant. The scene objects are simple part of the same DOM but
>> only some of them get rendered. I am not even sure that we need here in
>> addition to the DOM and suitable mappings for the components.
>>
>> -- Asset: As you say this is already built-into the XML3D DOM. I see it a
>> bit like the network system in that it watches missing resources in the DOM
>> (plus attributes on priotity and such?) and implements a sort of scheduler
>> excutes requests in some priority order. A version that only loads missing
>> resources if is already available, one that goes even further and deletes
>> unneeded resources could probably be ported from your resource manager.
>>
>> -- UI: That is why we are building on top of HTML, which is a pretty good
>> UI layer in many requests. We have the 2D-UI GE to look into missing
>> functionality
>>
>> -- Input: This also is already built in as the DOM as events traverse the
>> DOM. It is widely used in all WEB based UIs and has proven quite useful
>> there. Here we can nicely combine it with the 3D scene model where events
>> are not only delivered to the 3D graphics elements but can be handled by
>> the elements or components even before that.
>>
>> But maybe I am missunderstanding you here?
>>
>>
>> Best,
>>
>>          Philipp
>>
>>
>> Am 30.10.2013 14:31, schrieb Jonne Nauha:
>>
>>> var client =
>>> {
>>>       network   : Object, // Network sync, connect, disconnect etc.
>>> functionality.
>>> // Implemented by scene sync GE (Ludocraft).
>>>
>>>       renderer  : Object, // API for 3D rendering engine access, creating
>>> scene nodes, updating their transforms, raycasting etc.
>>>                           // Implemented by 3D UI (Playsign).
>>>
>>>       scene     : Object, // API for accessing the
>>> Entity-Component-Attribute model.
>>>                           // Implemented by ???
>>>
>>>       asset     : Object, // Not strictly necessary for xml3d as it does
>>> asset requests for us, but for three.js this is pretty much needed.
>>>                           // Implemented by ???
>>>
>>>       ui        : Object, // API to add/remove widgets correctly on top
>>> of the 3D rendering canvas element, window resize events etc.
>>>                           // Implemented by 2D/Input GE (Adminotech).
>>>
>>>       input     : Object // API to hook to input events occurring on top
>>> of the 3D scene.
>>> // Implemented by 2D/Input GE (Adminotech).
>>> };
>>>
>>>
>>> Best regards,
>>> Jonne Nauha
>>> Meshmoon developer at Adminotech Ltd.
>>> www.meshmoon.com <http://www.meshmoon.com>
>>>
>>>
>>>
>>> On Wed, Oct 30, 2013 at 9:51 AM, Toni Alatalo <toni at playsign.net
>>> <mailto:toni at playsign.net>> wrote:
>>>
>>>      Hi again,
>>>      new angle here: calling devs *outside* the 3D UI GE: POIs,
>>>      real-virtual interaction, interface designer, virtual characters, 3d
>>>      capture, synchronization etc.
>>>      I think we need to proceed rapidly with integration now and propose
>>>      that one next step towards that is to analyze the interfaces between
>>>      3D UI and other GEs. This is because it seems to be a central part
>>>      with which many others interface: that is evident in the old
>>>      'arch.png' where we analyzed GE/Epic interdependencies: is embedded
>>>      in section 2 in the Winterthur arch discussion notes which hopefully
>>>      works for everyone to see,
>>>      https://docs.google.com/**document/d/**1Sr4rg44yGxK8jj6yBsayCwfitZTq5
>>> **Cdyyb_xC25vhhE/edit<https://docs.google.com/document/d/1Sr4rg44yGxK8jj6yBsayCwfitZTq5Cdyyb_xC25vhhE/edit>
>>>      I propose a process where we go through the usage patterns case by
>>>      case. For example so that me & Erno visit the other devs to discuss
>>>      it. I think a good goal for those sessions is to define and plan the
>>>      implementation of first tests / minimal use cases where the other
>>>      GEs are used together with 3D UI to show something. I'd like this
>>>      first pass to happen quickly so that within 2 weeks from the
>>>      planning the first case is implemented. So if we get to have the
>>>      sessions within 2 weeks from now, in a month we'd have demos with
>>>      all parts.
>>>      Let's organize this so that those who think this applies to their
>>>      work contact me with private email (to not spam the list), we meet
>>>      and collect the notes to the wiki and inform this list about that.
>>>      One question of particular interest to me here is: can the users of
>>>      3D UI do what they need well on the entity system level (for example
>>>      just add and configure mesh components), or do they need deeper
>>>      access to the 3d scene and rendering (spatial queries, somehow
>>>      affect the rendering pipeline etc). With Tundra we have the
>>>      Scene API and the (Ogre)World API(s) to support the latter, and also
>>>      access to the renderer directly. OTOH the entity system level is
>>>      renderer independent.
>>>      Synchronization is a special case which requires good two-way
>>>      integration with 3D UI. Luckily it's something that we and
>>>      especially Lasse himself knows already from how it works in Tundra
>>>      (and in WebTundras). Definitely to be discussed and planned now too
>>>      of course.
>>>      So please if you agree that this is a good process do raise hands
>>>      and let's start working on it! We can discuss this in the weekly too
>>>      if needed.
>>>      Cheers,
>>>      ~Toni
>>>
>>>      ______________________________**_________________
>>>      Fiware-miwi mailing list
>>>      Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu>
>>>      https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi>
>>>
>>>
>>>
>>>
>>>
>>> ______________________________**_________________
>>> Fiware-miwi mailing list
>>> Fiware-miwi at lists.fi-ware.eu
>>> https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi>
>>>
>>>
>> --
>>
>> ------------------------------**------------------------------**
>> -------------
>> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH
>> Trippstadter Strasse 122, D-67663 Kaiserslautern
>>
>> Geschäftsführung:
>>    Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender)
>>    Dr. Walter Olthoff
>> Vorsitzender des Aufsichtsrats:
>>    Prof. Dr. h.c. Hans A. Aukes
>>
>> Sitz der Gesellschaft: Kaiserslautern (HRB 2313)
>> USt-Id.Nr.: DE 148646973, Steuernummer:  19/673/0060/3
>> ------------------------------**------------------------------**
>> ---------------
>>
>
>
> _______________________________________________
> Fiware-miwi mailing list
> Fiware-miwi at lists.fi-ware.eu
> https://lists.fi-ware.eu/listinfo/fiware-miwi

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20131031/5c14328e/attachment.html>


More information about the Fiware-miwi mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy