[Fiware-miwi] 3D UI Usage from other GEs / epics / apps

Philipp Slusallek Philipp.Slusallek at dfki.de
Sun Nov 3 10:14:13 CET 2013


Hi,

Just catching up on this thread.

Am 31.10.2013 08:16, schrieb Toni Alatalo:
> I think the renderer API is needed for these kind of things:
>
> 1. Custom drawing, either in a component’s implementation or just with
> direct drawing commands from application code. For example a component
> that is a procedural tree — or volumetric terrain. Or some custom code
> to draw aiming helpers in a shooting game or so, for example some kind
> of curves.

While I think that it might be necessary to have an escape mechanism for 
drawing stuff directly, we should see that we can cover most of these 
features within the declarative part as well. This will aloow us to use 
other renderers as well (e.g. with XML3D and our shade.js everything 
also works with a real-time ray tracer that we are developing again in 
other projects and which seems to be coming to mobile devices in HW 
(e.g. via Imagination, Samsung).

> 2. Complex scene queries. Typically they might be better via the scene
> though. But perhaps something that goes really deep in the renderer for
> example to query which areas are in shadow or so? Or visibility checks?

Would be good to know what is needed here. We obviously have ray casting 
queries, bboxes, and such but others could be added as well.

> 3. Things that need to hook into the rendering pipeline — perhaps for
> things like Render-To-Texture or post-process compositing. Perhaps how
> XFlow integrates with rendering?

Kristian and Felix are working on a general mechanism for expressing 
dependencies within the rendering process (e.g. need to generate the 
texture before using it. It seems to still be in flux right now.

>>     scene     : Object, // API for accessing the
>> Entity-Component-Attribute model.
>>                         // Implemented by ???
>
> Yes, the unclarity about the responsibility here is why I started the
> tread on ‘the entity system’ some weeks ago.
>
> I think the situation is not catastrophic as we already have 3
> implementations of that: 2 in the ‘WebTundras’ (Chiru-WebClient and
> WebRocket) and also xml3d.js.
>
> Playsign can worry about this at least for now (say, the coming month).
> We are not writing a renderer from scratch as there are many good ones
> out there already so we can spend resources on this too. Let’s see
> whether we can just adopt one of those 3 systems for MIWI — and hence
> probably as realXtend’s future official WebTundra — or do we need to
> write a 4th from scratch for some reason.

As I wrote this morning, A WebComponent version based on XML3D would 
seem like a good way to address this. Jonne's existing work seems to go 
a long way here.

> Again, one particular question here is the ‘IComponent’: how do we
> define new components — aka. XML elements? As mentioned before,
> WebComponents can be related so is to be analysed. Adminotech is already
> using WebComponents in the 2D UI work so perhaps you could help with
> understanding this: would the way they have there work for defining reX
> components? Also Philipp’s comments about KIARA which has a way to
> define things is related.

That would be a great step forward. As i wrote to Jonne before, a 
WebComponent could easily define the KIARA definition as well (did not 
mention KIARA by name though) and thus interface to the network behind 
the scenes without the user having to worry about that.

>>     asset     : Object, // Not strictly necessary for xml3d as it does
>> asset requests for us, but for three.js this is pretty much needed.
>>                         // Implemented by ???
>
> I think this belongs to the 3D UI so falls in Playsign’s responsibility.
> Again there are the existing implementations in WebTundras and xml3d.js
> — and obviously the browser does much of the work but I think we still
> need to track dependencies in the loading and keep track of usages for
> releasing resources etc. (the reason why you have the asset system in
> web rocket and the resource manager in xml3d.js).

I realized that there might be a misconception about mememory management 
in the previous emails regarding this (with Kristian). Our resource 
manager is not leaking memory or anything like this when he was talking 
about not releasing resources.

The point that Kristian was making is that we are simply not concerned 
about removing objects that are not close to us. Instead we are assuming 
that all references are supposed to be loaded and do fit into memory but 
we do so asynchronously. This approach was simply not designed for the 
case that only a fraction of the scene fits into memory. While you can 
use the loading part of our resource manager, the delete part is not 
there and has to be added.


Thanks,

	Philipp

> Thanks for the draft! Code speaks louder than words (that’s why I wrote
> the txml (<)-> xml3d converter), and at least for me this kind of
> code-like API def was very clear and helpful to read :)
>
> ~Toni
>
>>
>>     ui        : Object, // API to add/remove widgets correctly on top
>> of the 3D rendering canvas element, window resize events etc.
>>                         // Implemented by 2D/Input GE (Adminotech).
>>
>>     input     : Object // API to hook to input events occurring on top
>> of the 3D scene.
>> // Implemented by 2D/Input GE (Adminotech).
>> };
>>
>>
>> Best regards,
>> Jonne Nauha
>> Meshmoon developer at Adminotech Ltd.
>> www.meshmoon.com <http://www.meshmoon.com/>
>>
>>
>> On Wed, Oct 30, 2013 at 9:51 AM, Toni Alatalo <toni at playsign.net
>> <mailto:toni at playsign.net>> wrote:
>>
>>     Hi again,
>>     new angle here: calling devs *outside* the 3D UI GE: POIs,
>>     real-virtual interaction, interface designer, virtual characters,
>>     3d capture, synchronization etc.
>>     I think we need to proceed rapidly with integration now and
>>     propose that one next step towards that is to analyze the
>>     interfaces between 3D UI and other GEs. This is because it seems
>>     to be a central part with which many others interface: that is
>>     evident in the old 'arch.png' where we analyzed GE/Epic
>>     interdependencies: is embedded in section 2 in the Winterthur arch
>>     discussion notes which hopefully works for everyone to see,
>>     https://docs.google.com/document/d/1Sr4rg44yGxK8jj6yBsayCwfitZTq5Cdyyb_xC25vhhE/edit
>>     I propose a process where we go through the usage patterns case by
>>     case. For example so that me & Erno visit the other devs to
>>     discuss it. I think a good goal for those sessions is to define
>>     and plan the implementation of first tests / minimal use cases
>>     where the other GEs are used together with 3D UI to show
>>     something. I'd like this first pass to happen quickly so that
>>     within 2 weeks from the planning the first case is implemented. So
>>     if we get to have the sessions within 2 weeks from now, in a month
>>     we'd have demos with all parts.
>>     Let's organize this so that those who think this applies to their
>>     work contact me with private email (to not spam the list), we meet
>>     and collect the notes to the wiki and inform this list about that.
>>     One question of particular interest to me here is: can the users
>>     of 3D UI do what they need well on the entity system level (for
>>     example just add and configure mesh components), or do they need
>>     deeper access to the 3d scene and rendering (spatial queries,
>>     somehow affect the rendering pipeline etc). With Tundra we have
>>     the Scene API and the (Ogre)World API(s) to support the latter,
>>     and also access to the renderer directly. OTOH the entity system
>>     level is renderer independent.
>>     Synchronization is a special case which requires good two-way
>>     integration with 3D UI. Luckily it's something that we and
>>     especially Lasse himself knows already from how it works in Tundra
>>     (and in WebTundras). Definitely to be discussed and planned now
>>     too of course.
>>     So please if you agree that this is a good process do raise hands
>>     and let's start working on it! We can discuss this in the weekly
>>     too if needed.
>>     Cheers,
>>     ~Toni
>>
>>     _______________________________________________
>>     Fiware-miwi mailing list
>>     Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-ware.eu>
>>     https://lists.fi-ware.eu/listinfo/fiware-miwi
>>
>>
>
>
>
> _______________________________________________
> Fiware-miwi mailing list
> Fiware-miwi at lists.fi-ware.eu
> https://lists.fi-ware.eu/listinfo/fiware-miwi
>


-- 

-------------------------------------------------------------------------
Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH
Trippstadter Strasse 122, D-67663 Kaiserslautern

Geschäftsführung:
   Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender)
   Dr. Walter Olthoff
Vorsitzender des Aufsichtsrats:
   Prof. Dr. h.c. Hans A. Aukes

Sitz der Gesellschaft: Kaiserslautern (HRB 2313)
USt-Id.Nr.: DE 148646973, Steuernummer:  19/673/0060/3
---------------------------------------------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slusallek.vcf
Type: text/x-vcard
Size: 441 bytes
Desc: not available
URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20131103/6d46ab7a/attachment.vcf>


More information about the Fiware-miwi mailing list

You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy   Cookies policy