And yes this is entirely another discussion :) sorry about off topic spam On Fri, Oct 25, 2013 at 11:34 AM, Tomi Sarni <tomi.sarni at cyberlightning.com>wrote: > Yes I agree in general. Just a thought that in some use-cases this could > be thought as an option. It has been difficult to design the API in a way > that it would be highly dynamic in a sense that it would suite wide variety > of application development needs. > The NGSI 9/10<http://forge.fi-ware.eu/plugins/mediawiki/wiki/fiware/index.php/NGSI-9/NGSI-10_information_model>development in earlier GE seemed difficult to adapt and in my personal > opinnion does allow passing interaction interface clearly enough for the > application developer. > > > On Fri, Oct 25, 2013 at 11:28 AM, Philipp Slusallek < > Philipp.Slusallek at dfki.de> wrote: > >> Hi, >> >> With interaction I mean the user interaction. Yes, it eventually gets >> mapped to REST (or such) calls to the device. But how you map the device >> functionality to user interaction is a big step where different applicatios >> will have very different assumptions and interaction metaphors. Mapping >> them all to ageneric sensor model seems very difficult. >> >> Using a sematic annotation avoid having to create such a mapping when you >> design the sensor, avoid having to store the model on each sonsor, and >> pushes the mapping to the software/application side,which is (in my >> opinion) in a much better option to decide on that mapping. A fallback >> mapping may still be provided by the sensor for the most basic cases. >> >> >> Best, >> >> Philipp >> >> Am 25.10.2013 09:05, schrieb Tomi Sarni: >> >>> /It becomes more of an issue when we talk about interactivity, when the >>> >>> visual representation needs to react to user input in a way that is >>> consistent with the application and calls functionality in the >>> application. In other words, you have to do a mapping from the sensor to >>> the application at some point along the pipeline (and back for actions >>> to be performed by an actuator)./ >>> >>> >>> Currently when a client polls a device(containing sensor and/or >>> actuators) it will receive all interaction options that available for >>> the particular sensor or actuator. These options can be then accessed >>> by a HTTP POST method from the service. So there is the logical mapping. >>> I can see your point though, in a way it would seem logical to have that >>> XML3D model to contain states (e.g. button up and button down 3d model >>> states), and i have no idea whether this is supported by XML3D, as i >>> have been busy on server/sensor side. This way when a sensor is being >>> accesses by HTTP POST call to change state to either on or off for >>> instance, the XML3D model could contain transition logic to change >>> appearance from one state to another. Alternatively there can be two >>> models for two states. When the actuator is being queried it will return >>> model that corresponds to its current state. >>> >>> >>> >>> >>> >>> On Fri, Oct 25, 2013 at 9:29 AM, Philipp Slusallek >>> <Philipp.Slusallek at dfki.de <mailto:Philipp.Slusallek@**dfki.de<Philipp.Slusallek at dfki.de>>> >>> wrote: >>> >>> Hi Tomi, >>> >>> Yes, this is definitely an interesting option and when sensors offer >>> REST-ful interfaces, it should be almost trivial to add (once a >>> suitable and standardized way of how to find that data is specified. >>> At least it would provide a kind of default visualization in case no >>> other is available. >>> >>> It becomes more of an issue when we talk about interactivity, when >>> the visual representation needs to react to user input in a way that >>> is consistent with the application and calls functionality in the >>> application. In other words, you have to do a mapping from the >>> sensor to the application at some point along the pipeline (and back >>> for actions to be performed by an actuator). >>> >>> Either we specify the sensor type through some semantic means (a >>> simple tag in the simplest case, a full RDF/a graph in the best >>> case) and let the application choose how to represent it or we need >>> to find a way to map generic behavior of a default object to >>> application functionality. The first seems much easier to me as >>> application functionality is likely to vary much more than sensor >>> functionality. And semantic sensor description have been worked on >>> for a long time and are available on the market. >>> >>> Of course, there are hybrid methods as well: A simple one would be >>> to include a URI/URL to a default model in the semantic sensor >>> description that then gets loaded either from the sensor through >>> REST (given some namespace there) or via the Web (again using some >>> namespace or search strategy). Then the application can always >>> inject its own mapping to what it thinks is the best mapping. >>> >>> >>> Best, >>> >>> Philipp >>> >>> Am 25.10.2013 07:52, schrieb Tomi Sarni: >>> >>> *Following is completely on theoretical level:* >>> >>> To mix things a little further i've been thinking about a >>> possibility to >>> store visual representation of sensors within the sensors >>> themselves. >>> Many sensor types allow HTTP POST/GET or even PUT/DELETE methods >>> (wrapped in SNMP/CoAP communication protocols for instance) >>> which in >>> theory would allow sensor subscribers to also publish >>> information in >>> sensors (e.g. upload an xml3d model). This approach could be >>> useful in >>> cases where these sensors would have different purposes of use. >>> But the >>> sensor may have very little space to use for the model from up >>> 8-18 KB. >>> Also the web service can attach the models to IDs through use of >>> data >>> base. This is really just a pointer, perhaps there would be >>> use-cases >>> where the sensor visualization could be stored within the sensor >>> itself, >>> i think specifically some AR solutions could benefit from this. >>> But do >>> not let this mix up things, this perhaps reinforces the fact >>> that there >>> need to be overlaying middleware services that attach visual >>> representation based on their own needs. One service could use >>> different >>> 3d representation for temperature sensor than another one. >>> >>> >>> >>> >>> On Thu, Oct 24, 2013 at 9:49 PM, Philipp Slusallek >>> <Philipp.Slusallek at dfki.de <mailto:Philipp.Slusallek@**dfki.de<Philipp.Slusallek at dfki.de> >>> > >>> <mailto:Philipp.Slusallek at __df**ki.de <http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de>>>> >>> wrote: >>> >>> Hi, >>> >>> OK, now I get it. This does make sense -- at least in a >>> local >>> scenario, where the POI data (in this example) needs to be >>> stored >>> somewhere anyway and storing it in a component and then >>> generating >>> the appropriate visual component does make sense. Using web >>> components or a similar mechanism we could actually do the >>> same via >>> the DOM (as discussed for the general ECA sync before). >>> >>> But even then you might actually not want to store all the >>> POI data >>> but only the part that really matter to the application >>> (there may >>> be much more data -- maybe not for POIs but potentially for >>> other >>> things). >>> >>> Also in a distributed scenario, I am not so sure. In that >>> case you >>> might want to do that mapping on the server and only sync >>> the >>> resulting data, maybe with reference back so you can still >>> interact >>> with the original data through a service call. That is the >>> main >>> reason why I in general think of POI data and POI >>> representation as >>> separate entities. >>> >>> Regarding terminology, I think it does make sense to >>> differntiate >>> between the 3D scene and the application state (that is not >>> directly >>> influencing the 3D rendering and interaction). While you >>> store them >>> within the same data entity (but in different components), >>> they >>> still refer to quite different things and are operated on by >>> different parts of you program (e.g. the renderer only ever >>> touches >>> the "scene" data). We do the same within the XML3D core, >>> where we >>> attach renderer-specific data to DOM nodes and I believe >>> three.js >>> also does something similar within its data structures. At >>> the end, >>> you have to store these things somewhere and there are only >>> so many >>> way to implement it. The differences are not really that >>> big. >>> >>> >>> Best, >>> >>> Philipp >>> >>> Am 24.10.2013 19:24, schrieb Toni Alatalo: >>> >>> On 24 Oct 2013, at 19:24, Philipp Slusallek >>> <Philipp.Slusallek at dfki.de >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de>> >>> <mailto:Philipp.Slusallek at __df**ki.de <http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de> >>> >> >>> <mailto:Philipp.Slusallek@ >>> <mailto:Philipp.Slusallek@>__d**f__ki.de <http://df__ki.de> < >>> http://dfki.de> >>> >>> >>> <mailto:Philipp.Slusallek at __df**ki.de <http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de>>>>> >>> wrote: >>> >>> Good discussion! >>> >>> >>> I find so too — thanks for the questions and comments >>> and all! Now >>> briefly about just one point: >>> >>> Am 24.10.2013 17:37, schrieb Toni Alatalo: >>> >>> integrates to the scene system too - for >>> example if a >>> scene server >>> queries POI services, does it then only use the >>> data to >>> manipulate >>> the scene using other non-POI components, or >>> does it >>> often make sense >>> also to include POI components in the scene so >>> that the >>> clients get >>> it too automatically with the scene sync and >>> can for >>> example provide >>> POI specific GUI tools. Ofc clients can query >>> POI >>> services directly >>> too but this server centric setup is also one >>> scenario >>> and there the >>> scene integration might make sense. >>> >>> But I would say that there is a clear distinction >>> between >>> the POI data >>> (which you query from some service) and the >>> visualization or >>> representation of the POI data. Maybe you are more >>> talking >>> about the >>> latter here. However, there really is an application >>> dependent mapping >>> from the POI data to its representation. Each >>> application >>> may choose >>> to present the same POI data in very different way >>> and its >>> only this >>> resulting representation that becomes part of the >>> scene. >>> >>> >>> No I was not talking about visualization or >>> representations here >>> but the >>> POI data. >>> >>> non-POI in the above tried to refer to the whole which >>> covers >>> visualisations etc :) >>> >>> Your last sentence may help to understand the >>> confusion: in >>> these posts >>> I’ve been using the reX entity system terminology only >>> — hoping >>> that it >>> is clear to discuss that way and not mix terms (like >>> I’ve tried >>> to do in >>> some other threads). >>> >>> There ‘scene’ does not refer to a visual / graphical or >>> any >>> other type >>> of scene. It does not refer to e.g. something like what >>> xml3d.js and >>> three.js, or ogre, have as their Scene objects. >>> >>> It simply means the collection of all entities. There >>> it is >>> perfectly >>> valid to any kind of data which does not end up to e.g. >>> the >>> visual scene >>> — many components are like that. >>> >>> So in the above ‘only use the data to manipulate the >>> scene using >>> other >>> non-POI components’ was referring to for example >>> creation of Mesh >>> components if some POI is to be visualised that way. >>> The mapping >>> that >>> you were discussing. >>> >>> But my point was not about that but about the POI data >>> itself — >>> and the >>> example about some end user GUI with a widget that >>> manipulates >>> it. So it >>> then gets automatically synchronised along with all the >>> other >>> data in >>> the application in a collaborative setting etc. >>> >>> Stepping out of the previous terminology, we could >>> perhaps >>> translate: >>> ‘scene’ -> ‘application state’ and ‘scene server’ -> >>> ‘synchronization >>> server’. >>> >>> I hope this clarifies something — my apologies if not.. >>> >>> Cheers, >>> ~Toni >>> >>> P.S. i sent the previous post from a foreign device and >>> accidentally >>> with my gmail address as sender so it didn’t make it to >>> the list >>> — so >>> thank you for quoting it in full so I don’t think we >>> need to >>> repost that :) >>> >>> This is essentially the Mapping stage of the >>> well-known >>> Visualization >>> pipeline >>> >>> (http://www.infovis-wiki.net/_**___index.php/Visualization____** >>> _Pipeline<http://www.infovis-wiki.net/____index.php/Visualization_____Pipeline> >>> <http://www.infovis-wiki.net/_**_index.php/Visualization___** >>> Pipeline<http://www.infovis-wiki.net/__index.php/Visualization___Pipeline> >>> > >>> >>> >>> <http://www.infovis-wiki.net/_**_index.php/Visualization___** >>> Pipeline<http://www.infovis-wiki.net/__index.php/Visualization___Pipeline>< >>> http://www.infovis-wiki.net/**index.php/Visualization_**Pipeline<http://www.infovis-wiki.net/index.php/Visualization_Pipeline> >>> >>), >>> >>> except >>> that here we also map interaction aspects to an >>> abstract scene >>> description (XML3D) first, which then performs the >>> rendering and >>> interaction. So you can think of this as an >>> additional >>> "Scene" stage >>> between "Mapping" and "Rendering". >>> >>> I think this is a different topic, but also with >>> real-virtual >>> interaction for example how to facilitate nice >>> simple >>> authoring of >>> the e.g. real-virtual object mappings seems a >>> fruitful >>> enough angle >>> to think a bit, perhaps as a case to help in >>> understanding the entity >>> system & the different servers etc. For example >>> if there's a >>> component type 'real world link', the Interface >>> Designer >>> GUI shows it >>> automatically in the list of components, ppl >>> can just >>> add them to >>> their scenes and somehow then the system just >>> works.. >>> >>> >>> I am not sure what you are getting at. But it would >>> be great >>> if the >>> Interface Designer would allow to choose such POI >>> mappings >>> from a >>> predegined catalog. It seems that Xflow can be used >>> nicely for >>> generating the mapped scene elements from some >>> input data, >>> e.g. using >>> the same approach we use to provide basic >>> primitives like >>> cubes or >>> spheres in XML3D. Here they are not fixed, build-in >>> tags as >>> in X3D but >>> can actually be added by the developer as it best >>> fits. >>> >>> For generating more complex subgraphs we may have >>> to extend the >>> current Xflow implementation. But its at least a >>> great >>> starting point >>> to experiment with it. Experiments and feedback >>> would be >>> very welcome >>> here. >>> >>> I don't think these discussions are now hurt by >>> us >>> (currently) having >>> alternative renderers - the entity system, >>> formats, sync >>> and the >>> overall architecture is the same anyway. >>> >>> >>> Well, some things only work in one and others only >>> in the other >>> branch. So the above mechanism could not be used to >>> visualize POIs in >>> the three.js branch but we do not have all the >>> features to >>> visualize >>> Oulu (or whatever city) in the XML3D.js branch. This >>> definitely IS >>> greatly limiting how we can combine the GEs into >>> more complex >>> applications -- the untimate goal of the orthogonal >>> design >>> of this >>> chapter. >>> >>> And it does not even work within the same chapter. >>> It will >>> be hard to >>> explain to Juanjo and others from FI-WARE (or the >>> commission >>> for that >>> matter). >>> >>> BTW, I just learned today that there is a FI-WARE >>> smaller review >>> coming up soon. Let's see if we already have to >>> present >>> things there. >>> So far they have not explicitly asked us. >>> >>> >>> Best, >>> >>> Philipp >>> >>> -Toni >>> >>> >>> From an XML3D POV things could actually be >>> quite >>> "easy". It should >>> be rather simple to directly interface to >>> the IoT >>> GEs of FI-WARE >>> through REST via a new Xflow element. This >>> would >>> then make the data >>> available through <data> elements. Then you >>> can use >>> all the features >>> of Xflow to manipulate the scene based on >>> the data. >>> For example, we >>> are discussing building a set of >>> visualization nodes >>> that implement >>> common visualization metaphors, such as >>> scatter >>> plots, animations, >>> you name it. A new member of the lab >>> starting soon >>> wants to look >>> into this area. >>> >>> For acting on objects we have always used >>> Web >>> services attached to >>> the XML3D objects via DOM events. >>> Eventually, I >>> believe we want a >>> higher level input handling and processing >>> framework >>> but no one >>> knows so far, how this should look like (we >>> have >>> some ideas but they >>> are not well baked, any inpu is highly >>> welcome >>> here). This might or >>> might not reuse some of the Xflow >>> mechanisms. >>> >>> But how to implement RealVirtual >>> Interaction is >>> indeed an intersting >>> discussion. Getting us all on the same page >>> and >>> sharing ideas and >>> implementations is very helpful. Doing this >>> on the >>> same SW platform >>> (without the fork that we currently have) >>> would >>> facilitate a >>> powerful implementation even more. >>> >>> >>> Thanks >>> >>> Philipp >>> >>> Am 23.10.2013 08:02, schrieb Tomi Sarni: >>> >>> ->Philipp >>> /I did not get the idea why POIs are >>> similar to >>> ECA. At a very high >>> level I see it, but I am not sure what >>> it buys >>> us. Can someone sketch >>> that picture in some more detail?/ >>> >>> Well I suppose it becomes relevant at >>> point when >>> we are combining our >>> GEs together. If the model can be >>> applied in >>> level of scene then >>> down to >>> POI in a scene and further down in >>> sensor level, >>> things can be >>> more easily visualized. Not just in >>> terms of >>> painting 3D models but in >>> terms of handling big data as well, more >>> specifically handling >>> relationships/inheritance. It also >>> makes it easier >>> to design a RESTful API as we have a >>> common >>> structure which to follow >>> and also provides more opportunities >>> for 3rd >>> party developers to make >>> use of the data for their own purposes. >>> >>> For instance >>> >>> ->Toni >>> >>> From point of sensors, the >>> entity-component becomes >>> device-sensors/actuators. A device may >>> have an >>> unique identifier and IP >>> by which to access it, but it may also >>> contain >>> several actuators and >>> sensors >>> that are components of that device >>> entity. >>> Sensors/actuators >>> themselves >>> are not aware to whom they are >>> interesting to. >>> One client may use the >>> sensor information differently to other >>> client. >>> Sensor/actuator service >>> allows any other service to query using >>> request/response method either >>> by geo-coordinates (circle,square or >>> complex >>> shape queries) or perhaps >>> through type+maxresults and service >>> will return >>> entities and their >>> components >>> from which the reqester can form logical >>> groups(array of entity uuids) >>> and query more detailed information >>> based on >>> that logical group. >>> >>> I guess there needs to be similar >>> thinking done >>> on POI level. I guess >>> POI does not know which scene it >>> belongs to. It >>> is up to scene >>> server to >>> form a logical group of POIs (e.g. >>> restaurants >>> of oulu 3d city >>> model). Then >>> again the problem is that scene needs >>> to wait >>> for POI to query for >>> sensors and form its logical groups >>> before it >>> can pass information to >>> scene. This can lead to long wait >>> times. But >>> this sequencing problem is >>> also something >>> that could be thought. Anyways this is >>> a common >>> problem with everything >>> in web at the moment in my opinnion. >>> Services >>> become intertwined. >>> When a >>> client loads a web page there can be >>> queries to >>> 20 different services >>> for advertisment and other stuff. Web >>> page >>> handles it by painting stuff >>> to the client on receive basis. I think >>> this >>> could be applied in Scene >>> as well. >>> >>> >>> >>> >>> >>> On Wed, Oct 23, 2013 at 8:00 AM, >>> Philipp Slusallek >>> <Philipp.Slusallek at dfki.de >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de>> >>> <mailto:Philipp.Slusallek at __df**ki.de<http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de> >>> >> >>> <mailto:Philipp.Slusallek@ >>> <mailto:Philipp.Slusallek@>__d**f__ki.de <http://df__ki.de> < >>> http://dfki.de> >>> >>> <mailto:Philipp.Slusallek at __df**ki.de<http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de> >>> >>> >>> <mailto:Philipp.Slusallek@ >>> <mailto:Philipp.Slusallek@>__d**f__ki.de <http://df__ki.de> < >>> http://dfki.de> >>> >>> >>> <mailto:Philipp.Slusallek at __df**ki.de<http://dfki.de> >>> <mailto:Philipp.Slusallek@**dfki.de <Philipp.Slusallek at dfki.de>>>>> >>> wrote: >>> >>> Hi, >>> >>> First of all, its certainly a good >>> thing to >>> also meet locally. I was >>> just a bit confused whether that >>> meeting >>> somehow would involve us as >>> well. Summarizing the results >>> briefly for >>> the others would >>> definitely be interesting. >>> >>> I did not get the idea why POIs are >>> similar >>> to ECA. At a very high >>> level I see it, but I am not sure >>> what it >>> buys us. Can someone >>> sketch that picture in some more >>> detail? >>> >>> BTW, what is the status with the >>> Rendering >>> discussion (Three.js vs. >>> xml3d.js)? I still have the feeling >>> that we >>> are doing parallel work >>> here that should probably be >>> avoided. >>> >>> BTW, as part of our shading work >>> (which is >>> shaping up nicely) Felix >>> has been looking lately at a way to >>> describe >>> rendering stages >>> (passes) essentially through Xflow. >>> It is >>> still very experimental >>> but he is using it to implement >>> shadow maps >>> right now. >>> >>> @Felix: Once this has converged >>> into a bit >>> more stable idea, it >>> would be good to post this here to >>> get >>> feedback. The way we >>> discussed it, this approach could >>> form a >>> nice basis for a modular >>> design of advanced rasterization >>> techniques >>> (reflection maps, adv. >>> face rendering, SSAO, lens flare, >>> tone >>> mapping, etc.), and (later) >>> maybe also describe global >>> illumination >>> settings (similar to our >>> work on LightingNetworks some years >>> ago). >>> >>> >>> Best, >>> >>> Philipp >>> >>> Am 22.10.2013 23:03, schrieb >>> toni at playsign.net <mailto:toni at playsign.net> >>> <mailto:toni at playsign.net <mailto:toni at playsign.net>> >>> <mailto:toni at playsign.net >>> <mailto:toni at playsign.net> >>> <mailto:toni at playsign.net >>> <mailto:toni at playsign.net>>> >>> >>> <mailto:toni at playsign.net >>> <mailto:toni at playsign.net> >>> <mailto:toni at playsign.net >>> <mailto:toni at playsign.net>>>: >>> >>> Just a brief note: we had some >>> interesting preliminary >>> discussion >>> triggered by how the data >>> schema that >>> Ari O. presented for >>> the POI >>> system seemed at least partly >>> similar to >>> what the Real-Virtual >>> interaction work had resulted >>> in too -- >>> and in fact about >>> how the >>> proposed POI schema was >>> basically a >>> version of the >>> entity-component >>> model which we’ve already been >>> using for >>> scenes in realXtend >>> (it is >>> inspired by / modeled after it, >>> Ari >>> told). So it can be much >>> related to >>> the Scene API work in the >>> Synchronization GE too. As the action >>> point we >>> agreed that Ari will organize a >>> specific >>> work session on that. >>> I was now thinking that it >>> perhaps at >>> least partly leads >>> back to the >>> question: how do we define (and >>> implement) component types. I.e. >>> what >>> was mentioned in that >>> entity-system post >>> a few weeks back (with >>> links >>> to reX IComponent etc.). I >>> mean: if >>> functionality such as >>> POIs and >>> realworld interaction make >>> sense as >>> somehow resulting in >>> custom data >>> component types, does it mean >>> that a key >>> part of the framework >>> is a way >>> for those systems to declare >>> their types >>> .. so that it >>> integrates nicely >>> for the whole we want? I’m not >>> sure, too >>> tired to think it >>> through now, >>> but anyhow just wanted to >>> mention that >>> this was one topic that >>> came up. >>> I think Web Components is again >>> something to check - as in XML >>> terms reX >>> Components are xml(3d) elements >>> .. just >>> ones that are usually in >>> a group >>> (according to the reX entity >>> <-> xml3d >>> group mapping). And Web >>> Components are about defining & >>> implementing new elements >>> (as Erno >>> pointed out in a different >>> discussion >>> about xml-html authoring >>> in the >>> session). >>> BTW Thanks Kristian for the >>> great >>> comments in that entity system >>> thread - was really good to >>> learn about >>> the alternative >>> attribute access >>> syntax and the validation in >>> XML3D(.js). >>> ~Toni >>> P.S. for (Christof &) the DFKI >>> folks: >>> I’m sure you >>> understand the >>> rationale of these Oulu meets >>> -- idea is >>> ofc not to exclude you >>> from the >>> talks but just makes sense for >>> us to >>> meet live too as we are in >>> the same >>> city afterall etc -- naturally >>> with the >>> DFKI team you also talk >>> there >>> locally. Perhaps is a good idea >>> that we >>> make notes so that can >>> post e.g. >>> here then (I’m not volunteering >>> though! >>> 😜) . Also, the now >>> agreed >>> bi-weekly setup on Tuesdays >>> luckily >>> works so that we can then >>> summarize >>> fresh in the global Wed >>> meetings and >>> continue the talks etc. >>> *From:* Erno Kuusela >>> *Sent:* Tuesday, October >>> 22, 2013 >>> 9:57 AM >>> *To:* Fiware-miwi >>> >>> >>> Kari from CIE offered to host >>> it this >>> time, so see you there at >>> 13:00. >>> >>> Erno >>> >>> >>> ______________________________**_______________________ >>> >>> >>> Fiware-miwi mailing list >>> Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-** >>> ware.eu <Fiware-miwi at lists.fi-ware.eu>> >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> https://lists.fi-ware.eu/_____**_listinfo/fiware-miwi<https://lists.fi-ware.eu/______listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> > >>> >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> >> >>> >>> >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> > >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi> >>> >>> >>> >>> >>> >>> >>> ______________________________**_______________________ >>> >>> >>> Fiware-miwi mailing list >>> Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-** >>> ware.eu <Fiware-miwi at lists.fi-ware.eu>> >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> https://lists.fi-ware.eu/_____**_listinfo/fiware-miwi<https://lists.fi-ware.eu/______listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> > >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> >> >>> >>> >>> >>> <https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> > >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi> >>> >>> >>> >>> >>> >>> -- >>> >>> >>> >>> ------------------------------**______------------------------** >>> --__--__--__------------- >>> >>> >>> Deutsches Forschungszentrum für >>> Künstliche >>> Intelligenz (DFKI) GmbH >>> Trippstadter Strasse 122, D-67663 >>> Kaiserslautern >>> >>> Geschäftsführung: >>> Prof. Dr. Dr. h.c. mult. Wolfgang >>> Wahlster (Vorsitzender) >>> Dr. Walter Olthoff >>> Vorsitzender des Aufsichtsrats: >>> Prof. Dr. h.c. Hans A. Aukes >>> >>> Sitz der Gesellschaft: >>> Kaiserslautern (HRB 2313) >>> USt-Id.Nr.: DE 148646973, >>> Steuernummer: >>> 19/673/0060/3 >>> >>> >>> ------------------------------**______------------------------** >>> --__--__--__--------------- >>> >>> >>> >>> >>> >>> ______________________________**_____________________ >>> Fiware-miwi mailing list >>> Fiware-miwi at lists.fi-ware.eu <mailto:Fiware-miwi at lists.fi-** >>> ware.eu <Fiware-miwi at lists.fi-ware.eu>> >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> >>> <mailto:Fiware-miwi at lists.fi-_**___ware.eu<Fiware-miwi at lists.fi-____ware.eu> >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> > >>> >>> >>> <mailto:Fiware-miwi at lists.fi-_** >>> _ware.eu <Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >>> >>> https://lists.fi-ware.eu/____**listinfo/fiware-miwi<https://lists.fi-ware.eu/____listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> > >>> >>> <https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi> >>> >> >>> >>> >>> >>> >>> -- >>> >>> >>> ------------------------------**____--------------------------** >>> --__--__------------- >>> Deutsches Forschungszentrum für Künstliche >>> Intelligenz (DFKI) GmbH >>> Trippstadter Strasse 122, D-67663 >>> Kaiserslautern >>> >>> Geschäftsführung: >>> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster >>> (Vorsitzender) >>> Dr. Walter Olthoff >>> Vorsitzender des Aufsichtsrats: >>> Prof. Dr. h.c. Hans A. Aukes >>> >>> Sitz der Gesellschaft: Kaiserslautern (HRB >>> 2313) >>> USt-Id.Nr.: DE 148646973, Steuernummer: >>> 19/673/0060/3 >>> >>> ------------------------------**____--------------------------** >>> --__--__--------------- >>> <slusallek.vcf> >>> >>> >>> >>> >>> >>> -- >>> >>> >>> ------------------------------**____--------------------------** >>> --__--__------------- >>> Deutsches Forschungszentrum für Künstliche >>> Intelligenz >>> (DFKI) GmbH >>> Trippstadter Strasse 122, D-67663 Kaiserslautern >>> >>> Geschäftsführung: >>> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster >>> (Vorsitzender) >>> Dr. Walter Olthoff >>> Vorsitzender des Aufsichtsrats: >>> Prof. Dr. h.c. Hans A. Aukes >>> >>> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >>> USt-Id.Nr.: DE 148646973, Steuernummer: >>> 19/673/0060/3 >>> >>> ------------------------------**____--------------------------** >>> --__--__--------------- >>> <slusallek.vcf> >>> >>> >>> >>> >>> >>> >>> -- >>> >>> >>> ------------------------------**____--------------------------** >>> --__--__------------- >>> Deutsches Forschungszentrum für Künstliche Intelligenz >>> (DFKI) GmbH >>> Trippstadter Strasse 122, D-67663 Kaiserslautern >>> >>> Geschäftsführung: >>> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) >>> Dr. Walter Olthoff >>> Vorsitzender des Aufsichtsrats: >>> Prof. Dr. h.c. Hans A. Aukes >>> >>> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >>> USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 >>> >>> ------------------------------**____--------------------------** >>> --__--__--------------- >>> >>> ______________________________**___________________ >>> Fiware-miwi mailing list >>> Fiware-miwi at lists.fi-ware.eu >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> > >>> <mailto:Fiware-miwi at lists.fi-_**_ware.eu<Fiware-miwi at lists.fi-__ware.eu> >>> <mailto:Fiware-miwi at lists.fi-**ware.eu<Fiware-miwi at lists.fi-ware.eu> >>> >> >>> https://lists.fi-ware.eu/__**listinfo/fiware-miwi<https://lists.fi-ware.eu/__listinfo/fiware-miwi> >>> <https://lists.fi-ware.eu/**listinfo/fiware-miwi<https://lists.fi-ware.eu/listinfo/fiware-miwi> >>> > >>> >>> >>> >>> >>> -- >>> >>> ------------------------------**__----------------------------** >>> --__------------- >>> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH >>> Trippstadter Strasse 122, D-67663 Kaiserslautern >>> >>> Geschäftsführung: >>> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) >>> Dr. Walter Olthoff >>> Vorsitzender des Aufsichtsrats: >>> Prof. Dr. h.c. Hans A. Aukes >>> >>> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >>> USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 >>> ------------------------------**__----------------------------** >>> --__--------------- >>> >>> >>> >> >> -- >> >> ------------------------------**------------------------------** >> ------------- >> Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) GmbH >> Trippstadter Strasse 122, D-67663 Kaiserslautern >> >> Geschäftsführung: >> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender) >> Dr. Walter Olthoff >> Vorsitzender des Aufsichtsrats: >> Prof. Dr. h.c. Hans A. Aukes >> >> Sitz der Gesellschaft: Kaiserslautern (HRB 2313) >> USt-Id.Nr.: DE 148646973, Steuernummer: 19/673/0060/3 >> ------------------------------**------------------------------** >> --------------- >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: <https://lists.fiware.org/private/fiware-miwi/attachments/20131025/1483f2f1/attachment.html>
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy