When forwarding / replicating a REGISTER, is it possible (without using DMQ) for the receiving host to save the contact with the original received parameter?
REGISTER > server 1 (received set) > forward > server 2 (received not set)
add_rcv_param will set received to server 1 IP, but I'm looking for the original parameter as set on SERVER 1.
-dan
Dear Kamailio mailing list
I've a problem with a simple Kamailio-Asterisk setup. I tried to find a solution and found several posts, but was not able to fix my problem!
I'm using the following setup:
- SIP device registered to Kamailio (IP 10.40.6.188)
- Kamailio (port 5070) and Asterisk (port 5060) on the same host (IP 10.40.8.104)
- I tried to forward calls from Kamailio to Asterisk with with standard PSTN call routing and dispatcher module, but the result was the same
- Asterisk has a trunk to Kamailio
- SIP device registered to Asterisk (IP 10.40.6.214)
Calls from the Asterisk phone towards the Kamailio phone work without any issue.
Calls from the Kamailio phone towards the Asterisk phone are successfully established and the voice works in both directions. The problem here is, that the Kamailio phone sends an ACK upon 200 OK, but the ACK is not forwarded to Asterisk. After 6 seconds, the call is terminated.
If I remove record_route() or add a second IP to the same interface for Asterisk (10.40.8.106), then the call signaling works without any issue.
Does anyone have an idea what I'm doing wrong here?
I'm using more or less a standard Kamailio config file without rtpproxy. Please let me know if I shall post the full config or just some relevant snippets.
Best regards
Mathias
Following are two snippets of the relevant ACK. The first shows the situation where the ACK was not forwarded to Asterisk and the second with a different IP for Asterisk works.
No. Time Source Destination Protocol Length Info
752 17.269019 10.40.8.104 10.40.8.104 SIP/SDP 1135 Status: 200 OK |
Frame 752: 1135 bytes on wire (9080 bits), 1135 bytes captured (9080 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.104, Dst: 10.40.8.104
User Datagram Protocol, Src Port: 5060, Dst Port: 5070
Session Initiation Protocol (200)
Status-Line: SIP/2.0 200 OK
Status-Code: 200
[Resent Packet: False]
[Request Frame: 304]
[Response Time (ms): 2093]
Message Header
Via: SIP/2.0/UDP 10.40.8.104:5070;branch=z9hG4bKda4f.48918e0ef9372ba1a38655eb93742e46.0;received=10.40.8.104
Via: SIP/2.0/UDP 10.40.6.188:56960;received=10.40.6.188;rport=56960;branch=z9hG4bKPjVtFCvs26liH-aJoGn8CaLg4kWqml5LH.
Record-Route: <sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61>
From: <sip:proxydevice@10.40.8.104>;tag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC
To: <sip:004112345@10.40.8.104>;tag=as5e5c287c
Call-ID: a6T0GAwBL69m5gsc215EgXi61Oorwxik
CSeq: 10909 INVITE
Server: Asterisk PBX 12.3.2
Allow: INVITE, ACK, CANCEL, OPTIONS, BYE, REFER, SUBSCRIBE, NOTIFY, INFO, PUBLISH, MESSAGE
Supported: replaces, timer
Session-Expires: 1800;refresher=uas
Contact: <sip:004112345@10.40.8.104:5060>
Content-Type: application/sdp
Require: timer
Content-Length: 271
Message Body
No. Time Source Destination Protocol Length Info
764 17.270668 10.40.8.104 10.40.6.188 SIP/SDP 1025 Status: 200 OK |
Frame 764: 1025 bytes on wire (8200 bits), 1025 bytes captured (8200 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.104, Dst: 10.40.6.188
User Datagram Protocol, Src Port: 5070, Dst Port: 56960
Session Initiation Protocol (200)
Status-Line: SIP/2.0 200 OK
Status-Code: 200
[Resent Packet: False]
[Request Frame: 259]
[Response Time (ms): 2115]
Message Header
Via: SIP/2.0/UDP 10.40.6.188:56960;received=10.40.6.188;rport=56960;branch=z9hG4bKPjVtFCvs26liH-aJoGn8CaLg4kWqml5LH.
Record-Route: <sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61>
From: <sip:proxydevice@10.40.8.104>;tag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC
To: <sip:004112345@10.40.8.104>;tag=as5e5c287c
Call-ID: a6T0GAwBL69m5gsc215EgXi61Oorwxik
CSeq: 10909 INVITE
Server: Asterisk PBX 12.3.2
Allow: INVITE, ACK, CANCEL, OPTIONS, BYE, REFER, SUBSCRIBE, NOTIFY, INFO, PUBLISH, MESSAGE
Supported: replaces, timer
Session-Expires: 1800;refresher=uas
Contact: <sip:004112345@10.40.8.104:5060>
Content-Type: application/sdp
Require: timer
Content-Length: 271
Message Body
No. Time Source Destination Protocol Length Info
786 17.402987 10.40.6.188 10.40.8.104 SIP 486 Request: ACK sip:004112345@10.40.8.104:5060 |
Frame 786: 486 bytes on wire (3888 bits), 486 bytes captured (3888 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.6.188, Dst: 10.40.8.104
User Datagram Protocol, Src Port: 56960, Dst Port: 5070
Session Initiation Protocol (ACK)
Request-Line: ACK sip:004112345@10.40.8.104:5060 SIP/2.0
Method: ACK
Request-URI: sip:004112345@10.40.8.104:5060
[Resent Packet: False]
[Request Frame: 259]
[Response Time (ms): 2247]
Message Header
Via: SIP/2.0/UDP 10.40.6.188:56960;rport;branch=z9hG4bKPjM2GQWjVep2XhpLxJD-cxNh75Qtep-sN3
Max-Forwards: 70
From: <sip:proxydevice@10.40.8.104>;tag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC
To: <sip:004112345@10.40.8.104>;tag=as5e5c287c
Call-ID: a6T0GAwBL69m5gsc215EgXi61Oorwxik
CSeq: 10909 ACK
Route: <sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61>
Content-Length: 0
No. Time Source Destination Protocol Length Info
787 17.404480 10.40.8.104 10.40.8.104 SIP 561 Request: ACK sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61 |
Frame 787: 561 bytes on wire (4488 bits), 561 bytes captured (4488 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.104, Dst: 10.40.8.104
User Datagram Protocol, Src Port: 5070, Dst Port: 5070
Session Initiation Protocol (ACK)
Request-Line: ACK sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61 SIP/2.0
Method: ACK
Request-URI: sip:10.40.8.104:5070;lr;ftag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC;did=17a.4f61
[Resent Packet: False]
Message Header
Via: SIP/2.0/UDP 10.40.8.104:5070;branch=z9hG4bKda4f.f914ac9def098ece9a50d53389574f85.0
Via: SIP/2.0/UDP 10.40.6.188:56960;received=10.40.6.188;rport=56960;branch=z9hG4bKPjM2GQWjVep2XhpLxJD-cxNh75Qtep-sN3
Max-Forwards: 69
From: <sip:proxydevice@10.40.8.104>;tag=NCP2N5E7OZuppblpkCCNZziEmpiTRsQC
To: <sip:004112345@10.40.8.104>;tag=as5e5c287c
Call-ID: a6T0GAwBL69m5gsc215EgXi61Oorwxik
CSeq: 10909 ACK
Content-Length: 0
In this case I used a second IP on the same interface for Asterisk and this call works without any issue:
No. Time Source Destination Protocol Length Info
939 8.849431 10.40.8.106 10.40.6.214 SIP 473 Request: ACK sip:004112345@10.40.6.214:49316;ob |
Frame 939: 473 bytes on wire (3784 bits), 473 bytes captured (3784 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.106, Dst: 10.40.6.214
User Datagram Protocol, Src Port: 5060, Dst Port: 49316
Session Initiation Protocol (ACK)
No. Time Source Destination Protocol Length Info
946 8.850293 10.40.8.106 10.40.8.104 SIP/SDP 1136 Status: 200 OK |
Frame 946: 1136 bytes on wire (9088 bits), 1136 bytes captured (9088 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.106, Dst: 10.40.8.104
User Datagram Protocol, Src Port: 5060, Dst Port: 5070
Session Initiation Protocol (200)
No. Time Source Destination Protocol Length Info
956 8.851703 10.40.8.104 10.40.6.188 SIP/SDP 1026 Status: 200 OK |
Frame 956: 1026 bytes on wire (8208 bits), 1026 bytes captured (8208 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.104, Dst: 10.40.6.188
User Datagram Protocol, Src Port: 5070, Dst Port: 60223
Session Initiation Protocol (200)
No. Time Source Destination Protocol Length Info
989 9.095859 10.40.6.188 10.40.8.104 SIP 485 Request: ACK sip:004112345@10.40.8.106:5060 |
Frame 989: 485 bytes on wire (3880 bits), 485 bytes captured (3880 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.6.188, Dst: 10.40.8.104
User Datagram Protocol, Src Port: 60223, Dst Port: 5070
Session Initiation Protocol (ACK)
No. Time Source Destination Protocol Length Info
991 9.097228 10.40.8.104 10.40.8.106 SIP 516 Request: ACK sip:004112345@10.40.8.106:5060 |
Frame 991: 516 bytes on wire (4128 bits), 516 bytes captured (4128 bits)
Linux cooked capture
Internet Protocol Version 4, Src: 10.40.8.104, Dst: 10.40.8.106
User Datagram Protocol, Src Port: 5070, Dst Port: 5060
Session Initiation Protocol (ACK)
Hello,
wondering if we we should do a new IRC devel meeting in the near future
to sync on development plans.
One of the decisions we should take is whether we should target to
release the new major version (5.3) before the summer holidays or we
leave it for the autumn.
If many wants to do it, then a first proposal for a date would be: March
07, 2019, at 15:00 UTC (16:00 Berlin time).
As usual, I created a wiki page to track the availability and the topics
that should be approched:
* https://www.kamailio.org/wiki/devel/irc-meetings/2019a
Feel free to add yourself there, propose topics, etc...
Cheers,
Daniel
--
Daniel-Constantin Mierla -- www.asipto.comwww.twitter.com/miconda -- www.linkedin.com/in/miconda
Kamailio World Conference - May 6-8, 2019 -- www.kamailioworld.com
Kamailio Advanced Training - Mar 4-6, 2019 in Berlin; Mar 25-27, 2019, in Washington, DC, USA -- www.asipto.com
Hello,
Kamailio SIP Server v5.2.2 stable release is out.
This is a maintenance release of the latest stable branch, 5.2, that
includes fixes since the release of v5.2.1. There is no change to
database schema or configuration language structure that you have to do
on previous installations of v5.2.x. Deployments running previous v5.2.x
versions are strongly recommended to be upgraded to v5.2.2.
For more details about version 5.2.2 (including links and guidelines to
download the tarball or from GIT repository), visit:
* https://www.kamailio.org/w/2019/03/kamailio-v5-2-2-released/
RPM, Debian/Ubuntu packages will be available soon as well.
Many thanks to all contributing and using Kamailio! Looking forward
to meeting many of you in Berlin, at Kamailio World Conference 2019!
Cheers,
Daniel
--
Daniel-Constantin Mierla -- www.asipto.comwww.twitter.com/miconda -- www.linkedin.com/in/miconda
Kamailio World Conference - May 6-8, 2019 -- www.kamailioworld.com
Kamailio Advanced Training - Mar 25-27, 2019, in Washington, DC, USA -- www.asipto.com
Daniel would it be possible to backport the rtpengine start/stop forwarding
and play/stop media features into 5.2.2? I realize they are not technically
"fixes"--not sure how obtrusive these changes would be. Either way, thank
you. -A
--
Anthony - https://messinet.com
F9B6 560E 68EA 037D 8C3D D1C9 FF31 3BDB D9D8 99B6
Hello,
I am considering to release a new version from branch 5.2, respectively
v5.2.2, on next Monday, March 11.
As usual, check if any fixes done on master for issues you reported were
backported to 5.2 already or not (I just pushed a bunch of backports
earlier today).
Also, if you are aware of issues not reported to the bug tracker,
register them to get a chance to review and eventually fix.
Cheers,
Daniel
--
Daniel-Constantin Mierla -- www.asipto.comwww.twitter.com/miconda -- www.linkedin.com/in/miconda
Kamailio World Conference - May 6-8, 2019 -- www.kamailioworld.com
Kamailio Advanced Training - Mar 25-27, 2019, in Washington, DC, USA -- www.asipto.com
Fyi. Sorry for crossposting.
P.S.: Today I deleted the "kamailio" clone from my github account http://github.com/christoph-v, because that account is now solely intended for private use (SP-ARK, PS-ARK).
I created another account for professional purposes: http://github.com/christoph-v-kapsch
Gesendet: Freitag, 08. März 2019 um 16:
Von meinem Samsung Galaxy Smartphone gesendet.
-------- Ursprüngliche Nachricht --------
Von: Christoph Valentin <christoph.valentin(a)gmx.at>
Datum: 09.03.19 01:33 (GMT+01:00)
An: Valentin Christoph <Christoph.Valentin(a)kapsch.net>
Betreff: Fw: Re: [x3d-public] x3d-public Digest, Vol 120, Issue 34
F.y.i......
--
Diese Nachricht wurde von meinem Android Mobiltelefon mit GMX Mail gesendet.
Am 08.03.19, 22:49, Christoph Valentin <christoph.valentin(a)gmx.at> schrieb:
Hi Andreas,
Still some neurons firing, so I send another update. Please feel free to break the "event cascade", when it starts to become boring :-)
Everything else inline.
Have a nice weekend
Christoph
P.S.: Today I deleted the "kamailio" clone from my github account http://github.com/christoph-v, because that account is now solely intended for private use (SP-ARK, PS-ARK).
I created another account for professional purposes: http://github.com/christoph-v-kapsch
Gesendet: Freitag, 08. März 2019 um 16:07 Uhr
Von: "Andreas Plesch" <andreasplesch(a)gmail.com>
An: "X3D Graphics public mailing list" <x3d-public(a)web3d.org>
Betreff: Re: [x3d-public] x3d-public Digest, Vol 120, Issue 34
Thanks.
I looked at the network sensor, and BS Collaborate nodes. I think the idea is to explicitly forward all events which need sharing to the server which then distributes those to connected clients. This leaves the definition of the shared state to the scene, and therefore requires careful design and code for MU. Perhaps there is a way that the browser can better assist with making a scene MU capable.
What is the complete state a client needs when it is admitted to a shared scene ?
Since most fields of most nodes accept input, and can therefore potentially change, complete state probably means just the value of all fields of all root nodes which means all nodes. The complete state may only need to be transferred when a client joins.
Plus the offsets of avatars from Viewpoints. And the clock. Perhaps other values.
[Christoph:] I think, what you are considering here is the tradeoff between "MU is done completely by the browser" and "MU is done completely by the author".
Let's imagine the Web3D Consortium decided "MU was up to the author, everything should be done by EAI/SAI". Then every author had to decide, which nodes of his scene needed synchronization and he had free choice of the network protocol and of the server he would use. His efforts were the maximum, on the other hand his freedom were maximum, too.
The opposite extreme would be that the browser vendor did all the MU inherently within the X3D nodes. The efforts of the author to create MU scenes from SU scenes would be zero, but his freedom would be zero, too (given each browser implemented his own MU strategy).
So the Network Sensor is the ABSOLUTE MINIMUM, which MUST be standardized (INCLUDING a standard for the network protocol), in order to keep independence from the browser vendor and from the server vendor.
I'm with you, when you doubt the Network Sensor would be enough. I am with you, when you insist some additional nodes MUST be standardized to keep the efforts of the authors small. The Network Sensor is not enough (I think I already said that here).
If you are interested, I can introduce you to the SMUOS/C3P idea, which I developed throughout my hobby project. Would be a loss of ten years, if everybody had to repeat those experiences. And I guess, there are some projects and products (people) around here, who can help, too, much better than I can do.
I was thinking about avatar initiated event cascades since the state of everything else should be deterministic and only depend on time. These other things should update themselves. There are exceptions such as scripts which generate random values.
Avatar initiated event cascades generally start with environment sensors. Not sure if there are other ways. The idea would be to just have to transmit this event cascade to other clients which then can replay it to update their scene instances.
[Christoph:] I think it will not be necessary to transmit event cascades, only single events and states need to be networked
I also was reading some of the github open-dis documentation. Interesting background, and pretty accurate discussion on coordinate systems. It may be possible to set up somewhere a very simple distributed simulation to join and test clients with.
[Christoph:] I do not want to comment on HLA/DIS here.
Single browser MU: I think an additional touchedByUser field would be required but perhaps there is a way to generalize tracking of Users.
[Christoph:] Probably. Each problem can be solved, if you want. Some solutions are beautiful, others are ugly.
Cheers,
-Andreas
Date: Thu, 7 Mar 2019 18:11:15 +0100
From: "Christoph Valentin" <christoph.valentin(a)gmx.at<mailto:christoph.valentin@gmx.at>>
To: "x3d-publicweb3d.org<http://x3d-publicweb3d.org>" <x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>>
Subject: Re: [x3d-public] Multiplayer strategies
Hi Andreas,
Please find another 2c inline :-)
All the best
Christoph
?
?
Gesendet:?Donnerstag, 07. M?rz 2019 um 14:02 Uhr
Von:?"Andreas Plesch" <andreasplesch(a)gmail.com<mailto:andreasplesch@gmail.com>>
An:?"X3D Graphics public mailing list" <x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>>
Betreff:?Re: [x3d-public] Multiplayer strategies
Hi Christoph,
?
I admit that it is somewhat uncomfortable to think about single browser MU requirements but exploring this could be a fertile exercise as it seems orthogonal to sharing by multiple scene instances.
[Christoph:] Agree. Maybe I am just prejudiced, because I started with my project from the MU example on Bitmanagement's Homepage (is it still there?), where they used the "sessionId"(SFInt32) to identify the users and their avatars.
Now, when we think about single browser MU requirements, then the users should rather be identified by "sessionId + userId" (where userId would be a local identifier of a user of the local scene instance).
Next step would be to investigate environment sensors. E.g. touch sensor should not deliver "touchTime"(SFTime) but "touchedByUser"(SFInt32) instead, true?
?
What constitutes the state of a scene which needs synchronization across instances ? The main changes which need updating are the progress of time, and avatar state.?
[Christoph:] This depends on the use case. If you had a museum like world(with predefined animation), where only the avatars would interactively move, then you were right.
I prefer to dream of arbitrary animated and/or simulated interactive worlds/universes with moving models (cars, trains, houses, doors,......), where anything could be an avatar (even a locomotive could be an avatar)
?
Perhaps it suffices to synchronize avatar generated event cascades and replay those in the instances ? Would that require an event queue for recording ?
[Christoph:] What do you mean by "avatar generated event cascade"? I do not understand this term.
When you talk about recording, do you mean LI requirements? I think LI will be a topic, in particular with respect to avatar position, and in particular when it comes to mixed reality.
?
Another way to think about state synchronization is to have a very complete description of state and then only with deltas for synchronization and conflict resolution across instances.
[Christoph:] I think the complete state must be stored persistently on a server, because anytime a new scene instance can join the session and must receive the complete state for initialization (do you know, how the Network Sensor of Bitmanagement works? I recommend to check it out).
?
another 2c :)
?
Andreas
?
?
?
?
?
-Andreas
?
?
Date: Thu, 28 Feb 2019 16:40:24 +0000
From: Valentin Christoph <Christoph.Valentin(a)kapsch.net<mailto:Christoph.Valentin@kapsch.net>[mailto:Christoph.Valentin@kapsch.net<mailto:Christoph.Valentin@kapsch.net>]>
To: Andreas Plesch <andreasplesch(a)gmail.com<mailto:andreasplesch@gmail.com>[mailto:andreasplesch@gmail.com<mailto:andreasplesch@gmail.com>]>
Cc: X3D Graphics public mailing list <x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>[mailto:x3d-public@web3d.org<mailto:x3d-public@web3d.org>]>
Subject: Re: [x3d-public] Multiplayer strategies
Message-ID:
? ? ? ? <VI1PR03MB47038C7653CE09362D433D3EE4750(a)VI1PR03MB4703.eurprd03.prod.outlook.com<mailto:VI1PR03MB47038C7653CE09362D433D3EE4750@VI1PR03MB4703.eurprd03.prod.outlook.com>[mailto:VI1PR03MB47038C7653CE09362D433D3EE4750@VI1PR03MB4703.eurprd03.prod.outlook.com<mailto:VI1PR03MB47038C7653CE09362D433D3EE4750@VI1PR03MB4703.eurprd03.prod.outlook.com>]>
Content-Type: text/plain; charset="utf-8"
Hi Andreas,
Maybe another 2 cents from my side.
If we want to keep it simple, we should keep a 1:1 relationship between user and scene graph (I call this the ?personal scene instance? PSI).
Why? Two reasons.
(1) Maybe the scene consists of many ?modules?, which might span a large section of the Virtual Universe, an which are loaded and unloaded into each scene instance on demand.
One user is ? for some time ? only interested in module A ? so other modules need not be loaded in ?his? scene instance ? saving memory and CPU resources.
Other user is interested in several modules at the same time --> he will need higher performance in his scene instance
So the matter of scalability will be easier to handle, if we keep user : PSI = 1 : 1.
(2) The scene might provide different ?views? to different ?users?. One user might get a photo realistic 3D graphic, another user might receive a topographic illustration of the scene with only symbolic content. Only the ?shared state? is the same for all scene instances of a multiuser session
As I said, just my two cent.
All the best.
From: x3d-public <x3d-public-bounces(a)web3d.org<mailto:x3d-public-bounces@web3d.org>[mailto:x3d-public-bounces@web3d.org<mailto:x3d-public-bounces@web3d.org>]> On Behalf Of Andreas Plesch
Sent: Thursday, February 28, 2019 4:25 PM
To: X3D Graphics public mailing list <x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>[mailto:x3d-public@web3d.org<mailto:x3d-public@web3d.org>]>
Subject: Re: [x3d-public] Multiplayer strategies
Thanks for all the thoughtful response. Various ideas were offered. The DIS component is dedicated to communication and synchronization between browsers in a peer to peer fashion but has its own limitations. Outside of X3D various web technologies such as webRTC, websocket or socket.io<http://socket.io>[http://socket.io]<http://socket.io[http://socket.io]> exist which can be used with ad hoc protocols and SAI or DOM based scene updating. I think Firebase is designed to push realtime updates of a json store to all connected clients, and could fit well. Synchronization of multiple avatars and persistent avatar registration on a dedicated service was suggested.
It is a wide field. To narrow the domain, let's perhaps consider the local, single game console/browser with multiple controllers and split screen/multiple headsets case, say up to 4 actors, no servers.
Using a projector or a large TV, there is natural sharing which is eliminated with HMDs. Replicating the screen, one mode is just mirroring of a master render to other HMDs but this is very unpleasant in VR. Another mode is one actor with sensing, and other viewers, passive but still moving and looking. Another mode is full access to the scene for all locally connected.
WebXR allows for multiple HMDs and controllers. I am not sure if web browsers can deal with multiple mice/keyboards but I suspect they can; there is a gamepad API.
Brainstorming a multiple avatar, client only design:
- a list of render surfaces, for each avatar, perhaps layout, layer related
- a list of active avatars, linked to a render surface
- a way to add and remove an avatar
- an active viewpoint per avatar
- touchsensor, other sensors linked to a list of avatars
Very fuzzy but perhaps a start for thinking about such a case.
I was looking around castle engine for inspiration from a gaming perspective but could not find much.
It may be that dealing with multiple avatars in a single browser is actually more complicated than a local one browser per avatar, plus a synchronized scene from a scene server design which has to deal with updates to the shared scene, the synchronization, and distribution.
Was there a VRML approach to shared experiences ?
>From a practical standpoint, simply mirroring from a master HMD to a second connected HMD using webVR in x3dom would be a first step to explore.
And perhaps exploring Firebase, eg. if it can store a json X3D scene, and how multiple X3DOM or X-ITE instances would receive it, perhaps in an inline.
-Andreas
On Tue, Feb 26, 2019, 5:33 AM Andreas Plesch <andreasplesch(a)gmail.com<mailto:andreasplesch@gmail.com>[mailto:andreasplesch@gmail.com<mailto:andreasplesch@gmail.com>]<mailto:andreasplesch@gmail.com<mailto:andreasplesch@gmail.com>[mailto:andreasplesch@gmail.com<mailto:andreasplesch@gmail.com>]> wrote:
With VR it may become more common to share a live, dynamic experience using multiple headsets and controllers. At first glance this seems to call for multiple, active viewpoints rendered by a single browser. The layering and layout components seem relevant.
Another strategy would be having multiple browsers with identical scenes and keeping scenes in sync with an additional process and SAI methods.
What are the strategies offered by X3D to support sharing a live, dynamic world ?
This came up as a x3dom GitHub issue and I thought may be more generally interesting.
Andreas
The information contained in this e-mail message is privileged and confidential and is for the exclusive use of the addressee. The person who receives this message and who is not the addressee, one of his employees or an agent entitled to hand it over to the addressee, is informed that he may not use, disclose or reproduce the contents thereof, and is kindly asked to notify the sender and delete the e-mail immediately.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20190228/42e8e0…<http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20190228/42e8e0…>>
------------------------------
Subject: Digest Footer
_______________________________________________
x3d-public mailing list
x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>[mailto:x3d-public@web3d.org<mailto:x3d-public@web3d.org>]
http://web3d.org/mailman/listinfo/x3d-public_web3d.org
------------------------------
End of x3d-public Digest, Vol 119, Issue 86
*******************************************_______________________________________________ x3d-public mailing list x3d-public(a)web3d.org<mailto:x3d-public@web3d.org> http://web3d.org/mailman/listinfo/x3d-public_web3d.org[http://web3d.org/mai…<http://web3d.org/mailman/listinfo/x3d-public_web3d.org%5Bhttp://web3d.org/m…>
------------------------------
Message: 2
Date: Thu, 7 Mar 2019 18:29:24 +0000
From: "Brutzman, Donald (Don) (CIV)" <brutzman(a)nps.edu<mailto:brutzman@nps.edu>>
To: "semantics(a)web3d.org<mailto:semantics@web3d.org>" <semantics(a)web3d.org<mailto:semantics@web3d.org>>, "X3D Graphics public
mailing list" <x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>>
Subject: [x3d-public] X3D Semantic Web Working Group 7 MAR 2019:
references, geometric properties, MPEG-7 Descriptors
Message-ID: <6aa55efc-497c-1813-4bdb-c5843ae2486c(a)nps.edu<mailto:6aa55efc-497c-1813-4bdb-c5843ae2486c@nps.edu>>
Content-Type: text/plain; charset="utf-8"
8.0. Eighth meeting of the Semantic Web Working Group
Attendees Jakub Flotynski, Athanasios Malamos, Anita Havele, Don Brutzman.
Web3D Teleconference Information
http://www.web3d.org/member/teleconference-information
Prior minutes, Jakub and Athanasios:
[x3d-public] X3D Semantic Web Working Group minutes, 17 JAN 2019: structural and conceptual semantics
http://web3d.org/pipermail/x3d-public_web3d.org/2019-January/009898.html
All information in these minutes is approved for pubic release.
=================================================================
7.0 Last week's meeting included Thanos Jakub Nicholas Anita and Don.
Essentially we reviewed website links and kept discussing/improving slides.
Unfortunately my minutes got mistakenly deleted. Sorry about that.
=================================================================
8.1 *Working group information*
The X3D Semantic Web Working Group is a Web3D Consortium member-only group that does most of its business openly on the x3d-public mailing list.
X3D Semantic Web Working Group Charter
http://www.web3d.org/working-groups/x3d-semantic-web/charter
X3D Semantic Web Working Group
http://www.web3d.org/working-groups/x3d-semantic-web
"The X3D Semantic Web Working Group mission is to publish models to the Web using X3D in order to best gain Web interoperability and enable intelligent 3D applications, feature-based 3D model querying, and reasoning over 3D scenes."
semantics(a)web3D.org
http://web3d.org/mailman/listinfo/semantics_web3d.org
=================================================================
8.2 *Working group assets*
We have started work on the following website pages to record resources. Many are now present, more will follow.
X3D Semantic Web Public Assets
http://www.web3d.org/x3d-semantic-web-public-assets
X3D Semantic Web Member Assets
http://www.web3d.org/member/wiki/x3d-semantic-web-member-assets
Working group co-chairs have permission to edit the member-assets page, then results are reviewed and copied over to public-assets page.
Inputs welcome to keep building and structuring these important lists of assets.
=================================================================
8.3 *Workshop opportunities*
We think that our current activity can likely be a contribution at the upcoming
First Eurographics-EuroVR Workshop on Semantic 3D Content
6 May 2019 in Genova Italy as part of EuroGraphics 2019
http://semantic3d.org/workshop
Paper/poster/demonstration submission deadline extended: March 10, 2019
We discussed what a good follow-on might be for Web3D 2019 Conference. Perhaps another workshop, or simply a meeting, on X3D Semantic Web Working Group. This could build on the EuroGraphics 2019 momentum, disseminate progress among participants and set us up for much expected work to emerge in the coming year.
WEB3D 2019: 24th International ACM Conference on 3D Web Technology
26-28 July 2019, Colocated with SIGGRAPH2019, Los Angeles California USA
http://www.web3d.org/event/web3d-conference-2019http://web3d2019.web3d.org
The current work will not be sufficiently mature to be a paper submission by the deadline (~10 days away).
Instead we plan to submit a Web3D workshop proposal for this new work. This can build upon the Eurographics-EuroVR Workshop products.
Possibly a poster is also appropriate, especially if it describes coherent existing work.
Web3D workshop deadline submission: 1 April 2019. This will be next week's topic.
=================================================================
8.4 Primary topic: *slideset Semantic X3D - thoughts and ideas*
Much detailed architectural work is in progress, distilled in these slides.
https://docs.google.com/presentation/d/1fCMu0V-zRAfJqFId7QIMyLh2EOCr5Qgl63M…
_Geometric properties_ (slides 9-10) updated:
Thanos said ?sitting in my corner? ... Can we define a property for ?corner? based on geometric relationships? Good to think about.
We might suppose multiple candidate geometric properties:
* Primitive shapes: Rectangular, Conical, Cylindrical, Spherical, Ellipsoid, Point, Line, Mesh
* Side, TopSide BottomSide LeftSide RightSide FrontSide BackSide
* ParametricSurface, NURBS, BREP, other types?
* Characteristics: Irregular, Open, Closed (Watertight), Corner, Seam, Wall
* Angular relationships: Perpendicular, Acute, Obtuse
------
Thanos: corner rdfs:subClassOf
str:includes(2) str:triangles;
If normal(A) CROSS-PRODUCT normal(B) >0 -> createCorner(C) and includes(C,A) and includes(C,B)
------
TODO: does there already exist a set of 3D property classes related to shape of models?
TODO: should we next compare existing 3D ontologies of interest?
TODO: build examples that help us determine the best, most reusable elements of an X3D ontology?
We discussed relationships and 3D functions for extraction of semantic information from geometric shapes so we could specify possible goals of our works:
* Extraction of semantic information from X3D models (re-visit MPEG7)
* Generating X3D models on the basis of semantic 3D models (conceptual)
* Likely an iterative process, OWL inference can generate more RDF properties
* Semantic annotation (description) of X3D (without representation)
=================================================================
8.5 MPEG-7 overview
We briefly discussed prior work with MPEG-7. Background:
MPEG-7 - Wikipedia
https://en.wikipedia.org/wiki/MPEG-7
"MPEG-7 is a multimedia content description standard. It was standardized in ISO/IEC 15938 (Multimedia content description interface). This description will be associated with the content itself, to allow fast and efficient searching for material that is of interest to the user. MPEG-7 is formally called Multimedia Content Description Interface. Thus, it is not a standard which deals with the actual encoding of moving pictures and audio, like MPEG-1, MPEG-2 and MPEG-4. It uses XML to store metadata, and can be attached to timecode in order to tag particular events, or synchronise lyrics to a song, for example.
It was designed to standardize:
* a set of Description Schemes ("DS") and Descriptors ("D")
* a language to specify these schemes, called the Description Definition Language ("DDL")
* a scheme for coding the description
The combination of MPEG-4 and MPEG-7 has been sometimes referred to as MPEG-47."
TODO Thanos will look up whether licensing or patents have been declared.
MPEG Licensing Authority
https://www.mpegla.com
=================================================================
8.6 MPEG-7 Visual Descriptors: More than MPEG-7
These are draft slides that Thanos has prepared for our discussions. They will be exposed publicly once further developed.
https://docs.google.com/presentation/d/11VSFHriBnOXJzsHfYX0XDUMVqO5X9HePUvQ…
Back to shared slideset, follow-on slide 11:
--------------------------------------
Inclusion of Visual Descriptors in X3D
Visual Descriptors are available for color and shape. They seem quite analogous to structure provided by RDF properties.
It is an interesting question whether the X3D Specifications are ?ready? for inclusion of visual descriptors. Visual-descriptor properties are primarily metadata about a scene, not directions for rendering. Indeed our current effort is to create such a conceptually coherent ontology for X3D.
Thus if we define how to include visual descriptor properties in a scene,
* Authors could include Metadata nodes with RDF properties,
* Tools could perform geometric inference and similarly add Metadata nodes
Attaching semantic information to X3D scenes
This working group needs to identify X3D Ontology mappings as
* embedded MetadataSet structures
* embedded (multi-namespace?) and external RDF files
* norms and best practices for including such descriptor files
======================================
8.7 *W3C Ontology for Media Resources*
Really important reference that Jakub identified:
Ontology for Media Resources 1.0
W3C Recommendation 09 February 2012
https://www.w3.org/TR/mediaont-10/
This ontology has an amazing number of metadata correlations. Further, aligning an X3D Ontology with this approach would immediately give us a broad number of terms and mappings, all compatible with HTML5. Excerpt from Table of Contents:
5.2.2 Multimedia metadata formats mapping tables
5.2.2.1 CableLabs 1.1
5.2.2.2 DIG35
5.2.2.3 Dublin Core
5.2.2.4 EBUCore
5.2.2.5 EXIF 2.2
5.2.2.6 ID3
5.2.2.7 IPTC
5.2.2.8 LOM 2.1
5.2.2.9 Media RSS
5.2.2.10 MPEG-7
5.2.2.11 OGG
5.2.2.12 QuickTime
5.2.2.13 DMS-1
5.2.2.14 TTML
5.2.2.15 TV-Anytime
5.2.2.16 TXFeed
5.2.2.17 XMP
5.2.2.18 YouTube
5.2.3 Multimedia container formats mapping tables
5.2.3.1 3GP
5.2.3.2 Flash
5.2.3.2.1 FLV
5.2.3.2.2 F4V
5.2.3.3 QuickTime
5.2.3.4 MP4
5.2.3.5 OGG
5.2.3.6 WebM
We definitely need to review and study this work further.
Wondering if we can invite an expert in that group to give a presentation at Web3D 2019?!
=================================================================
8.8 * Planning Ahead *
We plan to meet 14 MAR and 21 MAR. No meeting 28 MAR.
Of interest:
Prot?g? Short Course: MARCH 27 - 29, 2019 at STANFORD, CA
https://protege.stanford.edu/short-courses.php
=================================================================
Steady interesting progress, thanks colleagues! 8) 8) 8)
all the best, Don
--
Don Brutzman Naval Postgraduate School, Code USW/Br brutzman(a)nps.edu<mailto:brutzman@nps.edu>
Watkins 270, MOVES Institute, Monterey CA 93943-5000 USA +1.831.656.2149
X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman
------------------------------
Subject: Digest Footer
_______________________________________________
x3d-public mailing list
x3d-public(a)web3d.org<mailto:x3d-public@web3d.org>
http://web3d.org/mailman/listinfo/x3d-public_web3d.org
------------------------------
End of x3d-public Digest, Vol 120, Issue 34
*******************************************
_______________________________________________ x3d-public mailing list x3d-public(a)web3d.org http://web3d.org/mailman/listinfo/x3d-public_web3d.org
_______________________________________________ x3d-public mailing list x3d-public(a)web3d.org http://web3d.org/mailman/listinfo/x3d-public_web3d.org