Hi Sebastian,
You should use the 1.2 svn branch (or maybe the trunk). There are
several fixes already present in the tree. There's no point testing
against the original 1.2 tarball.
Regards,
Ovidiu Sas
On 5/17/07, Schumann Sebastian <Sebastian.Schumann(a)t-com.sk> wrote:
Alright, I just used the Debian 4.0 OpenSER
distribution files. I will
compile it by myself now and send you the Logs again.
Thanks so far.
Sebastian
-----Original Message-----
From: Anca-Maria Vamanu [mailto:anca@voice-system.ro]
Sent: Thursday, May 17, 2007 5:37 PM
To: Schumann Sebastian
Cc: users(a)openser.org
Subject: Re: AW: [Users] OpenSER crashed with Segmentation Fault using
PresenceModule
I have not found the source either. It would really help if you would
compile openser with debug simbols activated (they are activated in the
default Makefile) to see exactly which line the segm fault occures.
Then, please post again the backtrace.
Thanks,
Anca
Schumann Sebastian wrote:
Hello,
here is the stderr console output (just the last lines which should be
important as
I saw there is a message size limit), and the gdb
backtrace.
Do you see any hint for the crash?
Best regards
Sebastian
________________________________
Von: Anca-Maria Vamanu [mailto:anca@voice-system.ro]
Gesendet: Do 5/17/2007 16:57
An: Schumann Sebastian
Cc: users(a)openser.org
Betreff: Re: [Users] OpenSER crashed with Segmentation Fault using
PresenceModule
Hello,
Please post the log and the gdb backtrace.
regards,
Anca Vamanu
Schumann Sebastian wrote:
>Dear all
>
>I am currently writing my final thesis and want to set up a Presence
Core
therefore. I am using OpenSER as Presence Agent (with Registrar
function or now with separat Registrar, both configuration has the
problem). It crashes after a certain time and certain amount of watcher
entries. I cannot replicate this or deliminate to a certain action but
it always happends, no matter if I clean up the tables, restart the
machine or whatever.
>
>My configuration is one XCAP Server (XCAP Lite 1.0 from ag-projects,
working
properly), one Registrar (OpenSER with MySQL DB, which forwards
SUBSCRIBE/PUBLISH to PA) and Presence Agent (OpenSER with Presence
Module, MySQL DB, XMLRPC Module) and Eyebeam 1.5 Clients. All runs on
VMware virtual machines.
>
>I have just three clients that try to subscribe each other. Without
any
Subscriptions everything works properly, XML data gets stored,
PUBLISH is processed etc. Subscribing one user by another is possible
and works properly.
>
>After allowing a watcher to subscription (using XCAP) and if the
watcher logs
in next time, the state in the watcher table changes from
pending to active. It happens now and then, when a user goes online and
one of his subscribed presentities is not online, that the OpenSER PA
crashes.
>
>Analyzing DEBUG info and backtrace was not successful, because I found
no
information or hints about it, just the message segmentation fault
appears.
>
>Does anyone have this problem as well or has anybody an idea about it?
I can
provide you logs, the dump or traces if you need it.
Thanks for your help.
Sebastian
-----------------------------------------------------------------------
-
0(4591) SIP Request:
0(4591) method: <SUBSCRIBE>
0(4591) uri: <sip:bob@10.96.115.125>
0(4591) version: <SIP/2.0>
0(4591) parse_headers: flags=2
0(4591) Found param type 232, <branch> = <z9hG4bK0a73.6d721e06.0>;
state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=2
0(4591) parse_headers: this is the first via
0(4591) After parse_msg...
0(4591) preparing to run routing scripts...
0(4591) parse_headers: flags=100
0(4591) Found param type 232, <branch> =
<z9hG4bK-d87543-666f5148e9613c52-1--d87543->; state=6
0(4591) Found param type 235, <rport> =
<32800>; state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=100
0(4591) parse_headers: this is the second via
0(4591) DEBUG:maxfwd:is_maxfwd_present: value = 69
0(4591) DEBUG: t_newtran: T on entrance=0xffffffff
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) DEBUG:parse_to:end of header reached, state=10
0(4591) DBUG:parse_to: display={}, ruri={sip:bob@10.96.115.125}
0(4591) DEBUG: get_hdr_field: <To> [25]; uri=[sip:bob@10.96.115.125]
0(4591) DEBUG: to body [<sip:bob@10.96.115.125>
]
0(4591) get_hdr_field: cseq <CSeq>: <1> <SUBSCRIBE>
0(4591) DEBUG: get_hdr_body : content_length=0
0(4591) found end of header
0(4591) parse_headers: flags=78
0(4591) t_lookup_request: start searching: hash=14240, isACK=0
0(4591) DEBUG: RFC3261 transaction matching failed
0(4591) DEBUG: t_lookup_request: no transaction found
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) PRESENCE: handle_subscribe:Missing or unsupported event header
field
value
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) check_via_address(10.96.115.125, 10.96.115.125, 0)
0(4591) PRESENCE:handle_subscribe: ERROR occured
0(4591) DEBUG:tm:UNREF_UNSAFE: after is 0
0(4591) DEBUG:destroy_avp_list: destroying list (nil)
0(4591) receive_msg: cleaning up
0(4591) SIP Request:
0(4591) method: <SUBSCRIBE>
0(4591) uri: <sip:bob@10.96.115.125>
0(4591) version: <SIP/2.0>
0(4591) parse_headers: flags=2
0(4591) Found param type 232, <branch> = <z9hG4bKc989.6f967b16.0>;
state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=2
0(4591) parse_headers: this is the first via
0(4591) After parse_msg...
0(4591) preparing to run routing scripts...
0(4591) parse_headers: flags=100
0(4591) Found param type 232, <branch> =
<z9hG4bK-d87543-4e0baf255d4f5c21-1--d87543->; state=6
0(4591) Found param type 235, <rport> =
<32800>; state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=100
0(4591) parse_headers: this is the second via
0(4591) DEBUG:maxfwd:is_maxfwd_present: value = 69
0(4591) DEBUG: t_newtran: T on entrance=0xffffffff
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) DEBUG:parse_to:end of header reached, state=10
0(4591) DBUG:parse_to: display={}, ruri={sip:bob@10.96.115.125}
0(4591) DEBUG: get_hdr_field: <To> [25]; uri=[sip:bob@10.96.115.125]
0(4591) DEBUG: to body [<sip:bob@10.96.115.125>
]
0(4591) get_hdr_field: cseq <CSeq>: <1> <SUBSCRIBE>
0(4591) DEBUG: get_hdr_body : content_length=0
0(4591) found end of header
0(4591) parse_headers: flags=78
0(4591) t_lookup_request: start searching: hash=39068, isACK=0
0(4591) DEBUG: RFC3261 transaction matching failed
0(4591) DEBUG: t_lookup_request: no transaction found
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) PRESENCE: handle_subscribe:Missing or unsupported event header
field
value
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) check_via_address(10.96.115.125, 10.96.115.125, 0)
0(4591) PRESENCE:handle_subscribe: ERROR occured
0(4591) DEBUG:tm:UNREF_UNSAFE: after is 0
0(4591) DEBUG:destroy_avp_list: destroying list (nil)
0(4591) receive_msg: cleaning up
0(4591) SIP Request:
0(4591) method: <SUBSCRIBE>
0(4591) uri: <sip:sebastian@10.96.115.125>
0(4591) version: <SIP/2.0>
0(4591) parse_headers: flags=2
0(4591) Found param type 232, <branch> = <z9hG4bKe413.3434b775.0>;
state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=2
0(4591) parse_headers: this is the first via
0(4591) After parse_msg...
0(4591) preparing to run routing scripts...
0(4591) parse_headers: flags=100
0(4591) Found param type 232, <branch> =
<z9hG4bK-d87543-2a56156bb30b720d-1--d87543->; state=6
0(4591) Found param type 235, <rport> =
<32800>; state=16
0(4591) end of header reached, state=5
0(4591) parse_headers: Via found, flags=100
0(4591) parse_headers: this is the second via
0(4591) DEBUG:maxfwd:is_maxfwd_present: value = 69
0(4591) DEBUG: t_newtran: T on entrance=0xffffffff
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) DEBUG:parse_to:end of header reached, state=10
0(4591) DBUG:parse_to: display={}, ruri={sip:sebastian@10.96.115.125}
0(4591) DEBUG: get_hdr_field: <To> [31];
uri=[sip:sebastian@10.96.115.125]
0(4591) DEBUG: to body
[<sip:sebastian@10.96.115.125>
]
0(4591) get_hdr_field: cseq <CSeq>: <1> <SUBSCRIBE>
0(4591) DEBUG: get_hdr_body : content_length=0
0(4591) found end of header
0(4591) parse_headers: flags=78
0(4591) t_lookup_request: start searching: hash=12622, isACK=0
0(4591) DEBUG: RFC3261 transaction matching failed
0(4591) DEBUG: t_lookup_request: no transaction found
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) PRESENCE: handle_subscribe: 'expires' found
0(4591) PRESENCE: handle_subscribe: lexpire= 3600
0(4591) PRESENCE: handle_subscribe: 'To' header ALREADY PARSED:
<sip:sebastian@10.96.115.125>
0(4591) PRESENCE:handle_subscribe: 'From'
header not parsed
0(4591) DEBUG: add_param: tag=e035dc2e
0(4591) DEBUG:parse_to:end of header reached, state=29
0(4591) DBUG:parse_to: display={"Bob"}, ruri={sip:bob@10.96.115.125}
0(4591) PRESENCE:handle_subscribe: generating to_tag
0(4591) PRESENCE :proces_rr: out rr
[<sip:10.96.115.125;lr=on;ftag=e035dc2e>]
0(4591)
[p_user]= sebastian [p_domain]= 10.96.115.125
[w_user]= bob [w_domain]= 10.96.115.125
0(4591) [event]= presence
[staus]= active
[expires]= 3600
0(4591) [to_tag]= 10.4591.1179421870.7
[from_tag]= e035dc2e
0(4591) PRESENCE: update_subscribtion ...
0(4591)
[p_user]= sebastian [p_domain]= 10.96.115.125
[w_user]= bob [w_domain]= 10.96.115.125
0(4591) [event]= presence
[staus]= active
[expires]= 3600
0(4591) [to_tag]= 10.4591.1179421870.7
[from_tag]= e035dc2e
0(4591) expires: 3600
0(4591) PRESENCE:update_subscribtion:Inserting into database:
n_query_cols:14
0(4591) [0] = to_user sebastian
0(4591) [1] = to_domain 10.96.115.125
0(4591) [2] = from_user bob
0(4591) [3] = from_domain 10.96.115.125
0(4591) [4] = event presence
0(4591) [5] = event_id
0(4591) [6] = callid YTUwN2U3MmViZDM3NDNmZDdkMTllNTQ4MDgyN2ExMDk.
0(4591) [7] = to_tag 10.4591.1179421870.7
0(4591) [8] = from_tag e035dc2e
0(4591) [9] = contact sip:bob@10.96.100.128:32800
0(4591) [10] = status active
0(4591) [11] = cseq 1
0(4591) parse_headers: flags=ffffffffffffffff
0(4591) check_via_address(10.96.115.125, 10.96.115.125, 0)
0(4591) PRESENCE:get_subs_dialog:querying database table =
active_watchers
0(4591) PRESENCE: get_subs_dialog:The query for
subscribtion for
[user]= sebastian,[domain]= 10.96.115.125 for [event]=
presence.winfo
returned no result
0(4591) PRESENCE:query_db_notify: Could not get
subs_dialog from
database
0(4591) PRESENCE:update_subscribtion:Could not
send notify for
presence.winfo
0(4591) PRESENCE:notify:dialog informations:
0(4591)
[p_user]= sebastian [p_domain]= 10.96.115.125
[w_user]= bob [w_domain]= 10.96.115.125
0(4591) [event]= presence
[staus]= active
[expires]= 3600
0(4591) [to_tag]= 10.4591.1179421870.7
[from_tag]= e035dc2e
0(4591) PRESENCE:get_xcap_tree:The query in table xcap for
[username]=sebastian ,
domain=10.96.115.125 returned result 0(4591)
PRESENCE:get_xcap_tree: xcap body:
<?xml version="1.0"
encoding="UTF-8"?>
<ruleset xmlns="urn:ietf:params:xml:ns:common-policy"
xmlns:pr="urn:ietf:params:xml:ns:pres-rules"
xmlns:cp="www.counterpath.com/privacy-lists">
<rule id="whitelist">
<conditions>
<identity>
<id entity="sip:bob@10.96.115.125"/>
</identity>
</conditions>
<actions>
<pr:sub-handling>allow</pr:sub-handling>
</actions>
<transformations>
<pr:provide-services>
<pr:all-services/>
</pr:provide-services>
<pr:provide-persons>
<pr:all-persons/>
</pr:provide-persons>
<pr:provide-devices>
<pr:all-devices/>
</pr:provide-devices>
<pr:provide-all-attributes/>
</transformations>
</rule>
<rule id="blacklist">
<conditions>
</conditions>
<actions>
<pr:sub-handling>block</pr:sub-handling>
</actions>
<transformations/>
</rule>
<cp:privacy-lists>
<cp:list name="whitelist_uri_phone">
</cp:list>
<cp:list name="blacklist_uri_phone">
</cp:list>
<cp:list name="whitelist_domain_phone">
</cp:list>
<cp:list name="blacklist_domain_phone">
</cp:list>
<cp:list name="whitelist_uri_IM">
</cp:list>
<cp:list name="blacklist_uri_IM">
</cp:list>
<cp:list name="whitelist_domain_IM">
</cp:list>
<cp:list name="blacklist_domain_IM">
</cp:list>
</cp:privacy-lists>
</ruleset> 0(4591) presence:uandd_to_uri: uri=sip:bob@10.96.115.125
0(4591) PRESENCE:is_watcher_allowed:node1->name= rule
0(4591) PRESENCE:is_watcher_allowed:cond_node->name= conditions
0(4591) PRESENCE:is_watcher_allowed:actions_node->name= actions
0(4591) PRESENCE:is_watcher_allowed:sub_handling_node->name=
sub-handling
0(4591)
PRESENCE:is_watcher_allowed:sub_handling_node->content= allow
0(4591) PRESENCE:notify: [status]=active
0(4591) PRESENCE:get_p_notify_body: querying presentity
0(4591) PRESENCE: get_p_notify_body: The query returned no result
There is no presence information recorded forthe user
0(4591) PRESENCE:notify: Could not get the notify_body
0(4591) presence:uandd_to_uri: uri=sip:sebastian@10.96.115.125
0(4591) PRESENCE: notify: build notify to user= bob domain=
10.96.115.125 for
event= presence
0(4591)
[p_user]= sebastian [p_domain]= 10.96.115.125
[w_user]= bob [w_domain]= 10.96.115.125
0(4591) [event]= presence
[staus]= active
[expires]= 3600
0(4591) [to_tag]= 10.4591.1179421870.7
[from_tag]= e035dc2e
0(4591) PRESENCE:build_str_hdr: expires = 3600
0(4591) PRESENCE:build_str_hdr: subs_expires : 3600
0(4591) PRESENCE: build_str_hdr: headers:
Event: presence
Contact: <sip:10.96.115.128:5060>
Subscription-State: active;expires=3600
Content-Type: application/pidf+xml
0(4591) PRESENCE:notify: headers:Event: presence
Contact: <sip:10.96.115.128:5060>
Subscription-State: active;expires=3600
Content-Type: application/pidf+xml
0(4591) CONTACT = sip:bob@10.96.100.128:32800
0(4591) presence:uandd_to_uri: uri=sip:bob@10.96.115.125
0(4591) DEBUG:tm:t_uac:
next_hop=<sip:10.96.115.125;lr=on;ftag=e035dc2e>
0(4591) DEBUG: mk_proxy: doing DNS lookup...
0(4591) DEBUG:destroy_avp_list: destroying list (nil)
0(4591) DEBUG: dlg2hash: 12619
0(4591) print_request_uri: sip:bob@10.96.100.128:32800
0(4591) DEBUG:tm:set_timer: relative timeout is 500000
0(4591) DEBUG: add_to_tail_of_timer[4]: 0xb5aaefec (41700000)
0(4591) DEBUG:tm:set_timer: relative timeout is 30
0(4591) DEBUG: add_to_tail_of_timer[0]: 0xb5aaf008 (71)
1(4592) DEBUG: timer routine:4,tl=0xb5aa825c next=0xb5aaefec,
timeout=41700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
1000000
1(4592) DEBUG: add_to_tail_of_timer[5]: 0xb5aa825c (42700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:4,tl=0xb5aaefec next=(nil),
timeout=41700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
1000000
1(4592) DEBUG: add_to_tail_of_timer[5]: 0xb5aaefec (42700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:5,tl=0xb5aa825c next=0xb5aaefec,
timeout=42700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
2000000
1(4592) DEBUG: add_to_tail_of_timer[6]: 0xb5aa825c (44700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:5,tl=0xb5aaefec next=(nil),
timeout=42700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
2000000
1(4592) DEBUG: add_to_tail_of_timer[6]: 0xb5aaefec (44700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:6,tl=0xb5aa825c next=0xb5aaefec,
timeout=44700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (48700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:6,tl=0xb5aaefec next=(nil),
timeout=44700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (48700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:2,tl=0xb5aa9bc8 next=(nil), timeout=46
1(4592) DEBUG: wait_handler : removing 0xb5aa9b80 from table
1(4592) DEBUG: delete transaction 0xb5aa9b80
1(4592) DEBUG: wait_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=48700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (52700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=48700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (52700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=52700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (56700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=52700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (56700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=56700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (60700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=56700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (60700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=60700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (64700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=60700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (64700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=64700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (68700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=64700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (68700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=68700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aa8110, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aa825c (72700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec next=(nil),
timeout=68700000
1(4592) DEBUG: retransmission_handler : request
resending
(t=0xb5aaeea0, NOTIFY si ... )
1(4592) DEBUG:tm:set_timer: relative timeout is
4000000
1(4592) DEBUG: add_to_tail_of_timer[7]: 0xb5aaefec (72700000)
1(4592) DEBUG: retransmission_handler : done
1(4592) DEBUG: timer routine:0,tl=0xb5aa8278 next=0xb5aaf008,
timeout=71
1(4592) DEBUG: final_response_handler:stop retr.
and send CANCEL
(0xb5aa8110)
1(4592) DEBUG:tm:t_should_relay_response:
T_code=0, new_code=408
1(4592) DEBUG:tm:t_pick_branch: picked branch 0, code 408
1(4592) DEBUG:tm:local_reply: branch=0, save=0, winner=0
1(4592) DEBUG:tm:local_reply: local transaction completed
1(4592) DBG: trans=0xb5aa8110, callback type 256, id 0 entered
1(4592) PRESENCE:p_tm_callback: completed with status 408
[watcher_id:0xb5aaa9cc/10.4591.1179421870.4]
> 1(4592) DEBUG: add_to_tail_of_timer[2]: 0xb5aa8158 (76)
> 1(4592) DEBUG: final_response_handler : done
> 1(4592) DEBUG: timer routine:0,tl=0xb5aaf008 next=(nil), timeout=71
1(4592) DEBUG: final_response_handler:stop retr.
and send CANCEL
(0xb5aaeea0)
1(4592) DEBUG:tm:t_should_relay_response:
T_code=0, new_code=408
1(4592) DEBUG:tm:t_pick_branch: picked branch 0, code 408
1(4592) DEBUG:tm:local_reply: branch=0, save=0, winner=0
1(4592) DEBUG:tm:local_reply: local transaction completed
1(4592) DBG: trans=0xb5aaeea0, callback type 256, id 0 entered
1(4592) PRESENCE:p_tm_callback: completed with status 408
[watcher_id:0xb5aaacf4/10.4591.1179421870.7]
1(4592) DEBUG: add_to_tail_of_timer[2]:
0xb5aaeee8 (76)
1(4592) DEBUG: final_response_handler : done
1(4592) DEBUG: timer routine:7,tl=0xb5aa825c next=0xb5aaefec,
timeout=72700000
1(4592) DEBUG: timer routine:7,tl=0xb5aaefec
next=(nil),
timeout=72700000
1(4592) DEBUG: timer routine:2,tl=0xb5aa8158
next=0xb5aaeee8,
timeout=76
1(4592) DEBUG: wait_handler : removing 0xb5aa8110
from table
1(4592) DEBUG: delete transaction 0xb5aa8110
1(4592) DEBUG: wait_handler : done
1(4592) DEBUG: timer routine:2,tl=0xb5aaeee8 next=(nil), timeout=76
1(4592) DEBUG: wait_handler : removing 0xb5aaeea0 from table
1(4592) DEBUG: delete transaction 0xb5aaeea0
1(4592) DEBUG: wait_handler : done
1(4592) PRESENCE: msg_watchers_clean:cleaning pending subscriptions
1(4592) PRESENCE: msg_active_watchers_clean:cleaning expired watcher
information
1(4592) PRESENCE:msg_presentity_clean:cleaning
expired presentity
information
1(4592) PRESENCE: msg_watchers_clean:cleaning
pending subscriptions
1(4592) PRESENCE: msg_active_watchers_clean:cleaning expired watcher
information
1(4592) PRESENCE:msg_presentity_clean:cleaning
expired presentity
information
-----------------------------------------------------------------------
-
...
(no debugging symbols found)
Core was generated by `openser -f openser.cfg -w .'.
Program terminated with signal 11, Segmentation fault.
#0 0x0809e332 in fm_free ()
(gdb) bt
#0 0x0809e332 in fm_free ()
#1 0xb7b7d67a in handle_subscribe () from
/usr/lib/openser/modules/presence.so
#2 0x08051224 in do_action ()
#3 0x08052f8c in run_action_list ()
#4 0x080521af in do_action ()
#5 0x08052f8c in run_action_list ()
#6 0x080525a1 in do_action ()
#7 0x08052f8c in run_action_list ()
#8 0x08053247 in run_top_route ()
#9 0x08074e04 in receive_msg ()
#10 0x0809832d in udp_rcv_loop ()
#11 0x080664f4 in main_loop ()
#12 0x08067f25 in main ()
(gdb) quit
_______________________________________________
Users mailing list
Users(a)openser.org
http://openser.org/cgi-bin/mailman/listinfo/users