Dear all
we have been testing this module with the following setup
kamailio 5.3.2
evapi params
modparam("evapi", "workers", 4)
modparam("evapi", "netstring_format", 0)
modparam("evapi", "bind_addr", "127.0.0.1:8448")
modparam("evapi", "max_clients", 32)
then in the configuration we do evapi_relay of avp including a json data
(which can be quite long), like this
{"key" : "aarp2q0tcpqhs0cpucuhukjs2ah2j00q(a)10.18.5.64" ,
"msg" :
{"rg_in":"701","ani_init":{"ani_source":"pai",
....... }}}
We have an application listening on the tcp socket and writing those
messages to a kafka cluster, and this works ok, and in the previous manual
tests we have done no issue was found.
But when making some load tests, and passing some live traffic we see some
issues
seems like some times, when there are messages to be sent to the tcp socket
at the same time, they are sent in the same message, when normally each
data sent using evapi_relay is sent in 1 message
We do sometimes see something like this on the application consuming from
the tcp socket
2020-11-25 15:20:01.744 UTC [error]
<0.706.0>@evapi_kafka_listener:handle_info:167 body "{\"key\" :
\"
6142651aa63616c6c04a783cd(a)72.21.24.130\" , \"msg\" :
{\"rg_in\":\"677\",\"ani_init\":{\"ani_source\":\"fro\",.......}}}{\"key\"
: \"isbc7caT4001915251VabcGhEfHdNiF0i(a)172.16.120.1\" , \"msg\" :
{\"rg_in\":\"22\",\"ani_init\":{\"ani_source\":\"pai\",
.......
,\"translate" not valid json; error = {691,invalid_trailing_data}
2020-11-25 15:20:01.745 UTC [error]
<0.706.0>@evapi_kafka_listener:handle_info:167 body
"dPartition\":\"-1\",......}}}" not valid json; error =
{1,invalid_json}
and we do see that the application cannot parse the json message fine,
because we have like 2 json objects together
......{\"ani_source\":\"fro\",.......}}}{\"key\" :
\"isbc7caT4001915251Vabc............
This happens with 2 different UDP receivers processing messages and calling
evapi_relay at the same time. But i don't think this happens all the time.
Seems like some issue when several processes try to use evapi workers at
the same time.
We tried to increase evapi workers and it's the same
We also saw another issue I think. Seems when the avp sent to evapi socket
is bigger than ~1680 char, the json is also truncated, and also happens
when we use the socket in Lo interface which has an MTU of 65535.
Could you please take a look to see if there is any problem or limitation,
or if we are using something wrong?
thanks and best regards
david
--
[image: Logo]
David Escartín Almudévar
VoIP/Switch Engineer
descartin(a)sonoc.io
*SONOC*
C/ Josefa Amar y Borbón, 10, 4ª · 50001 Zaragoza, España
Tlf: +34 917019888 ·
www.sonoc.io