On Sunday 07 August 2011, Yaron Nachum wrote:
I have downloaded SR r 3.1.4 and have been playing around with it. I am considering implementing Number Translation services (1800...) and Number Portability Services on it. We have around 20K entries on the Number Translation Service and around 3M entries on the Number portability Service. The volume is around 100 caps.
Hello Yaron,
I have started working with the carrierroute module - it seems perfect for the Number translation Service, but I am not sure about the Number portability. I have several questions: 1. Is it possible to use this module with several millions of entries? 2. What is the impact of loading the table? Is it possible to do it during traffic hours?
It should be possible to use cr with several million of entries, just keep in mind to increase the shared memory settings somewhat, 1024MB should be probably a good start. I know from some reports that people use it successfully in a scenario with millions of entries as well.
Its of course possible to do the data reloading during traffic, you'll just need the double amount of RAM. ;-) The reason for this is that the module discard the old data only after the successful loading of the new data.
I have read about PDB module, but I haven't understood how to setup the PDB server. I would appreciate for any guidance. Would you recommend using the PDB module for the Number Portability or the carrierroute?
The PDB module is from the setup a bit more complicated, but has lower performance requirements and can be placed on a dedicated server. It was written especially for number portability. Basically it works that you setup the server module (pdbd) on some machine and feed it with a precompiled data file. There is a small compiler (pdbt) to create this mapping file from text data. You then use the pdb module in your configuration to get the carrier ID for a number and do your routing accordingly in the script, e.g. with cr. You find the tool in the utils/pdbt dir in sr.
But for the performance requirements you quoted I'd look also into another scenario - what about just storing this data in a "normal" database and querying it with sqlops? If you give the DB enough RAM it should be also fast enough.
Best regards,
Henning