Clear now, thanks!
a) When you have a lot of SMPP binds and a high message traffic then the performance of the server where Diafaan SMS Server runs is the most important, specifically the performance of the hard disk. The performance of the server that is running SQL Server (Express) is not important for the performance of Diafaan SMS Server but it is important that the network connection between Diafaan SMS Server and MS SQL Server is fast and reliable, preferably within the same LAN network.
b) Yes, but only if the '...RetryOnError' properties in the advanced settings of the SQL Connector are set to 'True' (default is 'False') and the SQL Connector is not stopped or restarted while the database is offline.
c) I can't give you any specific requirements. For Diafaan SMS Server a fast (SSD) hard disk helps performance very much, memory and CPU is less important. The specifications for the SQL Server is not very important but the connection between Diafaan SMS Server and the database server should reliable and fast.
In our tests with Diafaan we are logging the message log into SQL Server Express. Working ok with both on the same machine running on a SSD. No customer is using the SQL connector.
For maintenance reasons I would like to run the SQL Server on a separate server and have a few questions:
a) Who requires the HD performance when we have high volume SMPP binds? Diafaan, SQL Server or both?
b) If the SQL Server is shut down for maintenance/backup overnight, when traffic is small, will Diafaan store the messages and update the SQL DB when it is back online?
c) What is required when we have 30-40 MPS and run SQL and Diafaan on separate machines:
i) Connectivity: Gigabit?
ii) HD for Diafaan: Enterprise disk SATA vs enterprise disk SAS vs good SSD?
iii) HD for SQL: Enterprise disk SATA vs enterprise disk SAS vs good SSD?
iv) Memory required?
v) CPU required?
I need general guidelines, nothing so precise.
Thanks for the help
Most Users Ever Online: 494
Currently Browsing this Page:
Guest Posters: 542
Newest Members:, Henk Helmantel
Administrators: Henk Helmantel: 1084