When you change the confidence interval on the Erlang C model, is there a simple mathematical relationship that relates the new interval to the average hold time. That is, if you specify for example 95% of calls to be answered in 10 seconds, is there a simple formula that allows you to deduce what is the time to answer 50% of the calls. I am thinking allong the lines of the normal distribution where you can link them in terms of the standard deviation.
Where Servicetime is the number of seconds in which the required percentage will be answered. QTime is the average time queueing for those calls which queue (i.e. excludes calls which go straight through to an agent). Confidence is the percentage for the servicetime. ErlangC is the percentage likelyhood of the call being queued.
QTime can be calculated by:
QTime=1/((Agents * 3600/AHT) – CallsperHour)
Where Agents=number of agents available. AHT=Average handle time in seconds. Callsperhour=Calls per hour!
Lester.
Author
Posts
Viewing 2 posts - 1 through 2 (of 2 total)
The forum ‘Telecom Design’ is closed to new topics and replies.