Queuing has been in the news lately. First, the Wall Street Journal’s most recent The Numbers column was on queuing theory (The Science of Standing in Line, Oct 7). The story is in someways disappointing since it emphasizes the history of queuing over its current applications or general insights. However, it does feature this rather spiffy graphic contrasting service systems in which several servers pull from a common queue as opposed to each server having a separate line.
OK, so this may not be the slickest queuing graphic the Journal has ever run, but it emphasizes a common point in queuing: Service benefits from pooling. That is, a system in which multiple servers pull from the same queue will offer a shorter wait at the same level of utilization as a system in which customers stand in separate lines. So just rearranging how people stand can offer shorter waits without hiring more workers or adding more capacity.
Choose a single line that leads to several cashiers
Not all lines are structured this way, but research has largely shown that this approach, known as a serpentine line, is the fastest. The person at the head of the line goes to the first available window in a system often seen at airports or banks.
Getting into a single line also provides a sense of psychological relief because it eliminates the choice of where to go and second-guessing about the best line to choose, said Julie Niederhoff, an assistant professor of supply chain management at Syracuse University.
Still, most people prefer to take their chances with parallel lines — individual lines dedicated to a single cashier — even though most of the time they end up picking a slower line, Professor Marsden said.
Douglas E. Norton, a professor of mathematics and statistics at Villanova University, said studies had typically shown that with three tellers, each serving his or her own line of customers, the wait time was three times longer on average than a single line leading to an array of tellers.
So why is the less efficient parallel line model used at grocery stores? “Essentially, nobody wants a huge line of folks with full grocery carts winding (like a serpent) around their store,” he said in an email.
Of course, there are some caveats here that we are conveniently ignoring. Perhaps most importantly, we are assuming that the arrival rate to individual queues in a system with separate queues is independent of the state of the system. That is, there are customers who are destined to go to Server 1 and they will get in line for Server 1 even if Server 1 has a line with five customers waiting while Server 2 is sitting idle. So the straw man of separate queues is more appropriate for, say, a bank having customers call a national call center as opposed to their local branches. It is not really appropriate for a setting in which customers see the state of each queue and can chose the one that is shortest. Indeed, if we allow customers to see the queues before they join and then jump between queues if one is moving faster, then there is no difference in the average wait between the systems (although issues of fairness and stress are relevant).
There is another point raised in the Times article that is interesting.
Beware of lines with obstructions
If you find yourself in a line that snakes around a corner or where the cashier’s view of the number of customers is obstructed by a wall or a shelf, be prepared for a longer wait, one study found. The study, by Professors Niederhoff and Masha Shunko of the University of Washington and Yaroslav Rosokha of Purdue University, released in June, noted that obstructions hinder the feedback cashiers get from seeing how their work thins the line.
The paper in question, Humans Are Not Machines: The Behavioral Impact of Queueing Design on Service Time, actually goes beyond what is described above. (See here for a synopsis of the work in Scientific American.) In particular, they run experiments and show that when servers are responsible for their own line (i.e., Server 1 can see how many customers are waiting just for him), they work faster. Similar results have been found in non-laboratory settings — like hospital emergency departments.
We thus have two countervailing effects. Pooling leads to shorter waits assuming that the service rate doesn’t change BUT pooling can lower the service rate. Taken together with the observation that the benefits of pooling can be overstated if customers can choose their line strategically, this suggests that running service system with separate lines may not be bad at all.