Friday, August 15, 2014

Scheduling Instability

Fellow OR blogger Laura McLay recently wrote a post "in defense of model simplicity", which is definitely worth the read. It contains a slew of links to related material. As I read it, though, my contrarian nature had me thinking "yes ... as long as the model is not too simple".

A recent piece in the NY Times ("Working Anything but 9 to 5") had me thinking again about model simplicity. The gist of the Times article is that irregular (erratic, chaotic) work schedules for hourly employees, coupled with short notice of those schedules, creates stress for some families, particularly single-parent families where daycare needs to be arranged. Fairly deep into the article, I found someone saying what I was thinking:
Legislators and activists are now promoting proposals and laws to mitigate the scheduling problems. But those who manufacture and study scheduling software, including Mr. DeWitt of Kronos, advocate a more direct solution: for employers and managers to use the software to build in schedules with more accommodating core hours.

“The same technology could be used to create more stability and predictability,” said Zeynep Ton, a professor at M.I.T. who studies retail operations.
The multiperiod nature of scheduling models tends to make them a bit chewy, especially when you need to coordinate schedules of multiple individuals (or machines, or venues) across blocks of time while dealing with multiple constraints and possibly multiple conflicting criteria. As with most discrete optimization problems, simplicity of the model tends to pay dividends in reduced execution time. That said, models that take into account the cumulative schedules of workers do exist. In the U.S., the Federal Aviation Administration has rules limiting consecutive hours in the cockpit and rest times for pilots flying passenger routes. (If you think this is another example of the "nanny state", have a look at this article about an Indian airliner imitating a roller coaster.)

There are at least four dimensions in play that may be contributing to the high variance in schedules for some employees:
  1. Is the employer concerned enough about scheduling chaos to make it a policy to mitigate variance in schedules of individual employees?
  2. If so, is the policy being implemented?
  3. How does the employer assess tradeoffs between smoothing of schedules and other criteria (adequate staffing, staffing cost, matching staff skills with needs, ...)?
  4. Can/does the scheduling software (assuming software is used) implement variance-mitigation either as constraints or in the objective function?
The second dimension, in my opinion, is one best left to the management folks (my erstwhile colleagues). The fourth dimension clearly belongs to the OR crowd: we need to make provision for those sorts of constraints/criteria, even at the risk of making models more complicated (cough Laura cough),  and we need to communicate clearly to the users that those provisions exist (and how to use them). I'm inclined to think that the first and third dimensions straddle the disciplines of OR (or, if you prefer, analytics) and management. The management folks can provide evidence-based arguments of how more worker-friendly schedules can improve employee performance and retention, while those of us in OR can perhaps help assess the trade-offs and quantify costs and benefits.

[Update: Starbucks, at least, appears to be implementing policy changes. Thanks to Tallys Yunes for this link: "Starbucks to Change Scheduling as Some Employees Struggle".]

I'll conclude with a wonderful anecdote, told to me long ago by a colleague from our statistics department, about model simplicity. He attended a colloquium by what would now be called a biomathematician (I don't think the category existed back then). The presenter had developed a model for arterial blood flow in the human body, using a system of nonlinear differential equations. The right-hand sides of the equations represented a forcing function. The system was too complex for the presenter to solve, so he simplified things by solving the homogeneous case (setting the right sides of the equations to zero). At the conclusion of the talk, after the usual round of applause, the presenter asked if there were any questions. He got at least one: "If the forcing terms are all zero, doesn't that mean the patient is dead?" It does -- the simpler model that he had solved required that the body's heart not beat.

Simplicity in models is good, but arguably not worth dying for.

1 comment:

Due to intermittent spamming, comments are being moderated. If this is your first time commenting on the blog, please read the Ground Rules for Comments. In particular, if you want to ask an operations research-related question not relevant to this post, consider asking it on Operations Research Stack Exchange.