In April 2001, Cisco Systems wrote off $2.2 billion of excess inventory and cut 8,500 workers, but it wasn't enough to keep the Internet infrastructure giant from turning in a dismal third-quarter report. After making $800 million in the first quarter, the San Jose, Calif.-based company had a $2.7 billion loss -- a shocking turnaround for one of the world's fastest-growing corporations. What happened?
In the words of Business Week, "Cisco had flattened the corporate pyramid, outsourced capital-intensive manufacturing, and forged strategic alliances with suppliers that were supposed to eliminate inventory almost entirely. Sophisticated information systems gave its managers real-time data, allowing them to detect the slightest change in current market conditions and to forecast with precision." But the highly hyped systems failed to account for frustrated customers and resellers, tired of long waits for products, who began to order from multiple distributors. Cisco began to stockpile components, added workers, and helped contract manufacturers buy more parts. The backlog evaporated as customers canceled duplicate orders and new orders failed to materialize.
Although many factors impacted Cisco's bottom line, its losses would have been much smaller if management had recognized the double orders and adjusted its sales forecast accordingly, says Stanford Graduate School of Business faculty member Erica Plambeck. "Cisco failed to account for duplicates in the order backlog and, therefore, although the tech economy had begun to slow down, Cisco anticipated continued high demand for its products," says the assistant professor of operations, information and technology.
An examination of what happens when management fails to correctly estimate demand for a product or customers' sensitivity to delay is the crux of a recent paper by Plambeck and co-author Mor Armony, assistant professor at Stern School of Business, New York University. They show how these factors can cause companies to build too much capacity or not enough. "Cisco is not the only company with difficulties in estimating demand because of duplicate orders," Plambeck says. "Intel and other semiconductor companies believe that data on bookings is irrelevant because it is too difficult to distinguish between duplicate orders and true demand."
Yet it is important to account for double orders and the reasons for them, she says. "Otherwise, by counting duplicate orders as true demand, you overestimate the demand rate, and by counting the cancellations of duplicate orders as lost sales, you overestimate customers' sensitivity to delay, and then you wind up with excess capacity."
Plambeck says she became interested in the subject of double ordering after reading about Cisco's problems. "If you read the business press, you hear only about overestimating demand. I had not read anything in the business press about overestimating customers' sensitivity to delay, the rate at which sales are lost when customers are forced to wait for the product. The optimal level of capacity increases with customers' sensitivity to delay, so estimating customers' sensitivity to delay is a very important part of the puzzle."
She and Armony showed how to tackle the problem using observable data that varies over time, such as the stock distributors have on hand, the number of outstanding orders, the number of orders placed and the number of cancellations per day per distributor, and the length of time that customers wait for the product. In a simple model with one manufacturer and just two distributors, they calculated the most likely values, in technical terms the "maximum likelihood estimates," of the true demand rate, the average amount of time that a customer will wait before canceling his order, and the rate at which customers can be expected to double order when forced to wait for the item they want.
What Plambeck and Armony found is that errors in estimating duplicate orders and cancellations are common even in stable supply-and-demand environments. "Even if customers are rarely back-ordered, the manufacturer will make a significant error in estimating the reneging rate unless it accounts for double orders," the paper concludes. "Typically, this results in excess production capacity." In one of the modeled examples, the manufacturer's capacity is 20 percent greater than optimal because of overestimated demand. Conversely, the researchers say, a manufacturer can invest too little in capacity, again by miscalculating the key factors. "Our analysis serves to warn manufacturers: Watch out for double orders or you might make a grave mistake," the paper says.
But Plambeck cautions that the model is "not very realistic -- it's stylized and stripped down to teach a lesson." In a setting with many distributors, and with buyers seeking out alternative distributors in response to long lead times, so that the incidence of duplicate orders evolves over time, the estimation problem becomes more complex, requiring substantially more computing power, and maximum likelihood estimation becomes no longer effective. Plambeck notes, however, that Stanford professors J. Darrell Duffie and Peter Glynn have developed efficient estimators for complex financial applications. "Ongoing research with Peter Glynn will develop similar estimators to handle industrial-sized problems with duplicate orders," she says.
For more information, visit www.stanford.edu.