Monday, March 23, 2009

Risk Mismanagement brought into the classroom

Joe Nocera, Risk Mismanagement, New York Times January 2, 2009


THERE AREN’T MANY widely told anecdotes about the current financial crisis, at least not yet, but there’s one that made the rounds in 2007, back when the big investment banks were first starting to write down billions of dollars in mortgage-backed derivatives and other so-called toxic securities. This was well before Bear Stearns collapsed, before Fannie Mae and Freddie Mac were taken over by the federal government, before Lehman fell and Merrill Lynch was sold and A.I.G. saved, before the $700 billion bailout bill was rushed into law. Before, that is, it became obvious that the risks taken by the largest banks and investment firms in the United States — and, indeed, in much of the Western world — were so excessive and foolhardy that they threatened to bring down the financial system itself. On the contrary: this was back when the major investment firms were still assuring investors that all was well, these little speed bumps notwithstanding — assurances based, in part, on their fantastically complex mathematical models for measuring the risk in their various portfolios.

I'm teaching a class in Database Design. Part of my scouting expedition into the worlds of academia. And tonight's class was on Reporting. Now, I think what everyone expects here is some tips on building reports using software packages (e.g. MS Access) and general design tips. I spent the session talking about why your doing reporting and focusing attention on the purpose of the report while doing the design.

And I've always had this fantasy in my head that I would try to bring in real world examples into teaching. For my midterm I put in an actual data collection form (which elicited a complaint about being vague and ambiguous. Complaints which are gladly voiced by the people who actually do the data collection using the form in question.)

So over the weekend, while thinking about what to do for this class (which was not looking very interesting) I thought about the financial crisis and all those little articles that I have been reading about the role of quants. And realizing that this was a reporting issue (i.e. the math was not the problem, it was in the communication, interpretation and use.)

The next question: How to do it. So I treated it as a case study. Operational environment. Problem description. Then led them through the thought process of the metric to be reported.

Next was the actual VaR metric. First explained the definition, then a discussion of where there was room for danger. Next, what the real definition was and a discussion of what people who VaR was reported to thought about it. Then a discussion of what happens when you measure something and use it as an evaluation metric (the metric was gamed.) And what happened when you did not pay attention (there was an asset which had characteristics that were hidden when viewed through the lens of VaR.)

Finally, the payoff. The wrong and right ways of using reports. Cribbed out of the New York Times article, which conveniently had both in one place.

While I won't say it was a free-wheeling discussion, there was some discussion. Not the risk taking opinionated you would get out of an MBA or policy course, but good enough. And it felt good to actually teach something.

No comments: