The
using clause in C# is pretty nifty. Wrap an initializer in the first part of
using, and, no matter how the code in body exits, the object will be disposed of using its
IDisposable interface. .NET defines a lot of such
managed
object classes for processes, mutexes, and so on.
Well, I was
stealing borrowing some code from an example that looked like:
using (Process p = new Process()) {
. . .
p.WaitForExit();
}
but I needed to return the process and wait for it somewhere else instead, outside this snipped. So I wrote:
using (Process p = new Process()) {
. . .
return p;
}
As soon as the caller used
WaitForExit() on the process, it blew up, because the process was disposed of before it was returned !
While I suppose there could be reasons for returning disposed objects — after all, they are still valid objects — this kind of thing is usually a braino. It would be nice if the compiler issued a warning like:
Hey, bub, do you really want to return a disposed object ?