Debugging process considered harmful

The following is a result of personal experience on a number of projects:

* First of all, in general, you cannot install the debugger where the application you are developping will ultimately run. If you are writing a desktop application, there is no way you will be able to install a decent debugger there. If you are writing a server application, the operation team will probably not let you install the debugger of your choice on production servers! If you are lucky, you will be able to install a simplified debugger, like WinDbg that has almost no side effect. WinDbg just works for unmanaged code so far. I do not know any good standalone .Net debugger yet (w/o installing Visual Studio, that is). This is a real issue. If you have put all your debugging efforts in the Visual Studio debugger, putting your application in production is like sending a spaceship towards Mars without any communication with the ship: welcome to the unknown.

* Now, let's get back to the development phase. You have to think that, when someone debugs, he just works for himself. Debugging may only be benefitial to one person. There is no way to transfer the work he has done (setting breakpoints, stepping into code, ...) and the knowledge he has acquired on the program he was debugging (if any...) to someone else.
In a team development mode, the time spent by one individual on debugging is simply lost for the rest of the team.

* When someone debugs, he just does the same things (stepping) over an over agin, loosing precious time. He is just exchanging development (creativity) time with stupid "stepping" time. This is actually depending on the competency of the guy. A good developer will use good watches, good breakpoints and advanced debugging techniques. He will therefore loose less time than a beginner developer. Thus the next point:

* To debug efficiently, a developer needs to master a debugger, or worse a certain number of debuggers. Debuggers do not follow any real standards, beside some de facto common UI. Training should be required to be able to debug efficiently.

* If you manage developers and just take a look at the time the average developer in your team spends in the debugger, you should do something! But you can't because Debugging trainings are very very rare and Debugging books are very rare too. Here are two decent books about debugging https://www.amazon.com/exec/obidos/tg/detail/-/0735615365/ and https://www.amazon.com/exec/obidos/tg/detail/-/1590590503 (not only for vb.net)

This explains why I almost never debug. And I think my productivity is greatly enhanced. I trace everything. I agree this is not ideal (some call it "printf debugging"...), but that's just because no technical progress has been made in this domain. Today, putting traces everywhere is really prehistoric work. But that does not mean traces are bad, it just means nothing is there to help us add traces automatically to our code. I would like to be able to tell to the C# compiler: "put traces here and here, please remember the values of the parameters automatically, just like you do when I debug. Please add a hierarchical trace of this object, and please record all that that so I can replay it later, add filters, and so on...". That service should be provided by the CLR, IMHO.