Update: Here's a little article that shows locking with Blocking Queues (an idea I refer to at the bottom of the article)
I've used it several times since to learn about interesting and useful stuff such as how to properly use ReaderWriterLocks, When does Thread.SpinWait() make sense (almost never) and the various ways there are to synchronize access to common resources in a multi threaded, multi CPU environment (lots).
The thing about .NET threading is that, not only is it scary to the uninitiated (as it should be), every time to check it out, you find out something you didn't know about.
One thing I got to notice a lot about threading, and locking specifically, is that people tend to use it where it's not really needed, or where other approaches tend to make more sense. For example. a case where you are trying to synchronize multiple commands from multiple threads into a single resource, that performs a "heavy" task for each request.
Many people tend to approach this with WaitHandles and what not, when easily one of the most scalable approaches is to put a Request Queue somewhere in the receiving end, and simply "pop" messages out of the queue when you are finished handling the last one. It makes the locking scheme much more easily understood, and makes for a very scalable system (you could easily have multiple threads reading from the queue to handle more requests at the same time when on a multi CPU machine, for example).
So, threading *is* important, especially to understand when you need to use *other means* to accomplish your tasks, and how threading can help to make using those other means run smoothly (in my example, synchronizing the read/write to the queue may be in order, but the queue is the "other means" by which you separate your requests into a logical "first in, first out" requirement in the system.