Creating Passionate Users has a discussion on tool selection that I mostly agree with, and promote in all the companies I consult with. My only contention with the piece would be a semantic issue ... the use of the phrase "best tool".
"Best" is the superlative of "good", and if you hit the dictionary to look at the term, "good", you will see that it mostly deals with the nature of a thing. So describing a tool as "the best" implies an evaluation of its inherent quality. And this is why you get the religious wars about programming tools, because a thing is not intrinsically good or bad, just good or bad in a specific context for a specific purpose (a computer might be designed so poorly as to be totally useless for coding, but might make a wonderful and truly artistic doorstop.) The only way to argue about a thing being intrinsically good or bad is to fabricate an abstract ideal, and compare the item being debated to that manufactured ideal.
I would prefer the term, "appropriate" ... suitable for a particular person, condition, occasion, or place.
Selection of a tool (or language, or platform, or whatever) involves a multitude of issues. Not merely the obvious ones that were mentioned, or the subtle ones that a manager must consider such as (as mentioned in the comments) retaining talented programmers by giving them greater variety, but also the far more indirect issues such as tool longevity, likelyhood of the producing company being acquired, demand for the appropriate knowledge in the marketplace, etc.
For instance, when I was working in Baltimore, the marketplace was transitioning from VB.Net to C#. Why? I could never get a straight answer. The consequence was that there were suddenly a lot of VB programmers who either had to learn C# (starting at the bottom of the experience curve again) or find the rare VB gig. If I were a manager in Baltimore, I'd probably lean heavily toward using VB rather than C# if I could get a senior VB developer for the same price as a novice C# developer.