The small installed base of Macs makes them an unexciting, low-visibility target for the bad guys, and so the weaknesses don’t get exploited much

The marketshare argument only goes so far. This seems to be a version of the “Macs have no software” argument. It is indeed true that they are targeted less for this reason. But the argument that it’s straight cause-and-effect is disingenuous. If this principle were true, the apache web server platform would have far more vulnerabilities than IIS, since apache is by far the most widely used web server on the internet. But not only does apache not have more vulnerabilities, the disparity is laughable. This is a perfect example of greater exposure not necessarily equating to increased vulnerabilities.

And if marketshare dictates the level at which vulnerabilities are discovered, then why would any of you trust Linux, Solaris, or AIX for mission critical applications? They all have low “marketshare”. How is it that vulnerabilities are ever discovered in any of these operating systems if marketshare is the determining factor?

In the case of Unix, the vulnerabilities are greater – even in the Mach kernel underlying Mac OS X – but once again the installed base makes for an uninteresting target

This seems to suggest that the vulnerabilities are potentially even greater than Windows. I am sure that many security experts would take great issue with that statement.

If suddenly Macs were much more widely used, they’d rapidly become an interesting target, and we’d see more bad-guy action

We’d definitely see more bad-guy action. Whether any of it is fruitful remains to be seen. This also ignores the intensive and continuous peer review that open source software enjoys, which is a prime component in its ongoing security. While not all of Mac OS X is open source, almost any piece that would be exposed to the network is. And Apple’s track record on security, even during the heyday of Mac marketshare on campuses, has been nothing short of stellar.