I use this example to introduce formal and functional approaches to topics in the social sciences. Any argument you try to make within the debate ends up including a variant of “…because sandwiches [abstraction about what formally defines a sandwich]”, which itself presumes that the “right” way to carve up the world is in categories of form. You could also conceive of sandwiches functionally, where something isn’t a sandwich if we (some cultural or linguistic group) just don’t think of them that way.
From a functional view, the very fact the debate exists at all means hot dogs aren’t sandwiches, cereal isn’t soup, pop tarts aren’t ravioli, etc.
Then I make them think about it in contexts like language, Durkheim, and policy making and watch their little minds explode.
FWIW, academia is utterly dominated by Macs. In the last 10 years I have known exactly one colleague to choose to use a PC, and her open reason for doing that is that she thinks it’s fun to be contrarian. A lot of (psychology) labs will have one dusty PC stashed away in a corner somewhere running that one weird piece of Windows-only proprietary software for the eye-tracker or a super niche stats program or something, but then you make IT come in to keep it alive because the idea of having to put any effort into using it or replacing it is horrible.
I was a little curious whether losing the ability to BootCamp (the new M chips can’t, and I personally used dual booting all the time for video games) would change anything, but my university’s response was to start paying for Parallels for anyone who wants it.
I really didn’t understand why people still acted like anybody at all uses Windows until my husband moved from academia to industry a few years ago and we were totally floored by the PC-culture (heh) he found himself in (though he’s personally pretty anti-Mac and not complaining). Now the only Mac he sees is mine and the only PC I see is his. It’s wild.