Some complain at the inhumanity of computers and systems. This is usually fallacious: all such systems are created in the images of humans, and usually what people lament inhuman isn’t so in the sense that it is unlike people (say, lacking emotion) but in actually in the sense that it lacks empathy and ethics: if you spend any time with humans, you will learn just how fond some people are of creating inhuman systems; the only difference, in the realm of computers, is that the machines give people a place to hide.
So, Postel’s Law of Robustness, which computers find it much easier to obey than we do, is actually very human, and Postel, a man very fond of machines, a great humanist. Thus, when I recommended, above, being a little bit more like our machines, I meant only the extent to which they are faithful, diligent, and disinterested; characteristics which, combined with very human traits like public-spiritedness and a sense of community, we might call virtue. With this as our guiding principle, we might find conversation a little easier.
* I have not found reference to the philosophy smell idea anywhere, so believe it to be original. I will of course hand it to its rightful owner, if corrected.