there is a difference between epistemological rationality and instrumental rationality. epistemological rationality relates to discovering what is and is not true about reality given the available evidence. instrumental rationality, on the other hand, relates to finding the best strategy given your goals and the rules of the game you’re playing, and very often will require you to abandon beliefs that are epistemologically rational.
most people do not know what decisions or decision-making processes (“intrapsychic institutions” or “self-regulations”) are in their long-term best interest. they don’t know how decisions made today will affect tomorrow’s preferences, or how a series of goals achieved will affect their internal reward systems and future well-being. there is ample evidence that human beings aren’t very good at managing their internal ”pleasure economies” or ”well-being marketplaces” due to the inherent uncertainty of great complexity. we seek our own happiness (much of the time) in the dark.
with that in mind, what makes anyone think they’re in a position to advocate alternative social institutions that will (ostensibly) increase the well-being of the entire population? what evidence is there that having no government accomplishes the weighted goals of the population better than having government (most evidence, including its very existence, strongly suggests otherwise)? how do we know that unregulated markets produce more desirable (tricky word) outcomes than regulated markets? how would you even answer the god damn questions (well-being isnt measurable; the relative importance of diff utility functions isnt determinable; the goals are constantly changing, but some goals may not be in the long-term best interests of those defining them, etc.) ? like, words fail me. we cant even define the problems we’re trying to solve with any degree of precision.
it’s funny: the best at assessing what is and is not true are very rarely among the best at convincing others of their assessment. this is almost always the case in any domain where the truth or falsity of statements cannot be precisely determined (convincing people we dont know very much spoils the fun by forestalling the signalling games that inevitably spring up around the unknown; it is actually, i think, in everyone’s best interest to ignore/fight those who advocate neutrality, but this would be a curious place for such a tangent), and it is frequently the case even in areas where we can.
PB is basically saying ACists dont have solutions because they dont know what they need to know to have solutions. he isnt saying he has solutions: he’s saying no one has solutions. that is a very reasonable position because it is obviously true. whether or not believing it is in your best interest is best left to.. you (although, if you ask me, it probably isnt)