Rankian Uncertainty

Whither art?

Is art a "practice?" It is routinely spoken of as such, at least by intellectuals.

Does art change? It is always changing, on the whole, though certain details may be held constant. Or, art may stay the same while the world changes around it, and that is more or less the same thing.

Does art progress? That is a value judgment. Better to say that art changes.

Does art matter? Its sheer pervasiveness, viewed from outer space, suggests that it matters deeply. Meanwhile, from the global level on down to the nation, the community, the affinity group and finally, the individual, all becomes confounded very precipitously and very badly. Here on planet earth, "art" is too broad a term. One should be much more specific. A painting is not a poem, etc.

This is only one direction of travel, though, beginning with the creative impulse and the social ceremonial which is observable across many cultural activities, and then diverging out into the particular art form, the métier, perhaps the genre, style, school-of-thought, national or local identity, and on and on until such boutique self-ascriptions as "Steampunk Big Band" and "Aesthetics of Failure" are landed upon. What is it, then, that propels people on their journeys from person to artist to technician to stylist to aesthete and finally, perhaps, to nonrepeatable individual? What have people done to arrive here, all artists but each on their own patch of ground? And, what has been done to them to keep things moving steadily in this direction?

Otto Rank was nothing if not a "social theorist of art," in literal terms, but not the kind who faces this problem from just this one direction, and then goes to lunch. Rather, Rank looks the other way too, and what he sees is that the "artist-type" does not always become an artist at all, let alone a particular type of artist. Although "artist" is already too broad a term to be very useful, Rank also sees that it is not broad enough to encompass the psychological type(s) which it suggests, because many (perhaps most) of those people do not become artists at all, not even as hobbyists.

Rank does not much linger on this, not as a parochial argument about aesthetics or psychology. Instead, he explodes it into a complete theory of culture and a prophetic account of "modern" art and artists. What his account suggests, all the same, is that "artist-type" is not a valid scientific category vis-a-vis art itself. The artist-type per se, Rank's own term, is for him also a misnomer. The type needs another name, one which captures its contents more accurately. There is no other name which does this any better, though, because this type of person may appear in culture as any among dozens or hundreds of métiers; hence any of those métiers could be used this way, but all would merely recapitulate the original "problem." Artist-type, for, Rank, includes myriad non-artists, perhaps hundreds or thousands of times more non-artists than artists. To distinguish the artists, then, requires something further. This is Rank's "problem" of art and artist.

What is so hard about this? If this much can already be said, it would seem that the difference is already known. Someone already has the answer. Otherwise there would no distinction, already existing, between artists and every other sort of culture hero. Rank's point all the same (and not in these words) is that the observations which permit such a distinction to be made, at the end of a long process of individuation, do not really permit of causal ascription retrospectively. It is clear enough, for example, that the priests of literature have done something different (if only subtlely different) from what the priests of religion have done. It is not hard to see the differences, nor is it hard to see (not to be undersold here) the similarities either. But then, what of it?

At this stage in the game the retrospective view still has plenty of moves available to it. Surely there is something in the backstories of the litterateurs which distinguishes them from the liturgists? Surely there is something in common also? Of course such things can always be found, somewhere; Rank, however, is not interested in this kind cherrypicking. He sees that it obfuscates more than it reveals. He sees that there is some correlation, perhaps much, but precious little causation. He sees (it is not worth his time to say quite so baldly) that some novelists come from broken families whereas others come from whole ones; some write for themselves while others write for the drawer; some have lived through momentous and drastic historical change while others were blissfully ignorant of it or were otherwise cloistered away at a safe distance.

But what about, say, bourgeois white European female novelists who lived in the second half of the nineteenth century? What about those from broken homes, as opposed to those from whole ones? What about those who married young, married late, married often, or not at all? What about . . . ? By now it should be obvious, without torturously following the retrospective view lemming-like over the cliff of sophistry, that this does not solve the "problem" of art and artist, no matter how far it is pursued. Indeed, even in the terms of the best contemporary social science methodology and statistical analysis, what constitutes an adequate sample size or a significant correlation here? Formal sociology has a hard enough time untangling these sorts of problems with the benefit of highly refined methodology. Without this methodology? Why so much as bother? What really can be achieved by locating and comparing the writers who overlap on the Venn diagram, if more are ignored than accounted for this way?

The problem with the open-ended naming of correlations is that non-correlations too can be named open-endedly. Differences among very similar artists can always be found, same as with similarities among painters, priests and politicians.

This launches Rank into his project, launches him back to the beginning of recorded history and then forward, slowly and in great detail, eventually to land back in his own time. The institution he refers to as mechanistic science, however, has not taken this journey with him, even now. It probably never will, for reasons to be explained.


Once history is viewed as contingent rather than predestined, a rational or "analytic" form of discourse on art becomes very difficult to carry out. The ground of "contingency" is always shifting beneath artworks' feet, beneath everybody's feet, always. Predestination holds that things could only have worked out exactly as they have; simple enough! Under contingency, meanwhile, it can be impossible to distinguish true causes from among everything and anything else that was also happening. It is difficult (rationally, at least) to put a "frame" around causality, because this requires the assumption that literally everything else that was happening near the time of the causal event(s) had no effect at all. If it is found, say, that two composers of great modern string quartets both were raised in the country, both were avid chess players, and both experienced the death of an adolescent chum, that is a notable coincidence; does it also explain their common educational achievements, love lives or audience reception? And if they evince differing political ideologies, technical compositional abilities, and business acumens, then as many questions have been posed as answered.

As Taleb puts it,

If you have the right models...you can predict with great precision how the ice cube will melt—this is a specific engineering problem devoid of complexity,... However, from the pool of water you can build infinite possible ice cubes, if there was in fact an ice cube there at all. The first direction, from the ice cube to the puddle, is called the forward process. The second direction, the backward process, is much, much more complicated. The forward process is generally used in physics and engineering; the backward process in nonrepeatable, nonexperimental historical approaches.

"In a way, the limitations that prevent us from unfrying an egg also prevent us from reverse engineering history."

What If Not?

What about a good old-fashioned counterfactual? Can this succeed where simple biography and taxonomy fail?

A jejune example: Shostakovich's Fourth Symphony. (Not his Fifth. His Fourth.) The score had to be reconstructed from the instrumental parts, which sat idle for decades after the symphony's premiere was nixed. The piece was lost for decades, and then, one day, it was found. If the parts had never resurfaced, though, there would be no Fourth Symphony, at least not for contemporary audiences.

Given all of the known circumstances, it is easy enough to imagine the parts failing to resurface and the piece thus enjoying the ride to oblivion. It is impossible to say how the world would then be different, at least if the so-called "butterfly effect" is taken into account. It is very easy, nonetheless, to entertain the counterfactual, because most everyone has lost track of some important papers at some point in their lives, and it is easy enough to see the profile of that general human problem; it is very easy to project it out in one's mind, even if one did not live in Soviet Russia during the first decades of Bolshevik hegemony.

In formal historiography, it is enough merely to relate the story of the parts and to take things from there. The notion that there would be no symphony if there were no parts is too obvious. It is a sophistry, most of the time. It is an answer in search of a question, unless and until a particular question is being asked. It is a "counterfactual" rather than "factual" statement. To play out the consequences from there is risky business. It is idle speculation, entertaining but hardly enlightening.

It seems, then, that the way of the rationalist might be, again, to simply accept history exactly as it is thought to have unfolded and to take things from there. That is the way, certainly, if the solution of some problem-at-hand depends upon ideal fidelity to the barest past facts. It will be seen straightaway, however, that this just-the-facts orientation is not an orientation which prevails very much of the time, not in art history, and certainly not in criticism; it does not even prevail all of the time in artists' own shop-talk, although that is, on the whole, probably the best place to look if one is intent on finding some of it, somewhere, just to confirm that it ever happens at all.

As for Shostakovich, what does it mean, then, historically speaking, that the parts were lost for [##] years? What does it mean that Soviet officials were not happy with the Fourth, as far as they knew it at all? What does it mean that Shostakovich had ten fingers? (If he indeed had ten fingers?)

Does this last question seem less relevant, or salient, than the previous two? Well then, so it is! And if not? Then not!


Mauceri—this belongs with "biography," etc.

John Mauceri's recent book The War on Music furnishes a particularly egregious example. At one point, Mauceri, who has been both a Yale professor and a very successful practical musician, admonishes his readers to "Look carefully at a photo of Cologne in 1945 and imagine a motherless seventeen-year-old," surrounded by devastation and "self-punitive architecture that was being constructed" amongst the recovery effort. "The horrors of war made beauty inappropriate," hence "this dissonant music" of Stockhausen "is redolent of loss." Almost anyone can perform this exercise, and everyone who does will form some image of the prompt. For Mauceri, what matters about such images is, for lack of better ways of speaking, the motherlessness, seventeenness and Cologneness of the principal; also the self-punitiveness of the architecture, the horror of the war, and, last but hardly least, the dissonance of the music. That is a lot! It is already too much for many people. Try, even so, to parse this motley as a historian or a practical musician might. For those purposes there is not very much here at all. The amount is closer to nothing than to something, actually. Life does not unfold this way. This is no account of life or of work, but rather of Mauceri's own memory. It is a frame chosen in order to produce a comprehensible morsel of infotainment, not to produce historical understanding or practical refinement. Not even the life of a seventeen-year-old proto-atonalist can be reduced this drastically and retain any explanatory power at all. However one feels about the principals here, this will not do.

It will be objected that Mauceri is a bad example. His example (not just here but on nearly every page of his book) is not good criticism, not good history, not even good self-directed learning, not for a practical musician of great accomplishment, as he indisputably is. A better example must be found, and plenty can be found, not just anywhere, but perhaps at the pinnacle of professional accomplishment in the fields of criticism, history, musicology, and so on. So goes the metacritical rejoinder to any pointing out of a bad example. Incomplete evidence is presented here. That is absolutely true. There are other writers who have not erred as badly in their framing of an artist's life and work.

These good writers have not confined themselves to mental images of teenagers projected (mentally) against a photograhic prompt. Rather (or also), they have brought in more and better granular details of "context," of politics, culture, religion, ideology, intellectual history, race, nationality, gender, sexual orientation, and so on, and on and on . . . Has Mauceri got any more to do but simply to be more comprehensive? Has he merely failed to provide sufficient "political, social and cultural context?" Did he carelessly omit two or three other relevant episodes in Stockhausen's life? Need he merely include those episodes in his account, and then go to lunch? Did the groundhog show his face the year the war ended? Was Mercury in retrograde during the composition of certain klavierstücke ?

Does all of this sum up, in toto, the shortcomings of the motherless-teenager parable?

It not only does not sum up these shortcomings, it is wholly irrelevant. Mere "context" will not resolve Mauceri's frame problem. What would? The entire universe.

Already, by the time "framing" has so much as been noticed, things are very dire for critics and criticism. That is because the critic's frame (really anyone's "frame of reference" on anything) is arbitrary. It is personal and "unrevealable," as Becker says broadly of the "insides" of things and of people. This point will sound absurd and sophistic to those who have been reared on transactional scholarship and criticism of the arts, so it is necessary to elaborate upon it in some detail.

First, none of this is to say that "there is no reality," or that people can never hold anything at all in common. People can hold certain things in common, but there is a peculiar and specific way that this works, and it does not (cannot) ever land upon very much "reality."

Disparities of salience are no problem on the unrevealable inside of a human being. Quite literally, that is life. It is not always possible, though, to project one's own saliences out and onto the world one-to-one, as Mauceri does above. That is not to say that Mauceri will not find a few others (probably very many, actually) who resonate with his particular framing. It is not to say that anyone really does or ought to care about how many fingers or toes a composer has. Who ever could care? Produce at least one person who does care at all! And, how to know with certainty just how many fingers and toes these gentlemen actually had?

These rejoinders rely, perhaps unwittingly, upon what Ittelson calls "the nexus of contemporary social structure and practice." Granted the context of that remark is very different, but the contours of the problem are identical here. Generally, the topic of art under Soviet political repression is far more salient to present-day Westerners than is the topic of composerly fingers or toes. It may be ultimately proven, perhaps, that there is no interest in the latter whatsoever, or not in these terms. If so, then both the social nexus and the people who comprise it are biased.

To understand the full import of this point, it is necessary, first, to let go entirely of the B-word's colloquial usages, and to let go of the notion that "bias" is something that some people have in excess while other people have very little of it. In philosophy and communications theory, all that "bias" means is that some objects from among the total social or perceptual field have salience for some subject, while other objects do not; or, more likely, the salience is finely arrayed and gradated from ultimate to high to low to nil, and for who knows what reason.

In Soviet music history, political repression per se tends to be on the high side of salience, and fingers-and-toes on the low side. It is easy to imagine why; indeed, it is easy only to imagine, but not to know for certain. The ways of the social "nexus" are always right on the surface, certainly. Ascertaining them is not difficult. But as Ittelson suggests, this is "pragmatic" and not "principled." It can fail anywhere there is incoherence or "slippage" in the semantic field. The example of gender, touchy as always, is always close at hand if needed.

It ought to be apparent by now that there is a truly ultimate problem underlying all of this jejune talk of "bias." The problem, namely, is that facts-of-history, like any observations about the world, can be multiplied ad infinitum. The parts to the Fourth were not just stowed and unstowed: say that the second clarinet part has a stray mark in the bottom margin of the second page; this mark is made in ink; the ink is black; the top of the mark is thin while the termination some ways below it is a thick blot; this blot shows through the paper to appear also at the bottom of page three, with myriad differences of appearance compared with page two; and so on, ad infinitum.

This just could not ever be salient, though, could it? Something must have caused the inkblot, but the inkblot could not possibly have caused anything at all, right? All of this Devil's Advocacy is precisely the denseness and overthinking of a non-problem which has "killed" philosophy, right? Not exactly. The question is, "What could potentially make something salient in the future, although it is not the least bit salient now?" The answers to this prospective question also are infinite. That is the problem to be confronted.

What is more, it is very easy to see salience shifting over time, even if it is rarely possible to say exactly why. No epoch in recorded art history has reinforced this point more strongly than the epoch which is ongoing as this study is being written. For a very long time, no one much cared, for example, that more men than women made European and Euro-American art; with the emergence of more women into the public sphere, the issue became more salient; by the 1960s it is a frequent topic of discussion; by the 1980s the discussion reaches a fever pitch; and in the 2020s it is, so to speak, ingested with one's mother's milk. When one makes art in the 2020s, any art at all, one's gender is more salient to more people than ever before. Who is to say to what ends? The point, for now, is that this has changed very severely and very quickly, within living memory. This, also, is merely a particularly visible and fraught example. There is, again, a limit only to the salience of the available examples; there is no limit to the human ability to multiply nominal examples until one's rhetorical opponent simply gives up. And, as always, salience of examples does not ensure their explanatory validity.


Why not simply let sleeping dogs lie? The entire universe does not really have something to say about each and every artwork, does it?

The dog of historical contingency is a light sleeper. That is history's blowback against all of the facile, one-sided art criticism which purports to show that things can only be just as they are. Critics always are sure to tie one hand behind history's back before challenging it to a fight. That is the only way they can so much as draw even.

It is easy enough to see, in Danto's parable of the Sistine ceiling, for example, not only that "feminism" had nothing to do with Michelangelo's life, times or process, but also that myriad social agents in Danto's own midst (and, for better or worse, right up to today also) have not seen the feminist light. Some even double down as a symbolic gesture of dissent. When Danto says that his time has been "saturated by feminism," then, he is overstating the case so drastically as to be duplicitous. Pose to any woman the thesis that "saturation" has been reached in the culture at large and see what she says.

Presumably, Eve-in-the-middle is not quite as salient to the unreformed chauvinist as it was to Danto in that moment. Or, perhaps it is quite salient to the backlasher, who sees exactly what Danto sees but with a contrary valence. All the same, plenty of these people live in the same time and place. The demand to provide deeper "context" for the assessment of each one then meets with identical answers unless mere time and place are abandoned for more specific frames.

How to resolve this problem? Tighten the microscope until the known differences of "context" between the feminist and the antifeminist come into plainer view? Provide exactly that context which explains the feminism of one or the antifeminism of the other? (By the way . . . now that exactitude is sought, what exactly is "feminism?" What is "saturation?") By this time, then, the view is no longer comprehensive. There now is a very long and precious subordinate clause after the colon in the title of the monograph; prepositions and hyphens abound, desperate to qualify and narrow down the purview of the study so that no one mistakenly takes it to apply to everyone, everywhere. The frame is thus tightened and filled with even more one-sided stuff; but there now is even more of the world outside of the frame than there was before, and there is even less room to bring any more of it in.

What are called here "bias" and "salience" really are complementary concepts. Each makes the other possible. A perfectly "unbiased" rendering of the world is one in which the shape of a composer's left big toenail has perfectly equivalent salience to their compositional techniques, to the social and political context in which they live(d), to anything and everything not just about them but in the entire universe. That is why social creatures are biased, always, certainly in the narrow sense, and indeed probably in the colloquial sense too: because there is scarcely a world at all, let alone a "social" world, for them to experience through their perceptions if those perceptions do not narrow the field of "reality" significantly. That is what perception is. Perception is a narrowing, not a broadening. But this includes not just discrete sense perception; it includes the integral kind, too. It applies all the same to the purely idealistic pondering of past and future, which is all based in sense perception anyway. Speculating upon past and future is also biased, not because it is less than it could or should be, but because that is all that it can be.

The demand to be unbiased, then, is always nonsensical philosophically, even if it is understood, in some other context, as a mere figure of speech entailing far, far less ambitious ends than it literally denotes. Therefore, the point of calling into question even the most seemingly obvious, settled points of salience is not to demand remediation. That is nonsensical. The point, rather, is to suggest that salience is "pragmatic but not principled" in Ittelson's sense. If some piece of salience "works," if it helps to make a prediction which is later validated, or if it prevents a fatal car crash by drawing attention to an oncoming vehicle, even here it is not so easy to assume that future predictions based on this piece of salience will also come true, or that the secret to safe driving, elsewhere and everywhere, has been revealed. Moreover, the stakes are not always quite this high. Nobody dies (perhaps contrary to popular belief) when all but one unfortunate person in the audience for the Fourth has knowledge of its harrowing provenance. If one person who does know does not care, the earth does not spin off its axis.

The really pernicious part of twentienth century transactional criticism, then, is its assumption of a uniformity of bias, if not in society at large, then certainly among the audience for a given publication or institution. Even as social science, history, and various parochial academic specialties were becoming fully constructivist and (for lack of a better word) contingentist, much of the discourse on art was paying mere lip service to constructivist discoveries, all while refusing to make the requisite evolution at all. All the proof of that is in Mauceri's book. It is still possible, in the 2020s, for a writer like Mauceri, a man of impeccable practical and academic credentials, to "practice" the most vulgar reductionism, to later contradict himself on the very point he seeks to make this way (he says later: "you already know everything you need to know"; except, perhaps, about Stockhausen, Respighi, and about 200 additional pages worth of other information), and then, all the same, to go to press with {{{}}}, to receive favorable notices, and to have an audience with an enthisiastic John McWhorter; all on account of work that he himself would be loathe to accept from a student. How is this possible? Because almost everyone who cares at all about Mauceri's subject matter, who has thought about it at all, thinks about it the same way he does. There is near uniformity here. If a composer's mother has gone insane and died, this is poignant and salient. What if a composer's father, one day, got hung up haggling at the dry cleaners, got stuck in traffic, had to stop for gas, and arrived home so late that the budding composer has had an unexpected extra hour or two to himself? What if the composer frittered away this unexpected alone time, not knowing what exactly was afoot, by paging through a few books and pacing the house? What if nothing the least bit poignant or salient happened during this time? What if this time has long been forgotten by all concerned? And what if this was the time when some germ of something first sparked in the composer's mind, for whatever reason? Is this a priori the moment when the prior death of the mother came home to roost? If it has already been declared, before the fact, that the death will be the decisive factor, then of course that is the one available answer to this question. Even then, however, this cannot be the only answer, because it did not happen just anytime or anyplace.

Though it is by now risky to lay any more weight on the Danto/Maes interview than has already been laid, it must be pointed out, also, that Danto speaks here of making a break with a "profession" which "here in the United States was basically logical positivist. There wasn't a lot of room for art." Really there is plenty of room for "art" in logical positivism, in so-called analytic philosophy, in any so-called rationalist discourse. Practictioners of these disciplines have remained hard at work making good on this. There is very little room here for "criticism," the metier, however, because criticism's raison d'etre assumed a uniformity of bias in its audience; again, that does not mean a uniformity of political "slant," as the word "bias" might be parsed on a TV news program; what it means is that critics, those absolute pragmatist evangelists for "the nexus of contemporary social structure and practice," assume that Shostakovich's brush with the authorities, e.g., is salient, whereas the shape of his toenails is not.

Superficially, the further these hypotheticals are multiplied, the further the conversation seems to get from proving anything important. To show that the hypotheticals are infinite, one tack taken above is to offer series of them which start salient and become progressively less so; by the end, then, the conversation which began with political repression has landed in stray in blots and toenails. That is really the whole point. Just to be thorough about it, though, here is another, obverse tack.

Shostakovich himself is an instance of what is technically referred to, nowadays, as a white male, and the fact that he was a white male is the single most important thing about him and his work in the eyes of a great many people living today. All the same, in the 1930s (and perhaps as late as the 1970s) his whiteness and maleness, per se, did not have quite the same salience, or not as broadly. Matthew Frye Jacobson, for example, shows just what a moving target the category "white" has been in the United States over the decades. Many of the hyphenate-American ethnicities now subsumed under "white" were explicitly excluded from it, viciously and for a long time. This, however, has changed; so too, then, has the salience of myriad ethnic markers and personal histories. Obviously, to point out to a proud ethnic that they have been merely "white" from the beginning, or ever since a very certain date on the calendar, this may not go over too well, and indeed it should not. The same can be said, though, for precisely the obverse scenario before the fact rather than after it.

What used to be salient, in the 1930s and 1940s, about Shostakovich's identity? Probably not this, at least not with the same valences and semantics as today. Perhaps the topline item was that he was Russian, or maybe Soviet? (One or both of these; but at least one.) In any case, the selective constructivism of the mavens of context bites them in the rear end here, once such questions as this one are posed. The rage for context is based, explicitly, on the realization that something very obvious, like a person's race or gender, was almost never mentioned, or not mentioned causally, whereas other vectors of identity (e.g. nationality, as here) assumed outsize importance. And then, the inevitable backlash: these things were not remarked upon because they simply did not matter in this way. Once again, both sides have dug in their heels and declared that things could only have turned out exactly as they did; but each side has a different idea of what, exactly, has happened (or not happened).

This implicates basic methods involved in writing history, but that is not the point being pursued here. Indeed, the thinking of a 2010s person must not be facilely ascribed to a 1960s person, "ahistorically" as it is said, not unless it is seen, incontrovertibly, that there is some commonality which permits the two to be spoken of near-identically. This is the point: America had abolitionists, for example, before it was America; what, then, of the contention that any given historical figure, say as late as the 1840s, could not be expected to be unprejudiced because they (and everyone, ever) are merely "a product of their time?" Were the abolitionists not products of the same time? Place? Did all these people not burgeon forth from much the same broth of "political, social and cultural context?" Then why did they land on such different sides of an issue?

Sociologists have been drilling down ever further on this type of question for a long time without turning up very much.

from "The Frame Problem," Stanford Encyclopedia of Philosophy:

the challenge of representing the effects of action in logic without having to represent explicitly a large number of intuitively obvious non-effects.

In other words,

To many philosophers, the AI researchers' frame problem is suggestive of a wider epistemological issue, namely whether it is possible, in principle, to limit the scope of the reasoning required to derive the consequences of an action.
...
Using mathematical logic, how is it possible to write formulae that describe the effects of actions without having to write a large number of accompanying formulae that describe the mundane, obvious non-effects of those actions?
...
What we need, it seems, is some way of declaring the general rule-of-thumb that an action can be assumed not to change a given property of a situation unless there is evidence to the contrary. This default assumption is known as the common sense law of inertia. The (technical) frame problem can be viewed as the task of formalising this law.
...
The puzzle, according to Dennett, is how “a cognitive creature … with many beliefs about the world” can update those beliefs when it performs an act so that they remain “roughly faithful to the world”?
...
the question of how to compute the consequences of an action without the computation having to range over the action's non-effects. ...the “sleeping dog” strategy... not every part of the data structure representing an ongoing situation needs to be examined when it is updated to reflect a change in the world. ...

...the epistemological question is not so much how the computational challenge can be met, but rather how the robot could ever be sure it had sufficiently thought through the consequences of its actions to know that it hadn't missed anything important.

Fodor suggestively likens this to “Hamlet's problem: when to stop thinking” (Fodor 1987, p.140). The frame problem, he claims, is “Hamlet's problem viewed from an engineer's perspective”. But to warrant the award of depth, an epistemological problem must at least resist the most obvious attempts to resolve it. In the case of Hamlet's problem, the obvious appeal is to the notion of relevance. Only certain properties of a situation are relevant in the context of any given action, and consideration of the action's consequences can be conveniently confined to those.

... Fodor's claim is that when it comes to circumscribing the consequences of an action, just as in the business of theory confirmation in science, anything could be relevant (Fodor 1983, p.105). There are no a priori limits to the properties of the ongoing situation that might come into play. Accordingly, in his modularity thesis, Fodor uses the frame problem to bolster the view that the mind's central processes — those that are involved in fixing belief — are “informationally unencapsulated”, meaning that they can draw on information from any source (Fodor 1983; Fodor 2000).
...
solutions to the logical frame problem developed by AI researchers typically appeal to some version of the common sense law of inertia, according to which properties of a situation are assumed by default not to change as the result of an action.
...
According to Fodor, this metaphysical justification is unwarranted. To begin with, some actions change many, many things. ... But a deeper difficulty presents itself when we ask what is meant by “most properties”. What predicates should be included in our ontology for any of these claims about “most properties” to fall out?
...
These questions and the argument leading to them are very reminiscent of Goodman's treatment of induction... Goodman showed that inductive inference only works in the context of the right set of predicates, and Fodor demonstrates much the same point for the common sense law of inertia.
...
An intimate relationship of a different kind between the frame problem and the problem of induction is proposed by Fetzer (1991), who writes that “The problem of induction [is] one of justifying some inferences about the future as opposed to others. The frame problem, likewise, is one of justifying some inferences about the future as opposed to others. The second problem is an instance of the first.” This view of the frame problem is highly controversial, however (Hayes 1991).

.


Learning by practice requires feedback in order to complete its circle, but even practicing people never really observe absolute contingency firsthand. This is merely guessed at, first, as people are thrust blindly into whatever culture they inherit; then, practice proceeds on the basis of those guesses; and from there, as certain radicals would say, all things are possible; or so it may appear, for a while.

As for the present study, there is no pretense to having gained any more complete understanding of the cosmos, or of the individual, than is already implicit in existing cultures, practices, academic disciplines, communities. The effort here can only be as any human effort is: incomplete, slanted, half of a whole problem which amounts in the end to nothing. There are aspects of the problem, however, as with anything that has already been labeled a "problem," which move in and out of human beings' field of direct observation. It is not all for the better that events, very occasionally, reveal their partial answers very unequivocally, while the rest of the time, life is a motley. To human beings it can only seem as if all degrees short of perfect knowledge are equally hazardous, since there is then no more true certainty for the adept than for the novice; it is logical, from here, if this is indeed the assumption that has been made, to assume ultimate uncertainty at all times.

The field of "science" consists in the ascertaining of near-certainties. It can seek these answers, of course, only from problems which seem amenable to offering them up; the process of assessment of a problem as amenable or unamenable is very well codified and will not be entered into in any detail here, for now, because it is relevant only to say that this seems to work very well within the range of its own purview. But this range of inquiry is limited, profoundly so; this all the same regardless of the import in terms of human life, which can be ultimate, in rapture and destruction alike. Human beings can glimpse ultimate ecstacy and ultimate terror through their own works, scientific and otherwise. They have not yet succeeded in returning all of humanity to the earth, nor in ascending as one species-group up and out into the cosmic light. Suffice it to say: they are working on it. Both things. Concurrently. Never one without the other.

The two ultimate outcomes are not commensurable, however. Here on earth, these are not merely two sides of the same coin. There is no dialectic here, not in terms of human life. This study is humanist, then, in the too-literal sense of assuming that humanity ought to continue to exist, or most likely will in any case; and it is antihumanist in the basic sense of opposing most of what has fallen under the heading of "humanism" for the last 500 years. Antihumanism here on earth is humanism when it is viewed from outer space. This is the one and only "national prejudice" which is tolerated here.

Mostly human beings will not sniff anything scientifically actionable unless they are very adept in science itself. It is rare happenstance indeed which grants people a front row seat for the revelation of something profound, concrete, ecstatic, and perhaps also terrifying. These people become true believers in their own certainty on the matter at hand, even where they remain contentedly uncertain of very much else and not particularly troubled or bolstered by the fact. This then becomes just another form of difference between and among human beings, a difference of practical knowledge rather than of formal education, but with some of the same contours. This is why human beings have killed each other over their art, even though art hardly matters at all to anyone, seemingly. Also, it is why so many among those who have glimpsed the absolute truth of "great" artworks could remain or become among the most violent and destructive people in history. These are the two sides of the pragmatic problem to be faced here, and it is an ultimate problem not in terms of any artistic absolutism but rather of human life; this quite literally; this foremost; and then, secondarily, also in the broader, looser, artsier sense.

On this point, the hope is merely to reveal something, anything at all, that is humanly actionable but has not been acted upon because it has not been understood, or not understood by and for people, in a time and place and in mundane circumstances which permit of any such actionability. That is all. Nothing more. Nothing less.


How has art been practiced through the ages? In many ways, and in many incommensurate ways.

How has change unfolded here? Presumably it was very slow or non-existent for a very long time, until it began to accelerate out of control. Have these recent jolts of innovation been validated by practice? Or have they merely made people angry? Inflicted suffering? Threatened to lead humanity ever further down the road of self-annihilation?

Has innovation already completed its expansion? Has it already filled the entire art-universe, leaving nothing further to expand into?

These latter questions are jejune ones, once again, rephrased in the language of the preceding sections such as to reflect this study's central theses. They have been discussed endlessly without being answered satisfactorily.

On these points, unlike with vectors of identity, the broadest causal accounts are not too controversial. When formerly isolated peoples make contact, the more so when they are thrown together to live in close quarters, "ideas have sex," and change is accelerated. This is how the pace of change begins to outpace human beings' ability to keep up. The glacially slow material process of practical discovery and validation cannot keep pace with ideal reproduction. All of that kind of thing is thrown into chaos and confusion as more and more new ideas are born than can be provided for. As Becker explains so well, no one ever knew very many of all the things they really needed to know about their world, but they had enough communal protection against the bugbears that mere survival, at least, was still possible. Now, slowly, and eventually quickly, all of that was rolled back to make way for the "innovations" that would deliver humanity, advance it, to some new-and-vastly-improved way of life, the way of complete knowledge of and oneness with the cosmos.

For this process to take its course, it could not be regulated; not fully regulated, at least, not in the broadest sense, nor in the later technical sense. It is not even clear, even now, how certain branches of it would or could be regulated. And so it has run away and has to be chased down by regulators, if they have a visual on it at all. Government regulators are just one example. Any political relation, any "institutionalized inequality of power," is a latent conduit of regulation, broadly construed. There are plenty of these conduits around, perhaps more than ever, too many; but one cannot even see innovation by looking through them, so far has it advanced out into the expanses.

Individuals are also self-regulators, autonomically and willfully both, and of course they also have regulation done to them by all the circumstances unfolding outside of their own bodies. When Rank writes of artists meeting their own dynamic needs of equilibriation, he is honing in one very peculiar type of self-regulation. As is clear enough by now, this is just one side and merely to quote it out of context, as above, is already risky. The risk is well-known to any artist who has received more than cursory critical attention: if the art has been a violent lurch away from prevailing norms, something previously must have lurched at the artist just that violently; or, just as speciously but not quite as baldly aggressive, if the art has been so conventional as to be worthy of its inevitable short ride to oblivion, then clearly the artist must be a fat-and-happy bourgeois with plenty of time, money and intimacy to keep them in the mainstream, in art as in life. Input: Output. Image: Reflection.

There is no need to abandon this folk wisdom, not yet, nor to abandon the many witticisms and anecdotes based upon it. It is probably true on the whole. But what does "on the whole" actually mean? What can really be said about the whole world, and then applied so directly on smaller scales, all the way down to group or individual concerns? There is not much at all of this kind, and that is why there are two absolutely opposed theories of "society"; that is why some people have questioned whether society per se exists at all in reality or is merely the cultural illusion du jour. The two theories have been called "ontological nominalism" and "ontological realism," but this is unwieldy. For ease of use, colloquial discourse received the ideal reductionist phraseology when Margaret Thatcher declared of society that, "There is no such thing!" Ever since, there have been Thatcherites and Reaganites, and there have been Social Democrats; and around the fringes there have been the timeless boutique centrisms and extremist death spirals.

The position taken here, tersely condensed, is that believers in society are onto something, but they have one humungous detail all mixed up: much of what they are after is simply untenable on the scale of the post-industrial nation-state. It is not even possible in a city of millions. Information technology, designed to overcome precisely this pessimism of scale, has merely validated it beyond a shadow of a doubt. When people are not accountable to each other by way of mutual integration into elemental social structures, it does not matter what kind of choice architecture or content moderation prevails on a given internet channel; there will be truly antisocial behavior. (Antisocial, not misanthropic. There is always some of that too, but that is something else. Rather, antisocial, as in "society.")

On a scale of individual-to-society, there is a Puritan waystation, community, which best names the price. Communities are not as individualistic as contemporary Western democracies, but nor are they any more "societies." Not at all. Communities are highly independent, perhaps self-governing and isolationist, confined to a patch of ground, tending more towards stability than innovation. Cloistered bourgeois neighborhoods and recreational affinity groups are referred to, these days, as "communities," but really they are just what they are: cloistered away or escaped into. They quite literally get their sustenance from farflung sources: from food grown on the other side of the world by people they will never meet, or from a parasocial relationship with a celebrity who may well turn out never to have existed at all. That is not all bad, but it is not community. Actually it is the opposite.

As with anything and everything human, the ideal of community is an ideal. Real life is never so simple. Any and all of that kind of thing is happily conceded here. All the same, this is how the present study aligns itself on the matter: as a work of left conservatism, as Russell Arben Fox has proposed such tendencies be labeled.

What of art, then, if both contemporary social democracy and classical individualism are abandoned for this thing called community? Is this not the death knell for modernism? For experimentalism, for activist ostentation, and for hyperminimalism too? Is Pure Art out and Functional Art in? Are all the artists who have pursued all of this so doggedly throughout the tumultuous and long twentieth century simply to be abandoned by the side of the road? Perhaps in the end, but not yet. For now this work appears as just another body of practice, groping in the wilderness for feedback, receiving some, going wanting in other ways, and then dropping into the postmodern grab-bag of intertextual reference and filmic leimotif, from which it is fished out only for very calculated reasons; indeed, it is trotted out most often as a form of critique and rarely at all as living practice.

Rightly or wrongly, modernist and experimentalist art always has, from the get-go, been emblematic of the Great Acceleration. It changed too much and too quickly for audiences to get a bead on it; therefore audiences not only rejected it, largely, but were (and still are) for the most part repulsed. People still react to early modernism with the same deep distaste and even aggression with which its very first audiences did. That is said to settle the matter right there. The reason people still encounter this work, however, is that yet other people, a tiny elite perhaps, have worked tremendously hard to preserve it and keep it in circulation. This cohort is tiny and their quality of commitment is ultimate. That is a delicate problem for a "society" to confront, even in the sphere of leisure.

In a very famous series of lectures, published as Art and Technics, Lewis Mumford gives voice to the everyman's experience of modernism.

the symbols that most deeply express the emotions and feelings of our age are a succession of dehumanized nightmares, transposing into esthetic form either the horror and violence or the vacuity and despair of our time
(p. 7)

This towards the beginning of the lecture series. And towards the end,

The healthy art of our time is either the mediocre production of people too fatuous or complacent to be aware of what has been happening to the world--or it is the work of spiritual recluses,...artists who bathe tranquilly in the quiet springs of traditional life, but who avoid the strong, turbid currents of contemporary existence, which might knock them down or carry them away. These artists no doubt gain in purity and intensity by that seclusion; but by the same token, they lose something in strength and general breadth of appeal.

(p. 147)

Few see fit to disagree with this outlook whenever it is offered up, at least as concerns their own sensibilities; nevertheless, the argumentation here is tragicomically wrong, and it stands out as such because there is not very much that Mumford ever got wrong, not like this. It stands out particularly to a Rankian, too, because Rank's book was twenty years in the rear-view by this time; Rank had already shown just how dubious this kind of socio-determinism really is; also, Rank and Mumford indeed evince much the same eclectic polymathy on some of the same subjects. Mumford, however, reveals here that he had not thought very hard at all about the relationship between creation and experience; despite his encyclopedic grasp of the creations, his account of the creators will not do.

The purely procedural gaffe Mumford begins with is to take Picasso's Guernica to be a representative work. He takes it to be such because it is considered "great," perhaps the greatest. But that is precisely why it is not representative. Mumford had his reasons for doing so, it seems; unfortunately they are not good ones. Guernica is a one-off among one-offs; so one can only hope, given the event which it portrays. Elsewhere in the world, even at this fraught historical moment, many other people were painting landscapes, reviving Shakespeare, and singing lieder. It is not so easy to simply round out the tautology and insist that these people, if this is what they were doing, must not have been too disturbed over the bombing of Guernica, or perhaps did not even know about it. That is a difficult point to litigate granularly, it must be conceded; but it is dubious anyway on a high level. Indeed, all things are not possible at all times, but much remained possible in 1937 besides "vacuity" and "nightmares." Much of what had already been done in Europe and America remained doable, and people did it. If Picasso or Faulkner was too much, there was always Benny Goodman or Charlie Chaplin. What was not doable? That which had been forgotten, disused, or wilfully eradicated.
[Mumford on the death of "polytechnics"!!]

The sin of modernism, then, is not that it succeeded or failed in its totalitarian ambitions. It would be just as easy to conclude, ahistorically, that modernism "comforts the disturbed and disturbs the comfortable," nothing more nor less, and that the world actually and always needs just a little bit of that kind of thing (but only a little bit . . .). The reality, though, even for the advocate, is that certain parts of the world, at certain times, did get much more modernism than was good for them. Why? Because many modernists socially transacted in their work such as to impose it on people rather than to allow people to take it or leave it. As concerns music, John Mauceri has recently put forth an account of this sordid history, an account which gets at many underdiscussed aspects of it, but unfortunately suffers from a condemnably vulgar form of socio-determinism which puts Mumford in the shade.
[link to some Gann material would be really helpful here...but which post(s) is it in?]

The reality, then, is that even the ostensibly Puritan "high modernism" of the mid-twentieth century was not pure at all.
[note on Kavolis]
Rather, it was transacted viciously and absolutely. It was imposed upon the people by a tiny elite. By that time the aesthetics of it all do not matter a lick. The psychology of the creative process and the ramifications of biography do not matter anymore. There is a more basic problem.

Artists of advanced sensibility must confront this, openly and honestly, and they must do it with and for their audiences, not to their audiences. They must afford audiences the fundamental democratic right of abstention, without which there is no true consent. That is what was not done when modernism was fresh. And then, just as Lasch says, somewhere, that those calling most loudly for revolutionary liberation later are found at the head of some new system of oppression, the experimentalists and conceptualists took the helm. They are not quite as vicious as the modernists and are more in tune with their own times, but they are human beings thrust into much the same motley of global expansionism and ideological chaos as were their radical forebears. They have committed all of the same sins of impure social transactionality and top-down imposition; and so, where it once was atonality or nonrepresentationalism which was pushed on people, now all has become conceptual and didactic, all the same if its subject matter is soap or citizenship, food fights or class war.

No human matter ever unfolds ideally or according to plan, and art history is no different. But history can only unfold. It can be refolded and shipped to oblivion, postage due, only by being forgotten and disused.

Modernism's sins of transaction must not be forgotten, and neither must the work itself be forgotten.

The work and the transaction (and the backstories and the forestories) are not simple reflections or antitheses of each other. Their relationships are infinitely complicated. Those relationships are intractable. Communities can only make judgment calls on such matters; they cannot penetrate to the absolute truth of who was thinking what, why, and how.

The way forward, as Mauceri says in his brightest moment, is simply to present the work. Practice modernism within the social and legal norms of the time-and-place. Transact in purity and benevolence. Do not lure people into traps and then hold them there. Let them wander past, stop and look, come and go as they please. Let them stay for as long or as short a time as they wish. Let them think what they will.

Do not read the act of leaving a concert midway through in the manner it has always been read, not even if there are other hints to this effect. Do not read this as a Cognitivist does. Read it as a Behaviorist does.

Foremost of all: do not hold grudges. Artists must continue to practice community self-government and self-transcendence with and for all of their own concert-leavers. Otherwise they have excluded themselves from the community. They have not been excluded by their leavers; they have excluded themselves.

In other words: Turn absolutely against the radical platform of art-life monism. Compartmentalize, work, play, commune. In other words, practice community. Community can be stifling. It does not have to be that way.

Continue to give concerts and see who stays, and who comes back. Someone always does, though the door is always open for them to leave. This may mean that they need to be there. The reason for their need is not important. What is important is that it be met, and that the needs of the artist to have at least a few people watching them are also thereby met. These are basic needs. If either set of needs appears unreasonable, something larger, elsewhere, has gone very wrong.

This is how those subsumed within an insular, monolithic community life can find "their people" all the same. They can have just enough individualism to keep things fresh without needing to integrate the entire universe with itself.

By that time all is well except for one thing, which Becker has spoken to eloquently and extensively: the parties on either side of a social transaction must be weary of later landing, first, in sacralization, later in scapegoating of each other. When that happens (it almost always happens given long enough), practice has taken its course, the previous experiment has come to an end, and a new experiment must be launched. The community, nominally, remains unchanged. The body replaces its cells so gradually that it remains identifiably the same person; in fact becomes only more like itself with age. But the cells must replace themselves, or else the organism will quickly die. That is what modernism must become: cell replacement as against amputation, implantation or prosthesis.

NOTES

. . . "An Open Letter to Open-Minded Artists" . . .

The author confesses that he has not actually read the work whose title he riffs on here. He promises to do so soon.

[ Mumford, Myth of the Machine, p. 255 (gbooks preview) "A genuine polytechnics was in the making..."










More Is Different

With Rank, Becker saw that asceticism and gluttony, potlatch and hoarding, moralism and hedonism, are not just culturally relative "human problems" (though they are that too), but actually solutions to an ultimate human problem also. They fail as solutions as easily as they succeed, of course, but this does not prevent their independent discovery elsewhere in time and space.

Moralism especially has failed because it requires a diversionary cover story that becomes evermore difficult to sustain as knowledge advances. Moralism is always standing astride the knowledge-cognition gap, and so it is always falling in. Someone always goes down to the bottom and fishes it out, though, now a bit worse for wear, and sets it back astride. And so it goes.

Morality itself is culturally relative, which creates intergroup conflict; but moralism can fail within a culture if the leapfrog of advancing knowledge is feeling frisky. At root, morality is always more ideal than real, so it is constantly running up against "real" events that it cannot account for.

It is easy to imagine that material circumstances have something to say about the formation of morals, but morality does not really come from reality. As Becker says, "Man's answers to the problem of his existence are in large measure fictional."

Such it is that a comparative anthropologist is always already a theorist of civilizational collapse, and Becker is no exception.

anthropology has taught us that when a culture comes up against reality on certain critical points of its perceptions, and proves them fictional, then that culture is indeed eliminated by what we could call "natural selection." When the Plains Indians hurled themselves against White man's bullets thinking themselves immune due to the protection of Guardian Spirits in the invisible world, they were mowed down pitilessly.

But the curious fact is that reality rarely tests a culture on salient points of its hero-system . ...man seems to have been permitted by natural bounty to live largely in a world of playful fantasy. Whole societies have been able to persist with central beliefs that bore little relation to reality. About the only time a culture has had to pay has been in the encounters with conquerors superior in numbers, weapons, and immunity to certain diseases.

As for more recent times,

One of the terrifying things about living in the late decades of the twentieth century is that the margin that nature has been giving to cultural fantasy is suddenly being narrowed down drastically. The consequence is that for the first time in history man, if he is to survive, has to bring down to near zero the large fictional element in his hero-systems.

This latter point is arguable, and it has been argued over, certainly. For art people, the knee-jerk rejoinder is to say that not very many people, now or ever, have much enjoyed living without some "large fictional element," in "society" or otherwise; while that element has never been wholly or even mostly a matter of art or aesthetics, these nonetheless are the best places to find it, and they are harmless diversions anyway. It simply is not worth living without fictions. To live this way is to underbid life's price by a factor of ten.

In answer to this: contrary to the well-worn saying, what one does not know can hurt them; in fact it can destroy them along with everything they think they ever knew.

The question is, then: What illusion can a society afford? Or, to return again to the Game Show theory of existence: What is life's price, and can a society name it to the nearest dollar without going over?

Becker's marshaling of psychoanalytic theory, though he rejects and revises far more of it than he merely serves up, is just another aspect of his books that has become ever more unsightly in polite scholarly company. Yet it is not so easy to reject passages such as the following simply because of the sourcing:

Since his choice of mechanisms of defense, of a style of life, is the child's adaptation to superior powers, this choice does not reflect his own real feelings, his own true perceptions. In fact, it would be difficult to determine what these might be since, in large part, the child was not given the chance to have them. This means that the child's denial of his burdens is "dishonest," not fully under his control, unknown to him: his character, in a word, is an urgent lie about the nature of reality.

The child thus becomes the prototype for the later adult. At a certain point Freud's "mechanistic" account of this became untenable: the former Freudian reality was itself relegated to "cultural fantasy" when it ran up against later, better empirical observations. But it is easy enough to see, above, that Becker is really talking about everybody. It has only been confirmed, as knowledge has advanced, that when any human being is "not given the chance to have" any of their "own real feelings," one likely result is a "dishonest denial of burdens" and a certain unwittingness vis-a-vis one's own lie of "character." Those observations have held up just fine, and so there is more than a hint of primitive animism in rejections of Becker, in toto, merely on the basis of purported Freudian contagion. Freudianism may be very wrong, ultimately, but it has shaken loose all manner of precious stones from the attic of the scholarly mind, along with a few wasps, and so it is derelict to simply declare a taboo on any author who appeals to it.

Becker is, finally, the kind of anthropologist who is an even more astute observer of his own time than of past times. He is a pantheon moral philosopher and political theorist, and even a naive but brilliant philosopher of science, who is still being discovered as such. And so, he is not above turning the telescope around and looking in the thick end.

Modern man is denying his finitude with the same dedication as the ancient Egyptian pharaohs, but now whole masses are playing the game, and with a far richer armamentarium of techniques.

Hence,

Life in contemporary society is like an open-air lunatic asylum with people cutting and spraying their grass..., beating trails to the bank with little books of figures that worry them around the clock... This is truly obsessive-compulsiveness on the level of the visible and the audible, so overpowering in its total effect that it seems to make of psychoanalysis a complete theory of reality. I mean that in this kind of normal cultural neurosis man's natural animal spontaneity is almost wholly stifled: the material-technological character-lie is so ingrained in modern man, for the most part, that his natural spontaneity, his urges toward mystery, awe, and beauty show up only minimally, if at all, or in forms that are so swallowed up in culturally-standardized perceptions that they are hardly recognizable:... Modern man is closed off, tightly, against dimensions of reality and perceptions of the world that would threaten or upset his standardized reactions: he will have it his way if he has to strangle the segment of reality that he has equipped himself to cope with

Such declarations have by now become tiresome, and indeed there is some garden-variety pessimism here; but as with the Freudian account of childhood, it is not so easy simply to close the book and go do something else. Becker here has got a bead on something more than the quotidian depravity of lawns and bank books.

When man found that certain ways of doing things worked to bring him satisfaction and survival, these ways became true and right; ways that didn't work became false and wrong. And so moral codes grew up around the interrelationships of things, theories of good and evil that tried to separate the real from the illusory.

The curious thing about this long search for reality, as anthropologists have long known, is that a large part of it was accidental. Primitive man did not know the interrelationships of things in many areas of his life, and he thought these interrelationships were primarily invisible and spiritual. As a result, when something important did not work, he looked for any clues he could get,...

And so, to bring things up to date,

The second curious thing about accidental causal explanations is that they did not vanish from the earth with prehistoric evolution, but remained an intimate part of human beliefs all through human history, right up to yesterday, so to speak.

In sum: a human being may survive and thrive, and indeed feel good, and yet unwittingly be in grave and constant danger all the same. There always seems to be too much suffering in the world and not enough joy; but there is never very much "reality" even in the best of times; and finally, if times are just that good, nobody is going to be particularly keen on upsetting the apple cart.

What is to be done about this, if anything must or can be done about it?

While we can agree that the task of social science is nothing less than the uncovering of social illusion, we must also right away admit that we understand that man can never securely know what absolute reality is. ...we have to rephrase our problem to put it in the more pragmatic terms proper to our talents. We cannot ask in any ultimate sense, What is Real? but we can ask experientially, What is False?—what is illusory, what prevents the health, the coping with new problems, the life and survival of a given society? What are its real possibilities within the web of fictions in which it is suspended?

Notice that Becker articulates something which is elsewhere called "negative empiricism," the method of ruling out causes, the task with which GPs and CEOs so often busy themselves; this as opposed to the "positive" feeling about in the dark for whatever one might hit one's head on.

Notice also that he grounds the "possibilities" and the "fictions," grounds them in each other, if not quite dialectically then at least relatively and dynamically. In this he makes a true leap towards something profound. This is Becker's visionary statement. It does not regress into mere prejudice, at least if it is understood. It regresses as such only if the grounding is undone by a refusal to accept limits on human life.

Now, who wants to live their life within someone else's "limits?" Who wants to be so "negative" all the time? Are these not humanity's prisons, from which people have only just begun to escape? That has been the gist of much hopeful conjecture, it has been a revolutionary battle cry on and off for a long time now, but the arrogation has had to be walked back in shame very shortly thereafter, every time it has been made.

And so, finally, one must face up to the political side, and as always, it is the most unpleasant side to face.

The Oedipus complex, understood simply, is the gulf that exists between one's early training, one's basic perceptions, one's primary sense of self, and the choices, opportunities, experiences and challenges of the adult world. If the Oedipus is heavy this gulf can be great enough to completely cripple the person's ability to live in a changing adult world. But democracy needs adults more than anything, especially adults who bring something new to the perception of the world, cut through accustomed categories, break down rigidities. We need open, free, and adaptable people precisely because we need unique perceptions of the real, new insights into it so as to disclose more of it. In a democracy the citizens are the artists who open up new reals. The genius of the theoreticians of democracy is that they understood this, that we must have as many different individuals as possible so as to have as varied a view of reality as possible, for only in this way can we get a rich approximation of it. Twisted perspectives then get corrected easily because each person serves as part of a corrective on the others. Totalitarianism is a form of government that inevitably loses in the longer run because it represents the view of one person on reality... When the pressure of reality becomes too much, all go down together.

And so Becker, probingly but farsightedly, comes out as a diversity theorist, a Talebian antifragilist, and a Laschian populist, all at once. Perhaps this is not really that remarkable, since even these far-flung cohorts have some common intellectual heritage. It is worth noting, though, and what Becker says here, quite literally at the end of his oeuvre and his life, can only color, retrospectively, everything else he wrote prior. He articulates something here which can only appear, fifty years on, as some kind of eclectic cherrypicking from left, center, and right, with pinches of Marxism and Nihilism to taste, perhaps as a cocktail party ruse or social media stunt. This, however, is not what this thought represented fifty years ago; that is not where this thought came from. And while it would be literally true to label Becker "eclectic," this customarily implies superficiality also, and that will not do in this case.

An intellect of this caliber, even, cannot have thought of everything, and in any case no short stack of books can touch upon anything-and-everything. And so it remains for others, confronted with this opening up of space, to begin to fill it in, one page at a time. This is welcome in an epoch when it seems as if everything has already been done; and yet this task begins, now and perhaps always, with pessimism.

To start, contemporary diversity theory has not brought any new perceptions into the world. It has not broken down any rigidities but rather is itself a rigid and evermore rigidifying totalitarian ideology. It arises from good intentions, seemingly; the very best, in fact. The problem is that it is not what it seems to be: it is a Beckerian "cultural illusion" to the letter, clothed in pseudoscience and hand-waving. It is not "empirically true." In fact most of its basic assumptions had already been ruled out before it was even conceived. As such it is already (was always) an affront to the very reality it seeks to act upon.

The ideal of diversity per se which Becker has in mind here had already been practiced, accidentally, in a few isolated times and places; otherwise later diversity theorists would not have had much scholarly ammunition to work with. This historical data, however, has not been heeded much at all in contemporary policy making. In its place, revolutionary cognitivism and idealism carry the day. "Diversity" has come down to the real-life aesthetic practice of leitmotif, to a protracted fit of "larping," and to a total work of art as life, one that Wagner himself can only envy. Obviously, that will not do.

Many of its detractors refer to the contemporary diversity ideology (and pretty much everything else left-of-center) as "Marxist," but there really is not much of Marx or Engels to be found in it, and there is not a whiff of Plekhanov, Kautsky or Luxemburg. What there is, rather, is a crucial element of Leninism, or Marxism-Leninism, and it is the all-important element: the use of state power (and later soft power) "to keep down one’s enemies by force," its use for "keeping down" any and all class enemies as the the ideologists du jour see fit to declare them. That is what is truly ominous about the events of the 2010s and 2020s. The rest of it is just people being people, and it will settle into something better with time.

(It will of course be argued that diversity measures lift people up and hold no one down. On this, see all above re: the dual aspect of causality.)

There is something transcendent to be done with and through "diversity." It has already been done, accidentally and in spurts; one day it will be done intentionally and always. It is possible. That is an unshakable article of faith which this study seeks to affirm, not to deny. No artist or aesthete of any sophistication at all can fail to grasp the ideal of diversity that Becker lands on above. The contemporary diversity ideology ain't it.

As for the matter of fragility, what Becker says above about democracy, while it is a characteristically searing point, cannot really be projected onto the global political stage. Becker seems to have studied the entire world, but he cannot be talking about the entire world here. His wisdom and his subject matter both are "cosmic," but cosmic diversity and community diversity are absolutely incommensurable notions. All such concerns are modulated by scale and scope. That is why desperate efforts to recover some kind of democratic community life always come down to excluding someone, even if that someone has some claim (or many claims) to have been included; and this of course has mostly ended in crippling acrimony and violence rather than in community self-rule. That is how contemporary citizens of the world are introduced to the problem of fragility and its relationship to the problem of diversity.

These problems are only getting worse. By this time, then, existential contemplation is a bit of a luxury. Something must be done, something which is cognizant rather than ignorant of the fact that scale and scope are dynamic modulators. More is neither more nor less. More is different. And yet the stomachs of the world will never be as big as the eyes. That much will never be any different for human beings, and their feigning of ignorance here has long since grown tiresome.

Sometimes a human being who has ended up feeling bad after they once felt good will have a lucid moment. They will tell Carruthers The Elder to hide the cake before they have a chance to grab a second or third piece. Back home, they will pour all the booze down the sink, cut up their credit cards, or declare that they no longer enjoy listening to a certain style of music and perhaps never really did. People need help to arrive here, it is true; and, people need people generally, indeed; but the help that comes from other people does not need to be hand delivered. Sometimes it is enough that someone simply notices how bad things have gotten for them, compared to other people at least, and to notice that this was not always the case; and from there, this person can initiate a well-controlled experiment on themselves and follow it out to some slightly happier resolution. They will say they have done some thinking, or have been working on themselves like an auto mechanic, but that is not quite accurate. It may be an outright lie. And many of them will say they did it all by themselves, which is a baldfaced lie that none of their friends and relations will fall for.

People have known all about this for a long time, but when there are eight billion of them things are a bit more complex. Specifically, if someone at the party says "Hide the cake before I eat any more!", inevitably another will chime in, snidely: "What, you don't like cake?!" What happens next is anyone's guess.

And if there are people who find out that they were never even invited? Perish the thought. "What, you don't like us?!"

One can still, now, go to Freud and his discredited "orthodox" followers for an understanding of this very specfic scenario. They describe it all perfectly. The reason their descriptions of it are so good, in spite of their vast hubris and ignorance of other matters, is that this is one thing they were not ignorant of. They lived it. They observed it on the very small scale of domestic and community life, without podcasts or sexbots to enervate and distract them.

When there are eight billion people at the party, no one can observe very much or really have any idea how they might manage to do so. That is a state of severe global "fragility" in Taleb's sense. And that is the whole point of this section.

As for what Becker says above specifically about democracy, it is best to pick up that thread a couple of decades later, where it is taken up by another visionary thinker who was being summoned away just as his work was falling into place.

Asking The Right Questions

Whither populism?

According to the European Center for Populism Studies, "the ideational approach" to populism defines it as

an ideology which presents “the people” as a morally good force and contrasts them against “the elite” , who are portrayed as corrupt and self-serving. Populists differ in how “the people” are defined, but it can be based along class, ethnic, or national lines. . . . the populist decides who the real people are; and whoever does not want to be unified on the populist’s terms is completely and utterly excluded . . .

To this it can only be said: corruption and self-dealing exist. Be it denied that they (or any other human construct) exist as ultimate realities or essences, a Real Populism nonetheless rises to meet them on the pragmatic and phenomenal levels. Individuals, groups, institutions, worlds: all may be or become corrupt.

It is a bankable assumption, by now, that the possibility of "class, ethnic, or national" divisions will be emphasized in dictionary definitions of populism, and there is plenty of real-life "practice" for this "theory" to point to in that regard. This "can" happen. Indeed. It does not have to happen, but it can.

Such statements thus put populists of all stripes on the hook for vulgar populism and its associated demagogies. That is one problem, for the other populists at least, though it is the opposite of a problem for the issuer of the statements. There is a second problem created too: such high-level reductions also, somehow, let the the class, the ethnic group, and the nation off the hook too easily. In 2024 these themselves so often appear as the most prolific breeding grounds for "corrupt and self-serving" behavior; this precisely to the extent that groups qua groups find themselves conscious of their difference and forced to confront it, be that in geopolitics, trade, internal political division, sports rivalries, or even in mutual "personal" disdain at the atomic level of social life. Profoundly "corrupt" dealings with outgroups are tolerated or even valorized while a strict moral code prevails among the ingroup. "Elite" corruption is merely one instance of this, the instance which the most people are most comfortable talking about: after all, most people are not "elites" and hence do not implicate themselves with such talk.

The ECPS article grants that populists may be onto something, sometimes, but it equivocates on the matter of elite corruption, which is itself more or less another "nationalist" ideology which has developed, spread and thrived among an insular group of people who just happen to live in the same time but not in the same space. If these people had to live with themselves, in a real community rather than a virtual one, the problem would be resolved quite parsimoniously. To speak of their insularity, though, is already a misleading reduction. "The elites" are insulated, physically, from "the people," and the elites are integrated with the people in a few specific ways. It has been averred, for example, that the present paradigm, "late capitalism" per se, aims to maximize the flow of capital across borders while keeping most of the people contained. The problems with that are obvious by now, but it does not have to be just capital and bodies. It could be anything. What is called "globalism" is actually imperialism in a velvet glove. This has been understood for a long time, since at least Rosa Luxemburg's Junius Pamphlet. It has never been possible for the people to do very much about it, however. "Imperialism" does not operate only by way of "capital," either. It is to the great credit of the radical left to have understood and emphasized the comprehensive nature of imperialism, to have noted its destruction not only of subsistence economies but of street-level culture.

A Real Populism, then, is a populism which is able to hold two quite dissonant ideas in its head at once. First: as globalism advances, nationalism becomes ever more toxic as it is diverted into all manner of base prejudices and outrageous platforms aimed at turning back the advance. Second: at the same time, the demands of nationalists, no matter what outrageous prejudicies accompany these demands, in the end come down to the assertion of a basic human right: the right to membership in a community and to everything that comes along with that.

When suggested by an interviewer that he had "become a man of the right," Christopher Lasch replied,

if I have to be labelled I would prefer to be called a populist.

Acknowledging that "populism can be reactionary," Lasch enumerates its "values."

a sense of limits, a respect for the accomplishments and aspirations of ordinary people, a realistic appraisal of life's possibilities, genuine hope without utopianism which trusts life without denying its tragic character.

"Above all," he concludes,

it is connected to a moral tradition. For this reason alone we cannot let it go out of fashion.

In contrast to this insistence on maintaining connection to a moral tradition, the ECPS article does not seem too concerned with the question of what, exactly, is "morally good" and what is "corrupt and self-serving." The problem with "unifi[cation] on the populist’s terms" seems not to be the nature of the terms but the mere fact that there are terms. That, however, is precisely what community is. Few recent thinkers give more eloquent voice to this point than does Lasch.

The ECPS does concede that "not everyone who criticizes elites is automatically a populist"; also that "keeping a close eye on elites can in fact plausibly be seen as a sign of good democratic engagement." It is even "completely normal" to hold certain "values" which don't align with one's community; yet to "personalize and moralize political conflict" is to exit "productive democracy" and descend into "anti-pluralism." Between this kind of "elite" equivocation and the close-at-hand caricature of "populist" vulgarity with which it comforts itself, it is hard to say which is the greater threat to peace and stability.

It is hardly novel that a loaded term such as "populism" could be thought of differently by observers separated in time and space. In this case, however, the juxtaposition of disparate standpoints poses a crucial and substantive question. As Lasch insists on staying "connected to a moral tradition," so for the ECPS "populism inevitably involves a claim to a moral monopoly," leaving all others "completely and utterly excluded." Where is the room for compromise here? What might it then mean to somehow stay "connected" to a morality which has no "monopoly?" Most likely it means to be driven to madness, slowly and excruciatingly, as one continuously confronts moral others who are just as unamenable to change as oneself and yet are structurally integrated into one's very own political, economic and cultural institutions. It means entering a high-stakes negotiation without agreeing upon definitions of any terminology. That is contemporary American politics in capsule. (Lakoff's "contested concepts" again.)

Lasch's later work fills out his position. "The impending crisis of competence and civic trust . . . casts a heavy pall of doubt over" the notion that "it is liberal institutions, not the character of citizens, that make democracy work." "Formally democratic institutions do not guarantee a workable social order." Further, if fellow citizens "are never held up to any kind of judgment," then "the question that really matters—How should I live?" becomes a mere "matter of taste."

Indeed,

this deeper and more difficult question, rightly understood, requires us to speak of impersonal virtues... If we believe in these things, moreover, we must be prepared to recommend them to everyone, as the moral preconditions of a good life. To refer everything to a "plurality of ethical commitments" means that we make no demands on anyone and acknowledge no one's right to make any demands on ourselves.

Hence,

it is our reluctance to make demands on each other, much more than our reluctance to help those in need, that is sapping the strength of democracy today.

By that time, "we can enjoy only the most rudimentary kind of common life," for "we have no basis on which either to demand respect or to grant it." For Lasch, "liberal democracy has lived off the borrowed capital of moral and religious traditions antedating the rise of liberalism," hence confounding the historical data. Only now that this "capital" has dried up is democratic formalism fending for itself; the result has been the recapitulation of social ills rather than their remedy.

Whatever objections this line may be open to, it plainly enough isolates the crucial hidden premise of the ECPS article, which is precisely this "liberal" faith in institutions and their attendant formalism to deliver not merely a "workable public order" but much else besides. For the varying breadths of moral "pluralism" entailed by several hundred, hundred thousand, or hundred million citizens there is prescribed, ostensibly, the same set of procedures.

And yet . . . "If we believe in these things, we must be prepared to recommend them to everyone." In other words, we must be moral creatures, because that is all that we can be. This entails the very real possibility of exclusion, including self-exclusion. It matters not whether morality is relative in a philosophical or culturally relativist sense. It matters not what roles "nature" and "nurture" have played in retrospect. Morality can be severely strained by circumstance, it can be explained or explained away, it can ossify or evolve, slowly; but if it is amenable to the quotidian bargaining or suggestion of practical politics, it is not morality anymore.

It is not so easy, then, to talk past the fundamental populist demands, even where it is very easy to detect incoherent or destructive aims alongside them. The right to community entails the right to "recommend to everyone" that which the community stands for and has worked for. The informed moral relativist position, then, is not to simply point out the abstract incoherence of the moral sensibility when this sensibility is viewed, once again from outer space, as a mere artifact of cultural relativity. Planet Earth can perfectly well tolerate a wide range of human moral impulses; she simply cannot tolerate them if they live all together, cheek-by-jowl, in a single global "community" of Billions-with-a-B.

Life as a died-in-the-wool moral relativist is tough going, even if it only gets easier to live as an occasional one the further globalism advances. The elites, too, who promote this expectation of rare tolerance, are reduced to plebeians anytime they are confronted with irreconcilable moral conflict. In this they truly are just like anyone else. One can only assume that they do not actually face this problem all that much, though, in their daily lives, or perhaps not at all; otherwise they would not constantly be making things worse rather than better with every word that comes out of their mouths. The word "irreconcilable" is used here for a reason; and if it is ultimately seen, through far more detailed and refined inquiry than is able to be offered here, that there is not very much which is truly irreconcilable, it stands to reason, then, that the present regime of globalism is indeed doing a very poor job in this regard, because there has not been very much reconciliation lately and there has been much nationalism, demagoguery and violence. That is the bare minimum of skepticism required.

It is hardly surprising that so much vulgar populism of the 2010s and 2020s has been an anti-globalist populism. Yet the pop-psychology that is deployed to account for the evident racism of these movements can seemingly produce solutions only in negative, so harshly do these solutions grate against the creed of democratic formalism. There is of course much boilerplate talk of scapegoating, but with none of the context, learnedness or rigor which Becker and his antecedents applied to that problem. There is much talk of people under threat reacting the way people under threat always do, whether in a psychology lab, a work performance review, or a school playground. Often it is even granted to be a real threat, such as economic strife. But pluralism too is a threat. Pluralism is a threat to the personal moral absolutism which is the bedrock of the self-concept. As Stephenson puts it, "almost all that we are in selfhood respects is given to us in relation to social controls." People adapt easily to different brands of soap, but destabilizing the edifice of "social control" strikes at our very "selfhood." What centrist formalists tend to miss about this is that loosening the control too much and too suddenly is not very different psychologically from overtightening it to the same degree. Both are highly destructive of settled "internalizations," and when people's lifeways are destroyed, as Becker sums it up, those people are "as good as dead."

Exasperated globalists throw their hands up at the historical moving targets of race and nationality: it seems to matter not at all who the immigrants actually are or where they come from, they will be opposed regardless. It seems it hardly matters if they really are morally compatible or incompatible with their hosts, though antiglobalist commentators fixate endlessly on this as the decisive question and are well-prepared to explain away the fact that the same line was taken against their own immigrant ancestors. It seems America has learned nothing from its own history and Europe learned its own too well. The exasperated are onto something here that the sanguine overlook. They should take their own exasperated observations more seriously. They apply tortuous depth psychology to observed behavior while taking words at face value. They have got it backwards.

What is at issue here, again, is merely the universal right to form and maintain communities of moral consensus. Alongside whatever shortcomings populists also have, they refuse to abdicate responsibility for the reconciliation of disparate moralities to the formal vicissitudes of a neo-liberal "politics of accommodation" ill-equipped to handle such a task. That is to their enduring credit.

Populists (and some others) are notably unmoved by the line that globalism has "lifted millions out of poverty." Globalists ask rhetorically, What on earth could take precedence over that? They should ask their own exasperated question more sincerely. It would elicit the sincere answers which they had already decided do not exist.

In exalting the label, Lasch concedes that populism "does not offer a ready made solution to our multiple ills." Nonetheless, "it asks the right questions."

As Taleb says,

people invoke an expression, "Balkanization," about the mess created by fragmented states, as if fragmentation was a bad thing,...but nobody uses "Helvetization" to describe its successes.

Prejudiced or not in whatever direction, vulgar populists may in fact prove above all to be poor judges of their own line of exclusion. They hurt themselves, too, by refusing community with foreigners who may ultimately prove like-minded and up to the enormous task facing them. This is not an easy problem to solve, but at least it is theoretically amenable to some technical refinement in a way that moral bedrock cannot be. There is every reason to think that eventual assimilation and acceptance (in some order) are just as inevitable as initial opposition. But responsibility for this cannot simply be imposed upon the rooted any more than assimilation itself can be imposed upon the uprooted.

This is the pragmatic view. A blind faith in institutions is not.




NOTES

. . . "More is Different" . . .

P.W. Anderson, "More Is Different: Broken symmetry and the nature of the hierarchical structure of science" (p. 393).
https://www.ias.ac.in/article/fulltext/reso/025/05/0735-0740

the reductionist hypothesis does not by any means imply a "constructionist" one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more the elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the rest of science, much less to those of society.

The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other. That is, it seems to me that one may array the sciences roughly linearly in a hierarchy, according to the idea: The elementary entities of science X obey the laws of science Y. ... But this hierarchy does not imply that science X is "just applied Y." At each stage entirely new laws, concepts, and generalizations are necessary, requiring inspiration and creativity to just as great a degree as in the previous one. Psychology is not applied biology, nor is biology applied chemistry.




. . . "in large measure fictional" . . .

Ernest Becker, Escape from Evil (p. 126).




. . . "when a culture comes up against reality" . . .

. . . "One of the terrifying things" . . .

ibid (pp. 127-129).




. . . "character, in a word, is an urgent lie" . . .

ibid (p. 148).




. . . "now whole masses are playing the game" . . .

. . . "like an open-air lunatic asylum" . . .

ibid (pp. 149-151).




. . . "moral codes grew up" . . .

. . . "right up to yesterday" . . .

ibid (pp. 155-156)




. . . "What is false?" . . .

ibid (p. 159).




. . . "negative empiricism" . . .

Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable.

a series of corroborative facts is not necessarily evidence. Seeing white swans does not confirm the nonexistence of black swans. There is an exception, however: I know what statement is wrong, but not necessarily what statement is correct. If I see a black swan I can certify that all swans are not white! ...

We can get closer to the truth by negative instances, not by verification. It is misleading to build a general rule from observed facts. Contrary to conventional wisdom, our body of knowledge does not increase from a series of confirmatory observations... But there are some things I can remain skeptical about, and others I can safely consider certain. This makes the consequences of observations one-sided. It is not much more difficult than that.

This asymmetry is immensely practical. It tells us that we do not have to be complete skeptics, just semiskeptics. The subtelty of real life over the books is that, in your decision making, you need be interested only in one side of the story: if you seek certainty about whether the patient has cancer, not certainty about whether he is healthy, then you might be satisfied with negative inference, since it will supply you the certainty you seek. So we can learn a lot from data—but not as much as we expect. Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful. It is true that a thousand days cannot prove you right, but one day can prove you to be wrong.

(pp. 56-57).

And, of course,

the sources of Black Swans today have multiplied beyond measurability.* In the primitive environment they were limited to newly encountered wild animals, new enemies, and abrupt weather changes. These events were repeatable enough for us to have built an innate fear of them. This instinct to make inferences rather quickly, and to "tunnel"...remains rather ingrained in us. This instinct, in a word, is our predicament.

The footnote:

*Clearly, weather-related and geodesic events (such as tornadoes and earthquakes) have not changed much over the past millennium, but what have changed are the socioeconomic consequences of such occurrences. Today, an earthquake or hurricane commands more and more severe economic consequences than it did in the past because of interlocking relationships between economic entities and the intensification of the "network effects" that we will discuss [elsewhere]...

(p. 61).




. . . "democracy needs adults more than anything" . . .

Becker, Escape (pp. 163-164).




. . . "eclectic" . . .

ibid (pp. xviii-xix).

Becker concludes his preface,

it goes without saying that this is a large project for one mind to try to put between two covers; I am painfully aware that I may not have succeeded, that I may have bitten off too much and may have tried to put it too sparely so that it could all fit in. As in most of my other work, I have reached far beyond my competence and have probably secured for good a reputation for flamboyant gestures. But the times still crowd me and give me no rest, and I see no way to avoid ambitious synthetic attempts; either we get some kind of grip on the accumulation of thought or we continue to wallow helplessly, to starve amidst plenty. So I gamble with science and write, but the game seems to me very serious and necessary.

Similarly, from the Preface to The Denial of Death,

One of the reasons, I believe, that knowledge is in a state of useless overproduction is that it is strewn all over the place, spoken in a thousand competitive voices. Its insignificant fragments are magnified all out of proportion, while its major and world-historical insights lie around begging for attention. There is no throbbing, vital center. Norman O. Brown observed that the great world needs more Eros and less strife, and the intellectual world needs it just as much. There has to be revealed the harmony that unites many different positions, so that the "sterile and ignorant polemics" can be abated.

(p. xviii).




. . . "to keep down one’s enemies by force" . . .

"Engels to August Bebel in Zwickau" (March, 1875).
https://www.marxists.org/archive/marx/works/1875/letters/75_03_18.htm

Grammatically speaking, a free state is one in which the state is free vis-à-vis its citizens, a state, that is, with a despotic government. All the palaver about the state ought to be dropped, especially after the Commune, which had ceased to be a state in the true sense of the term. The people’s state has been flung in our teeth ad nauseam by the anarchists, although Marx’s anti-Proudhon piece and after it the Communist Manifesto declare outright that, with the introduction of the socialist order of society, the state will dissolve of itself and disappear. Now, since the state is merely a transitional institution of which use is made in the struggle, in the revolution, to keep down one’s enemies by force, it is utter nonsense to speak of a free people’s state; so long as the proletariat still makes use of the state, it makes use of it, not for the purpose of freedom, but of keeping down its enemies and, as soon as there can be any question of freedom, the state as such ceases to exist.

Indeed, this is Engels himself writing this, presumably not for the first time. All the same, Lenin and his successors were the ones who finally made good on it, infamously. Also, it is transparent enough that most contemporary activists have not read a book in a very long time, and so calling them "Marxists" actually gives them too much credit.




. . . "the ideational approach" . . .

European Center for Populism Studies, "Populism."
https://www.populismstudies.org/Vocabulary/populism/

First accessed by the author on 20 January, 2023, on which date this was the third-from-top Google search result for the query "populism."




. . . "Junius Pamphlet" . . .

Rosa Luxemburg, The Junius Pamphlet: The Crisis of German Social Democracy.
https://www.marxists.org/archive/luxemburg/1915/junius/




. . . "if I have to be labelled" . . .

"On the Moral Vision of Democracy (A Conversation With Christopher Lasch)."
https://chamberscreek.net/library/Christopher%20Lasch/car_interview.html

There is no bibliographical information at all on this page, but elsewhere on the same site, the following is given:

Civic Arts Review Vol. 4, No. 4, Fall 1991.




. . . "liberal institutions, not the character of citizens ... Formally democratic institutions ... How should I live?"

Christopher Lasch, The Revolt of the Elites (pp. 85-87).




. . . "to speak of impersonal virtues" . . .

ibid (pp. 87-88).




. . . "our reluctance to make demands on each other" . . .

ibid (p. 107).




. . . "the most rudimentary kind of common life" . . .

ibid (p. 88).




. . . "lived off borrowed capital" . . .

ibid (p. 86).




. . . "irreconcilable moral conflict" . . .

. . . "Pluralism is a threat" . . .

. . . "politics of accommodation" . . .

Noam Chomsky, American Power and the New Mandarins.

Chomsky, in his first political work, is onto these problems, albeit obliquely, when he observes that "American politics is a politics of accommodation that successfully excludes moral considerations." Rather, "only pragmatic considerations of cost and utility guide our actions."

It is deplorable, but nonetheless true, that what has changed American public opinion and the domestic political picture is not the efforts of the "peace movement"—still less the declarations of any political spokesmen—but rather the Vietnamese resistance, which simply will not yield to American force. What is more, the "responsible" attitude is that opposition to the war on grounds of cost is not, as I have said, deplorable, but rather admirable...

(p. 10).

Faced with atrocity,

By entering into the arena of argument and counterargument, of technical feasibility and tactics, of footnotes and citations, by accepting the presumption of legitimacy of debate on certain issues, one has already lost one's humanity.

(p. 9).

In these respects, Vietnam certainly stands as a lurid historical monument to a democratic formalism that has lost its moral compass. Unfortunately, even Chomsky here seems to take it for granted that most people's public-facing morals would fall into alignment on such a grave matter were morals per se simply deemed admissable, a questionable assumption and one which itself evinces a vulgar rather than a Real populism.

William Stephenson's remarks on the formation of selfhood, its relation to social control, and the change-resistant nature of the deepest "internalizations" are also apt here. (See 0-2, notes.)

Writing at the twilight of American consensus politics and without much recourse to sophisticated laboratory psychology, Stephenson the opinion researcher and methodologist has already glimpsed the social psychology of total polarization that by the early 2000s had become part of the national discourse in the US. Already Stephenson notes that Senator McCarthy's approval rating was virtually unaffected by his censure; also that there are two completely contradictory definitions of "democracy" in the U.S., each of which perseveres in rather total ignorance of the other, precisely as Lakoff later concludes about "contested concepts" generally.

Perhaps it takes a populist to see that already Stephenson is catching a pretty good glimpse at the line beyond which people cease to be amenable to democratic compromise. Meanwhile, as concerns aesthetics, a populist takes note also of the parable of the New Yorkers in Texas, who find "self-expansion" and "self-expression" by way of "the trivia of modern consumer goods," all while their "early internalizations remain untouched." This suggests a certain pragmatic boundary between the aesthetic and the moral; it also suggests a nexus, as indeed there must be, somewhere or other.


. . . "as good as dead" . . .

Ernest Becker, The Denial of Death (p. 189).

Man needs a "second" world, a world of humanly created meaning, a new reality that he can live, dramatize, nourish himself in. "Illusion" means creative play at its highest level. Cultural illusion is a necessary ideology of self-justification, a heroic dimension that is life itself to the symbolic animal. To lose the security of heroic cultural illusion is to die—that is what "deculturation" of primitives means and what it does. It kills them or reduces them to the animal level of chronic fighting and fornication. ... Many of the older American Indians were relieved when the Big Chiefs in Ottawa and Washington took control and prevented them from warring and feuding. It was a relief from the constant anxiety of death for their loved ones, if not for themselves. But they also knew, with a heavy heart, that this eclipse of their traditional hero-systems at the same time left them as good as dead.




. . . "it asks the right questions" . . .

"On the Moral Vision of Democracy (A Conversation With Christopher Lasch)."

Populism, however ideally we might want to reconstruct it, does not offer a ready made solution to our multiple ills. I think, however, it asks the right questions. And it comes closest to answering the question about civic virtue. Above all, it is connected to a moral tradition. For this reason alone we cannot let it go out of fashion.


UPDATED: 29 June 2024

The Evolution of Cultural Evolution
JOSEPH HENRICH AND RICHARD McELREATH
https://henrich.fas.harvard.edu/files/henrich/files/henrich_mcelreath_2003.pdf
[123, abstract] it seems certain that the same basic genetic endowment produces arctic foraging, tropical horticulture, and desert pastoralism, a constellation that represents a greater range of subsistence behavior than the rest of the Primate Order combined. The behavioral adaptations that explain the immense success of our species are cultural in the sense that they are transmitted among individuals by social learning and have accumulated over generations.
The paper begins:
[123] In 1860, aiming to be the first Europeans to travel south to north across Australia, Robert Burke led an extremely well-equipped expedition of three men (King, Wills and Gray) from their base camp in Cooper’s Creek in central Australia with five fully loaded camels (specially imported) and one horse. Figuring a maximum round trip travel time of three months, they carried twelve weeks of food and supplies. Eight weeks later they reached tidal swamps on the northern coast and began their return. After about ten weeks their supplies ran short and they began eating their pack animals. After twelve weeks in the bush, Gray died of illness and exhaustion, and the group jettisoned most of their remaining supplies. A month later, they arrived back in their base camp, but found that their support crew had recently departed, leaving only limited supplies. Still weak, the threesome packed the available supplies and headed to the nearest outpost of “civilization,” Mt. Hopeless, 240km south. In less than a month, their clothing and boots were beyond repair, their supplies were again gone, and they ate mostly camel meat.

Faced with living off the land, they began foraging efforts and tried, unsuccessfully, to devise means to trap birds and rats. They were impressed by the bountiful bread and fish available in aboriginal camps, in contrast to their own wretched condition. They attempted to glean as much as they could from the aboriginals about nardoo, an aquatic fern bearing spores they had observed the aboriginals using to make bread. Despite traveling along a creek and receiving frequent gifts of fish from the locals, they were unable to figure out how to catch them. Two months after departing from their base camp, the threesome had become entirely dependent on nardoo bread and occasional gifts of fish from the locals. Despite consuming what seemed to be sufficient calories, all three became increasingly fatigued and suffered from painful bowel movements. Burke and Wills soon died, poisoned and starved from eating improperly processed nardoo seeds. Unbeknown to these intrepid adventurers, nardoo seeds are toxic and highly indigestible if not properly processed. The local aboriginals, of course, possess specialized methods for detoxifying and processing these seeds. Fatigued and delusional, King wandered off into the desert where he was rescued by an aboriginal group, the Yantruwanta. He recovered and lived with the Yantruwanta for several months until a search party found him.

The planning for this expedition could not have been more extensive, and these men were not unprepared British schoolboys out on holiday. However, despite their big brains, camels, specialized equipment, training, and seven months of exposure to the desert environment prior to running out of supplies, they failed to survive in the Australian desert. This bit of history makes a simple point: Humans, unlike other animals, are heavily reliant on social learning to acquire large and important portions of their behavioral repertoire. No evolved cognitive modules, “evoked [124] culture,” or generalized cost-benefit calculators delivered to these men the knowledge of how to detoxify nardoo spores or how to make and use rat traps, bird snares, or fishing nets from locally available materials. Unlike social learning in other animals, human cultural abilities generate adaptive strategies and bodies of knowledge that accumulate over generations. Foraging, as it is known ethnographically, would be impossible without technologies such as kayaks, blowguns, bone tools, boomerangs, and bows. These technological examples embody skills and know-how that no single individual could figure out in his lifetime. Nonmaterial culture, such as seed processing techniques, tracking abilities, and medicinal plant knowledge, reveals similar locally adaptive accumulations. Interestingly, this adaptive information is often embodied in socially learned rules, techniques, and heuristics that are applied with little or no understanding of how or why they work.

Thus, understanding a substantial amount of human adaptation requires understanding the cultural learning processes that assemble our behavioral repertoires over generations. This is not, however, a call to separate humans from the rest of nature. A productive approach should seat humans within the broader context of mammalian and primate evolution while at the same time being able to explain how and why humans are so different in the diversity and nature of their behavioral adaptations.