joculum: (Default)
The inconvenience of reading posts in reverse chronological order has led me to overcome my dread that no one will read four thousand words in succession in an online format. As I indicate herein, I have no idea when, if ever, Parts Three and Four will come into being, but this sums up and rethinks a good many topics I have tried to argue are intrinsically linked to one another. There are a good many that are not so linked; I am not likely to incorporate Patti Smith's introduction to the volume presenting the fashion designs of Ann Demeulemeester, for example, although other aspects of the topic might relate to the subject matter herein. Although the number of topics I seem to incorporate into this essay are numerous, they are not only finite but fewer than they might seem to a sufficiently impatient reader.

—Jerry Cullum (who is asserting his Creative Commons rights if not an outright copyright, as I expect the piece to evolve quite a bit before it reaches finished form)


Tipping Points in the Anthropocene Era, Part One


I have the increasing sense that the world has reached what can only be called (in spite of Malcolm Gladwell’s obnoxious use of the term) several tipping points—I use the term to denote points from which there is no going back, whether the change is for better, for worse, or indifferent but irreversibly different. These are to be distinguished from those all too common points in personal and global lives in which failure to reach a tipping point means that everything will degrade back to the unsatisfactory way it used to be; the world has an ample quantity of those, too, but what has been reversed once can be reversed again, in the opposite direction. Some tipping points are culturally inflected; it is still possible to live without the digital revolution, and even to live without electricity and contemporary medical knowledge (entire societies live that way, not always by choice) but to choose a technology involves choosing a cognitive package that comes with it even if it destabilizes things you would prefer not to have rendered unstable. Other tipping points are environmental, and are truly irreversible: extinction is forever, even if we genetically engineer a reasonable simulacrum of the original.

Perhaps some environmental tipping points will be avoided by way of cultural or technological tipping points. Perhaps the monarch butterflies will be brought back from near-extinction by the cessation of illegal logging, alteration of pesticide use, and planting of the right (North American) rather than wrong (tropical) species of milkweed on the monarchs’ migration routes. Perhaps colony collapse disorder will be corrected in the bee population, and we won’t have to use robots or impractically immense numbers of underpaid farm workers to pollinate fields and orchards fifty years from now (with the concomitant die-off of large quantities of natural flora not self-pollinated or pollinated by moths or other intermediaries...with cascading consequences for other species). Perhaps a century or so from now solar powered container ships will deliver at a slower pace sweatshop-manufactured goods from remote parts of the globe, the absurdity of just-in-time inventory replacement having gone the way of the dodo bird. (Or perhaps solar powered drones will have developed to transoceanic capacities, or some other way of getting supercheap goods from point A to point B will have been developed.) Or perhaps the countries hosting the sweatshops will have all fallen prey to demodernizing revolutions, replacing global banking with traditional methods of exchange and imposing legal systems that are not in sync with global standards of acceptable commercial conduct, leading to the rise in the rest of the world of some other, perhaps more technologically rather than exploitatively based methods of making and distributing inexpensive goods (with still further worker displacement....). Perhaps the exhaustion of a number of critically short raw materials will be worked around, as it always has been. Nobody worries about the current shortage of whale oil (to the disadvantage of the world’s whales) and as the Saudi oil minister said once, the Stone Age did not end because of a shortage of stone. (Speaking of the environment in which those whales live, perhaps sustainable fisheries will become a reality, so that fifty years from now we will not have a world in which the only fish available is farm-raised tilapia from polluted waters.) As the multiplying number of parentheticals in this paragraph indicates, the outcome of one inevitable change affects a good many other outcomes.


The future, in all those regards, is unknowable, but the future is arriving rather faster than anybody expected, and there are, at present, inbuilt social structures that keep anybody from being in a position to change things fast enough to meet its challenges unless the solutions increase short-term profits for a very specific set of asset managers in capitalist and putatively socialist countries.

The problem is that many, perhaps most, of the people who can keep up with the technology involved tend to think that the human problems will be solved by the advent of smart machines of one sort or another. The age of the transhuman is a popular notion among some self-styled futurists. Other futurists can write that we should modify William Gibson’s (no tech optimist he) remark that the future is already here, it’s just not very evenly distributed; the future is already here, but in many ways it’s not distributed at all. This small problem does not seem to bother the transhumanists, who presumably will not have to worry about rioting human mobs interrupting the electricity supply by blowing up the grid or the fuel pipeline or the solar and wind farms when the machines take over and the futurists upload their digitized selves onto servers. (Presumably the machines will have devised foolproof defenses for all these parts of the infrastructure, putting humans in general in the position of missile-armed tribesmen fighting against drone aircraft.)

The cultural tipping points are much harder to define than the environmental or technological. The technological, just as the half-delusional prophets of the 1960s predicted, has been a major force in the alterations of the cultural. The whole nexus of events, however, has not unfolded quite in the way that McLuhan or the others expected, as we all become textually visual in the immediacy of a financier-governed global village.

The cultural issues are so confusingly simultaneous that to discuss them one at a time is to fall prey to the misrepresentations that keep us from realizing quite what is happening. But to discuss all of them in the same simultaneous (dis)order with which they arrive in our lives and on our digital devices is likely to leave us with a headache and no greater degree of comprehension than we had previously.

As H. P. Lovecraft put it, in an opening sentence I was quoting before it became fashionable or even acceptable to do so, “The most merciful thing in the world is the inability of the human mind to correlate all its contents.” Thomas Pynchon concurred, of course, but he too did not completely realize just how much there is that needs to be correlated. (William Gibson sort of did, and still does, as does Douglas Coupland and, maybe, Okey Ndibe—Foreign Gods Inc. is a fairly amazing first novel, as much in its own way as its nearly polar opposite Neuromancer was.) There are frightening gulfs of time and space that have nothing to do with Lovecraft’s space monsters, and globe-spanning sets of interactions that go beyond Pynchon’s intricate economic conspiracies (which is not to say there are not economic conspiracies, just that they are not a sufficient organizing principle to explain what is happening to the planet and the human society that sprawls across the face of it, changing it inexorably and unconsciously as it goes).

Widen the area of consciousness, Allen Ginsberg wrote; but he meant something psychedelic, and while it is good to be aware of the gulfs at the margins of consciousness (where something very interesting might be trying to get us to notice it), we need to widen awareness of things that are much more central to human survival. But we live in an age of multiple centers, or an era in which the notion of the center has been inexorably weakened by the awareness of the networks of meaning and the networks of physical force that we approach one at a time, when in fact we need to be conscious of how they interact.

George Steiner, who prophesied all this some forty years ago in In Bluebeard’s Castle, has become yet another version of the Last European (in the sense of someone defending a culture that already has become, as Pope Francis recently implied, sclerotic at best and moribund at worst). There is something ludicrous about the style of hatch-battening being undertaken by those who feel that it is time to batten down the hatches of the glocal ship against the storms to come and the storms that are already here; this is so not least because the measures are so inadequate and so shortsighted, and because the ships with their unbattened hatches (let us ride in the vessel of this nautical metaphor for as long as we can) are already sinking in the crosscurrents of global forces.

That was a way of putting it; not very satisfactory. To quote T. S. Eliot’s Four Quartets, which is one of the fragments I shore against my personal ruins. The boy from Saint Louis did all right in mythicizing the British culture into which he inserted himself as a foreign immigrant, even if, as Wyndham Lewis put it, he had to disguise himself as Westminster Abbey in order to do it. Derek Walcott and V. S. Naipaul did it differently, coming from the Caribbean instead of the middle of North America, and they have been succeeded by younger generations of immigrant writers, predominantly female.

The curious thing is that Steiner not only noticed this, decades ago, he celebrated the fact that the English deployed by Commonwealth writers was far richer than the language as it was spoken and written by the putative indigenes of a little fog-haunted island off the coast of the peninsula of Asia that we call Europe. In other words: the ex-colonials were doing it much better than the longstanding Brits, and frequently doing it in the home country of their colonizers. A similar phenomenon obtained in France, where African and Caribbean francophones often outdid the proud originators of the language, and did it in Paris, too.

What happens, however, when the new arrivals begin not just to enrich but to supplant the cultural assumptions of the previous generation? The Brits learned in school that their culture came about because after holding out against an onslaught of Danish immigration, Celtic and Saxon-immigrant culture pretty much collapsed under the weight of Norman French occupation, resulting in the hybrid we call England that subsequently pulled Scotland, Wales and Ireland into its orbit. The French...well, the French learned about their ancestors, the Gauls; just how a welter of differently-languaged regions were fused into the hybrid entity of La Belle France tended to be passed over in silence until very recently. Cultures function by mythicizing themselves; cf. the relatively recent books titled The Invention of France, The Invention of Scotland, The Invention of Argentina—everywhere cultures are created by arbitrary denials of difference for the sake of creating a fictionalized national unity that eventually turns into a functioning reality.

As I was implying a few paragraphs back, the problem today is that the new arrivals everywhere are increasingly less interested in mixing and mingling with the existing cultures, not least because the historically defined local cultures are increasingly in a state of collapse, and the long-resident locals are not particularly inspiring exemplars of them. (Whittaker Chambers sixty years ago, in his usual tone of histrionic rhetoric: “It is futile to talk of preventing the wreck of Western civilization; it is already a wreck from within.” Gandhi, some decades earlier than that, when asked his opinion of the European code of civilized conduct: “I think it would be a good idea.” (Both quotes are cited from memory, and probably slightly wrong. Don’t quote me.) The fact that a good many readers no longer understand Gandhi’s joke illustrates Chambers’ point; there are habits of irony that never quite took comfortable hold in America, and are disappearing from the Europe that originated them, and for which Chinese and Indian and African intellectuals found analogues in their own cultures. Sometimes the non-European intellectuals are now more adept practitioners of the ironic cast of mind than the heavy-handed literalists of one-dimensional Europe and America—and has anyone noticed how Australia and New Zealand get cast to one side in this discussion, while South American cultures are so tangential to it that it is necessary to reframe the terms of debate in order to include them? although of course “non-European intellectuals” is a term that includes Jorge Luis Borges, who was more European in Buenos Aires than most intellectuals in Paris or London, and entire subsequent generations of South American writers in Spanish and Portuguese have more in common with their contemporaries in Prague or Dresden than with their contemporaries in, say, Chicago, Toronto, or Sydney.)

This essay has yet to get to the cultural consequences of the digital divide, and of the prevalence of cultures of demodernization in large parts of the world, including enclaves in a world that has gone from modern to liquid-culture post-postmodern. But in my experience, fifteen hundred or two thousand words is about all I am good for, and all that anybody can deal with online or in downloads, other than the dwindling ranks of New York Review of Books and New Yorker readers in America, and their counterparts in other countries and other continents for whom ten thousand words at a stretch is as nothing, even onscreen. (The quality of the screen matters a great deal, however.)

So I shall have to take up the difficult aspects of shifts in sensibility some other time, which is just as well since, as all of the foregoing implies, the shifts are so different from one subculture to another that it will take a fair number of paragraphs even to delineate them as inadequately as I have thus far delineated the other stuff. I like to allude rather than spell things out, which, by the way, is a characteristic that both Michel Houllebecq and Graham Harman have ascribed to H. P. Lovecraft in their reassessment of the value of his oft-derived rhetoric and transhumanism avant le lettre. Onward to eldritch revelations of disturbing fragments of evidence under a gibbous moon, then, but later.

[Regarding the habit of allusion without footnotes—that is why we have Wikipedia and websearches, not necessarily on Google. I look up most of my allusions these days (except when I get really lazy), and sometimes have to search for days before I find the reliable version of the quotation if I don’t have the book on my shelves. But in the old days of interlibrary loans, it sometimes took even longer, and sometimes I still hadn’t tracked it down, even years later.

But it would be fun to annotate this essay; littered with superscript numbers midway through every sentence, it would resemble an idea-convoluted story by John Barth, Donald Barthelme, David Foster Wallace, or their generational relatives and successors, with whom I haven’t kept up.
]


-----------------------------------------------------------------------------

Tipping Points in the Anthropocene Era, Part Two:


I have said remarkably little about the American context, and the purportedly postracial American context most of all. But that is because I do not want to sink irredeemably into a dispute over whether the American experience east of the Mississippi prior to 1920 is best represented by Huck Finn, Moby-Dick, The Souls of Black Folk, all of the foregoing plus Emily Dickinson (and maybe Madame C. J. Walker as a practical economic exemplar), or that it is pointless to argue about literary and economic history as long as today’s policemen armed with army-grade weapons are killing black men and women for selling cigarettes or standing the wrong way in the street.

So for the nonce, let’s not go there, although we must someday visit how all this affects or is affected by the forces that depopulate drought-ridden agricultural regions while creating boom towns in resource-rich ones, and how much of what is happening is unfolding more or less as it always has whether times in America are deemed to be good or bad.

Better to look for now, very briefly, at the discontents of demodernizing forces across the planet, and their relationship to the insights that Peter L. Berger articulated some forty years ago: that modernity travels with cognitive packages that are difficult to detach from it, and that those cognitive packages are intrinsically destabilizing even when they don’t automatically offer the culturally specific forms of stability that advanced industrial societies had worked out, in the era just before the Arab Oil Embargo first changed everything.

The world’s societies have been relatively settled for so long that their mores have become mythicized as much in the centers of postindustrially developed power as in the cultures closest to the level of hunting and gathering. The successful evolution of ways of routinizing the rights of the individual (and of the corporation defined as a collectively composed individual) have become so codified into law that it is forgotten how much the law has to be transmuted into social habit. One might argue that Edmund Burke and Antonio Gramsci were not irredeemably far apart in their insights into human nature; the difference is that Burke felt that settled tradition reinforced by state power was the only thing keeping humanity’s natural tendency towards rascality from devolving into chaos, whereas Gramsci more incisively saw that the rascals running things were perfectly capable of bending settled traditions and state power to their own ends and of learning how to convince others to believe in versions of settled tradition that were against their own interest and the interest of all their neighbors. It is not necessary to grant angelic status to human beings to support their right to act with full awareness and as much capacity to act as is not instrumentally harmful to other human beings. (But what if the harm is psychological? and what if the instrumental harm is indirect, by way of subtle aspects of the physical environment, whether that environment be the ecological balance of natural forces or the social forces of architecture and urban space? Politics may be based on power, but it also rests on how communities define the nature of nature and the nature of the human.)

In liquid modernity (I confess that I do like Zygmunt Bauman’s locution, for its metaphoric power) the mix of individuality and social support structures are shifting in ways that bring their interplay into clear visibility for those who choose to see. The problem is that scarcely anyone really wants to see; regardless of one’s political or social position, certain opinions that are grounded in reality are going to sound like transgressions against one’s received pieties, and granting aid and comfort to one’s ideological and political enemies.

We are not yet at irreversible points in the definition of humanity, in spite of the irreversible flood of knowledge about our condition; not so long as bodies of knowledge can be obliterated or occulted by force or disorder. The definitions, of course, remain irreversible in and of themselves; what is reversible is how much is known by individuals in a society, how it is known, whether the exact definition of the knowledge is open to argument and refinement, and whether the knowledge can be acted upon.

So before we enter upon the question of the nature of humanity and the nature of human institutions and the nature of nature and the future of all three together, which may end up as a never to be written Part Three of this, we shall have to consider a couple of present-day (in 2015) examples of the demodernizing and anti-modernizing forces (not the same thing) that have been explored from many different perspectives since I first read about them in Peter L. Berger and read about their practical consequences in the unintentionally mischievous reportage of Robert D. Kaplan.

Boko Haram is, of course, actually named after the assertion that non-African modes of education are heresy in the brand of Islam that the group espouses. Never mind that the brand is itself an opinion open to dispute, or that parallel demodernizing groups have sought to wipe out the intellectual heritage of African Islam itself and a large part of traditional African Islamic creative practice; what matters on an immediate basis is that the group has the imported firepower to enforce its opinion and supplant state power in so doing. (To combine the insights of Berger and Kaplan, when technology is considered just another object in the town marketplace, akin to salt or vegetables, the question of cognitive packages does not arise—or rather, the cognitive package is the category “useful object for sale.” You do not have to think about the world that produced a weapon in order to use it, or even to maintain or repair it, any more than you need to understand the relationship between software, hardware, and network in order to use the internet.)

That’s a demodernizing movement, although the modernity it opposes incorporates a large part of its own indigenous history. Anti-modernizing currents, or movements that oppose a specifically Western European modernity, are not at all the same thing, and we could look at a couple of the world’s largest societies if we had a few thousand words to spare and a few hundred hours to do the research beyond the headlines.

China’s directive to schools and universities not to teach “Western ideas” (other than socialism, of course) is the example du jour. The assertion could be spelled out more precisely, and probably has been by someone other than the European and American academicians who insist that since science and civilization in China got along fine independently prior to the intrusion of European and American would-be colonizers, it is perfectly feasible and perhaps desirable to revert to a uniquely Chinese way of organizing technology and the state that makes no obeisance to notions imported from Europe, other than Marxism, of course.

Others have argued that on that view, it is possible and necessary to look at the irruptions and eruptions of intellectual and political forces in Chinese history that were effectively analogous parallels to the opinions being disparaged as alien, and to argue that since almost all societies generate similar displacements of tradition (it’s just that the displacements succeed better in some cultural circumstances than in others), we might as well consider certain parts of the contemporary human condition (e.g. individual rights) as universals, regardless of who was the first to articulate them.

This is where the disputes of present-day anthropology become relevant to the disputes of present-day politics, but it would take another ten thousand words to unpack that problem.

Although the notions of exceptionalism in such places as China and Russia and the United States of America are certainly worth respecting and examining, if only because in many ways each is an exception, all cultures insist upon the rightness and superiority of their own ways of doing things, and treat the barbarians or just the rubes from someplace else with a certain amount of suspicion.

Problems arise when enough rubes from someplace else arrive with the same unexamined reverence for their way of doing things that the local rubes have for theirs, and we are talking about events probably dating back five or ten thousand years here. In good circumstances, invigorating hybridities flourish in newly enriched cultures; in bad circumstances, people start killing one another even without benefit of invading armies to render the job more efficient.

It is difficult enough when the simple facts of intermingling of cultures have reached a point of irreversibility and something is going to change dramatically, so that the point at issue is what sort of change is going to happen. When the culture on whose turf the drama is being enacted is itself going through internal upheavals created by economics, ecology, and/or technology, the difficulties are doubled and quadrupled, or multiplied by whatever factor your capacity for dubious rhetoricizing will allow.

Right now, globalized complex societies throughout the world are suffering from a combination of increasingly irreversible circumstances of this sort. Differently complex societies are in some cases simply collapsing, and generating instabilities that impact almost every other society on earth. (A few isolated societies are affected only by one or two forces, such as that their continued existence constitutes a hindrance to resource extraction.)

The dynamics of economic exploitation, cultural insensitivity, ingrained dislike for other ways of life, kneejerk responses applying universal principles to particular cases, and so on, is a topic that can scarcely be discussed without offending so many passionately held beliefs that no one really wants to undertake the dialogue.

To cite only one example, recent patterns of disease suggest that we need to consider the difficulties caused by the fact that some people really like to kill and eat forest animals even when other food options are available, and that some people like to consume cheeseburgers no matter how many cautionary calorie charts are posted or less artery-clogging food options placed on the menu. But there are also ample numbers of West African and Midwestern American young professionals lining up at the same low-calorie food purveyors in spite of their grandparents’ preference for bush meat or burgers. Some of them even share the same opinions regarding the ecological and epidemiological consequences of socially mediated choices, as well as the consequences for their personal health. And many of them probably hold divergent opinions on ashé (including simply knowing the term) and on the legitimacy of banging away at deer in hunting season in America.

But for those who try to discuss such topics as this outside a work of fiction, the topic of practical consequences will quickly be derailed by inquiries and objections about the problem of intercultural communication and who possesses the right to pass judgment on what sort of behavior.

In the meantime, the tipping points for a host of interconnected problems continue to arrive, and the choices narrow accordingly.

There is a clear need to move to Part Three here, but that is the larger area in which I need to discuss topics on which I have been accumulating books I was going to get around to reading someday, even as the topics themselves can barely be kept up to date in daily news reports on the consequences of the historical forces, without regard to the theories that might allow us to make sense of them. So I may never get round to writing that one, although miracles have been known to happen.

A bibliography of interesting titles, with a frank admission that I have only a fragmentary idea of the overall value of the books’ contents, may be the interim solution. Part Three, Beta Version

Actually, parts one and two are very much beta versions, with their own share of operating problems. But I hope I have made my point that human survival depends on the concomitant consideration of a variety of rapidly changing situations, despite all the reasons why this sort of consideration is not very likely to happen.

What we do about this is the topic of a Part Four that is even less likely to get written than Part Three, although I have implied in past essays some possible routes to addressing the problem.
joculum: (cupid in the tropics)
“The Universe...Is Stranger Than We Can Suppose.” "—Play it again, Sam"?


Friend Grady Harris and I (I won’t, for once, a.k.a. to his many online pseudonyms, of which “Grady Harris” sounds like one and on one level actually is) have long traded comments on how it comes to be that all famous quotations are parceled out erroneously on the internet: Wryly ironic observations are said to be by Mark Twain unless they are by Woody Allen, and people who think that Twain must be the originator of everything vaguely ironic sometimes compel Twain to use slang terms from the 1960s. Slightly gnomic but lyrical lines of verse are almost all said to be by Emily Dickinson but occasionally by a handful of others, although the lines usually contain enough internal clues to warn a diligent websearcher that they were not written by any of the poets cited. Observations about science are typically assigned to Albert Einstein; for a significant exception, see the discussion that follows these longwinded general observations.

There are subcategories of generic aphorisms that end up being ascribed to whatever other authors and thinkers most internet users already know, so that on a more arcane level of discourse, the persons most often cited in introductory college courses in anthropology or sociology become associated with authentic quotes (sometimes by their academic opponents) that express opinions they never held. This shades off into the more common phenomenon of supposed quotations of recent remarks by the Dalai Lama or Pope Francis that neither of them ever uttered but that sound sort of right; the “Play it again, Sam” syndrome, one might call it, and probably someone already does.

The game of quotational distortion and erroneous ascription is, of course, much older than the internet, as the “Play it again, Sam” reference indicates; and longtime readers of this blog who also have long memories will recall my patient tracking down of who first reported on an experience with ether in which the user was granted the revelation that the universe is permeated with a strong smell of turpentine, or was it that an odor of petroleum obtains throughout? Actually, it was neither one, not exactly, although very similar expressions occur in the retellings that now include this one.

What happens, often enough, is that writers remember an anecdote or a quotation pertinent to their argument; the writers, if they are intellectually responsible, don’t want to put a loose paraphrase in quotation marks, and if they trust the reader to be familiar with the quotation in question or omit the name because they themselves are not sure who said it, they will find the quotation ascribed to them as the originator, not always in the exact words they used in paraphrasing.

I bring all this up because recently I saw on Facebook one of those familiar overlays of an edifying quotation on a semi-appropriate photograph, in this case one saying, “The universe is not only stranger than we think, it is stranger than we can think. —Werner Heisenberg”. This annoyed me because I was convinced that Heisenberg couldn’t have written that; and that some other well-known physicist had said it; and that the correct quotation was the more elegantly phrased “The universe is not only stranger than we imagine, it is stranger than we can imagine.”

It took a fairly short time to find that the more elegant version is ascribed, wrongly, to Arthur Eddington, who never wrote such a sentence even if he expressed vaguely similar opinions. The preponderant opinion of commenters was that this is a distortion of J. B. S. Haldane’s 1927 aphorism, “The universe is not only queerer than we suppose, it is queerer than we can suppose.” Since no one gives a page reference to Haldane's Possible Worlds, even when they identify it as the source, I must withhold judgment on what Haldane actually wrote, but he does seem to be the locus classicus for this particular thread of morphing aphorisms.

Subsequently I found that the Heisenberg quote often comes attached to a book title, his 1974 Across the Frontiers, but the page reference hasn’t migrated along with the quotation, and I haven’t searched long enough to find the book online. If the entire passage from which the quotation comes has ever been ever cited by someone, it is buried so deep in the pages of search results that I haven’t had time to locate it.

Citations of the “imagine” instead of “think” version never seemed to get beyond the misascription to “Sir Arthur Eddington,” which bothered me, as I thought I had seen it ascribed otherwise. A more comprehensive websearch found that it is now often ascribed to Richard Feynman. A slightly different set of search terms turned up a mind-numbingly exact citation of Richard Feynman’s The Feynman Lectures on Physics, Volume II, Section 41, page 12, but unfortunately in that passage Feynman is actually quoting J. B. S. Haldane’s original remark, “The universe is not only queerer than we suppose, it is queerer than we can suppose.”

At long last, I discovered that a source purporting to impart “the deepteachings of Merlyn” actually transcribes the passage in which Feynman says “Not only is the universe stranger than we imagine, it is stranger than we can imagine,” and it is in a popularizing lecture on the mysteries of quantum mechanics in which he makes offhanded allusions and, in the transcribed passage at least, doesn’t give oral citation of his sources. The writer cites the URL www.youtube.com/watch?v=0XgmrMZ0h54, but unfortunately this page has been taken down for copyright infringement so the accuracy of the transcription can’t be verified.

I am left with the feeling that Feynman thinks he is quoting somebody else quite accurately, and considers the quotation too well known to require him to cite the name in a lecture in which his rhetoric is on a roll. He imagines a firmly established identity for the author of an aphorism that turns out to be far more fluidly attributed than he supposes.

One webpage of quotes about science wisely ascribes the “imagine” version to that prolific producer, Anonymous. Somebody is the actual author.
joculum: (Default)
The Time Between Is Not Quite Over, Part Two


I have written to the point of boredom that one of my favorite cartoon punch lines is Charlie Brown’s “I don’t even understand what it is I don’t understand.”

This is indisputably (a rhetorical device that indicates that the concept under discussion is quite open to dispute) the case with our present historical moment.

We know (sort of) that we are biologically/psychologically inclined to focus on the little details most of the time, to compose the big picture from a few little details misleadingly turned into general principles, and in general to jump to conclusions, just as I am about to jump to conclusions in this essay.

A major difficulty of the present moment is the fatal concatenation of the aforementioned inbuilt mental tendencies. We are confronted with a planet-wide confluence of hyperobjects, systems of nature too large and too independently variable to permit easy focus on more than a fraction of their cumulative effects. We are confronted with hypersystems in global economics that make it easy to direct individuals towards the interpretation of small grievances, while insisting that these are all that there are, that there are no large consequences playing themselves out as a result of the many small economic distortions made possible by structurally similar legislative choices the world over.

We are discovering that human health evolves according to similar principles: Plagues derive from large environmental alterations that turn previously minor diseases into major ones through the creation of new disease vectors. Major breakdowns in individual health derive from the combination of a large number of minor fluctuations in the functioning of systems of the body we have learned to think of as separate from one another. (Paul the Apostle already knew better than that, sociologically as well as materially: cf. I Corinthians, chapter 12.)

One of the major problems in confronting this concatenation of obvious facts (a rhetorical device that indicates that there is indisputably a dispute over whether they are obvious, or even their status as potentially debatable facts) is that the previously available conceptual language with which to discuss them has been sloppily sentimental, philosophically misleading, or simply unintelligible. “Holistic” when applied to any topic is, for many people, only a cuss word signifying “superstition masquerading as science.” “Object oriented ontology” is a phrase that signifies just as much to most people as you would think it does, and a topic that seems to lead to informal discussions of what a glass might experience when it breaks, which is not at all what a revived focus on the relationship between objects and consciousness ought to be discussing. The point is that objects do interact in ways we cannot quite wrap our heads around; consciousness does not quite determine being, but “being” is not what Karl Marx thought it was, so being does not quite determine consciousness, either. We need a dialectic that is neither old-school materialist nor idealist, and old philosophical conundrums are not going to help get us there.

Whether there are neglected aspects of past thought and practice that might prove useful in getting us to where we need to be is a topic that happens to fascinate me, but focusing on possible historical antecedents is only going to give rise to unnecessary and unhelpful arguments. We need to find new ways of expressing our present condition, of overcoming our predilection towards systematic misinterpretations of that condition, and of bestirring us to get off our butts and do something about that condition.

So there. (Discuss.)
joculum: (Default)
Nails and Hammers and Unmixed Metaphors


Some time ago, when I was telling a friend about the neuroscience students who were looking at Bethany Collins’ blackboard-like panel of white-lettered words breaking apart and collecting into piles of letters, which reminded them of how memory and language-formation function, he said, “To the man who only has a hammer, everything looks like a nail,” to which I retorted. “No, no, they got it! That was why I put it in the ‘From Cosmology to Neurology and Back Again’ show! —because it wasn’t just about the political concepts in the words, it was about how the concepts form and fall apart in our minds and our societies!”

Actually, I didn’t say much beyond “No, no, they got it!” but we all know about pensées de l’escalier. (Or “That was what I shoulda said.”)

I so, however, keep having new and ever more horrifying realizations of how we actually do interpret the whole world in terms of the tools we have with which to interpret it; more accurately, we interpret and judge other people’s tools in terms of the ones we know how to use, and we interpret other people’s interpretations in terms of the tools we know how to use.

The computational model of consciousness is a case in point. Anyone who studies humankind’s cultural creations realizes that there is a much more complex set of responses to the environment than pure computation, but how germane to actual consciousness are the complex responses? We are, after all, getting better at creating systems like Siri in which algorithms mimic at least the standard tropes for reacting to reports of others’ emotions and sensations, from pain and hunger to fear and sexual desire. (“I am sorry to hear that. I am sad that there is nothing I can do to improve your situation.”)

So a good many everyday behavioral or pragmatic tools for navigating existence are purely mechanical, a more sophisticated form of “How are you?” “Fine, thanks, how are you?” “Fine, thanks.” or “Thank you.” “You’re welcome.” And as the linguistic philosophers pondered three-quarters of a century ago, it is quite possible for the “Fine” exchange to contain not a syllable of accuracy concerning the respondents’ inner emotional or physical condition, since it exists for other purposes than information.

But we end up, vis-à-vis such questions as the computational model of consciousness, in messy issues of what it means to have a body or to be a body and what it would mean not to be a body or to exist as a conscious being without having a body. And people whose particular emotions and mental skills lead them to acquire expertise in one academic field are likely to have a completely different way of putting the problem into words, or of understanding the problem intuitively, from those whose expertise is formed from different skill sets.

In principle, we should all be able to comprehend what it would mean to understand a problem using a different set of acquired skills. But because our comprehension of that question is partially determined by who we are as embodied beings with a personal history, we don’t even understand what it is we don’t understand, as I have quoted so often from The Wisdom of Charlie Brown.
joculum: (Default)
Neither Here Nor There: An Exceptionally Hasty Note on the Concept of Between-ness



I have been unable to visit the exhibition at Atlanta’s Gallery 72 titled “Middle,” but I have read the curator’s statement (or an extract from it in the press release) by Candice Greathouse that states, “The works included in this exhibition serve as a visual dialogue of ideas that investigate notions of this middleness - inbetweenness and potentiality through material and process. The artists and works featured in this exhibition represent the middle through a variety of strategies conceptually and aesthetically. They resist concrete categorization and definition, offering instead a provoking ambiguity that prolongs and redefines the ‘middle’.”

Receipt of this press release happened to coincide with arrival of my essay “Oscillations and Interstices” written for the catalogue of the “Oscillations” show at Steffen Thomas Museum of Art a year ago.

This has got me to thinking about the shifts in the concepts of in-between-ness from Zwischenheit to Zwischenzeit in German (the one just means “the meantime” as in “in the meantime” while the other means the condition of “in-between-ness”) and what the existentialists of fifty years ago made of the former. The meantime became a literal “time between,” Martin Heidegger’s “time of the No-More of the god that has fled and the Not-Yet of the god that is coming.” Less polytheistically, “the time between” implied a moment of fundamental historical change, a transitional moment in which no one could feel at home: “each torpid turn of the world has such disinherited chidren, to whom no longer what’s been, and not yet what’s coming, belongs.” (Rainer Maria Rilke, Duino Elegies in the familiar Leishman-Spender translation).

It feels like we are in the condition of between-ness but no longer in the time between; we have crossed some kind of dividing line, and the no-longer is receding into a rapidly aging history while the not-yet is rapidly becoming the already-here. (“The future is already here, but is unevenly distributed.”--William Gibson)

I have no time (no pun intended) to expand upon these thoughts at the moment as I have to go catch a plane to somewhere else.
joculum: (Default)
Having been reminded, nearly two years later, to look it up, I am delighted to find that I have once again created the only Google-search citation for a quotation, one which I acknowledged at the time was probably an ill-favored misquotation, but mine own.

Item: Samuel Johnson's "Depend upon it, sir, when a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully."

Which I rendered, on January 13, 2013, as "The sure knowledge that one is to be hanged in a fortnight concentrates the mind most wonderfully."

It was pleasant to rediscover that post, and to delete some wondrously unconcentrated spamming comments that didn't add enough surreal content to make them worth retaining.
joculum: (Default)
It’s a kind of [devastated-landscape] ugliness that can be achieved anywhere, I suppose, but it’s most easily found on the borders where cultures clash....

...among the handlers, I had learned not to dismiss anything as meaningless. Mystery, I’d read somewhere, isn’t the absence of meaning but the presence of more meaning than we can comprehend.
---Dennis Covington, Salvation on Sand Mountain: Snake Handling and Redemption in Southern Appalachia

I wonder what he meant by that? ---punch line of joke in which the psychoanalyst is referring to a colleague’s pleasant “Good morning.”

Why haven’t I seen this before? ---Walter Pidgeon as Edward Morbius in Forbidden Planet



I find myself reading Salvation on Sand Mountain after twenty years in which I never quite felt the need to do so, and am finding it unexpectedly resonant.

The remark about the ugliness of borders where cultures clash (in Appalachia, places where mountains are leveled both for strip mining and to provide a space for a new Holiday Inn) suddenly illuminates for me—because different cultural expectations have real material effects in life and landscape—the dubiousness of most attempts to disentangle material and spiritual/psychological factors. (This last remark would be easier to make in German, where the word Geist serves for both “spirit” and “mind.” German, however, has had to go to “spirituelle” rather than “geistig” to translate “I’m spiritual but not religious,” if Google Translate is to be believed. “Geistig” still means both “spiritual” and “mental,” however.)

As I have written so often before, it is pointless to try to derive cultural characteristics solely from economic substructures, as pointless as to try to insist that only the spirit matters, matter doesn’t matter. (As in the intrinsically untranslatable old British joke, “What is matter? Never mind. What is mind? No matter.”)

As Election Day nears in a couple of places, and has just passed in a few others, I find myself thinking about the persistence of cultural preferences in the midst of changing economic circumstances, and how cynics can play upon regional psychologies to attain their own ends, ends which may be either economically or culturally based. Leaders, too, are prepared to sacrifice their own best material interests for the sake of the ideals that stir their souls most deeply.

The best outcome would be one in which material and spiritual goals were not muddled up together, or mistaken one for the other.

As Captain Obvious said once, I believe.

The problem, and I have written more than a few thousand muddled words about this, is how to extract meaning from the muddle.

I wish I could remember which character in fiction said, “I love mystery, but I hate muddles.” It seems like it came out of a Charles Williams novel, but it is such a British-English thing to say that it could equally well be something as embarrassing as Agatha Christie or as differently embarrassing as Robertson Davies.

None of the above...a very modest amount of websearching attributes the quotation to Mrs. Moore in A Passage to India, which is rather appropriate to my original topic of borders where cultures clash. Adele: “I dislike [mysteries] not because I’m English, but from my own personal point of view.” Mrs. Moore: “I like mysteries, but I rather dislike muddles.” Fielding: “A mystery is a muddle. ... A mystery is only a high-sounding term for a muddle.”

“Boum,” said the Malabar Caves.
joculum: (Default)
I have written a few essays that seem so irrelevant to world history of the past six months that I have not posted them anywhere. The day may come when they seem relevant to something.

Or as a very young Bob Dylan said as an introduction to one of his compositions, "It must be good for somebody, this here song. If not for me, it's good for somebody."
joculum: (asleep)
Karl Kraus withheld publication of Die Dritte Walpurgisnacht for fear of recrimination against friends, and famously wrote only that "Gewalt kein Objekt der Polemik, Irrsinn kein Gegenstand der Satire sei." (Those unable to parse this could paste it into a reputable translation program as I did to make sure I was understanding it rightly.)

The times are not propitious for writing about many of my favorite topics, and have not been for some time now. So I have written a number of posts and then chosen not to post them.

But I must remark that I feel at the moment as though I have fallen into a condensed version of John Crowley's Ægypt cycle: we have learned thanks to current events that the Gnostics are beleaguered both as supporters of repression and victims of it. The secretive Cult of the Peacock Angel is a topic of daily newspaper headlines, except that many of its practitioners have cellphones and wear T-shirts bearing contemporary slogans, making one speculate whether secularity has eroded religious passion among its practitioners as it has in so many other religions. The sense of social instability that accompanies this, when reinforced by economic instability, makes a revived fundamentalism seem plausible to multitudes. (And this insight also is found in Crowley's four Ægypt novels, exercises in fantastic realism with an emphasis on the "realism.")

I would say, as Kraus did, that all this brings nothing else to mind, but it would not be true, as he knew it was not for him when he made the statement.
joculum: (Default)
More Mildly Entertaining (Or More Likely Not) Notes on the Human Imagination


Re-reading various theoreticians who quarrel fatally with one another, I wish more than ever that we had a more comprehensive model of how the cultures into which we are born shape our psychological preferences. A concurrent perusal of my Facebook news feed, Wes Anderson’s new movie The Grand Budapest Hotel, my cousin Mary Stricker’s blog about fantasy Grimmella , and Patrick Leigh Fermor’s reconstruction of his youthful self’s peregrinations round the Black Sea in The Broken Road reveals parallel narrative and emotional structures that are modified by the time frame and gender-and-class conditions in which the parallel structures are being constructed by the human imagination. In other words, very different kinds of folks fall again and again for the same sorts of things, only different. And it matters very much whether we look more at the sameness or the difference, when we ought to be looking at both together, and looking both at what people love and what they hate.

We have a huge amount of academic fustian and intellectual obscuration taking place in unreadable journals, all discussing phenomena of the human condition that the academicians are examining in too small a sample, in too limited a geographical and historical circumstance, looking at too few variables.

Likewise, people in general like what they like, and know a great deal about the stuff they like, and do not think very much about why they like what they like, and why, under different circumstances, they happen to like something else.

I could try to struggle on for a few thousand more words on this topic, but you would stop reading after about one more paragraph, anyway.

This re-realization (it’s another one of those topics I rediscover about twice a week with the same sense of surprise each time) makes me feel like finally taking the time to work all the way through D. Fox Harrell’s Phantasmal Media, a new book to which I refer with monotonous regularity, simply because Harrell is trying to synthesize a good many theoretical approaches. It is possible to extract a number of different lessons from Harrell’s narrative even though his primary interest is how to make relationships of power and ethnic identity visible through digital media—or how to create politically and socially efficacious video games. Since he focuses on “how to understand and create evocative story worlds, poetic metaphors, critical conceptual blends, haunting visions, and empowering cultural experiences using the computer,” we can extrapolate beyond the “using the computer” part and look more broadly at that piquant juxtaposition of “haunting visions,” “empowering cultural experiences,” and so forth. Most of the people writing about one or two of those topics wouldn’t know the other topics if they performed the usual American-slang cliché on their posterior regions. (For my non-U.S. readership, that’s “if they came up and bit ‘em in the ass.”)
joculum: (cupid in the tropics)
I have found another convert for the fiction of John Crowley—Joe Elias Tsambiras, (http://www.kailinart.com/2014/02/the-art-of-joe-elias-tsambiras/) whose artwork is Crowleyan without knowing it, but using a different visual vocabulary from the art John Crowley most admires (which means Mr. Crowley may not approve of it). Although I recommended Little, Big and the Ægypt cycle without further citation or explanation, it was the 2009 interview in The Believer (http://www.believermag.com/issues/200905/?read=interview_crowley ) that made a believer out of him, perhaps because so much that Mr. Crowley says in that interview would also apply to the art of Joe Elias Tsambiras.

Unfortunately Tsambiras is acquiring particular editions according to the cover art he finds most relevant and appealing to him, so I can’t guarantee the purchase of new copies, which of course matters a great deal to a living writer.
more? )
joculum: (cupid in the tropics)
I am stunned to find that the annual conference of the International Association for the Fantastic in the Arts has a roster of academic papers and participating academicians that puts the American Academy of Religion to shame in terms of quantity: and this is only on one interdisciplinary topic, even if it is explored from many different perspectives.

http://iafa.highpoint.edu/wp-content/uploads/2011/08/35th-Annual-ICFA-Program-Draft-v2.pdf
joculum: (cupid in the tropics)
Fans of Ronald Hutton Should Be Patient. This Will Eventually Turn into a Review Essay About His New Book Pagan Britain

It is extremely unlikely that anyone encountering this on their LiveJournal feed will feel like stopping everything and reading eighteen hundred words about methodology and degrees of certainty in topics beginning with meteorology and cosmology and ending up with visionary folk art, while spending a great deal of time en route dealing with Ronald Hutton’s book on prehistoric religions and their possible survivals in Britain. I recommend clicking on the LJ-cut mark and then downloading or copying the whole thing for later perusal, if you find it of the slightest interest.

John Crowley might be particularly interested in a quotation from Hutton that I have situated at the very end of this much too far ranging essay. —Jerry Cullum


”more” )
joculum: (cupid in the tropics)
hyperobjects and hyperobjectives: notes towards an exhibition I do not actually plan to curate, but wish I did


I have the dubious distinction of being an increasingly elderly white male who still owns a considerably battered copy of the Marshall McLuhan issue of Aspen, “the magazine that comes in a box” that was the forerunner of numerous deconstructed pieces of print media (mostly artists’ books, though a 1968 issue of a college literary magazine I edited and an entire 1986 “bagazine” architecture issue of Art Papers took their cues from Aspen’s example). I assume Aspen took its inspiration from Fluxus’s intermingling of the highly aesthetic concept and the commonplace object, since Fluxus itself was the topic of one of the later issues. (All the contents of the seven years of the magazine can be viewed via ubuweb, but of course the point was to handle this incredible variety of objects and textures, and that sensory experience can’t be communicated online—yet.)

I bought said volume with considerable excitement because it was clearly attempting to extent McLuhan’s insights regarding the impact of media by defeating our expectations of what a magazine ought to be while forcing us to think, via what a subsequent generation would call the deconstructed print medium, about the new electronic media. In so doing, it questioned the limits and the legitimacy of both.

Or so we thought, anyway, even if we didn’t quite understand what McLuhan was telling us about the dialectic in question.

The reason we didn’t quite understand was that a great deal of what McLuhan was saying was nonsense. And this is a problem I have encountered again and again over my lifetime: the writers who perceive the full dimensions of a previously unconsidered question almost always articulate their perceptions unintelligibly, with explications of the topic that are frequently just plain wrong when they can be deciphered at all. But the fundamental perceptions behind the wrongly conceived articulations are completely valid.

This generalized insight could be pursued in so many different directions that for once I do not wish to attempt to analyze all of them in a single blog post. However, I do want to present references to two or three ideas that I may never get around to pursuing beyond these preliminary notes. (My blog seems to be turning into a series of prefaces to projects that are never begun, something that in itself has a distinguished history, Walter Benjamin’s Arcades Project being a classic example of the larger category of books for which there exist an immense amount of preparatory materials but no actual product beyond a few preliminary fragments.)

Anyone who wishes to pursue this topic on the other side of an LJ-cut is welcome to click here )
joculum: (cupid in the tropics)
“our hearts are restless...and we don’t know why,” part two


I am afraid that, based on my earlier musings on why we should ever have become lovers of impossible wonders, I am going to inflict some reflections of the imaginal and the imaginary on my readers without benefit of citations of most of the theoretical volumes that have probably influenced my ideas.

Lonnie Holley’s performance “Six Space Shuttles and 144,000 Elephants” is an excellent test case, because a good many people will react to it with dismissive irritation simply because for them, the whole premise of six space shuttles and 144,000 elephants celebrating Queen Elizabeth’s birthday is arrant nonsense. Those of us who delight in it may find pleasure in its incongruity plus its intuitive formalism: after all, space shuttles and elephants are logical opposites when it comes to soaring versus being difficult to get off the ground, but both possess the quality of being large and attention-getting, hence obscurely appropriate for a celebration of royal power (think ancient Roman processions and contemporary Air Force flyovers). The off-the-wall numerology borrowed from the Book of Revelation signifying the population of redeemed souls further reinforces the notion that something very important and dignified is being symbolized by a juxtaposition that, nevertheless, is incongruous enough to be hugely amusing. Whether this leads to serious reflections on what constitutes incongruity and why we find it funny—that depends on who we are.

The amusement or disgust at violations of the rules of “things that go together” and “how things ought to be done” is functional enough in terms of maintaining social order, but it does raise the question of why “we find him, as far back as we can trace, making this thing other” (misquotation of David Jones somewhere in The Anathemata, I think, and in Norman O. Brown’s Love’s Body). We can understand why our remotest ancestors buried objects with their dead (whether they did it because they expected the dead to use them in the afterlife or because objects once used by the newly dead made them feel nervous). The motive for metaphor is a little more obscure; the part for the whole, or the abstract image that apparently conveyed as much meaning as the exquisitely rendered animal or cartoon shaman(?) in the cave painting.

It’s curious that Aby Warburg should have intuited the relevance of the question when he used his researches among the Hopi to ponder the implications of Renaissance iconography and image-making in general—a career choice that horrified both his relatives who still practiced Judaism and his assimilationist family who had found a language-centered Protestantism an easy enough leap from a militantly aniconic Jewish tradition. (I started out with the Warburg Institute folks when I was twenty-two, but this latest meditation is based on Michael P. Steinberg’s “Aby Warburg and the Secularization of the Image” in the 2013 survey volume Weimar Thought: A Contested Legacy, eds. Peter E. Gordon and John P. McCormick.)

I can see how the capacity to pursue multiple lines of thought that turn out to be useless dead ends would actually be evolutionarily useful—the solution that no one had yet thought of comes out of all that maundering and idea-mongering, more so than from simple trial and error. But how we as a species got so fundamentally wedded to dysfunctional pursuits, behaviors that make individuals and sometimes whole societies less likely to survive—well, that was a question being discussed in anthropology half a century ago, as belief in the dogmas of functionalism waned. The notion that a dwindling number of economists and far too many evolutionary psychologists still operate as though functionalism were the default position is nothing short of dismaying. There is a large enough number of human beings who actually do operate according to what seems to them the most immediately advantageous course of action, and who have no use for any aspects of their own culture that doesn’t offer immediate sensory rather than intellectual gratification, to make us wonder why the more functionless imaginative options have survived, plus why the excessive social rules that such folks treat with cynical disdain ever became so excessively codified in the first place. As in the once-contemporary idiom that comes to mind at the notion of six space shuttles and 144,000 elephants celebrating the Queen’s birthday: “that’s just wrong.”
joculum: (cupid in the tropics)
“Inquietum est cor nostrum...nescimusque cur,” Augustine did not write.

Why are we lovers of wonders?

Lovers of unheard-of luxuries, yes. The imaginations of the powerful in all ages have led to the creation of objects that fulfill the most extravagant fantasies, all of which fantasies are no more than refined versions of the Land of Cockaigne where food and drink and comfort arrive effortlessly (the Big Rock Candy Mountain being a more recent American version of this). The sufficiently well-to-do in more imaginative times have created earthly paradises far beyond ordinary wish-fulfillment and paid scholars to make them immortal within their alternative worlds. (Today, luxury objects and luxury environments are merely fancier versions of the common folk’s geegaws and getaway pleasures, and the quest for immortality involves flatfooted genetic manipulation, but all that is another story.)

We need not agree with Augustine or C. S. Lewis about the God-shaped blank or the notion that there must be a fulfillment for the wish for an absent Paradise just as there is a fulfillment for the wish for sex or for food. (Strange that a man who composed about allegorical fantasies and wrote them himself did not think as he composed the argument-from-Sehnsucht, that his argument was identical to “we have a desire to see unicorns, and....” But for him, Aslan and Narnia were symbolic parallels to the shape of the real world that was there but that could not be seen or spoken of openly.)

Evolutionary psychologists seem curiously oblivious to this dysfunctionality of the human condition. (Harrell’s new book Phantasmal Media confronts, happily, the computational model of consciousness with the imaginal model—although his focus is the potential for social manipulation and social liberation through such images. But the larger point is that the human mind is doing more than operating probability-calculating wetware.)

For survival, we must visualize more or less adequately as well as compute probabilities, but even so, it seems highly unlikely that ancestors inclined to sit down and fantasize about all the wonderful things that could or ought to be on the other side of that grove of trees, rather than finding out one way or another and/or figuring out how to make the desired result happen, would have passed along their genes in sufficient quantity to make fantasy and the lust for wonders into such a major human capacity.

We may be a storytelling species, but why aren’t our stories more consistently humdrum? There certainly are enough people for whom the humdrum is sufficient to make us curious as to why it isn’t a universal trait.

Is it just that once the capacity for imaginative solutions has been inherited as a genetic trait, there is no stopping it at functional limits? Imagining the impossible, and enjoying imagining the impossible, would have their own desirable outcomes.

But why should we enjoy imagining the completely impossible in the first place? As distinct from imagining the not-yet-possible, or the world that might very well exist out there beyond our immediate perception...but here we are getting into the problem of whether all fantastic stories begin as tales of belief, or alternately, as conscious lies. Our capacity to state the counterfactual even as we know that it is counterfactual obviously comes into play in the initial creation of fantasies...adults’ fantasies and children’s fantasies both. But why the capacity for the counterfactual hypothesis should stretch from childhood to adult tale-telling...this takes us into realms of psychology that presumably have been researched to the point of boredom, but who has written the definitive study of how narrative is birthed from childhood’s early imagination? The famous studies of children from sixty years ago were grounded in local cultures and social classes so much as to make their conclusions hopelessly suspect, though we have a good many collections of stories from around the world that offer evidence with which to supplement them.

Obviously there is something evolutionarily desirable in the extension of the childhood fluidity between the real and the unreal...but if the ability to fantasize leads to ritual and social order, it also leads in more or less equal measure to pointless pathology and to productive (even when functionless...) art. I presume that this dual outcome is a structural constant of the capacity to imagine and the influence exerted upon it by early childhood experience. (This allows guardians of social order to equate functionless art with personal pathology, as we know very well from the past hundred years or so of history.)

I am plodding painfully through what is very familiar territory for various academic disciplines, because I now have trouble making the academic disciplines line up satisfactorily. Every one of them picks up at a different point in the human story, and I cannot quite discover how some imaginative faculties can remain dormant for so long until refined by circumstance (this is nothing mystical...I mean such things as the ability to see, to understand preverbally, what is going on in a painting, for example, something of which I was incapable even after years of graduate study in verbal disciplines...something of which some would accuse me of still being incapable, but my life as an art critic depended on having developed a certain amount of capability in this department).
joculum: (magi from Ravenna mosaic)
Fresh from a triumphal tour of music venues all over Europe (smaller venues, but extremely enthusiastic audiences), the African-American self-taught artist Lonnie Holley is the subject of an extended profile in the New York Times Magazine: http://www.nytimes.com/2014/01/26/magazine/lonnie-holley-the-insiders-outsider.html?hpw&rref=magazine&_r=0

Holley has been a legend in the folk or vernacular or outsider art world (whatever you prefer to call it) at least since his spectacular site-specific installation in the "Souls Grown Deep" exhibition presented in Atlanta during the 1996 Olympics. Now his recordings from Dust-to-Digital (which is an enterprise that deserves a post in and of itself) have been named by more than one music critic as among the best releases of recent years.

Having written about Holley for many years, I am gratified at this breakthrough in terms of international attention of his unique oeuvre.
joculum: (magi from Ravenna mosaic)
As I mentioned, I'm looking forward to tackling D. Fox Harrell's Phantasmal Media for its perspectives on how things can or cannot be changed for the better, and by whom and under what circumstances. (The relationship to such movements as Afrofuturism is a side topic of the larger issue, which I state as baldly and stupidly as possible.) Anthony Dunne and Fiona Raby's Speculative Everything: Design, Fiction, and Social Dreaming approaches the same problem from the direction of design rather than digital media, and with less awareness of the perspectives of specific ethnic communities; but part of the point is that those perspectives are altering with increasing speed in advanced digital society, anyway. Part of the problem is that those perspectives are being reinforced in other quarters by the dislocations created by advanced digital society, as it is presently constituted. (Joe Nocera's summation, in his January 7 New York Times column, of Jaron Lanier's Who Owns the Future? cites Lanier's point-blank observation that corporations have engaged in technological activities that result in the impoverishment of their own customer base—never mind whether the bottom-rung employees still left are being paid fairly for their services, which is a separate issue from whether there will be enough bottom-rung jobs to keep the potential employment pool from scrambling in desperation to have them. Whether Lanier's core idea makes sense—that of renewing the middle class by paying people for clicks on such things as essays like this one—is a side issue, however important. And discussing how corporations maintain their own sustainability in the face of dwindling demand by investing capital in financial transactions and decreasing employment is most decidedly a side issue en route to where I am going.)

Dunne & Raby may be hopeless utopians in their insistence that keeping imaginative alternatives in play is itself an activity that makes possible a different future. That leads back into some of the questions Harrell is dealing with in Phantasmal Media, and a good many other questions.

What brought me up short, early in the book, was their offhand discussion of defining possibility by showing how little is genuinely impossible, rather than merely improbable. Citing a book (and previous TV series) by Michio Kaku, they state that scientifically, there are only two genuine impossibilities: perpetual motion and precognition. Either one would require a complete reformulation of our present knowledge, whereas some of the most impossible-seeming of other eventualities would not.

Dunne & Raby may well have misconstrued what was meant as an observation meant to grab the attention of a mass audience, but the claim sheds a different light on any number of topics I have written about previously, from the chequered legacy of Ernst Bloch's Das Prinzip Hoffnung to Jeffrey Kripal's attempts to revalorize the only presumptively impossible.
joculum: (magi from Ravenna mosaic)
I belong to a generation that cannot read or hear “Dinka and Nuer” without thinking “Evans-Pritchard,” probably because the only core-course lectures on anthropology we heard had to do with Malinowski and Evans-Pritchard (lectures that led some of us to be delighted when we encountered the line from the Fugs’ song “Nothing,” “social anthropology, a heckuva lot of nothing”).

Having long since left behind Evans-Pritchard and his ilk (though his student Mary Douglas’ Natural Symbols was one of those books we swore by rather than at in my youth), I was shocked to learn from the Wikipedia entry that Evans-Pritchard’s youthful fieldwork among the Nuer and the Dinka had begun as recently as 1930. The difference is less than two decades, but I had vaguely placed him with the generation of Malinowski, legendarily stranded in New Guinea as an obviously harmless enemy alien, unable to return to England but allowed by the Australian colonial authorities to potter about with the Trobriand natives. (Incidentally, how many great moments of modernity depended on would-be humdrum intellectual careers being blocked by war and shunted off in different, more consequential directions? I can think of several, but that would be a monumental digression.)

Instead, Evans-Pritchard belongs to that generation of the colonial ’30s that then had intriguing adventures with folks whose descendants also show up in more recent history (he was an administrator in British-occupied Cyrenaica, where he wrote about the Sanusi resistance to Italian colonization, and before that, he had been facilitating guerrilla activity with the Anuak people of South Sudan against the Italian occupation forces in Ethiopia).

“They do say that all things are connected,” goes the line in a traditional teaching story, and although many of the connections are ridiculously inconsequential, some are not.

I cringe at the thought that the Guinea worm eradication program is being put in jeopardy by the mass migration of refugees in Mali and South Sudan, just at the point when eradication seems possible. A few freshly contaminated bodies of water in the adjoining countries, and the disease is off and (almost literally) running again.
joculum: (magi from Ravenna mosaic)
For the record, I remembered several pertinent details of the NPR story wrongly. (Q.V.) This is how variant versions of tales make their way around the globe with alarming speed, and have to be batted down on Wikipedia with great regularity. (It may say something about my search interests that almost every Wikipedia entry I pull up begins with "This article has issues."
Page generated Jun. 12th, 2025 03:21 pm
Powered by Dreamwidth Studios