joculum: (asleep)
The Paul Nash exhibition just concluded at Tate Britain is a visual summation of some of the great traumas and struggles of the twentieth century: the fundamental disruption created by the First World War, obviously, followed by the exploration of what Gerard Manley Hopkins had already called the “cliffs of fall” in the human psyche that made possible not only the deluded self-destruction of 1914-18 but the even more delusional destructiveness that led to the same thing, only worse, in 1939-45.

Nash turned all this into landscape considered as psychic projection, from his paintings of the eerily devastated forests of his paintings immediately following the First World War to his dead ocean of wrecked warplanes in Totes Meer. His photographs, which initially he thought of as visual notes for details of paintings, today seem like some of his most profound work: the Monster Field series of distorted-looking fallen trees, when juxtaposed with his studies of uncanny aspects of the built environment, suggest that human beings continuously find themselves in an interpreted world in which they are not very securely at home (to paraphrase Rilke’s line from the Duino Elegies).

The insecurity of interpretation is something that Friedrich Nietzsche made much of in the nineteenth century, but so did the Society for Psychical Research. I’ve been paging through Alen Owen’s The Place of Enchantment: British Occultism and the Culture of the Modern, years after I first acquired it, and am reminded that while theosophists were off making visionary astral journeys to distant planets, F. W. H. Myers and the Society he co-founded were engaged in anything but what today’s skeptics like to call “woo”—they simply assumed that there was a reality at the margins of normal consciousness that we do not understand, and took it as their responsibility to approach that marginal experience with all the empirical rigor they could muster. (Whether they could muster enough is in dispute, as noted below). Accordingly, they undertook, among many other investigations, a Census of Hallucinations. Myers made assumptions about the structure of reality that semiotic philosopher C. S. Peirce, for one, regarded as based on embarrassingly insufficient evidence, but Myers was using what he regarded as empirically available data to construct his models of the several possible selves and the invisible interconnection of selves with other selves. He was not depending, like Madame Blavatsky, on inwardly perceived revelations from Tibetan mahatmas, however dubiously metaphysical his own conclusions may seem to us today.

The publication in 1899 of Sigmund Freud’s The Interpretation of Dreams was the inaugural salvo in an assault on what Freud called “the black mud of occultism,” an assault in which, nevertheless, Freud refused to exclude the possibility of telepathy, and indeed worked it into his theories about the nature of the Primal Horde at the beginnings of civilization. In subsequent decades, while Freudians battled it out with behavioralists about the nature of the body in which consciousness might or might not be an accidental epiphenomenon, Myers’ speculations based on collections of data ran up against a singular difficulty: that the marginal experiences under investigation proved as difficult to replicate under laboratory conditions as, say, moments of innovation in the writing of poetry (not to mention ballet, mountain climbing, or gymnastics). Moments of creative insight, of course, can be considered heightened examples of thoroughly mundane processes; the thesis of Myers’ successors was that the so-called paranormal is likewise a heightened instance of mundane lesser processes, processes to which we do not pay attention because we believe that they cannot exist.

Myers felt obligated to create hypotheses to explain “genius” as much as to explain telepathy and precognition, a reminder that the 1890s represented the efflorescence of many strands of thought that the intellectual mainstream thought of the twentieth century would dismiss as intrinsically unworthy of serious consideration. Most of the decade’s cultural manifestations were irrational in and of themselves; the efforts to analyze and explain phenomena perhaps wrongly associated with irrationality tend to be forgotten.

Written towards the end of his life, Oliver Sacks’ recent book on hallucinations cites a few cases that go beyond his glancing reminder that we are prone to impose meaning where there is none. Sacks doesn’t censor these cases; neither does he offer hypotheses, in contradistinction from his accounts of neurological disorders in which the categories that our perceptions and our language construct become hopelessly muddled.

Owen’s book on occultism and modernism also examines Aleister Crowley’s empiricist approach to appropriating magical practice, another case of a skeptical intellect determined to discern the factual basis underlying imaginatively constructed ancient interpretations, and, if possible, to make use of their practices. Crowley’s theatricality makes it difficult to decide just how much he shared with the S.P.R. researchers the knowledge that the consciousness of the practitioner creates a scrim of illusion that makes perception of the reality being encountered particularly difficult.

The twentieth century became so aware of the scrim of illusion that by the twenty-first century the predominant assumption was that the notion that there were any invisible connections among individual selves was as absurd as the assumptions underlying magical practice., not least because there was no self and no perceiving embodied consciousness that could not be reduced to uploadable algorithms. For those who took note of the embodied nature of consciousness, making it not reducible to mathematically based algorithms, the very nature of embodiment meant that there could be no invisible communication between bodies other than the physically limited effect of ordinarily imperceptible materials such as pheromones.

Those inclined to believe otherwise have to accept the fact that they are in the position of a Galileo asserting that “eppur si muove” without having something parallel to Galileo’s theory to explain why it moves. Their only comfort is that Joseph Campbell’s assertion seventy years ago that “[humanity itself] is now the crucial mystery” has proven more true than he ever believed, as nearly all theories have been called sufficiently into question that the only way to construct a comprehensive theory of human existence is to truncate large parts of it by declaring it out of bounds. It would be possible, for example, to explicate the roots of large parts of Paul Nash’s oeuvre in terms of the origins and cultural shaping of the emotions projected into his visual structures, but the transhumanist movement would ask why anyone would want to waste their time on something like that, and quite a few other bodies of theory can present reasons why this is not a line of investigation worth pursuing, or even a valid line of investigation at all.
joculum: (Default)
I’m aware that my most recent Facebook posts have been making elementary observations that would elicit pitying smiles among my few academic friends, so possessed are these observations of what academicians regard as “a firm grasp on the obvious.“ But the obvious is exactly what nobody seems to have much of a grasp on in today’s America.

One problem, of course, is that my “obvious” is not your “obvious,” and vice versa. I just realized that 2017 is not only the fiftieth anniversary of the album The Velvet Underground and Nico, it is the fiftieth anniversary of George Steiner’s Language and Silence. The two had a more or less equal impact on me at age twenty-one; the one showing that it was possible to create astonishing songs about heroin and Sacher-Masoch’s Venus in Furs, the other introducing me to the work of Claude Lévi-Strauss, Marshall McLuhan, Ernst Bloch, and Georg Lukács, all of which was pretty heady stuff for a boy from a little town in Florida who was also absorbing that year’s release of the Beatles’ Sgt. Pepper’s Lonely Hearts Club Band and Jefferson Airplane’s Surrealistic Pillow, from which “White Rabbit” became an anthem of the Haight-Ashbury’s Summer of Love.

All this comes to mind because yesterday I read the 115 pages of George Steiner’s A Long Saturday: Conversations (with Laure Adler, translated from the French by Teresa Lavender Fagan). In these conversations, Steiner evinces a familiarity with Alain Resnais’ classic films and regrets that he never engaged in systematic study of cinema alongside his magisterial grasp of European philosophy and literature. He also expresses regret that he has never been able to get his head or his emotions around popular music from rock to hip-hop and beyond. (And yet he was one of the first to grasp the implications of the shift in consciousness that accompanied the ubiquity of electronic media that would eventually become the digital revolution, and pretty much had all that figured out by age forty. The fact that he was only five years older than Leonard Cohen left me feeling that I was going to have to run as fast as possible if a kid from Small Town South were to catch up with modern and contemporary culture before age forty. Leonard Cohen’s first album also appeared in 1967.)

Although I didn’t know it at the time, 1967 was also the fiftieth anniversary of the publication of Paul Valéry’s epoch-making poem "La Jeune Parque.” I just read that poem in translation for very nearly the first time a few months ago, and was flabbergasted to discover that although it tackles the big issues of love and death, it is also a poem about a young woman trying to figure out how to avoid becoming just a decorative status symbol for a husband. Given that he spent four years writing it while the First World War was going on, it makes me want to go out and read a biography of Valéry, who was even more of a polymath than Steiner.

I dwell on this because the translation in the new book of conversations calls the poem “The Young Park,” and while “Parque” does mean “park,” here it refers to one of the Parcae, the Three Fates. The poem is always cited by its original French title, because “The Young Parca” sounds silly and “The Young Fate” isn’t much better.

I know translators don’t get paid much for their largely unappreciated job (Steiner published an entire book about translation, After Babel, so he would sympathize). But I feel like someone translating a book so steeped in European culture should know a poem that is considered one of the two or three greatest French poems of the twentieth century.

Of course, nobody can know everything, although Steiner has done a good job of giving the appearance that he does. One of the refreshing things about these conversations at almost the age of ninety is the extent to which Steiner admits to the things he doesn’t know. Some of us felt back in the day that George Steiner and Susan Sontag were all we needed, because everything Steiner didn’t know, Sontag did, and vice versa—but both of them gave the impression that of course they knew more than they were writing about, they just couldn’t be bothered to address the topics.

In practice, we needed a number of writers about popular culture as well, because popular culture was already as arcane as, say, the visuals of Beyoncé’s Lemonade, to understand which one has to know both Pipilotti Rist and the orisha Oshun, along with a good deal of African-American and feminist thought and imagery.
joculum: (Default)
For reasons too complicated to bother to explain, I have had to import my LiveJournal blog entries into my Dreamwidth account, thus creating something of a hash.

The advantage is that everything worth saving is now in one secure location. The disadvantage is that a great deal of incidental dross has come with it.

Apologies to anyone who has been following the few Dreamwidth entries, which were originally supposed to provide a clearer path through the labyrinth. I no longer care much whether I find my own way out of the labyrinth and thus cannot claim to be providing much of a way out for anyone else, either. There are ways out, but I cannot follow most of them or teach others how to follow them.
joculum: (an ordinary evening in new haven?)
Am feeling melancholy about the various LiveJournals likely to disappear in the not-so-foreseeable future, some of which (not just mine) were designed to possess a degree of permanence. I should think the desirable outcome would be a version of public posts in which all the irrelevant or no longer relevant posts would be reset to private and the remainder produced as a downloadable e-book and/or print version. For now, I have simply moved them to Dreamwidth for later editing.

I have concluded that I am not going to produce the definitive explanation of our historical moment (or even my confused version of same) and have now restricted myself to topical posts on counterforces.blogspot.com and joculum.dreamwidth.org, and was thinking of writing some notations about such topics as St-John Perse's poems fifty years later, and what is and is not relevant in what has lately been termed The Age of Earthquakes (regarding which, see my review essay on Counterforces), but these proposed LiveJournal posts have now been consigned to the realm of might-have-beens.
joculum: (mughal virgin and child)
I let my commentary migrate away from LiveJournal (and, for that matter, languish on Dreamwidth, as I note in the edited version of this 2015 post) because the conversation I started with myself seems to have been evolving in parallel strands in the worlds of academia, and since none of the other scholars clawing their way up Mount Analogue ever read this journal (and if they did, I invite them to e-mail me to that effect or private-message me on some other social media), it is difficult to continue to bore my handful of friends with my usual topics. The friends-only summary I offered of my essay for a European university press met, as I expected, with zero response. (It doesn't help, of course, that it was so tedious in its condensed version that I myself couldn't get through it.)

I find essay-length posts on my friends page that cry out to be collected into an eminently publishable book; but as Basar, Coupland, and Obrist write in The Age of Earthquakes, their update for the digital age of McLuhan and Fiore's The Medium Is the Massage, "Your blog is now one of seven billion blogs." Which is a witticism, since it only seems like every human being now alive on earth has a blog. In fact, some of them only use Snapchat.

I have been somewhat more active on counterforces.blogspot.com, which people tell me they actually follow (and thank me for).
joculum: (Default)
T. M. Luhrmann has an extraordinary capacity for redefining issues about which folks have been muddled for generations, if not centuries. “Faith Vs. Facts,” in the April 19, 2015 New York Times ( see http://www.nytimes.com/2015/04/19/opinion/sunday/t-m-luhrmann-faith-vs-facts.html?ref=international&_r=0 ) clarifies fifty years of muddle by redefining religious behavior versus religious content; once we suspect that holding beliefs religiously is not the same thing as believing in a divinity of any description, then of course Communism and atheism and vegetarianism and environmentalism and free-market economics are religions. Ordinary language already knew this; we do things “religiously.”

So if in fact “sacred values are immune to the normal cost-benefit trade-offs that govern other dimensions of our lives,” and “sacred values may even have different neural signatures in the brain,” as Scott Atran and his colleagues* would argue, then the mysteries of mysticism are suddenly clarified; why mystics are so often at odds with their religions, to the point of having no interest in dogmatic tenets and traditional practice alike. They are taking a completely different approach to relating to the forces that govern their lives (never mind what those forces really are) because they are not religious. They often aren’t “spiritual,” either; they frequently take something like what I have been calling a (w)hol(l)y agnostic stance, operating empirically and rationally within their definitions of reason. (“Mysticism” being one of those cluttered categories in which we park everything that doesn’t fit into some other category, rather like “genre fiction,” there are of course mystics who are totally emotional and anti-verbal as well as ones who are so concerned with the limits of language that they might as well be Ludwig Wittgenstein,, who seems to have understood in the Tractatus what mysticism really is.)

But this means that we need to discard the concept of mysticism even as we apply the term “religion” to any set of beliefs in which total identification with particular god-terms (see, we’ve had the concepts for two generations, we just haven’t known how literally correct they were) is more important than fidelity to demonstrable fact—a position that is easily defensible, since ‘fact” is so often mediated by social position and psychological predisposition. We’ve known for a very long time that being a rationalist is frequently the opposite from being rational, but now we have better grounds on which to argue that. As with so many false binaries, the opposite does not apply; being an irrationalist is not the opposite from being irrational, although there have been ample numbers of writers who have devoted considerable rational analysis to demonstrating why abstract reason just doesn’t cut it when it comes down to cases.

Getting back to those folks who don’t have much truck with god-terms and a great deal of truck with what some people call God but those particular folks often don’t—we have no terms with which to distinguish belief-inclined agnostics who both think and believe that there is something out there but we don’t quite know what (one species of mystic) from emotion-based believers who, however, don’t have much use for religion as opposed to direct experience. As good ol’ Ludwig would say, “Don’t say they must have something in common or they would not both be called ‘mystics.’ Don’t think, but look!” How do they behave, and why do they behave that way? What is their relationship to that imperceptible thing we like to call reality? (College dorm exchanges of conceptual nonsense really do have a grasp on fundamental problems; the more we look at everyday reality, the less real it appears to be. We survive because we don’t ask ourselves how we navigate through this mess of physical circumstances that our self-aware primate species has been handed.)

Is there a bumper sticker that says “Anyone who has a firm grasp on reality is delusional”? I seem to recall something similar from the ‘60s, when nobody had a firm grasp on much of anything, but everyone of all political stripes believed passionately that they did. (We can thank the transgender activists for forcing grammarians to declare that “everyone” can now be construed as plural rather than awkwardly singular, by the way.)


*Luhrmann, who is a professor of anthropology at Stanford, doesn’t bother to identify the anthropologist Scott Atran, and as so often in web searches, an effort to learn more turned up something else, an astonishing book of essays from Edge edited by John Brockman, titled This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works—wherein a variety of academic professionals answer the question posed to them by Steven Pinker, “What is your favorite deep, elegant, or beautiful explanation?” http://www.brainpickings.org/2013/01/22/this-explains-everything-brockman-edge-question/ But as always, you don’t have to fill up your bookshelves to read the 192 answers, all of which are published online: http://edge.org/annual-question/what-is-your-favorite-deep-elegant-or-beautiful-explanation

As so often in such matters, Dr. Atran’s explication of his ideas makes me feel more uncertain of their validity rather than more convinced, illustrating how much one’s choice of rhetoric determines the plausibility of one’s opinions.
joculum: (cupid in the tropics)
Melancholy Reflections on the Rapid Demise of Vehicles of Information


I should start, as a certified Old Fart, by paying homage to antique academic proprieties (George Steiner would once have started such an essay as this with such a meditation) but my heart’s not in it. However, since I just read an essay by the founder/editor of the online journal n+1 surveying the onetime range of Partisan Review while bemoaning the decay of the idea of the public intellectual, I’ll begin (sigh) by noting the assorted print quarterlies that have come and gone over the decades, and mostly gone as the twenty-first century has come on apace.

But in fact my subject isn’t quarterly print journals, which I peruse these days via Arts & Letters Daily (www.aldaily.com) when I seek out their contents at all. Academic libraries are hard to get into for people outside the university, the public libraries carry few such titles, and the surviving magazine stands in this part of the world stock a dwindling number of such journals. I am astonished, or was astonished as of a few years ago, to find that some of my favorite topics are now covered by new specialized journals, which keep their contents online firmly behind Jstor walls, as do a growing number of the surviving general-audience intellectual journals—if that isn’t an oxymoron, which at one time it wasn’t, there having been many “general audiences” in the sense of what was once meant by “middlebrow.” Today there are more niche "general audiences" than ever, many of them quite scholarly in pursuing their own particular obsessions, too. “Lowbrow” is a field of intellectual activity with its own hyperserious publications, galleries, and online discussion groups. There is also an online publication that proudly calls itself Hilobrow (hilobrow.com), which is singlehandedly passing on the information once provided by middlebrow quarterlies of various sorts, although not the sorts in which the New York intellectuals published their pontifications on public events (he said Peter-Piper-ishly).

There, I got that out of the way. And I got, however obliquely, into my real subject, which is a typically meandering complaint about the condition of rapidly evolving digital information sources.

I don’t know why I should be bothered. In the bygone days of print, “little magazines” were launched and died after a couple of issues, and did so with monotonous regularity, and most of them were even harder to locate than the most interesting online information sources—which is typically one reason they died off so quickly. A more common reason was that the editor lost interest or surplus income, which also is the reason that a good many online journals disappear completely. The difference is that as far as I know, some of the online journals genuinely disappear—when the servers that held them are wiped or the accounts are deleted, semi-decayed back issues are not offered for sale on eBay. One once-popular repository of photographs was recently completely obliterated, after a decent interval in which individual account holders could recover their own material if they saw fit. Presumably something like that will happen to Flickr someday, and to Instagram after or before it. (Snapchat has figured out how to self-destruct, or rather pretend to self-destruct, moment by moment.) The deletion of blog or photo hosts is not like shutting down a magazine; it is like burning down a library, only the library is the equivalent of the library Richard Brautigan once imagined, in which unpublished authors deposited the manuscripts of their unpublished books for perusal by library visitors. In the era of print-on-demand, we can foresee similar events of destruction—growing numbers of image-heavy books and periodicals depend on online publishers, which means that instead of such titles being available in the future from booksellers for one American cent, as is the case today with many secondhand titles for which there is only a small market, the half-dozen hard copies of some print-on-demand titles will be worth thousands of dollars. This is, by the way, already the case with recently published hard-copy exhibition catalogues with a short press run and no digital availability—there appear to be only two copies of one such catalogue for sale anywhere on the planet, both of them going for a few thousand dollars to whichever library or well-to-do connoisseur was too negligent to acquire them two years ago before the supply was exhausted.

This is a tedious topic, but I am struck by the fact that it appears to be so tedious that no one is paying particular attention to it, at least not in widely distributed discussion groups. There are ample numbers of library sites, I'm sure, that write about it all the time.

I continue to badger publishers to produce at least PDF-format versions of out of print books (e-books migrate among incompatible platforms, which is why I am happy that some people produce pirated online editions of books that were published in now-defunct electronic formats—bad for the authors’ royalties, but good for the accessibility of books that otherwise can’t be acquired, period. So far, the PDF has been a lasting multi-platform format.) —I continue to do that, I say; so I am not completely averse to the digital revolution. In fact, I have benefited from it beyond my wildest dreams, in terms of the coming of the universal library, and to the point that I feel deprived when I can’t locate some obscure title at least among the lists of the world’s antiquarian booksellers.

But I am disturbed by, among many other phenomena, the thoughtless wiping out of things like the popular websites of 2005. I’m sure they exist on the backup servers of entities that I shall not enumerate, but scholars can’t get at them. There is quite enough on the internet that its creators wish could be wiped out, but which cannot, not quite, so “the right to be forgotten” has become a popular topic. But things that ought not to be forgotten are also sent down the memory hole, as those who grew up on George Orwell's novel are wont to write.

Arts & Letters Daily is such a useful aggregator that the academic community stepped in to maintain it when its wonderfully opinionated editor died. (But of course it provides links, not copies of articles, and the links go dead.) And perhaps the ephemera of popular websites are too voluminous to be kept accessible over the long term; some years ago, the now-endangered film company Kodak established a program to collect donated home movies, snapshots and snapshot negatives, but most family photo albums end up in antique shops when they aren’t hauled off to the dumpster, and home movies simply decay beyond recovery, like digital information on 1980s diskettes. Historians of social trends would like to have access to every letter ever written, but few archives have the space to collect them en masse. The difference is that as far as I know, it has never occurred to a public archive to copy everything on Flickr or Pinterest. (If there is such an archive, I'd like to know about it.)

But archiving is, as is usual for me, not what I intended to grumble about, although I am glad I downloaded certain essays while they could still be downloaded. The fact that ten years from now I may find them as impossible to open as certain essays I wrote and stored on obsolete media in discontinued programs—that is a separate topic, also.

Actually, I am writing this because the modes in which information is produced are shifting so rapidly that it becomes difficult to know where best to look for it, or it is not being produced at all in the formats and lengths in which it was once produced.

Facebook is excellent as a crowdsourced aggregator, for people who accumulate the right sorts of Facebook friends—a whole range of links to essays in specialized topics appear in every hour’s news feed, and that makes it worth wading through the posts from otherwise highly intelligent friends obsessed with the strange habits of their cats. We all have our kinks, and now all of us can let the whole planet know about them.

But people’s Tumblr accounts are usually merely frustrating; Pinterest is an intermittently excellent if insufficiently catalogued visual resource for many things; and many excellent specialized blogs still exist on Blogger, a few on LiveJournal, a few on Wordpress—overall, so many of them that even when I discover them via someone’s link to a specific post, I can’t keep up with them and doubt that I could even if I put them all on RSS feeds. I don’t open many of the innumerable press releases I find in my inbox, and I don't look at a good many blogs I should be reading.

So I am not grumbling about lack of information per se (“at last he is getting to the point,” you say, but I have been covering, as usual, points I had long intended to make about other interconnected topics). I am feeling melancholy about the shift in our modes of attention themselves.

It’s a Twitter world, and unless it was an ironic aside in an advice column that seemed too genuinely earnest for that, someone’s not having a Twitter account is regarded as a major plus for certain millennials when it comes to making initial judgments as to who might be worth pursuing for more than a hookup. (One of my Facebook friends writes for Bustle and summarizes lots of stuff, but usually the satire is easier to tell from the real thing than it is on political websites.)

But for those of us who are a generation or so behind the curve, Facebook seems to have become as good as it gets for the mix of ideas, information, and images that blogs once gave us in greater profusion than they currently do. So many LiveJournal friends, and I myself, now limit themselves to random outbursts where once they would have gone on for pages (or very long scrolldowns) in far greater depth. Some have apparently said what they had to say in this format, and moved on to the immediate gratifications of posts in which they know from the sheer lack of “like”s whether it has gone over like the antique metaphor of the lead balloon. LiveJournal statistics don’t indicate whether a post was read by an interested person too busy to compose a comment or by a bot searching for a place to park an irrelevant spam message written in a Slavic or East Asian language.

I have moved or copied some of my most serious posts to another site (joculum.dreamwidth.org) that I try to keep clear of offhand remarks like this one, but I miss what once was a profusion of similar ambitious but not-ready-for-prime-time lucubrations by people who have given up on such pursuits because they get their spur-of-the-moment ideas out elsewhere. The elsewheres are too transient or quick-paced to be entirely useful; I sometimes remember to click through to someone’s Facebook timeline to see what I missed while I was having a life or writing for publication, but more often I forget, and more often what I missed consists of an enigmatic paragraph instead of a few thousand interesting words. (People increasingly write entirely for the people whom they private-message or see on a daily basis, making their posts unintelligible to 90% of their actual audience.)

Perhaps all this bloggy meandering, the written equivalent of thinking out loud for an audience, always was a bad idea. But I am feeling its absence severely, as my LJ friends feed has shrunk to almost nothing. The more so in that many of them never post to Facebook, either, and I refuse to migrate to Twittr even though the ill-chosen and ultimately unretractable 140 characters is the wave of the (near) future, or rather of a future so short-lived that it is already mostly becoming the past. (Don’t try to remind me that this has always been the case—as has recently been noted in some essays linked to in Facebook, the future is arriving much faster than it used to, which means the past is piling up at an accelerated rate, also. The ruins viewed by the Angel of History who is being blown forward by the storm from Paradise get bigger with considerably more rapidity as the wind picks up.)
joculum: (Default)
Workpoints towards “Tipping Points in the Anthropocene Era, Part Four”

There are no workpoints yet for Part Three, about human interaction and human creative practice. The problems are almost impossible to phrase correctly, never mind keep in mind simultaneously. Despite the common pressures imposed by globalized economies, human beings differ in background, environmental situation, and a multitude of other factors. It is easy to point out how minimum-wage discount-store employees in a Midwestern American city might have some financial stresses in common with corporation-employed copra dryers in a Pacific coral-atoll culture, but difficult to explicate the anxiety-provoking similarities and equally crucial differences between landlord-owned housing threatened by high rents and worker-owned housing threatened by rising sea levels literally lapping at the foundations. A particular minority-religion group in parts of Central Asia is being assisted into modernity by a global foundation, while members of the same minority group a few hundred miles to the south are threatened with physical extermination. Members of relatively disempowered ethnicities in advanced industrial societies are currently pointing out that the same structures of privilege operate throughout the social order: historically oppressed ethnicities receive only adequate rewards for doing the same work for which historically dominant ethnicities receive disproportionately great rewards, for example—even though all of them receive compensation that the less well employed of all ethnicities can only envy. There are disputes as to whether these ethnic disproportions deserve more recognition than gender-based disproportions, which are also a matter of historically disenfranchised and historically dominant categories of human beings. It is very difficult to talk about all these things at the same moment, while keeping in mind that environmentally caused cancers and lung diseases also affect different parts of the social order in different degrees, just as rising sea levels have a more immediate impact on those who live by the water—who on some coastlines are very rich, and on others, extremely poor. Discussion of these variables is rendered even more difficult by the inevitability of emotional turmoil over perceived offense, and even the workpoints for later discussions become unwieldy. So we are going direct to workpoints for Part Four, that part that is intended to address some of the problems discussed in Parts One through Three.

No one of the phenomena described in these twelve workpoints plays out in isolation, as is implied by the contortions of Parts One and Two as previously posted and the probable contortions of the as yet unwritten Part Three. It is useful to separate them out, however, for the simple purpose of achieving a bit of transient clarity before plunging back into the fatal muddle that is the labyrinth of the world. There are many permutations that could turn into further workpoints, but that would land us back in the middle of the labyrinth instead of in the—well, not quite the paradise of the heart, but hopefully something of which Comenius might approve. —Jerry Cullum, still asserting some version of Creative Commons right of ascription of authorship in this text’s subsequent uses



1) Under ordinary circumstances, we are not adequately equipped by our bodies’ biological wiring to consider concurrently all the tipping points in the anthropocene era, and how they interact with, alter, reinforce or diminish the impact of one another.

2) Even if we can learn how to expand both the length of our attention span and number of topics we can keep in mind simultaneously, we still run up against the problem of the fields of knowledge in which we simply aren’t any good. All of these fields need to be deployed together merely to understand the interacting tipping points—never mind do something about them that doesn’t suffer fatally from the law of unintended consequences.

3) So we need to figure out how to coordinate our collective information in a way to which our ordinary condition does not predispose us. If we entrust the task to machines, we need to know the consequences of how the information was presented to the machines, which at this point is determined by the human beings writing the algorithms that input and analyze the data. As we know from the websites that recommend books and merchandise and vacation sites to us, algorithms and programmers do err. (That’s a joking reference to Martin Luther’s “popes and councils do err,” by the way; which piece of information illustrates a subsequent tipping point of communally shared knowledge that must be dealt with.)

4) Machines can understand the world, but the point is to change it. (Can a machine be altruistic? Can a human being? Points for debate currently.)

5) The cultures within which we operate are not particularly capable of changing it in the ways it needs to be changed not just to maximize human happiness, but to ensure human survival.

6) The cultures within which we operate are currently under stress from the increasingly random collisions—for many different reasons—of populations with no more than semi-compatible value systems. While hybridity seems possible because it has been the way of the earth for millennia, right now in many situations a sophisticated hybridity is the exception, and uncomfortable accommodation the rule when there is not outright mutual rejection. This is a simple empirically verifiable reality regardless of what we wish were the case. What to do about it is the question that must be answered if we are to maintain societies in which most human beings would want to live—and there is a wide variety of opinion as to what sorts of societies particular human beings would prefer to inhabit, once we get beyond a few very widespread preferences.

7) A tiny minority of humanity is engaged in scientific discoveries and technological breakthroughs that are unsettling habits formed over millennia even more thoroughly than the technological breakthroughs of the previous hundred and fifty years put together. Yet the system within which these technologies are being deployed randomly and uncontrollably is governed by economic maxims developed a century and a quarter ago and applied as though they were immutable universal truths. The tension between the two forces is creating a level of social and economic insecurity among the vast majority of humankind that is giving rise to demodernizing movements, xenophobic mythologies, and other pathologies that occur when human beings are pushed to their limit culturally, financially, and environmentally, and mostly all at the same moment.

8) The stresses of global industrial society, including industrialized agriculture and resource extraction, are leading to mass extinctions of species, intensifying already existing cycles of climate variation and thus creating destructive alterations in food production and survival of natural systems (a.k.a. ecological networks), and also creating urban environments that perpetuate personal stress among an already socially and financially beleaguered global citizenry. This is so whether the governmental entity within which the stresses are being generated calls itself capitalist or socialist, whether the form of global capitalism being practiced is corporate or state-controlled and regardless of the philosophical background of the putatively socialist economy.

9) The vast majority of human beings are too busy with other matters to absorb even a stripped-down understanding of these problems, and even if they did absorb such an understanding, many would be inclined to reject solutions that are anathema to the crumbling cultures in which they were brought to adulthood.

10) The rapidity of change increases the level of incomprehension not only between cultures and generations, but between individuals. Fewer starting points than ever are available from which to analyze and choose among the possible outcomes of global difficulties.

11) Specialized subgroups of professionals whose collective body of knowledge comes as close as we can get at this point to solving the planet-wide crisis of our epoch hold each other in mutual contempt most of the time, and misunderstand each other’s points when they overcome their contempt long enough to attempt collaboration.

12) What is to be done?
joculum: (Default)
The inconvenience of reading posts in reverse chronological order has led me to overcome my dread that no one will read four thousand words in succession in an online format. As I indicate herein, I have no idea when, if ever, Parts Three and Four will come into being, but this sums up and rethinks a good many topics I have tried to argue are intrinsically linked to one another. There are a good many that are not so linked; I am not likely to incorporate Patti Smith's introduction to the volume presenting the fashion designs of Ann Demeulemeester, for example, although other aspects of the topic might relate to the subject matter herein. Although the number of topics I seem to incorporate into this essay are numerous, they are not only finite but fewer than they might seem to a sufficiently impatient reader.

—Jerry Cullum (who is asserting his Creative Commons rights if not an outright copyright, as I expect the piece to evolve quite a bit before it reaches finished form)


Tipping Points in the Anthropocene Era, Part One


I have the increasing sense that the world has reached what can only be called (in spite of Malcolm Gladwell’s obnoxious use of the term) several tipping points—I use the term to denote points from which there is no going back, whether the change is for better, for worse, or indifferent but irreversibly different. These are to be distinguished from those all too common points in personal and global lives in which failure to reach a tipping point means that everything will degrade back to the unsatisfactory way it used to be; the world has an ample quantity of those, too, but what has been reversed once can be reversed again, in the opposite direction. Some tipping points are culturally inflected; it is still possible to live without the digital revolution, and even to live without electricity and contemporary medical knowledge (entire societies live that way, not always by choice) but to choose a technology involves choosing a cognitive package that comes with it even if it destabilizes things you would prefer not to have rendered unstable. Other tipping points are environmental, and are truly irreversible: extinction is forever, even if we genetically engineer a reasonable simulacrum of the original.

Perhaps some environmental tipping points will be avoided by way of cultural or technological tipping points. Perhaps the monarch butterflies will be brought back from near-extinction by the cessation of illegal logging, alteration of pesticide use, and planting of the right (North American) rather than wrong (tropical) species of milkweed on the monarchs’ migration routes. Perhaps colony collapse disorder will be corrected in the bee population, and we won’t have to use robots or impractically immense numbers of underpaid farm workers to pollinate fields and orchards fifty years from now (with the concomitant die-off of large quantities of natural flora not self-pollinated or pollinated by moths or other intermediaries...with cascading consequences for other species). Perhaps a century or so from now solar powered container ships will deliver at a slower pace sweatshop-manufactured goods from remote parts of the globe, the absurdity of just-in-time inventory replacement having gone the way of the dodo bird. (Or perhaps solar powered drones will have developed to transoceanic capacities, or some other way of getting supercheap goods from point A to point B will have been developed.) Or perhaps the countries hosting the sweatshops will have all fallen prey to demodernizing revolutions, replacing global banking with traditional methods of exchange and imposing legal systems that are not in sync with global standards of acceptable commercial conduct, leading to the rise in the rest of the world of some other, perhaps more technologically rather than exploitatively based methods of making and distributing inexpensive goods (with still further worker displacement....). Perhaps the exhaustion of a number of critically short raw materials will be worked around, as it always has been. Nobody worries about the current shortage of whale oil (to the disadvantage of the world’s whales) and as the Saudi oil minister said once, the Stone Age did not end because of a shortage of stone. (Speaking of the environment in which those whales live, perhaps sustainable fisheries will become a reality, so that fifty years from now we will not have a world in which the only fish available is farm-raised tilapia from polluted waters.) As the multiplying number of parentheticals in this paragraph indicates, the outcome of one inevitable change affects a good many other outcomes.


The future, in all those regards, is unknowable, but the future is arriving rather faster than anybody expected, and there are, at present, inbuilt social structures that keep anybody from being in a position to change things fast enough to meet its challenges unless the solutions increase short-term profits for a very specific set of asset managers in capitalist and putatively socialist countries.

The problem is that many, perhaps most, of the people who can keep up with the technology involved tend to think that the human problems will be solved by the advent of smart machines of one sort or another. The age of the transhuman is a popular notion among some self-styled futurists. Other futurists can write that we should modify William Gibson’s (no tech optimist he) remark that the future is already here, it’s just not very evenly distributed; the future is already here, but in many ways it’s not distributed at all. This small problem does not seem to bother the transhumanists, who presumably will not have to worry about rioting human mobs interrupting the electricity supply by blowing up the grid or the fuel pipeline or the solar and wind farms when the machines take over and the futurists upload their digitized selves onto servers. (Presumably the machines will have devised foolproof defenses for all these parts of the infrastructure, putting humans in general in the position of missile-armed tribesmen fighting against drone aircraft.)

The cultural tipping points are much harder to define than the environmental or technological. The technological, just as the half-delusional prophets of the 1960s predicted, has been a major force in the alterations of the cultural. The whole nexus of events, however, has not unfolded quite in the way that McLuhan or the others expected, as we all become textually visual in the immediacy of a financier-governed global village.

The cultural issues are so confusingly simultaneous that to discuss them one at a time is to fall prey to the misrepresentations that keep us from realizing quite what is happening. But to discuss all of them in the same simultaneous (dis)order with which they arrive in our lives and on our digital devices is likely to leave us with a headache and no greater degree of comprehension than we had previously.

As H. P. Lovecraft put it, in an opening sentence I was quoting before it became fashionable or even acceptable to do so, “The most merciful thing in the world is the inability of the human mind to correlate all its contents.” Thomas Pynchon concurred, of course, but he too did not completely realize just how much there is that needs to be correlated. (William Gibson sort of did, and still does, as does Douglas Coupland and, maybe, Okey Ndibe—Foreign Gods Inc. is a fairly amazing first novel, as much in its own way as its nearly polar opposite Neuromancer was.) There are frightening gulfs of time and space that have nothing to do with Lovecraft’s space monsters, and globe-spanning sets of interactions that go beyond Pynchon’s intricate economic conspiracies (which is not to say there are not economic conspiracies, just that they are not a sufficient organizing principle to explain what is happening to the planet and the human society that sprawls across the face of it, changing it inexorably and unconsciously as it goes).

Widen the area of consciousness, Allen Ginsberg wrote; but he meant something psychedelic, and while it is good to be aware of the gulfs at the margins of consciousness (where something very interesting might be trying to get us to notice it), we need to widen awareness of things that are much more central to human survival. But we live in an age of multiple centers, or an era in which the notion of the center has been inexorably weakened by the awareness of the networks of meaning and the networks of physical force that we approach one at a time, when in fact we need to be conscious of how they interact.

George Steiner, who prophesied all this some forty years ago in In Bluebeard’s Castle, has become yet another version of the Last European (in the sense of someone defending a culture that already has become, as Pope Francis recently implied, sclerotic at best and moribund at worst). There is something ludicrous about the style of hatch-battening being undertaken by those who feel that it is time to batten down the hatches of the glocal ship against the storms to come and the storms that are already here; this is so not least because the measures are so inadequate and so shortsighted, and because the ships with their unbattened hatches (let us ride in the vessel of this nautical metaphor for as long as we can) are already sinking in the crosscurrents of global forces.

That was a way of putting it; not very satisfactory. To quote T. S. Eliot’s Four Quartets, which is one of the fragments I shore against my personal ruins. The boy from Saint Louis did all right in mythicizing the British culture into which he inserted himself as a foreign immigrant, even if, as Wyndham Lewis put it, he had to disguise himself as Westminster Abbey in order to do it. Derek Walcott and V. S. Naipaul did it differently, coming from the Caribbean instead of the middle of North America, and they have been succeeded by younger generations of immigrant writers, predominantly female.

The curious thing is that Steiner not only noticed this, decades ago, he celebrated the fact that the English deployed by Commonwealth writers was far richer than the language as it was spoken and written by the putative indigenes of a little fog-haunted island off the coast of the peninsula of Asia that we call Europe. In other words: the ex-colonials were doing it much better than the longstanding Brits, and frequently doing it in the home country of their colonizers. A similar phenomenon obtained in France, where African and Caribbean francophones often outdid the proud originators of the language, and did it in Paris, too.

What happens, however, when the new arrivals begin not just to enrich but to supplant the cultural assumptions of the previous generation? The Brits learned in school that their culture came about because after holding out against an onslaught of Danish immigration, Celtic and Saxon-immigrant culture pretty much collapsed under the weight of Norman French occupation, resulting in the hybrid we call England that subsequently pulled Scotland, Wales and Ireland into its orbit. The French...well, the French learned about their ancestors, the Gauls; just how a welter of differently-languaged regions were fused into the hybrid entity of La Belle France tended to be passed over in silence until very recently. Cultures function by mythicizing themselves; cf. the relatively recent books titled The Invention of France, The Invention of Scotland, The Invention of Argentina—everywhere cultures are created by arbitrary denials of difference for the sake of creating a fictionalized national unity that eventually turns into a functioning reality.

As I was implying a few paragraphs back, the problem today is that the new arrivals everywhere are increasingly less interested in mixing and mingling with the existing cultures, not least because the historically defined local cultures are increasingly in a state of collapse, and the long-resident locals are not particularly inspiring exemplars of them. (Whittaker Chambers sixty years ago, in his usual tone of histrionic rhetoric: “It is futile to talk of preventing the wreck of Western civilization; it is already a wreck from within.” Gandhi, some decades earlier than that, when asked his opinion of the European code of civilized conduct: “I think it would be a good idea.” (Both quotes are cited from memory, and probably slightly wrong. Don’t quote me.) The fact that a good many readers no longer understand Gandhi’s joke illustrates Chambers’ point; there are habits of irony that never quite took comfortable hold in America, and are disappearing from the Europe that originated them, and for which Chinese and Indian and African intellectuals found analogues in their own cultures. Sometimes the non-European intellectuals are now more adept practitioners of the ironic cast of mind than the heavy-handed literalists of one-dimensional Europe and America—and has anyone noticed how Australia and New Zealand get cast to one side in this discussion, while South American cultures are so tangential to it that it is necessary to reframe the terms of debate in order to include them? although of course “non-European intellectuals” is a term that includes Jorge Luis Borges, who was more European in Buenos Aires than most intellectuals in Paris or London, and entire subsequent generations of South American writers in Spanish and Portuguese have more in common with their contemporaries in Prague or Dresden than with their contemporaries in, say, Chicago, Toronto, or Sydney.)

This essay has yet to get to the cultural consequences of the digital divide, and of the prevalence of cultures of demodernization in large parts of the world, including enclaves in a world that has gone from modern to liquid-culture post-postmodern. But in my experience, fifteen hundred or two thousand words is about all I am good for, and all that anybody can deal with online or in downloads, other than the dwindling ranks of New York Review of Books and New Yorker readers in America, and their counterparts in other countries and other continents for whom ten thousand words at a stretch is as nothing, even onscreen. (The quality of the screen matters a great deal, however.)

So I shall have to take up the difficult aspects of shifts in sensibility some other time, which is just as well since, as all of the foregoing implies, the shifts are so different from one subculture to another that it will take a fair number of paragraphs even to delineate them as inadequately as I have thus far delineated the other stuff. I like to allude rather than spell things out, which, by the way, is a characteristic that both Michel Houllebecq and Graham Harman have ascribed to H. P. Lovecraft in their reassessment of the value of his oft-derived rhetoric and transhumanism avant le lettre. Onward to eldritch revelations of disturbing fragments of evidence under a gibbous moon, then, but later.

[Regarding the habit of allusion without footnotes—that is why we have Wikipedia and websearches, not necessarily on Google. I look up most of my allusions these days (except when I get really lazy), and sometimes have to search for days before I find the reliable version of the quotation if I don’t have the book on my shelves. But in the old days of interlibrary loans, it sometimes took even longer, and sometimes I still hadn’t tracked it down, even years later.

But it would be fun to annotate this essay; littered with superscript numbers midway through every sentence, it would resemble an idea-convoluted story by John Barth, Donald Barthelme, David Foster Wallace, or their generational relatives and successors, with whom I haven’t kept up.
]


-----------------------------------------------------------------------------

Tipping Points in the Anthropocene Era, Part Two:


I have said remarkably little about the American context, and the purportedly postracial American context most of all. But that is because I do not want to sink irredeemably into a dispute over whether the American experience east of the Mississippi prior to 1920 is best represented by Huck Finn, Moby-Dick, The Souls of Black Folk, all of the foregoing plus Emily Dickinson (and maybe Madame C. J. Walker as a practical economic exemplar), or that it is pointless to argue about literary and economic history as long as today’s policemen armed with army-grade weapons are killing black men and women for selling cigarettes or standing the wrong way in the street.

So for the nonce, let’s not go there, although we must someday visit how all this affects or is affected by the forces that depopulate drought-ridden agricultural regions while creating boom towns in resource-rich ones, and how much of what is happening is unfolding more or less as it always has whether times in America are deemed to be good or bad.

Better to look for now, very briefly, at the discontents of demodernizing forces across the planet, and their relationship to the insights that Peter L. Berger articulated some forty years ago: that modernity travels with cognitive packages that are difficult to detach from it, and that those cognitive packages are intrinsically destabilizing even when they don’t automatically offer the culturally specific forms of stability that advanced industrial societies had worked out, in the era just before the Arab Oil Embargo first changed everything.

The world’s societies have been relatively settled for so long that their mores have become mythicized as much in the centers of postindustrially developed power as in the cultures closest to the level of hunting and gathering. The successful evolution of ways of routinizing the rights of the individual (and of the corporation defined as a collectively composed individual) have become so codified into law that it is forgotten how much the law has to be transmuted into social habit. One might argue that Edmund Burke and Antonio Gramsci were not irredeemably far apart in their insights into human nature; the difference is that Burke felt that settled tradition reinforced by state power was the only thing keeping humanity’s natural tendency towards rascality from devolving into chaos, whereas Gramsci more incisively saw that the rascals running things were perfectly capable of bending settled traditions and state power to their own ends and of learning how to convince others to believe in versions of settled tradition that were against their own interest and the interest of all their neighbors. It is not necessary to grant angelic status to human beings to support their right to act with full awareness and as much capacity to act as is not instrumentally harmful to other human beings. (But what if the harm is psychological? and what if the instrumental harm is indirect, by way of subtle aspects of the physical environment, whether that environment be the ecological balance of natural forces or the social forces of architecture and urban space? Politics may be based on power, but it also rests on how communities define the nature of nature and the nature of the human.)

In liquid modernity (I confess that I do like Zygmunt Bauman’s locution, for its metaphoric power) the mix of individuality and social support structures are shifting in ways that bring their interplay into clear visibility for those who choose to see. The problem is that scarcely anyone really wants to see; regardless of one’s political or social position, certain opinions that are grounded in reality are going to sound like transgressions against one’s received pieties, and granting aid and comfort to one’s ideological and political enemies.

We are not yet at irreversible points in the definition of humanity, in spite of the irreversible flood of knowledge about our condition; not so long as bodies of knowledge can be obliterated or occulted by force or disorder. The definitions, of course, remain irreversible in and of themselves; what is reversible is how much is known by individuals in a society, how it is known, whether the exact definition of the knowledge is open to argument and refinement, and whether the knowledge can be acted upon.

So before we enter upon the question of the nature of humanity and the nature of human institutions and the nature of nature and the future of all three together, which may end up as a never to be written Part Three of this, we shall have to consider a couple of present-day (in 2015) examples of the demodernizing and anti-modernizing forces (not the same thing) that have been explored from many different perspectives since I first read about them in Peter L. Berger and read about their practical consequences in the unintentionally mischievous reportage of Robert D. Kaplan.

Boko Haram is, of course, actually named after the assertion that non-African modes of education are heresy in the brand of Islam that the group espouses. Never mind that the brand is itself an opinion open to dispute, or that parallel demodernizing groups have sought to wipe out the intellectual heritage of African Islam itself and a large part of traditional African Islamic creative practice; what matters on an immediate basis is that the group has the imported firepower to enforce its opinion and supplant state power in so doing. (To combine the insights of Berger and Kaplan, when technology is considered just another object in the town marketplace, akin to salt or vegetables, the question of cognitive packages does not arise—or rather, the cognitive package is the category “useful object for sale.” You do not have to think about the world that produced a weapon in order to use it, or even to maintain or repair it, any more than you need to understand the relationship between software, hardware, and network in order to use the internet.)

That’s a demodernizing movement, although the modernity it opposes incorporates a large part of its own indigenous history. Anti-modernizing currents, or movements that oppose a specifically Western European modernity, are not at all the same thing, and we could look at a couple of the world’s largest societies if we had a few thousand words to spare and a few hundred hours to do the research beyond the headlines.

China’s directive to schools and universities not to teach “Western ideas” (other than socialism, of course) is the example du jour. The assertion could be spelled out more precisely, and probably has been by someone other than the European and American academicians who insist that since science and civilization in China got along fine independently prior to the intrusion of European and American would-be colonizers, it is perfectly feasible and perhaps desirable to revert to a uniquely Chinese way of organizing technology and the state that makes no obeisance to notions imported from Europe, other than Marxism, of course.

Others have argued that on that view, it is possible and necessary to look at the irruptions and eruptions of intellectual and political forces in Chinese history that were effectively analogous parallels to the opinions being disparaged as alien, and to argue that since almost all societies generate similar displacements of tradition (it’s just that the displacements succeed better in some cultural circumstances than in others), we might as well consider certain parts of the contemporary human condition (e.g. individual rights) as universals, regardless of who was the first to articulate them.

This is where the disputes of present-day anthropology become relevant to the disputes of present-day politics, but it would take another ten thousand words to unpack that problem.

Although the notions of exceptionalism in such places as China and Russia and the United States of America are certainly worth respecting and examining, if only because in many ways each is an exception, all cultures insist upon the rightness and superiority of their own ways of doing things, and treat the barbarians or just the rubes from someplace else with a certain amount of suspicion.

Problems arise when enough rubes from someplace else arrive with the same unexamined reverence for their way of doing things that the local rubes have for theirs, and we are talking about events probably dating back five or ten thousand years here. In good circumstances, invigorating hybridities flourish in newly enriched cultures; in bad circumstances, people start killing one another even without benefit of invading armies to render the job more efficient.

It is difficult enough when the simple facts of intermingling of cultures have reached a point of irreversibility and something is going to change dramatically, so that the point at issue is what sort of change is going to happen. When the culture on whose turf the drama is being enacted is itself going through internal upheavals created by economics, ecology, and/or technology, the difficulties are doubled and quadrupled, or multiplied by whatever factor your capacity for dubious rhetoricizing will allow.

Right now, globalized complex societies throughout the world are suffering from a combination of increasingly irreversible circumstances of this sort. Differently complex societies are in some cases simply collapsing, and generating instabilities that impact almost every other society on earth. (A few isolated societies are affected only by one or two forces, such as that their continued existence constitutes a hindrance to resource extraction.)

The dynamics of economic exploitation, cultural insensitivity, ingrained dislike for other ways of life, kneejerk responses applying universal principles to particular cases, and so on, is a topic that can scarcely be discussed without offending so many passionately held beliefs that no one really wants to undertake the dialogue.

To cite only one example, recent patterns of disease suggest that we need to consider the difficulties caused by the fact that some people really like to kill and eat forest animals even when other food options are available, and that some people like to consume cheeseburgers no matter how many cautionary calorie charts are posted or less artery-clogging food options placed on the menu. But there are also ample numbers of West African and Midwestern American young professionals lining up at the same low-calorie food purveyors in spite of their grandparents’ preference for bush meat or burgers. Some of them even share the same opinions regarding the ecological and epidemiological consequences of socially mediated choices, as well as the consequences for their personal health. And many of them probably hold divergent opinions on ashé (including simply knowing the term) and on the legitimacy of banging away at deer in hunting season in America.

But for those who try to discuss such topics as this outside a work of fiction, the topic of practical consequences will quickly be derailed by inquiries and objections about the problem of intercultural communication and who possesses the right to pass judgment on what sort of behavior.

In the meantime, the tipping points for a host of interconnected problems continue to arrive, and the choices narrow accordingly.

There is a clear need to move to Part Three here, but that is the larger area in which I need to discuss topics on which I have been accumulating books I was going to get around to reading someday, even as the topics themselves can barely be kept up to date in daily news reports on the consequences of the historical forces, without regard to the theories that might allow us to make sense of them. So I may never get round to writing that one, although miracles have been known to happen.

A bibliography of interesting titles, with a frank admission that I have only a fragmentary idea of the overall value of the books’ contents, may be the interim solution. Part Three, Beta Version

Actually, parts one and two are very much beta versions, with their own share of operating problems. But I hope I have made my point that human survival depends on the concomitant consideration of a variety of rapidly changing situations, despite all the reasons why this sort of consideration is not very likely to happen.

What we do about this is the topic of a Part Four that is even less likely to get written than Part Three, although I have implied in past essays some possible routes to addressing the problem.
joculum: (cupid in the tropics)
“The Universe...Is Stranger Than We Can Suppose.” "—Play it again, Sam"?


Friend Grady Harris and I (I won’t, for once, a.k.a. to his many online pseudonyms, of which “Grady Harris” sounds like one and on one level actually is) have long traded comments on how it comes to be that all famous quotations are parceled out erroneously on the internet: Wryly ironic observations are said to be by Mark Twain unless they are by Woody Allen, and people who think that Twain must be the originator of everything vaguely ironic sometimes compel Twain to use slang terms from the 1960s. Slightly gnomic but lyrical lines of verse are almost all said to be by Emily Dickinson but occasionally by a handful of others, although the lines usually contain enough internal clues to warn a diligent websearcher that they were not written by any of the poets cited. Observations about science are typically assigned to Albert Einstein; for a significant exception, see the discussion that follows these longwinded general observations.

There are subcategories of generic aphorisms that end up being ascribed to whatever other authors and thinkers most internet users already know, so that on a more arcane level of discourse, the persons most often cited in introductory college courses in anthropology or sociology become associated with authentic quotes (sometimes by their academic opponents) that express opinions they never held. This shades off into the more common phenomenon of supposed quotations of recent remarks by the Dalai Lama or Pope Francis that neither of them ever uttered but that sound sort of right; the “Play it again, Sam” syndrome, one might call it, and probably someone already does.

The game of quotational distortion and erroneous ascription is, of course, much older than the internet, as the “Play it again, Sam” reference indicates; and longtime readers of this blog who also have long memories will recall my patient tracking down of who first reported on an experience with ether in which the user was granted the revelation that the universe is permeated with a strong smell of turpentine, or was it that an odor of petroleum obtains throughout? Actually, it was neither one, not exactly, although very similar expressions occur in the retellings that now include this one.

What happens, often enough, is that writers remember an anecdote or a quotation pertinent to their argument; the writers, if they are intellectually responsible, don’t want to put a loose paraphrase in quotation marks, and if they trust the reader to be familiar with the quotation in question or omit the name because they themselves are not sure who said it, they will find the quotation ascribed to them as the originator, not always in the exact words they used in paraphrasing.

I bring all this up because recently I saw on Facebook one of those familiar overlays of an edifying quotation on a semi-appropriate photograph, in this case one saying, “The universe is not only stranger than we think, it is stranger than we can think. —Werner Heisenberg”. This annoyed me because I was convinced that Heisenberg couldn’t have written that; and that some other well-known physicist had said it; and that the correct quotation was the more elegantly phrased “The universe is not only stranger than we imagine, it is stranger than we can imagine.”

It took a fairly short time to find that the more elegant version is ascribed, wrongly, to Arthur Eddington, who never wrote such a sentence even if he expressed vaguely similar opinions. The preponderant opinion of commenters was that this is a distortion of J. B. S. Haldane’s 1927 aphorism, “The universe is not only queerer than we suppose, it is queerer than we can suppose.” Since no one gives a page reference to Haldane's Possible Worlds, even when they identify it as the source, I must withhold judgment on what Haldane actually wrote, but he does seem to be the locus classicus for this particular thread of morphing aphorisms.

Subsequently I found that the Heisenberg quote often comes attached to a book title, his 1974 Across the Frontiers, but the page reference hasn’t migrated along with the quotation, and I haven’t searched long enough to find the book online. If the entire passage from which the quotation comes has ever been ever cited by someone, it is buried so deep in the pages of search results that I haven’t had time to locate it.

Citations of the “imagine” instead of “think” version never seemed to get beyond the misascription to “Sir Arthur Eddington,” which bothered me, as I thought I had seen it ascribed otherwise. A more comprehensive websearch found that it is now often ascribed to Richard Feynman. A slightly different set of search terms turned up a mind-numbingly exact citation of Richard Feynman’s The Feynman Lectures on Physics, Volume II, Section 41, page 12, but unfortunately in that passage Feynman is actually quoting J. B. S. Haldane’s original remark, “The universe is not only queerer than we suppose, it is queerer than we can suppose.”

At long last, I discovered that a source purporting to impart “the deepteachings of Merlyn” actually transcribes the passage in which Feynman says “Not only is the universe stranger than we imagine, it is stranger than we can imagine,” and it is in a popularizing lecture on the mysteries of quantum mechanics in which he makes offhanded allusions and, in the transcribed passage at least, doesn’t give oral citation of his sources. The writer cites the URL www.youtube.com/watch?v=0XgmrMZ0h54, but unfortunately this page has been taken down for copyright infringement so the accuracy of the transcription can’t be verified.

I am left with the feeling that Feynman thinks he is quoting somebody else quite accurately, and considers the quotation too well known to require him to cite the name in a lecture in which his rhetoric is on a roll. He imagines a firmly established identity for the author of an aphorism that turns out to be far more fluidly attributed than he supposes.

One webpage of quotes about science wisely ascribes the “imagine” version to that prolific producer, Anonymous. Somebody is the actual author.
joculum: (Default)
The Time Between Is Not Quite Over, Part Two


I have written to the point of boredom that one of my favorite cartoon punch lines is Charlie Brown’s “I don’t even understand what it is I don’t understand.”

This is indisputably (a rhetorical device that indicates that the concept under discussion is quite open to dispute) the case with our present historical moment.

We know (sort of) that we are biologically/psychologically inclined to focus on the little details most of the time, to compose the big picture from a few little details misleadingly turned into general principles, and in general to jump to conclusions, just as I am about to jump to conclusions in this essay.

A major difficulty of the present moment is the fatal concatenation of the aforementioned inbuilt mental tendencies. We are confronted with a planet-wide confluence of hyperobjects, systems of nature too large and too independently variable to permit easy focus on more than a fraction of their cumulative effects. We are confronted with hypersystems in global economics that make it easy to direct individuals towards the interpretation of small grievances, while insisting that these are all that there are, that there are no large consequences playing themselves out as a result of the many small economic distortions made possible by structurally similar legislative choices the world over.

We are discovering that human health evolves according to similar principles: Plagues derive from large environmental alterations that turn previously minor diseases into major ones through the creation of new disease vectors. Major breakdowns in individual health derive from the combination of a large number of minor fluctuations in the functioning of systems of the body we have learned to think of as separate from one another. (Paul the Apostle already knew better than that, sociologically as well as materially: cf. I Corinthians, chapter 12.)

One of the major problems in confronting this concatenation of obvious facts (a rhetorical device that indicates that there is indisputably a dispute over whether they are obvious, or even their status as potentially debatable facts) is that the previously available conceptual language with which to discuss them has been sloppily sentimental, philosophically misleading, or simply unintelligible. “Holistic” when applied to any topic is, for many people, only a cuss word signifying “superstition masquerading as science.” “Object oriented ontology” is a phrase that signifies just as much to most people as you would think it does, and a topic that seems to lead to informal discussions of what a glass might experience when it breaks, which is not at all what a revived focus on the relationship between objects and consciousness ought to be discussing. The point is that objects do interact in ways we cannot quite wrap our heads around; consciousness does not quite determine being, but “being” is not what Karl Marx thought it was, so being does not quite determine consciousness, either. We need a dialectic that is neither old-school materialist nor idealist, and old philosophical conundrums are not going to help get us there.

Whether there are neglected aspects of past thought and practice that might prove useful in getting us to where we need to be is a topic that happens to fascinate me, but focusing on possible historical antecedents is only going to give rise to unnecessary and unhelpful arguments. We need to find new ways of expressing our present condition, of overcoming our predilection towards systematic misinterpretations of that condition, and of bestirring us to get off our butts and do something about that condition.

So there. (Discuss.)
joculum: (Default)
Nails and Hammers and Unmixed Metaphors


Some time ago, when I was telling a friend about the neuroscience students who were looking at Bethany Collins’ blackboard-like panel of white-lettered words breaking apart and collecting into piles of letters, which reminded them of how memory and language-formation function, he said, “To the man who only has a hammer, everything looks like a nail,” to which I retorted. “No, no, they got it! That was why I put it in the ‘From Cosmology to Neurology and Back Again’ show! —because it wasn’t just about the political concepts in the words, it was about how the concepts form and fall apart in our minds and our societies!”

Actually, I didn’t say much beyond “No, no, they got it!” but we all know about pensées de l’escalier. (Or “That was what I shoulda said.”)

I so, however, keep having new and ever more horrifying realizations of how we actually do interpret the whole world in terms of the tools we have with which to interpret it; more accurately, we interpret and judge other people’s tools in terms of the ones we know how to use, and we interpret other people’s interpretations in terms of the tools we know how to use.

The computational model of consciousness is a case in point. Anyone who studies humankind’s cultural creations realizes that there is a much more complex set of responses to the environment than pure computation, but how germane to actual consciousness are the complex responses? We are, after all, getting better at creating systems like Siri in which algorithms mimic at least the standard tropes for reacting to reports of others’ emotions and sensations, from pain and hunger to fear and sexual desire. (“I am sorry to hear that. I am sad that there is nothing I can do to improve your situation.”)

So a good many everyday behavioral or pragmatic tools for navigating existence are purely mechanical, a more sophisticated form of “How are you?” “Fine, thanks, how are you?” “Fine, thanks.” or “Thank you.” “You’re welcome.” And as the linguistic philosophers pondered three-quarters of a century ago, it is quite possible for the “Fine” exchange to contain not a syllable of accuracy concerning the respondents’ inner emotional or physical condition, since it exists for other purposes than information.

But we end up, vis-à-vis such questions as the computational model of consciousness, in messy issues of what it means to have a body or to be a body and what it would mean not to be a body or to exist as a conscious being without having a body. And people whose particular emotions and mental skills lead them to acquire expertise in one academic field are likely to have a completely different way of putting the problem into words, or of understanding the problem intuitively, from those whose expertise is formed from different skill sets.

In principle, we should all be able to comprehend what it would mean to understand a problem using a different set of acquired skills. But because our comprehension of that question is partially determined by who we are as embodied beings with a personal history, we don’t even understand what it is we don’t understand, as I have quoted so often from The Wisdom of Charlie Brown.
joculum: (Default)
Neither Here Nor There: An Exceptionally Hasty Note on the Concept of Between-ness



I have been unable to visit the exhibition at Atlanta’s Gallery 72 titled “Middle,” but I have read the curator’s statement (or an extract from it in the press release) by Candice Greathouse that states, “The works included in this exhibition serve as a visual dialogue of ideas that investigate notions of this middleness - inbetweenness and potentiality through material and process. The artists and works featured in this exhibition represent the middle through a variety of strategies conceptually and aesthetically. They resist concrete categorization and definition, offering instead a provoking ambiguity that prolongs and redefines the ‘middle’.”

Receipt of this press release happened to coincide with arrival of my essay “Oscillations and Interstices” written for the catalogue of the “Oscillations” show at Steffen Thomas Museum of Art a year ago.

This has got me to thinking about the shifts in the concepts of in-between-ness from Zwischenheit to Zwischenzeit in German (the one just means “the meantime” as in “in the meantime” while the other means the condition of “in-between-ness”) and what the existentialists of fifty years ago made of the former. The meantime became a literal “time between,” Martin Heidegger’s “time of the No-More of the god that has fled and the Not-Yet of the god that is coming.” Less polytheistically, “the time between” implied a moment of fundamental historical change, a transitional moment in which no one could feel at home: “each torpid turn of the world has such disinherited chidren, to whom no longer what’s been, and not yet what’s coming, belongs.” (Rainer Maria Rilke, Duino Elegies in the familiar Leishman-Spender translation).

It feels like we are in the condition of between-ness but no longer in the time between; we have crossed some kind of dividing line, and the no-longer is receding into a rapidly aging history while the not-yet is rapidly becoming the already-here. (“The future is already here, but is unevenly distributed.”--William Gibson)

I have no time (no pun intended) to expand upon these thoughts at the moment as I have to go catch a plane to somewhere else.
joculum: (Default)
Having been reminded, nearly two years later, to look it up, I am delighted to find that I have once again created the only Google-search citation for a quotation, one which I acknowledged at the time was probably an ill-favored misquotation, but mine own.

Item: Samuel Johnson's "Depend upon it, sir, when a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully."

Which I rendered, on January 13, 2013, as "The sure knowledge that one is to be hanged in a fortnight concentrates the mind most wonderfully."

It was pleasant to rediscover that post, and to delete some wondrously unconcentrated spamming comments that didn't add enough surreal content to make them worth retaining.
joculum: (Default)
It’s a kind of [devastated-landscape] ugliness that can be achieved anywhere, I suppose, but it’s most easily found on the borders where cultures clash....

...among the handlers, I had learned not to dismiss anything as meaningless. Mystery, I’d read somewhere, isn’t the absence of meaning but the presence of more meaning than we can comprehend.
---Dennis Covington, Salvation on Sand Mountain: Snake Handling and Redemption in Southern Appalachia

I wonder what he meant by that? ---punch line of joke in which the psychoanalyst is referring to a colleague’s pleasant “Good morning.”

Why haven’t I seen this before? ---Walter Pidgeon as Edward Morbius in Forbidden Planet



I find myself reading Salvation on Sand Mountain after twenty years in which I never quite felt the need to do so, and am finding it unexpectedly resonant.

The remark about the ugliness of borders where cultures clash (in Appalachia, places where mountains are leveled both for strip mining and to provide a space for a new Holiday Inn) suddenly illuminates for me—because different cultural expectations have real material effects in life and landscape—the dubiousness of most attempts to disentangle material and spiritual/psychological factors. (This last remark would be easier to make in German, where the word Geist serves for both “spirit” and “mind.” German, however, has had to go to “spirituelle” rather than “geistig” to translate “I’m spiritual but not religious,” if Google Translate is to be believed. “Geistig” still means both “spiritual” and “mental,” however.)

As I have written so often before, it is pointless to try to derive cultural characteristics solely from economic substructures, as pointless as to try to insist that only the spirit matters, matter doesn’t matter. (As in the intrinsically untranslatable old British joke, “What is matter? Never mind. What is mind? No matter.”)

As Election Day nears in a couple of places, and has just passed in a few others, I find myself thinking about the persistence of cultural preferences in the midst of changing economic circumstances, and how cynics can play upon regional psychologies to attain their own ends, ends which may be either economically or culturally based. Leaders, too, are prepared to sacrifice their own best material interests for the sake of the ideals that stir their souls most deeply.

The best outcome would be one in which material and spiritual goals were not muddled up together, or mistaken one for the other.

As Captain Obvious said once, I believe.

The problem, and I have written more than a few thousand muddled words about this, is how to extract meaning from the muddle.

I wish I could remember which character in fiction said, “I love mystery, but I hate muddles.” It seems like it came out of a Charles Williams novel, but it is such a British-English thing to say that it could equally well be something as embarrassing as Agatha Christie or as differently embarrassing as Robertson Davies.

None of the above...a very modest amount of websearching attributes the quotation to Mrs. Moore in A Passage to India, which is rather appropriate to my original topic of borders where cultures clash. Adele: “I dislike [mysteries] not because I’m English, but from my own personal point of view.” Mrs. Moore: “I like mysteries, but I rather dislike muddles.” Fielding: “A mystery is a muddle. ... A mystery is only a high-sounding term for a muddle.”

“Boum,” said the Malabar Caves.
joculum: (Default)
I have written a few essays that seem so irrelevant to world history of the past six months that I have not posted them anywhere. The day may come when they seem relevant to something.

Or as a very young Bob Dylan said as an introduction to one of his compositions, "It must be good for somebody, this here song. If not for me, it's good for somebody."
joculum: (asleep)
Karl Kraus withheld publication of Die Dritte Walpurgisnacht for fear of recrimination against friends, and famously wrote only that "Gewalt kein Objekt der Polemik, Irrsinn kein Gegenstand der Satire sei." (Those unable to parse this could paste it into a reputable translation program as I did to make sure I was understanding it rightly.)

The times are not propitious for writing about many of my favorite topics, and have not been for some time now. So I have written a number of posts and then chosen not to post them.

But I must remark that I feel at the moment as though I have fallen into a condensed version of John Crowley's Ægypt cycle: we have learned thanks to current events that the Gnostics are beleaguered both as supporters of repression and victims of it. The secretive Cult of the Peacock Angel is a topic of daily newspaper headlines, except that many of its practitioners have cellphones and wear T-shirts bearing contemporary slogans, making one speculate whether secularity has eroded religious passion among its practitioners as it has in so many other religions. The sense of social instability that accompanies this, when reinforced by economic instability, makes a revived fundamentalism seem plausible to multitudes. (And this insight also is found in Crowley's four Ægypt novels, exercises in fantastic realism with an emphasis on the "realism.")

I would say, as Kraus did, that all this brings nothing else to mind, but it would not be true, as he knew it was not for him when he made the statement.
joculum: (Default)
More Mildly Entertaining (Or More Likely Not) Notes on the Human Imagination


Re-reading various theoreticians who quarrel fatally with one another, I wish more than ever that we had a more comprehensive model of how the cultures into which we are born shape our psychological preferences. A concurrent perusal of my Facebook news feed, Wes Anderson’s new movie The Grand Budapest Hotel, my cousin Mary Stricker’s blog about fantasy Grimmella , and Patrick Leigh Fermor’s reconstruction of his youthful self’s peregrinations round the Black Sea in The Broken Road reveals parallel narrative and emotional structures that are modified by the time frame and gender-and-class conditions in which the parallel structures are being constructed by the human imagination. In other words, very different kinds of folks fall again and again for the same sorts of things, only different. And it matters very much whether we look more at the sameness or the difference, when we ought to be looking at both together, and looking both at what people love and what they hate.

We have a huge amount of academic fustian and intellectual obscuration taking place in unreadable journals, all discussing phenomena of the human condition that the academicians are examining in too small a sample, in too limited a geographical and historical circumstance, looking at too few variables.

Likewise, people in general like what they like, and know a great deal about the stuff they like, and do not think very much about why they like what they like, and why, under different circumstances, they happen to like something else.

I could try to struggle on for a few thousand more words on this topic, but you would stop reading after about one more paragraph, anyway.

This re-realization (it’s another one of those topics I rediscover about twice a week with the same sense of surprise each time) makes me feel like finally taking the time to work all the way through D. Fox Harrell’s Phantasmal Media, a new book to which I refer with monotonous regularity, simply because Harrell is trying to synthesize a good many theoretical approaches. It is possible to extract a number of different lessons from Harrell’s narrative even though his primary interest is how to make relationships of power and ethnic identity visible through digital media—or how to create politically and socially efficacious video games. Since he focuses on “how to understand and create evocative story worlds, poetic metaphors, critical conceptual blends, haunting visions, and empowering cultural experiences using the computer,” we can extrapolate beyond the “using the computer” part and look more broadly at that piquant juxtaposition of “haunting visions,” “empowering cultural experiences,” and so forth. Most of the people writing about one or two of those topics wouldn’t know the other topics if they performed the usual American-slang cliché on their posterior regions. (For my non-U.S. readership, that’s “if they came up and bit ‘em in the ass.”)
joculum: (cupid in the tropics)
I have found another convert for the fiction of John Crowley—Joe Elias Tsambiras, (http://www.kailinart.com/2014/02/the-art-of-joe-elias-tsambiras/) whose artwork is Crowleyan without knowing it, but using a different visual vocabulary from the art John Crowley most admires (which means Mr. Crowley may not approve of it). Although I recommended Little, Big and the Ægypt cycle without further citation or explanation, it was the 2009 interview in The Believer (http://www.believermag.com/issues/200905/?read=interview_crowley ) that made a believer out of him, perhaps because so much that Mr. Crowley says in that interview would also apply to the art of Joe Elias Tsambiras.

Unfortunately Tsambiras is acquiring particular editions according to the cover art he finds most relevant and appealing to him, so I can’t guarantee the purchase of new copies, which of course matters a great deal to a living writer.
more? )
joculum: (cupid in the tropics)
I am stunned to find that the annual conference of the International Association for the Fantastic in the Arts has a roster of academic papers and participating academicians that puts the American Academy of Religion to shame in terms of quantity: and this is only on one interdisciplinary topic, even if it is explored from many different perspectives.

http://iafa.highpoint.edu/wp-content/uploads/2011/08/35th-Annual-ICFA-Program-Draft-v2.pdf

Profile

joculum: (Default)
joculum

March 2017

S M T W T F S
   1234
567891011
121314 15161718
192021 22232425
262728293031 

Syndicate

RSS Atom

Style Credit

Expand Cut Tags

No cut tags
Page generated Oct. 17th, 2017 07:52 am
Powered by Dreamwidth Studios