Overweening Generalist

Tuesday, October 29, 2013

Synthetic Biology: Potentials Perilous and Promising

"Synbio," or synthetic biology, is here. It's alive!:


It's already been three years since Craig Venter's team made a species that was self-replicating...and its parents were not a mom and dad, but a computer.

In 2003 the human genome was sequenced. It costed billions of dollars to sequence and took up the energies of people in over 160 labs. Now you can buy a sequencing machine for a few thousand and sequence your own genome overnight. Or pay 23 and Me $99. By this time next year it'll probably be half that.

Synthetic biology, according to Venter, will change everyone's life at some point. Its upside: we can make microbes that eat carbon dioxide. We can generate flu vaccines almost overnight. Tiny critters that generate clean biofuels that are cheaper and as efficient as fossil fuels seem possible. The brilliant Drew Endy of Stanford is gung-ho about genetic engineering and synthetic biology, claiming it already constitutes 2% of the Unistat economy and is growing at 12% annually.

Venter commissioned a panel to study the potential issues in public health and national security arising from synbio. Two big problems jumped out at us:

1. Synthetic biological work had become so cheap that most of the people who were doing it weren't even trained biologists, so there was no understood consensus about standards, ethics or safety.

2.) What standards existed by governments and international bodies were ten years old and so may as well have been 100 years old.

You're probably wondering what I'm wondering: when will someone get hold of some genome of a relatively benign virus or bacteria, tweak it using known methods, then use it as a bioweapon?

You can email a genetic sequence to someone else. You need to buy a few things to tinker with, but it's doable. I'm trying to spook you for Halloween. Is it working yet?

In the 18th century, Giambattista Vico, countering Rene Descartes, asserted that humans can only know what they have made. Only true understanding can come from something the mind makes, and Descartes's notion about "distinct ideas" in the mind as a basis for philosophy was flawed because we did not make the mind; Descartes was doing metaphysics. Vico called his principle, verum factum. That which is true and that which is made convert into each other; anything else is an abstraction. (I linked Vico's idea to Niels Bohr's Copenhagen Interpretation of quantum mechanics HERE, in case anyone wants to see how bent I can get.)

Back to biology, there's the GOF, which is also growing at an exponential rate, or at least ultra-quickly. It's short for Gain Of Function. Here's how it applies to the Pandora's Box of synbio: biologists attempt to combat some potential horrific pathogen by creating it in the lab, so then they can figure out a way to develop a vaccine for it. We can only know what we have made, as Vico said.

At a conference for scientists a researcher said that he'd tinkered with the H5N1 virus then being talked about as a potential killer of millions, if it mutated. It's a simple coronavirus, but he tinkered with it so a host could infect another via transmission through the air. Then another researcher said he'd done the same thing. They both published their papers, in bigtime journals Science and Nature. They knew what they had done could be interpreted as reckless, and indeed: both journals were persuaded to omit the part of each biologist's work that detailed the techniques by which they took a dangerous virus and made it far more dangerous, because who knows which band of deranged and sick mo fos would read this stuff and get ideas? And carry it off? (Beside The State, of course, by which I mean Google "Tuskegee Syphilis Study.")

But...can you really keep info under wraps? ("Paging Mr. Snowden! Mr. Edward Snowden; Please come to the white courtesy phone...")

In reading about the uncooperative governments (SARS in China, anyone?), the paranoia about Western governmental power (read up on Indonesia and their lethal coronaviral outbreaks), governmental snafus, international differences between countries, and just how hopelessly behind the curve biosecurity experts are in Unistat alone...I'm not sanguine, friends. It's only a matter of time. Let us pray the international bioterrorists make a crucial mistake and the deaths are limited.

However, when it does happen? There's nothing more paranoia-inducing than a massively-social-mediated group of people terrified of the invisible death-bringing entities that may be in the very air they're breathing. All bets are off, and it seems just the thing to get Ted Cruz elected President. (Then: watch out, "liberals": all that NSA data could be gunnin' fer ya!)

With seven billion on the planet now, even if a pandemic arose "naturally" and killed off 3-5% of the population (like the Spanish Flu of 1918 did), how much more paranoid are we now than then? Many people who didn't die will go to their grave convinced the Other was responsible...

I hope you're scared now, or I'm not doing my job, on this, October 29...

The old Biology: you observed life from outside that life, wondered about details and behavior and then dissected to see how it worked, or placed the life in some environment and observed.

The new Biology: You're an engineer: you know the life-form because you created it, from genomic information and computer models. Now you watch to see how it plays out. If it moves, eats, respirates and replicates, you've created a new species!

So...yea. The scary part is anyone with a serious political beef, or simple hatred, can align with others and send away for stuff and do what's called 4-D printing: those microbes that were just info on a screen are now ready to be released into your enemy's territory. You send away for stuff, you use steganography (al-Qaeda left a code in a porn video). Sequencers are cheap. The data is there. One fleeting problem: many biotech companies are keeping track of "nucleotides of concern": any known dangerous sequences are tracked: who is it that wants this info?

So: we have bioterror security experts who aren't sure how to determine threats, or if a threat is all that important; they don't know how to surveil those who'd go the whole nine and release something unspeakable, and they're not sure how to combat the pathogens anyway. Supposedly the International community is getting their act together along these lines. But...let's recall some sobering facts: in 2002 at SUNY Stony Brook, researchers took the genetic code for polio and made that virus. Because...verum factum, and Gain Of Function (GOF). If we truly know these bad boys we stand a chance of combating them when they come at us.

And let's not forget that in 2005 researchers sequenced the 1918 Spanish flu virus that killed 50 million people. They sequenced it...and then of course they made it. And the speed and cost of doing this is becoming ever-quicker and ever-cheaper. Just think: the Spanish Flu killed 50 million, but its lethality was only 2.5%. On the other hand, the H5N1 killed 59% of the people it infected. Can you imagine a huge batch of H5N1 tweaked (like two researchers have already done) to become transmissible via air?

(By the way: now is not the time to read this article about how Unistat labs are insecure. Just don't read this, or it might even bum your Halloween.)

Other Bad Signs: in Unistat the CDC and NIH don't have the infrastructure to develop massive amounts of vaccine for something that might appear. How many would need that magic shot or pill? Not as many as those hundreds of millions who'd take Lipitor or Viagra, paying for it all and making investors happy. Big Pharma is in the Big Money game; they cannot afford to spend an estimated $700 million to $1 billion to develop a vaccine or pill, when maybe after the bioterror attack quarantine and international cooperation stops the spread. There's no money in that! (SARS was stifled largely because of quarantine and cooperation.)

To sum up: synbio offers incredible promise, but just one really "successful" bioterror attack by angry young men who take their own version of a merciless God and some old border dispute very seriously...and life on Earth will have truly changed, and not in a good way. Because we have cops and monitors on one hand, but cheap technology, sheer fluid-like information and motivated ingenuity on the other hand. (Please make sure you wash both hands, thoroughly, when you're done reading this morose report.)

Dr. Frankenstein's imperative makes every day from here on out all the more fraught with drama, eh?

Happy Halloween! Muahahahahaha<cough>ahamuahahaha! Okay that's it: I may have failed to scare you, but in writing this - consulting 13 articles and taking notes - I've grown pallid, anemic and weak in my anxiety attack, and it further sickens me to say, "Well, I just hope that all happens after I'm dead and gone, 'cuz..." What kind of morality is that? It's like saying, "I hope all-out nuclear war happens after I'm dead, while your children are still around to experience it."

Now if you'll excuse me. I need to go rest. Oy! (No, but seriously: don't drink and drive on Halloween.)



Wednesday, October 23, 2013

Metaphysics and Overspecialization: A Meander

Woody Allen once talked about the time when he was expelled from college because he was caught cheating during the final exam in Metaphysics, when he "looked into the soul of the boy sitting next to me."

Aristotle
The subject of metaphysics is something I will never truly understand, but I'm cool with it. It's a blast to try to understand. There are many, many metaphysical roads to take from Aristotle, who is generally credited with being the first to tackle metaphysics as a "science" or a topic in philosophical thought, even though he didn't use the word, apparently. Along with a lot of hand waving and trying to field answers from students about "ultimate things," he variously called what his translators have labeled as metaphysics: "theology" "first philosophy" "first science" and "wisdom." When I read him, he wants to get at the "first cause" of things. He wants to talk about ontology, or the Being-ness of stuff. (Kant put epistemology as the "first philosophy" but I'm getting ahead of myself.) The part that has most intrigued me lately is the search for that which does not change. Given my understanding of physics, I'm not sure there "is" anything that does not change, but it's an interesting idea to think with, and I guess Aristotle was influenced by Plato here, at least a little.

The semantics of "metaphysics" among non-professional philosophers (like the OG) has always seemed a mess, but as I get older, that bothers me less and less. Just think of what metaphysics implies: thinking about things that are above physics. Or beyond physics. It's supposedly a topic that addresses those things that have no mass, no atomic or subatomic structure and no energy. I usually see the topic as what Max Stirner called a "spook": we humans can make up all kinds of things and ideas that simply don't exist, and then reify those "things." And yet, as some sort of humanist type, this notion of metaphysics goes back so far...it's a part of us. And therefore, it can't be negligible, even if it's just made out of words.

Aristotle's origin of all things was with the Prime Mover. For a good time over the next month, mentally insert "Prime Mover" in place of "God" every time anyone says it or writes it...or you think of It. Report your results! ("For Prime Mover' sake! Put the toilet seat down once every blue moon!")

Words
I remember where I was when it happened. I was sitting in a room a few blocks from the Pacific Ocean, near the Los Angeles harbor. I was half-awake and listening to some scientist answer questions on the radio. He said something about language and neuroscience and I started to perk up and listen attentively: it turns out that language, our words, have physical status. It's hard to pin down, but neural imaging, studies of brain damaged people, and our understanding of synapses and learning...the words we use are all tied up with larger neural clusters (made of atomic goo and having some weight and mass)  that have to do with our being human beings with bodies and living in a world with language...but the words themselves have some physical, ontological status, even if it's hazy and difficult to pin down. They're taking up neural space. It made sense: language does not Speak from On High to us. It's not "out there" and emanating from some Superior Being. It's a biological property, and for abstruse evolutionary reasons we developed it to a very high degree, compared with our other-ape cousins. And a lot of it seems localized in the brain - language, that is - and it's so enchanting to other parts of our brain that most of us seem to think that language "really does" reflect "reality." It "actually" maps anything "out there" into words, in a perfect fit. If we're stunned and "can't find the words to express...," it's only due to some temporary imbalance. Possibly of the humors. Or not enough coffee. Too much wine. Not enough sleep. You're freakin' stoned again?

Topics
What are our thought-chains that lead us back to privately pondering the origin of matter, what happens after death, whether there's a superior, even transcendent intelligence inhering somewhere, or a perennial favorite: why is there something rather than nothing?...or why we find ourselves getting all worked up over other peoples' answers to these questions? For me, often: mortality thoughts. And, especially in groups of friends and acquaintances of "curious, breathing, laughing flesh"(Whitman), we're already outside our ordinary "reality": drinking some wine or other inebriant accomplices, jousting with witticisms, stoned on weed, euphoric in music, coming down softly from a whirl of flirtation. I know these states get me going on some metaphysical topic, but often I keep it to myself. Although I do like to hear where you stand. And why.

                                                   Habermas

Jurgen Habermas (and Marx)
I see Habermas as the Noam Chomsky of the European Union, committed to rationality and saving Europe from monied interests. (To my German and other European readers: I apologize in advance for the paltry riffs on Habermas I'm about to play.) Habermas, now in his 80s, is still fighting for something saner. He's made splashes in legal theory, political theory, sociology, psychology. (Here's a blogger-champion to read on him.)

I first became interested in Habermas when I heard a lecture about his idea of an "ideal speech situation," and this seemed to come of his historical views on the rise of literacy and media, coffee houses and newspapers. Everyone should be allowed to speak their view, without fear of recrimination. Metaphysical appeals are the wrong way: we should talk about what's demonstrably "real." Only rational thought will save us. With enough of this massively democratic speech, the better ideas will out. I'm making this too simple to an absurd degree.

Anyway, since the early 1980s - when Habermas was advocating no metaphysics in public speech about our life conditions - he's gradually softened up. He now believes that the discourse of religion has its place in public speech in his massively democratic ideology. Even though he confesses he's "unmusical" when it comes to religion (borrowing a phrase from one of his biggest influences, Max Weber), Habermas thinks there's no getting around the impulse to religious thought. Even though he still seems to be an atheist, he's made amends with religion while trying to maintain his Kantian-Enlightmenment rationalistic ideal speech situation idea.

Peter Berger, reviews Habermas's slow move toward allowing religion/metaphysics. I agree that Habermas is like Edward Gibbon's magistrate, who finds the various religious beliefs of the populace, "useful."

Being a carrier of the Critical Theory tradition (even though he had some cogent critiques of Adorno, Horkheimer, et.al and their opposition to "instrumental rationality"), Habermas is thoroughly steeped in Marx's ideas about religion, that it was "the sigh of an oppressed creature, the heart of a heartless world, and the soul of soulless conditions," and that it was the "opiate of the people." Marx: "The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up on a condition that requires illusions. The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo." - Contribution To The Critique of Hegel's Philosophy of Right, 1843.

A common reading of this goes something like this, thumbnail: Religion is a conspiracy, mostly by the Ruling Class, to conceal from Workers the actual reasons for our unhappiness. I think anyone pondering Marx (or The New Atheists, for that matter) ought realize the ambiguity here in Marx. For he might also be saying, religion has the correct insights in that our suffering must be overcome and what we really desire ought to have satisfaction, but  in looking toward religion we make a fundamental error in thinking of deriving our happiness through metaphysics, and not the nitty-gritty mundane, materialist world, which will require knowledge and action.

Anyway, one of the most renowned thinkers in the European Union had at the center of his social ideas the rejection of appeals to metaphysics as a basis for rational understanding, and now he's allowing metaphysics into that program. Which now seems primarily aimed at saving the idea of a relatively sane Europe.

I've often wondered, in the years since I read Habermas's Theory of Communicative Action (I read volume 1 and only thumbed through volume 2), that the ideal speakers in his ideal democracy might need to know more than about what they've specialized in, because I've been in rooms with far too many well-educated specialists who can't understand why the other guy ain't seein' it all from his angle. About which more later below...

Very Brief Take on Philosopher Kings
I still get a charge out of the far-more ancient-than-Aristotle Chinese view of metaphysics in the Tao Te Ching: "That which is above matter is the Tao." Hey: it's a decent take. Or at least it makes it for me.

There was a time when the Schoolmen, the Scholastics, doing philosophy as theologians, were The Cheese intellectuals in the West. When they decided to hold the Renaissance, starting on January 1, 1500, some Humanists, artists, engineers, poets and political philosophers began to get a piece of the action. By around 1860, Natural Philosophy (AKA "science") began to rack up win after win. And this held sway through the Roaring 20th century.

Richard Rorty said the Philosophers had always insisted that, no matter what others thought, theirs was The Cheese all along. They had constructed a bunch of elaborate systems that placed something between the individual and the world: Mind. Language. History's Laws. But theirs was the discipline ne plus ultra.

Whatever, it had always been assumed, says Rorty in his Philosophy and the Mirror of Nature, that the role of philosophers was as meta-cultural criticism and the assumption that only philosophers had a "God's Eye View" on all the other sub-disciplines and fields of study. It was even up to philosophers to tell the lesser historians or economists or psychologists or anthropologists to do more of this, less of that.  We're never going to arrive at a One True Real Copy of Reality if you keep doing that sort of fieldwork! Do something else. The very picture of Plato's Philosopher Kings. As the renegade Marxist sociologist Alvin Gouldner called it: a Platonic Complex.

Rorty says, enough with the idea of achitectonic disciplines: the true role of the philosopher is to live up to its name: love wisdom. And we do that by being Generalists: we read about popular culture and wonder what it means for sociology. We talk to some historians about medicine and get some ideas there...how can this all "hang together...?" We read the philosophy of science and then about actual conditions in labs and see if we find something there. We look at marginalized discourses and books and authors and then make conversations about what they may have to offer that is being missed by those not being marginalized. We wonder about happiness and political power and economics and language and quantum mechanics and Dark Matter...and how it all hangs together. Or might.

Rorty says: enough with the Philosopher King role. It never worked and was pretentious and it alienated philosophers from a more valuable role: as messengers between disciplines. Generalists.

(Right now witness the Third Culturalists trying to assume the role of Philosopher Kings, and attacks from the traditional Humanities and other places. Maybe start HERE. How much of it has to do with funding and prestige?)

Contra people like Pinker and Dawkins and (what I see as) their sophisticated scientism, I do not abide by the idea that there exists any meta-discourse, anywhere. (As of October, 2013)

                                          rendering of Margaret Fuller

The Fascinating Case of Buckminster Fuller's Metaphysics
Talk about a Generalist! And yet, as I parse Fuller's books, I always got the feeling that, as much as he paid lip-service to economics, sociology, poetry, and the humanities, he thinks (he died in 1983, but his ideas are still alive for me) Science is a meta-discourse. And metaphysics actuates science.

So what is metaphysics, according to Fuller? Scientific laws that express a tremendous amount of generalization from a dizzying welter of individual cases. Or, an example in Bucky-speak:

Humans are unique in respect to all other creatures in that they also have minds that can discover constantly varying interrelationships existing only between a number of special case experiences as individually apprehended by their brains, which covarying interrelationship rates can only be expressed mathematically. For example, human minds discovered the law of relative interattractiveness of celestial bodies, whose initial intensity is the product of masses of any two such celestial bodies, while the force of whose interattractiveness varies inversely as the second power of the arithmetical interdistancing increases.
-Critical Path, p.63

Fuller thinks that humans, constantly looking into Nature, using their Minds (different than the brain), discover generalities expressed in the language of math. As time goes on, these generalities get honed and become evermore exact and interaccomodative. (<----I just used a word that I'm not sure even exists, but every time I study Bucky I get infected with his unique verb-ifying language style, so I say what the hell and let 'er fly.)

But this bit about the Mind not being the same as the brain? Well, first let's get to Fuller's conception of God:

Acknowledging the mathematically elegant intellectual integrity of eternally regenerative Universe is one way of identifying God. 

Ohhh...another Platonist. Hey, whatever floats your Dymaxion House!

God may also be identified as the synergy of the interbehavioral relationships of all the principles unpredicted by the behaviors of characteristics of any of the principles considered only separately. 

Recall: Fuller is the grandnephew of American Transcendentalist Margaret Fuller. There's something genetic. Nevertheless, Fuller seemed Leonardo enough for the 20th century.

Oh yea: Mind does not equal Brain:

Brains always and only coordinate the special case information progressively apprehended in pure principle by the separate senses operating in pure mathematical-frequency principle. Brain then sorts out the information to describe and identify whole-system characteristics, storing them in the memory bank as system concepts for single or multiple recall for principle-seeking consideration and reconsideration as system integrities by searching and ever reassessing mind. 

Okay, this brain sounds pretty impressive to me. How can anything be better than that? Well, here's how Bucky conceived mind:

Only minds have the capability to discover principles. Once in a very great while scientists' minds discover principles and put them to rigorous physical test before accepting them as principle. More often theologists or others discover principles but do not subject them to the rigorous physical-special-case testing before accepting and employing them as working-assumption principles.
-pp.159-160, Critical Path

The mind, unlike the brain, is weightless, massless, colorless, and not detected by any instrument that I know of. Furthermore, for Fuller, the physical principles that actually work to run our world of technics and know-how, are also weightless, massless, odorless, colorless, and they don't take in or emit energy, etc:

Mind and general physical principles, generalized, are metaphysical entities. And their synergy runs the world.

Fuller, in book after book, is able to think about our lives and educations and be somewhat dispassionate about the way we were trained to think of inquiry and knowledge as being separate entities. At other times he sees this as something like a conspiracy theory against Mind by powerful interests. Why so much at stake? Because, specialization gets you extinct. And we need as many people to think in creative, generalistic ways as possible if we are to avert catastrophe. Think of his God, his idea of Mind, your Mind. Does it make sense? In the introductory chapter to Synergetics he sees specialization as fostering isolation, futility and confused feelings. Humanity is "deprived of comprehensive understanding." Understanding based on the soundest metaphysical principles. Because most of us in the West were educated to specialize, we tend to abandon personal responsibility for thinking of the Big Picture, and taking social action. We let others deal with the big stuff. He doesn't say it, but he seems to equate specialization with marginalization. "Specialization's preoccupation with parts deliberately forfeits the opportunity to apprehend and comprehend what is provided exclusively by synergy."

Fuller sees art, science, economics and "ideology" all as having separate "drives" and "complexedly interacting trends" which could be understood via synergetics, but hardly anyone "in" one of those fields seems to believe this. This is threatening to the survival of the species. Giant pandas only eat bamboo. When the bamboo is gone, the panda is gone. 99% of all species that ever existed are extinct (or a like number; it's not good news), and not all extinctions were due to specialization or overspecialization, but there have have been enough extinctions, presumably, due to this short-sightedness. And we're supposed to have all the tools! We live at the equator and near the Arctic Circle, in rain forests and deserts, savannas and at 10,000 feet above sea level.

Related to this, here may be one of the brainiest conspiracy theories you'll ever read:

We have also noted how the power structures successively dominant over human affairs had for aeons successfully imposed a "specialization" upon the intellectually bright and physically talented members of society as a reliable means of keeping them academically and professionally divided - ergo, "conquered," powerless. The separate individuals' special, expert glimpses of the separate, invisible reality increments became so infinitesimally fractionated and narrow that they gave no hint of the significant part their work played in the omni-integrating evolutionary flow of total knowledge and its power-structure exploitability in contradistinction to its omni-humanity-advancing potentials. Thus the few became uselessly overadvantaged instead of the many becoming regeneratively ever more universally advantaged.
--p.162, Critical Path

In a slim and criminally underrated and under-read book by Fuller, GRUNCH of Giants, he goes into the history of this conspiracy by the very few to use the "wizards" for their own control of wealth and power. And if you can get with the prose style, you might find it very rewarding.

With this I abandon my typing with the idea that we've specialized too much; we've been marginalized, the survival of our species is at stake, and the deepest synergetic nexus of survival and real wealth is metaphysical know-how. I had no idea I'd end up here. Adieu!

P.S: Not long ago I was delving around in the philosopher Willard Van Orman Quine, and in 1948 he seems to have thought very much like Fuller on these topics. I wonder if Fuller influenced him, or Quine influenced Fuller, or this is another of those convergences that Charles Fort described as "It's steam engines when it comes steam engine time." In 1948, in an essay, "What There Is," Quine said that our best scientific theories "carry an ontological commitment" to objects whose existence is incompatible with nominalism.

                                                 Buckminster Fuller

Thursday, October 17, 2013

The Demonic Powers of (Some) Books: A Take or Three

A while back one of my intellectual colleagues urged me to read Fritz Leiber's novel Our Lady of Darkness, and if you haven't read it yet, it's October and the perfect time to get down to the library and read this thing. It's even better if you live near San Francisco, as it's set there. Leiber, influenced by Lovecraft and Clark Ashton Smith and Montague Rhodes James, uses some Jungian riffs and gets off a tremendous work I couldn't put down. It's weird, realistic, creepy, and destabilizing, somwhat artsy in style and yet a page-turner. Because I'm all out of breath I'll just say Yo This Is An Amazing Book. It's perfect for Halloween-times. (As I link the title to Amazon- I take no money from them - I noted the reviews were less than 4.5 stars...which is simply absurd, trust me on this.)

Have you ever been in a Big City and felt like It had something to say? As if there were signs all around, but you didn't quite have the key to read the language?

Leiber posits a secret art of reading Cities, and predicting and manipulating the future, via Megapolisomancy, and a dark character named Thibault De Castries literally wrote the book on this art. Everything that makes up the metropolis: steel, wire, and cement; paper, rubber and bricks...has always had effects on humans throughout history. The effects are physiological, psychological, and, perhaps most importantly: hyper-psychological. I'd say "parapsychological" but this could be misconstrued. It's creepier than that. Castries also wrote The Grand Cipher, but I don't want to say too much here. Ever since I finished Leiber, my forays in the City - always an expedition in psychogeography - have never felt the same. It's those damned...elementals emanating from the stuff the City is made from. But I won't go into it. Save for the utterly demonic aspect of Leiber's novel.

                                                      Fritz Leiber

"Demonic"? Aye, but not in the American evangelical's sense. The word's had a peculiar evolution. Everyone who's studied any philosophy knows that Socrates attributed whatever he "knew" to his daemon: a voice that spoke to him. This demonic voice was associated with Divine Knowledge. And I remember reading how Goethe was so blown away by JS Bach he said Bach was demonic.

In late 18th-19th century Europe, highly influenced by Hamann and Herder, Goethe saw uncanny creative genius as "demonic." Goethe seems fairly demonic his own self, but that reminds me of one of his books, The Sorrows of Young Werther. It made the demonic Goethe a huge celebrity writer-star at age 24, and was based on autobiographical elements that Goethe later regretted sharing with the world: a very romantic young man's unrequited love leads him to suicide. And the book was responsible for "copycat" suicides in real life. Is it Goethe's fault? The book's fault? The culture's fault?

I used to say it's a combination of all three, but mostly the culture. Now I prefer to attribute the suicides to the book more than the culture or Goethe. I have my reasons. It seems to me the demonic in the 19th century sense is probably at large in every culture, almost everywhen. And while the demonic powers reside in Goethe's nervous system, those books, when disseminated throughout Germany and then the rest of Europe, went out of Goethe's hands. If the culture's "right" then you get readers who succumb to something irrational they see in the book. But the Book actuated the suicides. Goethe's writing resonated so strongly with young people who saw in themselves aspects of the fictional character. And killed themselves.

Other books are linked to killers. Demonic?



A confession: Here's where I realize I'm a bit...off: I'm bibliomane enough to admit to a Walter Mitty thrill that books can have such powers over humans.

Stephen King voluntarily pulled his novel Rage, a work he started while in high school, because it might prove as an "accelerant" to school gun violence, already notorious in Unistat. I can see his point. Already it looks like maybe there was a copycat killing. And yet: is it a publicity stunt? Something to garner a heavier demand for the novel? Am I being cynical? King says guns aren't the problem in Unistat; it's the Kardashianization of culture that's the problem, and King himself owns guns and is a big 2nd Amendment guy.

Now hold on, wait a minute: if I assert the absurdity of blaming Marilyn Manson for the Columbine killers, or Judas Priest or Ozzy Osbourne for other self-inflicted deaths of Unistatian teens, why do I support the book medium over those musical texts? Good Question. Here's how I've negotiated it: in reading interviews and seeing the rock stars talk about their work - and I'm thoroughly acquainted with their music, by the way - I believe the musicians when they say they're writing that music for the joy and fun of it, and Ozzy liked to argue Who believes Vincent Price was an actor who meant harm for his audience?

The writer of a book is working with the nature of the book, the reading of which is almost exclusively solitary, and silent. Reading a novel makes demands on the nervous system that are unique to the act of reading and certainly different than the apprehension of auditory musical texts. But it's the intent and subjectivity of the Author which, combined with the phenomenology and physiology of reading books that makes some of them...demonic.



It seems only fair to ask of the author of a book that might possibly cause untoward (or desirable) effects on its readers to warn them in some way, but the very nature of fiction and unheimlich aspects of  the demonic...seems to violate the rules of the game. However, a warning or notice is done from time to time. The fair warning. For example, in a series of putatively "non-fiction" postscripts to a 700+-page surrealistic novel, Robert Anton Wilson tells his readers:

This book, being part of the only serious conspiracy it describes [...] has programmed the reader in ways that he or she will not understand for a period of months (or perhaps years) [...] Officials at Harvard thought Dr. Timothy Leary was joking when he warned that students should not be allowed to indiscriminately remove dangerous, habit-forming books from the library unless each student proves a definite need for each volume.
-Illuminatus! Trilogy, p.774, omnibus ed.

Who among us can withhold admiration for the author who, in such an overwhelmingly vivid fashion of embedding a non-existent text within the actual text, influences later generations to actually produce a "real" version of the once-embedded imaginary book? One might think immediately of the Necronomicon. But this has been going on for some time. Here's Frances King:

Someone has only to announce the existence of a mysterious book, or an even more mysterious occult fraternity, and there will always be those who are prepared to produce the required article or organization - usually for a suitably large fee. For example, no one had heard of any alchemical writings of the early English St. Dunstan until the Elizabethan magician Edward Kelly stated that he had found a strange red powder of projection and The Book of St. Dunstan, describing how to use this same red powder for the purpose of transmuting base metals into gold, in the ruins of Glastonbury Abbey. Nevertheless, within fifty years of Kelly first making his claim to this discovery no less than half a dozen alchemical tracts had been printed, all of them differing one from another, and each claiming to be the sole authentic Book of St. Dunstan.
-Sexuality, Magic and Perversion, pp.5-6

But these wild, inspired imaginings that go viral: they act as palimpsests, they infuse and infect and imbue the gesticulations and ideation of far-flung gens, dead ignorant of their originations. Fer crissake: look at the abominable life of The Protocols of the Learned Elders of Zion. Now the Priory of Sion has momentum. The Gemstone and The Octopus will fuel conspiracy thinking for a long while yet. These works might be thought of as "non-fiction," but they seem somehow like hyperfiction to me. They are demonic, but not in Goethe's sense. And there are too many to name.

The prolific historian Philip Jenkins traced the origin of satanic panics in 1980s Unistat to a 1926 novel written by Herbert S. Gorman titled The Place Called Dagon. Lovecraft himself was influenced by this novel. What's sorta odd (a digression!) to me: Gorman was the first biographer of James Joyce, his 1924 book receiving much help from Joyce himself, and now thought to be a wonderful source for how Joyce wanted to have been perceived. Gorman was a busy writer and he could have no inkling that, 55 years later, a strain of high-strung xtian PTA types would read his novel and get ideas. So to wrap up this digression: we have a bizarre synchro-mesh of a newspaper reporter and novelist, Lovecraft, Joyce, and the McMartin preschool debacle, among others...

Demonic?



Peter Lamborn Wilson: "The world of apocrypha is a world of books made real, which may well be understood and appreciated by readers of Borges, Calvino, Lewis Carroll - or certain sufis. The apocryphal imagination turns 'Tibet' or 'Egypt' into an amulet or mantram with which to unlock an 'other world', most real in dreams and books and dreams of books, visions induced by holy fasting or noxious alchemic fumes."
-Sacred Drift: Essays on the Margins of Islam, p.22

More PLW: "According to the Manicheans, books might be Angels, living personifications of the Word from On High - or from elsewhere, from another reality. There exist angelic alphabets. The British magus and alchemist, John Dee, received angelic transmissions in the Enochian alphabet, and Jewish magicians used angelic letters in their amulets and Kabbalistic meditations."
-The Little Book of Angels, p.6

A final thought from PLW: "The crude truth is perhaps that texts can only change reality when they inspire readers to see and act, rather than merely see. [...] Just as there exist books which have inspired earthshaking crimes we would like to broadcast texts which cause hearers to seize (or at least make a grab for) the happiness God denies us. Exhortations to hijack reality. But even more we would like to purge our lives of everything which obstructs or delays us from setting out - not to sell guns or slaves in Abyssinia - not to be either robbers or cops - not to escape the world or to rule it but to open ourselves to difference. I share with the most reactionary moralists the presumption that art can really affect reality in this way, and I despise the liberals who say all art should be permitted because - after all - it's only art."
-Immediatism, Essays by Hakim Bey, pp.57-58

Maybe I ought remember my William James and think about the predispositions of readers who might "allow" a book to take hold of them, influencing but not causing them to act in a way a contemporary evangelist would deem "demonic." It would seem James's "tender-minded" might be more prone to the lure of such books than his "tough-minded." Maybe Erik Davis is right when he writes of Lovecraft's doomed protagonists, bookish types (like some people we know?) whose "intellectual curiosity drives them to pore through forbidden books or local folklore."

"district attorneys hunt for books so evil they are not protected by the First Amendment..." - RAW, p.8, Everything Is Under Control

Okay, for today I'm ready to call this a wash, and suffice to say that only some books are demonic, as are some authors (only they might not know it); culture has some skin in this demonic game, and I'm not sure how much. Writing has always been associated with magic, danger, the demonic. Let us try not to forget it...

                                                       Thoth, who seems to have 
                                                       started this whole damned
                                                           thing.


Friday, October 11, 2013

Euclidean Quotidian: 90 Degree Angles and the Semantic Unconscious

Ten Scattershot Ideas, One For Each Finger and Two Thumbs

1.) Supposedly the medieval Europeans thought Euclid's works were the same as the one we know as Eucleides of Megara, so olde books about geometry in Europe were by "Megarensis." They weren't the same dude: "Megarensis" was a contemporary of Plato; the great Euclid of high school geometry was closer to being contemporary with some of Plato's early students.

The Arabs got hold of Euclid and thought the name was made of ucli (the key) and dis (measure). At any rate, his Elements was the model of rationality ne plus ultra, and I'm writing this piece after pondering Euclid's influence on two philosophers, Vico and Spinoza, who were not the first to mimic the potent rhetorical form and structure of Euclid.

2.) In Peter Thonemann's review of three books for the TLS, note the story of the Malawi girl, who charged with learning how to set a dinner table English-style, experienced a steep learning curve, because the world she grew up in was curvilinear; there were no right angles. We had to learn the "order of things" we take for granted as "the way things are done."  I also thought it interesting that  with the Romans rolling through the peoples of Europe, they brought right angles and rectangles and ideas about straight lines and order with them, the Irish being the last to "convert," and it went along with Christianity.

There's a question of the "reading" of artefacts from the long-dead: if they built with right angles, was their social structure more authoritarian? Some think so. Others think what matters is the initial posit and then iterated forms that grew from there. Mikhail Okhitovich, Soviet sociological thinker of the 1930s, asserted that right angles originated with private land ownership, then extended to architectural forms, and represent a non-communistic mode of thought; because of this curvilinear forms in architecture were the best and most egalitarian form.



Before rigid hierarchical forms of State, what was often found were circular forms, which have a center but seem to resist hierarchy...on some level. Do Euclidean forms give rise to a form of thought that permeates a culture, and if so, is this idea mostly unconscious, part of the paideuma?

Many non-communist Left-ish thinkers have assumed that dwellings based on rectangles and 90 degree angles were somehow metaphors for artificiality, non-organicism, or simply convention, and living in "boxes" tended to encourage conformist social ideas and a stifling of creativity. Look at any fat book on great 20th century architects and buildings. Look at Buckminster Fuller.

3.) A pop kulch example of a leftist strain in American thought is found in this folk song: "Little Boxes." Boxes and conformity. Boxes and restraint. Boxes and the suburbs, Levittowns.

4.) The distaste for "boxes" runs in countless intellectual and aesthetic fields. While Nietzsche lays out with this probe: "Mathematics would certainly not have come into existence if one had known from the beginning that there was no exactly straight line, no actual circle, no absolute magnitude," and we are left to wonder, our contemporary Nassim Nicholas Taleb writes in his Bed of Procrustes, "They are born, then put in a box; they go home to live in a box; they study by ticking boxes; the go to what is called 'work' in a box, where they sit in their cubicle box; they drive to the grocery store in a box to buy food in a box; they go to the gym in a box to sit in a box; they talk about thinking 'outside the box'; and when they die they are put in a box. All boxes, Euclidean, smooth boxes." (p.31)

5.) Art critic Jed Perl wonders about the state of painting and painters in today's art world. At one time the rectangle frame of the painting was a given. The artist played an outre role in society. But now practically all competing media are either rectangle shaped (iPod/iPad/iPhone?), or text is read within a rectangular-ish frame (the screen you're using now?); further: images in the most popular media are dynamic inside a rectangular frame: TV, films, the camera frame. Could it be that the "degree of stabilizing supremacy of that rectangle has been undermined by the technology that surrounds us?," Perl asks. He knowns painters. It's his milieu. And Perl asserts that today's painter, because of the static image inside a rectangle, has been forced to go on the defensive or offensive, which presents a new hindrance. At the same time, Perl asserts that painting is not dead.

6.) In what appears to be an untitled poem, Tony Quagliano:

I read this poem about geometry
or shadows
or was it poetics, or
some analogy among the three---
that sounds right
a poem about science and art
itself some artful connection
opting for the poem of course (being a poem) slyly
saying math's impure
or at least not pure enough
for one geometer not impressed by Euclid
or more impressed by non-Euclid
or some such twist
and what gets me, why I mention this at all, is
that the poem was good

though no one bled directly in it
words were clean, scientific
stitched in artful lines for the anthologist
and while a slashed wrist would have to wait
this poem of shadows, or math
or some connection in the courtyard of art
this fragile suture, poet to geometer, takes life
over your dead body
and mine

and it was good
which is why I mention this at all.
-p.65, Language Matters: Selected Poetry

7.) I remember reading about some hotshot engineering students - probably at CalTech? - and the problem of stacking oranges at the grocery store. Because of their roundness, there's far more non-used-up space (AKA "air") between oranges. How to maximize the number the oranges stackable? Well, you obviously make square oranges, using the Lego-mind. Easier said than done.


I hadn't thought much about shipping containers and how they have made the world seem far smaller and distance irrelevant until I read Andrew Curry's fine piece in Nautilus not long ago. "Invisible to most people, (shipping containers) are fundamental to how practically everything in our consumer-driven lives works." As for packing as much stuff into a space as efficiently as possible, it doesn't get much better than shipping containers. ("Invisible to most people...")

Score one for rectilinearity.

8.) One of the Prophets of Euclidean space and modern consciousness, Marshall McLuhan, in 1968:

The visual sense, alone of our senses, creates the forms of space and time that are uniform, continuous and connected. Euclidean space is the prerogative of visual and literate man. With the advent of electric circuitry and the instant movement of information, Euclidean space recedes, and the non-Euclidean geometries emerge. Lewis Carroll, the Oxford mathematician, was perfectly aware of this change in our world when he took Alice through the looking-glass into the world where each object creates its own space and conditions. To the visual or Euclidean man, objects do not create time and space. They are merely fitted into time and space. The idea of the world as an environment that is more or less fixed is very much the product of literacy and visual assumptions. In his book The Philosophical Impact of Contemporary Physics Milic Capek explains some of the strange confusions in the scientific mind that result from the encounter of the old non-Euclidean spaces of preliterate man with the Euclidean and Newtonian spaces of literate man. The scientists of our time are just as confused as the philosophers, or the teachers, and it is for the reason that Whitehead assigned: they still have the illusion that the new developments are to be fitted into the old space or environment.
-p. 347, Essential McLuhan, from an essay, "The Emperor's New Clothes," originally in Through the Vanishing Point: Space in Poetry and Painting, co-written with Harley Parker. McLuhan asserted in 1968 that "the artist is a person who is especially aware of the challenge and dangers of new environments presented to human sensibility." McLuhan thought artists were subversive because society expected the replication of existing orders and forms, but artists violated these expectations.

Three thoughts:
a.) In 1968 McLuhan may have been far more prophetic than he thought: not only are scientists still trying to come to terms with non-Euclidean findings in astrophysics, materials science, microbiology, subatomic physics (but I do see some inroads), but going back to Jed Perl's essay on the "state of the art" in painting 45 years later, McLuhan's "with the advent of electric circuitry"...and I think maybe painting, contra Perl, may be, if not dead, in the ICU, condition: critical.

b.) When I do that mental yoga which allows me into McLuhan's thought-space, I realize how intensely Euclidean my assumptions seem, as based on the idea of Gutenberg Man and the space of the literate reader of texts, for hours every day, decades on end, eyes decoding 26 symbols with punctuation, left to right, linear left to right, left to right (THIS), left to right, punctuation. In my conditioned assumptions of quotidian reality, objects "really do" fit inside of space and time. I want them to create space and time themselves, by power of their sheer Being, capital be. But most of the time: no. I have to work on it. How do I get out of Gutenberg Euclidean head space? Cannabis, film, walks in nature, animation, humor and surrealism, reading Joyce or Pound, get into the Korzybski-Zen level of the phenomenal event-level, pre-language, observing without hypnotizing and misleading "woids," and then careful consciousness of abstracting, watching myself abstract until It all melts, or something strange in science. You have your ways.

c.) For such a overwhelmingly "straight" Euclidean man, Prof. McLuhan's (whose personal politics were a sort of conservative Catholic with tinges of anarchy?) mind was, to me, reliably non-Euclidean and psychedelic. His deep immersion in James Joyce and Ezra Pound was probably a significant influence here, but there was so so so so much more. He was an absolute virtuoso with playing with metaphors and combining those ideas with others, if only just to see if they were thrilling and made anyone else want to think about some idea in some new way. I find this an anarchist strain in McLuhan's thought. (How about I take catholic idea about the senses and think about the new electronic media, like radio of TV? I can add ideas I copped from Thomas Nashe, Wyndham Lewis, Ezra Pound, Harold Innes, and anthropologists. And Finnegans Wake! And mythology, Poe, Einstein,  and painting's figure/ground and the rise of the Renaissance's vanishing point? And then: Vico! And commercials and comic strips!? And Walter J. Ong...and and and...)




9.) Robert Anton Wilson's extensions of Timothy Leary's ideas of the evolution of "circuits" in the human mind drew heavily on Euclid for the first three "domesticated primate" aspects of all of us: the oral/biosurvival circuit is about approach/avoidance and is represented in Euclidean metaphor as "forward-back." The second circuit stage of development (according to the theory, we "imprint" all of these circuits), the anal/territorial circuit, is about up/down, and represents the deeper levels of any thinking about politics, whether within the family, local city, national, or international. Notice up/down fits well in Euclidean space-thought.

The third circuit is about right/left and for mammals like us, based on the bilateral symmetry of the body and the nervous system, which nature has seen fit to encourage a dominance of one side over the other, most people's left hemisphere's motor cortex encouraging right-handedness. Conceptual thought and left-right equations (think: algebra!) and logic all fall under the third circuit.

Although neuroscientific ideas about hemisphericalization in evolution and discrete modules of each of the brain's two hemispheres has moved away from a once-popular notion of the "holistic" right hemisphere and the "linear" left, these metaphors still seem to resonate. For Wilson, right-handedness and math and literacy in symbolic humans indicate a left-hemisphere domination (the left hemisphere controls the right side of the body) which has unconsciously biased "linear" and hierarchical forms in human history, which begins with writing. The right hemisphere, relatively "silent" and seemingly subdued by assumptions about "reality" made by the left hemisphere (especially in industrialized Western humans), has yet to harness the intuitive genius housed in the right hemisphere.

So much ink has been spilled over these ideas, once extremely popular but now seemingly in a slow descent. Nevertheless, these ideas live, as you may have noticed from a conversation within the past few months. Why?

Well, I think it's because there's still some truth to the right/left brain modularity-of-function idea, although it's not as simple as those who popularized the findings of the Sperry and Gazzaniga "split brain" experiments. Also: I think Wilson was on to something: "Right-hand dominance, and associated preferences for the linear left-lobe functions of the brain, determine our normal modes of artifact-manufacture and conceptual thought, i.e., third circuit 'mind.' It is no accident, then, that our logic (and our computer-design) follows the either-or, binary structure of these circuits. Nor is it an accident that our geometry, until the last century, has been Euclidean. Euclid's geometry, Aristotle's logic, and Newton's physics are meta-programs synthesizing and generalizing first brain forward-back, second brain up-down and third brain right-left programs." - Cosmic Trigger vol 1, pp.199-200

For Wilson (and Leary) there were relatively "new" circuits that have appeared in human evolution over the last 11,000 years or so. And they seem non-Euclidean, more organic, curvilinear, and more inclusive of a holistic, total-floating body sense, as if we were meant to move through space/time.

To be clear: Euclid and his forebears the Pythagoreans wormed their way into our paideuma due to the natural evolution of mammals on a rocky watery planet with an atmosphere conducive to carbon-based replicative life forms under the purview of a energy-source star at a Goldilocks distance. We got Euclidean forms because that's the way we evolve. Which may Beg the Q, but it's one of my favored narratives, and my entire brain, both hemispheres, seem to harmonically resonate with it.

[Further extrapolations from Wilson on this complex of ideas: see Illuminatus! Trilogy, pp.793-795; Prometheus Rising, pp.97-100; Schrodinger's Cat Trilogy, pp.342-347.]

10.) I grew up in boxy architecture, and when I first encountered this idea - about rectangles and 90 degree angles and conformity - I also found out we forgot how we did it, but at some point we had to learn to see in 3-D spatial terms. Supposedly some cultural anthropologists had gone into deepest darkest rain forest Africa and lived with and studied pygmies, whose complete environment was always giant trees and vines and moving through those living breathing green spaces, always canopied by jungle thickness as "ceiling."And when they were taken to a clearing at the edge of the forest and the anthropologists pointed to a man and a jeep far off in the distance, the natives thought they were seeing a tiny man. They had not learned to see over vistas of "open space."

So, I lay in bed and looked at the point where the ceiling meets the walls. Two walls meet at the "point" of the ceiling. And I tried to remember what it was like to not see that as a point in space. It's akin to many visual illusions or the Necker Cube you've all seen. It was fruitless. Until, one day...O! Such little things that thrill me. Aye: the corner was on a flat plane. And then it pointed out toward me...

I attest, I assert that when I enter buildings of a non-Euclidean build, my consciousness is altered. An inventory of memories and anecdotes would bore you and me, but I wonder if you have felt the same? I love round rooms. A spiral staircase can really get me going. On and on. But here's the thing: if I grew up in a non-Euclidean house, I strongly suspect that entering a Euclidean "tiny box" house would alter my conscious also. Because I think these represent the unfamiliar structure of space...

I hope I didn't come off like some un-hep "square" in this blogspew.


Saturday, October 5, 2013

I Didn't Build This Blog

In the Blizzard of Memes, you can't see what hit you. Metaphors land softly in the snow of your neural fluids only dimly noted, at best. Other cleaning systems, born by fluids electrically discharging, move the thought-stuff out of your system, and what remains...remains. Other memes are noted...as "memes" because, well, here you are: noting them. But what's inside?

And what's forgotten?

When Obama made a now-famous speech in his campaign against Romney (who ironically invented Obamacare), he riffed on "You didn't build that." I immediately recognized this riff as being probably stolen (watch my loaded words here!), or borrowed from Elizabeth Warren. I'd read or heard or saw Warren give a variation, well-fleshed-out, years earlier. But she didn't build that.

I just finished reading the Wikipedia article on "You Didn't Build That." I thought it was a decent Wiki, although as I read it I found - as I usually do - that I'd wanted it to link to...something older. Because Elizabeth Warren didn't build "You didn't build that." (But I've always believed she had long internalized the conceptual framework of the idea; and I think it gets near to the heart of our central tragedy in Unistat that hardly anyone understands that conceptual framework. When Obama used the idea, I had the strong feeling he had not fully internalized that conceptual framework; he was merely riffing and playing his role.) Let me explain. Try to...

The Wikipedia link above? If you clicked on it and only skimmed it for two seconds? All the iterations that article has gone through? The contributors? They didn't build that. Jimmy Wales didn't build it. The infrastructure of the Internet? The infrastructure that supports that infrastructure? The history of architecture, design, craftsmanship, planning, industrial works, mathematical, chemical, and physical ideas? As the Jewish comedian said, "Don't get me started!"

I assume you're reading this in some sort of environment. My guess is, you're "indoors." (I find it taxes my imagination to visualize anyone reading OG outside, walking down a street, on a mobile device, but  who knows?) Anyway, indoors or out: look away from the screen for 30 seconds or three breaths and note your surroundings. Did you build that? As I look around this cramped, book-packed room, I find I assembled most of it. The bones - walls, ceiling, the wiring inside the walls, the paint, those little screws that hold the plate on around the light switch on the wall...I most definitely did not build that.

(I was thinking just now of tough guy and revolutionary Modernist Ezra Pound, who, barely scraping by but going out of his way and tirelessly taking pains and efforts toward making sure that rich lady patrons of the arts knew about Pound's friends, the relatively unknown Joyce, cummings, Hemingway, Frost, Eliot, on and on...and that those soon-to-be "important" artists were going to be supported, subsidized, receive notice. Meanwhile, Pound the poet-revolutionary made his own furniture and got by on a bowl of soup. He could look on the serviceable chair with pride. Did he build that?)

In 1919, a writer writing about a very, very old idea:

The now dead inventor of the steam engine could not have produced his ingenious invention except by using the living powers of other dead men - except by using the material and spiritual or mental wealth created by those who had gone before. In the inventor's intellectual equipment there was actively present the kinetic use-value of 'bound-up-time,' enabling him to discover the laws of heat, water, and steam; and he employed both the potential and kinetic use-values of mechanical instruments, methods of work, and scientific knowledge of his time and generation - use-values of wealth created by the genius and toil of by-gone generations.
-pp.121-122, Manhood of Humanity, Alfred Korzybski

Who knows to what degree this idea has sunk in. I know that when I first encountered it it felt totally revolutionary. And yet, I found I kept forgetting it. Growing up in Unistat you very easily become brainwashed to believe without reservation, that everything someone has, they..."made that." For what it's worth, I now find the idea completely preposterous and feel embarrassment when I remember how naive I'd been to believe it. (And I'm embarrassed that so many of Us still believe it.)

I'd encounter the idea articulated by Korzybksi (NB: he didn't build that) again and again and it was wondrous and seemed "truer" than what my conditioning led me to work with. It seemed very much like when I learned as a young boy that the sun didn't "set" but instead we were on a much smaller body, revolving away from the sun. I knew intellectually this was true, but my natural, naive experience of the sun moving and not us...held. It took practice to get over this. Now when the sun "sets" I can feel us moving on Earth, from my relatively inertial standpoint.

"We" can be utterly profoundly liberating as a concept internalized. Or so I assert.

You can take in history in an embodied way that seems to me qualitatively richer than what was dished to you by cultural conditioning. And, to take a Poet out of context, this "makes all the difference."

"We" goes back a long time, to the most inchoate use of tools by our deepest ancestors.

The words on the screen you're looking at right now are made of letters, and people helped you learn to read them; they had a breakfast those mornings that they merely assembled...: the phonemes, the sounds, the poetry of language and its resonances. They helped you learn to decode these, as others had done for them.

Countless tinkerers throughout human time added incrementally to the sum totals of technics. The ones who tried a new approach that didn't work, but others took note? This too created value: we now know what doesn't seem to work. Let's go more in this direction. And hey: why not keep notes?

Assembling, let me be clear, is nothing to be sneezed at. It is a creative act. But it seems thoroughly encompassed within building.

You have built much. Probably far more than you realize. You don't realize the many things you have built because of categorical accounting schemes you assumed were true. You have built neural circuitry in other mammals, for one thing...You have played a part in building me. (How? Just think about it. Hint: maybe it has something to do with one of Korzybski's triumvirate "material and spiritual or mental wealth"?)

This laptop I'm using? I built none of it. (But "We" built it all.) The silicon, the plastic, the glass, the mathematics "inside" it? I didn't build that. The router, the insulation on the thingumbob that plugs into the whatsit that gives me the juice? Me no build-a dat ting, no. Let's not even get into the Server, or satellites, or the stuff that goes into the foundation of the building that houses the thing that supports the dealio that runs on the doohickey, artifact, and article that goes into that gadget over there, that thing made of metal but's really some alloy of some sort? Who the hell mined that? But I digress...

My ideas at Overweening Generalist? An absurdly complex agglomeration and concatenation of metaphors and names that I didn't build, but I may have used a form, a syntax, a display, an array of combined ideas that may have spurred something within you. I got that from those who went before me. It's my understanding that almost all knowledge percolates constantly in these fashions. But I didn't build it all ex nihilo, of course. The details seem fabulous but true...

No: It's more like We the culture threw out an ungawdly amount of mindstuff, and some of it stuck inside my head! (I didn't build any of those metaphors, I merely borrowed them, so if you have any complaints, please see The Mgt.) The quasi-hidden form of the desktop? The icons? The people at Blogger? The coders that made Blogger so easy for a dunderhead like myself to use, so I can write this crap so you can read it? Me no build zees zing, neethuh! (The farmers that grew the food that allowed most thinkers and tinkerers and laborers to get off the farm and do weirder things with knowledge, like build engines, roads, algorithms, surgical steel, Etsy?)

WE built it, like Korzybski says. Take what he's saying about the steam engine and just extrapolate, and give yourself credit for doing so, for it's a creative act to do so, and who knows what brilliant and novel ways you're envisioning this idea, but if you make something of this idea (did I do that, just now, today?), give yourself some credit. Just not all the credit.

Because giving yourself all the credit just seems to me...childish. Or, I'll be charitable: child-like. Naive, and, as the Philosopher said, "Human, all-too human." Other times I say: greedy and pretentious and stupid.

There seems very much I have not said here.

Here's Prairie Populist Elizabeth ("Betty") Warren, with a variation on a very very olde idea: Is she right? If not, how is she wrong?


Wednesday, October 2, 2013

The Drug Report: Crisis In Psychopharmacology

It's been at least 30 years since a truly new drug has hit the market that addresses the needs of patients suffering from depression, anxiety, manic depression (now rather bloodlessly called "bipolar disorder"), and schizophrenia. Any "new" drugs in the last 30 years have been basically some variation on an older, established drug (called "Me Too" drugs), in an effort of competing drug companies to keep up with the competition. These non-new "new" drugs are almost always marketed as "blockbuster" or "revolutionary" therapeutics, touting less side effects than older, competing drugs. They are not new and the side effects are just different, not less. 50 or so psychiatric drugs bring in $25 billion a year in Unistat alone. And they're pretty lousy.

(I know, I know: you'd be far worse off without the one that worked for you. Hey: they do some good. For some people. I want better drugs for you, is all. And we were promised them with the 2000 mapping of the human genome. So...where are they? Later.)

                                                        serotonin

The drugs people use - by every estimate I've seen between 20% to 25% of the Unistat population takes  at least one of these - were discovered by accident. By serendipity. In the 15 years after 1945. In 1952 a tuberculosis drug didn't work for TB, but iproniozid sure elicited euphoria when tested! Bingo: the first antidepressant. The drug that became Tofranil was supposed to work for schizophrenics, but it didn't help them, only make them run naked into town, laughing. Another antidepressant. In 1949 lithium was discovered, by accident, to treat manic depression. In 1957 Leo Sternbach was about ready to give up his research into a class of antihistamines, things were looking like a dead-end, when he stumbled onto the benzodiazepines: your Valium, Xanax, Lorazepam, Klonopin, etc: an empire of anti-anxiety drugs, and a huge influence on the tonality of culture in the West in the latter half of the 20th century.

With better technics, we learned much more about neurons and neurotransmitters. The SSRIs seemed to treat depression and anxiety. They were really the last big breakthrough. Ever since then, clinical trials that have made it to Stage III have been nothing but huge, sad, very expensive wastes. And so Novartis, Glaxo-Smith-Kline, Astra Zeneca, Pfizer, Sanofri and Merck have by and large quit trying. They've halted clinical trials, moved onto research that shows more promise. The pipeline for new psychopharmacological drugs is dry.

                                    psilocybin, very much like serotonin in structure

Wait a minute: with more neuroscientists than ever before, far better imaging devices, a tremendous acceleration of knowledge about the human brain over the past 30 years...why? And mental health takes an increasing toll on us. If not you, someone you know. Why is this so difficult? Is it because what R.D. Laing called "the medical model" finally showed its hand? (A pair of nines?)

Again: our technology to map with ever finer-grains our cells, genes, and organs is greater than ever. We now have a deeper understanding of the human genome, an explosive discovery of the complexity of the epigenome, increasing understanding of how our environment and microbes interact with us...why don't we have a drug that will cure depression by now? Are we simply too complex to understand? Were we destined to be granted a brief window of time in which a few "happy accidents" would yield up as good as it gets, and it all ended 30 years ago? What about our computing power and pharmacological knowledge? Isn't it also subject to Moore's Law: a doubling roughly every 18 months? Shouldn't we have had a bevy of breakthroughs by now?

What are we doing wrong?

In 2011 Eli Lilly thought they had a breakthrough for schizophrenia. They'd given PCP to mice, then their new drug and...the mice calmed down! Everything went well. They got to Stage III clinical trials (humans) and 18 months later the drug was dead. Placebos worked just as well. Lilly is another company that has all but given up now too.

                                    LSD: like psilocybin and serotonin, structurally

Some New Ways of Thinking and Genuine Promise 
Steven Hyman of Harvard and M.I.T. knows this field well. He was quoted in an article I read as admitting of his colleagues, "People are tired of curing mice."

Let's go back to the last breakthough: Prozac and all its cousins.

It had been assumed that, when those happy accidents occurred, there must be a theoretical basis. Pharmacologists have always acted like they were on top of what was going on, but the trade secret was they were faking it: when a drug worked, it went on the market, people used it and they "worked" well enough, but at first the chemists and psychiatrists had no idea why. With better understanding of the brain, they found the ancient model of the imbalance of humors as an explanatory scheme. Only they juiced it up: they found  these drugs altered neurotransmitters. Therefore, the lack of the neurotransmitter caused the disease! It seemed quite plausible, and very much like the hardcore finding that insulin works for diabetics.



Nassim Nicholas Taleb says this is a classic case of the "reverse-engineering problem": drop an ice cube on the floor and then go play cards with your friends in the other room. Can you visualize the cube breaking down into a tiny pool of water? Of course you can. You walk back into the kitchen and see a tiny pool of water where you had dropped the cube. It's pretty straight-forward. Now: imagine walking down the street and coming upon a tiny pool of water. A little spot of wet. How many ways can you dream up the cause of this spot?

A cop comes upon a drunken man looking for his keys, at night, under a streetlight. The cop asks the drunk why he keeps looking under the streetlight, and the drunk says it's because the light is so much better there.

Obviously, even our best researchers have been looking where the light was bright. And the reverse-engineered explanation of our not-all-that-great/we-can-do-better psychopharmacological drugs? Human. All-too human.



The neurotransmitters are not the cause of the mental illness. They merely point at the underlying cause; neurotransmitters (dopamine, serotonin, norepinephrine, etc) are tangential and partial. Reverse-engineering to allow more serotonin to remain in the synaptic gap between neurons was a genius move; too bad there are a handful of studies that show SSRIs work little better than placebos. (For some people they have worked well enough; I don't want to slight this!) All in all, there's a "truthiness" about depression drugs.

We treat everyone the same in studies, while knowing they have variable epigenomes. This is receiving some major research and seems quite promising, to my eyes. We have a semantic problem with experts dealing with a patient, making observations and tests, then naming the disease they "have," which is a major problem: people and diseases do not fall into our socially-constructed and convenient categories as well as we'd like. This problem is now far more acknowledged than ever, which seems promising to me. One example is the Research Domain criteria: we map behavioral abnormalities and symptoms and link them to specific causes in the brain, without the label of "schizophrenia" or "panic disorder." Why is this approach better? Because it's more targeted. Instead of looking at one or two neurotransmitters that "cause" schizophrenia, we try to find out specifically what causes people to hear voices, or become catatonic.

The idea that we must take 18 years from conception through clinical trials is being re-thought. Even more crucially for mental disease: non-human animal studies long ago reached diminished returns. Now the idea is small-scale, carefully controlled studies on humans will speed up the process and may yield breakthroughs in shorter periods.

Another area of promise: when a drug failed, it often worked for a few people. But our gold standard of drug testing: double-blind and placebo-controlled? The rules were that if the placebo worked as well as the drug, throw out the drug. But the people who were helped probably should have told us something.

Along those lines, there is a strong call to restore abandoned or "invisible" clinical trials to correct the scientific record. We may learn some very interesting things from "failed" trials.

The techniques surrounding stem cells have accelerated at an incredibly dizzying pace upward and for the better: now researchers can test cells and drugs in a a dish and make very good guesses as to whether a compound would have some efficacy.

With the mapping of human genome in 2000, hundreds of utopian promises were made that now seem embarrassing or outright quackery. But there was reason to be optimistic. We thought because we were very complex, we'd have the most genes, but instead of 100,000 we only had about 21,000. Grapes have more genes than us: this was nothing like what we'd expected. Worse: 13 years later we now know that a "bigger" system - in terms of complexity - governs the genome: the epigenome. It turns out that RNA plays a far, far bigger part than we'd thought. The complexity can seem overwhelming.

In 2002 researcher Andrew Hopkins came up with an eye-opening paper, the "druggable genome": Okay: we'd thought we had 100,000 genes. We have closer to 21,000. He estimated that only about 10% of those genes coded for proteins that could bind to small molecules, which is how drugs work, basically. So: about 2,100 genes. But he estimated that, of those, only about 20% would be likely to involve diseases. So now we're down to about 420 possibilities for targets. And then he guessed we'd already discovered 50% of those (probably accidentally?). We only had 210 targets left? For all diseases, not just mental illnesses? Not exactly a rosy scenario. But...

Cheminformatics! This is a burgeoning discipline using the aforementioned computational doubling: there are tens of thousands of compounds in digitized libraries. Do you test them all? Two guys wrote  an algorithm to teach a computer to sift through a welter of data on TB, which is becoming antibiotic-resistant. A Big Deal, quite threatening to all of us, potentially. Their algorithm said: find all compounds that are like the drugs that used to work on tuberculosis. So you get that data set. Then the algorithm says, throw out every compound known to be toxic to mammalian cells. You have a smaller set, but a safer one to work with. The algorithm discovered a 40-year old drug that was shown to have anti-TB properties but had been forgotten.




Even more interesting and promising: researchers in Cambridge, MA have taken messenger RNA (mRNA), an ultra fragile molecule which, when injected activates the body's immune response, tweaked a couple of "letters" in its nucleotide sequence, and made a non-fragile mRNA that does not turn on the immune system. What this could do is take the information from the DNA in a gene and make it "fix" missing or broken proteins in another cell, in effect causing a patient with a (probably inherited?) protein abnormality to make a drug inside their own cells!

Nessa Carey, a gifted explainer of how epigenetics works in our bodies, has urged us to be cautious about getting too excited over drugs based on DNA-RNA, because so far, "One of the major problems with this kind of approach therapeutically may sound rather mundane. Nucleic acids, such as RNA-DNA, are just difficult to turn into good drugs. Most good existing drugs - ibuprofen, Viagra, antihistamines - have certain characteristics in common. You can swallow them, they get across your gut wall, they get distributed around your body, they don't get destroyed too quickly by your liver, they get taken in by cells, and they work their effects on the molecules in or on the cells. Those all sound like really simple things, but they're often the most difficult things to get right when developing a new drug."

Finally, there is a very real call to combine all our new technologies with an active looking for happy accidents, like in the 1945-60 period. We find as many compounds that could possibly have efficacy, get people willing to be guinea pigs to try them (we have far better ways to guess at what's likely to have horrendous side effects or death-dealing qualities, but we're by no means "covered" here), and see what happens! Yes, the dark side is that the poor will probably be the ones to sign up...How do we find new things to try? "Scientists Map All Possible Drug-Like Chemical Compounds." It turns out the drunk looking for his keys was far more accurate an analogy than we might've guessed. Or wanted to guess. Check out all the unexplored chemical "space" yet to be charted! It reminds me of the incredible number of phenethylamines and tryptamines that Alexander Shulgin mapped: but a drop in the ocean? (Shulgin deserved the Nobel Prize for Chemistry: just read-up on his career! It's almost criminal he didn't get the Prize.) It's like looking for signs of life in the Milky Way! Or more prosaically: like geologists learning how to more profitably drill for oil. It's also about algorithms and possibilities and adventure and hellacious mistakes yet to be made.

To all of us looking for better living through chemistry: Bon appetite! I do think we may make it through this bottleneck to a whole new world of more sophisticated drugs that will make all the ones we've had since 1945 look primitive. Maybe?

Some Of The Works Consulted:
The Epigenetics Revolution by Nessa Carey
"No New Meds," by Laura Sanders:
http://www.sciencenews.org/view/feature/id/348115/description/No_New_Meds
Happy Accidents: Serendipity In Modern Medical Breakthroughs, by Morton A. Meyers
"The Psychiatric Drug Crisis" by Gary Greenberg:
http://www.newyorker.com/online/blogs/elements/2013/09/psychiatry-prozac-ssri-mental-health-theory-discredited.html
PIHKAL: A Chemical Love Story, by Alexander and Ann Shulgin
"Where Are All The Miracle Drugs?" by Brian Palmer:
http://www.slate.com/articles/health_and_science/human_genome/2013/09/human_genome_drugs_where_are_the_miracle_cures_from_genomics_did_the_genome.single.html
"Messenger RNAs Could Create a New Class of Drugs," by Susan Young:
http://www.technologyreview.com/news/512926/messenger-rnas-could-create-a-new-class-of-drugs/
"Faster, Smarter and Cheaper Drug Discovery":
http://www.sciencedaily.com/releases/2013/03/130321131920.htm
Serendipity: Accidental Discoveries In Science, by Royston Roberts
Hope or Hype: The Obsession With Medical Advances and the High Cost of False Promises, by Richard A. Deyo and Donald L. Patrick
"Experts Propose Restoring Invisible and Abandoned Trials to 'Correct the Scientific Record'":
http://www.sciencecodex.com/experts_propose_restoring_invisible_and_abandoned_trials_to_correct_the_scientific_record-114055
The Black Swan: The Impact of the Highly Improbable, by Nassim Nicholas Taleb