Carl Zimmer | Phenomena | National Geographic

The initials in BRCA1 stand for breast cancer. Its name reflects how it was discovered: scientists found it as they were searching for the cause of the disease. But such names are really misnomers. After all, genes don’t simply sit in our DNA so that they can mutate in some people and make them sick. Normally, they have a job to do. In the case of BRCA1, there are many jobs. For one thing, it protects DNA from harmful mutations that can arise as it’s getting replicated. And if DNA does get damaged, the BRCA1 protein helps fix it. It joins together with several teams of other proteins, and each team carries out a different part of the complex task of DNA repair.





Corey Robin | The Nation

. . . to understand that text and its influence, it’s necessary to turn away from contemporary America to fin de siècle Vienna. The seedbed of Hayek’s arguments is the half-century between the “marginal revolution,” which changed the field of economics in the late nineteenth century, and the collapse of the Habsburg monarchy in 1918. It is by now a commonplace of European cultural history that a dying Austro-Hungarian Empire gave birth to modernism, psychoanalysis and fascism. Yet from the vortex of Vienna came not only Wittgenstein, Freud and Hitler but also Hayek, who was born and educated in the city, and the Austrian school of economics.

Friedrich Nietzsche figures critically in this story, less as an influence than a diagnostician. This will strike some as an improbable claim: Wasn’t Nietzsche contemptuous of capitalists, capitalism and economics? Yes, he was, and for all his reading in political economy, he never wrote a treatise on politics or economics. And despite the long shadow he cast over the Viennese avant-garde, he is hardly ever cited by the economists of the Austrian school.

Yet no one understood better than Nietzsche the social and cultural forces that would shape the Austrians: the demise of an ancient ruling class; the raising of the labor question by trade unions and socialist parties; the inability of an ascendant bourgeoisie to crush or contain democracy in the streets; the need for a new ruling class in an age of mass politics. The relationship between Nietzsche and the free-market right—which has been seeking to put labor back in its box since the nineteenth century, and now, with the help of the neoliberal left, has succeeded—is thus one of elective affinity rather than direct influence, at the level of idiom rather than policy.





James Cartwright | It’s Nice That





Judith Shulevitz | The New Republic

. . . natural selection favored people who needed people. Humans are vastly more social than most other mammals, even most primates, and to develop what neuroscientists call our social brain, we had to be good at cooperating. To raise our children, with their slow-maturing cerebral cortexes, we needed help from the tribe. To stoke the fires that cooked the meat that gave us the protein that sustained our calorically greedy gray matter, we had to organize night watches. But compared with our predators, we were small and weak. They came after us with swift strides. We ran in a comparative waddle.

So what would happen if one of us wandered off from her little band, or got kicked out of it because she’d slacked off or been caught stealing? She’d find herself alone on the savanna, a fine treat for a bunch of lions. She’d be exposed to attacks from marauders. If her nervous system went into overdrive at perceiving her isolation, well, that would have just sent her scurrying home. Cacioppo thinks we’re hardwired to find life unpleasant outside the safety of trusted friends and family, just as we’re pre-programmed to find certain foods disgusting. “Why do you think you are ten thousand times more sensitive to foods that are bitter than to foods that are sweet?” Cacioppo asked me. “Because bitter’s dangerous!”





Joshua Rothman | The Page Turner Blog | The New Yorker

Baz Luhrmann’s “The Great Gatsby” is lurid, shallow, glamorous, trashy, tasteless, seductive, sentimental, aloof, and artificial. It’s an excellent adaptation, in other words, of F. Scott Fitzgerald’s melodramatic American classic. Luhrmann, as expected, has turned “Gatsby” into a theme-park ride. But he’s done it in exactly the right way. He hasn’t tried to make the novel more respectable, intellectual, or realistic. Instead, he’s taken “The Great Gatsby” very seriously just as it is.

. . . And there’s another sense in which I think Lurhmann gets “Gatsby” exactly right. His movie, which is presented in 3-D, seems streamlined and pre-packaged—it’s presented, self-consciously, as mass entertainment—and his characters feel flat, smoothed-out, uncomplicated. Many critics have charged the movie with flatness, too. In his excellent essay on the film, my colleague Richard Brody writes that “there’s no roughness whatsoever to [DiCaprio’s] character, none of life’s burrs or scrapes, no tinge of real power”; Carey Mulligan, similarly, “doesn’t invest the character with style or with substance.” The director, he concludes, is “unable to take society seriously, to recognize the extraordinary character that extraordinary manners both hide and (for those attuned to them) display.” These are legitimate, discerning objections, and yet I can’t help but feel that the film’s flatness is a deliberate choice; that what seems like a failure of Luhrmann’s imagination is actually a faithfulness to Fitzgerald’s. The characters are like that in the novel, too; that’s why Lionel Trilling, in “The Liberal Imagination,” compared them to “ideographs.” Flatness, after all, is the state to which all of Fitzgerald’s characters aspire. Even Gatsby, whose life thrums with secret ambition and desire, manages to be the cool man in the pink suit. “You always look so cool,” Daisy tells him. In a moment of admiration, she says that he resembles “an advertisement” of a man.

The flatness of the characters in “Gatsby” is, I think, part of what makes it so insightful. . .


Michael Nordine | Los Angeles Review of Books

Malick went on to study philosophy under Stanley Cavell at Harvard University, where he graduated summa cum laude and Phi Beta Kappa in 1965 before crossing the Atlantic as a Rhodes scholar at Magdalen College, Oxford. Prior to completing his PhD, he left the school over a dispute with his thesis advisor. The details of this argument are largely unknown, though The Harvard Crimson claims it had to do with “the contrasting worldviews of Kierkegaard, Heidegger, and Wittgenstein.”[3] Upon arriving back in the United States, Malick taught philosophy at MIT and published a translation of Martin Heidegger’s Essence of Reasons. (While at Harvard, he also translated Heidegger’s Holzwege and met the philosopher during a year abroad in Germany.) For a number of years following his return, he worked as a journalist for Newsweek and Life, where he wrote about Latin America, and The New Yorker, where he had an office from 1968 to 1969. Malick then enrolled at the AFI Conservatory as part of its inaugural class, graduating two years later with David Lynch. The decision to apply to the program was apparently an easy one: “I’d always liked the movies in a kind of naïve way,” Malick once said, for the simple reason that “they seemed no less improbable a career than anything else.”

It thus seems safe to say that Malick’s interests have always extended beyond cinema and that his life doesn’t appear to revolve around filmmaking. Moreover, he did not simply arrive in Hollywood on the back of a turnip truck one day and attempt to make it big. Malick had a built-in network of colleagues and friends from the above institutions (namely, his agent Mike Medavoy, who eventually became a producer of The Thin Red Line) and had already proven himself, not only by his enviable intellect, but also through his success in different, sometimes overlapping fields. Perhaps the most prescient of his early projects, however, was an unfinished one: a “huge piece on the death of Che Guevara for The New Yorker” said to have “piled up to six feet of copy. He got obsessed, and he overwrote, and he went past it. He never finished it.” Prior to completing his third film, Malick would start, but not finish, a great many more projects.





Spark Pictures | Vimeo

Illuminating the culture of Burning Man – the annual pilgrimage to Nevada’s Black Rock Desert – as a catalyst for community, innovation and the actualization of dreams, this film offers a glimpse into the art and culture of this dynamic community with the hope to spark a dream within you.

DREAM premiered at the Sonoma International Film Festival. Directed and produced by Rich Van Every of Rich also was a cinematographer for “Spark: A Burning Man Story,” a feature film on the inside story of Burning Man during a year of unprecedented challenges and growth “Spark: A Burning Man Story” premiered at SXSW 2013 and will be in general release Summer 2013. See for more information and to sign up for email updates on upcoming screenings.





Charles Pierce | Grantland

The National Football League is just now stirring from the approximately 11 seconds of hibernation that it allows itself each year. As usual, it is coming awake slowly, wiping the initials out of its eyes, OTAs and all the like. The players are all in T-shirts and shorts. The coaches do not yet look like fugitives from a Turkish prison. By and large, everyone is upright. Everyone is smiling. Somewhere in the bowels of the medical facilities around the league are unopened boxes of tape and racks of shiny syringes that have not yet been used. This is how the NFL begins its year, every year. This is how the savagery begins, in T-shirts and shorts, in the sunshine and the spring breezes and the lilacs hanging over the chain-link fence. This is how the savagery begins.

Euphemism is no longer adequate. Euphemism is, in fact, an insult to our collective intelligence and a cruel mockery of our collective morality. At its highest and most lucrative level — at the very apex of its slow dance with the institutions of American corporate power — it is at its most savage, which is fitting, because we are in an age of organized (if somewhat more polite) corporate savagery unseen since the days of Jay Gould and the rest of the Gilded Age brigands. Euphemism is no longer sufficient camouflage. Football is about breaking human beings, preferably on the cheap, and replacing them with human beings who, for the moment, are unbroken — or, at least, less broken. The only even remotely interesting question remaining is whether the nation can simply admit this to itself and — if having done so honestly — whether it can live as an ethical and moral culture in which the savagery is allowed to prosper.

There really are no other questions left. For a long while, the league was able to mask the fact that the destruction of the human body was as central to its fundamental structure as that destruction ever was to, say, boxing. For a long while, the libertarian argument seemed to prevail; yes, the argument went, we concede the savagery and the destruction but, to paraphrase Hyman Roth, this is the business they have chosen. Both of those strategies have run their course. Scientific evidence continues to overwhelm any attempts to spin what happens to a human being over the course of a career playing football. And there comes a point at which the libertarian argument runs headlong into the question of whether it is moral for a society to allow people to commit slow-motion suicide for the purposes of mass entertainment. That leaves us with the question of what we will tolerate as an ethical and moral culture, and why. And that is the question that the NFL must answer in a whole host of areas regarding the safety and health of its employees, lest one day it get an answer that it will not like very much.


Ben Lindbergh | Grantland

Take a look at these two pitches from 2012:
They’re both four-seam fastballs thrown by right-handed pitchers to left-handed hitters. They both pass through the strike zone 21 inches off the ground, between 11.7 and 12.9 inches from the center of home plate. They both hit their targets, so the catchers know where they’re headed and have time to prepare. And they’re both called by the same umpire, Sam Holbrook.

In fact, the two pitches are similar in just about every respect but their outcomes. The top one, thrown by James Shields last July, is a strike, but the bottom one, thrown by Liam Hendriks last June, is a ball.

We can’t say for sure why only one was a strike; maybe Holbrook was just feeling generous when the first pitch crossed the plate. But we do know one important variable that differs between the two pitches — the catcher. The pitch on the top was caught by the Rays’ Jose Molina, one of baseball’s best receivers. The pitch on the bottom was caught by the Twins’ Ryan Doumit, one of the worst. And that may have made all the difference.

Focus on how catchers “frame” pitches to make them look more like strikes, or talk to guys who are good at it, and the distinction between players like Molina and Doumit starts to stand out. Depending on how they’re caught, two pitches that are almost identical on their way to the plate can look a lot different once they get to the glove.





Jeff Hamadi | BOOOOOOOM!





Paul Krugman | The New York Review of Books

Everyone loves a morality play. “For the wages of sin is death” is a much more satisfying message than “Shit happens.” We all want events to have meaning.

When applied to macroeconomics, this urge to find moral meaning creates in all of us a predisposition toward believing stories that attribute the pain of a slump to the excesses of the boom that precedes it—and, perhaps, also makes it natural to see the pain as necessary, part of an inevitable cleansing process. When Andrew Mellon told Herbert Hoover to let the Depression run its course, so as to “purge the rottenness” from the system, he was offering advice that, however bad it was as economics, resonated psychologically with many people (and still does).

By contrast, Keynesian economics rests fundamentally on the proposition that macroeconomics isn’t a morality play—that depressions are essentially a technical malfunction. As the Great Depression deepened, Keynes famously declared that “we have magneto trouble”—i.e., the economy’s troubles were like those of a car with a small but critical problem in its electrical system, and the job of the economist is to figure out how to repair that technical problem. Keynes’s masterwork, The General Theory of Employment, Interest and Money, is noteworthy—and revolutionary—for saying almost nothing about what happens in economic booms. Pre-Keynesian business cycle theorists loved to dwell on the lurid excesses that take place in good times, while having relatively little to say about exactly why these give rise to bad times or what you should do when they do. Keynes reversed this priority; almost all his focus was on how economies stay depressed, and what can be done to make them less depressed.

I’d argue that Keynes was overwhelmingly right in his approach, but there’s no question that it’s an approach many people find deeply unsatisfying as an emotional matter. And so we shouldn’t find it surprising that many popular interpretations of our current troubles return, whether the authors know it or not, to the instinctive, pre-Keynesian style of dwelling on the excesses of the boom rather than on the failures of the slump.

David Stockman’s The Great Deformation should be seen in this light. It’s an immensely long rant against excesses of various kinds, all of which, in Stockman’s vision, have culminated in our present crisis. History, to Stockman’s eyes, is a series of “sprees”: a “spree of unsustainable borrowing,” a “spree of interest rate repression,” a “spree of destructive financial engineering,” and, again and again, a “money-printing spree.” For in Stockman’s world, all economic evil stems from the original sin of leaving the gold standard. Any prosperity we may have thought we had since 1971, when Nixon abandoned the last link to gold, or maybe even since 1933, when FDR took us off gold for the first time, was an illusion doomed to end in tears. And of course, any policies aimed at alleviating the current slump will just make things worse.


David Stuckler & Sanjay Basu | The New York Times

EARLY last month, a triple suicide was reported in the seaside town of Civitanova Marche, Italy. A married couple, Anna Maria Sopranzi, 68, and Romeo Dionisi, 62, had been struggling to live on her monthly pension of around 500 euros (about $650), and had fallen behind on rent.

Because the Italian government’s austerity budget had raised the retirement age, Mr. Dionisi, a former construction worker, became one of Italy’s esodati (exiled ones) — older workers plunged into poverty without a safety net. On April 5, he and his wife left a note on a neighbor’s car asking for forgiveness, then hanged themselves in a storage closet at home. When Ms. Sopranzi’s brother, Giuseppe Sopranzi, 73, heard the news, he drowned himself in the Adriatic.

The correlation between unemployment and suicide has been observed since the 19th century. People looking for work are about twice as likely to end their lives as those who have jobs.

In the United States, the suicide rate, which had slowly risen since 2000, jumped during and after the 2007-9 recession. In a new book, we estimate that 4,750 “excess” suicides — that is, deaths above what pre-existing trends would predict — occurred from 2007 to 2010. Rates of such suicides were significantly greater in the states that experienced the greatest job losses. Deaths from suicide overtook deaths from car crashes in 2009.





Michael Pollan | The New York Times

I can tell you the exact date that I began to think of myself in the first-person plural — as a superorganism, that is, rather than a plain old individual human being. It happened on March 7. That’s when I opened my e-mail to find a huge, processor-choking file of charts and raw data from a laboratory located at the BioFrontiers Institute at the University of Colorado, Boulder. As part of a new citizen-science initiative called the American Gut project, the lab sequenced my microbiome — that is, the genes not of “me,” exactly, but of the several hundred microbial species with whom I share this body. These bacteria, which number around 100 trillion, are living (and dying) right now on the surface of my skin, on my tongue and deep in the coils of my intestines, where the largest contingent of them will be found, a pound or two of microbes together forming a vast, largely uncharted interior wilderness that scientists are just beginning to map.





Web Urbanist

These layered creations are surprisingly realistic, even in black and white, thanks in part to their scale and reinforced by their shadows, but also due to the ordinary nature of the sidewalk scenes being depicted.

Strøk (Anders Gjennestad) is a stencil artist and mural maker from Norway with works in various contexts, from city streets to suburban galleries.






James Bartolacci | Architizer

Building a new home in a forest can often place contemporary structures in contention with their environment. To blend in with the secluded natural surroundings, the home is defined by curvy lines and organic wood tiles. Rusty exterior cladding gives the home a weathered look.

But perhaps the most captivating features of the Wilkinson Residence are the circular windows that grace the walls of the living areas. Like the lens of a camera, these windows frame the views of treetops, placing the residents in direct dialogue with nature. Although the home is not a treehouse by definition, it definitely makes us feel like kids again!

oregon treehouse 01

oregon treehouse 02

oregon treehouse 03




Sammy Medina | Co.Design

Sited just outside of Austin, Texas, on a rehabilitated brownfield, the Edgeland House is embedded in a grassy clearing. An old oil pipeline cut through the site, which the architects removed in an attempt to “heal the land” of its past pestilence and “recreate the original prairie,” says co-principal Thomas Bercy. But rather than leveling the site and crowning it with an unimaginative single-family home, Bercy Chen proposed building the house in the void vacated by the pipeline. “Nestling the house in the excavation and covering it with a green roof completed the site remediation,”








Lindsey Zoladz | Pitchfork

Savages really show promise and range on the slow-burners. The moody dirge “Waiting for a Sign” and goth-cabaret closer “Marshall Dear” aren’t the most immediate songs on the record, but over repeated listens, they bloom. If Hassan and Faye Milton’s punishing rhythm section takes the helm on the more frantic numbers, Savages’ downtempo moments allow Gemma Thompson and her scuzzy Fender to shine. On the excellent “Strife,” she holds back as often as she strikes, underscoring Beth’s most brutal lines with perfectly timed jolts and filling the song’s winding corriders with thick plumes of distortion.

The mix allows each band member’s contribution to smolder with equal intensity and lends a palpable physicality to Savages’ sound. Milton handles her toms and bass drum like a boxer going at a punching bag; Hassan’s bass strings pulsate like a throbbing tendons; Thompson’s guitar cuts with a goosebump-inducing tone that recalls a chainsaw, and Beth shrieks like she’s resetting her own bones. Combining in a constant pendulum swing between tension and release, it all provides the perfect atmosphere for the darkly sensual themes that Silence Yourself explores.

savages - silence yourself


Design Boom


Richard Brody | The Front Row | The New Yorker | amazon

So it is with “Sun Ship: The Complete Session,” a newly released two-CD set of recordings by John Coltrane and his “classic quartet” of McCoy Tyner (piano), Jimmy Garrison (bass), and Elvin Jones (drums). It was made at New York’s RCA Victor Studios on August 26, 1965—that epochal group’s penultimate recording—and features multiple takes, plus inserts meant for splicing (as well as some studio banter), of the five numbers that the original “Sun Ship” album, released in 1971 (four years after Coltrane’s death at the age of forty), comprises. Tyner and Jones were gone from the band by the end of 1965 after a five-year run, and this album—especially in its complete unfurling—makes clear the divergent musical directions that they and Coltrane were taking. But, even more important, it highlights Coltrane’s tense and increasingly conflict-torn contention with his own musical heritage, style, and material.

That tension is evident as well in the original album release, but as important as the sheer quantitative addition to Coltrane’s discography (the original session feature complete alternate takes for four of the five tracks plus four potent solos recorded as inserts) is the shift in emphasis resulting from the chronological document of the session. The album opens with the furious title track with its wailing rapid-fire four-note theme repeated and broken down to three notes, when—after a long, swirling and percussive solo by Tyner, Coltrane enters with a vortex of obsessively involuted streaks of chordal fragments that yield to furious, sound-shredded shrieks and bellows that suggest the will to break through the stuff of harmonic investigation to sheer expressive sound, the swinging patterns of pounding rhythm to shifting biocentric undulations. It’s radical enough, ecstatically musical, and imbued with the spirit of the new thing—of so-called free jazz and, in particular, of the ideas and ways of the ne plus ultra sonic innovator on the tenor saxophone, Albert Ayler, whose playing owed nothing to the bebop and post-bop ways of Charlie Parker and Miles Davis and tore through the very framework of modern jazz to link it to primordial New Orleans and African traditions by way of Ayler’s self-made ecstatic spirituality.

john coltrane - sun ship complete