When I heard that Jan Berenstain, the co-creator of the Berenstain Bears, had passed away, I thought I should abstain from comment. As some of you know I'm not a fan of the Berenstain Bears--I've accused them of encouraging false consciousness. But Hanna Rosin marked the occasion by offering an in-depth critique:

I have loved many a midcentury book starring the retrograde housewife. Most great Dr. Seuss books were written around the same time. The Frances books are some of my favorites, and Mother in that book never changes out of her apron. And I can read Richard Scarry all day. But usually you need humor to soften the blow. Stan and Jan, sadly, were allergic to humor. This is the only thing I hold against Theodor Geisel, aka Dr. Seuss. He was the one who apparently approved the Berenstain Bears books and yet he never pushed them to write one funny line. Papa Bear for example, is a bumbling oaf. The usual plotline in the books involves Papa trying to fix some problem but screwing it up, so that Mama has to swoop in and save the day. In defter hands, Papa could have been a prototype Homer Simpson. But in these book he just bangs on the table and shouts things like “pinheaded fiddlebrain!”

She notes that Charles Krauthammer of the Washington Post is also a Berenstain critic. "I hate the Berenstain Bears," he once wrote. Hate is too strong a word--the Berenstain Bears mean well--but critics don't need to give people a free pass just because they're writing for kids; chidren's art and literature can make an impact on people, as all of this cryptozoological literary analysis suggests. Rosin was pretty glib about Berenstain's death, but her critique of the books is oddly timed rather than vicious. As for Mrs Berenstain, she clearly helped create a world that a lot of people enjoyed, even if it frustrated others. There's room for all kinds.
 
 
Ahem:

It was Mary's own fault that she died. Elizabeth had to protect her position and Mary did have support in her belief that she should be queen. Elizabeth gave her ample opportunity to renounce her claim on the throne. However, Elizabeth had experienced imprisonment in the Tower of London and living with the threat of execution (from her own sister Mary) and she did not want to have Mary to have to die.
 
 
The Washington Post has an excerpt from Craig Timberg and Daniel Helperin's forthcoming book Tinderbox: How the West Sparked the AIDS Epidemic and How the World Can Finally Overcome It:

We now know where the epidemic began: a small patch of dense forest in southeastern Cameroon. We know when: within a couple of decades on either side of 1900. We have a good idea of how: A hunter caught an infected chimpanzee for food, allowing the virus to pass from the chimp’s blood into the hunter’s body, probably through a cut during butchering.

As to the why, here is where the story gets even more fascinating, and terrible.

Timberg and Helperin explain that the hunter was infected around 1900, during Africa's colonial era--a propitious moment for transmission, as thousands of people were trundling through previously isolated areas of West Africa trying to open and secure trade lines. From Cameroon, they explain, the strain in question made its way to Kinshasa, where it exploded:

Most of this colonial world didn’t have enough potential victims for such a fragile virus to start a major epidemic. HIV is harder to transmit than many other infections...

So the improbable journey of the killer strain of HIV was feasible for only a few hectic decades, from the 1880s to the 1920s. Without “The Scramble for Africa,” it’s hard to see how HIV could have made it out of southeastern Cameroon to eventually kill tens of millions of people. Even a delay might have caused the killer strain of HIV to die a lonely death deep in the forest.

But as it happened, "the West forced its will on an unfamiliar land, causing the essential ingredients of the AIDS epidemic to combine." This is fascinating as history and as a scientific detective story. I'm not sure what to make of the implied comment on colonialism. As the authors explain, it's tragic that this crucial early infection happened at time of expanded interaction; if not for that, the HIV strain in question might have wreaked its havoc on a handful of people rather than tens of millions around the world.

But we know that interactions between previously separate groups can have catastrophic consequences as well as productive ones. Spain devastated the Aztecs with smallpox, for example, and the Mongol armies brought gunpowder to the west. It's no defense of colonialism to say that with regard to transmission (of disease, technology, or ideas) the beh matters more than the motive. That seems like a salient point here, because while people now reject colonialism, some of the actions thereof--global trade and travel--obviously have only increased. For more on the relationship between economic interchange and the rate of HIV infection, see this TED talk from economist Emily Oster--more exports mean more AIDS, as she puts it (starting around minute 10).  
 
 
This is predictable, but I really like the BBC series Sherlock, which just finished its second season over there. As Michael Dirda explains in the New York Review of Books, Arthur Conan Doyle's Sherlock Holmes stories have always had their devotees, but the past decade has brought a lot of adaptations, explorations, and expansions. (I would add to his list House, Michael Chabon's novella The Final Solution, and Julian Barnes's Arthur and George as works in the shadows of Sherlock.) Some of these aren't very inspired. The recent movie version (with Robert Downey Jr.) was pretty tedious.

The BBC series, however, is really fun. Benedict Cumberbatch is great in the title role; he brings a bit of boyishness that we don't usually get from Holmes. character. Martin Freeman is very good as Watson. Once I finish the dowager countess master class in bitchface, I'm going to practice this characteristic I'm-exasperated-at-circumstances-and-myself-as-much-as-you expression. The series is also really beautiful--not a Benedict Cumberbatch joke--which is an underrated quality in a detective series. I was watching an Inspector Morse the other night and the greatest mystery was what could possibly be happening on screen. This series has rich color and texture, and a good eye for design; 221B is my favorite bachelor flat on film since Count Almásy's.
As a fan of the stories, although not a very diligent one, it's interesting to contemplate how the series departs from them. Conan Doyle, as I recall, shows Holmes to be a lot more affable than this series does. In the stories, Holmes is aware that his mind outpaces those of the people around him, but his attitude is relatively tolerant. "You see I have a lot of special knowledge which I apply to the problem, and which facilitates matters wonderfully," he explains to Watson soon after they meet. "Those rules of deduction laid down in that article which aroused your scorn are invaluable to me in practical work. Observation with me is second nature." And Holmes is comparatively solicitous of other people--a bit wry about their problems, but not particularly vicious. The overall impression is eccentric but not offensive.

The TV Sherlock, by contrast, makes a point of offending. He thinks poorly of most people, and often volunteers that to them. Even when he's dealing with someone who's clearly in distress, he scolds them for boring him. It's suggested that he doesn't understand other people's emotional reactions; most of his interactions with the shy mortician played by Louise Brealey end with her on the verge of tears, even though Sherlock theoretically has no problem with her. At one point, when Watson scolds him about it--Sherlock having just harshed her by explaining that a new boyfriend is a confirmed bachelor--Sherlock protests that he was trying to be kind. On other occasions, though, he's clearly able to anticipate people's feelings; he manages this whenever he wants someone to tell him or give him something. On the evidence, he's less a psychopath than just kind of a dick. There are, however, some indications in the second series that his attitude is thawing.

Despite this, the character is clearly Sherlock Holmes. This suggests that the core of the character has little to do with his disposition. The essential Sherlock Holmes quality, across the successful iterations, is his empirical stance. His devotion is to science, reason, data, logic, deduction, induction. From that it follows that he'll have little use for institutions, with all their pathologies; little facility with convention; and little insight into human foibles like love.

Whether that yields an avuncular Sherlock or an aggressive one may be a function of the era. Interestingly, Hugh Laurie's House character (inspired by Holmes) is also unpleasant. It might be all these months I've spent writing about the Republican primary, but I wonder if we can infer that this is because people don't like overtly clever people these days. In Conan Doyle's day, people either admire Holmes or they dismiss him as a harmless oddball. Today, we have a handful of people who admire Sherlock, another group who apparently respect him but seem to dislike him personally, and a third group who are openly hostile. They may, of course, be hostile because he's a dick. But there's probably also a bit of a negative feedback loop at work. In Victorian England, science could still be seen as a harmless pastime. Norms were established by the church, by tradition, by convention, or even by philosophy. All of those are to some extent based on things unseen, and over the intervening century their authority has been eroded. Scientists, empiricists, technocrats, and super-logical types have become more powerful, and their epistemological stance sometimes makes people uneasy. Their rules imply that conclusions that aren't falsifiable equally can't be accepted as true. A modern Sherlock Holmes would surely be controversial, even if his manners were better.
 
 
My pick for best-dressed last night: Gwyneth Paltrow in Tom Ford. What is this dress? Is it intelligent or innocent, classic or futuristic, inviting or aloof, lush or austere? Or somehow, improbably, all of the above?

I also liked Shailene Woodley's long-sleeved 70s look. Although the bow at Emma Stone's neck was kind of ridiculous, it turned out to be a droll frame for her satirical presentation speech; she looked tall, mischievous, and clever.

My least favorite dress was Melissa McCarthy's. She has really pretty coloring--rich hair, greenish eyes--and the color, landing somewhere between pink and camel, didn't do anything to bring that out. Anna Faris is adorable and funny, but her dress was drowning her. I didn't like Angelina Jolie's dress. Setting aside the leg (as she did), it was way too bunchy. It was like she had a whole curtain around her waist. Or like when you buy a floor lamp and you have to scrunch the plastic sleeve to guide the pieces of the pole into place.

The best Oscar commentary, meanwhile, comes from Anthony Lane:

Over the years, I have come to prefer the gibberish of these experts, lightly powdered with panic, to the coarse-ground rhetoric that prevails inside the auditorium. When one of the resident style queens, epistemologists to their nail lacquer, gazed at Tina Fey and said, in tones of unfeigned awe, “I’ve never seen this hair on her,” we were vouchsafed a genuine insight into the unreliable surface of reality, as it shimmers on Pacific shores.
 
 
Some more new things this week:

In the print edition, a story on efforts to boost job creation by lending a helping hand to start-ups.

Also in the print edition, a story asking whether Mormonism has a Mitt Romney problem.

And at DiA, a post inspired by Rick Santorum, about the roots of social conservatism in America.
 
 
 
 
Just as the world of sports has recently been subsumed by Linsanity, over the past few weeks, the tech world has been similarly gripped by Pinsanity--a sudden pash for Pinterest, an image-sharing social media site that allows users to create pinboards, or collections of images organised around themes of their choosing. As discussed in last week's Babbage podcast, the site gets more than 10m unique visitors a month, making it one of the fastest-growing sites in history, if not the fastest. It's also become a major driver of referral traffic, behind Facebook and Tumblr. It's been beribboned with awards, and attracted some $37m in venture funding last year.

Notably, Pinterest is not actually that new. Millions of women--the vast majority of its users are women--have been using it for months. The fact that it had a low profile until this month may be an idiosyncracy that explains itself; I wrote last year, social media has some hidden gender biases. Another explanation would be that industry observers are simply overwhelmed by social media startups. "Once you’re on Facebook and Twitter and Foursquare and Google Plus and Tumblr and LinkedIn and Instagram and Reddit and Path — when, exactly, do you have time left over for a life?" asks David Pogue, in a generally positive review at the New York Times. A third explanation would be that people haven't been taking Pinterest seriously because it doesn't seem like a serious business. As the Wall Street Journal explains, although at least one investor informally valued the site at about $200m, its plan for making money is not clear. "Pinterest's monetization strategy isn't in the oven and it's not even off the baking table," said one board member, earning a tetchy rebuke from Alexis Madrigal at The Atlantic, who points out that Pinterest could potentially make an awful lot of money by taking a cut of any sales that are inspired by its toothsome pinboards.

In any case, Pinterest is here, and it is in a good position, being both innovative and useful. As with most social media sites, Pinterest has echoes of its antecedents. It encourages users to connect on the basis of common interests, like Facebook; like Twitter or Tumblr, it encourages people to pass along each other's thoughts, in this case with a one-click 'repinning' feature. But Pinterest is interesting because of the ways that it builds on those earlier experiments, and breaks from previous assumptions about online behaviour. 

One of Pinterest's innovations is that it is forward-looking. Social media typically has a performative dimension, as indeed does most social behaviour. Often, however, the positioning is based on past experience: this is where I went to school, these are my friends, this is what I said about it at the time. Facebook's new Timeline feature, for example, reinforces that site's tendency to act as an autobiography or social CV. Pinterest, by contrast, is largely disinterested what users have done. It's geared toward what they are interested in and what they might do. The suggestions that the site offers about what it can be used for--redecorate your home, perhaps, or plan a wedding--emphasise plans, goals, daydreams. A related point is that Pinterest dispenses with concerns about privacy and sharing. There arguably is no need for privacy on Pinterest. You can upload your own photos, but most of the content is simply pinned from other sites. Accordingly, there is no privacy on Pinterest. Anyone can look at anyone else's pinboards.

The greatest difference between Pinterest and other social media sites is that most of its content is visual. This is unusual because although it's clearly possible to communicate without words people typically mediate their thoughts through language, especially if they're trying to explain themselves to other people. Pinterest doesn't bother. While you can describe what a pinboard is for, the images you've posted are more important. That may, actually, be a minor reason that Pinterest has been far more popular with women than with men. It's not that women are more visual than men; in some contexts, neurologists say the opposite is true. But women probably have more experience with ad-hoc semiotics. Fashion spreads are often structured as mood boards, for example; another big site that you rarely hear people talk about is Polyvore, which allows people to put together outfit ideas and solicit feedback from other visitors. To give another example, women's magazines sometimes advise readers to keep a "vision board" as a way to unearth subconscious feelings or aspirations.

The more significant reason for the gender imbalance on Pinterest, however, probably has to do with the fact that women control the majority of the world's consumer spending. So if Pinterest is presenting itself as a way to help you plan a party, or to bookmark cooking ideas you want to try, or to highlight looks you liked from London Fashion Week, you would expect it to be more popular with women than with men. Women make more of those decisions than men do. That being the case, Pinterest is actually offering quite a useful service, and one that hasn't really been done before. If they can figure out the business side of the operation, they should be here for a while.
 
 
Funny:

But what, I ask, about your great-great-great-grandchildren? What do they get? How can our laws be so heartless as to deny them the benefit of your hard work in the name of some do-gooding concept as the "public good", simply because they were born a mere century and a half after the book was written? After all, when you wrote your book, it sprung from your mind fully-formed, without requiring any inspiration from other creative works – you owe nothing at all to the public. And what would the public do with your book, even if they had it? Most likely, they'd just make it worse.
 
 
Philosopher Ruth Marcus (Ruth Barcan Marcus, not the Ruth Marcus who writes for the Washington Post) has died. A few things related to Marcus that are worth reading:

"A Philosopher's Calling," by Ruth Barcan Marcus.
Brian Leither links to Marcus's 2010 Dewey Lecture, which she approached as an intellectual autobiography. In addition to briefly describing some of her work, she gives some sense of her personal history and what it was like to be a woman in the philosophy department: "Yale had a philosophy club open to undergraduate and graduate students. I was elected president but then received a letter from the chair of the department suggesting that I decline. The reasons given were that Yale was predominantly and historically a male institution and that my election may have been a courtesy.  Also, the club's executive committee met at Mory’s which was closed to women. I did not respond to the letter and did not decline. It was, to me, obviously unreasonable."

"Actualism," Stanford Encyclopedia of Philosophy.
Marcus is best known for her Barcan Formula (Barcan being her maiden name), which holds that if something is possible, it actually exists. (Or as she put it in the Dewey lecture: "There is, on my account, no inflated metaphysics of possible worlds, except as a façon de parler. Possibility is about the way the actual world might be.") The Stanford Encyclopedia of Philosophy gives a primer on actualism and possibilism, 

"Whose Idea Is It, Anyway?: A Philosopher's Feud" by Jim Holt in Lingua Franca (via Graeme, @gcaw).
In 1994 philosopher Quentin Smith argued that Saul Kripke's New Theory of Reference actually draws heavily on Marcus's early work and should be credited accordingly. Within the academy, this was considered an explosive and borderline libelous accusation. Holt's article is an amusing exposition of a philosophical controversy and--despite the fact that the central dispute concerns the origin of several fairly esoteric ideas--it helps put Marcus's work in context with that of Kripke and other logicians.

"Moral Dilemmas and Consistency" (PDF) by Ruth Marcus, the Journal of Philosophy.
In later years Marcus focused on moral philosophy (it's not a huge departure from her previous work if you consider that a lot of moral questions have to do with how things are and how things might be). This paper about moral dilemmas is thoughtful and tolerant. She argues that moral dilemmas are real and difficult, and we shouldn't try to wriggle out of them with sophistry: "For dilemmas, when they occur, are data of a kind. They are to be taken into account in the future conduct of our lives." She also advises an ecumenical approach with regard to their resolution: "Not all questions of value are moral questions, and it may be that not all moral dilemmas are resolvable by principles for which moral justification can be given."