Before I get started, for any JK Rowling fans who missed it, The Ickabog has now been fully published online. Despite being written for "children between seven and nine," I found it a delightful read and highly recommend it.
----- 4 stars -----
Black Death, COVID, and Why We Keep Telling the Myth of a Renaissance Golden Age and Bad Middle Ages / Ex Urbe
This blog post (by a UChicago history professor) is a bit less polished than the typical 4-star article, but the content is exceptional:
As a Renaissance historian, I feel it’s my job to shoulder the other half of the load by talking about what the Renaissance was like, confirming that our Medievalists are right, it wasn’t a better time to live than the Middle Ages, and to talk about where the error comes from, why we think of the Renaissance as a golden age, and where we got the myth of the bad Middle Ages. [...] Why was life in the Renaissance so bad? [...] Let’s look at life expectancy: In Italy, average life expectancies in the solidly Medieval 1200s were 35-40, while by the year 1500 (definitely Renaissance) life expectancy in Italian city states had dropped to 18. Pause to think on that one a moment—by the time you turn 18, half the kids your age are dead. It’s striking how consistently, when I use these numbers live, the shocked and mournful silence is followed by a guy objecting: those numbers are deceptive, you’re including infant mortality—voiced as if this observation should discredit them. Yes, the average of 18 does include infant mortality, but the Medieval average of 35 includes it too, so the drop is just as real. [...] As for how an age so terrible to live through produced the masterpieces and innovations we still hold in awe, my ultrashort answer is that Renaissance art and culture was also a gradual ramp-up from ever-changing Medieval art and culture, and that the leaps we seem to see in the later period are the desperate measures of a desperate time. [...] After the Renaissance, in the period vaguely from 1700 to 1850, everyone in Europe agreed the Renaissance had been a golden age of art, music, and literature specifically. Any nation that wanted to be seen as powerful had to have a national gallery showing off Renaissance (mainly Italian) art treasures, and capital buildings with Renaissance neoclassical motifs, while an individual with elite ambitions had to know classicizing Latin, and a bit of Greek, and have opinions about Raphael, Titian, Petrarch, and the polyphonic motets of Lassus. Seriously: in the original Doyle Holmes stories, so 1850-1910, after having Watson establish Holmes’s “Knowledge of literature—nil. Philosophy—nil.” still has Holmes carry a pocket Petrarch and write a monograph on the polyphonic motets of Lassus, because that’s what a smart, impressive person did in 1850. This also meant that Renaissance art treasures were protected and preserved more than Medieval ones—if you’re valorizing the Renaissance you’re usually criticizing the Middle Ages in contrast, so these generations learned to think of Renaissance art as good taste and the periods on both sides (Medieval and baroque) as bad taste, and a lot of great Medieval art was left to gather dust, or rot, or was even actively destroyed, since nothing invokes the Renaissance like sweeping away the “bad” medieval. As a result, the Renaissance became a self-fulfilling source base: go to a museum today and you see much more splendid Renaissance art than Medieval, leading to the natural conclusion that the Renaissance produced more art in general, but Middle Ages did make splendid art, it’s just that later centuries didn’t preserve it as carefully, so less survives, and what survives is more likely to be in storage than in the main gallery.
----- 3 stars -----
The True Cost of Dollar Stores / New Yorker
You may protest, particularly in the first half, that this piece suffers from poor statistical reasoning (and I think to some extent you'd be right); however, there's a lot of worthwhile and high-quality reporting here that more than makes up for it:
Even the most image-conscious public corporations tend to acknowledge, in their required disclosures to investors and in their quarterly calls with market analysts, the challenges facing them. So it was startling to find no mention of the prevalence of crime and violence in recent filings for either Dollar General or Family Dollar and Dollar Tree. Company executives make occasional reference to “shrink,” the industry euphemism for stock lost mainly to shoplifting or employee theft. But the steady stream of violence at the stores, much of it directed against employees, was omitted. Dollar General emphasized its efforts to keep costs down. In its disclosures for the third quarter of 2019, Dollar General lamented the rise in nationwide hourly wages, and said that it was aiming to shift to self-checkout in many stores. The company hopes not to have to increase security at stores, since its “financial condition could be affected adversely” by doing so. “Our ability to pass along labor costs to our customers is constrained by our everyday low price model,” Dollar General concluded, “and we may not be able to offset such increased costs elsewhere in our business.” Similarly, Dollar Tree executives told analysts in a quarterly call in March that they were pushing “productivity initiatives” in stores, which would help get more from fewer workers. “We are well positioned in the most attractive sector of retail to deliver continued growth and increase value for our shareholders,” Gary Philbin, the company’s C.E.O., said. In the past five years, the share price of Dollar General has nearly tripled, outpacing the broader stock market by some eighty per cent and vastly outperforming traditional grocery stores and retailers such as Kroger and Macy’s. In 2018, Vasos, Dollar General’s C.E.O., received more than ten million dollars in total compensation, nearly eight hundred times the median pay for workers at the company. Philbin, at Dollar Tree, was paid about the same amount. Asked about the hundreds of incidents of violent crime at their stores, the companies said that they took security concerns seriously, but they did not elaborate on preventive measures at the stores. [...] Frustration was rising at City Hall, too. When Mayor Whaley entered city government, in 2005, she viewed the dollar chains as serving a useful purpose, but over time she saw how the chains’ stores in urban neighborhoods contrasted with the ones in rural areas. Residents often sent her photos of dangerously cluttered aisles, and she asked fire marshals to issue warnings. “The more and more ubiquitous they’ve gotten, they’ve gotten less and less caring,” she said. “I came to see them as glorified check-cashing and payday lenders, for the way they prey off the poor but don’t really care about the poor.”
Life hasn’t yet returned to normal in America’s smallest state, but it’s at least no longer crazy to think about life returning to normal, because coronavirus deaths, hospitalizations and infections have been plummeting since April. Rhode Island is leading the nation in testing, with nearly a quarter of its population tested so far, and its rate of positive tests has dropped from over 18 percent to under 2 percent. While a national debate rages over school schedules, weighing concerns about education and convenience against concerns about safety, Governor Gina Raimondo has already announced that classrooms will reopen this fall—not because parents have no other child care options, or because President Donald Trump is insisting there’s nothing to worry about, but because she's confident Rhode Island can do it safely. It is a testament to the viciousness of the virus that Rhode Island can be seen as a success story despite the deaths of nearly 1,000 of its citizens, or nearly 0.1 percent of its population. It was a uniquely vulnerable state, featuring the nation’s second-densest and ninth-oldest population, nestled between the pandemic hot spots of New York City and Boston. Rhode Island was hit early and hard by the outbreak. But now, while COVID-19 is still winning in much of America, with infections on the rise and outbreaks in states like Florida, Texas and Arizona, forcing reversals of some efforts to reopen economies, Raimondo and her public health team have beat back the pandemic in Rhode Island. It is now one of only four U.S. states rated a “low” risk level by the Covid Act Now project. So how did Rhode Island do it? The short answer is Raimondo copied the playbook of nations like South Korea and New Zealand that have fared much better than the United States in battling the virus—intensive testing, tracing and isolation plus wear-your-damn-mask policy and messaging—while adding innovative twists through uniquely American public-private partnerships.
The period from the 1870s to the start of the First World War saw a steep rise in working-class living standards in Britain, much of it underpinned by a vast array of cheap imported foods. Thanks to new refrigerated steamships and a growing railway network, such items as butter, eggs and meat could be transported from as far afield as New Zealand and Argentina. The British started to eat butter from Denmark; oranges and grapes from Spain; mutton from Argentina; bacon and cheese from the United States; wheat from Canada. The percentage of meat consumed in Britain that was imported rose from 13.6 per cent in 1872 to 42.3 per cent in 1912. The influx of these new cheap food imports gave many in the working classes a much more varied and tasty diet than before. Eggs were no longer a luxury and as the price of imported fruit fell, many in the cities started eating oranges and bananas for the first time. They could only afford to buy these foods because the costers who sold them kept the prices too low to allow themselves a decent life. By the same token, big shopkeepers kept food prices down by forcing employees to work long hours for low pay. A ninety-hour week was not uncommon for a clerk in a Victorian grocer shop, but these hours still might not deliver a wage large enough to live on, despite the cheapness of food. [...] Cheap food is one thing; cheapened food is another. When a Victorian costermonger could not earn enough money to buy fresh food, he or she might starve. When a modern-day worker doesn’t earn enough for fresh food, he or she eats ultra-processed food instead. [...] Hunger has been a bestseller in Spain, and it is not hard to see why. It is one of the most unsettling books I have ever read. In clear, direct and angry prose, Caparrós asks the upsetting questions that most of us spend our days avoiding. The question he keeps coming back to, in a series of interludes between chapters, is “How the hell do we manage to live knowing these things are happening?” By “these things” he means the fact that in a world where there is more than enough food to go around, there are still 800 million people who experience absolute hunger every day and 2 billion people, many of them overweight, who suffer from food insecurity. His subtitle refers to hunger as the world’s “oldest problem”, yet it is also a new and more brutal scourge in the sense that “for centuries, there was no solution to famine”; today, “feeding the hungry depends only on will”.
The Theory of History That Guides Xi Jinping / Palladium
Tanner Greer is fast becoming one of my favourite analysts; some excellent insights here:
Does it matter that Xi Jinping believes that there is a telos to the times? It is possible this is all just talk. If you cannot convince the world to like you, you can at least convince the world that you are inevitable. And for a General Secretary worried about the flagging “faith” of Party cadres, asserting that the Party has mastered the laws of history has ideological allure. But dismissing all of this as mere rhetoric is hard to square with the settings in which Xi appeals to “the pulse of times.” One does not call every ambassador to Beijing just to bore them with the latest propaganda hacks. Xi calls these meetings because he has an exact idea of how he wants his diplomats, bureaucrats, and generals to do their job. Addresses like these are less like stump speeches on the campaign trail than they are like instruction manuals. [...] Xi Jinping has called his solution to this problem “the path of peaceful development.” Neither the phrase nor the strategy it evokes is new to the Xi era—indeed, every instance Xi used the phrase in the 2010s was a subtle reminder that China had not gone to war in forty years. But “the path of peaceful development” is not just a decision to avoid war with foreign powers. It is a quest to build up Chinese power and reshape the global order through “interconnected” or “win-win” development. This formula implicitly rejects the revolutionary agitation of China’s Maoist days and sets itself in opposition to America’s military interventions in the Middle East. But why has the Party decided that military tools are of limited use in bringing about their preferred world order? [...] As that last item suggests, the Party’s commitment to a strategy of peaceful development is not a commitment to abandon coercion. [...] What we do know: China’s commitment to peaceful development rests on its leader’s belief that globalization and economic integration is an irrevocable historical law. In days of depression and pandemic this is an unsettling thought.
Over the past two decades, inequality in Latin America had fallen to the lowest point in its recorded history. The pandemic threatens to reverse that. We traveled 1,000 miles across Colombia to document this critical moment. [...] By the time the pandemic hit, top earners made an average 22 times what the poorest made. So while inequality clung to the region, it had fallen to a record low, he said. Now, the pandemic could push poverty and inequality back to what they were at the turn of the 21st century in Colombia, according to an analysis by professors at the Universidad de los Andes. “A setback of two decades,” they called it. Economists are predicting similar regressions around the region, with the World Bank warning that more than 50 million people in Latin America and the Caribbean could fall into poverty this year alone. “The current crisis is probably the biggest threat to inequality that we have experienced,” Mr. Busso said. In Medellín, we watched hundreds of single mothers line up outside a food bank that expanded significantly when the crisis began. One woman, María Camila Salazar, 22, said her mother, María Eugenia Carvalho, 53, had become so dangerously malnourished that her thin shoulders now jutted from her frame. “We go to bed without eating, without giving anything to our children,” she said.
On June 22nd, visitors to Slate Star Codex, a long-standing blog of considerable influence, discovered that the site’s cerulean banner and graying WordPress design scheme had been superseded by a barren white layout. In the place of its usual catalogue of several million words of fiction, book reviews, essays, and miscellanea, as well as at least as voluminous an archive of reader commentary, was a single post of atypical brevity. “So,” it began, “I kind of deleted the blog. Sorry. Here’s my explanation.” The farewell post was attributed, like virtually all of the blog’s entries since its inception, in 2013, to Scott Alexander, the pseudonym of a Bay Area psychiatrist—the title “Slate Star Codex” is an imperfect anagram of the alias—and it put forth a rationale for this online self-immolation. [...] Alexander’s appeals to the reporter’s conscience had been ineffective. “He said that it was New York Times policy to include real names, and he couldn’t change that,” he wrote. “After considering my options, I decided on the one you see now. If there’s no blog, there’s no story.” Alexander had taken his own work, and his site as a community gathering place, temporarily hostage in the hope that the Times would either cancel the story or permit the use of his pseudonym. [...] Alexander’s appeal elicited an instant reaction from members of the local intelligentsia in Silicon Valley and its satellite principalities. Within a few days, a petition collected more than six thousand signatories, including the cognitive psychologist Steven Pinker, the economist Tyler Cowen, the social psychologist Jonathan Haidt, the cryptocurrency oracle Vitalik Buterin, the quantum physicist David Deutsch, the philosopher Peter Singer, and the OpenAI C.E.O. Sam Altman. Much of the support Alexander received was motivated simply by a love for his writing. The blogger Scott Aaronson, a professor of computer science at the University of Texas at Austin, wrote, “In my view, for SSC to be permanently deleted would be an intellectual loss on the scale of, let’s say, John Stuart Mill or Mark Twain burning their collected works.”
Astrology, private equity, a $1.1 billion gender discrimination lawsuit, and a precariously built bangle behemoth
President Trump’s obstructions of justice were broader than those of Richard Nixon or Bill Clinton, and the special counsel’s investigation proved it. How come the report didn’t say so?
My sex life as a fat woman was a trickle of accumulated humiliations and loneliness, so I decided to try enjoying my own company instead. [...] This moment — a handsome man kissing my bare back, his lips and his hands moving with deft tenderness over the rolls of fat that have anguished me since I was a teenager — is the moment I’ve been dreaming of since I first started splashing and floundering in the dating pool. Finally, after years of being the girl who rarely gets a swipe right, the ghost in a low-cut black dress who will remain alone at the end of the bar unless she settles for some crude 2 a.m. assignation, I’m enjoying the kind of intimacy my thinner friends have long bragged about. I can stop sucking in my stomach and holding my breath: For once, I’ll have an answer to “what’s new and exciting?” that isn’t, “Oh, you know, work.” Finally, a partner who tells me, in honeyed word and sweet deed, that he digs my body as is — without, mercifully, using the phrase “big girls.” He’s the first hookup I’ve really liked in a long time — or, possibly, ever. He gives good email and he knows how to touch me. Once we’ve finished, and I’m lying with my head on his chest, inhaling the gentle musk of his dried sweat, I feel bold. I ask if he’d like to go out on a proper date to a museum. Out of the gauzy dark of the bedroom and into the light of day. Suddenly, his face assumes an apologetic tension; I’ve seen this look on so many men’s faces over the years: “Cool, cool,” he says, in a tone that betrays that it is not cool, cool. Then he adds that he just wants me to know that he doesn’t hold hands. “But it’s not because of, I mean, you know.” I do know. I know exactly why he won’t hold my hand in public, why this night went from an electric hum of potential to a dull, familiar drone of humiliation. I know that I am in my early thirties and I’m too tired to play the cool, cool girl. The girl who will still hook up with him, if only to say that she’s hooking up — because if she’s hooking up, well, then, somebody wants her, and if somebody wants her, then she’s not too weird or too ugly; she’s not alone. But I am lonely. I’m in this man’s arms and I’m lonelier than I’ve ever been. Once he’s gone — never to email or DM again, let alone take a walk outside — I decide that I’ll take a year away from dating, from thinking about sex. I delete all my apps. I stop drinking, going to bars. That year soon becomes two years. Two years slurs gently, almost imperceptibly, into five years. Five years yawns into nearly 10 years.
Heyerdahl challenged the scientific community’s view that evidence pointed instead to the peopling of Polynesia by people travelling east from Asia, and his idea that Polynesia was initially populated by South Americans was generally criticized by scholars. The same scientific community nevertheless discussed cultural contacts between the two regions, because a South American plant, the sweet potato, has a long history of cultivation in eastern Polynesia. The idea that Polynesians voyaged to South America and introduced the plant on their return to Polynesia became the accepted explanation for this. [...] When Ioannidis and colleagues searched for similarities between the genetic signatures of Native South Americans found in Polynesia and those of Indigenous populations in northern coastal areas of South America, the connection to Colombian populations was especially strong. The earliest genetic signal of Native Southern Americans found by the authors in Polynesia was from people of the Southern Marquesas islands, and the authors argue that Colombians mixed with Polynesians there around AD 1150. This date is so early that it could even suggest South Americans reached there before Polynesians arrived, which would make Heyerdahl partly right if it were the case that South Americans first settled at least the area of eastern Polynesia that has signs of early admixture.
In recent weeks President Trump has railed against tearing down statues across the country — and has been particularly dogged in his defense of Confederate monuments. But his argument that they are benign symbols of America’s past is misleading. An overwhelming majority of Confederate memorials weren’t erected in the years directly following the Civil War. Instead, most were put up decades later. Nor were they built just to commemorate fallen generals and soldiers; they were installed as symbols of white supremacy during periods of U.S. history when Black Americans’ civil rights were aggressively under attack.
For the past few weeks, I have been obsessed with a mystery emerging in the national COVID-19 data. Cases have soared to terrifying levels since June. Yesterday, the U.S. had 62,000 confirmed cases, an all-time high—and about five times more than the entire continent of Europe. Several U.S. states, including Arizona and Florida, currently have more confirmed cases per capita than any other country in the world. But average daily deaths are down 75 percent from their April peak. Despite higher death counts on Tuesday and Wednesday, the weekly average has largely plateaued in the past two weeks. The gap between spiking cases and falling-then-flatlining deaths has become the latest partisan flashpoint.
In practical terms, this meant that most communities were siloed off from each other. While a very few people with national prominence might be blogging to the masses, most people were blogging and commenting for a more select community. People thought of themselves as part of a select community. There were many incentives for treating each other as community members should. Because these communities were relatively small and contained, everybody involved was soon familiar with everybody else. There was nothing to be gained but much to be lost by turning disagreements personal. Bloggers and forum moderators had full control over their own sites, and did not shy from banning and censoring people they believed were poisoning the conversation. Each community had its norms, and people were happy to shun those who did not follow them. This leads to the second big difference between the internet of the aughts and the internet of the 2010s: the standards for participation were different—in some ways the barrier to entry was both higher and lower than on twitter. In the old days people used to say "if you don't like it, make your own blog!" That directive was easy to follow. It is near impossible for someone de-platformed from twitter to create some new twitter to replace it; in contrast, anybody really could create their own blog (and forums were not hard to stand up either).
I reached out to nine of my former allies and rivals who still consult for Republican candidates at the highest levels of Senate and House races, some who have gone full MAGA and others for whom the president is not their cup of tea. I asked them to speak candidly, without their names attached, to learn about the real behind-the-scenes conversations about the state of affairs. [...] What I found in their answers was one part Stockholm Syndrome, one part survival instinct. They all may not love the president, but most share his loathing for his enemies on the left, in the media, and the apostate Never Trump Republicans with a passion that engenders an alliance with the president, if not a kinship. [...] As one put it: “There are two options, you can be on this hell ship or you can be in the water drowning.”
What happens when an African grey parrot goes head-to-head with 21 Harvard students in a test measuring a type of visual memory? Put simply: The parrot moves to the head of the class.
Kanye West’s Fourth of July declaration, via Tweet, that he was running for president lit the internet on fire, even as pundits were trying to discern how serious he was. Over the course of four rambling hours of interviews on Tuesday, the billionaire rapper turned sneaker mogul revealed: That he’s running for president in 2020 under a new banner—the Birthday Party—with guidance from Elon Musk and an obscure vice presidential candidate he’s already chosen. “Like anything I’ve ever done in my life,” says West, “I’m doing to win.” That he no longer supports President Trump. “I am taking the red hat off, with this interview.” That he’s okay with siphoning off Black votes from the Democratic nominee, thus helping Trump. “I’m not denying it, I just told you. To say that the Black vote is Democratic is a form of racism and white supremacy.” That he’s never voted in his life.
Paris-based writer and photographer Carrie Solomon set out to document the contents of the refrigerators of some of the world’s most renowned toques. Co-written with Adrian Moore, Chefs’ Fridges: More Than 35 World-Renowned Cooks Reveal What They Eat at Home is the end result of that mission.
A study finds that a significant share of partisans will support a re-run if they don’t get their way
Hydrogen and helium are plentiful in the Sun, and for other elements it is probably cheaper to transport mass to the Sun than to transport energy away from it. Probably mostly from Mercury for a long while. Some say computers are more efficient when run at low temperatures, but I don’t see that. So it seems to me that once our descendants go beyond merely clumping around Earth to be near activity there, the main place they will want to go is near the Sun.
When the Fuji-Q Highland theme park reopened on 1 June after a three-month closure due to the pandemic, it asked visitors to follow the recommendations of the amusement park association and not to shout or scream. Some customers complained it was impossible to stay quiet on rides, particularly the two-kilometre-long Fujiyama rollercoaster, which reaches speeds of 130km/h and drops 70 metres at one point. Named after nearby Mount Fuji, the rollercoaster was the fastest and tallest in the world when it opened in 1996. In response, the park released a video of two stony-faced senior executives riding Fujiyama without uttering a peep, urging visitors to imitate them and “Keep your screams inside.”