Update (): This article has been rewritten to take advantage of a new feature in Bigfoot 2.0.3.
This website uses a script called Bigfoot to make fancy pop-up footnotes. It also uses MathJax to render attractive mathematical expressions like this:
If you try to include math markup in a Bigfoot footnote, though, then the math will just disappear; you need to do some extra configuration to get the two scripts to play nicely. Here’s what I came up with.
Version 6.0 of the Unicode standard, released in October 2010, added support for emoji. Aside from the classics like 😃 (SMILING FACE WITH OPEN MOUTH), 👍 (THUMBS UP SIGN), and 💩 (PILE OF POO),1 the standard also included several national flags like these:
🇺🇸 🇩🇪 🇬🇧 🇯🇵 🇮🇹
In fact, the standard included every national flag, and in a way that won’t require the standard to be changed when new countries come into being. How did the Unicode Consortium pull this off?
What they did is both crazy and genius. Instead of assigning a codepoint to each flag, which is the obvious way to do it (and the way the rest of the emoji are encoded), the standard defines twenty-six “regional indicator symbols”, from U+1F1E6 REGIONAL INDICATOR SYMBOL LETTER A to U+1F1FF REGIONAL INDICATOR SYMBOL LETTER Z. In order to include a country’s flag in your text, you first look up the country’s two-letter ISO 3166-1 code and then write the two regional indicator symbols corresponding to those letters. A font with support for that flag treats the two-codepoint sequence as a ligature, replacing the combination with a single pictogram.
Let’s take the United States as an example. Its ISO 3166-1 two-letter code is “us”, so we need to use the codepoints U+1F1FA REGIONAL INDICATOR SYMBOL LETTER U and U+1F1F8 REGIONAL INDICATOR SYMBOL LETTER S. Combining these gives a symbol that renders in your browser as 🇺🇸.
Note well, though, that it’s entirely up to the font designer to decide which flags will be supported. As of this writing, Canada’s flag, which would be encoded as U+1F1E8 U+1F1E6 (for “ca”), is not included in any font available on my computer. (In fact, only ten countries’ flags are available: Japan, South Korea, Germany, China, the United States, France, Spain, Italy, Russia, and the United Kingdom.) Trying to include an unsupported flag in your text gives you some ugly placeholder instead, like “🇨🇦” for Canada.
This encoding scheme seems a little wacky at first, but it lets the Unicode Consortium completely avoid the issue of who gets to be a country and who doesn’t. Some ISO committee is responsible for assigning the two-letter codes and the type foundry is responsible for drawing the flag and actually including it in the font. If your brand-new nation—or Canada, I guess—doesn’t get its own twee icon, that’s not Unicode’s fault.
In an IAmA interview on Reddit, OK Go lead singer Damian Kulash was asked, “Do you worry that your amazing video work could overshadow your audio work?” His answer is a thoughtful take on how creative people work today.
See, the question you’re asking basically masks a way of thinking about creativity (or, more accurately, people’s creative careers) that seems… stuck in another century, I guess. That’s the idea that creativity and creative people are supposed to stay in particular boxes that were defined by the way our products used to be distributed. It used to be that music and film and video games and journalism were actually very different physical objects with industries built around selling and distributing them. Now all of us make ones and zeros.
I meant to link to this months ago: Lee Hutchinson at Ars Technica has written a gripping account of how the rescue of Columbia might have played out, had NASA realized how much damage had been done to the shuttle’s wing. His source material is the report of the Columbia Accident Investigation Board, which apparently includes a detailed analysis of how the shuttle Atlantis might have been scrambled to retrieve the crew of Columbia. Someday, this could make a thrilling movie.
Apple announced a lot of cool stuff at yesterday’s WWDC keynote, but for me the most immediately exciting one was a new programming language called Swift, intended to replace Objective-C for OS X and iOS development. It has been in the works since 2010, according to its primary author, Chris Lattner. He says that it draws ideas from “Objective-C, Rust, Haskell, Ruby, Python, C#, CLU, and far too many others to list”. I’m reading through the iBook version of the manual now, and I’m excited to start playing around with the language!
After Man of Steel came out last summer, David Chen at Slashfilm made this video comparing Hans Zimmer’s main theme for that movie with John Williams’s main theme for 1978’s Superman. I listened to the Man of Steel soundtrack last week at work when I needed some understated background music—I couldn’t have done that with the Superman soundtrack! They’re totally different approaches musically, but the movies too are different approaches.
In “The Internet with a Human Face”, a talk he gave at Beyond Tellerand, Maciej Cegłowski discusses computers’ perfect memories, government surveillance and private data collection, the bullshit business model most web startups are using, and the need for decentralization. I’m going to go into more detail on this latter topic.
Cegłowski is talking about decentralization in the context of moving away from “one size fits all” social services like Facebook and Twitter. His main objection to all of us putting all of our eggs in the same few baskets is that it makes government surveillance trivially easy.
If these vast databases are valuable enough, it doesn’t matter who they belong to. The government will always find a way to query them. Who pays for the servers is just an implementation detail.
He’s right that decentralization is something we need to aim for. But decentralization requires interoperability, and that’s going to require a sea change in the way social networks operate.
I’ve moved this site from www.bdesham.info to the more svelte esham.io—it’s 50% fewer characters! All old URLs (including the Atom feed) should continue to work; they’ll just redirect to the new domain. If you notice anything wonky please let me know.
Paul Ford has spent a lot of time trawling through public presentations from the U.S. military. Like any other massive bureaucracy, they have made some truly inscrutable diagrams.
This is a graphic that defines a way of describing anything that has ever existed and everything that has ever happened, in any situation. The United States Military is operating at a conceptual level beyond every other school of thought except perhaps academic philosophy, because it has a much larger budget.
Mikal Gilmore at Rolling Stone has a great interview with George R. R. Martin. Unsurprisingly Martin has some smart things to say about war, power, and our natural good and evil.
Look at a figure like Woodrow Wilson, one of the most fascinating presidents in American history. He was despicable on racial issues. He was a Southern segregationist of the worst stripe, praising D.W. Griffith and The Birth of a Nation. He effectively was a Ku Klux Klan supporter.
But in terms of foreign affairs, and the League of Nations, he had one of the great dreams of our time. The war to end all wars—we make fun of it now, but God, it was an idealistic dream. If he’d been able to achieve it, we’d be building statues of him a hundred feet high, and saying, “This was the greatest man in human history: This was the man who ended war.” He was a racist who tried to end war. Now, does one cancel out the other? Well, they don’t cancel out the other. You can’t make him a hero or a villain. He was both. And we’re all both.
I’m firmly against the death penalty but Conor Friedersdorf has a perversely logical point here:
I can imagine one objection: that the guillotine is barbaric. But to me, that’s a point in its favor. Let’s have no illusions about what we’re doing when the state carries out the killing of captive prisoners. I imagine support for the death penalty would decline rather quickly once heads started rolling.
Peter Welch seems to have had a more frustrating programming career than I have—so far, anyway—but this all still rings true:
Would you drive across this bridge? No. If it somehow got built, everybody involved would be executed. Yet some version of this dynamic wrote every single program you have ever used, banking software, websites, and a ubiquitously used program that was supposed to protect information on the internet but didn’t.
At the Philosophy Stack Exchange site, David Zhang asks a question I have also asked myself, especially when I was deciding what to do after college.1 I implicitly answered “no” by continuing to study physics and then becoming a software engineer, but I’m not sure I ever really looked the question square in the face and said, “No, I don’t need to become a doctor to do good in the world”. Luckily for me, the answers to David’s question tend to support that idea.
It didn’t help that I was spending three hours a day watching House. ↩
Marcial Losada wrote three journal articles—in 1999, 2004, and 2005—which seem like they should have revolutionized the fields of psychology and sociology. In the first paper he demonstrated how the Lorenz equations (which describe the behavior of fluids) can also describe human emotions, with the spatial dimensions x, y, and z replaced with “inquiry–advocacy”, “other–self”, and “emotional space”. Of course, people who know how math works have alarm bells going off in their heads right about now, and correctly so, according to this paper:
We shall demonstrate that each one of the three articles is completely vitiated by fundamental conceptual and mathematical errors, and above all by the total absence of any justification for the applicability of the Lorenz equations to modeling the time evolution of human emotions. (Furthermore, although the second and third articles rely entirely for their validity on the presumed correctness of their predecessors—which, as we shall demonstrate, is totally lacking—we nevertheless invite the reader, at each stage, to assume for the sake of argument the correctness of the preceding article(s); in this way we are able to explain more clearly the independent flaws of each of the three subject articles.)
The article, which was published in American Psychologist in December, is well worth the short read. The authors exercise the utmost professionalism as they calmly kick the shit out of this pseudoscientific blather.
The United States Court of Appeals recently decided against the FCC in a case about net neutrality. (Nilay Patel of The Verge wrote a good introduction to the issue and summary of how we got here.) Reading through the article, it sounded like internet access was going to get worse before it got better. But I wasn’t truly alarmed until I read this article from Matt Drance:
The privacy implications are just as chilling. A discriminatory model bakes surveillance into the way ISPs do business. Sure, your provider can snoop on your traffic right now, but nothing in the fundamental concept of delivery requires or justifies that they do. With this environment in place, the implications for privacy and anonymity tools like Tor should be obvious: they would be banned in the provider’s terms of service (how else can they know how much to charge and what to block?) and lobbyists would waste no time making them illegal.
If we had all been honest with ourselves—and by “we” I mean the FCC—the internet providers would be regarded as dumb pipes; commodities on the same level as the electrical companies. Since that wasn’t politically feasible, though, we now have to worry about the loss of online anonymity at a time when that prospect has never been more chilling.
[Democracy] means that we enjoy equal rights versus the government, and in relation to each other. Having equal rights does not mean having equal talents, equal abilities, or equal knowledge. It assuredly does not mean that “everyone’s opinion about anything is as good as anyone else’s.” And yet, this is now enshrined as the credo of a fair number of people despite being obvious nonsense.
One of the values I try to live up to is that there’s no shame in saying, “Uh, actually I don’t know enough about that to have an opinion”.
According to Kwame Opam at The Verge, Phil Schiller, Apple’s senior vice president of global marketing, has taken “a principled stand against Mountain View” (and it “isn’t the first time Schiller has taken a hard line against Google”). Wow—what extreme action has Apple taken now?
9to5Mac reports that Schiller has opted to unfollow both [Nest] and Nest CEO Tony Fadell on Twitter.
This is middle-school drama wrapped in hyperbole. Why would The Verge think this is worth reporting?
Spotify and Rdio probably work really well for people who see music as a transient background interest. But I’m difficult and picky, and music is extremely important to me.
This is a good summary of why you might want to keep a local library of music. I’ll add one more reason: historical metadata. I’ve been using iTunes for twelve years and I’ve accumulated twelve years’ worth of information on when songs were added and how many times they’ve been played. (My total play count is just over 133,000 right now.) I love being able to ask, “What was I listening to in the summer of 2009?” and getting an answer just by sorting by “Date Added”. Khoi Vinh believes that streaming services should be able to do even cooler things with metadata, but right now iTunes is the clear winner at furnishing that kind of information.1
It’s great that I can export this information from iTunes, but will I ever be able to import it into another application? Apple has made some questionable interface decisions with iTunes, but they’ve never been problematic enough for me to leave my metadata and jump ship. I’m resigned to the fact that if I ever leave iTunes I’ll probably have to leave my history with it. ↩
The Washington Post has a long interview with Edward Snowden and a discussion of the effects of his disclosures. It’s a shame that Snowden grants so few interviews, because he’s thoughtful and articulate:
“Dianne Feinstein elected me when she asked softball questions” in committee hearings, he said. “Mike Rogers elected me when he kept these programs hidden.… The FISA court elected me when they decided to legislate from the bench on things that were far beyond the mandate of what that court was ever intended to do. The system failed comprehensively, and each level of oversight, each level of responsibility that should have addressed this, abdicated their responsibility.”
“It wasn’t that they put it on me as an individual—that I’m uniquely qualified, an angel descending from the heavens—as that they put it on someone, somewhere,” he said. “You have the capability, and you realize every other [person] sitting around the table has the same capability but they don’t do it. So somebody has to be the first.”
In most written language, the period is a neutral way to mark a pause or complete a thought; but digital communications are turning it into something more aggressive. “Not long ago, my 17-year-old son noted that many of my texts to him seemed excessively assertive or even harsh, because I routinely used a period at the end,” Mark Liberman, a professor of linguistics at the University of Pennsylvania, told me by email.