Bifurcated Rivets: From FB

Moo glow

Bifurcated Rivets: From FB

Serious virtuosity

Bifurcated Rivets: From FB

Great image

Bifurcated Rivets: From FB


Bifurcated Rivets: From FB


Slashdot: 'New California' Movement Wants To Create a 51st State

PolygamousRanchKid, Ayano, and an anonymous reader all shared the same story. Tribune Media reports: A group has launched a campaign to divide California into two states. It isn't the first attempt to split California, but unlike a failed campaign in 2016 to divide California into six states, the campaign to create New California would split the state into one made up of rural counties and another made up of coastal counties. USA Today provides some context: Breaking up California remains no easy task: A formal secession means getting approval from both Congress and California's legislature itself. But that hasn't stopped folks from trying. Hundreds of times... Monday's declaration of "the State of New California" marked the latest in more than 200 long-shot efforts to split the Golden State. All so far have failed.

Read more of this story at Slashdot.

Disquiet: What Sound Looks Like

A doorbell button sends a variety of signals. It’s an instruction, an invitation, a place-marker. When lit at night, it can suggest habitation, even when no one is home. Often, especially in dense urban settings, the doorbell’s inherent messages aren’t sufficient to the task, however. There may be numbers and letters to clarify the association of address and interface. There may be arrows directing the visitor’s eye and finger. There may be redirects for postal services. There may be cameras that, intentionally or not, create an interactional moat, a digitally mediated divide between visitor and host — the host in such circumstances has an access to, a vantage on, a control over the visitor before the visitor has ever stepped foot inside. There’s lore of the vampire, who in some tellings must have permission before crossing such a threshold; digital vampires of the opposite persuasion — the ones on the recording end of the camera — have no significant restraints on their ability to capture, to collect and collate. They need not even cross the divide to have a presence. Sometimes the additional message is simply a bit of text, like here, where the instruction to “push hard” is neatly appended below the button. This modest device has no internet-era or even multi-functional connectivity, but it does speak messages, even beyond its literal one. For context, understand that there is also an array of buttons hung on that perpendicular metal gate. This button is an add-on, perhaps a replacement for one of the earlier ones. There is personality to the writing, in particular the swirl in the numeral 2 and the playful vitality of that “a” in “hard,” its schoolbook charm somehow both youthful and old-fashioned. This writing wasn’t done quickly, or haphazardly, or out of anger. It doesn’t appear to contain a subtext of antipathy toward a landlord, or toward technology for that matter. The writing is welcoming, reducing any emotional strain that such an instruction might have introduced in other circumstances. Still, the button itself shows little wear, which can be read generously as the resilience of something well-constructed, or more likely as evidence of it having been pushed with limited frequency over the years. The genteel stroke of the pen, upon reflection, takes on a kind of neediness, the entreating smile of an urban entity that knows the loneliness of the crowd all too well.

An ongoing series cross-posted from

Instructables: exploring - featured: Temperature, Relative Humidity, Atmospheric Pressure Logger Using Raspberry Pi and TE Connectivity MS8607-02BA01

Introduction:In this project I will show you how to build setup by step a logging system for temperature humidity and atmospheric pressure. This project is based on the Raspberry Pi 3 Model B and TE Connectivity environmental sensor chip MS8607-02BA01, this chip is really tiny so i suggest you get i...
By: Noreddine Kessa

Continue Reading »

Instructables: exploring - featured: Awesome Binder Wings

These wings may not help you fly, but they look pretty cool in motion! I discovered this project While thinking about what I could make with a binder. Pretty weird, right? Let me show you how to make them! Materials: Paracord. One piece, 145cm, two pieces, about 80cm, a binder, and an X-Acto knif...
By: awalston887

Continue Reading »

Instructables: exploring - featured: Concrete Octagonal Driveway Pavers

I had a problem.I had this squalid little section of property on the far side of my driveway. Nothing would grow there, and the dirt was mounded up a foot above my driveway, so not only would it prevent drainage when it rained, the run-off would leave silt deposits in my driveway deep enough to plan...
By: monickingbird

Continue Reading »

Instructables: exploring - featured: Stackable Hexagon Infinity Mirrors

So I got an Arduino and this is the first project I made. I got my inspiration while I was looking on this site and tried to make a simple project for myself. Coding isn’t my strong part so I had to keep it simple and wanted to make it more complex with the analog part. What I used: -Arduino Uno -Ne...
By: NoaV

Continue Reading »

Instructables: exploring - featured: How to Make Scented Candles?

Everyone loves things that smell good, especially during winter when we spend a lot of time indoors. This recipe is totally customiziable, and you might just have almost all you need for this one around the house right now. The fragrance is totally up to you. You can use any essential oils you like....
By: Dom Bloggs

Continue Reading »

MetaFilter: "Hi, I'm Max Keller. This is how I usually leave a bar."

Thirty-four years ago today, NBC premiered The Master (publicity still), a ninja action series starring Western film veteran Lee Van Cleef (as the ninja) and Timothy Van Patten, half-brother of Dick Van Patten, as his hot-headed young sidekick. Each week, Max and The Master drive into a new town (in Max's custom van) and end up protecting/rescuing a damsel in distress from greedy land developers, union-busters, crimelords and their thugs, a surprisingly high number of other ninjas, and the occasional terrorist. (Here's Van Cleef promoting the show on Carson.) Fans of Mystery Science Theater 3000 know the show from Master Ninja I and II, repackaged video versions of the first four episodes. Cancelled after 13 episodes, the entire series is viewable on YouTube:

Premise (from Wikipedia): John Peter McAllister (Van Cleef), an American veteran who stayed in Japan following World War II and became a ninja master, leaves Japan for the United States in search of a daughter he did not know he had. This flight from his ninja life is seen as dishonorable by his fellow ninjas, including his former student, Okasa (Sho Kosugi), who attempts to assassinate him. Escaping with a minor wound, McAllister finds himself in the small town of Ellerston, where he believes his daughter resides. Along the way, he meets a drifter named Max Keller (Van Patten). Max desires to learn to fight like a ninja, but McAllister is reluctant to train him, feeling him to be too emotional.

(MST3K fans take note: The first few links include the original opening credits and theme song, which Film Ventures International removed for their video re-releases.)

1. "Max" (January 20, 1984) – After meeting McAllister, Max gets involved in a dispute between Mr. Christensen, a ruthless developer, and the Trumbulls, a father and daughter who run an airport targeted by Christensen.
NOTABLE GUEST STARS: Clu Gulager (San Francisco International Airport*) as the villain; Claude Akins and Demi Moore as the Trumbulls. Robert Clouse (director of Enter the Dragon) directed this episode.
* = Sorry MiSTies, none of the six episodes of this series appear to be on YouTube. --Ed.

2. "Out-of-Time Step" (January 27, 1984) – A ninja-guarded crime lord mistakes Max and McAllister for bodyguards hired by a nightclub owner the crime lord is trying to control.
NOTABLE GUEST STAR: Brian Tochi (Revenge of the Nerds, the Star Trek: The Next Generation episode "Night Terrors") as the crime lord.

3. "State of the Union" (February 3, 1984) – Max befriends a "biker chick" who is trying to organize a union at the cannery where she works; he and McAllister strike back when the cannery owner tries to strong arm the girl and the union.
NOTABLE GUEST STAR: Crystal Bernard (of Wings fame) as the union organizer.

4. "Hostages" (February 10, 1984) – McAllister is accused by a secret agent of helping a band of terrorists; to prove his innocence, he must help rescue the hostages that the terrorists have taken.
NOTABLE GUEST STARS: George Lazenby (the second James Bond) as "Mallory"; David McCallum (Man from U.N.C.L.E.) as the head terrorist; Monte Markham (The Second Hundred Years, Hawaii Five-O, Blanche's brother on The Golden Girls) as the head of the CIA.

5. "High Rollers" (March 2, 1984) – A former girlfriend of Max's becomes a pawn in a Las Vegas heist when her daughter is held hostage to ensure her cooperation. The resulting adventure leads Max and McAllister to a deserted western movie set, where the Master makes himself very much at home.
NOTABLE GUEST STAR: Terri Treas (Alien Nation, Deathstalker and the Warriors from Hell) as Max's former girlfriend.

6. "Fat Tuesday" (March 9, 1984) – During Mardi Gras in New Orleans, a reporter uses Teri McAllister's name as a cover for her own sources, hoping to bring down a respected local citizen who is secretly running guns to Arab terrorists. Max and McAllister become entangled as a result.
NOTABLE GUEST STAR: Robert Pine (Veep, It's Always Sunny in Philadelphia, Jim's dad on The Office).

7. "Juggernaut" (March 16, 1984) – Max and McAllister help a mother and daughter organize the local farmers against an evil land baron. McAllister has more success romancing the mother than Max does with the daughter. At the close of this episode, Max and McAllister are joined on their van-voyage by Cat Sinclair (Tara Buckman), a potential love interest for Max.
NOTABLE GUEST STAR: Diana Muldaur (Dr. Ann Mulhall and Dr. Miranda Jones on Star Trek: The Original Series, Dr. Pulaski on Star Trek: The Next Generation) as McAllister's love interest.

8. "The Good, the Bad and the Priceless" (March 23, 1984) – Caught between a criminal mastermind and an FBI agent posing as McAllister's daughter, the two leads find themselves forced to steal the Crown Jewels of England. The Cat Sinclair character is mentioned by Max in the opening narration as if she is a new regular cast member, but this is the last episode she appears in; the character is never mentioned again and no explanation for her disappearance is given.
NOTABLE GUEST STARS: Janine Turner (Northern Exposure); George Maharis (Route 66).

9. "Kunoichi" (April 6, 1984) – With the help of a female pupil, Okasa puts in motion a plan to frame McAllister for the murder of an old friend, who is now a prominent government official in Washington.
NOTABLE GUEST STARS: William Campbell (Koloth from Star Trek: The Original Series and Star Trek: Deep Space Nine); Jack Kelly (Maverick).

10. "The Java Tiger" (April 13, 1984) – Max and McAllister take a break from the search for Teri to help out a friend of McAllister's: a bumbling PI, based in Hawaii, who is on a quest for a legendary tiger made of gold. Unfortunately, a Bond-villain-like crime lord with a penchant for karate is also interested in the Java Tiger.
NOTABLE GUEST STAR: Anthony de Longis (Maje Culluh from Star Trek: Voyager); Kabir Bedi (Octopussy).

11. "Failure to Communicate" (May 4, 1984) – Max reunites with his estranged father Patrick, who is a pawn in a kidnapping scheme.
NOTABLE GUEST STARS: Marc Alaimo (Gul Dukat from Star Trek: Deep Space Nine) as one of the villains; Mark Goddard (Lost in Space); Rebecca Holden (Knight Rider, one-time "Breck Girl"); Doug McClure (The Virginian, and multiple films featured on MST3K) as Max's father.

12. "Rogues" (August 10, 1984) – A high school friend of Max's is now a cop, on the run from a band of crooked cops. A woman who runs a gym harasses McAllister about being out of shape.
NOTABLE GUEST STAR: Spice Williams (Klingon bridge officer in Star Trek V: The Final Frontier).

13. "A Place to Call Home" (August 31, 1984) – Max and McAllister protect an orphanage from greedy land developers, with Max playing surrogate father to a troubled teen.
NOTABLE GUEST STAR: James Gammon (Major League, Nash Bridges); Sho Kosugi's son Kane.

Creator Michael Sloan (who'd had a hit in the '70s with Quincy M.E.) had more success in 1985 with The Equalizer, starring Edward Woodward and several more notable guest stars than The Master had.

Lee Van Cleef never acted on television again, but appeared in a few more films prior to his death in 1989. Further reading: - A Tribute to Lee Van Cleef

Timothy (now credited as Tim) Van Patten now has a successful career as a television director, winning Emmys for his work on The Sopranos and Boardwalk Empire. He has also directed episodes of Rome, Game of Thrones, and Black Mirror.

Previously on FanFare: the MST3K episodes Master Ninja I and Master Ninja II. And here's the "Master Ninja Theme Song" sketch from the end of the first one.

Recent additions: collection-json

Added by alunduil, Sat Jan 20 19:02:19 UTC 2018.

Collection+JSON—Hypermedia Type Tools

Recent additions: siren-json

Added by alunduil, Sat Jan 20 18:55:04 UTC 2018.

Siren Tools for Haskell

Slashdot: New Antifungal Provides Hope in the Fight Against Superbugs

dryriver shares news about the ongoing war against drug-resistant fungus. ScienceDaily reports: Microscopic yeast have been wreaking havoc in hospitals around the world -- creeping into catheters, ventilator tubes, and IV lines -- and causing deadly invasive infection. C. auris is particularly problematic because it loves hospitals, has developed resistance to a wide range of antifungals, and once it infects a patient doctors have limited treatment options. But in a recent Antimicrobial Agents and Chemotherapy study, researchers confirmed a new drug compound kills drug-resistant C. auris, both in the laboratory and in a mouse model that mimics human infection. The drug works through a novel mechanism. Unlike other antifungals that poke holes in yeast cell membranes or inhibit sterol synthesis, the new drug blocks how necessary proteins attach to the yeast cell wall. This means C. auris yeast can't grow properly and have a harder time forming drug-resistant communities that are a stubborn source of hospital outbreaks... The drug is first in a new class of antifungals, which could help stave off drug resistance.

Read more of this story at Slashdot.

MetaFilter: From where the church used to be, 2 blocks south, 1 block east

Let's talk about addresses in Nicaragua.

MetaFilter: the eucharist is lit

traditional Catholicism| vaporwave | aesthetics | tradwave dot comTWITTERfacebookinstaGRAM piNtEreSt

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: The dollar

RYAN  By Guest Blogger Ryan Lewenza

One of the key aspects of financial analysis is to determine and isolate the main drivers of a specific asset class. For equities it’s earnings and valuations. For bonds it’s inflation and interest rates. For the Canadian dollar it’s oil prices and the interest rate differential between Canadian and US bond yields. What do these factors portend for the Canadian dollar in 2018?

First let’s start with the outlook for interest rates in Canada and the US. This week we saw the Bank of Canada hike rates by 25 bps to 1.25% – the third hike in a year. They did this in response to the strength seen in the economy which grew at over 3% and added 423,000 jobs in 2017.

I see the US economy growing above 3% this year driven by a strong consumer and the recent US tax reform which could provide an additional boost to their economy. For Canada I see slower growth of roughly 2.5% as higher interest rates weigh on consumer spending and uncertainty over NAFTA weighs on businesses and exports.

Given this outlook I see the Fed hiking rates at a faster clip than the BoC which has implications for our dollar.

Fed and BoC Overnight Rates

Source: Bloomberg, Turner Investments

Below I illustrate why this matters. In the chart I show the strong relationship between the Canadian dollar and the interest rate differential between Canadian and US bond yields. Since 2014 there has been a near perfect correlation between these two. Put simply, as the Fed hikes rates faster than the BoC this year, US bond yields should move higher relative to Canadian bond yields, which would be negative for the Canadian dollar.

Interest Rate Differential and CAD/USD

Source: Bloomberg, Turner Investments

The second key driver of the Canadian dollar is commodity prices and, in particular, oil prices. I’ve been bullish on oil prices over the last year and I see more upside. I’m targeting WTI to close the year at US$66/bl. If I’m correct this would be bullish for the Canadian dollar.

So of the two key drivers for the Canadian dollar, one is bearish (interest rate differential) and one is bullish (oil prices).

Another tool we can look at to help determine the direction of the Canadian dollar is my financial model for the dollar, which uses current oil prices and interest rates to determine “fair value”. Based on these inputs and current levels, this suggests a fair value of 81 cents. With the dollar currently at 80 cents the model suggests limited upside from here.

CAD Model Points to Fair Value at 81 Cents

Source: Bloomberg, Turner Investments

Finally, we need to see what the technicals look like for the Canadian dollar so we have the complete picture. The Canadian dollar has improved technically over the last year with the CAD now above its rising 200-day moving average. However, it remains trapped in a clear trading range of roughly 74 cents to 84 cents. We would need to see the Canadian dollar breakout from this range before we would be willing to change our outlook.

CAD in a Technical Range

Source:, Turner Investments

Putting this analysis together, I see the Canadian dollar trading range bound this year as higher oil prices potentially push up the exchange rate at times, and the rate differential favouring US bond yields pushes it lower at times.

This long and boring analysis points to a boring trading year for the Canadian dollar where our client portfolios are neither negatively impacted by a higher Canadian dollar, nor benefit from a materially weaker Canadian dollar.

Ryan Lewenza, CFA,CMT is a Partner and Portfolio Manager with Turner Investments, and a Senior Vice President, Private Client Group, of Raymond James Ltd.


Recent additions: fltkhs

Added by deech, Sat Jan 20 17:48:56 UTC 2018.

FLTK bindings

Slashdot: Red Hat Reverts Spectre Patches to Address Boot Issues

An anonymous reader quotes BleepingComputer: Red Hat is releasing updates for reverting previous patches for the Spectre vulnerability (Variant 2, aka CVE-2017-5715) after customers complained that some systems were failing to boot. "Red Hat is no longer providing microcode to address Spectre, variant 2, due to instabilities introduced that are causing customer systems to not boot," the company said yesterday. "The latest microcode_ctl and linux-firmware packages are reverting these unstable microprocessor firmware changes to versions that were known to be stable and well tested, released prior to the Spectre/Meltdown embargo lift date on Jan 3rd," Red Had added. Instead, Red Hat is recommending that each customer contact their OEM hardware provider and inquire about mitigations for CVE-2017-5715 on a per-system basis. Besides Red Hat Enterprise Linux, other RHEL-based distros like CentOS and Scientific Linux are also expected to be affected by Red Hat's decision to revert previous Spectre Variant 2 updates, so these users will also have to contact CPU/OEM vendors. At least one site "characterized the move as Red Hat washing its hands of the responsibility to provide customers with firmware patches," writes Data Center Knowledge, arguing instead that Red Hat "isn't actually involved in writing the firmware updates. It passes the microcode created by chipmakers to its users 'as a customer convenience.'" "What I would have said if they'd asked us ahead of time is that microcode is something that CPU vendors develop," Jon Masters, chief ARM architect at Red Hat, told Data Center Knowledge in a phone interview Thursday. "It's actually an encrypted, signed binary image, so we don't have the capability, even if we wanted to produce microcode. It's a binary blob that we cannot generate. The only people who can actually generate that are the CPU vendors."

Read more of this story at Slashdot.

Recent additions: tuple-ops

Added by JiasenWu, Sat Jan 20 17:42:00 UTC 2018.

various operations on n-ary tuples via GHC.Generics Slob-0.002002

Read .slob dictionaries (as used by Aard 2)

Slashdot: Apple Might Discontinue the iPhone X This Summer

BGR shares a startling prediction from Ming-Chi Kuo, the Apple analyst at KGI securities: Kuo -- who we should note has an exemplary track record with respect to iPhone rumors -- adds that Apple may opt to discontinue the current iPhone X entirely if sales are underwhelming. "KGI also expects a trio of iPhone models in the fall of 2018," AppleInsider notes. "He predicts the iPhone X will be 'end of life' in the summer of 2018, instead of being retained as a lower-cost option in the following year." If Kuo's projection pans out, this would represent a marked shift in Apple's iPhone sales strategy. Going back nearly a decade, Apple has always positioned older iPhone models around as a wallet-friendly alternative for users who weren't keen on paying a premium for Apple's latest and greatest.

Read more of this story at Slashdot. Music-Tension-1.02

music tension analysis

Recent additions: debug-pp 0.1.1

Added by PepeIborra, Sat Jan 20 16:09:12 UTC 2018.

A preprocessor for the debug package

Planet Haskell: Mark Jason Dominus: I forgot there are party conventions

Yesterday I made a huge mistake in my article about California's bill requiring presidential candidates to disclose their tax returns. I asked:

Did I miss anything?

Yes, yes I did. I forgot that party nominees are picked not by per-state primary elections, but by national conventions. Even had Ronnie won the California Republican primary election, rather than Trump, that would not be enough the get him on the ballot in the general election.

In the corrected version of my scenario, California sends many Ronnie supporters to the Republican National Convention, where the other delegates overwhelmingly nominate Trump anyway. Trump then becomes the Republican party candidate and appears on the California general election ballot in November. Whoops.

I had concluded that conceding California couldn't hurt Trump, and it could actually a huge benefit to him. After correcting my mistake, most of the benefit evaporates.

I wonder if Trump might blow off California in 2020 anyway. The upside would be that he could spend the resources in places that might give him some electoral votes. (One reader pointed out that Trump didn't blow off California in the 2016 election, but of course the situation was completely different. In 2016 he was not the incumbent and he was in a crowded field.)

Traditionally, candidates campaign even in states they expect to lose. One reason is to show support for candidates in local elections. I can imagine many California Republican candidates would prefer that Trump didn't show up. Another reason is to preserve at least a pretense that they are representatives of the whole people. Newly-elected Presidents have often upheld the dignity of the office by stating the need for unity after a national election and promising (however implausibly) to work for all Americans. I don't care to write the rest of this paragraph.

The major downside that I can think of (for Trump) is that Republican voters in California would be infuriated, and while they can't directly affect the outcome of the presidential election, they can make it politically impossible for their congressional representatives to work with Trump once he is elected. A California-led anti-Trump bloc in Congress would probably cause huge problems for him.

Wild times we live in.

Slashdot: Car Manufacturers Sued Over Rodents Eating Soy-Insulated Wires

An anonymous reader writes about "a little-known problem plaguing many newer vehicles from the likes of Honda, Toyota, and Kia." The car makers used soy-insulated wiring to cut costs and "Go Green", but owners in rural areas are finding the local wildlife finds the wiring irresistible; thousands of dollars in damage has been done by rats and other critters eating wiring harnesses. Hackaday is asking their community to brainstorm solutions to this unique problem, as owners of affected vehicles have had to resort to sprinkling their driveway with coyote urine and putting rat traps on the wheels. Hackaday reports that "It isn't just one or two cases either, it's enough of a problem that some car manufacturers are getting hit with class-action lawsuits." Back in 2010 Slashdot reported that rabbits had already discovered the joys of eating soy-insulated wires, and were turning the parking lot at the Denver International Airport into their own personal buffet. There's even a web site called, which reports that Honda has already manufactured a special wire-wrapping tape that's infused with the active ingredient from chili peppers.

Read more of this story at Slashdot. XML-Saxtract-1.04

Streaming parse XML data into a result hash based upon a specification hash

ScreenAnarchy: Watch Delightful Sci-Fi Short EINSTEIN-ROSEN, Start Your Day with a Smile

Children are naturally curious about the world around them. And sometimes, the world that's not around them, a world that can't be seen. Sometimes, this gets them into the most hilarious trouble. In Olga Osorio's quirky and light-hearted short Einstein-Rosen, that curiosity leads to a metaphorical shower of sibling rivalry, and a literal shower of, well, you'll just have to watch it. It's the summer of 1982 in A Coruña. Young Teo claims he has found a wormhole. His brother Óscar does not believe him - at least not for now. Osorio perfesctly blends th recklessness of children with their sense of wonder in science and the world around them, all with knowing nods to relationships between siblings in both love and ftrustration. Screened at...

[Read the whole post on]

MetaFilter: An Homage to You Rudy

Two Tone was both a record label and movement that combined imported Jamaican Ska with homegrown British punk to form a uniquely British multi-racial, multi-ethnic musical (and sartorial) style that has gone on to have a worldwide impact. They also had complicated relationship with another UK youth culture at the time: skinheads. List-Flatten-XS-0.03

L with XS WebService-BitFlyer-0.01

one line description

Open Culture: 10,000 Classic Movie Posters Getting Digitized & Put Online by the Harry Ransom Center at UT-Austin: Free to Browse & Download

Who hasn’t pinned one of Saul Bass’s elegant film posters on their wall—with either thumbtacks above the dormroom bed or in frame and glass in grown-up environs? Or maybe it’s 70s kitsch you prefer—the art of the grindhouse and sensationalist drive-in exploitation film? Or 20s silent avant-garde, the cool noir of the 30s and 40s, 50s B-grade sci-fi, 60s psychedelia and French new wave, or 80s popcorn flicks…? Whatever kind of cinema grabs your attention probably first grabbed your attention through the design of the movie poster, a genre that gets its due in novelty shops and specialist exhibitions, but often goes unheralded in popular conceptions of art.

Despite its utilitarian and unabashedly commercial function, the movie poster can just as well be a work of art as any other form. Failing that, movie posters are at least always essential archival artifacts, snapshots of the weird collective unconscious of mass culture: from Saul and Elaine Bass’s minimalist poster for West Side Story (1961), “with its bright orange-red background over the title with a silhouette of a fire escape with dancers” to more complex tableaux, like the baldly neo-imperialist Africa Texas Style! (1967), "which features a realistic image of the protagonist on a horse, lassoing a zebra in front of a stampede of wildebeest, elephants, and giraffes.”

These two descriptions only hint at the range of posters archived at the University of Texas Harry Ransom Center—upwards of 10,000 in all, “from when the film industry was just beginning to compete with vaudeville acts in the 1920s to the rise of the modern megaplex and drive-in theaters in the 1970s.” So writes Erin Willard in the Ransom Center’s announcement of the digitization of its massive collection, expected to reach completion in 2019. So far, around 4,000 posters have been photographed and are becoming available online, downloadable in “Large,” “Extra Large,” and “High-Quality” resolutions.

The bulk of the collection comes from the Interstate Theater Circuit—a chain that, at one time, “consisted of almost every movie theater in Texas”—and encompasses not only posters but film stills, lobby cards, and press books from “the 1940s through the 1970s with a particular strength in the films of the 1950s and 60s, including musicals, epics, westerns, sword and sandal, horror, and counter culture films.” Other individual collectors have made sizable donations of their posters to the center, and the result is a tour of the many spectacles available to the mid-century American mind: lurid, violent excesses, maudlin moralizing, bizarre erotic fantasies, dime-store adolescent adventures....

Some of the films are well-known examples from the period; most of them are not, and therein lies the thrill of browsing this online repository, discovering obscure oddities like the 1956 film Barefoot Battalion, in which “teen-age wolf packs become heroes in a nation’s fight for freedom!” The number of quirks and kinks on display offer us a prurient view of a decade too often flatly characterized by its penchant for grey flannel suits. The Mad Men era was a period of institutional repression and rampant sexual harassment, not unlike our own time. It was also a laboratory for a libidinous anarchy that threatened to unleash the pent-up energy and cultural anxiety of millions of frustrated teenagers onto the world at large, as would happen in the decades to come.

What we see in the marketing of films like Five Branded Women (1960) will vary widely depending on our orientations and political sensibilities. Is this cheap exploitation or an empowering precursor to Mad Max: Fury Road? Maybe both. For cultural theorists and film historians, these pulpy advertisements offer windows into the psyches of their audiences and the filmmakers and production companies who gave them what they supposedly wanted. For the ordinary film buff, the Ransom Center collection offers eye candy of all sorts, and if you happen to own a high-quality printer, the chance to hang posters on your wall that you probably won’t see anywhere else. Enter the online collection here.

Related Content:

40,000 Film Posters in a Wonderfully Eclectic Archive: Italian Tarkovsky Posters, Japanese Orson Welles, Czech Woody Allen & Much More

The Film Posters of the Russian Avant-Garde

A Look Inside Martin Scorsese’s Vintage Movie Poster Collection

40 Years of Saul Bass’ Groundbreaking Title Sequences in One Compilation

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

10,000 Classic Movie Posters Getting Digitized & Put Online by the Harry Ransom Center at UT-Austin: Free to Browse & Download is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

MetaFilter: Martin Luther King's intellectual, ethical, and political commitments

MLK Now - "In the year before King's death, he faced intense isolation owing to his strident criticisms of the Vietnam War and the Democratic Party, his heated debates with black nationalists, and his headlong quest to mobilize the nation's poor against economic injustice. Abandoned by allies, fearing his death was near, King could only lament that his critics 'have never really known me, my commitment, or my calling.' " (via)
They ignore his indictment of the United States as the "greatest purveyor of violence in the world," his critique of a Constitution unjustly inattentive to economic rights and racial redress, and his condemnation of municipal boundaries that foster unfairness in housing and schooling... This is a tragedy, for King was a vital political thinker. Unadulterated, his ideas upset convention and pose radical challenges—perhaps especially today, amidst a gathering storm of authoritarianism, racial chauvinism, and nihilism that threatens the future of democracy and the ideal of equality. What follows is an effort to recover those unsettling ideas by shedding light on three of the most important and misunderstood elements of King's mature thought: his analysis of racism; his political theory of direct action and civil disobedience; and his understanding of the place of ethical virtues in activism and social life...

King's interests in fear, ideology, and politics led him to believe, as he expressed in "The Power of Nonviolence" (1958), that we must "attack the evil system rather than individuals who happen to be caught up in the system."

This systemic focus, crucially, does not inflate "racism" to make it explain all racial disparities, but understands that such inequalities are outcomes of many phenomena that interact with racism, yet cannot be reduced to only racism. These include technology, political economy, and cultural patterns. As early as 1964, for example, King presciently warned in Why We Can't Wait that "if automation is a threat to Negroes, it is equally a menace to organized labor." Arguing for an alliance between civil rights and labor activists, King foresaw how capital investments in "efficiency" would dislocate middle-class jobs, stagnate wages, and devastate unions' political power. Granted, discrimination and historical disadvantage would cause these burdens to fall hardest on poor blacks—yet it still opened the possibility of broader political alliances...

In the face of accelerating automation and the elimination of living-wage jobs, King endorsed a number of egalitarian policies, including basic income and a full-employment guarantee, which have once again become rallying cries. Our present-day interest in these policies, however, remains too tethered to technocratic governance. King thought only mass civil disobedience would create, shape, and sustain such transformative goals...

For King, such persistent failures of reciprocity—political, social, and economic—made civil disobedience legitimate... The mass dimension of protest allows for people of all walks of life to be more than spectators, and instead be transformed by their resistance to oppression, rediscovering courage and self-respect in the face of assaults on their dignity... its nonviolent aspect remained crucial, in part because of its unique ability to throw racist ideology off balance. On King's account, the racist worldview predicts that the humiliation and disregard dispensed in its name will bring back more of the same... For King, "adherence to nonviolence—which also means love in its strong and commanding sense," politically performed a feat of redirection. By unsettling racist expectations and disclosing new possibilities for living together, nonviolence and an ethic of love became vehicles for staging grievance, disrupting distrust and retaliation, and envisioning new forms of cooperation...

King's blindness to the gendered dimensions of charismatic authority and hierarchical leadership within protest organizations—and the black church—is surely reason enough to be critical of his example... Any retrieval of King's legacy has to amend his triple evils [racism, militarism and poverty] to include a fourth: sexism...

Still King's call to internationalize nonviolent social justice movements continues to matter in at least one important respect. We face global existential challenges of climate change, nuclear weaponry, war and terrorism, and wealth inequality (abetted by offshore tax havens and attacks on capital controls). Yet the institutions that exercise the most power over these circumstances remain insulated from democratic action and accountability to citizens. If there is any hope to prevent disempowered citizens' rage and resentment from being exploited by demagogues and reactionaries, it must be channeled into coordinated, enduring social movements that force electoral and economic reckonings while fostering respect for our shared "garment of destiny."

King was hopeful, but not blind to the difficulty and costs of these aspirations. Members of such movements will face repression, scorn, prison, and sacrifice. Racism and sexism will threaten solidarity, violence will injure our faith in cooperation, and inequality will breed its rationalizations. But when threats are mortal, retreat and accommodation are avenues to self-destruction. As we scour for exemplars of struggle, we must not write off the United States' most peculiar radical and his enduring intellectual and political challenge. King calls on us to think and argue publicly about the crises of our present, and collectively determine the broadest range of nonviolent coercive powers at our disposal. "Our very survival," King wrote in Where Do We Go From Here, "depends on our ability to stay awake, to adjust to new ideas, to remain vigilant and to face the challenge of change."

Disquiet: Building on “Fever Pitch”

The Disquiet Junto has been going on since the first week of January 2012, and though I have moderated the Junto from the start, and we’re currently on the 316th consecutive weekly project, and the mailing list has over 1,200 subscribers from around the world, I myself have participated less than a handful of times, most recently this past week, for project 0315.

I hadn’t recorded the piece of music, “Fever Pitch,” as part of Junto 0315 initially. I recorded “Fever Pitch,” in fact, for an entirely different weekly music project series, one called Weekly Beats. When I subsequently recognized that the simple track, just a guitar line filtered by a modular synthesizer, fit the constraints of Junto project 0315, I posted it for that as well. There is a lot of cross-pollination among only compositional series. For example, I wrote a poem for the great Naviar Haiku series on the occasion of its 40th weekly project, and some people have cross-posted pieces of music between Naviar and Junto, which share a bit of the same roster in general, and we have collaborated once or twice.

In any case, the point of project 0315, “First Chair,” was for musicians to make short pieces of music that would serve as one third of a trio, with the idea that in the following weeks other musicians would, in turn, flesh out the trio. It’s an exercise in asynchronous collaboration, which is a central theme of all Junto projects. The sequence originating with Junto 0315 is simply a reinforcement through emphais of that concept.

Well, as part of Junto 0316, which is currently ongoing and will close at 11:59pm on Monday night, a Brooklyn-based musician named Joseph Branciforte did me a great honor. He added a second part to “Fever Pitch,” which he simply titled after the day he recorded it, “January 18, 2018.” It’s a marvel of simpatico consideration, his Fender Rhodes, coaxed by some effects pedals, filling in the blanks left by my guitar. I’ve been fiddling with a modular synthesizer since 2014, when I started to assemble one after marveling at a performance by Marcus Fischer at Powell’s Books in Portland at an event for my then just published book on Aphex Twin’s album Selected Ambient Works Volume II, part of the Bloomsbury 33 1/3 series. Since last July, when I started taking guitar lessons weekly, my synthesizer has gotten less attention, but I recently got into using the synth as an oversized effects pedal, which is how this piece came about.

All of which is to say, I’m writing this evening to thank Branciforte for the great pleasure his piece — that is, his piece and my piece in tandem — has brought me. There is a misunderstanding that music critics are frustrated musicians. I’m in no way a frustrated musician. I have such low expectations for what I might accomplish musically, that learning guitar and synthesizer is just as sequence of pleasurable discoveries fed by curiosity and reinforced by the steady pace of practice.

As I write this, there are already 21 tracks by almost as many musicians in the 0316 Junto, “El Segundo,” some others of which have also built on my “Fever Pitch.” I’m just beginning to work my way through the accumulating duets, and listening for the space they leave for what will soon be trios.

Track originally posted at More from Joseph Branciforte, who is based in Brooklyn, New York, at,, YouTube, and

Penny Arcade: News Post: Paragone

Tycho: Out here, up in the Game Spire, where we buy games and then play them, Public G is a big deal, if not the biggest deal.  But a lot - and I mean a lot - of people play Fortnite.  To give you entre into a world you may not be aware of, the Youngs play Fortnite almost exclusively.  It doesn’t cost anything.  That’s, uh…  that’s a pretty good price.  There’s nothing even approximating a Queue on the PC.  Once you’ve decided you want to play, that’s it.  And from the sounds of it, that kind of community support is having…

ScreenAnarchy: Frontières 2018: Amsterdam Titles Announced, Submissions for Cannes And Montreal Open

With the new year well on its way it is time once again to bring about focus on Frontières and it's ongoing and awesome work to help create more international genre films.    While we would hesitate to call it a divine calling there is something certainly heavanly about their work. Which brings us to their first project of the year, the Frontières Finance & Packaging Forum happening in Amsterdam in February from the 22nd through the 24th.    For the first time a project from South America will participate in a Frontières program. The Monster Within, by filmmaker Rodrigo Susarte (Ventana 2014), is a joint effort from his native Chile and their partner in Europe in Denmark. Prior to this, two projects from Mexico were...

[Read the whole post on]

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: The leap

Photo by Andy Seliverstoff

People with wasted lives who follow social media and (ugh) blogs were scandalized a few days ago when a Toronto rowhouse which deserves to be burned hit the market for $750,000. Yes, it’s in an area favoured by trendoids and hipsters, guys with beards and buns and women wearing tats but, sheesh, three-quarters of a mill for a place with a kitchen like this?

If the house sells, according to the Tweets and posts, it will ‘prove’ the local market has resiliency and duration, that a piffle like rising mortgage rates or a stress test cannot stop it. This place is an utter, complete, unqualified, horrific, likely diseased, dump. Makes you wonder if the people on either side of space a local real estate site called a “howling shithole”, know what’s but a few inches away.

Anyway, let’s all celebrate the grandly-named Allister John Sinclair, the listing agent from Re/Max, who put pen to paper and came up with a glowing description. I think a Pulitzer is in order, or certainly a Governor-General’s award for fiction:

Builders Delight Perched In The Highly Desired Neighbourhood Of Trinity Bellwoods! Attention Renovators & Builders! 9 Foot Ceilings. This 2 Storey Home Just Needs Tlc, Renovations And Remodeling To Become Your Dream Home! Just Steps To Queen Street, Trinity Bellwoods Park, Ttc, Subway, Shops And So Much More. Prime Location! Don’t Miss This Opportunity! **** EXTRAS **** Property Being Sold As Is Where Is, Street Permit Parking Available, Roof Is In Good Condition. Upgraded Furnace – Recently Serviced With New Heat Exchanger And Motor. Hot WaterTank (Rental – 2016)

As of Friday afternoon it was still on the market – not yet picked off by some smarty moister with a B of Mom loan and an e-bike. Maybe there’s hope after all. BTW, here is the listing for 15 Rebecca. Please be responsible, and don goggles and other protective gear before you click on the link.

Well, there’ll be serious discussion over the next two weeks about the direction of the market as a whole. Despite sliding detached prices in Vancouver the natives seem convinced things are just ducky, unless Comrade Horgan, the new NDP premier, comes down with a vicious new anti-speculation law in his February budget. He’s rejected an outright ban on non-locals owning property, which is entirely correct and reasonable, but he’s only in power thanks to the Greens, whose leader is a flaming xenophobe. So, we’ll see. The combination of the existing anti-Chinese tax, the empty houses tax and a new spec tax – plus higher mortgage rates and B20 – should be enough to finish off YVR.

Toronto’s a more interesting case, since what happens here will impact about 8 million people in the GTA and its commutershed. The big news will come about February 5th, when the realtor-gathered sales numbers for January are released. If they suck, you can pretty much write off the next two or three months as market sentiment runs negative. Already homeowners believe conditions are worsening thanks to interest rates while buyers are stressed over the stress test. It’s a perfect storm as banks pull in their horns and credit is seriously restricted.

But if the January numbers are close to those of last year (5,188 transactions, of which 2,261 were detached) and if the HS at 15 Rebecca finds a buyer, the bubble will continue to inflate – certainly leading to a more catastrophic outcome down the road.

And where are we now?

As reported the other day on this blog by a veteran broker, a mere 77 detached houses found buyers in 416 in the first couple of weeks in January. Bad. Now Zoocasa has some new numbers. Also bad. So far condo sales are down 21% year/year and 20% lower for detacheds. This has created a buyer’s market, as the ratio of sales to listings plunges below the 40% mark – in stark contrast to early last year when it oscillated between 90% and 100%.

The ratio now is somewhat shocking: 24% for detached houses and 26% for condos. It means as new listings flood in for spring, buyers will probably have a large and growing pool to choose from.

But these aren’t ‘official’ numbers. The weather was appalling after Christmas. News about the mortgage changes are rate hikes was everywhere. And two weeks do not a market make – that’s too big a leap to take. But there’s no denying the cost of money has shot higher than anyone expected a year ago. The new borrowing rules are, by historic measure, extreme. Tens of thousands of potential buyers will not be approved. As many more will qualify to borrow less. If listings swell as they do every March and April, peak house will truly be over.

Meanwhile, some deluded kid will buy the horror on Rebecca. Pray for her.

new shelton wet/dry: Dumbest movie ever with a predictable dumb plot, bad acting, worse script, straight up ridiculous

…America’s system of government. The bureaucracy is so understaffed that it is relying on industry hacks to draft policy. They have shaped deregulation and written clauses into the tax bill that pass costs from shareholders to society. { Economist | Continue reading } graphite pencil, crayon and collage on paper { Jasper Johns, Green Flag, 1956 [...]

Quiet Earth: CLOVERFIELD Sequel Viral Marketing Begins

The next film in the unconnected Cloverfield cinematic universe is due out April 20, 2018. Originally titled God Particle the film will likely receive a new name since it was earmarked by J.J. Abrams to enter the Cloververse.

While we haven't seen any footage or images from the mysterious sci-fi film, the viral marketing has officially begun with the appearance of an update to the Tagruato website that includes the following cryptic message:

"Tokyo – January 18 2018: Tagruato has begun development on a revolutionary new ene [Continued ...]

new shelton wet/dry: The river that swallows all rivers

More recently we have supertasks such as Benardete’s Paradox of the Gods, A man decides to walk one mile from A to B. A god waits in readiness to throw up a wall blocking the man’s further advance when the man has travelled ½ a mile. A second god (unknown to the first) waits in readiness [...]

ScreenAnarchy: Sundance 2018 Review: THE GUILTY, a Tense Phone Call That Changes Everything

If Larry Cohen lived in Copenhagen, he might have written The Guilty. Veteran filmmaker Cohen, of course, has written dozens of screenplays that start with a clever idea and then expound on it with a wicked, pulp sense of humor and drama. Two good examples that spring immediately to mind are Phone Booth and Cellular, which both revolve around a phone call forcing the recipient into a suspenseful course of action by the caller. Putting that aside, however, director Gustav Möller says that a real-life incident inspired him to make The Guilty (original title: Den skyldige), which follows what happens when an emergency response operator receives a call from a woman who's apparently been kidnapped. Written by Möller and Emil Nygaard Albertsen, the screenplay takes...

[Read the whole post on]


A selection of recent work by artist Jesus Perea from Madrid. Click here for previous posts. See more images below.


Jesus Perea

Quiet Earth: Albert Pyun Confirms CYBORG Bonus Features

Shout! Factory have announced they will release Albert Pyun's post-apocalyptic, cyberpunk hybrid Cyborg on Blu-ray in April but they haven't released any bonus features yet. We previously reported the release was scheduled for January 30, but it was pushed back to April. Perhaps we've learned more about why.

The film's director Albert Pyun confirmed on Twitter today that he is scheduled to record a new commentary for the film as well as partake in a series of interviews (a [Continued ...]


Montana-born, New York-based photographer Suzanne Saroff combines commonplace objects with different tools, techniques and colours to create alternative avenues of perception and expression. As she shared with us:

“Taking shape via shadows or fragmentations, my subjects often become more than the singular and expected version of themselves. By utilizing clear forms like cylinders and glasses of water, a watermelon can be multiplied, stretched and flipped as it dances and contorts within the walls of overlapping glass.”

See more of Saroff’s experimental explorations below!


 Suzanne Saroff

Planet Haskell: Brandon Simmons: In defense of partial functions in the haskell Prelude

…because I’m trying to blog more, and this sounds like a fun argument to try to make.

One of the most universally-maligned parts of Haskell is the inclusion of partial functions in its standard library called Prelude. These include head and tail which are undefined for the empty list:

head :: [a] -> a
head (a:_) = a
head [] = error "empty list"

It’s generally understood that the inclusion of these sorts of functions are a wart (that the type of head should be [a] -> Maybe a) that has motivated the proliferation of Prelude alternatives, few of which are used by anyone besides their authors (and fewer still have enthusiastic advocates).

I’ve heard a handful of allusions to tricky production bugs that involved some deep dive to find the source of a "*** Exception: Prelude.head: empty list", but personally I can recall only one instance of such a bug in the code I’ve worked on professionally and it was trivial to track down. I can’t recall flagging a use of head in a code review either, or raising an eyebrow at some use of the function in some library source I was perusing.

But most of the time the argument goes that partial functions should be removed for the sake of new users, who will become quickly disillusioned when their first function blows up with an exception. But you said haskell was safe!

It would be unfortunate if this caused a new user to give up, and maybe this is a real problem for the community, but here’s what I think really happens to most of us:

  • Your homework doesn’t work; this doesn’t matter.
  • You use Google and quickly learn that partial functions (will forever) exist, and that they’re bad
  • You ask yourself “Hm, come to think of it what did I expect to happen…?”

And so you learn an important lesson, early on and in the most painless way possible, you acquire a nose for inferring which functions must be partial, an appreciation for compiler warnings that help prevent accidentally-partial functions, etc.

Would I recommend designing a standard library around this weird sort of tough-love? Probably not, but I think the haskell library ecosystem and pedagogy have benefited from this wart.

The problems that get the most (and most passionate) attention are usually not the ones that are the most important, but the ones that are the most easily understood. I think in the proliferation of Preludes and the discussion around partial functions (and the fact that they haven’t been excised yet) we see evidence of both the Law of Triviality, and a healthy language pedagogy.

ScreenAnarchy: Slamdance 2018 Review: ROCK STEADY ROW Rewrites the College Experience

The elitism and sexism that tend to accompany higher learning takes a beating in Travis Stevens' delightfully anarchic free-for-all.

[Read the whole post on]

Open Culture: How a Virtual Reality Model of Auschwitz Helped Convict an SS Concentration Camp Guard: A Short Documentary on a High Tech Prosecution

In 2016, Reinhold Hanning, a former SS guard at the Auschwitz concentration camp, was tried and convicted for being an accessory to at least 170,000 deaths. In making their case, prosecutors did something novel--they relied on a virtual reality version of the Auschwitz concentration camp, which helped undermine Hanning's claim that he wasn't aware of what happened inside the camp. The virtual reality headset let viewers see the camp from almost any angle, and established that "Hanning would have seen the atrocities taking place all around him."

The high-tech prosecution of Hanning gets well documented in "Nazi VR," the short documentary above. It comes from MEL Films, and will be added to our collection of online documentaries.

Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox. 

If you'd like to support Open Culture and our mission, please consider making a donation to our site. It's hard to rely 100% on ads, and your contributions will help us provide the best free cultural and educational materials.

Related Content:

Auschwitz Captured in Haunting Drone Footage (and a New Short Film by Steven Spielberg & Meryl Streep)

Carl Jung Psychoanalyzes Hitler: “He’s the Unconscious of 78 Million Germans.” “Without the German People He’d Be Nothing” (1938)

From Caligari to Hitler: A Look at How Cinema Laid the Foundation for Tyranny in Weimar Germany

Watch World War II Rage Across Europe in a 7 Minute Time-Lapse Film: Every Day From 1939 to 1945

How a Virtual Reality Model of Auschwitz Helped Convict an SS Concentration Camp Guard: A Short Documentary on a High Tech Prosecution is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

ScreenAnarchy: Criterion in April 2018: VIRGIN SUICIDES, DEAD MAN and a Bevy of Bergman

Apologies for the lateness of this posting, but since it's just you and me here, devoted fans of classy and extremely well-presented home video, allow me to say: the Criterion Collection's lineup is getting more and more exciting! In April 2018, the company plans to release two strikingly different black and white films: Leo McCarey's wonderful comedy The Awful Truth, starring Cary Grant, Irene Dunne and Ralph Bellamy; and Jim Jarmusch's Dead Man, his first period picture, starring Johnny Depp. Sofia Coppola's strikingly subduedl The Virgin Suicides and Sergei Parajanov's The Color of Pomegranates -- about which I know nothing -- and a bevy of Bergman. The latter is part of Criterion's no-frills Eclipse line and will allow fans of the fab Ingrid Berman to...

[Read the whole post on]

Colossal: Walk Inside a Warehouse-Sized Kaleidoscopic Painting by Katharina Grosse

The newest work by German artist Katharina Grosse encompasses an entire warehouse, transforming its raw interior into a soft maze of kaleidoscopic color. The installation, titled The Horse Trotted Another Couple of Metres, Then it Stopped, responds to the architecture of Sydney’s contemporary art center Carriageworks, filling the industrial space with nearly 90,000 square feet of painted fabric.

“I was fascinated by the thought of folding space,” explained Grosse in a statement about the work. “I was interested in taking this vast surface and shrinking it by folding or, actually, hiding the entirety of what’s there. I understand a painting as something that, as we view it, travels through us and realigns our connections with the world.”

To produce the piece Grosse first suspended the multitude of fabric from Carriageworks’ ceiling, creating a series of drapes and folds. The artist then used a spray gun to paint the work in a series of gestural strokes, creating an immersive site-specific environment that obscures the historic building’s architecture in a dense mass of swirling color.

The work was mounted as a part of Sydney Festival 2018, and is on view through April 8, 2018. You can view more of Grosse’s large-scale paintings (including this 2016 in situ installation at Rockaway Beach) on her website.  (via Juxtapoz)

Ideas from CBC Radio (Highlights): Travels through Trump's America one year later

It’s been one year since Donald Trump’s inauguration. His official swearing-in compelled many Americans reflect on what America actually is now, politically, socially and culturally. Contributor David Zane Mairowitz is originally from America, and has been living in Europe for over fifty years. He returned to the U.S. in the spring of 2017 to travel through six southern states, where he recorded his encounters with everyday people at restaurants, churches -- and gun shows. His aim: to gain insight into an America he’s now struggling to comprehend.

Planet Haskell: Mark Jason Dominus: Presidential tax return disclosure

The California state legislature passed a bill that would require presidential candidates to disclose their past five tax returns in order to qualify for California primary elections. The bill was vetoed by Governor Brown, but what if it had become law?

Suppose Donald Trump ran for re-election in 2020, as seems likely, barring his death or expulsion. And suppose he declined once again to disclose his tax returns, and was excluded from the California Republican primary election. I don't see how this could possibly hurt Trump, and it could benefit him.

It doesn't matter to Trump whether he enters the primary or wins the primary. Trump lost California by 30% in 2016. Either way he would be just as certain to get the same number of electors: zero. So he would have no incentive to comply with the law by releasing his tax returns.

Most candidates would do it anyway, because they try to maintain a pretense of representing the entire country they are campaigning to lead, but Trump is really different in this way. I can easily imagine he might simply refuse to campaign in California, instead dismissing the entire state with some vulgar comment. If there is a downside for Trump, I don't see what it could be.

Someone else (call them “Ronnie”) would then win the California Republican primary. Certainly Ronnie is better-qualified and more competent than Trump, and most likely Ronnie is much more attractive to the California electorate.

Ronnie might even be more attractive than the Democratic candidate, and might defeat them in the general election, depriving Trump's challenger of 55 electoral votes and swinging the election heavily in Trump's favor.

Did I miss anything?

[ Addendum 20180120: Yeah, I forgot that after the primary there is a convention that nominates a national party candidate. Whooops. Further discussion. ]

Penny Arcade: Comic: Paragone

New Comic: Paragone

Open Culture: Celebrate the Women’s March with 24 Goddess GIFs Created by Animator Nina Paley: They’re Free to Download and Remix

As millions of women, men, and friends beyond the binary gear up for Women's March events around the world this weekend, we can’t help but draw strength from the Venus of Willendorf in Graphics Interchange Format, above.

Like the pussy hats that became the most visible symbol of last year’s march, there’s a strong element of humor at play here.

Also respect for the female form.

As Dr. Bryan Zygmont notes in his Khan Academy essay on the Venus of Willendorf, her existence is evidence that “nomadic people living almost 25,000 years ago cared about making objects beautiful. And … that these Paleolithic people had an awareness of the importance of the women.”

Animator Nina Paley has taken up our Paleolithic ancestors’ baton by creating two dozen early goddess GIFs, including the Venus.

As further proof that sisterhood is powerful, Paley is sharing her unashamedly bouncy pantheon with the public. Visit her blog to download all 24 individual goddess GIFs. Disseminate them widely. Use them for good! No permission needed.

Paley is no stranger to goddesses, having previously placed the divine heroine of the Ramayana front and center in her semi-autobiographical feature length animation, Sita Sings the Blues.

She’s also incredibly familiar with rights issues, following massive complications with some vintage recordings her Betty Boop-ish Sita lip-synchs in the film. (She had previously believed them to be in the public domain.) Unable to pay the huge sum the copyright holders demanded to license the tunes, Paley ultimately decided to relinquish all legal claims to her own film, placing Sita Sings the Blues in the public domain, to be freely shared, exhibited, or even remixed.

If Paley's the poster child for copyright issues she’s also a shining example of deriving power from unlikely sources.

As she wrote on her website nearly ten years ago:

My personal experience confirms audiences are generous and want to support artists. Surely there's a way for this to happen without centrally controlling every transaction. The old business model of coercion and extortion is failing. New models are emerging, and I'm happy to be part of that. But we're still making this up as we go along. You are free to make money with the free content of Sita Sings the Blues, and you are free to share money with me. People have been making money in Free Software for years; it's time for Free Culture to follow. I look forward to your innovations.

As for Paley's own plans for her goddesses, they’ll be a part of her upcoming animated musical, Seder-Masochism, noting that “all early peoples conceived the divine as female.”

Download Nina Paley’s Goddess GIFs here. Watch Sita Sings the Blues here. March ever onward!

Related Content:

3D Scans of 7,500 Famous Sculptures, Statues & Artworks: Download & 3D Print Rodin’s Thinker, Michelangelo’s David & More

How Ancient Greek Statues Really Looked: Research Reveals their Bold, Bright Colors and Patterns

The Goddess: A Classic from the Golden Age of Chinese Cinema, Starring the Silent Film Icon Ruan Lingyu (1934)

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine. Join her on February 8 for Necromancers of the Public Domain, when a host of New York City-based performers and musicians will resurrect  a long forgotten work from 1911 as a low budget, variety show. Follow her @AyunHalliday.

Celebrate the Women’s March with 24 Goddess GIFs Created by Animator Nina Paley: They’re Free to Download and Remix is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

CreativeApplications.Net: Die With Me – The chat app for when you have less than 5% battery

Created by Dries Depoorter & David Surprenant and presented in collaboration with IDFA Doclab, Die With Me is a chat app available for iOS and Android that you can only use when you have less than 5% battery.

things magazine: See you around

Locrating presents huge quantities of mapping data about commuting, schools, etc. ‘Outstanding’ schools are of green green, ‘good’ ones are yellow and woe betide your property prices if you live in close proximity to the little bomb-like red ones / … Continue reading

Open Culture: Ian McKellen Chokes Up While Reading a Poignant Coming-Out Letter

"In 1977, Armistead Maupin wrote a letter to his parents that he had been composing for half his life," writes the Guardian's Tim Adams. "He addressed it directly to his mother, but rather than send it to her, he published it in the San Francisco Chronicle, the paper in which he had made his name with his loosely fictionalised Tales of the City, the daily serial written from the alternative, gay world in which he lived." The late 1970s saw a final flowering of newspaper-serialized novels, the same form in which Charles Dickens had grown famous nearly a century and a half before. But of all the zeitgeisty stories then told a day at a time in urban centers across America, none has had anything like the lasting impact of San Francisco as envisioned by Maupin.

Much of Tales of the City's now-acknowledged importance comes from the manner in which Maupin populated that San Francisco with a sexually diverse cast of characters — gay, straight, and everything in between — and presented their lives without moral judgment.

He saved his condemnation for the likes of Anita Bryant, the singer and Florida Citrus Commission spokeswoman who inspired Maupin to write that veiled letter to his own parents when she headed up the anti-homosexual "Save Our Children" political campaign. When Michael Tolliver, one of the series' main gay characters, discovers that his folks back in Florida have thrown in their lot with Bryant, he responds with an eloquent and long-delayed coming-out that begins thus:

Dear Mama,

I'm sorry it's taken me so long to write. Every time I try to write you and Papa I realize I'm not saying the things that are in my heart. That would be OK, if I loved you any less than I do, but you are still my parents and I am still your child.

I have friends who think I'm foolish to write this letter. I hope they're wrong. I hope their doubts are based on parents who love and trust them less than mine do. I hope especially that you'll see this as an act of love on my part, a sign of my continuing need to share my life with you. I wouldn't have written, I guess, if you hadn't told me about your involvement in the Save Our Children campaign. That, more than anything, made it clear that my responsibility was to tell you the truth, that your own child is homosexual, and that I never needed saving from anything except the cruel and ignorant piety of people like Anita Bryant.

I'm sorry, Mama. Not for what I am, but for how you must feel at this moment. I know what that feeling is, for I felt it for most of my life. Revulsion, shame, disbelief — rejection through fear of something I knew, even as a child, was as basic to my nature as the color of my eyes.

You can hear Michael's, and Maupin's, full letter read aloud by Sir Ian McKellen in the Letters Live video above. In response to its initial publication, Adams writes, "Maupin had received hundreds of other letters, nearly all of them from readers who had cut out the column, substituted their own names for Michael’s and sent it verbatim to their own parents. Maupin’s Letter to Mama has since been set to music three times and become 'a standard for gay men’s choruses around the world.'"

Those words come from a piece on Maupin's autobiography Logical Family, published just last year, in which the Tales of the City author tells of his own coming out as well as his friendships with other non-straight cultural icons, one such icon being McKellen himself. "I have many regrets about not having come out earlier," McKellen told BOMB magazine in 1998, "but one of them might be that I didn't engage myself in the politicking." He'd come out ten years before, as a stand in opposition to Section 28 of the Local Government Bill, then under consideration in the British Parliament, which prohibited local authorities from depicting homosexuality "as a kind of pretended family relationship."

McKellen entered the realm of activism in earnest after choosing that moment to reveal his sexual orientation on the BBC, which he did on the advice of Maupin and other friends. A few years later he appeared in the television miniseries adaptation of Tales of the City as Archibald Anson-Gidde, a wealthy real-estate and cultural impresario (one, as Maupin puts it, of the city's "A-gays"). In the novels, Archibald Anson-Gidde dies closeted, of AIDS, provoking the ire of certain other characters for not having done enough for the cause in life — a charge, thanks in part to the words of Michael Tolliver, that neither Maupin nor McKellen will surely never face.

Related Content:

Benedict Cumberbatch Reads a Letter Alan Turing Wrote in “Distress” Before His Conviction For “Gross Indecency”

Allen Ginsberg Talks About Coming Out to His Family & Fellow Poets on 1978 Radio Show (NSFW)

Ian McKellen Reads a Passionate Speech by William Shakespeare, Written in Defense of Immigrants

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Ian McKellen Chokes Up While Reading a Poignant Coming-Out Letter is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs. Comic for 2018.01.19

New Cyanide and Happiness Comic

TheSirensSound: Review for 52 by Orellana

Orellana is a neo-classical/post-rock collective hailing from Bristol, UK. Their new album “52”, released in late December, brought in the new year with it’s explosive and intricate sound. The project’s music transcends genre definitions in order to focus on a broad, diverse concept that is more emotional than tangible. This particular release is full of rich and diverse arrangements, but it is also a powerful exercise in minimalism, one that showcases the strength of very few notes placed in the right spots. The simplicity of the arrangement is actually one of the strongest aspects of this entire release: there’s a palpable stillness created by the long, drone notes in the background, which almost makes you feel like the world is happening in slow motion. When the chords and notes change, it feels quite monumental due to the beautiful contrast between the stillness of the background textures and the expressive sound of the guitar-based melodies.

“Waterbirth” is the first track on the album, and it begins with haunting cinematic melodies on a carpet of lush drones and textures. As the name suggests, this number flows into a great crescendo, and then it blooms with powerful and vigorous harmonies that are reminiscent of contemporary post-rock artists such as "This Will Destroy You" or "Explosion In The Sky", among others. This track serves as a perfect opening number, allowing listeners to dive into the aesthetics of the record fully.

“I See You ft. Mahesh Raghunandan” is another stand-out track from this release, as it takes the listeners into a more ethereal and melancholic plane. This track defies the expectation of listeners, as instead of being another purely instrumental track, it features vocals, due to the contribution of Mahesh Raghundan. The singer’s delicate tone makes for a truly special vibe. The song sounds similar to the dreamy otherworldly style of artists such as “Sigur Ros”, where the same minimalistic approach reaches new heights and depths.

“Is That ... You?” is a great closing number a tune that echoes the immense sonic experience of bands such as “Mono” or “Mogwai”. This song is a stirring and moving end to this multi- layered and complex sonic experience.

Check out the “52” on Orellana’s Bandcamp, where you can listen to the album for free, or “name your price” to download it and get exclusive bonus tracks!

new shelton wet/dry: I know the boy will well usurp the grace, Voice, gait and action of a gentlewoman

This study documents that men and women experience and perform consumer shopping differently. […] There is an abundant literature on sex differences in spatial abilities and object location that follow from the specific navigational strategies associated with hunting and gathering in the ancestral environment. In addition to sex differences in navigational strategies, the unique features [...]

Planet Haskell: FP Complete: Signs Your Business Needs a DevOps Consultant

How to Know if Hiring a DevOps Consultant is Right for Your Business

DevOps is a set of tools and methods to get your applications from Dev through Deployment to Operations. It automates a reliable path from my app runs on one machine to my app is online for all users and data, secure, scalable, managed, and maintainable.


A selection of work by graphic artist Antonio Carrau from Maldonado, Uruguay. See more images below.


Antonio Carrau

new shelton wet/dry: Loading the BRICKS from my FRONT YARD into a DUMPSTER because my neighbor TODD is a FUCKHEAD

…the “Trump Carousel” in New York’s Central Park. The problem there: “It was never named Trump Carousel,” said Crystal Howard of the New York City parks department. She said the Trump Organization — which had a contract to operate the attraction, whose name is the Friedsam Memorial Carousel — had simply put up a sign that renamed [...]

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: What it means

Only in Canada would a quarter-point rate increase grip the nation. It’s weird. The folks in Venezuela (inflation 2,350% this year) would laugh at us. Such first-world problems we have.

However, there’s a good reason this week’s news matters. Consider those lines of credit secured by real estate that people have binged on. We owe an historic $211 billion in HELOCS. Astonishingly (says the federal government) four in ten people are paying nothing on them. Nada. Ziltch. Another quarter pay just the interest. So, two-thirds of borrowers haven’t been reducing their debt load by a penny – even during a time when the cost of money’s been incredibly low. What a losing strategy. Everyone should have known rates would rise. Now they are.

Home equity lines float along with the prime. A quarter point increase means families will have to fund $1.6 billion more in bank charges in 2018, or try to absorb that amount of additional debt. That’s what a lousy quarter point means. When surveys show us almost half the households in Canada have less than $200 after paying the monthlies, how can this not matter?

This week realtors went into overdrive trying to minimize the impact of central bank tightening, aided and abetted by their pimps in the media. “In the near-term, it will likely mean some belt-tightening among those with variable rate mortgages and lines of credit, and with more increases expected, some consumers will be scrimping further as the year goes on,” said the country’s largest newspaper. “But the demand for housing remains so strong that the higher mortgage rates aren’t expected to have a big impact on home sales.”

We’ll see. Higher rates have pushed the benchmark five-year rate to 5.14%. The B20 stress test now requires borrowers to qualify at that rate, or their offered bank rate plus 2%, whichever is greater. Last January rates were 2%, and no stress test. So, obviously, somebody is lying to you. And it’s not this blog.

It will take time for the new reality to sink in, and be translated into changes in market supply and demand. But it seems a no-brainer listings will increase and the pool of buyers shrink as credit is restricted and tens of thousands of purchasers are shut out. More supply, Less demand. Prices fall.

Here are some interesting thoughts from the capital markets guys at brokerage Macquarie. Because Canadians have self-pickled in historic heaps of debt, they say, this cycle of rate hikes by the Bank of Canada is “far graver” than most people believe. If rates rise just one more itsy-bitsy weeny quarter point, the impact will be 65% to 80% as severe as the one that crashed Toronto real estate by 32% in the early 1990s.

Those of you old enough to recall those days may remember house prices declining year after year after year. Nobody was buying. Properties turned illiquid. Real estate was seen as a risk-laden asset after a huge price run-up in the previous decade. I sold a commercial building to a dude who ran into trouble because of the recession and couldn’t pay the financing I’d extended to him. So we swapped out two condos in a brand new building on the waterfront, cancelling his mortgage. I sat on those for six years until I could recoup the money, all the while leased for negative cash flow because rents had tumbled along with prices. Once the storm hit, it took 14 years for prices to recover.

2018 is not 1991, of course. It might be worse. Canadians have never owed so much, nor been so leveraged into real estate. Macquarie points out that 30% of the whole economy comes from selling houses or cars, which is 50% greater than in the past. “The wealth effect from rising home prices has driven nearly 40 per cent of nominal growth in gross domestic product over the past three years, about two to four times the amount experienced previously when the BoC was hiking rates. Even as this has occurred, fixed business investment and exports have struggled, limiting the ability for a virtuous domestic growth cycle to unfold. This again is in sharp contrast to similar periods in the past when these were accelerating.”

In fact, 2% of all of Canada’s GDP has come from realtor commissions alone. Yikes. Imagine all the homeless Audis that will suffer as a result.

Well, add in the stress test and Macquarie warns the effects will be exaggerated. Buyers are expected to have 17% less purchasing power, “which jumps to 23% after incorporating the rise in mortgage rates since mid-2017.” So, all things being equal, houses should cost 23% less than they were when the rates started to move – which was mid-summer.

At that time the average Toronto detached was changing hands for $1.304 million. So are we on the way to a $300,000 decline as the market reflects the changes to credit and the cost of money? If so, why would anyone buy now?

Beats me. I’ve seen this movie before.

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: Influential Voices: An Interview with Artist Shantell Martin

Colossal: A Relaxing Video Demonstrates the Detailed Steps of Making Paper by Hand

Chinese vlogger Li Ziqi films her videos in the serene countryside of China, demonstrating step-by-step instructions for making traditional recipes such as fresh pomelo honey and Lanzhou beef noodles. In one of her most recent videos Li presents the days long process of traditional Chinese paper making, a process which can be traced back to the early years of the Han Dynasty sometime within the 2nd century BC.

The soothing video weaves together the necessary steps for making paper from scratch. During the video Li strictly adheres to the ancient process, using only basic tools such as fire and a mortar and pestle to transform the raw bark. After cutting down a few trees for the paper, Li then cuts and mashes the trunks into pulp, solidifying the consistency of the solution through several rounds of soaking and drying. You can watch the entirety of the demonstration above (along with a surprising twist ending), and view more of Li’s relaxing instructionals on her Youtube channel. (via Laughing Squid)

Planet Haskell: Comonad Reader: Computational Quadrinitarianism (Curious Correspondences go Cubical)

Back in 2011, in an influential blog post [1], Robert Harper coined the term "computational trinitarianism" to describe an idea that had been around a long time — that the connection between programming languages, logic, and categories, most famously expressed in the Curry-Howard-Lambek correspondence — should guide the practice of researchers into computation. In particular "any concept arising in one aspect should have meaning from the perspective of the other two". This was especially satisfying to those of us trying learning categorical semantics and often finding it disappointing how little it is appreciated in the computer science community writ large.

1. Categories

Over the years I've thought about trinitarianism a lot, and learned from where it fails to work as much as where it succeeds. One difficulty is that learning to read a logic like a type theory, or vice versa, is almost a definitional trick, because it occurs at the level of reinterpretation of syntax. With categories it is typically not so easy. (There is a straightforward version of categorical semantics like this — yielding "syntactic categories" — but it is difficult to connect it to the broader world of categorical semantics, and often it is sidestepped in favor of deeper models.)

One thing I came to realize is that there is no one notion of categorical semantics — the way in which the simply typed lambda calculus takes models in cartesian closed categories is fundamentally unlike the way in which linear logics take models in symmetric monoidal categories. If you want to study models of dependent type theories, you have a range of approaches, only some of which have been partially unified by Ahrens, Lumsdaine and Voevodsky in particular [2]. And then there are the LCCC models pioneered by Seely for extensional type theory, not to mention the approach that takes semantics directly in toposes, or in triposes (the latter having been invented to unify a variety of structures, and in the process giving rise to still more questions). And then there is the approach that doesn't use categories at all, but multicategories.

Going the other way, we also run into obstacles: there is a general notion, opposite to the "syntactic category" of a type theory, which is the "internal logic" of a category. But depending on the form of category, "internal logic" can take many forms. If you are in a topos, there is a straightforward internal logic called the Mitchell–Bénabou language. In this setting, most "logical" operations factor through the truth-lattice of the subobject classifier. This is very convenient, but if you don't have a proper subobject classifier, then you are forced to reach for other interpretations. As such, it is not infrequently the case that we have a procedure for deriving a category from some logical theory, and a procedure for constructing a logical theory from some category, but there is no particular reason to expect that where we arrive, when we take the round-trip, is close to, much less precisely, where we began.

2. Spaces, Logics

Over the past few years I've been in a topos theory reading group. In the course of this, I've realized at least one problem with all the above (by no means the only one) — Harper's holy trinity is fundamentally incomplete. There is another structure of interest — of equal weight to categories, logics, and languages — which it is necessary to understand to see how everything fits. This structure is spaces. I had thought that it was a unique innovation of homotopy type theory to consider logics (resp. type theories) that took semantics in spaces. But it turns out that I just didn't know the history of constructive logic very well. In fact, in roughly the same period that Curry was exploring the relationship of combinatory algebras to logic, Alfred Tarski and Marshall Stone were developing topological models for intuitionistic logic, in terms of what we call Heyting Algebras [3] [4]. And just as, as Harper explained, logic, programming and category theory give us insights into implication in the form of entailment, typing judgments, and morphisms, so to, as we will see, do spaces.

A Heyting algebra is a special type of distributive lattice (partially ordered set, equipped with meet and join operations, such that meet and join distribute over one another) which has an implication operation that satisfies curry/uncurry adjointness — i.e. such that c ∧ a ≤ b < -> c ≤ a → b. (Replace meet here by "and" (spelled "*"), and ≤ by ⊢ and we have the familiar type-theoretic statement that c * a ⊢ b < -> c ⊢ a → b).

If you haven't encountered this before, it is worth unpacking. Given a set, we equip it with a partial order by specifying a "≤" operation, such that a ≤ a, if a ≤ b and b ≤ a, then a = b, and finally that if a ≤ b and b ≤ c, then a ≤ c. We can think of such things as Hasse diagrams — a bunch of nodes with some lines between them that only go upwards. If a node b is reachable from a node a by following these upwards lines, then a ≤ b. This "only upwards" condition is enough to enforce all three conditions. We can define ∨ (join) as a binary operation that takes two nodes, and gives a node a ∨ b that is greater than either node, and furthermore is the uniquely least node greater than both of them. (Note: A general partial order may have many pairs of nodes that do not have any node greater than both of them, or may that may have more than one incomparable node greater than them.) We can define ∧ (meet) dually, as the uniquely greatest node less than both of them. If all elements of a partially ordered set have a join and meet, we have a lattice.

It is tempting to read meet and join as "and" and "or" in logic. But these logical connectives satisfy an additional important property — distributivity: a & (b | c) = (a & b) | (a & c). (By the lattice laws, the dual property with and swapped with or is also implied). Translated for lattices this reads: a ∧ (b ∨ c) = (a ∧ b) ∨ (a ∧ c). Rather than thinking just about boolean logic, we can think about lattices built from sets — with meets as union, join as intersection, and ≤ given by inclusion. It is easy to verify that such lattices are distributive. Furthermore, every distributive lattice can be given (up to isomorphism) as one built out of sets in this way. While a partially ordered set can have a Hasse diagram of pretty arbitrary shape, a lattice is more restrictive — I imagine it as sort of the tiled diamonds of an actual lattice like one might use in a garden, but with some nodes and edges possibly removed.

Furthermore, there's an amazing result that you can tell if a lattice is distributive by looking for just two prototypical non-distributive lattices as sublattices. If neither is contained in the original lattice, then the lattice is distributed. These tell us how distribution can fail in two canonical ways. The first is three incomparable elements, all of which share a common join (the top) and meet (the bottom). The join of anything but their bottom element with them is therefore the top. Hence if we take the meet of two joins, we still get the top. But the meet of any two non-top elements is the bottom and so, if we take the join of any element with the meet of any other two, we get back to the first element, not all the way to the top, and the equality fails. The second taboo lattice is constructed by having two elements in an ordered relationship, and another incomparable to them — again augmented with a bottom and top. A similar argument shows that if you go one way across the desired entity, you pick out the topmost of the two ordered elements, and the other way yields the bottommost. (The wikipedia article on distributive lattices has some very good diagrams to visualize all this). So a distributive lattice has even more structure than before — incomparable elements must have enough meets and joins to prevent these sublattices from appearing, and this forces even more the appearance of a tiled-diamond like structure.

To get us to a Heyting algebra, we need more structure still — we need implication, which is like an internal function arrow, or an internal ≤ relation. Recall that the equation we want to satisfy is "c ∧ a ≤ b < -> c ≤ a → b". The idea is that we should be able to read ≤ itself as an "external implication" and so if c and a taken together imply b, "a implies b" is the portion of that implication if we "only have" c. We can see it as a partial application of the external implication. If we have a lattice that permits infinite joins (or just a finite lattice such that we don't need them), then it is straightforward to see how to construct this. To build a → b, we just look at every possible choice of c that satisfies c ∧ a ≤ b, and then take the join of all of them to be our object a → b. Then, by construction, a → b is necessarily greater than or equal to any c that satisfies the left hand side of the equation. And conversely, any element that a → b is greater than is necessarily one that satisfies the left hand side, and the bi-implication is complete. (This, by the way, gives a good intuition for the definition of an exponential in a category of presheaves). Another way to think of a → b is as the greatest element of the lattice such that a → b ∧ a ≤ b (exercise: relate this to the first definition). It is also a good exercise to explore what happens in certain simple cases — what if a is 0 (false)? What if it is 1? The same as b? Now ask the same questions of b.

So why is a Heyting algebra a topological construct? Consider any topological space as given by a collection of open sets, satisfying the usual principles (including the empty set and the total set, and closed under union and finite intersection). These covers have a partial ordering, given by containment. They have unions and intersections (all joins and meets), a top and bottom element (the total space, and the null space). Furthermore, they have an implication operation as described above. As an open set, a → b is given by the meet of all opens c for which a ∧ c ≤ b. (We can think of this as "the biggest context, for which a ⊢ b"). In fact, the axioms for open sets feel almost exactly like the rules we've described for Heyting algebras. It turns out this is only half true — open sets always give Heyting algebras, and we can turn every Heyting algebra into a space. However, in both directions the round trip may take us to somewhere slightly different than where we started. Nonetheless it turns out that if we take complete Heyting algebras where finite meets distribute over infinite joins, we get something called "frames." And the opposite category of frames yields "locales" — a suitable generalization of topological spaces, first named by John Isbell in 1972 [5]. Spaces that correspond precisely to locales are called sober, and locales that correspond precisely to spaces are said to have "enough points" or be "spatial locales".

In fact, we don't need to fast-forward to 1972 to get some movement in the opposite direction. In 1944, McKinsey and Tarski embarked on a program of "The Algebra of Topology" which sought to describe topological spaces in purely algebraic (axiomatic) terms [6]. The resultant closure algebras (these days often discussed as their duals, interior algebras) provided a semantics for S4 modal logic. [7] A further development in this regard came with Kripke models for logic [8] (though arguably they're really Beth models [9]).

Here's an easy way to think about Kripke models. Start with any partial ordered set. Now, for each object, instead consider instead all morphisms into it. Since each morphism from any object a to any object b exists only if a ≤ b, and we consider such paths unique (if there are two "routes" showing a ≤ b, we consider them the same in this setting) this amounts to replacing each element a with the set of all elements ≤ a. (The linked pdf does this upside down, but it doesn't really matter). Even though the initial setting may not have been Heyting algebra, this transformed setting is a Heyting algebra. (In fact, by a special case of the Yoneda lemma!). This yields Kripke models.

Now consider "collapsings" of elements in the initial partial order — monotone downwards maps taken by sending some elements to other elements less than them in a way that doesn't distort orderings. (I.e. if f(a) ≤ f(b) in the collapsed order, then that means that a ≤ b in the original order). Just as we can lift elements from the initial partial order into their downsets (sets of elements less than them) in the kripkified Heyting Algebra, we can lift our collapsing functions into collapsing functions in our generated Heyting Algebra. With a little work we can see that collapsings in the partial order also yield collapsings in the Heyting Algebra.

Furthermore, it turns out, more or less, that you can generate every closure algebra in this way. Now if we consider closure algebras a bit (and this shouldn't surprise us if we know about S4), we see that we can always take a to Ca, that if we send a → b, then we can send Ca → Cb, and furthermore that CCa → Ca in a natural way (in fact, they're equal!). So closure algebras have the structure of an idempotent monad. (Note: the arrows here should not be seen as representing internal implication — as above they represent the logical turnstile ⊢ or perhaps, if you're really in a Kripke setting, the forcing turnstile ⊩).

Now we have a correspondence between logic and computation (Curry-Howard), logic and categories (Lambek-Scott), and logic and spaces (Tarski-Stone). So maybe, instead of Curry-Howard-Lambek, we should speak of Curry-Howard-Lambek-Scott-Tarski-Stone! (Or, if we want to actually bother to say it, just Curry-Howard-Lambek-Stone. Sorry, Tarski and Scott!) Where do the remaining correspondences arise from? A cubical Kan operation, naturally! But let us try to sketch in a few more details.

3. Spaces, Categories

All this about monads and Yoneda suggests that there's something categorical going on. And indeed, there is. A poset is, in essence, a "decategorified category" — that is to say, a category where any two objects have at most one morphism between them. I think of it as if it were a balloon animal that somebody let all the air out of. We can pick up the end of our poset and blow into it, inflating the structure back up, and allowing multiple morphisms between each object. If we do so, something miraculous occurs — our arbitrary posets turn into arbitrary categories, and the induced Heyting algebra from their opens turns into the induced category of set-valued presheaves of that category. The resultant structure is a presheaf topos. If we "inflate up" an appropriate notion of a closure operator we arrive at a Grothendieck topos! And indeed, the internal language of a topos is higher-order intuitionistic type theory [10].

4. Spaces, Programming Languages

All of this suggests a compelling story: logic describes theories via algebraic syntax. Equipping these theories with various forms of structural operations produces categories of one sort or another, in the form of fibrations. The intuition is that types are spaces, and contexts are also spaces. And furthermore, types are covered by the contexts in which their terms may be derived. This is one sense in which we it seems possible to interpret the Meillies/Zeilberger notion of a type refinement system as a functor [11].

But where do programming languages fit in? Programming languages, difficult as it is to sometimes remember, are more than their type theories. They have a semantic of computation as well. For example, a general topos does not have partial functions, or a fixed point combinator. But computations, often, do. This led to one of the first applications of topology to programming languages — the introduction of domain theory, in which terms are special kinds of spaces — directed complete partial orders — and functions obey a special kind of continuity (preservation of directed suprema) that allows us to take their fixed points. But while the category of dcpos is cartesian closed, the category of dcpos with only appropriately continuous morphisms is not. Trying to resolve this gap, one way or another, seems to have been a theme of research in domain theory throughout the 80s and 90s [12].

Computations can also be concurrent. Topological and topos-theoretic notions again can play an important role. In particular, to consider two execution paths to be "the same" one needs a notion of equivalence. This equivalence can be seen, stepwise, as a topological "two-cell" tracing out at each step an equivalence between the two execution paths. One approach to this is in Joyal, Nielson and Winskel's treatment of open maps [13]. I've also just seen Patrick Schultz and David I. Spivak's "Temporal Type Theory" which seems very promising in this regard [14].

What is the general theme? Computation starts somewhere, and then goes somewhere else. If it stayed in the same place, it would not "compute". A computation is necessarily a path in some sense. Computational settings describe ways to take maps between spaces, under a suitable notion of topology. To describe the spaces themselves, we need a language — that language is a logic, or a type theory. Toposes are a canonical place (though not the only one) where logics and spaces meet (and where, to a degree, we can even distinguish their "logical" and "spatial" content). That leaves categories as the ambient language in which all this interplay can be described and generalized.

5. Spaces, Categories

All the above only sketches the state of affairs up to roughly the mid '90s. The connection to spaces starts in the late 30s, going through logic, and then computation. But the categorical notion of spaces we have is in some sense impoverished. A topos-theoretic generalization of a space still only describes, albeit in generalized terms, open sets and their lattice of subobject relations. Spaces have a whole other structure built on top of that. From their topology we can extract algebraic structures that describe their shape — this is the subject of algebraic topology. In fact, it was in axiomatizing a branch of algebraic topology (homology) that category theory was first compelled to be invented. And the "standard construction" of a monad was first constructed in the study of homology groups (as the Godement resolution).

What happens if we turn the tools of categorical generalization of algebraic topology on categories themselves? This corresponds to another step in the "categorification" process described above. Where to go from "0" to "1" we took a partially ordered set and allowed there to be multiple maps between objects, to go from "1" to "2" we can now take a category, where such multiple maps exist, and allow there to be multiple maps between maps. Now two morphisms, say "f . g" and "h" need not merely be equal or not, but they may be "almost equal" with their equality given by a 2-cell. This is just as two homotopies between spaces may themselves be homotopic. And to go from "2" to "3" we can continue the process again. This yields n-categories. An n-category with all morphisms at every level invertible is an (oo,0)-category, or an infinity groupoid. And in many setups this is the same thing as a topological space (and the question of which setup is appropriate falls under the name "homotopy hypothesis" [15]). When morphisms at the first level (the category level) can have direction (just as in normal categories) then those are (oo,1)-categories, and the correspondence between groupoids and spaces is constructed as an equivalence of such categories. These too have direct topological content, and one setting in which this is especially apparent is that of quasi-categories, which are (oo,1)-categories that are built directly from simplicial sets — an especially nice categorical model of spaces (the simplicial sets at play here are those that satisfy a "weak" Kan condition, which is a way of asking that composition behave correctly).

It is in these generalized (oo,1)-toposes that homotopy type theory takes its models. And, it is hypothesized that a suitable version of HoTT should in fact be the initial model (or "internal logic") of an "elementary infinity topos" when we finally figure out how to describe what such a thing is.

So perhaps it is not that we should be computational trinitarians, or quadrinitarians. Rather, it is that the different aspects which we examine — logic, languages, categories, spaces — only appear as distinct manifestations when viewed at a low dimensionality. In the untruncated view of the world, the modern perspective is, perhaps, topological pantheism — spaces are in all things, and through spaces, all things are made as one.

Thanks to James Deikun and Dan Doel for helpful technical and editorial comments


Quiet Earth: Cyberpunk in Cinema: BLADE RUNNER from 2019 to 2049 [Part 2]

[Editor's note: Be sure to read part one of Rochefort's series here.]

Cyberspace, artificial intelligence, street tech, cyborgs, shady megacorporations. These are just some of the staples of Cyberpunk, a science-fiction subgenre that began as a literary movement spearheaded by authors like William Gibson (whose “Neuromancer” won the Hugo, Nebula, and Philip K. Dick awards in 1984), Bruce Sterling, Rudy Rucker, Lewis Shiner, John Shirley and Neal Stephenson.

Cyberpunk reshaped the way we see the future, and has informed our understanding of the age of big data. Its pervasive and frequently subversive influence can be found in everything from the way w [Continued ...]

Open Culture: Watch David Byrne Lead a Massive Choir in Singing David Bowie’s “Heroes”

Throughout the years, we've featured performances of Choir!Choir!Choir!--a large amateur choir from Toronto that meets weekly and sings their hearts out. You've seen them sing Prince's "When Doves Cry," Soundgarden's "Black Hole Sun" (to honor Chris Cornell) and Leonard Cohen's "Hallelujah."

The product of that whimsy is now evident in this footage, almost seven minutes of exceptional sonic transformation, as the tape loop is mixed with dense oscillations, all of which is shifted, looped, glitched, and warped. There are terse bell tones and effluent white noise, lens-flare grace notes and ecstatic birdsong to “Blossom,” which true to its name expands as it proceeds — what starts as loose and gentle gets more chaotic and rambunctious as time passes. The beauty of the video isn’t merely the color and framing, but how active Annie’s left hand is, adjusting settings on various synthesizer modules, tweaking the balance of the tape deck, and lending a conductor-like visual narration to the piece.

This is the latest video I’ve added to my YouTube playlist of recommended live performances of ambient music. Video originally posted on Ann Annie’s YouTube channel. More from Ann Annie at,, and

Jesse Moynihan: Swords and Cups

Been assembling some notes on Swords and Cups. Still need to lock down a cup design to start from. Getting close to something, but I thought I’d just share what I have so far.

things magazine: Ruined forms

Casa Sperimentale, by Giuseppe Perugini, at Architizer and Dear Magazine, where these photos by Marco Ponzianelli were published. One of the internet’s favourite bits of ‘lost Brutalism’. See also the Visual News essay by Oliver Astrologo / twisted pop, slowed … Continue reading

Quiet Earth: New on Blu-ray and DVD! January 16, 2017

Thirty years after the events of the first film, a new Blade Runner, LAPD Officer, K, (Ryan Gosling), unearths a longburied secret that has the potential to plunge what’s left of society into chaos. K’s discovery leads him on a quest to find Rick Deckard (Harrison Ford), a former LAPD blade runner who has been missing for 30 years.

If you want to really dive into the history of Blade Runner and a look at the new film, read Rochefort's exhaustive article [Continued ...]

TheSirensSound: New track "You Will Go On My First Whistle" by Arms That Fit Like Legs

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: The trigger

Up she goes! Central bank raises rates: Link

One more sleep until the nobs at the central bank take back another bit of stimulus. With the BoC rate increasing a quarter point this week, that’s three increases in a handful of months, with more on the way. The implications could be epic, especially if you borrowed against your house to buy Bitcoins.

But what are the odds Stephen Poloz will pull the trigger? Yuge, apparently.

Of the 26 know-it-all economists surveyed by Bloomberg this week, 23 said it’ll happen. Overnight swaps are giving it 92% odds. And the thinking is this hike on Wednesday will be one of three that we get in 2018.

So that would increase the bank prime to a hair under 4%, and boost the cheapest HELOCs to 4.5%, with five-year mortgages advancing to 4.25%. Add in the stress test added it and borrowers must qualify at 6.25% later this year. Yes, that’s a 300% increase in 12 months. Our bankers have turned into rate-normalizing hawks. Or Buzzards. Opening the door for vultures.

Rising rates and B20 pose a combination that cannot be ignored. Yesterday this axe-wielding blog told you a mere 77 detached houses sold in Toronto (416) in the first two weeks of January. Yeah, it was brutally cold and this month so far is a bucket of suck, but that’s a disaster by any measure. Prices for detacheds fell over $90,000, or 9.1%.

A similar story is being told by Zolo (recall that realtors suspended their mid-month reports months last year, lacking the stones showed by Calgary). According to that site, monthly prices have declined in Toronto overall by 8.8%, and for the last three months the erosion is almost 14%. So imagine what happens when a five-year mortgage breaks 4%. Ouch.

Of course there are reasons the rate escalation might stop. Commodity prices (oil) plop. NAFTA blows up. Hawaii blows up. Trump blows up. Massively indebted Canadian households blow up. But in the absence of that, everybody should expect the normalization of the cost of money to continue. Never again in your lifetime will 2% five-year mortgages be available. Everything changes. Bond prices fall. Real estate comes back to earth. All variable-rate and demand loan costs rise. Mortgage credit restricts. And yet that heaping pile of debt is undiminished.

Who will pay the wages of sinful borrowing?

Many. For example, 14% of all households in the GTA, the nation’s largest market by far, own multiple properties. Typically that’s a principal residence which rose in value like a horny fungus, plus an ‘investment’ condo financed with a HELOC against the main house. The trouble now is the condo will be dropping in value and can’t be rented for enough to carry it. So it eventually gets dumped at a loss, bringing down its peers.

There are untold numbers of flippers, speckers and dumbass amateur landlords who have poured all their liquid net worth into leveraged properties, now facing a tough 2018. They may have bought at peak house last year and have already lost 10-15% of their capital. They may have purchased before selling an existing home – now turned illiquid. They may have financed condo-lusty spawn through Bank of Mom loans, but junior can’t afford the condo fees, plus the place is worth less than the financing. They may have taken mortgages from some dodgy company that, thanks to B20, won’t be renewing them. Now they face a stress test they may not pass.

Or perchance they’re one of the 49,000 realtors in Toronto with leased Audis and no income, whose idea of financial diversification is to own two condos, instead of one. Of course, there are the 48% of Canadians (as mentioned yesterday) who would be in deep trouble if their monthly expenses increase by $200 – almost guaranteed to happen as Poloz turns the screws. Unless he doesn’t.  But Mr. Market says, 92%, that he will.

So, in summary, it’s not different this time. It never is.

Ask the moisters who bought Bitcoin on Visa or with lines of credit against their condos a few months ago at almost $20,000. As I write this the cryptocurrency is worth less than $12k, and the chart suggests it has a good chance of sinking further. Zero might be a reasonable bottom. As you may know, China is cracking down on digital coin trading. South Korea used the hammer. The SEC has halted ETFs based on Bitcoin. Central banks, regulators and governments globally, as mentioned, will not allow an unregulated currency to undermine state-issued money. Like, duh. How is that possibly a surprise?

Bitcoin’s unearned ascent and worthy descent exemplify what a bubble is. Uninformed demand  and greed propelled it. Common sense and fear wrecks it. Faerie dust.

At least you could paper your bathroom with worthless Nortel certificates.

You get to live in a house that financially ruins you.

Bitcoin? Just a hole seared in your judgment.

Ideas from CBC Radio (Highlights): First Nation, Second Nation: A discussion about the state of Indigenous people in Canada today

Canadians like to pretend that Indigenous peoples have some special place, that they shape our society in some significant way, but history -- as well as contemporary actions and attitudes -- might suggest otherwise. In a country where just about all of us are immigrants, Indigenous people are creating new structures and rediscovering old values. A discussion from the Stratford Festival featuring Leanne Betasamosake Simpson, Jarrett Martineau and Alexandria Wilson.

The Shape of Code: First use of: software, software engineering and source code

While reading some software related books/reports/articles written during the 1950s, I suddenly realized that the word ‘software’ was not being used. This set me off looking for the earliest use of various computer terms.

My search process consisted of using pfgrep on my collection of pdfs of documents from the 1950s and 60s, and looking in the index of the few old computer books I still have.

Software: The Oxford English Dictionary (OED) cites an article by John Tukey published in the American Mathematical Monthly during 1958 as the first published use of software: “The ‘software’ comprising … interpretive routines, compilers, and other aspects of automotive programming are at least as important to the modern electronic calculator as its ‘hardware’.”

I have a copy of the second edition of “An Introduction to Automatic Computers” by Ned Chapin, published in 1963, which does a great job of defining the various kinds of software. Earlier editions were published in 1955 and 1957. Did these earlier edition also contain various definitions of software? I cannot find any reasonably prices copies on the second-hand book market. Do any readers have a copy?

Software engineering: The OED cites a 1966 “letter to the ACM membership” by Anthony A. Oettinger, then ACM President: “We must recognize ourselves … as members of an engineering profession, be it hardware engineering or software engineering.”

The June 1965 issue of COMPUTERS and AUTOMATION, in its Roster of organizations in the computer field, has the list of services offered by Abacus Information Management Co.: “systems software engineering”, and by Halbrecht Associates, Inc.: “software engineering”. This pushes the first use of software engineering back by a year.

Source code: The OED cites a 1965 issue of Communications ACM: “The PUFFT source language listing provides a cross reference between the source code and the object code.”

The December 1959 Proceedings of the EASTERN JOINT COMPUTER CONFERENCE contains the article: “SIMCOM – The Simulator Compiler” by Thomas G. Sanborn. On page 140 we have: “The compiler uses this convention to aid in distinguishing between SIMCOM statements and SCAT instructions which may be included in the source code.”

Running pdfgrep over the archive of documents on bitsavers would probably throw up all manners of early users of software related terms.

Planet Lisp: Quicklisp news: The Quicklisp local-projects mechanism

Quicklisp provides a lot of software, but there's also a simple way to load things that Quicklisp doesn't provide. That same mechanism can be used to override libraries Quicklisp does provide.

The local projects mechanism sets up a special directory that is automatically scanned for software to load. Here are a few quick examples.

Trying a library not in Quicklisp

First, imagine that you just heard about a great new library and want to try it. However, it's not available through Quicklisp yet, only through a git repository on One easy way to try it:
$ cd ~/quicklisp/local-projects
$ git clone
After the git command completes, and there is a fun-project subdirectory with a fun-project/fun-project.asd file present, the system is visible to ASDF and can be loaded either with ql:quickload or asdf:find-system. When loaded through ql:quickload, Quicklisp will automatically fetch and load any prerequisites automatically as well.

Overriding a library in Quicklisp

Second, imagine that you want to hack on a library that Quicklisp already provides. You don't want to load and hack on the version from Quicklisp - that software is not under version control, and just represents a snapshot of the project at a particular point in time.

Once again, the procedure is to put the software in the ~/quicklisp/local-projects/ directory:
$ cd ~/quicklisp/local-projects/
$ git clone
After the git command completes, (ql:quickload "vecto") will load the library from local-projects rather than from the standard Quicklisp release.

How it works

The local-projects mechanism is relatively automatic. Here's how it works underneath, and how to fix problems that might crop up.

ASDF has an extensible mechansim (the asdf:*system-definition-search-functions* variable) for searching for system files. Quicklisp extends this mechanism with a function that does the following, all in the context of the local-projectsdirectory.
  1. If there is no file named system-index.txt, it is created by scanning the directory tree for system files (matching "*.asd"). Each pathname is added to the file.
  2. If the system-index.txt file exists, but its timestamp is older than its containing directory, the directory is rescanned and the index recreated.
  3. The system-index.txt is searched for any entry with a pathname-name that matches the desired system name. If there's a match, matching pathname is probed. If it still exists, it is returned. If it has disappeared, the system-index.txt is recreated as in step 1 and the search is retried.
  4. Otherwise the system search is deferred to the remaining ASDF system search functions.
When there are multiple system files with the same name in the directory tree, the one with the shortest full pathname name is returned. In the case of a pathname length tie, the one that is #'string< is returned.

Timestamp problems can sometimes crop up with step 2 above. For example, if you have a directory local-projects/my-project/ and you create local-projects/my-project/supporting-system.asd, the timestamp of local-projects/ is not updated and supporting-system.asd won't be automatically added to the system index file.

There are a couple ways to force an update of the system index file. Within Lisp, you can use (ql:register-local-projects) to immediately regenerate system-index.txt. Outside of Lisp, you can use the touch command (or an equivalent) to update the timestamp of the local-projects directory, which will trigger a rebuild of the index on the next attempt to find systems..

Because of how the system index file is created (and recreated as needed), Quicklisp must have write access to the local-projects directory to make use of it.


The local-projects mechanism is configured through a special variable ql:*local-project-directories*. By default, it includes only the local-projects subdirectory in the Quicklisp install directory, but you can add or remove directories at any time to have more places scanned for systems.
To disable the local-projects mechanism entirely, set ql:*local-project-directories* to NIL.

New Humanist Blog: The science of belief: a conversation

Scientists Colin Blakemore and Tom McLeish examine how the cognitive impetus that drove the emergence of science might be considered to be the same impetus that fostered religion and other metaphysical beliefs.

Planet Lisp: Quicklisp news: Build failures with ASDF 3.3.1

SBCL 1.4.3 ships with ASDF 3.3.1, and a number of Quicklisp projects have build problems as a result. Linedit, mgl, micmac, cl-string-match, and others are affected.

Here is a build failure report for yesterday. (You can ignore the gendl failures - it's a special case.) If anyone has ways to fix these projects, please do so as soon as you can - otherwise they will be removed from the January Quicklisp dist update in a few weeks.

Tea Masters: Une leçon de Hung Shui Oolongs

Comment apprendre le thé? Comme pour le vin, c'est en le dégustant et en sachant exactement ce que l'on déguste. Si l'on est débutant, on commencera par goûter à des thés de familles différentes (blanc, vert, Oolong, rouge, puerh par exemple). Mais pour progresser, on concentrera sa leçon sur des thés plus précis, plus proches les uns des autres afin d'apprécier ces petits détails qui changent avec des feuilles différentes bien que similaires.
Pour cette leçon focalisée sur les Hung Shui Oolongs de Taiwan, j'ai préparé 2 Chaxi. Celui ci-dessus est pour mon élève (il a un plateau qinghua en porcelaine sous les tasses afin de ne pas tacher le Chabu). Mon Chaxi ci-dessous n'a pas de plateau. Mon challenge sera donc de ne pas renverser de thé sur le tissu brun pendant mes infusions.
 Nous commençons par le top Hung Shui Jinxuan de Dong Ding de cet hiver. Je montre comment une verse douce et lente (lors des 2 premières infusions) permet de donner de la profondeur et de la pureté à un Oolong récemment torréfié. On aurait pu simplement appelé ce thé un Dong Ding Oolong, puisqu'il vient de Dong Ding et qu'il est torréfié comme ils le sont traditionnellement. Si je préfère le nom "Hung Shui" (eau rouge), c'est car c'est le nom technique des Oolongs torréfiés à la façon de Dong Ding. C'est un nom plus exact, car il y a 2 phénomènes qui brouillent la signification du nom Dong Ding Oolong:
1. avec la popularité des Oolongs frais, non torréfiés, on produit aussi des Oolongs non torréfiés dans la région de Dong Ding. (Celui-ci est un bon exemple d'Oolong bien oxydé, mais pas torréfié),
2. le concours de Dong Ding est ouvert à tous les fermiers de l'ile pourvu que le thé soit produit selon le mode Hung Shui, si bien que les gagnants de ce concours 'Dong Ding' proviennent souvent des hautes montagnes de Shan Lin Xi ou de Lishan.
Ensuite, nous avons infusé le top Hung Shui de Dong Ding de cet hiver également. Sa torréfaction était un peu plus légère, ses feuilles plus vertes et surtout plus petites. Cette taille réduite est due au cultivar qingxin Oolong et au fait que les feuilles furent cueillies relativement tôt. En haute montagne, par contre, on remarqua que les feuilles de qingxin Oolong de ce Hung Shui Oolong de compétition de Shan Lin Xi sont plus matures et longues. Nous remarquons aussi que le style 'compétition' est plus fortement torréfiées que mon batch de Dong Ding d'hiver 2017 dont la fraicheur est particulièrement bien préservée.
 La famille des Hung Shui Oolongs recouvre un large spectre de torréfactions et d'oxydations. Avec ces deux variables, il est possible de produire des thés avec des arômes très différents: un mélange de fruits mûrs, de miel et de mélasse, ou bien de céréales grillées comme le malt ou le riz soufflé, parfois si intenses qu'on dirait du whisky (sans l'alcool!) Parfois, il y a encore une certaine fraicheur sous-jacente quand la torréfaction est bien faite.
 L'autre particularité des Hung Shui, c'est que leurs feuilles se conservent bien dans le temps. C'est pourquoi il est intéressant de goûter à 2 styles de Hung Shui bonifiés:
Hung Shui Oolong de 2003 et 1979
1. (A gauche) Un Hung Shui Oolong de Dong Ding de 2003. Celui-ci n'a pas été retorréfié depuis 2003, et c'est pourquoi ses feuilles apparaissent plus vertes que celles du
2. Hung Shui Oolong de Dong Ding de 1979 qui a un degré de torréfaction plus important et dont les feuilles sèches ont eu le temps de s'ouvrir.
Bien que ces 2 thés âgés aient perdu leur odeurs torréfiées, il vaut mieux les infuser tout doucement afin d'accentuer la longueur en bouche. On voit clairement que les feuilles de 2003 s'ouvrent très bien et que l'infusion couleur or démontre un très bel équilibre. Au nez, ce Hung Shui Oolong d'une quinzaine d'année a complètement perdu ses senteurs torréfiées. A la place, on a des senteurs qui font penser à ces magasins de parfum Duty Free dans les aéroports! L'odeur dominante est l'osmanthus. Le goût est dansant et pur. C'est un Oolong qui ne laisse pas voir son âge, juste sa qualité!
Pour celui de 1979, la première infusion est très surprenante. Elle a une bonne dose d'acidité (Wuyi suan est le terme technique chinois), qui, heureusement finit par des notes plus sucrées. On dirait un vin de prunes autant au niveau de l'acidité que des arômes. Les prochaines infusions sont plus douces, moins acides, mais les notes restent puissantes. On sent plus le bois noble et ancien maintenant. Par contre, les notes de torréfaction ont aussi disparues.
En parallèle à cela, j'ai réussi mon challenge: je n'ai pas taché mon Chabu malgré le grand nombre d'infusions de ces 5 Hung Shui Oolongs!

OCaml Weekly News: OCaml Weekly News, 16 Jan 2018

  1. Bioinformatics with OCaml: Prohlatype
  2. MariaDB 1.0.1
  3. µDNS - an opinionated DNS server library
  4. Uri 1.9.6 available: with improved ARM support and build times
  5. new release of vpt (a Vantage-point tree library)
  6. OCaml 2017 videos?
  7. Other OCaml News

Tea Masters: Dong Ding finesse

Winter 2017 Top Hung Shui Oolong from Dong Ding 
It's difficult to brew tea perfectly. I confess that it doesn't happen as often as I'd like. It's a constant challenge and I'm the first to notice when I merely brewed a good rather than a great cup of tea. But that's also what makes tea so interesting. And that's what I try to show to people who come to have tea with me, that when we are brewing the same tea we get different results. If my cup is better, it's because I have been spent a lot of time to improve my brewing technique. I even wrote a  guide to share what I've learned!

When I notice in the blog that my cup is better, my point is not to show off but to make my readers realize that the brewing is essential. This is something that is difficult to experience when you brew alone, but it's something that I'm tasting at every tea class: a tea can go from flat, average to deep and excellent just through the proper way of pouring water inside the teapot!
But sometimes the problem lies somewhere else. Usually, for gongfucha, tea prepared in a small teapot, the more (roasted Oolong or puerh) leaves, the better, because the tea will taste more concentrated and intense. In that spirit, I brewed this top Hung Shui Oolong from Dong Ding with a fair amount of leaves last week. I found the result quite disappointing (too rough and to strong), even when brewing the leaves for a very short time with this same Duanni teapot.
 This time, I drastically reduced the amount of leaves for my brews. (See above). Instead of weighing the leaves, I use a very small antique qinghua plate to display the leaves I was about to brew. Usually, the rule is to roughly cover the bottom of the teapot with dry leaves. This time it was just half covered.
While pouring water in the teapot, this small qinghua plate can also be used to place the lid on it. This small plate must have had a different purpose when it was made some 100 years ago. Using it to display/measure the leaves and for the lid is what the Japanese would call a Mitate (見立て) whereby the tea master finds a new, tea related purpose in a Chaxi for an ordinary object.
With fewer leaves and slow brews, this Dong Ding Oolong was able to express all its finesse, sweetness combined with its underlying freshness. Now it's close to perfection, I thought! The roasting aromas and the tea aromas are in harmony and not overshadowing each other. And the tea feels alive, dancing on the palate, sweet in the throat and slowly melting away.
I also had a very beautiful Chaxi for my failed brews last week. But I was glad that this one is even nicer and that it includes the latest tea postcard that you helped me choose on Facebook as my newest gift for purchases on
On this postcard we can see Qilin lake and Oolong plantations that are part of the Dong Ding village where Dong Ding Oolong started!

Addendum: the open leaves after the brews. They are quite small and therefore well concentrated. Comic for 2018.01.16

New Cyanide and Happiness Comic

Daniel Lemire's blog: Microbenchmarking calls for idealized conditions

Programmers use software benchmarking to measure the speed of software. We need to distinguish system benchmarking, where one seeks to measure the performance of a system, with microbenchmarking, where one seeks to measure the performance of a small, isolated operation.

For example, if you are building a browser, you might want to know how quickly the iPhone 7 can retrieve and display the Wikipedia home page. That’s what I call system benchmarking.

Microbenchmarking is much more narrow in scope. So maybe you are interested in how quickly your processor can divide two numbers. Or maybe you want to know how quickly your processor can generate a random number.

Microbenchmarking is almost entirely distinct from system benchmarking, even if it looks superficially the same.

If you are trying to find out how quickly your processor can divide two integers, you want to know that, nothing else. You don’t want to measure how quickly the processor can divide two numbers while loading a website and cleaning out its registry.

Microbenchmarking is a form of science. Think about the physicist trying to measure the speed of light in some medium. It usually calls for idealized conditions. Why would you want to use idealized conditions when you know that they will not incur in the normal course of system operations?

  1. Idealized conditions makes it easier to reason about the results. You are much more likely to know about the factors that influence your code if you have far fewer factors.
  2. Idealized conditions are more consistent and reproducible. Real systems are not stationary. Their performance tends to fluctuate. And to reproduce even your own results can be a challenge when working with real systems. A small idealized scenario is much easier to reproduce.

Though microbenchmarking can be used as part of an engineering project, the primary purpose is to gain novel insights.

You might object to idealized conditions… what if you want to know how fast your code would run “in practice”? The problem is that “in practice” is ill-defined. You need to tell us what the “practice” is. That is, you need to describe a system, and then you are back into system benchmarking.

System benchmarking is more typically engineering. You seek to make a system that people care about better. You can and should be able to get reproducible results with system benchmarking, but it is much more costly. For one thing, describing a whole system is much harder than describing a tiny function.

In the type of microbenchmarking I like to do, CPU microbenchmarking, the idealized conditions often take the following form:

  • As much as possible, all the data is readily available to the processor. We often try to keep the data in cache.
  • We make sure that the processor runs at a flat clock frequency. In real systems, processors can run slower or faster depending on the system load. As far as I know, it is flat out impossible to know how many CPU cycles were spent on a given task if the CPU frequency varies on Intel processors. Let me quote Wikipedia on time-stamp counters (TSC):

    Constant TSC behavior ensures that the duration of each clock tick is uniform and makes it possible to use of the TSC as a wall clock timer even if the processor core changes frequency. This is the architectural behavior for all later Intel processors.

    Your processor can run faster or slower than the advertized frequency, unless you specifically forbid it.

  • If the code performance depends on branching, then we make sure that the branch predictor has had the chance to see enough data to make sound predictions. An even better option is to avoid branching as much as possible (long loops are ok though).
  • When using a programming language like Java, we make sure that the just-in-time compiler has done its work. Ideally, you’d want to avoid the just-in-time compiler entirely because it is typically not deterministic: you cannot count on it to compile the code the same way each time it runs.
  • You keep memory allocation (including garbage collection) to a minimum.
  • You keep system calls to a minimum.
  • You must benchmark something that lasts longer than ~30 cycles, but you can use loops.
  • You have to examine the assembly output from your compiler to make sure that your compiler is not defeating your benchmark. Optimizing compilers can do all sorts of surprising things. For example, your compiler could figure out that it does not need to compute the trigonometric functions you are calling very accurately, and so it could fall back on a cheaper approximation. If you can’t examine the assembly output, you should be very careful.

Of course, these are not requirements for a good microbenchmark, it is just a set of requirements that I have often found useful.

If you are lucky and can meet the above requirements, then you can run reliable CPU microbenchmarks in microseconds.

What do I mean by reliable? Last week, I reported that standard C code can interleave two 32-bit integers into a 64-bit integers using about 3.6 cycles on a Skylake processor. In the my original post, I only ran 5 trials, I picked the minimum and I divided the total clock time by the number of pairs. This was criticized: according to some, my benchmark did not last long enough to “stabilize” the results; according to others, my test is too short to have good accuracy.

I returned to this Skylake processor and computed a histogram. I run my short function (which computes 1000 interleaves) 2048 times, and each time I record the duration, before dividing by the number of pairs (1000). Running this whole experiment takes less than a second, and only requires a simple script. So let us look at the histogram:

The bulk of the results are in the 3.6 to 3.65 range (72% of all data points). The median is 3.636. As a first approximation, the noise follows a log-normal distribution. You might expect the measurement noise to follow a normal distribution (a bell curve), but normal distributions are uncommon when measuring computer performance.

From this histogram, one could argue that the “real” time is maybe slightly above 3.6 cycles per pair. It is maybe somewhere between 3.6 and 3.7 cycles. But that’s clearly a reasonable uncertainty.

Even though 72% of data points are between 3.6 and 3.65, the average is 3.6465… but that can be explained by the presence of outliers (for example, one measure was 17.2).

I like to report the minimum because it is easy to reason about: it is close to the best-case scenario for the processor. We know a lot about what processors can do in the best case… Yes, it is idealized but that’s fine.

My minimum is still the minimum of large averages (1000 pairs)… so it is more robust than you might expect. Also I can reasonably expect the noise to be strictly psitive, so the minimum makes sense as an estimator.

If you want to know how fast a human being can run, you look at the world’s records and pick the smallest time. When I run microbenchmarks, I want to know how fast my processor can run my code in ideal conditions.

If you had a normal distribution (bell curve), then taking the minimum would be a terrible idea because you would tend to track unlikely events (also called sigma events). But that’s not the case in performance: with a proper testing methodology, your minimum will be consistent from run to run.

Is the minimum necessarily a good metric? I think it is in CPU-bound microbenchmarks. If you can check that the average (on a quiet machine) is close to the minimum, then the minimum is surely sound. Intel recommends looking at the variance of the minimum from run to run. If the variance is small (e.g., 0), then the minimum is likely a sound metric.

There is nothing magical about the average because your distributions are almost never normal, it is just easily computed. If you are doing service benchmarks, percentiles (1%,20%, 50%,80%,99%) are very useful, and the minimum and maximum are just instances of percentiles.

When in doubt regarding which metric to use, just plot your data. If you can’t script it, just throw the numbers in a spreadsheet (like excel) and generate some graph.

Notice how I returned to this benchmark a week later, after the machine was updated and rebooted, and was able, without effort, to reproduce exactly my prior results.

Why don’t I include normality tests, standard errors and all that? Because I don’t need to. The best statistician is the one you never need. It is something of a myth that scientific results need fancy statistics: great scientific results require very little statistical sophistication. You just need to control the factors involved so that the numbers can speak for themselves.

A good analogy is how you weight people. If you do things right, that is, you weight the individual naked at always the same time at the beginning of the day, before any food was eaten, a single (short) measure each day is more than good enough. The “error” is probably small and irrelevant. If you weight people randomly during the day, after random meals, wearing random clothes, then you may need many measures to accurately estimate someone’s weight. Of course, once you throw in the complexity of having to deal with a whole population, then it becomes much harder to control all the factors involved and you may need fancy statistics again.

Wouldn’t I get more accurate results if I repeated my tests more often? Maybe not. If you pick a textbook, you might learn that averaging repeated measures brings you closer to a true value… but there are hidden assumptions behind this result that typically do not hold. Moreover, it is very hard to keep a modern system busy doing just one small thing for a long time. You can do it, but it requires a lot more care.

So long-running benchmarks are often not good idealized microbenchmarks because they also measure all sorts of irrelevant operations like unnecessary cache misses, context switches and so forth. The numbers you get often depend on how quiet your machine was at the time. You can run the same benchmark for three days in a row and get different numbers (with error margins far above 1%). It is certainly possible to get good long-running microbenchmarks, it is just harder.

Think about the surface of the Earth. If you move very little, then you can safely assume that the ground is flat and that we live on a planar surface. If you move a lot, you will encounter montains and seas, and so forth.

(Of course, system benchmarks can and maybe should be run over a long time… but that’s different!)

Can it hurt to run really exhaustive tests? I think it can. The computer might not tire of waiting for seconds and minutes, but the human mind fares a lot better when answers keep coming quickly. I favor many small and cheap microbenchmarks over a few long-running ones, even when the long-running benchmarks are perfectly clean.

The idea that if a benchmark takes longer to run, it ought to be better might seem intuitively self-evident, but I think we should be critical of this view. Fast is good.

Penny Arcade: News Post: NFL DLC

Tycho: I’ll talk to Gabe later about changing the first i to a capital I, but since it’s in the original script I suspect it’s gonna have to be something I do myself with tools I fashion from wood and stone.  Sometimes I can convince Kiko to do it in secret.  I might be getting him in trouble, there. I have to admit that I’ve been quite startled by the viewership on Twitch for a few recent events, but this is what happens when a piece of content floods a new viewership into the platform.  When I saw the OWL running on Wed - didn’t see the new one on…

things magazine: Clatter

The Vestaboard, when you want text messages delivered on old-fashioned train announcement boards / Superseventies is a tumblr devoted to said decade, includes the occasional that might reflect badly in the modern era/ a London charity shop scavenger hunt (via … Continue reading

Perlsphere: Maintaining Perl 5 (Tony Cook): November 2017 Report

This is a monthly report by Tony Cook on his grant under Perl 5 Core Maintenance Fund. We thank the TPF sponsors to make this grant possible.

Two tickets were worked on.

[Hours]         [Activity]
 17.98          #127743 work out a practical fix, work on implementation
                #127743 finish implementation (with some side trips –
                found a new bug), testing
                #127743 commit, work on fix for network retrieves of large
                objects, fix some -DDEBUGGING build issues (amongst a maze
                of massive macro expansions)
                #127743 build issues, portability work, testing
                #127743 run entire test suite, find out -DDEBUGGING builds
                of Storable are painfully slow for large arrays, hashes
                #127743 fix a huge.t failure I introduced, more testing,
                fix flagged hash bug
                #127743 rebase on blead (complicated by ad2ec6b54c),
                testing, rebase fixes
                #127743 cross platform testing, fixes
 13.44          #132506 netbsd in-place edit failures, reproduce, testing,
                #132506 re-working in-place finalization code
                #132506 re-working in-place finalization code
                #132506 re-working in-place finalization code
                #132506 debugging, more fixes
                #132506 debugging fork test leaving work files behind
                #132506 fix unlink on backup failure breakage, testing,
                update hints for netbsd
                #132506 cross platform testing
 31.42 hours total

new shelton wet/dry: To flame in you. Ardor vigor forders order.

America’s largest city, 8.5 million strong, is taking decisive action on two separate fronts. We are demanding compensation from those who profit from climate change. And we plan to withdraw our formidable investment portfolio from an economic system that is harmful to our people, our property and the city we love and invest it in [...]

TheSirensSound: "Texas" by maxwell cabana

Riding on soulful vocals and picked up in the second half by woodwind flourishes, “Texas” is about loneliness and redemption, the soundtrack to a pause for some much needed reflection. The Portland, OR-based act Maxwell Cabana was born in a hazy ‘02 Corolla, in the parking lot of a fried chicken joint up in Seattle. It was there that his journey began, and with a basket of chicken and a mix CD of classic Soul tunes, he got to work. Nowadays, his psychedelic R&B quartet includes Seattle native Murray McCulloch on guitar and vocals, Portland's own Noah Puggarana on keys, and Chicago-born brothers Sean and Jamie Higgins on bass and drums, respectively.

Years in the making, Maxwell’s band began to take shape in early 2016, when Murray McCulloch approached Sean and Jamie Higgins with the concept for the project. The three had played together before in the funky hip-hop outfit Sack Lunch, along with frequent collaborator HB, so they knew the chemistry was already there. Soon afterwards, the three began writing and rehearsing their first material in a damp North Portland basement.

After releasing their first project, a self-titled EP recorded at Cloud City Sound Studios in Portland, the band met keyboardist Noah Puggarana, whose soulful melodies and heady harmonies seemed to effortlessly fill in spaces that the music didn’t even know it had. Since then, Maxwell Cabana and his band have been playing shows around the Portland area, studying, jamming, and refining their sound.

In the summer of 2017, the group linked with producer Joey Cox to record Nothing Changed, their forthcoming release. The EP combines the group's love for old-school R&B and Soul with an affinity for Jazz, Hip-Hop and psychedelia. Maxwell Cabana’s influences range from Tim Maia to Madlib, Donny Hathaway to Gorillaz, and beyond.

TheSirensSound: Pretend for Another by Old Smile

The glue that binds it all together is Herman himself, who wrote, recorded, mixed, and played every instrument in this collection. Who is Tom Herman Jr. anyway? A mythic hermit from south Jersey? A well-honed bedroom savant? Listen and let us know. He’s still trying to find out. 

TheSirensSound: New album Rise by Echoes Across The Sky

"Echoes Across The Sky" was formed in late 2011 as an alternate project for musician Craig Siegelbaum. Craig known mostly for his guitar work in rock bands Supergenius and Half Circle Drive as well as his solo guitar instrumental project. With "Echoes Across The Sky," Craig broadens his musical landscape. In fact there is very little guitar used in the music. It's a more of a piano and string based sound with thunderous drums and bass. The debut EP is was released in October 2012. The EP was very well received. With downloads from all over the world already. Echoes Across The Sky is just starting to take off. The debut full length album "The Growing Distance" was released on 5.5.15 and is available at all digital retailers. The sophomore album "Rise" was released 1.1.18 and is also available at all digital retailers and streaming services. 

CreativeApplications.Net: Synthetic Pollenizer – Robotic interventions in the real-world ecological systems

Created by Michael Candy, "Synthetic Pollenizer" is a conceptual intervention in real-world ecological systems using artificial flowers. Inspired by natural pollenizers, these robotic replicas artificially pollinate bees, integrating into the reproductive cycle of local flora; an initiative into a cybernetic ecology.

OCaml Planet: 2017 at OCamlPro

Since 2017 is just over, now is probably the best time to review what happened during this hectic year at OCamlPro… Here are our big 2017 achievements, in the world of OCaml (with OPAM 2, flambda 2 etc.), of blockchains (Tezos and the Tezos ICO, Liquidity, etc.), of formal methods (Alt-Ergo etc.)

In the World of OCaml

Towards OPAM 2.0

OPAM was born at Inria/OCamlPro with Frederic, Thomas and Louis, and is still maintained here at OCamlPro. Now thanks to Louis Gesbert’s thorough efforts and the OCaml Labs contribution, OPAM 2.0 is coming !

  • opam is now compiled with a built-in solver, improving the portability, ease of access and user experience (`aspcud` no longer a requirement)
  • new workflows for developers have been designed, including convenient ways to test and install local sources, more reliable ways to share development setups
  • the general system has seen a large number of robustness and expressivity improvements, like extended dependencies
  • it also provides better caching, and many hooks enabling, among others, setups with sandboxed builds, binary artifacts caching, or end-to-end package signature verification.

More details: on and releases on

This work is allowed thanks to JaneStreet’s funding.

Flambda Compilation

* Work of Pierre Chambart, Vincent Laviron

Pierre and Vincent’s considerable work on Flambda 2 (the optimizing intermediate representation of the OCaml compiler – on which inlining occurs), in close cooperation with JaneStreet’s team (Mark, Leo and Xavier) aims at overcoming some of flambda’s limitations. This huge refactoring will help make OCaml code more maintainable, improving its theoretical grounds. Internal types are clearer, more concise, and possible control flow transformations are more flexible. Overall a precious outcome for industrial users.

This work is allowed thanks to JaneStreet’s funding.

OCaml for ia64-HPUX

In 2017, OCamlPro also worked on porting OCaml on HPUX-ia64. This came from a request of CryptoSense, a French startup working on an OCaml tool to secure cryptographic protocols. OCaml had a port on Linux-ia64, that was deprecated before 4.00.0 and even before, a port on HPUX, but not ia64. So, we expected the easiest part would be to get the bytecode version running, and the hardest part to get access to an HPUX-ia64 computer: it was quite the opposite, HPUX is an awkward system where most tools (shell, make, etc.) have uncommon behaviors, which made even compiling a bytecode version difficult. On the contrary, it was actually easy to get access to a low-power virtual machine of HPUX-ia64 on a monthly basis. Also, we found a few bugs in the former OCaml ia64 backend, mostly caused by the scheduler, since ia64 uses explicit instruction parallelism. Debugging such code was quite a challenge, as instructions were often re-ordered and interleaved. Finally, after a few weeks of work, we got both the bytecode and native code versions running, with only a few limitations.

This work was mandated by CryptoSense.

The style-checker Typerex-lint

* Work of Çagdas Bozman, Michael Laporte and Clément Dluzniewski.

In 2017, typerex-lint has been improved and extended. Typerex-lint is a style-checker to analyze the sources of OCaml programs, and can be extended using plugins. It allows to automatically check the conformance of a code base to some coding rules. We added some analysis to look for code that doesn’t comply with the recommendations made by the SecurOCaml project members. We also made an interactive web output that provides an easy way to navigate in typerex-lint results.

Build systems and tools

* Work of Fabrice Le Fessant

Every year in the OCaml world, a new build tool appears. 2017 was not different, with the rise of jbuild/dune. jbuild came with some very nice features, some of which were already in our home-made build tool, ocp-build, like the ability to build multiple packages at once in a composable way, some other ones were new, like the ability to build multiple versions of the package in one run or the wrapping of libraries using module aliases. We have started to incorporate some of these features in ocp-build. Nevertheless, from our point of view, the two tools belong to two different families: jbuild/dune belongs to the “implicit” family, like ocamlbuild and oasis, with minimal project description; ocp-build belongs to the “explicit” family, like make and omake. We prefer the explicit family, because the build file acts as a description of the project, an entry point to understand the project and the modules. Also, we have kept working on improving the project description language for ocp-build, something that we think is of utmost importance. Latest release: ocp-build 1.99.20-beta.

Other contributions and softwares

In the World of Blockchains

Tezos and the Tezos ICO

* Work of Grégoire Henry, Benjamin Canou, Çagdas Bozman, Alain Mebsout, Michael Laporte, Mohamed Iguernlala, Guillem Rieu, Vincent Bernardoff (for DLS) and at times all the OCamlPro team in animated and joyful brainstorms.

Since 2014, the OCamlPro team had been co-designing the Tezos prototype with Arthur Breitman based on Arthur’s White Paper, and had undertaken the implementation of the Tezos node and client. A technical prowess and design achievement we have been proud of. In 2017, we developed the infrastructure for the Tezos ICO (Initial Coin Offering) from the ground up, encompassing the web app (back-end and front-end), the Ethereum and Bitcoin (p2sh) multi-signature contracts, as well as the hardware Ledger based process for transferring funds. The ICO, conducted in collaboration with Arthur, was a resounding success — the equivalent of 230 million dollars (in ETH and BTC) at the time were raised for the Tezos Foundation!

This work was allowed thanks to Arthur Breitman and DLS’s funding.

The Liquidity Language for smart contracts

* Work of Alain Mebsout, Fabrice Le Fessant, Çagdas Bozman, Michaël Laporte


OCamlPro develops Liquidity, a high level smart contract language for Tezos. Liquidity is a human-readable language, purely functional, statically-typed, whose syntax is very close to the OCaml syntax. Programs can be compiled to the stack-based language (Michelson) of the Tezos blockchain.

To garner interest and adoption, we developed an online editor called “Try Liquidity. Smart-contract developers can design contracts interactively,  directly in the browser, compile them to Michelson, run them and deploy them on the alphanet network of Tezos.

Future plans include a full-fledged web-based IDE for Liquidity. Worth mentioning is a neat feature of Liquidity: decompiling a Michelson program back to its Liquidity version, whether it was generated from Liquidity code or not. In practice, this allows to easily read somewhat obfuscated contracts already deployed on the blockchain.

In the World of Formal Methods


* By Mohamed Iguernlala


For Alt-Ergo, 2017 was the year of floating-point arithmetic reasoning. Indeed, in addition to the publication of our results at the 29th International Conference on Computer Aided Verification (CAV), Jul 2017, we polished the prototype we started in 2016 and integrated it in the main branch. This is a joint work with Sylvain Conchon (Paris-Saclay University) and Guillaume Melquiond (Inria Lab) in the context of the SOPRANO ANR Project. Another big piece of work in 2017 consisted in investigating a better integration of an efficient CDCL-based SAT solver in Alt-Ergo. In fact, although modern CDCL SAT solvers are very fast, their interaction with the decision procedures and quantifiers instantiation should be finely tuned to get good results in the context of Satisfiability Modulo Theories. This new solver should be integrated into Alt-Ergo in the next few weeks. This work has been done in the context of the LCHIP FUI Project.

We also released a new major version of Alt-Ergo (2.0.0) with a modification in the licensing scheme. Alt-Ergo@OCamlPro’s development repository is now made public. This will allow users to get updates and bugfixes as soon as possible.

Towards a formalized type system for OCaml

* Work of Pierrick Couderc, Grégoire Henry, Fabrice Le Fessant and Michel Mauny (Inria Paris)

OCaml is known for its rich type system and strong type inference, unfortunately such complex type engine is prone to errors, and it can be hard to come up with clear idea of how typing works for some features of the language. For 3 years now, OCamlPro has been working on formalizing a subset of this type system and implementing a type checker derived from this formalization. The idea behind this work is to help the compiler developers ensure some form of correctness of the inference. This type checker takes a Typedtree, the intermediate representation resulting from the inference, and checks its consistency. Put differently, this tool checks that each annotated node from the Typedtree can be indeed given such a type according to the context, its form and its sub-expressions. In practice, we could check and catch some known bugs resulting from unsound programs that were accepted by the compiler.

This type checker is only available for OCaml 4.02 for the moment, and the document describing this formalized type system will be available shortly in a PhD thesis, by Pierrick Couderc.

Around the World

OCamlPro’s team members attended many events throughout the world:

As a member committed to the OCaml ecosystem’s animation, we’ve organized OCaml meetups too (see the famous OUPS meetups in Paris!).

A few hints about what’s ahead for OCamlPro

Let’s keep up the good work!


New Humanist Blog: Warp-speed capitalism

The latest sci-fi imagines what society will look like if we colonise space – a universe in which might is right and there are no good guys.

Planet Lisp: Victor Anyakin: Reading a file line-by-line revisited

One of the frequent questions is how do you read a file line by line using Common Lisp?

A canonical answer, as formulated by the Practical Common Lisp, section 14. Files and File I/O is essentially the same as the one provided by the Common Lisp Cookbook (Reading a File one Line at a Time):

(let ((in (open "/some/file/name.txt" :if-does-not-exist nil)))
  (when in
    (loop for line = (read-line in nil)
        while line do (format t "~a~%" line))
    (close in)))

And basically it does the job.

But what happens if you deal with a log that has captured random bytes from a crashing application? Lets simulate this scenario by reading from /dev/urandom. SBCL will give us a following result:

debugger invoked on a SB-INT:STREAM-DECODING-ERROR in thread
#:  :UTF-8 stream decoding error on
#:   the octet sequence #(199 231) cannot be decoded.

Type HELP for debugger help, or (SB-EXT:EXIT) to exit from SBCL.

restarts (invokable by number or by possibly-abbreviated name):
  0: [ATTEMPT-RESYNC   ] Attempt to resync the stream at a character boundary
                         and continue.
  1: [FORCE-END-OF-FILE] Force an end of file.
  2: [INPUT-REPLACEMENT] Use string as replacement input, attempt to resync at
                         a character boundary and continue.
  3: [ABORT            ] Exit debugger, returning to top level.


The same will be reported on other Lisp implementations. However, dealing with this problem is not really portable, and requires platform-specific switches and boilerplate code.

For example, on SBCL it is possible to specify a replacement character in the external-format specification:

(with-open-file (in "/dev/urandom"
                      :if-does-not-exist nil
                      :external-format '(:utf-8 :replacement "?"))
  ;; read lines

Other Lisps require a different and incompatible external format specification.

But there are actually other ways to read a file line-by line. cl-faster-input looks into some of them. Namely:

  • A standard read-line.
  • read-line-into-sequence suggested by Pascal Bourguignon in a cll discussion. Unlike the standard read-line this function reads lines into a pre-allocated buffer, reducing workload on the garbage collector.
  • read-ascii-line that is the part of the COM.INFORMATIMAGO.COMMON-LISP.CESARUM library.
  • ub-read-line-string from the ASCII-STRINGS package that is a part of the CL-STRING-MATCH library

Please check the src/benchmark-read-line.lisp in the sources repository.

Benchmarks show that the ub-read-line-string outperforms the standard read-line approach, does not require platform-specific switches, and allows trivial character substitution on the fly (like up/down casing the text, replacing control characters etc.)

Sample usage (from the sources):

(with-open-file (is +fname+ :direction :input :element-type 'ascii:ub-char)
    (loop with reader = (ascii:make-ub-line-reader :stream is)
       for line = (ascii:ub-read-line-string reader)
       while line
       count line))

On developer’s desktop it takes 1.71 seconds to complete the benchmark with the standard read-line, and 1.076 seconds with the ub-read-line-string benchmark. Memory consumption is on the same level as the standard read-line, though significantly higher than the read-line-into-sequence.

On Clozure CL 1.9 the read-ascii-line benchmark fails. The ub-read-line-string falls into an infinite loop.

On Embeddable CL 16.0 all functions work, but the ub-read-line-string takes almost 10 times more time to complete than any of the alternatives.

Conclusion: It might be reasonable to look at different approaches for reading files line-by-line if you plan to deal with large volumes of text data with a possibility of presence of malformed characters. Check the sources of cl-faster-input for different ideas, tweak and run the benchmarks as it suits your tasks.

P.S. this post has been written in September of 2015 but never published. As it appeared to be pretty complete I decided to post it now, in the January of 2018. Stay tuned… Comic for 2018.01.15

New Cyanide and Happiness Comic

Ideas from CBC Radio (Highlights): Decoding pre-historic art with Jean Clottes

Neil Sandell introduces us to the French archaeologist Jean Clottes, a man who's devoted his lifetime trying to decipher the rich, enigmatic world of cave art.

Trivium: 14jan2018 Comic for 2018.01.14

New Cyanide and Happiness Comic Comic for 2018.01.13

New Cyanide and Happiness Comic

OCaml Planet: my 2018 contains robur and starts with re-engineering DNS


At the end of 2017, I resigned from my PostDoc position at University of Cambridge (in the rems project). Early December 2017 I organised the 4th MirageOS hack retreat, with which I'm very satisfied. In March 2018 the 5th retreat will happen (please sign up!).

In 2018 I moved to Berlin and started to work for the (non-profit) Centre for the cultivation of technology with our project "At robur, we build performant bespoke minimal operating systems for high-assurance services". robur is only possible by generous donations in autumn 2017, enthusiastic collaborateurs, supportive friends, and a motivated community, thanks to all. We will receive funding from the prototypefund to work on a CalDAV server implementation in OCaml targeting MirageOS. We're still looking for donations and further funding, please get in touch. Apart from CalDAV, I want to start the year by finishing several projects which I discovered on my hard drive. This includes DNS, opam signing, TCP, ... . My personal goal for 2018 is to develop a flexible mirage deploy, because after configuring and building a unikernel, I want to get it smoothly up and running (spoiler: I already use albatross in production).

To kick off (3% of 2018 is already used) this year, I'll talk in more detail about µDNS, an opinionated from-scratch re-engineered DNS library, which I've been using since Christmas 2017 in production for and The development started in March 2017, and continued over several evenings and long weekends. My initial motivation was to implement a recursive resolver to run on my laptop. I had a working prototype in use on my laptop over 4 months in the summer 2017, but that code was not in a good shape, so I went down the rabbit hole and (re)wrote a server (and learned more about GADT). A configurable resolver needs a server, as local overlay, usually anyways. Furthermore, dynamic updates are standardised and thus a configuration interface exists inside the protocol, even with hmac-signatures for authentication! Coincidentally, I started to solve another issue, namely automated management of let's encrypt certificates (see this branch for an initial hack). On my journey, I also reported a cache poisoning vulnerability, which was fixed in Docker for Windows.

But let's get started with some content. Please keep in mind that while the code is publicly available, it is not yet released (mainly since the test coverage is not high enough, and the lack of documentation). I appreciate early adopters, please let me know if you find any issues or find a use case which is not straightforward to solve. This won't be the last article about DNS this year - persistent storage, resolver, let's encrypt support are still missing.

What is DNS?

The domain name system is a core Internet protocol, which translates domain names to IP addresses. A domain name is easier to memorise for human beings than an IP address. DNS is hierarchical and decentralised. It was initially "specified" in Nov 1987 in RFC 1034 and RFC 1035. Nowadays it spans over more than 20 technical RFCs, 10 security related, 5 best current practises and another 10 informational. The basic encoding and mechanisms did not change.

On the Internet, there is a set of root servers (administrated by IANA) which provide the information about which name servers are authoritative for which top level domain (such as ".com"). They provide the information about which name servers are responsible for which second level domain name (such as ""), and so on. There are at least two name servers for each domain name in separate networks - in case one is unavailable the other can be reached.

The building blocks for DNS are: the resolver, a stub (gethostbyname provided by your C library) or caching forwarding resolver (at your ISP), which send DNS packets to another resolver, or a recursive resolver which, once seeded with the root servers, finds out the IP address of a requested domain name. The other part are authoritative servers, which reply to requests for their configured domain.

To get some terminology, a DNS client sends a query, consisting of a domain name and a query type, and expects a set of answers, which are called resource records, and contain: name, time to live, type, and data. The resolver iteratively requests resource records from authoritative servers, until the requested domain name is resolved or fails (name does not exist, server failure, server offline).

DNS usually uses UDP as transport which is not reliable and limited to 512 byte payload on the Internet (due to various middleboxes). DNS can also be transported via TCP, and even via TLS over UDP or TCP. If a DNS packet transferred via UDP is larger than 512 bytes, it is cut at the 512 byte mark, and a bit in its header is set. The receiver can decide whether to use the 512 bytes of information, or to throw it away and attempt a TCP connection.

DNS packet

The packet encoding starts with a 16bit identifier followed by a 16bit header (containing operation, flags, status code), and four counters, each 16bit, specifying the amount of resource records in the body: questions, answers, authority records, and additional records. The header starts with one bit operation (query or response), four bits opcode, various flags (recursion, authoritative, truncation, ...), and the last four bit encode the response code.

A question consists of a domain name, a query type, and a query class. A resource record additionally contains a 32bit time to live, a length, and the data.

Each domain name is a case sensitive string of up to 255 bytes, separated by . into labels of up to 63 bytes each. A label is either encoded by its length followed by the content, or by an offset to the start of a label in the current DNS frame (poor mans compression). Care must be taken during decoding to avoid cycles in offsets. Common operations on domain names are comparison: equality, ordering, and also whether some domain name is a subdomain of another domain name, should be efficient. My initial representation naïvely was a list of strings, now it is an array of strings in reverse order. This speeds up common operations by a factor of 5 (see test/

The only really used class is IN (for Internet), as mentioned in RFC 6895. Various query types (MD, MF, MB, MG, MR, NULL, AFSDB, ...) are barely or never used. There is no need to convolute the implementation and its API with these legacy options (if you have a use case and see those in the wild, please tell me).

My implemented packet decoding does decompression, only allows valid internet domain names, and may return a partial parse - to use as many resource records in truncated packets as possible. There are no exceptions raised, the parsing uses a monadic style error handling. Since label decompression requires the parser to know absolute offsets, the original buffer and the offset is manually passed around at all times, instead of using smaller views on the buffer. The decoder does not allow for gaps, when the outer resource data length specifies a byte length which is not completely consumed by the specific resource data subparser (an A record must always consume four bytes). Failing to check this can lead to a way to exfiltrate data without getting noticed.

Each zone (a served domain name) contains a SOA "start of authority" entry, which includes the primary nameserver name, the hostmaster's email address (both encoded as domain name), a serial number of the zone, a refresh, retry, expiry, and minimum interval (all encoded as 32bit unsigned number in seconds). Common resource records include A, which payload is 32bit IPv4 address. A nameserver (NS) record carries a domain name as payload. A mail exchange (MX) whose payload is a 16bit priority and a domain name. A CNAME record is an alias to another domain name. These days, there are even records to specify the certificate authority authorisation (CAA) records containing a flag (critical), a tag ("issue") and a value ("").


The operation of a DNS server is to listen for a request and serve a reply. Data to be served can be canonically encoded (the RFC describes the format) in a zone file. Apart from insecurity in DNS server implementations, another attack vector are amplification attacks where an attacker crafts a small UDP frame with a fake source IP address, and the server answers with a large response to that address which may lead to a DoS attack. Various mitigations exist including rate limiting, serving large replies only via TCP, ...

Internally, the zone file data is stored in a tree (module Dns_trie implementation), where each node contains two maps: sub, which key is a label and value is a subtree and dns_map (module Dns_map), which key is a resource record type and value is the resource record. Both use the OCaml Map ("also known as finite maps or dictionaries, given a total ordering function over the keys. All operations over maps are purely applicative (no side-effects). The implementation uses balanced binary trees, and therefore searching and insertion take time logarithmic in the size of the map").

The server looks up the queried name, and in the returned Dns_map the queried type. The found resource records are sent as answer, which also includes the question and authority information (NS records of the zone) and additional glue records (IP addresses of names mentioned earlier in the same zone).


The data structure which contains resource record types as key, and a collection of matching resource records as values. In OCaml the value type must be homogenous - using a normal sum type leads to an unneccessary unpacking step (or lacking type information):

let lookup_ns t =
  match Map.find NS t with
  | None -> Error `NotFound
  | Some (NS nameservers) -> Ok nameservers
  | Some _ -> Error `NotFound

Instead, I use in my current rewrite generalized algebraic data types (read OCaml manual and Mads Hartmann blog post about use cases for GADTs, Andreas Garnæs about using GADTs for GraphQL type modifiers) to preserve a relation between key and value (and A record has a list of IPv4 addresses and a ttl as value) - similar to hmap, but different: a closed key-value mapping (the GADT), no int for each key and mutable state. Thanks to Justus Matthiesen for helping me with GADTs and this code. Look into the interface and implementation.

(* an ordering relation, I dislike using int for that *)
module Order = struct
  type (_,_) t =
    | Lt : ('a, 'b) t
    | Eq : ('a, 'a) t
    | Gt : ('a, 'b) t

module Key = struct
  (* The key and its value type *)
  type _ t =
    | Soa : (int32 * Dns_packet.soa) t
    | A : (int32 * Ipaddr.V4.t list) t
    | Ns : (int32 * Dns_name.DomSet.t) t
    | Cname : (int32 * Dns_name.t) t

  (* we need a total order on our keys *)
  let compare : type a b. a t -> b t -> (a, b) Order.t = fun t t' ->
    let open Order in
    match t, t' with
    | Cname, Cname -> Eq | Cname, _ -> Lt | _, Cname -> Gt
    | Ns, Ns -> Eq | Ns, _ -> Lt | _, Ns -> Gt
    | Soa, Soa -> Eq | Soa, _ -> Lt | _, Soa -> Gt
    | A, A -> Eq

type 'a key = 'a Key.t

(* our OCaml Map with an encapsulated constructor as key *)
type k = K : 'a key -> k
module M = Map.Make(struct
    type t = k
    (* the price I pay for not using int as three-state value *)
    let compare (K a) (K b) = match a b with
      | Order.Lt -> -1
      | Order.Eq -> 0
      | Order.Gt -> 1

(* v contains a key and value pair, wrapped by a single constructor *)
type v = V : 'a key * 'a -> v

(* t is the main type of a Dns_map, used by clients *)
type t = v M.t

(* retrieve a typed value out of the store *)
let get : type a. a Key.t -> t -> a = fun k t ->
  match M.find (K k) t with
  | V (k', v) ->
    (* this comparison is superfluous, just for the types *)
    match k k' with
    | Order.Eq -> v
    | _ -> assert false

This helps me to programmaticaly retrieve tightly typed values from the cache, important when code depends on concrete values (i.e. when there are domain names, look these up as well and add as additional records). Look into server/

Dynamic updates, notifications, and authentication

Dynamic updates specify in-protocol record updates (supported for example by nsupdate from ISC bind-tools), notifications are used by primary servers to notify secondary servers about updates, which then initiate a zone transfer to retrieve up to date data. Shared hmac secrets are used to ensure that the transaction (update, zone transfer) was authorised. These are all protocol extensions, there is no need to use out-of-protocol solutions.

The server logic for update and zone transfer frames is slightly more complex, and includes a dependency upon an authenticator (implemented using the nocrypto library, and ptime).

Deployment and Let's Encrypt

To deploy servers without much persistent data, an authentication schema is hardcoded in the dns-server: shared secrets are also stored as DNS entries (DNSKEY), and,, and names are introduced to encode the permissions. A _transfer key also needs to encode the IP address of the primary (to know where to request zone transfers) and secondary IP (to know where to send notifications).

Please have a look at and the examples for more details. The shared secrets are provided as boot parameter of the unikernel.

I hacked maker's ocaml-letsencrypt library to use µDNS and sending update frames to the given IP address. I already used this to have letsencrypt issue various certificates for my domains.

There is no persistent storage of updates yet, but this can be realised by implementing a secondary (which is notified on update) that writes every new zone to persistent storage (e.g. disk or git). I also plan to have an automated Let's Encrypt certificate unikernel which listens for certificate signing requests and stores signed certificates in DNS. Luckily the year only started and there's plenty of time left.

I'm interested in feedback, either via twitter or an issue on the data repository.

CreativeApplications.Net: CarbonScape – Pollution soundscape by h0nh1m

Created by h0nh1m (Chris Cheung) CarbonScape is a kinetic data soundscape installation consisting of 18 tracks of granular synthesized sound samples. They are collected from the sound sources where carbon footprints are left – sound from the jet engine, steam from the factory or horn of the ship – all composed into a single soundscape.

bit-player: Flipping Wyoming

state border signs for the dozen states from MA to CA on I-80 (there is no "wlecome to Nebraska" sign, so I made do with "welcome to Omaha"

Last week I spent five days in the driver’s seat, crossing the country from east to west, mostly on Interstate 80. I’ve made the trip before, though never on this route. In particular, the 900-mile stretch from Lincoln, Nebraska, across the southern tier of Wyoming, and down to Salt Lake City was new to me.

Driving is a task that engages only a part of one’s neural network, so the rest of the mind is free to wander. On this occasion my thoughts took a political turn. After all, I was boring through the bright red heart of America. Especially in Wyoming.

Based on the party affiliations of registered voters, Wyoming is far and away the most Republican state in the union, with the party claiming the allegiance of two-thirds of the electorate. The Democrats have 18 percent. A 2013 Gallup poll identified Wyoming as the most “conservative” state, with just over half those surveyed preferring that label to “moderate” or “liberal.”

The other singular distinction of Wyoming is that it has the smallest population of all the states, estimated at 579,000. The entire state has fewer people than many U.S. cities, including Albuquerque, Milwaukee, and Baltimore. The population density is a little under six people per square mile.

I looked up these numbers while staying the night in Laramie, the state’s college town, and I was mulling them over as I continued west the next morning, climbing through miles of rolling grassland and sagebrush with scarcely any sign of human habitation. A mischievous thought came upon me. What would it take to flip Wyoming? If we could somehow induce 125,000 liberal voters to take up legal residence here, the state would change sides. We’d have two more Democrats in the Senate, and one more in the House. Berkeley, California, my destination on this road trip, has a population of about 120,000. Maybe we could persuade everyone in Berkeley to give up Chez Panisse and Moe’s Books, and build a new People’s Republic somewhere on Wyoming’s Medicine Bow River.

Let me quickly interject: This is a daydream, or maybe a nightmare, and not a serious proposal. Colonizing Wyoming for political purposes would not be a happy experience for either the immigrants or the natives. The scheme belongs in the same category as a plan announced by a former Mormon bishop to build a new city of a million people in Vermont. (Vermont has a population of about 624,000, the second smallest among U.S. states.)

Rather than trying to flip Wyoming, maybe one should try to fix it. Why is it the least populated state, and the most Republican? Why is so much of the landscape vacant? Why aren’t entrepreneurs with dreams of cryptocurrency fortunes flocking to Cheyenne or Casper with their plans for startup companies?

The experience of driving through the state on I-80 suggests some answers to these questions. I found myself wondering how even the existing population of a few hundred thousand manage to sustain itself. Wikipedia says there’s some agriculture in the state (beef, hay, sugar beets), but I saw little evidence of it. There’s tourism, but that’s mostly in the northwest corner, focused on Yellowstone and Grand Teton national parks and the cowboy-chic enclave of Jackson Hole. The only conspicuous economic activity along the I-80 corridor is connected with the mining and energy industries. My very first experience of Wyoming was olfactory: Coming downhill from Pine Bluffs, Nebraska, I caught of whiff of the Frontier oil refinery in Cheyenne; as I got closer to town, I watched the sun set behind a low-hanging purple haze that might also be refinery-related. The next day, halfway across the state, the Sinclair refinery announced itself in a similar way.

Sinclair refinery in Sinclair, Wyoming

Still farther west, coal takes over where oil leaves off. The Jim Bridger power plant, whose stacks and cooling-tower plumes are visible from the highway, burns locally mined coal and exports the electricity.

Jim Bridger power plant 5582

As the author of a book celebrating industrial artifacts, I’m hardly the one to gripe about the presence of such infrastructure. On the other hand, oil and coal are not much of a foundation for a modern economy. Even with all the wells, the pipelines, the refineries, the mines, and the power plants, Wyoming employment in the “extractive” sector is only about 24,000 (or 7 percent of the state’s workforce), down sharply from a peak of 39,000 in 2008. If this is the industry that will build the state’s future, then the future looks bleak.

Economists going all the way back to Adam Smith have puzzled over the question: Why do some places prosper while others languish? Why, for example, are Denver and Boulder so much livelier than Cheyenne and Laramie? The Colorado cities and the Wyoming ones are only about 100 miles apart, and they share similar histories and physical environments. But Denver is booming, with a diverse and growing economy and a population approaching 700,000—greater than the entire state of Wyoming. Cheyenne remains a tenth the size of Denver, and in Cheyenne you don’t have to fight off hordes of hipsters to book a table for dinner. What makes the difference? I suspect the answer lies in a Yogi Berra phenomenon. Everybody wants to go to Denver because everyone is there already. Nobody wants to be in Cheyenne because it’s so lonely. If this guess is correct, maybe we’d be doing Wyoming a favor by bringing in that invasion of 125,000 sandal-and-hoodie–clad bicoastals.

sign at the continental divide, elevation 7000One more Wyoming story. At the midpoint of my journey across the state, near milepost 205 on I-80, I passed the sign shown at left. I am an aficionado of continental divide crossings, and so I took particular note. Then, 50 miles farther along, I passed another sign, shown at right. continental divide sign at elevation 6930 On seeing this second crossing, I put myself on high alert for a third such sign. This is a matter of simple topology, or so I thought. If a line—perhaps a very wiggly one—divides an area into two regions, then if you start in one region and end up in the other, you must have crossed the line an odd number of times. Shown below are some possible configurations. three possible ways of crossing a wiggly continential divideIn each case the red line is the path of the continental divide, and the dashed blue line is the road’s trajectory across it. At far left the situation is simple: The road intersects the divide in a single point. The middle diagram shows three crossings; it’s easy to see how further elaboration of the meandering path could yield five or seven or any odd number of crossings. An arrangement that might seem to generate just two crossings is show at right. One of the “crossings” is not a crossing at all but a point of tangency. Depending on your taste in such matters, the tangent intersection could be counted as crossing the divide twice or not at all; in either case, the total number of crossings remains odd.

In the remainder of my trip I never saw a sign marking a third crossing of the divide. The explanation has nothing to do with points of tangency. I should have known that, because I’ve actually written about this peculiarity of Wyoming topography before. Can you guess what’s happening? Wikipedia tells all.

Daniel Lemire's blog: Science and Technology links (January 12th, 2018)

  1. A few years ago, researchers in Danemark expressed concerns regarding high concentrations of pesticides that are showing up in urine samples of Danish mothers and children. Last time I was in Danemark, a newspaper was reporting that there are surprising amounts of pesticides in the products sold in Danemark. A recent research article found that

    the adverse health effects of chronic pesticide residue exposure in the Danish population are very unlikely. The Hazard Index for pesticides for a Danish adult was on level with that of alcohol for a person consuming the equivalent of 1 glass of wine every seventh year.

  2. For the first time in American history, healthcare is the largest source of jobs, ahead of retail, the previous largest source.
  3. Farmers use laser beams to stop eagles from attacking their animals.
  4. Last year, we saw many new therapies based on gene editing. The most exciting gene editing technique right now is CRISPR-Cas9. A paper just reported that most of us are probably immune to CRISPR-Cas9, which means that we can’t receive an in vivo gene therapy based on CRISPR-Cas9 without concern that our immune system will fight it. This suggests that new techniques are needed.
  5. Gary Marcus caused quite a debate online by posting a paper entitled Deep Learning: A Critical Appraisal. I believe that Marcus’s main point is deep learning is maybe not the golden path toward human-like intelligence. I’m not exactly sure why this should be so controversial given that I have heard one of the founders if the deep-learning school, Hinton, say exactly the same thing.

    Deep Learning is the dominant paradigm in artificial intelligence, right now. And it works. It is not all hype. It solves real problem people care about.

    But we don’t know how far it can go. Yet it seems that there are fairly obvious limits. For example, nobody knows how to use deep-learning alone to prove theorems. Yet we have pretty good tools (that work right now) to automatically prove theorems.

  6. Many people are concerned about how our climate could change in the near future. The Atlantic has a piece of how we can use technology to regulate the climate (something they call “geo-engineering”).

Tea Masters: Dark Oolong for a dark hour

Despite the title of this article, this isn't a Chaxi devoted to the movie about Winston Churchill. In my previous article I found inspiration in a book by Alexandre Dumas and I could also do one about Chuchill. The last lion, Winston Spencer Churchill Alone, by William Manchester, is one of the few books I brought along when I moved to Taiwan in 1996... For a Churchillian Chaxi I would definitely use my teapot with a painted lion that dates back to a different time!

But this Chaxi is literally and simply about a tea after sunset. Chinese New Year is approaching and the cold outside temperatures require some red color and fire to warm us up here in Taipei. That's also why I chose a roasted Oolong (from Wuyi) for this special Chaxi since I'm using charcoal to heat the water in my silver kettle..
A small Yixing zhuni teapot from the 1980s is perfect for this tea. 
The clarity and purity of this tea is simply amazing.
I get to use my celadon ewer when I'm using charcoal, because it's faster not to fill the kettle to the top, but adding enough water for each brew.
Once the Nilu is up to speed with the glowing charcoal, the water comes quickly to a boil again. The trick is that it's the kettle to should generate steam while the charcoal shouldn't smoke.
The result is that we are transported back in time to ancient China with this traditional tea and accessories.
And now I feel so warm that I'm taking off my jacket! A good roasted Oolong will make you feel warm...

Michael Geist: Insider Access: Secret Advisory Groups Damage the Credibility of Canada’s NAFTA Negotiations

The Canadian government has frequently touted its commitment to transparency and consultation with respect to its trade negotiations, citing a steady stream of open events and its receptiveness to public feedback. Indeed, since the renegotiation of NAFTA was placed back on the table, officials say they have talked to nearly 1,000 stakeholders and received more than 44,000 public submissions.

While the openness to public comment represents a notable shift in approach, my Globe and Mail op-ed reports that the government has been far less forthcoming about the creation of secret NAFTA industry advisory groups. According to documents obtained under the Access to Information Act, as of last October, members of those groups had signed 116 confidentiality and non-disclosure agreements that pave the way for access to secret information about the status of the negotiations. Those stakeholders are in addition to the dozen NAFTA Advisory Council members, most of whom have also signed the non-disclosure documents.

The industry advisory groups cover some of the negotiations’ most contentious areas, including agriculture, intellectual property, services, auto, culture, and energy. There are also groups for newer trade issues such as women’s rights, labour, and Indigenous concerns. In all, the government supports at least 12 previously undisclosed advisory groups.

The size of each advisory group varies. The government documents indicate there are at least 14 members in the services group, 12 members in the auto group, and seven in the intellectual property group. The composition of the advisory groups remains a secret, though officials acknowledge that they consist primarily of businesses and their industry associations with few independent voices and no academic experts.

Officials maintain the NDAs are needed to allow for disclosure of the state of the talks and the negotiating positions of the U.S. and Mexico delegations. While revealing Canadian positions would not be subject to confidentiality restrictions, an agreement between the three countries allows for private disclosure of the dynamics of the negotiations and specific country positions. The advisory groups are not provided with copies of the draft text, but are given sufficiently detailed information to assess the likely impact of the proposed provisions.

The willingness to disclose NAFTA negotiating details should not come as a surprise as the government undoubtedly wants to limit the possibility of unanticipated harms from the final text. Yet the entire process remains shrouded in secrecy (an official responded that the groups were not secret but no one had previously asked about them), a far cry from the promises of transparency promoted by the government. Moreover, while the number of NDA agreements and the existence of the advisory groups was revealed as part of the Access to Information request, any identifying information about which groups or individuals signed the agreements was fully redacted.

The secret two-tier approach damages the credibility of an otherwise open consultation. Encouraging Canadians to provide their views on Canada’s trade priorities makes sense given that trade has emerged as perhaps the single biggest economic issue facing the country and all Canadians have a stake in the outcome of the talks. However, the use of secret advisory groups creates an uneven playing field with some stakeholders positioned to provide better informed feedback than their competitors as well as many other interested parties and independent experts.

If the insider access approach is to continue, Global Affairs Minister Chrystia Freeland and International Trade Minister François-Philippe Champagne should move quickly to lift the veil of secrecy behind the process, openly disclosing the nature and membership of each advisory group. Moreover, the advisory groups should be expanded to include a wider diversity of voices, including badly needed independent perspectives.

The government has emphasized its willingness to engage with the U.S. on NAFTA, even if President Donald Trump chooses to start the process of walking away from the deal. Its engagement with Canadians should be similarly robust, marked by a transparent, public advisory process, a clear commitment to balanced advice, and strict limits on the creation of privileged insider access.

The post Insider Access: Secret Advisory Groups Damage the Credibility of Canada’s NAFTA Negotiations appeared first on Michael Geist.

The Shape of Code: Computer books your great grandfather might have read

I have been reading two very different computer books written for a general readership: Giant Brains or Machines that Think, published in 1949 (with a retrospective chapter added in 1961) and LET ERMA DO IT, published in 1956.

‘Giant Brains’ by Edmund Berkeley, was very popular in its day.

Berkeley marvels at a computer performing 5,000 additions per second; performing all the calculations in a week that previously required 500 human computers (i.e., people using mechanical adding machines) working 40 hours per week. His mind staggers at the “calculating circuits being developed” that can perform 100,00 additions a second; “A mechanical brain that can do 10,000 additions a second can very easily finish almost all its work at once.”

The chapter discussing the future, “Machines that think, and what they might do for men”, sees Berkeley struggling for non-mathematical applications; a common problem with all new inventions. Automatic translator and automatic stenographer (typist who transcribe dictation) are listed. There is also a chapter on social control, which is just as applicable today.

This was the first widely read book to promote Shannon‘s idea of using the algebra invented by George Boole to analyze switching circuits symbolically (THE 1940 Masters thesis).

The ‘ERMA’ book paints a very rosy picture of the future with computer automation removing the drudgery that so many jobs require; it is so upbeat. A year later the USSR launched Sputnik and things suddenly looked a lot less rosy.

Tea Masters: La Reine Margot et un thé de la concubine

 Il n'est pas question de thé dans 'La Reine Margot' de Dumas, mais cela ne l'empêche pas d'être un roman historique captivant de la première à la dernière page. J'ai eu envie de créer ce Chaxi pour rendre hommage à ce livre! En effet, on peut trouver partout de l'inspiration pour un Chaxi, une cérémonie de thé. La plupart du temps, c'est la saison ou le thé choisi qui s'offre comme point de départ. Cette fois, j'ai eu envie de me souvenir de ce livre dans mon blog, car j'aime bien partager tout ce qui me procure du plaisir. Le thé permet si bien de prolonger le plaisir intellectuel par le plaisir des sens (odorat, goût et la vue) que la composition d'un Chaxi de ce livre se construit avec la même logique que l'histoire du roman.

Au centre du roman, il y a donc Marguerite de Valois, la femme d'Henri de Navarre. C'est une femme très libre et amoureuse. Elle a 2 amants et un mari, et c'est pourquoi je choisis un Oolong Concubine semi-sauvage âgé de quelques années. En effet, pour un roman historique je trouvais qu'il valait mieux déguster un thé qui a eu le temps de se bonifier avec le temps au point de devenir un classique! 
Pour le Chabu, je choisis un tissu de couleur rose, couleur féminine de l'amour. La porcelaine blanche de Dehua correspond historiquement à la porcelaine destinée à la cours royale (un siècle plus tard environ, avec l'arrivée du thé en Europe. (Remarque: il n'est pas question de thé dans la Reine Margot, mais Dumas parle de porcelaine japonaise dans le roman 'La Dame Monsoreau' qui fait suite à la Reine Margot. Or, le Japon ne produisait pas encore de porcelaine durant le règne d'Henri III. Si Dumas fait cette erreur, c'est que le Japon était très à la mode au milieu XIX ème siècle, lorsque Dumas écrivit ce livre). La porcelaine blanche Dehua fonctionne très bien avec le Oolong torréfié et lui donne une couleur chaleureuse plus intense encore.
L'utilisation de ma théière Yixing zisha de la dynastie Qing s'imposa naturellement. Sa décoration falangcai bleu et blanc représente notamment un lion, symbole de royauté. Et sa glaise zisha est parfaite avec un Oolong torréfié!
Le Chatuo ciselé me fait penser aux armures des gentilshommes qui partent au combat.
Mais l'accessoire qui correspond le mieux au XVI ème siècle est ma jarre qinghua de la dynastie Ming (1368-1644). Ce Chaxi est donc l'occasion idéal de me servir de cette jarre dont Catherine de Médicis, la mère de Margot, aurait pu se servir pour entreposer un de ses nombreux poisons! Catherine de Médicis et le parfumeur René étaient dangereux comme une araignée!
Mais le poison le plus fatal dans cette histoire, c'est un sentiment intemporel: la haine. En effet, le roman commence avec les massacres des Huguenots durant la Saint-Barthélemy! Heureusement, on trouve aussi beaucoup d'amour et de nobles sentiments dans ce roman qui fait revivre l'histoire de France sans occulter ses zones d'ombre. / 2018-01-20T20:41:21