Bifurcated Rivets: From FB

New horrors here

Bifurcated Rivets: From FB

Lucero Tena

Bifurcated Rivets: From FB


Bifurcated Rivets: From FB

Business wetsuits...

Bifurcated Rivets: From FB


Computer Science: Theory and Application: Data flow analysis - Lattice theory, never needed it

While implementing a few DFA problems, I always wondered why this whole lattice thing was needed. I mean, theoretically to give bound, I get that.

But I've never needed it.

DFA(CFG, transferFunc, directory, mergeOperator, equalsFunc) { ... }

That's how I've always done DFA. I'm wondering if I'm missing something.

I'm about to start a new job where I'm sure I'll have to do more optimizations stuff, but I was wondering if anyone else is in the same situation, i.e. lattice top/bottom goes over their head.

submitted by halivingston
[link] [comment]

Recent additions: edit-distance

Added by AdamBergmark, Mon May 4 09:23:49 UTC 2015.

Levenshtein and restricted Damerau-Levenshtein edit distances

Recent additions: feed

Added by AdamBergmark, Mon May 4 09:22:34 UTC 2015.

Interfacing with RSS (v 0.9x, 2.x, 1.0) + Atom feeds.

Slashdot: WikiLeaks' Anonymous Leak Submission System Is Back After Nearly 5 Years

Sparrowvsrevolution writes: On Friday, WikiLeaks announced that it has finally relaunched a beta version of its leak submission system after a 4.5 year hiatus. That file-upload site, which once served as a central tool in WIkiLeaks' leak-collecting mission, runs on the anonymity software Tor to allow uploaders to share documents and tips while protecting their identity from any network eavesdropper, and even from WikiLeaks itself. In 2010 the original submission system went down amid infighting between WikiLeaks' leaders and several of its disenchanted staffers, including several who left to create their own soon-to-fail project called OpenLeaks. WikiLeaks founder Julian Assange says that the new system, which was delayed by his legal troubles and the banking industry blockade against the group, is the final result of "four competing research projects" WikiLeaks launched in recent years. He adds that it has several less-visible submission systems in addition to the one it's now revealed. "Currently, we have one public-facing and several private-facing submission systems in operation, cryptographically, operationally and legally secured with national security sourcing in mind," Assange writes.

Read more of this story at Slashdot.

TwitchFilm: Learning From The Masters Of Cinema: Sidney Lumet's THE OFFENCE

It is no secret that Sean Connery grew to hate James Bond long before he stopped playing the character. In fact, he was so reluctant to return as 007 for Diamonds Are Forever, after George Lazenby walked away from the franchise after just one film, that United Artists offered the Scottish actor an unprecedented fee of US$1.25 million, and also agreed to produce two subsequent films of Connery's choosing if he'd pick up the Walther PPK one last time.. The first of these was The Offence, a bleak and brutal British police drama, directed by acclaimed American filmmaker Sidney Lumet. Connery and Lumet had previously collaborated on The Hill (1965) and The Anderson Tapes (1971), and would work together again on Murder on the Orient Express...

[Read the whole post on]

Open Culture: 125 MOOCs Getting Started in May: Enroll in One Today

Just a quick note to let you know that 125 MOOCS are getting started this month. You can find them all on our comprehensive list, curated with the help of our friends at Class Central. As always, the MOOCs cover many different topics — everything from Poetry in America: The Civil War and Its Aftermath, to World War 1: Trauma and Memory, to Women in Leadership: Inspiring Positive Change and Writing American Food — but the one I’m curious to check out is The Rise of Superheroes and Their Impact On Pop Culture, co-taught by Stan Lee. It starts on May 5th. You can enroll in the course today. Find more free May MOOCs here.

Follow us on Facebook, Twitter and Google Plus and share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox.

Hackaday: Making Servos Spin Right Round Without Stopping

[Brian B] found a handful of servos at his local hackerspace, and like any good hacker worth his weight in 1N4001’s, he decided to improve upon their design. Most servos are configured to spin only so far – usually 180 degrees in either direction. [Brian B’s] hack makes them spin 360 degrees in continuous rotation.

He starts off by removing the top most gear and making a small modification with a razor. Then he adds a little super glue to the potentiometer, and puts the thing back together again. A few lines of code and an arduino confirms that the hack performs flawlessly.

We’ve seen ways to modify other types of servos for 360 rotation. There’s a lot of servos out there, and every little bit of information helps. Be sure to check your parts bin for any Tower Pro SG90 9g servos and bookmark this article. It might come in handy on a rainy day.

Filed under: Arduino Hacks

Penny Arcade: Comic: Gavinophilia

New Comic: Gavinophilia Math-Prime-Util-0.50

Utilities related to prime numbers, including fast sieves and factoring

programming: I wish somebody had told me this when I started.

submitted by ayswl
[link] [5 comments]

Recent additions: step-function

Added by petterb, Mon May 4 06:19:46 UTC 2015.

Step functions, staircase functions or piecewise constant functions

Slashdot: Two Gunman Killed Outside "Draw the Prophet" Event In Texas

cosm writes: ABC news reports that two armed gunman were shot and killed outside a "Draw the Prophet" event hosted in Garland Texas. From the article: "The event, sponsored by the American Freedom Defense Initiative, featured cartoons of the Prophet Muhammad, and scheduled speakers included Dutch MP Geert Wilders, who has campaigned to have the Quran banned in the Netherlands. The winner of the contest was to receive $10,000." In light of the Charlie Hebdo terrorist attacks, the Lars Vilks Muhammad drawing controversies, and the American show South Park's satirical depiction of the state of Muhammad phobia in the US and elsewhere, is there an end in sight to the madness associated with the representation of this religious figure?

Read more of this story at Slashdot. App-screenorama-0.05

Application output to websocket stream

programming: Irregular Expression: Three Tales of Second System Syndrome

submitted by szabgab
[link] [2 comments] Comic for 2015.05.04

New Cyanide and Happiness Comic

TwitchFilm: SUICIDE SQUAD: Here's Will Smith As Deadshot

One of the immediate questions that came up from the cast of Suicide Squad in costume shot released earlier this evening was if that was the complete costume of Will Smith as Deadshot. Where was his cybernetic eye, asked the masses? A couple hours later director David Ayer responded with a solo shot of Smith with the eye, via his Twitter account.Now everyone is asking for for Harley Quinn. With those... shorts(?)... cannot say that I blame them. The films is still in production here in Toronto and still scheduled for release on August 5th, 2016. ...

[Read the whole post on]

Recent additions: Diff 0.3.2

Added by SterlingClover, Mon May 4 04:32:43 UTC 2015.

O(ND) diff algorithm in haskell. HiD-1.95_93-TRIAL

Static website publishing framework

MetaFilter: Vending machine tally: This rule can go to hell.

Nearly losing sanity in a 28-hour Marvel Marathon.

Slashdot: VA Tech Student Arrested For Posting Perceived Threat Via Yik Yak

ememisya writes: I wonder if I posted, "There will be another 12/7 tomorrow, just a warning." around December, would people associate it with Pearl Harbor and I would find myself arrested, or has enough time passed for people to not look at the numbers 12 and 7 and take a knee jerk reaction? A student was arrested for "Harassment by Computer" (a class 1 misdemeanor in the state of Virginia) due to his post on an "anonymous" website [Yik Yak]. Although the post in and of itself doesn't mean anything to most people in the nation, it managed to scare enough people locally for law enforcement agencies to issue a warrant for his arrest. "Moon, a 21-year-old senior majoring in business information technology, is being charged with Harassment by Computer, which is a class one misdemeanor. Tuesday night, April 28, a threat to the Virginia Tech community was posted on the anonymous social media app Yik Yak. Around 11:15 p.m., an unknown user posted 'Another 4.16 moment is going to happen tomorrow. Just a warning (sic).' The Virginia Tech Police Department released a crime alert statement Wednesday morning via email informing students that VTPD was conducting an investigation throughout the night in conjunction with the Blacksburg Police Department."

Read more of this story at Slashdot. Dist-Zilla-Plugin-Test-PodSpelling-2.006009

Author tests for POD spelling

Embedded in Academia: What afl-fuzz Is Bad At

American fuzzy lop is a polished and effective fuzzing tool. It has found tons of bugs and there are any number of blog posts talking about that. Here we’re going to take a quick look at what it isn’t good at. For example, here’s a program that’s trivial to crash by hand, that afl-fuzz isn’t likely to crash in an amount of time you’re prepared to wait:

#include <stdlib.h>
#include <stdio.h>

int main(void) {
  char input[32];
  if (fgets(input, 32, stdin)) {
    long n = strtol(input, 0, 10);
    printf("%ld\n", 3 / (n + 1000000));
  return 0;

The problem, of course, is that we’ve asked afl-fuzz to find a needle in a haystack and its built-in feedback mechanism does not help guide its search towards the needle. Actually there are two parts to the problem. First, something needs to recognize that divide-by-zero is a crash behavior that should be targeted. Second, the search must be guided towards inputs that result in a zero denominator. A finer-grained feedback mechanism, such as how far the denominator is from zero, would probably do the job here. Alternatively, we could switch to a different generation technology such as concolic testing where a solver is used to generate test inputs.

The real question is how to get the benefits of these techniques without making afl-fuzz into something that it isn’t — making it worse at things that it is already good at. To see why concolic testing might make things worse, consider that it spends a lot of time making solver calls instead of actually running tests: the base testing throughput is a small fraction of what you get from plain old fuzzing. A reasonable solution would be to divide up the available CPU time among strategies; for example, devote one core to concolic testing and seven to regular old afl-fuzz. Alternatively, we could wait for afl-fuzz to become stuck — not hitting any new coverage targets within the last hour, perhaps — and then switch to an alternate technique for a little while before returning to afl-fuzz. Obviously the various search strategies should share a pool of testcases so they can interact (afl-fuzz already has good support for communication among instances). Hopefully some concolic testing people will spend a bit of time bolting their tools onto afl-fuzz so we can play with these ideas.

A variant of the needle-in-a-haystack problem occurs when we have low-entropy input such as the C++ programs we would use to test a C++ compiler. Very few ascii strings are C++ programs and concolic testing is of very little help in generating interesting C++. The solution is to build more awareness of the structure of C++ into the testcase generator. Ideally, this would be done without requiring the somewhat elaborate effort we put into Csmith. Something like LangFuzz might be a good compromise. (Of course I am aware that afl-fuzz has been used against clang and has found plenty of ways to crash it! This is great but the bugs being found are not the kind of semantic bugs that Csmith was designed to find. Different testcase generators solve different problems.)

So we have this great fuzzing tool that’s easy to use, but it also commonly runs up against testing problems that it can’t solve. A reasonable way forward will be for afl-fuzz to provide the basic framework and then we can all build extensions that help it reach into domains that it couldn’t before. Perhaps Valgrind is a good analogy: it provides not only an awesome baseline tool but also a rich infrastructure for building new tools.

[Pascal Cuoq gave me some feedback on this post including a bunch of ideas about afl-fuzz-related things that he’ll hopefully blog about in the near future.]

MetaFilter: African-American migrants to the Soviet Union

"My father felt that the U.S.S.R. treated him better than America. He was happy here."

Recent additions: pandoc-crossref

Added by lierdakil, Mon May 4 02:03:49 UTC 2015.

Pandoc filter for cross-references

Hackaday: Improving Active Loads

[Texane]’s job requires testing a few boards under a set of loads, and although the lab at work has some professional tools for this it seemed like a great opportunity to try out the Re:load 2. It’s a nifty little active load that’s available can of course be improved with an injection of solder and silicon.

While the Re:load 2 is a nice, simple device that can turn up to 12 Watts directly into heat, it’s not programmable. The ability to create and save load profiles would be a handy feature to have, so [Texane] took a Teensy 3.1 microcontroller and installed a resistor divider in front of the Re:load’s amplifier. A simple script running on a computer allows [Texane] to set the amount of current dumped and automate ramps and timers.

There is a more fundamental problem with the Re:load; the lowest possible current that can be dumped into a heat sink is 90mA. [Texane] replace the amplifier with a zero-drift amp that brought that 90mA figure down to 7mA.

Of course the Re:load and Teensy 3.1 are sold in the Hackaday store, but if you’re looking for a ready-built solution for a computer-controlled active load you can always check out the Re:load Pro, a fancy-smanchy model that has an LCD. The Pro costs more, and [Texane] just told you how to get the same features with the less expensive model we’re selling, though…

Filed under: tool hacks

programming: Pony - High Performance Actor Programming

submitted by fadeddata
[link] [18 comments]

TwitchFilm: SUICIDE SQUAD: Here Is The First Pic Of The Cast In Costume

Here you go. The first pic of the cast of Suicide Squad in costume. Director David Ayer tweeted the pic around an hour ago with the caption, "Task Force X assembled and ready."LtoR: Adam Beech as Slipknot, Jai Courtney as Capt. Boomerang, Cara Delevingne as The Enchantress, Joel Kinnaman as Rick Flagg, Margot Robbie as Harley Quinn, Will Smith as Deadshot, Adewale Akinnuoye-Agbaje as Killer Croc. I presume that is Jay Hernandez as El Diablo with the skullface at the end. And you have Katana in the crouch spam position up front. Production is underway here in Toronto. Suicide Squad will release in cinemas on August 5, 2016....

[Read the whole post on] Hash-Ordered-0.006

A fast, pure-Perl ordered hash class

Slashdot: Tesla's Household Battery: Costs, Prices, and Tradeoffs

Technologist Ramez Naam (hat tip to Tyler Cowen's "Marginal Revolution" blog) has taken a look at the economics of Tesla's new wall-mounted household battery system, and concludes that it's "almost there," at least for many places in the world -- and seems to already make sense in some. From his analysis: For some parts of the US with time-of-use plans, this battery is right on the edge of being profitable. From a solar storage perspective, for most of the US, where Net Metering exists, this battery isn’t quite cheap enough. But it’s in the right ballpark. And that means a lot. Net Metering plans in the US are filling up. California’s may be full by the end of 2016 or 2017, modulo additional legal changes. That would severely impact the economics of solar. But the Tesla battery hedges against that. In the absence of Net Metering, in an expensive electricity state with lots of sun, the battery would allow solar owners to save power for the evening or night-time hours in a cost effective way. And with another factor of 2 price reduction, it would be a slam dunk economically for solar storage anywhere Net Metering was full, where rates were pushed down excessively, or where such laws didn’t exist. That is also a policy tool in debates with utilities. If they see Net Metering reductions as a tool to slow rooftop solar, they’ll be forced to confront the fact that solar owners with cheap batteries are less dependent on Net Metering. ... And the cost of batteries is plunging fast. Tesla will get that 2x price reduction within 3-5 years, if not faster.

Read more of this story at Slashdot.

Instructables: exploring - featured: How To Make Man Cave Signs

Every guy needs man cave signs don't they? I've seen the metal printed ones, some of which go for pretty hefty prices! I wondered how effective painted signs, on wood, wood look. Hence this instructable. Finally got around to giving it a shot. Signs look good and from just a little distance, the...
By: Creativeman

Continue Reading »

Instructables: exploring - featured: R3: Rolling Red Robot

My first robot! I built this from the ground up, designing the platform, drive system, sensor array, and programming. I designed it with the idea it would be a flexible platform to experiment with and to be able to add accessories, like the blue LED headlights, which is what the small green breadb...
By: pvowell

Continue Reading »

MetaFilter: Be it Pill, Patch, Shot, Ring

Health insurance companies are illegally charging for birth control, according to studies conducted by the National Women's Law Center.

Jezebel reports further on the matter:

"The issue isn't just birth control: the report found that health insurers are also illegally excluding transition-related medical care for transgender people from their plans and illegally charging women for preventative services like 'well woman' exams. The insurers claim they're using 'reasonable management techniques,' which they're entitled to do under the law. It'll probably take years of scoldings from a variety of federal agencies before they cut it out."

Instructables: exploring - featured: Battery Eliminator from a recycled wall wart

If you are anything like me you hate to throw anything away...and as devices die, you end up with an endless supply of 110 VAC to DC at some voltage/amp rating "wall warts".A few weeks ago trout season started and I needed an aerator to keep the minnows alive longer. I ended up with a battery power...
By: doncrush

Continue Reading »

Computer Science: Theory and Application: Why does it take longer for a computer to recognize an incorrect password than the correct password?

I've noticed this multiple times, on both OS X and Ubuntu. If I ever mistype my system password, it sits there for a second before telling me I got it wrong. But, when I type the correct password, there is no wait, and it immediately lets me in.

Does it actually take less time to confirm a password match than to recognize a failure? Is this some sort of rate limiting for an attack vector like a usb device which masquerades as a keyboard and tries to brute force the password?

submitted by appletreker
[link] [25 comments]

Computer Science: Theory and Application: Models for Parallel Computation (Hitchhiker's Guide to Massively Parallel Universes)

submitted by yaroslavtsev
[link] [comment]

MetaFilter: Women in Science Fiction & Fantasy Month, 2015

Every April for the past several years, Fantasy Cafe has published a series of guest posts for Women in Science Fiction & Fantasy Month. This year, the article that generated the most discussion was "'I am ... ?': Representation of Mature Women in Fantasy" by Mieneke from A Fantastical Librarian, who asked, "So where are the older women in fantasy? Mature women who are the hero of their own story?" The many other guest posts this year offered an interesting range of questions, observations, and reflections--often by well-known names in the field.

Week 1 Week 2 Week 3 Week 4 Week 5

Slashdot: Facebook Wants to Skip the Off-Site Links, Host News Content Directly

The Wall Street Journal, in a report also cited by The Next Web and others, reports that Facebook is to soon begin acting not just as a conduit for news links pasted onto users' timelines (and leading to articles hosted elsewhere) but also as a host for the articles themselves. From the WSJ article: To woo publishers, Facebook is offering to change its traditional revenue-sharing model. In one of the models under consideration, publishers would keep all of the revenue from ads they sell on Facebook-hosted news sites, the people familiar with the matter said. If Facebook sells the advertisement, it would keep roughly 30% of the revenue, as it does in many other cases. Another motivation for Facebook to give up some revenue: It hopes the faster-loading content will encourage users to spend more time on its network. It is unclear what format the ads might take, or if publishers will be able to place or measure the ads they sell within Facebook. It seems likely Facebook would want publishers to use its own advertising-technology products, such as Atlas and LiveRail, as opposed to those offered by rivals such as Google Inc.

Read more of this story at Slashdot.

Instructables: exploring - featured: Tripod Mount for an iPad

Taking photos for Instructables is quick and easy with an iPad, but I sometimes feel like I might drop it, or I need both hands in the photo. For those times a tripod (and the self-timer built into the software) would be very helpful. This Instructable will show a simple tripod mount for a full-size...
By: Phil B

Continue Reading »

Hackaday: Hackaday Links: May 3, 2015

Everybody loves How It’s Made, right? How about 3D printers? The third greatest thing to come out of Canada featured Lulzbot in their most recent episode. It’s eight minutes of fun, but shame the puns weren’t better. Robertson drives and the Avro Arrow, if you’re wondering.

Speaking of 3D printers, a lot of printers are made of aluminum extrusion. Has anyone tried something like this? It’s an idea that’s been around for a while but we can’t seem to find anyone actually using 3D printed extrusion.

CastARs are shipping out, and someone made a holodeck with retroreflective material. It’s an inflatable dome that’s attached to a regular ‘ol tent that works as a positive pressure airlock. If you’re looking to replicate this, try it with hexagons and pentagons. That should be easier than the orange-slice gores.

For some reason we can’t comprehend, USB ports are now power ports. There’s still a lot of stuff that uses 9 and 12V, and for that there’s the USB 912. It’ll work better with one of those USB battery packs.

Want to see what the Raspberry Pi 2 looks like with a Flir? NOQ2 has you covered.

Remember the Speccy? In the manual, there was an exercise left to the reader: reproduce [Mahler]’s first symphony with the BEEP command. It took a Raspberry Pi (only for synchronizing several Speccys), but it’s finally done.

Filed under: Hackaday Columns, Hackaday links

MetaFilter: OMG! 美语

Pick-up Line Do's & Don'ts! Order Food At A Restaurant! What In The World?! NYC & LA!

周一到周五白洁都会播出一个节目!一起来学最新的,最地道的美语!Monday thru Friday "Bai Jie" posts one show discussing the newest and most authentic American English slang terms and phrases!

The star of this popular Voice of America program is Jessica Beinecke (Bái Jié 白洁).

programming: TIL Slovenia's IRS has an API

submitted by Audioburn
[link] [17 comments]

Planet Haskell: Keegan McAllister: Modeling garbage collectors with Alloy: part 1

Formal methods for software verification are usually seen as a high-cost tool that you would only use on the most critical systems, and only after extensive informal verification. The Alloy project aims to be something completely different: a lightweight tool you can use at any stage of everyday software development. With just a few lines of code, you can build a simple model to explore design issues and corner cases, even before you've started writing the implementation. You can gradually make the model more detailed as your requirements and implementation get more complex. After a system is deployed, you can keep the model around to evaluate future changes at low cost.

Sounds great, doesn't it? I have only a tiny bit of prior experience with Alloy and I wanted to try it out on something more substantial. In this article we'll build a simple model of a garbage collector, visualize its behavior, and fix some problems. This is a warm-up for exploring more complex GC algorithms, which will be the subject of future articles.

I won't describe the Alloy syntax in full detail, but you should be able to follow along if you have some background in programming and logic. See also the Alloy documentation and especially the book Software Abstractions: Logic, Language, and Analysis by Daniel Jackson, which is a very practical and accessible introduction to Alloy. It's a highly recommended read for any software developer.

You can download Alloy as a self-contained Java executable, which can do analysis and visualization and includes an editor for Alloy code.

The model

We will start like so:

open util/ordering [State]

sig Object { }
one sig Root extends Object { }

sig State {
pointers: Object -> set Object,
collected: set Object,

The garbage-collected heap consists of Objects, each of which can point to any number of other Objects (including itself). There is a distinguished object Root which represents everything that's accessible without going through the heap, such as global variables and the function call stack. We also track which objects have already been garbage-collected. In a real implementation these would be candidates for re-use; in our model they stick around so that we can detect use-after-free.

The open statement invokes a library module to provide a total ordering on States, which we will interpret as the progression of time. More on this later.


In the code that follows, it may look like Alloy has lots of different data types, overloading operators with total abandon. In fact, all these behaviors arise from an exceptionally simple data model:

Every value is a relation; that is, a set of tuples of the same non-zero length.

When each tuple has length 1, we can view the relation as a set. When each tuple has length 2, we can view it as a binary relation and possibly as a function. And a singleton set is viewed as a single atom or tuple.

Since everything in Alloy is a relation, each operator has a single definition in terms of relations. For example, the operators . and [] are syntax for a flavor of relational join. If you think of the underlying relations as a database, then Alloy's clever syntax amounts to an object-relational mapping that is at once very simple and very powerful. Depending on context, these joins can look like field access, function calls, or data structure lookups, but they are all described by the same underlying framework.

The elements of the tuples in a relation are atoms, which are indivisible and have no meaning individually. Their meaning comes entirely from the relations and properties we define. Ultimately, atoms all live in the same universe, but Alloy gives "warnings" when the type system implied by the sig declarations can prove that an expression is always the empty relation.

Here are the relations implied by our GC model, as tuple sets along with their types:

Object: {Object} = {O1, O2, ..., Om}
Root: {Root} = {Root}
State: {State} = {S1, S2, ..., Sn}

pointers: {(State, Object, Object)}
collected: {(State, Object)}

first: {State} = {S1}
last: {State} = {Sn}
next: {(State, State)} = {(S1, S2), (S2, S3), ..., (S(n-1), Sn)}

The last three relations come from the util/ordering library. Note that a sig implicitly creates some atoms.


The live objects are everything reachable from the root:

fun live(s: State): set Object {

*(s.pointers) constructs the reflexive, transitive closure of the binary relation s.pointers; that is, the set of objects reachable from each object.

Of course the GC is only part of a system; there's also the code that actually uses these objects, which in GC terminology is called the mutator. We can describe the action of each part as a predicate relating "before" and "after" states.

pred mutate(s, t: State) {
t.collected = s.collected
t.pointers != s.pointers
all a: Object - |
t.pointers[a] = s.pointers[a]

pred gc(s, t: State) {
t.pointers = s.pointers
t.collected = s.collected + (Object -
some t.collected - s.collected

The mutator cannot collect garbage, but it can change the pointers of any live object. The GC doesn't touch the pointers, but it collects any dead object. In both cases we require that something changes in the heap.

It's time to state the overall facts of our model:

fact {
no first.collected
first.pointers = Root -> (Object - Root)
all s: State - last |
let t = |
mutate[s, t] or gc[s, t]

This says that in the initial state, no object has been collected, and every object is in the root set except Root itself. This means we don't have to model allocation as well. Each state except the last must be followed by a mutator step or a GC step.

The syntax all x: e | P says that the property P must hold for every tuple x in e. Alloy supports a variety of quantifiers like this.

Interacting with Alloy

The development above looks nice and tidy — I hope — but in reality, it took a fair bit of messing around to get to this point. Alloy provides a highly interactive development experience. At any time, you can visualize your model as a collection of concrete examples. Let's do that now by adding these commands:

pred Show {}
run Show for 5

Now we select this predicate from the "Execute" menu, then click "Show". The visualizer provides many options to customise the display of each atom and relation. The config that I made for this project is "projected over State", which means you see a graph of the heap at one moment in time, with forward/back buttons to reach the other States.

After clicking around a bit, you may notice some oddities:

Diagram of a heap with an object pointing to the root

The root isn't a heap object; it represents all of the pointers that are reachable without accessing the heap. So it's meaningless for an object to point to the root. We can exclude these cases from the model easily enough:

fact {
all s: State | no s.pointers.Root

(This can also be done more concisely as part of the original sig.)

Now we're ready to check the essential safety property of a garbage collector:

assert no_dangling {
all s: State | no (s.collected &

check no_dangling for 5 Object, 10 State

And Alloy says:

Executing "Check no_dangling for 5 Object, 10 State"
8338 vars. 314 primary vars. 17198 clauses. 40ms.
Counterexample found. Assertion is invalid. 14ms.

Clicking "Counterexample" brings up the visualization:

Diagram of four states. A single heap object is unrooted, then collected, but then the root grows a new pointer to it!

Whoops, we forgot to say that only pointers to live objects can be stored! We can fix this by modifying the mutate predicate:

pred mutate(s, t: State) {
t.collected = s.collected
t.pointers != s.pointers
all a: Object - |
t.pointers[a] = s.pointers[a]

// new requirement!
all a: |
t.pointers[a] in

With the result:

Executing "Check no_dangling for 5 Object, 10 State"
8617 vars. 314 primary vars. 18207 clauses. 57ms.
No counterexample found. Assertion may be valid. 343ms.

SAT solvers and bounded model checking

"May be" valid? Fortunately this has a specific meaning. We asked Alloy to look for counterexamples involving at most 5 objects and 10 time steps. This bounds the search for counterexamples, but it's still vastly more than we could ever check by exhaustive brute force search. (See where it says "8617 vars"? Try raising 2 to that power.) Rather, Alloy turns the bounded model into a Boolean formula, and feeds it to a SAT solver.

This all hinges on one of the weirdest things about computing in the 21st century. In complexity theory, SAT (along with many equivalents) is the prototypical "hardest problem" in NP. Why do we intentionally convert our problem into an instance of this "hardest problem"? I guess for me it illustrates a few things:

  • The huge gulf between worst-case complexity (the subject of classes like NP) and average or "typical" cases that we encounter in the real world. For more on this, check out Impagliazzo's "Five Worlds" paper.

  • The fact that real-world difficulty involves a coordination game. SAT solvers got so powerful because everyone agrees SAT is the problem to solve. Standard input formats and public competitions were a key part of the amazing progress over the past decade or two.

Of course SAT solvers aren't quite omnipotent, and Alloy can quickly get overwhelmed when you scale up the size of your model. Applicability to the real world depends on the small scope hypothesis:

If an assertion is invalid, it probably has a small counterexample.

Or equivalently:

Systems that fail on large instances almost always fail on small instances with similar properties.

This is far from a sure thing, but it already underlies a lot of approaches to software testing. With Alloy we have the certainty of proof within the size bounds, so we don't have to resort to massive scale to find rare bugs. It's difficult (but not impossible!) to imagine a GC algorithm that absolutely cannot fail on fewer than 6 nodes, but is buggy for larger heaps. Implementations will often fall over at some arbitrary resource limit, but algorithms and models are more abstract.


It's not surprising that our correctness property

all s: State | no (s.collected &

holds, since it's practically a restatement of the garbage collection "algorithm":

t.collected = s.collected + (Object -

Because reachability is built into Alloy, via transitive closure, the simplest model of a garbage collector does not really describe an implementation. In the next article we'll look at incremental garbage collection, which breaks the reachability search into small units and allows the mutator to run in-between. This is highly desirable for interactive or real-time apps; it also complicates the algorithm quite a bit. We'll use Alloy to uncover some of these complications.

In the meantime, you can play around with the simple GC model and ask Alloy to visualize any scenario you like. For example, we can look at runs where the final state includes at least 5 pointers, and at least one collected object:

pred Show {
#(last.pointers) >= 5
some last.collected

run Show for 5

Thanks for reading! You can find the code in a GitHub repository which I'll update if/when we get around to modeling more complex GCs.

Hackaday: Weekend Proves Hardware Wins Hackathons

Teams hacking on hardware won big this weekend in New York. There were ten teams that answered Hackaday’s call as we hosted the first ever hardware hackathon at the Tech Crunch Disrupt NYC. These teams were thrown into the mix with all of the software hackers TC was hosting and rose to the top. Eight out of our ten teams won!

As we suspected, having something physical to show off is a huge bonus compared to those showing apps and webpages alone. Recipe for awesome: Mix in the huge talent pool brought by the hardware hackers participating, then season with a dash of experience from mentors like [Kenji Larson], [Johngineer], [Bil Herd], [Chris Gammell], and many more.

Out of over 100 teams, first runner-up went to PicoRico, which built a data collection system for the suspension of a mountain bike. The Twillio prize went to Stove Top Sensor for Paranoid, Stubburn Older Parents which adds cellphone and web connectivity to the stove, letting you check if you remembered to turn off the burns. The charismatic duo of fifteen-year-olds [Kristopher] and [Ilan] stole the show with their demonstration of Follow Plants which gives your produce a social media presence which you can then follow.

We recorded video and got the gritty details from everyone building hardware during the 20-hour frenzy. We’ll be sharing those stories throughout the week so make sure to check back!

Filed under: cons, Featured

Hackaday: TubeNetRadio Project Modernizes 1959 Tube Radio

Years ago, [Luk] came across an old tube radio. He’s since wanted to convert it to an internet radio but never really got around to it. Now that we are living in the age when a micro computer can be had for a mere $35, [Luk] decided it was time to finish his long lost project.

He chose a Raspberry Pi for the brains of his project because it is an inexpensive and well documented product perfect for what he wanted to do. [Luk] had a goal, to modify the radio as little as possible in order to get it to play both internet radio and locally stored MP3s. The radio from 1959 is certainly old, but it had a feature you may not expect. It had an AUX input with a separate volume knob out front. As is the radio itself, the input was mono. To connect the Raspberry Pi to the radio, [Luk] had to make an 1/8th inch stereo to banana plug adapter, a great solution that did not require any modification to the original radio.

WiFi is accessed though an off-the-shelf USB wireless module. After evaluating tapping into a 5vdc source somewhere in the radio, it was decided to use a wall wart to power the Raspberry Pi. A plug for the wall wart was spliced in after the radio’s main on/off switch. That way the radio and Raspberry Pi both turn on and off together. There is plenty of room for all of these added components inside the radio’s case.

The RaspPi can be fully controlled over the WiFi network but has a couple buttons wired up to the GPIO pins for limited manual control. The buttons for these controls fit perfectly in the round vent holes in the back panel of the radio’s case. Although the buttons are visible, no permanent modifications had to be made! [Luk] reports that everything works great, as do the original functions of the radio.

Filed under: home entertainment hacks

programming: Software Engineering vs. Product Manager Salaries

submitted by maus80
[link] [205 comments]

Instructables: exploring - featured: DIY SPEEDOMETER AND ODOMETER

Hey guys, this a instructable of how to make a bike speedometer.Yeah,you read it right the one that we use in car but only for 10$ .Well the first this for you to know is that this is the collaborated project of Mr_DIY_Electrician and paurushthemaker .So back to topic our project is ...
By: electroguyz

Continue Reading »

Computer Science: Theory and Application: How do I get into research?

I don't know if this belongs here because this is not really a CS career question, but I'm more interested in the mathematical and "theoretical" parts of computer science more than the practice of writing code. I want to get into research( Masters or PHD?) however I'm not sure how to really go about this. The current university I go to is ranked in the 80's to 110's by US news ranking (for comp sci), I was wondering if it's worth it to pursue this path even though I did not get into ivy leagues like Columbia or can't attend decent schools (cost) that have good comp sci programs. Thanks guys!

submitted by GucciVersaceGucci
[link] [21 comments]

Computer Science: Theory and Application: A Quadratic Probing Question

Hello everybody, when hashing and doing quadratic probing I understand (I'm pretty sure I understand at least) how to insert elements and what to do if there is a collision but what happens when I run out of "space"

ie my example in the picture only has 10 elements thus only space from 0-9. My understanding is if I have 10 elements I mod(%) 10 the number and insert.

13 mod 10 is 3, 3 is empty so I can insert 13 no problem.

23 mod 10 is 3 again so I get 3 and add it to 12 = 1 so 3+1=4 insert 23 into 4.

33 mod 10 is once again 3 and a collision. 12 = 1 so 3+1=4 but there is another collision. So 22 =4, 4+3 = 7 no collision so insert 33 into 7.

Now my problem is what happens when my element is 43 and beyond? 43 mod 10 is once again 3 and a collision. 12 = 1 so 3+1=4 but there is another collision. So 22 =4, 4+3 = 7 is a collision. 32 =9, 9+3 =12 but there is no 12th spot.

Would I use chaining somehow or loop around?

Thank you

EDIT I think I might have just found out my problem. I think I missed a step. After I get my new number after adding in n2 I'm supposed to mod 10 again correct? So in the case of 43 where I get 12 I'm supposed to do 12 mod 10 which would give me 2 so I insert 43 in the 2nd spot?

submitted by tinktinkdotorg
[link] [2 comments]

Greater Fool - Authored by Garth Turner - The Troubled Future of Real Estate: The lucky guy

LUCKY 3 modified

Stan worked on the line at GM’s Oshawa plant for thirty years. “Last of the breed,” he says. “Man, look at the news.” Indeed. GM just punted a thousand workers, who will be gone by November. When Stan started there, 15,000 guys crowded the gates. Now there are 3,600. Soon, a third less. “This place is doomed,” he prophecies.

Pensions are one reason, which is why I was talking to the guy. GM Canada has about 30,000 retirees drawing monthly cheques. It also has an unfunded pension liability estimated to be more than $2 billion. That’s despite a $3.2-billion cash gift the company received from the government when GM was bailed out in ‘09. It effectively means most company pensioners today are drawing taxpayer money. Yep, just like civil servants. Except most ex-GM workers get more.

Stan never saved a nickel, has no RRSPs, no TFSA, no investment portfolio and $12,500 in his TD Canada Trust daily savings account earning 0.10%. But he does have a house east of Toronto he paid $220,000 for, plus a wife who works at Loblaws.

But Stan’s one lucky dude. He has a gold-plated pension from the olden days when automakers secretly sweated as the union’s brass swaggered to the negotiating table. He also has a big choice to make. He can collect a monthly cheque until he dies. Or he can commute it – taking over the pension himself with a lump-sum payment. In his case, it will be just under $1 million – some of it rolled into a tax-free registered account, some of it in taxable cash.

“I’m scared,” he said. “I can’t sleep, and now all I do is worry.” That’s normal, I told him. People lacking money worry occasionally about being poor. People who have money obsess about losing it. It’s why rich people never smile.

Well, Stan made his choice finally. He took the money, will have it invested privately and get his monthly allowance that way. Here’s why.

“I don’t trust them.” These are the words of a guy who’s watched the ranks of the employed decimated, seen his company rescued from colossal failure by the government, and knows there’s not enough money in the pot to fund his pension for the next 35 years. In fact, unfunded pension liabilities are a ticking timebomb with the potential to blow up the lives of many unsuspecting people.

For example, Canada Post has an unfunded pension liability of $6.5 billion, which should explain why it’s trying hard to get out of the mail delivery business and laying off armies of people. Across Canada it’s estimated there are $300 billion worth of pensions that public sector workers are expecting that actually have no dollars allocated to them. Some bitter surprises are in store.

Anyway, Stan’s smart. Why even take a chance when you can take the money now?

Then there’s this: “What if they screw up again?” Governments struggling with their own debts and deficits might not be so generous with GM the next time it hits the rocks. Pensioners in Canada could live through the same experience as cops and firefighters have in American cities and states where pension benefits are arbitrarily cut. Already teachers in Ontario have been forced to pay more into their massive pension plan and will be receiving less, just to keep it solvent.

By taking the money and putting it to work, hopefully matching long-term investment returns, Stan will never deplete it and harvest a monthly amount equal to that the pension administrators were promising.

Most importantly he said, “I have to do this for Brenda.” Smart. If Stan took the company pension the way most of his greying buddies are, with its stress-free payments, then died in a few years, Brenda would get a small and temporary survivor benefit. But by commuting the pension amount, Stan’s family owns 100% of the money – forever. If he passes first (“Like that won’t happen…”) then Brenda gets every cent, to support her and help the kids as they get established.

Besides, there couldn’t be a better time for the guy to be doing this, since interest rates have cratered. Low rates make a commuted pension worth more in today’s dollars, since the present value of it rises. If current rates were a couple of percentage points higher then the autoworker’s pension value would be at least $300,000 lower. In fact, his commuted value jumped enough to buy a new RV with the tiny quarter-point bank rate drop in January.

Like I said. Lucky dude.

Finally, Stan can take his wad, invest it reasonably for growth and stability, and end up paying less tax than his pension-collecting pals. That’s because a portion of his income can be deemed return of capital, which means it’s not reportable, keeping him in a lower tax bracket.

Of course, in return for these benefits, he worries. He has to trust someone with his million. And that is the highest hurdle.

Trivium: 03may2015

Climate Resistance: Wake Up and Smell the Coffee!

Another day, another apocalyptic story in the Guardian

Coffee catastrophe beckons as climate change threatens arabica plant
Study warns that rising temperatures pose serious threat to global coffee market, potentially affecting livelihoods of small farmers and pushing up prices


Coffee, as we all now know, is grown by poor people. And, as we all know, climate change is worse for the poor. Never mind that environmentalists — who claim to care for the poor — hate coffee shops (unless they’re in Amsterdam), and hate global trade and hate the vehicles that global trade depends on, and hate even more the fuels that make advanced agriculture and global shipping possible…

Cultivation of the arabica coffee plant, staple of daily caffeine fixes and economic lifeline for millions of small farmers, is under threat from climate change as rising temperatures and new rainfall patterns limit the areas where it can be grown, researchers have warned.

This is surely a disaster.

With global temperatures forecast to increase by 2C-2.5C over the next few decades, a report predicts that some of the major coffee producing countries will suffer serious losses, reducing supplies and driving up prices.

2.5 degrees over the next few decades? Really? Over the course of my coffee-drinking career — i.e. my adult life — the globe has warmed by approximately no degrees centigrade. But let’s not worry about that right now. What exactly is the claim?

The joint study, published by the International Center for Tropical Agriculture (CIAT) under the CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS), models the global suitability of arabica cultivation to see how production will be affected in 2050.

It predicts that Brazil, Vietnam, Indonesia and Colombia – which between them produce 65% of the global market share of arabica – will find themselves experiencing severe losses unless steps are taken to change the genetics of the crops as well as the manner and areas in which it is grown.

Well, we can all agree that adaptation is a Good Thing, and is likely a good way of responding to climate change. But there’s adaptation and there’s adaptation. Most adaptation is a decision that can be taken at the level of the farm. The implication of the study, however, is that coffee growers will have to move ever upwards to cope with the changing climate, demanding the intervention of national and global carbon bureaucracies.

But is this true? What’s the evidence for it?

It doesn’t exist in the statistics relating to the production of coffee provided by the UN. Here is a chart showing coffee production in the countries named by the Guardian in the passage above, and for the world total.


World coffee production has doubled since 1980. Coffee production has tripled in Brazil since 1995, and output is less volatile. Vietnam has emerged as a coffee superpower in just two decades. Indonesia’s coffee production has shown slow, but steady and sure growth. This picture is hard to marry with the story that coffee production is getting harder. The only loser here is Columbia, whose output seemed to peak in the early 1990s. For this we turn to Wikipedia for the standard explanation

Regional climate change associated with global warming has caused Colombian coffee production to decline since 2006 from 12 million 132-pound bags, the standard measure, to 9 million bags in 2010. Average temperatures have risen 1 degree Celsius between 1980 to 2010, with average precipitation increasing 25 percent in the last few years, disrupting the specific climatic requirements of the Coffea arabica bean.[13]

Well that’s one explanation for Colombia’s coffee production decline. But there are at least two others… Fair trade organisation, Equal Exchange offer this account:

The global coffee [price] crisis hit Colombia’s small producers hard. Twenty-three percent of producers were not meeting production costs in the nineteen nineties. The affect on producer families varied by region, but overall the crisis sent people further into poverty and debt. Malnutrition among small children in farm families went up significantly, while coffee production across the country fell 44% as farmers could no longer afford to harvest and process their crops. Many farmers were forced to migrate for work in urban areas leading to increased unemployment and more poverty.

The article is not without its own tendency to sustainabollocks. And this journal article offers a third perspective, but which it also attempts to link to climate change…

Coffee rust is a leaf disease caused by the fungus, Hemileia vastatrix. Coffee rust epidemics, with intensities higher than previously observed, have affected a number of countries including: Colombia, from 2008 to 2011; Central America and Mexico, in 2012–13; and Peru and Ecuador in 2013. There are many contributing factors to the onset of these epidemics e.g. the state of the economy, crop management decisions and the prevailing weather, and many resulting impacts e.g. on production, on farmers’ and labourers’ income and livelihood, and on food security. Production has been considerably reduced in Colombia (by 31 % on average during the epidemic years compared with 2007) and Central America (by 16 % in 2013 compared with 2011–12 and by 10 % in 2013–14 compared with 2012–13). These reductions have had direct impacts on the livelihoods of thousands of smallholders and harvesters. For these populations, particularly in Central America, coffee is often the only source of income used to buy food and supplies for the cultivation of basic grains. As a result, the coffee rust epidemic has had indirect impacts on food security. The main drivers of these epidemics are economic and meteorological. All the intense epidemics experienced during the last 37 years in Central America and Colombia were concurrent with low coffee profitability periods due to coffee price declines, as was the case in the 2012–13 Central American epidemic, or due to increases in input costs, as in the 2008–11 Colombian epidemics. Low profitability led to suboptimal coffee management, which resulted in increased plant vulnerability to pests and diseases. A common factor in the recent Colombian and Central American epidemics was a reduction in the diurnal thermal amplitude, with higher minimum/lower maximum temperatures (+0.1 °C/-0.5 °C on average during 2008–2011 compared to a low coffee rust incidence period, 1991–1994, in Chinchiná, Colombia; +0.9 °C/-1.2 °C on average in 2012 compared with prevailing climate, in 1224 farms from Guatemala). This likely decreased the latency period of the disease. These epidemics should be considered as a warning for the future, as they were enhanced by weather conditions consistent with climate change. Appropriate actions need to be taken in the near future to address this issue including: the development and establishment of resistant coffee cultivars; the creation of early warning systems; the design of crop management systems adapted to climate change and to pest and disease threats; and socio-economic solutions such as training and organisational strengthening.

But the link between climate change — whether it be natural or anthropogenic — and reduced coffee bean production is speculation. The research only suggests it as a ‘likely’ part-cause of an epidemic, given relatively modest changes in temperature extremes, which itself had a much more profound effect on production, which was again much more likely an economic consequence — low price and poverty. Let us not forget that greens are hostile to interventions which could have prevented the disease — pesticides — and campaign to abolish their use, and have persuaded Fair Trade organisations to make ‘sustainability’ a condition of trade. In other words, it is not implausible that the demands of ‘sustainability’ could have caused the very problem which its advocates now attribute to climate change.

A broader picture of climate change’s effect on coffee production can be gained by looking at each country’s yield.


Again, we can see that the story of environmental decline doesn’t fit with the statistics. We can see no signal corresponding to climate change in any country except Colombia, which we have an explanation for. Moreover, in the case of Vietnam, where we can see a dramatic shift in yield between the late 1990s and mid 2000s, which the environmentalist might be tempted to explain as the consequence of climate change. But he would be wrong. The producer price of coffee fell between 1997 and 2004, before rising again. As this graph of Colombian production statistics shows. (The data for producer prices in Vietnam do not exist over this time range).


Economics accounts for changes in production yield much better than climate. When the price is low, the yield is low.

The Guardian article continues, quoting one of the study’s authors…

“If you look at the countries that will lose out most, they’re countries like El Salvador, Nicaragua and Honduras, which have steep hills and volcanoes,” he said. “As you move up, there’s less and less area. But if you look at some South American or east African countries, you have plateaus and a lot of areas at higher altitudes, so they will lose much less.”

So do these countries show any sign of being vulnerable to climate change yet? Here are the production and yield stats for those countries.



We can see coffee production increase in Honduras and Nicaragua, and yield increase in Honduras, with wobbly increase for yield in Nicaragua. The case of El Salvador is very different. Coffee production fell, and has not recovered since 1979, and its yield has fallen since 1969. Is this the result of climate change?

No. In the cases of both Nicaragua and El Salvador, conflict much better explains changes in production statistics than climate change. In Nicaragua, civil war affects production through the 1980s, which was amplified by US sanctions, and the reduction in yield from the late 1990s through the mid 200s is explained by the lower prices that affected Vietnam. Civil war affected El Salvador through the 1980s, also, from which the El Salvadorian economy has not recovered .

The report‘s abstract reads as follows…

Regional studies have shown that climate change will affect climatic suitability for Arabica coffee (Coffea arabica) within current regions of production. Increases in temperature and changes in precipitation patterns will decrease yield, reduce quality and increase pest and disease pressure. This is the first global study on the impact of climate change on suitability to grow Arabica coffee. We modeled the global distribution of Arabica coffee under changes in climatic suitability by 2050s as projected by 21 global circulation models. The results suggest decreased areas suitable for Arabica coffee in Mesoamerica at lower altitudes. In South America close to the equator higher elevations could benefit, but higher latitudes lose suitability. Coffee regions in Ethiopia and Kenya are projected to become more suitable but those in India and Vietnam to become less suitable. Globally, we predict decreases in climatic suitability at lower altitudes and high latitudes, which may shift production among the major regions that produce Arabica coffee.

This seems to me to reproduce the same old trick, of plugging in worst-case scenario projections into modelled assumptions of sensitivity of this-or-that to climate, to reveal, hey-presto, a sound prediction of what life will be like a few decades hence. Yet we can see that climate has had very little impact on agricultural production, if any negative impact at all. And we can see that economics plays a much bigger role in agricultural production than any environmental effect.

These kind of studies claim to want to protect the interests of producers. Yet their futures don’t seem to be at all dependent on the interventions of climate bureaucracies, if there is any lesson to be had from the past. The weather is simply the weather, whereas price volatility and conflict are the real enemies of farmers in poorer economies. Wealth allows for the proper management of crops, as well as adaptation to any kind of weather. The study does not appear to have attempted to isolate climate and its Nth-order effects from economic effects and conflict in its estimation of coffee-production’s sensitivity to climate. Why not?

This doesn’t exclude the possibility, of course, that dramatic shifts in climate could create problems for coffee producers. Of course it could. Yet even extreme weather, such as that which caused widespread damage in coffee-producing economies in the late 1990s as a result of El Nino don’t seem to have affected coffee production. In fact, the price of coffee fell following the 1997-8 El Nino, no doubt amplifying the consequences for recovery.

To link agricultural production and climate change in this way — as seems to be the greens’ want — is to make instrumental use of the plight of producers in poorer economies. It does not aim to intervene in any way that would improve their condition. The purpose is to inflate an already engorged bureaucracy and add to its powers. A genuine discussion about how to improve the conditions of producers in poorer economies would be about how best to allow a situation in which fewer farmers produced more goods, leaving more people to produce the machines and chemicals those wealthier farmers would use in their work, the other services they would use in their lives, and the books, films and music they would use in their leisure time.

But bloated, ambitious green bureaucracies and their academic organs like the CGIAR Research Program on Climate Change, Agriculture and Food Security, which produced this report don’t want such lifestyles for poorer producers.

No single research institution working alone can address the critically important issues of global climate change, agriculture and food security. The CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS) will address the increasing challenge of global warming and declining food security on agricultural practices, policies and measures through a strategic collaboration between CGIAR and Future Earth.

Food security is not an ‘increasing challenge’. It is a challenge which has reduced dramatically over just the timespan of anthropogenic global warming. More people have more access to better quality food than ever before. Only in the minds of bureaucrats and climate impact models is the world a worse place than it ever has been. The reasons for this are obvious.

Lambda the Ultimate - Programming Languages Weblog: BER MetaOCaml -- an OCaml dialect for multi-stage programming

BER MetaOCaml -- an OCaml dialect for multi-stage programming
Oleg Kiselyov

BER MetaOCaml is a conservative extension of OCaml for ``writing programs that generate programs''. BER MetaOCaml adds to OCaml the type of code values (denoting ``program code'', or future-stage computations), and two basic constructs to build them: quoting and splicing. The generated code can be printed, stored in a file -- or compiled and linked-back to the running program, thus implementing run-time code optimization. A well-typed BER MetaOCaml program generates only well-scoped and well-typed programs: The generated code shall compile without type errors. The generated code may run in the future but it is type checked now. BER MetaOCaml is a complete re-implementation of the original MetaOCaml by Walid Taha, Cristiano Calcagno and collaborators.

Introduction to staging and MetaOCaml

The standard example of meta-programming -- the running example of A.P.Ershov's 1977 paper that begat partial evaluation -- is the power function, computing x^n. In OCaml:

let square x = x * x

let rec power n x =
  if n = 0 then 1
  else if n mod 2 = 0 then square (power (n/2) x)
  else x * (power (n-1) x)


In MetaOCaml, we may also specialize the power function to a particular value n, obtaining the code which will later receive x and compute x^n. We re-write power n x annotating expressions as computed `now' (when n is known) or `later' (when x is given).

let rec spower n x =
  if n = 0 then .<1>.
  else if n mod 2 = 0 then .<square .~(spower (n/2) x)>.
  else .<.~x * .~(spower (n-1) x)>.;;
A brief history of (BER) MetaOCaml

As MetaOCaml was being developed, new versions of the mainline OCaml were released with sometimes many fixes and improvements. The MetaOCaml team tracked new OCaml releases and merged the changes into MetaOCaml. (The MetaOCaml version number has as its base OCaml's release version.) The merge was not painless. For example, any new function in the OCaml compiler that dealt with Parsetree (AST) or Typedtree has to be modified to handle MetaOCaml extensions to these data structures. The merge process became more and more painful as the two languages diverged. For instance, native code compilation that first appeared in MetaOCaml 3.07 relied on SCaml, a large set of patches to OCaml by malc at to support dynamic linking. OCaml 3.08 brought many changes that were incompatible with SCaml. Therefore, in MetaOCaml 3.08 the native compilation mode was broken. The mode was brought back in the Summer 2005, by re-engineering the SCaml patch and implementing the needed parts of dynamic linking without any modification to the OCaml code. The revived native compilation has survived through the end.


BER MetaOCaml has been re-structured to minimize the amount of changes to the OCaml type-checker and to separate the `kernel' from the `user-level'. The kernel is a set of patches and additions to OCaml, responsible for producing and type-checking code values. The processing of built code values -- so-called `running' -- is user-level. Currently the user-level metalib supports printing, type-checking, and byte-compiling and linking of code values. Users have added other ways of running the code, for example, compiling it to machine code, C or LLVM -- without any need to hack into (Meta)OCaml or even recompile it.


By relying on attributes, the feature of OCaml 4.02, BER N102 has become much closer integrated with OCaml. It is instructive to compare the amount of changes BER MetaOCaml makes to the OCaml distribution. The previous version (BER N101) modified 32 OCaml files. The new BER N102 modifies only 7 (that number could be further reduced to only 2; the only file with nontrivial modifications is typing/ It is now a distinct possibility that -- with small hooks that may be provided in the future OCaml versions -- MetaOCaml becomes just a regular library or a plug-in, rather being a fork.

Planet Haskell: apfelmus: GUI - Release of the threepenny-gui library, version

I am pleased to announce release of threepenny-gui version 0.6, a cheap and simple library to satisfy your immediate GUI needs in Haskell.

Want to write a small GUI thing but forgot to sacrifice to the giant rubber duck in the sky before trying to install wxHaskell or Gtk2Hs? Then this library is for you! Threepenny is easy to install because it uses the web browser as a display.

The library also has functional reactive programming (FRP) built-in, which makes it a lot easier to write GUI application without getting caught in spaghetti code. For an introduction to FRP, see for example my slides from a tutorial I gave in 2012. (The API is slightly different in Reactive.Threepenny.)

In version 0.6, the communication with the web browser has been overhauled completely. On a technical level, Threepenny implements a HTTP server that sends JavaScript code to the web browser and receives JSON data back. However, this is not the right level of abstraction to look at the problem. What we really want is a foreign function interface for JavaScript, i.e. we want to be able to call arbitrary JavaScript functions from our Haskell code. As of this version, Threepenny implements just that: The module Foreign.JavaScript gives you the essential tools you need to interface with the JavaScript engine in a web browser, very similar to how the module Foreign and related modules from the base library give you the ability to call C code from Haskell. You can manipulate JavaScript objects, call JavaScript functions and export Haskell functions to be called from JavaScript.

However, the foreign calls are still made over a HTTP connection (Threepenny does not compile Haskell code to JavaScript). This presents some challenges, which I have tried to solve with the following design choices:

  • Garbage collection. I don’t know any FFI that has attemped to implement cross-runtime garbage collection. The main problem are cyclic references, which happen very often in a GUI setting, where an event handler references a widget, which in turn references the event handler. In Threepenny, I have opted to leave garbage collection entirely to the Haskell side, because garbage collectors in current JavaScript engines are vastly inferior to what GHC provides. The module Foreign.RemotePtr gives you the necessary tools to keep track of objects on the JavaScript (“remote”) side where necessary.

  • Foreign exports. Since the browser and the HTTP server run concurrently, there is no shared “instruction pointer” that keeps track of whether you are currently executing code on the Haskell side or the JavaScript side. I have chosen to handle this in the following way: Threepenny supports synchronous calls to JavaScript functions, but Haskell functions can only be called as “asynchronous event handlers” from the JavaScript side, i.e. the calls are queued and they don’t return results.

  • Latency, fault tolerance. Being a GUI library, Threepenny assumes that both the browser and the Haskell code run on localhost, so all network problems are ignored. This is definitely not the right way to implement a genuine web application, but of course, you can abuse it for writing quick and dirty GUI apps over your local network (see the Chat.hs example).

To see Threepenny in action, have a look at the following applications:

Daniel Austin’s FNIStash
Editor for Torchlight 2 inventories.
Chaddai’s CurveProject
Plotting curves for math teachers.

Get the library here:

Note that the API is still in flux and is likely to change radically in the future. You’ll have to convert frequently or develop against a fixed version.

Paper Bits: "Here’s the big news flash for people who don’t vaccinate their kids: you don’t live on an island in..."

“Here’s the big news flash for people who don’t vaccinate their kids: you don’t live on an island in the middle of the woods in the middle of whatever century Laura Ingalls Wilder was born in which, if you wanted pork chops, you had to fatten the hog first. Having a cartoon drawing of your family on the back window of your Honda Element doesn’t make you and them the only people in the world. Look around you. Those things with the heads and the arms and the legs are other human beings.
Also: you see those tiny little things that some of those people are carrying around? Those are what we call Other People’s Infants. (I know you know what Your Infant looks like because you have a picture of it on the same phone you use to read stupid crap written by absolute morons like Jennie McCarthy and Melanie Phillips while taking up a space in the Whole Foods parking lot.) Anyway, infants also can’t get vaccinated. This means that, if your children aren’t vaccinated, they could infect an infant (not your infant though, of course! Your infant is safe in your phone!) and it could die, and it would be your fault.
I have said this many times to people – “an infant could die, and it would be your fault” – and they look at me like I just told them it’s raining. And then they go back to the “my private choice” thing, and I am left chilled to the bone with the knowledge that whatever kind of anti-vaxxer freak they are – whether they’re the hippie “I think bone broth cures everything” kind or the urban “I’m so hypereducated that I’ve lost touch with reality” kind – they really just doesn’t care that their actions might hurt other people.”

- Laura Miller
(via mysharona1987)

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - The Monster Under the Bed

Hovertext: Twist ending: The kid's eyes are pure white because he's a monster too! Spooooooky!

New comic!
Today's News:

Better Embedded System SW: Counter Rollover Bites Boeing 787

Counter rollover is a classic mistake in computer software.  And, it has just bit the Boeing 787.

The Problem:

The Boeing 787 aircraft's electrical power control units shut down if powered without interruption for 248 days (a bit over 8 months). In the likely case that all the control units were turned on at about the same time, that means they all shut down at the same time -- potentially in the middle of a flight. Fortunately, the power is usually not left on for 8 continuous months, so apparently this has not actually happened in flight.  But the problem was seen in a long-duration simulation and could happen in a real aircraft. (There are backup power supplies, but do you really want to be relying on them over the middle of an ocean?  I thought not.) The fix is turning off the power and turning it back on every 120 days.

That's right -- the FAA is telling the airlines they have to do a maintenance reboot of their planes every 120 days.

(Sources: NY Times ; FAA)


Just for fun, let's do the math and figure out what's going on.
248 days * 24 hours/day * 60 minute/hour * 60 seconds/minute = 21,427,200
Hmmm ... what if those systems keep time as an 32-bit signed integer in hundredths of a second? The maximum positive value for such a counter would give:
0x7FFFFFFF = 2147483647 / (24*60*60) = 24855 / 100 = 248.55 days.

If they had used a 32-bit unsigned it would still overflow after twice as long = 497.1 days.

Other Examples:

This is not the first time a counter rollover has caused a problem.  Some examples are:

  • IBM: Interface adapters hang after 497 days of uptime [IBM]
  • Windows 95: hang after 49.7 days without reboot, counting in milliseconds [Microsoft]  
There are also plenty of date roll-over bugs:
  • Y2K: on 1 January 2000 (overflow of 2-digit year from 99 to 00)   [Wikipedia]
  • GPS: 1024 week rollover on 22 August 1999 [USCG]
  • Year 2038: Unix time will roll over on 19 January 2038 [Wikipedia]

There are also somewhat related capacity overflow issues such as 512K day for IPv4 routers.

If you want to dig further, there is a "zoo" of related problems on Wikipedia:  "Time formatting and storage bugs"

Perlsphere: Blog-Battle #14: Gift

Nach einigen weniger beliebten Begriffen wird es dieses Mal wieder einfacher beim Blog-Battle, denn das Thema der Woche lautet: Gift. Das ist - glaube ich zumindest - die erste Woche in der ein englisches Wort als Vorgabe gegeben wurde.

TwitchFilm: Review: A FOOL, A Stark Reminder That In China Nice Guys Finish Last

Based on Hu Xuewen's novel Running Moonlight, actor Chen Jianbin's directorial debut is a harsh reminder of humanity's predatory nature, as an honest farmer's efforts to help a young homeless man set of a chain of calamitous events.Chen Jianbin was awarded both the Best Actor and Best New Director prizes for A Fool at last November's Golden Horse Awards in Taiwan (as well as the Best Supporting Actor prize for Paradise in Service), yet the film, which he also wrote the screenplay for, had its domestic release pulled earlier this year, after supporting performer Wang Xuebing was arrested for drug possession.A Fool is the latest in a string of gritty noirish dramas emerging from China, portraying the country as an almost Wild West frontier of...

[Read the whole post on]

Cowbirds in Love: Stranded

Last weekend, I decided a fun thing to do would be to walk 20 miles from Claymont, Delaware, where I live, to Newark, Delaware.

It’s about a seven hour walk.

On hour 1, I thought of this comic.

On hour 5, my leg started hurting a lot and I had to stop walking.

I’m not saying I can predict the future, but maybe my comics can?????

My leg is feeling fine now, by the way. Comic for 2015.05.03

New Cyanide and Happiness Comic

TwitchFilm: Stanley Film Festival 2015 Review: SUN CHOKE Brings Ambiguous Back

Writer-director Ben Cresciman's second feature, Sun Choke, premiered at the Stanley Film Festival this weekend. It's a film that's not easy to describe: imagine a fever dream sprung from the mind of an unreliable narrator that is both a murderer, an epileptic, and figures somewhere on the autism or Asperger's spectrum, and you get Sun Choke. Starring Sarah Hagan (Buffy The Vampire Slayer, Freaks and Geeks), Barbara Crampton (You're Next, Re-Animator), and Sara Malakul Lane (12/12/12, Sharktopus), Sun Choke takes place in the Hollywood Hills of Los Angeles. Hagan's Janie is being tenuously led back to health by Crampton's Irma while her father stays overseas for months working.We see a few flashbacks involving what looks to be a murder scene at home as well as a...

[Read the whole post on]

Perlsphere: Call For Grant Proposals (May 2015 Round)

Contribute to Perl and get some $$!

The Grants Committee is accepting grant proposals all the time. We evaluate them every two months and another evaluation period has come.

If you have an idea for doing some Perl work that will benefit the Perl community, consider sending a grant application. The application deadline for this round is 23:59 May 15th UTC. We will publish the received applications, get community feedback and conclude acceptance by May 30th.

The format will be the same as the previous rounds in 2014-2015.

To apply, please read How to Write a Proposal. Rules of Operation will also help you understand how the grant process works.

We will confirm the receipt of application within 24 hours.

If you have further questions, please comment here. If your comment does not show up here within 24 hours, the chances are that the spam filter did something bad. Get in touch with me at tpf-grants-secretary at

Matt Might's blog: Discovering new diseases with the internet: How to find a matching patient

Genome and exome sequencing are the greatest diagnostic breakthroughs in the history of rare disease.

When sequencing identifies a genotype already associated with human disease, it can short-circuit years of costly and painful one-off disease tests.

But, if sequencing turns up “variants/mutations of uncertain clinical significance,” then a new kind of diagnostic odyssey unfolds.

Narrowing down which variant is responsible for a disorder may require “functional studies”: going to the lab to study cells or genetically modifying organisms in an attempt to link the mutations to the presentation of the disorder.

(Functional studies are not and are unlikly to ever be covered by insurance.)

Alternatively – and preferrably – you can find a second patient to confirm discovery of the disorder.

This article describes how to use the internet to find a second case for a previously unknown genetic disorder.

If you find success with this approach, please email me to let me know how it worked out for you.

Click here to read the rest of the article

All Content: Ride


Helen Hunt's second film as a director, "Ride," is about a brittle book editor who heals old wounds and her strained relationship with her son by learning to surf. From its opening scenes, it settles into what seems like a familiar if very specific comic mode exemplified by such filmmakers as Woody Allen, Nancy Meyers ("Something's Gotta Give") and James L. Brooks ("Broadcast News" and "As Good as it Gets," for which Hunt won an Oscar).  The characters in these movies tend to be white, American, upper-middle class to rich, educated, and nervously talkative, and the whole thing tends to be pitched somewhere between a TV sitcom and a laid-back American indie film in which nothing too terribly upsetting happens onscreen. Lessons are learned, loves lost or found, quotable lines uttered, roll credits. None of which is meant to denigrate the movies as unworthy: well-done, they can be incredibly satisfying, and it's questionable whether they're any more or less trivial than a superhero film or "Godzilla" in the greater scheme (at least you get to see a version of the actual world on a movie screen). Just that usually you go in having a pretty good idea what you're in for, and you're rarely proven wrong.

"Ride," though, is a somewhat different animal. Hunt, who directs her own original screenplay, nails the Allen-Meyers-Brooks aspect of the film from the opening scene, in which her heroine Jackie banters with her college-bound son Angelo (Brenton Thwaites) about a short story he's written. The rat-a-tat rhythm of the dialogue, complete with people talking over and past each other, is highly structured, every word and pause carefully chosen to put across a particular rhythmic or emotional effect. Nobody's just winging it here. The rest of the setup for the story is similarly meticulous, with dialogue scenes and bits of physical comedy blocked as meticulously as anything you might see in a Broadway play (one of the cleverest is the scene where Angelo allays his mom's fears that he's moving too far away from her by walking her across the park, counting each step along the way). We deduce that this is a movie about learning to let go, with an empty nest story at its center, and we figure it'll manifest itself by showing us that Jackie is a control freak, like a lot of older rom-com heroines, and that her son chafes at that even though he obviously has a touch of it himself, and that they're both going to have to get used to the idea of not being in each other's lives 24/7.

But then Angelo goes to Venice, California to visit his father and his new family one last time before college and impulsively decides to drop out and just hang around the beach. Suddenly the whole temperament of the film changes, and you start to figure out that the tight-knit relationship between a single mother and her only son is not the main story here, but a gateway into the real main story. Suffice to say that once "Ride" gets around to explaining why Jackie and Angelo are so incredibly, perhaps unhealthily close (bickering and swapping in-jokes like a brother and sister, or an old married couple) and then has Jackie decide to stay in Venice for a while and learn to surf and open some long-locked doors in her psyche, the film goes darker than you expected it to go; but it explores that darkness with a plainspokeness that goes up to the edge of agonized psychodrama but somehow never punctures that feeling that Hunt established early on. This is a remarkable feat that required keen instincts to pull off. It's as if somebody had spliced bits of a French art house drama about grief, suffering and repression into "Spanglish" or "It's Complicated," without one aspect canceling out the other.

The film is high-strung, nervous and slightly chilly in the New York scenes, but once the action shifts to the beaches of Venice, it slows down considerably, and fittingly; the rhythm of the waves dictates the pace and color of the movie, and it makes sense to apply the brakes to a story about a woman who's always talking and working and running all over the place and defining and ranking and describing everything in her life. (Complaining about LA, she says, "By the time you drive to a museum, you've lost the will to look at a painting, so it's hardly a decision based on culture").

The slowing-down effect of Venice makes it possible for Jackie to actually live in reality rather than hurrying through it or avoiding it. This in turn encourages her to confront the real reason she's so controlling towards Angelo and so incapable of letting him go. Without giving away the movie's big reveal (it's horrifying, in an everyday sort of way, but not "surprising" in a movie way) I can say that once we face it along with Jackie and Angelo and their loved ones, the film seems to go through much the same revelatory experience that Jackie has on the beach as she learns how to ride a board with help from a handsome, slightly younger instructor and soon-to-be-love-interest named Ian (Luke Wilson). It chills out and loosens up. And then it goes deep—deeper than you expected—into pain. There's a long, wordless sequence out in the ocean, just Hunt on a surfboard, that has the ritualistic power of a sacrament.

"Ride" isn't going to win any prizes for cinematic innovation. Its comfort in privilege has an irritatingly unexamined quality (Jackie's got the sort of job where she can fly cross-country for several days and drive to and from the beach in a limo and not be fired immediately). And there are aspects that feel more generous than dramatically wise, such as most of the scenes involving Angelo in LA; I think it would've been all right to let him be a supporting character after the film leaves New York and slip toward the margins of the story, which is mainly Jackie's. Still, this is an unusual movie, especially when you look back and realize how usual it seemed going in.

Certain actors seem to understand their screen persona better than almost any director they've worked with. Hunt might be one of them. She's often been cast as a killjoy, a micro-manager, a voice of reason, or a somewhat abstract figure of mystery and longing, and she's written a part for herself that incorporates all of those shards. ("It sounds like hubris, but I know that once I get out there I am going to be better than average at intuiting how to do this," she says, explaining why she doesn't want surfing lessons at first.) But there's an earthier, warmer figure buried just beneath Jackie's surface, and the movie digs her out in a disarmingly natural way. Jackie can get high and laugh endlessly at a completely innocuous question, or march into her ex-husband's house and have a tearful meltdown that's been a long time coming while being keenly aware of how inappropriate and selfish it is, to the point of preemptively describing what she thinks the other characters in the room are thinking and feeling about her.

The movie is funny until suddenly it's not funny at all. Then it's funny again while discussing the fact that some things really aren't funny. And near the end, it's affecting without seeming to try to hard to be affecting. A lot of this has to do with the way Hunt invests ordinary words with strong emotion by having the characters deliver them in an offhand way. "Life is long," Ian tells Jackie. "It is," she replies. I hope a lot of people see this movie. It's not great, but it has greatness in it, and as a filmmaker, Hunt is the real deal.

Planet Haskell: Roman Cheplyaka: Smarter validation

Today we’ll explore different ways of handling and reporting errors in Haskell. We shall start with the well-known Either monad, proceed to a somewhat less common Validation applicative, and then improve its efficiency and user experience.

The article contains several exercises that will hopefully help you better understand the issues that are being addressed here.

Running example

{-# LANGUAGE GeneralizedNewtypeDeriving, KindSignatures, DataKinds,
             ScopedTypeVariables, RankNTypes, DeriveFunctor #-}
import Text.Printf
import Text.Read
import Control.Monad
import Control.Applicative
import Control.Applicative.Lift (Lift)
import Control.Arrow (left)
import Data.Functor.Constant (Constant)
import Data.Monoid
import Data.Traversable (sequenceA)
import Data.List (intercalate, genericTake, genericLength)
import Data.Proxy
import System.Exit
import System.IO
import GHC.TypeLits

Our running example will consist of reading a list of integer numbers from a file, one number per line, and printing their sum.

Here’s the simplest way to do this in Haskell:

printSum1 :: FilePath -> IO ()
printSum1 path = print . sum . map read . lines =<< readFile path

This code works as expected for a well-formed file; however, if a line in the file can’t be parsed as a number, we’ll get unhelpful no parse

Either monad

Let’s rewrite our function to be aware of possible errors.

  :: Int -- line number (for error reporting)
  -> String -- line contents
  -> Either String Integer
     -- either parsed number or error message
parseNum ln str =
  case readMaybe str of
    Just num -> Right num
    Nothing -> Left $
      printf "Bad number on line %d: %s" ln str

-- Print a message and exit
die :: String -> IO ()
die msg = do
  hPutStrLn stderr msg

printSum2 :: FilePath -> IO ()
printSum2 path =
  either die print .
  liftM sum .
  sequence . zipWith parseNum [1..] .
  lines =<< readFile path

Now, upon reading a line that is not a number, we’d see something like

Bad number on line 2: foo

This is a rather standard usage of the Either monad, so I won’t get into details here. I’ll just note that there are two ways in which this version is different from the first one:

  1. We call readMaybe instead of read and, upon detecting an error, construct a helpful error message. For this reason, we keep track of the line number.
  2. Instead of throwing a runtime exception right away (using the error function), we return a pure Either value, and then combine these Eithers together using the Moand Either isntance.

The two changes are independent; there’s no reason why we couldn’t use error and get the same helpful error message. The exceptions emulated by the Either monad have the same semantics here as the runtime exceptions. The benefit of the pure formulation is that the semantics of runtime exceptions is built-in; but the semantics of the pure data is programmable, and we will take advantage of this fact below.

Validation applicative

You get a thousand-line file with numbers from your accountant. He asks you to sum them up because his enterprise software mysteriously crashes when trying to read it.

You accept the challenge, knowing that your Haskell program won’t let you down. The program tells you

Bad number on line 378: 12o0

— I see! Someone put o instead of zero. Let me fix it.

You locate the line 378 in your editor and replace 12o0 with 1200. Then you save the file, exit the editor, and re-run the program.

Bad number on line 380: 11i3

— Come on! There’s another similar mistake just two lines below. Except now 1 got replaced by i. If you told me about both errors from the beginning, I could fix them faster!

Indeed, there’s no reason why our program couldn’t try to parse every line in the file and tell us about all the mistakes at once.

Except now we can’t use the standard Monad and Applicative instances of Either. We need the Validation applicative.

The Validation applicative combines two Either values in such a way that, if they are both Left, their left values are combined with a monoidal operation. (In fact, even a Semigroup would suffice.) This allows us to collect errors from different lines.

newtype Validation e a = Validation { getValidation :: Either e a }
  deriving Functor

instance Monoid e => Applicative (Validation e) where
  pure = Validation . Right
  Validation a <*> Validation b = Validation $
    case a of
      Right va -> fmap va b
      Left ea -> either (Left . mappend ea) (const $ Left ea) b

The following example demonstrates the difference between the standard Applicative instance and the Validation one:

> let e1 = Left "error1"; e2 = Left " error2"
> e1 *> e2
Left "error1"
> getValidation $ Validation e1 *> Validation e2
Left "error1 error2"

A clever implementation of the same applicative functor exists inside the transformers package. Ross Paterson observes that this functor can be constructed as

type Errors e = Lift (Constant e)

(see Control.Applicative.Lift).

Anyway, let’s use this to improve our summing program.

printSum3 :: FilePath -> IO ()
printSum3 path =
  either (die . intercalate "\n") print .
  liftM sum .
  getValidation . sequenceA .
  map (Validation . left (\e -> [e])) .
  zipWith parseNum [1..] .
  lines =<< readFile path

Now a single invocation of the program shows all the errors it can find:

Bad number on line 378: 12o0
Bad number on line 380: 11i3

Exercise. Could we use Writer [String] to collect error messages?

Exercise. When appending lists, there is a danger of incurring quadratic complexity. Does that happen in the above function? Could it happen in a different function that uses the Validation applicative based on the list monoid?

Smarter Validation applicative

Next day your accountant sends you another thousand-line file to sum up. This time your terminal gets flooded by error messages:

Bad number on line 1: 27297.
Bad number on line 2: 11986.
Bad number on line 3: 18938.
Bad number on line 4: 22820.

You already see the problem: every number ends with a dot. This is trivial to diagnose and fix, and there is absolutely no need to print a thousand error messages.

In fact, there are two different reasons to limit the number of reported errors:

  1. User experience: it is unlikely that the user will pay attention to more than, say, 10 messages at once. If we try to display too many errors on a web page, it may get slow and ugly.
  2. Efficiency: if we agree it’s only worth printing the first 10 errors, then, once we gather 10 errors, there is no point processing the data further.

Turns out, each of the two goals outlined above will need its own mechanism.

Bounded lists

We first develop a list-like datatype which stores only the first n elements and discards anything else that may get appended. This primarily addresses our first goal, user experience, although it will be handy for achieving the second goal too.

Although for validation purposes we may settle with the limit of 10, it’s nice to make this a generic, reusable type with a flexible limit. So we’ll make the limit a part of the type, taking advantage of the type-level number literals.

Exercise. Think of the alternatives to storing the limit in the type. What are their pros and cons?

On the value level, we will base the new type on difference lists, to avoid the quadratic complexity issue that I allude to above.

data BoundedList (n :: Nat) a =
    !Integer -- current length of the list
    (Endo [a])

Exercise. Why is it important to cache the current length instead of computing it from the difference list?

Once we’ve figured out the main ideas (encoding the limit in the type, using difference lists, caching the current length), the actual implementation is straightforward.

singleton :: KnownNat n => a -> BoundedList n a
singleton a = fromList [a]

toList :: BoundedList n a -> [a]
toList (BoundedList _ (Endo f)) = f []

fromList :: forall a n . KnownNat n => [a] -> BoundedList n a
fromList lst = BoundedList (min len limit) (Endo (genericTake limit lst ++))
    limit = natVal (Proxy :: Proxy n)
    len = genericLength lst

instance KnownNat n => Monoid (BoundedList n a) where
  mempty = BoundedList 0 mempty
  mappend b1@(BoundedList l1 f1) (BoundedList l2 f2)
    | l1 >= limit = b1
    | l1 + l2 <= limit = BoundedList (l1 + l2) (f1 <> f2)
    | otherwise = BoundedList limit (f1 <> Endo (genericTake (limit - l1)) <> f2)
      limit = natVal (Proxy :: Proxy n)

full :: forall a n . KnownNat n => BoundedList n a -> Bool
full (BoundedList l _) = l >= natVal (Proxy :: Proxy n)

null :: BoundedList n a -> Bool
null (BoundedList l _) = l <= 0


Now we will build the smart validation applicative which stops doing work when it doesn’t make sense to collect errors further anymore. This is a balance between the Either applicative, which can only store a single error, and Validation, which collects all of them.

Implementing such an applicative functor is not as trivial as it may appear at first. In fact, before reading the code below, I recommend doing the following

Exercise. Try implementing a type and an applicative instance for it which adheres to the above specification.

Did you try it? Did you succeed? This is not a rhetorical question, I am actually interested, so let me know. Is your implementation the same as mine, or is it simpler, or more complicated?

Alright, here’s my implementation.

newtype SmartValidation (n :: Nat) e a = SmartValidation
  { getSmartValidation :: forall r .
      Either (BoundedList n e) (a -> r) -> Either (BoundedList n e) r }
  deriving Functor

instance KnownNat n => Applicative (SmartValidation n e) where
  pure x = SmartValidation $ \k -> k <*> Right x
  SmartValidation a <*> SmartValidation b = SmartValidation $ \k ->
    let k' = fmap (.) k in
    case a k' of
      Left errs | full errs -> Left errs
      r -> b r

And here are some functions to construct and analyze SmartValidation values.

-- Convert SmartValidation to Either
fatal :: SmartValidation n e a -> Either [e] a
fatal = left toList . ($ Right id) . getSmartValidation

-- Convert Either to SmartValidation
nonFatal :: KnownNat n => Either e a -> SmartValidation n e a
nonFatal a = SmartValidation $ (\k -> k <+> left singleton a)

-- like <*>, but mappends the errors
  :: Monoid e
  => Either e (a -> b)
  -> Either e a
  -> Either e b
a <+> b = case (a,b) of
  (Right va, Right vb) -> Right $ va vb
  (Left e,   Right _)  -> Left e
  (Right _,  Left e)   -> Left e
  (Left e1,  Left e2)  -> Left $ e1 <> e2

Exercise. Work out what fmap (.) k does in the definition of <*>.

Exercise. In the definition of <*>, should we check whether k is full before evaluating a k'?

Exercise. We developed two mechanisms — BoundedList and SmartValidation, which seem to do about the same thing on different levels. Would any one of these two mechanisms suffice to achieve both our goals, user experience and efficiency, when there are many errors being reported?

Exercise. If the SmartValidation applicative was based on ordinary lists instead of difference lists, would we be less or more likely to run into the quadratic complexity problem compared to simple Validation?


Although the Validation applicative is known among Haskellers, the need to limit the number of errors it produces is rarely (if ever) discussed. Implementing an applicative functor that limits the number of errors and avoids doing extra work is somewhat tricky. Thus, I am happy to share my solution and curious about how other people have dealt with this problem.

Disquiet: Selected Ambient Kremlinology

Now at 190 total, the feed of tracks being uploaded by Aphex Twin (aka user48736353001) from his personal archive has expanded in recent days, after a two-month break, to include a wide range of sounds, among them industrial grinding and freeform experimentation. This piece, “6 Gear Smudge,” is the first in awhile to come close to the level of ambient simplicity that is represented by the (currently) 10-track Selected Ambient Works Volume 3 playlist I made from the uploads on the user48736353001 account. With its prominent synth melody, it’s arguably too poppy to make that cut, but it’s still a great nugget.

My favorite in this recent batch, by far, is “3 Slothscrape,” which one commenter (user1789670, the account of which has no tracks associated with it) perceptively likens to the work of Pierre Bastien. It indeed has the warped, melting pacing that Bastien shares with Kid Koala and Gavin Bryars. The “6 Gear Smudge” track is dated as “94 ish” but there is no date associated with “”3 Slothscrape.”

Aphex Twin, aka Richard D. James, isn’t just posting tracks from his relative seclusion. He’s also joining in the comments section. He explains to one commenter in the “6 Gear Smudge” thread that he created the hip-hop (“btw its all me inc. the scratchin”) heard in the introduction to the Chris Cunningham–directed “Windowlicker” video, and expresses disappointment about a longstanding beef with a fellow musician. At one point his generic account name appears not as “user 48736353001″ but as the French “utilisateur 48736353001,” which suggests he was logged in from a different computer or location.

Tracks originally posted at

new shelton wet/dry: Langsome heels and langsome toesis

A paper on gender bias in academia was recently rejected by an academic journal, whose reviewer told the two female authors to “find one or two male biologists to work with” if they wanted to get their work published. That work, by the way, was a scientific survey of how and why men in academia tend [...]

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Cause and Effect

Hovertext: Ahhhhhhhh. Heh. Hehehe. HAHAHAHAHAHAHA!

New comic!
Today's News:

Planet Lisp: Dimitri Fontaine: Quicklisp and debian

Common Lisp users are very happy to use Quicklisp when it comes to downloading and maintaining dependencies between their own code and the librairies it is using.

Sometimes I am pointed that when compared to other programming languages Common Lisp is lacking a lot in the batteries included area. After having had to package about 50 common lisp librairies for debian I can tell you that I politely disagree with that.

And this post is about the tool and process I use to maintain all those librairies.

Quicklisp is good at ensuring a proper distribution of all those libs it supports and actually tests that they all compile and load together, so I've been using it as my upstream for debian packaging purposes. Using Quicklisp here makes my life much simpler as I can grovel through its metadata and automate most of the maintenance of my cl related packages.

It's all automated in the ql-to-deb software which, unsurprisingly, has been written in Common Lisp itself. It's a kind of a Quicklisp client that will fetch Quicklisp current list of releases with version numbers and compare to the list of managed packages for debian in order to then build new version automatically.

The current workflow I'm using begins with using `ql-to-deb` is to `check` for the work to be done today:

$ /vagrant/build/bin/ql-to-deb check
Fetching ""
Fetching ""
update: cl+ssl cl-csv cl-db3 drakma esrap graph hunchentoot local-time lparallel nibbles qmynd trivial-backtrace
upload: hunchentoot

After careful manual review of the automatic decision, let's just `update` all what `check` decided would have to be:

$ /vagrant/build/bin/ql-to-deb update
Fetching ""
Fetching ""

Updating package cl-plus-ssl from 20140826 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-plus-ssl.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/cl+ssl-20150302-git.tgz"
      md5: 61d9d164d37ab5c91048827dfccd6835
Building package cl-plus-ssl

Updating package cl-csv from 20140826 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-csv.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/cl-csv-20150302-git.tgz"
      md5: 32f6484a899fdc5b690f01c244cd9f55
Building package cl-csv

Updating package cl-db3 from 20131111 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-db3.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/cl-db3-20150302-git.tgz"
      md5: 578896a3f60f474742f240b703f8c5f5
Building package cl-db3

Updating package cl-drakma from 1.3.11 to 1.3.13.
     see logs in "//tmp/ql-to-deb/logs//cl-drakma.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/drakma-1.3.13.tgz"
      md5: 3b548bce10728c7a058f19444c8477c3
Building package cl-drakma

Updating package cl-esrap from 20150113 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-esrap.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/esrap-20150302-git.tgz"
      md5: 8b198d26c27afcd1e9ce320820b0e569
Building package cl-esrap

Updating package cl-graph from 20141106 to 20150407.
     see logs in "//tmp/ql-to-deb/logs//cl-graph.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/graph-20150407-git.tgz"
      md5: 3894ef9262c0912378aa3b6e8861de79
Building package cl-graph

Updating package hunchentoot from 1.2.29 to 1.2.31.
     see logs in "//tmp/ql-to-deb/logs//hunchentoot.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/hunchentoot-1.2.31.tgz"
      md5: 973eccfef87e81f1922424cb19884d63
Building package hunchentoot

Updating package cl-local-time from 20150113 to 20150407.
     see logs in "//tmp/ql-to-deb/logs//cl-local-time.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/local-time-20150407-git.tgz"
      md5: 7be4a31d692f5862014426a53eb1e48e
Building package cl-local-time

Updating package cl-lparallel from 20141106 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-lparallel.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/lparallel-20150302-git.tgz"
      md5: dbda879d0e3abb02a09b326e14fa665d
Building package cl-lparallel

Updating package cl-nibbles from 20141106 to 20150407.
     see logs in "//tmp/ql-to-deb/logs//cl-nibbles.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/nibbles-20150407-git.tgz"
      md5: 2ffb26241a1b3f49d48d28e7a61b1ab1
Building package cl-nibbles

Updating package cl-qmynd from 20141217 to 20150302.
     see logs in "//tmp/ql-to-deb/logs//cl-qmynd.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/qmynd-20150302-git.tgz"
      md5: b1cc35f90b0daeb9ba507fd4e1518882
Building package cl-qmynd

Updating package cl-trivial-backtrace from 20120909 to 20150407.
     see logs in "//tmp/ql-to-deb/logs//cl-trivial-backtrace.log"
Fetching ""
Checksum test passed.
     File: "/tmp/ql-to-deb/archives/trivial-backtrace-20150407-git.tgz"
      md5: 762b0acf757dc8a2a6812d2f0f2614d9
Building package cl-trivial-backtrace

Quite simple.

To be totally honnest, I first had a problem with the parser generator library esrap wherein the README documentation changed to be a file, and I had to tell my debian packaging about that. See the 0ef669579cf7c07280eae7fe6f61f1bd664d337e commit to ql-to-deb for details.

What about trying to install those packages locally? That's usually a very good test. Sometimes some dependencies are missing at the dpkg command line, so another apt-get install -f is needed:

$ /vagrant/build/bin/ql-to-deb install
sudo dpkg -i /tmp/ql-to-deb/cl-plus-ssl_20150302-1_all.deb /tmp/ql-to-deb/cl-csv_20150302-1_all.deb /tmp/ql-to-deb/cl-csv-clsql_20150302-1_all.deb /tmp/ql-to-deb/cl-csv-data-table_20150302-1_all.deb /tmp/ql-to-deb/cl-db3_20150302-1_all.deb /tmp/ql-to-deb/cl-drakma_1.3.13-1_all.deb /tmp/ql-to-deb/cl-esrap_20150302-1_all.deb /tmp/ql-to-deb/cl-graph_20150407-1_all.deb /tmp/ql-to-deb/cl-hunchentoot_1.2.31-1_all.deb /tmp/ql-to-deb/cl-local-time_20150407-1_all.deb /tmp/ql-to-deb/cl-lparallel_20150302-1_all.deb /tmp/ql-to-deb/cl-nibbles_20150407-1_all.deb /tmp/ql-to-deb/cl-qmynd_20150302-1_all.deb /tmp/ql-to-deb/cl-trivial-backtrace_20150407-1_all.deb
(Reading database ... 79689 files and directories currently installed.)
Preparing to unpack .../cl-plus-ssl_20150302-1_all.deb ...
Unpacking cl-plus-ssl (20150302-1) over (20140826-1) ...
Selecting previously unselected package cl-csv.
Preparing to unpack .../cl-csv_20150302-1_all.deb ...
Unpacking cl-csv (20150302-1) ...
Selecting previously unselected package cl-csv-clsql.
Preparing to unpack .../cl-csv-clsql_20150302-1_all.deb ...
Unpacking cl-csv-clsql (20150302-1) ...
Selecting previously unselected package cl-csv-data-table.
Preparing to unpack .../cl-csv-data-table_20150302-1_all.deb ...
Unpacking cl-csv-data-table (20150302-1) ...
Selecting previously unselected package cl-db3.
Preparing to unpack .../cl-db3_20150302-1_all.deb ...
Unpacking cl-db3 (20150302-1) ...
Preparing to unpack .../cl-drakma_1.3.13-1_all.deb ...
Unpacking cl-drakma (1.3.13-1) over (1.3.11-1) ...
Preparing to unpack .../cl-esrap_20150302-1_all.deb ...
Unpacking cl-esrap (20150302-1) over (20150113-1) ...
Preparing to unpack .../cl-graph_20150407-1_all.deb ...
Unpacking cl-graph (20150407-1) over (20141106-1) ...
Preparing to unpack .../cl-hunchentoot_1.2.31-1_all.deb ...
Unpacking cl-hunchentoot (1.2.31-1) over (1.2.29-1) ...
Preparing to unpack .../cl-local-time_20150407-1_all.deb ...
Unpacking cl-local-time (20150407-1) over (20150113-1) ...
Preparing to unpack .../cl-lparallel_20150302-1_all.deb ...
Unpacking cl-lparallel (20150302-1) over (20141106-1) ...
Preparing to unpack .../cl-nibbles_20150407-1_all.deb ...
Unpacking cl-nibbles (20150407-1) over (20141106-1) ...
Preparing to unpack .../cl-qmynd_20150302-1_all.deb ...
Unpacking cl-qmynd (20150302-1) over (20141217-1) ...
Preparing to unpack .../cl-trivial-backtrace_20150407-1_all.deb ...
Unpacking cl-trivial-backtrace (20150407-1) over (20120909-2) ...
Setting up cl-plus-ssl (20150302-1) ...
dpkg: dependency problems prevent configuration of cl-csv:
 cl-csv depends on cl-interpol; however:
  Package cl-interpol is not installed.

dpkg: error processing package cl-csv (--install):
 dependency problems - leaving unconfigured
dpkg: dependency problems prevent configuration of cl-csv-clsql:
 cl-csv-clsql depends on cl-csv; however:
  Package cl-csv is not configured yet.

dpkg: error processing package cl-csv-clsql (--install):
 dependency problems - leaving unconfigured
dpkg: dependency problems prevent configuration of cl-csv-data-table:
 cl-csv-data-table depends on cl-csv; however:
  Package cl-csv is not configured yet.

dpkg: error processing package cl-csv-data-table (--install):
 dependency problems - leaving unconfigured
Setting up cl-db3 (20150302-1) ...
Setting up cl-drakma (1.3.13-1) ...
Setting up cl-esrap (20150302-1) ...
Setting up cl-graph (20150407-1) ...
Setting up cl-local-time (20150407-1) ...
Setting up cl-lparallel (20150302-1) ...
Setting up cl-nibbles (20150407-1) ...
Setting up cl-qmynd (20150302-1) ...
Setting up cl-trivial-backtrace (20150407-1) ...
Setting up cl-hunchentoot (1.2.31-1) ...
Errors were encountered while processing:

Let's make sure that our sid users will be happy with the update here:

$ sudo apt-get install -f
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Correcting dependencies... Done
The following packages were automatically installed and are no longer required:
  g++-4.7 git git-man html2text libaugeas-ruby1.8 libbind9-80
  libclass-isa-perl libcurl3-gnutls libdns88 libdrm-nouveau1a
  libegl1-mesa-drivers libffi5 libgraphite3 libgssglue1 libisc84 libisccc80
  libisccfg82 liblcms1 liblwres80 libmpc2 libopenjpeg2 libopenvg1-mesa
  libpoppler19 librtmp0 libswitch-perl libtiff4 libwayland-egl1-mesa luatex
  openssh-blacklist openssh-blacklist-extra python-chardet python-debian
  python-magic python-pkg-resources python-six ttf-dejavu-core ttf-marvosym
Use 'apt-get autoremove' to remove them.
The following extra packages will be installed:
The following NEW packages will be installed:
0 upgraded, 1 newly installed, 0 to remove and 51 not upgraded.
3 not fully installed or removed.
Need to get 20.7 kB of archives.
After this operation, 135 kB of additional disk space will be used.
Do you want to continue? [Y/n] 
Get:1 sid/main cl-interpol all 0.2.1-2 [20.7 kB]
Fetched 20.7 kB in 0s (84.5 kB/s)
debconf: unable to initialize frontend: Dialog
debconf: (Dialog frontend will not work on a dumb terminal, an emacs shell buffer, or without a controlling terminal.)
debconf: falling back to frontend: Readline
Selecting previously unselected package cl-interpol.
(Reading database ... 79725 files and directories currently installed.)
Preparing to unpack .../cl-interpol_0.2.1-2_all.deb ...
Unpacking cl-interpol (0.2.1-2) ...
Setting up cl-interpol (0.2.1-2) ...
Setting up cl-csv (20150302-1) ...
Setting up cl-csv-clsql (20150302-1) ...
Setting up cl-csv-data-table (20150302-1) ...

All looks fine, time to sign those packages. There's a trick here, where you want to be sure you're using a GnuPG setup that allows you to enter your passphrase only once, see ql-to-deb vm setup for details, and the usual documentations about all that if you're interested into the details.

$ /vagrant/build/bin/ql-to-deb sign
 signfile /tmp/ql-to-deb/cl-plus-ssl_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-plus-ssl_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-csv_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-csv_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-db3_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-db3_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-drakma_1.3.13-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-drakma_1.3.13-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-esrap_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-esrap_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-graph_20150407-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-graph_20150407-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/hunchentoot_1.2.31-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/hunchentoot_1.2.31-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-local-time_20150407-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-local-time_20150407-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-lparallel_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-lparallel_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-nibbles_20150407-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-nibbles_20150407-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-qmynd_20150302-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-qmynd_20150302-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files
 signfile /tmp/ql-to-deb/cl-trivial-backtrace_20150407-1.dsc 60B1CB4E
 signfile /tmp/ql-to-deb/cl-trivial-backtrace_20150407-1_amd64.changes 60B1CB4E
Successfully signed dsc and changes files

Ok, with all tested and signed, it's time we upload our packages on debian servers for our dear debian users to be able to use newer and better versions of their beloved Common Lisp librairies:

$ /vagrant/build/bin/ql-to-deb upload
Trying to upload package to ftp-master (
Checking signature on .changes
gpg: Signature made Sat 02 May 2015 05:06:48 PM MSK using RSA key ID 60B1CB4E
gpg: Good signature from "Dimitri Fontaine <>"
Good signature on /tmp/ql-to-deb/cl-plus-ssl_20150302-1_amd64.changes.
Checking signature on .dsc
gpg: Signature made Sat 02 May 2015 05:06:46 PM MSK using RSA key ID 60B1CB4E
gpg: Good signature from "Dimitri Fontaine <>"
Good signature on /tmp/ql-to-deb/cl-plus-ssl_20150302-1.dsc.
Uploading to ftp-master (via ftp to
  Uploading cl-plus-ssl_20150302-1.dsc: done.
  Uploading cl-plus-ssl_20150302.orig.tar.gz: done.
  Uploading cl-plus-ssl_20150302-1.debian.tar.xz: done.
  Uploading cl-plus-ssl_20150302-1_all.deb: done.
  Uploading cl-plus-ssl_20150302-1_amd64.changes: done.
Successfully uploaded packages.

Of course the same text or abouts is then repeated for all the other packages.

Enjoy using Common Lisp in debian!

Oh and remember, the only reason I've written ql-to-deb and signed myself up to maintain those upteens Common Lisp librairies as debian package is to be able to properly package pgloader in debian, as you can see at and in particular in the Other Packages Related to pgloader section of the debian source package for pgloader at

That level of effort is done to ensure that we respect the Debian Social Contract wherein debian ensures its users that it's possible to rebuild anything from sources as found in the debian repositories.

s mazuk: Video

Open Culture: Read Noam Chomsky & Sam Harris’ “Unpleasant” Email Exchange

In 2013, we documented the acrimonious exchange between Noam Chomsky and Slavoj Žižek, which all started when Chomsky accused Žižek of “posturing–using fancy terms like polysyllables and pretending [to] have a theory when you have no theory whatsoever.” To which Žižek responded: “Chomsky, … always emphasizes how one has to be empirical, accurate… well I don’t think I know a guy who was so often empirically wrong in his descriptions…” And so it continued.

Two years later, Chomsky now finds himself in another fraught exchange — this time, with Sam Harris, author of The End of Faith and Letter to a Christian Nation. It’s a little hard to pin down when the dust-up first began. But, it at least goes back to January, when Harris took Chomsky to task  (hear an excerpt of a longer podcast above) for drawing a moral equivalence between U.S. military action and the violence committed by some of America’s historical foes (e.g., the Nazis during WWII and later Al-Qaeda).

Over the past week, Chomsky and Harris continued the debate, trading emails back and forth. Their correspondence runs some 10,000 words, but it only amounts to what Harris ultimately calls “an unpleasant and fruitless encounter” that demonstrates the “limits of discourse.” It’s an exchange that Chomsky seemingly preferred to keep private (his permission to print the emails was grudging at best), and Harris saw some virtue in making public. The final email by Harris reads:

May 1, 2015

From: Sam Harris
To: Noam Chomsky


I’ve now read our correspondence through and have decided to publish it ( I understand your point about “exhibitionism,” but I disagree in this case.

You and I probably share a million readers who would have found a genuine conversation between us extremely useful. And I trust that they will be disappointed by our failure to produce one, as I am. However, if publishing this exchange helps anyone to better communicate about these topics in the future, our time won’t have been entirely wasted.


Whether Sam is right about that (is there something particularly instructive here?), you can decide. Here’s the entire exchange.

Dan Colman is the founder/editor of Open Culture. Follow us on Facebook, Twitter, Google Plus and LinkedIn and  share intelligent media with your friends. Or better yet, sign up for our daily email and get a daily dose of Open Culture in your inbox.

Related Content:

Noam Chomsky Slams Žižek and Lacan: Empty ‘Posturing’

Slavoj Žižek Responds to Noam Chomsky: ‘I Don’t Know a Guy Who Was So Often Empirically Wrong’

Clash of the Titans: Noam Chomsky & Michel Foucault Debate Human Nature & Power on Dutch TV, 1971

Read 9 Free Books By Noam Chomsky Online Comic for 2015.05.02

New Cyanide and Happiness Comic

Planet Haskell: Edward Z. Yang: Width-adaptive XMonad layout

My usual laptop setup is I have a wide monitor, and then I use my laptop screen as a secondary monitor. For a long time, I had two XMonad layouts: one full screen layout for my laptop monitor (I use big fonts to go easy on the eyes) and a two-column layout when I'm on the big screen.

But I had an irritating problem: if I switched a workspace from the small screen to the big screen, XMonad would still be using the full screen layout, and I would have to Alt-Tab my way into the two column layout. To add insult to injury, if I moved it back, I'd have to Alt-Tab once again.

After badgering the fine folks on #xmonad, I finally wrote an extension to automatically switch layout based on screen size! Here it is:

{-# LANGUAGE FlexibleInstances, MultiParamTypeClasses #-}

-- |
-- Module      :  XMonad.Layout.PerScreen
-- Copyright   :  (c) Edward Z. Yang
-- License     :  BSD-style (see LICENSE)
-- Maintainer  :  <>
-- Stability   :  unstable
-- Portability :  unportable
-- Configure layouts based on the width of your screen; use your
-- favorite multi-column layout for wide screens and a full-screen
-- layout for small ones.

module XMonad.Layout.PerScreen
    ( -- * Usage
      -- $usage
    ) where

import XMonad
import qualified XMonad.StackSet as W

import Data.Maybe (fromMaybe)

-- $usage
-- You can use this module by importing it into your ~\/.xmonad\/xmonad.hs file:
-- > import XMonad.Layout.PerScreen
-- and modifying your layoutHook as follows (for example):
-- > layoutHook = ifWider 1280 (Tall 1 (3/100) (1/2) ||| Full) Full
-- Replace any of the layouts with any arbitrarily complicated layout.
-- ifWider can also be used inside other layout combinators.

ifWider :: (LayoutClass l1 a, LayoutClass l2 a)
               => Dimension   -- ^ target screen width
               -> (l1 a)      -- ^ layout to use when the screen is wide enough
               -> (l2 a)      -- ^ layout to use otherwise
               -> PerScreen l1 l2 a
ifWider w = PerScreen w False

data PerScreen l1 l2 a = PerScreen Dimension Bool (l1 a) (l2 a) deriving (Read, Show)

-- | Construct new PerScreen values with possibly modified layouts.
mkNewPerScreenT :: PerScreen l1 l2 a -> Maybe (l1 a) ->
                      PerScreen l1 l2 a
mkNewPerScreenT (PerScreen w _ lt lf) mlt' =
    (\lt' -> PerScreen w True lt' lf) $ fromMaybe lt mlt'

mkNewPerScreenF :: PerScreen l1 l2 a -> Maybe (l2 a) ->
                      PerScreen l1 l2 a
mkNewPerScreenF (PerScreen w _ lt lf) mlf' =
    (\lf' -> PerScreen w False lt lf') $ fromMaybe lf mlf'

instance (LayoutClass l1 a, LayoutClass l2 a, Show a) => LayoutClass (PerScreen l1 l2) a where
    runLayout (W.Workspace i p@(PerScreen w _ lt lf) ms) r
        | rect_width r > w    = do (wrs, mlt') <- runLayout (W.Workspace i lt ms) r
                                   return (wrs, Just $ mkNewPerScreenT p mlt')
        | otherwise           = do (wrs, mlt') <- runLayout (W.Workspace i lf ms) r
                                   return (wrs, Just $ mkNewPerScreenF p mlt')

    handleMessage (PerScreen w bool lt lf) m
        | bool      = handleMessage lt m >>= maybe (return Nothing) (\nt -> return . Just $ PerScreen w bool nt lf)
        | otherwise = handleMessage lf m >>= maybe (return Nothing) (\nf -> return . Just $ PerScreen w bool lt nf)

    description (PerScreen _ True  l1 _) = description l1
    description (PerScreen _ _     _ l2) = description l2

I'm going to submit it to xmonad-contrib, if I can figure out their darn patch submission process...

Paper Bits: bunabbit: becausebirds: Dutch “Cuddly Owl” finally caught on...



Dutch “Cuddly Owl” finally caught on video. This bird has been cuddling the citizens of this town for a while. It likes to land and stomp on people’s heads.

Watch the video

Paper Bits: abcnews: BREAKING UPDATE: State attorney: Warrant issued for...


BREAKING UPDATE: State attorney: Warrant issued for arrest of 6 officers in Freddie Gray case; various charges including murder, manslaughter, assault being leveled:

Paper Bits: vandisa: Thank you. Someone said it.


Thank you. Someone said it.

Re: Factor: File Server

Python has a neat feature that lets you serve files from the current directory.

# Python 2
python2 -m SimpleHTTPServer

# Python 3
python3 -m http.server

I always thought this was a quick and useful way to share files on a local network. Given that Factor has a HTTP Server, we should be able to implement this!

We already have support for serving static content and serving CGI scripts, so we can very simply implement a script to create and launch a HTTP server for the current directory (or the one specified on the command-line), logging HTTP connections to stdout.

This is available in the file-server vocabulary, now you can:

factor -run=file-server [--cgi] [path]

Currently, this defaults to serving files on port 8080 from all available network interfaces. In the future, it would be nice to add the ability to specify port and network interfaces to bind.

s mazuk:


In the last few days before I stop being an ~independent developer~, I want to write about the qualities that pushed to attempt an indie career. Maybe some of this is applicable to indie in general, maybe this is way off base anyways, I’m not trying to make some totally 100% applicable statement….

I’m gonna be starting a new job in a few days so I wrote about what made me try to be indie in the first place. 

Greater Fool - Authored by Garth Turner - The Troubled Future of Real Estate: The honourable FSBO

SIGN modified

Would you buy a used house from a used MP?

Tim’s not convinced, but he sure was surprised at what Maurice Vellacott stuffed in his door. “No words can describe the feeling when you get home,” he says, “and find a sitting MP has come to your house to drop off a flyer that he’s selling his house and will give you a deal in a private sale.

“That’s a sure sign housing is going to crash……. I know you’re against private sales, but surely it’s safe to buy a house from a guy with “D. Min” and “MP” at the end of his name, right? Right???”

Velacott’s been a Member of Parliament since first elected as a Reformer back in 1997. He and I sat in Parliament together for a while, but as a pro-life MP and bigtime theological dude, he had little in common. I think he may even has trashed my butt once or twice after the Conservatives punted me. But then, who didn’t?

In any case, Vellacott lives in Tim’s Ottawa hood and just blitzed it, handing this out:

MV modified

How does any of this possibly matter? Actually it wouldn’t if Maurice wasn’t trying to impress the neighbours by flashing his Parliamentary designation and Godly credentials. Or including his MP’s cellphone number. But because he’s not going to run in the coming election, the politician is clearly trying to bail out of the market when the bailing’s good – and also being cheap while he’s at it.

But cheap is what FSBO is all about. In this instance Vellacott says he’ll be listing his digs for $450,000 in two weeks, “but will sell for $435,000 without any agents involved.” Of course, a house listed for $450,000 in a soggy market like S’toon would probably fetch $410,000 or $420,000, and after commission that would give the MP about $395,000. No wonder he’s passing around flyers.

Even if he found a greater fool to pony up the full price, his after-commission net would be $427,500 – eight grand less than his ‘discounted’ special offer to his beloved constituents. So here’s more proof that FSBO means greed. The DIY seller wants to pay the agent nothing so he can pay himself more. If you think, as a buyer, you’re getting a deal because the commission saving is being passed on to you, think again.

Meanwhile, as Tim asks, is the divine MV’s planned exit a harbinger of tough times ahead?

Well, facts are facts. Sales in Ottawa are stagnant from a year ago, and so are prices. Even CMHC has been warning that “soft employment conditions will temper real estate demand”.

At the same time, stats from back home are also grim. Saskabush sales year/year have plunged by 10% while listings have swollen 16%. Values are already being affected, with the average down 4%. As you may know, this is consistent with a lot of secondary markets in Canada, despite the fact mortgage rates have never been at these levels before, CMHC is still encouraging people without money to buy houses and the media’s turned into endless house porn.

There’s a limit for most people as to how much debt they can choke down in an economy where incomes are stagnant, commodities are in the ditch, full-time jobs are elusive and yet house prices are bloated. Unlike the latte-sipping, hedonistic metrosexual urban butterflies in YVR or the GTA, good God-fearin’ prairie folks (and even those in Ottawa) know their limits. Apparently this is it.

Of course Maurice is bailing. How can you blame him? He’s done with the feds. It’s the smart move.

But don’t be tempted to buy his house and subsidize the guy. You already did that.

new shelton wet/dry: Damn your yellow stick. Where are we going?

Marketers often seek to minimize or eliminate interruptions when they deliver persuasive messages in an attempt to increase consumers’ attention and processing of those messages. However, in five studies conducted across different experimental contexts and different content domains, the current research reveals that interruptions that temporarily disrupt a persuasive message can increase consumers’ processing of [...]

new shelton wet/dry: (The mirage of the lake of Kinnereth with blurred cattle cropping in silver haze is projected on the wall.)

An ambitious effort to replicate 100 research findings in psychology ended last week — and the data look worrying. Results posted online on 24 April, which have not yet been peer-reviewed, suggest that key findings from only 39 of the published studies could be reproduced. […] The results should convince everyone that psychology has a replicability [...]

new shelton wet/dry: Kicky Lacey, the pervergined, and Bianca Mutantini, her conversa

Eight studies explored the antecedents and consequences of whether people locate their sense of self in the brain or the heart. In Studies 1a–f, participants’ self-construals consistently influenced the location of the self: The general preference for locating the self in the brain rather than the heart was enhanced among men, Americans, and participants primed [...]

Open Culture: Hear the “Seikilos Epitaph,” the Oldest Complete Song in the World: An Inspiring Tune from 100 BC

Last summer, we featured a Sumerian hymn considered the oldest known song in the world. Given the popularity of that post, it seems we may have long underestimated the number of ancient-musicophiles on the internet. Therefore, we submit today for your approval the Seikilos epitaph, the oldest known complete musical composition — that is to say, a song that our 21st-century selves can still play and hear in its intended entirety, more or less as did the ancient Greeks who lived during the first-century (or thereabouts) era of its composition.

Seikilos epitaph

The Seikilos epitaph’s survival in one piece, as it were, no doubt owes something to its shortness. The Greeks could carve the entire thing onto the surface of a tombstone, exactly the medium on which the modern world rediscovered it in 1885 near Aidin, Turkey. Its lyrics, liberally brought into English, exhort us as follows:

While you live, shine

have no grief at all

life exists only for a short while

and time demands its toll.

The surface also bears an explanatory inscription about — and written in the voice of — the artifact itself:  “I am a tombstone, an image. Seikilos placed me here as an everlasting sign of deathless remembrance.” The Greeks, like many peoples in the ancient world of unvarnished mortality, relished a good memento mori, and this oldest complete song in the world offers one whose message still holds today, and which we can trace all the way to more recent words, like those of William Saroyan, when he said, “In the time of your life, live — so that in that good time there shall be no ugliness or death for yourself or for any life your life touches.”

Or for another interpretation, you can hear a modern, guitar-driven cover of the Seikilos epitaph by Vlogbrother and famous internet teacher Hank Green, in a truly striking example of two eras colliding. But of course, the Youtube era has also made everyone a critic. As one commenter perfectly put it, “I prefer his earlier stuff.”

Related Content:

Listen to the Oldest Song in the World: A Sumerian Hymn Written 3,400 Years Ago

What Ancient Greek Music Sounded Like: Hear a Reconstruction That is ‘100% Accurate’

Hear The Epic of Gilgamesh Read in the Original Akkadian and Enjoy the Sounds of Mesopotamia

Learn Latin, Old English, Sanskrit, Classical Greek & Other Ancient Languages in 10 Lessons

Colin Marshall writes on cities, language, Asia, and men’s style. He’s at work on a book about Los Angeles, A Los Angeles Primer, and the video series The City in CinemaFollow him on Twitter at @colinmarshall or on Facebook.

Paper Bits: Living with Everyday Violence in Black Baltimore

Living with Everyday Violence in Black Baltimore

Open Culture: Do Not Track: Interactive Film Series Reveals the Personal Information You’re Giving Away on the Web

If Facebook knows everything about you, it’s because you handed it the keys to your kingdom.  You posted a photo, liked a favorite childhood TV show, and willingly volunteered your birthday. In other words, you handed it all the data it needs to annoy you with targeted advertising.

(In my case, it’s an ancient secret that helped a middle aged mom shave 5 inches off her waistline. Let me save you a click: acai berries.)

Filmmaker Brett Gaylor (a “lefty Canadian dad who reads science fiction) seeks to set the record straight regarding the web economy’s impact on personal privacy.

Watching his interactive documentary web series, Do Not Track, you’ll inevitably arrive at a crossroads where you must decide whether or not to share your personal information. No biggie, right? It’s what happens every time you consent to “log in with Facebook.”

Every time you choose this convenience, you’re allowing Google and other big time trackers to stick a harpoon (aka cookie) in your side. Swim all you want, little fishy. You’re not exactly getting away, particularly if you’re logged in with a mobile device with a compulsion to reveal your whereabouts.

You say you have nothing to hide? Bully for you! What you may not have considered is the impact your digital easy-breeziness has on friends. Your network. And vice versa. Tag away!

In this arena, every “like”—from an acquaintance’s recently launched organic skincare line to Star Trekhelps trackers build a surprisingly accurate portrait, one that can be used to determine how insurable you are, how worthy of a loan. Gender and age aren’t the only factors that matter here. So does your demonstrated extraversion, your degree of openness.

(Ha ha, and you thought it cost you nothing to “like” that acquaintance’s smelly strawberry-scented moisturizer!)

To get the most out of Do Not Track, you’ll want to supply its producers with your email address on your first visit. It’s a little counter-intuitive, given the subject matter, but doing so will provide you with a unique configuration that promises to lift the veil on what the trackers know about you.

What does it say about me that I couldn’t get my Facebook log-in to work? How disappointing that this failure meant I would be viewing results tailored to Episode 3’s star, German journalist Richard Gutjahr?

(Your profile… says that your age is 42 and your gender is male. But the real gold mine is your Facebook data over time. By analyzing the at least 129 things you have liked on Facebook, we have used our advanced algorithm techniques to assess your personality and have found you scored highest in Openness which indicates you are creative, imaginative, and adventurous. Our personality evaluation system uses Psycho-demographic trait predictions powered by the Apply Magic Sauce API developed at the University of Cambridge Psychometrics Centre.)

I think the takeaway is that I am not too on top of my privacy settings. And why would I be? I’m an extrovert with nothing to hide, except my spending habits, browsing history, race, age, marital status…

Should we take a tip from our high school brethren, who evade the scrutiny of college admissions counselors by adopting some ridiculous, evocative pseudonym? Expect upcoming episodes of Do Not Track to help us navigate these and other digital issues.

Tune in to Do Not Track here. You can find episodes 1, 2 and 3 currently online. Episodes 4-6 will roll out between May 12 and June 9.

Related Content:

The Internet’s Own Boy: New Documentary About Aaron Swartz Now Free Online

A Threat to Internet Freedom: Filmmaker Brian Knappenberger Explains Why Net Neutrality Matters

How Brewster Kahle and the Internet Archive Will Preserve the Infinite Information on the Web

Ayun Halliday an author, illustrator, and Chief Primatologist of the East Village Inky zine invites you to look into her very soul @AyunHalliday

Quiet Earth: First Trailer for Rodney Asher's New Doc THE NIGHTMARE!

Rodney Ascher broke onto the scene with his documentary Room 237 (review) which was all about the crazy theories and interpretations of Stanley Kubrick's horror classic "The Shining" that some folks have obsessed about over the years. His latest doc, The Nightmare, take a look at a frightening condition that plagues thousands; sleep paralysis.

The movie chronicles eight different people who suffer from night terrors and sleep paralysis, that finds them trapped in awakened states of semi-consciousness, where they witness truly horrific visions, but are unable to move.

"The Nightmare" opens in limited release and hits VOD on June 5th.

[Continued ...]

OUR VALUED CUSTOMERS: Seriously though...

Disquiet: via

Greetings #soundstudies #ui #ux

Cross-posted from

Perlsphere: Kindersalat

Zoe hat eine Spielküche in ihrem Zimmer. Gelegentlich kocht sie dort auch leckere Sachen, so letztens auch Salat als Beilage zum Abendessen. Der Vorschlag gefiel uns und so durfte sie gleich umziehen: Von der Spiel- in die echte Küche.

Penny Arcade: News Post: Post Traumatic

Tycho: I had the weird-ass Brown And Green Zune, so I am not really in a position to defend myself; I made tremendous use of the device.  I think it’s my job to act as a halfway house for orphaned, exiled things; I have always had a fear that a preconception or the preconception of another will foreclose realities in some way.  I thought it was solid.  IP wrangling being what it is, some of its cool tricks couldn’t manifest.  It was a pretty fucked up time - you’ll recall that you couldn’t buy MP3s directly, yet.  Microsoft has a strange, almost…

Quiet Earth: Eli Roth's Student Film RESTAURANT DOGS is a Bit of Wonderful Madness

Crypt TV have exclusively released Eli Roth's student film Restaurant Dogs, a very strange affair and oddly fascinating look inside the creative mind of a filmmaker that went on to be one of the biggest names in horror.

The film is obviously inspired by Tarantino, though this is pre Pulp Fiction. Roth obviously knew where the industry would be going.

Perhaps not surprisingly, Roth's professors hated the film and were skeptical about graduating him after viewing it. It, of course, went on to win top honors at the student academy awards and the film was screened at the MOMA. Take that professors!

I won't spoil the proceedings with a synopsis. Just let it all rain down on you.

Watch the film on [Continued ...]

Quiet Earth: Insane Look Behind The Scenes of MAD MAX: FURY ROAD

"George may ask you to do things that seem insane..." you think?

The countdown to Mad Max: Fury Road is now in full swing (though around these parts, we've been counting down for what seems like nearly a year). There's already been quite a bit of coverage of the upcoming George Miller classic-in-the-making but we've yet to see much of Miller himself talk about the movie but this new featurette fixes that.

Miller shares where the idea for the movie came from, Charlize Theron and Tom Hardy chime in on the insanity of the production and we get a ridiculously impressive glimpse behind the scenes of the coolest looking movie of the summer.

Mad Max: Fury Road opens May 15.

[Continued ...]

Better Embedded System SW: How To Report An Unintended Acceleration Problem

Every once in a while I get e-mail from someone concerned about unintended acceleration that has happened to them or someone they know.  Commonly they go to the car dealer and get told (directly or indirectly) that it must have been the driver's fault. I'm sure that must be a frustrating experience.

Fortunately, you can do more than just get blown off by the dealer (if you feel like that is what happened to you).  Visit the US Dept. of Transportation's complaint system and file a complaint with their Office of Defects Investigation (ODI):

What this does is put information into the database that DoT uses to look for unsafe trends in vehicles. ODI conducts defect investigations and administers safety recalls, and this database is a primary source of information for them.  Putting in an entry does not mean that anyone will necessarily get back to you about your particular complaint, but eventually if enough drivers have similar problems with a particular vehicle type, ODI is supposed to investigate. You should be sure to use several different words and phrases to thoroughly describe your situation since often this database is searched via key words. (That means that if they are looking for a particular trend they might only look at records that contain a specific word or phrase, not all records for that vehicle.)  You should include specifics, and in particular things that you can recall that would suggest it is not simply driver error. But, realize that the description you type in will be publicly available, so think about what you write. 

To be sure, this should not be the only thing you do.  If you believe you have a problem with your vehicle should talk to the dealer and perhaps escalate things from there. (If it happened to me I would at a minimum demand a written problem report to the manufacturer central defects office be created and demand a written response from the manufacturer customer relations office to leave a record.)  But, if you skip the DoT database then one of the important feedback mechanisms independent of the car companies that triggers recalls won't have the data it needs to work. If you had an incident that did not result in a police report or insurance claim reporting is especially important, since there is no other way for DoT or the manufacturer to even know it happened.

Even if you haven't suffered unintended acceleration, you might be interested to look at complaints others have filed for your vehicle type, which are publicly available. And of course you can report any defect you like, not just acceleration issues. 

 (For those who are interested in how the keyword search might be done, you can see a NHTSA Document for an example from the Toyota UA investigations.)

All Content: Thumbnails 5/1/15



"How Hollywood Keeps Out Women": An infuriating, essential investigation from Jessica P. Ogilvie at L.A. Weekly

“Women are not tapped for power jobs in Hollywood. Their numbers trail far behind the percentage of females in executive positions in other heavily male-dominated endeavors, including the military, tech, finance, government, science and engineering. In 2013, 1.9 percent of the directors of Hollywood's 100 top-grossing films were female, according to a study conducted by USC researcher Stacy L. Smith. In 2011, women held 7.1 percent of U.S. military general and admiral posts, 20 percent of U.S. Senate seats and more than 20 percent of leadership roles at Twitter and Facebook — and both companies now face gender-discrimination lawsuits.In the wake of the Sony email-hacking scandal, and following Patricia Arquette's rallying cry at the Oscars, some well-known Hollywood figures are openly saying that an ugly bias grips the liberal, charitable, Democrat-dominated movie industry. ‘You have to be protective and arrogant’ to direct, says screenwriter Diablo Cody, winner of the Academy Award for 2007's ‘Juno’. ‘Those are great qualities, but people hate it in women. We talk about this on set all the time. You'll hear about a male director throwing stuff; we just laugh, because a woman would never work again, no matter who it was.It would screw her in every aspect of her life. I just think that there's a deep, rotten core in society. To me, it is just straight-up misogyny.’”


"Attention, Filmmakers: Top Indie Directors on Why Color Correction Matters": Indiewire's Rick Castañeda chats with filmmakers Josephine Decker, Ryan Piers Williams and Sarah Adina Smith.

“In the color sessions that I've sat in on, one thing I've always felt self-conscious about is the language of color, and when to say things like ‘crunch the blacks’ and ‘warm up those highlights.’ Decker says she often uses non-color terminology. ‘Make it scary, make it beautiful,’ she said. One of her references for the film was actually Super Mario Kart. ‘You know that level that is all black where you're on a rainbow track? Ultimately, we didn't go with ‘rainbow', but we did get super saturated,’ she said. ‘Thou Wast Mild and Lovely’ a sensual thriller starring ‘love, death, guns, goats, and a farm in the wilds of Kentucky’ had a very vibrant look, one that Decker describes as dark, brooding, with a brilliance. ‘Almost like beauty is lurking under the surface of the darkness,’ she said. Smith has a really great way of describing how she talks about color. ‘I speak in a weird language of feeling with everyone when I'm working,’ she said. ‘You're always trying to get at things you don't have words for, so you have to often dance around the meaning until it reveals itself.’”


"Spike Lee On Digital Viewing of Movies: 'It's Heartbreaking'": Variety's Brian Steinberg reports on the director's comments made at the publication's Entertainment and Technology conference.

“‘I know I’m a dinosaur,’ Lee said while holding forth at Variety’s Entertainment and Technology conference in New York, but ‘there’s something still for me actually being in a movie theater’ and seeing a film with a group of people. The director said he was taken aback by the notion that some people might watch ‘West Side Story,’ ‘Star Wars’ or ‘Apocalypse Now’ on a smartphone screen. ‘I know it’s not a popular view, but as a filmmaker — we kill ourselves with editing. With lighting. With sound,’ he said, adding: ‘It’s heartbreaking.’ Lee appeared with Marc Ecko, the media and fashion entrepreneur. Lee recently agreed to join Ecko’s Complex media and marketing business, becoming a board adviser for video products and branded content. Lee is no stranger to the world of advertising, having long overseen ad work at Spike DDB, part of the large DDB Worldwide unit of ad holding company Omnicom Group. The director said he felt at home tackling commercial work and filmmaking. ‘It’s just storytelling,’ he said. ‘That’s the way I approach it.’”


"All Things Shining in Terrence Malick's 'The Tree of Life'": At Indie Outlook, I've reposted my 2011 essay on Malick's divisive, exhilarating masterpiece, which premiered at Cannes four years ago this month.

“Once the film settles into the tale of Jack’s own evolution, the film reveals itself to be an ideal companion piece for Malick’s under-appreciated 1998 masterwork, ‘The Thin Red Line.’ Both films take place on a battlefield, though in the case of ‘Life,’ the war is one of ideas and the opposing armies are within the form of a mother and father. Mr. O’Brien’s worldview is both jaded and practical, whereas Mrs. O’Brien believes in the enriching power of grace. The father’s rigidity causes him to treat his children as if they were soldiers preparing for battle. He warns them of the dangers of being ‘too good,’ whereas Chastain teaches them to find the good in all things. To Malick, their warring philosophies seem to represent the ultimate dichotomy of the human spirit. It’s hard to watch scenes of their verbal altercations and not be reminded of the similar discussions in ‘Line’ that take place between Sean Penn and Jim Caviezel. In some ways, the adult Jack is an extension of Penn’s character in ‘Line,’ as he finds himself becoming like the father he had always loathed. He floats through life like a ghost, searching for the spark that had once ignited his soul.”


"Togetherness": Tyler Smith reviews Joss Whedon's "Avengers: Age of Ultron" at More Than One Lesson.

“These ‘Avengers’ movies are incredibly difficult to pull off, as they literally need to be greater than the sum of their parts. This can’t be ‘Iron Man and the Avengers.’ It has to be a group of equals. Whedon not only understands this, but underscores it in this film with the emphasis placed on the Hawkeye character. I remember when the first film came out, after watching the trailer, my wife incredulously said to me, ‘What’s Bow-and-Arrow Guy gonna do?’ It’s a good question; when we’ve got demigods and hulks running around, what can one man- albeit an extremely skilled one- contribute to the team? When it comes to the physical fighting, Hawkeye holds his own, but often realizes that he is outmatched, but goes ahead anyway, and the film champions this as true heroism. However, we see that the Hulk, possibly the most powerful member of the team, brings with him tremendous liability and danger. Tony Stark, the smartest of the team, brings tremendous amounts of ego and hubris, to the extent that the primary threat in the film is literally created by him. While these men are remarkably powerful, they need somebody like Hawkeye and Black Widow to keep an emotional and philosophical balance. So, ultimately, everybody has a part to play, and every part is important.”

Image of the Day

Peter Jackson posted a lovely remembrance of the late cinematographer Andrew Lesnie on Facebook.

Video of the Day

BLACK FRIDAY: an appropriated video from Nelson Carvajal on Vimeo.

This brilliant 2013 video is one of many works from the acclaimed, Webby-nominated video essayist, writer and filmmaker Nelson Carvajal that will be screened for free as part of Oracle's Aperture Series in Chicago. Carvajal will participate in post-film discussion May 1st and 2nd, and I'll be on hand to discuss the films on May 3rd.

Colossal: Simon Stålenhag’s Retro Sci-Fi Images of a Dystopian Swedish Countryside Published In Two New Books


Across the backdrop of an expansive retro-Scandinavian landscape, Swedish illustrator Simon Stålenhag has spent the last few years imagining a world of science fiction inhabited by roaming mech robots, dinosaurs, and other technological innovations plopped right onto the Swedish countryside. The digitally painted images spread far and wide across the internet over the last few years, capturing the imagination of legions of fantasy and sci-fi fans who clamoured for comic books and even a feature film. For now, we’ll have to make do with old-fashioned art books.

Stålenhag and Free League Publishing just announced a Kickstarter project for two new books featuring Stålenhag’s dystopian vision of the future that will pair illustrations with short stories written in English. You can explore many more illustrations on his website (just start scrolling), and some are available as individual prints.











The Rhizome Frontpage RSS: Lyfe, Labor, Lunch

Lunch Bytes began as a series of panel discussions on the topic of art and digital culture in Washington D.C. in 2011 and 2012. Curated by Melanie Bühler and supported primarily by the Goethe Institut, it expanded across the European continent from 2013 to 2015, partnering with local institutions in nine cities and bringing together 112 “artists and experts” for 24 events. As a final hurrah to conclude the series, in March of this year 24 past participants were invited to a conference at the Haus der Kulturen der Welt in Berlin, with four panel discussions each corresponding to one of the overarching themes that quartered the series: Medium, Structures and Textures, Society, and Life. Bookending the conference panels were keynotes by art historians David Joselit and Melissa Gronlund, plus a summary panel at the end.

While in previous events each of these large headlines possessed a sub-heading and a detailed focus text for the panelists to address, the conference took much broader strokes, allowing freer interpretation of those topics. Content therefore took shape horizontally through the confluence of individual perspectives, geared by participants rather than through top-down direction.

I moderated the final themed panel of the conference, “Life;” interpreting this title became a springboard into the discussion. In my introduction I emphasized the fact that anything “life”-like can also be construed as a form of labor: particularly the practices of artistic representation, self-representation, and representational politics presumably at stake in this conference on digital society.

The first speaker, Cornelia Sollfrank, provided a historical context to those practices in relation to cyberfeminism, while simultaneously critiquing the generational position she felt she represented and the ahistoricism of contemporary practice that could imply. Second, Cecile B. Evans re-routed the expectation of artistic self-narrative by converting the platform into a “live” version of one of her multimedia projects, an oblique approach to representing both her work and her subject position. Lastly, Jesse Darling, who refrained from showing any images of her artworks, questioned the so-called authority of artistic production over other kinds of image making, and therefore the presumed authority (and pigeonholing) inherent in a speaking gig.

Darling posed the bind like this: “I was asked to this panel, ‘Life’, to talk about subjectivity and the self in digital technology. It’s unfortunate for me, since some years ago I fled into abstraction to escape the sidelining of my entire self and politics into the digital feminist discourse, whatever that is. On the other hand, it is my own fault; I should have stuck to 140 characters.”

Meta-discussion of the format in which a discussion takes place holds as much potential for circularity as for critique. In this case, the Life framework, which we all felt the need to address, functioned as a necessary provocation to trigger and propel the conversation beyond itself. To repeat something I once wrote about an artwork I like: “the work provides an outlet from the sphere of the art world within the scope of its influence.” Try replacing “the work” with “life.”

Here we are in the echo chamber.

All videos can be seen at

Open Culture: Miles Davis Opens for Neil Young and “That Sorry-Ass Cat” Steve Miller at The Fillmore East (1970)

miles fillmore east

The story, the many stories, of Miles Davis as an opening act for several rock bands in the 1970s make for fascinating reading. Before he blew the Grateful Dead’s minds as their opening act at the Fillmore West in April 1970 (hear both bands’ sets here), Davis and his all-star Quintet—billed as an “Extra Added Attraction”—did a couple nights at the Fillmore East, opening for Neil Young and Crazy Horse and The Steve Miller Band in March of 1970. The combination of Young and Davis actually seems to have been rather unremarkable, but there is a lot to say about where the two artists were individually.

Nate Chinen in at Length describes their meeting as a “minimum orbit intersection distance”—the “closest point of contact between the paths of two orbiting systems.” Both artists were “in the thrall of reinvention,” Young moving away from the smoothness of CSNY and into free-form anti-virtuosity with Crazy Horse; Davis toward virtuosity turned back into the blues. Miles, suggested jazz writer Greg Tate, was “bored fiddling with quantum mechanics and just wanted to play the blues again.” The story of Davis and Young at the Fillmore East is best told by listening to the music both were making at the time. Hear “Cinnamon Girl” below and the rest of Neil Young and Crazy Horse’s incredible set here. The band had just released their beautifully ragged Everybody Knows this is Nowhere.

When it comes to the meeting of Davis and Steve Miller, the story gets juicier, and much more Miles: the difficult performer, not the impossibly cool musician. (It sometimes seems like the word “difficult” was invented to describe Miles Davis.) The trumpeter’s well-earned egotism lends his legacy a kind of rakish charm, but I don’t relish the positions of those record company executives and promoters who had to wrangle him, though many of them were less than charming individuals themselves. Columbia Records’ Clive Davis, who does not have a reputation as a pushover, sounds alarmed in his recollection of Miles’ reaction after he forced the trumpeter to play the Fillmore dates to market psychedelic jazz-funk masterpiece Bitches Brew to white audiences.

According to John Glatt, Davis remembers that Miles “went nuts. He told me he had no interest in playing for ‘those fu*king long-haired kids.’” Particularly offended by The Steve Miller Band, Davis refused to arrive on time to open for an artist he deemed “a sorry-ass cat,” forcing Miller to go on before him. “Steve Miller didn’t have his shit going for him,” remembers Davis in his expletive-filled autobiography, “so I’m pissed because I got to open for this non-playing motherfu*ker just because he had one or two sorry-ass records out. So I would come late and he would have to go on first and then when we got there, we smoked the motherfu*king place, and everybody dug it.” There is no doubt Davis and Quintet smoked. Hear them do “Directions” above from an Early Show on March 6, 1970.

“Directions,” from unreleased tapes, is as raw as they come, “the intensity,” writes music blog Willard’s Wormholes, “of a band that sounds like they were playing at the The Fillmore to prove something to somebody… and did.” The next night’s performances were released in 2001 as It’s About That Time. Hear the title track above from March 7th. As for The Steve Miller Blues Band? We have audio of their performance from that night as well. Hear it below. It’s inherently an unfair comparison between the two bands, not least because of the vast difference in audio quality. But as for whether or not they sound like “sorry-ass cats”… well, you decide.

Related Content:

The Night When Miles Davis Opened for the Grateful Dead in 1970: Hear the Complete Recordings

Miles Davis’ Entire Discography Presented in a Stylish Interactive Visualization

Bill Graham’s Concert Vault: From Miles Davis to Bob Marley

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - This is a Fry-Jacking

Hovertext: "If we hadn't used real bullets, would people have paid attention?!"

New comic!
Today's News:

All Content: Book Review: "In the Company of Legends" by Joan Kramer and David Heeley


As a sideline to my regular gig as a film critic, I have done my fair share of interviews with actors and filmmakers over the years, either in print or in on-stage Q&A's and like anyone who does that sort of thing on a semi-regular basis, I have a number of anecdotes that I have acquired over the years that I am perfectly willing to share at the drop of a hat—going completely fanboy talking to legends like Roger Corman, Dick Miller and Russ Meyer, the incredibly awkward 10 minutes or so I spent with a non-English-speaking filmmaker while waiting for his interpreter to show up, the insane number of hoops I was forced to jump through in order to speak to Lara Flynn Boyle, and so on. Joan Kramer and David Heeley have done an incredible number of interviews with some of the most notable names in screen history and have acquired plenty of behind-the-scenes anecdotes along the way and in "In The Company Of Legends" (Beaufort Books. $24.95), they share many of them in print for the first time.

While their names may not be immediately familiar to even the most ardent of film buffs, the work that Kramer and Heeley have done over the years certainly is. Having met when they were both working for New York's public television station WNET, they would eventually partner up and, between 1980-2005, would go on to create a number of acclaimed documentary programs on some of the most famous names in Hollywood history—their subjects would include Fred Astaire, Katharine Hepburn, Paul Newman & Joanne Woodward, Spencer Tracy, James Stewart, Henry Fonda, Universal Studios, Columbia Pictures and the Group Theatre. This may not sound like a big deal today, when elaborate Blu-Ray features of this type are common, but when they were starting out, there was not much interest in this kind of programming from the potential subjects (who could scuttle the entire thing if they disagreed with how things were going), the studios (who could withhold the all-important film clips that they controlled) and financiers (who were needed to foot the bills). Nevertheless, they persevered and their documentaries—which would be seen on such networks as WNET, TCM and Starz—would allow them to collect rave reviews, awards and a lot of stories along the way.

The anecdotes cover the gamut from legal minutiae (such as the occasional struggles to convince people to allow key film clips) to pure gossip (including Frank Sinatra turning up late and drunk for an appearance at a tribute to Spencer Tracy and Katherine Hepburn's appalled reaction to a Michael Jackson concert that she attended) and some of them are undeniably fascinating—the discoveries of priceless home movies and behind-the-scenes footage found in closets, attics and even in a barn out back; Olivia de Haviland talking about her love for longtime co-star Errol Flynn; Hepburn at the MGM studios pointing out the very table where she and Greta Garbo tried in vain to convince Louis B. Mayer to produce a film version of Eugene O'Neill's "Mourning Becomes Electra" as a vehicle for them. The occasional rough spots that Kramer and Heeley hit also make for some entertaining moments in the narrative—an attempt to do a program with Bette Davis goes weirdly awry and Harvey Keitel turns up to be interviewed for a film on John Garfield in such a surly mood that exactly one comment could be used in the final product.

The trouble is the truly memorable anecdotes are few and far between here and after a while, the background stories for the projects start to blend into each other, especially in a book running nearly 400 pages. With the rare exception of the likes of Davis and Keitel, virtually everyone winds up being as friendly and approachable as can be, and when a hiccup does occur, it is usually dealt with quickly and with little fuss. At worst, the stars they encounter sometimes take a little while to warm up but not only do they almost always do, many of them wind up becoming friends of the filmmakers and contributors to later projects. Outside of the chapter regarding the film about the Group Theatre, a passion project for Joanne Woodward, there is precious little talk about the actual nuts and bolts of putting the films together. The biggest flaw, however, is an admittedly unavoidable one—we hear about the wonderful and revelatory interviews that the stars give but, of course, none of that material is included in the book.

Whether you will enjoy "In The Company of Legends" will depend in large part on what exactly it is you are hoping to get from it. If all you want is a bunch of stories along the lines of Katherine Hepburn dealing with the sight of a raspberry stain on her sofa or James Stewart dealing with a oddball who turns up at his house with the dream of meeting his idol (hint: it isn't Stewart), then it should fit the bill—it is breezily written and contains a number of rare behind-the-scenes photos of many of the participants (and even sketches by Stewart of his most beloved co-star, Harvey) as well. However, if you are in the mood for a more penetrating look at the lives and careers of these icons that helps to explain exactly what it was that made them so fascinating and unique, you are advised to look elsewhere—the actual films of Kramer and Heeley being an excellent first stop. 

Planet Lisp: Quicklisp news: A small step in the right direction: https for

I had a joke slide at ELS last week that explained why Quicklisp was so easy to install: just use curl | sudo sh.  (Don't try this.) Although Quicklisp's installation isn't as risky as piping random code into a root shell, it does have its own problems. Several people at the conference asked me when I would add more security features to Quicklisp.

As of this week, is available through an https connection. Any requests that come in over http are redirected to the equivalent https location. That means you can have some confidence that the information there is provided by me, rather than intercepted and replaced by a third party.

The main Quicklisp website is only part of the story. The software to install and use Quicklisp is hosted on another domain, That domain now has optional https access, so that any URL may be accessed either through https or http.

That means the bootstrap file quicklisp.lisp is available via https, and so is the PGP key I use to sign client software and dist metadata. (That key is also available via various PGP keyservers.) If you have programs that fetch quicklisp.lisp or software archives directly from, I encourage you to update them to use https instead of http.

Why doesn't use https exclusively? Unfortunately, the Quicklisp client code itself does not know how to connect via https, so turning off http access would break Quicklisp completely. It will take more time to update the Quicklisp client code to use https.

Implementing https for is a small, but important, first step toward making the use of Quicklisp safer. If you have any questions or concerns, please get in touch via

CreativeApplications.Net: Rhythm Necklace

rythym-necklaceThe latest instalment in a long tradition of compact and innovative iOS musical tools is Rhythm Necklace, a new app for generating and modulating melodies. Capitalizing on super legible circular step sequencing and a restrained interface, the app offers a tangible method for composing, iterating, and exporting audio. Rhythm necklaces are circular representations of repeating […]

BOOOOOOOM!: Music Video: Tahiti Boy & The Palmtree Family “All That You Are”

tahitiboy-allthatyouare01 tahitiboy-allthatyouare02tahitiboy-allthatyouare04tahitiboy-allthatyouare03

Love this video – flawless styling and art direction, and great storytelling from a pair of directors I’d never heard of before, Barrere & Simon. Watch “All That You Are” below.

View the whole post: Music Video: Tahiti Boy & The Palmtree Family “All That You Are” over on BOOOOOOOM!.

BOOOOOOOM!: Music Video: Phantoms “Voyeur”

phantomsvoyuer0 phantomsvoyuer1 phantomsvoyuer2

This is one of my favourite videos of the year so far. Ace Norton’s marvelous black and white video for Phantoms is a masterpiece. Hats off to choreographers Tracy Phillips and Dominic Carbone, and the director of photography, Alexander Alexandrov. Watch “Voyeur” below.

View the whole post: Music Video: Phantoms “Voyeur” over on BOOOOOOOM!.

All Content: Kurt Cobain: Montage of Heck


I'll never forget the day I heard "Smells Like Teen Spirit". I had a show on my high school radio station at the time and once a week my co-host and I would sit in a booth and just randomly listen to what came in that day. Most of it was garbage, of course. The early ‘90s were not as wonderful a time for music as VH1 would have you believe. We knew of Nirvana and so the arrival of Nevermind grabbed our attention quickly. But neither of us were prepared for the first track. No one in the world was. I remember the look on my co-host's face. It’s that rare feeling when you know what you’re hearing is not only going to be popular but influential. We put it on the air immediately. And my obsession with grunge music began.

I offer this anecdote to make clear that my personal relationship with Nirvana is a very strong one (even if I have listened to Pearl Jam and Alice in Chains way more in the decades since)—I can also still picture a friend of mine openly weeping when she heard about Kurt Cobain’s suicide, too-few years later. However, even with my card-carrying fan status in mind, I think people on the other end of the grunge spectrum of taste would find something special in “Kurt Cobain: Montage of Heck”. Just as Nirvana took elements of music we had heard before and made them sound new, filmmaker Brett Morgen deconstructs the music documentary and makes it feel new again. In fact, this is one of the best music documentaries ever made.

“Montage of Heck” is the product of eight years of production by Morgen, the genius behind “The Kid Stays in the Picture”, "Chicago 10", and “Crossfire Hurricane,” and the first Nirvana doc authorized by the estate. It’s almost as remarkable for what it’s not as for what it is. It is not hagiography. A VAST majority of music documentaries are made “by fans for fans.” They merely gild the pedestal on which music history has already placed someone or argue for their underrated importance. There are also documentaries that serve the opposite purpose—to pull a Rock God from the heavens down to Earth. “Montage of Heck” doesn’t do that either. It is more intimate, more emotional, and more personal than either of those easy extremes. It is a true peek into the life of a private superstar. How did he become a rock icon? How did he turn his childhood pain into art? How did his emotional demons overtake him? These are much more difficult questions for a filmmaker to answer than “Nirvana vs. Pearl Jam” or other such garbage of the traditional rock doc. 

Morgen’s approach is as multi-layered as Nirvana’s music. There are interviews (including Courtney Love and Krist Novoselic, but, sadly, not Dave Grohl) but it’s FAR from a talking head doc as Morgen focuses on archival footage more than anything else. And it’s not a performance doc although Nirvana’s music can be heard through almost all two-hours-plus of the piece. Morgen varies styles—going from home movies to animated recreations of Kurt’s own autobiographical recordings to concert footage to interviews and back again. It all fits tonally with Nirvana. There’s something about listening to the music that Kurt would write later in life while watching home movies of a hyperactive young Cobain that almost feels like a music video that the band itself would have produced. At one point, Morgen plays a version of "All Apologies" that almost sounds like it's coming out of a child's music box, almost as if it's a melody that Cobain has in his head from childhood.

“Montage of Heck” is layered with emotion throughout. Morgen holds a shot after discussing Kurt’s emotional abuse as a child—“The sad part of the whole thing is that Kurt just really wanted to be with his mom”—and then those words lead into the opening lines of “Something in the Way”; it’s difficult not to cry at the little boy lost who turned that into art later in life. Morgen uses still photos, drawings Kurt made, animations, and then works with his editor to cut them together in rhythm with Nirvana’s music in ways that are completely mesmerizing.

“Kurt Cobain: Montage of Heck” gets slightly repetitive in the second half as Cobain’s issues with fame and relationship with Love dominate the narrative, but even those commonly-discussed chapters of the Cobain legacy have a different, almost tragic energy here. Watching Kurt joke around with Courtney (mocking Axl Rose and Chris Cornell) reveals the friendship dynamic of the two in a different way than we've seen before. And the filmmaking always crackles. Every song choice, every intimate home movie, every personal moment—they have been carefully chosen from eight years of research for maximum impact. There’s an important critical dictum when one reviews documentaries that we need to be careful to focus on the form of the filmmaking as much as the content. Filmmaking matters more than subject matter. In this case, both are brilliant.

"Kurt Cobain: Montage of Heck" premieres on HBO on Monday, May 4th at 9pm EST.

BOOOOOOOM!: Short Film: “Young Love” by Ariel Kleiman


A charming short film from a few years ago (only recently put online), written and directed by Ariel Kleiman. My kind of humour. Watch “Young Love” below.

View the whole post: Short Film: “Young Love” by Ariel Kleiman over on BOOOOOOOM!.

All Content: Welcome to Me


These are hyper-sensitive times. One wrongly worded tweet about an issue such as mental illness can unleash a torrent of social-media tongue lashings within seconds of posting.

Therefore, I must salute the efforts of director Shira Piven, who somehow manages to walk the fine line between mawkish and mocking with “Welcome to Me,” a humorous if occasionally horrific pitch-black satire about an unstable lottery winner from Palm Desert, Calif., who goes off her meds and invests her $86 million jackpot into a vanity talk show. Her one and only topic? Herself.

Network,” “The King of Comedy,” “Being There,” “The Truman Show” and its cousin “EDtv”—“Welcome to Me” owes a debt to each and every one as it comments on the skewed state of celebrity and the ongoing popularity of reality shows. But those films arrived long before validating every living moment with a selfie became a national pastime and YouTube was deemed a legit outlet to discover fresh talent. Now everyone is their own brand.  

What those films lacked, however, was a woman protagonist—an essential ingredient in “Welcome to Me,” with its distinctly feminine style of inward reflection and lashing out. Much early praise has been focused on its star, Kristen Wiig, who has always leaned towards the funny strange rather than the funny ha-ha side of comedy as exhibited by such “Saturday Night Live” alter egos as malevolent grade-school imp Gilly and baby-handed Lawrence Welk singer Dooneese. Since moving onto the big screen, however, she has added a nuanced dramatic edge to her skill set that finds her digging deeper inside the damaged psyches of her characters.

Wiig impressed when she headed an all-female ensemble as a sad-sack maid of honor in “Bridemaids” and paired with fellow SNL alum Bill Hader as suicidal siblings in “The Skeleton Twins.” But “Welcome to Me” basically lives and dies by her performance, and, luckily, her Alice Klieg is a carefully and cunningly crafted creation, which exposes an undercurrent of pain and sorrow beneath her often placid, pixilated state. She is like Blanche DuBois crossed with Amy Adams’ cartoon princess come to life in “Enchanted,” a depressed half-aware naïf who doesn’t know the meaning of TMI and only cares about her own pain.

This Oprah-idolizing social misfit festooned with quirks (she likes to preface her remarks by announcing, “I have a prepared statement,” before hauling out a sheet of paper) and kinks (she wears a fanny pack, binges on string cheese, wears multi-hued bobby socks and keeps her TV on constantly even when she isn’t home) decides to use her sudden windfall to pay people to indulge her every whim no matter how bonkers they might be. Money, as they say, can’t buy happiness but it can grant your wish to make a grand entrance on TV in a swan boat while stiffly waving your arms like a demented Vanna White.

But one must give a goodly chunk of credit to the director (as well as to screenwriter Eliot Laurence) for allowing us to be amused by Alice’s ridiculously watchable self-actualizing journey in the public eye as well as being appalled by it. Certainly, Piven does a better job milking chuckles out of a touchy subject than filmmaker husband Adam McKay and producing partner Will Ferrell—both part of the team behind “Welcome to Me”—when they tackled prison rape in the recent “Get Hard.”

One of the wisest choices Piven made is to hire serious actors, not a gang of comedians, to support Wiig and stabilize the subject matter. Alice essentially provides all the loony tunes antics we need. While some fine performers like Jennifer Jason Leigh get lost in the shuffle, others manage to stand out: Tim Robbins as Alice’s long-suffering if naggy pill-pushing shrink; Linda Cardellini as her one and only friend; Wes Bentley as the on-air infomercial spokesman whose company produces Alice’s show and who becomes her lover; and James Marsden as his opportunistic brother who serves as the film’s Faye Dunaway counterpart as he encourages Alice’s crackpot decisions no matter the consequences.

Leave it to Joan Cusack—has she ever been less than terrific?—to be the one person to be able to divert our attention from Wiig as the show’s disgusted director who nevertheless occasionally engages in a lively on-air back and forth with Alice as a kind of unseen God-like persona from beyond. 

Somehow it makes sense that Piven, who coached her then-2-year-old daughter Pearl on the art of berating Ferrell in “The Landlord,” the viral short that became an Internet sensation and put Daddy’s comedy website Funny or Die on the map back in 2007, should be behind “Welcome to Me.” Alice is essentially playing “let’s pretend” as she slowly consumes a cake on camera made of meat loaf and frosted with potatoes (part of her regimen of regulating her moods with a high protein lifestyle), does a segment titled “Smelling Things Before They Happen” and stages re-enactments of wrongs done to her at the hands of other women from her past.

Of course, Alice becomes a late-night sensation, complete with a besotted grad-school fan who wants to do term papers on her comparing her art to Cindy Sherman. Naturally, she will have a massive public meltdown around the time she decides to neuter dogs live on camera. As for her illness, we never do get a real handle on the extent of Alice’s problem—the better to give us permission to laugh. And, eventually, the film will strike one of its lone false notes by delivering a happy ending instead of a satisfying one.  

BOOOOOOOM!: “Edifice” by Rogerio Silva


A beautiful dance choreographed and performed by Carmine De Amicis and Harriet Waghorn, and directed by Rogerio Silva. Watch “Edifice” below.

View the whole post: “Edifice” by Rogerio Silva over on BOOOOOOOM!.

BOOOOOOOM!: Documentary: Soulwax “Part of the Weekend Never Dies”


Director Saam Farahmand’s 2008 documentary following iconic Belgium-based musicians Soulwax on their worldwide tour is now on Vimeo (not sure why they waited until now). If you’ve never seen it, you should. They filmed 120 shows with a single camera. Just a heads up there is a little nudity here and there in case you’re at work blah blah blah. Watch “Part of the Weekend Never Dies” below.

View the whole post: Documentary: Soulwax “Part of the Weekend Never Dies” over on BOOOOOOOM!. Blog: High Voltage Buck-Boost Regulator


The LTM8056 from Linear Technology is a 58 VIN, buck-boost μModule® (micromodule) regulator which requires just a few external passive components to complete the regulator design. Included in the package are the switching controller, power switches, inductor and support components. The basic external components needed are a single resistor to set the switching frequency, a resistor divider network to set the output voltage together with input and output capacitors. Other features such as input and output average current regulation may be implemented with just a few additional components. The LTM8056 operates with an input voltage ranging from 5 V to 58 V and can supply a regulated output voltage between 1.2 V and 48 V. The SYNC input and CLKOUT signal output provide clock synchronization options.

High Voltage Buck-Boost Regulator – [Link]

Planet Lisp: ECL News: ECL Quarterly - Volume I

ECL Quarterly - Volume I


Hello everyone!

From time to time there are misleading signals, that ECL is "floating dead". This FUD is mostly spread by IRC newcomers with little orientation, but it's good! This is a signal, that this particular implementation of Common Lisp needs to make a bit more noise, instead of silently fixing stuff (however it obviously needs a bit of work on stuff as well ;-)).

Some projects make new release each month to prevent such disinformation. It solves this particular problem, but introduces a new one - it's a bit fuzzy, if a new release introduces any significant improvements, or if it's just a release number bump. Yet, updating it requires recompilation of personal projects, which depending on machine might take a while and is surely a bit annoying.

This is how the ECL Quarterly idea was born. To show a bit activity from time to time - post will be published every three months. I want to make it an e-zine containing info about development, tutorials, comparisons, benchmarks and anything what is at least slightly related to ECL. Everyone is welcome to contribute - such material will also be published on wiki, added to git tree, and if appropriate - incorporated in documentation. If you have such material, don't hesitate to contact me at dkochmanski[at]turtle-solutions[dot]eu.

This chapter highlights:

  • Changes, development status etc.
  • Contributing to ECL

If you have any suggestions regarding ECL Quarterly, you like it, or maybe hate it - please tell me either by commenting this post, or writing an e-mail. Thank you!

Daniel Kochmański
Poznań, Poland
May 2015

Changes and development status

After the 15.3.7 release there are a few changes going on. We are now hosted at (however it would be nice to move to, just not now), a few wiki entries are added, official site is Domain is already bought (not configured properly yet).

Until now we have updated libraries that we depend on. Namely:

  • libffi to version 3.2.1,
  • asdf to version 3.1.4,
  • bdwgc to version 7.4.2.

There was also an update of GMP to more recent version, but we won't include it - yes, we are staying LGPLv2, to remain GPLv2 compatible. All changes were tested on Windows with MSVC2012.

For now we have also reworked the ROTATEF, SHIFTF and PSETF macros, to conform ANSI standard in regard to places (multiple values weren't held properly).

The experimental branch contains Android (as name suggests - experimental) port, which is preview only for now (this branch is LGPLv3, because included GMP enforces it, and doesn't build on Windows).

Loads of things are to be done, and we're slowly making progress. After all we won't change release versioning scheme. No ETA for next release, but I assure you - if you contribute, it will be sooner ;-). What leads us to the next part.

Contributing to ECL

Documentation, tutorials, wiki, others

You can contribute in numerous ways. One way is to code and fix bugs - and it's very important part. But equally important is to find bugs and report them, so developers can improve codebase. Feature requests are also welcome (however we are focused now on fixing stuff rather than adding functionality, so even good proposition might wait in queue behind something maybe less exciting).

Testing on various platform and architectures is essential. Thanks to Anton Vodonosov and cl-test-grid it is as easy as setting environment, tweaking two configuration files and invoking ./ Run may take a while (depending on computer) and is limited to operating systems and architectures supported by Clozure Common Lisp.

If ECL doesn't build, crashes, or works non-conforming to specification, please do report at It requires account on GitLab, but setting it up is free. I'm still struggling to produce some time and move all tickets from to above mentioned issue tracking site - volunteers are more then welcome to do it.

If you encounter problem, please write to mailing list. There are many kind souls there more then willing to help - and usually they do. On the other hand, if someone asks for help and you know the answer - act! :-)

Wiki is a place, where many resources are gathered. It is incomplete and barely usable, because I don't have time to improve it. If you successfully build ECL on Android - great! Please add page to wiki, so others may reproduce your work! Do you have some nifty idea, and you believe it is worth to keep it there - do it. See a typo? Bug? Outdated information? You know what to do! You spot some nice blog post about ECL? Please share with others - wiki is best place to do so. You did successful project using ECL? Share this information! All this, and so much more, may be done here:

You can also write something for ECL Quarterly (e-mail me at dkochmanski[at]turtle-solutions[dot]eu).

Source code contributions

Development takes place on git repository located at

If you want your code in Embeddable Common-Lisp project, please send a patch to mailing list with additional tag [PATCH] in subject. Generally we want to follow convention used by U-Boot development team (, which borrows a lot from Linux policy. Please read this guide, it's really worthwhile reading. If you submit a significant change, please report it in CHANGELOG located in top directory.

Basically, if you want to contribute to code, you have at least two choices. You may improve C code (which is probably less interesting for lispers). Most of the sources are written in lisp however, and require no knowledge about C. Problems vary from fairly easy even for seasoned developer to hard conceptual riddles, so everyone can find something interesting for himself.. Please consult appropriate directories under src/ sub-directory (i.e. src/lsp, src/clos etc.) - it's pure lisp! And it's fun to hack. Improving both C and Lisp sources might be a great learning experience. To figure what's wrong requires often getting your hands dirty, and then cleaning proposed solution. This, connected with peer review, might be a next step to become a better programmer.

There is also a third part, which is tiresome (for me at least) - improving build system - it is buggy and inconsistent. First person who will fix this has granted a free dinner when we meet in person. Remember however, that we support many operating systems, so it might be tricky to do it properly without introducing new problems.

If you are a library or application developer, please test against as many implementations as possible - its hard, it takes time, but in my humble opinion it is essential for CL ecosystem. Each implementation has it's weak and strong sides, and you never know, when you'll need one, or who and in what purpose is using your code :-). Blog: Tiny real-time clock consumes only 240 nA

Micro Crystal RV8803C7

by Susan Nordyk @

Furnished in a ceramic surface-mount package that is just 3.2×1.5×0.8 mm, the RV-8803-C7 real-time clock module from Swiss manufacturer Micro Crystal consumes 240 nA and operates from a supply voltage as low as 1.5 V to increase the life of backup supplies. The device gives designers the option to replace expensive batteries and supercapacitors with low-cost multilayer ceramic capacitors for battery backup.

The temperature-compensated real-time clock is accurate to within ±3.0 ppm (±0.26 seconds/day) over a temperature range of -40°C to +85°C. In addition to low current consumption and high accuracy, the RV-8803-C7 has one of the smallest ceramic packages in the industry with an integrated 32.768-Hz quartz crystal. It operates from a supply voltage ranging from 1.5 V to 5.5 V and employs an I2C interface.

Tiny real-time clock consumes only 240 nA – [Link]

Tudor Girba's blog: Talk at NDC Oslo 2015 on "Don't demo facts. Demo stories!"

This year, I will again have the pleasure of going at NDC Oslo 2015. This time I will talk about the demo-driven approach and the importance of storytelling in software engineering.

Here is the abstract of the talk:

Feedback is the central source of agile value. The most effective way to obtain feedback from stakeholders is a demo. That is what reviews are about. If a demo is the means to value, shouldn’t preparing the demo be a significant concern? Shouldn’t the preparation of demos not be left for the last minute? Should it not be part of the definition of done?

Good demos engage. But, there is more to a good demo. A good demo tells a story about the system. This means that you can tell the story. And it also means that the system is made to tell that story, too. Not a user story full of facts. A story that makes users want to use the system.

Many things go well when demos come out right. Your system looks different. Stakeholders are in sync. Marketing does not have to lie. And even sales can sell better.

This talk tells stories of successful demos and distills demo-driven lessons from both working in research and in industry. These lessons are meant to be used in every day projects.

Penny Arcade: Comic: Post Traumatic

New Comic: Post Traumatic

Volatile and Decentralized: Flywheel: Google's Data Compression Proxy for the Mobile Web

Next week, we'll be presenting our work on the Chrome Data Compression proxy, codenamed Flywheel, at NSDI 2015. Here's a link to the full paper. Our wonderful intern and Berkeley PhD student Colin Scott will be giving the talk. (I'm happy to answer questions about the paper in the comments section below.)

It's safe to say that the paper would have never happened without Colin -- most of us are too busy building and running the service to spend the time it takes to write a really good paper. Colin's intern project was specifically to collect data and write a paper about the system (he also contributed some features and did some great experiments). It was a win-win situation since we got to offload most of the paper writing to Colin, and he managed to get a publication out of it!

Rather than summarize the paper, I thought I'd provide some backstory on Flywheel and how it came about. It's a useful story to understand how a product like this goes from conception to launch at a company like Google.

(That said, standard disclaimer applies: This is my personal blog, and the opinions expressed here are mine alone.)

Backstory: Making the mobile web fast

When I moved to Seattle in 2011, I was given the mission to start a team with a focus on improving mobile Web performance. I started out by hiring folks like Ben Greenstein and Michael Piatek to help figure out what we should do. We spent the first few months taking a very academic approach to the problem: Since we didn't understand mobile Web performance, of course we needed to measure it!

We built a measurement tool, called Velodrome, which allowed us to automate the process of collecting Web performance data on a fleet of phones and tablets -- launching the browser with a given URL, measuring a bunch of things, taking screenshots, and uploading the data to an AppEngine-based service that monitored the fleet and provided a simple REST API for clients to use. We built a ton of infrastructure for Velodrome and used it on countless experiments. Other teams at Google also started using Velodrome to run their own measurements and pretty soon we had a few tables full of phones and tablets churning away 24/7. (This turned out to be a lot harder than we expected -- just keeping them running continuously without having to manually reboot them every few hours was a big pain.)

At the same time we started working with the PageSpeed team, which had built the gold standard proxy for optimizing Web performance. PageSpeed was focused completely on desktop performance at the time, and we wanted to develop some mobile-specific optimizations and incorporate them. We did a bunch of prototyping work and explorations of various things that would help.

The downside to PageSpeed is that sites have to install it -- or opt into Google's PageSpeed Service. We wanted to do something that would reach more users, so we started exploring building a browser-based proxy that users, rather than sites, could turn on to get faster Web page load times. (Not long after this, Amazon announced their Silk browser for the Kindle Fire, which was very much the same idea. Scooped!)

Starting a new project

Hence we started the Flywheel project. Initially our goal was to combine PageSpeed's optimizations, the new SPDY protocol, and some clever server-side pre-rendering and prefetching to make Web pages load lightning fast, even on cellular connections. The first version of Flywheel, which we built over about a year and a half, was built on top of PageSpeed Service.

Early in the project, we learned of the (confidential at the time) effort to port Chrome to Android and iOS. The Chrome team was excited about the potential for Flywheel, and asked us to join their team to launch it as a feature in the new browser. The timing was perfect. However, the Chrome leadership was far more interested in a proxy that could compress Web pages, which is especially important for users in emerging markets, on expensive mobile data plans. Indeed, many of the original predictive optimizations we were using in Flywheel would have resulted in substantially greater data usage for the user (e.g., prefetching the next few pages you were expected to visit). It also turned out that compression is way easier than performance, so we decided to focus our efforts on squeezing out as many bytes as possible. (A common mantra at the time was "no bytes left behind".)

Rewriting in Go

As we got closer to launching, we were really starting to feel the pain of bolting Flywheel onto PageSpeed Service. Originally, we planned to leverage many of the complex optimizations used by PageSpeed, but as we focused more on compression, we found that PageSpeed was not well-suited to our needs, for a bunch of reasons. In early 2013, Michael Piatek convinced me that it was worth trying to rewrite the service, from scratch, in Go -- as a way of both doing a clean redesign from scratch but also leveraging Go's support for building Google-scale services. It was a big risk, but we agreed that if the rewrite wasn't bearing fruit in just a couple of months that we'd stop work on it and go back to PageSpeed.

Fortunately, Michael and the rest of the team executed at lightning speed and in just a few months we had substantially reimplemented Flywheel in Go, a story documented elsewhere on this blog. In November 2013 I submitted a CL to delete the thousands of lines of the PageSpeed-based Flywheel implementation, and we switched over entirely to the new, Go-based system in production.

PageSpeed Service in C++ was pushing 270 Kloc at the time. The Go-based rewrite was just 25 Kloc, 13Kloc of which were tests. The new system was much easier to maintain, faster to develop, and gave our team sole ownership of the codebase, rather than having to negotiate changes across the multiple teams sharing the PageSpeed code. The bet paid off. The team was much happier and more productive on the new codebase, and we managed to migrate seamlessly to the Go-based system well before the full public launch.


We announced support for Flywheel in the M28 beta release of Chrome at Google I/O in 2013, and finally launched the service to 100% of Chrome Mobile users in January 2014. Since then we've seen tremendous growth of the service. More than 10% of all Chrome Mobile users have Flywheel enabled, with percentages running much higher in countries (like Brazil and India) where mobile data costs are high. The service handles billions of requests a day from millions of users. Chrome adoption on mobile has skyrocketed over the last year, and is now the #1 mobile browser in many parts of the world. We also recently launched Flywheel for Chrome desktop and ChromeOS. Every day I check the dashboards and see traffic going up and to the right -- it's exciting.

We came up with the idea for Flywheel in late 2011, and launched in early 2014 -- about 2.5 years of development work from concept to launch. I have no idea if that's typical at Google or anywhere else. To be sure, we faced a couple of setbacks which delayed launch by six months or more -- mostly factors out of our control. We decided to hold off on the full public release until the Go rewrite was done, but there were other factors as well. Looking back, I'm not sure there's much we could have done to accelerate the development and launch process, although I'm sure it would have gone faster had we been doing it as a startup, rather than at Google. (By the same token, launching as part of Chrome is a huge opportunity that we would not have had anywhere else.)

What's next?

Now that Flywheel is maturing, we have a bunch of new projects getting started. We still invest a lot of energy into optimizing and maintaining the Flywheel service. Much of the work focuses on making the service more robust to all of the weird problems we face proxying the whole Web -- random website outages causing backlogs of requests at the proxy, all manner of non-standards-compliant sites and middleboxes, etc. (Buy me a beer and I'll tell you some stories...) We are branching out beyond Flywheel to build some exciting new features in Chrome and Android to improve the mobile experience, especially for users in emerging markets. It'll be a while until I can write a blog post about these projects of course :-) Comic for 2015.05.01

New Cyanide and Happiness Comic

s mazuk: munchanka: Last night, Paul Mendoza and I spoke to our...


Last night, Paul Mendoza and I spoke to our animation class about the concept of “Simple vs Complex,” and I thought some of you might find this useful. The idea is to balance a strong pose by contrasting simple and contrast forms.  The simple (stretching) side of a pose is usually your main line of action, while the complex (squashing) side is where you get most of the interest and the focal points of the pose. “Simple vs Complex” also works for individual parts like a flexing arm or a hand pose. This concept enhances clarity, appeal, and energy in any pose.

Bruce Timm’s style illustrates this concept best, but great actors (like the Python troupe) display this kind of posing all the time.

Have at it!

things magazine: Spot the ball

Fine art embroidery by Stephanie Clark / The Art of Forgery: The Minds, Motives and Methods of Master Forgers. See also Made in China, a deliberate forgery on display at Dulwich Picture Gallery / data crunching: 40 years of the American home vs porn data: visualising fetish space / what’s the equivalent of carrying coals to Newcastle?

Planet Haskell: Danny Gratzer: Bracket Abstraction: The Smallest PL You've Ever Seen

Posted on May 1, 2015
Tags: types, haskell

It’s well known that lambda calculus is an extremely small, Turing Complete language. In fact, most programming languages over the last 5 years have grown some (typed and or broken) embedding of lambda calculus with aptly named lambdas.

This is wonderful and everything but lambda calculus is actually a little complicated. It’s centred around binding and substituting for variables, while this is elegant it’s a little difficult to formalize mathematically. It’s natural to wonder whether we can avoid dealing with variables by building up all our lambda terms from a special privileged few.

These systems (sometimes called combinator calculi) are quite pleasant to model formally, but how do we know that our system is complete? In this post I’d like to go over translating any lambda calculus program into a particular combinator calculus, SK calculus.

What is SK Combinator Calculus?

SK combinator calculus is a language with exactly 3 types of expressions.

  1. We can apply one term to another, e e,
  2. We have one term s
  3. We another term k

Besides the obvious ones, there are two main rules for this system:

  1. s a b c = (a c) (b c)
  2. k a b = a

And that’s it. What makes SK calculus so remarkable is how minimal it is. We now show that it’s Turing complete by translating lambda calculus into it.

Bracket Abstraction

First things first, let’s just define how to represent both SK calculus and lambda calculus in our Haskell program.

    data Lam = Var Int | Ap Lam Lam | Lam Lam
    data SK  = S | K | SKAp SK SK

Now we begin by defining a translation from a simplified lambda calculus to SK calculus. This simplified calculus is just SK supplemented with variables. By defining this step, the actual transformation becomes remarkably crisp.

    data SKH = Var' Int | S' | K' | SKAp' SKH SKH

Note that while SKH has variables, but no way to bind them. In order to remove a variable, we have bracket. bracket has the property that replacing Var 0 in a term, e, with a term, e', is the same as SKAp (bracket e) e'.

    -- Remove one variable
    bracket :: SKH -> SKH
    bracket (Var' 0) = SKAp' (SKAp' S' K') K'
    bracket (Var' i) = Var' (i - 1)
    bracket (SKAp' l r) = SKAp' (SKAp' S' (bracket l)) (bracket r)
    bracket x = x

If we’re at Var 0 we replace the variable with the term s k k. This has the property that (s k k) A = A. It’s traditional to abbreviate s k k as i (leading to the name SKI calculus) but i is strictly unnecessary as we can see.

If we’re at an application, we do something really clever. We have two terms which both have a free variable, so we bracket them and use S to supply the free variable to both of them! Remember that

s (bracket A) (bracket B) C = ((bracket A) C) ((bracket B) C)

which is exactly what we require by the specification of bracket.

Now that we have a way to remove free variables from an SKH term, we can close off a term with no free variables to give back a normal SK term.

    close :: SKH -> SK
    close (Var' _) = error "Not closed"
    close S' = S
    close K' = K
    close (SKAp' l r) = SKAp (close l) (close r)

Now our translator can be written nicely.

    l2h :: Lam -> SKH
    l2h (Var i) = Var' i
    l2h (Ap l r) = SKAp' (l2h l) (l2h r)
    l2h (Lam h) = bracket (l2h h)

    translate :: Lam -> SK
    translate = close . l2h

l2h is the main worker in this function. It works across SKH’s because it needs to deal with open terms during the translation. However, during the process we repeatedly call bracket so every time we go under a binder we call bracket afterwards, removing the free variable we just introduced.

This means that if we call l2h on a closed lambda term we get back a closed SKH term. This justifies using close after the toplevel call to l2h in translate which wraps up our conversion.

For funsies I decided to translate the Y combinator and got back this mess

(s ((s ((s s) ((s k) k))) ((s ((s s) ((s ((s s) k)) k))) ((s ((s s) k)) k))))
((s ((s s) ((s k) k))) ((s ((s s) ((s ((s s) k)) k))) ((s ((s s) k)) k)))

Completely useless, but kinda fun to look at. More interestingly, the canonical nonterminating lambda term is λx. x x which gives back s i i, much more readable.

Wrap Up

Now that we’ve performed this translation we have a very nice proof of the turing completeness of SK calculus. This has some nice upshots, folks who study things like realizability models of constructive logics use Partial Combinatory Algebras a model of computation. This is essentially an algebraic model of SK calculus.

If nothing else, it’s really quite crazy that such a small language is possible of simulating any computable function across numbers.

<script type="text/javascript"> var disqus_shortname = 'codeco'; (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = '//' + disqus_shortname + ''; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); </script> <noscript>Please enable JavaScript to view the comments powered by Disqus.</noscript> comments powered by Disqus

Disquiet: Disquiet Junto Project 0174: Glove Songs


Each Thursday in the Disquiet Junto group on and at, a new compositional challenge is set before the group’s members, who then have just over four days to upload a track in response to the assignment. Membership in the Junto is open: just join and participate.

Tracks will be added to this playlist for the duration of the project.

This assignment was made in the early evening, California time, on Thursday, April 30, 2015, with a deadline of 11:59pm wherever you are on Monday, May 4, 2015.

These are the instructions that went out to the group’s email list (at

Disquiet Junto Project 0174: Glove Songs
Play something on your favorite instrument — wearing gloves.

Step 1: Think of the instrument on which you are most proficient. (Touch screen interfaces such as iOS devices are excluded from this project.)

Step 2: Choose a piece of music you’ve recorded that employs that instrument.

Step 3: Re-record that same piece of music, this time wearing gloves.

Step 4: Upload your track to the Disquiet Junto group on SoundCloud. When posting, if possible include a link to the original (non-gloves) version of the piece.

Step 5: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Deadline: This assignment was made in the early evening, California time, on Thursday, April 30, 2015, with a deadline of 11:59pm wherever you are on Monday, May 4, 2015.

Length: The length of your finished work should be roughly the length of the original.

Upload: Please when posting your track on SoundCloud, only upload one track for this assignment, and include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Title/Tag: When adding your track to the Disquiet Junto group on, please include the term “disquiet0174-glovesongs” in the title of your track, and as a tag for your track.

Download: It is preferable that your track is set as downloadable, and that it allows for attributed remixing (i.e., a Creative Commons license permitting non-commercial sharing with attribution).

Linking: When posting the track, please be sure to include this information:

More on this 174th Disquiet Junto project — “Play something on your favorite instrument — wearing gloves” — at:

More on the Disquiet Junto at:

Join the Disquiet Junto at:

Disquiet Junto general discussion takes place at:

Photo associated with this project by Myxi, used via Creative Commons license:

Greater Fool - Authored by Garth Turner - The Troubled Future of Real Estate: Reality

BEAR BIKE modified

And now, a reality check or two.

Despite a 20% heave in the price of oil, house sales in Cowtown are down 23% (much better than last month), while days-on-market have doubled and the median price has declined 2%. This is tame, of course, compared to Fort McMoney, but it’s a disaster compared to Calgary one year ago, or the GTA today.

Is there more to come? Probably. The best outcome would seem to be a flatlining market – but even that is awful news for those who swallowed the buy-now-or-buy-never flotsam local realtors were tossing around in 2014.

Fearless housing analyst and CREA Public Enemy #1, Ross Kay, is on it. He tells me that the last 36,000 Calgarians who bought a home in that one-horse city, “have all lost money since the day the offer was written.” Even if they somehow found a buyer and sold immediately in today’s modest dip, they’d be in the hole.

“Worse,” he adds, “by this time tomorrow when April’s sales stats hit, another 4,000 families are set to go negative, taking the number to 40,000.  To put this in simple terms, 9% of all owner-occupied homes in Calgary would create a loss if they sold at today’s appraised values.”

And here’s a factoid for all those discussing CMHC insurance on this blog (remember, this is insurance buyers pay to protect the lender, not themselves): there are now 10,000 CMHC-insured properties in Calgary which are underwater, where the mortgaged amount exceeds the appraised value. That is almost double the current number of active listings (5,800). “Luckily Canadians fear foreclosure,” says Kay, “even in Calgary.”

But maybe we should fear CMHC even more.

Did you see the 12-footer house in YVR that sold this week for $1.35 million? Think two-storey garden shed avec granite and a heated bidet. Meanwhile the average detached house in that city is now $1.4 million, and in the East, where all the poor people live, beater houses have just topped seven figures. I yakked yesterday with a psychiatrist who lives on the Westside, whose house has doubled in five years, to $2.7 million. “They’re all crazy,” he said. And he should know.

So houses in Van cost 11 times more than the people who live there earn in a year. Compared to the rent a house generates, the over-valuation by international standards is more than 80%. As you know, there was a near riot a weekend ago as people tried to get their hands on unbuilt micro-condos of 300 square feet in a distant burb for between $100,000 and $200,000. And Twitter lit up recently with a campaign from pouty, entitled local Millennials who want 1982 to come back, called #DontHave1Million.

But here’s the reality check. Don’t worry about YVR. It’s okay. Cheap, even. It’s Winnipeg we should be sweating over.

That, believe it or not, is the official position of Canada Mortgage and Housing, the almost-proud owner of ten thousand underwater houses in Calgary. This is the federal agency which lets children without savings buy properties they can’t afford, permits zero-down financings and has saddled the taxpayers with a liability equal in size to the federal debt. Now, astonishingly, these noobs tell us this: “high house prices do not necessarily imply overvaluation.” In isolation, true enough. And if the average YVR family was making $200,000, then an average house could reasonably change hands for about $800,000 to a million. But, of course, household income is barely above $70,000, family debt is increasing and BC residents have an overall negative savings rate.

The bureaucrats say YVR has the fundamentals to support higher prices, including land supply (a reasonable constraint), income (fail), population (60% less than the GTA while houses cost 40% more. Yeah, right) and interest rates (not unique to YVR, and soon to change). The CMHC conclusion: “Vancouver’s market is at low overall risk for a correction.”

Now, Regina and Winnipeg are way riskier – examples of “overvaluation in a lower priced market.” Izzatso? The average Peg family earns $76,000 – 7% more than in delusional Van – and yet the average Winnipeg house sells for just $277,000 – which is a quarter the price.

OK, so it’s Winnipeg. I get that.

But there are just two reasons why a speculative bubble has coated the left coast (and Toronto) with risk. Cheap money. And, especially, CMHC. The Canadian Moral Hazard Corporation.







Perlsphere: Curating old releases on CPAN

There are some distributions on CPAN that were last released 20 years or so ago. Understandably many of them don't follow many of the conventions that we expect today, and some of them fail all their tests, and have for a while. I think we should do something about these dists: either update them to be well-behaved modern distributions, or remove them from CPAN. They'll continue to be available on BackPAN. Here I'll go through a batch of the oldest.

Quiet Earth: Prometheus Scribe Jon Spaihts Enters CUBE Remake

The remake of Vincenzo Natali's CUBE is moving along quickly now at Lionsgate. Director Saman Kesh,who caught everyone's eye with his short film Controller (below) is now attached to direct the film called "Cubed," describes as "a re-imagining of the original movie."

Jon Spaihts, the original writer of Prometheus and the in-development scifi film Passengers (which I read and is awesome), is on board to produce.

The film is described as a sci-fi survival thriller about artificial intelligence, humanity and the birth of a new “digital” race. Interesting.

Phil Gawthorne is writing the script based off of Kesh’s original take.

Vincenzo Natali's original Canadian thriller is simply one of the best genre exercises and has inspired coun [Continued ...]

The Rhizome Frontpage RSS: The Final Post: Computer Evolution on Law and Order


 The first computer on the show (1,1).

Combining endurance performance art and media studies, artist Jeff Thompson captured over 11,000 images of the show Law & Order while watching the complete original series over the course of 18 months, often at an increased frame rate in order to save time. Through these images, he tracked the computer's changing role on the show from its debut as a static background prop to its starring role as the focus of characters' attention and the basis of plotlines. Those images have been published to a Tumblr on an ongoing basis since the launch of the project; the final post went up today.

Clunky monitors slowly move to the front of the desk (5, 89)

Computers on Law & Order was created per a Rhizome commission in 2012 — an apt time to analyze the ongoing interdependence of technology and daily life. As Thompson immersed himself in television drama's interpretation of the rise of the internet and the appearance of the Blackberry, real-world consumers were being influenced by appification and the mainstream prevalence of the cloud. The ability to “binge watch”—coupled with the power to stream comfortably with near immediacy—is what ignited Thompson’s initial interest in the project; after obsessive detailing and the creation of logical infographics, his ultimate findings sound like an anthropological study of American culture:

Law & Order is an even more interesting cultural artifact than I could have ever expected. The show forms a unique database of images and speech, and one that reflects the fascinations, fears, and biases of its time. Law & Order's long run and its ‘ripped from the headlines’ content makes it a useful lens through which to look at a period of great political and economic change in the United States.”

In addition to the Tumblr, a curated book of image selections was self-published by the artist last year, along with an accompanying essay on Rhizome.

 A still from Law and Order episode #456, 2010

A still from Law and Order episode #456, 2010 / 2015-05-04T10:55:21