Penny Arcade: Comic: Success And Its Opposite

New Comic: Success And Its Opposite

Bifurcated Rivets: From FB


Bifurcated Rivets: From FB


Bifurcated Rivets: From FB


Bifurcated Rivets: From FB

Rather fine

Bifurcated Rivets: From FB


Bad Science: Sarepta: anecdote, data, surrogate outcomes, and the FDA

Hackaday: A Beautiful Turntable With A Heart Of Concrete

On the face of it, playing a vinyl record is a simple process. You simply mount it on a turntable rotating at the right speed, and insert a needle into the groove. A learning exercise for youngsters used to be a passable attempt at a record player on the kitchen table with a pencil, a large cork, a sharpened matchstick, and a piece of paper. It sounded awful, but it demonstrated well how the audio was recorded.

If you have ever looked into the operation of a more conventional turntable though you’ll know that a little more care and attention is needed. There are many factors which affect the quality of the sound, and you quickly become obsessive about tracking, and sources of the tiniest vibration. Someone who has followed this path is [Mjhara], who has made a very high quality turntable. There is an unusual choice in this project: the tonearm is part of the build rather than fitting a commercial item like most turntable projects.

balanced-with-shotThe platter is machined from a piece of rosewood, weighted and balanced with lead shot, and laminated between two sheets of brass. It sits on a bearing aided by a ring of opposing magnets, and is belt driven by a two-phase induction motor. The base of the turntable is cast as a single piece of concrete, the idea being that the extra weight will aid the damping of vibrations. The tonearm is machined from a piece of wood, and its pivot from brass. The tonearm bearing is a ballpoint pen, a surprising yet inspired choice .

Sometimes audiophiles take their quest for better sound to extremes, and justification for their expenditure can be very subjective. But [Mjhara] assures us that this turntable has an exceptionally good sound, and it is certainly a thing of beauty. Full details are in the Imgur gallery embedded below the break.

We’ve featured surprisingly few home made audiophile turntables here at Hackaday, probably because classic examples aren’t hard to come by. This layered plywood example is probably the most striking we’ve shown you.

Thanks [Itay Ramot] for the tip.

Filed under: home entertainment hacks

BOOOOOOOM!: Photographer Spotlight: Vincent van de Wijngaard


A selection of photos by Vincent van de Wijngaard. More images below.

Recent additions: dynamic-plot

Added by leftaroundabout, Fri Sep 30 10:44:55 UTC 2016.

Interactive diagram windows

Recent additions: haxl

Added by algoriddle, Fri Sep 30 10:28:58 UTC 2016.

A Haskell library for efficient, concurrent, and concise data access.

BOOOOOOOM!: Collection of 80 High-Res Textless Movie Posters


Not sure how this exists??? Someone has uploaded a collection of 80 high-resolution film posters stripped of any text. Can’t tell if they have next-level Photoshop skills or some other kind of voodoo that made this possible. In any case, thank you, Internet. Have a look at them all here. I included a bunch of my favourites below (there’s a common thread to the ones I picked).

*Edit – just discovered that the link I posted was actually a repost and this was originally posted by Reddit user Join_You_In_The_Sun, so I’ve updated the link in the post.

Slashdot: Oscar Winners, Sports Stars and Bill Gates Are Building Lavish Bunkers

turkeydance quotes a report from Hollywood Reporter: Given the increased frequency of terrorist bombings and mass shootings and an under-lying sense of havoc fed by divisive election politics, it's no surprise that home security is going over the top and hitting luxurious new heights. Or, rather, new lows, as the average depth of a new breed of safe haven that occupies thousands of square feet is 10 feet under or more. Those who can afford to pull out all the stops for so-called self-preservation are doing so -- in a fashion that goes way beyond the submerged corrugated metal units adopted by reality show "preppers" -- to prepare for anything from nuclear bombings to drastic climate-change events. Gary Lynch, GM at Rising S Bunkers, a Texas-based company that specializes in underground bunkers and services scores of Los Angeles residences, says that sales at the most upscale end of the market -- mainly to actors, pro athletes and politicians (who require signed NDAs) -- have increased 700 percent this year compared with 2015, and overall sales have risen 150 percent. Any time there is a turbulent political landscape, we see a spike in our sales. Given this election is as turbulent as it is, "we are gearing up for an even bigger spike," says marketing director Brad Roberson of sales of bunkers that start at $39,000 and can run $8.35 million or more (FYI, a 12-stall horse shelter is $98,500). Adds Mike Peters, owner of Utah-based Ultimate Bunker, which builds high-end versions in California, Texas and Minnesota: "People are going for luxury [to] live underground because they see the future is going to be rough. Everyone I've talked to thinks we are doomed, no matter who is elected." Robert Vicino, founder of Del Mar, Calif.-based Vivos, which constructs upscale community bunkers in Indiana (he believes coastal flooding scenarios preclude bunkers being safely built west of the Rockies), says, "Bill Gates has huge shelters under every one of his homes, in Rancho Santa Fe and Washington. His head of security visited with us a couple years ago, and for these multibillionaires, a few million is nothing. It's really just the newest form of insurance."

Read more of this story at Slashdot.

Recent additions: manifolds

Added by leftaroundabout, Fri Sep 30 09:55:43 UTC 2016.

Coordinate-free hypersurfaces

Recent additions: hsparql 0.2.9

Added by RobStewart, Fri Sep 30 09:31:14 UTC 2016.

A SPARQL query generator and DSL, and a client to query a SPARQL server.

Recent additions: introduction-test

Added by Norfair, Fri Sep 30 09:27:11 UTC 2016.

A prelude for the tests of safe new projects Dancer2-Plugin-Auth-Extensible-Provider-DBIC-0.602

authenticate via the L plugin

Blog – free electrons: Free Electrons at the Developer Conference 2016

The Foundation hosts every year around september the Developer Conference, which, unlike its name states, is not limited to developers, but gathers all the Linux graphics stack developers, including, Mesa, wayland, and other graphics stacks like ChromeOS, Android or Tizen.

This year’s edition was held last week in the University of Haaga-Helia, in Helsinki. At Free Electrons, we’ve had more and more developments on the graphic stack recently through the work we do on Atmel and NextThing Co’s C.H.I.P., so it made sense to attend.

XDC 2016 conference

There’s been a lot of very interesting talks during those three days, as can be seen in the conference schedule, but we especially liked a few of those:

DRM HWComposer – SlidesVideo

The opening talk was made by two Google engineers from the ChromeOS team, Sean Paul and Zach Reizner. They talked about the work they did on the drm_hwcomposer they wrote for the Pixel C, on Android.

The hwcomposer is one of the HAL in Android that interfaces between Surface Flinger, the display manager, and the underlying display driver. It aims at providing hardware composition features, so that Android can leverage the capacities of the display engine to perform compositions (through planes and sprites), without having to use the CPU or the GPU to do this work.

The drm_hwcomposer started out as yet another hwcomposer library implementation for the tegra-drm driver in Linux. While they implemented it, it turned into some generic enough implementation that should be useful for all the DRM drivers out there, and they even introduced some particularly nice features, to split the final screen content into several planes based on the actual displayed content rather than on windows like it’s usually done.

Their work also helped to point out a few flaws in the hwcomposer API, that will eventually be fixed in a new revision of that API.

ARC++ SlidesVideo

The next talk was once again from a ChromeOS engineer, David Reveman, who came to show his work on ARC++, the component in ChromeOS that allows to run Android applications. He was obviously mostly talking about the display side.

In order to achieve that, he had to implement an hwcomposer that would just act as a proxy between SurfaceFlinger and Wayland that is used on the ChromeOS side. The GL rendering is still direct though, and each Android application will talk directly to the GPU, as usual. Only the composition will be forwarded to the ChromeOS side.

In order to minimize that composition process, whenever possible, ARC++ tries to back each application with an overlay so that the composition would happen directly in hardware.

This also led to some interesting challenges, especially since some of the assumptions of both systems are in contradiction. For example, any application can be resized in ChromeOS, while it’s not really a thing in Android where all the applications run full screen.

HDR Displays in Linux – SlidesVideo

The next talk we found interesting was Andy Ritger from nVidia explaining how the HDR displays were supposed to be handled in Linux.

He first started by explaining what HDR is exactly. While the HDR is just about having a wider range of luminance than on a regular display, you often also get a wider gamut with HDR capable displays. This means that on those screens you can display a wider range of colors, and with a better range and precision in their intensity. And
while the applications have been able to generate HDR content for more than 10 years, the rest of the display stack wasn’t really ready, meaning that you had convert the HDR colors to colors that your monitor was able to display, using a technique called tone mapping.

He then explained than the standard, non-HDR colorspace, sRGB, is not a linear colorspace. This means than by doubling the encoded luminance of a color, you will not get a color twice brighter on your display. This was meant this way because the human eye is much more sensitive to the various shades of colors when they are dark than when they are bright. Which essentially means that the darker the color is, the more precision you want to get.

However, the luminance “resolution” on the HDR display is so good that you actually don’t need that anymore, and you can have a linear colorspace, which is in our case SCRGB.

But drawing blindly in all your applications in SCRGB is obviously not a good solution either. You have to make sure that your screen supports it (which is exposed through its EDIDs), but also that you actually tell your screeen to switch to it (through the infoframes). And that requires some support in the kernel drivers.

The Anatomy of a Vulkan Driver – SlidesVideo

This talk by Jason Ekstrand was some kind of a war story of the bring up Intel did of a Vulkan implementation on their GPU.

He first started by saying that it was actually a not so long project, especially when you consider that they wrote it from scratch, since it took roughly 3 full-time engineers 8 months to come up with a fully compliant and open source stack.

He then explained why Vulkan was needed. While OpenGL did amazingly well to cope with the hardware evolutions, it was still designed over 20 years ago, This proved to have some core characteristics that are not really relevant any more, and are holding the application developers back. For example, he mentioned that at its core, OpenGL is based on a singleton-based state machine, that obviously doesn’t scale well anymore on our SMP systems. He also mentioned that it was too abstracted, and people just wanted a lower level API, or that you might want to render things off screen without X or any context.

This was fixed in Vulkan by effectively removing the state machine, which allows it to scale, push things like the error checking or the synchronization directly to the applications, making the implementation much simpler and less layered which also simplifies the development and debugging.

He then went on to discuss how we could share the code that was still shared between the two implementations, like implementing OpenGL on top of Vulkan (which was discarded), having some kind of lighter intermediate language in Mesa to replace Gallium or just sharing through a library the common bits and making both the OpenGL and Vulkan libraries use that.

Motivating preemptive GPU scheduling for real-time systems – SlidesVideo

The last talk that we want to mention is the talk on preemptive scheduling by Roy Spliet, from the University of Cambridge.

More and more industries, and especially the automotive industry, offload some computations to the GPU for example to implement computer vision. This is then used in a car to implement the autonomous driving to make the car recognize signs or stay in its lane. And obviously, this kind of computations are supposed to be handled in a real time
system, since you probably don’t want your shiny user interface for the heating to make your car crash in the car before it because its rendering was taking too long.

He first started to explain what real time means, and what the usual metrics are, which should to no surprise to people used to “CPU based” real time systems: latency, deadline, execution time, and so on.

He then showed a bunch of benchmarks he used to test his preemptive scheduler, in a workload that was basically running OpenArena while running some computations, on various nouveau based platforms (both desktop-grade GPUs, and embedded SoCs).

This led to some expected conclusions, like the fact that a preemptive scheduler is indeed adding some overhead, but is on average worth it, while some have been quite interesting. He was for example observing some worst case latencies that were quite rare (0.3%), but were actually interferences from the display engine filling up its empty FIFOs, and creating some contention on the memory bus.


Overall, this has been a great experience. The organisation was flawless, and the one-track-only format allows you to meet easily both the speakers and attendees. The content was also highly technical, as you might expect, which made us learn a lot and led us to think about some interesting developments we could do on our various projects in the future, such as NextThing Co’s CHIP. Number-ZipCode-JP-0.20160930

Validate Japanese zip-codes Math-GSL-0.38

Interface to the GNU Scientific Library using SWIG

Hackaday: Amazon Offers $2.5M To Make Alexa Your Friend

Amazon has unveiled the Alexa Prize, a $2.5 Million purse for the first team to turn Alexa, the voice service that powers the Amazon Echo, into a ‘socialbot’ capable of, “conversing coherently and engagingly with humans on popular topics for 20 minutes”.

The Alexa Prize is only open to teams from colleges or universities, with the winning team taking home $500,000 USD, with $1M awarded to the team’s college or university in the form of a research grant. Of course, the Alexa Prize grants Amazon a perpetual, irrevocable, worldwide, royalty-free license to make use of the winning socialbot.

It may be argued the Alexa Prize is a competition to have a chat bot pass a Turning Test. This is a false equivalency; the Turing Test, as originally formulated, requires a human evaluator to judge between two conversation partners, one of which is a human, one of which is a computer. Additionally, the method of communication is text-only, whereas the Alexa Prize will make use of Alexa’s Text to Speech functionality. The Alexa Prize is not a Turing Test, but only because of semantics. If you generalize the phrase, ‘Turing Test’ to mean a test of natural language conversation, the Alexa Prize is a Turing Test.

This is not the first prize offered for a computer program that is able to communicate with a human in real time using natural language. Since 1990, the Loebner Prize, cosponsored by AI god Marvin Minsky, has offered a cash prize of $100,000 (and a gold medal) to the first computer that is indistinguishable from a human in conversation. Since 1991, yearly prizes have been awarded to the computer that is most like a human as part of the competition.

For any team attempting the enormous task of developing a theory of mind and consciousness, here are a few tips: don’t use Twitter as a dataset. Microsoft tried that, and their chatbot predictably turned racist. A better idea would be to copy Hackaday and our article-generating algorithm. Just use Markov chains and raspberry pi your way to arduino this drone.

Filed under: news, robots hacks

Open Culture: When an Octopus Caused the Great Staten Island Ferry Disaster (November 22, 1963)

Where were you on November 22, 1963?

I had yet to be born, but am given to understand that the events of that day helped shape a generation.

Documentarian Melanie Juliano knows this too, though she’s still a few months shy of the legal drinking age. The 2014 recipient of the New Jersey Filmmakers of Tomorrow Festival’s James Gandolfini Best of Fest Award uses primary sources and archival footage to bring an immediacy to this dark day in American history, the day a giant octopus—“a giant fuckin’ octopus” in the words of maritime expert Joey Fazzino—took down the Cornelius G. Kolff and all 400 hundred souls aboard.

What did you think I was talking about, the Kennedy assassination?


Image via the Facebook page of the Staten Island Ferry Octopus Disaster Memorial Museum

Those who would question this tragedy’s authenticity need look no further than a recently dedicated bronze memorial in Lower Manhattan’s Battery Park. To require more proof than that is unseemly, nay, cruel. If an estimated 90% of tourists stumbling across the site are willing to believe that a giant octopus laid waste to a Manhattan-bound Staten Island ferry several hours before John F. Kennedy was shot, who are you to question?

The memorial’s artist, Joe Reginella, of the Staten Island-based Super Fun Company, is finding it hard to disengage from a disaster of this magnitude. Instead the craftsman, whose previous work includes a JAWS tribute infant crib, lingers nearby, noting visitors’ reactions and handing out literature for the (non-existent) Staten Island Ferry Disaster Memorial Museum.

(New York 1 reports that an actual museum across the street from the address listed on Reginella’s brochures is not amused, though attendance is up.)

A Staten Island Octopus Disaster website is there for the edification of those unable to visit in person. Spend time contemplating this horrific event and you may come away inspired to learn more about the General Slocum disaster of 1904, a real life New York City ferry boat tragedy, that time has virtually erased from the public consciousness.

(The memorial for that one is located in an out of the way section of Tompkins Square Park.)

H/T to reader Scott Hermes/via Colossal

Related Content:

The Dancer on the Staten Island Ferry

“Moon Hoax Not”: Short Film Explains Why It Was Impossible to Fake the Moon Landing

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Her play Zamboni Godot is opening in New York City in March 2017. Follow her @AyunHalliday.

When an Octopus Caused the Great Staten Island Ferry Disaster (November 22, 1963) is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Electronics-Lab: LTC3623 – Switching regulator doubles as Class-D audio amplifier


Clemens Valens @ discuss about LTC3623 switching regulator which can be used as Class-D Audio Amplifier.

Sure thing, Elektor has published several designs of adjustable power supplies based on switching regulators, so we know that doing this properly in a reproducible way and without making things overly complex requires some serious head scratching. The anxiety may be reduced vastly though by a new adjustable synchronous buck regulator which uses a single resistor to set its output voltage anywhere between 0 and 14.5 volts. Using the device is very simple; you can even use it as an audio amplifier.

LTC3623 – Switching regulator doubles as Class-D audio amplifier – [Link]

The post LTC3623 – Switching regulator doubles as Class-D audio amplifier appeared first on Electronics-Lab.

s mazuk: I think I want to do more frequent and completely mundane posting, and each post has a queue delay...

I think I want to do more frequent and completely mundane posting, and each post has a queue delay of a year. that way, I can avoid being embarrassed about my thoughts. those aren’t mine anymore, they’re from some fool that was in this angle around the sun 365 days ago. not me. I got a whole year on them.

Electronics-Lab: Chronio – Low power Arduino based (smart)watch


Max.K @ designed his own impressive watch based on Atmega328p with Arduino bootloader, Maxim DS3231 (<2min per year deviation),  96×96 pixel Sharp Memory LCD (LS013B4DN04) and it’s powered by a CR2025 160mAh coin cell battery.

Chronio is an Arduino-based 3D-printed Watch. By not including fancy Wifi and BLE connectivity, it gets several months of run time out of a 160mAh button cell. The display is an always-on 96×96 pixel Sharp Memory LCD. If telling the time is not enough, you can play a simplified version of Flappy Bird on it.

Chronio – Low power Arduino based (smart)watch – [Link]

The post Chronio – Low power Arduino based (smart)watch appeared first on Electronics-Lab.

Electronics-Lab: 16×32 RGB Matrix Panel Driver Arduino Shield


Raj @ has revised his RGB Matrix Display Shield to an improved version.

The shield now also carries the DS1307 RTC chip on board along with a CR1220 coin cell battery holder on the back. It is applicable for driving popular 16×32 RGB matrix panels with HUB75 (8×2 IDC) connectors. Row and column driver circuits are already built on the back side of these matrix panel. The data and control signal pins for driving rows and columns are accessible through the HUB75 connector. It requires 12 digital I/O pins of Arduino Uno for full color control.

16×32 RGB Matrix Panel Driver Arduino Shield – [Link]

The post 16×32 RGB Matrix Panel Driver Arduino Shield appeared first on Electronics-Lab.

Electronics-Lab: Test application for the FPGA Tibbit in the smart LED controller configuration

This application example shows how to connect and use RGBW LED stripe with TPS hardware platform. The main difficulty is that LEDs have their own color generation circuit inside. New FPGA Tibbit #57 can generate fast PWM signal, which is needed for proper LEDs operation. Also, the topic shows the main advantage of FPGA technology. It allows the user to create any external interface, which will be easily connected to the TPS platform.

Test application for the FPGA Tibbit in the smart LED controller configuration – [Link]

The post Test application for the FPGA Tibbit in the smart LED controller configuration appeared first on Electronics-Lab.

Slashdot: Rosetta Spacecraft Prepares To Land On Comet, Solve Lingering Mysteries

sciencehabit writes from a report via Science Magazine: All good things must come to an end, and so it will be tomorrow when the Rosetta spacecraft makes its planned soft landing onto the surface of comet 67P/Churyumov-Gerasimenko, the culmination of 2 years of close-up studies. Solar power has waned as 67P's orbit takes it and Rosetta farther from the sun, and so the mission team decided to go on a last data-gathering descent before the lights go out. This last data grab is a bonus after a mission that is already changing theorists' views about how comets and planets arose early in the solar system. Several Rosetta observations suggest that comets form not from jolting mergers of larger cometesimals, meters to kilometers across, but rather from the gentle coalescence of clouds of pebbles. And the detection of a single, feather-light, millimeter-sized particle -- preserved since the birth of the solar system -- should further the view of a quiet birth. The report concludes: "A slew of instruments will keep gathering data as Rosetta approaches the surface at the speed of a gentle stroll. For team members whose instruments have already been turned off to conserve power, the ending is bittersweet -- but their work is far from over. Most instrument teams have only examined their own data, and are just now thinking about combining data sets. "We've just started collaborating with other teams," [Holger Sierks of the Max Planck Institute for Solar System Research in Gottingen, Germany, chief of Rosetta's main camera,] says. "This is the beginning of the story, not the end."

Read more of this story at Slashdot.

Penny Arcade: News Post: We Need A Back-End Developer/SysAdmin

Tycho: Here’s what you know: Critical Apache CDN Management MySQL Varnish Web development experience E-commerce experience Knowledge of PHP, jQuery, Javascript, HTML Important, but you could learn it here Expression Engine experience Shopify experience Bonus Computer science background Familiarity with CSS Developing sites for mobile UX design A cool hacker name like “Isolde” or “Matrix” Here’s who you are: When you encounter a new challenge, it’s exciting.  You can always think of something that could be a little, or even a lot better, and work to… App-CdUtils-0.003

CLI utilities related to changing directories

MetaFilter: The World Passes 400 PPM Threshold. Permanently.

2016 will be the year that carbon dioxide officially passed the symbolic 400 ppm mark, never to return below it in our lifetimes. In the centuries to come, history books will likely look back on September 2016 as a major milestone for the world's climate.

As noted previously, xkcd provides a helpful infographic to illustrate what kind of temperature change we're talking about here (long scroll, the end is worth it).

Penny Arcade: News Post: More Thornwatch Art

Gabe: The big bad monster in the Thornwatch print and play is the Swamp Choir. I thought I’d share some of the progress pics I took while I was drawing it. Rodney said he was looking for a monster with multiple heads. So I went to work on the art side of it. I did a few different sketched but this is the one I really liked. I cooked up this crazy four headed turtle and sent it over to Tycho. He came back with the name Swamp Choir and went to work on the lore while I started inking. As I was inking Tycho kept sending me info about this thing. How the different heads sing songs to lure… Mojolicious-Plugin-RussianPost-0.01

Mojolicious Plugin Russian Post

Hackaday: Hackaday Prize Entry: Explore M3 ARM Cortex M3 Development Board

Even a cursory glance through a site such as this one will show you how many microcontroller boards there are on the market these days. It seems that every possible market segment has been covered, and then some, so why on earth would anyone want to bring another product into this crowded environment?

This is a question you might wish to ask of the team behind Explore M3, a new ARM Cortex M3 development board. It’s based around an LPC1768 ARM Cortex M3 with 64k of RAM and 512k of Flash running at 100MHz, and with the usual huge array of GPIOs and built-in peripherals.

The board’s designers originally aimed for it to be able to be used either as a bare-metal ARM or with the Arduino and Mbed tools. In the event the response to their enquiries with Mbed led them to abandon that support. They point to their comprehensive set of tutorials as what sets their board apart from its competition, and in turn they deny trying to produce merely another Arduino or Mbed. Their chosen physical format is a compact dual-in-line board for easy breadboarding, not unlike the Arduino Micro or the Teensy.

If you read the logs for the project, you’ll find a couple of videos explaining the project and taking you through a tutorial. They are however a little long to embed in a Hackaday piece, so we’ll leave you to head on over if you are interested.

We’ve covered a lot of microcontroller dev boards here in our time. If you want to see how far we’ve come over the years, take a look at our round up, and its second part, from back in 2011.

Filed under: ARM, Microcontrollers, The Hackaday Prize Comic for 2016.09.30

New Cyanide and Happiness Comic

Slashdot: The Americas Are Now Officially 'Measles-Free'

An anonymous reader quotes a report from The Verge: The Americas are now free of measles and we have vaccines to thank, the Pan American Health Organization said earlier this week. This is the first region in the world to be declared measles-free, despite longtime efforts to eliminate the disease entirely. The condition -- which causes flu-like symptoms and a blotchy rash -- is one of the world's most infectious diseases. It's transmitted by airborne particles or direct contact with someone who has the disease and is highly contagious, especially among small children. To be clear, there are still people with measles in the Americas, but the only cases develop from strains picked up overseas. Still, the numbers are going down: in the U.S. this year, there have been 54 cases, down from 667 two years ago. The last case of measles that developed in the Americas was in 2002. (It took such a long time to declare the region measles-free because of various bureaucratic issues.) Health officials say that credit for this victory goes to efforts to vaccinate against the disease. Though the measles, mumps, and rubella (MMR) vaccine is recommended for all children and required by many states, anti-vaxxers have protested it due to since-discredited claims that vaccines can cause autism. NPR interviewed Dr. Seth Berkley, the CEO of GAVI, a Geneva-based nonprofit organization whose mission is to improve and provide vaccine and immunization coverage to children in the world's poorest countries. She says that 90 to 95 percent of people in a given region need to be vaccinated in order to stop transmission in a region. The rate worldwide is about 80 percent for measles, which means that 20 percent of people around the world are not covered.

Read more of this story at Slashdot.

Hackaday: Earliest Recorded Computer Music Restored

You want old skool electronic music? How about 1951?

Researchers at the University of Canterbury in New Zealand have just restored what is probably the oldest piece of recorded, computer-generated music. Recorded in 1951, the rendition of “God Save The King”, “Baa-Baa Black Sheep” and “In The Mood” was produced by a computer built by none other Alan Turing and other researchers at the Computing Machine Research Laboratory in Manchester.

These phat beats were captured by the BBC for broadcast on an acetate disk that the researchers found in an archive. They sampled and restored the recording, fixing the rather poor quality recording to reproduce the squawky tones that the computer played. You can hear the restored recording after the break.

It halts apparently unexpectedly in the middle of a stanza, sounds essentially horrible, and goes out of tune on the higher notes. But you gotta learn to crawl before you can walk, and these are the equivalent of the grainy 8mm films of baby’s first steps. And as such, the record is remarkable.

Via ABC News

Filed under: musical hacks, news

Slashdot: The Psychological Reasons Behind Risky Password Practices

Orome1 quotes a report from Help Net Security: Despite high-profile, large-scale data breaches dominating the news cycle -- and repeated recommendations from experts to use strong passwords -- consumers have yet to adjust their own behavior when it comes to password reuse. A global Lab42 survey, which polled consumers across the United States, Germany, France, New Zealand, Australia and the United Kingdom, highlights the psychology around why consumers develop poor password habits despite understanding the obvious risk, and suggests that there is a level of cognitive dissonance around our online habits. When it comes to online security, personality type does not inform behavior, but it does reveal how consumers rationalize poor password habits. My personal favorite: password paradox. "The survey revealed that the majority of respondents understand that their digital behavior puts them at risk, but do not make efforts to change it," reports Help Net Security. "Only five percent of respondents didn't know the characteristics of a secure password, with the majority of respondents understanding that passwords should contain uppercase and lowercase letters, numbers and symbols. Furthermore, 91 percent of respondents said that there is inherent risk associated with reusing passwords, yet 61 percent continue to use the same or similar passwords anyway, with more than half (55 percent) doing so while fully understanding the risk." The report also found that when attempting to create secure passwords, "47 percent of respondents included family names or initials," while "42 percent contain significant dates or numbers and 26 percent use the family pet."

Read more of this story at Slashdot.

MetaFilter: What A Horrible Year To Have A Curse

In honor of the 30th anniversary of Konami's iconic horror series Castlevania, USGamer has put together a retrospective of the series' history and influence and the AV Club has picked it's favorite songs from the soundtrack (YouTube link). If you want a trip down memory lane, VG Junk has a loving review of the first game, Dracula X, and a collection of Symphony of the Night ephemera. Or refresh yourself on what made the series so mechanically great with Tim Rogers essay In Praise of Sticky Friction.

MetaFilter: Meerkats - the most murderous mammal!

Cute but deadly
Which mammal is most likely to be murdered by its own kind? It's certainly not humans—not even close. Nor is it a top predator like the grey wolf or lion, although those at least are #11 and #9 in the league table of murdery mammals. No, according to a study led by José María Gómez from the University of Granada, the top spot goes to... the meerkat.

Gauge your risk based on your species with this fun graph!

Slashdot: IBM Buys Promontory Financial Group

An anonymous reader quotes a report from ZDNet: IBM said Thursday it plans to acquire compliance consulting firm Promontory Financial Group to bring more financial regulatory expertise to Watson's cognitive computing platform. Promontory is a global consulting operation with an aim of helping banks manage the ever-increasing regulation and risk management requirements in the financial sector. With that in mind, IBM wants to use the industry expertise of Promontory's workforce -- which is made up of ex-regulators and banking executives -- to teach Watson all about regulation, risk and compliance. IBM is also using the deal to create a new subsidiary called Watson Financial Services, which will build cognitive tools for things things like tracking regulatory obligations, financial risk modeling, surveillance, anti-money laundering detection systems. "This is a workload ideally suited for Watson's cognitive capabilities intended to allow financial institutions to absorb the regulatory changes, understand their obligations, and close gaps in systems and practices to address compliance requirements more quickly and efficiently," IBM said in a press release.

Read more of this story at Slashdot.

MetaFilter: The Evolution of Pepe

In light of the Clinton campaign calling Pepe the Frog "a symbol associated with white supremacy" (which the ADL has now added to its online hate symbols database), The Atlantic interviews Matt Furie, the creator of Pepe: "My feelings are pretty neutral, this isn't the first time that Pepe has been used in a negative, weird context. I think it's just a reflection of the world at large. The internet is basically encompassing some kind of mass consciousness, and Pepe, with his face, he's got these large, expressive eyes with puffy eyelids and big rounded lips, I just think that people reinvent him in all these different ways, it's kind of a blank slate. It's just out of my control, what people are doing with it, and my thoughts on it, are more of amusement."

ScreenAnarchy: Netflix Nabs Duncan Jones' MUTE

Warcraft may not have performed as expected, but that hasn't deterred director Duncan Jones from getting on with his passion project, Mute, and finding a pretty big distribution partner in Netflix.  In a recent appearance on the Empire podcast, Alexander Skarsgard spilled the beans on the deal. He explained that the film, which is currently in production, will get a Netflix day-and-date theatrical run. "I think they’ll do what they did with Beasts Of No Nation where they do a theatrical simultaneously to a Netflix release," said Skarsgard. "I just got back from Dublin where Duncan showed me all the renderings and the visuals of it. I'm very, very excited about it." And so are we! The actor went on to explain a bit about his...

[Read the whole post on]

MetaFilter: Inside the Chicago Police Department's secret budget

Through numerous Freedom of Information Act requests, the Chicago Reader, working with the Chicago-based transparency nonprofit Lucy Parsons Labs and the public records website MuckRock, obtained more than 1,000 pages of Chicago Police Department documents—including the department's deposit and expenditure ledgers, internal e-mails, and purchasing records—that offer an unprecedented look into how Chicago police and the Cook County state's attorney's office make lucrative use of civil asset forfeiture.

The Reader found that CPD uses civil forfeiture funds to finance many of the day-to-day operations of its narcotics unit and to secretly purchase controversial surveillance equipment without public scrutiny or City Council oversight. (The Cook County state's attorney's office, for its part, clearly indicates narcotics-related forfeiture income in its annual budget. According to its 2016 budget, the office will use this year's expected forfeiture revenue of $4.96 million to pay the salaries and benefits of the 41 full-time employees of its forfeiture unit.)

The amount of money seized from any given individual is, by itself, negligible to police and prosecutors' budgets—the median value of a forfeiture in Illinois is $530, according to the Institute for Justice, a nonprofit Libertarian public-interest law firm. But losing this sum of money or access to a vehicle can be devastating to the impoverished people civil forfeiture often affects. And in Chicago the millions of dollars accumulated through so many individual seizures don't go toward public services like schools or roads, but are used to fund the operations of the police division that carries out civil forfeiture.

A peek behind the investigative reporting process:
Under Illinois law, if you request a large number of records, an agency may deny your request for being "unduly burdensome." Going on the assumption that CPD's most significant purchases would be greater than $5,000, we requested only purchase orders corresponding to checks above that threshold.

But we ran into another roadblock. Again, under the Illinois law, a government agency may take longer to respond if a person sends multiple requests in a short period of time. To get over this hurdle, Lucy Parsons Labs launched a collaboration with MuckRock, a FOIA and transparency website, asking ordinary users to send FOIA requests on our behalf.
Civil forfeiture on Metafilter previously:
Last Week Tonight with John Oliver
The Use and Abuse of Civil Forfeiture

Hackaday: Weatherproof Circuits With a Pouch Laminator

[Nick Poole] over at SparkFun was playing with some force resistive strips. He wanted to use them as a keyboard input. It occurred to him that the office laminator could feasibly laminate a sheet of paper and the resistor into one sealed piece.

He put the assembly inside the pouch, ran it through the laminator, and it worked! After this success he built on it to make a full resistive keyboard. Then it occurred to him to ask, as it would to any good hacker with access to expendable company property “what else can I laminate”? Basically everything.

His next experiment was an LED throwie. No problem. Bolstered by the battery not exploding, he got more creative. The next victim was one of SparkFun’s Arduino-compatible boards and his business card. Success again.

Finally he went full out. Since the input rollers to the laminator are soft silicone it can apparently accommodate a fair amount of variance in height. He threw a full noise maker keyboard with resistive pads and a USB cable into the assembly. No issue.

It seems like a pretty good technique for making keyboards, weather proof circuits, and more.

Filed under: tool hacks

OCaml Planet: Topkg

OCaml Planet: Asetmap, Bos, Hmap and Webbrowser

Disquiet: Disquiet Junto Project 0248: Galactic Tick


Each Thursday in the Disquiet Junto group, a new compositional challenge is set before the group’s members, who then have just over four days to upload a track in response to the assignment. Membership in the Junto is open: just join and participate. A SoundCloud account is helpful but not required. There’s no pressure to do every project. It’s weekly so that you know it’s there, every Thursday through Monday, when you have the time.

This project was posted in the afternoon, California time, on Thursday, September 29, 2016, with a deadline of 11:59pm wherever you are on Monday, October 3, 2016.

These are the instructions that went out to the group’s email list (at

Disquiet Junto Project 0248: Galactic Tick
Celebrate the new celestial holiday in music.

Project Steps:

Step 1: Read up on the Galactic Tick, a new proposed holiday exploring, as described by Popular Mechanics, “how people’s perceptions would change if they really realized the one fixed point in their celestial understanding, the mighty sun, was also in flux.”

Step 2: Devise a short piece of music in celebration of the Galactic Tick. Perhaps you’ll explore the distance of 225 million years, which is how often the Earth fully circles the center of the Milky Way. Perhaps you’ll find cosmic meaning in 1/129,600,000, which is the “centi-arcsecond” employed by the Galactic Tick planners to make the period of time more human-comprehensible. Perhaps you’ll find meaning in 633.7, which is the number of days between celebrations of the Galactic Tick here on Earth, or 1.74, which is the number of years.

Five More Important Steps When Your Track Is Done:

Step 1: Per the instructions below, be sure to include the project tag “disquiet0248” (no spaces) in the name of your track. If you’re posting on SoundCloud in particular, this is essential to my locating the tracks and creating a playlist of them.

Step 2: Upload your track. It is helpful but not essential that you use SoundCloud to host your track.

Step 3: In the following discussion thread at please consider posting your track. (Assuming you post it on SoundCloud, a search for the tag will help me construct the playlist.)

Step 4: Annotate your track with a brief explanation of your approach and process.

Step 5: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Deadline: This project was posted in the afternoon, California time, on Thursday, September 29, 2016, with a deadline of 11:59pm wherever you are on Monday, October 3, 2016.

Length: The length is up to you. One minute and 44 seconds seems like a good length (that’s roughly 1.74 minutes).

Title/Tag: When posting your track, please include “disquiet0248” in the title of the track, and where applicable (on SoundCloud, for example) as a tag.

Upload: When participating in this project, post one finished track with the project tag, and be sure to include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Download: It is preferable that your track is set as downloadable, and that it allows for attributed remixing (i.e., a Creative Commons license permitting non-commercial sharing with attribution).

Linking: When posting the track online, please be sure to include this information:

More on this 248th weekly Disquiet Junto project — “Celebrate the new celestial holiday in music” — at:

More on the Disquiet Junto at:

Subscribe to project announcements here:

Project discussion takes place on

There’s also on a Junto Slack. Send your email address to for Slack inclusion.

The image associated with this project is borrowed from the website

OCaml Planet: Odig 0.0.1

CreativeApplications.Net: Deltu by Alexia Léchot – iPad as a ‘mirror interface’ between humans and robots

tumblr_o8e90argu81v8klwqo1_1280Created by Alexia Léchot at ECAL, Deltu is a delta robot with a personality that interacts with humans using two iPads. Created using arm technology normally found in 3d printers, Deltu uses three different applications on the iPad Alexia built for it, using symmetry as an interpretation, a mirror and a reflexion of our own image.

All Content: Miss Peregrine's Home for Peculiar Children


Walking back to the car after a recent screening of “Miss Peregrine’s Home for Peculiar Children,” my movie-savvy, nearly-seven-year-old son took my hand and asked me sweetly: “Mommy, what was that about?”

Um … er … well …

The short answer (which probably wasn’t terribly helpful to him) was: It’s “X-Men” meets “Groundhog Day.” The real answer, which required a lot of stumbling and bumbling and twists and turns, was far more lengthy (and probably not terribly helpful, either). Because even though I’d just seen the exact same movie my son had, I wasn’t sure I completely understood it, either.

The latest adventure from Tim Burton would seem tailor-made for his tastes but it’s a convoluted slog, dense in mythology and explanatory dialogue but woefully lacking in thrills. It’s been a matter of diminishing returns with Burton for the past several years now between “Alice in Wonderland,” “Dark Shadows” and “Big Eyes” (although the animated “Frankenweenie” found the director in peak retro form). “Miss Peregrine’s Home for Peculiar Children” allows him to show only brief glimmers of the gleefully twisted greatness of his early work such as “Pee-wee’s Big Adventure” and “Beetlejuice.” The characters here are supposed to be delightful—or at least interesting—simply because they’re superficially odd, and it just isn’t enough anymore. Too often, it feels like we’ve seen this movie before—and seen it done better.

Although the film (based on the novel by Ransom Riggs) is populated by an assortment of peculiars, as they’re known—kids born with unusual abilities that make it difficult for them to live in the outside world—precious few of them feel like actual human beings whose lonely plight might carry some emotional resonance. There’s Emma (Ella Purnell), the pretty blonde who has to wear lead shoes so she doesn’t fly away. There’s Olive (Lauren McCrostie), the redhead who has to wear gloves so she doesn’t accidentally set things on fire. There’s the girl with a ravenous maw hidden on the back of her head. The invisible boy who likes to play tricks. The girl who can make things grow super fast. The boy who can project images through his eyeball. The creepy, masked twins. They flit in and out, do the thing they do, and ta da! Then they’re gone without leaving much impact.

Their leader is the stylish and formidable Miss Alma LeFay Peregrine, played by Eva Green, who nearly saves the day simply by showing up with that vampy, riveting screen presence of hers. With a shocking swoop of midnight-blue hair and an array of gorgeous gowns from frequent Burton costume designer Colleen Atwood, she has the ability to manipulate time (and turn into a bird, which seems unrelated). But that isn’t enough. She also has to be extra quirky by smoking a pipe.

And the seemingly regular kid who stumbles upon all these freaks and geeks is the incredibly boring Jake, played by “Hugo” star Asa Butterfield. He’s our wide-eyed conduit, so of course he has to function as the straight man in such a wildly fanciful world. But there’s just nothing to him, and the young British actor’s American accent seems to flatten him further.

You may have noticed I haven’t tried to describe the plot yet. Yes, I am procrastinating. 

Shy, teenage Jake lives in a bland tract house in suburban Florida (on the same street as Edward Scissorhands, possibly). He dreams of being an explorer, he says, but he would seem to lack the requisite get-up-and-go. All his life, he’s heard his beloved grandfather (Terence Stamp, who departs far too quickly) tell him outlandish stories about his own youth on an island off the coast of Wales, where he grew up at an orphanage for misfits with magical powers.

After Grandpa dies under mysterious circumstances, Jake convinces his parents (Chris O’Dowd and a frustratingly underused Kim Dickens), with the help of his grief counselor (Allison Janney), that he should visit the island and try to find this mysterious home in hopes of achieving closure. Dad tags along to take photographs of birds and drink beer at the pub full of crotchety locals. (Cinematographer Bruno Delbonnel, whose work includes the Coen brothers’ luscious “Inside Llewyn Davis,” does make the foggy Welsh setting look severe and dramatic.)

When Jake finally does find the stately, gothic home his grandfather had told him about, he discovers it’s in ruins, the result of a bombing decades earlier during World War II. But once he steps inside and begins investigating, the inhabitants dare to pop their heads out and the place comes colorfully to life. Seems they’re stuck in a time loop, doomed to repeat the same day in September 1943 right up until the moment the Nazi bomb fell on them. The time-conscious Miss Peregrine explains that she winds the clock back 24 hours at the end of each night, just before the moment of destruction, allowing everyone to relive that day all over again.

Doesn’t that sound fun? Are you still paying attention?

Anyway, for some reason, all the kids want Jake to stick around, ostensibly because they haven’t seen a fresh face in about 70 years, and his will do. But they’re all in danger, you see, because just as there are good mutants in the “X-Men” world, there are also bad ones. Here, they’re the peculiars who use their powers to take over other time loops, or something. And they way the stay alive is by eating people’s eyeballs, or something. Their leader is the courtly yet menacing Mr. Barron, whom Samuel L. Jackson plays with the kind of scenery chewing he could do in his sleep. But what they want is never clear, so they’re never truly frightening.

The supposedly epic collision between good and evil results in exactly one exciting action set piece. It involves stop-motion animated skeletons battling an army of long-limbed, eye-gouging mercenary giants at a boardwalk amusement park, and it’s the only scene that vividly recalls the kind of artistry and absurd humor that long have been Burton’s trademarks. And the peculiar who makes it all happen has the most useful—and the most ethically intriguing—ability of all. Enoch (Finlay MacMillan) can bring things back to life—a person, a creepy doll—by inserting a beating heart into it. Unfortunately, though, he ends up being just another cog in the particularly dull machinery.

BOOOOOOOM!: Watch: “The Junction: A-Trak & Nick Catchdubs”

Here’s episode #3 of our animated series for Red Bull Music Academy, and it’s a peek into the pop culture-filled minds of Fool’s Gold co-founders A-Trak and Nick Catchdubs. Hopefully you’ll find it to be an enjoyable mix of the absurd and the profound (would love to see these two start their own podcast).

The visuals here are a collaborative effort from animator Brandon Blommaert and illustrator Josh Holinaty. The sound design and original music were created by Luigi Allemano.

Make sure you hit full screen on the episode above or watch it nice and big over on Booooooom TV.

Stay tuned for the rest of the episodes! If you missed the first two, watch Episode #1: Chilly Gonzales and Peaches, and watch Episode #2: Kaytranada and River Tiber.

Open Culture: Watch Benedict Cumberbatch Sing Pink Floyd’s “Comfortably Numb,” with David Gilmour Live on Stage

Around here, when we talk about Benedict Cumberbatch, we usually talk about his knack for reading classic texts–Kafka’s Metamorphosis, Melville’s Moby-Dick, a poignant letter by Alan Turing, even passages from a Guantánamo prisoner’s diary. But today we’re putting another one of his talents on display.

Above, watch Cumberbatch join David Gilmour live on stage to perform Pink Floyd’s 1979 song, “Comfortably Numb.” The performance took place last night at London’s Royal Albert Hall. Enjoy.

Note: You can download free audiobooks read by Benedict Cumberbatch if you sign up for a 30-Day Free Trial with  That includes readings of Sherlock Holmes, Jane Austen and Neil Gaiman. Find more information on Audible’s Free Trial program here.

via Rolling Stone

Related Content:

Pink Floyd’s David Gilmour Sings Shakespeare’s Sonnet 18

Ultra Orthodox Rabbis Sing Pink Floyd’s “Wish You Were Here” on the Streets of Jerusalem

Hear Lost Recording of Pink Floyd Playing with Jazz Violinist Stéphane Grappelli on “Wish You Were Here”

Watch Benedict Cumberbatch Sing Pink Floyd’s “Comfortably Numb,” with David Gilmour Live on Stage is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Colossal: Artist Leon Tarasewicz Covers the Poland National Gallery’s Great Hall Staircase in Splatter Paint


The Great Hall during the exhibition “Polish Painting of the 21st Century,” Leon Tarasewicz, 2006, photo: Sebastian Madejski. All images via We Are Museums



Back in 2006, Warsaw’s National Gallery of Art, Zachęta, held a group exhibition titled “Polish Painting of the 21st Century.” Painter Leon Tarasewicz contributed a site-specific work to the 60-artist exhibition, redoing the museum’s Great Hall in a bath of red, yellow, blue, and green splatter paint. The work splattered the stairs and crept up the surrounding walls, creating a dramatic entrance for anyone entering the exhibition. (via ArchAtlas which was inexplicably deleted by Tumblr last week?)

Jesse Moynihan: End of Adv Time Production

A few days ago I had coffee with my former AT board partner Ako Castuera and she dropped the deets that Adventure Time production would end with Season 9. Even though I’ve been off the show since July 2015, this news came as a weird relief for me. I told Ako this and she laughed […]

BOOOOOOOM!: Artist Spotlight: Greg Ito


A selection of recent work by Los Angeles-based artist Greg Ito. See more images below or on display at Steve Turner gallery until October 8th.

Open Culture: Artificial Intelligence Program Tries to Write a Beatles Song: Listen to “Daddy’s Car”

Last May, we told you about Flow Machine, an artificial intelligence-driven music composer that analyses composer’s styles and then creates new works from that data. Developed by François Pachet at Sony CSL-Paris, the initial experiments demonstrated Beethoven’s “Ode to Joy” as played in the style of bossa nova, the Beatles’ “Penny Lane,” and Ennio Morricone’s romantic work. Admittedly, it wasn’t the most stunning moment in A.I.—a computer was now doing what arrangers have been doing for years, applying genre rules to a melody created in another genre.

However, Flow Machine has returned with an interesting development: two upcoming albums of A.I.-created songs, from which two tunes have been released to give you a taste of computer creativity. French composer and musician Benoît Carré helped out with the arrangements and production of the songs, and also wrote the lyrics, so it’s not completely an A.I. creation, we should note.

So what should we make of “Daddy’s Car,” above, an attempt to create an A.I song in the style of the Beatles? The opening seconds feature the three-part harmony of “Because,” but when the band kicks in, it’s closer to the Beach Boys’ Pet Sounds than the Fab Four. (If anything, it’s closer to the High Llamas.)

But does it sound like it was written by a human? Yes.

For something stranger, try the other song released so far: “Mr. Shadow,” written “in the style of American songwriters such as Irving Berlin, Duke Ellington, George Gershwin and Cole Porter.”

Now this is much odder, a mix of country twang, Daniel Lanois-style ambience, along with a vocal that sounds like a corrupted audio file. If you are looking for a true glimpse of the future, wrap your ears and sanity around this one. Musicians and music fans, let us know in the comments what you think about this brave new world that has such hit singles in it.

Related Content:

Two Artificial Intelligence Chatbots Talk to Each Other & Get Into a Deep Philosophical Conversation

Noam Chomsky Explains Where Artificial Intelligence Went Wrong

Stephen Hawking Wonders Whether Capitalism or Artificial Intelligence Will Doom the Human Race

Ted Mills is a freelance writer on the arts who currently hosts the FunkZone Podcast. You can also follow him on Twitter at @tedmills, read his other arts writing at and/or watch his films here.

Artificial Intelligence Program Tries to Write a Beatles Song: Listen to “Daddy’s Car” is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Colossal: New Lace Street Art Created with Ceramic, Textile, and Spray Paint by NeSpoon


Polish artist NeSpoon (previously here and here) focuses on lace motifs that cover the walls, streets, and public parks found in urban environments. The lace works are either painted directly onto the surface or formed from clay, each handmade by herself or the traditional folk artists with whom she works.

“In lace there is an aesthetic code which is deeply embedded in every culture,” says NeSpoon. “In every lace we find symmetry, and some kind of order and harmony. Isn’t that what we all seek for instinctively?”

Recently NeSpoon has created work in Wroclaw, Auckland, Pont-l’Abbé, and Warsaw. You can see more of her public murals and installations on Behance.











ScreenAnarchy: SLASH: Alien Erotica In Trailer For The Indie Hit

Teen Wolf star Michael Johnston takes a dive into a very different part of online culture than afforded by his regular gig with a starring role in Clay Liford's Slash. The indie comedy following a young man dipping into the world of erotic fan fiction was a big hit at the SXSW Film Festival before launching on an extensive international festival run and with the US release of the film now scheduled for early December the first proper trailer has arrived online. Neil is a questioning high school freshman whose main social outlet is the steamy erotic fan fiction he writes about Vanguard, the brawny, galaxy-hopping hero of a popular sci-fi franchise. When Neil's stories are exposed in class he is mortified, but the effortlessly-cool...

[Read the whole post on]

ScreenAnarchy: San Diego International Film Festival 2016 Exclusive Interview: Branko Tomovic on RED

The San Diego International Film Festival kicked off this week and among the fantastic line-up is the intense and atmospheric thriller Red by Branko Tomovic (24, Fury, City of Tiny Lights), which also stars Dervla Kirwan (Ondine, Ballykissangel, The Silence, Doctor Who) and Francesca Fowler (Doctor Who, Rome, Closure). It's Branko's directing debut and tells the story of Niklas, a surgeon who lives a life of solitude and is tormented by self hatred. He performs regular illegal surgeries for the red market and works together with Mia, a young prostitute who lures her clients in and drugs them. Niklas is looking for a way out of this dark world, but owes his life to their violent crime boss Ed, who would rather kill him then let him go....

[Read the whole post on]

Colossal: A Macro Timelapse Highlights the Micro Movements of Spectacularly Colored Coral

Interested in documenting one of the oldest animals on Earth, Barcelona-based production company myLapse set to capture the minimal movements of brightly colored coral, recording actions rarely seen by the human eye. The short film took nearly 25,000 individual images of the marine invertebrates to compose, and photography of species, such as the Acanthophyllia, Trachyphyllia, Heteropsammia cochlea, Physogyra, took over a year.

The production team hopes the film attracts attention to the Great Barrier Reef, encouraging watchers to take a deeper interest in one of the natural wonders of the world that is being rapidly bleached due to climate change. You can see more up-close images of the coral species featured in this film on Flickr. (via Sploid)







Quiet Earth: GOOD TIDINGS To Come from XLrator, Blue Fox Partnership

XLrator Media and Blue Fox Entertainment have entered into a distribution agreement to jointly distribute 15 films a year across all North American platforms, including a number of co-acquisitions.

The first film under the new XLrator Media-BFE distribution agreement will be the holiday-themed thriller Good Tidings, to be released on December 6th, followed in 2017 by the supernatural thrillers Dark Signal and A Demon Within. All of the films will be released on XLrator Media’s acclaimed “MACABRE” genre label.

The two companies began their relationship earlier this year when XLrator Media acquired the North American distribution rights from BFE to the international horror festival hit The Windmill, which opens in theaters October 28th and [Continued ...]

ScreenAnarchy: Review: DEEPWATER HORIZON, Heroes At the Mouth of Hell

Six years ago, nightmare images of an oil rig burning at night in the Gulf Coast seared themselves into memory. The fire raged high and furious against a pitch-black landscape, leaving one to wonder just what had happened ... and who, if anyone, survived. Now comes Peter Berg's Deepwater Horizon, a dramatization of events in the hours leading up to the explosion that claimed 11 lives. Its direct source is a magazine article by David Rohde and Stephanie Saul; the screenplay is credited to Matthew Michael Carnahan (Berg's The Kingdom, and also Lions for Lambs and others) and Matthew Sand. In its early scenes, the movie feels very much like a staged reading of a magazine article, filled with a copious amount of information that...

[Read the whole post on]

Daniel Lemire's blog: Can Swift code call C code without overhead?

Swift is the latest hot new language from Apple. It is becoming the standard programming language on Apple systems.

I complained in a previous post that Swift 3.0 has only about half of Java’s speed in tests that I care about. That’s not great for high-performance programming.

But we do have a language that produces very fast code: the C language.

Many languages like Objective-C, C++, Python and Go allow you to call C code with relative ease. C++ and Objective-C can call C code with no overhead. Go makes it very easy, but the performance overhead is huge. So it is almost never a good idea to call C from Go for performance. Python also suffers from a significant overhead when calling C code, but since native Python is not so fast, it is often a practical idea to rewrite performance-sensitive code in C and call it from Python. Java makes it hard to call C code, so it is usually not even considered.

What about Swift? We know, as per Apple’s requirements, that Swift must interact constantly with legacy Objective-C code. So we know that it must be good. How good is it?

To put it to the test, I decided to call from Swift a simple Fibonacci recurrence function :

void fibo(int * x, int * y) {
  int c = * y;
  *y = *x + *y;
  *x = c;

(Note: this function can overflow and that is undefined behavior in C.)

How does it fare against pure Swift code?

let c = j;
j = i &+ j;
i = c;

To be clear, this is a really extreme case. You should never rewrite such a tiny piece of code in C for performance. I am intentionally pushing the limits.

I wrote a test that calls these functions 3.2 billion times. The pure Swift takes 9.6 seconds on a Haswell processor… or about 3 nanosecond per call. The C function takes a bit over 13 seconds or about 4 nanoseconds per iteration. Ok. But what if I rewrote the whole thing into one C function, called only once? Then it runs in 11 seconds (it is slower than pure Swift code).

The numbers I have suggest that calling C from Swift is effectively free.

In these tests, I do not pass to Swift any optimization flag. The way you build a swift program is by typing “swift build” which is nice and elegant. To optimize the binary, you can type “swift build --configuration release“. Nice! But benchmark code is part of your tests. Sadly, swift seems to insist on only testing “debug” code for some reason. Typing “swift test --configuration release” fails since the test option does not have a configuration flag. (Calling swift test -Xswiftc -O gives me linking errors.)

I rewrote the code using a pure C program, without any Swift. Sure enough, the program runs in about 11 seconds without any optimization flag. This confirms my theory that Swift is testing the code with all optimizations turned off. What if I turn on all C optimizations? Then I go down to 1.7 seconds (or about half a nanosecond per iteration).

So while calling C from Swift is very cheap, insuring that Swift properly optimizes the code might be trickier.

It seems odd that, by default, Swift runs benchmarks in debug mode. It is not helping programmers who care about performance.

Anyhow, a good way around this problem is to simply build binaries in release mode and measure how long it takes them to run. It is crude, but it gets the job done in this case:

$ swift build --configuration release
$ time ./.build/release/LittleSwiftTest

real       0m2.030s
user       0m2.028s
sys        0m0.000s
$ time ./.build/release/LittleCOverheadTest

real       0m1.778s
user       0m1.776s
sys        0m0.000s

$ clang -Ofast -o purec  code/purec.c
$ time ./purec

real       0m1.747s
user       0m1.744s
sys        0m0.000s

So there is no difference between a straight C program, and a Swift program that calls billions of times a C function. They are both just as fast.

The pure Swift program is slightly slower in this case, however. It suggests that using C for performance-sensitive code could be beneficial in a Swift project.

So I have solid evidence that calling C functions from Swift is very cheap. That is very good news. It means that if for whatever reason, Swift is not fast enough for your needs, you stand a very good chance of being able to rely on C instead.

My Swift source code is available (works under Linux and Mac).

Credit: Thanks to Stephen Canon for helping me realize that I could lower the call overhead by calling directly the C function instead of wrapping it first in a tiny Swift function.

ScreenAnarchy: Review: DANNY SAYS, More Than a Remarkable Musical Story

Danny says we gotta go Gotta go to Idaho But we can't go surfin' 'Cause it's 20 below Those words open The Ramones' song "Danny Says," from their woefully under appreciated Phil Spector produced 1980 album, End of the Century. That was what I knew of Danny Fields before seeing Brendan Toller's illuminating documentary Danny Says. The film traces Fields' roots back to his childhood in Queens, and follows his almost unbelievable journey alongside American pop culture in the late '60s and '70s as he shepherds one incredible movement after another into the limelight during a time in American history that would change everything. You name it, Danny was there, and he was crucial. Brendan Toller's film is more than just a bunch of talking...

[Read the whole post on]

Open Culture: Sci-Fi Icon Robert Heinlein Lists 5 Essential Rules for Making a Living as a Writer


So you want to be a writer? Good, you’ll find plenty of advice from the best here at Open Culture. Oh, you want to be a science fiction writer? The great Ursula K. Le Guin has offered readers a wealth of writing advice, though she won’t tell us “how to sell a ship, but how to sail one.” But wait, you also want to know how to publish, and make a living? For that, you’d better see Robert Heinlein, one of the acknowledged masters of the Golden Age of science fiction and a hugely prolific author who pioneered both popular hard sci-fi and what he called “speculative fiction,” a more serious, literary form incorporating social and political themes.

In his 1947 essay “On the Writing of Speculative Fiction,” Heinlein refers to these “two types” of science fiction as “the gadget story and the human interest story.” The latter kind of story, writes Heinlein “stands a better chance with the slicks than a gadget story does” because it has wider appeal. This advice sounds rather utilitarian, doesn’t it? What about passion, inspiration, the muse? Eh, you don’t have time for those things. If you want to be successful like Robert Heinlein, you’ve got to write stories, lots of ‘em, stories people want to publish and pay for, stories people want to read.

Heinlein spends the bulk of his essay advising us on how to write such stories, with a proviso, in an epigram from Rudyard Kipling, that “there are nine-and-sixty ways / Of constructing tribal lays / And every single one of them is right.” After, however, describing in detail how he writes a “human interest” science fiction story, Heinlein then gets down to business. He assumes that we can type, know the right formats or can learn them, and can spell, punctuate, and use grammar as our “wood-carpenter’s sharp tools.” These prerequisites met, all we really need to write speculative fiction are the five rules below:

1. You must write.

2. You must finish what you start.

3. You must refrain from rewriting except to editorial order.

4. You must put it on the market.

5. You must keep it on the market until sold.

You might think Heinlein has lapsed into the language of the realtor, not the writer, but he is deadly serious about these rules, which “are amazingly hard to follow—which is why there are so few professional writers and so many aspirants.” Anyone who has tried to write and publish fiction knows this to be true. But what did Heinlein mean in giving us such an austere list? For one thing, as he notes many times, there are perhaps as many ways to write sci-fi stories as there are people to write them. What Heinlein aims to give us are the keys to becoming professional writers, not theorists of writing, lovers of writing, dabblers and dilettantes of writing.

Award-winning science fiction writer Robert J. Sawyer has interpreted Heinlein’s rules with commentary of his own, and added a sixth: “Start Working on Something Else.” Good advice. Heinlein’s rule number three, however—“the one that got Heinlein in trouble with creative-writing teachers”—seems to contradict what most every other writer will tell us. Sawyer suggests we take it to mean, “Don’t tinker endlessly with your story.” Writer Patricia C. Wrede agrees, but also suggests that “Heinlein was of the school of thought that felt that ‘good enough’ was all that was necessary, ever.”

Like 19th century writers who churned out novels as serialized stories for the papers and magazines, Heinlein and his fellow Golden Age writers made their living selling story after story to the “pulps” and the “slicks” (preferably the slicks). One had to be prolific, and being “’prolific enough’ often involved not having time to polish and revise much (if at all).” So rule number three may or may not apply, depending on our constraints. The literary market has changed dramatically since 1947, but the rest of Heinlein’s rules still seem nonnegotiable if we intend not only to write—speculative fiction or otherwise—but also to make a career doing so.

Related Content:

Writing Tips by Henry Miller, Elmore Leonard, Margaret Atwood, Neil Gaiman & George Orwell

Ray Bradbury Gives 12 Pieces of Writing Advice to Young Authors (2001)

Stephen King’s Top 20 Rules for Writers

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Sci-Fi Icon Robert Heinlein Lists 5 Essential Rules for Making a Living as a Writer is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

All Content: NYFF 2016: Preview of the 54th New York Film Festival


Founded in 1963, the New York Film Festival predated both the Sundance and Toronto festivals, which eventually loomed much larger in terms of size, media attention and industry influence (both serve as sales markets). New York’s main claim to fame was as a tastemaker and gatekeeper: it very selectively imported the most acclaimed films from abroad as well as introducing important new voices in American filmmaking.

Under the 25-year leadership of the festival’s first director, Richard Roud, the French definition of auteur cinema ruled, and the consequent influence of the Cannes Film Festival made for programming heavy on European films. In the next quarter-century, Roud’s successor, Richard Peña broadened the festival’s scope in various ways, offering more documentaries and American independents as well as showcasing auteurs from cinemas as diverse as China, Taiwan, Iran, Egypt and Burkino Faso.

Even before Kent Jones took over from Peña four year ago, the Film Society of Lincoln Center, which sponsors the festival, had been undergoing various changes, some of which were reflected in a continually expanding range of festival offerings: These now include “Spotlight on Documentary” (a section I consider the most important addition of recent years) and “Convergence” (about VR and various form of “immersive” storytelling); revivals, retrospectives and restorations; filmmaker talks, salutes to actors, etc.

The festival’s Main Slate, though, remains its chief calling card, and rightly so: it aims to gather best of world cinema into a program of only two to three dozen films (this year’s edition includes 25 features and five programs of shorts). In recent years, one notable departure from the festival’s traditional emphases entailed opening the doors to major-studio movies, which led to Opening-Night slots for the likes of Paul Greengrass’ “Captain Phillips,” David Fincher’s “Gone Girl” and Robert Zemeckis’ “The Walk.”

Whether such inclusions were intended to boost the festival’s media or industry profiles or attract a broader base of donors—all very understandable objectives—this year’s edition reverses the trend very decisively. Its sole big-ticket item, Ang Lee’s “Billy Lynn’s Long Halftime Walk,” comes from a great director with a NYFF history and is a special presentation near the festival’s end. The Opening Night slot, meanwhile, will be occupied by “The 13th," Ava Duvernay’s documentary about the mass-incarceration of African-American men. While the film wasn’t available in time to be previewed for this article, I’m ready to lead the cheers for the festival including a documentary as its Opening Night offering (first time ever), especially one by such an estimable filmmaker.

I’ve been covering the NYFF for more than half of its history, and perhaps the highest compliment I can pay its programmers is to note that more often than not, as many as half the films on my annual 10-best list appeared in the festival’s Main Slate. Granted, there are always reasons to quibble: trendy or inconsequential films that are inexplicably included, and anticipated titles that aren’t (this year my list of most regretted no-shows is led by Ashgar Farhadi’s “The Salesman” and Damien Chazelle's “La La Land”; both directors are NYFF veterans). Yet it’s also been my experience that the films that really define the festival’s special flavor every year are not just the laureates from Cannes and other festivals but some of its less obvious, more idiosyncratic offerings.

This year, the latter category is led by “Son of Joseph,” the latest by the New York-born, Paris-based director Eugène Green. In recent years, my greatest debt to the NYFF might have been the discovery of Green’s sublime “La Sapienza” (his first film distributed in the U.S. and number one on my ten-best list of last year) at the 2014 festival. Having now seen much of Green’s earlier work, I count myself lucky to have encountered “La Sapienza” first, since its strikes me as his one unqualified masterpiece and thus the film that provides the best introduction to a body of work that is decidedly eccentric.

An American who went to France and helped revive its tradition of Baroque theater before turning to filmmaking, Green is an anti-modernist who wages war against pretty much everything that contemporary French intellectual culture represents. Like the Dardennes brothers (who co-produced “Son of Joseph” and whose exquisite “The Unknown Girl” is also in this year’s festival) he owes obvious debts to the cinema of Robert Bresson in both his essentially religious outlook and his pared-down visual style.

Like many of his films, “Son of Joseph” is rife with Biblical references. Given Green’s Christian orientation, it’s no surprise that the Joseph evoked here is he of the New Testament not the Old. The film’s protagonist is a Parisian boy whose pervasive unhappiness seems to stem in large part from the refusal of his mother (named, of course, Marie) to reveal the identity of his father. In his probings, the teen discovers evidence that his dad is a celebrated literary publisher (a wonderfully droll turn by Mathieu Amalric) who’s as big a creep as he is a cultural icon. While pursuing him, the boy also encounters and develops a friendship with the publisher’s brother (Green regular Fabrizio Rongione), a kindly man who wants to return to his family’s farm in Normandy.

In “La Sapienza,” the characters left the dry intellectualism of Paris for Italy’s warmth, verdant landscapes and Baroque architectural masterpieces; the film’s story describes a personal and philosophical journey toward wisdom and healing. “Son of Joseph” is much different in large part because it mostly remains planted in Paris, where its religious symbolism vies against Green’s broad satiric swipes at the city’s literary scene. Some of this is obviously intended as comedy, but other parts are less clearly jocular in their intent, though many people in the press screening I attended laughed like they were in a Mel Brooks movie. While Green certainly has his whimsical side, I would say that the reactions his latest provokes mark it a mixed success by a very unusual artist.

An expatriate from a different hemisphere, Alison Maclean first visited the NYFF with “Kitchen Sink,” a 1989 short made in her native New Zealand. Resident in New York since 1992, she returned to her homeland to make the sharp, keenly observed feature “The Rehearsal,” which follows a group of aspiring actors through their first year of drama school.

Adapted from a novel by Eleanor Catton, the film inevitably provokes the term “multi-layered” in part because we are always watching actors playing actors learning about acting. But Maclean never makes this an exercise in cerebral self-consciousness. With great subtlety and incisiveness, she concentrates on drawing very detailed, believable characters and then follows the process they undergo in gradually immersing themselves in a discipline that’s at once seductive and demanding, vainglorious and ego-crushing.

Her main character, Stanley (charismatic Kiwi teen star James Rolleston), is a country boy who, despite his good looks, approaches his new life with the tentativeness of someone who lacks the confidence his urban classmates have come by naturally. He meets his match in the domineering personality of his main teacher (a strong performance by Kerry Fox). She sees her role and breaking her students down before she can build them up again, and at first Stanley seems like he just might not survive the ordeal.

The film, though, ably chronicles the personal transformations and growth that students must undergo to discover the powers their instructors see in the best of them. At the same time, they all have lives outside the classroom, and “The Rehearsal” is very good at interweaving the ups and downs of first-year-at-college, the roommate snarls, newfound romances and hedonistic detours, even the unexpected tragedy. All in all, with its sure sense of narrative and Maclean’s cool, restrained style, it’s a film that makes good on the promise of her first two features, “Crush” and “Jesus’ Son.”

The festival dependably presents films about New York cultural icons that are sometimes made by filmmakers who qualify as the same. This year’s example, “Gimme Danger” by Jim Jarmusch, opens with a gaunt, lank-haired guy sitting down in front of the camera and being identified: “Jim Osterberg.” The name, of course, is the real-life moniker of the protean performer the world knows as Iggy Pop, erstwhile lead vocalist of the Stooges.

With Iggy eloquently narrating, Jarmusch initially throws viewers a curveball in his documentary about the band by beginning the chronicle in 1973, when they were at a low point, seemingly already washed up. But soon enough the filmmaker doubles back and gives us the story chronologically, starting with the Ann Arbor-based Stooges being discovered by influential Elektra Records PR man Danny Fields. (This part of the story is also recounted in Brendan Toller’s excellent doc about Fields, “Danny Says,” which opens this week.)

Their signing with Elektra led to the wide-eyed Midwestern boys being introduced to the fleshpots of the two coasts. In New York, Iggy works with John Cale, begins a romance with Nico (who has recently broken off with Lou Reed) and meets David Bowie, who would have a huge influence on his subsequent career. In L.A., he encounters Andy Warhol and takes to wearing dog collars. The Stooges’ first two albums, which were recorded in these separate sojourns, would later be regarded as classics, but they didn’t set the world on fire at a time when hippiedom and pot-flavored acoustic rock still ruled. Elektra dropped them after the second disc, which sent the band into the usual rock-star spiral of drugs, drink, fights, personnel changes and eventual dissolution.

Commendably, neither Jarmusch nor Iggy ever uses the hackneyed phrase “ahead of their time,” but the Stooges surely were—perhaps more so than any American rock act apart from the Velvet Underground. The raw, aggressive sonic assaults they unleashed on an unsuspecting public in 1970 influenced the Ramones in America, the Sex Pistols and the Clash in the U.K., as well as perhaps half the bands who would dominate the college radio airwaves and U.S. club scene a decade later.

Though Iggy, whose Greek-god physique remains pristine after nearly a half-century, disavows all labels including “punk,” his fearless stage dives and utter disregard of injuries (which were manifold) set the parameters for many future mosh pits and lead-singer derring do. He is still an amazing performer, as well as a thoughtful and articulate teller of his and the Stooges’ tale (the film also includes interviews with other surviving band members). With a style that’s more straightforward and less quirky than one might expect from Jarmusch, “Gimme Danger” provides an expert and engrossing overview of one of America’s seminal rock acts.

Cream of the Crop

As noted above, the NYFF dependably provides a first look at films that end up on my annual ten-best lists. Of the titles screened so far for the press, four seem likely candidates for such honors, whether they go into general release this year or next. I offer them here for festival-goers looking for the best of this year’s Main Slate. All have been reviewed on in previous festivals. In order of preference:

Cristian Mungiu's “Graduation” is a brilliantly crafted drama that weds stylistic rigor to a story that interrogates moral choices in present-day Romania; this one concerns a doctor and the lengths he will go to to get his daughter to a British university. (Our review from Cannes)

Cristi Puiu’s “Sieranevada,” also from Romania, offers a stylistic tour de force in which a ritual family gathering, presented with an almost anthropological avidity for details, as well as drolly understated wit, involves conversations touching on everything from 9/11 conspiracy theories to Romania’s Communist past. (Our review from Cannes)

“I, Daniel Blake,” the winner of the 2016 Palme d’or at Cannes and the best film in many years from British stalwart Ken Loach, sums up the bureaucratic miseries of digital-era England in this tale of disabled worker waging a Kafkaesque battle to get his health benefits. (Our review from Cannes)

“The Unknown Girl,” the latest masterpiece by the Dardennes brothers, takes place in their native Belgium and concerns a young doctor trying to deal with both her own guilt and an unsolved mystery after a young African immigrant is killed near her office. (Our review from Cannes)

The 54th New York Film Festival runs from September 30 to October 16. Click here for more information.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Invasive Species

I knew I should've gotten hamsters instead.

New comic!
Today's News:

My geeks tell me we've already sold 20% of BAHFest tickets in just a few days. These are definitely going to sell out. So, if you want tickets, please book soon!

CreativeApplications.Net: Scribb by Mylène Dreyer – Pen-Paper-Mouse Interface

mid_diplomes-2-copyCreated by Mylène Dreyer at ECAL, Scribb is a computer game in which the physical area scanned by the mouse is an integral part of the interaction. The player must draw black areas, detected by the mouse, to be able to evolve in the game, simultaneously managing the position of the mouse and the surface on which it is placed.

Planet Haskell: Neil Mitchell: Full-time Haskell jobs in London, at Barclays

Summary: I'm hiring 9 Haskell programmers. Email neil.d.mitchell AT to apply.

I work for Barclays, in London, working on a brand new Haskell project. We're looking for nine additional Haskell programmers to come and join the team.

What we offer

A permanent job, writing Haskell, using all the tools you know and love – GHC/Cabal/Stack etc. In the first two weeks in my role I've already written parsers with attoparsec, both Haskell and HTML generators and am currently generating a binding to C with lots of Storable/Ptr stuff. Since it's a new project you will have the chance to help shape the project.

The project itself is to write a risk engine – something that lets the bank calculate the value of the trades it has made, and how things like changes in the market change their value. Risk engines are important to banks and include lots of varied tasks – talking to other systems, overall structuring, actual computation, streaming values, map/reduce.

We'll be following modern but lightweight development principles – including nightly builds, no check-ins to master which break the tests (enforced automatically) and good test coverage.

These positions have attractive salary levels.

What we require

We're looking for the best functional programmers out there, with a strong bias towards Haskell. We have a range of seniorities available to suit even the most experienced candidates. We don't have anything at the very junior end; instead we're looking for candidates that are already fluent and productive. That said, a number of very good Haskell programmers think of themselves as beginners even after many years, so if you're not sure, please get in touch.

We do not require any finance knowledge.

The role is in London, Canary Wharf, and physical presence in the office on a regular basis is required – permanent remote working is not an option.

How to apply

To apply, email neil.d.mitchell AT with a copy of your CV. If you have any questions, email me.

The best way to assess technical ability is to look at code people have written. If you have any packages on Hackage or things on GitHub, please point me at the best projects. If your best code is not publicly available, please describe the Haskell projects you've been involved in.

OCaml Planet: Full Time: Software Developer (Functional Programming) at Jane Street in New York, NY; London, UK; Hong Kong

Software Developer

Jane Street is a proprietary quantitative trading firm, focusing primarily on trading equities and equity derivatives. We use innovative technology, a scientific approach, and a deep understanding of markets to stay successful in our highly competitive field. We operate around the clock and around the globe, employing over 400 people in offices in New York, London and Hong Kong.

The markets in which we trade change rapidly, but our intellectual approach changes faster still. Every day, we have new problems to solve and new theories to test. Our entrepreneurial culture is driven by our talented team of traders and programmers. At Jane Street, we don't come to work wanting to leave. We come to work excited to test new theories, have thought-provoking discussions, and maybe sneak in a game of ping-pong or two. Keeping our culture casual and our employees happy is of paramount importance to us.

We are looking to hire great software developers with an interest in functional programming. OCaml, a statically typed functional programming language with similarities to Haskell, Scheme, Erlang, F# and SML, is our language of choice. We've got the largest team of OCaml developers in any industrial setting, and probably the world's largest OCaml codebase. We use OCaml for running our entire business, supporting everything from research to systems administration to trading systems. If you're interested in seeing how functional programming plays out in the real world, there's no better place.

The atmosphere is informal and intellectual. There is a focus on education, and people learn about software and trading, both through formal classes and on the job. The work is challenging, and you get to see the practical impact of your efforts in quick and dramatic terms. Jane Street is also small enough that people have the freedom to get involved in many different areas of the business. Compensation is highly competitive, and there's a lot of room for growth.

You can learn more about Jane Street and our technology from our main site, You can also look at a a talk given at CMU about why Jane Street uses functional programming (, and our programming blog (

We also have extensive benefits, including:

  • 90% book reimbursement for work-related books
  • 90% tuition reimbursement for continuing education
  • Excellent, zero-premium medical and dental insurance
  • Free lunch delivered daily from a selection of restaurants
  • Catered breakfasts and fresh brewed Peet's coffee
  • An on-site, private gym in New York with towel service
  • Kitchens fully stocked with a variety of snack choices
  • Full company 401(k) match up to 6% of salary, vests immediately
  • Three weeks of paid vacation for new hires in the US
  • 16 weeks fully paid maternity/paternity leave for primary caregivers, plus additional unpaid leave

More information at

Open Culture: A Master List of 1,200 Free Courses From Top Universities: 40,000 Hours of Audio/Video Lectures

Image by Carlos Delgado, via Wikimedia Commons

For the past ten years, we’ve been busy rummaging around the internet and adding courses to an ever-growing list of Free Online Courses, which now features 1,200+ courses from top universities. Let’s give you the quick overview: The list lets you download audio & video lectures from schools like Stanford, Yale, MIT, Oxford and Harvard. Generally, the courses can be accessed via YouTube, iTunes or university web sites, and you can listen to the lectures anytime, anywhere, on your computer or smart phone. We haven’t done a precise calculation, but there’s about 40,000 hours of free audio & video lectures here. Enough to keep you busy for a very long time.

Right now you’ll find 146 free philosophy courses, 88 free history courses, 125 free computer science courses, 78 free physics courses and 55 Free Literature Courses in the collection, and that’s just beginning to scratch the surface. You can peruse sections covering Astronomy, Biology, BusinessChemistry, Economics, Engineering, Math, Political Science, Psychology and Religion.

Here are some highlights from the complete list of Free Online Courses. We’ve added a few unconventional/vintage courses in the mix just to keep things interesting.

The complete list of courses can be accessed here: 1,200 Free Online Courses from Top Universities

Related Content:

700 Free Audio Books: Download Great Books for Free.

800 Free eBooks for iPad, Kindle & Other Devices.

1,150 Free Movies Online: Great Classics, Indies, Noir, Westerns, etc..

Learn 48 Languages Online for Free: Spanish, Chinese, English & More.

A Master List of 1,200 Free Courses From Top Universities: 40,000 Hours of Audio/Video Lectures is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

BOOOOOOOM!: Photographer Spotlight: Tom Craig


A selection of images from “The Bigger Picture” by photographer Tom Craig. More images below.

Planet Haskell: Don Stewart (dons): Haskell dev roles with Strats @ Standard Chartered

The Strats team at Standard Chartered is growing. We have 10 more open roles currently, in a range of areas:

  • Haskell dev for hedging effectiveness analytics, and build hedging services.
  • Haskell devs for derivatives pricing services. Generic roles using Haskell.
  • Web-experienced Haskell devs for frontends to analytics services written in Haskell. PureScript and or data viz, user interfaces skills desirable
  • Haskell dev for trading algorithms and strategy development.
  • Dev/ops role to extend our continuous integration infrastructure (Haskell+git)
  • Contract analysis and manipulation in Haskell for trade formats (FpML + Haskell).
  • Haskell dev for low latency (< 100 microsecond) components in soft real-time non-linear pricing charges service.

You would join an existing team of 25 Haskell developers in Singapore or London. Generally our roles involve directly working with traders to automate their work and improve their efficiency. We use Haskell for all tasks. Either GHC Haskell or our own (“Mu”) implementation, and this is a rare chance to join a large, experienced Haskell dev team.

We offer permanent or contractor positions, at Director and Associate Director level, with very competitive compensation. Demonstrated experience in typed FP (Haskell, OCaml, F# etc) is required or other typed FP.

All roles require some physical presence in either Singapore or London, and we offer flexiblity with these constraints (with work from home available). No financial background is required or assumed.

More info about our development process is in the 2012 PADL keynote, and a 2013 HaskellCast interview.

If this sounds exciting to you, please send your PDF resume to me – donald.stewart <at>

Tagged: jobs

s mazuk: "All I can say is, it took me about ten years to learn how to write a story I knew was something like..."

“All I can say is, it took me about ten years to learn how to write a story I knew was something like what I wanted to write. In the sixty years since then I’ve learned how to do some more of what I’d like to do. But never all.”

- Ursula K. Le Guin, October 18th, 2015, Gollancz Festival (via ellenkushner)

Disquiet: Greg Davis & Keith Fullerton Whitman Live in 2002

Greg Davis and Keith Fullerton Whitman have posted a half-hour set from 2002, recorded on WVUM in Miami, Florida, during a spring tour from March and April of that year. It’s an early document for both. Davis released his first album, Arbor, in 2002, on the Carpark Records label. And while Whitman had been since the late 1990s making recordings (generally self-released), 2002 would also see the release of his breakthrough, Playthroughs, on Kranky.

According to the brief note on Davis’ SoundCloud account, “the live set was recently unearthed.” The two are heard “trading off playing live tracks.” These veer between gentle folktronic material from Davis (ruminative field recordings, guitar above a pixelated beat), and more frenetic, often IDM-flavored material from Hrvatski (rubbery breakbeats, scattered metric logicistics). The tag team approach is emblematic of their camaraderie.

I was fortunate to have seen them play when they hit New Orleans later that month at a show at the Mermaid Lounge. The full tour itinerary is archived at the microsound discussion list. It started at Bard College mid-way through March and ended in Montreal at La Sala Rossa toward the end of April. The microsound-list announcement humorously depicts the tour as a trio, splitting Whitman between his given name and his Hrvatski moniker. Here’s part of the announcement:

hello microsound listers. i’m going on tour starting tomorrow, will most likely end up in a town near you some time over the next few weeks. if you’re in the area come and say hello… -k

starting very soon: spring tour.

hrvatski (planet-mu, reckankreuzungsklankewerkzeuge).
greg davis (carpark, autumn).
keith fullerton whitman (kranky, apartment b).

hrvatski will be performing material from his forthcoming album swarm & dither (planet-mu).
greg davis will be performing material from his recently released debut album arbor (carpark).

keith fullerton whitman will be performing material from his forthcoming debut album playthroughs (kranky) on select dates.
greg davis and keith fullerton whitman will be performing material together as a duo on select dates (as they see fit).

If you want something to read while listening to the performances, I interviewed Davis later that year (“Woodshedding”), and Whitman in mid-2001 (“Army of One”).

Track originally posted at

Planet Haskell: Well-Typed.Com: Sharing, Space Leaks, and Conduit and friends

TL;DR: Sharing conduit values leads to space leaks. Make sure to disable the full laziness optimization in the module with your top-level calls to runConduit or ($$) (skip to the end of the conclusion for some details on how to do this). Similar considerations apply to other streaming libraries and indeed any Haskell code that uses lazy data structures to drive computation.


We use large lazy data structures in Haskell all the time to drive our programs. For example, consider

main1 :: IO ()
main1 = forM_ [1..5] $ \_ -> mapM_ print [1 .. 1000000]

It’s quite remarkable that this works and that this program runs in constant memory. But this stands on a delicate cusp. Consider the following minor variation on the above code:

ni_mapM_ :: (a -> IO b) -> [a] -> IO ()
{-# NOINLINE ni_mapM_ #-}
ni_mapM_ = mapM_

main2 :: IO ()
main2 = forM_ [1..5] $ \_ -> ni_mapM_ print [1 .. 1000000]

This program runs, but unlike main1, it has a maximum residency of 27 MB; in other words, this program suffers from a space leak. As it turns out, main1 was running in constant memory because the optimizer was able to eliminate the list altogether (due to the fold/build rewrite rule), but it is unable to do so in main2.

But why is main2 leaking? In fact, we can recover constant space behaviour by recompiling the code with -fno-full-laziness. The full laziness transformation is effectively turning main2 into

longList :: [Integer]
longList = [1 .. 1000000]

main3 :: IO ()
main3 = forM_ [1..5] $ \_ -> ni_mapM_ print longList

The first iteration of the forM_ loop constructs the list, which is then retained to be used by the next iterations. Hence, the large list is retained for the duration of the program, which is the beforementioned space leak.

The full laziness optimization is taking away our ability to control when data structures are not shared. That ability is crucial when we have actions driven by large lazy data structures. One particularly important example of such lazy structures that drive computation are conduits or pipes. For example, consider the following conduit code:

import qualified Data.Conduit as C

countConduit :: Int -> C.Sink Char IO ()
countConduit cnt = do
    mi <- C.await
    case mi of
      Nothing -> liftIO (print cnt)
      Just _  -> countConduit $! cnt + 1

getConduit :: Int -> C.Source IO Char
getConduit 0 = return ()
getConduit n = do
    ch <- liftIO getChar
    C.yield ch
    getConduit (n - 1)

Here countConduit is a sink that counts the characters it receives from upstream, and getConduit n is a conduit that reads n characters from the console and passes them downstream. Suppose we connect these two conduits and run them inside an exception handler that retries when an error occurs:

retry :: IO a -> IO a
retry io = catch io (\(_ :: SomeException) -> retry io)
main :: IO ()
main = retry $ C.runConduit $ getConduit 1000000 C.=$= countConduit 0

we again end up with a large space leak, this time of type Pipe and ->Pipe (conduit’s internal type):

Although the values that stream through the conduit come from IO, the conduit itself is fully constructed and retained in memory. In this blog post we examine what exactly is being retained here, and why. We will also suggest a simple workaround: it usually suffices to avoid sharing at the very top-level calls to runConduit or ($$). Note that these problems are not specific to the conduit library, but apply equally to all other similar libraries.

We will not assume any knowledge of conduit but start from first principles; however, if you have never used any of these libraries before this blog post is probably not the best starting point; you might for example first want to watch my presentation Lazy I/O and Alternatives in Haskell.


Before we look at the more complicated case, let’s first consider another program using just lists:

main :: IO ()
main = retry $ ni_mapM_ print [1..1000000]

This program suffers from a space leak for similar reasons to the example with lists we saw in the introduction, but it’s worth spelling out the details here: where exactly is the list being maintained?

Recall that the IO monad is effectively a state monad over a token RealWorld state (if that doesn’t make any sense to you, you might want to read ezyang’s article Unraveling the mystery of the IO monad first). Hence, ni_mapM_ (just a wrapper around mapM_) is really a function of three arguments: the action to execute for every element of the list, the list itself, and the world token. That means that

ni_mapM_ print [1..1000000]

is a partial application, and hence we are constructing a PAP object. Such a PAP object is an runtime representation of a partial application of a function; it records the function we want to execute (ni_mapM_), as well as the arguments we have already provided. It is this PAP object that we give to retry, and which retry retains until the action completes because it might need it in the exception handler. The long list in turn is being retained because there is a reference from the PAP object to the list (as one of the arguments that we provided).

Full laziness does not make a difference in this example; whether or not that [1 .. 10000000] expression gets floated out makes no difference.

Reminder: Conduits/Pipes

Just to make sure we don’t get lost in the details, let’s define a simple conduit-like or pipe-like data structure:

data Pipe i o m r =
    Yield o (Pipe i o m r)
  | Await (Either r i -> Pipe i o m r)
  | Effect (m (Pipe i o m r))
  | Done r

A pipe or a conduit is a free monad which provides three actions:

  1. Yield a value downstream
  2. Await a value from upstream
  3. Execute an effect in the underlying monad.

The argument to Await is passed an Either; we give it a Left value if upstream terminated, or a Right value if upstream yielded a value.1

This definition is not quite the same as the one used in real streaming libraries and ignores various difficulties (in particular exception safely, as well as other features such as leftovers); however, it will suffice for the sake of this blog post. We will use the terms “conduit” and “pipe” interchangeably in the remainder of this article.


The various Pipe constructors differ in their memory behaviour and the kinds of space leaks that they can create. We therefore consider them one by one. We will start with sources, because their memory behaviour is relatively straightforward.

A source is a pipe that only ever yields values downstream.2 For example, here is a source that yields the values [n, n-1 .. 1]:

yieldFrom :: Int -> Pipe i Int m ()
yieldFrom 0 = Done ()
yieldFrom n = Yield n $ yieldFrom (n - 1)

We could “run” such a pipe as follows:

printYields :: Show o => Pipe i o m () -> IO ()
printYields (Yield o k) = print o >> printYields k
printYields (Done ())   = return ()

If we then run the following program:

main :: IO ()
main = retry $ printYields (yieldFrom 1000000)

we get a space leak. This space leak is very similar to the space leak we discussed in section Lists above, with Done () playing the role of the empty list and Yield playing the role of (:).


A sink is a conduit that only ever awaits values from upstream; it never yields anything downstream.2 The memory behaviour of sinks is considerably more subtle than the memory behaviour of sources and we will examine it in detail. As a reminder, the constructor for Await is

data Pipe i o m r = Await (Either r i -> Pipe i o m r) | ...

As an example of a sink, consider this pipe that counts the number of characters it receives:

countChars :: Int -> Pipe Char o m Int
countChars cnt =
    Await $ \mi -> case mi of
      Left  _ -> Done cnt
      Right _ -> countChars $! cnt + 1

We could “run” such a sink by feeding it a bunch of characters; say, 1000000 of them:

feed :: Char -> Pipe Char o m Int -> IO ()
feed ch = feedFrom 10000000
    feedFrom :: Int -> Pipe Char o m Int -> IO ()
    feedFrom _ (Done r)  = print r
    feedFrom 0 (Await k) = feedFrom 0     $ k (Left 0)
    feedFrom n (Await k) = feedFrom (n-1) $ k (Right ch)

If we run this as follows and compile with optimizations enabled, we once again end up with a space leak:

main :: IO ()
main = retry $ feed 'A' (countChars 0)

We can recover constant space behaviour again by disabling full laziness; however, the effect of full laziness on this example is a lot more subtle than the example we described in the introduction.

Full laziness

Let’s take a brief moment to describe what full laziness is, exactly. Full laziness is one of the optimizations that ghc applies by default when optimizations are enabled; it is described in the paper “Let-floating: moving bindings to give faster programs”. The idea is simple; if we have something like

f = \x y -> let e = .. -- expensive computation involving x but not y
            in ..

full laziness floats the let binding out over the lambda to get

f = \x = let e = .. in \y -> ..

This potentially avoids unnecessarily recomputing e for different values of y. Full laziness is a useful transformation; for example, it turns something like

f x y = ..
    go = .. -- some local function


f x y   = ..
f_go .. = ..

which avoids allocating a function closure every time f is called. It is also quite a notorious optimization, because it can create unexpected CAFs (constant applicative forms; top-level definitions of values); for example, if you write

nthPrime :: Int -> Int
nthPrime n = allPrimes !! n
    allPrimes :: [Int]
    allPrimes = ..

you might expect nthPrime to recompute allPrimes every time it is invoked; but full laziness might move that allPrimes definition to the top-level, resulting in a large space leak (the full list of primes would be retained for the lifetime of the program). This goes back to the point we made in the introduction: full laziness is taking away our ability to control when values are not shared.

Full laziness versus sinks

Back to the sink example. What exactly is full laziness doing here? Is it constructing a CAF we weren’t expecting? Actually, no; it’s more subtle than that. Our definition of countChars was

countChars :: Int -> Pipe Char o m Int
countChars cnt =
    Await $ \mi -> case mi of
      Left  _ -> Done cnt
      Right _ -> countChars $! cnt + 1

Full laziness is turning this into something more akin to

countChars' :: Int -> Pipe Char o m Int
countChars' cnt =
    let k = countChars' $! cnt + 1
    in Await $ \mi -> case mi of
                        Left  _ -> Done cnt
                        Right _ -> k

Note how the computation of countChars' $! cnt + 1 has been floated over the lambda; ghc can do that, since this expression does not depend on mi. So in memory the countChars 0 expression from our main function (retained, if you recall, because of the surrounding retry wrapper), develops something like this. It starts of as a simple thunk:

Then when feed matches on it, it gets reduced to weak head normal form, exposing the top-most Await constructor:

The body of the await is a function closure pointing to the function inside countChars (\mi -> case mi ..), which has countChars $! (cnt + 1) as an unevaluated thunk in its environment. Evaluating it one step further yields

So where for a source the data structure in memory was a straightforward “list” consisting of Yield nodes, for a sink the situation is more subtle: we build up a chain of Await constructors, each of which points to a function closure which in its environment has a reference to the next Await constructor. This wouldn’t matter of course if the garbage collector could clean up after us; but if the conduit itself is shared, then this results in a space leak.

Without full laziness, incidentally, evaluating countChars 0 yields

and the chain stops there; the only thing in the function closure now is cnt. Since we don’t allocate the next Yield constructor before running the function, we never construct a chain of Yield constructors and hence we have no space leak.

Depending on values

It is tempting to think that if the conduit varies its behaviour depending on the values it receives from upstream the same chain of Await constructors cannot be constructed and we avoid a space leak. For example, consider this variation on countChars which only counts spaces:

countSpaces :: Int -> Pipe Char o m Int
countSpaces cnt =
    Await $ \mi ->
      case mi of
        Left  _   -> Done cnt
        Right ' ' -> countSpaces $! cnt + 1
        Right _   -> countSpaces $! cnt

If we substitute this conduit for countChars in the previous program, do we fare any better? Alas, the memory behaviour of this conduit, when shared, is in fact far, far worse.

The reason is that both the countSpaces $! cnt + 1 and the expression countSpaces $! cnt can both be floated out by the full laziness optimization. Hence, now every Await constructor will have a function closure in its payload with two thunks, one for each alternative way to execute the conduit. What’s more, both of these thunks will are retained as long as we retain a reference to the top-level conduit.

We can neatly illustrate this using the following program:

main :: IO ()
main = do
    let count = countSpaces 0
    feed ' ' count
    feed ' ' count
    feed ' ' count
    feed 'A' count
    feed 'A' count
    feed 'A' count

The first feed ' ' explores a path through the conduit where every character is a space; so this constructs (and retains) one long chain of Await constructors. The next two calls to feed ' ' however walk over the exact same path, and hence memory usage does not increase for a while. But then we explore a different path, in which every character is a non-space, and hence memory behaviour will go up again. Then during the second call to feed 'A' memory usage is stable again, until we start executing the last feed 'A', at which point the garbage collector can finally start cleaning things up:

What’s worse, there is an infinite number of paths through this conduit. Every different combination of space and non-space characters will explore a different path, leading to combinatorial explosion and terrifying memory usage.


The precise situation for effects depends on the underlying monad, but let’s explore one common case: IO. As we will see, for the case of IO the memory behaviour of Effect is actually similar to the memory behaviour of Await. Recall that the Effect constructor is defined as

data Pipe i o m r = Effect (m (Pipe i o m r)) | ...

Consider this simple pipe that prints the numbers [n, n-1 .. 1]:

printFrom :: Int -> Pipe i o IO ()
printFrom 0 = Done ()
printFrom n = Effect $ print n >> return (printFrom (n - 1))

We might run such a pipe using3:

runPipe :: Show r => Pipe i o IO r -> IO ()
runPipe (Done r)   = print r
runPipe (Effect k) = runPipe =<< k

In order to understand the memory behaviour of Effect, we need to understand how the underlying monad behaves. For the case of IO, IO actions are state transformers over a token RealWorld state. This means that the Effect constructor actually looks rather similar to the Await constructor. Both have a function as payload; Await a function that receives an upstream value, and Effect a function that receives a RealWorld token. To illustrate what printFrom might look like with full laziness, we can rewrite it as

printFrom :: Int -> Pipe i o IO ()
printFrom n =
    let k = printFrom (n - 1)
    in case n of
         0 -> Done ()
         _ -> Effect $ IO $ \st -> unIO (print n >> return k) st

If we visualize the heap (using ghc-vis), we can see that it does indeed look very similar to the picture for Await:

Increasing sharing

If we cannot guarantee that our conduits are not shared, then perhaps we should try to increase sharing instead. If we can avoid allocating these chains of pipes, but instead have pipes refer back to themselves, perhaps we can avoid these space leaks.

In theory, this is possible. For example, when using the conduit library, we could try to take advantage of monad transformers and rewrite our feed source and our count sink as:

feed :: Source IO Char
feed = evalStateC 1000000 go
    go :: Source (StateT Int IO) Char
    go = do
      st <- get
      if st == 0
        then return ()
        else do put $! (st - 1) ; yield 'A' ; go

count :: Sink Char IO Int
count = evalStateC 0 go
    go :: Sink Char (StateT Int IO) Int
    go = do
        mi <- await
        case mi of
          Nothing -> get
          Just _  -> modify' (+1) >> go

In both definitions go refers back to itself directly, with no arguments; hence, it ought to be self-referential, without any long chain of sources or sinks ever being constructed. This works; the following program runs in constant space:

main :: IO ()
main = retry $ print =<< (feed $$ count)

However, this kind of code is extremely brittle. For example, consider the following minor variation on count:

count :: Sink Char IO Int
count = evalStateC 0 go
    go :: Sink Char (StateT Int IO) Int
    go = withValue $ \_ -> modify' (+1) >> go

    withValue :: (i -> Sink i (StateT Int IO) Int)
              -> Sink i (StateT Int IO) Int
    withValue k = do
      mch <- await
      case mch of
        Nothing -> get
        Just ch -> k ch

This seems like a straight-forward variation, but this code in fact suffers from a space leak again4. The optimized core version of this variation of count looks something like this:

count :: ConduitM Char Void (StateT Int IO) Int
count = ConduitM $ \k ->
    let countRec = modify' (+ 1) >> count
    in unConduitM await $ \mch ->
         case mch of
           Nothing -> unConduitM get      k
           Just _  -> unConduitM countRec k

In the conduit library, ConduitM is a codensity transformation of an internal Pipe datatype; the latter corresponds more or less to the Pipe datastructure we’ve been describing here. But we can ignore these details: the important point here is that this has the same typical shape that we’ve been studying above, with an allocation inside a lambda but before an await.

We can fix it by writing our code as

count :: Sink Char IO Int
count = evalStateC 0 go
    go :: Sink Char (StateT Int IO) Int
    go = withValue goWithValue

    goWithValue :: Char -> Sink Char (StateT Int IO) Int
    goWithValue _ = modify' (+1) >> go

    withValue :: (i -> Sink i (StateT Int IO) Int)
              -> Sink i (StateT Int IO) Int
    withValue k = do
      mch <- await
      case mch of
        Nothing -> get
        Just ch -> k ch

Ironically, it would seem that full laziness here could have helped us by floating out that modify' (+1) >> go expression for us. The reason that it didn’t is probably related to the exact way the k continuation is threaded through in the compiled code (I simplified a bit above). Whatever the reason, tracking down problems like these is difficult and incredibly time consuming; I’ve spent many many hours studying the output of -ddump-simpl and comparing before and after pictures. Not a particularly productive way to spend my time, and this kind of low-level thinking is not what I want to do when writing application level Haskell code!

Composed pipes

Normally we construct pipes by composing components together. Composition of pipes can be defined as

(=$=) :: Monad m => Pipe a b m r -> Pipe b c m r -> Pipe a c m r
{-# NOINLINE (=$=) #-}
_         =$= Done   r   = Done r
u         =$= Effect   d = Effect $ (u =$=) <$> d
u         =$= Yield  o d = Yield o (u =$= d)
Yield o u =$= Await    d = u =$= d (Right o)
Await   u =$= Await    d = Await $ \ma -> u ma =$= Await d
Effect  u =$= Await    d = Effect $ (=$= Await d) <$> u
Done  r   =$= Await    d = Done r =$= d (Left r)

The downstream pipe “is in charge”; the upstream pipe only plays a role when downstream awaits. This mirrors Haskell’s lazy “demand-driven” evaluation model.

Typically we only run self-contained pipes that don’t have any Awaits or Yields left (after composition), so we are only left with Effects. The good news is that if the pipe components don’t consist of long chains, then their composition won’t either; at every Effect point we wait for either upstream or downstream to complete its effect; only once that is done do we receive the next part of the pipeline and hence no chains can be constructed.

On the other hand, of course composition doesn’t get rid of these space leaks either. As an example, we can define a pipe equivalent to the getConduit from the introduction

getN :: Int -> Pipe i Char IO Int
getN 0 = Done 0
getN n = Effect $ do
           ch <- getChar
           return $ Yield ch (getN (n - 1))

and then compose getN and countChars to get a runnable program:

main :: IO ()
main = retry $ runPipe $ getN 1000000 =$= countChars 0

This program suffers from the same space leaks as before because the individual pipelines component are kept in memory. As in the sink example, memory behaviour would be much worse still if there was different paths through the conduit network.


At Well-Typed we’ve been developing an application for a client to do streaming data processing. We’ve been using the conduit library to do this, with great success. However, occassionally space leaks arise that difficult to fix, and even harder to track down; of course, we’re not the first to suffer from these problems; for example, see ghc ticket 9520.

In this blog post we described how such space leaks arise. Similar space leaks can arise with any kind of code that uses large lazy data structures to drive computation, including other streaming libraries such as pipes or streaming, but the problem is not restricted to streaming libraries.

The conduit library tries to avoid these intermediate data structures by means of fusion rules; naturally, when this is successful the problem is avoided. We can increase the likelihood of this happening by using combinators such as folds etc., but in general the intermediate pipe data structures are difficult to avoid.

The core of the problem is that in the presence of the full laziness optimization we have no control over when values are not shared. While it is possible in theory to write code in such a way that the lazy data structures are self-referential and hence keeping them in memory does not cause a space leak, in practice the resulting code is too brittle and writing code like this is just too difficult. Just to provide one more example, in our application we had some code that looked like this:

go x@(C y _) = case y of
         Constr1 -> doSomethingWith x >> go
         Constr2 -> doSomethingWith x >> go
         Constr3 -> doSomethingWith x >> go
         Constr4 -> doSomethingWith x >> go
         Constr5 -> doSomethingWith x >> go

This worked and ran in constant space. But after adding a single additional clause to this pattern match, suddenly we reintroduced a space leak again:

go x@(C y _) = case y of
         Constr1 -> doSomethingWith x >> go
         Constr2 -> doSomethingWith x >> go
         Constr3 -> doSomethingWith x >> go
         Constr4 -> doSomethingWith x >> go
         Constr5 -> doSomethingWith x >> go
         Constr6 -> doSomethingWith x >> go

This was true even when that additional clause was never used; it had nothing to do with the change in the runtime behaviour of the code. Instead, when we added the additional clause some limit got exceeded in ghc’s bowels and suddenly something got allocated that wasn’t getting allocated before.

Full laziness can be disabled using -fno-full-laziness, but sadly this throws out the baby with the bathwater. In many cases, full laziness is a useful optimization. In particular, there is probably never any point allocation a thunk for something that is entirely static. We saw one such example above; it’s unexpected that when we write

go = withValue $ \_ -> modify' (+1) >> go

we get memory allocations corresponding to the modify' (+1) >> go expression.

Fortunately, there is a simple workaround. Any internal sharing in the conduit is (usually) fine, as long as we don’t retain the conduit from one run to the next. So it’s the argument to the top-level calls to runConduit or ($$) that we need to worry about (or the equivalent “run” functions from other libraries). This leads to the following recommendation:

Conduit code typically looks like

runMyConduit :: Some -> Args -> IO r
runMyConduit some args =
    runConduit $ stage1 some
             =$= stage2 args
             =$= stageN

You should put all top-level calls to runConduit into a module of their own, and disable full laziness in that module by declaring

{-# OPTIONS_GHC -fno-full-laziness #-}

at the top of the file. This means the computation of the conduit (stage1 =$= stage2 .. =$= stageN) won’t get floated to the top and the conduit will be recomputed on every invocation of runMyConduit (note that this relies on runMyConduit to have some arguments; if it doesn’t, you should add a dummy one).

It is not necessary to disable full laziness anywhere else. In particular, the conduit stages themselves (stage1 etc.) can be defined in modules where full laziness is enabled as usual.

There is a recent proposal for adding a pragma to ghc that might make it possible to disable full laziness on specific expressions, but for now the above is a reasonable workaround.

Addendum 1: ghc’s “state hack”

Let’s go back to the section about sinks; if you recall, we considered this example:

countChars :: Int -> Pipe Char o m Int
countChars cnt =
    let k = countChars $! cnt + 1
    in Await $ \mi -> case mi of
                        Left  _ -> Done cnt
                        Right _ -> k

feedFrom :: Int -> Pipe Char o m Int -> IO ()
feedFrom n (Done r)  = print r
feedFrom 0 (Await k) = feedFrom 0 $ k (Left 0)
feedFrom n (Await k) = feedFrom (n - 1) $ k (Right 'A')

main :: IO ()
main = retry $ feedFrom 10000000 (countChars 0)

We explained how countChars 0 results in a chain of Await constructors and function closures. However, you might be wondering, why would this be retained at all? After all, feedFrom is just an ordinary function, albeit one that computes an IO action. Why shouldn’t the whole expression

feedFrom 10000000 (countChars 0)

just be reduced to a single print 10000000 action, leaving no trace of the pipe at all? Indeed, this is precisely what happens when we disable ghc’s “state hack”; if we compile this program with -fno-state-hack it runs in constant space.

So what is the state hack? You can think of it as the opposite of the full laziness transformation; where full laziness transforms

     \x -> \y -> let e = <expensive> in ..    
~~>  \x -> let e = <expensive> in \y -> ..

the state hack does the opposite

     \x -> let e = <expensive> in \y -> ..
~~>  \x -> \y -> let e = <expensive> in ..    

though only for arguments y of type State# <token>. In general this is not sound, of course, as it might duplicate work; hence, the name “state hack”. Joachim Breitner’s StackOverflow answer explains why this optimization is necessary; my own blog post Understanding the RealWorld provides more background.

Let’s leave aside the question of why this optimization exists, and consider the effect on the code above. If you ask ghc to dump the optimized core (-ddump-stg), and translate the result back to readable Haskell, you will realize that it boils down to a single line change. With the state hack disabled the last line of feedFrom is effectively:

feedFrom n (Await k) = IO $
    unIO (feedFrom (n - 1) (k (Right 'A')))

where IO and unIO just wrap and unwrap the IO monad. But when the state hack is enabled (the default), this turns into

feedFrom n (Await k) = IO $ \w ->
    unIO (feedFrom (n - 1) (k (Right 'A'))) w

Note how this floats the recursive call to feedFrom into the lambda. This means that

feedFrom 10000000 (countChars 0)

no longer reduces to a single print statement (after an expensive computation); instead, it reduces immediately to a function closure, waiting for its world argument. It’s this function closure that retains the Await/function chain and hence causes the space leak.

Addendum 2: Interaction with cost-centres (SCC)

A final cautionary tale. Suppose we are studying a space leak, and so we are compiling our code with profiling enabled. At some point we add some cost centres, or use -fprof-auto perhaps, and suddenly find that the space leak disappeared! What gives?

Consider one last time the sink example. We can make the space leak disappear by adding a single cost centre:

feed :: Char -> Pipe Char o m Int -> IO ()
feed ch = feedFrom 10000000
    feedFrom :: Int -> Pipe Char o m Int -> IO ()
    feedFrom n p = {-# SCC "feedFrom" #-}
      case (n, p) of
        (_, Done r)  -> print r
        (0, Await k) -> feedFrom 0     $ k (Left 0)
        (_, Await k) -> feedFrom (n-1) $ k (Right ch)

Adding this cost centre effectively has the same result as specifying -fno-state-hack; with the cost centre present, the state hack can no longer float the computations into the lambda.


  1. The ability to detect upstream termination is one of the characteristics that sets conduit apart from the pipes package, in which this is impossible (or at least hard to do). Personally, I consider this an essential feature. Note that the definition of Pipe in conduit takes an additional type argument to avoid insisting that the type of the upstream return value matches the type of the downstream return value. For simplicity I’ve omitted this additional type argument here.

  2. Sinks and sources can also execute effects, of course; since we are interested in the memory behaviour of the indvidual constructors, we treat effects separately.

  3. runPipe is (close to) the actual runPipe we would normally use; we connect pipes that await or yield into a single self contained pipe that does neither.

  4. For these simple examples actually the optimizer can work its magic and the space leak doesn’t appear, unless evalStateC is declared NOINLINE. Again, for larger examples problems arise whether it’s inlined or not.

Embedded in Academia: Advanced Compilers Weeks 3-5

This continues a previous post.

We went through the lattice theory and introduction to dataflow analysis parts of SPA. I consider this extremely good and important material, but I’m afraid that the students looked pretty bored. It may be the case that this material is best approached by first looking at practical aspects and only later going into the theory.

One part of SPA that I’m not super happy with is the material about combining lattices (section 4.3). This is a useful and practical topic but the use cases aren’t really discussed. In class we went through some examples, for example this function that cannot be optimized by either constant propagation or dead code elimination alone, but can be optimized by their reduced product: conditional constant propagation. Which, as you can see, is implemented by both LLVM and GCC. Also, this example cannot be optimized by either sign analysis or parity analysis, but can be optimized using their reduced product.

We didn’t go into them, but I pointed the class to the foundational papers for dataflow analysis and abstract interpretation.

I gave an assignment to implement subtract and bitwise-and transfer functions for the interval abstract domain for signed 5-bit intervals. Their subtract had to be correct and maximally precise — about half of the class accomplished this. Their bitwise-and had to be correct and more precise than always returning top, and about half of the class accomplished this as well (a maximally precise bitwise-and operator for intervals is not at all easy — try it!). Since not everyone got the code right, I had them fix bugs (if any) and resubmit their code for this week. I hope everyone will get it right this time! Also I will give prizes to students whose bitwise and operator is on the Pareto frontier (out of all submitted solutions) for throughput vs precision and code size vs precision.

We looked at the LLVM implementation of the bitwise domain (“known bits”, they call it) which lives in ValueTracking.cpp. This analysis doesn’t have a real fixpoint computation, it rather simply walks up the dataflow graph in a recursive fashion, which is a bit confusing since it is a forward dataflow analysis that looks at nodes in the backward direction. The traversal stops at depth 6, and isn’t cached, so the code is really very easy to understand.

We started to look at how LLVM works, I went partway through some lecture notes by David Chisnall. We didn’t focus on the LLVM implementation yet, but rather looked at the design, with a bit of focus on SSA, which is worth spending some time on since it forms the foundation for most modern compilers. I had the students read the first couple of chapters of this drafty SSA book.

Something I’d appreciate feedback on is what (besides SSA) have been the major developments in ahead-of-time compiler technology over the last 25 years or so. Loop optimizations and vectorization have seen major advances of course, as have verified compilers. In this class I want to steer clear of PL-level innovations.

Finally, former Utah undergrad and current Googler Chad Brubaker visited the class and gave a guest lecture on UBSan in production Android: very cool stuff! Hopefully this motivated the class to care about using static analysis to remove integer overflow checks, since they will be doing assignments on that very topic in the future.

Penny Arcade: News Post: Want to work on PAX?

Tycho: There are a couple opportunities to PAX it up with our partner ReedPOP.  Here are two jobs in Norwalk, Connecticut, which you should absolutely check out: PAX Content Manager PAX Content Coordinator What is the difference between a Content Manager and a Content Coordinator?  Good question; it’s all in the links.  This next job is not in Norwalk, CT.  In fact it is very far away from there, because it is for PAX Aus (which is fast approaching and you should buy tickets!): PAX Aus Sales Manager (Sydney) Let’s make some shows. (CW)TB

All Content: A Satire, Serious Melodrama, Rock Musical, Comedy, Violent Exploitation Picture, Skin Flick and Moralistic Expose: On "Beyond the Valley of the Dolls"


“When I’m with you, pussycat, who needs grass?” “Porter, you have an unending capacity for counterfeit astonishment.” “Come into my den, said the spider, et cetera.” “This is my happening, and it freaks me out!” 

If nothing else, Russ Meyer’s 1970 exploitation epic "Beyond the Valley of the Dolls," about a trio of young female rock-and-rollers who move to Los Angeles and get embroiled in romance, drugs and a fight over a will, is a treasure trove of quotable lines, most of them courtesy of this site’s founder, Roger Ebert. It’s out on Criterion Blu-ray this week in a disc loaded with extras, including an essay by contributor Glenn Kenny, a 2003 audio commentary by Ebert from the regular-resolution disc, a set of five documentaries about the making of the movie, and a 1988 episode of "The Incredibly Strange Film Show" about the career of Russ Meyer.

Ebert wrote the screenplay for "Beyond the Valley of the Dolls" at the request of Meyer, a mammary-obsessed filmmaker he’d gotten to know personally. Ebert was just 27 when he finished the script. It was a tricky assignment because he had to devise something in the spirit of the original "Valley of the Dolls" (which not-coincidentally is being released simultaneously on Blu-ray by Criterion, with extras that include a video essay by contributor Kim Morgan). 

The original is an alternately scuzzy, solemn and ridiculous trash-fest based on Jacqueline Susann’s bestselling novel. The original adaptation of the book made a mint; the film’s releasing studio, Twentieth Century Fox, wanted a sequel, and had the rights to make one whether or not Susann had a source novel to base it on, and regardless of whether she approved of the story. Like a lot of the major studios at that time, Fox was hemorrhaging money and pinned its survival to the youth market. The surprise success of “Easy Rider,” an independent film that grossed 30 times its production cost, sent Fox and its competitors scrambling to find the newest hippie-flattering exploitation flick. Of course, most of the studios’ efforts were tone-deaf and did not connect with their intended viewers; but that didn’t stop them from greenlighting more projects in that vein. 

The biggest was "Beyond the Valley of the Dolls," a CinemaScope spectacular filmed on a series of huge sets and stocked with gyrating actresses, models, agents, managers, musicians and uncategorizable oddballs. The lapels are wide, the skirts short, the pneumatic breasts enormous. Meyer and Ebert screened the 1967 film of "Valley of the Dolls" as preparation but without reading the novel; their mandate was to work with the same formula and tell a new-ish story about three young women (in this case, rock musicians played by Dolly Read, Cynthia Myers and Marcia McBroom) who go to Hollywood and are plunged into a world of decadence. There are a few characters from "Beyond the Valley of the Dolls" who are essentially "Valley of the Dolls" characters with names and key details changed, but for the most part the milieu is original—and the tone definitely is. The dialogue’s super-arch, "All About Eve"-like tone, coupled with the story’s knowing blend of cruelty and innocence, is more reminiscent of Meyer’s "Faster, Pussycat! Kill! Kill!" than anything Susann wrote. 

Ebert said Meyer wanted a movie that could “simultaneously be a satire, a serious melodrama, a rock musical, a comedy, a violent exploitation picture, a skin flick and a moralistic expose.” No lack of ambition there. The result stays dirty but essentially light right up to the end, a ghastly-brutal home invasion improvised during production and modeled on the Manson murders that happened shortly before production began, in locations not to far from where "Beyond the Valley of the Dolls" was shot. Susann was not amused. After "Beyond the Valley of the Dolls" hit screens, she sued and won a settlement, arguing (improbably, given the spectacular nastiness of her own fiction) that Ebert and Meyer’s sequel had ruined the integrity of her brand.

Despite period details that were already behind-the-curve when the movie came out (Hollywood is always late to the party when it panders to youth culture) and a certain moralistic undertone characteristic of Meyer, "Beyond the Valley of the Dolls" is still a gas, thanks mainly to its heedless energy and its “are they kidding or not?” tone. According to Ebert, Meyer instructed the actors to deliver the script’s outrageous dialogue with a straight face, and apparently succeeded in convincing them that he was trying to make a serious statement about something or other. As Roger Ebert wrote in a 1980 appreciation of the movie published on its tenth anniversary: 

Meyer directed his actors with a poker face, solemnly, discussing the motivations behind each scene. Some of the actors asked me whether their dialogue wasn't supposed to be humorous, but Meyer discussed it so seriously with them that they hesitated to risk offending him by voicing such a suggestion. The result is that "BVD" has a curious tone all of its own. There have been movies in which the actors played straight knowing they were in satires, and movies that were unintentionally funny because they were so bad or camp. But the tone of "BVD" comes from actors directed at right angles to the material.

The right angles proved to be the right angle. The peculiar, slightly baffling tone means that the movie works even when it isn’t working—and there are many elements that shouldn’t work in any environment save a David Lynch film, in particular the bloody finale and a related, totally unmotivated twist which reveals that a character you thought was a man was actually a woman. It helps that this is a terrifically well-made piece of trash. "Beyond the Valley of the Doll'"s light-speed editing (some of which was meant to cover up bad performances) feels shockingly modern. Ebert’s knowingly unnatural dialogue lodges in the brain like song lyrics from a satirical musical. The whole contraption knows what it is—an experience, not a statement—and what pop cultural vein it’s operating in (Ebert’s screenwriting credit appears over a shot of a big-breasted woman with a gun in her mouth). 

The characters manage to be excitingly weird even when the performances are hit-and-miss. Some of them are conceived at right angles to reality. Ebert and Meyer based many of the key personages, including the rock band’s manager, Ronnie "Z-Man" Barzell (John LaZar) and heavyweight champ Randy Black (James Ingelhart) on major celebrities (Phil Spector and Muhammad Ali respectively) without ever meeting them; it was more a loopy notion of send-up than an attempt at substantive commentary. (Z-Man speaks in iambic pentameter that mashes up at least a dozen Shakespeare plays.) A certain bruised innocence comes through even when the movie is at its most facetious. It’s as if Meyer is in denial that there’s a Pollyanna buried underneath the layers of raunch that he wore as protective armor; in odd, fleeting moments, this movie let us see it.

To purchase your Criterion Collection Blu-ray or DVD of "Beyond the Valley of the Dolls," click here

Ideas from CBC Radio (Highlights): Changing the System

Artists are visionary, and their work often anticipates tectonic shifts in the future social landscape. But what relationship does art have with social change? What obligations, if any, do artists have to foster social justice? An AGO Creative Minds event

Penny Arcade: News Post: Success And Its Opposite

Tycho: Gabe is playing an Xbox game at home, alone, and without compulsion - specifically, Forza Horizon 3.  I don’t know what caused this to happen, but it’s happening and we have to deal with that.  Specifically, I have to deal with that. I grabbed a copy after being coated in his effusions, and with a jaunty spirit, named my international racing mogul Sloth.  In my mind I can see his business card, and underneath where it says Sloth is a curl of italicized cursive that says “It’s Ironic.” I want to know way, way more before I talk about it, but the…

Planet Haskell: Michael Snoyman: Respect

As I'm sure many people in the Haskell community have seen, Simon PJ put out an email entitled "Respect". If you haven't read it yet, I think you should. As is usually the case, Simon shows by example what we should strive for.

I put out a Tweet referring to a Gist I wrote two weeks back. At the time, I did not put the content on this blog, as I didn't want to make a bad situation worse. However, especially given Simon's comments, now seems like a good time to put out this message in the same medium (this blog) that the original inflammatory messaging came out in:

A few weeks back I wrote a blog post (and a second clarifying post) on what I called the Evil Cabal. There is no sense in repeating the content here, or even referencing it. The title is the main point.

It was a mistake, and an offensive one, to use insulting terms like evil in that blog post. What I said is true: I have taken to using that term when discussing privately some of the situation that has occured. I now see that that was the original problem: while the term started as a joke and a pun, it set up a bad precedent for conversation. I should not have used it privately, and definitely should not have publicized it.

To those active members in projects I maligned, I apologize. I should not have brought the discourse to that level.

All Content: Amazon's "Crisis in Six Scenes" is for Woody Allen Completists Only


If it weren't already clear as to where Woody Allen stands on the current debate of film vs. TV, you'd just need to sit through an episode of his new Amazon series, "Crisis in Six Scenes." In the six-part venture available on Friday, the writer/director also stars as an insignificant novelist named Sidney in the 1960s who is now slumming it by pitching a TV show. His character later refers to the project as "that idiotic television thing" and even his barber (Max Casella) agrees that TV isn't as highbrow compared to his bad novels, but suggests that Sidney should let himself be "humiliated all the way to the bank." This low standard for TV informs the entirety of “Crisis in Six Scenes,” which is for Woody Allen completists only. In an ugly, miserable relationship, the narrative requirements of TV hate him back by reducing his masterful skills of character, dialogue and pacing to staggering weaknesses.

Continuing Allen's parallels with author Philip Roth, the series is like a rinky-dink American Pastoral, sharing the novel's prevalent themes of a revolutionized 1960s America, a dysfunctional household, and a young woman who shatters everyone's life with her radicalism. The thin plot involves Allen's Sidney character receiving an unexpected visit from a family friend's daughter, Lennie (Miley Cyrus), who is hiding from the cops after committing domestic terrorism. While at the home of Sidney and his marriage therapist wife Kay (Elaine May), she starts to intrigue another family friend, John Magaro’s uptight Alan, who is set to marry his fiancée Ellie (Rachel Brosnahan). Lennie gives Alan and Kay some Communist literature that widens their political perspectives in ways Allen likes to joke about—the punchline being in privileged capitalists becoming unwittingly radicalized—which has Alan falling for Lennie, and Kay talking lovingly about Mao and Marx with her book club (which includes actresses like Joy Behar).

Particularly by the first episode’s anticlimactic cliffhanger, the initially intriguing element of seeing a modern Allen placed in the period that his career started in soon fades. It becomes apparent that Allen has made a 140-minute show, but not focused on what gives a series its oxygen—characters. He relies instead on the same amount of plot you’d get from an 85-minute script, filled with lots of wasted gabbing. As the pacing gets worse, you start to see how characters bicker for longer than usual about nothing, or how everything seems aimed to lose a few minutes here while debating whether to get out of bed, or a few seconds there to talk about clam chowder. 

Allen embodies this with his own character, who spends a lot of time on-screen but has very little arc. Like the people he shares numerous empty exchanges with, he only becomes more and more slippery—never more interesting. Other performances suffer, like intriguing turns from Elaine May and John Magaro, the latter offering an amusing wink playing a character named Alan whose plans blow up in his face. But grating performances, like from Miley Cyrus’ very turgid line-reading, become even worse. Allen’s revealing failure is in treating TV narratives as merely a longer-than-feature runtime to be stretched out, instead of offering the scope and depth of characters that makes TV viewers invested. This is certainly a show made by a film director who still thinks like his character Alvy from 1977's "Annie Hall," who spoke about how people in Beverly Hills don't throw away their garbage—“they turn it into television shows." 

There's an air of irritated obligation that can be felt in every single shot of “Crisis of Six Scenes.” It’s in how each scene plays out with the longest take possible, showing off capable improvisation far more than nuanced character; in how when two people have their own medium shots during a flat conversation they’re connected with a call-and-response energy. Though Allen is no stranger to this type of straightforward directing, he allows TV to make it look entirely plain; vindictively knocking us back from the work of previous cinematography collaborator Vittorio Storaro and his sumptuous “Café Society” cinematography to Eigil Byrd’s point-and-shoot work here, the series is further transparent in its anti-inspiration. It’s the production design by Carl Sprague that offers any type of texture, with great pieces of furniture and costuming in Sidney and Kay's house, providing what character this series often never has. 

But there is an unfortunate nature to this transparent crisis. In 2015, when Allen talked about the project at Cannes he called it “a catastrophic mistake.” That quote essentially functions as the project's thesis, from a filmmaker who is no stranger to off-films or shoulder-shrugging titles for them ("Whatever Works," "Anything Else") but has never felt before like he’s merely trying to make a product. Though it may have helped Allen get financing from the streaming giant for his current film projects, “Crisis in Six Scenes” is a definitive waste of existence, for its talent and its audience. I’m certain that Allen would be the first to agree with me. 

Quiet Earth: Fantastic Fest 2016: SCIENCE FICTION, VOL. 1 THE OSIRIS CHILD Review

If you ask a modern filmmaker to name some of the inspirations for their latest science fiction opus, chances are high he or she won't mention the likes of Circuitry Man, Space Truckers, Slipstream, or Fortress. Chances are also high that if you have any affinity whatsoever for 80's and 90's sci-fi B-movies you will, while watching the latest film from director Shane Abbess, flash back to at least one of these or a host of other hovering-just-below-the-radar flicks, the kind that once dominated the shelves of rental stores.

Science Fiction Vol.1: The Osiris Child takes place in a future where mankind has begun colonizing the galaxy, establishing corporate/military bases on each newly terraformed planet. Lt. Kane Sommerville (Daniel MacPherson), demot [Continued ...]

Colossal: This Solemn Forest Chapel in Japan Imitates Two Hands Clasped in Prayer


Located in a forest just beyond a nondenominational cemetery sits the Sayama Forest Chapel, a three-year-old building designed by Hiroshi Nakamura & NAP (previously). From a bird’s eye view the chapel appears to form both a star and two hands pressed together in prayer, which is a traditional Japanese structural form called “Gassho-zukuri.”

“For those who are in deep grief and inconsolable, how can architecture nurture them? With this in mind, I designed buildings that gently surround them and support their intentions,” explained Nakamura to Yellowtrace.

The building was also built in a way to promote growth around its exterior, with walls tilted inward to leave room for the forest to grow around its shape. The chapel’s floor and patterns of its slate also lean toward the forest, subtly asking visitors to concentrate their mind on the surrounding elements of nature.

The chapel was named as a winner in the religious buildings and memorials category in this year’s Architizer A+Awards, an awards program that celebrates the year’s best in architecture and products. (via Yellowtrace)








Quiet Earth: Fantastic Fest 2016: THE CREW is One of the Best Crime Movies of the Modern Era [Review]

Guns. Drugs. Passports. All common currency on the black market. Getting any of those in your hands is going to require the services of a professional like Yanis- a slender, laconic man of exemplary discipline who plays the game wisely and launders his money through legal fronts. Keeping a low profile means making sure no attention gets drawn to his operation- which proves to be difficult when in the company of a pillhead and a fast-and-loose little brother.

One careless misstep gets them in deep with the wrong people and they have to use their exceptional skills to dig their way back out. It’s a showdown of opposite camps of criminals- finesse and restraint versus chaotic brutality. Every decision they make takes them farther down a path of increasingly ruthless peril.

Easily [Continued ...]

Quiet Earth: Fantastic Fest 2016: MISS PEREGRINE'S HOME FOR PECULIAR CHILDREN Review

A remote, gorgeous mansion that offers safe haven to children with special powers. A kindly guardian who protects these children, not only from the outside world but from hostile adults with nefarious intentions and their own blend of special powers. A young man who seeks out this collection of beautiful freaks in the hopes of learning more about his own past. Sound familiar? "Tim Burton's X Men" aka Miss Peregrine's Home for Peculiar Children puts the ole Burtonesque spin on a different superhero formula (the first being Batman, of course), and if it sounds like I'm bein [Continued ...]

OUR VALUED CUSTOMERS: Fall is here! (From the OVC Archive!)

Colossal: A Skeleton of Found Roots and Tree Limbs Heralds the Beginning of Fall in Italy


In this 2012 installation, street artist Never2501 assembled a variety of found vegetation to form an eerie skeleton at the base of some steps in the idyllic gardens of the Museo Archeologico Paolo Giovio in Como, Italy. The piece was titled “In Cammino Per Trasformarsi Nell’istante Presente” (Moving to Transform into the Present) and could be interpreted as a harbinger of the seasons with the decaying root stumps and limbs pulled from a nearby forest, fit together without aid of any additional materials. Or maybe it’s just an incredibly disturbing thing to stumble onto when walking through the woods? You can see more photos of the temporary piece here, and follow Never2501’s more recent work on Instagram. (via This Isn’t Happiness, StreetArtNews)



Quiet Earth: UNDER THE SHADOW is Confident and Chilling [Review]

This feature debut from Iranian-born writer/director Babak Anvari is a remarkably accomplished and confident supernatural thriller, full of tension, fear and anxiety. The film uses the setting of 1980's Tehran, during the Iraq-Iran conflict, to examine a family in crisis, attempting to cope during Saddam’s missile bombing campaign of the city as something unearthly and sinister begins haunting their apartment.

Working on several levels and featuring excellent performances, it’s a striking, genuinely chilling piece of work, and a very easy film to recommend. While the backdrop is one of chaos, war and conflict, the underlying motifs are of guilt, anxiety and frustration, all manifest in the appearance of something more insidious than any invading army or dictator, something [Continued ...]

Daniel Lemire's blog: Sorting already sorted arrays is much faster?

If you are reading a random textbook on computer science, it is probably going to tell you all about how good sorting algorithms take linearithmic time. To arrive at this result, they count the number of operations. That’s a good model to teach computer science, but working programmers need more sophisticated models of software performance.

On modern superscalar processors, we expect in-memory sorting to limited by how far ahead the processor can predict where the data will go. Though moving the data in memory is not free, it is a small cost if it can be done predictably.

We know that sorting “already sorted data” can be done in an easy-to-predict manner (just do nothing). So it should be fast. But how much faster is it that sorting randomly shuffled data?

I decided to run an experiment.

I use arrays containing one million distinct 32-bit integers, and I report the time in CPU cycles per value on a Haswell processor. I wrote my code in C++.

function sorted data shuffled data sorted in reverse
std::sort 38 200 30

For comparison, it takes roughly n log(n) comparisons to sort an array of size n in the worst case with a good algorithm. In my experiment, log(n) is about 20.

The numbers bear out our analysis. Sorting an already-sorted array takes a fraction of the time needed to sort a shuffled array. One could object that the reason sorting already-sorted arrays is fast is because we do not have to move the data so much. So I also included initial arrays that were sorted in reverse. Interestingly, std::sort is even faster with reversed arrays! This is clear evidence for our thesis.

(The C++ source code is available. My software includes timsort results if you are interested.)

All Content: Deepwater Horizon


When Peter Berg finds a subject he's passionate about, one can usually see that passion reflected in the final product. Whatever issues one might have with “Friday Night Lights” or “Lone Survivor,” they’re good examples of films that Berg put his all into. His latest, “Deepwater Horizon,” which just debuted at the Toronto International Film Festival, about the horrendous 2010 oil rig disaster that cost lives and stands as the worst ecological incident in this country’s history, started production with a different director, J.C. Chandor (“All is Lost”). After mysterious creative differences forced him to split the project, Berg jumped in, and the result is a film that too often functions like a contractual obligation (and it’s telling that Berg jumped immediately to another project coming out in only a few months, “Patriots Day,” about the 2013 Boston bombing). “There had to eventually be a movie about this tragedy, so let’s just get this thing over with.” As is often the case with Berg’s films, it’s technically accomplished, but it’s lacking the depth of a project that comes from a creative spark. Everything here feels routine—more like an inevitability than a work of art or even a piece of entertainment.

“Deepwater Horizon” is essentially a two-act piece: “Act 1: Meet the Crew,” “Act 2: Watch the Disaster.” So, we get a little bit of time with Mike Williams (Mark Wahlberg) and his family, including wife Felicia (Kate Hudson), before he heads off to work at his admittedly very unusual job an oil-drilling rig in the Gulf of Mexico. We meet one of the elder statesmen of the rig, “Mr. Jimmy” (Kurt Russell) and one of its younger employees, Andrea Fleytas (Gina Rodriguez). Of course, there are also a few nefarious BP executives floating around, including the remarkably villainous Donald Vidrine (John Malkovich).

The first half of “Deepwater Horizon” is filled with a surprising degree of tech speech—lots of conversations about PSI and arguments over the horrendous state of the equipment on the rig. There’s something admirable about getting the technical details right in a film like this one, but it makes for a very dry set-up. We don’t really feel like we’re getting to know the characters in any memorable way outside of what they were doing the day everything went wrong. When things get intense later on, our connection to them isn't bigger than a standard hope for their survival. One never feels like they’re watching Mike and Mr. Jimmy try to get people to safety as much as watching Mark Wahlberg and Kurt Russell navigate some impressive stunt work.

And it is impressive. When things go wrong on the Deepwater Horizon, it’s not a minor problem—it is really a vision of Hell with water, mud, oil and eventually fire literally everywhere. To be honest, the fact that anyone survived is pretty remarkable, and Berg and his technical team know how to make an effective disaster movie that will pin viewers to their seats long enough that they may more easily forgive the film’s flaws. Fires boom, metal creaks and bodies get tossed around in terrifying ways. It’s impressive, but ultimately hollow. “Deepwater Horizon” too often feels like a relative of “Battleship” in that it’s more concerned with set pieces and action blockbuster stunt work than it is with the people involved. It’s also a film that really muddles the geography and timing of what happened and when. The best section of the movie is when the explosion first happens, when water and mud spray the chamber, and one can still sense committed men trying to fix it before it goes horribly wrong. After they’re unable to do that, it’s too often an indistinct blur of noise and fire.

BP cut corners, and people died. Now how do we tell that story in an entertaining way while also making a piece of art that gives back to a community impacted by the tragedy? Tough question, and one the film never pulls off. There’s a more ambitious version of “Deepwater Horizon” that goes deeper into the lives of those on that rig (maybe more than just the fateful day would have helped). “Deepwater Horizon” gets the explosions just right, but it’s everything around them—the people, the aftermath, the tragedy—that it misses. That’s the more challenging story—the one that places the fire and oil in a context that allows us to see its impact beyond the cost of human life. Peter Berg could have made that film. But maybe he needed to be there from the beginning to do so.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - A New Debate Format

Networks: I am prepared to direct this pilot at a moment's notice.

New comic!
Today's News:



I had to stop the experiment for a bit because I wasn't getting much reading done while finishing up a project. But, I'm back on track, and I'm going to try to run weekly book reviews from here on out.


For transparency: If you click the links below, for the next 24 hours, if you buy something on amazon, SMBC gets a small payment. So, basically, if one of the books below looks tempting, we'd appreciate if you clicked the link before buying.


This is a new thing we're doing, so if you have any thoughts on how it can be improved, please let us know.



Aug 17 - Shocked (Casarett)


A quick bit of history and science on the topic of bringing humans back from the brink of death. It’s a quick read, with a lot of dorky humor injected. It’s not bad, and there are a lot of neat stories and weird science, but it kind of felt like it was Not Quite Mary Roach. Still, fun if you’re interested in the topic.


Verdict: 3/5


Aug 20 - Red Notice (Browder)


An INSANE memoir about Browder’s life in high finance, going on adventures making crazy deals as the Soviet Union collapses and breaks apart. Making deals required him to go to strange places and repeatedly risk being assassinated. He sounds like he’s totally nuts, but it’s a hell of a story.


Verdict: 4/5


Aug 25 - The Conundrum (Owen)


An interesting book. It starts with the economic observation that efficiency (which theoretically is good for the environment) often leads to increased consumption (which is probably bad for the environment). Owen suggests that if efficiency tends to lead us to consume more, the only way to save the environment is to reduce our lifestyles.


I don’t buy everything he’s saying, but I think it’s a very interesting argument that’s worth confronting.


Verdict: 3.5/5


Aug 27 - Woe to Live On (Woodrell)


A great novel about a group of Confederate soldiers, written with a lot of realism and depth. An interesting feature of this book is that it’s stylistically very 19th century, but it yet contains the violence and sex that tend to be elided in actual books from that period.


Verdict: 5/5


Sept 1 - How Everything Became War and the Military Became Everything (Brooks)


This was a pretty excellent book. It’s a sort of combination of a memoir of Rosa Brooks’ time in the Pentagon, along with the history and psychology of how (according to her) our relationship to war has changed for the worse in recent times.


Verdict: 3.5/5


Sept 3 - White Trash (Isenberg)


A good historiography of poor whites in the history of the United States, and how they tend to be viewed by cultural elites. I felt like it got a bit less compelling as we got to modern times, but maybe that’s just because I’m more familiar with life now than life 300 years ago.


Verdict: 3/5


Sept 5 - Eye in the Sky (Dick)


Like a lot of these early Philip K Dick novels, I feel like it’s a cool idea and a well-developed world, but the execution is really hokey.


The plot is about a bunch of people who get zapped by a high energy beam, who then somehow start serially inhabiting each other’s consciousnesses. Each such universe contains the strange biases of its consciousness. It’s fun, but it’s really just 1950s pulp stuff, despite the clever premise.


Verdict: 2/5


Sept 6 - Messy (Harford)


A fun little book on how interesting ideas often come from what Harford refers to as “messy” situations, in the broad sense of (supposedly) non-ideal creative environments. It contains a lot of fun stories ranging from musicians to scientists.


Verdict: This book is written by an author I have met personally, and I’ve decided my policy on such reviews is that I won’t list a number verdict.


Sept 7 - The Man Who Japed (Dick)


A dystopia story about a sort of puritanical world, only the puritanical culture stems from something like suburban American cultural norms circa 1950. It’s all right. Dick has so many stories about controlling prudish middle-aged women that I’m starting to wonder if the character isn’t based on someone real.


Verdict: 2.5


Sept 9 - The Broken Bubble (Dick)


Now this is something interesting! It’s Philip K Dick, but not science fiction, and it wasn’t published until after his death. Early on in his career, Dick wrote literary fiction, and I would say this is the best one I’ve read so far. It’s not perfect, but it’s pretty good. It’s a story about two couples - one very young and accidentally pregnant, and one divorced because they were unable to conceive. The dynamics are really interesting, but it lacks the subtlety and depth that Dick would later develop. I found myself wishing he had come back to this novel later in life.


Verdict: 3/5


Sept 14 - The End of the Cold War (Service)


A very enjoyable and in depth history of the last six years of the USSR. The only part that was slightly odd was that now and then he seemed to really fanboy over Ronald Reagan. I certainly don’t mind reading history books from people with political views (because the alternative doesn’t exist), but it felt out of place in such a long meticulous work of history. Still, quite good!



Bad Science: The Cancer Drugs Fund is producing dangerous, bad data: randomise everyone, everywhere!

CreativeApplications.Net: Ugly Dynamics – About controlling simulations / Ugly Film

534891171Ugly Dynamics is a personal exploration/documentation by Nikita Diakur exploring the effects and control of simulated dynamics in computer software, specifically in the work produced by his studio Ugly Films.

Electronics-Lab: Arduino Wiring, A New Choice for Developers On Windows 10 IoT Core

Last year, Arduino and Microsoft announced a strong partnership and Windows 10 became the world’s first Arduino certified operating system. This partnership made the creation and innovation much easier with the hardware capability of Arduino and the software capabilities of Windows.


Early this month, Arduino Wiring became a new programming language for Windows 10 IoT Core besides C#, C++, Visual Basic, JavaScript, Python and Node.js, which means that developers are now able to run Arduino Wiring sketches on all supported IoT Core devices including Raspberry Pi 2 and 3.

The popularity of Arduino in the makers’ community alongside the simplicity of using and programming its boards maybe were the main reasons for this step.


Installing and setting up may differ between devices, but all information and getting started guides for users are available online on Microsoft website, and here are the links:

The post Arduino Wiring, A New Choice for Developers On Windows 10 IoT Core appeared first on Electronics-Lab.

Tea Masters: Thoughts on China and tea

Shanghai skyline by night
There are many things that can I could criticize about China, and my first criticism would be that it's not possible to write and publish a blog post like this one as long as you are behind the "big firewall"! But let me start with some positive words about what I saw. The development of the cities (Shanghai, Suzhou) I visited is more than impressive. In the matter of a generation, 30 years, the country has gone from third world to first world. The ride across Shanghai took an hour and there were only recent, big buildings along the way! It's one thing to read about the economic rise of China, it's another to actually see how modern and affluent the country and many of its people have become.
Old town of Luzhi, near Suzhou
A lot has been destroyed and rebuilt anew. However, I was pleasantly surprised that some ancient parts of town have been well preserved and restored. This was the case in the old streets of Luzhi and in the Humble Administrator Garden of Suzhou. Beautiful places!
Old street/canal in Luzhi
I haven't spent a lot of time in China (3 short stays in 3 different provinces in the last 3 years), so I give you these impressions without pretending to be an expert on China's rapid development.
Humble Admninistrator Garden
That's why I would like to also give you the perspective of my Taiwanese brother in law who has spent the last 15 years as an engineer in the booming construction industry. His major criticism is the lack of quality in all this development. Fake products abound: cheap USB sticks that only work for a couple of days (!), fake luggage, sold in an official shop (!), that rips after 2 months (it's when he wanted to use the warranty that Samsonite realized that the bag he bought in their shop was a fake), recent buildings that collapse a few weeks after inspectors declared them safe... Appearances are everything. Quality is always low compared to our standards. The house we stayed in is 15 years old only, but requires a lot of maintenance every year, because it's original quality is so poor. That's one reason why buyers systematically prefer to build a new house rather than renovating the old.
Humble Admninistrator Garden
This low quality in all their products makes sense for a country where the majority of people still have low incomes. The problem is that this low quality has become a way of doing business there. Even high prices don't mean better quality, only better packaging. It must look expensive is more important than quality and value.

2003 wild Yiwu puerh
All my Taiwanese tea friends strongly dislike going to China, but often their jobs don't give a choice. The teas they are bringing back and share are getting more and more expensive and their packaging is now much nicer that what we're used to in Taiwan. However, the teas are usually very disappointing.

In our most recent class, Teaparker let us taste a young raw puerh he received from a rich Chinese connection from a luxurious tea room filled with (real) antiques and overseeing a beautiful lake. The cake looked nice with lots of rather big tips and a strong flowery scent. Did I mention it's supposedly made from leaves from a single tree?
When we brewed this puerh, nobody in the class liked it. The taste wasn't smooth or pure at all. I felt a shrinking of my throat, the natural reaction of the body when it rejects something. The smell was still very strong and reminded me of scented jasmine tea. After everybody expressed his tasting opinion, Teaparker confirmed that this puerh was indeed a highly priced and deceptive puerh: the buds were not old arbor and the leaves had been artificially scented! It's a good example of how price and quality don't necessarily increase in parallel in China. 
Back at home, I'm brewing one of my favorite puerh: the wild raw spring 2003 Yiwu cake. Its taste is sweet, smooth and pure. The dry scents are faint, but they become alive in the brew. The aftertaste is very long and harmonious. It's naturally simple and delicious. It's getting more expensive with time, but here we have a rise that is justified by its improvement in taste. That's a solid foundation for real growth. Tea teaches you to disregard packaging, stories and price. What counts is the intrinsic quality of the taste and scents.
Always be cautious in China and when drinking tea!

New Humanist Blog: Book review: Anger and Forgiveness

As the century rages on, we clearly need reminding that anger is no good.

explodingdog: topherchris: Everybody okay?


Everybody okay?

Disquiet: The Drone as Amber

Bryan Hilyard’s “Asleep in Amber” is a gentle drone. To say it’s a drone is to say it subsists on waveforms. To say it subsists on waveforms is to say that it pulses. The pace at which it pulses provides an internal contrast to the sounds themselves, which are gentle, hazy, soft, true to the slumber made explicit by the track’s title. The pace of the pulse, in contrast, is fairly quick, the main waveform moving more like water pushing at a pier as the tides shift than like a ripple on an otherwise still pond. It’s insistent for much of the track’s first third, at which point it dives. The pulse remains, but it’s deep, shadowed by the dense shimmer that Hilyard takes artful pains to accumulate. There seem to be voices in the mix, though they’re never remotely intelligible. They’re trapped even deeper down than is the pulse — like the title says, “Asleep in Amber.”

Track originally posted at More from Hilyard, who is based in Mariaville, Maine, at and (Track found via a repost by, aka Ivan Ujevic of Zagreb.) Comic for 2016.09.28

New Cyanide and Happiness Comic

Ideas from CBC Radio (Highlights): Darkwave - Underwater languages at the brink of extinction

Whales are threatened by us. Their language eroding through noise and climate change. Carrie Haber explores how marine scientists around the world are thinking about our evolutionary courtship with these magnificent mammals in the sea.

CreativeApplications.Net: F3 [Form From Function] – Playful and powerful 3D design using signed distance functions

f3_coverCreated by Reza Ali, F3, [Form From Function], is a playful and powerful 3D design app that enables you to live code 3D form, rapidly iterate on its design, and export for 3D printing, rendering and animation. F3 uses signed distance functions (SDFs) to build forms – designing 3D forms using 2D image cross sections.

Disquiet: This Week in Sound: Sounds of and for the Cosmos +

A lightly annotated clipping service:

Bedside Manner: Medicine X is Stanford University’s initiative to explore “the future of medicine and healthcare.” As summarized by Andrea Ford at the school’s Scope publication, MedX has an artist in residence, Yoko Sen, who is addressing issues of noise pollution in hospitals: “She played the audience a track of beeps, buzzes, alarms, and mumbled voices; other hospital sounds include patients screaming, and the empty silence after bad news is delivered.” 

Phone Hum: Much of the yap about the iPhone 7 is its haptic (touch) improvements, but as Matthew Hughes reports at the Next Web it “makes an audible hissing noise whenever under intense strain.” Hughes credits the detection to Stephen Hackett, who “eventually realized that the noise wasn’t coming from the speaker, but rather from the logic board itself.” Hughes quotes Marco Arment correctly likening it to the sound a laptop fan makes when the CPU is being overly taxed. Other theories exist, too, the most colorfully named being “coil whine.”
(via Warren Ellis’ Sunday email newsletter)

Always Listening: “How we learned to talk to computers, and how they learned to answer back” — those are the questions that Charles McLellan seeks to answer in his detailed TechRepublic piece, tracing it from the dissection of human speech through computer recognition, the role of neural networks in passing the WER (or “word error rate”) test, on through natural language understanding, and a sense of where AI is headed.

V’ger’s Greatest Hits: The Voyager space probes carried a “Golden Record” conceived by Carl Sagan that contained exemplary sounds of our planet to hypothetical intelligent civilizations far beyond our modest solar system. David Pescovitz of Boing Boing is leading a Kickstarter project to make the record available closer to home, a gorgeous box set with three vinyl LPs and a collection of images from the probes.
(via Rob Walker, Bruce Levenstein, others)

Smule’s Pitch: At, Murry Newlands interviews Smule’s CEO and co-founder, Jeff Smith, about the business side of the social-oriented music-app developer, looking at matters of profitability, misperceptions about the scope of the music market, and the unique nature of the sounds they produce. Says Smith, “For example, because our community is creating the music, we’re not using that master recording, and we’re not licensing the master recording from the label. Instead, we’re licensing the copyright to the composition from the publisher, from the writer. And we pay royalties out to all the writers.” 

Sound Awards: At least three of this week’s announced MacArthur Grant winners work in sound and music: Daryl Baldwin, a linguist working on cultural preservation in a culture that “lost its last native speaker in the mid-twentieth century”; Josh Kun, a cultural historian of popular music (I helped out on the pop-up Tikva Records store Kun and others at the Idelsohn Society put together in 2011); and Julia Wolfe, composer and co-founder of Bang on a Can.

Tome On: While physical and ebook sales are slipping (paperbacks have risen), Alexandra Alter reports in the New York Times that audiobooks sales are up.

Olde Tyme: The Internet Archive (which is housed walking distance from my home, and just a block from where I first lived when I moved to San Francisco in 1996, 20 years ago) reports on the process of saving 78-rpm records in collaboration with New York’s ARChive of Contemporary Music.
(via Joseph Witek and Michael Rhode)

This first appeared, in slightly different form, in the September 23, 2016, edition of the free Disquiet “This Week in Sound” email newsletter:

OCaml Planet: The fixpoint combinator

Consider the following recursive definition of the factorial function. \[ FAC = \lambda n.\;IF \left(=\;n\;0\right)\;1\;\left(*\;n\;\left(FAC\;\left(-\;n\;1\right)\right)\right) \nonumber \] The definition relies on the ability to name a $\lambda$-abstraction and then to refer to this name inside the $\lambda$-abstraction itself. No such facility is provided by the $\lambda$-calculus. $\beta$-abstraction is applying $\beta$-reduction backwards to introduce new $\lambda$-abstractions, thus $+\;4\;1\leftarrow \left(\lambda x.\;+\;x\;1\right)\; 4$. By $\beta$-abstraction on $FAC$, its definition can be written \[ FAC = \left(\lambda fac.\;\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(fac\;\left(-\;n\;1\right)\right)\right)\right)\right) FAC \nonumber \] This definition has taken the form $FAC = g\;FAC$ where $g = \left(\lambda fac.\;\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(fac\;\left(-\;n\;1\right)\right)\right)\right)\right)$ is without recursion. We see also that $FAC$ is a fixed point ("fixpoint") of $g$. It is clear this fixed point can only depend on $g$ so supposing there were a function $Y$ which takes a function and delivers a fixpoint of the function as the result, we'd have $FAC = Y\;g = g\;(Y\;g)$. Under the assumption such a function exists, in order to build confidence this definition of $FAC$ works, we will try to compute $FAC\;1$. Recall \[ \begin{eqnarray} &FAC& = Y\;g \nonumber \\ &g& = \lambda fac.\;\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(fac\;\left(-\;n\;1\right)\right)\right)\right) \nonumber \end{eqnarray} \] So, \[ \begin{eqnarray} FAC\;1 &\rightarrow& (Y\;g)\; 1 \nonumber \\ &\rightarrow& (g\;(Y\;g))\;1 \nonumber \\ &\rightarrow& (\left(\lambda fac.\;\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(fac\;\left(-\;n\;1\right)\right)\right)\right)\right) (Y\;g))\; 1 \nonumber \\ &\rightarrow& \left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(\left(Y\;g\right)\;\left(-\;n\;1\right)\right)\right)\right)\; 1 \nonumber \\ &\rightarrow& *\;1\;\left(\left(Y\;g\right)\;0\right) \nonumber \\ &\rightarrow& *\;1\;\left(\left(g\;\left(Y\;g\right)\right)\;0\right) \nonumber \\ &\rightarrow& *\;1\;\left(\left(\left(\lambda fac.\;\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(fac\;\left(-\;n\;1\right)\right)\right)\right)\right)\;\left(Y\;g\right)\right)\;0\right) \nonumber \\ &\rightarrow& *\;1\;\left(\left(\lambda n.\;IF\left(=\;n\;0\right)\;1\;\left(*\;n\;\left(\left(Y\;g\right)\;\left(-\;n\;1\right)\right)\right)\right)\;0\right) \nonumber \\ &\rightarrow& *\;1\;1 \nonumber \\ &=& 1 \nonumber \end{eqnarray} \]

The $Y$ combinator of the $\lambda$-calculus is defined as the $\lambda$-term $Y = \lambda f.\;\left(\lambda x.\;f\;\left(x\;x\right)\right)\left(\lambda x.\;f\;\left(x\;x\right)\right)$. $\beta$ reduction of this term applied to an arbitrary function $g$ proceeds like this: \[ \begin{eqnarray} Y\;g &\rightarrow& \left(\lambda f.\;\left(\lambda x.\;f\;\left(x\;x\right)\right) \left(\lambda x.\;f\;\left(x\;x\right)\right)\right)\;g \nonumber \\ &\rightarrow& \left(\lambda x.\;g\;\left(x\;x\right)\right) \left(\lambda x.\;g\;\left(x\;x\right)\right) \nonumber \\ &\rightarrow& g\;\left(\left(\lambda x.\;g\;\left(x\;x\right)\right)\;\left(\lambda x.\;g\;\left(x\;x\right)\right)\right) \nonumber \\ &=& g\;\left(Y\;g\right) \end{eqnarray} \] The application of this term has produced a fixpoint of $g$. That is, we are satisfied that this term will serve as a definition for $Y$ having the property we need and call it the "fixpoint combinator".

In the untyped $\lambda$-calculus, $Y$ can be defined and that is sufficient for expressing all the functions that can be computed without having to add a special construction to get recursive functions. In typed $\lambda$-calculus, $Y$ cannot be defined as the term $\lambda x.\;f\;(x\;x)$ does not have a finite type. Thus, when implementing recursion in a functional programming language it is usual to implement $Y$ as a built-in function with the reduction rule $Y\;g \rightarrow g\;(Y\;g)$ or, in a strict language, $(Y\; g)\;x \rightarrow (g\;(Y\;g))\;x$ to avoid infinite recursion.

For an OCaml like language, the idea then is to introduce a built-in constant $\mathbf{Y}$ and to denote the function defined by $\mathbf{let\;rec}\;f\;x = e$ as $\mathbf{Y}(\mathbf{fun}\;f\;x \rightarrow e)$. Intuitivly, $\mathbf{Y}$ is a fixpoint operator that associates a functional $F$ of type $\left(\alpha \rightarrow \beta\right) \rightarrow \alpha \rightarrow \beta$ with a fixpoint of type $\alpha \rightarrow \beta$, that is, a value having the property $\mathbf{Y}\;F = F\;\left(\mathbf{Y}\;F\right)$. The relevant deduction rules involving this constant are: \[ \begin{equation} \frac{\vdash f\;(Y\;f)\;x \Rightarrow v} {\vdash (Y\;f)\;x \Rightarrow v} \tag{App-rec} \end{equation} \] \[ \begin{equation} \frac{\vdash e_{2}\left[Y(\mathbf{fun}\;f\;x \rightarrow e_{1})/f\right] \Rightarrow v} {\vdash \mathbf{let\;rec}\;f\;x=e_{1}\;\mathbf{in}\;e_{2} \Rightarrow v} \nonumber \tag {Let-rec} \end{equation} \]

[1] The Implementation of Functional Programming Languages,Simon Peyton Jones, 1987.
[2] The Functional Approach to Programming, Guy Cousineau, Michel Mauny, 1998.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Motivation

Terribly sorry. The fourth wall is being reconstructed tonight and should be ready to go tomorrow morning.

New comic!
Today's News:

Tickets are on sale for BAHFest West! This is our third year doing this, and tickets have sold out the previous two times. To guarantee a spot, please buy soon!


things magazine: Steam library

Everything is Broken / fix it yourself with the Piper Raspberry Pi Computer Kit / Line Wobbler, a ‘one-dimensional dungeon crawler game with a unique wobble controller made out of a door-stopper spring and a several meter long ultrabright LED … Continue reading

OCaml Weekly News: OCaml Weekly News, 27 Sep 2016

  1. bindings for xz compression and decompression ?
  2. BuckleScript on Windows
  3. Encoding "links" with the type system
  4. otags reloaded 4.03.1 for OCaml 4.03
  5. opkg v0.0.1 - Documentation access improvements
  6. Other OCaml News

Planet Haskell: FP Complete: Updated Hackage mirroring

As we've discussed on this blog before, FP Complete has been running a Hackage mirror for quite a few years now. In addition to a straight S3-based mirror of raw Hackage content, we've also been running some Git repos providing the same content in an arguably more accessible format (all-cabal-files, all-cabal-hashes, and all-cabal-metadata).

In the past, we did all of this mirroring using Travis, but had to stop doing so a few months back. Also, a recent revelation showed that the downloads we were making were not as secure as I'd previously believed (due to lack of SSL between the Hackage server and its CDN). Finally, there's been off-and-on discussion for a while about unifying on one Hackage mirroring tool. After some discussion among Duncan, Herbert, and myself, all of these goals ended up culminating in this mailing list post

This blog post details the end result of these efforts: where code is running, where it's running, how secret credentials are handled, and how we monitor the whole thing.


One of the goals here was to use the new hackage-security mechanism in Hackage to validate the package tarballs and cabal file index downloaded from Hackage. This made it natural to rely on Herbert's hackage-mirror-tool code, which supports downloads, verification, and uploading to S3. There were a few minor hiccups getting things set up, but overall it was surprisingly easy to integrate, especially given that Herbert's code had previously never been used against Amazon S3 (it had been used against the Dreamhost mirror).

I made a few downstream modifications to the codebase to make it compatible with officially released versions of Cabal, Stackify it, and in the process generate Docker images. I also included a simple shell script for running the tool in a loop (based on Herbert's README instructions). The result is the snoyberg/hackage-mirror-tool Docker image.

After running this image (we'll get to how it's run later), we have a fully populated S3 mirror of Hackage guaranteeing a consistent view of Hackage (i.e., all package tarballs are available, without CDN caching issues in place). The next step is to use this mirror to populated the Git repositories. We already have all-cabal-hashes-tool and all-cabal-metadata-tool for updating the appropriate repos, and all-cabal-files is just a matter of running a tar xf on the tarball containing .cabal files. Putting all of this together, I set up the all-cabal-tool repo, containing:

  • will:
    • Grab the 01-index.tar.gz file from the S3 mirror
    • Update the all-cabal-files repo
    • Use git archive in that repo to generate and update the 00-index.tar.gz file*
    • Update the all-cabal-hashes and all-cabal-metadata repos using the appropriate tools
  • uses the hackage-watcher to run each time a new version of 01-index.tar.gz is available. It's able to do a simple ETag check, saving on bandwidth, disk IO, and CPU usage.
  • Dockerfile pulls in all of the relevant tools and provides a commercialhaskell/all-cabal-tool Docker image
  • You may notice some other code in that repo. I did have intention of rewriting the Bash scripts and other Haskell code into a single Haskell executable for simplicity, but didn't get around to it yet. If anyone's interested in taking up the mantle on that, let me know.

* About this 00/01 business: 00-index.tar.gz is the original package format, without hackage-security, and is used by previous cabal-install releases, as well as Stack and possibly some other tools too. hackage-mirror-tool does not mirror this file since it has no security information, so generating it from the known-secure 01-index.tar.gz file (via the all-cabal-files repo) seemed the best option.

In setting up these images, I decided to split them into two pieces instead of combining them so that the straight Hackage mirroring bits would remain unaffected by the rest of the code, since the Hackage mirror (as we'll see later) will be available for users outside of the all-cabal* set of repos.

At the end of this, you can see that we're no longer using the original hackage-mirror code that powered the FP Complete S3 mirror for years. Unification achieved!


As I mentioned, we previously ran all of this mirroring code on Travis, but had to move off of it. Anyone who's worked with me knows that I hate being a system administrator, so it was a painful few months where I had to run this code myself on an EC2 machine I set up personally. Fortunately, FP Complete runs a Kubernetes cluster these days, and that means I don't need to be a system administrator :). As mentioned, I packaged up all of the code above in two Docker images, so running them on Kubernetes is very straightforward.

For the curious, I've put the Kubernetes deployment configurations in a Gist.


We have a few different credentials that need to be shared with these Docker containers:

  • AWS credentials for uploading
  • GPG key for signing tags
  • SSH key for pushing to Github

One of the other nice things about Kubernetes (besides allowing me to not be a sysadmin) is that it has built-in secrets support. I obviously won't be sharing those files with you, but if you look at the deployment configs I shared before, you can see how they are being referenced.


One annoyance I've had in the past is, if there's a bug in the scripts or some system problem, mirroring will stop for many hours before I become aware of it. I was determined to not let that be a problem again. So I put together the Hackage Mirror status page. It compares the last upload date from Hackage itself against the last modified time on various S3 artifacts, as well as the last commit for the Git repos. If any of the mirrors fall more than an hour behind Hackage itself, it returns a 500 status code. That's not technically the right code to use, but it does mean that normal HTTP monitoring/alerting tools can be used to watch that page and tell me if anything has gone wrong.

If you're curious to see the code powering this, it's available on Github.

Official Hackage mirror

With the addition of the new hackage-security metadata files to our S3 mirror, one nice benefit is that the FP Complete mirror is now an official Hackage mirror, and can be used natively by cabal-install without having to modify any configuration files. Hopefully this will be useful to end users.

And strangely enough, just as I finished this blog post, I got my first "mirrors out of sync" 500 error message ever, proving that the monitoring itself works (even if the mirroring had a bug).

What's next?

Hopefully nothing! I've spent quite a bit more time on this in the past few weeks than I'd hoped, but I'm happy with the end result. I feel confident that the mirroring processes will run reliably, I understand and trust the security model from end to end, and there's less code and machines to maintain overall.

Thank you!

Many thanks to Duncan and Herbert for granting me access to the private Hackage server to work around CDN caching issues, and to Herbert for the help and quick fixes with hackage-mirror-tool. Comic for 2016.09.27

New Cyanide and Happiness Comic

Disquiet: This Is “Glisten”

There’s likely no actual guitar on “Glisten,” a track off R Beny’s superb new collection, Full Blossom of the Evening. There are, in the extensive list of equipment, some potential sources for the sharp, high-pitched, tightly plucked, string-like sounds that are the focus of the track. We may be hearing a synthesized string from the Teenage Engineering OP-1, for example, or a sample played from his iPad. The sharpness of those sounds, the brittle, fiercely resistant tautness, brings to mind the artificial guitar heard on Oval’s 2010 album O. Both Oval’s record and this track off Beny’s new one explore the textures of the guitar in a digital space, a seeming simulacra of the familiar, rendered as a kind of sonic fiction, and the Oval reference gets additionally rich when a certain glitchiness is applies to the recording, when it temporarily seems as if the playback is failing. What makes the track distinct from Oval’s work is the core of the music. Oval’s reference was largely rock, pop, and folk. Beny has created a synthesizer chamber music, something that feels like it was plucked from the renaissance. When it glistens, per the title, and it does so quite often, it’s like light hitting stained glass — virtual stained glass, perhaps, but the beauty is real.

The full album is at “Glisten” is simply a recommended entry point. More from R Beny, aka Austin Cairns of the San Francisco Bay Area, at

Ideas from CBC Radio (Highlights): Designing Life: The Brave New World of Gene Editing

CRISPR is a revolutionary new development in gene editing. It has the potential to eliminate genetically transmitted diseases. But it could also be used to wage biological warfare or for eugenics. A panel discussion hosted by McGill University.

The Geomblog: Axiomatic perspective on fairness, and the power of discussion

Sorelle Friedler, Carlos Scheidegger and I just posted a new paper on the arxiv where we try to lay out a framework for talking about fairness in a more precise way. 

The blog post I link to says more about the paper itself. But I wanted to comment here on the tortuous process that led to this paper. In some form or another, we've been thinking about this problem for two years: how do we "mathematize" discussions of fairness and bias? 

It's been a long and difficult slog, more than any other topic in "CS meets X" that I've ever bumped into. The problem here is that the words are extremely slippery, and refract completely different meanings in different contexts. So coming up with notions that aren't overly complicated, and yet appear to capture the different ways in which people think about fairness was excruciatingly difficult. 

We had a basic framework in mind a while ago, but then we started "testing" it in the wild, on papers, and on people. The more conversations we had, the more we refined and adapted our ideas, to the point where the paper we have today owes deep debts to many people that we spoke with and who provided nontrivial conceptual challenges that we had to address. 

I still have no idea whether what we've written is any good. Time will tell. But it feels good to finally put out something, however half-baked. Because now hopefully the community can engage with it. 

new shelton wet/dry: Every day, the same, again

Canadian Mint employee accused of smuggling $180K of gold in his rectum Sperm delivery by mail? There’s an app for that Riding Roller Coasters Can Help Dislodge Kidney Stones MIT Researchers developed a device that uses radio waves to detect whether someone is happy, sad, angry or excited. Why do more men than women commit suicide? The [...]

new shelton wet/dry: Didn’t they tell you that I was a savage? Fuck your white horse and a carriage.

Recently, experimental psychologists at Oxford University explored the function of kissing in romantic relationships. Surprise! It’s complicated. After conducting an online survey with 308 men and 594 women, mostly from North America and Europe, who ranged in age from 18 to 63, the researchers have concluded that kissing may help people assess potential mates and then maintain [...]

CreativeApplications.Net: After Urban Screens – Dave Colangelo on Massive Media

intheairtonight-2Dave Colangelo, a researcher and artist focused on the role media plays in the city. An Assistant Professor at the Portland State University in the School of Theatre + Film, and a member of the Public Visualization Studio, Colangelo chatted with CAN about media façades, public art, and Pokémon Go.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Rough Sex

Please oh please don't let the Internet take this the wrong way.

New comic!
Today's News:

New Humanist Blog: Reforming Pakistan's madrasas

Numbering in the tens of thousands, are Pakistan’s infamous religious schools really beyond reform? Comic for 2016.09.26

New Cyanide and Happiness Comic

Trivium: 25sep2016

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Natural Scientists

Though, hopefully your kids don't drink as much as research scientists.

New comic!
Today's News: Comic for 2016.09.25

New Cyanide and Happiness Comic

Planet Lisp: Hans Hübner

Berlin Lispers Meetup: Tuesday September 27th, 2016, 8.00pm

You are kindly invited to the next "Berlin Lispers Meetup", an informal gathering for anyone interested in Lisp, beer or coffee:

Berlin Lispers Meetup
Tuesday, September 27th, 2016
8 pm onwards

St Oberholz, Rosenthaler Straße 72, 10119 Berlin
U-Bahn Rosenthaler Platz

We will try to occupy a large table on the first floor, but in case you don't see us,
please contact Christian: 0157 87 05 16 14.

Please join for another evening of parentheses!

things magazine: Weekend plans

Music: Jude Woodhead; Black Angel Drifter; The Cure, live in Glasgow in 1984; Ruth and Martin’s Album Club, introducing people to works they’ve never (often inexplicably) heard; Coffee and Riffs; The Caretaker’s new album series is a three year project … Continue reading

OUR VALUED CUSTOMERS: While discussing 90's nostalgia... (From the OVC Archive!)

Gnuplotting: Circular heat map

Suppose you have a large circular container filled with sand and measure its density at different positions. Now the goal is to display your measurements as a heat map extrapolated from your measurements, but limiting that heat map to the inner part of the container as shown in Fig. 1.

circular heat map

Fig. 1 Sand density measured at different positions in a circular container (code to produce this figure, sand.pal, data)

The underlying measurements are provided in the following format:

# sand_density_orig.txt
#1      2        3        4      5       6
#prob   x        y        z      density description
"E01"   0.00000 -1.14161 -0.020  0.7500  "dense"
"E02"  -0.94493 -0.81804 -0.020  0.5753  "normal"
"E03"   0.75306 -0.72000 -0.020  0.7792  "dense"

Those data points have to be extrapolated onto a grid for the heat map, which can be achieved by the following commands.

set view map
set pm3d at b map
set dgrid3d 200,200,2
splot "sand_density1.txt" u 2:3:5

Fig. 2 shows the result which has two problems. The grid data is limited to the boundary given by the measurement points. In addition, the grid is always rectangular in size and not circular.

circular heat map

Fig. 2 Sand density measured at different positions in a circular container (code to produce this figure, sand.pal, data)

To overcome the first problem you have to add four additional points to the original data in order to stretch the grid boundary to the radius of the container. For that you have to come up with some reasonable extrapolation from the existing points. I did this in a very simple way by a mixture of linear interpolation or using the value of the nearest point. If you want to do the same with your data set you should maybe spent a little bit more effort on this.

# sand_density.txt
#1      2        3        4      5       6
#prob   x        y        z      density description
"E01"   0.00000 -1.14161 -0.020  0.7500  "dense"
"xmin" -1.50000  0.00000 -0.050  0.5508  "dummy"
"xmax"  1.50000  0.00000 -0.050  0.6634  "dummy"
"ymin"  0.00000 -1.50000 -0.050  0.7500  "dummy"
"ymax"  0.00000  1.50000 -0.050  0.6315  "dummy"

If you plot those modified data set you will get Fig. 3.

circular heat map

Fig. 3 Sand density measured at different positions in a circular container (code to produce this figure, sand.pal, data)

In order to limit the heat map to a circle you first extrapolate the grid using dgrid3d and store the data in a new file.

set table "tmp.txt"
set dgrid3d 200,200,2
splot "sand_density2.txt" u 2:3:5
unset table

Afterwards a function is defined in order to limit the points to the inner of the circle and plot the data from the temporary file.

circle(x,y,z) = sqrt(x**2+y**2)>r ? NaN : z
plot "tmp.txt" u 1:2:(circle($1,$2,$3)) w image

Finally a few labels and the original measurement points are added. The manually added points like xmin are removed by a smaller radius value. The result is then the nice circular heat map in Fig. 1.

r = 1.49 # make radius smaller to exclude interpolated edge points
set label 'normal' at -1,0.2 center front tc ls 1
set label 'dense' at 0.5,0.75 center front tc ls 1
set label 'very dense' at 0.3,-0.3 center front tc ls 1
plot "sand_density.txt" \
         u (circle($2,$3,$2)):(circle($2,$3,$3)) w p ls 1

The Shape of Code: The wind is not yet blowing in software engineering research

An article by Andrew Gelman is getting a lot of well deserves publicity at the moment. The topic of discussion is sloppy research practices in psychology and how researchers are responding to criticism (head in the sand and blame the messenger).

I imagine that most software developers think this is an intrinsic problem in the ‘soft’ sciences that does not apply to the ‘hard’ sciences, such as software; I certainly thought this until around 2000 or so. Writing a book containing a detailed analysis of C convinced me that software engineering was mostly opinion, with a tiny smattering of knowledge in places.

The C book tried to apply results from cognitive psychology to what software developers do. After reading lots of books and papers on cognitive psychology I was impressed with how much more advanced, and rigorous, their experimental methods were, compared to software engineering.

Writing a book on empirical software engineering has moved my views on to the point where I think software engineering is the ideal topic for the academic fraudster.

The process of cleaning up their act that researchers in psychology are going through now is something that software engineering will have to go through. Researchers have not yet reached the stage of directly pointing out that software engineering research is a train wreck. Instead, they write parody papers and polite article showing how dirty popular datasets are.

Of course, industry is happy to keep quiet. The development of software systems is still a sellers market (the packaged market is dog eat dog) or at least that is still the prevalent mentality. I doubt anything much will change until the production of software systems becomes a buyers market, which will create a real incentive for finding out what really works. / 2016-09-30T11:39:49