Bifurcated Rivets: From FB

More

Bifurcated Rivets: From FB

Lovely

Bifurcated Rivets: From FB

Big Banjo

Bifurcated Rivets: From FB

Interesting version

Bifurcated Rivets: From FB

Fred Geiger - great 5 string player almost nobody knows about.

MetaFilter: No need to peel off the stickers

Self-solving Rubik's Cube

MetaFilter: An Ill Wind

While the number of opioid overdose deaths nationwide has doubled since 2008, the number of those victims who have become organ donors has quadrupled. Partially as a result of the newly available organs from overdose deaths, the list of people waiting for transplants — nearly 124,000 at its peak in 2014 — has begun to shrink for the first time, after 25 years of continuous growth.

Recent additions: ngx-export 1.4.2

Added by lyokha, Mon Sep 24 12:43:02 UTC 2018.

Helper module for Nginx haskell module

Recent additions: net-spider 0.1.0.0

Added by debugito, Mon Sep 24 12:30:17 UTC 2018.

A graph database middleware to maintain a time-varying graph.

Recent additions: aeson-compat 0.3.9

Added by phadej, Mon Sep 24 11:58:04 UTC 2018.

Compatibility layer for aeson

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: Society6 — Ones to Watch: An Interview with Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler

Kim Leutwyler’s Website

Kim Leutwyler on Instagram

Recent additions: conduit-zstd 0.0.1.1

Added by luispedro, Mon Sep 24 11:12:38 UTC 2018.

Conduit-based ZStd Compression

Recent additions: indexation 0.6.3

Added by NikitaVolkov, Mon Sep 24 11:04:53 UTC 2018.

Tools for entity indexation

Open Culture: In 1900, a Photographer Had to Create an Enormous 1,400-Pound Camera to Take a Picture of an Entire Train

Cameras are small, and getting smaller all the time. This development has helped us all document our lives, sharing the sights we see with an ease difficult to imagine even twenty years ago. 120 years ago, photography faced an entirely different set of challenges, but then as now, much of the motivation to meet them came from commercial interests. Take the case of Chicago photographer George R. Lawrence and his client the Chicago & Alton Railway, who wanted to promote their brand-new Chicago-to-St. Louis express service, the Alton Limited. This product of the golden age of American train travel demanded some respectable photography, a technology then still in its thrilling, possibility-filled emergence.

A truly elegant piece of work, the Alton Limited would, during its 72-year lifespan, boast such features as a post office, a library, a Japanese tea-room, and a striking maroon-and-gold color scheme that earned it the nickname "the Red Train."

Even from a distance, the Alton Limited looked upon its introduction in 1899 like nothing else on the railroads, with its six identical Pullman cars all designed in perfect symmetry — the very aspect that so challenged Lawrence to capture it in a photograph. Simply put, the whole train wouldn't fit in one picture. While he could have shot each car separately and then stitched them together into one big print, he rejected that technique for its inability to "preserve the absolute truthfulness of perspective."

Only a much bigger camera, Lawrence knew, could capture the whole train. And so, in the words of Atlas Obscura's Anika Burgess, he "quickly went to work designing a camera that could hold a glass plate measuring 8 feet by 4 1/2 feet. It was constructed by the camera manufacturer J.A. Anderson from natural cherry wood, with bespoke Carl Zeiss lenses (also the largest ever made). The camera alone weighed 900 pounds. With the plate holder, it reached 1,400 pounds. According to an August 1901 article in the Brooklyn Daily Eagle, the bellows was big enough to hold six men, and the whole camera took a total of 15 workers to operate." Transporting the camera to Brighton Park, "an ideal vantage point from which to shoot the waiting train," required another team of men, and developing the eight-foot long photo took ten gallons of chemicals.

The advertisements in which Lawrence's photograph appeared practically glowed with pride in the Alton Limited, billing it as "a train for two cities," as "the only way between Chicago and St. Louis," as "the handsomest train in the world." The whole-train picture beggared belief: though it went on to win Lawrence the Grand Prize for World Photographic Excellence at the 1900 Paris Exposition, Burgess notes, it looked so impossible that both the photographer and Chicago & Alton "had to submit affidavits to verify that the photograph had been made on one plate." We in the 21st century, of course, have no reason to doubt its authenticity, or even to marvel at its ingenuity until we know the story of the immense custom camera with which Lawrence shot it. Today, what awes us are all those smaller shots of the Alton Limited's interior, exuding a luxuriousness that has long vanished from America's railroads. If we were to find ourselves on such a train today, we'd surely start Instagramming it right away.

via Atlas Obscura

Related Content:

Behold a Beautiful Archive of 10,000 Vintage Cameras at Collection Appareils

19-Year-Old Student Uses Early Spy Camera to Take Candid Street Photos (Circa 1895)

See the First Photograph of a Human Being: A Photo Taken by Louis Daguerre (1838)

The History of Photography in Five Animated Minutes: From Camera Obscura to Camera Phone

Darren’s Big DIY Camera

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

In 1900, a Photographer Had to Create an Enormous 1,400-Pound Camera to Take a Picture of an Entire Train is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

MetaFilter: Man, machine or beast

Police have called off the three year investigation into the 'Croyden Cat Killer' as they believe they now know the answer. But some locals are not happy and want the hunt to continue (Video possibly nsfw, other links have potentially disturbing descriptions)

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: Immensely Satisfying Animated Gifs by Nicolas Fong

Slashdot: Alcohol Causes One In 20 Deaths Worldwide, Says WHO

An anonymous reader quotes a report from The Guardian: Alcohol is responsible for more than 5% of all deaths worldwide, or around 3 million a year, new figures have revealed. The data, part of a report from the World Health Organization, shows that about 2.3 million of those deaths in 2016 were of men, and that almost 29% of all alcohol-caused deaths were down to injuries -- including traffic accidents and suicide. The report, which comes out every four years, reveals the continued impact of alcohol on public health around the world, and highlights that the young bear the brunt: 13.5% of deaths among people in their 20s are linked to booze, with alcohol responsible for 7.2% of premature deaths overall. It also stresses that harm from drinking is greater among poorer consumers than wealthier ones. While the proportion of deaths worldwide that have been linked to alcohol has fallen to 5.3% since 2012, when the figure was at 5.9%, experts say the findings make for sobering reading.

Read more of this story at Slashdot.

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: STORYHIVE’s 2018 Music Video Edition is Open!

MetaFilter: Mr. Rogers vs. the Superheroes

One of the few things that could raise anger — real, intense anger — in Mister Rogers was the willful misleading of children. Superheroes, he thought, were the worst culprits. An excerpt adapted from The Good Neighbor: The Life and Work of Fred Rogers by Maxwell King, on Longreads.

Open Culture: Watch Andy Warhol’s Screen Tests of Three Female Muses: Nico, Edie Sedgwick & Mary Woronov

Artist Andy Warhol shot over 500 silent, black-and-white screen-tests in his famous Factory between 1964 and 1966, documenting the beautiful youth who were drawn to the scene. Sometimes he would chat with the subject beforehand, offering suggestions to help them achieve the type of performance he was looking for. More frequently he took a passive role, to the point of leaving the room during the filming.

The opposite of a people person, he preferred to engage with his subjects by scrutinizing the finished screen tests, projecting them in slow motion to imbue them with an added element of glamour and amplify every nuance of expression. As Warhol wrote in The Philosophy of Andy Warhol:

That screen magnetism is something secret. If you could figure out what it is and how you make it, you'd have a really good product to sell. But you can't even tell if someone has it until you actually see them up there on the screen. You have to give screen tests to find out.

The screen tests are less auditions for roles in Warhol films than pieces of an ongoing project. Warhol played with them, assembling and reassembling them into collections which he screened under such fluid titles as 13 Most Beautiful Women and 13 Most Wanted Men. Some of his test subjects went on to achieve real stardomLou Reed, Dennis Hopper, and Bob Dylan

Others’ fame is forever tied to the Factory.

Edie Sedgwick, above, one of his best known muses, was a troubled girl from a wealthy family. Unlike some of the moodier screen tests, Sedgwick’s is fully lit. She displays a genuine movie star’s poise, barely moving as the camera drinks her in. Her beauty appears untouched by the addictions and eating disorders that were already a driving force in her life.

Actress and painter Mary Woronov emerged unscathed from her time at the Factory. Like Sedgwick, she seemed comfortable with the idea of being observed doing nothing for an extended period. Recalling her screen test experience in an interview with Bizarre, she made it clear that the subjects were far from the center of attention:

Andy put you on a stool, then puts the camera in front of you. There are lots of people around usually. And then he turns the camera on, and he walks away, and all the people walk away too, but you're standing there in front of this camera.

I saw Salvador Dali do one, it was really funny. It's a very interesting film, because it's a way of cracking open your personality and showing what's underneath—only in a visual way, because there's no talking, nothing. You just look at the camera. Salvador made this gigantic pose with his moustache blaring and everything, and he couldn't hold the pose. Not for five minutes. And so at about minute four, he suddenly started looking very, very real.

The camera loves stillness, something model and singer Nico was unable to deliver in her screen test. Perhaps not such a problem when the director has plans to project in slow motion.

As he stated in POPism: The Warhol '60s:

What I liked was chunks of time all together, every real moment… I only wanted to find great people and let them be themselves… and I'd film them for a certain length of time and that would be the movie.

Factory regular/interior decorator/photographer Billy Name told punk historian Legs McNeil in an interview that the screen tests served another purposeto identify the fellow travelers from among the poor fits:

… it's always cool to meet other artists, you know, to see if it's somebody who's going to be a peer or a compatriot, who you can play with and hang around with or not. Andy was doing a series of screen tests for his films, and we wanted everybody to do one: Dylan, Nico, Dennis Hopper, Susan Sontag, Donovan—everyone famous that came up to the Factory. We'd just film 16mm black-and-white portraits of the person sitting there for a few minutes. So our purpose was to have Dylan come up and do a screen test, so he could be part of the series. That was enough for us. But Dylan didn't talk at all when we filmed him. I don't think he liked us, ha, ha, ha!

Revolver Gallery, devoted exclusively to Warhol, has a gallery of screen-tests on their YouTube channel.

Related Content:

The Velvet Underground & Andy Warhol Stage Proto-Punk Performance Art: Discover the Exploding Plastic Inevitable (1966)

Andy Warhol’s 15 Minutes: Discover the Postmodern MTV Variety Show That Made Warhol a Star in the Television Age (1985-87)

The Big Ideas Behind Andy Warhol’s Art, and How They Can Help Us Build a Better World

Andy Warhol’s ‘Screen Test’ of Bob Dylan: A Classic Meeting of Egos

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in NYC on Monday, September 24 for another monthly installment of her book-based variety show, Necromancers of the Public Domain. Follow her @AyunHalliday.

Watch Andy Warhol’s Screen Tests of Three Female Muses: Nico, Edie Sedgwick & Mary Woronov is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

MetaFilter: "...Now we have a glue gun"

Pastry Chef Attempts To Make Gourmet Twizzlers (SLYT)

Slashdot: Famed Mathematician Claims Proof of 160-Year-Old Riemann Hypothesis

Slashdot reader OneHundredAndTen writes: Sir Michael Atiyah claims to have proved the Riemann hypothesis. This is not some internet crank, but one the towering figures of mathematics in the second half of the 20th century. The thing is, he's almost 90 years old. According to New Scientist, Atiyah is set to present his "simple proof" of the Riemann hypothesis on Monday at the Heidelberg Laureate Forum in Germany. Atiyah has received two awards often referred to as the Nobel prizes of mathematics, the Fields medal and the Abel Prize; he also served as president of the London Mathematical Society, the Royal Society and the Royal Society of Edinburgh. "[T]he hypothesis is intimately connected to the distribution of prime numbers, those indivisible by any whole number other than themselves and one," reports New Scientist. "If the hypothesis is proven to be correct, mathematicians would be armed with a map to the location of all such prime numbers, a breakthrough with far-reaching repercussions in the field."

Read more of this story at Slashdot.

Lambda the Ultimate - Programming Languages Weblog: The Little Typer

A new introductory book about dependent types, involving some familiar names:

The Little Typer

by Daniel P. Friedman and David Thrane Christiansen.

Foreword by Robert Harper.

Afterword by Conor McBride.

An introduction to dependent types, demonstrating the most beautiful aspects, one step at a time.

A program's type describes its behavior. Dependent types are a first-class part of a language, and are much more powerful than other kinds of types; using just one language for types and programs allows program descriptions to be as powerful as the programs they describe. The Little Typer explains dependent types, beginning with a very small language that looks very much like Scheme and extending it to cover both programming with dependent types and using dependent types for mathematical reasoning. Readers should be familiar with the basics of a Lisp-like programming language, as presented in the first four chapters of The Little Schemer.

The first five chapters of The Little Typer provide the needed tools to understand dependent types; the remaining chapters use these tools to build a bridge between mathematics and programming. Readers will learn that tools they know from programming—pairs, lists, functions, and recursion—can also capture patterns of reasoning. The Little Typer does not attempt to teach either practical programming skills or a fully rigorous approach to types. Instead, it demonstrates the most beautiful aspects as simply as possible, one step at a time.

Slashdot: Meet the World's First Self-Driving Car From 1968

Qbertino writes: The German Web industry magazine T3N (think of it as the German TechCrunch) has an article about a test circuit and a test vehicle -- a modified Mercedes Benz limousine of the time -- that was set up by the German tire manufacturer Continental in order to test tires in a precisely reproducible set of tests. Hence the self-driving mechanism provided by a wire in the test track to send and receive signals from the car and to record data on the test runs on magnetic tape and other high-tech stuff from the time. Here's a short video, erm, film clip showing the setup in action -- driverless seat included. Today's artificial intelligence is nowhere to be seen of course, but the entire setup itself seems pretty impressive and sophisticated.

Read more of this story at Slashdot.

Slashdot: How Qualcomm Tried and Failed To Steal Intel's Crown Jewel

An anonymous reader shares an article from Bloomberg: In early November, Qualcomm Chairman Paul Jacobs stood on a stage in the heart of Silicon Valley and vowed to break Intel's stranglehold on the world's most lucrative chip business. The mobile internet and cloud computing were booming and the data centers running this digital economy had an insatiable thirst for computer servers -- and especially the powerful, expensive server chips that Intel churns out by the million. Qualcomm had spent five years and hundreds of millions of dollars designing competing processors, trying to expand beyond its mobile business. Jacobs was leading a coming-out party featuring tech giants like Microsoft and HP, which had committed to try the new gear. "That's an industry that's been very slow moving, very complacent," Jacobs said on stage. "We're going to change that." Less than a year later, this once-promising business is in tatters, according to people familiar with the situation. Most of the key engineers are gone. Big customers are looking elsewhere or going back to Intel for the data center chips they need. Efforts to sell the operation -- including a proposed management buyout backed by SoftBank -- have failed, the people said. Jacobs, chief backer of the plan and the son of Qualcomm's founder, is out, too. The demise is a story of debt-fueled dealmaking and executive cost-cutting pledges in the face of restless investors seeking quick returns -- exactly the wrong environment for the painstaking and expensive task of building a new semiconductor business from scratch. It leaves Qualcomm more reliant on a smartphone market that's plateaued. And Intel's server chip boss is happy.

Read more of this story at Slashdot.

Disquiet: Stasis Report: Loscil ✚ Grouper ✚ More

The latest update to my Stasis Report ambient-music playlist. It started out just on Spotify. It’s now also on Google Play Music. The following five tracks were added on Sunday, September 23. Four of the tracks are fairly new, with the exception of one from 2012:

✚ “Quiet Midnight” off Spring by ioflow (aka Joshua Saddler of San Diego, California): ioflow.bandcamp.com. Released on the Gohan Tapes label.

✚ “Laramie” off Ghost Box (Expanded) by SUSS, whose members are Bob Holmes, Gary Leib, Pat Irwin, Jonathan Gregg, and William Garrett: suss.bandcamp.com. Interview at brooklynvegan.com. Album on the Northern Spy label.

✚ “Imprints,” a single by Loscil, working with Field Works, aka Stuart Hyatt, based on Hyatt’s own field recordings. It’s from a new collection, Born in the Ear, also featuring Paul de Jong, Eluvium, Forrest Lewinger, the Album Leaf, Greg Davis, Juana Molina, and Hyatt: fieldworks.bandcamp.com. Released on the Temporary Residence label.

✚ “Blouse” from Grid of Points by Grouper, aka Liz Harris: grouper.bandcamp.com. Released on the Yellow Electric label.

✚ “Brûlez ce coeur,” the title track off a 2012 album by Les Momies de Palerme (on Constellation Records), aka Xarah Dion and Marie Davidson: cstrecords.com, soundcloud.com/constellation-records. It’s also on Bandcamp, though minus a few tracks: lesmomiesdepalerme.bandcamp.com. Davidson has a new album, Working Class Woman, due out October 5 on Ninja Tune.

Some previous Stasis Report tracks were removed to make room for these, keeping the playlist length to roughly two hours. Those retired tracks (Leila Abdul-Rauf, Lucrecia Dalt, M. Geddes Gengres, and Jake Muir, as well as Kara-Lis Coverdale remixing Úlfur) are now in the Stasis Archives playlist (currently only on Spotify).

Slashdot: Japan's Two Hopping Rovers Successfully Land On Asteroid Ryugu

sharkbiter shares a report from Space.com: The suspense is over: Two tiny hopping robots have successfully landed on an asteroid called Ryugu -- and they've even sent back some wild postcards from their new home. The tiny rovers are part of the Japan Aerospace Exploration Agency's Hayabusa2 asteroid sample-return mission. Engineers with the agency deployed the robots early Friday (Sept. 21), but JAXA waited until today (Sept. 22) to confirm the operation was successful and both rovers made the landing safely. In order to complete the deployment, the main spacecraft of the Hayabusa2 mission lowered itself carefully down toward the surface until it was just 180 feet (55 meters) up. After the rovers were on their way, the spacecraft raised itself back up to its typical altitude of about 12.5 miles above the asteroid's surface (20 kilometers). The agency still has two more deployments yet to accomplish before it can rest easy: Hayabusa2 is scheduled to deploy a larger rover called MASCOT in October and another tiny hopper next year. And of course, the main spacecraft has a host of other tasks to accomplish during its stay at Ryugu -- most notably, to collect a sample of the primitive world to bring home to Earth for laboratory analysis. JAXA tweeted on Saturday: "We are sorry we have kept you waiting! MINERVA-II1 consists of two rovers, 1a & 1b. Both rovers are confirmed to have landed on the surface of Ryugu. They are in good condition and have transmitted photos & data. We also confirmed they are moving on the surface."

Read more of this story at Slashdot.

explodingdog: Photo



Disquiet: “Module Learning: Swoop (Bounds Parameters)”

I recently reworked about half my modular synthesizer, with some key priorities in mind, among them (1) introducing some modules with their own strong personalities, notably the 4MS Spectral Multiband Resonator and the Ieaskul F. Mobenthey Swoop, and (2) having a means to be always recording. I added an Expert Sleepers Disting MK4 between the mixer and the output to use as an always-on recorder. This track is a first attempt I made at using the IFM Swoop. Here the Swoop is producing a triangle wave, bounds of which are changing as the track proceeds (among other variables I’m just beginning to wrap my head around). It’s secondarily being sent through a low-pass filter, to take the upper edge off.

The attached photo shows the patch, though some of the knobs were fiddled with as it ran. (Perhaps the main thing I learned today was that it’s sort of a hassle to get the SD card in and out of the Disting. The always-on recording was nice, though. This is the end of a longer segment, the opening part of which was even lower on the learning curve. When I was done, I just edited off the opening half and introduced a fade-in. The fade-out was done manually with my in-rack mixer.)

Planet Haskell: ERDI Gergo: CPU modeling in CλaSH

My entry for RetroChallenge 2018/09 is building a CHIP-8 computer. Previously, I've talked in detail about the video signal generator and the keyboard interface; the only part still missing is the CPU.

The CHIP-8 instruction set

Since the CHIP-8 was originally designed to be an interpreted language run as a virtual machine, some of its instructions are quite high-level. For example, the framebuffer is modified via a dedicated blitting instruction; there is a built-in random number generator; and instructions to manipulate two 60 Hz timers. Other instructions are more in line with what one would expect to see in a CPU, and implement basic arithmetic such as addition or bitwise AND. There is also a generic escape hatch instruction but that doesn't really apply to hardware implementations.

The CPU has 16 generic-purpose 8-bit registers V0…VF; register VF is also used to report flag results like overflow from arithmetic operations, or collision during blitting. Most instructions operate on these general registers. Since the available memory is roughly 4K, these 8-bit registers wouldn't be too useful as pointers. Instead, there is a 12-bit Index register that is used as the implicit address argument to memory-accessing instructions.

For flow control, the program counter needs 12 bits as well; the CHIP-8 is a von Neumann machine. Furthermore, it has CALL / RET instructions backed by a call-only stack (there is no argument passing or local variables).

Modeling the CPU's internal state

We can collect all of the registers described above into a single Haskell datatype. I have also added two 8-bit registers for the high and low byte of the current instruction, but in retrospect it would be enough to just store the high byte, since the low byte is coming from RAM exactly when we need to dispatch on it anyway. The extra phase register is to distinguish between execution phases such as fetching the first byte of the next instruction, or for instructions that are implemented in multiple clock cycles, like clearing the frame buffer (more on that below).

type Addr = Unsigned 12
type Reg = Index 16

data CPUState = CPUState
    { opHi, opLo :: Word8
    , pc, ptr :: Addr
    , registers :: Vec 16 Word8
    , stack :: Vec 24 Addr
    , sp :: Index 24
    , phase :: Phase
    , timer :: Word8
    , randomState :: Unsigned 9
    }
    

I implemented the random number generator as a 9-bit linear-feedback shift register, truncated to its lower 8 bits; this is because a maximal 8-bit LFSR wouldn't generate 0xFF.

lfsr :: Unsigned 9 -> Unsigned 9
lfsr s = (s `rotateR` 1) `xor` b4
  where
    b = fromIntegral $ complement . lsb $ s
    b4 = b `shiftL` 4
    

Input and output "pins"

Similar to how a real chip has various pins to interface with other parts, our CPU description will also have multiple inputs and outputs. The input consists of the data lines read from main memory and the framebuffer; the events coming from the keypad, and the keypad state; and the 60 Hz VBlank signal from the video generator. This latter signal is used to implement the timer register's countdown. The keypad's signals are fed into the CPU both as events and statefully; I've decided to do it this way so that only the peripheral interface needs to be changed to accomodate devices that are naturally either parallel (like a keypad matrix scanner) or serial (like a computer keyboard on a PS/2 connector).

type Key = Index 16
type KeypadState = Vec 16 Bool

data CPUIn = CPUIn
    { cpuInMem :: Word8
    , cpuInFB :: Bit
    , cpuInKeys :: KeypadState
    , cpuInKeyEvent :: Maybe (Bool, Key)
    , cpuInVBlank :: Bool
    }      
    

The output is even less surprising: there's an address line and a data out (write) line for main memory and the video framebuffer.

type VidX = Unsigned 6
type VidY = Unsigned 5

data CPUOut = CPUOut
    { cpuOutMemAddr :: Addr
    , cpuOutMemWrite :: Maybe Word8
    , cpuOutFBAddr :: (VidX, VidY)
    , cpuOutFBWrite :: Maybe Bit
    }
    

So, what is a CPU?

As far as CλaSH is concerned, the CPU is extensionally a circuit converting input signals to output signals, just like any other component:

extensionalCPU :: Signal dom CPUIn -> Signal dom CPUOut
    

The internal CPU state is of no concern at this level. Internally, we can implement the above as a Mealy machine with a state transition function that describes behaviour in any given single cycle:

intensionalCPU :: (CPUState, CPUIn) -> (CPUState, CPUOut)

extensionalCPU = mealy intenstionalCPU initialState
    

As far as a circuit is concerned, a clock cycle is a clock cycle is a clock cycle. If we want to do any kind of sequencing, for example to fetch two-byte instruction opcodes from the byte-indexed main memory in two steps, we need to know in intensionalCPU which step is next. This is why we have the phase field in CPUState, so we can read out what we need to do, and store what we want to do next. For example, in my current version the video framebuffer is bit-indexed (addressed by the 6-bit X and the 5-bit Y coordinate), and there is no DMA to take care of bulk writes; so to implement the instruction that clears the screen, we need to write low to all framebuffer addresses, one by one, from (0, 0) to (63, 31). This requires 2048 cycles, so we need to go through the Phase that clears (0, 0), to the one that clears (0, 1), all the way to (63, 31), before fetching the first byte of the next opcode to continue execution. Accordingly, one of the constructors of Phase stores the (x, y) coordinate of the next bit to clear, and we'll need to add some logic so that if phase = ClearFB (x, y), we emit (x, y) on the cpuOutFBAddr line and Just low on the cpuOutFBWrite line. Blitting proceeds similarly, with two sub-phases per phase: one to read the old value, and one to write back the new value (with the bitmap image xor'd to it)

data Phase
    = Init
    | Fetch1
    | Exec
    | StoreReg Reg
    | LoadReg Reg
    | ClearFB (VidX, VidY)
    | Draw DrawPhase (VidX, VidY) Nybble (Index 8)
    | WaitKeyPress Reg
    | WriteBCD Word8 (Index 3)      
    

So how should we write intensionalCPU? We could do it in direct style, i.e. something like

intensionalCPU (s0, CPUIn{..}) = case phase of
    Fetch1 ->
        let s' = s{ opHi = cpuInMem, pc = succ $ pc s, phase = Exec }
            out = CPUOut{ cpuOutMemAddr = pc s', cpuOutMemWrite = Nothing
                        , cpuOutFBAddr = minBound, cpuOutFBWrite = Nothing
                        }
        in (s', out)
    WaitKeyPress reg ->
        let s' = case cpuInKeyEvent of
                Just (True, key) -> s{ registers = replace reg key (registers s), phase = Fetch1 }
                _ -> s
            out = CPUOut{ cpuOutMemAddr = pc s', cpuOutMemWrite = Nothing
                        , cpuOutFBAddr = minBound, cpuOutFBWrite = Nothing
                        }
        in (s', out)                   
    -- and lots of other cases as well, of course
  where
    s | cpuInVBlank = s0{ timer = fromMaybe 0 $ predIdx $ timer s0 }
      | otherwise = s0
    

If you think this is horrible and unreadable and unmaintainable, then yes! I agree! Which is why I've spent most of this RetroChallenge (when not fighting synthesizer crashes) thinking about nicer ways of writing this.

This post is getting long, let's end on this note here. Next time, I am going to explain how far I've gotten so far in this quest for nicely readable, composable descriptions of CPUs.

new shelton wet/dry: ‘I think that God in creating Man somewhat overestimated his ability.’ –Oscar Wilde

[W]hy are some societies more religious than others? One answer is religious coping: Individuals turn to religion to deal with unbearable and unpredictable life events. To investigate whether coping can explain global differences in religiosity, I combine a global dataset on individual-level religiosity with spatial data on natural disasters. Individuals become more religious if an [...]

ScreenAnarchy: Fantastic Fest 2018 Review: FP2: BEATS OF RAGE Is One For The Fans

FP2: Beats of Rage is the kind of sequel that seems both impossible and inevitable at the same time. 2012's The FP presented such a unique, idiosyncratic vision of a dystopian future in which the course of human existence was determined by video game combat in the form of "Beat Offs", duels on a Dance Dance Revolution styled arcade game renamed Beat Beat Revelation.  The film was so odd, the tone so unique, and the humor so specifically calibrated that you either love or hate the film, and initial reactions were very split. Writer/director/star Jason Trost managed to round up a team and run a couple of successful crowdfunding campaigns to get FP2 off the result is a film that is both more ambitious than...

[Read the whole post on screenanarchy.com...]

Planet Haskell: Edward Z. Yang: HIW’18: Let’s Go Mainstream with Eta!

My name is Rahul Muttineni, CTO of TypeLead, working on building services around a language named Eta. To get started, I'll give an overview of how the project started, and where it is now.

It started as a HSOC project. It was called GHCVM; back then we had plans of making it both on JVM and CLR... we don't think about CLR anymore. I was mentored by Edward Kmett. We got pretty good response on this, so Jo and I decided to take the risk and work on this full time.

Big thanks to the GHC team, really good work. We've worked with the codebase for two years, and the more and more we work with it, we see how much awesome stuff there is. I've learned a lot by working with the code.

What is Eta? Eta is a fork of GHC. During the GSOC project, it started off as a Haskell program that used the GHC API. Midway in the program, I found that there were certain things that I wanted to do that I couldn't do, and I spent 3-4 days setting up a fork. I'll talk about what those limitations are. Like Haskell, it's a ... language, but the key difference is that it runs on the JVM. That is its own set of challenges, primarily with respect to tail calls. The nice thing about Eta is that it runs on the JVM, and it can run a good chunk of projects just like that. lens... recently, in the last month, we got Yesod working... it's in good shape. The next really great type of Eta is the strongly typed FFI. That works really well with the subtyping in JVM. A good chunk of the talk is about how we got that working. One of the main focuses of Eta is to be focused on industrial use. GHC is focused on industrial use, and research, both. There's a tension between the two... the nice thing we have for Eta is we don't have to face that tension; it's easy to make decisions on how to add new features, because will it help companies? If it is yes, otherwise we don't. (SPJ: That can be a hard question to answer!)

Haskell: Avoid success at all costs. We're not going to sacrifice core principles of language for benefit. Pursue success, at minimum cost. We want to make it successful as much as possible, but we want to make as little sacrifice as possible. That will be a little tricky...

What is Eta? What language features does it support? It started off as a fork of GHC 7.10.3. All extensions that work there, work with Eta as well. The only thing was TemplateHaskell and QuasiQuotes didn't work for a long time. We got it working 3-4 months ago. Biggest change is JavaFFI. GHC 7.10.3 is MINUS C FFI. We could have supported it: Java has JNI, but we tried to avoid it because we didn't want to make platform specific bindings to all the libbraries.

Joe backported a bunch of GHC 8 features: StrictData, ApplicativeDo, OverloadedLabels. Backpack was got recently. There's a very particular reason we had to do it: it has to do with the fact that we don't have green threads by default, and we wanted to give the user a choice of threaded runtime versus blocking runtime.

The compiler? It's a fork of GHC, so all the compiler passes are the same. We just chopped off everything after STG; e.g., C-- is gone. We generate bytecode from STG. We don't do any optimizations right now, and won't need to for some fine. We don't have to because in JVM, it's JIT compiled, so we don't have to optimize as much since JVM will remove a lot of the code that's not used anyway. And the driver: GHC generates object files... we decided to use JAR files. They're just zip files that package up a bunch of class files that store Java bytecodes. We also added one more mode for Uberjars. These are JAR files that are packaged up into one giant package.

I'll talk a little bit about how we implemented the REPL; template haskell. It works through the external-interpreter architecture. In GHC that's called iserv: the process, what it does, is handles running the code. So the compiler will still do the typechecking and everything, but once it's done with all that stuff, GHC will generate, a specific bytecode set, for interpreting Haskell efficiently. Because we already generated JVM bytecodes. We didn't need that custom bytecode set; we just compile with optimizations off; that gives us JVM bytecodes, then we send it to the external process, load it up, and execute them. Implementing the REPL is pretty easy how to get all this working together. JVM has a mechanism called classloading, which is very flexible. You can download bytecodes from the network, get code an runtime. Once you load the class, it's statically compiled code, it's optimized the same, etc.

The build tool we use is Etlas. We didn't want to move too far off of GHC, we stuck with Cabal. At the point we started using it, we forked off of Cabal 2.0. Main difference is that it lets you manage Eta versions. Etlas is almost like Stack, but it's much much closer to Cabal. We took the nice features of Stack and added them to Cabal. The other thing is that it does patch management. What we've been finding as we add more features and backport, Eta is not exactly GHC 7.10, nor is it GHC 8.0, it's a weird intermediate state, so certain packages that won't exactly compile without small changes, so we needed some system to apply those changes before we actually run the build. So we setup a GitHub repository that stores all the patch files. What etlas will do, it will get you the most recent set of patches. Then if you install a package, lens or something, it will download lens, apply the patch, and then it will build. Just recently, we were using base 4.8, and recently we upgraded to base 4.11. But we couldn't update to the new Generics infrastructure, because it slowed down compile times. So there were a bunch of packages that would check if they were GHC 8... and then use new generics. So we had to do a bunch of patching for that. But that's the kind of stuff we have to deal with.

The title of this talk is lets go mainstream with eta. I want to take a moment and say, what does that mean? "The ideas, attitudes, or activities that are shared by most people and regarded as normal or conventional." So at what point does a programming language become consdiered normal or conventional? It has to be used a big company, solve a big real world problem, and people have to believe it works. That's a very complicated question, multifaceted, one part of that answer is, it should make it easier to solve real world problems easier than the status quo. Take for example PHP. PHP came out when there was nothing better to program dynamic web applications. It had just the minimum features required to make it useful to build these. Now everyone here is asking the question: Haskell clearly solves a lot of problems better than the status quo. So why isn't it moving forward? That's a big question, I'm going to talk about how we're approaching it.

The strategy we're using internally, is we put on a "Big Company Hat"; we pretend we're a big company with a lot of employees, millions or billions of lines, and try to figure out what problems they'll face. Some problems are crazy long build times, when trying to build huge software; dynamic where you have to make sure junior developers get up to speed... etc. That's couple to get this conversation started.

After thinking about this a long time, we boiled it down to three basic principles, how we will develop Eta.

1. User Experience
2. Performance
3. Safety

User Experience is mainly, an emotional thing, how you feel when you use Eta technology, how you interact with it, what you feel when you get an error, psychologically. When something has good UX, we feel good. That's a very subjective thing, it can vary between different people, we have to figure out a way to standardize / make it universal. Something we forget as software and tool developers, the person developing the software is human. If they get errors persistently over time, they'll get frustrated. Machines will do what you tell them over and over again.

So what have we done in Eta to concern? We've done something very recently; it's not in master yet. Jo and I spent a week refactoring the codebase to refactor the error reporting part of the typechecker. It stores a list of strings; internally in GHC, there's a pretty printed data type, a list of those. The problem is we can't do postprocessing on that. So, what Jo did was made a giant data type with three hundred data constructors, one for every error message in GHC. That refactor to a week (SPJ: only a week?!) How it is now, it's decoupled, now you have, instead of storing in the typechecking monad, storing strings, you store a data type that stores the relevant data to print out that error message. And then at the final point, you can traverse the data type; based on the presence of other errors, you can decide what to do. Now it's pattern matching on certain error patterns and reporting them nicely. This is one example. We talked about simple errors: refactoring, adding an argument, changing the type, that's one of the most common errors you'll get working with Haskell. So we focused on those first. This shows an example of a type error... 'checker', it's an IO action.

GHC would tell you, couldn't match Int -> IO () with IO (). The problem is, for people who don't know how the typechecker works, they won't be able to understand what the typechecker is doing: going argument by argument. Because of the refactor we've done, it was easy to pattern match on this particular case, and say, hey, if the user forgot to put an argument, you can print out an error message of this form. You print an argument is missing, you highlight. (SM: You might have been missing the first argument, in this case!) That's true. It's tricky; sometimes the suggestion you give, might not. We don't tell people what they did exactly wrong, because we don't know. This is not a perfect thing, but we try to give the best suggestion that we can. And an important feature of this, most of how we decdied this layout, we studied languages like Elm and Purescript, which have done good work in this error. PureScript and Elm both, what they do, for a certain type of error, and you're not sure what to do... e.g., our info is not complete, they can go to a particular link and see other things that could have happened. So we don't have to flood the user with every suggestion, we just have to show the user what probably is the cause for it. And if it's a tricky case, not what we posted, in the link, we'll have the case as well.

(BG: There's other information that might be relevant; expanding type synonyms, etc. Do you have this info?) We're still thinking about that. Probably we'll have extra flags and stuff. Eventually, we'll have a mode that prints out JSON for IDEs, then it's easier to parse on the IDE side. (BG: Incidentally, there's a ticket, a student working with Richard, trying to figure out smoething similar).

Another aspect of UX is we added the REPL. Tried to simplify the entry point, try to make it easy. You want types, kinds, and where to find out more information. This is a statically typed language: you always hhave to be thinking about types. So we :set +t: always print out the types when you print things. One more thing, one of the former Scala engineers, has been learning Haskell, and he made a critique of one aspect of the REPL experience. f is a function of two argumets. In a second statement of the REPL, I applied 1. Find instance, show instance, for a goes to a. He said that... no show instance found, just say that this is a function, and you can't print it. That's a change we did. This was very easy for us to do.

Performance: it can mean many things. We're talking about fast developer feedback loop. Compile time and develop time, reducing that feedback loop. Some work we've done in this direction is reproducible builds. As of now, we have bit-for-bit reproducibility in Eta. That amounted to... nikita already did lots of work on reproducibility, he made HAskell interface reproducible; but the last mile of bit for bit is hard, there's many places. For our code generator, it was a lot simpler, we didn't have to do as much. It was 20 lines of code to make it deterministic. The main source of nondeterminism in GHC is the Unique data type, that changes between different runs depending on environment. What we did, was we added a counter. We used to print the uniques in the Java class name; that will make it nondeterministic. So we made a counter: the order in which the bindings make it to STG is the same.

GHCi is known to take up lots of memory, esp. with IDE. Simon Marlow has a bunch of fixes to that; we also backported those.

Another aspect of performance is the actual runtime performance. We're on the JVM, that puts us at a huge disadvantage. We don't have control over many things. The runtime system... this is Java. It's OO, so the runtime system is implemented in Java. We setup a hierarchy for values, that are defined in Eta. We have Closure, it's a class, parent class of all values, thunks, WNF. The Closure class has two methods. evaluate, evaluate to WHNF, and enter will actually enter... it's similar to GHC runtime system. The initial version was modeled exactly after GHC, except for tail calls. The terminology is similar. It's primarily used when you do the body of function. The main subclasses of Closure are Thunk and Value. Value will be the parent class, of things like functions, partiallly applied functions, and data constructors. Thunk will be the superclass of things like CAFs, single entry thunks, and updatable thunks. CAFs don't have free variables, so there's a special case for that, and you create a blackholing entry every time, to avoid two threads evaluating the same thunk. UpdatableThunk pushes an update frame, when it's finished evaluating, it will update the thunk to point to the newly computed value. And SingleEntryThunk, they're evaluated only once, so you can just evaluate it directly without pushing an update frame. This terminology is borrowed from GHC as well.

VAlues: DataCon, Function and PAPs. In the early days, and even now, every function call that was a tail call, is just a method call. This is the only way to make it remotely efficient. (More on stack soon). For static tail recursive calls: singly recursive or mutually recursive, they get compiled to loops. In most cases, they get a nice tight loop. In the mutual case, what will happen is, we collect all of the SCC, and we make one giant method that goes into a loop. Let's say you're in the even/odd example, what will happen is, when even calls odd, there's a variable called target, an integer. Even will be assigned 0, odd is assigned 1, so then you set 1 and restart. (BG: Do you always have unfoldings available for functions you compiled?) This is mutually recursive functions defined in the same module. (SPJ: They might have very different type arguments.) We cat all the arguments into one. The main problem with this argument, is parsers generated with Happy and Alex, we hit limits. (BG: Crash?) Not stack blowup. JVM has method size limit, so you can only have 65000 bytecodes. That's Eta compiled with itself. That's the only thing that's preventing us from using Eta with Eta. But all you need to do is split method into smaller chunks.

So how do we handle tail calls? When we know it , tail recursive, let's say you don't. Let's say you're using CPS. It's so common in Haskell, any fast parser uses CPS. In early days, Aeson would just blow the stack, it was pretty bad. So, we explored trampolining by default, and it was just awful, it was slow, super slow. What we did is turn it off, and let stack blow up. We found a better solution. The JVM has... the only way to unwind the stack is throwing an exception, or returning, and keep on returning until you return all the way down. It turns out, with exceptions, you can turn off the feature that captures the stack trace: that's the most expensive part of an exception. So we have a general exception. So this trampoline mechanism is optional. So, what we do, we have a function 'trampoline :: a -> a', runtime primitive, what it does is activates a boolean in the context which says, I'm going to trampoline now, and it activates a codepath that turns a counter, and once you reach a certain number, which is configurable, it will unwind the stack, and then continue where it needed to go. Our limit is 400, and then we unwind. It used to be in 1000s, but with Happy and Alex, we needed a smaller number. (BG: Inside that context, how much does it cost? But observably, it's faster. A couple months ago, we got PureScript to work in Eta, and it wasn't bad by default?) (SPJ: So you could turn it on by default: all you're doing is counting.) The counting is how we know how big the stack is. In your main function, you could call trampolineIO, and trampoline your whole program. (SPJ: Maybe it's low overhead, and you can do it all the time.) If it's low, we will do it. (How do you resume? Once you raise the exception, what do you store?) The counter happens at the entry point, and it's guarded bby the boolean. So, that, if the limit is exceeded, it will call another function that takes the context. So we store all the arguments in a context variable that gets passed to every eta function. We stash all the arguments into a function that has the state, then wjhen it unwinds, marked by this function, it will call that, with that function and those arguments.

As I mentioned, it's guarded by a boolean. JVM has an optimization, where it observes the boolean is true for a lot of times, it won't even compile that branch in the native code. So if you don't use trampolining, it doesn't affect you at all; the code for the counter will just not be there.

One nice thing I like about Eta is that you actually get stack traces for exceptions. This is because, to get good perf for Eta, you have to implement most primitives on JVM stack. This is a sample stack. You have a schedule loop, and you hare evaluting some IO acttion. applyV/applyN, these are partial applications. Execute an IO action. And another nice part, we've tried to encode it close to the original name. So you can tell this fucntion call happened in statistics.Regression, rnfAll. If you see, you notice there are line numbers. This is not perfect, and we can definitely make it better later... GHC gives you a lot of debugging info at STG time, but because the JVM doesn't have much flexibility, we can only attach one line number to code, so we have to discard all that info. This will get better; we'll stash that debug information in the classfile itself, and then access it and render a better stack trace. (BG: This is Source Notes?) Yeah.

Concurrency: One nice part is, it's nice or not. If you're evaluating a long chain of thunks, you're going to blow the stack. This happily coincides with GHC also having a space leak. Neil Mitchell wrote a blog post about how to detect space leaks: restrict stack size and then figure out which thunk was being evaluated. If you see a stack trace like this, and you see a huge chain of evaluates, in a long chain, you probably have a space leak.

How do I do interop? The way we did interop was, made a thing called the Java monad. IT's supposed to give you the experience of programming JAva. The basic implementation is inspired from IO monad. Object# c is "this", the object that is being threaded through. Because of this encoding, you get the Java experience: you can call dot on the Java object. It's almost like working with Java inside. The argument is called... that's the type constructor that forced us to fork, instead of use the API. You can't declare primitive types in the API. And we had to introduce a new low level representation. Declare wrapper types, wrapping the iterable interface in Java. We've stolen better syntax, which were type applications... resolve it somehow. I'm declaring an Eta type that wraps a JAva type, @java.lang.Iterable.

You use the java function to run the Java monad. All of these have to be imported. newArrayList, newInteger, but we brought some combinators, that let you call methods. It owrked out with the monad. This is sample code that does the same thing as Java code. it just uses standard monadic combinators. If it's a fixed c, it's an instance.

You can use Eta as a better JAva, with referential transparency! Unlike Kotlin or Scala.

How do we handle subtyping? We define uilt in type families. We have a typeclass named extends. Any time you declare a function that takes a given class and any subtype of that class, you can, instead of actually subtyping, we do it with constraints. Extends' takes the info from Inherits and figures it out. You can use the dot operator on anything that is a subclass of Iterator. We had to extend the typechecker just a little bit: a lot of times the type gets stuck in the form Extends' (List JSTring) (List a) where a is unconstrained.

Imports are tiresome, so we're setting up direct Java Interop; actually use JAva reflection to get info class files, and generate imports. "import java java.lang.Math" works, but doesn't scale. Biggest priority for the rest of the year is Java interop, really good IDE support, documentation, language extensions: UnboxedSums, TypeApplications, DerivingVia, QuantifiedConstraints. We have some new language extensions in mind, AnonymousRecords, RowTypePolymorphism... We'll see how that goes.

I was thinking about ways... we work on the same codebase, how to collaborate? We're interested in compile performance, support for unbboxed sums. Worker wrapper has some glitch, and no one got around to fixing it. At some point, maybe not any time soon, that and mutable fields. Pretty important for us. (BG: Do unboxed sums get a lot of usage? Why unboxed sums? Does Eta code make a lot of use?) No. But a lot of people on JVM are annoyed that Maybe is boxed all the time. But if you have unboxed sums, you can represent it as null. (SPJ: Or you can say, just box it, and you won't notice it. If it's fast enough all the time, focus on what's going to make a difference.)

Q: Did you consider using Growl (it's a new virtual machine that supports partial evaluation and partial escape analysis, good for functional languages)?

A: We have looked into it, it's not completely there yet to use, and we're not sure if it's something we can invest time with. We're keeping up with it. (BG: But you lose the JVM!) That's what's preventing us from going there. Maybe if it gets integrated into a mainline VN we might look at it. (Mainline Java is planning to integrate Growl)

Q: (SM) Are you keeping the fork up to date with master GHC?

A: One thing that is out of bounds for us, and for a long time, is all the dependent Haskell work. Everything else, we keep up. If there's any nice bugfixes... (SM: So you're selectively backporting).

Q: (BG) Have you considered unforking.

A: Not yet, no.

ScreenAnarchy: Fantastic Fest 2018 Interview: Gareth Evans Talks APOSTLE

Serving up a big bowl of angry stew, Gareth Evans' Apostle is a kinetic exercise in British folk horror. Full disclosure: I confess that I'm only vaguely familiar with the sub-genre. Watching the filmmaker's latest effort unfold on a big screen during its world premiere at Austin's Fantastic Fest -- sorry, everyone else, you'll have to wait until October 12, when it debuts globally on Netflix -- I was reminded simultaneously of Joko Anwar's Modus Anomali and Ben Wheatley's A Field in England, movies that are atypically creepy and disturbing. British folk horror is "a very specific sub-genre," Evans told me the day after the screening. "What I love about it is that there's enough that's grounded in reality, but then there's just something about...

[Read the whole post on screenanarchy.com...]

things magazine: The Atlas of Remote Islands revisited, post 2 of 4

Part two of our virtual voyage around the world (part 1, the source material: Judith Schalansky’s Atlas of Remote Islands, ‘Fifty Islands I have not visited and never will’ – oddly the US edition is subtitled ‘Fifty Islands I Have … Continue reading

Perlsphere: Perl foreach loops

A foreach loop runs a block of code for each element of a list. No big whoop, “perl foreach” continues to be one of the most popular on Google searches for the language. So we thought we’d see what’s happened in 20 years. I expand on Tom Christiansen’s slide that’s part of his longer presentation then add a new but experimental feature at the end. If you want more, there’s plenty to read in perlsyn or my book Learning Perl.

Going through a list

Unless you say otherwise, foreach aliases the current element to the topic variable $_. You can specify that list directly in the parentheses after foreach, use an array variable, or use the result of a subroutine call (amongst other ways to get a list):

foreach ( 1, 3, 7 ) {
	print "\$_ is $_";
	}
my @numbers = ( 1, 3, 7 );
foreach ( @numbers ) {
	print "\$_ is $_";
	}
sub numbers{ return ( 1, 3, 7 ) }
foreach ( numbers() ) {
	print "\$_ is $_";
	}
sub numbers{ keys %some_hash }
foreach ( numbers() ) {
	print "\$_ is $_";
	}

Some people like to use the synonym for. There’s a proper C-style for that has three semicolon-separated parts in the parentheses. If Perl doesn’t see the two semicolons it treats for just like a foreach:

for ( my $i = 0; $i < 5; $i++ ) {  # C style
	print "\$i is $i";
	}

for ( 0 .. 4 ) {  # foreach synonym
	print "\$_ is $_";
	}

Element source gotchas

The aliasing is only temporary. After the foreach the topic variable returns to its original value:

$_ = "Original value";
my @numbers = ( 1, 3, 7 );
print "\$_ before: $_\n";
foreach ( @numbers ) {
	print "\$_ is $_\n";
	$_ = $_ * 2;
	}
print "\$_ after: $_\n";

The output shows that $_ appears unaffected by the foreach:

$_ before: Original value
$_ is 1
$_ is 3
$_ is 7
$_ after: Original value

This is an alias instead of a copy, which is a shortcut that allows your program to be a little faster by not moving data around. If you change the topic you change the original value if the list source is an array (the values are read-only otherwise and you’ll get an error):

my @numbers = ( 1, 3, 7 );
print "Before: @numbers\n";  # Before: 1 3 7
foreach ( @numbers ) {
	print "\$_ is $_\n";
	$_ = $_ * 2;
	}
print "After: @numbers\n";   # After: 2 6 14

Not only that, but if you change the source by adding or removing elements you can screw up the foreach. This loops infinitely processing the same element because each go through the block moves the array elements over one position; when the iterator moves onto the next position it finds the same one it just saw:

my @numbers = ( 1, 3, 7 );
print "\$number before: $number\n";
foreach $number ( @numbers ) {
	print "\$number is $number\n";
	unshift @numbers, "Added later";
	}

This output will go on forever:

$number is 1
$number is 1
$number is 1
$number is 1

Naming your own topic variable

The $_ is often handy because it’s the default variable for several Perl functions, such as chomp or split. You can use your own name by specifying a scalar variable between the foreach and the parentheses. Usually you don’t want to use that variable for something other than the loop so the usual style declares it inline with the foreach:

foreach my $number ( 1, 3, 7 ) {
	print "\$number is $number";
	}

Since Perl flattens lists into one big list, you can use more than one list source in the parentheses:

my @numbers      = ( 1, 3, 7 );
my @more_numbers = ( 5, 8, 13 );
foreach my $number ( @numbers, @more_numbers ) {
	print "\$number is $number";
	}

Or a mix of source types:

my @numbers      = ( 1, 3, 7 );
my @more_numbers = ( 5, 8, 13 );
foreach my $number ( @numbers, numbers(), keys %hash ) {
	print "\$number is $number";
	}

Using your own named topic variable acts just like what you saw with $_:

my @numbers      = ( 1, 3, 7 );

my $number = 'Original value';
say "Before: $number";
foreach $number ( @numbers ) {
	say "\$number is $number";
	}
say "After: $number";

The output shows the aliasing effect and that the original value is restored after the foreach:

Before: Original value
$number is 1
$number is 3
$number is 7
After: Original value

Controlling

There are three keywords that let you control the operation of the foreach (and other looping structures): last, next, and redo.

The last stops the current iteration. It’s as if you immediately go past the last statement in the block then breaks out of the loop. It does not look at the next item. You often use this with a postfix conditional:

foreach $number ( 0 .. 5 ) {
	say "Starting $number";
	last if $number > 3;
	say "\$number is $number";
	say "Ending $number";
	}
say 'Past the loop';

You start the block for element 3 but end the loop there and continue the program after the loop:

Starting 0
$number is 0
Ending 0
Starting 1
$number is 1
Ending 1
Starting 2
$number is 2
Ending 2
Starting 3
Past the loop

The next stops the current iteration and moves on to the next one. This makes it easy to skip elements that you don’t want to process:

foreach my $number ( 0 .. 5 ) {
	say "Starting $number";
	next if $number % 2;
	say "\$number is $number";
	say "Ending $number";
	}

The output shows that you run the block with each element but only the even numbers make it past the next:

Starting 0
$number is 0
Ending 0
Starting 1
Starting 2
$number is 2
Ending 2
Starting 3
Starting 4
$number is 4
Ending 4
Starting 5

The redo restarts the current iteration of a block. You can use it with a foreach although it’s more commonly used with looping structures that aren’t meant to go through a list of items.

Here’s an example where you want to get three “good” lines of input. You iterate through the number of lines that you want and read standard input each time. If you get a blank line, you restart the same loop with

my $lines_needed = 3;
my @lines;
foreach my $animal ( 1 .. $lines_needed ) {
	chomp( my $line = <STDIN> );
	redo if $line =~ /\A \s* \z/x;  # skip "blank" lines
	push @lines, $line;
	}

say "Lines are:\n\t", join "\n\t", @lines;

The output shows that the loop effectively ignore the blank lines and goes back to the top of the loop. It does not use the next item in the list though. After getting a blank line when it tries to read the second line, it tries the second line again:

Reading line 1
First line
Reading line 2

Reading line 2

Reading line 2
Second line
Reading line 3

Reading line 3

Reading line 3
Third line
Lines are:
    First line
    Second line
    Third line

That’s not very Perly though but this is an article about foreach. A better style might be to read lines with while to the point that @lines is large enough:

my $lines_needed = 3;
my @lines;
while( <STDIN> ) {
	next if /\A \s* \z/x;
	chomp;
	push @lines, $_;
	last if @lines == $lines_needed;
	}
say "Lines are:\n\t", join "\n\t", @lines;

There’s more that you can do with these. The work with labels and nested loops. You can read more about them in perlsyn or Learning Perl.

A common file-reading gotcha

Since foreach goes through each element of a list, some people reach for it when they want to go through each line in a file:

foreach my $line ( <STDIN> ) { ... }

This is usually not a good idea. The foreach needs to have to entire list all at once. This isn’t a lazy construct like you’d see in some other languages. This means that the foreach reads in all of standard input before it does anything. And, if standard input doesn’t close, the program appears to hang. Or worse, it tries to completely read terabytes of data from that filehandle. Memory is cheap, but not that cheap.

A suitable replacement is the while idiom that reads and processes one line at a time:

while( <STDIN> ) { ... }

This is really a shortcut for an assignment in scalar context. That reads only one line from the filehandle:

while( defined( $_ = <STDIN> ) ) { ... }

An experimental convenience

Perl v5.22 added an experimental refaliasing feature. Assigning to a reference makes the thing on the right an alias for the thing on the left. Here’s a small demonstration where you assign an anonymous hash to a reference to a named hash variable. Now %h is another name (the alias) for that hash reference:

use feature qw(refaliasing);
use Data::Dumper;

\my %h = { qw(a 1 b 2) };
say Dumper( \%h );

This is handy in a foreach where the elements of the list are hash references. First, here’s how you might do this without the feature. Inside the block you interact the $hash as a reference; you must dereference it to get to a value:

my @mascots = (
	{
		type => 'camel',
		name => 'Amelia',
	},
	{
		type => 'butterfly',
		name => 'Camelia',
	},
	{
		type  => 'go',
		name  => 'Go Gopher',
	},
	{
		type  => 'python',
		name  => 'Monty',
	},
	);
foreach my $hash ( @mascots ) {
	say $hash->{'name'}
	}

With v5.22’s refaliasing feature you can use a named hash variable as the topic. Inside the block you interact with the current element as a named hash. There’s no -> for a dereference:

use v5.22;
use feature qw(refaliasing);
use Data::Dumper;

my @mascots = (
	{
		type => 'camel',
		name => 'Amelia',
	},
	{
		type => 'butterfly',
		name => 'Camelia',
	},
	{
		type  => 'go',
		name  => 'Go Gopher',
	},
	{
		type  => 'python',
		name  => 'Monty',
	},
	);

foreach \my %hash ( @mascots ) {
	say $hash{'name'}
	}

The output is the same in both programs:

Amelia
Camelia
Go Gopher
Monty
Aliasing via reference is experimental at ...

There’s a warning from this experimental feature (and, all such features). The feature might change or even disappear according to Perl’s feature policy. Disable the warning if you are comfortable with that:

no warnings qw(experimental::refaliasing);

Conclusion

The foreach is a handy way to go through a list an element at a time. Use it when you already have the list completely constructed (and not to process a filehandle). Define your own topic variable to choose a descriptive name.

Explosm.net: Comic for 2018.09.23

New Cyanide and Happiness Comic

Planet Haskell: ERDI Gergo: Back in the game!

For most of this week, it seemed I will have to thrown in the towel. As I mentioned in my previous entry last Saturday, I ran into what at first seemed like a CλaSH bug. However, further investigation showed that the error message was actually pointing at an internal bug in the Xilinx ISE synthesizer. The same generated VHDL didn't cause any problems when fed into the Yosys open source synthesizer, Altera Quartus, or the newer version of Xilinx Vivado. But the Papilio Pro I'm using is based on the Spartan 6 FPGA, which is not supported by the newer Xilinx tools, so I am stuck with ISE 14.7 from 2013. So the conclusion is, just like all other closed-source proprietary software from FPGA vendors, the Xilinx ISE is simply a piece of shit that falls over under its own weight on perfectly valid VHDL.

I was thinking of ordering a new FPGA board, but I only have until next Wednesday to finish this (I won't be able to put in any work on the last Retrochallenge weekend), so it's not even a given it would get here in time. Also, I'd like to do a bit more research on what board I should get -- on one hand, both Altera and Xilinx have nice, more modern dev boards with good IO ports for my retro-computing-oriented needs, but on the other hand, it feels a bit like throwing good money after bad, since these would still be programmed with proprietary shitty software, with no way forward when (not if!) they break.

Then there's Lattice's ICE40 line which is fully supported by the open source toolchain IceStorm, but the largest ICE40 is still quite small compared to the Spartan 7 or the Cyclone V series; not to mention that even the nicest ICE40 board I could find doesn't have a USB connector on board, so you have to play around with an Arduino and plug jumper wires into this weird connector to get anything working. Also, while I'm ranting, of course the Lattice ICE40 open source toolchain is not from Lattice themselves; instead, its bitstream format had to be reverse-engineered by awesome free software hackers

So anyway, I had a perfectly functioning board betrayed by its software toolchain. I tried some desparate ideas like generating Verilog instead of VHDL or getting rid of the unguarded block statements, but nothing made any difference. Then Thursday night I had an even wilder idea. If the Xilinx ISE is crashing because the generated VHDL is triggering some weird corner case in the synthesizer, then maybe using the same ISE version, but changing the target FPGA model, would get over the hurdle? And that's when I remembered I still have my first ever FPGA board: the Papilio One based on the Spartan 3E. Luckily, the Spartan 3-series is also supported by the 14 series ISE, so the same toolchain can serve both boards.

On Friday morning, I did the necessary changes to my code to target the Papilio One. The clock generator is different between the models, so I needed to replace that; the other difference was that the Spartan 3 doesn't seem to have wide enough blocks for 64-bit arithmetic. This shouldn't be a problem for the CHIP-8, but CλaSH generates code that converts everything into 64 bits. I initially overcame that by post-processing CλaSH's output with sed, but then I discovered that there is a flag -fclash-intwidth to set that properly.

With these changes, I was able to get it through the Xilinx ISE's synthesizer, and all the way through the rest of the pipeline! As before, the code is on GitHub.

<object height="350" width="425"><param name="movie" value="http://www.youtube.com/v/eqpMFACw-B4"/><param name="wmode" value="transparent"/><embed height="350" src="http://www.youtube.com/v/eqpMFACw-B4" type="application/x-shockwave-flash" width="425" wmode="transparent"></embed></object>

And with this, I am where I was supposed to be a week ago at half-time. I probably won't have time to work on this project next weekend since we'll be travelling; this looks like a good time to take inventory of the project.

  • I am very happy with how the video and keyboard peripheral interfaces turned out; the CλaSH code is nice and clean.
  • I still need to write a blog post about the design I came up with for implementing the CPU. I'm convinced it should scale to more complex processors; but of course the proof of that pudding will be in implementing a real retro-CPU like a 6502 or a Z80.
  • The font ROM is not hooked up yet; I plan to finish that tomorrow.
  • The plan was to get it working first, and make it more performant later. I'm afraid I won't have time before the RetroChallenge finishes to implement improvements. The biggest deficiency here is that the framebuffer is accessed one bit at a time, so clearing the screen (a single opcode in CHIP-8!) takes 64⨯32 = 2048 cycles!
  • The signal-less high-level simulation makes debugging the CPU implementation very convenient and fast. Being able to run the CHIP-8 machine in real-time, with interactive graphics and input, is immensely satisfying.
  • I prefer the code I was able to write in CλaSH, when compared to Kansas Lava. Of course, I don't know how much of that is simply because this is the second time I was implementing these circuits.
  • There's a lot to be improved in CλaSH itself. Significant improvements to compilation time are already coming, which will be welcome given that even this tiny CHIP-8 circuit is now taking about 5 minutes to compile; this issue is exacerbated by the fact that compilation cannot use multiple cores. The other problem is that CλaSH generates very inefficient HDL, putting big pressure on the vendor-specific synthesis tools. As we've seen, this pressure can be too big for crappy software like the old version of the Xilinx ISE I am forced to use with my FPGA.
  • Writing these posts takes a lot of time! I love reading the posts of other RetroChallenge entrants, but there is no way I could have written more frequent updates. I wouldn't have had time to do the actual work then!

ScreenAnarchy: Fantastic Fest 2018 Review: MADAM YANKELOVA'S FINE LITERATURE CLUB Beguiles, Strangely

The air of a fairy tale blows throughout, swirling hither and thither, as Sophie and Hannah search for a suitable man. It is not just any man they seek, however. He must be good-looking, someone their peers will look upon favorably. He must be unattached. And they are looking for a man of a certain size; the bigger his ... head, the better. The opening moments of Madam Yankelova's Fine Literature Club (original title: HaMoadon LeSafrut Yaffa Shel Hagveret Yanlekova) suggest a Tim Burton-style dark comedy. Reserved Sophie and her outgoing friend Hannah stand on a road in an isolated area at night, pretending to be hitchhikers so they can get a man to stop. The intent of their actions is diabolical rather than romantic...

[Read the whole post on screenanarchy.com...]

Daniel Lemire's blog: Science and Technology links (September 22nd, 2018)

  1. Apple benefits from the chip-making technology of a company called TSMC. This company has surpassed Intel in transistor density. Thus, in some sense, the microprocessors in Apple’s latest iPhone are more advanced than the microprocessors you find in brand-new PCs.
  2. Here is a provocative opinion piece: For decades, the medical community has ignored mountains of evidence to wage a cruel and futile war on fat people, poisoning public perception and ruining millions of lives.
  3. Narcolepsy might be an autoimmune condition (source: Nature).
  4. A major insurer wants to include the use of fitness trackers (like the Apple Watch) as part of its policies (source: BBC).
  5. Economics might go against our deeply held instincts (source: New Scientist).
  6. Medical researchers in universities receive grants to conduct clinical trials, but they do not register the results contrary to what the regulations stipulate:

    (…) compliance is particularly lacking at universities conducting drug trials. These also include German universities such as Berlin (Charité), Heidelberg and Cologne (0%).

  7. The number of ruminant animals in the United States is roughly the same today as it was 200 years ago (Source: Nutrition Today).
  8. Open-source software is an international phenomenon that enables free collaboration from people all over the world. Code from Japanese and Canadians is more likely to be accepted into projects whereas code from Germany and Brazil is less likely to be accepted, with Americans being in the middle.
  9. Rangel found in her thesis that, at least some of the time, people rate software code more highly when they are told it is from a female developer:

    Respondents were asked to score source code written by a fictive male or female developer, (…) participants were randomly assigned one of four code examples (…) the fictive female author was scored higher than the fictive male author. These unexpected results support the need for further understanding of the complexities of gender related to software engineering, and should not provide a foundation for complacency in regard to improving female participation in software engineering.

    (Source: Northcentral University)

  10. People who consume one diet drink a day ‘three times more likely to suffer stroke or dementia’ (Source: The Independent)
  11. Wheat gluten intake increases weight gain according to an article in Nature. (Source: Nature)
  12. Treating fever may lead to a higher mortality rate. (Source: Surg. Infect.)
  13. Physical activity does not influence obesity risk. (Source: International Journal of Epidemiology)
  14. Despite claims by the World Health Organization (WHO) that eating processed meat causes colon cancer and red meat probably causes cancer, the observational data used to support the claims are weak, confounded by multiple unmeasured factors, and not supported by other types of research needed for such a conclusion. (Source: Animal Frontiers)
  15. We might be able to prevent cognitive decline and Alzheimer’s by clearing the brain from senescent cells, using new drugs called senolytics. (Source: Nature)

ScreenAnarchy: Fantastic Fest 2018 Review: APOSTLE, Ambitious Folk Horror That Lacks Focus But Mostly Delivers Anyway

There is a lot to like in Gareth Huw Evans' ambitious folk horror tale, Apostle. The problem might be that there is just too much of everything to really fall in love. The film takes huge swings at slightly incongruous tones and while it works to create a sense of unease throughout the running time, it does leave this viewer wondering exactly what kind of film this is meant to be. That lack of focus, an element that Evans managed so perfectly in The Raid: Redemption, often gives the audience a bit too much to chew on, however, the payoffs are big, and the choices are bold in this brutal tale of rescue, revenge, and redemption. Upon arriving at his family's home following a long,...

[Read the whole post on screenanarchy.com...]

new shelton wet/dry: THREAT TO ‘SHOOT THE PLACE UP’

Vyacheslav Molotov (1890 – 1986) was a Soviet politician and diplomat, and a leading figure in the Soviet government from the 1920s, when he rose to power as a protégé of Joseph Stalin. […] Molotov served as Minister of Foreign Affairs from 1939 to 1949 and from 1953 to 1956. […] The Winter War was [...]

things magazine: Complexity and Contradiction

Denise Scott Brown’s Wayward Eye at Betts Project. Sadly related, two obituaries for Robert Venturi / Found Sounds from the Edge of Earth and Explaining the ‘Mystery’ of Numbers Stations, both via tmn / art and music by Kitty Finer … Continue reading

Explosm.net: Comic for 2018.09.22

New Cyanide and Happiness Comic

Penny Arcade: News Post: Miniscule

Tycho: So much stuff going on today.  Yeesh!  Okay.  Let’s go. I don’t consider the PlayStation retro yet, largely because features a ton of 3D stuff, but I’m open to the idea that the event horizon has claimed this portion of the medium’s history.  I suspect if I asked my son he would say that the PS1 is Retro because it is Old, but he’s fucking twelve.  Everything is old compared to him.   We imported Jumping Flash back in the day, so we were overjoyed to see it on the Classic - obviously FF7 is a draw, that’s gonna sell a machine…

Penny Arcade: News Post: Want To Watch Mike &amp; I Get Our New Tattoos?

Tycho:

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: Artist Spotlight: Conor Harrington

New work from Irish painter Conor Harrington (previously featured here).

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington

 

 

Conor Harrington’s Website

Conor Harrington on Instagram

ScreenAnarchy: Blu-ray Review: Criterion's ANDREI RUBLEV Is A Stacked Disc

I've noticed a trend in recent Criterion releases: the paucity of the special features. The default mode seems to have become one or two newly-recorded interviews, an archival video feature, and the in-jacket essay. This is a sign of the times. There was a time (Criterion, itself, helped invent it) when home video became as fascinated with the making of the film as by the presentation of the film itself; this trend bounced along from laserdisc to DVD to blu-ray. On the latter two formats, I suppose it was presumed that if you picked up a disc in an HMV and saw that it was $30, you'd immediately flip it over to see what else you were getting for your money, since thirty bucks tends...

[Read the whole post on screenanarchy.com...]

Colossal: Community: Over 500,000 Preserved and Local Flowers Suspended in the Toledo Museum of Art

Floral artist Rebecca Louise Law (previously) travels widely to install her beloved cascading flower showers around the world. Most recently, the UK-based artist worked with residents of Toledo, Ohio to install Community, her largest work to date. The exhibition incorporates over 500,000 flowers, installed with substantial help from local volunteers. . Community is comprised of dried flowers preserved from previous exhibitions as well as over 150,000 locally sourced native plants. The exhibit is on view at the Toledo Art Museum through January 13, 2019. You can see a time-lapse of the installation in the video below, and explore more of Law’s work on Instagram and Facebook.

BOOOOOOOM! – CREATE * INSPIRE * COMMUNITY * ART * DESIGN * MUSIC * FILM * PHOTO * PROJECTS: Booooooom TV Best of the Week: Music Videos, Short Films & Animation

Colossal: Digitally Altered Portraits Superimposed with Flowers, Antique Patterns, and Wildlife Illustrations by Tawny Chatmon

Maryland-based artist Tawny Chatmon combines traditional portraiture with digital collage, layering elements of antique patterns, vintage botanicals, and wildlife illustrations onto images of her children and other relatives. Once printed, Chatmon often revisits the digital textures she has superimposed, physically adding layers of gold ornamental elements or paint.

“My camera remains my primary tool of communication, while my constant exploration of diverse ways of expression moves me to add several different layers using a variety of mediums,” explains Chatmon to Colossal, “After a portrait session is complete, I typically digitally manipulate my portraits and unite them with other photographic components to achieve a work that is a new expression—often lending to them the eyes of someone their elder and more wise, and almost always exaggerating their hair.”

Her children not only serve as her models, but also her greatest source of inspiration while making work. Chatmon further explains that the layered portraits are driven by her “desire to contribute something important to a world I want my children to thrive in.” The artist’s work will be on display as part of The Art of Blackness Exhibition in Chicago, which opens at Block 37 on October 12, 2018. You can see more of her work on her website and Instagram. (via Beautiful Bizarre Magazine)

Open Culture: The Hieronymus Bosch Demon Bird Was Spotted Riding the New York City Subway the Other Day…

To me, the great promise of homeschooling is that one day your child might, on their own initiative, ride the New York City subways dressed in a homemade, needlefelted costume modeled on the ice-skating bird messenger from Hieronymus Bosch’s The Temptation of St. Anthony.

Rae Stimson, aka Rae Swon, a Brooklyn-based artist who did just that a little over a week ago, describes her upbringing thusly:

Growing up I was home schooled in the countryside by my mom who is a sculptor and my dad who is an oil painter, carpenter, and many other things. Most of my days were spent drawing and observing nature rather than doing normal school work. Learning traditional art techniques had always been very important to me so that I can play a role in keeping these beautiful methods alive during this contemporary trend of digital, nonrepresentational, and conceptual art. I make traditional artwork in a wide variety of mediums, including woodcarving, oil painting, etching, needle felting, and alternative process photography.

Not every homeschooler, or, for that matter, Waldorf student, is into needle felting. It only seems that way when you compare the numbers to their counterparts in more traditional school settings…

Even the tiniest creature produced by this method is a labor intensive proposition, wherein loose woolen fibers are soaked, soaped, and jabbed with a needle until they come together in a rough mat, suitable for shaping into the whimsical—or demonic—figure of its creator’s choosing.

Stimson matched her full-head bird mask to the one in the painting by equipping it with gloves, a blanket cloak, long velvet ears, and a leafless twig emerging from the spout of its hand-painted funnel hat.

An accomplished milliner, Stimson was drawn to her subject’s unusual headgear, telling HuffPo’s Priscilla Frank how she wished she could ask Bosch about the various elements of his “beautiful demon-bird” and “what, if any, symbolic significance they hold.”

The answer lies in art history writer Stanley Meisler’s Smithsonian magazine article, "The World of Bosch":

…a monster on ice skates approaches three fiends who are hiding under a bridge across which pious men are helping an unconscious Saint Anthony. The monster, wearing a badge that Bax says can be recognized as the emblem of a messenger, bears a letter that is supposedly a protest of Saint Anthony's treatment. But the letter, according to (Bosch scholar and author Dirk) Bax, is in mirror writing, a sure sign that the monster and the fiends are mocking the saint. The monster wears a funnel that symbolizes intemperance and wastefulness, sports a dry twig and a ball that signify licentious merrymaking, and has lopping ears that show its foolishness. All this might have been obvious to the artist's contemporaries when the work was created, but the average modern viewer can only hope to understand the overall intent of a Bosch painting, while regarding the scores of bizarre monsters and demons as a kind of dark and cruel comic relief.

A field guide to Bosch’s bizarre images in the same article gives viewers leave to interpret any and all funnels in his work as a coded reference to deceit and intemperance... perhaps at the hands of a false doctor or alchemist!

Not every subway rider caught the arty reference. Unsurprisingly, some even refused to acknowledge the strange being in their midst. Those folks must not share Stimson’s dedication to examining “that which is unfamiliar, seeking out all that is yet unknown to you in both art and life.”

Within 24 hours of its Metropolitan Transit Authority adventure, the one-of-a-kind demon-bird costume was sold on Etsy.

(Holler if you wish Stimson had kept it around long enough to take a spin on the ice at Rockefeller Center or Bryant Park, where the majority of patrons would no doubt be gliding around in ignorance that, as per Meisler, Bosch equated skates with folly.)

See more of Rae Stimson’s needle-felted creations, including a full-body alien robot costume and a sculpture of author Joyce Carol Oates with her pet chicken in her Etsy shop.

via Hyperallergic

Related Content:

Figures from Hieronymus Bosch’s “The Garden of Earthly Delights” Come to Life as Fine Art Piñatas

Hieronymus Bosch Figurines: Collect Surreal Characters from Bosch’s Paintings & Put Them on Your Bookshelf

Take a Virtual Tour of Hieronymus Bosch’s Bewildering Masterpiece The Garden of Earthly Delights

Ayun Halliday is a New York City-based homeschooler, author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her at The Tank NYC on Monday, September 24 for another monthly installment of her book-based variety show, Necromancers of the Public Domain. Follow her @AyunHalliday.

The Hieronymus Bosch Demon Bird Was Spotted Riding the New York City Subway the Other Day… is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Penny Arcade: Comic: Miniscule

New Comic: Miniscule

Colossal: Glass Insects Small Enough to Balance on the Tip of Your Finger by Wesley Fleming

Glass sculptor Wesley Fleming creates life-size and anatomically correct sculptures of a variety of bizarre and well-known insects. The colorful creatures are small enough to balance gently on the tip of his finger, like a neon orange spider barely larger than his nail. The artist began working with the medium more than 15 years ago at the MIT Glass Lab and has pushed his technique ever since, learning flameworking, sculpting with borosilicate, and the Italian technique of sculpting soft glass on the Venetian island of Murano in 2005. You can see more of his work with insects and other creatures on his website and Instagram, and view glass sculptures for sale on Etsy(thnx, Diana

Open Culture: When Led Zeppelin Reunited and Crashed and Burned at Live Aid (1985)

I’ve tended to avoid reunion shows from my favorite bands of old, and I’ve missed some great performances because of it, I’m told, and also a few clunkers and forgettable nostalgia trips. But sometimes it really doesn’t matter how good or bad the band is ten or twenty years past their prime—or that one or more of their original members has left their mortal coil or shuffled off into retirement. It’s such a thrill for fans to see their heroes that they’ll overlook, or fail to notice, serious onstage problems.

The crowd of thousands at Philly’s JFK Stadium exploded  after “Rock and Roll,” Led Zeppelin’s opener to their 1985 Live Aid reunion gig (above), with Phil Collins and Chic’s Tony Thompson doubling on drum duties (because it takes two great drummers to equal one John Bonham, I guess). But according to the musicians themselves, the show was an absolute fail—so much so that Collins nearly walked offstage in the middle of the 20-minute set. “It was a disaster really,” he said in a 2014 interview, “It wasn’t my fault it was crap.”

Collins expands on the problems in his candid autobiography:

I know the wheels are falling off from early on in the set. I can’t hear Robert clearly from where I’m sat, but I can hear enough to know that he’s not on top of his game. Ditto Jimmy. I don’t remember playing 'Rock and Roll,' but obviously I did. But I do remember an awful lot of time where I can hear what Robert decries as ‘knitting’: fancy drumming…. you can see me miming, playing the air, getting out of the way lest there be a train wreck. If I’d known it was to be a two-drummer band, I would have removed myself from proceedings long before I got anywhere near Philadelphia.

As for the Zeppelin members proper, Plant and Page had no fond memories of the gig. “It was horrendous,” said Plant in 1988. “Emotionally, I was eating every word that I had uttered. And I was hoarse. I’d done three gigs on the trot before I got to Live Aid.” Page, writes Rolling Stone, “was handed a guitar right before walking onstage that was out of tune.” “My main memories,” he later recalled, “were of total panic.” Apparently, no one thought to ask John Paul Jones about the show.

Barely rehearsed (Jones arrived “virtually the same day as the show”) and with failing monitors ensuring the band could hardly hear themselves, they struggled through “Rock and Roll,” “Whole Lotta Love,” and “Stairway to Heaven.” The footage, which the band scrapped from the 2004 DVD release, doesn’t show them at their best, for sure, but it’s maybe not quite as bad as they remembered it either (see the full concert above).

In any case, Plant was so inspired that he tried to reunite the band, with Thompson back on drums, in secret rehearsals a few months later. The attempt was “embarrassing,” he’s since said. “We did about two days…. Jonesy played keyboards, I played bass. It sounded like David Byrne meets Hüsker Dü.” Now that is a reunion I’d pay good money to see.

22 years later, at London's O2 Arena, the band was confident and totally on top of their game once again for the Ahmet Ertegun Tribute Concert, with Jason Bonham behind the kit. Probably their last performance ever, and it's damned good. See "Black Dog" above and buy the full concert film here.

The clip below lets you see more than 90 minutes of Led Zeppelin reunion concerts. Beyond their Live Aid show, it includes performances at Atlantic Records' 4oth anniversary (1988) and at the Rock'n Roll Hall of Fame (1995).

Related Content:

Hear Led Zeppelin’s First Recorded Concert Ever (1968)

What Makes John Bonham Such a Good Drummer? A New Video Essay Breaks Down His Inimitable Style

Led Zeppelin Plays One of Its Earliest Concerts (Danish TV, 1969)

Jimmy Page Describes the Creation of Led Zeppelin’s “Whole Lotta Love”

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

When Led Zeppelin Reunited and Crashed and Burned at Live Aid (1985) is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Michael Geist: The Internet is not an ATM: My Appearance at the Senate Transport and Communications Committee on Broadcast and Telecom Reform

Earlier this week, I appeared before the Senate Standing Committee on Transportation and Communications alongside Carleton professor Dwayne Winseck to discuss broadcast and telecom reform. The Senate study, which largely mirrors the government’s broadcast and telecommunications reform panel, is expected to run into 2019 with a broad mandate that covers everything from affordable access to net neutrality. The discussion was similarly wide ranging with discussion on the failings of the CRTC, the lack of telecom competition, and on the need for real data in assessing the impact of the Internet on the cultural sector.

My opening statement focused on the danger of treating the Internet as equivalent to the broadcast system, the realities of how the Canadian cultural sector is succeeding online, and how policy makers ought to respond the changing landscape for communications in Canada. It is posted below.

Appearance before the Senate Standing Committee on Transport and Communications, September 18, 2018

Good morning. My name is Michael Geist.  I am a law professor at the University of Ottawa, where I hold the Canada Research Chair in Internet and E-commerce Law, and I am a member of the Centre for Law, Technology, and Society. My areas of speciality include digital policy, intellectual property, privacy and the Internet. I appear in a personal capacity representing only my own views.

This committee’s study places the spotlight on an exceptionally important question: as the Internet increasingly serves as the foundation for telecommunications, broadcasting, commerce, and culture, what reforms are needed to the current communications laws and regulations? As you no doubt know, this issue is also at the heart of a current broadcast-telecom review commissioned by ISED and Canadian Heritage. I think both efforts will be valuable and I hope that there are ways to ensure synergies between them.

My opening remarks will focus on three issues: the danger of treating the Internet as equivalent to the broadcast system, the realities of how the Canadian cultural sector is succeeding online, and how policy makers ought to respond the changing landscape for communications in Canada.

1.    The Danger of Treating the Internet as Equivalent to the Broadcast System

With the remarkable popularity of services such as Netflix and YouTube, there is a widely held view that the internet has largely replaced the conventional broadcast system. Industry data suggests the business of broadcasters and broadcast distributors such as cable and satellite companies won’t end anytime soon, but it is undeniable that a growing number of Canadians access broadcast content through the internet. Yet while it may be true that the broadcasting system is (or will soon be) the internet, the internet is not the broadcasting system. Indeed, any decision to treat the internet as indistinguishable from broadcast for regulatory purposes would send us down a deeply troubling path that is likely to result in less competition, increased consumer costs, and dubious regulation.

For example, the CRTC recently issued a report maintaining that internet access is “almost wholly driven by demand for audio and video content.” However, its own data contradicted that conclusion since it also noted that 75 per cent of wireless internet traffic is not audio or video. The reality is that internet use is about far more than streaming videos or listening to music. Those are obviously popular activities, but numerous studies point to the fact that they are not nearly as popular as communicating through messaging and social networks, electronic commerce, internet banking, or searching for news, weather, and other information.

Why is this important?

There are several significant problems with viewing the internet through the prism of a broadcasting system. Most notably, the approach leads to the view that if (a) we regulate broadcast and (b) broadcast is now the internet, then (c) we must now regulate the internet. However, given that the internet is much more than just broadcast, such efforts would by definition regulate far more than the broadcasting sector. This is not to say that there should be no Internet related regulation.  Of course there should. However, targeted regulation is not the same as regulating the Internet as if it were the broadcast system.

2.    The realities of how the Canadian cultural sector is succeeding online

Some of the impetus for communications law reform in Canada stems from concerns that existing regulations are failing to adequately support the Canadian cultural sector and that the Internet places its future at risk. Yet the data points to a very different reality, namely that much of the sector is experiencing unprecedented growth in the Internet era without the need for a regulatory overhaul.

For example, the days of worrying whether consumers would pay for music are largely over with the Canadian music market growing much faster than the world average, streaming revenues more than doubling in 2017, the Canadian digital share of revenues of 63 per cent exceeding the global average of 50 per cent, and Canada leaping past Australia to become the 6th largest music market in the world.

In fact, since the 2012 copyright reforms, music collective SOCAN’s Internet streaming revenues have grown more than tenfold. Last year it reached nearly $50 million annually. By comparison, in 2013, Internet streaming revenues were just over $3 million.

The success story is particularly notable with respect to film and television production in Canada. According to the latest data from the Canadian Media Producers Association, the total value of the Canadian film and television sector exceeded $8 billion last year, over a billion more than has been recorded in any year over the past decade. In fact, last year everything increased: Canadian television, Canadian feature film, foreign location and service production, and broadcaster in-house production.

Canadian content production hit an all-time high last year at $3.3 billion, rising by 16.1%. Notably, the increased expenditures do not come from broadcasters. In fact, the private broadcasters now contribute only 11% of the total financing for English-language television production. Their contribution is nearly half of what it was just three years ago in an industry that is growing. Yet despite the private broadcaster decline, money is pouring into the sector from distributors, who see benefits of global markets, and foreign financing, which has grown by almost $200 million in the past four years. It should be noted, however, that the sector remains heavily supported by the public, with federal and provincial tax credits now accounting for almost 30% of financing.
The increase in foreign investment in production in Canada is staggering. When Netflix began investing in original content in 2013, total foreign investment, including foreign location and service production, Canadian theatrical, and Canadian television. was $2.2 billion. That number has doubled in the last five years, now standing at nearly $4.7 billion. While much of that stems from foreign location and service production that supports thousands of jobs, foreign investment in Canadian television production has also almost doubled in the last five years, In sum, the data confirms that there has never been more money invested in film and television production in Canada and far from representing a threat, the digital environment has provided new opportunities for Canadians to thrive.

3.    What Next for Broadcasting and Telecommunications Legislation?

Given the risks of treating the Internet as the broadcasting system and the success of the cultural sector in Canada, what next for broadcasting and telecommunications legislation?

I’ll quickly point to five issues to consider. First, ensure affordable access for all. As the committee works through its study, it must keep in mind that all of these benefits of the Internet depend on all Canadians having affordable access. The imposition of new taxes or fees for Internet access will invariably mean that Canadians will pay more for those services. With a quarter of low income Canadians still without access – often due to affordability concerns – and Canada with some of the highest wireless prices in the world, imposing new costs would risk increasing the digital divide.

Second, maintain a level playing field through strong net neutrality rules. Existing rules have been interpreted to include net neutrality, but we would still benefit from an unequivocal legislative direction to support and enforce net neutrality. Some commentators have raised the possibility that Canadian cultural policy might benefit from zero rating Canadian content. In other words, rather than rely on net neutrality rules to ensure that Canadian content benefits from a level playing field, perhaps it would be even better to tilt the rules in favour of Cancon by mandating that domestic content not count against monthly data caps. Canadian content can compete with the best in the world and our regulatory rules should ensure a level playing field to allow it to complete fairly with content from around the world.

Third, Canadian broadcasting and telecommunications law must keep pace with the changing digital environment. Rules that grant the CRTC the power to determine which channels may operate in Canada should be repealed. Instead, the Commission should concentrate on consumer protection and marketplace competition. The consumer protection issues include regulations maintaining maximum consumer choice through pick-and-pay models, truth in advertising on communications services, and tough action against deceptive practices.

Fourth, we should reject new fees or taxes on Internet access and services. An Internet or ISP tax is largely premised on the argument that ISPs and Internet companies owe their revenues to the cultural content accessed by subscribers and they should therefore be required to contribute to the system much like broadcasters and broadcast distributors. As previously discussed, however, is that Internet use is about far more than streaming videos or listening to music. Governments can (and do) support the creation of Canadian content through grants, tax credits, and other subsidies, but foisting support on a monthly internet or wireless bill stretches the definition of the conventional broadcast system beyond recognition.

Fifth, we should reject calls for website or content blocking. Recent proposals  along those lines to the CRTC have been disproportionate, harmful, inconsistent with international standards, and violate Canadian norms. Indeed, website blocking would bring major costs and negative implications for freedom of expression, net neutrality, affordable and competitive consumer Internet access, and the balanced enforcement of intellectual property rights.

I look forward to your questions.

The post The Internet is not an ATM: My Appearance at the Senate Transport and Communications Committee on Broadcast and Telecom Reform appeared first on Michael Geist.

explodingdog: Photo



things magazine: The Atlas of Remote Islands revisited, post 1 of 4

Judith Schalansky’s Atlas of Remote Islands, ‘Fifty Islands I Have Never Set Foot On and Never Will’, remains a bit of an obsession. We decided to use it to do some armchair traveling, courtesy of the book’s expansive Wikipedia page … Continue reading

Tea Masters: Drinking alone under the moon

The moon festival has links to the Shang dynasty over 3000 years ago, at a time when people worshiped celestial objects. Later, during the Tang dynasty, poets found their night time inspiration in the moon. One of the most famous Tang poem is 月下独酌 "Drinking alone under the moon" by Li Bai (701-762). And despite its politically incorrect content (drinking wine alone), it is still taught to children in Taiwan and China: my children have learned it in kindergarten!

Last month, I had a similar experience drinking tea at nightfall near Lishan. The moon was rising from behind the mountain and I got to enjoy my tea as the sunlight dimmed.
Aged Hung Shui Oolong felt the closest substitute to wine and an excellent fit to the cooler mountain air! The tall ivory flower cups helped keep the brew hot and give it a more intense hue. The Yixing zhuni teapot also helped to brew at a higher temperature (than with porcelain) and obtain a warmer cup. Also, my lack of gongdao bei (pitcher) serves the same purpose, because the tea cools during each transfer from one vessel to another. Pouring directly in the cups means a hotter tea.
While it's kind of pathetic to get wasted (with alcohol) alone if you're not a poet, it feels peaceful and pure to drink tea alone under the moon! And it got even nicer when my wife finally joined me!
Happy autumn full moon festival!

Open Culture: William Shatner Is Releasing a Christmas Album with Iggy Pop & Henry Rollins : Get a First Listen to “Jingle Bells”

You know what they say: each year the Christmas season seems to start a little earlier. Here it's not yet October, and already we're hearing "Jingle Bells" — but then, this version doesn't sound quite like any we've heard before. The song comes as the opening number on Shatner Claus: The Christmas Album, which promises exactly what it sounds like it does. Officially dropping on October 26th, it will contain, according to Consequence of Sound, William Shatner's "unique take on 13 holiday staples," and feature guest contributors like Iggy Pop on "Silent Night," ZZ Top's Billy Gibbons on “Rudolph the Red-Nosed Reindeer,” and former Black Flag frontman and all-around provocateur Henry Rollins on "Jingle Bells," a collaboration you can stream just above.

You may not describe Shatner's distinctive half-singing-half-speaking style as possessed of a great "range," technically speaking, but who can doubt the formidable cultural range of his musical career? On his debut album The Transformed Man fifty years ago he covered the Beatles, ten years later he took on "Rocket Man," and more recently he appeared on Dr. Demento's punk album singing The Cramps' "Garbage Man" with Weird Al Yankovic.

Shatner Claus demonstrates that the former Captain Kirk's interest in punk rock hasn't dissipated, and the pairing of him and no less an icon of that genre makes a certain kind of sense, seeing as both of them have spent decades blurring the performative line between singing and the spoken word, each in his own distinctive way.

Perhaps it comes as no surprise, then, that Shatner and Rollins are friends, and have been since they first recorded together on Shatner's album Has Been in 2004. Rollins once described Shatner to rock site Blabbermouth as "extraordinarily friendly, a very energized guy" despite being three decades the  middle-aged Rollins' senior. "He impresses me in that he's a guy who's really figured out what he likes," especially football: "I've been to the Shatner house many times for dinner, for Super Bowl Sunday, for football games. I don't watch football, but I like his friends. I'm a shy person. I don't really go out of my way to hang out but I like him and his wife... and I like all the food he lays out." The vast game-day spreads at chez Shatner have also given Rollins stories to tell at his spoken-word shows, and listening to Shatner Claus, you have to wonder: what must they have for Christmas dinner?

via Consequence of Sound

Related Content:

Dr. Demento’s New Punk Album Features William Shatner Singing The Cramps, Weird Al Yankovic Singing The Ramones & Much More

A Cult Classic: William Shatner Sings Elton John’s “Rocket Man” at 1978 SciFi Awards Show

William Shatner Sings Nearly Blasphemous Version of “Lucy in the Sky with Diamonds” (1968)

Stream a Playlist of 68 Punk Rock Christmas Songs: The Ramones, The Damned, Bad Religion & More

Hear the 20 Favorite Punk Albums of Black Flag Frontman Henry Rollins

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

William Shatner Is Releasing a Christmas Album with Iggy Pop & Henry Rollins : Get a First Listen to “Jingle Bells” is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Planet Haskell: Holden Karau: Bringing the Jewels of the Python world (including Spacy) to Scala with Spark @ Lambda World Seattle

Thanks for joining me on 2018-09-18 at Lambda World Seattle 2018 Seattle, WA, USA for Bringing the Jewels of the Python world (including Spacy) to Scala with Spark.The talk covered:

With the new Apache Arrow integration in PySpark 2.3, it is now starting become reasonable to look to the Python world and ask what else do we want to steal besides tensorflow, or as a Python developer look and say how can I get my code into production without it being rewritten into a mess of Java?. Regardless of your specific side(s) in the JVM/Python divide, collaboration is getting a lot faster, so lets learn how to share! In this brief talk we will examine sharing some of the wonders of Spacy with the JVM world, which still has a somewhat lackluster set of options for NLP & deep learning.

.I'll update this post with the slides soon.Comment bellow to join in the discussion :).Talk feedback is appreciated at http://bit.ly/holdenTalkFeedback

Explosm.net: Comic for 2018.09.21

New Cyanide and Happiness Comic

Planet Haskell: FP Complete: Deploying Postgres based Yesod web application to Kubernetes using Helm

Yesod is one of the most popular web frameworks in the Haskell land. This post will explore creating a sample Postgres based Yesod web application and then deploying it to Kubernetes cluster. We will create a Helm chart for doing our kubernetes release. Note that the entire source lives here:

Colossal: ReActor: a Tilting House That Shifts and Spins Based on its Inhabitants’ Movements

Photography: Richard Barnes & Dora Somosi

In the rolling hills of upstate New York at the outdoor sculpture park Art Omi, artist duo Alex Schweder and Ward Shelley (previously) created a fully functional house with a special slant. The project, called ReActor, is a 42 by 8-foot rotating home that balances on a single 14-foot tall concrete column. Movements inside the dwelling, as well as outside forces like gusts of wind, cause the structure to gently tilt and rotate. In the summer of 2016, Schweder and Shelley inhabited the home for five days, and their movements toward or away from the house’s fulcrum caused constant motion. Because the home is constructed with Philip Johnson-esque levels of floor-to-ceiling windows, the artists’ interior activities were visible to Omi attendees.

Schweder and Shelley have collaborated since 2007, focusing on “performance architecture,” a practice of designing, building, and living in structures for the purpose of public observation and dialogue.  Though the artists are currently residing in (presumably) more stable housing, the tilting house remained on view at Omi until August 2018. (via Yellowtrace)

Disquiet: Disquiet Junto Project 0351: Selected Insomniac Works Volume II

Each Thursday in the Disquiet Junto group, a new compositional challenge is set before the group’s members, who then have just over four days to upload a track in response to the assignment. Membership in the Junto is open: just join and participate. (A SoundCloud account is helpful but not required.) There’s no pressure to do every project. It’s weekly so that you know it’s there, every Thursday through Monday, when you have the time.

Deadline: This project’s deadline is Monday, September 24, 2018, at 11:59pm (that is, just before midnight) wherever you are on. It was posted just before noon, California time, on Thursday, September 20, 2018.

Tracks will be added to the playlist for the duration of the project.

These are the instructions that went out to the group’s email list (at tinyletter.com/disquiet-junto):

Disquiet Junto Project 0351: Selected Insomniac Works Volume II
The Assignment: Rework some very quiet music by making it even more sedate.

Step 1: Last week, about 60 members of the Disquiet Junto recorded ambient music for the middle of the night. The specific request was to “Make very quiet music for very late at night for very fragile psyches.” This week, we’ll each select a track from last week and proceed to dial it down even further.

Step 2: Listen through the tracks from last week’s project, and choose the one whose ambience you want to employ in your track:

https://soundcloud.com/disquiet/sets/disquiet-junto-project-0350

In addition, there may be some other tracks from the project in the Lines discussions, here:

https://llllllll.co/t/disquiet-junto-project-0350-selected-insomniac-works/

Step 3: Having chosen a track in Step 2 above, confirm that your chosen track is downloadable. If it isn’t, either get in touch with the musician who made it, or choose another track.

Step 4: Listen closely to the track you selected in Step 3. Consider what edges it has that might be smoothed out, what drama it has that might be subsumed. Consider how you might do such things while retaining something that is inherently listenable, should someone choose to turn up the volume and focus on it.

Step 5: Rework the track you selected in Steps 2 and 3 to achieve the goals that arose from Step 4.

Six More Important Steps When Your Track Is Done:

Step 1: Include “disquiet0351” (no spaces or quotation marks) in the name of your track.

Step 2: If your audio-hosting platform allows for tags, be sure to also include the project tag “disquiet0351” (no spaces or quotation marks). If you’re posting on SoundCloud in particular, this is essential to subsequent location of tracks for the creation a project playlist.

Step 3: Upload your track. It is helpful but not essential that you use SoundCloud to host your track.

Step 4: Post your track in the following discussion thread at llllllll.co:

https://llllllll.co/t/disquiet-junto-project-0351-selected-insomniac-works-volume-ii/

Step 5: Annotate your track with a brief explanation of your approach and process.

Step 6: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Other Details:

Deadline: This project’s deadline is Monday, September 24, 2018, at 11:59pm (that is, just before midnight) wherever you are on. It was posted just before noon, California time, on Thursday, September 20, 2018.

Length: The length of your track is up to you.

Title/Tag: When posting your track, please include “disquiet0351” in the title of the track, and where applicable (on SoundCloud, for example) as a tag.

Upload: When participating in this project, post one finished track with the project tag, and be sure to include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Download: Please consider setting your track as downloadable and allowing for attributed remixing (i.e., a Creative Commons license permitting non-commercial sharing with attribution, allowing for derivatives).

Linking: When posting the track online, please be sure to include this following information — as well as the name of the individual whose original track you’re reworking, and a link to that source track:

More on this 351st weekly Disquiet Junto project (Selected Insomniac Works Volume II / The Assignment: Rework some very quiet music by making it even more sedate) at:

https://disquiet.com/0351/

More on the Disquiet Junto at:

https://disquiet.com/junto/

Subscribe to project announcements here:

http://tinyletter.com/disquiet-junto/

Project discussion takes place on llllllll.co:

https://llllllll.co/t/disquiet-junto-project-0351-selected-insomniac-works-volume-ii/

There’s also a Junto Slack. Send your email address to twitter.com/disquiet to join in.

Image associated with this project is by Chris, used thanks to Flickr and a Creative Commons license:

https://flic.kr/p/yfhgm

https://creativecommons.org/licenses/by-nc-sa/2.0/

Colossal: Domestic Sculptures Formed With Wood Grown at the United States and Mexico Border by Hugh Hayden

"America" (2018), 
Sculpted mesquite (Prosopis glandulosa) on plywood, 
overall dimensions: 43 1/4 x 81 x 81 in
, © Hugh Hayden; Courtesy Lisson Gallery

“America” (2018), 
Sculpted mesquite (Prosopis glandulosa) on plywood, 
overall dimensions: 43 1/4 x 81 x 81 in
, © Hugh Hayden; Courtesy Lisson Gallery

Texas-born sculptor Hugh Hayden (previously) combines different varieties of wood to create furniture and other domestic objects with protruding spikes and branches. For his latest exhibition Border States at Lisson Gallery in New York City, Hayden addresses notions of citizenship and boundaries with sculptures made using wood indigenous to the United States and Mexico border. The traditional family ideals of the American Dream are evoked in objects such as a dining room table, picket fence, and child’s stroller, yet their source material speaks to the contentious practices upheld at our nation’s border.

Eastern Red Cedar, a wood from Texas with a pinkish interior, composes 
The Jones Part 3, a fence covered in branches which reach out at the audience from its vertical slats. “Texas Ebony,” a dark wood that grows at the border, composes another sculpture, while the weed-like Mesquite forms a kitchen table and chairs titled America.

Hayden currently lives and works in New York City. Border States runs at Lisson Gallery through October 27, 2018. You can see more of his politically-minded sculptures on his website and Instagram.

"America" (detail) (2018), 
Sculpted mesquite (Prosopis glandulosa) on plywood, 
overall dimensions: 43 1/4 x 81 x 81 in
, © Hugh Hayden; Courtesy Lisson Gallery

“America” (detail) (2018), 
Sculpted mesquite (Prosopis glandulosa) on plywood, 
overall dimensions: 43 1/4 x 81 x 81 in
, © Hugh Hayden; Courtesy Lisson Gallery

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery.

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery. 

"Cable News" (2018
), Sculpted post cedar (Juniperus ashei) with mirror and hardware, 
101 x 31 1/2 x 19 1/2 in, © Hugh Hayden; Courtesy Lisson Gallery

“Cable News” (2018
), Sculpted post cedar (Juniperus ashei) with mirror and hardware, 
101 x 31 1/2 x 19 1/2 in, © Hugh Hayden; Courtesy Lisson Gallery

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery.<span style="color: #444444; font-size: 1rem;"> </span>

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery. 

"
The Jones Part 3" (2018), 
Sculpted eastern red cedar (Juniperus virginiana) with steel
, 78 1/2 x 180 x 26 3/4 in, © Hugh Hayden; Courtesy Lisson Gallery

“
The Jones Part 3” (2018), 
Sculpted eastern red cedar (Juniperus virginiana) with steel
, 78 1/2 x 180 x 26 3/4 in, © Hugh Hayden; Courtesy Lisson Gallery

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery.

Installation view of Hugh Hayden: Border States at Lisson Gallery, New York (15 September – 27 October 2018). © Hugh Hayden; Courtesy Lisson Gallery.

"Untitled (Wagon)" (2018), 
Sculpted post cedar (Juniperus ashei)
, 100 x 89 x 65 in, © Hugh Hayden; Courtesy Lisson Gallery

“Untitled (Wagon)” (2018), 
Sculpted post cedar (Juniperus ashei)
, 100 x 89 x 65 in, © Hugh Hayden; Courtesy Lisson Gallery

"Untitled (French gothic picket)" (2018), 
Sculpted post cedar (Juniperus ashei) on plywood
, 
68 x 98 x 59 in, © Hugh Hayden; Courtesy Lisson Gallery

“Untitled (French gothic picket)” (2018), 
Sculpted post cedar (Juniperus ashei) on plywood
, 
68 x 98 x 59 in, © Hugh Hayden; Courtesy Lisson Gallery

Tea Masters: Etude des Oolongs de haute montagne de Taiwan

De gauche à droite: Feng Huang, Alishan, DYL 90K 
Antonio adore les Oolongs de haute montagne et me demande une nouvelle leçon à leur sujet et notamment sur les caractéristiques des différentes montagnes. C'est pourquoi j'ai préparé 2 comparaisons de 3 Oolongs printaniers. Afin de comparer avec les mêmes paramètres, nous infusons chaque thé avec 3 grammes, de l'eau juste à ébullition, 6 minutes d'infusion et dans des sets de compétition standards en porcelaine blanche.

Pour la première série, je choisis un Qingxin Oolong faiblement oxydé de Feng Huang (Dong Ding) dont l'altitude n'est que de 700 mètres et ne qualifie pas comme haute montagne (1000 m au moins). Ce thé ressemble à un Oolong de haute montagne, mais n'en est pas un. C'était donc intéressant de goûter la différence avec 2 vrais Oolongs de haute montagne pour mieux comprendre l'impact de l'altitude sur les arômes. Son infusion est un peu plus jaune et foncée. Les feuilles aussi, notamment ouvertes. Les arômes sont moins fleuris, plus mûrs et le goût a moins de finesse et de légèreté.
Le second thé, au milieu, est mon Alishan de Changshu Hu. Il nous surprend car ses arômes fleuris sont plus prononcés que ceux du Da Yu Ling! Son goût a beaucoup de légèreté, mais moins de complexité et de profondeur que le Da Yu Ling. Ce dernier a aussi plus de fraicheur et l'on ressent qu'il provient d'un endroit plus frais. Ses feuilles ouvertes sont les plus grandes (mais à sec ce n'est pas si évident de qu'elles sont si grandes).
De gauche à droite: Shan Lin Xi, QiLai shan, Tian Chi
 Pour la seconde comparaison, je positionne à nouveau de manière à aller croissant en altitude. Le premier thé est ce Shan Lin Xi. Il est très fleuri et doux. Puis, au milieu, il y a cet Oolong de Qilai shan. Il a des saveurs plus minérales et s'est bien affiné depuis ce printemps. D'un point de vue qualité, cela le place à niveau similaire au DYL et au Tian Chi qui finit cette comparaison. Le Tian Chi a l'infusion la plus concentrée et les couleurs de ses feuilles sont le plus vives. Ses arômes sont les plus fruités. Sa plantation est belle à en pleurer et j'ai déjà consacré tout un article à écrire tout le bien que j'en pense. En un mot, c'est un Oolong de haute montagne exceptionnel à plus d'un titre, mais nous avons constaté que le Shan Lin Xi et le Qilai se défendaient bien! Pareillement, la taille des feuilles de même rang va en grandissant chez ces 3 Oolongs et cela confirme bien que l'altitude de leurs plantations va croissante.
 Pour finir la leçon, Antonio infuse le Oolong de son choix en théière zhuni d'Yixing. Il choisit celui de Qilai. Comme son volume est similaire au set, je mesure 3 grammes afin de mieux comparer la différence avec les infusions en porcelaine. Par contre, je le laisse infuser aussi longtemps qu'il le veut. Et comme son infusion dura bien moins que 6 minutes, on obtint un thé bien plus léger avec moins de goût et de longueur en bouche!
 Aussi, cette première infusion fut moins goûteuse que celle en porcelaine en mode de compétition! C'est assez étonnant, mais c'est aussi un bon signe pour la qualité de ces Oolongs de haute montagne qu'ils soient si plaisants infusés d'une manière si poussée! Et cela montre aussi que la porcelaine peut être un meilleur accessoire qu'une zhuni si on infuse les feuilles de manière à maximiser leur potentiel!
Les Oolongs de haute montagne de Taiwan n'en finissent pas de nous surprendre et de nous ravir par leur finesse, leur absence d'amertume, leur fraicheur...
Plantation d'Oolong de Tian Chi

Michael Geist: Cuts Like a Knife: Bryan Adams Calls for Stronger Protections Against One-Sided Record Label Contracts

Canadian artist Bryan Adams placed copyright in the spotlight on Tuesday, appearing before the Canadian Heritage committee to make his case for copyright reform. Adams attracted widespread media coverage, though the big music industry groups such as Music Canada were conspicuously silent with not even a tweet to mark the appearance. Why the cold shoulder from the Canadian music industry to one of Canada’s best known artists? The obvious answer is that Adams sang from a far different songbook than the industry lobby groups. While those groups have been pushing for copyright term extension and a so-called “value gap” that bears little reality to Canadian law, Adams expressed artist frustration with the industry and one-sided contracts, noting that “I don’t even want to start naming the names of people who have had their copyright whisked from underneath their feet from contracts that they’ve signed as youngsters.”

His proposed reform stems from Section 14(1) of the Copyright Act, which provides for the reversion of copyrights to a creator’s heirs 25 years after their death. The provision is designed to address concerns that creators are often the weaker party when they enter into agreements with publishers or record labels. The reversion provision seeks to remedy the bargaining imbalance by reverting the rights many years later. As Adams pointed out, however, creators never experience the benefit of reversion since it applies decades after their death. Adams instead proposed that the reversion take effect 25 years after the copyright is first assigned to the company:

25 years is plenty of time for copyright to be exploited by an assignee. The second point was that an author or composer can see a further potential financial benefit of their work in their lifetime, and reinvest in new creation. It won’t happen by having reversion. It’s an incentive. This is the single and probably the most efficient subsidy to Canadian creators at no additional costs to the taxpayers at all.

Adams’ pitch opens the door to an important conversation on copyright policy in Canada. First, by calling for a shorter copyright term for reversion in contractual agreements, he reminded policy makers that extending the term of copyright for years after the creator has died does little for them. Indeed, the industry push for copyright term extension of up to 70 years after death does not create new incentives to create or give creators what they need today. If the government is consider amendments to copyright terms, it would do far better to examine shortening the term of copyright reversion as Adams suggests, rather than locking down the public domain by extending general copyright term in a manner that leaves creators with little value or incentive today.

Second, Adams’ concern is fundamentally about the unfair bargaining power that frequently exists between creators and music labels, publishers, or other corporate copyright interests. The reversion approach is one mechanism to address the copyright imbalance, though shortening the term raises its own set of concerns, including the likelihood that publishers and labels will look for other ways to generate revenues from the artists over a shorter period of time. My colleague Professor Jeremy de Beer has identified other mechanisms that could be used to strengthen the bargaining power of artists when confronted with one-sided deals from record labels and publishers.

Third, the problem with copyright and unfair contracts is not limited to one-sided deals with record labels. The interaction between copyright and contracts can create challenges in many areas, including efforts by publishers to limit fair dealing rights through contract, contractual limits on authors to use their own works as they see fit, and the use of contracts and digital locks to leave consumers with few rights when their digital purchases are rendered inaccessible.

There has long been a need to ensure that copyright policies are not overriden by unfair contracts, ideally through legislative amendment that restrict the ability to contract out of fundamental creator and user rights. With his high profile appearance before the Heritage Committee, Adams has provided an important reminder that copyright lobby groups do not speak for all creators and redirected the copyright conversation toward some of the copyright concerns that the industry has been reluctant to address.

The post Cuts Like a Knife: Bryan Adams Calls for Stronger Protections Against One-Sided Record Label Contracts appeared first on Michael Geist.

The Shape of Code: A 1948 viewpoint on developer vs. computer time

For a long time now developer time has been a lot more expensive than computer time. The idea that developers should organize what they do, so as to maximize the efficiency of computer time rather than their own time, is considered to be an echo from a bygone age.

Until recently, I thought the transition from this bygone age, when computer time was considered more important than developer time, started in the late 1960s. Don’t ask me why I thought this, put it down to personal bias.

I was recently reading A Survey of Eniac Operations and Problems: 1946-1952, published in 1952, and what did I find:

“Early in 1948, R. F. Clippinger and some of his associates, in the course of coding the solution of …, were forced to adopt a different method of using the Eniac in order to fit their problem on the machine. …. The experience with this method (first discussed in reference 1), led J. von Neumann to suggest the use of a serial code for control of the Eniac. Such a code was devised and employed with the Eniac beginning in March 1948. Operation of the Eniac with this code was several times slower than either the original method of direct programming or the code for parallel operation. However, the resulting simplification of coding techniques and other advantages far outweighed this disadvantage.

In other words, in 1948, the people using one of the few computers in the world, which clocked at 100KHz, considered developer time to be more important than computer time.

things magazine: Fighting through it

Can there ever be a big-budget action game without violence? See also The Right to Roam, our earlier look at walking simulators and photo modes / Origami simulator / Spectrum Nostalgia / art by Adam Robinson / RANKED: 10 Paintings … Continue reading

Penny Arcade: News Post: The Craw Eternal

Tycho: I feel like a sweatshop that the worker willingly enters constitutes a kind of labor market endgame. Like the opt-in Surveillance State we acknowledge and worship every time we tag a photo, we really have to take a second to think about this.  But nobody has a second.  I deeply resent the fact that one can accurately and succinctly describe the actual world we live in and sound like a fucking AM radio whackjob. It’s true that these are choices people are making, ostensibly with the knowledge that this type of “employment” has none of the strictures commonly…

Daniel Lemire's blog: On the state of virtual-reality gaming

For nearly two years, I have been trying a wide range of video games in a virtual reality setting. Our lab. in Montreal has some permanent space dedicated to the HTC Vive, so I was also able to test out games with a wide range of people. I must have tried several dozen different games so far.

Gaming in virtual-reality is a disappointment. I am surprised that Sony sold millions of virtual-reality headsets. To my knowledge, there are no big studio betting on virtual reality. It is mostly owned by independent developers making small bets.

To be clear, I am not disappointed at virtual reality per se. However, it seems clear that two years ago, I greatly underestimated how much work we collectively need to do to get “virtual reality right”.

What works? A few games are quite good. I have two favorite games.

One of them is Superhot VR. In Superhot, you are an assassin moving from one minimalist sandbox to another, killing people best you can (with a knife, your fist, a bottle, a gun, …). It would be quite bland if not for the trick that time flows only as fast as you move. As long as you remain immobile, time remains still. The game is a “port” to virtual reality of a conventional game, but virtual reality makes it shine.

My other favorite game is Beat Saber. As the name suggests, you use (light) sabers to cut coloured boxes coming at you (not unlike a Star Wars Jedi) at the rhythm of some music. It is probably my favorite virtual-reality game so far.

Both of them are so good that they provide an unforgettable experience. However, they are both modest games.

What might we say about virtual-reality gaming?

  1. Both of these games are highly immersive. Once you are in the game, you feel as if you were teleported elsewhere and you forget where your body is. Yet they are not, in any way, realistic. That is, you are teleported in an artificial world that looks nothing like our everyday world.

    A few years ago, many people assumed that photorealism was required for immersion. That is entirely false.

  2. As a related, but distinct point, neither of these games was particularly expensive to build, or technically challenging. I could probably write cheap clones of these games in a few months, and I am not a video-game programmer. That is, of course, a consequence of the fact that there are seemingly no major investments.
  3. These games require “six degrees of freedom” and handheld commands. That is, they work because you can really move in the environment (forward, backward) while looking in all directions, and using your hands freely.

    However, they only require you to move within a small space. This last step is important since your actual body is still limited to a relatively small space.

    Many games allow you to travel vast distances through various tricks such as teleports, or by moving from within a vehicle. For example, you can point to a far location and click a button to appear there. Even though teleports “work” technically, they are disappointing. I almost invariably get frustrated at such games.

    Other games offer only restricted degrees of freedom. Some games only require you to look around, without having to move. I find these games disappointing as well.

  4. My impression is that simply carrying over existing video games is almost always going to be a futile exercise.

What might come around the corner?

  1. Multiplayer virtual-reality gaming might be great. There are games like Rec Room that offer decent experiences already, with a lot of frustration thrown in. However, we will need better hardware with features like eye tracking. It is almost here.
  2. I still haven’t seen any “long-form” game. That is, playing for hours in a deep and involved game is not possible, right now. What is worse: I cannot even imagine what such a game might look like.

Quiet Earth: FantasticFest: First Look at Dystopian Thriller LEVEL 16 [Trailer]

One of the projects I'm most excited to see this autumn is writer/director Danishka Esterhazy's latest tale of horror and woe: Level 16.


One of the hallmarks of Esterhazy's work is strong female leads and it seems that she is carrying on that tradition with her latest which stars Sara Canning (who previously starred in Esterhazy's excellent Black Field) as Miss Brixil, the exacting headmistress of Vestalis Academy, a place where girls go for training before being adopted into the upper-class families of this dystopian world.


Level 16 is the final level of training but as our protagonist, Vivien (Katie Douglas) is starting to suspect, there's something far more nefarious at play here than simple adoption.


It's difficult to watch a trailer for any pr [Continued ...]

MattCha's Blog: 2017 Zheng Si Long Gedeng & Thoughts On the Gedeng Producing Area


Well, I lost this 2017 Zheng Si Long Gedeng sample about a month ago.  Tiago, the owner of Tea Encounter, kindly included this in a recent order.  It goes for $164.32 for 400g cake or $0.41/g, but is currently sold out.  Tiago assured me that his stock gets regularly replenished so watched for its return if this review peaks your interest…

Dry leaves smell of mellow cherry fruits and of distant mountain dew, a rocky and almost forest like odour.

The first infusion has a mineral, rock-like taste, almost like literally eating a rock initially with a slippery almost sticky mouthfeeling.  There is a mild cooling and slight sugar and distant fruit.  The fruit element slowly expands in the mouth and shows subtleties of a more tropical fruit taste.  The mouthfeeling is reasonably long and still carries the mineral rock taste that is a touch forest like.

The second infusion has a very nice and full mouthfeel that is like a dense coating of slightly astringent paint over the tongue and mouth. The throat takes note and opens to such suggestions.  The mineral rock taste is there but along with distinct florals in the background as well as subtle fruits.  The feeling in the mouth and throat is really nice off the go here.  Subtle fruits and floral stretch long into the breath.

The third infusion has much of the second its mouthfeel is nice and strong, thick feeling liquor in the mouth the mineral, rock, mountain top, taste is distinct and dominating throughout.  The high notes linger in the back ground distant wildflowers and almost tropical fruit suggestions.

The fourth infusion has a nice full mineral, rock taste with edges of forest and opens up to a more distinct menthol taste with a hallow sugar finish with slight wildflower and honey.  The returning sweetness is a nice exaggeration of this with touches of tropical fruit.

The fifth and sixth were much the same the astringent up front mineral and forest base taste is interesting and a signature of Gedeng.  The sweetness is all on the back end in the form of almost buttery wildflowers and edges of clear tropical fruit tastes.

The seventh infusion has fruit tastes mix with forest.  The mineral, rock taste is less now and the fruity taste with a bit of slight astringency is found throughout.  The returning sweetness swells with a touch of cooling in the throat where tropical fruits appear. This infusion is much more sweet and fruity now.

The eighth and ninth is more mellow fruit with slight cooling and edges of astringency.  The fruity taste becomes more dominant now.  This tea is becoming very fruity and approachable with a distinct cooling aftertaste and long fruitiness.  The fruitiness is not that vibrant over powering thing, instead its a mellow almost foresty thing.  The qi of this tea is not ground breaking but soft and gentle you can feel a fluffiness in the head but nothing too much.  A touch relaxing, a touch alerting- nothing to strong, a mellow qi.

The tenth has a creaminess and sweetness to the fruity flavours which now dominate.  The eleventh still has a thicker viscus feeling, slightly astringent.

11, 12, 13, 14 it starts to weaken so I push harder but mainly enjoyable fruity tastes are pushed out.  Still a mild menthol, a mineral rock forest is mainly found in the aftertaste now still worthwhile and tasty.

There is lots to enjoy about this puerh mainly in its taste progression throughout the session.  There is interesting depth in this Gedeng due to its astringent mineral forest taste that at times is almost or barely bitter which helps to balance the interesting mild fruits and florals that wriggle themselves out especially later in the session where they dominate.  It offers a mild qi sensation, mellow.  Another thing that might interest people is its very characteristic Gedeng profile.  This might be worth a sample for those out there that want to get familiar with this famous (but not that common) classic six mountain puerh producing area.  The thing is, I have never really been a big fan of this area.  Either way, this is a great example even if just for education purposes.

Peace
 


Quiet Earth: Keanu Reeves in Sci-fi REPLICAS Trailer

Keanu Reeves is back in another cyberpunk movie called Replicas. We first reported on the film last year with a previous trailer, but since then the film has reportedly been undergoing some changes and this new trailer has emerged in quite a different form. Watch both if you can.

The film stars Keanu Reeves, Alice Eve, Thomas Middleditch, and John Ortiz and will be released "soon".


Synopsis:
In this sci-fi thriller, neuro-scientist William Foster is on the verge of successfully transferring human consciousness into a computer when his family is tragically killed in a car crash.

Desperate to resurrect them, William recruits fellow scientist Ed Whittle to help hi [Continued ...]

Quiet Earth: Family Visit Turns Deadly in KNUCKLEBALL [Trailer]

Writer/director Michael Peterson's feature film debut Lloyd the Conqueror made a bit of a splash in its festival run a few years ago and now the director is back with a tale which is far darker than and menacing than that film ever was: Knuckleball.


The movie stars Michael Ironside as Jacob, a grouchy granddad tasked with minding his grandson for the weekend. He's a tough SOB but he's determined to teach his grandson Henry (Luca Villacis) a few things before his parents pick him up.


When Jacob suddenly dies during the night and a snowstorm rolls into the secluded farm, Henry turns to Dixon (Munro Chambers of Turbo Kid and "Degrassi" fame), a nearby neighbour and farm hand, for help and so begins the trouble.


The [Continued ...]

explodingdog: Photo



Quiet Earth: New on Blu-ray and DVD for September 18, 2018

In the latest adaptation of Ray Bradbury's classic dystopian novel, a young man, Guy Montag, whose job as a fireman is to burn all books, questions his actions after meeting a young woman... and begins to rebel against society.

This version of the story comes courtesy of HBO and stars Michael Shannon, Michael B. Jordan and Sofia Boutella.

You'll find the trailer below.


ONWARDS!





It's been three years since theme park and luxury resort, [Continued ...]

CreativeApplications.Net: Horror Vacui – Exploring Earth’s (un)real geological formations

Horror Vacui – Exploring Earth’s (un)real geological formations
Created by Matteo Zamagni, "Horror Vacui" is a non-narrative film that explores geological formations of Earth and the frenetic hyper-development attained by humankind.

Penny Arcade: Comic: The Craw Eternal

New Comic: The Craw Eternal

OCaml Planet: opam 2.0.0 release and repository upgrade

We are happy to announce the final release of opam 2.0.0.

A few weeks ago, we released a last release candidate to be later promoted to 2.0.0, synchronised with the opam package repository upgrade.

You are encouraged to update as soon as you see fit, to continue to get package updates: opam 2.0.0 supports the older formats, and 1.2.2 will no longer get regular updates. See the Upgrade Guide for details about the new features and changes.

The website opam.ocaml.org has been updated, with the full 2.0.0 documentation pages. You can still find the documentation for the previous versions in the corresponding menu.

Package maintainers should be aware of the following:

  • the master branch of the opam package repository is now in the 2.0.0 format
  • package submissions must accordingly be made in the 2.0.0 format, or using the new version of opam-publish (2.0.0)
  • anything that was merged into the repository in 1.2 format has been automatically updated to the 2.0.0 format
  • the 1.2 format repository has been forked to its own branch, and will only be updated for critical fixes

For custom repositories, the advice remains the same.


Installation instructions (unchanged):

  1. From binaries: run
    sh <(curl -sL https://raw.githubusercontent.com/ocaml/opam/master/shell/install.sh)

    or download manually from the Github “Releases” page to your PATH. In this case, don’t forget to run opam init --reinit -ni to enable sandboxing if you had version 2.0.0~rc manually installed.

  2. From source, using opam:
    opam update; opam install opam-devel

    (then copy the opam binary to your PATH as explained, and don’t forget to run opam init --reinit -ni to enable sandboxing if you had version 2.0.0~rc manually installed)

  3. From source, manually: see the instructions in the README.

We hope you enjoy this new major version, and remain open to bug reports and suggestions.

NOTE: this article is cross-posted on opam.ocaml.org and ocamlpro.com. Please head to the latter for the comments!

Explosm.net: Comic for 2018.09.19

New Cyanide and Happiness Comic

Which Lights?: itscolossal: Watch How Steel Ribbons Are Shaped into Cookie...





itscolossal:

Watch How Steel Ribbons Are Shaped into Cookie Cutters

Tag ur porn pls

LaForge's home page: Wireshark dissector for 3GPP CBSP - traces wanted!

I recently was reading 3GPP TS 48.049, the specification for the CBSP (Cell Broadcast Service Protocol), which is the protocol between the BSC (Base Station Controller) and the CBC (Cell Broadcast Centre). It is how the CBC according to spec is instructing the BSCs to broadcast the various cell broadcast messages to their respective geographic scope.

While OsmoBTS and OsmoBSC do have support for SMSCB on the CBCH, there is no real interface in OsmoBSC yet on how any external application would instruct it tot send cell broadcasts. The only existing interface is a VTY command, which is nice for testing and development, but hardly a scalable solution.

So I was reading up on the specs, discovered CBSP and thought one good way to get familiar with it is to write a wireshark dissector for it. You can find the result at https://code.wireshark.org/review/#/c/29745/

Now my main problem is that as usual there appear to be no open source implementations of this protocol, so I cannot generate any traces myself. More surprising is that it's not even possible to find any real-world CBSP traces out there. So I'm facing a chicken-and-egg problem. I can only test / verify my wireshark dissector if I find some traces.

So if you happen to have done any work on cell broadcast in 2G network and have a CBSP trace around (or can generate one): Please send it to me, thanks!

Alternatively, you can of course also use the patch linked above, build your own wireshark from scratch, test it and provide feedback. Thanks in either case!

LLVM Project Blog: Announcing the new LLVM Foundation Board of Directors

The LLVM Foundation is pleased to announce its new Board of Directors:


Chandler Carruth
Mike Edwards (Treasurer)
Hal Finkel
Arnaud de Grandmaison
Anton Korobeynikov
Tanya Lattner (President)
Chris Lattner
John Regehr (Secretary)
Tom Stellard

Two new members and seven continuing members were elected to the nine person board.

We want to thank David Kipping for his 2 terms on the board. David has been actively involved with the LLVM Developer Meetings and was the treasurer for the past 4 years. The treasurer is a time demanding position in that he supports the day to day operation of the foundation, balancing the books, and generates monthly treasurer reports.

We also want to thank all the applicants to the board. When voting on new board members, we took into consideration all contributions (past and present) and current involvement in the LLVM community. We also tried to create a balanced board of individuals from a wide range of backgrounds and locations to provide a voice to as many groups within the LLVM community. Given this criteria and strong applicants, we increased the board from 8 members to 9.

About the board of directors (listed alphabetically by last name):


Chandler Carruth:

Chandler Carruth has been an active contributor to LLVM since 2007. Over the years, he has has worked on LLVM’s memory model and atomics, Clang’s C++ support, GCC-compatible driver, initial profile-aware code layout optimization pass, pass manager, IPO infrastructure, and much more. He is the current code owner of inlining and SSA formation.

In addition to his numerous technical contributions, Chandler has led Google’s LLVM efforts since 2010 and shepherded a number of new efforts that have positively and significantly impacted the LLVM project. These new efforts include things such as adding C++ modules to Clang, adding address and other sanitizers to Clang/LLVM, making Clang compatible with MSVC and available to the Windows C++ developer community, and much more.

Chandler works at Google Inc. as a technical lead for their C++ developer platform and has served on the LLVM Foundation board of directors for the last 4 years.

Mike Edwards:

Mike Edwards is a relative newcomer to the LLVM community, beginning his involvement just a few years ago while working for Sony Playstation. Finding the LLVM community to be an incredibly amazing and welcoming group of people, Mike knew he had to find a way to contribute. Mike’s previous work in DevOps led him to get involved in helping to work on the llvm.org infrastructure. Last year, with the help of the Board and several community members, Mike was able to get the llvm.org infrastructure moved onto a modern compute platform at Amazon Web Services. Mike is one of the maintainers of our llvm.org infrastructure.

Mike is currently working as a Software Engineer at Apple, Inc. working on the Continuous Integration and Quality Engineering efforts for LLVM and Clang development.

Hal Finkel:

Hal Finkel has been an active contributor to the LLVM project since 2011. He is the code owner for the PowerPC target, the alias-analysis infrastructure, and other components.

In addition to his numerous technical contributions, Hal has chaired the LLVM in HPC workshop, which is held in conjunction with Super Computing (SC), for the last five years. This workshop provides a venue for the presentation of peer-reviewed HPC-related researching LLVM from both industry and academia. He has also been involved in organizing an LLVM-themed BoF session at SC and LLVM socials in Austin.

Hal is Lead for Compiler Technology and Programming Languages at Argonne National Laboratory’s Leadership Computing Facility.

Arnaud de Grandmaison:

Arnaud de Grandmaison has been hacking on LLVM projects since 2008. In addition to his open source contributions, he has worked for many years on private out-of-tree LLVM-based projects at Parrot, DiBcom, or Arm. He has also been a leader in the European LLVM community by organizing the EuroLLVM Developers’ meeting, Paris socials, and chaired or participated in numerous program committees for the LLVM Developers’ Meetings and other LLVM related conferences.

Arnaud has attended numerous LLVM Developers’ meetings and volunteered as moderator or presented as well. He also moderates several LLVM mailing lists. Arnaud is also very involved in community wide discussions and decisions such as re-licensing and code of conduct.

Arnaud is a Senior Principal Engineer at Arm.

Anton Korobeynikov:

Anton Korobeynikov has been an active contributor to the LLVM project since 2006. Over the years, he has numerous technical contributions to areas including Windows support, ELF features, debug info, exception handling, and backends such as ARM and x86. He was the original author of the MSP430 and original System Z backend.

In addition to his technical contributions, Anton has maintained LLVM’s participation in Google Summer of Code by managing applications, deadlines, and overall organization. He also supports the LLVM infrastructure and has been on numerous program committees for the LLVM Developers’ Meetings (both US and EuroLLVM).

Anton is currently an associate professor at the Saint Petersburg State University and has served on the LLVM Foundation board of directors for the last 4 years.

Tanya Lattner:

Tanya Lattner has been involved in the LLVM project for over 14 years. She began as a graduate student who wrote her master's thesis using LLVM, and continued on using and extending LLVM technologies at various jobs during her career as a compiler engineer.

Tanya has been organizing the US LLVM Developers’ meeting since 2008 and attended every developer meeting. She was the LLVM release manager for 3 years, moderates the LLVM mailing lists, and helps administer the LLVM infrastructure servers, mailing lists, bugzilla, etc. Tanya has also been on the program committee for the US LLVM Developers’ meeting (4+ years) and the EuroLLVM Developers’ Meeting.

With the support of the initial board of directors, Tanya created the LLVM Foundation, defined its charitable and education mission, and worked to get 501(c)(3) status.

Tanya is the Chief Operating Officer and has served as the President of the LLVM Foundation board for the last 4 years.

Chris Lattner:

Chris Lattner is well known as the founder for the LLVM project and has a lengthy history of technical contributions to the project over the years. He drove much of the early implementation, architecture, and design of LLVM and Clang.

Chris has attended every LLVM Developers’ meeting, and presented at many of them. He helped drive the conception and incorporation of the LLVM Foundation, and has served as its secretary. Chris also grants commit access to the LLVM Project, moderates mailing lists, moderates and edits the LLVM blog, and drives important non-technical discussions and policy decisions related to the LLVM project.

Chris manages a team building machine learning infrastructure at Google and has served on the LLVM Foundation board of directors for the last 4 years.

John Regehr:

John Regehr has been involved in LLVM for a number of years. As a professor of computer science at the University of Utah, his research specializes in compiler correctness and undefined behavior. He is well known within the LLVM community for the hundreds of bug reports his group has reported to LLVM/Clang.

John was a project lead for IOC, a Clang based integer overflow checker that eventually became the basis for the integer parts of UBSan. He was also the primary developer of C-Reduce which utilizes Clang as a library and is often used as a test case reducer for compiler issues.

In addition to his technical contributions, John has served on several LLVM-related program committees. He also has a widely read blog about LLVM and other compiler-related issues (Embedded in Academia).

Tom Stellard:

Tom Stellard has been contributing to the LLVM project since 2012. He was the original author of the AMDGPU backend and was also an active contributor to libclc. He has been the LLVM project’s stable release manager since 2014.

Tom is currently a Software Engineer at Red Hat and is the technical lead for emerging toolchains including Clang/LLvm. He also maintains the LLVM packages for the Fedora project.

Perlsphere: Maintaining the Perl 5 Core (Dave Mitchell): Grant Report for August 2018

This is a monthly report by Dave Mitchell on his grant under Perl 5 Core Maintenance Fund. We thank the TPF sponsors to make this grant possible.

I didn't spend all that many hours during August on perl work.

I spent most of my time looking at a bug related to restoring of captures
within regex repeats. During the course of that, I took the opportunity to
simplify and cleanup some of the code in S_regmatch() which deals with
captures, and in particular, make it consistently use macros which produce
debugging output when opening or closing or restoring capture indices.

SUMMARY:
     18:03 RT #133352 Ancient Regex Regression
      1:30 RT #133429 Time-HiRes/t/itimer.t: intermittent failures
      1:36 RT #133441 no assignment to "my" variable  
    ------
     21:09 TOTAL (HH::MM)  


 254.7 weeks
3151.4 total hours
  12.4 average hours per week

There are 315 hours left on the grant

Michael Geist: Supreme Court of Canada on Copyright Notices: Identification of IP Address “Not Conclusive of Guilt”

The initial emphasis on last week’s Supreme Court of Canada’s copyright notice decision has focused on how Internet providers can pass along the specific costs associated with subscriber disclosures beyond those required for the notice-and-notice system to rights holders. The ruling rightly restores the notice system back to its intended approach, but it is not the only takeaway with implications for the recent flurry of file sharing lawsuits. While there has been a huge number of claims filed in Canada (with some surprisingly large settlements), the Supreme Court acknowledged important limitations in notice claims, noting that merely being associated with an IP address is not conclusive of guilt.

The full quote from the majority:

It must be borne in mind that being associated with an IP address that is the subject of a notice under s. 41.26(1)(a) is not conclusive of guilt.  As I have explained, the person to whom an IP address belonged at the time of an alleged infringement may not be the same person who has shared copyrighted content online. It is also possible that an error on the part of a copyright owner would result in the incorrect identification of an IP address as having been the source of online copyright infringement. Requiring an ISP to identify by name and physical address the person to whom the pertinent IP address belonged would, therefore, not only alter the balance which Parliament struck in legislating the notice and notice regime, but do so to the detriment of the privacy interests of persons, including innocent persons, receiving notice.

The passage is critically important since it lends support to many notice recipients who maintain that they have been misidentified or the notice has been sent in error. While some may feel that they have little alternative but to settle, the Supreme Court’s language sends a reminder that IP address alone may be insufficient evidence to support a claim of copyright infringement. Those that fight back against overly aggressive notices may find the claims dropped. Alternatively, contesting a claim would require copyright owners to tender more evidence than just an allegation supported by an identifiable IP address.

Moreover, the finding reinforces the need for the government to act by stopping the inclusion of settlement demands within copyright notices. The system was never intended to be used to send legal claims and pressure recipients to settle unproven infringement allegations. Indeed, with the court recognizing the privacy interests at stake and that an IP address alone is not conclusive of guilt, the decision increases the pressure on government to address the inappropriate conduct by rights holders who are using the system to pressure thousands of Canadians into paying hundreds of dollars despite the obvious limitations associated with their claims.

The post Supreme Court of Canada on Copyright Notices: Identification of IP Address “Not Conclusive of Guilt” appeared first on Michael Geist.

Tea Masters: 30 years old '8582' puerh

Wrapping paper of the 1988 8582.
 Marco P. left this nice review about the 1988 '8582' raw puerh cake from the Menghai factory: "Possibly the best leaves I’ve ever tasted: they have a disarming simplicity in conciliating opposites such as thickness vs cleanliness, sweetness vs persistency, one prevailing note vs innumerable shades... I felt like making love." 

It's not every day that I get such a comment! So I thought I'd share it and give more background information about this tea. Let's start with the news about its latest auction price. Last June, in China, the price for 1 cake has reached a new high of 16,000 USD (which means 45 USD per gram)! This is very expensive for a tea, but when you consider it's 30 years old and mostly gushu, it's not that unreasonable. If you assume that a new gushu is priced at 2 USD per gram, then the return rate comes out at 11% for 30 years of storage (during which you have flooding risks... as we can see recently) (2 x 1.11^30 = 45).
The 8582 cake was created as a special order for Hong Kong tea stores. Despite its recipe name, it wasn't made in 1985, but it was first made in 1988! That's why it is also known as 88 qing bing (88 raw cake). 88 is a very auspicious number in Chinese. Pronounced in Cantonese, it sounds like 'making a fortune'. But despite this clever marketing, the 8582 wasn't a big success when it started to be sold in 1988. Why? Because it wasn't very good tasting then, and because its leaves appeared very big and rough. The concept of gushu puerh didn't exist, yet. 30 years ago, tea drinkers were looking for lots of buds as a sign of finesse and high notes. These big leaves with few buds tasted very bitter then. And, objectively, compared to a green mark or an aged red mark puerh, these new 8582 simply tasted inferior. Now, however, they are just as popular as the similar aged 7542. Their taste is thicker, while the 7542 have more refined notes.

The first picture above shows the wrapper. The design remained standard for CNNP productions until 2000. So, let me repeat that it's not useful to look at the wrapper to determine the year or authenticity of the puerh. Colors and font sizes would vary because of the manual printing method employed at the time. Paper thickness was also not necessarily consistent within a same year. And such a wrapper is very easy to copy. What's more difficult to replicate are the right color, size and shape of the leaves of the cake. Their aromas and taste are also unique. That's why it's so important to educate your palate with the right stuff. But how do you know if it's real? Well, if the taste should lives up to its sterling reputation and inspires you to write a comment similar to Marco's, it likely is real!

This comment is also remarkable, because it offers a good definition of what all great teas have in common: "they have a disarming simplicity in conciliating opposites such as thickness vs cleanliness, sweetness vs persistency, one prevailing note vs innumerable shades... I felt like making love." Grazie Marco. It took your Italian sensibility to explain it so well!

things magazine: Enduring problems

Random things and thoughts, forgive crashing changes in tone / starting with a fine piece about Eileen Gray’s E.1027 house in Menton. Some more and necessary information about Gray’s life and work: “”E” for “Eileen”, 10 for the letter “J”, … Continue reading

OCaml Weekly News: OCaml Weekly News, 18 Sep 2018

  1. Multicore OCaml continuous benchmarking & call for benchmarks
  2. Dune 1.2.0
  3. Dune 1.2.1
  4. Other OCaml News

Explosm.net: Comic for 2018.09.18

New Cyanide and Happiness Comic

Disquiet: Taking Mount Vision by Strategy

The musician Emily A. Sprague has shifted her topographic imagination. At the very end of 2017, she released Water Memory, a widely praised collection of five modular-synthesizer tracks, the origins of which were likened in the album’s liner notes to “floating through the natural world around us.” Now, just shy of a year later, Sprague returns with Mount Vision, due out this coming Saturday, September 22 — barely a week after the reported two-day recording session that yielded the music. However rapid the production and delivery, a pre-release cut, “Piano 2 (Mount Vision),” suggests Mount Vision to be just as sedate and reflective as the best of Water Memory.

The one Mount Vision track available for streaming now, while the album racks up its cassette and download pre-preorders, is “Piano 2 (Mount Vision).” It opens with a brief acoustic piano chord — reminiscent of Aphex Twin’s “Avril 14th” — that repeats for the full length of the track. Initially the apparent automation of that repetition strikes a contrast with the chord’s natural composure. It is closely mic’d, like a Nils Frahm solo — you can almost picture the inner workings of the keyboard. Field recordings of room tone and birdsong lend a naturalism the piano might lack otherwise.

And as the piece moves forward, a wafting synthesizer line comes if not to the fore then to the slow-burn middle ground. The development of that line adds to the Celtic lilt of the piano part, and brings to mind the Evening Star collaborations between Robert Fripp and Brian Eno — the synth akin to Fripp’s infinite-hold guitar, and the piano to Eno’s swaying ambient groove. All that quiet beauty notwithstanding, the real marvel of “Piano 2 (Mount Vision)” is how that synth line reveals subtle harmonic and rhythmic components of the piano part. It’s a small miracle of a piece.

The map up above is a detail of Mount Vision just north of San Francisco. There are several Mount Visions on this planet, but judging by the album’s liner notes, which list Northern California as where the recordings took place in mid-September, this spot is likely where it was titled after. For further reading, here’s a bit about a podcast interview with Sprague: “The Self-Education of Synthesist Emily Sprague.” And here’s a bit about her previous album: “The Water Memory of Emily A. Sprague.”

Album available at mlesprg.bandcamp.com. More from Sprague, who is originally from the Catskills and now lives in Los Angeles, at mlesprg.info.

Quiet Earth: New PROSPECT Trailer, Poster & Release Details!

Explore uncharted territory on the hunt for intergalactic treasures with the premiere of sci-fi western Prospect launching November 2 exclusively in Regal theaters nationwide from Gunpowder & Sky’s sci-fi label DUST.

Written and directed by Zeek Earl and Christopher Caldwell (In The Pines), Prospect follows a father-daughter on a journey to a jungle moon in search of the ultimate payday where and the two quickly fall prey to unknown terrain and dangerous encounters as their mission turns deadly.

PROSPECT also features Andre Royo (The Spectacular Now), Sheila Vand (Argo) and Anwan Glover (“The Wire”).


Prospect is based on a short by Earl and Caldwell, had its world premiere at SXSW 2018 to critical acclaim where the film took home [Continued ...]

Michael Geist: Notice the Difference?: Supreme Court Rules ISPs Can Be Compensated for Copyright Costs

Policy makers have long struggled to strike a fair balance in crafting rules to address allegations of copyright infringement on the Internet. Copyright owners want to stop infringement and the right to pursue damages, Internet subscribers want their privacy and freedom of expression rights preserved in the face of unproven allegations, and Internet providers want to maintain their neutrality by resolving the disputes expeditiously and inexpensively.

My Globe and Mail op-ed notes that the Canadian system for online infringement was formally established in 2012 and came into effect in 2015. The so-called “notice-and-notice” approach grants rights holders the ability to send notifications of alleged infringement to Internet providers, who are required by law to forward the notices to the relevant subscriber and to preserve the data in the event of future legal action. The system does not prevent rights holders from pursuing additional legal remedies, but Internet providers cannot reveal the identity of their subscribers without a court order.

While the system has proven helpful in educating users on the boundaries of copyright, some rights holders have used it as a launching pad for further lawsuits. In fact, thousands of lawsuits have now been filed, with rights holders seeking to piggyback on the notice-and-notice system by obtaining the necessary subscriber information directly from Internet providers at no further cost.

The question of costs lies at the heart of an important Supreme Court of Canada copyright ruling released on Friday. Voltage Pictures sought subscriber information from Rogers Communications for the purposes of pursuing individual lawsuits. When Rogers advised that it wanted compensation of $100 per hour for the costs associated with fulfilling the request, Voltage responded that Internet providers could not pass along their costs since the notice-and-notice system already required them to identify subscribers and preserve the data without compensation.

The particular incident may have involved only a few hundred dollars, but the broader principle had the potential to dramatically alter the Canadian approach. If Internet providers were required to disclose subscriber information without passing along the costs, Canadian courts faced the prospect of an avalanche of lawsuits and Internet providers might be dissuaded from carefully ensuring that the privacy of their subscribers was properly protected.

The Supreme Court understood the broader implications of the case, ruling that Internet providers can pass along the specific costs associated with subscriber disclosures beyond those required for the notice-and-notice system. Indeed, the court recognized the importance of accurate data to safeguard against reputational harm and wrongful lawsuits.

While the ruling rightly restores the notice system back to its intended approach, there is still more work to be done to ensure that the balance the government sought to achieve between rights holders, subscribers, and Internet providers is maintained.

First, the Canadian approach recognizes that with great rights come great responsibilities for Internet providers. For these guardians of highly sensitive personal information, including the browsing habits, social contacts, and location data for millions of Canadians, disclosing subscriber information as part of a litigation process raises significant privacy issues.

The courts have determined that there may be situations where disclosure is appropriate, but doing so requires ensuring that the data is accurate and only revealed for specific, limited purposes. Friday’s ruling reinforces that Internet providers will be compensated for the costs associated with meeting those obligations. It now falls to them to ensure they exercise care and caution for any subscriber disclosures.

Second, the government must ensure that the notice-and-notice remains in place, despite considerable pressure from the United States to change it as part of the NAFTA renegotiations. The U.S. would like Canada to adopt its notice-and-takedown system, which encourages the removal of content online without a court review or order. But that approach that raises freedom of expression risks since it may raise instances of removing lawful content. The U.S. previously acknowledged during the Trans Pacific Partnership negotiations that the Canadian system provided an equivalent deterrent against online infringement. Despite renewed U.S. trade pressures, undoing the Canadian copyright balance should be taken off the table.

Third, assuming that notice-and-notice survives the NAFTA renegotiation, Innovation, Science and Economic Development Minister Navdeep Bains should follow through with a prior commitment to fix the loopholes in the Canadian approach. The system was designed to educate Canadians and avoid expensive litigation, but in the Rogers case, hundreds of thousands of notices that include settlement demands, and a steady stream of class action claims filed against individuals suggest that the system needs tinkering.

The government previously indicated that long-overdue changes prohibiting the inclusion of settlement demands in notices would be forthcoming. Now that the Supreme Court has settled the question of costs, it falls to the government to complete the job of addressing the shortcomings of a system designed to fairly balance the rights of all stakeholders.

The post Notice the Difference?: Supreme Court Rules ISPs Can Be Compensated for Copyright Costs appeared first on Michael Geist.

new shelton wet/dry: ‘A fun thing to do at parties is stay home and masturbate.’ –Eden Dranger

In April 2018, the servers of the popular video game “Fortnite” crashed for 24 hr. During this period, Pornhub (a popular pornographic website) analyzed trends in pornography access, finding that: (a) the percentage of gamers accessing Pornhub increased by 10% and (b) the searches of pornographic videos using the key term “Fortnite” increased by 60%. { Journal [...]

new shelton wet/dry: Every day, the same, again

Sex doll brothel raided by police, closed down She thought she had candles to burn during a power outage — but it was dynamite Japan starts space elevator experiments Amazon has patented a system that would put workers in a cage, on top of a robot The Secret Drug Pricing System Middlemen Use to [...]

CreativeApplications.Net: Encounter – Suspiciously curious robotics

Encounter – Suspiciously curious robotics
Created by Piet Schmidt during the summer semester at UdK Berlin (New Media / Digital Class), Encounter is a a robotic arm with a mirror that curiously observes its surroundings.

Perlsphere: Mojolicious 8.0 released: Perl real-time web framework

I’m excited to announce the release of Mojolicious 8.0 (Supervillain).

This release marks the culmination of a 2 year development cycle, reaching its conclusion last week at Mojoconf in Norway. Where we were fortunate enough to have the whole core team present, for many very productive discussions and some crazy fun experiments to get async/await working with Perl and Mojolicious.

The project has been growing steadily in the past few years, with many companies relying on Mojolicious to develop new code bases, and even 20 year old behemoths like Bugzilla getting ported to Mojolicious. To support the continued growth we’ve decided to make a few organizational changes. From now on all new development will be consolidated in a single GitHub organization. And our official IRC channel (say hi!) with almost 200 regulars will be moving to Freenode (#mojo on irc.freenode.net), to make it easier for people not yet part of the Perl community to get involved.

When it comes to new features, this is probably our biggest release yet with 26 major features. For a complete overview you can take a look at the slides from my Mojoconf talk or watch the video. Here’s a list of the highlights:

As usual there is a lot more to discover, see Changes on GitHub for the full list of improvements.

Have fun!

TheSirensSound: New EP "Harbors" by Lanterns of Hope


TheSirensSound: New EP "Peach Blossom Spring" by Astral Harmonies


Disquiet: Stasis Report: Less Bells ✚ Distant Fires Burning ✚ More

The latest update to my Stasis Report ambient-music playlist. It started out just on Spotify. As of three weeks ago, it’s also on Google Play Music. The following five tracks were added on Sunday, September 16. All the tracks are fairly new, with the exception of one from a recent reissue.

✚ “Seashore Story” from Ambient Hamlet by Eashwar Subramanian, of Bangalore, India: bandcamp.com.

✚ “Golden Storm” from Solifuge by Less Bells (aka Julie Carpenter) of Joshua Tree, California, on the Kranky label: bandcamp.com.

✚ “Rosalie” from the score to Green Days by the River, composed by Laura Karpman, who is based in Los Angeles: cdbaby.com

✚ “Any” For the Love of… by Distant Fires Burning (aka Gert De Meester of Mechelen, Belgium), on the Audiobulb label: bandcamp.com.

✚ “Second Lens” from A Turn of Breath – Extended by Ian William Craig of Vancouver, British Columbia, on the Recital Program label: recitalprogram.com. It’s an recently expanded reissue of an earlier record.

Some previous Stasis Report tracks were removed to make room for these, keeping the playlist length to roughly two hours (up from what was originally an hour and a half, when the playlist first launched). Those retired tracks (Goldmund, Pariah, C. Diab, and Meg Bowles) are now in the Stasis Archives playlist (currently only on Spotify).

Perlsphere: Call for Grant Proposals (September 2018 Round)

The Grants Committee is accepting grant proposals all the time. We evaluate them every two months and another evaluation period is upon us.

If you have an idea for doing some Perl work that will benefit the Perl community, please consider submitting a grant application. The application deadline for this round is 23:59 September 30th UTC. We will publish the received applications, get community feedback through October 7th, and conclude acceptance by October 14th.

To apply, please read How to Write a Proposal. GC Charter, Rules of Operation and Running Grants List will also help you understand how the grant process works. We also got some grant ideas from the community.

We will confirm the receipt of application by October 1st.

If you have further questions, please contact me at tpf-grants-secretary at perl-foundation.org.

Update! Many of the links on this page were just fixed to match the recent TPF site refresh, sorry about that!

Perlsphere: Rakudo Perl 6 performance analysis tooling - Grant Report #4

The first public release! Code is now hosted in GitHub. Please see the instructions on how to install and run.

The release features a renewed "Routines" tab. Please read Timo's blog post to know how it compares to the previous profiler: The first public release!.

s mazuk: Sottsass & Grawunder’s models

communedesign:

Ettore Sottsass, born in Innsbruck in 1917, became one of the most influential designers and architects of the 20th century. Fond of radical thoughts and movements, he founded the energetic and flamboyant Memphis group in 1981 and completely changed the design scene in Italy - and in the world.

As Memphis attracted attention, Ettore Sottsass founded an architectural practice and a major design consultancy with Aldo Cibic, Marco Marabelli, Matteo Thun and Marco Zanini. He named it Sottsass Associati. He provided cultural guidance to work conducted by its many young associates. 

Johanna Grawunder, born in San Diego in 1961, joined the firm in 1985 and became a partner in 1989. At the Sottsass Studio, she was involved primarily with architecture and interiors, co-designing many of the firm’s most prestigious projects with Sottsass. The models they worked on together between 1988 and 2003 are incredible objects. Made of plexiglas and polychrome marble, they look like little sculptures and were edited by Ultima Edizione, a marble company located near Carrara.

In 2001, Johanna eventually left the firm to found her own studio in San Francisco and Milan. Sottsass died in 2007 at the age of 90 years but Sottsass Associati continue to carry on the philosophy and culture of the studio. The studio still works with former members of Memphis, including Johanna Grawunder.

image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image

Trivium: 16sep2018

explodingdog: Photo



TheSirensSound: At The Piano by Alex Tiuniaev

This is a collection of solo piano pieces inspired by the nighttime ambience and the starless skies of his hometown. Also, by Blade Runner and Erik Satie.

Jesse Moynihan: Star Revisions Part 1

  I was so unhappy with my composition for the Star, I decided to try and take it in radically different direction: Accentuating the cosmological ideas of Athanasius Kircher, Robert Fludd, Dante Alighieri etc… This was something I was trying to get at, but I was too shackled by the Marseille layout. These are depictions […]

Daniel Lemire's blog: Science and Technology links (September 15th, 2018)

  1. I was told repeatedly throughout my life that the normal body temperature was 37.5°C. This estimate is over a hundred years old and flawed. It is off by one degree: a better “normal” is 36.5°C.
  2. According to Malhotra et al., heart disease is a chronic inflammatory condition (meaning that it is related to a dysfunction of your immune system), not something caused by saturated fat clogging the arteries.
  3. Apple has released a watch with government-approved ECG (heart monitoring) capabilities.
  4. Could Alzheimer’s be an infectious disease?
  5. Drinking beer does not lead to weight gains in obese people.
  6. Boys tend to be both the lowest and the highest performers in terms of their reasoning abilities.
  7. Technological progress does not require better understanding, but is maybe more likely the result of the accumulation of many small improvments. That is, technological progress is more about evolution than about science and knowledge gathering.
  8. Higher personal and corporate income taxes negatively affect the quantity, quality, and location of inventive activity.
  9. The latest iPhone processor (the A12) has 6.9 billion transistors.
  10. Many researchers publish at least one paper every five days. They are described as being hyperprolific. Several of them have published hundreds of articles in the same two journals. Some of them work under a system where the more you publish, the more you get paid.

    Einstein published about 300 papers in his (relatively long) life. These people publish as much as Einstein did every five years.

    To be fair, if Einstein were alive today and had access to the Internet and to computers, he might publish 300 papers a year.

The Shape of Code: Major players in evidence-based software engineering

Who are the major players in evidence-based software engineering?

How might ‘majorness’ of players be calculated? For me, the amount of interesting software engineering data they have made publicly available is the crucial factor. Any data published in a book, paper or report is enough to be considered interesting. How interesting is data published on a web page? This is a tough question, let’s dodge the question to start with, and consider the decades before the start of 2000.

In the academic world performance is based on number of papers published, the impact factor of where they were published and number of citations of those papers. This skews the results in favor of those with lots of students (who tack their advisor’s name on the end of papers published) and those who are good at marketing.

Historians of computing have primarily focused on the evolution of hardware and are slowly moving to discuss software (perhaps because microcomputers have wiped out nearly every hardware vendor). So we will have to wait perhaps a decade or two for tentative/definitive historian answer.

The 1950s

Computers and Automation is a criminally underused resource (a couple of PhDs worth of primary data here). A lot of the data is hardware related, but software gets a lot more than a passing mention.

The US military published lots of hardware data, but software does not get mentioned much.

The 1960s

Computers and Automation are still publishing.

The US military still publishing data; again mostly hardware related.

Datamation, a weekly news magazine, published a lot of substantial material on the software and hardware ecosystems as they evolved.

Kenneth Knight’s analysis of computer performance is an example of the kind of data analysis that many people undertook for hardware, which was rarely done for software.

The 1970s

The US military are still leading the way; we are in the time of Rome. Air Force officers studying for a Master’s degree publish more software engineering data than all academics combined over this and the next two decades.

“Data processing technology and economics” by Montgomery Phister is 720 A4 pages packed with graphs and tables of numbers. Despite citing earlier sources, this has become the primary source for a lot of subsequent researchers; this is understandable in a pre-internet age. Now we have Bitsavers and the Internet Archive, and the cited primary source can be downloaded.

NASA is surprisingly low volume.

The 1980s

Rome falls (i.e., the work gets outsourced to a university) and the false prophets (i.e., academics doing non-evidence based work) multiply and prosper. There are hushed references to trouble makers performing unclean acts experiments in the wilderness.

A few people working in the wilderness, meaning that the quantity of data being produced drops by at least an order of magnitude.

The 1990s

Enough time has passed for people to be able to refer to the wisdom of the ancients.

There are still people in the wilderness howling at the moon, and performing unclean acts experiments.

The 2000s

Repositories of Open source and bug reports grow and prosper. Evidence-based software engineering research starts to become mainstream.

There are now groups of people doing software engineering research.

What about individuals as major players? A vaguely scientific way of rating individual impact, on evidence-based software engineering, is to count the number of papers they have published, that are cited by a book claiming to discuss all the important/interesting publicly available software engineering data (code+data).

The 1,521 papers cited, by such a book, had 3,716 authors, of which 3,095 were different. The authors who appeared most often are listed below (count on the right, and yes, at number 2 is a theoretician; I have cited myself nine times, but two of those are to web sites hosting data).

Magne Jorgensen 17
Anne Chao 11
Dag I. K. Sjoberg 10
Massimiliano Di Penta 10
Ahmed E. Hassan 8
Christian Bird 8
Stanislas Dehaene 8
Giuliano Antoniol 7
Thomas Zimmermann 7
Alexander Serebrenik 6
Dror G. Feitelson 6
Gregorio Robles 6
Krzysztof Czarnecki 6
Lutz Prechelt 6
Victor R. Basili 6

The number of authors/papers follows the usual pattern of many people writing one paper.

Number of evidence-based papers written by an author

Who might I have missed? The business school researchers don’t get a mention because their data is often covered by a confidentiality agreement. The machine learning crowd are just embarrassing.

Suggestions for major players welcome.

MattCha's Blog: 2015 Zheng Si Long Wa Long & Productive Qi



When you say “Wa Long” my mind thinks immediately of intensely sweet Yiwu material with basically no bitterness and miles of deliciousness…. Mmmmmm… I wonder if this Wa Long can satisfy my presumption about this growing area?  This Wa Long goes for $156.24 for 400g cake or $0.39/g.

The dry leaves smell of creamy intensely sweet woody Yiwu-ness.  The odour has a fruity cherry character to it as well as a candy like sweet smell…

First delivers an intensely creamy icing sugary sweet fluffy cotton candy intense sweet deliciousness.  There is a long cool faint menthol that hangs out mildly in the background as not to disturb the intensity of the sweetness.  This is intensely and beautifully sweet stuff.

The second infusion at least half of the sweetness has vaporized in its volatility but still there is enough of that to go around.  The very distant wood note, like a rainforest, lingers throughout.  The mouthfeel is light and stimulates the edges of the tongue, it has a tingling feeling to it not at all vacuous.

The third develops some dragon fruit and pear taste as a layer to its intense sweetness.  Very Faint woodiness is almost overlooked completely over draping, very distinct Yiwu sweetness. The Qi is big in the head very weighty and muddling, happy and energizing.  In the body it can be felt in the heart.

The fourth infusion has a fruitier than sweet onset- pear, plum, distant tropical.  There is a slight almost sour/bitter wood taste underneath.  The mouthfeel is thin but slight sticky on the tongue.  It’s more on the tongue than the throat.  The aftertaste is a continuation of sweet fruits.

The fifth almost has a pungent menthol initial taste which swells in the mouth and tongue.  There is a subtle woody, rainforest taste throughout.  It has a sweet bready yeasty finish indicating a few years of more humid storage.

The sixth infusion has a woody, plum, and slightly sour taste to it.  The tongue develops a chalky bitterness to it, which is mild.  The aftertaste is bready, fruity, woody and has a yam note in there as well.  Nice mild menthol finish, more fruitiness trails off.

The seventh has a woody plum and tropical edge taste presenting initially.  The intensity of the first few infusions can’t be found any more but a faint trace in the aftertaste.  What remains are classic Yiwu woody, plumb, foresty tastes.  Slight bitter and sour but very faint.  A sweet bread finish in the mouth.

The eighth has more of a malty woody plumy fruitiness.  The tastes of this tea have some depth to them in the stimulating but mild tongue coating.  The throat only opens mildly to welcome these flavours in.  The Qi is heady, alerting, cloudy.  In the body it races the heart slightly and you can still feel it in the stomach.  It makes for a very productive day…. This is that Qi that makes you want to get stuff done.  It gives you a sort of clarity and focus but also a nice surge of energy especially mental energy.

Ninth is nicely woody, plumb, almost soapy sweetness with a ghostly edge of that intense sweetness almost gone now as it lingers with fruits in the aftertaste.

Tenth has a nice deep mellow fruity woody Yiwu thing going on.  The fruit flavor is complex enough in the mouth.  Slight menthol lingers.  A good Yiwu profile, yummy!

Eleventh and twelfth is steeped with a good 15 seconds longer than flash and much more tropical fruits are pushed out the wood is mellow in the background now. Tropical fruity with a menthol background.

13th & 14th are pushed longer and a woody dryness with fruity edges so very Yiwu.

This tea has such a wonderfully productive Qi to it.  Its effects leaves the mind in a profoundly focused state.  I imagine I could have done a few more steeps out of this one easy but instead I was way way too busy getting stuff done!

Peace


 



MattCha's Blog: 2015 Zheng Si Long Yi Shan Mo


And back to my onslaught of Zheng Si Long samples.  I hope you have been enjoying my current deep plunge into Tea Encounter's puerh catalogue...

This is a new tea area for me although it’s entirely possible I would have tried something like this before… It seems like the Yi Shan Mo area is pretty far off the beaten path.  This is one of two 2015 Zhang Si Long at Tea Encounter this one sells for $156.24 for 400g cake or $0.39 /g.

The dry leaf delivers deep rich pungent dried apricot odours in a heavy whiff of sweetness.

The first infusion has a watery floral, icing sugar- like opening, faint wood, then cooling retuning finish, ending on a melon taste on the tongue.

The second infusion starts on a caramel note transitioning to a slightly pungent dry woody taste.  The cool menthol is noticeable.  The mouthfeel is elegant and flows to the edges of the mouth and tongue.  The mouthfeel is nice and the quaint throat sensation is deep where menthol flows.  Slight melon aftertaste.

The third infusion starts off as a dry woody, fruit melon taste then dry wood fades and the melon swells then menthol arrives.  The mouthfeeling and throatfeeling is very gentle but nice.  There is a long melon and floral sweetness that lingers minutes later.

The fourth starts with significantly pungent, dry woody notes which give way to melon then menthol.  There is a sweet-sour, almost mango-like, taste in there prominent as well.  Long melon and floral note lingers along with wood.

The fifth infusion is much the same as the fourth but much more pronounced.  The mouthfeel becomes thicker and stickier in the mouth.  With no bitterness around, there is almost a cloying melon sweetness while the woody taste is really becoming more pronounced.

The sixth is continuing to build slowly into a slightly syrupy woody melon fruit taste with a long camphor wood cooling finish.  The mouth and throat feel slowly build in the mouth and become a sticky, almost dry.  The finish is long melon and wood.  The aftertaste is quite nice.

The seventh infusion is more menthol and pungent now.  That has to be the dominant falvour.  The pungent taste is throughout, it’s long.  Initial it shares space with woody and slight sweetness.  In the aftertaste with melon, sweetness, slight honey, and floral.

The eighth infusion has an interesting soapy floral taste as the dominant taste now.  Tastes like Thrills gum.  A sticky almost grapy taste in the mouth is there as well.

The ninth shares this interesting taste.  There is woods and florals and sweet melon fruits in there as well.  It has suggestions of a faint cotton candy sweetness under the fruit sweetness.  A nice sticky mouth coating.  The Qi of this tea is very mellow.  It strolls throughout the body without much fuss.

The tenth has a woody melon initial taste with distinct menthol underpinning.  A long floral melon stays along in the aftertaste.  The eleventh has more powerful creamy cotton candy sweetness in there compared to previous infusions.  This taste seems to be the dominant one now.

The twelfth has a more pronounced fruity sweetness thing going on with the cotton candy floss underneath.  Layers of light nuanced sweetness. Faint wood. No bitterness.

The 13th and 14th have ten seconds over flash infusions and offer a dense thicker broth of fruits and thicker florals, a long creamy sweetness, light menthol.  The mouthfeel is nice but not standoffish.  This tea has great stamina of flavor and gives off a lot of depth when pushed a bit more.

I push it a bit harder to 30-90 seconds and it gives off thick fruity creamy menthol wood tastes.  Nice.  I feel a bit bad at stopping this session at 18 infusions because I think it has a lot more to offer- great stamina.  It’s getting late in the afternoon and I have no other choice, least I’ll be up too late.  I put it through a few more days of overnight steeping and the result is very thick fruity tastes.

This puerh has the Yiwu fruity and woody with more of a boarder tea melon almost green tea like taste at times.  The flavours are very nice, this is a flavor tea to me with a certain thickness to it.  This Yi Shan Mo is a slow moving puerh with lots of stamina.  The Qi sensation is very mild in both the body and mind.

Peace

Charles Petzold: Retirement and Realignment

Effective today, I have resigned my employment at Microsoft, concluding an engaging and delightful 4½ years as part of the Xamarin documentation team. I will miss my co-workers immensely, and I hope to keep in touch with them on Facebook.

... more ...

CreativeApplications.Net: Face Trade – Art vending machine that trades mugshots for “free” portraits

Face Trade – Art vending machine that trades mugshots for “free” portraits
Face Trade is an Art Vending Machine created by Matthias Dörfelt that dispenses unique prints of computer generated face drawings. Instead of paying with money, buyers trade a mugshot that is taken on the spot in order to be permanently stored in the Ethereum Blockchain, consequently turning the transaction into a semi-permanent Face Swap.

Jesse Moynihan: Hermit Revised

Hi resolution link HERE Earliest versions of the Hermit depicted him as Father Time, or Saturn and “the triumph of Death” as Gertrude Moakley puts it. What was hiding inside the Fool’s body is now cloaking the Hermit. The Hermit is clothed in the unknown, and now illuminates what is within, from high up on […]

explodingdog: Photo



OCaml Planet: Continuous Benchmarking & Call for Benchmarks

Over the past few weeks, at OCaml Labs, we’ve deployed continuous benchmarking infrastructure for Multicore OCaml. Live results are available at http://ocamllabs.io/multicore. Continuous benchmarking has already enabled us to make informed decisions about the impact of our changes, and should come in handy over the next few months where we polish off and tune the multicore runtime.

Currently, the benchmarks are all single-threaded and run on x86-64. Our current aim is to quantify the performance impact of running single-threaded OCaml programs using the multicore compiler. Moving forward, would would include multi-threaded benchmarks and other architectures.

The benchmarks and the benchmarking infrastructure were adapted from OCamlPro’s benchmark suite aimed at benchmarking Flambda optimisation passes. The difference with the new infrastructure is that all the data is generated as static HTML and CSV files with data processing performed on the client side in JavaScript. I find the new setup easier to manage and deploy.

Quality of benchmarks

If you observe the results, you will see that multicore is slowest compared to trunk OCaml on menhir-standard and menhir-fancy. But if you look closely:

Binary tree

these benchmarks complete in less than 10 milliseconds. This is not enough time to faithfully compare the implementations as constant factors such as runtime initialisation and costs of single untimely major GC dominate any useful work. In fact, almost half of the benchmarks complete within a second. The quality of this benchmark suite ought to be improved.

Call for benchmarks

While we want longer running benchmarks, we would also like those benchmarks to represent real OCaml programs found in the wild. If you have long running real OCaml programs, please consider adding it to the benchmark suite. Your contribution will ensure that performance-oriented OCaml features such as multicore and flambda are evaluated on representative OCaml programs.

How to contribute

Make a PR to multicore branch of ocamllabs/ocamlbench-repo. The packages directory contains many examples for how to prepare programs for benchmarking. Among these, numerical-analysis-bench and menhir-bench are simple and illustrative.

The benchmarks themselves are run using these scripts.

Dockerfile

There is a handy Dockerfile to test benchmarking setup:

$ docker build -t multicore-cb -f Dockerfile . #takes a while; grab a coffee

This builds the docker image for the benchmarking infrastructure. You can run the benchmarks as:

$ docker run -p 8080:8080 -it multicore-cb bash
$ cd ~/ocamlbench-scripts
$ ./run-bench.sh --nowait --lazy #takes a while; grab lunch

You can view the results by:

$ cd ~/logs/operf
$ python -m SimpleHTTPServer 8080

Now on your host machine, point your browser to localhost:8080 to interactively visualise the benchmark results.

Caveats

Aim to get your benchmark compiling with OCaml 4.06.1. You might have trouble getting your benchmark to compile with the multicore compiler due to several reasons:

  • Multicore compiler has syntax extensions for algebraic effect handlers which breaks packages that use ppx.
  • Multicore compiler has a different C API which breaks core dependencies such as Lwt.
  • Certain features such as marshalling closures and custom tag objects are unimplemented.

If you encounter trouble submitting benchmarks, please make an issue on kayceesrk/ocamlbench-scripts repo.

Michael Geist: Compromising on Culture?: Why a Blanket Culture Exception in NAFTA is Unnecessary

As the NAFTA negotiations continue to inch along, one of the remaining contentious issues is the inclusion of a full cultural exception that would largely exclude the Canadian culture industries from the ambit of the agreement. The government has not been shy about speaking out against compromising on culture, noting the perceived risks of provisions that might permit foreign ownership of media organizations. Indeed, the culture issue has attracted considerable attention, with coverage pointing to media ownership rules and simultaneous substitution policies as hot button concerns. Yet as cultural groups cheer on the government’s insistence that cultural policy should be taken off the NAFTA table, the reality is that there remains plenty of room for compromise. This post focuses on three of the biggest issues: foreign ownership, simultaneous substitution, and the TPP culture exceptions.

Foreign Ownership of Media Organizations

For Prime Minister Justin Trudeau, the worst case scenario is the prospect of U.S. networks taking over the Canadian market. Last week, he stated “it is inconceivable to any Canadian that an American network might buy Canadian media affiliates, whether it is a newspaper or television stations or TV networks.” Yet beyond the fact that Canadian networks often act as little more than U.S. affiliates by retransmitting their programs with Canadian commercials and that networks such as Fox News are already available on Canadian cable television packages, the fears associated with foreign ownership of broadcasters are largely overblown as the connection between Canadian broadcasting ownership and Canadian culture is tenuous at best. In fact, those arguing that foreign ownership imperils Canadian content and culture with lost regulation are often the same groups who maintain that foreign online video services such as Netflix should be fully regulated in Canada.

Canadian law currently features both foreign ownership restrictions and content requirements. The foreign ownership rules generally limit licensees to 20 percent foreign ownership (up to 33 percent for a holding company). This covers all types of broadcasters including television, radio, and broadcast distributors. Many observers appear to assume that Canadian ownership and content requirements go hand-in-hand, fearing that a foreign owned broadcaster would be less likely to comply with Canadian content requirements. Yet there is little reason to believe this to be so.

The Canadian Radio-television and Telecommunications Commission’s active involvement in setting Canadian content requirements is a direct result of Canadian-owned broadcasters regularly seeking to limit the amount of Canadian content they are required to broadcast. Producing original Canadian content is simply more expensive than licensing foreign (largely U.S.) content. These fiscal realities – and the regulations that have arisen as a response – remain true regardless of the nationality of the broadcaster.

Foreign owned businesses face Canadian-specific regulations all the time – provincial regulations, tax laws, environmental rules, or financial reporting – and there is little evidence that Canadian businesses are more likely to comply with the law than foreign operators.  
Cultural businesses may raise particular sensitivities, but broadcasters that are dependent upon licensing from a national regulator can ill-afford to put that licence at risk by violating its terms or national law.

In fact, a review of other developed countries reveals that many have eliminated foreign ownership restrictions in their broadcasting sector but retain local content requirements. For example, Australia has no foreign ownership restrictions on broadcasters (Canwest was once the majority owner of one of its television networks), yet employs a wide range of local content requirements. The same is true in many European countries – Germany has eliminated foreign ownership restrictions but retains daily regional programming requirements, Ireland has no foreign ownership restrictions but establishes programming requirements for each broadcast licensee, and the Czech Republic has dropped its foreign ownership restrictions but relies on broadcasting licences to mandate local programming.  In other words, opening the door to greater foreign ownership of media assets does not mean that Cancon regulations will be lost in the process.

Simultaneous Substitution

The issues associated with culture and NAFTA is not limited to foreign ownership. The U.S. has also oddly been arguing both for and against simultaneous substitution, arguing for it in the context of the Super Bowl broadcast and against it with respect to compensation for U.S. stations situated near the border. Canada is unlikely to change simultaneous substitution via a trade deal, but the policy is nevertheless steadily declining in importance.

The growth of specialty channels, which now represent a far bigger slice of the broadcasting revenue pie than conventional channels, heralded the decreasing importance of simultaneous substitution with fewer programs substituted and subscription revenue surpassing conventional television advertising revenue. Moreover, consumers gaining increasing control over what they watch and when they watch it contribute to its declining importance. Recording television shows or watching them on demand eliminates the simultaneous substitution issue. Sports leagues now package their seasons for full streaming and many watch streamed versions of shows directly from broadcasters or through services like Netflix, Amazon and CraveTV. With the arrival of even more streaming options – including U.S. broadcasters such as CBS – simultaneous substitution matters less and less every year.

Not only has the relevance of simultaneous substitution declined in recent years, but the policy has arguably harmed the long-term success of the Canadian system. It effectively trades some additional revenue for loss of control over the Canadian programming schedule and turns the Canadian system into a country-wide U.S. affiliate with hundreds of millions of dollars spent on the rights to non-Canadian programming. The CRTC recognized several years ago that eliminating simultaneous substitution altogether would still create a shock to the system. Limiting the elimination to the Super Bowl had the practical benefit of starting to move the industry off the addiction to U.S. programming and toward competition rather than regulatory protection. The Canadian policy will continue to evolve (including through the courts), but resolving the issue through NAFTA makes little sense since the U.S. is arguing both for and against the practice.

TPP Culture Provisions

While these issues have captured the headlines, the more obvious target for the U.S. are the provisions found in the TPP, which were sidelined by Canadian officials once the U.S. dropped out in side letters with the remaining TPP countries. The TPP provision stated:

Canada reserves the right to adopt or maintain any measure that affects cultural industries and that has the objective of supporting, directly or indirectly, the creation, development or accessibility of Canadian artistic expression or content, except:

a) discriminatory requirements on services suppliers or investors to make financial contributions for Canadian content development; and 

b) measures restricting the access to on-line foreign audiovisual content. 


In other words, the full culture exception was limited in two ways that could resurface as part of the NAFTA negotiations. Canada’s starting position is that it opposes any exceptions to a full cultural exception, but neither TPP exception were bad policy on their own (whether any of this belongs in a trade deal is open to debate, however).

The first – a ban on discriminatory requirements to support Cancon development – raises a legitimate concern about the possibility of mandated Cancon payments by foreign providers. While Canadian groups have actively lobbied to require foreign providers such as Netflix to make payments similar to those paid by Canadian broadcasters and broadcast distributors, they have been less supportive of Netflix benefiting from those Cancon funding mechanisms. Payment mandates without the same benefits would likely (and rightly) be viewed as discriminatory. The TPP would have blocked such an approach. The government maintains it opposes a Netflix tax. While there is a healthy debate about whether there should be mandated payments (I’ve written here about the uneven playing field that grants significant advantages to Canadians), asking for equal treatment should be a mandated payment system be established seems fair.

The second exception – a ban on instituting restrictions to foreign online video services such as Netflix – would be a non-starter in Canada. Given its commitment to net neutrality – the principle of treating all content and applications equally – the government is unlikely to require Internet providers to block access to foreign services. Yet in the name of the cultural exception, the Canadian government is arguing that it cannot even agree to a no-blocking mandate. Reversing on that issue – as with several others – would have no practical policy cost to Canada, would allow Canada to focus on other digital policy issues such as copyright term (which would have a far more significant impact on access to Canadian culture), and would not put Canadian culture or identity at risk.

The post Compromising on Culture?: Why a Blanket Culture Exception in NAFTA is Unnecessary appeared first on Michael Geist.

MattCha's Blog: Making Sense of Your Tea Drinking


I recently read a comment on TeaDB that made me reflect on how to think about my puerh drinking.  In the comment section of this article on justifying the purchase of shu puerh James places his tea drinking into logical categories with rationale as to what teas make most sense for each category.  He states:

For me tea drinking falls into three basic categories. (1) Casual brews I drink/make for my wife. (2) Teas I drink gong-fu throughout the day. (3) Teas I drink with other people.

Ripe pu’erh tends to do very well in category 1 and depending on the audience category 3. It doesn’t make sense for me to be brewing something fancy for category 1 and for whatever reason I just about never want to drink ripe as my gong-fu session for the day. That just leaves category 3, and I’m not sure I’m at the point where I can justify fancier boutique ripe sheerly to serve guests. I’ll admit to having considered but I’m not quite there for myself. I also certainly wouldn’t fault the person who chooses to buy it.

For me, even a few years ago, my tea drinking was very very different but for the last year or two it has been pretty consistent mainly due to stable life circumstances.  My tea drinking falls into (1) morning gongfu I drink/ make for wife and family. (2) Stored productions that I bring out of storage to drink with my family on a rare occasion. (3) Teas I drink with other people. (4) Everyday drinkers I one cup steep at work. (5) better teas I gong fu at work.

Over the last while category 1 tends to be aged sheng of increasingly decent quality but also can include shu puerh, Korean Balhyocha, or Oolong.  My children regularly drink tea with us so I make sure it is of a certain base level of quality.  My wife will not tolerate anything overly harsh or unusual and if its sheng, it better be aged.  She has an increasingly discerning pallet when I’m gong fu brewing.

Category 2 tends to be sheng puerh that I have lesser quantities of and I am trying to hold on to-expensive or cheap, old or young.  Usually, it has some quality of rareness to it preventing me from putting a cake into my regular rotation thereby preventing me from drinking through it on a day-to-day basis.  It also has some level of quality to it, otherwise I would just drink through the cake in Category 4.

Category 3 tends to be similar to category 2 but is sometimes Darjeeling which my wife enjoys as well but that I rarely consume these days.

Category 4 tends to be a lot of factory sheng that I have acquired over the last year.  If I’m simply looking for caffeine after lunch and my day is too busy to deeply appreciate such things it could be some lesser quality sheng that I have a sample cake of or some cheaper Menghai Factory stuff.  If I’m feeling like something of better quality, I go up the quality ladder without hesitation.  I will even consume fresh sheng samples as well at work.

Category 5 tends to be nicer aged sheng or samples where I can spend some time with and enjoy and often blog or write about.

Anyways, I think that helping to categorize your tea drinking is another way other than measuring your consumption that can help guide your future purchases.  This is especially true if you consider yourself more of a puerh drinker rather than collector.

In my case, I have amassed enough tea to satisfy categories 2-4 over the past year and from years before.  So, right now my buying is focused more on high quality drinkers that satisfy Category 1 and maybe for some more special stuff that satisfies Category 2 & 5.  Currently, my generous onslaught of samples are satisfying these categories nicely without me dipping into my stored cakes.  I am also wondering if I should take the plunge into buying more shu puerh?  I really prefer sheng but my wife also enjoys shu and doesn’t really pay to much attention to weather its sheng or shu anyways, as long as its good.  I also feel that my purchasing is slowing down because I have enough to last me many many years.


I hope that this reflection has helped you evaluate your own drinking needs.  I wonder what your drinking categories are and how that influences your purchasing, if at all?

Hummmm…. Something to meditate on…

Peace

Tea Masters: Le Taiwan des tribus aborigènes et un lien avec le thé OB


C'est quoi cet Oolong Beauté Orientale Fan zhuang d'automne 2010? C'est quoi Fan zhuang? "En fait, Fan zhuang est lieu dans la campagne de Hsinchu, là d'où proviennent ces feuilles. Si ce nom sonne bizarre en chinois, c'est parce que c'est un nom de lieu aborigène" m'expliqua le producteur Hakka de cette Beauté Orientale.

"Mes ancêtres se sont installés sur des terres où vivaient des aborigènes des plaines (Pingpu)". Autrefois, avant l'immigration des Chinois du Fujian et de Chaozhou, avant 1600 environ, Taiwan était une ile surtout peuplée de tribus originaires du Pacifique.

Sur la carte ci-dessous, on peut voir comment ces nombreuses tribus étaient réparties:
Avec l'arrivée des immigrants chinois, mieux armés qu'eux et plus avancés techniquement, ces tribus aborigènes ont surtout trouvé refuge dans les montagnes et sur la côte Est ou bien se sont assimilées. L'image ci-dessous (prise dans le centre culturel des aborigènes près du lac Sun Moon) montre que ces aborigènes avaient des traditions guerrières et ce n'est pas de gaieté de cœur qu'ils ont accepté cette cohabitation!
De nos jours, ces tribus sont entièrement pacifiée et sont souvent chrétiennes car les missionnaires occidentaux les trouvaient plus faciles à convertir que les Chinois! On retient surtout leurs contributions culturelles avec les motifs de leurs textiles. Voici qui ferait un beau Chabu:

Leur poterie est aussi intéressante car elle nous ramène à la poterie de la fin de la préhistoire.

Mais c'est par leurs danses et leurs chants que les aborigènes se distinguent surtout à Taiwan. On ne compte plus les chanteurs populaires qui apprirent d'abord à chanter dans leur tribu avec les montagne comme chambre d'écho.

Et s'il y a une chose qu'on peut apprendre d'eux, c'est le sens de la fête! Même au plus fort d'une averse ils continuaient à danser lors d'un spectacle qui montrait les danses et costumes traditionnels de différentes tribus de Formose:
Ils ont la culture de vivre dans l'instant et ne se projettent pas trop loin dans l'avenir. C'est une force pour apprécier le présent, mais c'est une faiblesse pour la culture du thé (au sens premier)! En effet, le gouvernement de Taiwan cherche parfois à aider ces communautés aborigènes, souvent plus pauvres que la moyenne car très rurales. Or, avec le thé de haute montagne, ils étaient bien placés pour en tirer des bénéfices. Mais, culturellement, cela a rarement marché, car les plantations de thé mettent au moins 3 ans avant de commencer à produire un peu de thé. Cet horizon est trop lointain pour eux et toutes les tentatives de faire des aborigènes des cultivateurs de thé se sont soldées par des échecs pour cette raison. La culture du riz, et des céréales en général, ne demande pas tant de patience et de planning à long terme.
Or, selon le test du marshmallow, c'est la capacité de différer la gratification qui est le meilleur indicateur de succès futur d'un enfant. C'est arriver à renoncer à un peu maintenant afin de pouvoir recevoir plus dans le futur. C'est la base de toute notre longue éducation: plutôt que de commencer à travailler à 15/16 ans quand on commence à avoir la force d'un adulte, nous nous éduquons au lycée et dans le supérieur sans rien produire de tangible afin d'acquérir des savoirs plus valorisants et mieux valorisés. Et tout au long de notre carrière, nous continuous à investir une partie de notre temps dans la formation.
Planter des arbres, de la vigne ou du thé, c'est faire preuve d'une capacité à se projeter dans le long terme. On retrouve aussi cette qualité aussi dans le vieillissement de certains thés et savoir attendre que le thé ait atteint son apogée avant de le consommer. C'est justement le cas avec cette beauté orientale d'automne 2010. Bien torréfié il y a 8 ans, ce thé a eu le temps de s'affiner. Ses senteurs exhibent déjà des odeurs suaves de brandy, de malt et de molasse, mais aussi de fruits mûrs!
Ce Chaxi est donc un concentré de civilisation. Il repose sur un produit qui demande de la patience dans sa culture. Il demande de la patience et de l'apprentissage pour son producteur qui a appris sa technique de son père. Il a mis 8 ans à se bonifier. Et la réalisation des accessoires en porcelaine, en zisha d'Yixing, en étain, en fonte, en textile a fait appel à de nombreux savoir-faire... Et finalement, l'infusion de ce thé repose aussi sur 15 ans de leçons et autant d'années de pratique.
Pour mieux jouir du thé, il faut à la fois passer du temps à l'apprendre, mais il faut aussi arriver à vivre dans l'instant et déguster chaque coupe comme si demain n'existait pas!

new shelton wet/dry: ‘I am not young enough to know everything.’ –Oscar Wilde

Knowing yourself requires knowing not just what you are like in general (trait self-knowledge), but also how your personality fluctuates from moment to moment (state self-knowledge). We examined this latter form of self-knowledge. […] People had self-insight into their momentary extraversion, conscientiousness, and likely neuroticism, suggesting that people can accurately detect fluctuations in some aspects of [...]

OCaml Weekly News: OCaml Weekly News, 11 Sep 2018

  1. callipyge 0.2 and eqaf 0.1
  2. Be Sport is hiring (engineers, interns)
  3. Sedlex moved to ocaml-community
  4. An implementation of the Noise Protocol Framework
  5. Release of Bindlib 5.0
  6. Ocaml Github Pull Requests

The Shape of Code: Business school research in software engineering is some of the best

There is a group of software engineering researchers that don’t feature as often as I would like in my evidence-based software engineering book; academics working in business schools.

Business school academics have written some of the best papers I have read on software engineering; the catch is that the data they use is confidential. For somebody writing a book that only discusses a topic if there is data publicly available, this is a problem.

These business school researchers show that it is possible for academics to obtain ‘interesting’ software engineering data from industry. My experience with talking to researchers in computing departments is that most are too involved in their own algorithmic bubble to want to talk to anybody else.

One big difference between the data analysis papers written by academics in computing departments and business schools, is statistical sophistication. Computing papers are still using stone-age pre-computer age techniques, the business papers use a wide range of sophisticated techniques (sometimes cutting edge).

There is one aspect of software engineering papers written by business school researchers that grates with me, many of the authors obviously don’t understand software engineering from a developer’s perspective; well, obviously, they are business oriented people.

The person who has done the largest amount of interesting software engineering research, whose work I don’t (yet; I will find a way) discuss, is Chris Kemerer; a researcher who has a long list of empirical papers going back to the late 1980s, and rarely gets cited by papers by people in computing departments (I am the only person I know, who limits themself to papers where the data is publicly available).

TheSirensSound: New album "Smooth Sailing" by Mike Pace and the Child Actors

The first thing you notice about Smooth Sailing, Mike Pace and the Child Actors' new LP, is that it gleams. Picture a studio stuffed with synth whizzes, session bassists, forty or fifty world-class audio engineers. Picture some label accountant rubbing his temples, grilling a Child Actor over some outrageous line item ("ten thousand dollars for 'vibe maintenance??'). Picture, of course, the man himself, Mike Pace: stomping around in a speedo and bathrobe, refusing sleep, verbally abusing children, sinking periodically into morose funks, instantaneously emerging from those funks with gnomic yet emotionally lucid career highlights like this album's "Troubleshooting," etc.

The reality is, in its way, even more outlandish. In the years since Pace adopted his Child Actors moniker and released Best Boy, he's had no fewer than two children, acquired a mortgage, and settled fully into a consuming job in production music. Smooth Sailing, then, was written and recorded in the cracks of a full and meaningful life: in those minutes or hours most of us use to watch bad TV, or stare blankly into the middle distance. And yet in terms of scope and lushness of sound, and in the way it updates and personalizes a whole slew of classic rock reference points, it stands with the best of War on Drugs or Father John Misty. Like those guys, Pace is first and foremost a nerd, the good kind: someone who cares passionately and unpretentiously about something most people never think about, specifically progressive rock and big-tent singer-songwriter stuff from the 1970s, and puts that care to productive artistic use.

On some level Smooth Sailing is its own classic rock radio station, diverse enough to appeal to a whole jammed freeway's worth of commuters. Some might prefer the Randy Newman/10cc-style "Senior Statesman" (one of Pace's full-fledged story songs, which some enterprising movie producer should option ASAP), others the perfect power-pop of "Blaster" (think Sugar, or Matthew Sweet). Undoubtedly some will cry right there in their cars to "Disconnected Heart," a ballad so beautiful you could picture a Xanax-addicted SoundCloud rapper sampling it. I personally love "Americana Manhasset"—a pink-sunset ambient-instrumental track which harkens back to at least four imagined pasts, only one of which I lived through. (Credit goes as well to Matt LeMay, the producer/multi-instrumentalist who embellished, shaped and mixed each of the songs on Smooth Sailing.)

If you've ever listened to Pace's music you know this already, but just to be clear: this is no kind of bloodless genre exercise. As always with Pace, the cherished albums are all mixed up with the memories of those cherished albums, and with the memories those albums soundtracked, so that the result—filtered through Pace's well-established interest in nostalgia, time's passing, etc.—is on the one hand new and idiosyncratically Pace-imprinted and, on the other, familiar and comforting and kind of pleasantly sad—pop sad.

This stuff might not sound much like Mike's last band, Oxford Collapse—possibly New York's last great indie rock band, before the whole operation shipped over to Philadelphia—but it definitely feels like Oxford Collapse, because all of Pace's songs yearn in this totally unique way. And as ever these songs are set in places built for yearning: beach towns, high school hallways, commuter trains. The yearning has something to do with growing up, with putting away childish things. A song like "Escape the Noise," with lyrics about giving up on guitars and "ragged nights," has a ton of parallels in Pace's discography, but this one's his best—for many reasons, but especially because we now know for sure that he doesn't actually mean it—that he'll be writing about this stuff for a long while to come. - Daniel Kolitz


TheSirensSound: New Album "Nina" by Jonathan Lear

"Nina" is an instrumental math rock journey. Although the few lyrics on the album give away very little, conceptually speaking, the record is in fact Jonathan's attempt to keep up with his landlord's dog: an utterly gigantic, supremely energetic, deaf pit bull / Dalmatian mix named Nina, who he is tasked with walking, feeding, and occasionally babysitting. Long story short, Jonathan peers into Nina's soul and realizes that there is a fully-formed human being inside, and this record is his attempt to capture that humanity in musical form. In doing so, Jonathan realizes that is actually dogs that make us human.

The Geomblog: Hello World: A short review

A short review of Hannah Fry's new book 'Hello World'

Starting wth Cathy O'Neill's Weapons of Math Destruction, there's been an onslaught of books sounding the alarm about the use of algorithms in daily life. My Amazon list that collects these together is even called 'Woke CS'. These are all excellent books, calling out the racial, gender, and class inequalities that algorithmic decision-making can and does exacerbate and the role of Silicon Valley in perpetuating these biases.

Hannah Fry's new book "Hello World" is not in this category. Not exactly, anyway. Her take is informative as well as cautionary. Her book is as much an explainer of how algorithms get used in contexts ranging from justice, to medicine, to art, as much as it is a reflection on what this algorithmically enabled world will look like from a human perspective.

And in that sense it's a far more optimistic take on our current moment than I've read in a long time. In a way it's a relief: I've been mired for so long in the trenches of bias and discrimination, looking at the depressing and horrific ways in which algorithms are used as tools of oppression, that it can be hard to remember that I'm a computer scientist for a reason: I actually do marvel at and love the idea of computation as a metaphor, as a tool, and ultimately as a way to (dare I say it) do good in the world.

The book is structured around concepts (Power, data) and domains (justice, medicine, cars, crime and art). After an initial explainer on how algorithms function (and also how models are trained using machine learning), and how data is used to fuel these algorithms, she very quickly gets into specific case studies of both the good and the bad in algorithmically mediated decision making. Many of the case studies are from the UK and were unknown to me before this book. I quite liked that: it's easy to focus solely on examples in the US, but the uses (and misuse) of algorithms is global (Vidushi Mardia's article on AI policy in India has similar locally-sourced examples).

If you're a layman looking to get a general sense of how algorithms tend to show up in decision making systems, how they hold out hope for a better way of solving problems and where they might go wrong, this is a great book. It uses a minimum of jargon, while still beiing willing to wade into the muck of false positives and false negatives in a very nice illustrative example in the section on recidivism prediction and COMPAS, and also attempting to welcome the reader into the "Church of Bayes".

If you're a researcher in algorithmic fairness, like me, you start seeing the deeper references as well. Dr. Fry alludes to many of the larger governance issues around algorithmic decision making that we're wrestling with now in the FAT* community. Are there better ways to integrate automated and human decision-making that takes advantage of what we are good at? What happens when the systems we build start to change the world around them? Who gets to decide (and how) what level of error in a system is tolerable, and who might be affected by it? As a researcher, I wish she had called out these issues a little more, and there are places where issues she raises in the book have actually been addressed (and in some cases, answered) by researchers.

While the book covers a number of different areas where algorithms might be taking hold, it takes very different perspectives on the appropriateness of algorithmic decision-making in these domains. Dr. Fry is very clear (and rightly so) that criminal justice is one place where we need very strong checks and balances before we can countenance the use of any kind of algorithmic decision-making. But I feel that maybe she's letting off the medical profession a little easy in the chapter on medicine. While I agree that biology is complex enough that ML-assistance might lead us to amazing new discoveries, I think some caution is needed, especially since there's ample evidence that the benefits of AI in medicine might only accrue to the (mostly white) populations that dominate the clinical trials.

Similarly, the discussion of creativity in art and what it means for an algorithm to be creative is fascinating. The argument Dr. Fry arrives at is that art is fundamentally human in how it exists in transmission -- from artist to audience -- and that art cannot be arrived at "by accident" via data science. It's a bold claim, and of a kind with many claims about the essential humanness of certain activities that have been pulverized by advances in AI. Notwithstanding, I find it very appealing to posit that art is essentially a human endeavour by definition.

But why not extend the same courtesy to the understanding of human behavior or biology? Algorithms in criminal justice are predicated on the belief that we can predict human behavior and how our interventions might change it. We expect that algorithms can pierce the mysterious veil of biology, revealing secrets about how our body works. And yet the book argues not that these systems are fundamentally flawed, but that precisely because of their effectiveness they need governance. I for one am a lot more skeptical about the basic premise that algorithms can predict behavior to any useful degree beyond the aggregate (and perhaps Hari Seldon might agree with me).

Separately, I found it not a little ironic, in a time when Facebook is constantly being yanked before the US Congress, Cambridge Analytica might have swayed US elections and Brexit votes, and Youtube is a dumpster fire of extreme recommendations, that I'd read a line like "Similarity works perfectly well for recommendation engines" in the context of computer generated art.

The book arrives at a conclusion that I feel is JUST RIGHT. To wit, algorithms are not authorities, and we should be skeptical of how they work. And even when they might work, the issues of governance around them are formidable. But we should not run away from the potential of algorithms to truly help us, and we should be trying to frame the problem away from the binary of "algorithms good, humans bad" or "humans good, algorithms bad" and towards a deeper investigation of how human and machine can work together. I cannot read
Imagine that, rather than exlcusively focusing our attention on designing our algorithm to adhere to some impossible standard of perfect fairness, we instead designed them to facilitate redress when they inevitable erred; that we put as much time and effort into ensuring that automatic systems were as easy to challenge as they are to implement.
without wanting to stand up and shout "HUZZAH!!!". (To be honest, I could quote the entire conclusions chapter here and I'd still be shouting "HUZZAH").

It's a good book. Go out and buy it - you won't regret it.

This review refers to an advance copy of the book, not the released hardcover. The advance copy had a glitch where a fragment of latex math remained uncompiled. This only made me happier to read it.

s mazuk: talesfromweirdland: Covers by Kazuaki Saitō for Japanese sci-fi magazine, SF. 1970s.

talesfromweirdland:

Covers by Kazuaki Saitō for Japanese sci-fi magazine, SF. 1970s.

Better Embedded System SW: Different types of risk analysis: ALARP, GAMAB, MEMS and more

When we talk about how much risk is enough, it is common to do things like compare the risk to current systems, or argue about whether something is more (or less) likely than events such as being killed by lightning. There are established ways to think about this topic, each with tradeoffs.

Tightrope Walker


The next time you need to think about how much risk is appropriate in a safety-critical system, try these existing approaches on for size instead of making up something on your own:

ALARP: "As Low As Reasonably Practicable"  Some risks are acceptable. Some are unacceptable. Some are worth taking in exchange for benefit, but if that is done the risk must be reduced to be ALARP.

GAMAB: "Globalement Au Moins Aussi Bon"  Offer a level of risk at least as good as the risk offered by an equivalent existing system. (i.e., no more dangerous than what we have already for a similar function)

MEM: "Minimum Endogenous Mortality"  The technical system must not create a significant risk compared to globally existing risks. For example, this should cause a minimal increase in overall death rates compared to the existing population death rates.

MGS: "Mindestens Gleiche Sicherheit"   (At least the same level of safety) Deviations from accepted practices must be supported by an explicit safety argument showing at least the same level of safety. This is more about waivers than whole-system evaluation.

NMAU: "Nicht Mehr Als Unvermeidbar"  (Not more than unavoidable)  Assuming there is a public benefit to the operation of the system, hazards should be avoided by reasonable safety measures implemented with reasonable cost.

Each of these approaches has pros and cons.  The above terms were paraphrased from this nice discussion:
Kron, On the evaluation of risk acceptance principles,
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.455.4506&rep=rep1&type=pdf

There is an interesting set of slides that covers similar ground here, and works some examples. In particular the graphs involving whether risks are taken voluntarily for different scenarios is thought provoking:
http://agse3.informatik.uni-kl.de/teaching/suze/ws2014/material/folien/SRES_03_Risk_Acceptance.pdf

In general, if you want to dig deeper into this area, a search on
    gamab mem alarp 
will bring up a number of hits

Also note that legal and other types of considerations exist, especially regarding product liability.

churchturing.org / 2018-09-24T13:24:46