ScreenAnarchy: Tribeca 2017 VR Roundup: A Killer New Crop of VR Debuts

It's unclear if it's the New York setting or where it falls on the calendar, but Tribeca has emerged as one of the very best places to debut new VR and 360 content and the 2017 program is no exception. Undoubtedly the hard work of the VR programming team is a big part of this and they've certainly outdone themselves with world premieres from many of the top VR content creators like Penrose and Boabab Studios to very interesting indie artists. We've had the pleasure of experiencing the majority of what's on hand at this year's "Virtual Arcade." Let's dig in. Penrose Studios' Allumette impressed everyone last year with its emotional story, highly detailed 3D modeling, and fully inhabitable world design. Their even more ambitious...

[Read the whole post on screenanarchy.com...]

Open Culture: How to Make the World’s Smallest Cup of Coffee, from Just One Coffee Bean

The Finnish coffee company, Paulig, has been around for a good long while–since 1876, to precise. But only in 2017 did they get around to doing this–enlisting Helsinki designer Lucas Zanotto “to make...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Bifurcated Rivets: From FB

Pancakes

Bifurcated Rivets: From FB

Interesting

Bifurcated Rivets: From FB

The Flashing

Bifurcated Rivets: From FB

Read

Bifurcated Rivets: From FB

Been here before, but still wonderful

ScreenAnarchy: Mysterious THE MIST AND THE MAIDEN Teaser

Based on a novel by Lorenzo Silva, The Mist and the Maiden (La niebla y la doncella) investigates the mystery of a young man whose corpse is found in a forest. As the synopsis explains, the forest is located "on the island of La Gomera. The case closes with the accusation to a local politician who is exonerated in the later judgment. Three years later, Sergeant Bevilaqua and his assistant, Corporal Chamorro, are sent to the island to revive the investigation. Corporal Anglada accompanies them, the last one who saw the young man alive." Quim Gutiérrez stars as Sergeant Bevilaqua, with Aura Garrido as Corporal Chamorror and Verónica Echegui as Corporal Anglada. Andrés M. Koppel wrote the screenplay and directed. Watch the intriguing teaser below,...

[Read the whole post on screenanarchy.com...]

Slashdot: No Longer a Dream: Silicon Valley Takes On the Flying Car

Last year, Bloomberg reported that Google co-founder Larry Page had put money in two "flying car" companies. One of those companies, Kitty Hawk, has published the first video of its prototype aircraft. From a report on The Verge: The company describes the Kitty Hawk Flyer as an "all-electric aircraft" that is designed to operate over water and doesn't require a pilot's license to fly. Kitty Hawk promises people will be able to learn to fly the Flyer "in minutes." A consumer version will be available by the end of this year, the company says. The video is part commercial and part test footage, starting with a lakeside conversation between friends about using the Flyer to meet up before switching to what The New York Times says are shots of an aerospace engineer operating the craft in Northern California.

Read more of this story at Slashdot.

MetaFilter: Concise and austere but not necessarily brief

Postal Pieces is a series of 11 musical compositions (on 10 postcards) by written by James Tenney between 1965 and 1971. Details and images from an essay by Larry Polansky. I'm particularly fond of the look and sound of Cellogram.

Hackaday: Official Launch Of The Asus Tinker Board

Earlier this year, a new single board computer was announced, and subsequently made its way onto the market. The Tinker Board was a little different from the rest of the crop of Raspberry Pi lookalikes, it didn’t come from a no-name company or a crowdfunding site, instead it came from a trusted name, Asus. As a result, it is a very high quality piece of hardware, upon which we remarked when we reviewed it.

Unfortunately, though we were extremely impressed with the board itself, we panned the Asus software and support offering of the time, because it was so patchy as to be non-existent. We had reached out to Asus while writing the review but received no answer, but subsequently they contacted us with a sorry tale of some Tinker Boards finding their way onto the market early, before their official launch and before they had put together their support offering. We updated our review accordingly, after all it is a very good product and we didn’t like to have to pan it in our review.

This week then, news has come through from Asus that they have now launched the board officially. There is a new OS version based on Debian 9, which features hardware acceleration for both the Chromium web browser and the bundled UHD media player. There is also an upcoming Android release though it is still in beta at time of writing and there is little more information.

The Tinker Board is one of the best of the current crop of Raspberry Pi-like single board computers, and it easily trounces the Pi itself on most counts. To see it launched alongside a meaningful software and support offering will give it a chance to prove itself. In our original review we urged tech-savvy readers to buy one anyway, now it has some of the backup it deserves we’d urge you to buy one for your non-technical family members too.


Filed under: computer hacks

Slashdot: Billionaire Jack Ma Says CEOs Could Be Robots in 30 Years, Warns of Decades of 'Pain' From AI

Self-made billionaire, Alibaba chairman Jack Ma warned on Monday that society could see decades of pain thanks to disruption caused by the internet and new technologies to different areas of the economy. From a report: In a speech at a China Entrepreneur Club event, the billionaire urged governments to bring in education reform and outlined how humans need to work with machines. "In the coming 30 years, the world's pain will be much more than happiness, because there are many more problems that we have come across," Ma said in Chinese, speaking about potential job disruptions caused by technology. [...] Ma also spoke about the rise of robots and artificial intelligence (AI) and said that this technology will be needed to process the large amount of data being generated today, something that a human brain can't do. But machines shouldn't replace what humans can do, Ma said, but instead the technology community needs to look at making machines do what humans cannot. This would make the machine a "human partner" rather than an opponent.

Read more of this story at Slashdot.

Recent additions: servant-auth-token-api 0.4.2.1

Added by NCrashed, Mon Apr 24 15:16:12 UTC 2017.

Servant based API for token based authorisation

MetaFilter: 1941 State Fair

Rare color photos of a 1941 State Fair in Vermont.

ScreenAnarchy: Our Favorite Faces Of Rosario Dawson

This weekend saw the première of famous producer Denise Di Novi's first film as a director: Unforgettable, starring Rosario Dawson as a woman terrorized by the deranged ex-wife of her soon-to-be-husband. Reviews have been mixed (and puns regarding the title have been rife) but everyone seems to agree that lead actress Rosario Dawson does a good job in it. But with her that's basically a given, as she often manages to impress even when given little material to work with. So tell us: what's your favorite performance by Rosario Dawson? Chime in, in the comments below!...

[Read the whole post on screenanarchy.com...]

search.cpan.org: String-Tagged-0.14

string buffers with value tags on extents

All Content: Video: 35mm at Ebertfest 2017

Thumb_hair-35mm-2017

Ebertfest 2017 included two movies that were shown on 35mm film—opening night selection "Hair" and Thursday afternoon's "Hysteria." Ebert Fellow Jason Yue was in the Virginia Theater's projection booth to check out the technology and process required for these special presentations. 

All Content: Ebertfest 2017, Day 5: Every Time We Say Goodbye

Thumb_237689_full

Milos Forman’s “Hair” was a fitting choice to open the 19th installment of Ebertfest, with its topical recurring themes of tolerance, pluralism and sexuality. On Sunday, the festival closed on an appropriately wistful note with a screening of Irwin Winkler’s 2004 portrait of Cole Porter, “De-Lovely.” Chaz Ebert’s introduction was an achingly emotional one, as she reflected on Roger’s love of singing. When he saw a great musical—even ones populated by untrained singers—he found the melodies infectious. One of his recent favorites was Robert Altman’s final film, 2006’s “A Prairie Home Companion,” a picture that serves as a celebration of life as well as a farewell to it. The same could be said of “De-Lovely,” which also happens to star the scene-stealer of Altman’s swan song, Kevin Kline. After Chaz brought out festival director Nate Kohn and assistant festival director Casey Ludwig to take a final bow, they presented her with a bouquet of flowers, a tradition that Roger used to do at the end of every festival. Overcome with emotion, Chaz revealed that she still receives flowers several times a year that Roger had ordered for her before his death in 2013. This revelation left the audience in tears well before the lights went down.

“De-Lovely” is the perfect film to end a critic’s film festival, since it invites Porter to be a critic of his own biopic. An inspired framing device allows him to comment on his life as it unfolds in front of him, barking out directions that none of the “performers” can hear (“It’s too early for another song!” he protests). There are even moments during his memories where he appears to be directing the people around him, asking his wife mid-conversation, “How do you want to play this?” Ashley Judd portrays his wife, Linda Lee, a woman whose abusive first husband left her with little interest in sex, which makes the outwardly gay Porter a nature fit for her. He often tells her, “I love you,” but admits that “the words always work better with music under them.” Kline’s trademark exuberance proves to be a major asset in the film’s dance sequences, such as the comedic showstopper, “Be a Clown,” the song that later was rewritten into “Make ‘Em Laugh” (performed by previous Ebertfest guest Donald O’Connor in “Singin’ in the Rain”). There are also several cameos by major pop singers providing their own reinterpretations of Porter classics. I’ll admit that when Elvis Costello showed up, I initially thought he was Michael Barker, the Song Pictures Classics co-founder and co-president who interviewed Isabelle Huppert at the festival this year (if he’s back next year, I encourage him to belt out “Be a Clown” prior to screening “Toni Erdmann”). Robbie Williams’ lovely rendition of the title tune was also used for Michael Mirasol’s marvelous festival trailer preceding several of this year’s features, earning applause each time it materialized. Most moving of all was “Every Time We Say Goodbye,” as sung by the late Natalie Cole

“Access Hollywood” producer Scott Mantz joined Kohn onstage to moderate a post-film discussion with Winkler and his son, Charles (who was the film’s co-producer). At age 85, Winkler appears as unstoppable as Norman Lear did when he graced the stage earlier in the weekend. As the Oscar-winning producer of “Rocky,” Winkler has enough immortal credits to fill a festival of his own, and Mantz couldn’t resist asking him about his other Best Picture-nominated films, “The Right Stuff” and “Goodfellas.” Many of Winkler’s most acclaimed pictures are his collaborations with Scorsese, and he mentioned that their most recent project, “The Irishman,” aims to re-team Robert De Niro with his former co-stars Harvey Keitel, Al Pacino and Joe Pesci, who will all appear at varying ages, courtesy of a “de-aging process” developed by Industrial Light and Magic. The costliness of the technique combined with the film’s lack of Marvel superheroes will likely cause it to wind up on Netflix in addition to limited theatrical engagements. Winkler and his cinematographer Tony Pierce-Roberts (“The Remains of the Day”) paid homage to Scorsese in “De-Lovely” with some spectacular Stedicam shots, one of which was set to the tune of “Love for Sale.” The shot marks the passage of time in a single take, requiring no less than four costume changes from Kline, and it was completed over the course of one day. Winkler’s technique of allowing character movement to inform the camera movement in this shot was borrowed from Vincent Minnelli. 

Winkler’s son recalled how Kline was miserable throughout the production, unable to understand the director’s vision until he screened a cut of the film and it all clicked. Neither of the Winklers had seen the movie since its 2004 premiere at the Cannes Film Festival, and they were overjoyed to show it at a festival founded by one of their greatest champions. “I never saw Roger happier than when he was with Chaz,” Winkler noted, while acknowledging his own wife of 58 years, actress Margo Winkler, in the audience. Chaz voiced Roger’s belief that Winkler was “one of the smartest producers in Hollywood,” while Winkler remembered how he flew out a print of his 1991 directorial debut, “Guilty by Suspicion,” to ensure that Roger saw it before anyone else. Bringing the festival to a rousing finale were three songs performed by Jimmy and Donnie Demers. They have acquired many fans, including former president Bill Clinton, with their splendidly old-fashioned selections, but neither had tackled a Cole Porter song until last afternoon. Each sought to find their favorite tune for the occasion, and both chose the same one. Here’s the final verse: “There’s no love song finer / But how strange the change from major to minor / Every time we say goodbye.”

If you have suggestions for screenings or events at next year’s 20th anniversary of Ebertfest, send an e-mail to ebertfest@yahoo.com.


All Content: Ebertfest 2017, Day 4: Being Human is Hard

Thumb_ebertfest_day_4_

Contrary to the ordinary, there is no festival fatigue at Ebertfest. Even as day four settled in at the Virginia theater, audiences were characteristically responsive and engaged. They seemed to be enlivened by the slate of diverse films. 

Saturday’s lineup began with a unique screening of “Mind/Game,” a complex portrait of women’s basketball superstar Chamique Holdsclaw. Despite its short running time (56 min.), director Rick Goldsmith manages to get to the heart of Holdsclaw. She wasn’t merely a dominant ball player at the University of Tennessee, where she put together the NCAA’s first ever “three-peat” of national championships. She’s a layered, motivated woman who suffers from a bipolar disorder. After years of personal and medical neglect, the film finds Holdsclaw grappling with her biological realities. Equal parts moving and insightful, Goldsmith veers away from trivializing this serious issue (see: “Fatal Attraction”). With the help of Glenn Close (serving as the narrator), “Mind/Game” aims to illuminate. 

After the film, Goldsmith explained that the film has not only screened on the festival circuit, but “at various mental health gatherings.” It’s played for sports psychologists, high school students, college athletes, professional basketball players, and beyond. “Mind/Game” is the rare Important Film that is also a Good Film. To further educate the audience, Eric Pierson (a professor at the University of San Diego) led a panel of mental health professionals after the Q&A. It did not over simplify the issues this country faces when it comes to treating patients. It did not take shortcuts, or avoid approaching the pain that accompanies mental health. Instead, Pierson engendered the type of dialogue we need more of in America, 2017. If only Donald was watching the live stream. 

Next up was an original comedy that Roger believed “reassures us that there is hope.” Released in 1998, “Pleasantville” delivers on Roger’s promise. Two hours of humor and warmth, director Gary Ross ("Big," “Seabiscuit”) knocked it out of the park on his first at bat. It’s one of the more assured directorial debuts in recent memory. Quickly jettisoning its '90s setting, Ross transports his main characters, David (Tobey Maguire) and Jennifer (Reese Witherspoon), into a '50s sitcom. It’s fantastical fiction at its finest. For Ross, who hadn’t seen the film in “15 years” until Ebertfest, it holds up. “I really like the movie!” he said laughing. “This is a movie you generate in your youth. It belongs in the tumult of youth.” And for Ross, “Pleasantville” represents his “critical view of nostalgia”—while still reluctantly embracing the past. Eager to open up, Q&A moderators Nate Kohn and Brian Tallerico did an especially efficient job gently prodding Ross. Autobiographically, Ross spoke of his mom (“a ‘50s housewife”) and his father (“a blacklisted screenwriter”) in relation to the Joan Allen and William H. Macy characters. The film is (and was) a personal project to him. “I’m interested in embracing untidiness and lack of symmetry,” says Ross. “That’s what makes life worth living.” 

“Being human is hard,” says Norman Lear. “You heard it here first.” Profundity followed by comedy, which seems to be how Lear has lived 93 years of life. Although perhaps not always in that order. Adapting from his biography, “Norman Lear: Just Another Version of You,” directors Heidi Ewing and Rachel Grady have made something truly remarkable. It’s a documentary that isn’t bland or sycophantic—it doesn’t sheepishly contribute to mindless mythologizing. Turning the myth to man, Ewing and Grady craft a well-researched primer for those interested in learning about the person who created hit shows like “All in the Family,” “Good Times,” “Maude,” “The Jeffersons” and countless more. More is the operative word here. Lear has made the most of existing for nearly a century on this Earth. When he longer wanted to be a prodigious Hollywood show-runner and writer, Lear went into activism and philanthropy. After that he found himself as an older father. The show continues. 

Before breaking for dinner, Lear and company strutted onto the stage to talk shop. Five minutes into questioning, Lear’s phone began to rang. He couldn’t help but perform. He took the call. It was Grady, ringing to see how the film played. The audience erupted in applause. The joke—and the movie—tore the house down. Lear continued to proffer wisdom. How comedy probably can’t save us, but it can momentarily alleviate the pain. He spoke of the swinging pendulum of existence. Big highs and big lows, love and loss, success and failure. 93 years in, he assured us that no one has ever quite figured out how to live a life. In part because there is no wrong way to live. We simply do. 

Chauncey Gardiner is precisely who we wish Donald Trump was: a good-hearted, TV-crazed man-child who has unwittingly found himself in a position of power. Watching “Being There” in 2017, amid the Trump administration, is equally hilarious and horrifying. The parallels between Gardiner (Peter Sellers) and the Donald are unnerving. It’s hard to tell whether director Hal Ashby was a filmmaker or a fortune teller. A hippie renegade whose career was later derailed by a variety of drugs, Ashby may have peaked with this 1979 masterpiece. Cinematographer Caleb Deschanel described “Being There” as “two worlds colliding together.” Gardiner and DC politicians—the latter undone by the former’s unvarnished candor. There’s truth and hilarity here. I just wish Ashby’s fiction didn’t feel so much like our reality. 

Slashdot: Amazon Launches Marketplace For Digital Subscriptions

Amazon said on Monday it is launching a platform for companies with subscription services -- from newspapers, magazines to TV streaming. The "Subscribe with Amazon" marketplace allows consumers to buy subscriptions to products like SlingTV streaming, Headspace meditation, Dropbox Plus, as well as workout videos, online classes, meal plans and even matchmakers. The marketplace also features more traditional subscriptions, similar to those that have become popular on Amazon's Kindle tablets, including the Chicago Tribune, LA Times, Wall Street Journal and New Yorker.

Read more of this story at Slashdot.

All Content: Ebertfest 2017, Day 3: A Special Short, the World's Greatest Actress and More

Thumb_efest_day_3_2017

Day three included two of Ebertfest’s most anticipated events: the return of fest regulars The Alloy Orchestra presenting their latest live musical accompaniment to a silent film classic and the arrival of the woman who is arguably the greatest actress working in the cinema today, Isabelle Huppert, to present her latest triumph, the shocking Paul Verhoeven thriller “Elle.” The expectations may have been elevated for this particular day but those two events, in addition to the fine and stirring documentary that kicked things off, more than lived up to the hype.

The day’s program started off with a special treat, “July and Half of August,” a short film directed by Brandeaux Tourville and written by RogerEbert.com contributor Sheila O’Malley. It observes Neve (Annika Marks) and Jack (Robert Baker), two former lovers who reunite at a bar one night a few years after the end of their relationship—whose length gives the film its title—and hash over things in a conversation that becomes increasingly fraught with doubt and denial as it goes on. Because, as I mentioned, O’Malley is a contributor to this site, giving the film any sort of proper review might not seem very proper and I will therefore avoid doing so. In other words, I am not going to tell you that the screenplay is quite strong in the way that offers us two fully fleshed-out characters and convincingly depicts the way that the two lie to each other and themselves about what happened in the past and how it continues to affect them to this day. I am not going to mention that Tourville manages to make the film visually striking despite taking place entirely inside a dingy bar—a redressed Green Bay Packers bar, no less—by knowing just the exact moments when to cut from one person to the other, and through the striking black-and-white photography. Most of all, I am certainly not going to mention that O’Malley mentioned that the short is actually part of a proposed feature film chronicling the entire relationship of Neve and Jack and that, based on the portion shown here, I can’t wait to see the rest of it.

This was followed by the documentary “They Call Us Monsters,” which was presented at the festival by director Ben Lear and co-producer Sasha Alpert. The film follows three young men in a juvenile detention facility—Jarad, Antonio and Juan—who are participating in a screenwriting program taught by teacher Gabriel Cowan and tracks their fates as they make their way through the court system that is contemplating trying them as adults for crimes that, while undeniably heinous in nature, were committed when they were teenagers. From the sentence I have just written, you have no doubt already created a picture in your mind of what the film is like and I assure you that no matter what you are thinking, it is almost certainly incorrect. Instead of focusing entirely on Cowan teaching his students about the beauty of art as an alternative to the violence and cruelty that they have known for most of their lives, it uses the screenwriting sessions as a leaping-off point to examine their lives and delve into the reasons why they landed where they are. (We do get to see scenes from the screenplay they come up with staged with actors by Cowan that are sprinkled throughout the film.) At the same time, the film does not necessarily let them off the hook either by depicting them solely as misunderstood kids in a bad situation—at one point, there is an interview with a victim of Jarad’s drive-by shooting and is now paralyzed as a result, reminding us in the starkest possible terms that while our subjects had been dealt a number of bad breaks in their lives, their actions did hurt others. We also get a look at how the outside system affects their fate, whether it be through the debate for Senate Bill 260, which is designed to give people who did commit crimes as a minor more opportunities to work towards rehabilitation and parole, or through the actions of a lawyer who brags about her abilities going in but who proves to be incompetent at her job. 

During the post-screening Q&A, Lear (the son of television producer and 2017 Ebertfest guest Norman Lear) mentioned that he first became inspired to make this film while researching a conventional feature project that he then put aside. I don’t know what, if anything, will become of that but it is hard to imagine any feature conveying the same degree of power as “They Call Us Monsters.” (The film is current scheduled to appear on PBS in May. Additionally, The Chaz and Roger Ebert Foundation did provide a donation that went towards the film’s social action campaign.)

Next up was The Alloy Orchestra, a three-man ensemble that has made a name for themselves creating and performing original scores for silent movie classics utilizing an array of off-beat instruments, which includes everything from synthesizers to banjos to musical saws. To see them perform live is an extraordinary experience—watching them in the orchestra pit as they use their instruments to create any number of vividly depicted soundscapes is so mesmerizing that they are as compelling as the films they are scoring, not an insignificant quality  since they have added their talents to some of the greatest silent films ever made. This year’s offering was “Variete,” an extraordinary 1925 film that combined the talents of three of the most celebrated members of the German film industry of the time—director and co-writer Ewald Andre Dupont, actor Emil Jannings and cinematographer Karl Freund—in a visually stunning melodrama of such outsized emotions that it could only have been from the silent era because films today are just too timid to go to the lengths that this one does. Told in a series of flashbacks from the confines of prison, it tells the strange and sad story of Boss Huller (Jannings) and how he wound up where he is. When we first see him, he is a former acrobat who is now a reasonably happy family man with a wife and an infant child whose life is turned upside down with the arrival of Bertha-Marie (Lya de Putti), a beautiful young acrobat who has just arrived from parts unknown. Boss is instantly besotted and after putting up a token resistance at best, he decides to leave his wife and kid and go off with Bertha-Marie to begin a new acrobatic act. Along the way, they pick up a third partner in Artinelli (Warwik Ward) and for a while, things go swimmingly but it quickly becomes obvious to everyone but Boss that something is developing between Artinelli and Bertha-Marie. When Boss does finally discover what has been going on under his nose, let it be said that working without a net is no longer the most dangerous thing that any of the three have to face anymore.

As stated, the film is pure and unabashed melodrama and just a dry recitation of the facts might make it sound like a silly soap opera along the lines of the not-exactly-classic Burt Lancaster-Tony Curtis-Gina Lollabrigida drama “Trapeze” (1955) but “Variete” is anything but that. This is one of the most visually stunning films of the entire silent era as Freund conjures up a dizzying array of visual astonishments that make today’s lavish CGI spectacles seem somehow puny by comparison. And yet, the film works for more reasons than just the incredible technical achievements that Freund (who would go on to have a career that would see him shoot such classics as “Dracula” and “Key Largo” (1948), direct cult favorites like “The Mummy” (1932) and “Mad Love” (1935) and essentially invent the visual grammar of television with his revolutionary work on “I Love Lucy”) was able to pull off. Jannings, who would go on to win the very first Oscar for Best Actor, is a volcanic yet strangely touching presence as Boss and Lya de Putti is such a mesmerizing presence as Bertha-Marie that one has no trouble believing the ease with which she leads the men in her life along. Keeping it all humming along beautifully is Dupont, who effectively goes for broke in scene after scene without letting things go completely off the rails. Of the three key participants, his career was the only of the three that didn’t really make the transition to the sound era, a shame because based on his achievements here, one gets the sense that he could do anything. At one point during the post-screening Q&A, one of the members of the Alloy Orchestra commented on the use of a musical saw as a recurring motif in the film by describing it as being “romantic and disturbing,” a description that fits “Variete” perfectly.

Topping off the day was “Elle” and as so much has been written about it since it startled audiences at its premiere at the Cannes Film Festival last year, I won’t going into an extended rehash of the plot, a Hitchcockian-style thriller about a tough and controlling business woman who, as the film opens, is violently raped by a masked attacker and, for reasons that gradually become clear and understandable, decides not to contact the police and instead begins her own cat-and-mouse pursuit that takes some very strange and unexpected twists and turns along the way. If you have seen it, then you know what a masterful work it is. If you haven’t, I don’t want to say anything about it that might spoil it in any way. Instead, I offer only two personal observations after watching it again at the Virginia: 

  1. Last year, when I put up my list of the 10 best films of 2016, I listed “Elle” at #2, coming in just behind “La La Land.” While my love for “La La Land” has not abated at all, I must confess that if I were to present that list today, I would have the positions reversed and give “Elle” the #1 spot that it so richly deserves.
  2. Isabelle Huppert has delivered more great screen performances that pretty much any other actress working today, far too many than can possibly be listed here. As great as those turns have been—and she is one of those actresses who can be great even when the films as a whole are on the dodgy side—her work in “Elle” is the pinnacle of her astonishing career. If there was any justice in the world, it would have been her name in the envelope inadvertently handed to Warren Beatty on Oscar night.

Led by Sony Classics co-president Michael Barker, who preceded the screening with a clip reel of some of her most memorable on-screen moments and an appreciation for her work in the masterful “Heaven’s Gate,” the post-screening Q&A with Huppert quickly became a celebration of her extraordinary career. Although her on-screen persona can come across as cool and resoundingly unsentimental, Huppert in person is an absolute delight and charmed the entire crowd as she discussed her career. Regarding “Elle,” she praised Verhoeven as someone that she always wanted to work with (“I could have been Robocop or a Starship Trooper!”) and revealed that she doesn’t overly prepare for her roles because for her, “moviemaking is always about the present time.” From there, the conversation expanded to encompass her entire career and covered topics such as “Rosebud,” the weird 1975 terrorism thriller from Otto Preminger that was one of her very first major films, the infamous production and reception of “Heaven’s Gate” (which she described as “an expensive dream”), working with filmmakers like Michael Haneke and the late Claude Chabrol and even touched on how impressed she was with the trained cat that provides some of the more memorable moments in “Elle.” Afterwards, she not only braved the occasionally oddball questions offered by the audience but stuck around for the post-screening party even after many of the other guests had left, a fitting end to a fairly unforgettable day.

search.cpan.org: Lingua-RU-Money-XS-0.05

Perl extension for digit conversion to corresponding money sum in Russian.

search.cpan.org: MooX-Options-Actions-0.001

Instant one class CLI App

Open Culture: 11 Rules for the Perfect Italian Futurist Meal: F.T. Marinetti, Theorist of Futurism, Attempts to Turn Italian Cuisine into Modern Art (1930)

With the coming savage cuts in arts funding, perhaps we’ll return to a system of noblesse oblige familiar to students of The Gilded Age, when artists needed independent wealth or patronage, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Colossal: Hand-Painted Planetary Push Pins by Duncan Shotton

Tokyo-based industrial designer Duncan Shotton (previously) is known for his unique approach to houseware and stationery design, where he takes common objects from pencils to bookmarks and conceives of a novel twist. His latest creation is a series of push pins designed to look like the solar system called Planet Pins. The set includes the 8 planets (sorry Pluto fans) and an optional moon pin cast in concrete. Planet Pins just launched on Kickstarter and 100 sets are available as a signed limited edition.

Michael Geist: CRTC’s Zero Rating Ruling Kills Proposals for Preferential Treatment for Cancon Online

There is much to like about last week’s CRTC’s differential pricing decision (also referred to as zero rating) with recent posts from Dwayne Winseck, Timothy Denton, and Peter Nowak providing some helpful analysis. My initial post focused on the CRTC’s key findings and the new framework that will govern differential pricing plans. In addition to those rules, however, there are several additional findings that will have significant implications.

One notable aspect of the decision is that the CRTC has effectively killed proposals to create Internet-style Cancon regulations. While there may still be efforts to impose requirements on companies such as Netflix, the ruling ends the possibility of granting preferential treatment to Canadian content in the provision of Internet services. Columnists such as Kate Taylor have speculated about new regulations and the Canadian Media Producers Association promoted the proposal in its submission to the CRTC:

the Commission should be open to considering ways in which differential pricing practices related to Internet data plans could be used to promote the discoverability of and consumer access to Canadian programming. There are different ways this might be accomplished. For example, the Commission could allow service providers to eliminate data usage charges for accessing trailers and other promotional materials specific to Canadian programs. This would assist the discoverability of Canadian programs. A broader and deeper approach would be to eliminate usage charges for accessing any qualified Canadian programs. Such an approach would not only promote discoverability but actual viewership.

It concludes:

In closing, the CMPA submits that high speed broadband presents a wealth of opportunities for Canadian audiences to access high quality informative and entertaining programming made by Canadians on the platform(s) of their choice. For this reason, we submit that the Commission should permit – or, if necessary, mandate – retail Internet access services to adopt practices, including differential pricing practices, which will serve to promote and improve these opportunities.

The possibility of mandating zero rating for Canadian content is directly addressed by the CRTC, which puts an end to the possibility.  Its analysis acknowledges that differential pricing could be used to support Canadian content, but that the implementation of such a plan raises problems:

The creation, support, and discoverability of programming made by Canadians underscore many of the policy objectives set out in subsection 3(1) of the Broadcasting Act. Those objectives could be supported by differential pricing practices that would make that content available on Internet platforms in an easy and inexpensive way. However, the conception and implementation of such practices would be problematic for the same reasons that differential pricing practices based on content categories would pose a problem. For instance, while longstanding Canadian content recognition procedures are in place, the reliable identification by ISPs of this content, as well as the regulation and enforcement of the differential pricing practice, would be difficult.

When the parties who suggested such use of differential pricing practices were asked how they would implement their suggestion, they did not provide details at a practical or technical level. The record does not provide any basis to demonstrate that differential pricing practices could be fully and reliably implemented in such a way as to ensure that all programming made by and transmitted to Canadians in the online space would be properly captured.

In light of this analysis, the CRTC’s concludes:

Given all the drawbacks and limitations of using differential pricing practices as a way to support and promote Canadian programming, the Commission considers that any benefits to the Canadian broadcasting system would generally not be sufficient to justify the preference, discrimination, and/or disadvantage created by such practices.

The decision effectively means that efforts to establish regulations or policies designed to grant preferences to Canadian content on basic Internet services are likely to violate the differential pricing rules. As Canadian Heritage Minister Melanie Joly develops policy plans coming out of the Cancon in the digital age consultation, there is at least one tool that should come out of the toolkit.

The post CRTC’s Zero Rating Ruling Kills Proposals for Preferential Treatment for Cancon Online appeared first on Michael Geist.

Recent additions: Hastodon 0.0.1

Added by syucream, Mon Apr 24 14:08:04 UTC 2017.

mastodon client module for Haskell

Recent additions: StockholmAlignment 1.0.3

Added by FlorianEggenhofer, Mon Apr 24 14:04:56 UTC 2017.

Libary for Stockholm aligmnent format

Hackaday: Neural Networks: You’ve Got It So Easy

Neural networks are all the rage right now with increasing numbers of hackers, students, researchers, and businesses getting involved. The last resurgence was in the 80s and 90s, when there was little or no World Wide Web and few neural network tools. The current resurgence started around 2006. From a hacker’s perspective, what tools and other resources were available back then, what’s available now, and what should we expect for the future? For myself, a GPU on the Raspberry Pi would be nice.

The 80s and 90s

Neural network 80s/90s books and mags
Neural network 80s/90s books and mags

For the young’uns reading this who wonder how us old geezers managed to do anything before the World Wide Web, hardcopy magazines played a big part in making us aware of new things. And so it was Scientific American magazine’s September 1992 special issue on Mind and Brain that introduced me to neural networks, both the biological and artificial kinds.

Back then you had the option of writing your own neural networks from scratch or ordering source code from someone else, which you’d receive on a floppy diskette in the mail. I even ordered a floppy from The Amateur Scientist column of that Scientific American issue. You could also buy a neural network library that would do all the low-level, complex math for you.  There was also a free simulator called Xerion from the University of Toronto.

Keeping an eye on the bookstore Science sections did turn up the occasional book on the subject. The classic was the two-volume Explorations in Parallel Distributed Processing, by Rumelhart, McClelland et al. A favorite of mine was Neural Computation and Self-Organizing Maps: An Introduction, useful if you were interested in neural networks controlling a robot arm.

There were also short courses and conferences you could attend. The conference I attended in 1994 was a free two-day one put on by Geoffrey Hinton, then of the University of Toronto, both then and now a leader in the field. The best reputed annual conference at the time was the Neural Information Processing System conference, still going strong today.

And lastly, I recall combing the libraries for published papers. My stack of conference papers and course handouts, photocopied articles, and handwritten notes from that period is around 3″ thick.

Then things went relatively quiet. While neural networks had found use in a few applications, they hadn’t lived up to their hype and from the perspective of the world, outside of a limited research community, they ceased to matter. Things remained quiet as gradual improvements were made, along with a few breakthroughs, and then finally around 2006 they exploded on the world again.

The Present Arrives

We’re focusing on tools here but briefly, those breakthroughs were mainly:

  • new techniques for training networks that go more than three or four layers deep, now called deep neural networks
  • the use of GPUs (Graphics Processing Units) to speed up training
  • the availability of training data containing large numbers of samples

Neural Network Frameworks

There are now numerous neural network libraries, usually called frameworks, available for download for free with various licenses, many of them open source frameworks. Most of the more popular ones allow you to run your neural networks on GPUs, and are flexible enough to support most types of networks.

Here are most of the more popular ones. They all have GPU support except for FNN.

TensorFlow

Languages: Python, C++ is in the works

TensorFlow is Google’s latest neural network framework. It’s designed for distributing networks across multiple machines and GPUs. It can be considered a low-level one, offering great flexibility but also a larger learning curve than high-level ones like Keras and TFLearn, both talked about below. However, they are working on producing a version of Keras integrated in TensorFlow.

We’ve seen this one in a hack on Hackaday already in this hammer and beer bottle recognizing robot and even have an introduction to using TensorFlow.

Theano

Languages: Python

This is an open source library for doing efficient numerical computations involving multi-dimensional arrays. It’s from the University of Montreal, and runs on Windows, Linux and OS-X. Theano has been around for a long time, 0.1 having been released in 2009.

Caffe

Languages: Command line, Python, and MATLAB

Caffe is developed by Berkeley AI Research and community contributors. Models can be defined in a plain text file and then processed using a command line tool. There are also Python and MATLAB interfaces. For example, you can define your model in a plain text file, give details on how to train it in a second plain text file called a solver, and then pass these to the caffe command line tool which will then train a neural network. You can then load this trained net using a Python program and use it to do something, image classification for example.

CNTK

Languages: Python, C++, C#

This is the Microsoft Cognitive Toolkit (CNTK) and runs on Windows and Linux. They’re currently working on a version to be used with Keras.

Keras

Languages: Python

Written in Python, Keras uses either TensorFlow or Theano underneath, making it easier to use those frameworks. There are also plans to support CNTK as well. Work is underway to integrate Keras into TensorFlow resulting in a separate TensorFlow-only version of Keras.

TF Learn

Languages: Python

Like Keras, this is a high-level library built on top of TensorFlow.

FANN

Languages: Supports over 15 languages, no GPU support

This is a high-level open source library written in C. It’s limited to fully connected and sparsely connected neural networks. However, it’s been popular over the years, and has even been included in Linux distributions. It’s recently shown up here on Hackaday in a robot that learned to walk using reinforcement learning, a machine learning technique that often makes use of neural networks.

Torch

Languages: Lua

Open source library written in C. Interestingly, they say on the front page of their website that Torch is embeddable, with ports to iOS, Andoid and FPGA backends.

PyTorch

Languages: Python

PyTorch is relatively new, their website says it’s in early-release beta, but there seems to be a lot interest in it. It runs on Linux and OS-X and uses Torch underneath.

There are no doubt others that I’ve missed. If you have a particular favorite that’s not here then please let us know in the comments.

Which one should you use? Unless the programming language or OS is an issue then another factor to keep in mind is your skill level. If you’re uncomfortable with math or don’t want to dig deeply into the neural network’s nuances then chose a high-level one. In that case, stay away from TensorFlow, where you have to learn more about the API than Kera, TFLearn or the other high-level ones. Frameworks that emphasize their math functionality usually require you to do more work to create the network. Another factor is whether or not you’ll be doing basic research. A high-level framework may not allow you to access the innards enough to start making crazy networks, perhaps with connections spanning multiple layers or within layers, and with data flowing in all directions.

Online Services

Are you you’re looking to add something a neural network would offer to your hack but don’t want to take the time to learn the intricacies of neural networks? For that there are services available by connecting your hack to the internet.

We’ve seen countless examples making use of Amazon’s Alexa for voice recognition. Google also has its Cloud Machine Learning Services which includes vision and speech. Its vision service have shown up here using Raspberry Pi’s for candy sorting and reading human emotions. The Wekinator is aimed at artists and musicians that we’ve seen used to train a neural network to respond to various gestures for turning things on an off around the house, as well as for making a virtual world’s tiniest violin. Not to be left out, Microsoft also has its Cognitive Services APIs, including: vision, speech, language and others.

GPUs and TPUs

Iterating through a neural network
Iterating through a neural network

Training a neural network requires iterating through the neural network, forward and then backward, each time improving the network’s accuracy. Up to a point, the more iterations you can do, the better the final accuracy will be when you stop. The number of iterations could be in the hundreds or even thousands. With 1980s and 1990s computers, achieving enough iterations could take an unacceptable amount of time. According to the article, Deep Learning in Neural Networks: An Overview, in 2004 an increase of 20 times the speed was achieved with a GPU for a fully connected neural network. In 2006 a 4 times increase was achieved for a convolutional neural network. By 2010, increases were as much as 50 times faster when comparing training on a CPU versus a GPU. As a result, accuracies were much higher.

Nvidia Titan Xp graphics card
Nvidia Titan Xp graphics card. Image Credit: Nvidia

How do GPUs help? A big part of training a neural network involves doing matrix multiplication, something which is done much faster on a GPU than on a CPU. Nvidia, a leader in making graphics cards and GPUs, created an API called CUDA which is used by neural network software to make use of the GPU. We point this out because you’ll see the term CUDA a lot. With the spread of deep learning, Nvidia has added more APIs, including CuDNN (CUDA for Deep Neural Networks), a library of finely tuned neural network primitives, and another term you’ll see.

Nvidia also has its own single board computer, the Jetson TX2, designed to be the brains for self-driving cars, selfie-snapping drones, and so on. However, as our [Brian Benchoff] has pointed out, the price point is a little high for the average hacker.

Google has also been working on its own hardware acceleration in the form of its Tensor Processing Unit (TPU). You might have noticed the similarity to the name of Google’s framework above, TensorFlow. TensorFlow makes heavy use of tensors (think of single and multi-dimensional arrays in software). According to Google’s paper on the TPU it’s designed for the inference phase of neural networks. Inference refers not to training neural networks but to using the neural network after it’s been trained. We haven’t seen it used by any frameworks yet, but it’s something to keep in mind.

Using Other People’s Hardware

Do you have a neural network that’ll take a long time to train but don’t have a supported GPU, or don’t want to tie up your resources? In that case there’s hardware you can use on other machines accessible over the internet. One such is FloydHub which, for an individual, costs only penny’s per hour with no monthly payment. Another is Amazon EC2.

Datasets

Training neural network with labeled data
Training neural network with labeled data

We said that one of the breakthroughs in neural networks was the availability of training data containing large numbers of samples, in the tens of thousands. Training a neural network using a supervised training algorithm involves giving the data to the network at its inputs but also telling it what the expected output should be. In that case the data also has to be labeled. If you give an image of a horse to the network’s inputs, and its outputs say it looks like a cheetah, then it needs to know that the error is large and more training is needed. The expected output is called a label, and the data is ‘labeled data’.

Many such datasets are available online for training purposes. MNIST is one such for handwritten character recognition. ImageNet and CIFAR are two different datasets of labeled images. Many more are listed on this Wikipedia page. Many of the frameworks listed above have tutorials that include the necessary datasets.

That’s not to say you absolutely need a large dataset to get a respectable accuracy. The walking robot we previously mentioned that used the FNN framework, used the servo motor positions as its training data.

Other Resources

Unlike in the 80s and 90s, while you can still buy hardcopy books about neural networks, there are numerous ones online. Two online books I’ve enjoyed are Deep Learning by the MIT Press and Neural Networks and Deep Learning. The above listed frameworks all have tutorials to help get started. And then there are countless other websites and YouTube videos on any topic you search for. I find YouTube videos of recorded lectures and conference talks very useful.

The Future

Raspberry Pi 7 with GPU
Raspberry Pi 7 with GPU

Doubtless the future will see more frameworks coming along.

We’ve long seen specialized neural chips and boards on the market but none have ever found a big market, even back in the 90s. However, those aren’t designed specially for serving the real growth area, the neural network software that everyone’s working on. GPUs do serve that market. As neural networks with millions of connections for image and voice processing, language, and so on make their way into smaller and smaller consumer devices the need for more GPUs or processors tailored to that software will hopefully result in something that can become a new component on a Raspberry Pi or Arduino board. Though there is the possibility that processing will remain an online service instead. EDIT: It turns out there is a GPU on the Raspberry Pi — see the comments below. That doesn’t mean all the above frameworks will make use of it though. For example, TensorFlow supports Nvidia CUDA cards only. But you can still use the GPU for your own custom neural network code. Various links are in the comments for that too.

There is already competition for GPUs from ASICs like the TPU and it’s possible we’ll see more of those, possibly ousting GPUs from neural networks altogether.

As for our new computer overlord, neural networks as a part of our daily life are here to stay this time, but the hype that is artificial general intelligence will likely quieten until someone makes significant breaktroughs only to explode onto the scene once again, but for real this time.

In the meantime, which neural network framework have you used and why? Or did you write your own? Are there any tools missing that you’d like to see? Let us know in the comments below.


Filed under: Featured, Interest, software hacks

ScreenAnarchy: Watch First ZOMBIOLOGY: ENJOY YOURSELF TONIGHT Trailer, Just Because

Heading for release in Hong Kong on June 29, Zombiology: Enjoy Yourself Tonight follows what happens when "a monster from Lung's favorite animation appears in the city ... and turns people into zombies." That's from the rather lengthy plot summary at IMDb. These are running kind of zombies, not George A. Romero's slow walkers, so this may not be for traditionalists. But if you can roll with that, it looks promising. Here's the entire description (spoilers may follow): Lung (starring Michael Ning) and Chi-Yeung (starring Louis Cheung) are two eccentric hot-blooded young men leading a devil-may-care life. They deem themselves as heroes that can save the earth. However, Lung can do nothing about things in life that don't work out as he wishes: he has...

[Read the whole post on screenanarchy.com...]

Slashdot: Unroll.me 'Heartbroken' After Being Caught Selling User Data To Uber

The chief executive of email unsubscription service Unroll.me has said he is "heartbroken" that users felt betrayed by the fact that his company monetises the contents of their inbox by selling their data to companies such as Uber. Over the weekend, The New York Times published a profile of Uber CEO Travis Kalanick, in which, among other things, it reported that following an acquisition by shopping app Slice in 2014, Unroll.me developed a side-business: selling aggregated data about users to the very apps they were unsubscribing from. Uber was one of Slice's big data arm Slice Intelligence's customers. CNET adds: While Unroll.me did not specifically admit to selling data to Uber, it has apologised for not being "explicit enough" in explaining how its free service worked. "It was heartbreaking to see that some of our users were upset to learn about how we monetize our free service," CEO Jojo Hedaya said on the Unroll.me blog. While reiterating that "all data is completely anonymous and related to purchases only," Hedaya admitted, "we need to do better for our users" by offering clearer information on its website.

Read more of this story at Slashdot.

search.cpan.org: Dancer2-Plugin-Captcha-0.11

Dancer2 add-on for CAPTCHA.

MetaFilter: Reports of Her Death Have Been Greatly Exaggerated

Emily Gould covers author Cat Marnell in her piece Cat Marnell is Still Alive for NY Magazine. Gould writes "There's always a fine line between appreciating the art that someone's making out of her fucked-up life and feeling like your attention makes you complicit in her self-destruction."

The NY Times echoes that sentiment in a book review of Marnell's new book How to Murder Your Life. In her review Tales From the Personal Essay Industrial Complex, Anne Helen Petersen writes, "Marnell treads a knife edge between glamorizing her own despair and rendering it with savage honesty."

Cat Marnell for Janexo
Cat Marnell for Vice
Previously

Both Gould and Petersen make it a point to mention their own forays into the "first-person industrial complex" (as Gould refers to it). In Reinventing Emily Gould from NYT, Ruth La Ferla writes, "Indeed, a case could be made that Ms. Gould's warts-and-all brand of self-exposure anticipated a wave of confessional writing that paved the way for "Girls," Lena Dunham's quasi-autobiographical hit on HBO."

The Hairpin Jia Tolentino interviews Anne Helen Petersen on her departure from academia (and the Hairpin) to write for Buzzfeed. Petersen says "There's this internet myth that you have to make this low-hanging cheap fruit in order to subsidize the "real" reporting/writing that no one reads, and that those "real" writers should never have to worry about the fact that no one reads them."

search.cpan.org: App-Netdisco-2.035003

An open source web-based network management tool.

Recent additions: sarsi 0.0.4.0

Added by aloiscochard, Mon Apr 24 13:34:53 UTC 2017.

A universal quickfix toolkit and his protocol.

MetaFilter: Japan Made Secret Deals with the NSA that Expanded Global Surveillance

Ryan Gallagher of The Intercept provides a fascinating look at the complex relationship between the US and Japanese surveillance organizations who have been cooperating and surveilling each other since the end of the second World War.

Much of this information was made public by the Snowden Archive but this piece pulls together information Snowden revealed as well as historical information over the decades since Japan began to host the NSA in Japan post-war. That the Japanese public has little information about their own government's surveillance programs is interesting. That The Intercept was able to work with Japan's national broadcaster, NHK, on this story is also of interest.

MetaFilter: Lifestyles of the Rich and Tasteless

No 18th Century Estate Was Complete Without a Live-in Hermit

Recent additions: invariant 0.4.2

Added by ryanglscott, Mon Apr 24 13:04:14 UTC 2017.

Haskell98 invariant functors

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Dadbucks



Click here to go see the bonus panel!

Hovertext:
Later, the kid agrees to do chores, but only in exchange for precious metals.

New comic!
Today's News:

Thank you so much MIT. Congrats to James Propp and Olivia Walch. We hope to see you again next year.

ScreenAnarchy: Pretty Packaging: Carlotta's PHANTOM OF THE PARADISE Is A Stunner

When the French distributor Carlotta Films started with their line of "Ultra Collector" boxsets back in 2015, we sat up and took notice. Each of these featured a newly remastered Blu-ray and DVD of a chosen title, fitted on the inside of a stellar 200-page hardcover book. What also caught the attention was the incredible cover-art of their first release. That was for Brian De Palma's Body Double. Since then, every three or four months we've seen a new addition to this line-up, each with a great hardcover book and great art attached. In this series, Brian De Palma may perhaps be a wee bit over-represented: of the six "Ultra Collector" releases so far, three have been for his films. The latest of these is...

[Read the whole post on screenanarchy.com...]

OCaml Planet: Full Time: Front-end Developer at issuu in Copenhagen

Fulltime, Copenhagen

issuu is the world's fastest-growing digital publishing platform. We are looking for a new member to join our fantastic team. With great people, unique ideas and stunning technology, we're changing the future of publishing today. Can you be the best at what you do? Join us!

About this job

As a Front-end Developer at issuu, you will be joining a team of highly skilled web enthusiasts building web applications in an agile environment. We currently develop for desktops, tablets and mobile web. This requires great responsive sites that ensure the best possible user experience on all platforms.

We run a NodeJS server in a production system that serves billions of pages every month. Our client-side applications are a mix of vanilla JavaScript (transpiled with Babel from ES2015), and React / Redux, apps with an in-house developed, modular, BEM-styled styling framework.SASS enables us to keep our CSS well structured.

As our ideal candidate, you are excited about HTML5 features, like SVG and , and have a feel for the pulse of new developments in browser technology.In a system the size of issuu’s, code maintainability is just as important as accessibility. That also means that you are willing to share your code with other people or sit together for pair-programming sessions.

What we Like

  • We think browsers are awesome.
  • We love HTML5, CSS3 and all the new exciting web APIs.
  • We use React, Redux, ES2015 and CSS Modules.
  • We draw with Canvas, WebGL and SVGs.
  • We build modern progressive web apps.
  • We like declarative and functional programming.
  • We used to like jQuery and Backbone, but now we sort of grew apart.
  • We also like it if our front-end developers aren’t afraid of touching our backends, which we make with -
  • Python, NodeJS, OCaml and Erlang.

Qualifications

  • Demonstrated ability to deliver results in a fast-paced environment as a self-starter and quick learner
  • Expert understanding of browsers, web technologies (object-oriented JavaScript, ES2015, HTML, CSS), and a passion for the latest and greatest web standards
  • Experience in collaborating with UI Designers
  • Experience architecting and building robust, high-performance production web applications
  • Used to working with Design and Product teams in an agile fashion
  • Experience with building web applications for mobile web is a plus
  • Experience with MVC-based web applications is a plus
  • Experience with server-side languages like Python is a plus
  • Solid communications skills in English

What we Offer

  • You’ll be a part of issuu – an amazing place with room for parents, foodies, geeks, odd birds and the occasional normal person
  • Informal, inclusive and very flexible workplace
  • Competitive compensation package
  • Flat hierarchy – every opinion matters regardless of team and position
  • Regular hackathons
  • A sleek Mac and a pretty sweet desk setup
  • Great offices in Copenhagen, Palo Alto and Berlin

About the Team and Office

You will be joining our DK engineering team in Copenhagen consisting of approx. 27 developers. We are a diverse office with members of all parts of issuu, including Customer Support, Product, Design, Data Analytics and Management.

In a company the size of issuu, code maintainability is just as important as site accessibility. That also means that you are willing to share your code with other people or sit together for pair programming sessions.

Our office is conveniently located next to the Copenhagen Central Station. We provide nice, catered lunches each day, regular outings and monthly Friday bars with drinks, snacks and activities.

About issuu

issuu connects the world’s publishers to an audience of active consumers, on a global scale. Every day millions of people find, read and share billions of pieces of content they find meaningful, from every device. Millions of magazine, newspaper and catalog creators are empowered to distribute, measure and monetize their work through the issuu platform issuu. If you are interested in a sneak peek of who we are as issuuees, have a look at our brandbook.

Details

It’s a prerequisite that you have a valid EU work permit

Slashdot: Aurora Enthusiasts Discover A Strange New Light In The Sky And Named It Steve

An anonymous reader quotes the BBC: A group of aurora enthusiasts have found a new type of light in the night sky and named it Steve. Eric Donovan from the University of Calgary in Canada spotted the feature in photos shared on a Facebook group. He did not recognise it as a catalogued phenomenon and although the group were calling it a proton arc, he knew proton auroras were not visible. Testing showed it appeared to be a hot stream of fast-flowing gas in the higher reaches of the atmosphere. The European Space Agency sent electric field instruments to measure it 300km (190 miles) above the surface of the Earth and found the temperature of the air was 3,000C (5,400F) hotter inside the gas stream than outside it. Inside, the 25km-wide ribbon of gas was flowing at 6 km/s (13,000mph), 600 times faster than the air on either side. One official at the European Space Agency made sure to thank the "army of citizen scientists" who helped with the discovery, saying "It turns out that Steve is actually remarkably common, but we hadn't noticed it before." The name apparently came from a scene in the movie "Over the Hedge."

Read more of this story at Slashdot.

Open Culture: Milton Glaser’s 10 Rules for Life & Work: The Celebrated Designer Dispenses Wisdom Gained Over His Long Life & Career

“None of us has really the ability to understand our path until it’s over,” the celebrated graphic designer Milton Glaser muses less than a minute into the above video. The 86-year-old Glaser’s many...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Hackaday: White-hat Botnet Infects, Then Secures IoT Devices

[Symantec] Reports Hajime seems to be a white hat worm that spreads over telnet in order to secure IoT devices instead of actually doing anything malicious.

[Brian Benchoff] wrote a great article about the Hajime Worm just as the story broke when first discovered back in October last year. At the time, it looked like the beginnings of a malicious IoT botnet out to cause some DDoS trouble. In a crazy turn of events, it now seems that the worm is actually securing devices affected by another major IoT botnet, dubbed Mirai, which has been launching DDoS attacks. More recently a new Mirai variant has been launching application-layer attacks since it’s source code was uploaded to a GitHub account and adapted.

Hajime is a much more complex botnet than Mirai as it is controlled through peer-to-peer propagating commands through infected devices, whilst the latter uses hard-coded addresses for the command and control of the botnet. Hajime can also cloak its self better, managing to hide its self from running processes and hide its files from the device.

The author can open a shell script to any infected machine in the network at any time, and the code is modular, so new capabilities can be added on the fly. It is apparent from the code that a fair amount of development time went into designing this worm.

So where is this all going? So far this is beginning to look like a cyber battle of Good vs Evil. Or it’s a turf war between rival cyber-mafias. Only time will tell.


Filed under: news, security hacks

CreativeApplications.Net: Sonic Pendulum – AI soundspace of tranquility

Created by Yuri Suzuki Design Studio and presented at the recent Milan Design Week, Sonic Pendulum is a sound installation in which articial intelligence imagines and materialises an endless soundscape.

BOOOOOOOM!: Artist Spotlight: Jaime Angelopoulos

Sculptures and collages by artist Jaime Angelopoulos. More images below.

: Life and death of a tree

 photo 5B7C3243-C98E-449A-9F35-96D2FA13A1C1.jpg

This is a picture from my friend L, who is visiting Yiwu again this year. He’s been going for some years now, the first visit of his from 2007. He said when he first went to Yiwu, this tree was supposed to be 600 years old. It was just growing in the wild, one of the older trees, but certainly nothing too special. A few years later, in 2012 when he visited this spot again, the tree was now 1400 years old, not 600. By then, it had been “protected” with this metal cage you see surrounding it, and also some concrete poured around it to help protect it from, presumably, falling off the slope or something. Fast forward a few more years to today – as you can see in the picture, the tree is either dead or about to die, with no leaves and no real sign of life. It’s not the first tree like this and won’t be the last. Nannuo mountain had a similar, much bigger (physically) tree that was also “protected” and died in the process.

But fear not – there’s already a newly crowned “1000 years old” tree at the front of the village with a sign hanging from the tree proclaiming so. Tourists who are entering the region need not worry – they will still be able to see 1000 years old tree and buy magical leaves from them!

Now, aside from the utter absurdity of the story and the sadness of it all, I think it’s safe to say that those of us who have watched the puerh market for a decade or more know this sort of thing has been going on for some time now. The ever-increasing age of certain trees is not surprising – it’s been that way since at least 2005, when people first started getting crazy about older trees. Prices for the leaves have never really fallen since then, and now ever-fancier things are happening, with single tree cakes being pressed, etc. Just look at this tree though – how much tea do you think it can realistically produce? It’s no taller than a person and half. Even if you chop down the entire tree and took down all the leaves when it was in full bloom, chances are it’s no more than a couple kilos when fried and dried.

That brings us to a more salient point – this area of China has never, ever been rich. For pretty much its entire history, human beings living in these mountains have lived a subsistence lifestyle – they produce enough to sustain their life, but not much more. When tea traders first visited these areas in the early 2000s, conditions were primitive. Huts were shabby, sanitation basic, food, while they exist, were not exactly free flowing. In earlier decades many farmers actually chopped down their tea trees to plant rubber, because rubber trees offered a more steady income. Old tree tea was cheaper – they were considered less good back then, and more troublesome to harvest. Prices only really reversed starting somewhere in 2003, and hasn’t looked back since.

So in the face of this sudden rush of fortune, it is not a surprise that farmers in this area would want to exploit it to the full. This is, after all, their one chance of getting comfortable, even rich if you were one of those lucky ones to live in a famous village like Banzhang. You can finally make some decent money, send your kids to school comfortably, buy some creature comfort, build a new, better house, get a motorcycle or even a pickup truck. You can have some money in the bank, and enjoy life a little more. If the cost of all that is, say, the over-harvesting of some trees in the slopes above your house…. that’s ok, no? These trees finally will pull them out of poverty, and with an endless supply of newcomers who don’t know that much about tea, business is good.

In the last few years as tea-tourism has increased exponentially (I read one account that said this year 500,000 people are visiting the tea mountains during harvest season) there is an increasing number of people who really have no business going to the mountains in there, buying tea. If you are a rich, city professional interested in tea, and are spending a couple weeks in Yiwu looking at things, well, you would want some of your own tea, no? Here, here’s some tea from my 800 years old tea tree. That bag there? It’s the 600 years old one. If you are visiting only that one time – you’ll want to get your hands on some of these things. What’s a few thousand RMB for half a kilo of tea? It’s the memory that counts, and you can press it into a cake or a couple cakes and store it forever, knowing that you personally went up to the mountain to press these unique, old, single-tree cakes.

At that point, does it actually matter what trees these leaves are from? These guys are just buying tour souvenirs. It can be trash tea and it won’t matter. And a lot of it is indeed trash tea sold to people who really don’t know what they’re doing when buying maocha. When you compare a few bags of tea, one of them will always be better than the others. That doesn’t mean the bag is good, unless you really know what you’re doing. Most people have never really tried really fresh maocha enough to know the difference.

Eager customers from faraway places who don’t get to go to Yunnan easily are also lured in by the same promise. Like this tree that magically went from 600 years to 1400 years old – outlandish claims exist, even among vendors whose primary customer are in Western countries – and people buy them hoping that they, too, can experience these amazing teas. Let it sink in for a moment how old those trees are really, and think about how likely it is that these claims have any semblance of truth. Meanwhile, spare a thought for this tree that perished in the process.

BOOOOOOOM!: Artist Spotlight: Ayumu Arisaka

A selection of drawings and animations by Ayumu Arisaka. More below!

Planet Haskell: Well-Typed.Com: Upcoming courses and events 2017

We are excited to be teaching Haskell courses once again – in June 2017, at Skills Matter’s CodeNode venue in London.

We are offering three courses:

Fast Track to Haskell (26-27 June)

Compact two-day course aimed at software developers with no prior experience in Haskell who want to learn the fundamentals of Haskell and functional programming.

Guide to the Haskell Type System (28 June)

One-day course covering most of GHC’s extensions of the Haskell type system, such as GADTs, data kinds, and type families. Suitable for Haskellers who want to get most out of Haskell’s powerful type system and understand the trade-offs involved.

Guide to Haskell Performance and Optimization (29-30 June)

Two-day course focusing on the internals of GHC, the evaluation strategy, how programs are compiled and executed at run-time. Explains how to choose the right data structure for your program in a lazy functional language, what kind of optimizations you can expect the compiler to perform, and how to write beautiful programs that scale.

Each of these courses is open for registration, with reduced rates available if you book soon.

The courses will also be held again in October 2017, in connection with the Haskell eXchange.

We also provide on-site (and remote) courses tailored to your specific needs. If you want to know more, have a look at our training page or contact us.

Other upcoming events

The following are some other events in 2017 we are planning to participate in (we may be at other events, too):

ZuriHac (9-11 June 2017)

As in previous years, we’ll be at ZuriHac again, which is the largest European Haskell Hackathon. Whether you’re a newcomer who wants to try Haskell for the first time or an experienced Haskeller with many years of experience, we are looking forward to meeting you there.

ICFP + Haskell Symposium + HIW + CUFP (3-9 September 2017)

The annual International Conference on Functional Programming will take place in Oxford this year. A full week of events focused on functional programming, including the two-day Haskell Symposium and the Haskell Implementors Workshop. There’s also the Commercial Users of Functional Programming conference which features several tutorials on various programming languages and techniques.

We will certainly be there and participate actively.

Haskell eXchange (12-13 October 2017 + 14-15 October 2017)

The two-day Haskell developer conference organized by us and Skills Matter in London is back for another year. We are currently looking for talk proposals for this conference, so if you have anything you would like to present, please submit! Registration is also open already, and tickets are cheaper the earlier you book.

There’s also going to be a two-day hackathon / unconference on the weekend after the Haskell eXchange.

If you would be interested in sponsoring the Haskell eXchange, please let us know.

BOOOOOOOM!: Artist Spotlight: Luke Pelletier

A selection of work by artist Luke Pelletier. More images below.

Planet Haskell: Ken T Takusagawa: [vqoxpezv] Omitting named function arguments

Consider a programming language whose syntax for function calls requires naming each passed argument, but as a benefit for this extra verbosity, allows specifying the arguments in any order:

f { foo = 3, bar = True, baz = "hello" }

If an argument is omitted, there are several things that could happen, depending on how the language is defined.

  1. Compile time error.
  2. It becomes a partial function application, a lambda function of the missing arguments.  Haskell does this with currying when trailing arguments are omitted.  (Tangentially, in Haskell, creating a lambda for a missing argument that is not the last one requires a little bit more work.)
  3. The missing arguments get silently assigned a lazy "undefined" value, which results in a run-time error if the "undefined" is ever evaluated.
  4. The language permits the function definition to provide default values to some omitted arguments.  If there is no default value, then compile-time error.  C++ does this.

It would be nice if a language could provide all of these options, even though strictly speaking they are mutually exclusive.

An imperfect solution is to have special keywords invoking #2, #3 and #4, perhaps something like f { foo = 3, REST UNDEFINED } or REST DEFAULT or REST LAMBDA, explicitly indicating indicating what do with the rest of arguments omitted at a call site.

I have seen default values implemented in Haskell using its typeclass mechanism, e.g., configuration values in xmonad.  Default values are overridden using record modification syntax.

A hybrid between #1 and #4 would have the compiler always produce an error if arguments are missing, but indicate in the compile error that a default value is available via an explicit keyword (as above) when one is.

A list of named parameters and their values looks kind of like a record or struct.  Make such a list a real type and allow variables of that type to be declared and assigned.  A special syntax allows invoking a function with a record instead of a named argument list.  If two functions have the same named parameters, are their parameter record types the same?  Duck typing.

This scheme also gets​ messy when arguments may be omitted; we need to be able to define record types with only a subset of the parameters of a function, as well as possibly allowing REST UNDEFINED or REST DEFAULT as dynamic values in the record.  If using REST LAMBDA, and whether a record field is defined is only known dynamically, then type checking and kind checking has to be postponed until run-time.

One of the things that makes me uneasy about Haskell record syntax is the following: REST UNDEFINED (#3) occurs implicitly when records are constructed whereas REST LAMBDA (#2) occurs when omitting trailing arguments when using positional constructors.  The latter will usually cause a type checking error if the argument was omitted accidentally whereas the former waits for a run-time error.

Previously: Same idea.

Having to specify a function name as well as all its parameter names might become tedious for the programmer.  Perhaps have mechanisms for a programmer to define alternate parameter names or omit a name.  Vaguely related: by type.

Open Culture: What Makes a Coen Brothers Movie a Coen Brothers Movie? Find Out in a 4-Hour Video Essay of Barton Fink, The Big Lebowski, Fargo, No Country for Old Men & More

What could movies as different as Barton Fink, The Big Lebowski, No Country for Old Men, and True Grit have in common? Even casual cinephiles will take that as a silly question, knowing full well...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Hackaday: Laser Surgery: Expanding the Bed of a Cheap Chinese Laser Cutter

Don’t you just hate it when you spend less than $400 on a 40-watt laser cutter and it turns out to have a work area the size of a sheet of copy paper? [Kostas Filosofou] sure did, but rather than stick with that limited work envelope, he modified his cheap K40 laser cutter so it has almost five times the original space.

The K40 doesn’t make any pretenses — it’s a cheap laser cutter and engraver from China. But with new units going for $344 on eBay now, it’s almost a no-brainer. Even with its limitations, you’re still getting a 40-watt CO2 laser and decent motion control hardware to play with. [Kostas] began the embiggening by removing the high-voltage power supply from its original space-hogging home to the right of the work area. With that living in a new outboard enclosure, a new X-Y gantry of extruded aluminum rails and 3D-printed parts was built, and a better exhaust fan was installed. Custom mirror assemblies were turned, better fans were added to the radiator, and oh yeah — he added a Z-axis to the bed too.

We’re sure [Kostas] ran the tab up a little on this build, but when you’re spending so little to start with, it’s easy to get carried away. Speaking of which, if you feel the need for an even bigger cutter, an enormous 100-watt unit might be more your style.

Thanks to stalwart tipster [George Graves] for the heads up on this one.


Filed under: laser hacks, tool hacks

BOOOOOOOM!: Artist Spotlight: Natasha Bieniek

Paintings by Melbourne, Australia-based artist Natasha Bieniek. More images below.

Instructables: exploring - featured: Arduino Tailpipe Filter and Sensor

This tutorial shows how to make a tailpipe sensor and filter that connects to an app via bluetooth. A Tinyduino stack measures carbon monoxide from a tailpipe and sends the data via BLE to an App. Acquire Parts Parts Needed:Tinyduino Starter KitTinyduino Nordic BLESparkfun MQ-7 Carbon Monoxide Se...
By: CheongS

Continue Reading »

BOOOOOOOM!: Photographer Spotlight: Tom Joseph Wilson

A selection of photos from “Levels” shot in Malawi, by photographer Tom Joseph Wilson. Have a look at the images below.

Instructables: exploring - featured: LoRa IOT Home Environment Monitoring System

The LoRa IOT Home Environmental Monitoring System consists of an Arduino Mega based IOT-to-Internet gateway and Arduino Feather based remote stations with environmental sensors. The remote stations communicate wirelessly with the gateway using LoRa radios.The system enables a homeowner to monitor th...
By: RodNewHampshire

Continue Reading »

Instructables: exploring - featured: Jammafying Daytona USA Arcade PCB

Daytona USA is a gross selling driving simulator that I really loved playing back in the days. The game is fitted with custom controls (wheel, pedals, shifting gear and VR buttons).In order to play this wonderful game using a standard Supergun, follow the steps in this instruction set.Stuff you need...
By: matan79

Continue Reading »

Electronics-Lab: Simple circuit indicates health of lithium-ion batteries

Fritz Weld @ edn.com proposes a simple circuit to check li-ion battery health. He writes:

Lithium-ion batteries are sensitive to bad treatment. Fire, explosions, and other hazardous condition may occur when you charge the cell below the margin that the manufacturer defines. Modern battery chargers can manage the hazardous conditions and deny operation when illegal situations occur. This fact doesn’t mean, however, that all cells are bad. In most cases, you can replace the discharged battery and increase your device’s lifetime. Figure 1 shows the circuit for testing battery packs.

Simple circuit indicates health of lithium-ion batteries – [Link]

The post Simple circuit indicates health of lithium-ion batteries appeared first on Electronics-Lab.

Instructables: exploring - featured: Wooden Guitar Picks

most guitar picks are made with a CNC but unfortunately I don't have one(yet..) so my way of making guitar picks is very simple and affective, with more common tools My picks are normally made with veneer, which have yielded me the best results. These picks are very easy and fun to make, great proj...
By: JakeR91

Continue Reading »

Electronics-Lab: Boost Converters and Buck Converters: Power Electronics

Boost Converters and Buck Converters: Power Electronics  – [Link]

The post Boost Converters and Buck Converters: Power Electronics appeared first on Electronics-Lab.

Electronics-Lab: Exploring Eagle CAD ULPs #6 – Group-aps_v4.ULP Autoplace by Group

Welcome to the 6th post of the “Exploring Eagle CAD ULPs” series. Each post will be discussing one useful ULP in Eagle CAD.

“ULP” User Language Program is a plain text file which is written in a C­-like syntax and can be used to access the EAGLE data structures and to create a wide variety of output files. You can think about it as a plug-in for Eagle.

You can reach the posts published in this series using the following link.

In the previous post we explored Place50 ULP which places all parts of the board to the position in the schematic. Place50 moves all parts of the board, but sometimes we need to do this auto-placement for just a certain group of parts. Beside that, we can’t change the position scaling factor in Place50. Group-aps_v4 ULP overcomes these two points of limitation in Place50 ULP by doing the auto-placement by group, and having user defined position scaling and offset.

To use Group-aps_v4 ULP first download it from Autodesk website. Before running it in the schematic editor, you need to define a group of parts first.

Group-aps_v4 has a simple dialog to enter scale and offset values.

Scale is used to scale the value of original position (X and Y) of the parts in the defined group in the schematic. While X,Y offset is used to offset the final position of the part in the board after scaling it. For example, if scale was 0.5 and the position (in mil) for the part is (500,100) then is will be considered as (250,50).

Group-aps_v4 ULP originally places the group in the calculated position of the the first part. So as an output, all parts will have the same X and Y and that’s not effective. So i made a simple edit to the ULP to solve this issue. You can download the updated version N_group-aps_v4.ulp.

The post Exploring Eagle CAD ULPs #6 – Group-aps_v4.ULP Autoplace by Group appeared first on Electronics-Lab.

Electronics-Lab: Fast Single-Pixel Camera

Compressed sensing is an new computational technique to extract large amounts of information from a signal. Researchers from Rice University, for example, have built a camera that can generate 2D-images using only a single light sensor (‘pixel’) instead of the millions of pixels in the sensor of a conventional camera.

This compressed sensing technology is rather inefficient for forming images: such a single-pixel camera needs to take thousands of pictures to produce a single, reasonably sharp image. Researchers from the MIT Media Lab however, have developed a new technique that makes image acquisition using compressed sensing fifty times more efficient. In the example of the single-pixel camera that means that the number of exposures can be reduces to several tens.

One intriguing aspect of compressed sensing is that no lens is required – again in contrast with a conventional camera. That makes this technique also particularly interesting for applications at wavelengths outside of the visible spectrum.

In compressed sensing, use is made of the time differences between the reflected light waves from the object to be imaged. In addition, the light that strikes the sensor has a pattern – as if it passed through a checkerboard with irregular positioned transparent and opaque fields. This could be obtained with a filter or using a micro-mirror array where some mirrors are directed towards the sensor and others are not.

The sensor each time measures only the cumulative intensity of the incoming light. But when this measurement is repeated often enough, each time with a different pattern, then the software can derive the intensity of the light that is reflected from different points of the subject.

Source: Elektor

The post Fast Single-Pixel Camera appeared first on Electronics-Lab.

Planet Haskell: Michael Snoyman: Haskell Success Stories

I've probably blogged, spoken, Tweeted, and commented on a variation on this theme many times in the past, so please excuse me for being a broken record. This is important.

I think we have a problem in the Haskell community. We all know that using Haskell to create a simple web service, a CRUD app, a statically linked command line tool, or a dozen other things is not only possible, but commonplace, trivial, and not even noteworthy. So we don't bother commenting when we create general purpose reverse proxy tools with prebuilt Docker images for auth-enabling arbitrary webapps. It's boring. Unfortunately, people outside our community don't know this. By not bothering to talk about this (for us) boring topic, we're hiding away the fact that Haskell is a practical language for creating real things.

Instead, we like to talk about better preludes, optimizing common functions, or dangers in our standard libraries. I'm picking on myself here with these examples, but my comments apply far more generally.

I know personally at least 10-15 Haskell success stories that have never been talked about publicly. And I have to apologize for not leading by example here; unfortunately most of my work in the past few years has either been under NDA, or been of absolutely no interest to people outside the Haskell community (usually open source infrastructure and libraries). So I'm hoping to inspire others to step up to the plate.

I'm not trying to tell anyone to stop talking about the things we find interesting. I just want to point out that just because we, within the Haskell community, may not find a "I launched a web service, and it's running, and it's not as buggy as we would have expected v1 to be" kind of blog post noteworthy, I think others will. These kinds of blog posts are also a much easier way to get started talking publicly about Haskell, since not all of us can explain zygohistomorphic prepomorphisms (I know I certainly can't).

As I was batting the idea for this post around with my wife last night, she pointed out that, most likely, the people best suited to write these kinds of posts may not have dedicated blogs at all right now. If you fall into that category, but would still be interested in writing up a post about your Haskell success story, I'd like to offer assistance. I'm happy to let guests write posts on the Yesod blog. Articles may also be relevant to haskell-lang.org. And we've run Haskell experience reports on FP Complete's website many times in the past.

I hope this time around this message had a bit of a different twist, and maybe can hit a different group of readers.

MattCha's Blog: The "NEW" MattCha's Blog Is A Puerh Blog!!! Ok?

Well my favourite tea is raw puerh cha of course!

It has always been (since well before the start of this blog at least).  All my tea friends and all the tea people I have ever been close to enjoy raw puerh much more than any other kind of tea.  I wonder what your favourite tea is?

When looking back at the posts on this blog the puerh posts are really overshadowed by the heavy Korean tea content.  In reality, I have drank much more puerh tea than Korean but you would probably never know it from reading this blog.  When going back through the blog I realized that I published more on samples and group tastings than my own purchases!  I guess I drink a lot of tea and you can't publish on everything else you would be posting daily and that's no fun at all!

The future of this blog will be puerh focused.  I hope to post more on what I buy, issues of puerh drinkers, and other original insights on puerh tea.  Don't worry, I will still post about Korean tea as well.  Since being out of the epicentre of Korean tea for so long, it seems natural to focus more on what I'm doing now and not as much what is going on thousands of miles away in Korea.

One of the first things I did to mark the shift to a puerh blog was go back to all the old puerh posts and tidy up the labels to reflect the vendor, factory, area, and town/mountain of the puerh.  Then I realized that the new mobile/ tablet version of Blogger doesn't even use these anymore.  You have to go to "view web version" to see them.  Does anyone even use labels anymore?

Anyhow.. So the upcoming posts are going to feature mainly puerh ... yay!

Peace

Explosm.net: Comic for 2017.04.24

New Cyanide and Happiness Comic

Hackaday: Arbitrary Code Execution is in Another Castle!

When one buys a computer, it should be expected that the owner can run any code on it that they want. Often this isn’t the case, though, as most modern devices are sold with locked bootloaders or worse. Older technology is a little bit easier to handle, however, but arbitrary code execution on something like an original Nintendo still involves quite a lot of legwork, as [Retro Game Mechanics Explained] shows with the inner workings of Super Mario Brothers 3.

While this hack doesn’t permanently modify the Nintendo itself, it does allow for arbitrary code execution within the game, which is used mostly by speedrunners to get to the end credits scene as fast as possible. To do this, values are written to memory by carefully manipulating on-screen objects. Once the correct values are entered, a glitch in the game involving a pipe is exploited to execute the manipulated memory as an instruction. The instruction planted is most often used to load the Princess’s chamber and complete the game, with the current record hovering around the three-minute mark.

If you feel like you’ve seen something like this before, you are likely thinking of the Super Mario World exploit for the SNES that allows for the same style of arbitrary code execution. The Mario 3 hack, however, is simpler to execute. It’s also worth checking out the video below, because [Retro Game Mechanics Explained] goes into great depth about which values are written to memory, how they are executed as an instruction, and all of the other inner workings of the game that allows for an exploit of this level.


Filed under: nintendo hacks

TheSirensSound: New single Not Over You by Memoryy

2 years in the making & finally the new Memoryy album SKELETONS dropped on Friday. The new single is "NOT OVER YOU" - and Memoryy's saved the best for last. The stand-out single "Not Over You" is already creating a buzz with it's universally emotive chorus "I'm not over not getting over you." Flourished with violins, breathy choirs, glitching vocals & pulsing synths, Not Over You is a knock-out from start to finish & showcases Memoryy's ability at crafting emotionally-driven Electronic Pop is right up there with the best of them.  After writing the Netflix Chelsea Handler theme song & dropping remixes for Body Language, Bridgit Mendler, Paperwhite more last year, MEMORYY wraps up 5 of his recent HypeMachine hits in the new album SKELETONS. It's a genre-spanning pop accomplishment that features collaborations with artists & producers including The Golden Pony, Brothertiger, Brain Tan, Frances Cone, Yeasayer-producer Abe Seiferth, Saint Motel-producer Joe Napolitano,  Grammy-nominated Katy Perry mixer-Abe Seiferth.

Ansuz - mskala's home page: Mastodon WTF timeline

In the last few days I've been fortunate to witness an interesting chapter in the Internet's history, and I'm trying to compile a timeline of what has happened while the memories are still reasonably fresh. This is incomplete and a work in progress; I'll be updating it, and not necessarily in chronological order, as I dig up other things worth including. Some of my TODO markers may remain. But here goes.

TheSirensSound: New album Dark by You Are a Soul


MattCha's Blog: MattCha's Blog's Many Accomplishments... Horay!

The Old MattCha's Blog was started at a time with very little English knowledge about:

1- Korean tea history and written classics

2- the various types of Korean teas and their production

3- where to purchase Korean teas

4- Korean teawear

English information on Korean tea history and classics was changed dramatically in 2007 with Brother Anthony's book Korean Way of Tea, it was further advanced in 2011 with the publication of Korean Tea Classics, and new and important insights were added in 2012 with the publication in Transactions by Brother Anthony and Steven Owyoung.  The readers of MattCha's Blog held a book club on Korean Tea Classics to celebrate this introduction of knowledge!

Korean green teas and other types of Korean teas are featured throughout this blog and the production is outlined here.

An extensive list of Korean tea vendors is published on MattCha's Blog (I better update this).

Korean teawear is also featured throughout and can be viewed with this label here.

Overall, MattCha's Blog gives a complete picture of Korean tea culture.

However, as is popular in Korea, the blog also features many posts on other teas.  This is especially true of raw puerh from Yunnan, and Japanese matcha.

Many people are very surprised when they ask me what my favorite tea is... (any guesses?)...

Peace

Daniel Lemire's blog: The real lesson of the human genome project

When I was pursuing my PhD, the human genome project was often both regarded as overly ambitious (maybe even impossible) and full of possibilities. To many people’s surprise, the project was declared complete back in 2003, much earlier than expected. Today, we have a “roughly” complete map of the human genome.

Many announced, too soon it turns out, that it would quickly bring new cures, new therapies, that were previously unthinkable. Fifteen years later, the concrete results are somewhat thin on the ground.

But it made one very important difference. Before we could track down people’s genes, it was often believed that our destiny was written out in our DNA. You were what you genes said you were.

So people quickly moved to cataloging the genes of the super smart, the genes of the athletes, the genes of the centenarians. And what have they found? Not a whole lot.

There are a few genes that might help you live a bit longer. There are a few genes more frequent in athletes. And so forth. But the effects are often quite weak. There are exceptions here and there. Some genes give you a much higher chance of getting some cancers, or Alzheimer’s and so forth… but they affect relatively few of us in aggregate.

Chances are that if you have your genome sequenced, you will find that you are slightly lucky in some ways and slightly unlucky in others… but it is all quite boring.

If we could build a human body from the DNA up, selecting only the best genes, the result would be nothing like a superman.

If you have a toned body, some of your genes are activated, but it turns out that your nerdy friend who barely looks like a man has more or less the same genes, they are just silenced.

And so, I think that the main lesson is that human biology is far more malleable than many people thought. And this means that the badly dressed kid next to you, who looks like an unhealthy idiot? To a close approximation, he probably has a comparable genetic legacy to yours.

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Victims

Apparently every eighth property sale in the GTA last year (and there were 113,200 of them) went to a person who already has real estate. This ratio exploded in the last decade according to government stats. More than 120,000 locals now own multiple places.

So what?

So if foreign buyers (according to local realtors) equal 4.9% of all deals, if 50% of all condos (according to Urbanation) are bought by people with no intention of moving in and if 14% of total buyers (according to the province) already own homes, then the changes announced last Thursday are doomed. The centrepiece of Ontario’s big douse-the-fire program was a 15% tax on foreign buyers (who don’t move here), but the numbers show the real culprit for runaway pricing is clearly old stock speckers.

By bringing in an anti-foreigner tax, spanking realtors, extending rent controls and opening the door to an empty-houses levy, the province missed stomping out the hot coals responsible for this conflagration. Forget Chinese dudes, assignment clauses or rich people with a downtown condo for game nights – prices have romped because GTA properties (ditto in Vancouver) are now an asset class, and part of the futures market.

The mania to acquire real estate is unlikely to be abated by anything last week delivered. Nothing Ontario did will immediately reduce sales or prices. A detached house will be just as unaffordable to the average moister couple in July as it was in March. That won’t change until government has the backbone to create a serious speculation tax of the kind last seen in the 1970s.

On April 9th, 1974, out of the clear blue came a bolt of lightning that sautéed the rear end of every speculator and multiple-property owner in Ontario. The province imposed a 50% tax on any and all profits an investor might realize from the sale of any piece of real estate. The only exceptions – your farm or your principal residence. And this was on top of the federal capital gains tax.

It was an astonishing thing for a conservative government to do, but it worked. Sales collapsed overnight. Within days, prices followed. The 30% year/year price gain which triggered this draconian action (currently the bloat is at 33%) was arrested, then interest rates started to rise and the party was truly over. Real estate remained relatively affordable until the next bubble formed in the mid-1980s (burst by mortgage rate hikes in the early 1990s).

The hate mail this pathetic blog has garnered over the past three years of suggesting locals, not dudes from Guangdong, were responsible for peak house, is impressive. I’ve been told what to do with literally every orifice on my bronzed, taut body. Most Canadians have bought into the meme that shadowy foreigners and traitor realtors have conspired to steal houses so they can launder their stolen fortunes. They want to believe it. They hate people who reject it. Life’s so much more understandable when you’re a victim.

Well, victims they are. Of their own frenzied obsession with dirt.

Renters are discriminated against. Household debt levels are reckless. Our media’s obsessed (“Buy now or risk saying bye-bye to affordable Montreal home ownership,” said the Gazette on the weekend). Our kids have turned into condo junkies. Financial balance has been sacrificed on the altar of potlights and polished cement. Worse, this bubble we’ve created for ourselves has turned many of us into xenophobes, racists and generally despicable, envious, venomous people. So we get the government actions we deserve – a tax on foreigners and collars on agents. Tomorrow houses will cost a little more. Risk on.

Frenzied buyers line up outside the sales office of Brad Lamb’s latest condo development on the weekend in downtown Toronto.

Perlsphere: Specifying the type of your CPAN dependencies

This is the third article in a series on CPAN distribution metadata. The first article was a general introduction, and the second article looked at dependencies, and in particular the different phases that you can specify dependencies for (configure, build, runtime, test, and develop). In this article, we'll cover the different types of dependencies and how you combine these with the phases (described in the previous article) to specify the dependencies (or prereqs) for your CPAN distribution.

This article is brought to you by MaxMind, a gold Sponsor for this year's Toolchain Summit, being held next month (May) in Lyon, France. The summit is only possible with the support of our sponsors.

Dependency Types

Most of the time, dependencies for CPAN distributions are required. This means that your distribution won't be installed unless all required dependencies could first be installed, as it's assumed that your module(s) won't work if one of the dependencies is missing.

But sometimes you can end up with optional dependencies. This might be to provide an optional feature, such as support for a number of export formats, or it might be an optional XS implementation for extra speed, falling back on a pure-perl implementation.

There are four different dependency types: requires, recommends, suggests, and conflicts. We'll look at each of them in turn. In the discussions below, we'll refer to the target distribution: This is the distribution being installed, and that prereqs are being checked for. To keep the examples below simple, we'll assume that the target distribution has a Makefile.PL based on ExtUtils::MakeMaker.

You should also be aware that the spec refers to these as dependency relationships, rather than types.

Requires

Required dependencies must be installed before the associated phase can be run for the target distribution. So if the target distribution has a required configure module, then that module must be installed before running

perl Makefile.PL

And if the required configure dependency can't be installed, then the installation of the target distribution must fail at that point.

Our example target module has a required configure dependency on ExtUtils::MakeMaker, so you'll see the following in META.json:

"prereqs" : {
    "configure" : {
        "requires" : {
            "ExtUtils::MakeMaker" : "6.3"
        }
    },
    ...
}

The great bulk of dependencies on CPAN have type "requires", so we'll not dwell on these any further.

Recommends

The spec says:

Recommended dependencies are strongly encouraged and should be satisfied except in resource constrained environments.

The interpretation of this is left up to individual CPAN clients:

  • For the CPAN module, and the cpan script front-end, it depends on whether you've set the configuration option "recommendspolicy.” If it's not set, or set to a false value, then any recommended prereqs are just ignored. If set to a true value, then CPAN will treat any recommended prereqs like requires: If they can't be installed, then the target distribution won't be installed. The default used to be for recommendspolicy to not be set, which is the same as a false value. In recent versions of CPAN it now defaults to true, but only if you're installing it for the first time. If you've already got CPAN installed, updating to a more recent version won't change this config setting.
  • By default, cpanm will similarly ignore any recommended prereqs, unless you give the --with-recommends command-line option. If you do give the option, cpan will try to install recommended prereqs, but if they fail, installation of the target dist will continue.

So by default, recommended prerequisites are ignored. More on that below.

Let's have a look at some distributions with recommended dependencies.

JSON::MaybeXS is a Perl module which will use the best available module for processing JSON. First it will use Cpanel::JSON::XS if that's available, then it will try JSON::XS, and if neither of those are available, it will fall back on JSON::PP, a pure-perl implementation that has been a core module since Perl 5.14.0.

If you look at the runtime prereqs in META.json, you'll see:

"runtime" : {
   "recommends" : {
      "Cpanel::JSON::XS" : "2.3310"
   },
   "requires" : {
      ...
      "JSON::PP" : "2.27300",
      ...
   }
},

Cpanel::JSON::XS is a recommended, but optional, dependency, and JSON::PP is a hard requirement. JSON::XS isn't listed as any kind of dependency: Cpanel::JSON::XS is a fork of JSON::XS. So, if the former couldn't be installed, it's unlikely a CPAN client will be able to install JSON::XS (but if JSON::XS is already installed, then it will be used in preference to JSON::PP).

But this isn't the whole picture! If you look at META.json, you'll see

"dynamic_config": 1,

As we covered in the first article of this series, this tells the CPAN client that dependencies need to be generated on the target platform, so it must first run:

perl Makefile.PL

And then use the metadata generated in MYMETA.json, rather than the META.json included in the release. And if you look at Makefile.PL, you'll see that it checks to see whether the target platform can handle XS modules. If it can, and JSON::XS isn't already installed, then Cpanel::JSON::XS is changed to be a required dependency.

David Golden's CPAN::Visitor is an implementation of the visitor pattern, for iterating over a CPAN repository. In its test prereqs, we see:

"recommends" : {
    "CPAN::Meta" : "2.120900"
},

This is an optional requirement for the test t/00-report-prereqs.t, which will still run without CPAN::Meta, but not do everything. This test is generated by the Test::ReportPrereqs plugin, which is included in DAGOLDEN's Dist::Zilla plugin bundle. When you run "make test” this test runs first, and lists the distribution's prereqs: output.

Suggests

The spec says:

These dependencies are optional, but are suggested for enhanced operation of the described distribution.

Again, the interpretation is left up to CPAN clients, with the two main clients having the same behaviour as for recommends:

  • By default, CPAN/cpan won't try and install these dependencies, but you can change that behaviour with the "suggests_policy" configuration option.
  • cpanm similarly won't try and install these dependencies, unless you give the --with-suggests command-line option.

Let's look at some suggested dependencies.

The Time::Stamp module provides functions for generating and parsing timestamps. If you have Time::HiRes installed, then you'll get fractional seconds in timestamps, but integer seconds otherwise. So Time::HiRes is specified as a suggested runtime prerequisite. It's been a core module since Perl 5.8.0, so for most people the suggested dependency is irrelevant (though an appropriate use of "suggests").

Module::Pluggable provides a mechanism for a module to easily support plugins. This works with App::FatPacker (a module which lets you roll up all the module dependencies for a script into a "packed" version of the script), and has a test that confirms this. So you can see in the metadata that App::FatPacker is a suggested test dependency.

Conflicts

This one is different. All the previous relationships are for expressing a module that will, or may, be used by the target distribution. A positive assertion, if you like. But a conflicts prerequisite says that the module must not be present. Here's what the spec says:

These libraries cannot be installed when the phase is in operation. This is a very rare situation, and the conflicts relationship should be used with great caution, or not at all.

With the regular type of prerequisites, it's common to see no version specific (a "0" given instead of a version). But with conflicts, you will almost always see it with a version. It is generally used to identify a version of a dependency that had a critical bug.

A good example is DBI's metadata, where you'll see the following:

"runtime" : {
    "conflicts" : {
        "DBD::Amazon" : "0.10",
        "DBD::AnyData" : "0.110",
        "DBD::CSV" : "0.36",
        "DBD::Google" : "0.51",
        "DBD::PO" : "2.10",
        "DBD::RAM" : "0.072",
        "SQL::Statement" : "1.33"
    },
}

Normally when you see a version number in prereqs, it means "this version or later", but here it means "not this version, or earlier".

The spec doesn't mention it, but you can also use expressions, rather than just a simple version number. For example, in META.json for Catmandu-PICA you'll see:

"runtime" : {
    "conflicts" : {
        "Catmandu::SRU" : "< 0.032"
    },
    ...
}

Which says that 0.032 and later are ok.

And in MooX-ClassAttribute's META.json you'll see:

"runtime" : {
    "conflicts" : {
        "Moo" : "== 1.001000",
        "MooseX::ClassAttribute" : "<= 0.26"
    }
},

This identifies a conflict with one specific release of Moo, and all releases of MooseX::ClassAttribute up to and including 0.26 conflict with MooX-ClassAttribute.

There are a number of problems with the "conflicts" type, though:

  • In a prereqs statement, a version number normally means "this version or later", but here it's assumed to mean "this version or earlier".
  • When DBI is installed, the user might not have DBD::Google installed. If the user later installs DBD::Google 0.51, will the CPAN client let the user know about the conflict? I'm pretty sure the answer is currently "no".
  • I don't know if all CPAN clients handle expressions instead of version numbers.

The "conflicts" mechanism is now seen as an experiment that didn't work out. It was discussed at the Lancaster QA Hackathon, and again at the Berlin QA Hackathon. The currently proposed solution is an x_breaks key in distribution metadata. No CPAN client implements this as yet, though. No CPAN client implements this yet, but the [Test::CheckBreaks] plugin for Dist::Zilla lets a distribution enforce it, and the Moose distribution includes a moose-outdated script that checks for conflicts.

Conclusion

At the time of writing, there are just over 35,450 distributions on CPAN. Of those, only 2,994 (8.4%) have a recommended dependency, and just 436 (1.2%) have a suggested dependency. And a mere 24 distributions (0.07%) have a "conflicts" dependency.

Given the definition of "recommends" in CPAN::Meta::Spec ("strongly encouraged and should be satisfied except in resource constrained environments"), I think the behaviour of the two most popular CPAN clients (CPAN and cpanm) could be improved. I think a better default behaviour would be:

  • Try to install recommended dependencies, but don't abort the phase if they can't be installed.
  • Don't try to install suggested dependencies, unless the user has explicitly requested that.

I wondered if there is some reason for the current behaviour, or is it just that when it was spec'd out, no-one was quite sure exactly how they'd be used? I asked Tatsuhiko Miyagawa, the author of cpanm, and he said that circular dependencies were a problem, which he solved with this behaviour.

One approach to avoiding circular dependencies is to install recommended dependencies after the target distribution has been installed. This was also discussed at the Berlin QA Hackathon.

Writing this article has been a real journey. I discovered things I didn't know about, or fully understand. And some of those where I thought I had then found out the full picture, it turned out I hadn't! And no doubt still haven't. I'll try and improve some of my remaining gaps at the toolchain summit, and possibly submit some pull requests for documentation updates!

My thanks to everyone who helped with this article, particularly David Golden, Andreas König, Tatsuhiko Miyagawa, and Karen Etheridge.

Specifying dependencies for a CPAN distribution

In the next article in this series, we'll look at the different ways you can specify dependencies for a distribution, depending on which builder you're using.

About MaxMind

Founded in 2002, MaxMind is an industry-leading provider of IP intelligence and online fraud detection services. Thousands of companies of all sizes rely on MaxMind's GeoIP geolocation and end-user data for fraud prevention, ad targeting, content customization, digital rights management, and more. MaxMind's minFraud Network leverages this data along with IP and email reputations established from over 100 million transactions per month to help merchants prevent fraudulent online transactions and account registrations.

A number of CPAN authors work at MaxMind, including Olaf Alders (OALDERS), Florian Ragwitz (FLORA), Mark Fowler (MARKF), Chris Weyl (RSRCHBOY), TJ Mather (TJMATHER), and Mateu Hunter (MATEU).

Olaf Alders, who was the founder of the MetaCPAN project, will be attending the summit.

Our thanks again to MaxMind for supporting the Toolchain Summit.

Instructables: exploring - featured: OUIJA and Heart Etched Pendants

For my birthday one of my presents was a small NEJI lazier engraver from GearBest (http://www.gearbest.com/3d-printers-3d-print…/pp_343187.html) . Despite being a little tool and in comparison to the big tools used in fabrication laboratories this little engraver is incredibly easy to use and seems ...
By: world of woodcraft

Continue Reading »

Planet Haskell: Roman Cheplyaka: traverse-with-class 1.0 release

I have released the 1.0 version of traverse-with-class. This library generalizes many Foldable and Traversable functions to heterogeneous containers such as records.

For instance, you can apply Show to all fields and collect the results:

{-# LANGUAGE TemplateHaskell, MultiParamTypeClasses, FlexibleInstances,
             ConstraintKinds, UndecidableInstances, TypeApplications #-}

import Data.Generics.Traversable
import Data.Generics.Traversable.TH

data User a = User
 { name :: String
 , age  :: Int
 , misc :: a
 }

deriveGTraversable ''User

allFields = gfoldMap @Show (\x -> [show x]) $ User "Alice" 22 True
-- ["\"Alice\"","22","True"]

You also get a free zipper for your data types.

The main change in the version 1.0 is that the constraint with which the traversal is conducted is specified via a visible type application. Type applications weren’t available when I originally wrote this library in 2013, so in that version I used implicit parameters to pass around the annoying proxies.

Thanks to Hao Lian for his help with this transition.


Right after I published this blog post, I saw this tweet:

<script async="async" charset="utf-8" src="https://platform.twitter.com/widgets.js"></script>

Guess what, traverse-with-class provides the sensible Foldable-like instance for tuples:

{-# LANGUAGE FlexibleInstances, TypeApplications #-}

import Data.Generics.Traversable

-- U is a trivial constraint satisfied by all types
class U a
instance U a

tupleLength = gfoldl' @U (\c _ -> c + 1) 0 (1::Int, True)
-- returns 2

Trivium: 23apr2017

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Fables



Click here to go see the bonus panel!

Hovertext:
Really, you want to have her in space for this sort of thing, but the detection apparatus is worse when it's dead.

New comic!
Today's News:

Last full day to buy tickets!

Better Embedded System SW: SCAV 2017 Keynote: Challenges in Autonomous Vehicle Validation


Challenges in Autonomous Vehicle Testing and Validation from Philip Koopman


Challenges in Autonomous Vehicle Validation
Keynote Presentation Abstract
Philip Koopman
Carnegie Mellon University; Edge Case Research LLC
ECE Dept. HH A-308, 5000 Forbes Ave., Pittsburgh, PA, USA
koopman@cmu.edu

Developers of autonomous systems face distinct challenges in conforming to established methods of validating safety. It is well known that testing alone is insufficient to assure safety, because testing long enough to establish ultra-dependability is generally impractical. That’s why software safety standards emphasize high quality development processes. Testing then validates process execution rather than directly validating dependability.

Two significant challenges arise in applying traditional safety processes to autonomous vehicles. First, simply gathering a complete set of system requirements is difficult because of the sheer number of combinations of possible scenarios and faults. Second, autonomy systems commonly use machine learning (ML) in a way that makes the requirements and design of the system opaque. After training, usually we know what an ML component will do for an input it has seen, but generally not what it will do for at least some other inputs until we try them. Both of these issues make it difficult to trace requirements and designs to testing as is required for executing a safety validation process. In other words, we’re building systems that can’t be validated due to incomplete or even unknown requirements and designs.

Adaptation makes the problem even worse by making the system that must be validated a moving target. In the general case, it is impractical to validate all the possible adaptation states of an autonomy system using traditional safety design processes.

An approach that can help with the requirements, design, and adaptation problems is basing a safety argument not on correctness of the autonomy functionality itself, but rather on conformance to a set of safety envelopes. Each safety envelope describes a boundary within the operational state space of the autonomy system.

A system operating within a “safe” envelope knows that it’s safe and can operate with full autonomy. A system operating within an “unsafe” envelope knows that it’s unsafe, and must invoke a failsafe action. Multiple partial specifications can be used as an envelope set, with the intersection of safe envelopes permitting full autonomy, and the union of unsafe envelopes provoking validated, and potentially complex, failsafe responses.

Envelope mechanisms can be implemented using traditional software engineering techniques, reducing the problems with requirements, design, and adaptation that would otherwise impede safety validation. Rather than attempting to prove that autonomy will always work correctly (which is still a valuable goal to improve availability), the envelope approach measures the behavior of one or more autonomous components to determine if the result is safe. While this is not necessarily an easy thing to do, there is reason to believe that checking autonomy behaviors for safety is easier than implementing perfect, optimized autonomy actions. This envelope approach might be used to detect faults during development and to trigger failsafes in fleet vehicles.

Inevitably there will be tension between simplicity of the envelope definitions and permissiveness, with more permissive envelope definitions likely being more complex. Operating in the gap areas between “safe” and “unsafe” requires human supervision, because the autonomy system can’t be sure it is safe.

One way to look at the progression from partial to full autonomy is that, over time, systems can increase permissiveness by defining and growing “safe” envelopes, shrinking “unsafe” envelopes, and eliminating any gap areas.

ACM Reference format:
P. Koopman, 2017. Challenges in Autonomous Vehicle Validation. In
Proceedings of 1st International Workshop on Safe Control of Connected
and Autonomous Vehicles, Pittsburgh, Pennsylvania, USA, April 2017
(SCAV 2017), 1 page.

Permission to make digital or hard copies of part or all of this work for personal or classroom use is  granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.
Copyright is held by the owner/author(s).
SCAV'17, April 21-21 2017, Pittsburgh, PA, USA
ACM 978-1-4503-4976-5/17/04.
http://dx.doi.org/10.1145/3055378.3055379

Disquiet: What Sound Looks Like


Belated image for Record Store Day. This is a detail of a 1948 photo by Todd Webb (1905-2000) of 6th Avenue in Manhattan. The full image, a semi-panorama of sorts showing the complete block between 43rd and 44th Streets, including a second record store, is on display currently at the Curator Gallery on West 23rd as part of the exhibit Down Any Street: Todd Webb’s Photographs of New York, 1945-1960, curated by Bill Shapiro. Note the window advertisement above for Brown’s Talking Picture Operating School. That sharp line to the right of the store, between it and the bar newly listing “television” among its attractions, is a cut where two images were placed next to each other to allow Webb to achieve the effect of showing the entire stretch of 6th Avenue as if viewed from across the street.

An ongoing series cross-posted from instagram.com/dsqt.

things magazine: Birthday celebrations

All of the genuinely great works of the 21st century have been acts of digital humanism, the analogue version of which once a driving force behind mystic intellectualism (via Kottke, which is celebrating turning 19) / a case in point: … Continue reading

Planet Haskell: mightybyte: Talk: Real World Reflex

I recently gave a talk at BayHac about some of the things I've learned in building production Reflex applications. If you're interested, you can find it here: video slides github

Explosm.net: Comic for 2017.04.23

New Cyanide and Happiness Comic

Jesse Moynihan: Turn of Events in the World of Animation

So yesterday I had a second meeting with the guys at Starburns Industries. They produce Rick and Morty and other animated projects past and present (Anomalisa Morel Orel  etc…). I met up with them about a month ago to see if they wanted to do something with me: either co-producing Manly episodes with Frederator, or […]

Jesse Moynihan: Tower Sketch

Here’s my consolidated notes and an initial sketch. The crown is not quite where I want it but it’s a good first pass. I debated putting the door in there as I think it is a Jodorowsky addition to the Marseille. I can’t find any early evidence of a door in the tower. However I […]

Quiet Earth: Robert Davi, Julie Benz & More Join DARK/WEB Series

Screen legends Robert Davi (Die Hard) and Julie Benz (Training Day) have joined the sprawling cast of Dark/Web as guest stars in the upcoming series from Felt Films.

Davi, who has appeared in over 100 films in his career, plays an ethically challenged doctor in the episode, “Transplant”, written and directed by Mario Miscione (“The Vault”) from a story by Tim Nardelli. The episode centers on a terminally ill man in need of a heart transplant. When time runs out, he turns to black market organ sales on the deep web run by an ethically challenged doctor (Davi); the heart saves his life, but the dark side effects that come with it lead to a search for it’s true owner.

Julie Benz will appear in the episode "Rideshare", which is [Continued ...]

Paper Bits: Build a Better Monster

Build a Better Monster

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Bad timing

DOUG By Guest Blogger Doug Rowat

Brad Barber and Terrance Odean. Know of them? Perhaps not, but you should.

Mr. Barber and Mr. Odean were researchers at the University of California who conducted a landmark study of investor behaviour during the 1990s. They looked at the trading activity of roughly 78,000 households at a large US discount brokerage over a six-year period. What they discovered was revealing.

Households that traded their stock portfolios the most had, by far, the worst performance. The highest turnover portfolios had an average annualized return of 11.4%, but the lowest turnover portfolios had a significantly better return of 18.5%. (If these returns seem impressive, it’s because it was the bull-market years of the 1990s.) The annualized blended-benchmark return (buy and hold) was roughly 18%. A conservative stock portfolio might have annual turnover of 20%—the highest turnover portfolios in the Barber and Odean study were realizing this every MONTH.

So what were these frequent traders doing wrong? The mistakes were numerous. Certainly overestimating their ability to determine market direction, an enormously complex task, was a key factor in their underperformance. Overconfidence led them astray, essentially. But there were other errors. They also traded emotionally. For instance, they held stocks that had recently underperformed the market and sold their winners, which was “consistent with the evidence that individual investors tend to hold their losers and sell their winning investments”. Our experience with our own clients is similar. When a client wants to raise money from their portfolio, often they suggest selling only the positions that have performed well, ignoring the fact that positive fundamentals are likely the reason for the strong performance and that it could easily continue. Repeatedly selling your winners usually only serves to blunt momentum.

Another amazing revelation of the Barber and Odean study was how poorly diversified most of these stock portfolios were. The mean household in their study held only 4.3 stocks! Recall the blog post I wrote a few months ago (http://www.greaterfool.ca/?s=hail+marys) highlighting that the odds of any one stock suffering a catastrophic loss (a 70% drop) is about 40%. If you have only a four-stock portfolio, you’re living dangerously.

Much of what Barber and Odean concluded has been supported by other analysis. For instance, Blackrock notes that you only need to miss a handful of strong market days to absolutely cripple your long-term portfolio performance (see chart). Over thousands of trading days do you believe that you’ll be able to accurately determine the few strongest market days and, by corollary, the few weakest days? Get over yourself.

Investment of $100,000 in S&P 500 (1995-2014): Market Timing is Pointless.

Source: Blackrock

Interestingly, Barber and Odean also determined that men trade 45% more often than women. And, lo and behold, underperformed women in terms of net returns. Sadly gentlemen, overconfidence once again gets the better of us. As Barber and Odean describe it

Overconfident investors believe more strongly in their own valuations, and concern themselves less about the beliefs of others. Overconfident investors…hold unrealistic beliefs about how high their returns will be and how precisely these can be estimated.

Put another way, even though your buddies have advised you that you’re an ugly slob, you still decide to ask Scarlett Johansson out on a date. You can, of course, do this, but just be aware that she’ll tell you to get lost every time (or call the police).

I don’t highlight the Barber and Odean study to suggest that portfolios should never be traded. They should. It makes sense, for instance, to periodically rebalance your portfolio and also to subtly shift asset and geographic weightings in response to broad changes in economic conditions or interest-rate outlook. But having 200% portfolio turnover because you view yourself to be a human algorithm? This is a poor strategy.

Dial it back, flash boys.

Doug Rowat, FCSI® is Portfolio Manager with Turner Investments and Senior Vice President, Private Client Group, Raymond James Ltd.

Daniel Lemire's blog: Science and Technology links (April 21st, 2017)

Can we trust software? Lance Fortnow, a famous computer scientist, answers

Sometimes I feel we put to much pressure on the machines. When we deal with humans, for example when we hire people, we have to trust them, assume they are fair, play by the rules without at all understanding their internal thinking mechanisms. And we’re a long way from figuring out cause and effect in people.

Luc Charlebois is a real-life Deus Ex character. He lost his leg ten years ago in a bike accident. The leg was just pulled off. He went to Australia where they remade his leg from scratch. He has an artificial leg that is directly connected to his skeleton. He can now walk and says that he when he does so, it “feels” like a true leg. That last part is critically important: we are a long way from wood pegs. But he had to spent a quarter of a million dollars to get it done. That’s not crazily expensive, but clearly out of reach to too many people. Given that this is now possible how long before we all ask for such high-quality artificial limbs?

Apple (the company) is allowed to test self-driving cars in California. “Siri, bring me back home.”

Daniel Lakens, an academic with a brilliant publication record, writes that blog posts are of higher scientific quality than journal articles. What he actually demonstrates is that it is quite easy to outdo scientific articles with something as silly as a blog post.

Using genetic engineering, we could one day selectively kill just one type of bacteria. This would turn modern-day antibiotics into blunt tools.

Want a growing industry? What about plastic surgery?

Somewhat mysteriously, obesity is a risk factor for bone health. Basically, beyond a certain point, the fatter you are, the more likely it is that your bones will break. That’s true even though heavier individuals tend to have higher bone mineral density. What is interesting is that we do not know why that is.

Currently, anesthesia is full of negative side-effects. Scientists are finding out that we could design better drugs that are free of these inconvenient side-effects.

The lens in our eyes tend to darken with age leading to a loss of vision, called cataracts. It is considered more or less unavoidable meaning that if you are old enough, you will have cataracts. We can simply replace the natural lens of your eyes, so it is not a crippling condition. There are several risk factors such as diabetes, age, sunlight… but wearing sunglasses, avoiding donuts, and staying inside won’t prevent cataracts. So what causes it? According to Beebe et al. it is the exposure of the lens to oxygen. Under normal conditions, in healthy young individuals, there is very little oxygen around the lens. But a liquefaction of the vitreous body of the eye or hyperbaric oxygen therapy can cause the lens to become exposed to oxygen, leading up to cataracts. Basically, your lens don’t interact well with oxygen. There is some evidence that if we can stop the exposure to oxygen, the lens could recover. Interestingly, we know how to repair the vitreous body with synthetic gels. So it is conceivable that we could one day develop preventive therapies against cataracts. At a high level, it might be as simple as keeping the vitreous body of the eye intact. The problem right now is that no doctor can tell you how much oxygen your lens are exposed to. So it is a difficult problem to study and catalog.

Netflix is reaching 100 million subscribers. This is far, far ahead of any cable TV company. In fact, cable TV companies are losing subscribers. Television is dying.

Following speculative claims that bees were being wiped out by the newest pesticides, neonicotinoids, the European Union banned these pesticides. Contrary to pesticides that you spray on the field, neonicotinoids are applied specifically on the seeds. These pesticides remain in wide use in Australia and North America, where both wild and honey bees are doing fine. Here is Matt Ridley on the consequences of the European ban in the Times:

In Britain, (…) farmers have more than quadrupled the number of insecticide applications on oil-seed rape (from 0.7 to 3.4 per growing season), but pest pressure has increased. Meanwhile, recent studies have demonstrated that declines among wild bees are driven mostly by land use changes and have not increased since neonics were introduced in the 1990s (…) This makes sense because neonics are mostly used as seed dressings, absorbed into the plant from germination, rather than sprayed on a growing crop. This makes them more lethal to pests such as flea beetles that eat the crop but less dangerous to innocent bystanders, including bees that collect pollen and encounter lower doses.

We ought to be critical of new technologies, but there is a difference between rejecting progress and being cautious. Some people do not want technological progress and they have much clout.

In diseases like Alzheimer’s, the environment of the brain deteriorates to the point that brain cells die. This lead to brain shrinkage and to cognitive decline. We are having a really like time fixing the environment of the brain. However, some clever scientists found out that we could stop the cells from dying. The brain might still be full of bad proteins, but the brain cells can still be coerced into surviving. In early work, the researchers achieved this effect but using a drug that would be toxic to human beings. However, Halliday et al. have now shown that we can get the same good effect with safe drugs that are already used in human beings. Speculatively, this means that if we can find in time that the environment of your brain is getting toxic for some cells, we could give you one of these drugs to prevent your cells from dying. This would not qualify as a cure for, say, Alzheimer’s, but it could turn it from a death sentence to something we can manage with medication and monitoring.

For a time, it looked like the blood (actually the plasma) of your people could rejuvenate older people. This lead to a good deal of unwarranted mockery. There is still much ongoing work in this direction, but I expect that it will end up being a dead end because, as suggested by work from the Conboy laboratory in Berkeley, it seems that we have “aging factors” in our blood… and not so much “youth factors”. Specifically, as we age, we might get too much of some factors in our blood. Thus, we could age younger people with the blood of older folks, but the other way around is unlikely to work. What we might need to do is to identify and normalize the aging signals in our blood.

The classic game Starcraft is now available for free, for both PCs and Macs.

We have this model of reality where we are this one person throughout time. So I remember the teenager I once was. I think of him as “me”. This may actually only be true in a very teneous manner. Psychologists have found out that there is very little correlation (none at all) between your personality as a kid and your personality as an elderly person. On the short term, you remain who you are, but your personality progressively changes and there does not seem to be any solid long-term anchor. The teenager I was? In a very real sense, he is dead. I no longer think like he did. Not in a meaningful way. It also means that even if I remain healthy for a very long time, who I am today will die over time and be replaced by someone else. Thus you cannot endure as an individual. I view this as a good thing.

Apple co-founder, Steve Wozniak, predicts that by 2075, we will have a colony on Mars.

Naked mole rates are long-lived tiny mammals. We still do not quite understand why they live so long. Interestingly, they can survive without any oxygen at all:

When the oxygen was completely removed and replaced with nitrogen, the mice died after 45 seconds. The naked mole rats passed out. But even after 18 minutes of no oxygen, they recovered when they were put back in normal air.

Why should you care about naked mole rats? Well. They are mammals not very different from us. They mostly have the same genes we do, plus/minus a few. If we can better understand how the cells of naked mole rates, we could use technology to mimick them. So, eventually, our cells could be taught to survive with very little oxygen. This would make us far more robust.

All Content: Tribeca 2017: “Gilbert,” “Hounds of Love,” “Aardvark”

Thumb_gilbert

A great film festival offers movie lovers a wide array of offerings on its opening weekend. Fests like Sundance and Toronto are downright overwhelming in just their first 72 hours, presenting dozens of films from around the world and from every possible genre. Tribeca, which started this week in New York City, has a similar model, presenting an extremely diverse slate of films. The first three I’ve seen from this year’s iteration couldn’t have less in common, and yet they all now share the Tribeca seal of approval.

The best of the three is Neil Berkeley’s “Gilbert,” a documentary about the singular Gilbert Gottfried, a stand-up comic and actor like no other. The love for Gottfried in both the filmmaking and the interview subjects assembled to discuss his life is palpable. This is not just a piece of fan service—it’s something of a love letter, a piece that humanizes a very private celebrity, and reveals the complexity and daring of his art. Gottfried is one of those comedians who is stunningly unafraid on stage. He does not care if you hate what he’s doing. In fact, he may like it more. And yet he’s not an abrasive, aggressive loudmouth off stage. He’s a father and a husband, an often-shy man—he just happens to be willing to tell the jokes that others won’t. Sometimes that willingness gets him into trouble, and “Gilbert” captures every side of Gottfried, including the one who sometimes fucks up.

At its best, Gottfried feels like a tribute to Gottfried, in which his colleagues, friends, and family members tell stories about the man. Legends show up—including Bill Burr, Artie Lange, and Jay Leno—but Berkeley wisely keeps returning to the people who know Gilbert best, including his sister and his wife, Dara. It’s often a love story about Dara and Gilbert, a man no one thought would settle down. “Gilbert” hits all the expected beats—“Aladdin,” “The Aristocrats,” the Aflac firing—but it also offers depth to a man who has stayed out of the spotlight. We learn about his never-approving father, who died when Gilbert was only 18. The idea that his pop never saw him become successful arguably fueled much of his passion to succeed. He’s trying to impress someone who’s gone. And the film is personal and moving when discussing the death of Gottfried’s mother. I have to believe that “Gilbert” proves, without question, that both of them would be incredibly proud.

On a totally different wavelength than possibly anything else you’ll see this year is Ben Young’s directorial debut, the harrowing and disturbing “Hounds of Love.” There are certain films that a critic can respect for the quality of filmmaking on display and still never want to see again. “Hounds of Love” falls into that category, artistically shining a light on a very disturbing, violent scene in an Australian suburb in the ‘80s. It’s a film about rape, kidnapping, and abuse that will shock pretty much anyone who sees it, but it’s also a film that makes it very clear that its auteur has a great eye for composition and a daring approach to storytelling.

That approach could be called unapologetic. “Hounds of Love” is about a couple—John and Evelyn White (Stephen Curry & Emma Booth)—who have a sociopathic hobby. In early scenes with them, Young peppers the audio with the sounds of clanking handcuffs in the background and a woman’s muffled cries. He regularly returns to shots of blood splatter on the floor, allowing your imagination of what these two are doing to fill in the horrifying gaps not provided visually. They’re serial killers. However, we quickly come to discern that Evelyn is a victim too. When they kidnap Vicki (Ashleigh Cummings) and begin their torturous routine, she realizes this as well, trying to drive a further wedge between the two of them. In a sense, both Vicki and Evelyn are victims of John’s abuse, just to different physical degrees. Evelyn’s handcuffs are emotional and mental.

All three of the leads here are very committed, but Emma Booth captivates as a woman who was clearly once one of John’s victims, and is now stuck in his spiral of madness. It’s a daring, physical performance, as we watch a woman unravel. Young also displays a strong use of space, shooting long takes down hallways and through doors, making the very home menacing. And he uses music well, particularly in a Tangerine Dream-y score that increases tension and a great use of Joy Division near the end. However, no Kate Bush.

Finally, there’s the relatively star-studded “Aardvark,” a film that hits more of the “Festival Expectation” beats than the other two but doesn’t really deliver anything memorable. This is an odd little dramedy about brothers, fame, and finding love that never comes together as more than a series of quirky moments. A few of the performances elevate it slightly, but it’s shockingly forgettable and too tonally inconsistent to register as more than an interesting failure.

Josh Norman (Zachary Quinto) is a troubled man. He’s an introvert, who struggles to get through daily life, but he’s started seeing a therapist named Emily (Jenny Slate) for help. He reveals to her that he has a famous brother named Craig (Jon Hamm), and he just happens to be in town for a visit. Or is he? Josh “sees” Craig everywhere, convinced that his actor sibling could be anyone he meets, including a homeless woman and a police officer. In other words, he hallucinates that his brother is everywhere. Josh is an introvert, his brother is the very-extroverted profession of an actor—how will these two impact each other?

The answer is not much at all. Writer/director Brian Shoaf spins the story in another direction, allowing Craig and Emily to develop a love story while Josh also gets a romantic arc with a mysterious woman (Sheila Vand), who he meets in such a way that we have to worry if she’s real or not. The quartet of actors give it their all, and I really hope to see the charismatic Vand (“A Girl Walks Home Alone at Night”) in more soon, but “Aardvark” is comically and emotionally limp. It’s a film that feels as uncertain as to where it’s going as its protagonist. 


Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Dungeon Classes



Click here to go see the bonus panel!

Hovertext:
Anyone caught emailing me in regards to the accuracy of today's comic shall be tarred, feathered, and made to carry a sign that reads 'No fun.'

New comic!
Today's News:

Last full day to get your BAHFest East tickets! We moved over a bunch of cheap tickets, but after these are gone, there are no more!

Also, in case you missed it, I'll be signing books prior to the show at MIT Press Bookstore, from 3-430. If you don't want to wait in line after the show, this is the way to go. <3

 

Open Culture: Hear Four Hours of Music in Jim Jarmusch’s Films: Tom Waits, Iggy Pop, Neil Young, Screamin’ Jay Hawkins & More

“I gotta say — not to rant, but — one thing about commercial films is, doesn’t the music almost always really suck?” Jim Jarmusch, director of films like Stranger Than Paradise, Mystery Train, Broken...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Explosm.net: Comic for 2017.04.22

New Cyanide and Happiness Comic

Penny Arcade: News Post: Evolution

Tycho: I’m still on the prowl for a Switch, so I can’t speak to it personally, but Grabe is having a premium experience with the new Wonder Boy.  That just which just reinforces two things I thought already: one, I need to steal a Switch from one of the other people here, and two, that a lot of the stuff I’m going to like best for it isn’t going to be on a cartridge.  I guess this all leads into a corollary, or maybe even a Point Three, where if you don’t have a plump SD card slotted in that thing it’s entirely possible that ur doin it rong. Mike…

Perlsphere: Perl 6 Performance and Reliability Engineering: Grant Report

This is a grant report by Jonathan Worthington on his grant under Perl 6 Core Development Fund. We thank the TPF sponsors to make this grant possible.

I have completed the second 200 hours of my Perl 6 performance and reliability engineering grant, funded by the Perl 6 core development fund. This report summarizes the work done during those 200 hours. In accordance with community feedback, the vast majority of effort has been put into reliability rather than performance.

Concurrency robustness

The main area of focus in this grant period has been making Perl 6's concurrency support more robust. While work remains to be done, the improvement over the last several months has been noticeable. It is also an important area for me to focus on, given the small number of people in the community with the skills, time, and patience (or, perhaps, stubbornness) to track down and resolve these problems. Here is a summary of the issues resolved.

  • Fixed a bug affecting use of callwith in multiple threads
  • Fixed RT #128809 (closure-related bug involving s/// construct, which showed up in concurrent scenarios)
  • Fixed RT #129213 (synchronous socket accept could block GC in other threads, thus blocking program progess)
  • Determined RT #128694 fixed, added test (zip-latest on two intervals would hang)
  • Eliminated use of in-place rope flattening, which violated the immutability of strings and could thus cause various crashes (especially in hash access); this resolved many failures, amongst them the one reported in RT #129781, and also made hash lookups using rope strings keys more efficient as a bonus
  • Fixed warnings due to over-sharing of $/ between threads when using grammars in parallel (mostly fixed RT #128833)
  • Fixed a Lock.protect bug when we unwound the stack due to control exceptions (discovered independently of, but also resolved, RT #126774)
  • Fixed RT #129949 (GC crash resulting from missing rooting of sent value in concurrent blocking queue)
  • Fixed RT #129834 (sporadic behavior when concurrently creating Proc::Async objects and obtaining handles)
  • Audited and fixed vulnerable cases of the once construct
  • Fixed RT #129994 (long-lived native call on one thread could block GC in other threads)
  • Fixed RT #125782 (uninformative error reporting when a Promise is broken)
  • Examined RT #127960; concluded it is fixed, but added it as a stress test since it's a bit distinct from the other test for the same underlying bug
  • Fixed a bug where method caches could be revealed to other threads before they were fully deserialized, causing falsely missed lookups
  • Fixed a data race inside of the NativeCall setup code
  • Fixed RT #130064 (trying to rethrow an exception that was never thrown before leaked an internal error; this showed up in Promise.break("foo"))
  • Fixed scoping/cloning problem with LAST/NEXT/QUIT phasers in supply, react, whenever, and for constructs
  • Fixed a bug with QUIT phasers mishandling exceptions thrown synchronously with the .tap
  • Switched to using Supplier::Preserving on the taps of stdout/stderr in Proc::Async, to avoid various innocent-looking usage patterns losing output
  • Fixed RT #128991 (messages could seep out of a supply block even after it was considered done)
  • Fixed a GC corruption bug involving Proc::Async that caused occasional crashes
  • Tracked down and fixed two data races in the supply/whenever implementation and in Supply.interval
  • Fixed RT #130168 (Supply.interval(...) with very small interval would only ever emit 1 value)
  • Fixed interaction of native callbacks, GC blocking, and embedding, which afflicted Inline::Perl6
  • Fixed use-after-free that could occur as part of inlining fixups when in a multi-threaded program
  • Fixed precompilation of the OO::Monitors module
  • Fixed RT #130266 (premature frees of handles in various async socket error handling cases)
  • Fixed SEGVs when GC stressing was applied to S15-nfg/many-threads.t and S17-supply/syntax.t
  • Fixed incorrect reporting of some errors on threads, which could show up as if they were compile-time errors
  • Fixed thread safety issue in the >>.foo implementation
  • Fixed a miscompilation of ||=, &&=, and //=, making them a good bit more efficient along the way
  • Add various bits of missing concurrency control in precompilation management, thus fixing parallel use of precompilation (this will help towards faster p6doc builds)

String decoding improvements in asynchronous I/O

Previously, decoding of bytes to strings for both IO::Socket::Async and Proc::Async was done at the VM level. This created a number of fragilities with regard to decoding errors. Due to time constraints, different encodings besides UTF-8 had not been implemented for these classes either, leaving users of them to do decoding manually if they needed anything else.

To rectify these issues, I first made the VM-backed decoders directly available to userland. These will, in the future, be exposed as a Perl 6-level API, and we'll support user-space encodings. For now, it meant I could move the code that orchestrates the decoding of strings in async I/O into Perl 6 space, fixing the robustness issues. This also means that string decoding for different spawned processes and socket connections can be done in the thread pool, rather than using the event-processing thread. Along the way, I added support for different encodings.

Finally, there were some issues around the way async sockets and processes worked with regard to NFG. I resolved these issues and made sure there was test coverage of the various edge cases.

Non-blocking await and react support

I did the initial round of work to provide support for non-blocking await and react. At present, these constructs will block a real OS thread, even if used on a thread in the thread pool. The changes, available via. use v6.d.PREVIEW, mean that thread-pool threads will be returned to the thread pool to do other work, and the code following the await and react will be scheduled once the result is available or processing is complete. This is implemented using continuations (much like gather/take, except in this case the continuation may be resumed on a different OS thread). The result is that Perl 6 programs will be able to have hundreds or thousands of outstanding reacts and awaits, with just a handful of real OS-threads required to process them.

This is just the initial implementation; further work will be required to make this feature ready to be the default in Perl 6.d.

Memory leak fixes and memory use improvements

The highlight of the memory management improvements was a simplification to the lifetime management of register working sets in MoarVM. This resulted from the elimination of a couple of speculative features that were not yet being utilized by Rakudo, and in one case never would have been anyway. Coupled with a range of cleanups and some code streamlining, the result was a 10% reduction in peak memory use for CORE.setting compilation, and 20% off the compilation runtime. I also:

  • Fixed a bug that caused bogus multi-dispatch cache misses for calls with many named arguments, leading to the cache growing indefinitely with duplicate entries
  • Fixed a regex interpolation memory leak; it boiled down to unclaimed entries left behind in the serialization context weakhash
  • Fixed leaks of asynchronous task handles
  • Fixed a leak in decode stream cleanup
  • Improved memory allocation measurement in I/O, meaning that full GC collection decisions are made more accurately in I/O-heavy programs
  • Fixed a memory leak involving Proc::Async
  • Fixed a memory leak when a synchronous socket failed to connect
  • Tracked down and resolved the last remaining leaks that showed up in perl6-valgrind-m -e '', meaning it is now clean. (Previously, some cleanup was missed at VM shutdown)

Unicode-related work

I did a number of Unicode improvements, as well as discussing with and reviewing Pull Requests from a new contributor who is now doing a bunch of Unicode work for Perl 6. My own contributions code wise were:

  • Initial support for Unicode 9 (updating the character database, basic NFG tweaks)
  • A rewrite of the UTF8-C8 encoding to eliminate various bugs (some falling under RT #128184), including a buffer overrun and not properly round-tripping valid but non-NFC input

Other assorted bugs

I also took care of a range of other bugs, which don't fit into any of the previously mentioned areas of work.

  • Fixed RT #128703 (1 R, 2 R, 3 lost values)
  • Fixed RT #129088 (lack of backtaces for sprintf and friends)
  • Fixed RT #129249 (mis-compilation of /$<cat>=@(...)/)
  • Fixed RT #129306 (erorr reporting bug involving sub-signatures)
  • Partially fixed and tested RT #129278 (native attributive parameter binding broken) and noted on the RT the less common cases that remain to be fixed
  • Fixed RT #129430 (sigilless parameters were declared too late to use in where clauses)
  • Fixed RT #129827 (sub { 42.return }() ended up being code-gen'd without a return handler)
  • Fixed RT #129772 (poor error reporting when you tried to invoke a native parameter; it blew up code-gen and gave no location info)
  • Tracked down and fixed a pre-comp management bug on Windows due to a file not being closed and then trying to rename over it
  • Fixed RT #129968 (error-reporting crash in the case of redeclaration errors in nested packages)
  • Fixed a bug with augmenting nested classes
  • Fixed RT #129921 (internal warning when producing exception for my $!a)
  • Hunted down a bug in the JIT-compilation of the nqp::exception() op and fixed it
  • Fixed RT #130107 (accidentally treated nread == 0 as an error in a couple of places in MoarVM)
  • Fixed RT #130081 (did not backtrack into a regex TOP in a grammar to try and make it match until the end of the string)
  • Fixed RT #130294 (SEGV that occasionally occurred during some cases of deep recursion)
  • Fixed RT #128516 (SEGV when composing meta-object held in an attribute)
  • Fixed RT #130465 (ignoremark not applied with backslashed literals)
  • Fixed RT #130208 (putting multi-line Pod documentation on a role would pun it)
  • Fixed RT #130615 (code-gen of $a++ in sink context for native $a was a lot slower than in non-sink context)
  • Fixed RT #130637 (two grammar constructs produced malformed NFAs, which gave wrong results or could even SEGV in MoarVM; MoarVM was made to validate NFAs more strongly, which shook out the second issue besides the reported one)
  • Investigated RT #129291 (SEGV involving processes with output of one fed into input of the other); applied likely fix
  • Investigated and fixed an issue with $/ setting, arising from changes to the implementation of match
  • Fixed a bug that occasionally caused spectest crashes; was an interaction between dynamic lexical caching, inlining, deoptimization, and garbage collection
  • Fixed a rare crash related to assignment when the type constraint was a refinement type
  • Fixed MoarVM #120 and #426, in which a failed debug annotation lookup led to boxing a NULL string
  • Fixed a couple of places where dynamic optimization could accidentally trigger garbage collection; the optimizer assumes this can never happen
  • Fixed RT #123989 and RT #125135 (callsame et al could sometimes have the dispatcher stolen by the wrong target invokee)

Other tasks

On top of this, some time was spent reviewing pull requests to Rakudo, NQP, and MoarVM, providing feedback, and merging them when appropriate. I also commented on a range of RT tickets besides those I fixed myself. Various other small cleanups and additions resulted from this work, ranging from typo fixes in comments up to improvements to GC debugging macros added while finding bugs.

Disquiet: What Sound Looks Like


People who try to express information on an XY grid eventually learn this lesson, often the hard way: sometimes the only option for accuracy is to access the third dimension. That realization was made, as well, by whoever was tasked at some point in the distant past with adding a fifth button (yes, fifth — note the semi-obscured circle at the bottom) to this already beleaguered assemblage. It’s unclear if this location is home to two or five individual addresses, or somewhere in between. The bottom set, if you perceive them as a set, could be three iterations of fixing a doorbell’s serial failures: first the main, boxy unit; then the second narrow sliver; then the side button. Then again these could be incremental sublets, the most recent an overpriced closet with the benefit of being near a major public transportation hub. The fact that none of the five buttons is labeled lends some mystery. While we may not know what the landlord is up to, clearly the next logical step is to go full tesseract.

An ongoing series cross-posted from instagram.com/dsqt.

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: The war on wealthy

Jim’s a lawyer and, by several accounts, a good one. But he’s not one of those $700-an-hour dudes with the dedicated parking spot below a bank tower for his Panamera. Instead, he’s an entrepreneur with a partner, one employee and an office above a dry cleaner in the west end. Last year he scored with a personal injury case and made a bundle.

“That kinda thing only comes around once,” he said. “Now it’s back to making way less than my plumber.” Jim operates through a small business corporation, of course. He and Brenda (wife) are the sole shareholders, so they try to be tax-efficient by both taking a combination of salary and dividends from the business.

Sadly, Jim reads this blog. Some posts a few months ago speculating the T2 gang wants to Hoover guys like him got his juices going. So he sent this brief note to his local MP, Arif Virani:

Hello Honourable MP Virani:

Could you please advise as to the government’s position (future plans) regarding the tax treatment of retained earnings for professional corporations as well as your stance on the government’s future plans for same.

Actually he sent it five times. No response. The sixth time he got lucky. The reply below was received yesterday. It’s important enough to every self-employed person that it is reprinted in full.

Thank you for writing to me about the tax treatment of retained earnings for professional corporations.

A recent review of federal tax expenditures conducted by the Government have highlighted a number of issues regarding tax planning strategies using private corporations, which can result in high income individuals gaining unfair tax advantages. A variety of tax reduction strategies are available to these individuals that are not available to other Canadians. These strategies include:

*         Sprinkling income using private corporations, which can reduce income taxes by causing income that would otherwise be realized by an individual facing a high personal income tax rate to instead be realized (e.g., via dividends or capital gains) by family members who are subject to lower personal tax rates (or who may not be taxable at all).

*         Holding a passive investment portfolio inside a private corporation, which may be financially advantageous for owners of private corporations compared to otherwise similar investors. This is mainly due to the fact that corporate income tax rates, which are generally much lower than personal rates, facilitate accumulation of earnings that can be invested in a passive portfolio.

*         Converting a private corporation’s regular income into capital gains, which can reduce income taxes by taking advantage of the lower tax rates on capital gains. Income is normally paid out of a private corporation in the form of salary or dividends to the principals, who are taxed at the recipient’s personal income tax rate (subject to a tax credit for dividends reflecting the corporate tax presumed to have been paid). In contrast, only one-half of capital gains are included in income, resulting in a significantly lower tax rate on income that is converted from dividends to capital gains.

The Government is further reviewing the use of tax planning strategies involving private corporations that inappropriately reduce personal taxes of high-income earners… The Government intends to release a paper in the coming months setting out the nature of these issues in more detail as well as proposed policy responses…

Yours sincerely,
Arif Virani, MP
Parkdale-High Park

This letter is not written by Virani, of course. Jim’s question was sent by Virani’s staff to the constituent communications division of the Finance Department where the response was crafted then returned to the MP to put into letter form. It comes straight out of the fine print in the 2017 budget. No government backbench MP has the authority or permission to freelance on government fiscal policy – thus what you have in this missive is the Trudeau-Morneau party position. So, Jimbo, look out!

The war against self-employed professionals, small business operators, high-income earners, commissioned workers and anyone who doesn’t toil for a salary is definitely heating up. While Trumphobia kept Ottawa from diddling with the tax laws (plus capital gains and dividends) in this year’s budget, it’s a safe bet to assume the hit’s coming next Spring. What can you expect?

  • No more income-splitting with spouses, relatives or children by making them shareholders of a professional corporation.
  • Possible restrictions on the ability of the self-employed to pay themselves through dividends rather than salary (even though there’s no real tax advantage).
  • A tax on retained earnings within a corporation, by ensuring funds invested in a corporate account are taxed at the highest personal level – eliminating the advantage of keeping them there.
  • Hiking the capital gains inclusion rate in selected circumstances to 100%, and in all others to (possibly) 75%.

The appetite of the current government is as insatiable as its spending is profligate. In less than two years we’ve seen a ‘temporary’ deficit to create jobs become one which will be structural for decades. With salaried high-income types already handing over 50% of their incomes, the guns will soon be leveled at the others – doctors, lawyers, sales execs, IT contractors and all who operate through incorporations for a host of reasons, usually getting by with no benefits, no pensions and dubious security.

Record household debt. Record house prices. Record spending. Record tax.

Is it just me, or does it feel like we’re going bigly in the wrong direction?

Perlsphere: Final Grant Report : Migrating blogs.perl.org - April 2017

Work on the blogs.perl.org grant, started in November 2015, has stalled. With no progress reports from the grantee since November 2016, and after a number of attempts on all sides to jumpstart the work, the Grants Committee has voted to cancel the grant, as provided in the rules of operation.

Many on the Committee and in the community would like to see a successful update of blogs.perl.org. With that in mind, the Grants Committee encourages interested parties to consider applying for blogs.perl.org improvement grants in upcoming rounds. The next round of decisions will happen in May. See How to write a proposal for tips, and feel free to reach out to the committee via tpf-grants-secretary at perl-foundation.org.

MAJ

Perlsphere: Programming language popularity - Stack Overflow

There are many ways to compare the popularity of programming languages. None of them is perfect. Let's look at one of those imperfect comparisions: Tags on StackOverflow

For the full article visit Programming language popularity - Stack Overflow

Quiet Earth: A Pug's Life: GETAWAY DRIVER Burns Rubber [Short in Full]

Writer/director Abner Pastoll has debuted a high-octane short called Getaway Driver. The stylish film, shot over 10 hours in a mutli-storey car park in New Malden, South West of London, features vintage cars, one cool pug, and an ending you won’t expect.

"I’ve always love car chases,” said Abner Pastoll. “Some of my favourite movies, The Driver, Breakdown, To Live and Die in LA, all have seriously epic chases. I wanted to capture something in that vein, a little old-school but unique and quirky.”

The cars used in the film are an Orange Ford Capri MKiii [Continued ...]

Colossal: How to Make the World’s Smallest Cup of Coffee

For their brand new advertisement, Finnish coffee roaster Paulig asked director and animator Lucas Zanotto to brew a cup of coffee from a single bean. Using a nail file to create the grounds, Zanotto then boils water over a single tea light, and finally pours the freshly brewed java one drop at a time into a thimble-sized mug. The video has a direct relationship to recently popularized miniature cooking videos on Youtube, which have produced everything from miniature deep-fried chicken to tiny shrimp tempura. You can watch more of the Helsinki-based director’s videos on his Instagram and Vimeo, and take a look at Zanotto’s miniature coffee brewing techniques above.

Quiet Earth: THE X-FILES Returns. Again.

Last year's "The X-Files" revival brought with it both good and bad. It was great to see Scully and Mulder back at it, solving cases, asking questions and generally just being awesome as more seasoned agents dragged into more of the weird shenanigans that brought them together in the first place.


Though I enjoyed the limited series, the run felt a bit vacant, like Carter brought the characters together to test the waters but didn't have a really clear sense of an overall story arch and as a result, the episodes felt, to this fan at least, disjointed and superficial and missing the secret ingredient that made the original series so great.


That said, it seems the return of the series performed above expectation for the studio and Fox has [Continued ...]

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - The Interpretation of Hats



Click here to go see the bonus panel!

Hovertext:
Now to spray sweet white flowers out of this big long wand.

New comic!
Today's News:

Hey geeks! I'll be signing this Sunday at the MIT Press Bookstore from 3-430, and then again after BAHFest. Come say hi!

Penny Arcade: Comic: Evolution

New Comic: Evolution

OUR VALUED CUSTOMERS: Said one high schooler to the other...


Colossal: Dual Bowls: Striking Mixed Metal Bowls Forged With the Ancient Art of Sand-Casting

Fusing ancient techniques with contemporary aesthetic, Dual Bowls are one-of-a-kind vessels forged from a mixture of recycled brass, copper, zinc, or nickel in this new project from artist Kawther Al Saffar. The bowls are made in partnership with the Alwafi Foundry in Kuwait who utilize a variety of sand-casting methods
with sand acquired from the nearby Nile river. Instead of masking or eliminating imperfections left behind from the casting process, Saffar chose to highlight them, giving each bowl a unique design while referencing the inherent complexity of forging a single object from two different materials.

Saffar was born and raised in Kuwait and attended the Rhode Island School of Design where she studied industrial design, and you can see more of her work in her portfolio. Dual Bowls are currently funding on Kickstarter, and it looks like they smashed their funding goal almost immediately.

CreativeApplications.Net: GLASS II – 3D printing glass structures at architectural scale

Created by the Mediated Matter Group and the MIT Media Lab, GLASS II is the group's most recent work in the area of 3D printing optically transparent glass now at architectural scale.

Electronics-Lab: Raspberry Pi Publishing MQTT Messages to ESP8266

Rui @ randomnerdtutorials.com tipped us with his latest tutorial. He writes:

In this project you’ll create a standalone web server with a Raspberry Pi that can toggle two LEDs from an ESP8266 using MQTT protocol. You can replace those LEDs with any output (like a relay that controls a lamp).

Raspberry Pi Publishing MQTT Messages to ESP8266 – [Link]

The post Raspberry Pi Publishing MQTT Messages to ESP8266 appeared first on Electronics-Lab.

Explosm.net: Comic for 2017.04.21

New Cyanide and Happiness Comic

Ideas from CBC Radio (Highlights): Children of the Fatherland: The Rise of the Extreme Right in France, Part 1

Philip Coulter explores the rise of the right-wing Front National party as France gets ready to elect their next president.

Perlsphere: Grant Extension Request: Maintaining the Perl 5

Tony Cook has requested an extension of $20,000 for his Maintaining the Perl 5 grant. This will allow him to dedicate another 400 hours to this work. During this grant he sent regular reports to the p5p mailing list as well as providing monthly summary reports that have been published on this site, the most recent of which are linked below:

Before we make a decision on this extension, we would like to have a period of community consultation. Please leave feedback in the comments field below or if you prefer, send email with your comments to makoto at perlfoundation.org. We request that all the feedback should be sent by April 25th.

If successful this extension will be funded from the Perl 5 Core Maintenance Fund.

Note: The request was received in February and TPF's internal process took time to post this. Apologies.

Quiet Earth: Monks Embark on a Dangerous PILGRIMAGE [Trailer]

RLJ Entertainment have released the trailer for Pilgrimage. The film will have its world premiere at Tribeca Film Festival on April 23, 2017 and will be in theaters and available on VOD and Digital HD August 11, 2017.

Directed by Brendan Muldowney (Savage) and written by Jamie Hannigan, Pilgrimage stars Tom Holland (Spider-Man: Homecoming), Jon Bernthal (Sicario) and Richard Armitage (The Hobbit).


SYNOPSIS:

Ireland, 1209. A small group of monks begin a reluctant pilgrimage across an island torn between centuries of tribal warfar [Continued ...]

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Cold comfort

It may be no coincidence that Home Capital melted down on the stock market the same day Ontario was trying to hammer housing. Canada’s premier alt lender lost big – down as much as 20% in a single session – after regulators charged the company with fibbing to investors and breaking securities laws. You might recall Home Capital had to shed dozens of brokers who were falsifying mortgage docs, handing out billions to borrowers who didn’t qualify.

Well, the company may not survive. Ditto for a whole whack of small-time speculators who bought condos they plan to flip or rent. At first blush, it looks like the condo market was the big loser in Thursday’s political assault on the free market. Rent controls on new units will virtually guarantee consistent long-term negative cash-flow for investor-owned apartments. Ouch. And since half of recent condo sales have gone to investors, you can imagine the impact.

The condo trade also relies heavily on assignment clauses – allowing buyers to sell their interest in a unit prior to closing. Given the fact it can be three or four years between making a deposit on an unbuilt unit and actually seeing it registered, assignments make sense. There are whole brokerages dealing in nothing but. So now with intense scrutiny and the CRA involved, any gains are likely to end up being taxed as income. Double ouch.

In case you missed it, Ontario did about what was forecast here. A non-resident (foreign buyer) tax of 15% – or $240,000 on the average detached house. Rent controls on every unit, ensuring landlords cannot stay ahead of inflation. Letting cities tax under-used properties – about $1,400 a month on a 416 SFH. And nebulous new regulation for realtors, perhaps ending an agent’s ability to represent both buyer and seller.

So what happens as a result?

In Vancouver sales plunged after similar moves were made and the prices of high-end properties sank. But residential values are sticky, and the consensus seems to be that none of it worked. Average houses remain unaffordable for average families, even after the Chinese dudes took a hike. But the GTA is not YVR. All markets are local. And Van was already rolling over when the province locked out foreign buyers and taxed empty houses.

It’s reasonable to assume the condo market will erode, since rent controls make investing in concrete boxes a losing prop. Controls also discourage purpose-built rental housing, so what the province has just done may actually limit the amount of new units coming to market. As far as the foreign buyer tax goes, realtor stats show only 4.9% of GTA deals were done by non-residents. So logic tells us the impact will be half what it was in Van, where the city had a 9.9% non-resident penetration.

As stated here so often, real estate’s emotional. People get horny to have it (whatever the cost) when markets rise and run screaming from perceived risk when it falls. Rent controls, realtor handcuffs and a tax on foreign guys may not be enough to persuade house-lusty moisters that a slanty semi on a dodgy street isn’t the holy grail. Which is why we could use that tax on stupid.

If you ever need evidence it’s insatiable demand – fueled by the cheap money government has provided – and not greedy, tax-cheating, money-laundering, speculating foreigners, just look at this chart. Personal debt’s over $2 trillion. Two-thirds of it is mortgages. All those orange bars are going wild.

This still ends badly. They know that tonight at Home Capital.

Quiet Earth: Neil Marshall Produced Horror DARK SIGNAL Finally Lands Stateside [Trailer]

Nearly a year after it's release in the UK, Edward Evers-Swindell's supernatural horror film Dark Signal will finally available in the US via the folks at XLrator Media.


The horror movie, about a radio station which is taken over by the ghost of a murdered woman, was of interest to us for a number of reasons among them the fact that it's a cool concept and the original trailer suggested a good looking film but what really got us excited was the fact that genre master Neil Marshall was on board as executive producer.


The movie stars The film stars Siwan Morris, Gareth David-Lloyd, Joanna Ignaczewska, Duncan Pow, Eleanor Gecks, Cinzia Monreale and James Cosmo.

:::BREAK [Continued ...]

Michael Geist: Net Neutrality Alive and Well in Canada: CRTC Crafts Full Code With Zero Rating Decision

The CRTC today released the final chapter (for now) in its net neutrality governance framework, creating policy that establishes strong safeguards against net neutrality violations and severely restricts the ability for providers to engage in zero rating practices. When combined with the federal government’s clear support for net neutrality, the Canadian framework is now one of the strongest in the world, providing guidance for the providers and appropriate protections for users and innovative services.

The Commission established its first net neutrality policy response in 2009 with the Internet traffic management practices. The rules restrict content blocking or slowdowns and require ISPs to disclose how they manage their networks. The issue expanded into zero rating in 2013 when Ben Klass, a graduate student in telecommunications, filed a complaint with the CRTC over how Bell approach to its Mobile TV product. In January 2015, the CRTC released its decision in the case, siding with Klass. The Commission expressed concern that the service “may end up inhibiting the introduction and growth of other mobile TV services accessed over the Internet, which reduces innovation and consumer choice.”

Today’s decision largely completes the process by providing a framework for examining future zero rating or differential pricing cases (and rejecting Videotron’s music service plan in an accompanying decision). The ruling opens by examining whether differential pricing (of which zero rating is a form) raises concerns regarding preferences or disadvantages.  The Commission concludes that it often does:

differential pricing practices, generally speaking, result in (a) a preference toward certain subscribers over others, (b) a preference toward certain content providers over others, (c) a disadvantage to subscribers who are not eligible for, or interested in, a differential pricing practice offering, and (d) a disadvantage to content providers that are not eligible for, or included in, an offering.

The impact is significant as the Commission notes that it can affect competition, innovation, consumer choice, access and affordability as well as privacy in a section of the decision that comprehensively makes the case for the harms associated with zero rating. For example, with respect to competition, the CRTC states:

The Commission considers that competition in the retail Internet access services sector is best served, and the telecommunications policy objectives set out in the Act are best achieved, when ISPs compete and differentiate their services based on their networks and the attributes of the services on those networks, such as price, speed, volume, coverage, and the quality of their networks.

The Commission also believes that differential pricing practices that favour particular services, technology, or content would generally negatively affect innovation.  On consumer choice, the CRTC is mindful of what consumer groups and pro-net neutrality advocates have warned:

The Commission considers that any short-term benefits of differential pricing practices would be greatly outweighed by the negative long-term impacts on consumer choice if ISPs were to act as gatekeepers of content through their use of such practices

Moreover, given that differential pricing is typically offered for higher tier services, it finds that there was no evidence that it meaningfully increases access.  Interestingly, the Commission also expresses support for the use of VPNs and is reluctant to embrace policies that might discourage their use. The decision states:

The Commission would be concerned, however, if differential pricing practices affected the use of VPNs. The Commission recognizes that VPNs are a legitimate tool to protect sensitive information, as recommended by security firms. While the Commission does not find differential pricing practices to have a direct negative impact on privacy per se, it is concerned that their adoption could discourage the use of VPNs and thus compromise the privacy and/or security of consumers.

Given the concerns and harms associated with zero rating, how to address the issue?

The CRTC rejects category style approaches advocated by some groups, concluding that they would not solve the concerns.  It also rejected calls from some cultural groups for preferences for Canadian content, noting:

Given all the drawbacks and limitations of using differential pricing practices as a way to support and promote Canadian programming, the Commission considers that any benefits to the Canadian broadcasting system would generally not be sufficient to justify the preference, discrimination, and/or disadvantage created by such practices.

Instead, the CRTC has established a framework that bears considerable similarity to its 2009 ITMP approach.  It will allow for a complaints-based mechanism that can lead to an evaluation of whether the differential pricing is compliant with the law.  Given that the Commission rejected many of the proposed categories and exceptions, this will be a difficult standard to meet and there is now considerable guidance for providers.

The evaluation criteria involves four key issues: agnostic treatment of data, exclusiveness of the offering, impact on Internet openness and innovation, and whether financial compensation is involved.  Agnostic treatment is viewed as the most important, though none are determinative. The Commission will also consider exceptional circumstances, which allow for public interest considerations, and a minimal harm analysis (which effectively expands the criteria to six possible grounds).  The details on the four main criteria:

The agnostic treatment of data. The Commission will consider the extent to which data traffic is priced or rated equally or agnostically by an ISP with regard to its customers’ retail Internet access services, while having regard to the amount of data involved. Offerings that rate or price data non-agnostically, such as by zero-rating data traffic from certain content providers (including affiliated entities), are likely to raise concerns under subsection 27(2). Differential pricing practices that treat data traffic agnostically (e.g. time-of-day offerings) are not likely to raise the same level of concern.

The exclusiveness of the offering. The Commission will consider the extent to which a differential pricing practice is exclusive to a particular class or group of subscribers, or to a particular content provider or class or group of content providers, while also having regard to the number of subscribers or content providers affected. For example, differential pricing practices that are exclusive to subscribers to a particular data plan are likely to raise concerns under subsection 27(2).

The impact on Internet openness and innovation. The Commission will consider the extent to which a differential pricing practice inhibits or compromises the openness of the Internet for Canadians and the choices available to Canadians. In particular, this analysis will consider (a) whether a differential pricing practice affects the ability of content providers or innovators to enter the market by creating barriers to entry, and (b) the extent to which a differential pricing practice affects innovation. For example, differential pricing practices that require content providers to conform to administrative and technical requirements that are burdensome, costly, or time-consuming to meet are likely to raise concerns under subsection 27(2). Differential pricing practices that favour large, established content providers over smaller ones and new entrants are also likely to raise concerns.

Whether there is financial compensation involved. The Commission will consider whether a differential pricing practice results in financial compensation or other financial benefits between a content provider and an ISP or third-party sponsor (including affiliated entities), having regard to the amount of compensation involved and the extent of the financial interest with any affiliated entity. For example, sponsored data arrangements, where an ISP receives payment from a content provider in exchange for zero-rating the data traffic to and from that provider, are likely to raise concerns under subsection 27(2).

The Commission expects all provides to follow these guidelines and – like the ITMP regime – will investigate complaints. Given that Commission rejects the Videotron service, has already rejected the Bell Mobile TV service, and rejects many compromise proposals that were raised during the hearing, it is clear that the bar for approval of a zero rating or differential pricing plan is very high.  Time of day differences are permitted as are plans that treat data in an agnostic manner.  In other words, the CRTC goes back to first net neutrality/common carriage principles of treating data equally.

It is worth noting that the CRTC decision also addresses the issue of data caps, declining to ban the practice and merely monitor the situation for now. Several groups (and many Canadians) had asked the Commission to address the practice.

In sum, this is a huge win for net neutrality in Canada as the CRTC was ultimately guided by its longstanding principle that telecom regulation should restrict the ability of ISPs to determine winners and losers through their power as the Internet’s gatekeepers. When combined with the the ITMP framework and the decisions involving Bell Mobile TV and Videotron, the CRTC has crafted a reasonable, pro-net neutrality framework that provides carriers with guidance and users – whether innovative businesses or consumers – with assurances that net neutrality is the law of the land.  As a complaints-based mechanism there is considerable onus placed on consumers to monitor to practices and to seek enforcement, but the right framework is in place for long-term benefits to innovation and consumers.

The post Net Neutrality Alive and Well in Canada: CRTC Crafts Full Code With Zero Rating Decision appeared first on Michael Geist.

Colossal: Fictional Butterflies Animated as Illuminated GIFs by Vladimir Stankovic

Australia-based illustrator Vladimir Stankovic has created several series of GIFs depicting his fantastical portrayal of the natural world, animating subjects such as Cepharthropoda (animals with characteristics of both cephalopods and arthropods), Cephalopodoptera (his cross between mollusks and insects), and the Lepiodoptera Obscura (seen here). Within this series he illustrates the lifecycle of a “hidden butterfly,” extravagantly colored insects that exist in some of the most remote areas of tropical rainforests.

You can see more of his fictional additions to natural history on his Instagram and Behance, and find fine art prints of his subjects on his Etsy.

Disquiet: Disquiet Junto Project 0277: Chew Concrète

Each Thursday in the Disquiet Junto group, a new compositional challenge is set before the group’s members, who then have just over four days to upload a track in response to the assignment. Membership in the Junto is open: just join and participate. A SoundCloud account is helpful but not required. There’s no pressure to do every project. It’s weekly so that you know it’s there, every Thursday through Monday, when you have the time.

Tracks will be added to this playlist for the duration of the project:

This project’s deadline is 11:59pm wherever you are on Monday, April 24, 2017. This project was posted in the morning, California time, on Thursday, April 20, 2017.

These are the instructions that went out to the group’s email list (at tinyletter.com/disquiet-junto):

Disquiet Junto Project 0277: Chew Concrète
The Assignment: Make music inspired by C. Reider’s Chew Cinders album procedures.

Step 1: This week’s project is inspired by the manner in which C. Reider recorded his recent album, Chew Cinders (Midnight Circles). We aren’t remixing his album. We’re remixing/repurposing his approach to the album. You can check it out here:

https://midnightcircles.bandcamp.com/album/chew-cinders

Step 2: This instruction is adapted, with Reider’s input, from the manner in which he recorded the album:

Process a sequence of standalone “chunks” of pre-recorded sound — voice, field recordings, noise — with an emphasis on the manipulation of time and pitch. Speed things up, slow them down, and explore the opportunity to use cutup techniques. Pay particular attention to segues between the chunks.

Step 3: Make a piece of music inspired by the approach delineated in Step 2.

Five More Important Steps When Your Track Is Done:

Step 1: If you hosting platform allows for tags, be sure to include the project tag “disquiet0277″ (no spaces) in the name of your track. If you’re posting on SoundCloud in particular, this is essential to my locating the tracks and creating a playlist of them.

Step 2: Upload your track. It is helpful but not essential that you use SoundCloud to host your track.

Step 3: In the following discussion thread at llllllll.co please consider posting your track:

http://llllllll.co/t/chew-some-concrete-sounds-disquiet-junto-project-0277/

Step 4: Annotate your track with a brief explanation of your approach and process.

Step 5: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Deadline: This project’s deadline is 11:59pm wherever you are on Monday, April 24, 2017. This project was posted in the morning, California time, on Thursday, April 20, 2017.

Length: The length is entirely up to the participant.

Title/Tag: When posting your track, please include “disquiet0277″ in the title of the track, and where applicable (on SoundCloud, for example) as a tag.

Upload: When participating in this project, post one finished track with the project tag, and be sure to include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Download: It is preferable that your track is set as downloadable, and that it allows for attributed remixing (i.e., a Creative Commons license permitting non-commercial sharing with attribution).

Linking: When posting the track online, please be sure to include this information:

More on this 277th weekly Disquiet Junto project — “Chew Concrète: Make music inspired by C. Reider’s Chew Cinders album procedures” — at:

http://disquiet.com/0277/

More on the Disquiet Junto at:

http://disquiet.com/junto/

Subscribe to project announcements here:

http://tinyletter.com/disquiet-junto/

Project discussion takes place on llllllll.co:

http://llllllll.co/t/chew-some-concrete-sounds-disquiet-junto-project-0277/

There’s also on a Junto Slack. Send your email address to twitter.com/disquiet for Slack inclusion.

Image associated with this project is adapted from Thomas Jung’s art for the cover of the album that inspired the project, C. Reider’s Chew Cinders:

https://midnightcircles.bandcamp.com/album/chew-cinders

OUR VALUED CUSTOMERS: While discussing the DAREDEVIL tv series...


Colossal: A Stained Glass Cabin Hidden in the Woods by Neile Cooper

Stained glass artist and jeweler Neile Cooper had a vision for a sanctuary: a small cabin behind her home in Mohawk, New Jersey that would feature her glass designs on every available surface. The result is Glass Cabin, a structure built almost entirely from repurposed window frames and lumber that features dozens of panels of her stained glass work, depicting flowers, birds, butterflies, mushrooms and other scenes from nature. Cooper explores many of these same motifs in her popular jewelry designs. You can see more photos of Glass Cabin on Instagram. (via This Isn’t Happiness)

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Satan-Fingers



Click here to go see the bonus panel!

Hovertext:
Discovered in the delightful though occasionally frustrating Moscow Puzzles

New comic!
Today's News:

We're running out of most tickets for BAHFest MIT, so we've moved over some of the more expensive ones to lower levels. Buy soon or feel shame!

Michael Geist: The Reel Story: Why Changing How We Measure a “Canadian Film” is Long Overdue

National Canadian Film Day 150, described as the world’s largest film festival, was held yesterday with events that showcased Canadian feature films at hundreds of venues from coast to coast. The event had a large number of sponsors (the Prime Minister promoted it) that helped place the spotlight on Canadian film. Yet a day devoted to Canadian feature film might also have called attention to the struggles of the Canadian feature film category and considered whether significant policy reforms are needed. This year’s Canadian Media Producers Association Profile 2016, which chronicles the industry (I used it earlier this year to discuss how foreign financing – not regulated contributions – is the now the top source of English-language television production in Canada), tells a story of a feature film industry that relies on public dollars to finance the majority of its costs, has hit a decade low in the number of films produced, and is experiencing declining budgets.

In the last reported year, the average English-language feature film budget declined to $2.2 million and the percentage of films with budgets over $10 million dropped to just 2%.  There were a total of 94 feature films made, the lowest figure in the past decade. The average budget for a Canadian English-language fiction feature film was also its lowest in the past ten years.

Funding for these films comes primarily from tax dollars with public sources accounting for $146 million or 57% of the total financing of Canadian theatrical feature film production. The total budget is small: $178 million for English-language films and $76 million for French-language films. The chart below highlights how little Canadian private sources spend on making feature films. After accounting for public dollars through CFFF-Telefilm, tax credits, and foreign money, less than one-third of funding comes from Canadian private sources. Note that this data is focused on Canadian feature film production to the theatres and does not include co-productions with other countries, which add an additional 26 productions (11 in English and 15 in French) with larger average budgets.

CMPA Profile 2016, Page 67, http://www.cmpa.ca/sites/default/files/documents/industry-information/profile/Profile%202016%20EN.pdf

 

The audience for Canadian feature films isn’t great either. While going to the movies is a billion dollar industry in Canada, Canadian feature films garnered just 0.6% of box office receipts in the English-language market (the number is better in French at 10.7%). The revenues are truly tiny: a total of $4.9 million in revenue for English-language feature films out of a box office of $857.1 million. The low revenue is notable since there were over 100 Canadian films shown constituting 7.9 percent of all English-language films.

 

CMPA Profile 2016, Page 118, http://www.cmpa.ca/sites/default/files/documents/industry-information/profile/Profile%202016%20EN.pdf

 

There are surely many factors behind the performance, not the least of which is the popularity of U.S. films, which typically have bigger budgets and more promotion associated with them. But if Canada deems feature film important, is willing to spend millions in tax dollars and credits to support the industry, and wants to ensure that Canadian stories make it onto the big screen, then considering other policy issues is needed (Simon Houpt did so in an excellent piece in 2015).

Topping the list of considerations might be how Canada defines a “Canadian film.” This issue was the subject of debate at the annual CMPA conference in February that was also covered by Houpt.  While the debate and Houpt piece focus on the virtues and problems with Canadian 10-point system for determining whether a film qualifies as “Canadian”, the reality is that the Canadian approach is an outlier when compared with many other countries.

I recently obtained a study conducted by the Department of Canadian Heritage under the Access to Information Act that compared approaches in ten countries: Canada, Australia, the UK, Ireland, Hungary, New Zealand, Mexico, Germany, France, and Colombia. The study noted that point systems are common, but Canada stands alone in focusing exclusively on the nationality of personnel involved in the production.

The majority of countries allow for points for three main criteria: cultural content (the cultural contribution of the film itself), production (the degree to which the film is nationally produced), and personnel. Some countries emphasize one criteria more than another, but only Canada considers a film to be Canadian based strictly on the nationality of personnel. Canada is also the only country to require the company to maintain worldwide copyright.

The report notes that Canada’s focus on process may come at the expense of cultural outcomes:

In its pursuit of cultural goals, Canada maintains a distinct focus on a process rather than outcome based approach relative to other countries being examined. The Canadian system focuses solely on ensuring the creators behind the production are Canadian. Not only do other countries have lower requirements relating to the number of key staff that must have their countries’ nationality, they also allow low scores in this category to be compensated by strong scores in cultural content and production…

When trying to achieve cultural goals, focusing on outcomes rather than process has potential advantages and disadvantages. Traditional policy literature encourages focusing on outcomes, as this is the clearest way to connect policies to the mandate of government. For example, a film made entirely by Canadian producers and key creative staff could still be based on American source material, be set in the United States, and consist only of American characters. This would not necessarily be achieving the goals of producing distinctly Canadian cultural content.

An internal presentation that accompanied the report highlighted the limitations of the Canadian approach, noting a film based on a Canadian novel, starring Canadian actors, and filmed and produced in Canada might still not qualify as Canadian if written, directed, and produced by an American.

 

Canadian Heritage Slide Presentation, obtained under ATIP

 

The outlier cultural approach – when combined with the financial struggles of the Canadian feature film industry and the significant public investment in the sector – suggests that it is time to reconsider the Canadian system. The Canadian industry is enormously successful once foreign location and service production is taken into account. That side of the industry – in which foreign producers use Canadian locations for filming and services – was a $2.6 billion industry with 128 feature films last year. However, the industry often cites the cultural importance of Canadian feature films, in part to justify the significant public support. If the goal of the feature film industry in Canada is primarily cultural with box office success or film budgets deemed secondary, then changing the way we measure what constitutes a “Canadian film” is long overdue.

The post The Reel Story: Why Changing How We Measure a “Canadian Film” is Long Overdue appeared first on Michael Geist.

The Shape of Code: Average maintenance/development cost ratio is less than one

Part of the (incorrect) folklore of software engineering is that more money is spent on maintaining an application than was spent on the original development.

Bossavit’s The Leprechauns of Software Engineering does an excellent job of showing the probably source of the folklore did not base their analysis on any cost data (I’m not going to add to an already unwarranted number of citations by listing the source).

I have some data, actually two data sets, each measuring a different part of the problem, i.e., 1) system lifetime and 2) maintenance/development costs. Both sets of measurements apply to IBM mainframe software, so a degree of relatedness can be claimed.

Analyzing this data suggests that the average maintenance/development cost ratio, for a IBM applications, is around 0.81 (code+data). The data also provides a possible explanation for the existing folklore in terms of survivorship bias, i.e., most applications do not survive very long (and have a maintenance/development cost ratio much less than one), while a few survive a long time (and have a maintenance/development cost ratio much greater than one).

At any moment around 79% of applications currently being maintained will have a maintenance/development cost ratio greater than one, 68% a ratio greater than two and 51% a ratio greater than five.

Another possible cause of incorrect analysis is the fact we are dealing with ratios; the harmonic mean has to be used, not the arithmetic mean.

Existing industry practice of not investing in creating maintainable software probably has a better cost/benefit than the alternative because most software is not maintained for very long.

CreativeApplications.Net: Studio Drift – Free floating concrete monolith & HoloLens Artwork

On display at the recent Armory Show in New York was the work of Amsterdam based Studio Drift featuring 'Drifter', a free floating concrete monolith together with 'Concrete Storm', a Holo Lens experience comprised of mixed reality art object.

New Humanist Blog: Why do we use reason to reach nonsensical conclusions?

Q&A with Hugo Mercier and Dan Sperber, authors of a new book about the evolution of reason.

Penny Arcade: News Post: The Litany

Tycho: Grab has already waxed poetical about Overwatch: Uprising, and in private I have seen him wipe away those tears which demand release.  When I look at the official page, that language doesn’t strike me as being in the service of a one-off initiative.  They can play with this forever, each time with a delectable new mess of crates, just continue driving in this potent spike of ambient narrative expression.  I want to emphasize that Overwatch is what it looks like when Blizzard fails to release a product.  Failing is a skill.  Can you imagine how it must have felt…

TheSirensSound: New video for 40 (Without Her Love) by Kilmanjaro

TheSirensSound: New album Before I Return To Dust by Slowburner

Ideas from CBC Radio (Highlights): The Rise of the Anti-Establishment: Where do we go from here?

Robert Reich, former U.S. Secretary of Labor and Professor of Public Policy at University of California at Berkeley, details how understanding the circumstances that led to the election of Donald Trump can help shape a new democratic political sensibility

churchturing.org / 2017-04-24T16:40:26