Bifurcated Rivets: From FB

Interesting

Bifurcated Rivets: From FB

Water

Bifurcated Rivets: From FB

This is rather fine

Bifurcated Rivets: From FB

Hmmm

Bifurcated Rivets: From FB

Some here I hadn't seen.

BOOOOOOOM!: Reader Submission: Katie Neece

Paintings by artist Katie Neece from South Bend, Indiana. Found via our monthly Reader Submissions (click here to participate). See more images below.

Hackaday: Loop Antenna is Portable

We don’t know if [OH8STN] has a military background, but we suspect he might since his recent post is about a “DIY Man Portable Magnetic Loop Antenna.” “Man-portable” is usually a military designation, and — we presume — he wouldn’t object to a woman transporting it either.

[OH8STN] started with a Chameleon antenna starter kit. This costs about $100 and is primarily a suitable variable capacitor with a 6:1 reduction drive premounted and soldered. Of course, you could source your own, but finding variable capacitors that can handle transmit duty (admittedly, these can apparently handle about 10 W continuous or 25 W on single sideband) can be tricky, especially these days. Although he started with a kit, he did modify the antenna to switch between two different sets of ham radio bands. You can see the antenna in the video below.

Loop antennas aren’t ideal–but neither is any other small antenna. Because the loop is tightly tuned to a particular frequency, it requires retuning for even relatively small frequency changes, even though it can operate on many different frequencies. If you want more technical details, you might enjoy this recent presentation from [W4RAX]. The links at the end are worth checking out, too.

Of course, in addition to the starter kit, you need some other components, and the video shows it all. You also need some tools, but we were amused to see that for bending the aluminum loop, [OH8STN] simply wrapped it around a tree trunk of suitable circumference.

Loop antennas are popular with apartment- and other city-dwellers. But if you are just looking for exotic radiators, perhaps try sea water. Or you could have a look at a very short dipole-like antenna called the Poynting vector antenna.


Filed under: radio hacks

MetaFilter: The truly villainous font is the ubiquitous Times New Roman.

It turns out the ongoing joke about the idiocy of Comic Sans is ableist.

Recent additions: kafka-device-joystick 0.2.1.0

Added by BrianBush, Sun Feb 26 04:45:42 UTC 2017.

Linux joystick events via a Kafka message broker

Recent additions: kafka-device 0.2.1.0

Added by BrianBush, Sun Feb 26 04:44:51 UTC 2017.

UI device events via a Kafka message broker

Recent additions: kafka-device-spacenav 0.2.1.0

Added by BrianBush, Sun Feb 26 04:44:29 UTC 2017.

Linux SpaceNavigator events via a Kafka message broker

search.cpan.org: code-UnifdefPlus-0.005.003

processes conditional parts of makefiles, c/c++ and kconfig files

Slashdot: The Videogame Industry Is Fighting 'Right To Repair' Laws

An anonymous reader quotes Motherboard: The video game industry is lobbying against legislation that would make it easier for gamers to repair their consoles and for consumers to repair all electronics more generally. The Entertainment Software Association, a trade organization that includes Sony, Microsoft, Nintendo, as well as dozens of video game developers and publishers, is opposing a "right to repair" bill in Nebraska, which would give hardware manufacturers fewer rights to control the end-of-life of electronics that they have sold to their customers... Bills making their way through the Nebraska, New York, Minnesota, Wyoming, Tennessee, Kansas, Massachusetts, and Illinois statehouses will require manufacturers to sell replacement parts and repair tools to independent repair companies and consumers at the same price they are sold to authorized repair centers. The bill also requires that manufacturers make diagnostic manuals public and requires them to offer software tools or firmware to revert an electronic device to its original functioning state in the case that software locks that prevent independent repair are built into a device. The bills are a huge threat to the repair monopolies these companies have enjoyed, and so just about every major manufacturer has brought lobbyists to Nebraska, where the legislation is currently furthest along... This setup has allowed companies like Apple to monopolize iPhone repair, John Deere to monopolize tractor repair, and Sony, Microsoft, and Nintendo to monopolize console repair... Motherboard's reporter was unable to get a comment from Microsoft, Apple, and Sony, and adds that "In two years of covering this issue, no manufacturer has ever spoken to me about it either on or off the record."

Read more of this story at Slashdot.

Slashdot: How Cable Monopolies Hurt ISP Customers

"New York subscribers have had to overpay month after month for services that Spectrum deliberately didn't provide," reports Backchannel -- noting these practices are significant because together Comcast and Charter (formerly Time Warner Cable) account for half of America's 92 million high-speed internet connections. An anonymous reader quotes Backchannel: Based on the company's own documents and statements, it appears that just about everything it has been saying since 2012 to New York State residents about their internet access and data services is untrue...because of business decisions the company deliberately made in order to keep its capital expenditures as low as possible... Its marketing department kept sending out advertising claims to the public that didn't match the reality of what consumers were experiencing or square with what company engineers were telling Spectrum executives. That gives the AG's office its legal hook: Spectrum's actions in knowingly saying one thing but doing another amount to fraudulent, unfair, and deceptive behavior under New York law... The branding people went nuts, using adjectives like Turbo, Extreme, and Ultimate for the company's highest-speed 200 or 300 Mbps download offerings. But no one, or very few people, could actually experience those speeds...because, according to the complaint, the company deliberately required that internet data connections be shared among a gazillion people in each neighborhood... [T]he lawsuit won't by itself make much of a difference. But maybe the public nature of the attorney-general's assault -- charging Spectrum for illegal misconduct -- will lead to a call for alternatives. Maybe it will generate momentum for better, faster, wholesale fiber networks controlled by cities and localities themselves. If that happened, retail competition would bloom. We'd get honest, straightforward, inexpensive service, rather than the horrendously expensive cable bundles we're stuck with today. The article says Spectrum charged 800,000 New Yorkers $10 a month for outdated cable boxes that "weren't even capable of transmitting and receiving wifi at the speeds the company advertised customers would be getting," then promised the FCC in 2013 that they'd replace them, and then didn't. "With no competition, it had no reason to upgrade its services. Indeed, the company's incentives went exactly in the other direction."

Read more of this story at Slashdot.

MetaFilter: Safety Pins and Swastikas

The frameworks of liberal identity politics and "alt-right" white nationalism are proving curiously compatible. Jacobin's Shuja Haider explores the co-opting of progressive methods by the alt right.

Hackaday: Decorate Your 3D Prints with Detailed Hydrographic Printing

It’s like the old quip from [Henry Ford]: You can have your 3D prints in any color you want, as long as it’s one. Some strides have been made to bringing more color to your extruded goodies, but for anything beyond a few colors, you’re going to need to look at post-print processing of some sort. For photorealistic 3D prints, you might want to look into a simple hydrographic printing method that can be performed right on a printer.

If some of the prints in the video below look familiar, it’s because we covered the original method when it was presented at SIGGRAPH 2015. [Amos Dudley] was intrigued enough by the method, which uses computational modeling of complex surfaces to compose a distorted image that will be stretched back into shape when the object is dipped, to contact the original authors for permission to use the software. He got a resounding, “Nope!” – it appears that the authors’ institution isn’t big into sharing information. So, [Amos] hacked the method.

In place of the original software, [Amos] used Blender to simulate the hydrographic film as a piece of cloth interacting with the 3D-printed surface. This allowed him to print an image on PVA film that will “un-distort” as the object is dipped. He built a simple tank with overflow for the printer bed, used the Z-axis to dip the print, and viola! Photo-realistic frogs and globes.

[Amos]’ method has its limitations, but the results are pretty satisfying already. With a little more tweaking, we’re sure he’ll get to the point that the original authors did, and without their help, thank you very much.

 


Filed under: 3d Printer hacks

TheSirensSound: New album - Vanity Mirror by Interns

Interns  aim to frame the ambient, spacey goodness of post-rock around a structure that is easily digestible for listeners of any background. Like much of their peers, they forego lyrics in an attempt to let their instruments convey their thoughts and emotion.

search.cpan.org: Dancer2-Template-Caribou-1.0.0

Template::Caribou wrapper for Dancer2

Slashdot: GitHub Invites Contributions To 'Open Source Guides'

An anonymous reader quotes InfoQ: GitHub has recently launched its Open Source Guides, a collection of resources addressing the most common scenarios and best practices for both contributors and maintainers of open source projects. The guides themselves are open source and GitHub is actively inviting developers to participate and share their stories... "Open source is complicated, especially for newcomers. Experienced contributors have learned many lessons about the best way to use, contribute to, and produce open source software. Everyone shouldn't have to learn those lessons the hard way." Making a successful first contribution is not the exclusive focus of the guides, though, which also strives to make it easier to find users for a project, starting a new project, and building healthy open source communities. Other topics the guides dwell on are best practices, getting financial support, metrics, and legal matters. GitHub's Head of Open Source says the guides create "the equivalent of a water cooler for the community."

Read more of this story at Slashdot.

TheSirensSound: New track - End Of You by MAD ONES

End Of You” is an existential meltdown played out over the course of a 3 minute rock song. Initially it was set during the industrial revolution and written from the viewpoint of someone contemplating the value of human life and likely speculating on the existence of an afterlife. The song is also appropriately suited to the recent rise of right wing populism in the US, the strength of the opposition to it and the sinking feeling we are, as a species, doomed to repeat our mistakes.

Instructables: exploring - featured: Easy Duck Tape Penguin

Making a cute duck tape penguin in 6 easy steps! Matterials Yellow and Black Duck TapeWhite fabricAn exacto knife Pencil Cut the Fabric Trace a circle of the size of the roll of duct tape on the fabric. And cut out two of those with an xatco knife. Making the Back Lay out 3 pieces of duck ...
By: cpfischer01

Continue Reading »

Instructables: exploring - featured: TWIX (CARAMEL SHORTBREAD) CHOCOLATE BAR CAKE

TWIX (CARAMEL SHORTBREAD) CHOCOLATE BAR CAKEPlease Support Me On Patreon! https://www.patreon.com/TreatFactory Youtube https://www.patreon.com/TreatFactoryInstagram: https://instagram.com/treat.factory/ Tumblr: http://treatfactory.tumblr.com/ Facebook: https://www.facebook.com/TreatFactory/ ...
By: TreatFactory

Continue Reading »

TheSirensSound: Album - UnFinetude by LeVant

According to LeVant,  “UnFinetude” is a rather special work and probably the most radical and genre-bending amongst the ones he made so far so far. As a sort of “raison d’etre”  for its genesis,  came from the subconscious challenge he  got from various persons asking him what is that?  So he  thought of  creating a digimodern work  and this slowly shaped in, thanks to some hours of expanding fantasy, unrelenting encounters with the realm of Illusion and a handful of plugins."

search.cpan.org: Dancer-Template-Caribou-1.0.0

Template::Caribou wrapper for Dancer

search.cpan.org: Group-Git-v0.6.0

Base module for group of git repository operations.

MetaFilter: High-def, close-up, like you're RIGHT THERE with the nachos

Are you bored staring at a blank wall all day? Stupid landlord won't let you hang pictures? Has life hidden your stud finder? It's okay, you don't need them anyway! Fall asleep beneath the disappointed gaze of this blue-eyed grey kitten. There's no way you *won't* lose weight with this fresh, hot pizza in a box placed strategically above your television. Nachos not for you? Enjoy your refined-palate lifestyle surrounded with locally-grown laughter and Pinterest-worthy paella decals . Still hungry? Not lactose intolerant? Jazz up your Wednesday dinner routine and impress your date with omnipresent cow's milk cheeses. And if that's not enough, you ungrateful monster, ruin your brain with this cloud of programming languages that may actually be used to create The Cloud OMG what is even real anymore.

search.cpan.org: Template-Caribou-1.2.0

class-based HTML-centric templating system

Recent additions: azubi 0.2.0.0

Added by palo, Sun Feb 26 01:22:59 UTC 2017.

A simple DevOps tool which will never "reach" enterprice level.

ScreenAnarchy: Review: IN DUBIOUS BATTLE Allows James Franco and His Cast to Shine

The figure of James Franco as a director is under constant scrutiny; this is his 15th film, yet he still hasn’t made one that has made the ripples that he no doubt wishes they had. His directing style is still undistinguishable, mixing some experimental elements with a direct approach at performance that makes him look amateurish, almost film school-like. He’s been constantly mocked, especially after many adaptations of various works of literary masters, some deemed un-adaptable, and with In Dubious Battle he dares once again, but with a more conventionally structured novel. By adapting more agreeable material that approaches his political sensitivities, he manages to get closer to a more conventionally made film that becomes at times great because of the passion involved in the...

[Read the whole post on screenanarchy.com...]

Instructables: exploring - featured: Programmable Pipe Lamp

Want to make a cool lamp you can program to illuminate how you want?If so, check out this Arduino-controlled lamp! The goal behind this project was to make an aesthetically appealing lamp out of resources that I had laying around. The main components of this lamp include Kombucha bottles, some pipin...
By: jaschenbach

Continue Reading »

things magazine: Things old and new

Histories of places and things. The story of the Trump Princess, the 128m yacht that was never built / an introduction to synthesis / photographs by John Maher / Blue Crow Media introduce their Brutalist Map of Paris, a city … Continue reading

Recent additions: gnss-converters 0.2.3

Added by markfine, Sun Feb 26 00:39:29 UTC 2017.

GNSS Converters.

Slashdot: Ask Slashdot: How Are You Responding To Cloudbleed?

An anonymous IT geek writes: Cloudflare-hosted web sites have been leaking data as far back as September, according to Gizmodo, which reports that at least Cloudflare "acted fast" when the leak was discovered, closing the hole within 44 minutes, and working with search engines to purge their caches. (Though apparently some of it is still lingering...) Cloudflare CEO Matthew Prince "claims that there was no detectable uptick in requests to Cloudflare-powered websites from September of last year...until today. That means the company is fairly confident hackers didn't discover the vulnerability before Google's researchers did." And the company's CTO also told Reuters that "We've seen absolutely no evidence that this has been exploited. It's very unlikely that someone has got this information... We do not know of anybody who has had a security problem as a result of this." Nevertheless, Fortune warns that "So many sites were vulnerable that it doesn't make sense to review the list and change passwords on a case-by-case basis." Some sites are now even resetting every user's password as a precaution, while site operators "are also being advised to wipe their sites' cookies and security certificates, and perform their own web searches to see if site data leaked." But I'd like to know what security precautions are being taken by Slashdot's readers? Leave your own answers in the comments. How did you respond to Cloudbleed?

Read more of this story at Slashdot.

Instructables: exploring - featured: Non volatile RAM Upgrade for SNES & SFC cartridges

At its peak, Game cartridges used RAM and a backup battery to save game advances and player profiles. Over time those batteries tend to run out and leak hazardous materials that may cause personal injuries, data loss and permanent damage to the cartridge.With this upgrade you can solve all those pro...
By: crowndelorean

Continue Reading »

Hackaday: Octosonar is 8X Better than Monosonar

The HC-SR04 sonar modules are available for a mere pittance and, with some coaxing, can do a pretty decent job of helping your robot measure the distance to the nearest wall. But when sellers on eBay are shipping these things in ten-packs, why would you stop at mounting just one or two on your ‘bot? Octosonar is a hardware and Arduino software library that’ll get you up and running with up to eight sonar sensors in short order.

Octosonar uses an I2C multiplexer to send the “start” trigger pulses, and an eight-way OR gate to return the “echo” signal back to the host microcontroller. The software library then sends the I2C command to select and trigger a sonar module, and a couple of interrupt routines watch the “echo” line to figure out the time of flight, and thus the distance.

Having two sonars on each side of a rectangular robot allows it move parallel to a wall in a straightforward fashion: steer toward or away from the wall until they match. Watch the video below for a demo of this very simple setup. (But also note where the robot’s 45-degree blind spot is: bump-bump-bump!)

With three address bits available on the I2C multiplexer, you could add another eight sonars to a project while only demanding one more interrupt pin from your host microcontroller. Heck, at the prices these things go for, why not go for the full 64?!

The comments for this project suggest a couple of alternative ways to go: a different pair of standard multiplexers instead of I2C, a CPLD to handle the logic, or whatever. But if you just want a simple setup, Octosonar has got you covered. If you’d like to dive deep into the HC-SR04 modules, check out this complete reverse engineering and brain transplant or this similar dissection of the modules with great references.


Filed under: robots hacks

Perlsphere: Configuring NGINX for SSL with Let's Encrypt

The Mojolicious Core Team has decided to try group blogging on Tumblr. As such I’m trying out posting there. I recently posted on how I configure NGINX, read more at https://asyncthoughts.tumblr.com/post/157702503315/configuring-nginx-for-lets-encrypt

Slashdot: Machine-Learning AI Now Beats Humans At Super Smash Bros. Melee

"The AI is definitely godlike," one professional player told Quartz. "I am not sure if anyone could beat it." An anonymous reader quotes their report about an AI's showdown with the best players of Super Smash Bros. Melee: Of 10 professionals that faced the bot, each one was killed more than they could kill the bot... But the bot was once only as good as a mere mortal. At first, Vlad Firoiu, creator and a competitive Smash player himself, couldn't train 'Phillip' to be as strong as the in-game bot, which he says even the worst players can beat fairly easily. Firoiu's solution? He started making the bot play itself over and over again, slowly learning which techniques fail and which succeed, called reinforcement learning. Then, he left it alone. "I just sort of forgot about it for a week," said Firoiu, who coauthored an unreviewed paper with William F. Whitney, the NYU student [who helped him] on the work. "A week later I looked at it and I was just like, 'Oh my gosh.' I tried playing it and I couldn't beat it." Business Insider points out that their AI read the players positions, velocities, and states directly from the game's memory, so the AI responds six times faster than a human player. To compensate it played as Captain Falcon, the game's slowest character, but there was one crucial glitch. "One particularly clever player found that the simple strategy of crouching at the edge of the stage caused the network to behave very oddly, refusing to attack and eventually KOing itself by falling off the other side of the stage."

Read more of this story at Slashdot.

MetaFilter: Call me, Ishmael.

Station 51000, a buoy, came unmoored in 2013. It's still reporting, and some Eddystone Light-hearted genius has hybridized the data with Moby Dick. ( Or possibly it isn't lost at all? NOAA still lists it with lat-long. )

MetaFilter: "It's imperative to make sure that these manuscripts are safe"

The monk who saves manuscripts from ISIS "Rescuing the world's most precious antiquities from destruction is a painstaking project—and a Benedictine monk may seem like an unlikely person to lead the charge. But Father Columba Stewart is determined. Soft-spoken, dressed in flowing black robes, this 59-year-old American has spent the past 13 years roaming from the Balkans to the Middle East in an effort to save Christian and Islamic manuscripts threatened by wars, theft, weather—and, lately, the Islamic State."

Perlsphere: Perl 6 By Example: Functional Refactorings for Directory Visualization Code

This blog post is part of my ongoing project to write a book about Perl 6.

If you're interested, please sign up for the mailing list at the bottom of the article, or here. It will be low volume (less than an email per month, on average).


In the last installment we've seen some code that generated tree maps and flame graphs from a tree of directory and file sizes.

There's a pattern that occurs three times in that code: dividing an area based on the size of the files and directories in the tree associated with the area.

Extracting such common code into a function is a good idea, but it's slightly hindered by the fact that there is custom code inside the loop that's part of the common code. Functional programming offers a solution: Put the custom code inside a separate function and have the common code call it.

Applying this technique to the tree graph flame graph looks like this:

sub subdivide($tree, $lower, $upper, &todo) {
    my $base = ($upper - $lower ) / $tree.total-size;
    my $var  = $lower;
    for $tree.children -> $child {
        my $incremented = $var + $base * $child.total-size;
        todo($child, $var, $incremented);
        $var = $incremented,
    }
}

sub flame-graph($tree, :$x1!, :$x2!, :$y!, :$height!) {
    return if $y >= $height;
    take 'rect' => [
        x      => $x1,
        y      => $y,
        width  => $x2 - $x1,
        height => 15,
        style  => "fill:" ~ random-color(),
        title  => [$tree.name ~ ', ' ~ format-size($tree.total-size)],
    ];
    return if $tree ~~ File;
    subdivide( $tree, $x1, $x2, -> $child, $x1, $x2 {
        flame-graph( $child, :$x1, :$x2, :y($y + 15), :$height );
    });
}

sub tree-map($tree, :$x1!, :$x2!, :$y1!, :$y2) {
    return if ($x2 - $x1) * ($y2 - $y1) < 20;
    take 'rect' => [
        x      => $x1,
        y      => $y1,
        width  => $x2 - $x1,
        height => $y2 - $y1,
        style  => "fill:" ~ random-color(),
        title  => [$tree.name],
    ];
    return if $tree ~~ File;

    if $x2 - $x1 > $y2 - $y1 {
        # split along the x-axis
        subdivide $tree, $x1, $x2, -> $child, $x1, $x2 {
            tree-map $child, :$x1, :$x2, :$y1, :$y2;
        }
    }
    else {
        # split along the y-axis
        subdivide $tree, $y1, $y2, -> $child, $y1, $y2 {
            tree-map $child, :$x1, :$x2, :$y1, :$y2;
        }
    }
}

The newly introduced subroutine subdivide takes a directory tree, a start point and an end point, and finally a code object &todo. For each child of the directory tree it calculates the new coordinates and then calls the &todo function.

The usage in subroutine flame-graph looks like this:

subdivide( $tree, $x1, $x2, -> $child, $x1, $x2 {
flame-graph( $child, :$x1, :$x2, :y($y + 15), :$height );
});

The code object being passed to subdivide starts with ->, which introduces the signature of a block. The code block recurses into flame-graph, adding some extra arguments, and turning two positional arguments into named arguments along the way.

This refactoring shortened the code and made it overall more pleasant to work with. But there's still quite a bit of duplication between tree-map and flame-graph: both have an initial termination condition, a take of a rectangle, and then a call or two to subdivide. If we're willing to put all the small differences into small, separate functions, we can unify it further.

If we pass all those new functions as arguments to each call, we create an unpleasantly long argument list. Instead, we can use those functions to generate the previous functions flame-graph and tree-map:

sub svg-tree-gen(:&terminate!, :&base-height!, :&subdivide-x!, :&other!) {
    sub inner($tree, :$x1!, :$x2!, :$y1!, :$y2!) {
        return if terminate(:$x1, :$x2, :$y1, :$y2);
        take 'rect' => [
            x      => $x1,
            y      => $y1,
            width  => $x2 - $x1,
            height => base-height(:$y1, :$y2),
            style  => "fill:" ~ random-color(),
            title  => [$tree.name ~ ', ' ~ format-size($tree.total-size)],
        ];
        return if $tree ~~ File;
        if subdivide-x(:$x1, :$y1, :$x2, :$y2) {
            # split along the x-axis
            subdivide $tree, $x1, $x2, -> $child, $x1, $x2 {
                inner($child, :$x1, :$x2, :y1(other($y1)), :$y2);
            }
        }
        else {
            # split along the y-axis
            subdivide $tree, $y1, $y2, -> $child, $y1, $y2 {
                inner($child, :x1(other($x1)), :$x2, :$y1, :$y2);
            }
        }
    }
}

my &flame-graph = svg-tree-gen
    terminate   => -> :$y1, :$y2, | { $y1 > $y2 },
    base-height => -> | { 15 },
    subdivide-x => -> | { True },
    other       => -> $y1 { $y1 + 15 },

my &tree-map = svg-tree-gen
    terminate   => -> :$x1, :$y1, :$x2, :$y2 { ($x2 - $x1) * ($y2 - $y1) < 20 },
    base-height => -> :$y1, :$y2 {  $y2 - $y1 },
    subdivide-x => -> :$x1, :$x2, :$y1, :$y2 { $x2 - $x1 > $y2 - $y1 },
    other       => -> $a { $a },
    ;

So there's a new function svg-tree-gen, which returns a function. The behavior of the returned function depends on the four small functions that svg-tree-gen receives as arguments.

The first argument, terminate, determines under what condition the inner function should terminate early. For tree-map that's when the area is below 20 pixels, for flame-graph when the current y-coordinate $y1 exceeds the height of the whole image, which is stored in $y2. svg-tree-gen always calls this function with the four named arguments x1, x2, y1 and y2, so the terminate function must ignore the x1 and x2 values. It does this by adding | as a parameter, which is an anonymous capture. Such a parameter can bind arbitrary positional and named arguments, and since it's an anonymous parameter, it discards all the values.

The second configuration function, base-height, determines the height of the rectangle in the base case. For flame-graph it's a constant, so the configuration function must discard all arguments, again with a |. For tree-graph, it must return the difference between $y2 and $y1, as before the refactoring.

The third function determines when to subdivide along the x-axis. Flame graphs always divide along the x-axis, so -> | { True } accomplishes that. Our simplistic approach to tree graphs divides along the longer axis, so only along the x-axis if $x2 - $x1 > $y2 - $y1.

The fourth and final function we pass to svg-tree-gen calculates the coordinate of the axis that isn't being subdivided. In the case of flame-graph that's increasing over the previous value by the height of the bars, and for tree-map it's the unchanged coordinate, so we pass the identity function -> $a { $a }.

The inner function only needs a name because we need to call it from itself recursively; otherwise an anonymous function sub ($tree, :$x1!, :$x2!, :$y1!, :$y2!) { ... } would have worked fine.

Now that we have very compact definitions of flame-graph and tree-map, it's a good time to play with some of the parameters. For example we can introduce a bit of margin in the flame graph by having the increment in other greater than the bar height in base-height:

my &flame-graph = svg-tree-gen
    base-height => -> | { 15 },
    other       => -> $y1 { $y1 + 16 },
    # rest as before

Another knob to turn is to change the color generation to something more deterministic, and make it configurable from the outside:

sub svg-tree-gen(:&terminate!, :&base-height!, :&subdivide-x!, :&other!,
                 :&color=&random-color) {
    sub inner($tree, :$x1!, :$x2!, :$y1!, :$y2!) {
        return if terminate(:$x1, :$x2, :$y1, :$y2);
        take 'rect' => [
            x      => $x1,
            y      => $y1,
            width  => $x2 - $x1,
            height => base-height(:$y1, :$y2),
            style  => "fill:" ~ color(:$x1, :$x2, :$y1, :$y2),
            title  => [$tree.name ~ ', ' ~ format-size($tree.total-size)],
        ];
        # rest as before
}

We can, for example, keep state within the color generator and return a slightly different color during each iteration:

sub color-range(|) {
    state ($r, $g, $b) = (0, 240, 120);
    $r = ($r + 5) % 256;
    $g = ($g + 10) % 256;
    $b = ($b + 15) % 256;
    return "rgb($r,$g,$b)";
}

state variables keep their values between calls to the same subroutine and their initialization runs only on the first call. So this function slightly increases the lightness in each color channel for each invocation, except when it reaches 256, where the modulo operator % resets it back to a small value.

If we plug this into our functions by passing color => &color-range to the calls to svg-tree-gen, we get much less chaotic looking output:

Tree map with deterministic color generation

Flame graph with deterministic color generation and one pixel margin between
bars

More Language Support for Functional Programming

As you've seen in the examples above, functional programming typically involves writing lots of small functions. Perl 6 has some language features that make it very easy to write such small functions.

A common task is to write a function that calls a particular method on its argument, as we've seen here:

method total-size() {
    $!total-size //= $.size + @.children.map({.total-size}).sum;
    #                                        ^^^^^^^^^^^^^
}

This can be abbreviated to *.total-size:

method total-size() {
    $!total-size //= $.size + @.children.map(*.total-size).sum;
}

This works for chains of method calls too, so you could write @.children.map(*.total-size.round) if total-size returned a fractional number and you wanted to the call .round method on the result.

There are more cases where you can replace an expression with the "Whatever" star * to create a small function. To create a function that adds 15 to its argument, you can write * + 15 instead of -> $a { $a + 15 }.

If you need to write a function to just call another function, but pass more arguments to the second function, you can use the method assuming. For example -> $x { f(42, $x } can be replaced with &f.assuming(42). This works also for named arguments, so -> $x { f($x, height => 42 ) } can be replaced with &f.assuming(height => 42).

Summary

Functional programming offers techniques for extracting common logic into separate functions. The desired differences in behavior can be encoded in more functions that you pass in as arguments to other functions.

Perl 6 supports functional programming by making functions first class, so you can pass them around as ordinary objects. It also offers closures (access to outer lexical variables from functions), and various shortcuts that make it more pleasant to write short functions.

Subscribe to the Perl 6 book mailing list

* indicates required

Hackaday: Do you trust your hard drive indication light?

Researchers in the past have exfiltrated information through air gaps by blinking all sorts of lights from LEDs in keyboards to the main display itself. However, all of these methods all have one problem in common: they are extremely noticeable. If you worked in a high-security lab and your computer screen started to blink at a rapid pace, you might be a little concerned. But fret not, a group of researchers has found a new light to blink (PDF warning). Conveniently, this light blinks “randomly” even without the help of a virus: it’s the hard drive activity indication light.

All jokes aside, this is a massive improvement over previous methods in more ways than one. Since the hard drive light can be activated without kernel access, this exploit can be enacted without root access. Moreover, the group’s experiments show that “sensitive data can be successfully leaked from air-gapped computers via the HDD LED at a maximum bit rate of 4000 bit/s (bits per second), depending on the type of receiver and its distance from the transmitter.” Notably, this speed is “10 times faster than the existing optical covert channels for air-gapped computers.”

We weren’t born last night, and this is not the first time we’ve seen information transmission over air gaps. From cooling fans to practical uses, we’ve seen air gaps overcome. However, there are also plenty of “air gaps” that contain more copper than air, and require correspondingly less effort.

[via /r/hacking]


Filed under: computer hacks

s mazuk: Photo





s mazuk: archiveofaffinities: Kazuhiro Ishii, 54 Windows, Soya Clinic...



archiveofaffinities:

Kazuhiro Ishii, 54 Windows, Soya Clinic and Residence, Hiratsuka, Kanagawa, Japan, 1975

Instructables: exploring - featured: Concrete Slump Test (The Cheap Way)

Concrete cones can be expensive and hard to find. For the home DIY'er, you rarely need a 12 foot cone to do tests. Supplies Supplies needed- 1 quart mixing cup (Can be found in the painting section of any hardware store)- Tape Measure- Tamping Rod (This is used to settle the mix, can be any roun...
By: Designer Of

Continue Reading »

Paper Bits: fucktheoryquestions: …And Here My Troubles Began

















fucktheoryquestions:

…And Here My Troubles Began

OUR VALUED CUSTOMERS: While discussing the intricacies of STAR WARS...


Hackaday: Ask Hackaday: Bitten by the Crocodile Clip

I have a love/hate relationship with the crocodile clip. Nothing is so quick to lash together a few half-baked prototype boards on your desk, but nothing ends up in such a tangle so quickly, either. I love the range of pretty colors that crocodiles come in, as well as the easy ability to just clip on to the side of a PCB, or any old loose wire. But they come loose, they can have intermittent contacts, and we’re not even sure if there is such a thing as a current rating for them.

When [WarriorRocker] wrote in asking what we use instead of crocodile clips, he included a photo that sent a chill down my spine, from a review of some clips on Amazon. I’ve seen this one in real life. And what’s worse is the one with the loose wires that sometimes make contact with the spring-clip body and sometimes not.

After an hour-long debugging session about twelve years ago now, such an intermittent croc caused us to make a lifelong vow. All of our croco-clips have been disassembled, manually inspected, and many of them soldered together. When I buy new ones, I check them all before mixing them in with the known-goods. Even thinking about this now makes me want to pull back their little rubber booties just to make sure.

But intermittents are not the only source of trouble. How thick is the wire inside your crocs? Are you sure that it’s beefy enough to take the current you’re passing through it? Are you sure the pointy teeth are making enough contact with whatever you’re clipping them to? Of course not. How would you be?

So we’re passing [WarriorRocker]’s questions on to you. Do you have any rules of thumb for how much current to pass through crocodile clips? What do you clip them to and what do you avoid? How do you manage the tangle? Do you just trust them when they come from the factory or have you been bitten too? What alternatives have you considered, and how’s it working? Or is there some ultra-premium crocodile clip manufacturer out there that lets us trade off pain for mere money?


Filed under: Ask Hackaday

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Complexities

DOUG By Guest Blogger Doug Rowat

ETFs are wonderful. There’s a reason why we use them in all our client portfolios. However, it’s not like we’ve discovered a well-kept secret: ETFs are now more than a $100 billion industry in Canada, which has been growing almost 20% annually over the past decade. It’s obvious why: ETFs provide more transparency, tax efficiency and, particularly, lower costs than mutual funds.

However, they’re not perfect.

Below are my concerns, which also highlight the pitfalls to avoid when investing with ETFs.

1. Take the market-mastery sales pitches of smart beta and actively managed ETF providers with a grain of salt. Because plain-vanilla ETFs are now so low cost that they’re, for all intents and purposes, free (a 5 basis point fee for an ETF amounts to a cost of only 50 cents per $1,000 invested), ETF providers have issued more and more higher-cost ETFs with an ‘active’ overlay to juice their margins. The Canadian ETF space is now about 1/3 actively managed and this area continues to grow. The average fee for actively managed ETFs is 0.85% vs. passive ETFs at 0.57%—an almost 50% premium. Certainly some of these ETFs have merit, but investors must be aware that a ‘strategy’ is being applied to the ETF and that this strategy will, at times, fail. Also, investors must ask themselves if the active management is necessary. For instance, why pay for a pricey ‘low volatility’ ETF when portfolio volatility can be controlled simply by adjusting bond weightings in the asset allocation.

2. ETFs have become highly specialized. With more than 6,000 exchange trade products listed globally it stands to reason that there will be many that are narrowly focused. On the market currently are ETFs that specialize in robotics, cyber security, livestock and beef futures, Catholic values (no pornography amongst other sinful businesses) and Nashville (yes, as in the home of the Grand Ole Opry). The ETF market is becoming very…what’s the word? Granular. Investors must be honest and ask themselves if they truly understand the highly specific areas that they’re investing in. Keep this in mind as you eagerly await the first medical marijuana ETF (it’s coming; Horizons just filed its prospectus).

3. Timing of new ETF issuance is not always advantageous for investors. ETF providers are in the business of sales. Fair enough, but just as a company typically only launches an IPO when its industry is red hot (read: expensive), so too do ETF providers with their product launches. For example, a new Bitcoin ETF is seeking approval from the Securities and Exchange Commission and may be released shortly. I’m not going to argue the merits of Bitcoin, but it should come as no surprise that Bitcoin relative to the US dollar is challenging its all-time highs (see chart). Also, the potential volatility of many new ETFs is not always clearly disclosed. It might surprise some investors, for instance, to learn that Bitcoin has been roughly 20x more volatile than the S&P 500 over the past five years.

Bitcoin ETF may be released near all-time highs

Deviation: Bitcoin 20x more volatile than S&P 500

Source: Turner Investments, Bloomberg. Standard deviation measures the amount of variation or dispersion for a particular index or security. In other words, it measures the risk of owning that index or security.

________________________________________________________

5. Leveraged and derivative-based ETFs, even now, aren’t fully understood by investors. This is not entirely the fault of the ETF as it is only doing what it was designed to do, but many investors are still unaware of the downside of these derivative-based products even after all the media coverage of ‘daily rolling contracts’ and ‘volatility drag’. For example, we have new clients arrive all the time holding VIX (volatility) ETFs that have sat in their portfolios for years. A VIX ETF is meant to be tactically traded (a virtually impossible task, by the way) not bought and held. The most popular VIX instrument, the iPATH S&P 500 VIX Short-Term Futures (VXX), held over five years has lost 99% of its value! So, ensure that you fully understand how an ETF should be used before buying. Derivative-based/leveraged ETFs also aren’t cheap having an average MER of almost 1.5%—more than double the industry average.

ETFs are wonderful products and have driven costs down across the entire investment industry, but they come with their own baggage and complexities. There are now more than 15 ETF providers in Canada. Fifteen years ago there was essentially one. And mutual fund companies are getting more involved in the space: AGF, Dynamic and Manulife are just a few of the fund companies launching ETFs this year. So, ETFs are only going to get more complex with more providers promising that they’ve built a better mouse trap.

Take care that you don’t get your fingers caught. Or find a financial advisor who’s learned better and safer ways to get the cheese.

Doug Rowat,FCSI® is Portfolio Manager with Turner Investments and Senior Vice President, Private Client Group, Raymond James Ltd.

 

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - The Last Potion



Click here to go see the bonus panel!

Hovertext:
Zach Weinersmith was known for his brilliant combinations of deep science, philosophy, mathematics, and literature.

New comic!
Today's News:

Big announcement Monday. Maybe the biggest of my life. Hope you geeks like it!

Penny Arcade: News Post: The Banner Saga 3 Promotion Thingy

Tycho: Hey!  I backed Banner Saga hard, like, so hard that I got to write a Godstone for the sequel and create an item and stuff.  I was asked to help with a promotion for The Banner Saga 3, and I know the second one was a rough ride for the studio so I said yes several times in rapid succession. They said that I could make three to five items for Banner Saga 3, but if I could make five items instead of three, why wouldn’t I do that?  So let’s go with five.  Also: Twitch Prime is setting up a thing where members can download the original Banner Saga for free through…

Explosm.net: Comic for 2017.02.25

New Cyanide and Happiness Comic

Penny Arcade: News Post: Up Top

Tycho: Horizon: Zero Dawn comes out on the 28th here in the States, and March 1st in Europe.  My feeling is that you should buy it. The introduction has a few very interesting moments, and my daughter Ronia was there for all of them; when the game begins, there are many games it could be.  I like the one that it chooses, but they take their time to let things breathe in this unformed space.  They allow the center to bake through. Horizon is clung-to by incredibly strange ideas; Ronia doesn’t know much about the game, other than the fact that the protagonist looks like…

ScreenAnarchy: Hong Kong goes West - When Hong Kong film makers attempt to break the Western market - part 2

Moving into the 1990’s Golden Harvest would once again make an attempt for American success. Unfortunately their first American made film of the decade was the poor China O Brien (1990), an attempt by Golden Harvest to launch star Cynthia Rothrock as a star in her native country. Golden Harvest had previously worked well with Rothrock on the Hong Kong productions of Yes Madam (1985), Millionaires Express (1986), Above the Law (1986), Inspector Wears Skirts (1988) and The Blond Fury (1989). From a business stand point it makes sense why they would choose Rothrock to front China O Brien. The main issue is how director Robert Clouse chooses to shoot the action, showing that he hadn’t learned anything since his Enter the Dragon days, if...

[Read the whole post on screenanarchy.com...]

All Content: Rock Dog

Thumb_rock-dog-2017

You may have already heard that “Rock Dog,” a new animated movie from an outfit that’s not Pixar and not Disney and not Dreamworks and not the People Who Gave You “Ice Age” and not the People Who Gave You “Minions” and “Despicable Me” is, besides not being from a proven provider of child-distracting content, also a movie that features a character who’s a yak and is named Fleetwood Yak.

All these things are true. “Rock Dog” is a Korean/Chinese co-production crafted by largely American animation artists. Portions of its scenario may look like they’ve been inspired by the likes of “Kung Fu Panda” but the actual source material for the movie is a Chinese graphic novel called “Tibetan Rock Dog” by Zheng Jun, whose day job is as a Chinese rock musician. And yes, there is a character in it who’s a yak named Fleetwood Yak.

But to be honest that’s really the most objectionable joke in the whole feather-light, primary-color filled, shorter-than-90 minute movie. What’s interesting about “Rock Dog” is just how very unapologetically a kid’s movie it is. True, the voice cast for the American version is designed to have some adult appeal. The aforementioned Fleetwood Yak is voiced by Sam Elliott, and the role of the yak is structurally a nod to Mr. Elliott’s work in “The Big Lebowski.” But this movie, despite being about a dog who wants to play rock music, has no winky pop-culture references besides that. The comic actors doing voice work, who also include Luke Wilson, Lewis Black, Matt Dillon, J.K. Simmons, Kenan Thompson, and Eddie Izzard, all do their jobs with relish and dispatch, but there’s nothing clever-clever about it.

The story is dopily simple. In the Asian mountain (Tibet is not prominently mentioned here), an alliance of guard dogs and self-shearing lambs have managed to form a society that happily keeps predatory wolves at bay. The mastiff patriarch Khampa (Simmons) has come to believe music has no part in this society—too distracting—and has banned it. His good-natured but lacking-in-fire son Bodi (Wilson), slouching around in a very neat wool cap that nonetheless manages to scream “stoner,” is traipsing through a valley one day when a radio falls from a plane. Once he tunes into a particular station, he can’t believe what he hears at all. He starts dancing to that fine, fine music, and so on. He breaks out a koto/guitar from the village storeroom and gets to strumming, which gets to annoying his dad. But the old man’s a softie, and gives him a bus ticket to the big city, where he hopes to go to Rock Park and learn from his new idol, rotter rock star Angus Scattergood. The arrival of Bodi attracts the notice of a wolf pack, led by Linnuxx (Black), who, having been banished from his mountain source of lamb chops, now runs mixed-martial-arts events in the local sports arena and thugs around ineffectually with his gang. So Linnuxx here sees an opportunity to get back at Khampa. In the meantime, charmingly naïve Bodi flops big in Rock Park, but is reluctantly adopted by pompous Angus (Izzard), who’s having a devil of a time writing his long-overdue new single, and is intrigued by Bodi, whose burgeoning talent sees him creating a very nice blue fire when he strums his guitar.

All this is every bit as silly and innocuous as its sounds, and it goes down, if not like honey, than like very finely spun cotton candy. I had to pay to see this movie like any other civilian. The distributors were not screening it for “digital” press, so despite my protestations that I liked rock, and that while I’m generally more of a cat person I also like dogs, so how bad was my review likely to be, I couldn’t get headway. So I trudged to my local multiplex at two this afternoon. On the way up to the theater I saw a nanny with a gaggle of six-year-olds, and I was like, “Are you guys going to see ‘Rock Dog’? Me too?” And I realized maybe this was not a good look for an overweight man in his fifties by himself to be sporting. In any event, despite all the many things that “Rock Dog” is not—one of which is “filled with rock music;” there are snippets of a Foo Fighters song and snippets of a Radiohead song and one original song and that’s kind of it—I can report that it enraptured and delighted, and most importantly, made quiet, the houseful of little kids and their nannies with which I watched it. If this is an experience your current moviegoing habits call for, see “Rock Dog” with confidence. 


Planet Haskell: Functional Jobs: Software Engineer/Researcher at Galois, Inc. (Full-time)

We are currently seeking software engineers/researchers to play a pivotal role in fulfilling our mission to make critical systems trustworthy.

Galois engineers participate in one or more projects concurrently, and specific roles vary greatly according to skills, interests, and company needs. Your role may include technology research and development, requirements gathering, implementation, testing, formal verification, infrastructure development, project leadership, and/or supporting new business development.

Skills & Requirements

  • Education: Minimum of a Bachelor’s degree in computer science or equivalent. MS or PhD in CS or a related field desirable but optional, depending on specific role.
  • Required Technical Expertise: Must have hands-on experience developing software and/or performing computer science research. Demonstrated expertise in aspects of software development mentioned above.
  • Required General Skills: Must work well with customers, including building rapport, identifying needs, and communicating with strong written, verbal, and presentation skills. Must be highly motivated and able to self-manage to deadlines and quality goals.

Our engineers use tools such as functional programming languages (including Haskell) and formal verification techniques to design and develop advanced technologies for safety- and security-critical systems, networks, and applications. Our research areas include computer security, cyber physical systems, identity management, security risk analysis, machine learning, systems software, and networking. Engineers work in small team settings and must successfully interact with clients, partners, and other employees in a highly cooperative, collaborative, and intellectually challenging environment.

We’re looking for people who can invent, learn, think, and inspire. We reward creativity and thrive on collaboration. If you are interested, please submit your cover letter and resume via out website

Get information on how to apply for this position.

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Why saving sucks

Poor, sodden, misguided savers. Your nation had bad news for you Friday.

The current inflation rate has bloated to 2.1%, thanks to higher energy costs – mostly gas. For many working people, as well as risk-averse savers, this is a disaster. In fact Stats Can has told us wages in many low-end occupations are now declining. Average weekly earnings for retail workers are lower year/year by more than 2% – which sucks, since almost two million people toil in this business (the most of any sector in the entire economy). Down, too, in hotels, tourism and restaurants, by 1.9%.

Overall, wages in Canada have risen in the past year by only 1.2%. Adjusted for the swelling cost of living, it means these millions are earning less now than they were a year ago. This is another reminder of what a fake economy we live in, how fantasy real estate values have become detached from economic reality, and why a growing hunk of our $2 trillion in household debt is absolutely, totally, and forever unrepayable.

Worse, this is creaming seniors with GICs, wuss investors with whimsical “high-yield” savings accounts and anyone misguided enough to keep big bucks in their chequing or regular savings account. In fact, to earn just 2% on a guaranteed investment at one of the big banks, you have to buy a non-cashable five-year GIC. That’s locking up your money for half a decade, and you’re still losing ground. What a dilemma for people too emotional or financially illiterate to properly deploy their capital.

In fairness, the main reason some folks don’t invest is fear. These days they’re afraid of “the markets” which, in popular parlance, means the Dow. Here it is:

Three important thoughts to keep in mind.

First, the Dow Jones Industrial Index (and the S&P 500, and the TSX) are in record territory in terms of their numeric indices. But that’s not the measure to look at – rather the P/E ratios. The P stands for the price of stocks and the E represents the earnings of the underlying companies composing the index. The ratio traditionally (for US markets) is in the 17 range, and lately has floated by to around 21. So, yeah, stocks are not cheap.

But neither are they dangerous. The P/E went to 44 during the dot-com era, or more than twice today’s level. And what were retail investors doing then? You bet – feasting on technology stocks and stuffing their RRSPs with Nortel. Kinda like buying a house in the GTA these days.

Second, a balanced and diversified portfolio with 60% of growth assets divided between Canada, the US and international markets will have maybe 6% exposure to big US companies, and another 7% or 8% invested in medium and smaller enterprises. So when you invest this way you’re hardly “playing the market”, especially when also owning preferred shares, bonds, real estate investment trusts and having exposure across the globe.

Third, there’s a strong case to be made that as robust as markets are now, we ain’t seen nothing yet – with this bull as tawny and pumped as, say, my own glistening abs. The reason is simple. If Trump makes good on his promise to slice corporate tax rates from the current 35% level to 15% (or even 20%, or 25%) then the E in the P/E ratio could  erupt. A tax reduction of that size would drop huge amounts to corporate bottom lines, suddenly making stocks look cheap.

Will he do it?

We might know next week, when the Trumpster makes a key and widely-anticipated speech to Washington lawmakers. Said an economist at Barclays on Friday: “We think that the presentation to Congress will be a good opportunity for the president to more clearly flesh out his policy priorities and goals, especially on trade, taxes, and public investment.” So far this guy has a track record of doing (or trying to do) what he promised on the campaign trail, and slashing both corporate and middle-class taxes was a key plank in getting elected. (The contrast with Canadian politicians is becoming comical.)

By the way, while GICs were collecting between 1% and 2% last year, a highly-balanced and diversified balanced portfolio delivered well north of 8%. Sure, an all-equity account did better, but it came with a ton of volatility. And, after all, what most people want is simple – no losses, and predictable growth.

There are no guarantees financial assets will continue to perform as they have in the past. But neither are there many reasons to think they won’t. The deflationary years that doomers so love have passed. We’re now on the other side. Savers will pay dearly.

All Content: If We Picked the Winners 2017: Best Picture

Thumb_moonlight2

In anticipation of the Academy Awards, we polled our contributors to see what they thought should win the Oscar. Once we had our winners, we asked various writers to make the case for our selection in each category. Here, Matt Zoller Seitz makes the case for the Best Picture of 2016: "Moonlight."


"Moonlight" is ecstatic filmmaking. Co-writer/director Barry Jenkins tells his tale of a young man's emotional and sexual coming-of-age through sharp dialogue and clearly articulated scenes that build to small or large epiphanies. But unlike too many American independent films, this one isn't essentially a PDF of the script acted out for a cinematographer who knows how to light faces. No, "Moonlight" is truly a movie, ultimately a symphony of images that punches in the same weight class as such visionary narrative experiments as "2001: A Space Odyssey," "Badlands," "All That Jazz," "The Limey," "Three Times," and perhaps most obviously the work of Wong Kar-Wai, possibly the world's greatest practitioner of sensual melancholy.

The structure has more in common with music than theater or the novel. Jenkins breaks the story into three sections, almost like symphonic movements, titled "Little," "Chiron" and "Black," each dealing with the protagonist, a gay man from Miami, at different phases of his development. In the first section, the character is a smart but persecuted young man known as "Little" (Alex Hibbert); we watch him as he's befriended by a drug dealer named Juan (Mahershala Ali), whose client list includes Little's mom Paula (Naomie Harris). Juan and his partner (Janelle Monae) take a parental interest in the life of this boy, who plaintively asks, "Am I a faggot?" From there the film moves through the hero's adolescence, where he's known as Chiron (and played by Ashton Sanders), struggling to balance adolescent macho male role-playing with his own suppressed awareness that he's something different than that, someone else, a particular someone who doesn't fit in. The third section revisits Little, who has transformed himself into Black (Trevante Rhodes), a thick-necked tough guy who feels a bit like Juan without the sensitivity, or hiding it; but although he presents a rock-hard facade to the world, he's really a flower who would open if sensitively tended. 

Every moment of the hero's development is thoughtfully and sensitively handled. But the space between the three ages of the hero, coupled with Jenkins' elliptical style, means that "Moonlight" leaves plenty of imaginative space in which to imagine the emotional connective tissue and project ourselves into the story, however similar or different our own journeys may be. This is a terrific example of finding the universal in the specific (Miami natives have already testified to the movie's documentary-accurate attention to local dialects, music, food, even the quality of the light), but even more so, it's a visceral, at times sensual experience, a sound and light show with a beating human heart at its center.

All Content: If We Picked the Winners 2017: Best Director

Thumb_moonlight

In anticipation of the Academy Awards, we polled our contributors to see what they thought should win the Oscar. Once we had our winners, we asked various writers to make the case for our selection in each category. Here, Brian Tallerico makes the case for the Best Director of 2016: Barry Jenkins for "Moonlight."


It’s not only that Barry Jenkins directed the best film of 2016. The winner for Best Director and Best Picture don’t necessarily need to match up. It’s not only that his competition in this category could have been stronger (no Martin Scorsese? Really?) It’s not only the important message it would send in 2017 to award a black man an Oscar for Best Director for the first time in film history. Sure, all of these factors played a role in our decision to assert that Jenkins should win on Sunday, but it is primarily because of the work right up there on the screen—the fluid storytelling, the blend of the lyrical & the realistic, the ability to make the specific feel universal, the pitch-perfect work with ensemble, the use of music, the tactile sense of setting. It’s not just one of the best-directed films of 2016 but of the decade.

Consider if you will all the places that “Moonlight” could have gone wrong. First and foremost, it is a film divided into three chapters, in which a different performer plays the same character, and yet it never feels episodic. We believe that Little (Alex Hibbert), Chiron (Ashton Sanders) and Black (Trevante Rhodes) are the same person, and we do so because of Jenkins’ direction—his visual choices that tie the chapters together and the way he directs his performers to echo each other without mimicking. He connects the chapters with a visual and emotional language in a way that allows us to never doubt or question that these three performers are playing the same person. The movie falls apart without that skill.

“Moonlight” strikes that amazing balance between personal memory and traditional storytelling. Jenkins allows us to never get pushed away by the specificity of it all, knowing that the film is stronger if it brings us into Terrell McCraney’s personal story instead of trying to make it “something for everyone.” This is Chiron’s story. That you can see yourself in it or feel its emotions is because of the truth of it and Jenkins' ability to turn truth into poetry.


ScreenAnarchy: NYC Weekend Picks, Feb. 24-26: Jordan Peele Curates, Oscar Nominated Shorts and Best Picture Winners, and Doc Fortnight 2017

This Oscar weekend's offerings include:  At BAM Rose Cinemas, the continuation of the series "The Art of the Social Thriller," curated by Jordan Peele (half of the brilliant sketch duo Key & Peele), of films that inspired his feature directorial debut Get Out, which opens today; Metrograph continues their series "Oscar: Our Favorite Best Picture Winners," which runs through March 3; The 2017 Oscar-nominated short films (live action, documentary, and animation) screen at multiple venues; The Museum of Modern Art wraps up "Doc Fortnight 2017," their eclectic and essential survey of some of the best work in nonfiction filmmaking being made today. More details are in the gallery below....

[Read the whole post on screenanarchy.com...]

ScreenAnarchy: Hong Kong goes West - When Hong Kong film makers attempt to break the Western market - part 1

Throughout the 1980’s and early 1990’s, Hong Kong cinema produced many films that to this day are considered to be the best action films ever made. Films like Police Story (1985), The Killer (1989), Once Upon a Time in China (1991), Hard Boiled (1992) and Full Contact (1992) are still impressing new audiences to this day and it is no surprise that Hollywood producers began to take notice of the popularity of such films. It was only a matter of time before film makers like John Woo, Tsui Hark and Ringo Lam would be brought to Hollywood and attempt to incorporate their skills into a Hollywood production. Unfortunately a number of these films never lived up to the directors Hong Kong work, with Hollywood studios...

[Read the whole post on screenanarchy.com...]

CreativeApplications.Net: Everything old is new again – DiMoDA, a ‘digital native’ platform for digital art

Gallery view of DiMoDA 2.0: Morphe Presence on view ath the RISD Museum January 6th through May 14th, 2017. Courtesy of the RISD Museum, Providence, RI.DiMoDa is a VR-based ‘digital museum for digital art’ initiated in 2015. After a busy 2016 the museum’s second iteration is currently showing at RISD Museum in Rhode Island. The museum’s co-founder Alfredo Salazar-Caro sheds a little light on where there platform has been, and where it is going.

Open Culture: How James Joyce’s Daughter, Lucia, Was Treated for Schizophrenia by Carl Jung

The life of James Joyce’s schizophrenic daughter Lucia requires no particular embellishment to move and amaze us.  The “received wisdom,” writes Sean O’Hagan, about Lucia is that she lived a “blighted life,” as a “sickly second child” after her brother Giorgio. As a teenager, she “pursued a career as a modern dancer and was an accomplished illustrator. At 20, having abandoned both, she fell hopelessly in love with [Samuel] Beckett, a 21-year-old acolyte of her fathers.” He soon ended their one-sided relationship, an incident that may have triggered a psychotic break. Beckett was one of the few people to visit her later in the mental hospital where she died in 1982 after decades of institutionalization.

Before succumbing to her illness, Lucia was a highly accomplished artist who worked “with a succession of radically innovative dance teachers,” notes Hermione Lee in a review of a recent biography that “prove[s]… Lucia had talent.” (See her above in Paris in 1929.) Her promise renders her fall that much more dramatic, and her tragedy has inspired variously sensational biographies, plays, a novel and a graphic novel. Lucia also inspired an unflattering portrait in Beckett’s Dream of Fair to Middling Women and, most famously, perhaps provided a model for the language of Finnegans Wake. As Joyce once remarked, “People talk of my influence on my daughter, but what about her influence on me?”


The relationship between father and daughter has provided a subject of disturbing speculation, possibly warranted by Lucia’s “father-fixated… mental agonies,” as Stanford’s Robert M. Polhemus writes, and by “eroticized father-daughter, man-girl relationships” in Finnegans Wake that weave in Freud and Jung “with sexy nymphets on the couches of their secular confessionals.” At least in the excerpt Polhemus cites, Joyce uses the prurient language of psychoanalysis to seemingly express guilt, writing, “we grisly old Sykos who have done our unsmiling bits on ‘alices, when they were yung and easily freudened….”

Without inferring the worst, we can see the rest of this unsettling passage as parody of Jung and Freud’s ideas, of which, Louis Menand writes, he was “contemptuous.” And yet Joyce sent Lucia to see Carl Jung, “the Swiss Tweedledee,” he once wrote, “who is not to be confused with the Viennese Tweedledee.” His daughter’s behavior had become “increasingly erratic,” Lee writes, “she vomited up her food at table; she threw a chair at Nora [Barnacle, her mother] on Joyce’s 50th birthday… she cut the telephone wires on the congratulatory calls that friends were making about the imminent publication of ‘Ulysses’ in America; she set fire to things….”

After a succession of doctors and diagnoses and an “unwilling incarceration,” Jung agreed to analyze her. He had become acquainted with Joyce’s work, having written an ambivalent 1932 essay on Ulysses (calling it “a devotional book for the object-besotted white man”), which he sent to Joyce with a letter. Jung believed that both Lucia Joyce and her father were schizophrenics, but that Joyce, Menand writes, “was functional because he was a genius.” As Jung told Joyce biographer Richard Ellmann, Lucia and Joyce were “like two people going to the bottom of a river, one falling and the other diving.” Jung also, writes Lee, “thought her so bound up with her father’s psychic system that analysis could not be successful.” He was unable to help her, and Joyce reluctantly had her committed.

Much of the relationship between Joyce and his daughter remains a mystery because of the destruction of nearly all of their correspondence by Joyce’s friend Maria Jolas. (Likewise Beckett burned all of his letters from Lucia). This has not stopped her biographer Carol Loeb Schloss from writing about them as “dancing partners,” who “understood each other, for they speak the same language, a language not yet arrived into words….” What is clear is that “Joyce’s art surrounded” his daughter, “haunted her from birth,” and was part of the circumstances that led to her and her brother often living in extreme poverty and instability.

Lucia resented her father but was never able to fully separate herself from him after several failed relationships with other prominent figures, including American artist Alexander Calder. Whether we characterize her story as one of abuse or, as Lee writes of Schloss’s biography, one of “love and creative intimacy,” depends on what we make of the limited evidence available to us. The erasure of Lucia from her father’s life began not long after his death, and hers “is a story that was not supposed to be told,” writes Schloss. But it deserves to be, as best as it can. Had her life been different, she would doubtless be well-known as an artist in her own right. As one critic wrote of her skills as a performer, linguist, and choreographer in 1928, James Joyce “may yet be known as his daughter’s father.”

Related Content:

Carl Jung Writes a Review of Joyce’s Ulysses and Mails It To The Author (1932)

James Joyce: An Animated Introduction to His Life and Literary Works

When James Joyce & Marcel Proust Met in 1922, and Totally Bored Each Other

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How James Joyce’s Daughter, Lucia, Was Treated for Schizophrenia by Carl Jung is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

s mazuk: archiveofaffinities: Justus Dahinden, Ferro House on the Lake...



archiveofaffinities:

Justus Dahinden, Ferro House on the Lake of Zurich, Zurich, Switzerland, 1970

Quiet Earth: First Look at Crew of ALIEN:COVENANT in Short Film PROLOGUE: LAST SUPPER

I realize this has been floating around the web for a few days but as you may have noticed from the lack of posting this last week, it's been a busy couple of days in the QE virtual office. Were this any other project, I'd likely let it slide but we're talking about a new Alien movie so... we'll make an exception.


Conceived by Ridley Scott and directed by Luke Scott (of Morgan fame (review)), this bit of marketing for Alien: Covenant is a short film titled "Prologue: Last Supper" which features the cast of the upcoming film celebrating their last dinner before going into cryosleep.


As the prologue moniker suggests, this is a pre-cursor to the movie which focuses on the [Continued ...]

Quiet Earth: Survival Tested in PA Thriller HERE ALONE [Trailer]

We had our first promising look at Rod Blackhurst's PA thriller Here Alone last year, shortly after the movie won the Audience Award at the Tribeca Film Festival.


At the time, the movie didn't have a release date but after a successful festival run, Here Alone is finally getting a theatrical release.


Lucy Walters stars as Ann, a young woman living alone in the wilderness and trying to survive after an unexplained outbreak has turned most of the population into zombies. Her somewhat peaceful survival is put into jeopardy when she encounters and helps a pair of survivors.


Yeah, it's another zombie movie and though I don't expect it to re-invent [Continued ...]

Embedded in Academia: Teaching Python Informally to Kids

For the last few months I’ve been running a “coding club” for my son’s sixth-grade class. Once a week, the interested students (about 2/3 of the class) stick around for an hour after school and I help them learn to program. The structure is basically like the lab part of a programming class, without the lecture. I originally had planned to do some lecturing but it soon became clear that at the end of a full day of school (these kids get on the bus around 7:00am, ugh) there was very little attention span remaining. So instead I decided to give them a sheet of problems to work on each week and I spend the hour walking around helping whoever needs help.

One of my main goals is to avoid making anyone hate programming. There are three parts to this. First, I’m not forcing them to do anything, but rather providing suggestions and guidance. Second, there’s no assessment going on. I’ve never been too comfortable with the grading part of teaching, actually. It can really get in the way of learning. Third, I’m encouraging them to work together. I’ve long noticed (starting when I was a CS student) that most learning occurs between students in the lab. Not as much learning happens in the lecture hall.

For a curriculum, we started out with turtle graphics, which I’ve always found to be wonderful. Python’s turtle library is clunkier than Logo and also is abysmally slow, but the advantages (visual debugging, many kids enjoy graphics) outweigh the problems. Fractals (Sierpinski triangle, dragon curve) were a good way to introduce recursion. We spent three or four weeks doing turtle stuff before moving on to simple crypto, which didn’t work as well. On the other hand, the students did really well with mathy code, here’s the handout from one of the math weeks.

Some observations:

  • One of my favorite moments so far has been when a couple of students implemented rot13 functions and used it to send rude messages to each other.
  • It’s pretty important to take a snack break.
  • Although the rockstar / 10x programmer idea is out of favor, it is absolutely clear that there is a huge amount of variation (much more than 10x) in how easily a group of twenty 10-11 year olds learn programming. Some kids are teaching themselves to open files and parse text while others are stuck on basic syntax issues.
  • Syntax, which computer professionals more or less learn to overlook, is hugely important.
  • I’ve always disliked Python’s significant whitespace and watching kids struggle with it has made me hate it even more.

Anyhow, this has been a fun teaching exercise and hopefully it is benefiting the kids.

BOOOOOOOM!: Artist Spotlight: Lauren Matsumoto

Paintings by Brooklyn-based artist Lauren Matsumoto. More images below.

BOOOOOOOM!: Artist Spotlight: Lauren Gallaspy

Sculptures by artist Lauren Gallaspy, currently based in Helena, Montana. See more images below.

Quiet Earth: Bloody New Look at AMERICAN GODS [Trailer]

I must admit that considering the release of "American Gods" is just around the corner, I'm just not that excited about the project. I'm having a hard time putting my finger on exactly why that is but I have a feeling it may have something to do with the fact that I'm still not really sure what the show is about beyond the plot that has a guy named Shadow taking the job of bodyguard to a guy named Wednesday.


Based on Neil Gaiman's much-beloved novel, the show comes to us from Bryan Fuller (of "Hannibal," "Dead Like Me," "Wonderfalls" and "Dead Like Me" fame) and Michael Green (of the short-lived but amazing "Kings," "Heroes" and "Everwood") and stars Ricky Whittle ("The 100") as Shadow, Ian McShane as Wednesday as well as Gillian Anderson, Emily Browning, Dane Cook, Peter Stormare, Or [Continued ...]

CreativeApplications.Net: Go Rando – a big FU to Facebook sentiment analysis

Go Rando is a Chrome and Firefox extension by Ben Grosser that allows Facebook users to obfuscate their emotional reactions to prevent them from being surveilled and analyzed.

Open Culture: Marcel Proust Plays Air Guitar on a Tennis Racket (1891)

Was “air guitar” a thing back in 1891, when a photographer captured young Marcel Proust in this playful photograph? Probably not. Maybe it’s anachronistic to read the photograph this way. But you have to admit, it’s worth suspending disbelief for a moment and imagining what song Marcel was playing. Any clever guesses?

via The Atlantic

Related Content:

The First Known Footage of Marcel Proust Discovered: Watch It Online

An Introduction to the Literary Philosophy of Marcel Proust, Presented in a Monty Python-Style Animation

When James Joyce & Marcel Proust Met in 1922, and Totally Bored Each Other

16-Year-Old Marcel Proust Tells His Grandfather About His Misguided Adventures at the Local Brothel

Marcel Proust Fills Out a Questionnaire in 1890: The Manuscript of the ‘Proust Questionnaire’

Marcel Proust Plays Air Guitar on a Tennis Racket (1891) is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Colossal: Handmade Ceramic Blooms and Succulents by Owen Mann

Self-taught artist Owen Mann creates ceramic blooms from dozens, and sometimes hundreds of petals, each hand-formed to mimic the appearance of peonies, dahlias, and spiraling succulents. Simply painted in cool shades of blues and greens, the porcelain flowers look as if they were freshly plucked from the garden. You can see more of Mann’s faux flora on his Instagram, and purchase the pieces on his Etsy shop. (via So Super Awesome)

OUR VALUED CUSTOMERS: While discussing K-2SO in ROGUE ONE...


Planet Haskell: FP Complete: The typed-process library

In October of last year, I published a new library - typed-process. It builds on top of the veritable process package, and provides an alternative API (which I'll explain in a bit). It's not the first time I've written such a wrapper library; I first did so when creating Data.Conduit.Process, which is just a thin wraper around Data.Streaming.Process.

With this proliferation of APIs, why did I go for another one? With Data.(Conduit/Streaming).Process, I tried to stay as close as possible to the underlying process API. And the underlying process API is rigid for (at least) two reasons:

  • It's one of the most used APIs in the Haskell ecosystem, so breaking changes carry a very high cost
  • Since process is a dependency of GHC itself (a boot library), we're limited in adding dependencies

After I got sufficiently fed up with limitations in the existing APIs, I decided to take a crack at doing it all from scratch. I made a small announcement on Twitter, and have been using this library regularly since its release. In addition, a few people have raised questions on the process issue tracker whose simplest answer is IMO "use typed-process." Therefore, I think now's a good time to discuss the library more publicly and get some feedback as to what to do with it.

Overview of typed-process

There is both a typed-process tutorial and Haddock documentation available. If you want details, you should read those. This section is intended to give a little taste of typed-process to set the stage for the rest of the post.

Everything starts with the ProcessConfig datatype, which specified all the rules for how we're going to run an external process. This includes all of the common settings from the CreateProcess type in the process package, like changing the working directory or environment variables. Importantly (and the source of the "typed" in the library name), ProcessConfig takes three type parameters, representing the type of the three standard streams (input, output, and error). For example, ProcessConfig Handle Handle Handle indicates that all three streams will have Handles, whereas ProcessConfig () (STM ByteString) () indicates that input and error will be unit, but output can be access as an STM action which returns a ByteString. (Much more on this later.)

There are multiple helper functions - like withProcess or readProcess - to take a ProcessConfig and turn it into a live, running process. These running processes are represented by the Process type, which like ProcessConfig takes three type parameters. There are underscore variants of these launch functions (like withProcess_ and readProcess_) to automatically check the exit code of a process and, if unsuccessful, throw a runtime exception.

You can access the exit code of a process with waitExitCode and getExitCode, which are blocking and non-blocking, respectively. These functions also come in STM variants to more easily work with processes from atomic sections of code.

Alright, enough overview, let's start talking about motivation.

Downsides of process

The typed-process tutorial identifies five limitations in the process library that I wanted to overcome. (There's also a sixth issue I'm aware of, a race condition, which I've added as a bonus section.) Let's dive into these more deeply, and see how typed-process addresses them.

Type variables

I've made a big deal about type variables so far. I believe this is the biggest driving force behind the more usable API in typed-process. Let's consider some idiomatic process-based code.

#!/usr/bin/env stack
-- stack --install-ghc --resolver lts-8.0 runghc
import Control.Exception
import System.Process
import System.IO
import System.Exit

main :: IO ()
main = do
    (Just inh, Just outh, Nothing, ph) <- createProcess
        (proc "cat" ["-", "/usr/share/dict/words"])
            { std_in = CreatePipe
            , std_out = CreatePipe
            }
    hPutStrLn inh "This is the list of all words:"
    hClose inh
    out <- hGetContents outh
    evaluate $ length out -- lazy I/O :(
    mapM_ putStrLn $ take 100 $ lines out
    ec <- waitForProcess ph
    if (ec == ExitSuccess)
        then return ()
        else error $ "cat process failed: " ++ show ec

The fact that std_in and std_out specify the creation of a Handle is not reflected in the types at all. If we left those changes out, our program would still compile, but our pattern match of (Just inh, Just outh would fail. By moving this information into the type system, we can catch bugs at compile time. Here's the equivalent code as above:

#!/usr/bin/env stack
-- stack --install-ghc --resolver lts-8.0 runghc --package typed-process
import Control.Exception
import System.Process.Typed
import System.IO

main :: IO ()
main = do
    let procConf = setStdin createPipe
                 $ setStdout createPipe
                 $ proc "cat" ["-", "/usr/share/dict/words"]
    withProcess_ procConf $ \p -> do
        hPutStrLn (getStdin p) "This is the list of all words:"
        hClose $ getStdin p
        out <- hGetContents $ getStdout p
        evaluate $ length out -- lazy I/O :(
        mapM_ putStrLn $ take 100 $ lines out

If you leave off the setStdin or setStdout calls, the program will not compile. But this is only the beginning. Instead of being limited to either generating a Handle or not, we now have huge amounts of flexibility in how we configure our streams. For example, here's an alternative approach to providing standard input to the process:

#!/usr/bin/env stack
-- stack --install-ghc --resolver lts-8.0 runghc --package typed-process
{-# LANGUAGE OverloadedStrings #-}
import Control.Exception
import System.Process.Typed
import System.IO

main :: IO ()
main = do
    let procConf = setStdin (byteStringInput "This is the list of all words:\n")
                 $ setStdout createPipe
                 $ proc "cat" ["-", "/usr/share/dict/words"]
    withProcess_ procConf $ \p -> do
        out <- hGetContents $ getStdout p
        evaluate $ length out -- lazy I/O :(
        mapM_ putStrLn $ take 100 $ lines out

There are functions in the process package that allow specifying standard input this easily, but they are not as composable as this approach (as we'll discuss below).

There's much more to be said about these type parameters, but hopefully this taste, plus the further examples in this post, will demonstrate their usefulness.

Proper concurrency

Functions like readProcessWithExitCode use some pretty hairy (IMO) lazy I/O tricks internally to read the output and error streams from a process. For the most part, you can simply use these functions without worrying about the crazy innards. However, consider if you want to do something off the beaten track, like capture the error stream while allowing the output stream to go to the parent process's stdout. There's no built-in function in process to handle that, so you'll be stuck implementing that behavior. And this functionality is far from trivial to get right.

By contrast, typed-process does not use any lazy I/O. And while it provides a readProcess function, there's nothing magical about it; it's built on top of the byteStringOutput stream config, which uses proper threading under the surface and provides its output via STM for even nicer concurrent coding.

#!/usr/bin/env stack
-- stack --install-ghc --resolver lts-8.0 runghc --package typed-process
{-# LANGUAGE OverloadedStrings #-}
import Control.Concurrent.STM (atomically)
import System.Process.Typed
import qualified Data.ByteString.Lazy.Char8 as L8

main :: IO ()
main = do
    let procConf = setStdin closed
                 $ setStderr byteStringOutput
                 $ proc "stack" ["path", "--verbose"]
    err <- withProcess_ procConf $ atomically . getStderr
    putStrLn "\n\n\nCaptured the following stderr:\n\n"
    L8.putStrLn err

STM

I won't dwell much on this one, since the benefits are less commonly useful. Since many functions in typed-process provide both IO and STM alternatives, it can significantly simplify some concurrent algorithms by letting you keep more logic within an atomic block. This is similar to (and inspired by) the design choices in the async library, which is my favorite library of all time.

Binary I/O

All input and output in typed-process works on binary data as ByteStrings, instead of textual String data. This is:

More composable

A major goal of this library has been to be as composable as possible. I've been frustrated by two issues in the process package:

  1. Many common changes to the API necessitate a breaking API change (e.g., the addition of the child_group setting or NoStream constructor)
  2. There is a big split between helper functions that work on CreateProcess values (like readCreateProcess) and those that work on raw command/argument pairs (like readProcess). The situation has improved in recent releases, but in older process releases, the lack of CreateProcess variants of many functions made it very difficult to both modify the environment/working directory for a process and capture its output or error.

For (1), I've gone the route of smart constructors throughout the API. You cannot access the ProcessConfig data constructor, but instead must use proc, shell, or OverloadedStrings. Instead of record accessors, there are setter and getter functions. And instead of a hard-coded list of stream types via a set of data constructors, you can create arbitrary StreamSpecs via the mkStreamSpec function. I hope this turns out to be an API that is resilient to breaking changes.

For (2), the solution is easy: all launch functions in typed-process work exclusively on ProcessConfig. Problem solved. We now have a very clear breakdown in the API: first you configure everything you want about your process, and then you choose whichever launch function makes the most sense to you.

Bonus: Race condition

There's a long standing race condition in process - which will hopefully be resolved soon - that introduces a race condition on waiting for child processes. In typed-process, we've avoided this entirely with a different approach to child process exit codes. Namely: we fork a separate thread to wait for the process and fill an STM TMVar, which both ensures no race condition and makes it possible to observe the process exiting from within an atomic block.

As a side benefit, this also avoids the possibility of accidentally creating zombie processes by not getting the process's exit code when it finishes. Similarly, by encouraging the bracket pattern (via withProcess) when interacting with a process, killing off child processes in the case of exceptions happens far more reliably.

Limitations

For the most part, I have not run into significant limitations with typed-process so far. The biggest annoyances I have with it are those inherited from process, specifically that command line arguments and environment variables are specified as Strings, leading to some character encoding issues.

I'm certain there are limitations of typed-process versus process. And for others, there may be a higher learning curve with typed-process versus process. I haven't received enough feedback on that yet to assess, however.

The other downside is dependencies, for those who worry about such things. In addition to depending on process itself (and therefore inheriting its dependencies), typed-process depends on async, bytestring, conduit, conduit-extra, exceptions, stm, and transformers. The conduit deps can easily be moved out, it's just for providing a convenience function that could be provided elsewhere. Regarding the others:

  • transformers is only needed for MonadIO. Now that MonadIO has moved into base, I could make that dependency conditional.
  • The exceptions dependency makes withProcess more general, and would be a shame to lose.
  • Dropping async and stm could be done by inlining their code here, which would work, but is a bad idea IMO.

The only reason for considering these changes would be the next section...

What's next?

I'm left with the question of what to do with this package, especially as more people ask questions that can be answered with "just use typed-process."

  • Do nothing. The package can live on Hackage/Stackage as-is, people who want to use it can use it, and that's it.
  • Add a note to the package process mentioning it as a potential, alternative API. Even though I'm currently the process package maintainer, I feel it would be inappropriate for me to make such a decision myself.
  • Even more radically: if there is strong support for this API, we could consider merging it back into the process package. I wouldn't be in favor of modifying the System.Process module (we should keep it as-is for backwards compatibility), but adding a new module with this API is certainly doable (sans the dependency issues mentioned aboved).

At the very least, this library has scratched a personal itch. If it helps others, that's a great perk :).

Open Culture: Walt Disney Creates a Frank Animation That Teaches High School Kids All About VD (1973)

The comically plainspoken, tough-guy sergeant is a heaven sent assignment for character actors.

Think R. Lee Ermey in Full Metal Jacket

Louis Gosset Jr. in An Officer and a Gentleman

Even Stripes’  Warren Oates.

Keenan Wynn, who strove to keep America safe from “deviated perverts” in 1964’s Dr. Strangelove, was awarded the role of a lifetime nine years later, when Disney Studios was seeking vocal talent for VD Attack Plan, above, a 16-minute animation intended to teach high schoolers about the scourge of venereal disease.

Wynn (son of Ed) threw himself into the part with gusto, imbuing his badly-complected, Kaiser-helmeted germ commander with the sort of straight-talking charisma rarely seen in high school Health class.

A risky maneuver, given that Vietnam-era teens did not share their parent’s generation’s respect for military authority and VD Attack Plan was the first educational short specifically aimed at the high school audience. Prior to that, such films were geared toward soldiers. (Disney waded into those waters in 1944, with the training film, A Few Quick Facts No. 7—Venereal Disease, the same year Mickey Mouse appeared in LOOK magazine, waging war on gonorrhea with sulfa drugs.

Gonorrhea was well represented in the Wynn’s Contagion Corps. The ranks were further swelled by Syphilis. Both platoons were outfitted with paramilitary style berets.

The Sarge pumped them up for the coming sneak attack by urging them to maim or better yet, kill their human enemy. Shaky recruits were reassured that Ignorance, Fear, and Shame would have their backs.

Scriptwriter Bill Bosche had quite the knack for identifying what sort of sugar would make the medicine go down. The Sarge intimates that only a few of the afflicted are “man enough” to inform their partners, and while Ignorance and Shame cause the majority to put their faith in ineffectual folk remedies, the “smart ones” seek treatment.

Elementary psychology, but effectual nonetheless.

Today’s viewers can’t help but note that HIV and AIDS had yet to assert their fearsome hold.

On the other hand, the Sarge’s matter of fact delivery regarding the potential for same sex transmission comes as a pleasant surprise. His primary objective is to set the record straight. No, birth control pills won’t protect you from contracting the clap. But don’t waste time worrying about picking it up from public toilet seats, either.

A word of caution to those planning to watch the film over breakfast, there are some truly gnarly graphic photos of rashes, sores, and skin eruptions. Helpful to teens seeking straight dope on their worrisome symptoms. Less so for anyone trying to enjoy their breakfast links sans the specter of burning urination.

So here’s to the sergeants of the silver screen, and the hardworking actors who embodied them, even those whose creations resembled Pillsbury’s Funny Face drink mix mascots. Let’s do as the Sarge says, and make every day V-D Day!

VD Attack Plan will be added to the animation section of our collection, 1,150 Free Movies Online: Great Classics, Indies, Noir, Westerns, etc..

Related Content:

Watch Family Planning, Walt Disney’s 1967 Sex Ed Production, Starring Donald Duck

The Story Of Menstruation: Watch Walt Disney’s Sex Ed Film from 1946

Salvador Dalí Creates a Chilling Anti-Venereal Disease Poster During World War II

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Her play Zamboni Godot is opening in New York City next week. Follow her @AyunHalliday.

Walt Disney Creates a Frank Animation That Teaches High School Kids All About VD (1973) is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Birth Strategy



Click here to go see the bonus panel!

Hovertext:
I'm just saying, from an evolutionary perspective, this is the best move.

New comic!
Today's News:

Big announcement Monday! Stay tuned...

Penny Arcade: Comic: Up Top

New Comic: Up Top

ScreenAnarchy: Dublin 2017 Review: NAILS Slam The Coffin Shut In This Horror Hospital

The current decade has been a very good one for Irish horror. While they haven't quite reached the same kind of transgressive peak that extreme French horror did in the last decade with Frontiere(s), Inside, and Martyrs, Ireland's current horror boom is putting the Island nation back on the genre film map in a very noticeable way. Dennis Bartok's Nails takes this trend, and many of its recurring themes and builds on it with a solid addition to the canon of modern Irish horror. Nails is the story of health-nut, wife, and mother, Dana (The Descent's Shauna MacDonald), who is horribly injured after being hit by a car and ends up in long term care in the hospital. Completely bed-ridden while attempting to heal, Dana...

[Read the whole post on screenanarchy.com...]

Electronics-Lab: Hack Your Car With Macchina M2

Car hacking applications have been growing during the last few years, making it faster and cheaper to get into automotive tinkering. A new device was launched recently on kickstarter called M2 by Macchina.

M2 is an open-source, versatile development platform which can be wired under the hood for a more permanent installation or plugged into the OBD2 port, enabling you to do virtually anything with your vehicle’s software.

It is a tiny device (56.4mm x 40.6mm x 15.7mm) that is compact, modular, wirelessly connectable, and based on the popular Arduino Due. It consists of a processor board with a SAM3X8E Cortex-M3 MCU, a USB port, some LEDs, an SD card slot, and built-in EEPROM, as well as an interface board with two channels of CAN, two channels of LIN/K-LINE, a J1850 VPW/PWM, and even a single-wire (GMLAN) interface.

M2 is universal as its libraries and protocols are compatible with any car that isn’t older than Google. Macchina also aims to make the M2 compatible with as many existing open source software packages as possible.It is already compatible with SavvyCAN, CanCAT, MetaSploit, and CANtact.

Working with M2 is easy for Arduino users. Here is a summary of the steps needed to duplicate our shift light project on a CANbus-equipped manual transmission car that also illustrates the basic workflow when car hacking with M2:

  • Step 1: Download the latest Arduino IDE and install the Macchina boards add-on; test everything is working by blinking an LED.
  • Step 2: Download and install one of several open source “Sniffer” applications to your computer and upload the corresponding “sketch” to M2.
  • Step 3: Use the “Sniffer” application to identify the piece of data you are looking to use. In this case, engine RPM
  • Step 4: Write a “Sketch” to watch for RPM data and light up some LEDs proportionally and flash when it is time to shift.

You can also check this video to see an example of simple car hacking:

Macchina has partnered with Arduino, Digi and Digi-Key to develop M2, and it believes that its highly-adaptable hardware will most benefit hot rodders, mechanics, students, security researchers, and entrepreneurs by providing them access to the inner workings of their rides.

As it is an open source project, you can get its 3D files, schematics, BOM, and source files on the github repository. M2 will be available for $79 and it may cost about $110 if you build it yourself. Visit Macchina’s Kickstarter page to learn more or pre-order yours today. You can also check out Hackaday’s review about M2.

Macchina M2 tutorial introduction:

The post Hack Your Car With Macchina M2 appeared first on Electronics-Lab.

Daniel Lemire's blog: Tech jobs are already largely automated

Lately, I have been reading a lot about the threat to computer jobs from automation. For example, we have AI systems that can write their own code. And billionaire Mark Cuban predicts that “software will soon begin writing itself, which will ultimately eliminate those lucrative software development jobs”.

I think that people like Cuban fundamentally miss that reality of the tech industry. How do they think that a couple of college graduates can build a tool that can be used by millions of people?

We work on layers upon layers of automatisation. We have been adding layers for decades.

Did you look around lately? More of us than ever are computer programmers, IT specialists and so forth. More automation has lead to more jobs.

Will software write its own code? It does so all the time. The optimizing compilers and interpreters we rely upon generate code for us all the time. It is not trivial automatisation. Very few human beings would be capable of taking modern-day JavaScript and write efficient machine code to run it. I would certainly be incapable of doing such work in any reasonable manner.

Programmers work actively to make themselves obsolete. That is, they work hard to solve problems so that nobody else has ever to worry about them again. The solution to the problem become automated.

Naively, one might think that software automation makes computer jobs less important. This makes some kind of sense if you think that all computer jobs will get automated.

But that’s not how the world tends to work. As automation increases, more and more new jobs are created higher up the tree. We no longer need many people to write C data structures, but we have seen an explosion in the number of web developers, data scientists and so forth.

Planet Haskell: Brent Yorgey: Signed sets and ballots, part 2

Recall, from my previous post, that our goal is to find a combinatorial proof showing the correspondence between signed sets and signed ballots, where a signed set is just a set of n elements, considered positive or negative according to the parity of n, and a signed ballot is an ordered list of sets, considered positive or negative according to the parity of the number of sets.

So, how should such a proof look? For a given number of labels n, there is a single signed set structure, which is just the set of labels itself (with a sign depending on the parity of n). On the other hand, there are lots of ballots on n labels; the key is that some are positive and some are negative, since the sign of the ballots depends on the parity of the number of parts, not the number of labels. For example, consider n = 3. There is a single (negative) signed set structure:

(I will use a dashed blue line to indicate negative things, and a solid black line for positive things.)

On the other hand, as we saw last time, there are 13 ballot structures on 3 labels, some positive and some negative:

In this example, it is easy to see that most of the positives and negatives cancel, with exactly one negative ballot left over, which corresponds with the one negative set. As another example, when n = 4, there is a single positive set, and 75 signed ballots:

This time it is not quite so easy to tell at a glance (at least not the way I have arranged the ballots in the above picture!), but in fact one can verify that there are exactly 37 negative ballots and 38 positive ones, again cancelling to match the one positive set.

What we need to show, then, is that we can pair up the ballots in such a way that positive ballots are matched with negative ballots, with exactly one ballot of the appropriate sign left to be matched with the one signed set. This is known as a signed involution: an involution is a function which is its own inverse, so it matches things up in pairs; a signed involution sends positive things to negative things and vice versa, except for any fixed points.

In order to do this, we will start by assuming the set of labels is linearly ordered. In one sense this is no big deal, since for any finite set of labels we can always just pick an arbitrary ordering, if there isn’t an “obvious” ordering to use already. On the other hand, it means that the correspondence will be specific to the chosen linear ordering. All other things being equal, we would prefer a correspondence that depends solely on the structure of the ballots, and not on any structure inherent to the labels. I will have quite a bit more to say about this in my third and (probably) final post on the topic. But for today, let’s just see how the correspondence works, given the assumption of a linear order on the labels. I came up with this proof independently while contemplating Anders Claesson’s post, though it turns out that the exact same proof is already in a paper by Claesson and Hannah (in any case it is really just a small lemma, the sort of thing you might give as a homework problem in an undergraduate course on combinatorics).

Given some ballot, find the smallest label. For example, if the labels are \{1, \dots, n\} as in the examples so far, we will find the label 1.

  • If the smallest label is contained in some part together with at least one other label, separate it out into its own part by itself, and put it to the right of its former part. Like this:

  • On the other hand, if the smallest label is in a part by itself, merge it with the part on the left (if one exists). This is clearly the inverse of the above operation.

  • The only case we haven’t handled is when the smallest label is in a part by itself which is the leftmost part in the ballot. In that case, we leave that part alone, switch to considering the second-smallest label, and recursively carry out the involution on the remainder of the ballot.

    For example:

    In this case we find the smallest label (1) in a part by itself in the leftmost position, so we leave it where it is and recurse on the remainder of the ballot. Again, we find the smallest remaining label (2) by itself and leftmost, so we recurse again. This time, we find the smallest remaining label (3) in a part with one other label, so we separate it out and place it to the right.

This transformation on ballots is clearly reversible. The only ballots it doesn’t change are ballots with each label in its own singleton part, sorted from smallest to biggest, like this:

In this case the algorithm recurses through the whole ballot and finds each smallest remaining label in the leftmost position, ultimately doing nothing. Notice that a sorted ballot of singletons has the same sign as the signed set on the same labels, namely, (-1)^n. In any other case, we can see that the algorithm matches positive ballots to negative and vice versa, since it always changes the number of parts by 1, either splitting one part into two or merging two parts into one.

Here’s my implementation of the involution in Haskell:

type Ballot = [[Int]]

ballotInv :: Ballot -> Ballot
ballotInv = go 1
  where
    go _ [] = []
    go s ([a]:ps)
      | s == a = [a] : go (s+1) ps
    go s (p:ps)
      | s `elem` p = delete s p : [s] : ps
    go s (p:[a]:ps)
      | s == a = sort (a:p) : ps
    go s (p:ps) = p : go s ps

(The call to sort is not strictly necessary, but I like to keep each part canonically sorted.)

Here again are the 13 signed ballots for n = 3, this time arranged so that the pair of ballots in each row correspond to each other under the involution, with the leftover, sorted ballot by itself at the top.

If you’d like to see an illustration of the correspondence for n = 4, you can find it here (I didn’t want to include inline since it’s somewhat large).

This completes the proof that signed sets and signed ballots correspond. But did we really need that linear order on the labels? Tune in next time to find out!


Open Culture: A Free Short Course on How Pixar Uses Physics to Make Its Effects

A new computer-animated spectacle that makes us rethink the relationship between imagination and technology seems, now, to come out every few months. Audiences have grown used to various computer animation studios all competing to wow them, but not so long ago the very notion of entertaining animation made with computers sounded like science fiction. All that changed in the mid-1980s when a young animator named John Lasseter breathed life into the CGI stars of such now simple-looking but then revolutionary shorts as The Adventures of André and Wally B. and Luxo Jr., the latter being the first independent production by a certain Pixar Animation Studios.

We know Pixar today as the outfit responsible for Toy Story, The IncrediblesWALL-E, and other groundbreaking computer-animated features, each one more impressive than the last. How do they do it? Why, with ever-larger and more highly skilled creative and technological teams, of course, all of whom work atop a basic foundation laid by Lasseter and his predecessors in the art of computer animation, in the search for answers to one question: how can we get these digital machines to convincingly simulate our world?


After all, even imaginary characters must emote, move around, and bump into one another with conviction, and do it in a medium of light, wind, water, and much else at that, all ultimately undergirded by the laws of physics.

Thanks to Pixar and their competition, not a few members of the past couple generations have grown up dreaming of mastering computer animation themselves. Now, in partnership with online educational organization Khan Academy, they have a place to start: Pixar in a Box, a series of short interactive courses on how to “animate bouncing balls, build a swarm of robots, and make virtual fireworks explode,” which vividly demonstrates that “the subjects you learn in school — math, science, computer science, and humanities — are used every day to create amazing movies.” The effects course gets deeper into the nitty-gritty of just how computer animators have found ways of taking real physical phenomena and “breaking them down into millions of tiny particles and controlling them using computer programming.”

It all comes down to developing and using particle systems, programs designed to replicate the motion of the real particles that make up the physical world. “Using particles is a simplification of real physics,” says Pixar Effects Technical Director Matt Wong, “but it’s an effective tool for artists. The more particles you use, the closer you get to real physics. Most of our simulations require millions and millions of particles to create believable water,” for instance, which requires a level of computing power scarcely imaginable in 1982, when Pixar’s own effects artist Bill Reeves (who appears in the one of these videos) first used a particle system for a visual effect in Star Trek II. These effects have indeed come a long way, but as anyone who takes this course will suspect, computer animation has only begun to show us the worlds it can realize.

For more Pixar/Khan Academy courses, please see the items in the Relateds below.

Related Content:

Pixar & Khan Academy Offer a Free Online Course on Storytelling

Take a Free Online Course on Making Animations from Pixar & Khan Academy

Pixar’s 22 Rules of Storytelling … Makes for an Addictive Parlor Game

Free Online Physics Courses

A Rare Look Inside Pixar Studios

The Beauty of Pixar

The First 3D Digital Film Created by Ed Catmull, Co-Founder of Pixar (1970)

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on a book about Los Angeles, A Los Angeles Primer, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.

A Free Short Course on How Pixar Uses Physics to Make Its Effects is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

All Content: Get Out

Thumb_getout

This review was originally published on January 24, 2017, as a part of our Sundance Film Festival coverage.

With the ambitious and challenging “Get Out,” which premiered in a secret screening at the 2017 Sundance Film Festival, Jordan Peele reveals that we may someday consider directing the greatest talent of this fascinating actor and writer. We knew from his days on “Key & Peele” and in feature comedies that he was a multiple threat, but his directorial debut is a complex, accomplished genre hybrid that should alter his business card. “Get Out” feels fresh and sharp in a way that studio horror movies almost never do. It is both unsettling and hysterical, often in the same moment, and it is totally unafraid to call people on their racist bullshit. When he introduced the film in Park City, he revealed that it started with an attempt to write a movie he hadn’t seen before. We need more directors willing to take risks with films like "Get Out."

To be fair, Peele is clearly riffing on some films he has seen before, including “The Stepford Wives” and “Rosemary’s Baby,” although with a charged, racial twist. His film is essentially about that unsettling feeling when you know you don’t belong somewhere; when you know you’re unwanted or perhaps even wanted too much. Peele infuses the age-old genre foundation of knowing something is wrong behind the closed doors around you with a racial, satirical edge. What if going home to meet your girlfriend’s white parents wasn’t just uncomfortable but downright life-threatening?

“Get Out” opens with a fantastic tone-setter. A young man (the great Keith Stanfield, in two other movies at this year’s Sundance and fantastic on FX’s “Atlanta”) is walking down a suburban street, joking with someone on the phone about how he always gets lost because all the streets sound the same. A car passes him, turns around, and slowly starts following him. It’s an otherwise empty street, so the guy knows something is wrong. Suddenly, and perfectly staged in terms of Peele’s direction, the intensity of the situation is amplified and we are thrust into a world in which the safe-looking suburbs are anything but.

Cut to our protagonists, Chris (Daniel Kaluuya) and his girlfriend Rose (Allison Williams of “Girls”), preparing to go home to meet her parents. Rose hasn’t told them he’s black, which she blows off as no big deal, but he’s wary. His TSA Agent buddy (a hysterical LilRel Howery) warns him against going too, but Chris is falling in love with Rose. He’ll have to meet them eventually. And Rose swears her dad would have voted for Obama a third time if he could have.

From the minute that Chris and Rose arrive at her parents’ house, something is unsettling. Sure, Dean (Bradley Whitford) and Missy (Catherine Keener) seem friendly enough, but almost too much so, like they’re looking to impress Chris. More unnerving is the demeanor of a groundskeeper named Walter (Marcus Henderson) and a housekeeper named Georgina (Betty Gabriel), who almost appear to be like the pod people from “Invasion of the Body Snatchers.” There’s just something wrong. But, as we so often do in social or racial situations, Chris keeps trying to excuse their behavior—maybe Walter is jealous and maybe Georgina has an issue with Chris being with a white woman. The lurking presence of Rose’s odd brother (Caleb Landry Jones), who often looks like he’s auditioning for a remake of “A Clockwork Orange,” doesn’t help. Chris goes out to have a smoke one night, and, well, things start to get even stranger in ways I won’t spoil—in fact, the preview gives away way too much. Avoid it if you can.

“Get Out” is a slow-burn of a film for its first half as Peele piles up the clues that something is wrong. Or could Chris just be overreacting to everyday racial tension? Peele’s greatest gift here is in the way he walks that fine line, staging exchanges that happen all the time but imbuing them with a greater degree of menace. As white partygoers comment on Chris’ genetically-blessed physical gifts, the mind is racing as to what exactly the greater purpose of this visit is for this young man, a minority in a sea of white people who seem to want to own him, which is itself a razor-sharp commentary on the way we often seek to possess cultural aspects other than our own.

Then Peele drops his hammer. The final act of “Get Out” is an unpredictable thrill ride. As a writer, Peele doesn’t quite bring all of his elements together in the climax in the way I wish he would, but he proves to be a strong visual artist as a director, finding unique ways to tell a story that goes increasingly off the rails. The insanity of the final act allows some of the satirical, racially-charged issues to drop away, which is slightly disappointing. He’s playing with so many interesting ideas when it comes to race that I wish the film felt a bit more satisfying it in its payoff, even if that disappointment is amply offset by the pure intensity of the final scenes, during which Peele displays a skill with horror action that I didn’t know he had. 

Peele works well with actors too, drawing a great leading man turn from Kaluuya, letting Williams essentially riff on her “Girls” persona, and knowing exactly what to do with Whitford & Keener, both of whom have always had that dangerous edge to their amiability. They’re excellent at working something sinister into their gracious host routines.

Most importantly, Peele knows how to keep his concept front and center. “Get Out” is not a film that takes breaks for comedy routines (even if Howery allows a little relief, it's often in the context of how he's convinced all white people want black sex slaves), keeping us on edge and uncertain from the opening scene to the final one. He understands that every time a black man goes home to visit his white girlfriend’s parents, there is uncertainty and unease. He’s merely turning that up, using an easily identifiable racial tension to make a horror movie. Many of our greatest genre filmmakers have done exactly the same thing—amplifying fears already embedded in the human condition for the purpose of movie horror. We just don’t often see something quite so ambitious from a February horror flick or a first-time director. Even if the second half doesn’t quite fulfill the promise of the first, Peele doesn’t just deserve credit for trying something so daring, he should have producers knocking down his door to see what else he’s never seen before.


All Content: I Don't Feel at Home in This World Anymore

Thumb_dont-feel-at-home-2017-2

You can honestly say of this film by writer/director Macon Blair that they don't make 'em like they used to. "I Don't Feel at Home in This World Anymore" is an American independent film from the 1990s that just happens to have been released this year.

The stakes are small compared to what tends to happen in American movies now; the story is rather slight; the filmmakers pay closer attention to the small details of character interaction than to the fine points of plot. The whole thing is mainly situational: we get to watch the heroine, Ruth Kimke (Melanie Lynskey) as she reacts to having her house broken into and follow along as she decides to locate and punish the people who did it with help from an oddball neighbor (Elijah Wood's Tony), who's enamored with morning stars and nunchucks.

Really, though, this is a film about the utter indifference and outright hostility that people encounter every day, and how essentially decent people like Ruth suffer and suffer through it, almost always silently, until they finally snap. The break-in is the culmination of a series of unfortunate encounters: she has to deal with an old racist at the nursing home where she works. She gets stuck in traffic and spies a jerk in a pickup truck at the head of the lane whose tailpipe spews inky smoke as he revs his engine. In a scene that will break the hearts of many regulars who read reviews, Ruth enjoys a drink at a neighborhood bar while reading a new book, only to have a plot twist casually spoiled by another customer that she initially mistakes for a nice guy (played by Blair himself).

The movie never escalates beyond a high simmer, though, and once we get to the inevitable (if welcome and satisfying) climax, you may start to tally up all the missed opportunities. Although "I Don't Feel At Home" has been compared to "Falling Down" for its interest in an ordinary citizen pushed to the breaking point through an accumulation of indignities, the movie it most reminded me of, at least in terms of aspiration, is Jonathan Demme's "Something Wild," a movie which, like this one, is all over the map tonally, veering from sly comedy to knockabout slapstick to much darker passages that verge on thumbscrews-tightening thriller atmospherics. (The initial scene of Ruth discovering her home—which has, as she puts it, been "violated"—is ultimately creepier than the finale, even though nothing much happens beyond agonized reactions by Ruth.)

The film is worth seeing for its interest in eccentric but realistic people, in particular Ruth, who's played with great intelligence and exactness by Lynskey. Lynskey, who first came to moviegoers' attention in "Heavenly Creatures," is one of those actresses I'm never not glad to see, and it's a treat to see her front-and-center here, carrying an entire movie mainly with her eyes, face and shoulders. A performance like this one can be quite tricky—you're essentially reactive a lot of the time, more of a sponge for the film than the motor driving it along—but Lynskey makes everything active by letting you feel Ruth's emotions and sense her train of thought as she puts various pieces together in her head, drawing correct or wrongheaded conclusions. She's also just a terrific audience surrogate. When she snarls or snaps, I wanted to cheer.


BOOOOOOOM!: Photographer Spotlight: Taylor Thomas Galloway

A selection of work by photographer Taylor Thomas Galloway. More images below.

Open Culture: Watch Earth, a Landmark of Soviet Cinema (1930)

Today we’re adding to our list of Free Movies a 1930 Soviet silent film by director Alexander Dovzhenko. It’s called Earth, and it’s the third installment in Dovzhenko’s “Ukraine Trilogy.”

When The Guardian created its list of the Top 10 Silent Movies of all time, it put Earth in the #9 slot. About the film writer Pamela Hutchinson said:

Earth, capped by that avowedly secular title, is a lyrical, carnal movie about birth, death, sex and rebellion. Officially, this Soviet-era Ukrainian silent is a paean to collective farming, crafted around a family drama, but its director, Alexander Dovzhenko, was a born renegade, for whom plots were far less important than poetry…

Earth is the final part of Dovzhenko’s silent trilogy (following the nationalist fantasy Zvenigora (1928) and the avant-garde anti-war film Arsenal (1929), and is brimming with exuberant youth, but haunted by the shadow of death….

Sketched as tribute to the boons of collectivisation, but released as those schemes were falling out of favour, Earth was condemned on its home turf on political grounds. It was also snipped by censors who objected to the nudity, and the infamous scene in which farmers urinate into their tractor’s radiator. But while there was dismay and censure in the Soviet Union, critics elsewhere were overawed…

It’s the latter impression that endures. Dovzhenko’s symbolism is both rich and audacious. His scope comprises vast pastoral landscapes, and intimate fleshy nakedness. Perhaps its most celebrated sequence is the magnificent opening scene: the painful counterpoint between a dying man, his infant grandchildren and the bursting fruit of his orchard. This is living cinema, as refreshing and vital as the film’s own climactic downpour.

You can watch Earth above, and find it listed in our collection of Free Silent Films, a subset of our meta collection, 1,150 Free Movies Online: Great Classics, Indies, Noir, Westerns, etc.. Below you can watch a version of Earth with a recent soundtrack provided by the amazing Ukrainian ensemble, DakhaBrakha. Enjoy.

Blog – free electrons: Linux 4.10, Free Electrons contributions

After 8 release candidates, Linus Torvalds released the final 4.10 Linux kernel last Sunday. A total of 13029 commits were made between 4.9 and 4.10. As usual, LWN had a very nice coverage of the major new features added during the 4.10 merge window: part 1, part 2 and part 3. The KernelNewbies Wiki has an updated page about 4.10 as well.

On the total of 13029 commits, 116 were made by Free Electrons engineers, which interestingly is exactly the same number of commits we made for the 4.9 kernel release!

Our main contributions for this release have been:

  • For Atmel platforms, Alexandre Belloni added support for the securam block of the SAMA5D2, which is needed to implement backup mode, a deep suspend-to-RAM state for which we will be pushing patches over the next kernel releases. Alexandre also fixed some bugs in the Atmel dmaengine and USB gadget drivers.
  • For Allwinner platforms
    • Antoine Ténart enabled the 1-wire controller on the CHIP platform
    • Boris Brezillon fixed an issue in the NAND controller driver, that prevented from using ECC chunks of 512 bytes.
    • Maxime Ripard added support for the CHIP Pro platform from NextThing, together with many addition of features to the underlying SoC, the GR8 from Nextthing.
    • Maxime Ripard implemented audio capture support in the sun4i-i2s driver, bringing capture support to Allwinner A10 platforms.
    • Maxime Ripard added clock support for the Allwinner A64 to the sunxi-ng clock subsystem, and implemented numerous improvements for this subsystem.
    • Maxime Ripard reworked the pin-muxing driver on Allwinner platforms to use a new generic Device Tree binding, and deprecated the old platform-specific Device Tree binding.
    • Quentin Schulz added a MFD driver for the Allwinner A10/A13/A31 hardware block that provides ADC, touchscreen and thermal sensor functionality.
  • For the RaspberryPi platform
    • Boris Brezillon added support for the Video Encoder IP, which provides composite output. See also our recent blog post about our RaspberryPi work.
    • Boris Brezillon made a number of improvements to clock support on the RaspberryPi, which were needed for the Video Encoder IP support.
  • For the Marvell ARM platform
    • Grégory Clement enabled networking support on the Marvell Armada 3700 SoC, a Cortex-A53 based processor.
    • Grégory Clement did a large number of cleanups in the Device Tree files of Marvell platforms, fixing DTC warnings, and using node labels where possible.
    • Romain Perier contributed a brand new driver for the SPI controller of the Marvell Armada 3700, and therefore enabled SPI support on this platform.
    • Romain Perier extended the existing i2c-pxa driver to support the Marvell Armada 3700 I2C controller, and enabled I2C support on this platform.
    • Romain Perier extended the existing hardware number generator driver for OMAP to also be usable for SafeXcel EIP76 from Inside Secure. This allows to use this driver on the Marvell Armada 7K/8K SoC.
    • Romain Perier contributed support for the Globalscale EspressoBin board, a low-cost development board based on the Marvell Armada 3700.
    • Romain Perier did a number of fixes to the CESA driver, used for the cryptographic engine found on 32-bit Marvell SoCs, such as Armada 370, XP or 38x.
    • Thomas Petazzoni fixed a bug in the mvpp2 network driver, currently only used on Marvell Armada 375, but in the process of being extended to be used on Marvell Armada 7K/8K as well.
  • As the maintainer of the MTD NAND subsystem, Boris Brezillon did a few cleanups in the Tango NAND controller driver, added support for the TC58NVG2S0H NAND chip, and improved the core NAND support to accommodate controllers that have some special timing requirements.
  • As the maintainer of the RTC subsystem, Alexandre Belloni did a number of small cleanups and improvements, especially to the jz4740

Here is the detailed list of our commits to the 4.10 release:

BOOOOOOOM!: Artist Spotlight: Katherine Tzu-Lan Mann

Paintings by artist Katherine Tzu-Lan Mann. More images below.

Electronics-Lab: Installing The Micronucleus Bootloader To An ATtiny Via Arduino

In order to be able to upload Arduino sketches directly to the ATtiny84 over USB without the need to use a programming device, Shawn Hymel, an electrical engineer at Sparkfun Electronics, had published a guide showing how to install the micronucleus bootloader, which supports virtual USB (V-USB), onto an ATtiny84 using Arduino.

The Atmel AVR ATtiny84 is a $3 tiny 8-bit processor with 8K of program space, 12 I/O lines, and 8-channel 10 bit ADC. It will run up to 20MHz with an external crystal and can be programmed in circuit.

To start following the tutorial, you will need these parts:

Micronucleus is a bootloader designed for AVR ATtiny microcontrollers with a minimal usb interface, cross platform libusb-based program upload tool, and a strong emphasis on bootloader compactness. It has a built in V-USB so that you can send compiled firmware over a virtual USB connection.

The process will use an Arduino as a programmer by loading an Arduino ISP to install the micronucleus bootloader on the ATtiny84. The next step is allowing USB programming on ATtiny84 by manually change fuses, then creating a board definition for ATtiny84 and installing any necessary USB drivers.

The hardware components should be connected as shown in the above circuit. At first you have to remove the capacitor and connect a FTDI breakout to the Arduino Pro Mini and upload the Arduino ISP firmware.

Before installing Micronucleus, a 10μF capacitor is added between the RESET and GND pins of the Arduino. It will prevent the Arduino from entering bootloader mode so that it will pass the compiled firmware to the connected ATtiny rather than trying to program itself.

AVRDUDE is used then to change the ATtiny fuses and set them as the following:

  • No clock divider
  • Brown-out detection at 2.7V (not necessary, but useful if running off battery)
  • Self-programming

This tutorial should also work with ATtiny85, ATtiny841, and ATtiny167. You can find the detailed steps with a blink example on the main tutorial page.

The post Installing The Micronucleus Bootloader To An ATtiny Via Arduino appeared first on Electronics-Lab.

Explosm.net: Comic for 2017.02.24

New Cyanide and Happiness Comic

LaForge's home page: Manual testing of Linux Kernel GTP module

In May 2016 we got the GTP-U tunnel encapsulation/decapsulation module developed by Pablo Neira, Andreas Schultz and myself merged into the 4.8.0 mainline kernel.

During the second half of 2016, the code basically stayed untouched. In early 2017, several patch series of (at least) three authors have been published on the netdev mailing list for review and merge.

This poses the very valid question on how do we test those (sometimes quite intrusive) changes. Setting up a complete cellular network with either GPRS/EGPRS or even UMTS/HSPA is possible using OsmoSGSN and related Osmocom components. But it's of course a luxury that not many Linux kernel networking hackers have, as it involves the availability of a supported GSM BTS or UMTS hNodeB. And even if that is available, there's still the issue of having a spectrum license, or a wired setup with coaxial cable.

So as part of the recent discussions on netdev, I tested and described a minimal test setup using libgtpnl, OpenGGSN and sgsnemu.

This setup will start a mobile station + SGSN emulator inside a Linux network namespace, which talks GTP-C to OpenGGSN on the host, as well as GTP-U to the Linux kernel GTP-U implementation.

In case you're interested, feel free to check the following wiki page: https://osmocom.org/projects/linux-kernel-gtp-u/wiki/Basic_Testing

This is of course just for manual testing, and for functional (not performance) testing only. It would be great if somebody would pick up on my recent mail containing some suggestions about an automatic regression testing setup for the kernel GTP-U code. I have way too many spare-time projects in desperate need of some attention to work on this myself. And unfortunately, none of the telecom operators (who are the ones benefiting most from a Free Software accelerated GTP-U implementation) seems to be interested in at least co-funding or otherwise contributing to this effort :/

Quiet Earth: Get Into GET OUT [Review]

Jordan Peele is best known as a comedian and one-half of the hilarious duo of Key & Peele but as it turns out, Peele is also a big horror buff and Get Out, a writing exercise he never figured would get made, is a perfect example of what can happen when an artist is given the opportunity to try something different.


Daniel Kaluuya ("The Fades," "Black Mirror," Sicario) stars as Chris, a talented photographer who is going away for the weekend with his girlfriend Rose (Allison Williams of "Girls" fame) to meet her parents for the first time. He's very black; she's very white and she hasn't told mom and dad. Rose argues race is a non-issue and though at first reluctant, Chris goes along with the weekend plans but rather quickly, even before the pair arrives at the posh family [Continued ...]

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Parabolic

Spring, 2017. Ah, rutting season. The saps are flowing. Molting and moaning everywhere. Twenty-one degrees in the Big Smoke. Warm zephyrs virtually coast-to-coast. House lust is on the rise along with the temps and the collective anxiety of our leaders. And the next two or three months may prove to be critical times for you and your real estate wealth.

First, we’re entering the horniest time of the year in an already-aroused state – at least in the one remaining puddle of froth in the nation. GTA prices are ridiculous, according to the local realtor stats. In real, on-the-street terms, they’re insane. Raymond told me yesterday about his $540,000 house (bought in 2009) which sold last week for $2.1 million. “Ugly, 1980s subdivision thing with a garage stuck on the front,” he says. “I did nothing to it for years.” Typical. Values have suddenly gone parabolic. A house bought 20 months ago in the northern burbs for $1.7 million was just sold for $2.8 million – by people who ended up sitting across from me, giggling.

“Talk about your greater fool,” Luigi said. “We just met him. And his fool wife.”

This is reminiscent of Vancouver exactly a year ago when, in the final stages of the orgiastic eruption of hormonal and speculative excess, the market went vertical before it croaked. Within months, sales of detached homes were plunging, buyers recoiling from fanciful prices and conditions set up for the coup de grace – the Chinese Dudes tax.

Could the pattern be repeating in the GTA (and now, all of southern Ontario), or is it different there? Silly question. It’s never different. People just believe it to be so, until it isn’t.

Supporting this pathetic blog’s oft-trashed position (do not put all your eggs in the RE basket) is Sun Life chief investment officer Sadiq Adatia who this week told the hated BNN network: “If you look at Vancouver, we’ve already seen the bubble burst there. A lot of people think it’s just the foreign players and that tax that came through, but it actually started before the tax actually got implemented. So, what we’re seeing is probably a further extension of that downturn as result of the foreign tax.”

As for Toronto, same story. Only a matter of time, says Adatia – echoing virtually all major economists. Off a cliff, baby…

“Eventually we are going to see this market kind of stop and then come off a cliff,” he said. “The longer we stay in this run-up, the bigger the downturn is going to be. Toronto for many years had been an under-valued real estate market. The difference is, though, that we’ve moved up so significantly in a short amount of time. And so, what will likely happen is that we’ll start seeing selling pressure down the road, we’ll see sales coming off. Right now demand is still there but demand is slowly coming off.”

What could be the catalyst?

A foreign buyer tax, maybe. While TPTB have so far dismissed the idea (along with the realtors, of course) it keeps bubbling up, with widespread support among Toronto city council members. This is still a definite possibility after the shocking price appreciation of late. Also possible are further actions from Ottawa, driven by Wild Bill Morneau (who lives in a $5 million house just off Bayview), the man who last October tried to slow down the train with a series of reforms, including the Moister Stress Test. Apparently, they didn’t work.

Then we have the Fed to worry about. While another rate hike in a couple of weeks at the March meeting is still a long shot, the odds of central bank action in the US are growing weekly. Count on at least two increases in 2017, and potentially three. Then three more in 2018. And, yes, the Bank of Canada will be unable to resist for too long. There’s no viable economic scenario now being floated in which the cost of money does not rise, for which you can heap thanks on the shoulders of the new Inflation President. His pro-business, pro-growth, trade-restricting policies are guaranteed to increase wages, prices, the US$, markets and rates.

Finally, real estate rutters should also be a tad worried about our own glorious leader. The Trudeau assault on rich people – already manifested in a special tax bracket – has apparently just started. By upping capital gains, diddling with dividends and bringing in a corporation-bashing Doctor Tax, Ottawa is ramming it to the very people who are out there buying $2-5 million houses, and keeping the market juices flowing.

Or, maybe it’s just a subtle combination of some of these factors, combined with the fact nervous bankers have started to pull back credit in a country with $2 trillion in outstanding personal debt, stagnant incomes, and more house porn than the rest of the planet combined.

We won’t know until it’s happened. But I don’t think Luigi cares. There’s no wiping the smile off that dude’s face.

Colossal: Wire Mesh Figures of Children Appear to Dissolve into Thin Air

Norwegian artist Lene Kilde seeks inspiration in the emotions of children, deftly capturing brief moments in their lives distilled into minimalistic wire mesh sculptures. The pieces focus almost entirely on the hands and feet of her subjects that dissolve into nothingness as they go about various activities. This is not to suggest anything is inherently missing, but rather to invite the viewer to complete the rest of each sculpture in their mind, perhaps substituting the missing fragments with their own memories or stories.

Kilde completed a masters degree in product design in 2012 and was subsequently awarded a three-year work scholarship from the Norwegian Arts Council. She is currently represented by Galleri Ramfjord where you can find more of her figurative sculptures.

Disquiet: Disquiet Junto Project 0269: Duet Portion

Each Thursday in the Disquiet Junto group, a new compositional challenge is set before the group’s members, who then have just over four days to upload a track in response to the assignment. Membership in the Junto is open: just join and participate. A SoundCloud account is helpful but not required. There’s no pressure to do every project. It’s weekly so that you know it’s there, every Thursday through Monday, when you have the time.

This project’s deadline is 11:59pm wherever you are on Monday, February 27, 2017. This project was posted in the late morning, California time, on Thursday, February 23, 2017.

These are the instructions that went out to the group’s email list (at tinyletter.com/disquiet-junto):

Disquiet Junto Project 0269: Duet Portion
Record half of a live duet.

Step 1: This week’s Junto will be the first in an occasional series allowing for asynchronous collaboration. You will be recording something with the understanding that it will be unfinished.

Step 2: The plan is for you to record a short and original piece of music, on any instrumentation of your choice, live, with no post-production edits or overdubbing. You can do as many takes as you’d like, but the final recording should be a document of a wholly live performance. Conceive it as something that leaves room for something else — another instrument, performed by another person — to join in.

Step 3: Record a short piece of music, roughly two to three minutes in length, as described in Step 2. If possible, it would be great if you could make a video of your live performance as well.

Step 4: Also be sure, when complete, to make the track downloadable, because it will be used by someone else in a future Junto project.

Five More Important Steps When Your Track Is Done:

Step 1: If you hosting platform allows for tags, be sure to include the project tag “disquiet0269″ (no spaces) in the name of your track. If you’re posting on SoundCloud in particular, this is essential to my locating the tracks and creating a playlist of them.

Step 2: Upload your track. It is helpful but not essential that you use SoundCloud to host your track.

Step 3: In the following discussion thread at llllllll.co please consider posting your track:

http://llllllll.co/t/record-half-a-duet-disquiet-junto-project-0269/6652

Step 4: Annotate your track with a brief explanation of your approach and process.

Step 5: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Deadline: This project’s deadline is 11:59pm wherever you are on Monday, February 27, 2017. This project was posted in the late morning, California time, on Thursday, February 23, 2017.

Length: The length is up to you, though about two to three minutes feel about right.

Title/Tag: When posting your track, please include “disquiet0269″ in the title of the track, and where applicable (on SoundCloud, for example) as a tag.

Upload: When participating in this project, post one finished track with the project tag, and be sure to include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Download: Please set your track for download and with a license that allows for attributed reworking (i.e., a Creative Commons license permitting non-commercial sharing with attribution).

Linking: When posting the track online, please be sure to include this information:

More on this 269th weekly Disquiet Junto project, “Duet Portion: Record half of a live duet” at:

http://disquiet.com/0269/

More on the Disquiet Junto at:

http://disquiet.com/junto/

Subscribe to project announcements here:

http://tinyletter.com/disquiet-junto/

Project discussion takes place on llllllll.co:

llllllll.co/t/record-half-a-duet-disquiet-junto-project-0269/6652

There’s also on a Junto Slack. Send your email address to twitter.com/disquiet for Slack inclusion.

Image associated with this project is by Tony Tsang. It’s used thanks to a Creative Commons license:

flic.kr/p/6565DF

creativecommons.org/licenses/by-nc-sa/2.0/

Colossal: Street Kintsugi: Artist Rachel Sussman Repairs the Roads with Gold

“Study for Sidewalk Kintsukuroi #01 (New Haven, Connecticut),” photograph with enamel paint and metallic dust.

As part of an ongoing series titled Sidewalk Kintsukuroi, artist Rachel Sussman (previously) brings the Japanese art of kintsugi to the streets. We’ve long been enamored by the ancient technique that traditionally involves the process of fixing broken pottery with a lacquer dusted or mixed with powdered gold, resulting in an a repair that pays homage to the object’s history. In the same way, Sussman’s kintsugi series highlights the history under our feet, bringing attention to the imperceptible changes that take place over time in the world around us. Though even the repairs are impermanent and will eventually be lost to wear and tear.

Several photos from Sidewalk Kintsukuroi are currently on view as part of the Alchemy: Transformations in Gold at the Des Moines Art Center through through May 5, 2017. (via Hyperallergic)

“Study for Sidewalk Kintsukuroi #09 (SoHo, New York),” photograph with enamel paint and metallic dust.

“Study for Sidewalk Kintsukuroi #02 (MASS MoCA),” photograph with enamel paint and metallic dust

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Swearing



Click here to go see the bonus panel!

Hovertext:
It's interesting how caveman can speak a language that won't exist for 50,000 years, but they still have trouble with article usage.

New comic!
Today's News:

Turbo-geeks of London! BAHFest London tickets are about 40% sold. We usually sell 70% of tickets in the last week, so this one will definitely sell out. Book now, or dwell in sorrow.

The Shape of Code: NWIP for Monochrome inkjet yield

As a member of IST/5, the British Standards’ programming language committee, I receive a daily notification of relevant documents that have arrived at BSI. The email arrives just before midnight and contains a generous helping of acronyms, such as: N13344 SC 28 ISO-IECJTC1-SC28 N2051 NWIP for Monochrome inkjet yield.

The line break on the above line resulted in “Monochrome inkjet yield” appearing at the start of a line and it caught my attention, so I downloaded the document.

SC28 is the ISO committee for office equipment and this NWIP (New Work Item Proposal) is for WG2 (the Working Group responsible for consumables) to create a new ISO Standard with the title: “Method for the Determination of Ink Cartridge Yield for Monochrome Inkjet Printers and Multifunction Devices that Contain Printer Components”. Voting, on whether or not work should start on this proposal, closes on July 12.

Why was information about inkjet yield sent to a programming language list? Are SC28/WG2 having a membership drive and have been tipped off that our workload is declining? More importantly, are they following the C++ model of having regular meetings in Hawaii; the paperwork does not say. The standard for color injet printers appeared in 2009; was the production of this document such a traumatic event that it decimated committee membership and it has taken eight years to put together a skeleton group.

Attached to the proposal is a 20-page draft document; somebody has been busy.

So how is it proposed that monochrome inkjet yield be calculated? You need at least nine inkjet cartridges, three printers and a room at a temperature of 23 degrees (plus/minus 2 degrees, with readings taken every 15 minutes and an hourly running average calculated; “… temperature can have a profound effect on test results.”). Load “… a common medium weight paper and must conform to the printer’s list of approved papers.” into the three printers that have been “… temperature acclimated to the test room environment.” and count the number of pages printed by each printer (using at least three cartridges in each printer) before “…an end of life judgement.” Divide total number of pages printed by total number of cartridges used and there you go.

End of life? “The cartridge yield is determined by an end of life judgement, or signalled with either of two phenomena: fade, caused by depletion of ink in the cartridge or automatic printing stop caused by an Ink Out detection function.”

What is fade?
“3.1 Fade
A phenomenon where a significant reduction in uniformity occurs due to ink depletion.
NOTE In this test, fade is defined as a noticeably lighter, 3 mm or greater, gap located in the text, in the bar chart, or in the boxes around the periphery of the test page. The determination of the change in lightness is to be made referenced to the 25th page printed for each cartridge in testing. For examples of fade, please consult Annex A.”

And Annex A?
“Examples of Fade <future edit: add picture>”

Formula for calculating the standard deviation and a 90% confidence interval are given (the 90% confidence interval formula assumes a Normal distribution; I would have thought that the distribution of pages printed by a cartridge might be skewed and a bootstrap procedure would be more reliable).

It is daylight now and my interest in inkjet yield is satiated. But if you, dear reader, have a longing for more, then Ms. Michelle Pangborn (Hewlett-Packard), USA or Mr. Nobuaki Hamada (Epson), Japan are the people to contact.

Some printer test pages to add to your link collection.

OCaml Planet: Moving from ocaml.io to ocamllabs.io

We are pleased to announce that the new and improved OCaml Labs website is here!

This wiki will remain active while we transition our content to ocamllabs.io, but it will be retired eventually. The new site will have all the recent news and exciting developments from OCaml Labs together with links to related projects and people, so it will be easier than ever to keep up to date with everything we are doing.

We hope you enjoy the new site!

New Humanist Blog: "Without the evolution of locomotion there would be no sex, no photosynthesis, no ecology"

Q&A with evolutionary biologist Matt Wilkinson.

The Universe of Discourse: Miscellaneous notes on anagram scoring

My article on finding the best anagram in English was well-received, and I got a number of interesting comments about it.

  • A couple of people pointed out that this does nothing to address the issue of multiple-word anagrams. For example it will not discover “I, rearrangement servant / Internet anagram server” True, that is a different problem entirely.

  • Markian Gooley informed me that “megachiropteran / cinematographer” has been long known to Scrabble players, and Ben Zimmer pointed out that A. Ross Eckler, unimpressed by “cholecystoduodenostomy / duodenocholecystostomy”, proposed a method almost identical to mine for scoring anagrams in an article in Word Ways in 1976. M. Eckler also mentioned that the “remarkable” “megachiropteran / cinematographer” had been published in 1927 and that “enumeration / mountaineer” (which I also selected as a good example) appeared in the Saturday Evening Post in 1879!

  • The Hacker News comments were unusually pleasant and interesting. Several people asked “why didn't you just use the Levenshtein distance”? I don't remember that it ever occured to me, but if it had I would have rejected it right away as being obviously the wrong thing. Remember that my original chunking idea was motivated by the observation that “cholecystoduodenostomy / duodenocholecystostomy” was long but of low quality. Levenshtein distance measures how far every letter has to travel to get to its new place and it seems clear that this would give “cholecystoduodenostomy / duodenocholecystostomy” a high score because most of the letters move a long way.

    Hacker News user tyingq tried it anyway, and reported that it produced a poor outcome. The top-scoring pair by Levenshtein distance is “anatomicophysiologic physiologicoanatomic”, which under the chunking method gets a score of 3. Repeat offender “cholecystoduodenostomy / duodenocholecystostomy” only drops to fourth place.

    A better idea seems to be Levenshtein score per unit of length, suggested by lobste.rs user cooler_ranch.

  • A couple of people complained about my “notaries / senorita” example, rightly observing that “senorita” is properly spelled “señorita”. This bothered me also while I was writing the article. I eventually decided although “notaries” and “señorita” are certainly not anagrams in Spanish (even supposing that “notaries” was a Spanish word, which it isn't) that the spelling of “senorita” without the tilde is a correct alternative in English. (Although I found out later that both the Big Dictionary and American Heritage seem to require the tilde.)

    Hacker News user ggambetta observed that while ‘é’ and ‘e’, and ‘ó’ and ‘o’ feel interchangeable in Spanish, ‘ñ’ and ‘n’ do not. I think this is right. The ‘é’ is an ‘e’, but with a mark on it to show you where the stress is in the word. An ‘ñ’ is not like this. It was originally an abbreviation for ‘nn’, introduced in the 18th century. So I thought it might make sense to allow ‘ñ’ to be exchanged for ‘nn’, at least in some cases.

    (An analogous situation in German, which may be more familiar, is that it might be reasonable to treat ‘ö’ and ‘ü’ as if they were ‘oe’ and ‘ue’. Also note that in former times, “w” and “uu” were considered interchangeable in English anagrams.)

    Unfortunately my Spanish dictionary is small (7,000 words) and of poor quality and I did not find any anagrams of “señorita”. I wish I had something better for you. Also, “señorita” is not one of the cases where it is appropriate to replace “ñ” with “nn”, since it was never spelled “sennorita”.

    I wonder why sometimes this sort of complaint seems to me like useless nitpicking, and other times it seems like a serious problem worthy of serious consideration. I will try to think about this.

  • Mike Morton, who goes by the anagrammatic nickname of “Mr. Machine Tool”, referred me to his Higgledy-piggledy about megachiropteran / cinematographer, which is worth reading.

  • Regarding the maximal independent set algorithm I described yesterday, Shreevatsa R. suggested that it might be conceptually simpler to find the maximal clique in the complement graph. I'm not sure this helps, because the complement graph has a lot more edges than the original. Below right is the complement graph for “acrididae / cidaridae”. I don't think I can pick out the 4-cliques in that graph any more than the independent sets in the graph on the lower-left, and this is an unusually favorable example case for the clique version, because the original graph has an unusually large number of edges.

    But perhaps the cliques might be easier to see if you know what to look for: in the right-hand diagram the four nodes on the left are one clique, and the four on the right are the other, whereas in the left-hand diagram the two independent sets are all mixed together.

  • An earlier version of the original article mentioned the putative 11-pointer “endometritria / intermediator”. The word “endometritria” seemed pretty strange, and I did look into it before I published the article, but not carefully enough. When Philip Cohen wrote to me to question it, I investigated more carefully, and discovered that it had been an error in an early WordNet release, corrected (to “endometria”) in version 1.6. I didn't remember that I had used WordNet's word lists, but I am not surprised to discover that I did.

    A rare printing of Webster's 2¾th American International Lexican includes the word “endometritriostomoscopiotomous” but I suspect that it may be a misprint.

  • Philippe Bruhat wrote to inform me of Alain Chevrier’s book notes / sténo, a collection of thematically related anagrams in French. The full text is available online.

  • Alexandre Muñiz, who has a really delightful blog, and who makes and sells attractive and clever puzzles of his own invention. pointed out that soapstone teaspoons are available. The perfect gift for the anagram-lover in your life! They are not even expensive.

  • Thanks also to Clinton Weir, Simon Tatham, Jon Reeves, Wei-Hwa Huang, and Philip Cohen for their emails about this.

Planet Haskell: Mark Jason Dominus: Miscellaneous notes on anagram scoring

My article on finding the best anagram in English was well-received, and I got a number of interesting comments about it.

  • A couple of people pointed out that this does nothing to address the issue of multiple-word anagrams. For example it will not discover “I, rearrangement servant / Internet anagram server” True, that is a different problem entirely.

  • Markian Gooley informed me that “megachiropteran / cinematographer” has been long known to Scrabble players, and Ben Zimmer pointed out that A. Ross Eckler, unimpressed by “cholecystoduodenostomy / duodenocholecystostomy”, proposed a method almost identical to mine for scoring anagrams in an article in Word Ways in 1976. M. Eckler also mentioned that the “remarkable” “megachiropteran / cinematographer” had been published in 1927 and that “enumeration / mountaineer” (which I also selected as a good example) appeared in the Saturday Evening Post in 1879!

  • The Hacker News comments were unusually pleasant and interesting. Several people asked “why didn't you just use the Levenshtein distance”? I don't remember that it ever occured to me, but if it had I would have rejected it right away as being obviously the wrong thing. Remember that my original chunking idea was motivated by the observation that “cholecystoduodenostomy / duodenocholecystostomy” was long but of low quality. Levenshtein distance measures how far every letter has to travel to get to its new place and it seems clear that this would give “cholecystoduodenostomy / duodenocholecystostomy” a high score because most of the letters move a long way.

    Hacker News user tyingq tried it anyway, and reported that it produced a poor outcome. The top-scoring pair by Levenshtein distance is “anatomicophysiologic physiologicoanatomic”, which under the chunking method gets a score of 3. Repeat offender “cholecystoduodenostomy / duodenocholecystostomy” only drops to fourth place.

    A better idea seems to be Levenshtein score per unit of length, suggested by lobste.rs user cooler_ranch.

  • A couple of people complained about my “notaries / senorita” example, rightly observing that “senorita” is properly spelled “señorita”. This bothered me also while I was writing the article. I eventually decided although “notaries” and “señorita” are certainly not anagrams in Spanish (even supposing that “notaries” was a Spanish word, which it isn't) that the spelling of “senorita” without the tilde is a correct alternative in English. (Although I found out later that both the Big Dictionary and American Heritage seem to require the tilde.)

    Hacker News user ggambetta observed that while ‘é’ and ‘e’, and ‘ó’ and ‘o’ feel interchangeable in Spanish, ‘ñ’ and ‘n’ do not. I think this is right. The ‘é’ is an ‘e’, but with a mark on it to show you where the stress is in the word. An ‘ñ’ is not like this. It was originally an abbreviation for ‘nn’, introduced in the 18th century. So I thought it might make sense to allow ‘ñ’ to be exchanged for ‘nn’, at least in some cases.

    (An analogous situation in German, which may be more familiar, is that it might be reasonable to treat ‘ö’ and ‘ü’ as if they were ‘oe’ and ‘ue’. Also note that in former times, “w” and “uu” were considered interchangeable in English anagrams.)

    Unfortunately my Spanish dictionary is small (7,000 words) and of poor quality and I did not find any anagrams of “señorita”. I wish I had something better for you. Also, “señorita” is not one of the cases where it is appropriate to replace “ñ” with “nn”, since it was never spelled “sennorita”.

    I wonder why sometimes this sort of complaint seems to me like useless nitpicking, and other times it seems like a serious problem worthy of serious consideration. I will try to think about this.

  • Mike Morton, who goes by the anagrammatic nickname of “Mr. Machine Tool”, referred me to his Higgledy-piggledy about megachiropteran / cinematographer, which is worth reading.

  • Regarding the maximal independent set algorithm I described yesterday, Shreevatsa R. suggested that it might be conceptually simpler to find the maximal clique in the complement graph. I'm not sure this helps, because the complement graph has a lot more edges than the original. Below right is the complement graph for “acrididae / cidaridae”. I don't think I can pick out the 4-cliques in that graph any more than the independent sets in the graph on the lower-left, and this is an unusually favorable example case for the clique version, because the original graph has an unusually large number of edges.

    But perhaps the cliques might be easier to see if you know what to look for: in the right-hand diagram the four nodes on the left are one clique, and the four on the right are the other, whereas in the left-hand diagram the two independent sets are all mixed together.

  • An earlier version of the original article mentioned the putative 11-pointer “endometritria / intermediator”. The word “endometritria” seemed pretty strange, and I did look into it before I published the article, but not carefully enough. When Philip Cohen wrote to me to question it, I investigated more carefully, and discovered that it had been an error in an early WordNet release, corrected (to “endometria”) in version 1.6. I didn't remember that I had used WordNet's word lists, but I am not surprised to discover that I did.

    A rare printing of Webster's 2¾th American International Lexican includes the word “endometritriostomoscopiotomous” but I suspect that it may be a misprint.

  • Philippe Bruhat wrote to inform me of Alain Chevrier’s book notes / sténo, a collection of thematically related anagrams in French. The full text is available online.

  • Alexandre Muñiz, who has a really delightful blog, and who makes and sells attractive and clever puzzles of his own invention. pointed out that soapstone teaspoons are available. The perfect gift for the anagram-lover in your life! They are not even expensive.

  • Thanks also to Clinton Weir, Simon Tatham, Jon Reeves, Wei-Hwa Huang, and Philip Cohen for their emails about this.

Electronics-Lab: PIC16F15386, A New PIC Family Announced By Microchip

Microchip, the well-known manufacturer of microcontrollers and semiconductors, announced this week a new family of 8-bit PIC microcontrollers, the ‘PIC16F15386’.

The new PIC16F15386 family features a 8 MIPS CPU speed, with 2KB RAM and up to 28KB flash memory offered in 8 to 48-pin packages. It also has a dual UART, dual SPI and dual I²C interfaces, one 8-bit timer and two 16-bit timers.

PIC16F15386 Features

  • Enhanced Mid-range Core with 49 Instruction, 16 Stack Levels
  • Flash Program Memory with self read/write capability
  • eXtreme Low Power (XLP)
  • IDLE and DOZE low power modes
  • Peripheral Module Disable (PMD)
  • Peripheral Pin Select (PPS)
  • 4x 10-bit PWMs
  • 2x Capture, Compare, PWM (CCP)
  • Complementary Waveform Generator (CWG)
  • Numerically Controlled Oscillator (NCO)
  • 4x Configurable Logic Controller (CLC)
  • 43 Channels 10-bit ADC with Voltage Reference
  • 5-bit Digital to Analog Converter (DAC)
  • 2x Comparators
  • 1x 8-bit Timers (TMR0/TMR2)
  • 2x 16-bit Timer (TMR1)
  • Window Watchdog Timer (WWDT)
  • Enhanced Power-On/Off-Reset
  • Low-Power Brown-Out Reset (LPBOR)
  • Programmable Brown-Out Reset (BOR)
  • In Circuit Serial Programming (ICSP)
  • PIC16LF15386 (1.8V – 3.6V)
  • PIC16F15386 (2.3V – 5.5V)

PIC16F15386 family comes with essential peripherals like Intelligent Analog, Core Independent Peripherals (CIPs) and communication combined with eXtreme Low-Power (XLP) for a wide range of low-power applications. The family features PWMs, multiple communication, temperature sensor and memory features like Memory Access Partition (MAP) and Device Information Area (DIA).

We’ve always offered a diverse portfolio of products with large market appeal,” said Steve Drehobl, vice president of Microchip’s 8-bit MCU division. “With the combination of the most requested features and peripherals by our large base of PIC MCU users, the flexibility in memory size and package options and the availability of MPLAB Xpress with MCC, we expect the PIC16F15386 family to be popular with experienced and first-time PIC MCU designers.

The PIC16F15386 is also compatible with the MPLAB Xpress IDE and the MPLAB Code Configurator, a graphical programming environment. The family includes 13 unique products that are offered in various package options including PDIP, SOIC, DFN, UDFN, UQFN and SSOP.

All products are available now for sampling and in volume production. Volume pricing starts at $0.33 for the product family.

The post PIC16F15386, A New PIC Family Announced By Microchip appeared first on Electronics-Lab.

Electronics-Lab: gen4 3.2”, The New Intelligent Display Modules

4D Systems, the manufacturer of intelligent graphics solutions, has announced a new 3.2” smart display module as part of the ‘ gen4 ’ series, which had been designed specifically for ease of integration and use, with careful consideration for space requirements and functionality.

These modules features a 3.2” color TFT display with options for Cover Lens Bezel (CLB), Resistive Touch and Capacitive Touch. The display is capable of Touch Detection, microSD memory Storage, GPIO and Communications, along with multiple millisecond resolution timers, and Audio Generation. gen4 modules have 30 pin ZIF socket for a 30 pin FPC cable, for easy and simple connection to an application or a motherboard.

The gen4 display modules are powered by the 4D Systems Diablo16 graphics processor that offers an array of functionality and options for any Designer / Integrator / User. Diablo16 is a custom embedded 4DGL graphics controller designed to interface with many popular OLED and LCD display panels.

gen4 display modules features:

  • Powerful 3.2” Intelligent LCD-TFT display module powered by DIABLO16.
  • 240 x 320 Resolution, RGB 65K true to life colours, TFT Screen with integrated 4-wire Resistive Touch Panel (on DT model only).
  • 6 banks of 32750 bytes of Flash memory for User Application Code and Data.
  • 32Kb of SRAM purely for the User.
  • 16 General Purpose I/O pins for user interfacing, which include 4 configurable Analog Inputs.
  • The GPIO is variously configurable for alternative functions such as:
    • 3x I2C channels available.
    • 1x SPI dedicated for SD Card and 3x configurable SPI channels available.
    • 1x dedicated and 3x configurable TTL Serial comm ports available.
    • Up to 6 GPIO can be used as Pin Counters.
    • Up to 6 GPIO for PWM (simple and Servo).
    • Up to 10 GPIO for Pulse Output.
    • Up to 14 GPIO can be configured for Quadrature Encoder Inputs (2 channels).
  • 30pin FPC connection, for all signals, power, communications, GPIO and programming.
  • On-board latch type micro-SD memory card connector for multimedia storage and data logging purposes.
  • DOS compatible file access (FAT16 format) as well as low level access to card memory.
  • Dedicated PWM Audio pin driven by WAV files from micro-SD card, and for sound generation, for an external amplifier.
  • Display full colour images, animations, icons and video clips.
  • Supports all available Windows fonts.
  • 4.0V to 5.5V range operation (single supply).
  • Module dimensions:
    • (D): 95.7 x 57.1 x 6.3mm.
    • (D-CLB): 98.8 x 72.6 x 7.4mm.
    • (DT): 95.7 x 57.1 x 7.5mm.
    • (DCT-CLB): 98.8 x 72.6 x 8.3mm.
  • 4x mounting tabs with 3.2mm holes for mechanical mounting using M3 screws.
  • RoHS and REACH compliant.
  • CE Compliant – please ask for CE declarations from our Support Team.

The intelligent gen4 displays can be programmed via Workshop4 IDE. It provides an integrated software development platform for all of the 4D family of processors and modules. The IDE combines the Editor, Compiler, Linker and Downloader to develop complete 4DGL application code.

gen4 modules are available in 4 models:

  • gen4-uLCD-32D (non Touch, without Cover Lens Bezel)
  • gen4-uLCD-32DT (Resistive Touch, without Cover Lens Bezel)
  • gen4-uLCD-32D-CLB (non Touch, Cover Lens Bezel)
  • gen4-uLCD-32DCT-CLB (Capacitive Touch, with Cover Lens Bezel)

The module is available on the official website with a range of $55 to $79 including interface board, 150mm FFC cable, and a quick start guide. Starter kits are also available from $75 to $99.

The post gen4 3.2”, The New Intelligent Display Modules appeared first on Electronics-Lab.

Disquiet: No Insects Were Harmed

Clara Iannotta’s “Dead Wasps in the Jam-Jar (ii)” (2016) intrigues with its title’s promise of quotidian decay and, perhaps, with a bit of telegraphed moralizing about the price paid for sweetness. The suspense builds even before you hit play, thanks to its list of components: “for string orchestra, objects, and sine waves.” Now technically, virtually all music contains sine waves, since those are a major component of sound, but clearly the sine waves heard here are of the electronically generated variety. As for the objects, the brush held by the composer in the accompanying photo provides a hint at the untraditional instruments. What unfolds as “Dead Wasps in the Jam-Jar (ii)” proceeds is a study in controlled energies. In programmatic terms, the wasps seem to meet their fate as the four-minute mark arrives, a sharp swirling hitting hard, and more loudly than anything that proceeded it. Then warping torques and sudden jitters evidence struggle before the piece settles into an extended if anxious stillness. That final period, from about six minutes until the end, at eleven and a half minutes, is where concentrated listening is especially rewarded, thanks to Iannotta’s expert mix of textures, of held strings and fluttering percussion.

Track originally posted at soundcloud.com/claraiannotta. More from Iannotta, who is from Italy and is based in both Berlin, Germany, and Boston, Massachussetts, at claraiannotta.com. She is currently working on compositions for Duo 2KW and Arditti Quartet, among other ensembles.

Penny Arcade: News Post: Update On That Stolen PS4 in New Zealand

Tycho: A torrent of generosity from the Internet washed the whole situation away, essentially - but there was an opportunity to make a lasting improvement to their Play Therapy department by supporting the playrooms and stock a new teen room they’d planned.  Travis over at Child’s Play did a write up that includes the overwhelemed response from the hospital; you can find it all here. (CW)TB

Ideas from CBC Radio (Highlights): Downloading Decision: Could machines make better decisions for us?

Humans like to let others make decisions for them. But what happens when those decisions are made by machines or artificial intelligence? Can we trust them to make the right choices?

Penny Arcade: News Post: Return Of The Jeed

Tycho: I like Star Wars, not as much as some but I do like it; there’s always new words in there for me to learn.  “The Last Jedi,” as a piece of nomenclature in English, has a lot of potential weight and finality to it because of how “the” works.  “The” abuts singular or plural nouns, and Jedi can refer to one or more completely mad star wizards.  It is ambiguity adorned with uncertain, enigmatic fixin’s; you’re getting it coming and going. When versions of the movie’s title came out translated into other languages,…

IEEE Job Site RSS jobs: Postdoctoral Fellowship Available in Electrical and Computer Engineering

Montreal, Quebec, Canada Concordia University Wed, 22 Feb 2017 19:16:26 -0800

Greater Fool – Authored by Garth Turner – The Troubled Future of Real Estate: Squandered

Now that tax-free accounts have room for $52,000 – and a couple can shelter $104,000 – they deserve your respect. TFSAs are not glorified savings accounts. They’re not for dicking around with a few loser stocks your BIL pumped. They have nothing to do with your next vacation. They should form the cornerstone of a decades-long financial plan. So if you can find a hundred bucks a week, do the right thing.

But TFSAs are also flexible, malleable, twisty vehicles that can be bent to serve many purposes. Blog dog John has an example.

“Long time reader!   My question is regarding my parents, recently retired, no savings to speak of… CPP/OAS/GIS… some calculations still being finalized… I am helping them out financially.

Assuming I have a good relationship with them, does it make sense to put $100K from checking into their own TFSA – with the trust and understanding that the money eventually gets back to me? I know you cannot comment on the trust factor, but what about the idea? Tiny risk, assuming they do not go crazy and burn all the money somehow, but otherwise is it a sound strategy? Would CRA complain because how can you be on GIS if you have TFSA?  Thanks, John.”

First, John, learn from thine parents. You can be young and poor and happy. Nobody can be old, broke and content. If your folks own real estate, it should be sold and the money invested for cash flow. If recently retired, these people are likely in their 60s, with twenty or thirty years left to finance. Your hundred grand won’t do much to help them in the long run. It’s always a sad thing when, after six decades on this earth, a couple cannot look after themselves. Learn, John. Prepare.

As for the TFSA strategy, it works. You can gift parents (or a spouse, or adult children) money to stuff into a tax-free account. There it can generate income for your folks which will not impact their OAS pogey since it’s uncounted as taxable income. Win.

And unlike RRSP contributions, which hit a wall at age 71, money can go into TFSAs every year they are alive. In fact, for most wrinklies, it makes great sense to transfer assets owned in a non-registered account annually into the TFSA, where taxless growth can occur and income flow out. And no need to ever convert into a RRIF. (In fact, all retirees should be taking the mandatory RRIF payments and plunking them inside a TFSA to mitigate the tax bite.)

But there is a wrinkle. So long as your folks keep the assets you financed inside the TFSA, things are cool. If they withdraw then any future growth or income will be attributed back to you (assuming the CRA knows where the cash came from).

The big question: do you trust your parents? If not (and recall they’ve saved nothing to date, and apparently lack financial discipline) then set this up as a POA account (power of attorney), and give them an allowance. Tough love. Badly required.

And speaking of TFSAs, there are almost 12 million of the little suckers opened in Canada, still with 80% of the assets idling in interest-bearing assets like HISAs (the jumbo shrimp of the investing world) plus comatose GICs. Most people with these accounts think of them as a place to temporarily store shoe money or save for new hardwood. And when T2 sliced the contribution limit in half, the nation let out a giant collective yawn. Most folks have absolutely no idea what money machines these can become, and a major reason is the banks. Like Pete learned.

“Garth – I’ve taken a ton of your advice over the years. I’ve had an iTrade account since Scotia purchased Etrade so many years ago. For the first time I was cornered by TNL@TB when moving some assets back into Scotia and opening another investment account.

“Long story short – I’ve had a TFSA sitting with cash in it with them for a while. So TNL@TB got all smiles and cheeks telling me I should look at a laddered GIC with them.  Little did I know the bank would set you up with a laddered system – NEVER promising a return but instead talking about 33% re-investment plans “COMPOUNDING” returns. TNL@TB – “very common practice these days among TFSA account holders, Mr. Smith”

“I did the backstroke out of there Olympic style, however felt the need to share the sell tactic with you.  I can see how people end up in these dead end situations with investments.  Cheers and bless you for changing my views on investing.  You are a man among boys and a dog lover so you win. Blog is amazing and….Hazel also thanks you.  (2 year old rescue Dober – Sharpei mix).”

For the record, a laddered GIC is just that – a string of interest-bearing deposits strung together with modestly appreciating rates where the interest earned on one is dumped into the principal for the next. Hence the ‘compounding.’ Money is divided into equal hunks, with each put into GICs with maturity dates ranging from one to five years – that means each year a piece of your dough matures and becomes available as cash, to be dumped into a new 5-year GIC. The idea is to earn more than you would with a one-year deposit, but not to lock all the money up for half a decade.

And what do GICs pay these days at the bank? A one-year deposit yields 1.35%, and the five-year model returns 2.01%. The inflation rate is about 1.6%, which means just about every stage of a laddered GIC is guaranteed to lose you money, even inside a TFSA where you’re shielded from paying tax on that asset. So a GIC held outside a tax-free account may be a sign you may have recently died. Have a trusted friend check.

The point of today’s post?

Simple. This vehicle is flexible, effective yet squandered. If you max it, invest for growth and do so for 25 years, you’ll have about $325,000. If you buy GICs then empty it and buy a floor you’ll have, well, a floor.

new shelton wet/dry: In the stillness of remembering what you had, and what you lost

Heterosexual men were most likely to say they usually-always orgasmed when sexually intimate (95%), followed by gay men (89%), bisexual men (88%), lesbian women (86%), bisexual women (66%), and heterosexual women (65%). Compared to women who orgasmed less frequently, women who orgasmed more frequently were more likely to: receive more oral sex, have longer [...]

new shelton wet/dry: Every day, the same, again

Women are getting freckles tattooed on their faces There are a thousand ways to buy weed in New York City, but the Green Angels devised a novel strategy for standing out: They hired models to be their dealers. Man wins back girlfriend’s love after she forgets him due to amnesia Wikipedia is teaming up with Google to work [...]

Perlsphere: Maintaining Perl 5: Grant Report for January 2017

This is a monthly report by Tony Cook on his grant under Perl 5 Core Maintenance Fund. We thank the TPF sponsors to make this grant possible.

Approximately 38 tickets were reviewed, and 8 patches were
applied

[Hours]         [Activity]
 13.82          #122490 (sec) more merge conflicts
                #122490 (sec) more merge conflicts, track down warning
                sources
                #122490 (sec) track down warning sources, start merging
                test changes
                #122490 (sec) more test merging, testing, debugging
                #122490 (sec) debugging
                #122490 (sec) debugging
  0.97          #126228 build testing, apply to blead
 16.08          #127663 testing, apply hash seed env suppression patch,
                back to in-place changes
                #127663 work on chdir test, testing, debugging, make
                mg_freeext() API and fix docs
                #127663 cleanup, threads handling, threads test
                #127663 more threads testing, try to make it fail with
                fork
                #127663 more try to make it fail with fork and succeed,
                work on fix, code polish
                #127663 hoist some work back up, testing
                #127663 uncombine thread/fork child handling which I
                combined by accident, work on more tests and find a couple
                of cleanup issues
                #127663 more tests
                #127663 post patch to ticket
  0.22          #128528 (sec) review and comment
  0.88          #128998 track down when it was fixed, ticket management
  0.30          #129012 make public, comment and close
  1.88          #129024 review, make public, check fix backports to 5.24,
                non-trivial backport to 5.22, comment
  1.30          #129125 check, testing, apply to blead
  1.65          #129149 apply patch, test #130557 case, testing, make
                public apply to blead, comment on #130557
  0.08          #129187 check and merge into #129149
  0.95          #129190 rebase with some conflicts, testing, make public,
                apply to blead
  0.17          #129199 make public, comment and close
  2.62          #129274 (sec) try to find an alternative attack
                #129274 more trying to break it, write regression test,
                testing, make public, apply to blead
  2.12          #129292 review code, debugging, make public and comment
  1.77          #129298 review patches, research, consider whether changes
                improve perl
                #129298 more consideration, ask khw
  4.32          #129340 (sec) review code, think about solutions
                #129340 (sec) work on a solution, testing
                #129340 (sec) write a regression test, testing
                #129340 (sec) suggested changes, testing
                #129340 (sec) research, comment with updated patch
  0.50          #129342 (sec) test provided patch, create a test case and
                comment
  0.45          #129377 (sec) review patch, look for similar issues,
                comment
  1.32          #129840 (sec) review, testing
                #129840 get it to test, merge into 129377
  0.40          #129848 review and make public
  1.53          #129861 (sec) debugging
  0.42          #129887 (sec) review and comment
  0.82          #129963 research, make public and link to stack-not-
                refcounted meta ticket
  0.92          #129975 debugging, make public and link to stack-not-
                refcounted meta ticket
  0.28          #130100 make public and point at list discussion on
                removal
  0.73          #130256 debugging, make public and link to stack-not-
                refcounted meta ticket
  1.67          #130262 apply patch with noise, test #130558 case,
                testing, make public, push to blead, comment on #130558
  0.18          #130321 (sec) debugging
  0.68          #130504 review, testing, apply to blead
  0.43          #130560 comment
  0.90          #130567 reproduce, suspect 94749a5ed was bad, ask khw on
                #p5p
                #130567 irc discussion
  1.35          #130569 (sec) comment
  2.85          #130578 debugging
                #130578 debugging, comment
  0.58          #130591 review discussion and comment
  0.33          #130614 research and comment
  1.57          #130635 review changes, check memory use, testing, comment
                #130635 comment
  1.55          #130675 debugging, #p5p discussion
                #130675 debugging, #p5p comment, ticket comment on #130679
  0.42          comment on deprecations thread
======
 69.01 hours total

Planet Lisp: Nicolas Hafner: Portacle - Adventures in Cross-Platform Deployment - Confession 72

header
As announced in my previous post, I've decided to do a write-up that illustrates my adventures in developing Portacle, a cross-platform, portable, installer-less, self-contained Common Lisp development environment.

The reason why I started with portacle in the first place is that, some day, probably in the distant future, I want to write a long series of long articles that should eventually accumulate into a book that introduces complete novices and interested people to programming in Common Lisp. As a prerequisite for that, I deem it absolutely necessary that the readers have an easy way to try out examples. Forcing them to perform a strange setup process that has absolutely nothing to do with the rest of the book and is likely to be error-prone in multiple ways is completely out of the question to me. They should be able to download an archive, extract it, and then just click on an icon to get a fully-featured IDE with which they can start out and play around with.

So, right off the bat this sets a few constraints that are rather hard to deal with.

  • It needs to be cross-platform. I can't constrain the users to a certain operating system. Requiring them to set up a virtual machine would be madness.
  • It needs to be cross-distribution. I can't know which distribution of Linux, or which version of it users are going to have. Requiring a specific one or a specific version would, too, be madness.
  • It needs to be self-contained and should not poison the rest of the environment unless necessary or intended. This is necessary for it to be truly portable and work from a single folder.
  • It needs to be reproducible and upgradable in order to be future-proof. This means I cannot simply assemble a package by hand once and call it a day once it works. It needs to be automatically buildable.
  • It needs to be able to run on all platforms simultaneously as a single distribution. Otherwise, you would not be able to put it onto a USB stick and use it from any machine.
  • It needs to be fully-featured enough to provide for both quick experiments and larger projects. As such, it will need to package a few different components and make them all work alongside, with the above constraints on top.

Since I want the eventual article series to be for absolute beginners as well, I also had some serious concerns about the usability aspects. It shouldn't be too confusing or hindering to use. Despite this, I still settled for Emacs+SLIME for the IDE, simply because there isn't really any alternative out there that is free and supports Lisp to a sufficient degree. I'm still not entirely set on the way the Emacs customisation is currently laid out, though. I might have to make more or less severe changes to make it easier for people to get started with. But, that's something for another time.

Now, considering the last constraint listed above, I decided on the following packages to be included in Portacle:

  • Emacs, for the primary editor.
  • Emacs customisation files, since the default is horrid.
  • SBCL, for a capable Lisp implementation.
  • ASDF, for building and system management.
  • Quicklisp, for package management and library access.
  • Git, for version control and project sharing.

ASDF was included specifically so that I would not have to rely on the implementation updating its internally shipped version and could instead deliver one that is possibly ahead of it, and thus less buggy/more feature-rich.

The first goal was to figure out how to build everything on OS X, Linux, and Windows. This alone took its fair share of experimentation and scouring as I had to look for the right combination of feature flags and build flags to make things work minimally.

In order to make this process work, I've developed a series of bash scripts that take care of downloading the source files of the various parts, configuring them, building them, and installing them in the proper locations in the resulting Portacle distribution tree. I settled on the following layout:

portacle      -- Base Portacle directory
  build       -- Build scripts and artefacts
    $package  -- Build artefacts for the package
  config      -- Configuration files and user files
  projects    -- User projects directory
  all         -- Cross-platform packages and resources
    emacsd    -- Emacs configuration package
    quicklisp -- Quicklisp installation
  $platform   -- Platform-specific packages and resources
    $package  -- Installed package files
    lib       -- Shared library files
    bin       -- Shared executable files

The layout used to be different at first, namely the $platform and $package directories used to be flipped around in the hierarchy. That proved to be less than ideal, however. I won't go into the details as to why. Suffice to say that complications that cropped up along the way ended up poisoning the hierarchy. Having a platform directory for all the specific files means you can create a Portacle distribution that works on all platforms simultaneously too.

Now, in order to properly illustrate the problems that cropped up, I'm going to talk about the development process for each platform separately. Pretty much every one of them had unique problems and complications that lead to a lot of wasted time and questions about the universe.

Linux

Linux was the first platform I started developing for, and ironically enough the last one I finished for good. The first phase of just getting everything built was comparatively painless. Things just worked.

However, as soon as I tried to run a complete installation on another Linux setup, things just segfaulted left and right. No wonder, too. After all, every component has shared library dependencies. Those are going to have different versions, and possibly even different names on different distributions. The first idea to deal with this mess was to simply copy every shared library the applications touched to a directory and set LD_LIBRARY_PATH in a wrapper script before the respective application was launched.

In order to do this, I extended the build scripts with tools that would statically analyse all the executables that were shipped and recursively trace a full list of shared library dependencies. It would then copy them all over to the shared library files directory.

After doing this, things worked a bit better. But only a tiny bit. SBCL worked now, at least. Everything else still crashed. Unfortunately however, SBCL also only worked on my particular system. Much later I would find out that on others it would still fail. The root of the problem lay in the fact that LD_LIBRARY_PATH does not cover everything. The system will still pick some shared libraries from elsewhere. Particularly, libc and so forth.

So I scoured the web in search of a solution that would somehow let me constrain where linux looked for shared libraries. After a long while, I finally found something: ld-linux.so. This file, which lives on every Linux system, is responsible for setting up the shared libraries an application depends on, and then running the application. It can be called directly, and it accepts a library-path argument, with which I can force it to look in a particular place first. So I changed the wrappers around to also start things with ld-linux.so. Doing this allowed me to get Emacs working.

However, now SBCL didn't work anymore. For some reason that is still completely beyond me, if I try to run SBCL under ld-linux.so, it acts as if the argument that is the SBCL binary path was simply not there. You can try it for yourself: /lib64/ld-linux.so $(which sbcl) will show you the help text, as if that argument was just not there. If you pass it the path twice it does something, but it won't work right either. Fortunately, since SBCL doesn't have many dependencies to speak of, I've been able to get by without wrapping it in ld-linux.so. I can't wait for the day for that to break as well, though.

Anyway, back to where I was. Emacs now worked. Kind of. Running other binaries with it did not, because the LD_LIBRARY_PATH would poison their loading. Regular binaries on the system somewhere would not execute. So I had to do away with that. Thanks to the ld-linux.so trick though, it wasn't really needed, at least for Emacs. For Git, the situation was worse. Git does some weird stuff where it has a multitude of git-* binaries that each perform certain tasks. Some of those tasks call other Git binaries. Some of them call bash or things like that. I couldn't set LD_LIBRARY_PATH because that would crash bash and others, and I couldn't use ld-linux.so because that doesn't work when a new process is created. Back to Google, then.

After even more endless searching I finally found a solution. Using LD_PRELOAD I could inject a shared library to be loaded before any other. This library could then replace the exec* family of libc system functions and ensure that the process you wanted to call was actually called under ld-linux.so. You can do this kind of symbol replacement by using libdl, fetching the function locations of the functions you want to replace using dlsym in the library's init function, and then defining your own versions that call out to the original functions. You can find a few interesting blog articles out there that use LD_PRELOAD and this kind of technique for malicious purposes. This culminated in something I called ld-wrap.so. Getting all the exec* calls to work again was its own adventure. On that path I discovered that execv will actually call out to a shell if it finds a shell file, and, even worse than that, excecvp* will actually call a shell any time if they can't seem to execute the file regularly. I think it's pretty insane for a "low-level system call" to potentially invoke a shell to run a script file, rather than "just" executing an ELF binary. In fact, there is no way to "just" execute a binary without implementing your own system calls directly. That's bonkers.

Anyway, getting ld-wrap.so to work right involved multiple iterations, the last of which finally got Git working proper. I had to incorporate tests to check whether the binary was in the Portacle distribution, as well as checks to see whether the binary was static or not. Apparently launching a static binary under ld-linux.so just results in a segfault. How very C-like. Anyway, it has not been the least bit of fun to go through this and pretty much every step of getting all of this worked out cost me weeks.

Aside from the SBCL problem mentioned above, there's an outstanding issue regarding the usage of fonts in Emacs. Since there's absolutely no guarantee for any particular font to exist on the system, and since I'd like to provide for some nice-looking default, I need to ship and install fonts automatically. I haven't gotten that to work quite yet, but I've been very disappointed to find that Emacs apparently has no support for loading a font from a File, and that Linux and OS X don't really support "temporarily" loading a font either. You have to install it into a user directory for it to be visible, so I have to touch things outside of the Portacle distribution folder.

Finally, apparently if you try to compile Emacs on some Linux systems, it will fail to run under ld-linux.so and just crashes immediately with a "Memory Exhausted" warning. I have no clue what that is about and haven't received any feedback at all from the Emacs mailing lists either. So far this is not a problem, since the virtual system I build on works, but it is a major hassle because it means that building a distribution is out of the question for a lot of people. You can find out more about this bug on the issue tracker.

Windows

Windows has been an interesting system to work with. On one hand it has given me a lot of problems, but on the other it has also avoided a lot of them. The biggest pleasure was that shared library deployment "just worked". No need for complicated ld-linux shenanigans, Windows is just consistent and faithful to loading the first library that it can find on its PATH. Given that most of the libraries I need are already either provided by the Windows system or from the self-contained MSYS installation, there really haven't been any issues with getting things running on a deployed system.

However, it makes up for this in other areas.

While Emacs and SBCL have been very easy to get compiled and running, Git has once again proven to be a problem child. First, Git insists on keeping around git-* prefixed hard-links to its binary because apparently a lot of things both internal and external still directly reference those. Hard links are difficult to ship because most archiving systems don't support them, making the resulting archive humongous. The current size of ~80 megabytes is already a lot by my measures, but the hard link issues exploded it to well over 100. Thus I had to go hunt for an archiver that allowed both self-extracting SFX executables -after all I couldn't ask people to install an archiver first- and was able to compress well and handle hard links. 7zip was up to the task, but it required complicating the deployment quite a bit in order to get the SFX working.

Next, Git is primarily a system written for Linux. As such, it has some "interesting" ideas about what the filesystem really looks like. I'm still not entirely sure how the path translation happens, but apparently Git interprets the root of the filesystem somewhere relative to its application location. This is fortunate for me, but took a long while to figure out. It is fortunate, because Git needs CURL to work, and CURL needs a certificate file to be able to do HTTPS. Naturally, Windows does not provide this by itself, so I have to do it. Git does allow you to set the path of the certificate file through a configuration variable, but it was one hell of a journey to get the entire setup working right. I'll try to explain why.

Because of - or rather thanks to Git's weird interpretation of the filesystem root, I can create a "fake" Git system configuration. Usually, Git looks up /etc/gitconfig as part of the configuration files. Now, since the root is sort of relative to the Git binary, I can create an etc subdirectory with the configuration file in the Git platform directory. This allows me to specify the certificate file path without having to disturb any of the other platforms. Then, thanks again to this root interpretation I can specify the path to the certificate file as an absolute path within the configuration file. Since the Git interpreted root moves with the Git binary, it becomes effectively relative. Naturally this would not work on Linux or OS X, but thankfully there I don't need to resort to such tricks.

Finally, if I try to compile Git without gettext on Windows, it fails to run properly and just exits with vsnprintf is broken. I did find some issues related to vsnprintf on Google, but nothing conclusive. Either way, if I do compile with gettext it seems to work fine. It doesn't work with gettext on OS X though, so I can't just enable it everywhere either.

Last but not least, Windows is primarily responsible for me not shipping a spell checker system. I really wanted to include that, as a spell checker is a really great thing to have when you're writing lots of comments and documentation, or just using Emacs for prose. However, I was simply not able to get ispell, aspell, or hunspell to compile and run no matter how much I tried. I was also unable to find any other compatible alternatives to those three that could be used as a replacement. If anyone else knows of a solution that I can compile successfully, and for which I can generate working dictionaries, I'd be very grateful.

Actually, I suppose it's also worth mentioning that for a while, before I wrote the launcher, I was trying to use batch scripts to launch things. Please, don't ever try to do that. Batch files are unbelievably cryptic to read and a huge pain to work with. There's no easy way to pass the arglist to another program for example, as it'll just reinterpret spaces within an argument as an argument separator. If you can, use PowerShell, which is supposedly much better, or just write a proper launcher application that does that kind of logic.

Mac OS X

Finally, OS X. This is a bit of a problematic system for me because Apple has, for reasons beyond me, decided to make it as difficult as possible to test for. Since I needed Portacle to work on multiple versions of OS X if possible, and even beyond that just ensure that it works outside of a development environment, I had to get my hands on virtual machines somehow. I first bought a MacBook Air just for this purpose -that's a thousand dollars wasted in the wind- only to realise that trying to run any kind of Virtual Machine on it is futile because it's just unbearably slow. Fortunately there are ways to get VMWare Workstation on Linux to run OS X as well if you use specially prepared images and some nefarious tools to unlock OS X as an option. However, only versions 10.11+ seem to run at any usable speed. 10.10 is so slow that you can pretty much forget trying to use it for anything.

Anyway, even just setting up a suitable build and test environment proved to be a major hassle. Thanks to Homebrew and MacPorts the building aspect wasn't too bad, though there too I've found weird inconsistencies like the aforementioned gettext problem. Aside from the compilation though, it really bothers me a lot that the OS X coreutils lack so many useful features. The build scripts have several special cases just for OS X because some Unix utility lacks some option that I need to get it done succinctly or efficiently.

Aside from the virtualisation thing, Apple also seems to try their hardest to prevent anyone from developing software for their system without also paying them a substantial fee every year. If you want to launch Portacle as a normal user, you just get a "security" error message that blocks the application from running. To get it to launch, you have to start the systems settings application, navigate to the security tab, unlock it, then click on "run this application" in order to finally run it. They completely removed the option to allow foreign apps by default, too, so there's no to me visible option to make it shut up. Windows has a security popup similar to that too, but at least you can immediately tell it to launch it, rather than having to waste multiple minutes in menus to do so.

In addition to the "security" popup, Apple has also recently decided to activate a feature that makes it virtually impossible for me to properly ship additional libraries, or versions of the libraries that Portacle requires, thus forever constraining the possible versions it can run on. OS X will now automatically clear out DYLD_LIBRARY_PATH whenever you execute a new application. You can only disable this lunatic behaviour by booting into recovery mode and changing a security option- definitely not something I can ask people to do just to use Portacle. Thus, it seems it is impossible for me to, in the long term, cover a wider version scheme than a single one. This is very annoying, especially given that lots of people seem to stop upgrading now that Apple is screwing up ever more with every new version.

Ah well.

Future Outlook

That about sums up most of the major issues that I can remember. You can find out more stories of me going borderline insane if you browse around in the issues or the commit history.

Now that most of the platform problems are finished, Portacle is almost done. There are a few more things left to do, however. The Emacs configuration as it is right now is somewhat passable, but I'd like to add and change a few more things to make it more comfortable, especially for beginners. I don't want to change too much either though, as that would make it hard for beginners to find help to a specific issue online.

Otherwise, I'd also like to work a bunch more on the included documentation and help file. As it stands it gives an overview over almost everything one needs to know to get started, but I haven't had it run by anyone yet, so I can't really give any estimates as to whether it's comprehensible, readable, or even useful in the first place.

I might write about Portacle again another time, hopefully when it has finally reached something I can call "version 1.0". Until then, I'll try to write some more articles about other subjects.

Professor Fish: The Haskell Road to Software Language Engineering and Metaprogramming

FP talk at Chalmers, Gothenburg, Sweden

The Haskell Road to Software Language Engineering and Metaprogramming

2017-02-24, 10.00, conference room 8103, Rännvägen 6, Johanneberg.  

Speaker: Ralf Lämmel, University of Koblenz-Landau

Abstract:
In this talk, I would like to sketch my upcoming textbook on software languages http://www.softlang.org/book while putting on the hat of a Haskell programmer. Overall, the book addresses many issues of software language engineering and metaprogramming: internal and external DSLs, object-program representation, parsing, template processing, pretty printing, interpretation, compilation, type checking, software analysis, software transformation, rewriting, attribute grammars, partial evaluation, program generation, abstraction interpretation, concrete object syntax, and a few more. Haskell plays a major role in the book in that Haskell is used for the implementation for all kinds of language processors, even though some other programming languages (Python, Java, and Prolog) and domain-specific languages (e.g., for syntax definition) are leveraged as well. I hope the talk will be interactive and help me to finish the material and possibly give the audience some ideas about FP- and software language-related education and knowledge management.

Slidesin preparation

Colossal: The Rise of the Image: Every NY Times Front Page Since 1852 in Under a Minute

The New York Times published its first issue on September 18, 1851, but the first photos wouldn’t appear on the cover until the early 1900s over 60 years later. This visual timeline by self-described data artist Josh Begley captures the storied newspaper’s approach to layout and photography by incorporating every NY Times front page ever published into a single one-minute video. The timelapse captures decades text-only front pages before the newspaper began to incorporate illustrated maps and wood engravings. The liberal usage of black and white photography begins a century later and finally the first color photo appears in 1997. What a fascinating way to view history through image, over 60,000 front pages in all. If you liked this, don’t miss Farewell — ETAOIN SHRDLU. (via Kottke)

Colossal: A Homemade Multipoint Pinhole Camera Made from 32,000 Drinking Straws

Using 32,000 black drinking straws, collaborators Michael (Mick) Farrell and Cliff Haynes created the Straw Camera, a homemade camera they began experimenting with in 2007. Despite the connection one might draw to a pinhole camera, the Straw Camera actually functions quite differently, producing a multipoint perspective from an array rather than a single point perspective.

The direct analogue process records the light collected from each straw onto a piece of paper secured to the back of the camera. The camera gives a direct 1:1 view of the subject that is placed before it, however it translates the image to one that mirrors that of pointillist painting, breaking the subject into thousands of little dots.

“In a world beset by selfies with their immediate gratification, and HD television in all its glory feeding our visual appetite, a Straw Camera image of an individual, with its engineering projection and disappearance of the subject into the near fog of visual capture, gives the viewer a glimpse of just how transitory perception is,” said Cliff about the camera.

To read more about the project, check out the photography duo’s website for the Straw Camera, or their book which was published earlier this month. (via PetaPixel)

CreativeApplications.Net: DOBOTONE – VIDEOGAMO’s parametric party machine

Powered by a dizzying array of parametric meta-controls, VIDEOGAMO’s ‘party console’ DOBOTONE invites (up to) four players to cycle through a strange and fiercely competitive selection of lo-fi videogames.

Michael Geist: The Copyright Lobby’s IIPA Report: Fake News About the State of Canadian Copyright

The International Intellectual Property Alliance (IIPA), a lobby group that represents the major lobbying associations for music, movie, software, and book publishing in the United States, has released its submission to the U.S. government as part of the Special 301 process. The Special 301 process leads to an annual report invariably claiming that intellectual property rules in the majority of the world do not meet U.S. standards. The U.S. process has long been rejected by the Canadian government, which has consistently (and rightly) stated that the exercise produces little more than a lobbying document on behalf of U.S. industry. The Canadian position, as described to a House of Commons committee in 2007 (and repeated regularly in internal government documents):

In regard to the watch list, Canada does not recognize the 301 watch list process. It basically lacks reliable and objective analysis. It’s driven entirely by U.S. industry. We have repeatedly raised this issue of the lack of objective analysis in the 301 watch list process with our U.S. counterparts.

The lack of credibility stems in part from the annual IIPA submission. While the submission generates some media attention, this year’s falls squarely into the category of fake news. The IIPA focuses on three concerns: piracy rates in Canada, the notice-and-notice system for allegations of infringement, and fair dealing. None of the concerns withstand even mild scrutiny and each is addressed below.

1.    State of Canadian Piracy

Throughout the Canadian copyright reform process that led to the 2012 law, the IIPA and rights holder groups claimed that Canada was a piracy haven in need of copyright reform. Despite getting what it asked for – tough anti-circumvention rules similar to those found in the U.S., an ISP liability system, an enabler provision that makes it easy to target websites that primarily facilitate infringement, and retention of some of the biggest statutory damages for commercial infringement in the world – the IIPA has returned to the same playbook in advance of the review of Canadian copyright law scheduled for later this year.

The IIPA claims are presented without much evidence, presumably because it isn’t available. The real Canadian story is that infringement rates have consistently declined in recent years. For example, the Business Software Alliance’s annual report last showed Canada at its lowest software piracy rate ever and well below the global and European averages. The decline will not come as a surprise to anyone following the explosive growth of digital services in Canada. As many predicted, the availability of affordable, convenient services is easily the best method to counter infringement. In the case of Canada, Netflix is seemingly too popular for many in the cultural community as the millions of subscribers have transformed the sector and conclusively demonstrated that Canadian consumers are willing to pay for good entertainment services. The growth of these services is not limited to video. SOCAN, Canada’s largest music copyright collective, recently reported record earnings from Internet streaming services which increased by more than 460 percent (which followed from previous records) again confirming that Canadian consumers are paying for music online too.

But wait, says the IIPA. While it admits that Canadian law has been used to shut down piracy sites such as isoHunt and KickAss Torrents, it identifies a few other sites that it says have a Canadian connection. However, the IIPA neglects to mention that the U.S. government’s most recent report on notorious markets makes no reference to Canada. In fact, it identifies what it says are the most problematic online markets and sites in the world and the word “Canada” does not appear anywhere. More importantly, the IIPA acknowledges that the Canadian enabler provision has been effective in shutting down sites of this kind. The failure is not a function of Canadian law, but rather a failure of the IIPA and its members to use the very legal tools they demanded.

2.    Notice and Notice

The IIPA is also unhappy with Canada’s notice-and-notice system, which it says is inadequate, is not receiving full compliance from ISPs, and which hurts licensed services. As noted above, licensed services are experiencing record revenues and growth in Canada.  Further, there has been no public evidence that ISPs are not compliant with the law. It would be surprising if there was given that ISPs face financial penalties for failure to comply with the law.

With respect to whether the notice-and-notice system meets U.S. standards, it is worth noting that the U.S. government itself has acknowledged that it does. As part of the Trans Pacific Partnership treaty, the Canadian system was treated as equivalent to the U.S. system for the purposes of complying with ISP liability and safe harbour rules. All parties, including the U.S. and Canadian governments, asserted that no reforms would be needed in Canada to meet the TPP requirements. Moreover, promoting the U.S. system raises serious concerns, particularly since it is receiving increased scrutiny with reports that it generates millions of fake DMCA notices that have massively inflated claims of online infringement. In fact, Google has advised the Register of Copyrights that 99.95% of the processed URLs from Google’s trusted submitter program regarding search are machine-generated URLs that do not involve actual pages in the search index. In other words, the notice-and-takedown system is filled with fake notices and rife with abuse.

The Canadian notice-and-notice system needs amendment, but not for the reasons articulated by the IIPA. The Canadian government never intended for notice-and-notice to be used by rights holders to send thousands of settlement demands and scare recipients into paying settlements. The Canadian government’s own public documents make it clear that there is no obligation to settle and even the movie industry has established a website that tries to set the record straight. The misuse of the notice-and-notice system is the real story and one that requires reform when the government turns to copyright.  Notice-and-notice should not be used by rights holders to trick or scare users into paying hundreds of dollars for settlements as part of ethically questionable anti-piracy business tactics. Addressing the notice-and-notice loopholes in the system should be at the top of the 2017 reform list.

3.    Fair Dealing

The IIPA comments on Canada also focus on Canadian fair dealing law, as it points to the 2012 reforms and states “that none has had a more concrete and negative impact than the addition of the word ‘education’ to the list of purposes (such as research and private study) that qualify for the fair dealing exception.” Given that it is fair dealing/fair use week, it essentially to correct the record yet again.

i.    Fair Dealing Practices

First, the attempt to link fair dealing practices in Canada with the 2012 legislative reforms are false. Fair dealing includes multiple purposes that can be relied upon by educational institutions, including research and private study. The addition of education in 2012 was always evolutionary rather than revolutionary. Indeed, the proof is in the Supreme Court of Canada’s fair dealing copyright decisions, which ruled against Access Copyright without the benefit of an education fair dealing purpose.

The widely used fair dealing guidelines are based primarily on decisions from the Supreme Court of Canada, the Federal Court of Appeal, and the Copyright Board of Canada. Despite claims that fair dealing guidelines went beyond the law, Access Copyright has lost every legal attempt to challenge them. The courts and board have provided detailed guidance the scope of fair dealing, the appropriate test, and the applicability of insubstantial copying. Current practices have been influenced by what courts and tribunals have ruled, not what the government implemented in 2012. In fact, Canadian educators could rely far more on the 2012 reforms, including the use of Internet exception for education and the exception for non-commercial user generated content.

It is important to note that Canadian fair dealing practices are not inconsistent with many jurisdictions around the world. For example, the U.S. fair use provision is far broader than fair dealing with recent fair use decisions involving the legality of university copying, digitization practices, and use of APIs. Fair use can be found in other countries, some of which have practices that involve far more generous copying than Canada. For instance, copying 20% of a book is viewed as fair use in Israel, double the Canadian guideline. Most recently, the Australian Productivity Commission, a government-backed think-tank, recommended the adoption of fair use in that country.

ii.    The State of Canadian Educational Publishers

The IIPA repeats the oft-stated claim that Canadian educational publishers are struggling and seeks to draw a direct link to fair dealing. The claim is false. Publishers may be facing new challenges, but copyright is a minor part of the story as disclosed in their own corporate and legal filings. Pearson PLC, the world’s largest education company, recently warned of an unprecedented decline in the North American education publishing market. This primarily reflects U.S. developments and highlights how Canada is not an outlier in educational publishing.

Pearson is not alone. Ariel Katz has previously debunked claims regarding Oxford University Press, whose recent annual reports acknowledge changing market conditions around the world, with the company noting:

“the Higher Education textbook market shrank in important markets such as the UK, Canada, and the US, illustrating the contrasting array of market conditions to which OUP needed to adapt in 2014.”

Nelson Education is the largest Canadian educational publisher and its President and CEO Geoff Nordal identified the primary economic challenges in an affidavit:

In Canada, each province and territory has authority over curriculum development and education funding for the K-12 Market. Following a historic high in Canada in 2006 with respect to new curriculum development and spending, the K-12 Market contracted. The K-12 Market has been negatively affected by reduced spending on new curriculum by Canadian schools over the last five years, and in particular the spending decline in Ontario which represents the largest proportion of educational spending in Canada.

In the higher education market, Nordal focused on the following issues:

The Higher Education Market has been negatively affected by, among other things: a lack of clarity at universities with respect to ‘ancillary fees’; with certain institutions banning digital homework solutions with added fees; increased traction in the open textbook movement due in part to government funding in a number of provinces; and the use of used books, rental books and peer-to-peer sharing, impacting the demand for new textbooks at universities and colleges in Canada. The impact caused by used books and rental books is mitigated by revisions cycles and new textbook editions, the adoption of digital materials and increased use of custom and indigenous products. In addition, the Higher Education Market is in transition from traditional books to digital products, which is having a transformative effect on the business.

Nordal’s emphasis on reduced provincial spending (for K-12) and the digital shift (for higher education) is consistent with the data from other sources. The 2010 report on K-12 publishing commissioned by Canadian Heritage also pointed to the long pilot periods delaying purchasing decisions and the increased use of alternative and digital resources.

These findings are also consistent with a 2015 study prepared for Creative BC and the Association of Book Publishers of British Columbia. The study characterizes the challenge for educational publishing as follows:

Scholarly and educational publishers share some of the same issues as trade publishers, but they face other unique challenges. Tablet and other nonprint use will increase in the school systems here and abroad, changing how educational materials are bought, used and updated. Scholarly publishers and trade publishers that sell into the academic market are struggling with the impact on their sales of Open Access and fair use policies, tailored subscription services such as Scribd’s Edelweiss, used book sales, student piracy and increased library use for class reading lists.

None of this will surprise anyone on campuses or in schools in Canada. As the B.C. study on the publishing industry notes, open access and free online alternatives do represent a business threat to the conventional publishing industry. Several provinces have invested heavily in developing quality, peer-reviewed online materials that can be freely used by any school. For example, Open School BC, backed by the province, has modules in the sciences, social sciences, and languages. The B.C. Open Textbook Project has over 150 open textbooks that has saved students millions of dollars. E-learning Ontario has an online resource bank featuring thousands of resources from students from kindergarten to Grade 12.

Meanwhile, Canadian post-secondary institutions continue to spend hundreds of millions of dollars each year on licensing from publishers. As the Canadian Association of Research Libraries (CARL) noted at the start of this academic year:

The 31 member libraries of the Canadian Association of Research Libraries (CARL) spent $293 million on information resources in 2014-15, demonstrating a clear commitment to accessing print and digital content legally and rewarding content owners accordingly. Universities are actively engaged in outreach to their faculty, staff, and students, educating them on their rights and responsibilities under the Copyright Act and ensuring that uses of material under copyright fall well within the provisions of the law. Where educational uses are more substantive and therefore fall outside of fair dealing, the content is either purchased to be added to licensed collections, or rights clearances are obtained and royalties are paid for these uses. Trained, knowledgeable library staff support these activities.

The IIPA and its allies have engaged in a fake news effort to malign fair dealing in Canada. The actual numbers and evidence tell a far different story: paying for content remains by far the largest method of acquiring access to content for educational institutions. In fact, the spending from just the 31 CARL libraries on information resources are more than 14 times the total revenues for Access Copyright for all its licences.

The Future of Canadian Copyright Reform

The issue of copyright reform will unquestionably be on the policy radar screen starting later this year and continuing into 2018. Changes are needed: as discussed above, the government should address the misuse of notice-and-notice. With the Canadian recording industry now admitting that the WIPO Internet treaties were a wrong guess, the government should fix the fair dealing gap by creating a clear exception in the anti-circumvention rules for fair dealing.  Further, it should consider expanding fair dealing to a fair use model (by adding “such as” to the list of fair dealing purposes), which would be more consistent with the intent of the law and create the necessary pro-innovative policies that we see in places like the U.S., Singapore, and Israel. As the government moves forward with the review process, it will be essential that the debate focus on the real state of Canadian copyright, not the fictional one portrayed by the IIPA.

The post The Copyright Lobby’s IIPA Report: Fake News About the State of Canadian Copyright appeared first on Michael Geist.

Saturday Morning Breakfast Cereal: Saturday Morning Breakfast Cereal - Solving Sophie's Choice



Click here to go see the bonus panel!

Hovertext:
My love for you is boundless, but it's not a desideratum in this context.

New comic!
Today's News:

Some fresh and tasty book reviews have been posted at The Weinerworks.

Tea Masters: Terra Sigillata, a 2500 years old pottery technique

The terra sigillata method dates back to the Greeks and Romans some 2500 years ago. The main characteristic of this pottery technique is that the slip, the glaze, on the earthenware is using the finest particles that float at the surface of the clay. (We can also find this technique in Chinese pottery: ni jiang you. It is sometimes used on Shantou teapots to make them less porous and appear redder).
The brown cup in this article is a Terra Sigillata cup made by French potter Dalloun. He is using this ancient method by adding a touch of unexpected natural beauty by using wild mud for his slip. Because this slip is obtained directly from nature, it contains different elements that produce different colors. And since only the lightest and smallest particles are used, this glaze is very soft to the touch and has a natural gloss. 
It's also interesting to see that Dalloun threw the cups on a wheel and later de-centered them to make them slightly uneven, using a wabi approach.

The body of the cup is made of white earthenware. It is fired between 950 and 1070 degrees Celsius. He usually uses a gas kiln, but the cups he sent me were wood fired.
My job was to test these cups, give him suggestions for future productions and establish which tea would be most suitable for his works. The first thing that struck me is that each cup had a different reaction to tea. Using different soils to make the slip means that the content of each slip is different and interacts differently with the teas. One color in particular (which I won't sell) produces very sour notes!
Terra sigillata cups aren't as thoroughly glazed as porcelain. They are much less porous than unglazed earthenware, but there's still some absorption and interaction between the tea and the slip and clay. My tests have shown that these two cups are good fit with puerh, especially shu, and with heavily roasted Oolong.
You may also check my 'Before spring specials'. For the first time, I have discounted the spring 2016 Da Yu Ling Oolongs! And I have increased the discounts on several teas up to 20% off!
The brush marks are visible on the surface of the cup!

Michael Geist: Bogus Claims: Google Submission Points to Massive Fraud in Search Index Takedown Notices

The U.S. DMCA notice-and-takedown system has generated heated debate for many years with supporters arguing that the safe harbour is essential, while rights holder critics countering that the growing number of takedown notices sent to Google illustrates mounting piracy concerns. In recent months, there have been several reports that raise questions about the reliability of takedown notices. A study released last year by the University of California, Berkeley and Columbia University found that approximately 30% of notices were questionable, while TorrentFreak report this week identified tens of millions of fake DMCA takedown notices sent to Google on a website with virtually no traffic. An earlier report also raised questions about dubious takedown practices.

Yet those reports pale in comparison to data just released by Google in its submission to the Register of Copyrights as part of the review of the DMCA notice-and-takedown system. Google reports that the overwhelming majority of takedown notices sent to Google Search through its Trusted Copyright Removal Program do not involve pages that are actually in its search index. The submission states:

a significant portion of the recent increases in DMCA submission volumes for Google Search stem from notices that appear to be duplicative, unnecessary, or mistaken. As we explained at the San Francisco Roundtable, a substantial number of takedown requests submitted to Google are for URLs that have never been in our search index, and therefore could never have appeared in our search results. For example, in January 2017, the most prolific submitter submitted notices that Google honored for 16,457,433 URLs. But on further inspection, 16,450,129 (99.97%) of those URLs were not in our search index in the first place. Nor is this problem limited to one submitter: in total, 99.95% of all URLs processed from our Trusted Copyright Removal Program in January 2017 were not in our index.

These numbers of simply staggering with only a tiny number of millions of requests reflecting actual pages in the search index. Rather, 99.95% of the processed URLs from Google’s trusted submitter program are machine-generated URLs that do not involve actual pages in the search index. Given that data, Google notes that claims that the large number of requests correlates to infringing content on the Internet is incorrect:

Nor is the large number of takedown requests to Google a good proxy even for the volume of infringing material available on the Internet. Many of these submissions appear to be generated by merely scrambling the words in a search query and appending that to a URL, so that each query makes a different URL that nonetheless leads to the same page of results.

The incredible volume of fake claims regarding allegedly infringing pages represents a serious problem. Indeed, the Google data points a massive fraud in search index takedown requests, calling into question claims about the scope of infringing material on the Internet. The Register of Copyrights review of the DMCA continues with written submissions on empirical research due next month.

The post Bogus Claims: Google Submission Points to Massive Fraud in Search Index Takedown Notices appeared first on Michael Geist.

Electronics-Lab: Measuring seismic activity using ProtoCentral OpenPressure

Seismic activity or “Vibrations of the earth” is measured using ProtoCentral’s OpenPressure 24-bit DAQ System.

A geophone is a magnetic device used to measure the Earth’s normal vibrations (some abnormal during events such as earthquakes). These movements are also present when there is a small explosion (commonly used for mining and exploration purposes).

Measuring seismic activity using ProtoCentral OpenPressure – [Link]

The post Measuring seismic activity using ProtoCentral OpenPressure appeared first on Electronics-Lab.

LLVM Project Blog: 2016 LLVM Developers' Meeting - Experience from Johannes Doerfert, Travel Grant Recipient

This blog post is part of a series of blog posts from students who were funded by the LLVM Foundation to attend the 2016 LLVM Developers' Meeting in San Jose, CA. Please visit the LLVM Foundation's webpage for more information on our Travel Grants program.

This post is from Johannes Doerfert:
2016 was my third time attending the US LLVM developers meeting and for the third year in a row I was impressed by the quality of the talks, the organization and the diversity of attendees. The hands on experiences that are presented, combined with innovative ideas and cutting edge research makes it a perfect venue for me as a PhD student. The honest interest in the presented topics and the lively discussions that include students, professors and industry people are two of the many things that I experienced the strongest at these developer meetings.

For the last two years I was mainly attending as a Polly developer that talked about new features and possible applications of Polly. This year however my roles were different. First, I was attending as part of the organization team of the European LLVM developers meeting 2017 [0] together with my colleagues Tina Jung and Simon Moll. In this capacity I answered questions about the venue (Saarbruecken, Germany [1,2]) and the alterations in contrast to prior meetings. Though, more importantly, I advertised the meeting to core developers that usually do not attend the European version. Second on my agenda was the BoF on a parallel extension to the LLVM-IR which I organized with Simon Moll. In this BoF, but also during the preparation discussion on the mailing list [3], we tried to collect motivating examples, requirements as well as known challenges for a parallel extension to LLVM. These insights will be used to draft a proposal that can be discussed in the community.

Finally, I attended as a 4th year PhD student who is interested in contributing his work to the LLVM project (not only Polly). As my current research required a flexible polyhedral value (and iterationspace) analysis, I used the opportunity to implement one with aninterface similar to scalar evolution. The feedback I received on this topic was strictly positive. I will soon post a first version of this standalone analysis and start a public discussion. Since I hope to finish my studies at some (not too distant) point in time, I seized the opportunity to inquire about potential options for the time after my PhD.

As a final note I would like to thank the LLVM Foundation for their student travel grant that allowed me to attend the meeting in the first place.


[0] http://llvm.org/devmtg/2017-03/
[1] http://sic.saarland/
[2] https://en.wikipedia.org/wiki/Saarbr%C3%BCcken
[3] http://lists.llvm.org/pipermail/llvm-dev/2016-October/106051.html

Explosm.net: Comic for 2017.02.22

New Cyanide and Happiness Comic

Ideas from CBC Radio (Highlights): The Proper Role of Science: Peter Gluckman

The Harper government muzzled scientists. Donald Trump's administration is now doing the same. But a better relationship between science and government is possible. Highlights from a talk by Sir Peter Gluckman.

The Universe of Discourse: Moore's law beats a better algorithm

Yesterday I wrote about the project I did in the early 1990s to find the best anagrams. The idea is to give pair of anagram words a score, which is the number of chunks into which you have to divide one word in order to rearrange the chunks to form the other word. This was motivated by the observation that while “cholecysto-duodeno-stomy” and “duodeno-cholecysto-stomy” are very long words that are anagrams of one another, they are not interesting because they require so few chunks that the anagram is obvious. A shorter but much more interesting example is “aspired / diapers”, where the letters get all mixed up.

I wrote:

One could do this with a clever algorithm, if one were available. There is a clever algorithm, based on finding maximal independent sets in a certain graph. I did not find this algorithm at the time; nor did I try. Instead, I used a brute-force search.

I wrote about the brute-force search yesterday. Today I am going to discuss the clever algorithm.

The plan is to convert a pair of anagrams into a graph that expresses the constraints on how the letters can move around when one turns into the other. Shown below is the graph for comparing acrididae (grasshoppers) with cidaridae (sea urchins):

The “2,4” node at the top means that the letters ri at position 2 in acrididae match the letters ri at position 4 in cidaridae; the “3,1” node is for the match between the first id and the first id. The two nodes are connected by an edge to show that the two matchings are incompatible: if you map the ri to the ri, you cannot also map the first id to the first id; instead you have to map the first id to the second one, represented by the node “3,5”, which is not connected to “2,4”. A maximal independent set in this graph is a maximal selection of compatible matchings in the words, which corresponds to a division into the minimum number of chunks.

Usually the graph is much less complicated than this. For simple cases it is empty and the maximal independent set is trivial. This one has two maximal independent sets, one (3,1; 5,5; 6,6; 7,7) corresponding to the obvious minimal splitting:

and the other (2,4; 3,5; 5,1; 6,2) to this other equally-good splitting:

In an earlier draft of yesterday's post, I wrote:

I should probably do this over again, because my listing seems to be incomplete. For example, it omits “spectrum / crumpets” which would have scored 5, because the Webster's Second list contains crumpet but not crumpets.

I was going to leave it at that, but then I did do it over again, and this time around I implemented the “good” algorithm. It was not that hard. The code is on GitHub if you would like to see it.

To solve the maximal independent set instances, I used a guided brute-force search. Maximal independent set is NP-complete, and so the best known algorithm for it runs in exponential time. But the instances in which we are interested here are small enough that this doesn't matter. The example graph above has 8 nodes, so one needs to check at most 256 possible sets to see which is the maximal independent set.

I collated together all the dictionaries I had handy. (I didn't know yet about SCOWL.) These totaled 275,954 words, which is somewhat more than Webster's Second by itself. One of the new dictionaries did contain crumpets so the result does include “spectrum / crumpets”.

The old scored anagram list that I made in the 1990s contained 23,521 pairs. The new one contains 38,333. Unfortunately most of the new stuff is of poor quality, as one would expect. Most of the new words that were missing from my dictionary the first time around are obscure. Perhaps some people would enjoy discovering that that “basiparachromatin” and “Marsipobranchiata” are anagrams, but I find it of very limited appeal.

But the new stuff is not all junk. It includes:

10 antiparticles paternalistic
10 nectarines transience
10 obscurantist subtractions

11 colonialists oscillations
11 derailments streamlined

which I think are pretty good.

I wasn't sure how long the old program had taken to run back in the early nineties, but I was sure it had been at least a couple of hours. The new program processes the 275,954 inputs in about 3.5 seconds. I wished I knew how much of this was due to Moore's law and how much to the improved algorithm, but as I said, the old code was long lost.

But then just as I was finishing up the article, I found the old brute-force code that I thought I had lost! I ran it on the same input, and instead of 3.5 seconds it took just over 4 seconds. So almost all of the gain since the 1990s was from Moore's law, and hardly any was from the “improved” algorithm.

I had written in the earlier article:

In 2016 [ the brute force algorithm ] would probably still [ run ] quicker than implementing the maximal independent set algorithm.

which turned out to be completely true, since implementing the maximal independent set algorithm took me a couple of hours. (Although most of that was building out a graph library because I didn't want to look for one on CPAN.)

But hey, at least the new program is only twice as much code!

[ Addendum: The program had a minor bug: it would disregard capitalization when deciding if two words were anagrams, but then compute the scores with capitals and lowercase letters distinct. So for example Chaenolobus was considered an anagram of unchoosable, but then the Ch in Chaenolobus would not be matched to the ch in unchoosable, resulting in a score of 11 instead of 10. I have corrected the program and the output. Thanks to Philip Cohen for pointing this out. ]

[ Addendum 20170223: More about this ]

Planet Haskell: Mark Jason Dominus: Moore's law beats a better algorithm

Yesterday I wrote about the project I did in the early 1990s to find the best anagrams. The idea is to give pair of anagram words a score, which is the number of chunks into which you have to divide one word in order to rearrange the chunks to form the other word. This was motivated by the observation that while “cholecysto-duodeno-stomy” and “duodeno-cholecysto-stomy” are very long words that are anagrams of one another, they are not interesting because they require so few chunks that the anagram is obvious. A shorter but much more interesting example is “aspired / diapers”, where the letters get all mixed up.

I wrote:

One could do this with a clever algorithm, if one were available. There is a clever algorithm, based on finding maximal independent sets in a certain graph. I did not find this algorithm at the time; nor did I try. Instead, I used a brute-force search.

I wrote about the brute-force search yesterday. Today I am going to discuss the clever algorithm.

The plan is to convert a pair of anagrams into a graph that expresses the constraints on how the letters can move around when one turns into the other. Shown below is the graph for comparing acrididae (grasshoppers) with cidaridae (sea urchins):

The “2,4” node at the top means that the letters ri at position 2 in acrididae match the letters ri at position 4 in cidaridae; the “3,1” node is for the match between the first id and the first id. The two nodes are connected by an edge to show that the two matchings are incompatible: if you map the ri to the ri, you cannot also map the first id to the first id; instead you have to map the first id to the second one, represented by the node “3,5”, which is not connected to “2,4”. A maximal independent set in this graph is a maximal selection of compatible matchings in the words, which corresponds to a division into the minimum number of chunks.

Usually the graph is much less complicated than this. For simple cases it is empty and the maximal independent set is trivial. This one has two maximal independent sets, one (3,1; 5,5; 6,6; 7,7) corresponding to the obvious minimal splitting:

and the other (2,4; 3,5; 5,1; 6,2) to this other equally-good splitting:

In an earlier draft of yesterday's post, I wrote:

I should probably do this over again, because my listing seems to be incomplete. For example, it omits “spectrum / crumpets” which would have scored 5, because the Webster's Second list contains crumpet but not crumpets.

I was going to leave it at that, but then I did do it over again, and this time around I implemented the “good” algorithm. It was not that hard. The code is on GitHub if you would like to see it.

To solve the maximal independent set instances, I used a guided brute-force search. Maximal independent set is NP-complete, and so the best known algorithm for it runs in exponential time. But the instances in which we are interested here are small enough that this doesn't matter. The example graph above has 8 nodes, so one needs to check at most 256 possible sets to see which is the maximal independent set.

I collated together all the dictionaries I had handy. (I didn't know yet about SCOWL.) These totaled 275,954 words, which is somewhat more than Webster's Second by itself. One of the new dictionaries did contain crumpets so the result does include “spectrum / crumpets”.

The old scored anagram list that I made in the 1990s contained 23,521 pairs. The new one contains 38,333. Unfortunately most of the new stuff is of poor quality, as one would expect. Most of the new words that were missing from my dictionary the first time around are obscure. Perhaps some people would enjoy discovering that that “basiparachromatin” and “Marsipobranchiata” are anagrams, but I find it of very limited appeal.

But the new stuff is not all junk. It includes:

10 antiparticles paternalistic
10 nectarines transience
10 obscurantist subtractions

11 colonialists oscillations
11 derailments streamlined

which I think are pretty good.

I wasn't sure how long the old program had taken to run back in the early nineties, but I was sure it had been at least a couple of hours. The new program processes the 275,954 inputs in about 3.5 seconds. I wished I knew how much of this was due to Moore's law and how much to the improved algorithm, but as I said, the old code was long lost.

But then just as I was finishing up the article, I found the old brute-force code that I thought I had lost! I ran it on the same input, and instead of 3.5 seconds it took just over 4 seconds. So almost all of the gain since the 1990s was from Moore's law, and hardly any was from the “improved” algorithm.

I had written in the earlier article:

In 2016 [ the brute force algorithm ] would probably still [ run ] quicker than implementing the maximal independent set algorithm.

which turned out to be completely true, since implementing the maximal independent set algorithm took me a couple of hours. (Although most of that was building out a graph library because I didn't want to look for one on CPAN.)

But hey, at least the new program is only twice as much code!

[ Addendum: The program had a minor bug: it would disregard capitalization when deciding if two words were anagrams, but then compute the scores with capitals and lowercase letters distinct. So for example Chaenolobus was considered an anagram of unchoosable, but then the Ch in Chaenolobus would not be matched to the ch in unchoosable, resulting in a score of 11 instead of 10. I have corrected the program and the output. Thanks to Philip Cohen for pointing this out. ]

[ Addendum 20170223: More about this ]


churchturing.org / 2017-02-26T08:11:56