Rabu, 25 April 2018

Prufrock: The Injustices of Algorithms, the End of the Professional Songwriter, and Court Rules ...

In this sad, absurd world, a ray of sanity shone from the Ninth Circuit Court of Appeals on Monday: A three-judge panel ruled that Naruto, a crested macaque that took a selfie with a photographer's camera, does not own the picture because, well, she's a monkey, and monkeys lack "statutory standing under the Copyright Act."

Is this the end of the line? Will we ever see you again? Will our heart be a vacant house when you're gone? Who knows? Still, it's rough out there for songwriters, who are having a hard time making ends meet in a world where everyone streams music: "In 2013, Andre Lindal, having finally climbed to the pinnacle of pop songwriter success, surveyed his works and despaired. The dubstep-tinged heart-throbber he had written for Justin Bieber, 'As Long as You Love Me,' was a smash hit, reaching No. 3 on Billboard's Top 40 rankings. The music video, essentially a short film starring actor Michael Madsen, had garnered over 34 million plays on YouTube in its first year. Pandora users had listened to the track more than 38 million times. Streaming numbers were Lindal's friend that year, at least until he got paid. When he did, he found that the 34 million YouTube views had earned him $218, and the 38 million Pandora streams had netted him only $278."

Speaking of streaming, a new service that will stream plays will launch this summer.

More regulation is not the answer to today's problems. We need a skin-in-the-game approach according to Nassim Nicholas Taleb: "Taleb contends that you shouldn't put someone else's skin in the game without putting your own in it as well. Taleb shows that this idea is at least as old as the Code of Hammurabi, which declared that if a builder constructed a house that collapsed and killed the occupant, the builder should be put to death. The 'eye-for-an-eye' equivalence behind Hammurabi's code does not need to be taken literally: we don't need to cut off the leg of the surgeon who accidentally amputated the wrong leg of a patient. Taleb assures us that it is probably enough to 'cut off' the surgeon's golf club membership with a large lawsuit settlement, so that the patient is not the only one at risk during the operation."

Jerry Z. Muller writes against metrics.

There is no such thing as "suburbia," writes Addison Del Mastro. Even in the same area, you can find more than one type of suburb.

The end of The Avengers: Why is Marvel ending the most lucrative movie franchise ever?

Alan Jacobs—a Christian thinker for the Internet age. (HT: John Wilson)

Essay of the Day:

In a long but important piece at The New Atlantis, Tafari Mbadiwe explains why algorithms created to help judges determine fair sentences fail:

"An algorithm is, in its most basic form, a well-defined procedure for solving a problem or making a decision. In this sense, the use of algorithms in the judicial system long predates not only the Big Data era but even the widespread integration of computers into our workplaces. The U.S. Federal Sentencing Guidelines — a manual of rules for helping judges efficiently determine fair sentences — are essentially a set of hyper-detailed algorithms meant to simplify and standardize judicial sentencing. And just as we've progressed from printing the original 1987 guidelines in bound volumes to making them easily accessible online, so too have we begun to employ computers to guide judicial choices using more advanced, software-based algorithms.

"But the rough continuity between the earlier guidelines and computerized algorithms shouldn't obscure just how much we as a society have in recent years ceded judicial agency to algorithms. Unlike the earlier guidelines, not only are the new algorithms executed by a computer rather than a person, but in many cases their workings are likely too abstruse to be understood, much less fairly administered, by a judge.

"Courts across America have adopted a patchwork of different algorithms for a wide variety of tasks, and as a result there is no single algorithmic system that's as ubiquitous as the Federal Sentencing Guidelines. One of the more prominent ones, however, is the Correctional Offender Management Profiling for Alternative Sanctions system, a mouthful that condenses into the catchier acronym, COMPAS. It was first developed in 1998 by the for-profit consulting company Northpointe Institute for Public Management (recently rebranded as part of Equivant, a software company based in Ohio).

"Northpointe's aim was to develop a product that made accurate judgments on a defendant's risk of recidivism, or re-offending. Often, judges use the risk scores produced by these algorithms to determine whether a person awaiting trial should be sent to jail or released, and whether or not a person convicted of a crime should be sentenced to prison, granted parole, or given the opportunity to receive the help he might need to get his life back on track.

"Reporting differs on the extent to which judges can or do use the risk scores as a factor in determining the actual length of a prison sentence. The creators of risk assessment algorithms have been quoted in news articles insisting that their systems are intended only to suggest which defendants might be most eligible for alternatives to incarceration. The Assistant Attorney General for the state of Wisconsin, offering a more ambiguous picture, argued before the state Supreme Court that 'the risk score alone should not determine the sentence of an offender.' Yet, judges have repeatedly cited high risk scores in their sentencing decisions, and admitted to reporters that they gave longer sentences than they would have had they not known the score."

Read the rest.

Photos: Kotisaari Island

Poem: Cecily Parks, "Front Yard Rhyme"

Get Prufrock in your inbox every weekday morning. Subscribehere.

Let's block ads! (Why?)



Tidak ada komentar:

Posting Komentar

Related Posts Plugin for WordPress, Blogger...

Incoming Search