HackDig : Dig high-quality web security articles for hackers

Death by Algorithm

2015-08-10 21:35
Death by Algorithm

Posted by on August 10, 2015.

Philosophy is a wonderful thing – it’s a way of transferring life and death situations into a string of emotionless logical arguments. But we remove emotion from life at our peril, because emotion lies at the heart of our humanity; and argument without humanity is a wrong direction.

But to the point. Synchronicity has drawn two events together that compel me to comment. Firstly, philosopher John Danaher has re-posted his essay titled The Philosophical Importance of Algorithms. This is a fascinating introduction to a fascinating subject; and I couldn’t possibly comment on the veracity of his philosophical arguments. Here’s an example:

The classic example of a subjectivist ontology in action is money. Modern fiat currencies have no intrinsic value: they only have value in virtue of the collective system of belief and trust. But those collective systems of belief and trust often work best when the underlying physical reality of our currency systems is hard to corrupt. As I noted before, the algorithmic systems used by cryptocurrencies like Bitcoin might provide the ideal basis for a system of collective belief and trust. Thus, algorithmic systems can be used to add to or alter our social ontology.

It’s that last sentence that is important: algorithms can and do shape our lives.

Historically, the majority of algorithms we use every day are self-taught. Importantly, they are qualified by humanity, conscience and experience. Danaher says as much, and illustrates the idea with a methodology for sorting books on a bookshelf. I have another that illustrates the futility of reliance on algorithms and the fundamental flaw that is inherent in logic – and by extension all philosophy and indeed all algorithms that are built on logic.

Every day most of us cross busy roads without getting hit by a motor vehicle. That is because experience, logic and training have taught us to apply our own algorithms for safety. Humanity teaches us to do so without endangering others. Mostly we can rely on these algorithms – but ultimately we cannot. That is because our algorithm, like philosophy, is built on logic. But for logic to provide an accurate output it has to know and analyse all possible inputs. And what we don’t know we cannot algorithmically process.

So, largely based on known speeds, sounds and appearance of motor vehicles we use a personally developed and applied algorithm to ensure we avoid getting knocked down. But that algorithm will not protect us from the new military stealth vehicle that travels at the speed of sound, makes no noise and is invisible – because we didn’t know anything about it. Without total knowledge our algorithms are ultimately false, and we can never absolutely know absolutely everything. The problem that we always shy away from is that this applies to all algorithms and all knowledge that is assumed by combining separate ‘facts’. Unless we know everything, we cannot accurately deduce outcomes; and it is absurd to think we will ever know everything.

Algorithms can help us, but should never be relied upon. The danger we now face – in the shorter term for society, and in the longer term for the very existence of our species – is that we are developing computerised algorithms that will control us, rather than serve us.

And that leads to the second part of this synchronicity. TorrentFreak told us on Saturday about one particular type of algorithm that is already imposed upon us. In numerous sites around the world perfectly ‘innocent’ videos have been censored by algorithmic process. In an apparent attempt to protect its copyright for the Columbia film Pixels an algorithm or algorithms scoured the internet looking for videos with the word Pixels and automatically despatched DMCA takedown demands which were equally automatically enforced. All without human intervention. Whether these videos actually are in breach of copyright is irrelevant to any of the algorithms involved.

Here’s an example of a 2006 video being removed from Vimeo.

pixels takedown

(It’s back up now. Ironically, it looks to me as if the original video maker would have a pretty good breach of copyright claim against Columbia.)

The point, however, is that this is an early example of an algorithm that controls us rather than serves us. It does exactly what it is supposed to do. However, the cost of making it work perfectly is considered too high (and would be impossible anyway), and the resulting collateral damage is accepted.

This is just the beginning. The bottom line will always take precedence over accuracy and justice. There will always be collateral damage with algorithms – it is inevitable. This example is not serious and easily redressable. But in the future world of big data, bureaucracy will increasingly rely on and trust in algorithms. Innocent people will be classified as terrorists by algorithm; innocent companies will be labeled as tax dodgers and investigated into oblivion; insurance will be allowed or disallowed by algorithms searching our medical records; travelers will be placed on watchlists because the algorithm says so.

I fear it is too late to stop it. We are already too far down the road. Society will be taken over by algorithms that will first reshape society, then control it, and ultimately destroy it.


Share This:
facebooktwittergoogle_plusredditpinterestlinkedinmail


Source: /mhtirogla-yb-htaed/80/5102/ku.oc.ytirucesti

Read:4070 | Comments:0 | Tags:Expert Views Kevin Townsend's opinions algorithm

“Death by Algorithm”0 Comments

Submit A Comment

Name:

Email:

Blog :

Verification Code:

Tools