Crosbie Fitch, in the Atom feed summary for this post looking at how 'freedom' can and should be defined, says:
You see copyright’s suspension of your freedom to perform particular activities, and so for each activity you demand a specific freedom. This is how the GPL arose. This is an inverted perspective from which to define ‘free culture’ (and free software). To define freedom you define its constraints – you do not enumerate the freedoms you want. This is because freedom is what we start off with in the first place. We constrain it to make it better. It is when we under or over-constrain it that we make it worse.
It's the "To define freedom you define its constraints – you do not enumerate the freedoms you want" which especially stands out to me. This seems such an important principle, yet one which so many politicians entirely ignore when they talk about their commitments to 'human rights'.
Am I being overly simplistic to equate this to the contrast between a 'planned' society - where everything is banned unless specifically permitted in an enumerated list of freedoms - and an 'evolving' society - where everything is permitted unless specifically banned? (Also: how does the contrast between codified Roman law and 'evolving' common law compare to this?)
Whatever the political and legal comparisons might be, the principle is certainly pertinent to the rise of architectures of control in technology. Up until just a few years ago, most technology was effectively 'open', assuming you could get hold of it. All of us had freedom to do what we wanted with it - take it apart, modify it, repurpose it, improve it, break it, even if the originators had never expressly intended anything like this, and even if it were 'illegal'. Now, though, we have (some) technology into which intentions can be codified. We have products with hyper-restrictive End-User Licence Agreements which we must accept before we use them, and which can report back if we don't abide by them. We have products which are intended to provide one-function-and-nothing-but-that-function, and are designed to frustrate or punish users who try anything different. We have politicians seeking to specify exactly what technology can and can't do. How do I know what freedoms I want until I've experimented? How can I even explain them until I've experienced them? Should the progress of tomorrow really be shackled by registering as law the prejudices and errors of today?
Of course, in the context of this blog, I'm merely striking the key-note once again, and that can make for a very dull tune. But that phrase, "you do not enumerate the freedoms you want," will stay with me. It's important.