Showing posts with label language. Show all posts
Showing posts with label language. Show all posts

Saturday, 17 March 2012

"Fabric theory": talking about cultural and computational diversity with the same words

In recent months, I have been pondering a lot about certain similarities between human languages, cultures, programming languages and computing platforms: they are all abstract constructs capable of giving a unique form or flavor to anything that is made with them or stems from them. Different human languages encourage different types of ideas, ways of expression, metaphors and poetry while discouraging others. Different programming languages encourage different programming paradigms, design philosophies and algorithms while discouraging others. The different characteristics of different computing platforms, musical instruments, human cultures, ideologies, religions or subcultural groups all similarly lead to specific "built-in" preferences in expression.

I'm sure this sounds quite meta, vague or superficial when explained this way, but I'm convinced that the similarities are far more profound than most people assume. In order to bring these concepts together, I've chosen to use the English word "fabric" to refer to the set of form-giving characteristics of languages, computers or just about anything. I've picked this word partly because of its dual meaning, i.e. you can consider a fabric a separate, underlying, form-giving framework just as well as an actual material from which the different artifacts are made. You may suggest a better word if you find one.

Fabrics

The fabric of a human language stems (primarily) from its grammar and vocabulary. The principle of lingustic relativity, also known as the Sapir-Whorf hypothesis, suggests that language defines a lot about how our ways of thinking end up being like, and there is even a bunch of experimental support for this idea. The stronger, classical version of the hypothesis, stating that languages build hard barriers that actually restrict what kind of ideas are possible, is very probably false, however. I believe that all human languages are "human-complete", i.e. they are all able to express the same complete range of human thoughts, although the expression may become very cumbersome in some cases. In most Indo-European languages, for example, it is very difficult to talk about people without mentioning their real or assumed genders all the time, and it may be very challenging to communicate mathematical ideas in an Aboriginal language that has a very rudimentary number system.

Many programmers seem to believe that the Sapir-Whorf hypothesis also works with programming languages. Edsger Dijkstra, for example, was definitely quite Whorfian when stating that teaching BASIC programming to students made them "mentally mutilated beyond hope of regeneration". The fabric of a programming language stems from its abstract structure, not much unlike those of natural languages, although a major difference is that the fabrics of programming languages tend to be much "purer" and more clear-cut, as they are typically geared towards specific application areas, computation paradigms and software development philosophies.

Beyond programming languages there are computer platforms. In the context of audiovisual computer art, the fabric of a hardware platform stems both from its "general-purpose" computational capabilities and the characteristics of its special-purpose circuitry, especially the video and sound hardware. The effects of the fabric tend to be the clearest in the most restricted platforms, such as 8-bit home computers and video game consoles. The different fabrics ("limitations") of different platforms are something that demoscene artists have traditionally been concerned about. Nowadays, there is even an academic discipline with an expanding series of books, "Platform Studies", that asks how video games and other forms of computer art have been shaped by the fabrics of the platforms they've been made for.

The fabric of a human culture stems from a wide memetic mess including things like taboos, traditions, codes of conduct, and, of course, language. In modern societies, a lot stems from bureaucratic, economic and regulatory mechanisms. Behavior-shaping mechanisms are also very prominent in things like video games, user interfaces and interactive websites, where they form a major part of the fabric. The fabric of a musical instrument stems partly from its user interface and partly from its different acoustic ranges and other "limitations". It is indeed possible to extend the "fabric theory" to quite a wide variety of concepts, even though it may get a little bit far-fetched at times.

Noticing one's own box

In many cases, a fabric can become transparent or even invisible. Those who only speak one language can find it difficult to think beyond its fabric. Likewise, those who only know about one culture, one worldview, one programming language, one technique for a specific task or one just-about-anything need some considerable effort to even notice the fabric, let alone expand their horizons beyond it. History shows that this kind of mental poverty leads even some very capable minds into quite disastrous thoughts, ranging from general narrow-mindedness and false sense of objectivity to straightforward religious dogmatism and racism.

In the world of computing, difficult-to-notice fabrics come out as standards, de-facto standards and "best practices". Jaron Lanier warns about "lock-ins", restrictive standards that are difficult to outthink. MIDI, for example, enforces a specific, finite formalization of musical notes, effectively narrowing the expressive range of a lot of music. A major concern risen by "You are not a gadget" is that technological lock-ins of on-line communication (e.g. those prominent in Facebook) may end up trivializing humanity in a way similar to how MIDI trivializes music.

Of course, there's nothing wrong with standards per se. Standards, also including constructs such as lingua francas and social norms, can be very helpful or even vital to humanity. However, when a standard becomes an unquestionable dogma, there's a good chance for something evil to happen. In order to avoid this, we always need individuals who challenge and deconstruct the standards, keeping people aware of the alternatives. Before we can think outside the box, we must first realize that we are in a box in the first place.

Constraints

In order to make a fabric more visible and tangible, it is often useful to introduce artificial constraints to "tighten it up". In a human language, for example, one can adopt a form of constrained writing, such as a type of poetry, to bring up some otherwise-invisible aspects of the linguistic fabric. In normal, everyday prose, words are little more than arbitrary sequences of symbols, but when working under tight constraints, their elementary structures and mutual relationships become important. This is very similar to what happens when programming in a constrained environment: previously irrelevant aspects, such as machine code instruction lengths, suddenly become relevant.

Constrained programming has long traditions in a multitude of hacker subcultures, including the demoscene, where it has obtained a very prominent role. Perhaps the most popular type of constraint in all hacker subcultures in general is the program length constraint, which sets an upper limit to the size of either the source code or the executable. It seems to be a general rule that working with ever smaller program sizes brings the programmer ever closer to the underlying fabric: in larger programs, it is possible to abstract away a lot of it, but under tight constraints, the programmer-artist must learn to avoid abstraction and embrace the fabric the way it is. In the smallest size classes, even such details as the ordering of sound and video registers in the I/O space become form-giving, as seen in the sub-32-byte C-64 demos by 4mat of Ate Bit, for example.

Mind-benders

Sometimes a language or a platform feels tight enough even without any additional constraints. A lot of this feeling is subjective, caused by the inability to express oneself in the previously learned way. When learning a new human language that is completely different to one's mother tongue, one may feel restricted when there's no counterpart for a specific word or grammatical cosntruct. When encountering such a "boundary", the learner needs to rethink the idea in a way that goes around it. This often requires some mind-bending. The same phenomenon can be encountered when learning different programming languages, e.g. learning a declarative language after only knowing imperative ones.

Among both human and programming languages, there are experimental languages that have been deliberately constructed as "mind-benders", having the kind of features and limitations that force the user to rethink a lot of things when trying to express an idea. Among constructed human languages, a good example is Sonja Elen Kisa's minimalistic "Toki Pona" that builds everything from just over 120 basic words. Among programming languages, the mind-bending experiments are called "esoteric programming languages", with the likes of Brainfuck and Befunge often mentioned as examples.

In computer platforms, there's also a lot of variance in "objective tightness". Large amounts of general-purpose computing resources make it possible to accurately emulate smaller computers; that is, a looser fabric may sometimes completely engulf a tighter one. Because of this, the experience of learning a "bigger" platform after a "smaller" one is not usually very mind-bending compared to the opposite direction.

Nothing is neutral

Now, would it be possible to create a language or a computer that would be totally neutral, objective and universal? I don't think so. Trying to create something that lacks fabric is like trying to sculpt thin air, and fabrics are always built from arbitrarities. Whenever something feels neutral, the feeling is usually deceptive.

Popular fabrics are often perceived as neutral, although they are just as arbitrary and biased as the other ones. A tribe that doesn't have very much contact with other tribes typically regards its own language and culture as "the right one" and everyone else as strange and deviant. When several tribes come together, they may choose one language as their supposedly neutral lingua franca, and a sufficiently advanced group of tribes may even construct a simplified, bland mix-up of all of its member languages, an "Esperanto". But even in this case, the language is by no means universal; the fabric that is common between the source languages is still very much present. Even if the language is based on logical principles, i.e. a "Lojban", the chosen set of principles is arbitrary, not to mention all the choices made when implementing those principles.

Powerful computers can usually emulate many less powerful ones, but this does not make them any less arbitrary. On the contrary, modern IBM PC compatibles are full of arbitrary desgin choices stacked on one another, forming a complex spaghetti of historical trials and errors that would make no sense at all if designed from scratch. The modern IBM PC platform therefore has a very prominent fabric, and the main reason why it feels so neutral is its popularity. Another reason is that the other platforms have many a lot of the same design choices, making today's computer platforms much less diverse than what they were a couple of decades ago. For example, how many modern platforms can you name that use something other than RGB as their primary colorspace, or something other than a power of two as their word length?

Diversity is diminishing in many other areas as well. In countries with an astounding diversity, like Papua-New-Guinea, many groups are abandoning their unique native languages and cultures in favor of bigger and more prestigious ones. I see some of that even in my own country, where many young and intelligent people take pride in "thinking in English", erroreusnly assuming that second-language English would be somehow more expressive for them than their mother tongue. In a dystopian vision, the diversity of millennia-old languages and cultures is getting replaced by a global English-language monoculture where all the diversity is subcultural at best.

Conclusion

It indeed seems to be possible to talk about human languages, cultures, programming languages, computing platforms and many other things with similar concepts. These concepts also seem so useful at times that I'm probably going to use them in subsequent articles as well. I also hope that this article, despite its length, gives some food for thought to someone.

Now, go to the world and embrace the mind-bending diversity!

Thursday, 21 August 2008

Past tense is evil

When talking about old video games and mature technology, it is quite common to use the past tense rather than the present one. People are more likely to say, for instance, that Uridium WAS a Commodore 64 game, than to state that it IS one.


In most cases, of course, this supposed mistake can be explained by the subjectiveness of the point of view. The C-64 hardware and software have obviously disappeared from the speakers' subjective world and are now something that belong entirely to their past.


Objectively speaking, however, stating that "Uridium was a Commodore 64 game" is just as wrong as saying that "George W. Bush was an American man". The statement about Dubya will not become valid until he either dies or changes his gender or nationality, and likewise, the statement about Uridium will not become valid until it is no longer possible to reconstruct the binary sequence that constitutes the Commodore 64 version of the game.


Yes, yes, but isn't this whole topic just meaningless nitpicking? How is the choice of temporal form supposed to matter so much?


Well, it is a known fact that language shapes the world, and a different choice of words promotes a different set of ideas and values. In my personal opinion, referring to a still-existing cultural artifact in the past tense promotes some quite ugly ideas, such as cultural disposability, planned obscolescence, pre-dictated consumption patterns and a general narrow-mindedness.


So, unless you deliberately want to promote these despisable and dangerous values of the dirty imperialists, you should definitely avoid referring to Uridium in the past tense.



Finding the sinners


Now that we have declared the sin, we need to find some sinners to bash and/or evangelize.


The first place where I looked into for sinful use of the past tense was, of course, the good old Wikipedia, the supposed mecca of objective and neutral knowledge.


In Wikipedia, I expected to find a lot of inconsistent use of temporal forms as well as many articles completely written in the past tense. However, I was positively surprised that it actually seems to prefer the present tense, at least for the more popular titles. For example, the Commodore 64 and Spectrum articles discuss the actual hardware platforms nearly uniformly in the present tense. There are still some problems, however, such as the VIC-20 article which uses the temporal forms inconsistently, but the overall state of the articles isn't nearly as bad as I expected.


Much more of this "past tense sin" can be found, surprisingly, on sites operated by people who actually play the old games, sometimes even with the original hardware. These people, some of which are clueless enough to refer to disk and tape images as "ROMs", often use the past tense even when referring to their favorite platforms they still use. I call these people "retro-morons".



The retro-morons


As I have already mentioned in an earlier post, "retro" is a swearword. Reading an old book does not make me a "retrobooker", or watching an old movie doesn't make me a "retrofilmer". Still, playing an old video game is supposed to make me a "retrogamer". In my opinion, the mere existence of such a category just reflects how immature the computing culture still is compared to many other forms of culture.


By using the past tense, retro-morons not only admit that they consider themselves reliving the past, but also promote the kind of thinking where objects are bound to their respective "golden ages", that is, the periods of time when they have been commercially exploited. Whenever the commercial lifeline of a product ends (after the magical finger-snap of some dirty capitalist pig), it moves from the "present tense world" into some kind of a "past tense" or "retro" world. In this strange imaginary world, the time is frozen still. No new things are possible with the "obsolete" technology anymore, no more fresh aspects can be found in anyone's creations. There's no creativity left, only endless collection, recollection and preservation.


The optimist in me hopes to see the "retro-moron" phenomenon as a temporary intermediate stage, a step towards the maturization of computing and gaming culture. In the future, I hope, old games and hardware platforms will become an integral part of our cultural heritage without being exclusively associated to certain periods of history. Just like it is possible to read a book written in the fifties without being a "fifties freak" or something, it will be possible for ordinary people to play an "eighties game" or to use an "eighties computer" without dwelling in the eighties nostalgia at all.


There's still much work to do before this stage can be reached, however. So, fire up your Interweb browser and start the holy crusade!