Sunday, March 30, 2008

Spiffy LaTeX script for Blogger!

I love the internet these days! A quick Google search can provide you with all sorts of nifty (although probably in the long run not so useful) information and tools. In this particular case (that inspired this blog post), I suddenly developed a hankering for the ability to use LaTeX notation for mathematical and logical notation on Blogger--not that I'll probably ever use it, mind you, and not that I'm even that familiar with LaTeX--but I digress. Blogger doesn't support LaTeX natively, but one swift search later and I'd discovered this Greasemonkey script which, after a very painless installation, sticks a nice friendly button on to the Blogger post editor. This button converts LaTeX notation into images and automatically uploads them into your post for manipulation. (This is all providing that you use Firefox. But of course you use FireFox! Just like any other self-respecting net citizen these days.)

Multipurpose example (to prove that it's working; to see how it looks with my blog colors; to increase my familiarity with LaTeX... or should I say, ):



Or something. Loosely a formulation of Leibniz' Indiscernibility of Identicals. This isn't an ideal solution to notational problems (for example, if you want to make a change in a really lengthy expression, you'd better have copied the text version somewhere because it doesn't convert from images back to text. Also, perhaps not so useful when writing away from my home computer, at least not if I want to see how the results turn out). But man, it's still pretty awesome to have something this convenient. Trés cool!

As a thanks to the author, here's a link to his/her blog (linking there is what he/she recommends by way of thanks): http://servalx02.blogspot.com/

Thursday, March 20, 2008

So, what, it's just a fractal after all?

Click the above picture for a larger version with more details about what's actually being shown. Still, you should be able to make out the words labeling the left image "Brain cell" and the right image "The Universe," and you should be immediately struck by the startling similarity between the two. The brain cell is from a stained slide of a mouse brain, and the universe picture is from a computer simulation portraying how we think the universe developed (it must be a simulation, of course, since we can't very well photograph the universe from outside, nor travel back in time to witness its formation—or so we think).

These two pics were apparently put together by a David Constantine for a New York Times article (or something? that's where the image is hosted, at any rate), although I cannot find the article itself. The "universe" screenshot is from the Millennium Simulation, an international project meant to visually model the universe's development at an unprecedented level of detail and realism. The aforementioned site includes download links for the simulation video, but here's a YouTube link for those who want it.

What do we make of this striking resemblance betwixt neuron and universe? Well, part of me wants to immediately go off on an excited rant about how this confirms all these things I keep noticing about looping/circularity/recursion/self-reference/things-building-upon-themselves, and perhaps there's some sort of mystic significance to be drawn from it all.

But my more cautious side—which, generally speaking, holds more sway for me in these matters—has a few things to say. First and foremost, this is just a simulation; we have no genuine fact of the matter about what the universe, taken as a whole, looked like then or looks like now. Second, this was a simulation designed by humans, creatures who thrive on centralized, hierarchical methods of understanding nature. Seeking out (and, for that matter, imposing) structured hierarchies in nature is one of our most revered conceptual tools, sometimes to the hindrance of our own knowledge—for example, we spent the longest time trying to identify "pacemaker" or leading/guiding cells to account for the slime-mold's self-organizational abilities before Evelyn Fox Keller and Lee Segal showed how they (slime-molds) group together without centralized direction (see Steven Johnson's Emergence: The Connected Lives of Ants, Brains, Cities, and Software for a very readable description of this and many related topics). I find it very likely that, assuming the universe simulation is not perfectly accurate, the simulator designers will tend to incorporate their own biological biases into their work, meaning that human simulation designers will favor simulations that mimic centralization and hierarchies.

Now, for all that, I still grant that the universe demonstrates this kind of arrangement in a number of different non-biological places as well: atoms have nuclei around which electrons orbit, planets and stars form from molecules accumulating around one point, planets orbit about stars, stars orbit about black holes—or whatever-the-hell is in the center of our galaxy. (Note that it pays to be cautious with these analogies: thinking of an atom in terms of planets revolving around a sun can lead to a number of unfortunate misconceptions).

Yes, systematized and centralized thinking is often very helpful.

The problem is, much like the search for theoretical unification and the willful employment of Occam's razor, perhaps it causes us to overlook other things, and see hierarchies where they may not necessarily exist.

(As a final point, there are quite a few images from the universe simulation one can select; given that there are probably thousands of neuron images out there too and given the, ahem, nebulous nature of the astronomical simulations, it can't be that hard to find a few that coincide fairly well.)

Wednesday, March 19, 2008

Discouraging Thoughts

I recently began reading Roger Penrose's The Emperor's New Mind. It's a very fascinating book that covers (directly, tangentially, or in passing) many of the topics that I'm interested in these days: artificial intelligence, Turing Machines, Gödel's theorem, formalism in mathematics, quantum mechanics, fractals, complexity, predictability, etc., etc. In fact, it's a lot like another excellent book that I'm trying to get through at the moment, Douglas Hoftstader's Gödel, Escher, Bach, both in terms of related topics and in how the author draws from many different sources to drive toward an eventual point about the mind (the impossibility of strong AI and/or the irreducibility of the mental to the physical in the case of Penrose, and the self-referential nature of consciousness in the case of Hofstadter. Or so I believe, since I haven't gotten too terribly far in either of them yet, and I'm only dimly aware of what their conclusions will be).

Anyway, the problem: I was skimming through the latter parts of The Emperor's New Mind to see what lies ahead, and for a pop-science book, it contains a bewildering number of frightening equations (aghh! Look at all the ψ's, ω's, and other intimidating esoterica!) . Now, I'm not as put off by this as the average reader might be, but still, I'm worried about my ability to adequately grasp this kind of thing, particularly given my currently less-than-pleasing math knowledge. And, moreover, I'd genuinely like to be able to understand these sorts of things. But do I have the patience? Do I even care enough? How can I learn these things outside of a classroom environment?

I'm also slightly concerned that the math-heavy sections of the book will be like roadblocks for my reading: that is, I will procrastinate getting through them since I'll keep thinking, "I want to understand this, so I'll just come back to it when I'm not feeling so tired, when I'm more in the mood to concentrate, etc., etc." And then, six months later, I still won't have made any progress in the book. Just some things for me to watch out for.


Etymological fact of the day: the word 'rapt'—to be engrossed or enraptured—derived from the past participle of 'rape,' which originally denoted 'seizing' or 'carrying off' without necessarily denoting non-consensual sex acts. Indeed, 'rapt' was occasionally used to indicate being carried from Earth to Heaven (as, perhaps, a prophet or mystic might be taken by God; or as one might be struck rapt in the midst of a vision). 'Rapture,' unsurprisingly, shares a very similar derivation, and this, I presume, is why the modern Christian doctrine of The Rapture has the name it does. The word 'ravish' also derives from the same distant root (Latin rapere); it has a near-identical meaning—to carry off forcibly, particularly in the case of a woman. But, 'ravish' has also been used in the sense of transport from Earth to Heaven, much the way 'rapt' has been, and so 'ravishing' is a complimentary term because it means something like 'enchanting' or 'entrancing.'

The frightening thing is that 'ravish' is still a synonym for our modern sense of 'rape' today. And so, telling a woman that she looking 'ravishing' seems to indicate that she looks rape-able. The Oxford English Dictionary (OED) does not list anything like this in its usage history of the word, and I generally consider the OED the final authority on all things language related, but still... kinda makes you think.

Monday, March 17, 2008

Another note about content

Even my more in-depth posts do not spend a lot of time directly addressing specific scientific or philosophical issues. General trends are great fun to write about, but I would like to write at least a few essays that focus on some particular problem, or that respond directly to some contemporary writer. Also, my posts still wander all over the place--this is good in that I see connections between disparate subjects, but it must be frustrating for the reader to follow.

I find it vastly amusing that I am currently thoroughly uninterested and unmotivated in my classes, yet I have such an interest in writing my own academic-quality work. Or something.

I'll see what I can do.

The Aftermath of Absurdity: H?

The title of this post ("The Aftermath of Absurdity: H?") contains an oblique reference to the futurist philosophy known as transhumanism. Transhumanism, as the Wiki link shows at the time of this writing, is occasionally represented/symbolized by the characters >H or H+ to indicate progressing beyond humanity in its current form (which is presumably symbolized by H). I will use H?, then, to indicate a curiosity or uncertainty about what exactly a human is in the first place, before we go about augmenting it.

What does it mean to be human?

It means to be a finite being {limited to subjectivity; prey to irrational impulses; hampered by the physical world} with aspirations toward divinity {sub species aeternitatis; pure rationality; transcending physicality}. Even those of an atheistic persuasion frequently seek this divinity in one way or another--and, in fact, the atheist strives to see through God's eyes much more often than the theist, since the theist normally considers the very thought blasphemous {Lucifer went astray when he desired God's position; humanity was punished when it built a tower meaning to ascend to the heavens; and, of course, original sin is the very product of seeing through God's eyes--the serpent tells Eve that knowledge of Good and Evil will make her godlike, and this is indeed what condemns humanity}. When the scientist desires to understand the operations of the universe beyond our immediate ability to perceive {knowledge of particles; the constituents of stars; DNA; functioning of the body}, when she desires a simple system of laws from which everything else may be derived {Grand Unified Theory}, she is desiring to transcend her senses (and all that which is given immediately and simply) to discover the true nature of reality--something which, presumably, only a god would have direct access to. What was the pre-human world like? How was it formed? What "makes" a plant grow? Does the universe exhibit counterfactual definiteness? If not, is Laplace's demon an impossibility? Would a god be subject to Heisenberg's uncertainty principle, or to the observer effect?

The aspiration to know this, coupled with the apparent impossibility of truly knowing, is one of the things that makes the human absurd. So many humans for so long have wanted to know what goes on "behind the scenes," and wanted to transcend this paltry, unreliable chunk of biological flesh and bones. Science/technology is presumably our best bet to facilitate this transcendence: it allows us to cleverly sneak around the limitations placed upon us by Nature, augmenting our vision through microscopy and telescopy, detecting and analyzing electromagnetic waves beyond our senses' ability to register, measuring quantities which we could never observe unaided. And, with the deepening of our knowledge, the greater becomes our ability to construct devices which manipulate nature for our own ends. This is the transhumanist goal, and, to a lesser extreme, the intention of nearly every technological endeavor since the dawn of time: harness natural forces so that we may prolong our life, ease our suffering, enable our own enjoyment.

Ever since we first realized that we could more regularly and readily find food if we planted seeds in the right kind of earth, that we can use sticks and rocks and other things to help us hunt and defend ourselves, that we can warm ourselves with animal skins and leaves, we have been on this path. The path which will make us God, perhaps? The path toward perfection?

Perhaps not. Perhaps I again assume too much about the rest of humanity, and I ascribe lofty, grandiose ambitions where they may not be entertained. Perhaps most people want simple, material/social comforts; they are not concerned with knowledge, or even with "transcending" the physical body through virtual reality and cybernetic augmentation. But I think they must be, otherwise why would the promise of heaven be so enticing? Why else would television and computer/console games enjoy the intense, sometimes addicting popularity that they have?

But then, perhaps it is only philosophers who dream this way.

What does it mean to be a philosopher ("What does 'P?' mean")?

Among other things, the philosopher examines presuppositions which underly our most fundamental beliefs. She makes the implicit explicit. She wanders the borders of human thought, heroically grappling with those speculative concepts which are on the outer limits of our ability to reason about, attempting to "make sense" of it all. ("Make sense" is an appropriate way to describe the process: humans often rely on metaphors derived from our senses when attempting to apprehend an abstract concept. See Where Mathematics Comes From: How the Embodied Mind Brings Mathematics into Being by George Lakoff and Raphael E. Núñez for some fascinating examples of how humans tend to map abstract mathematical concepts onto familiar experiential concepts.)

Now, the curious thing about philosophizing is its notorious "arm-chair" method of inquiry. In contrast with the empirical sciences, philosophy presumes to discover knowledge in a peculiar manner: reasoning built off of common intuitions, supplemented and refined by 1) the arguments of other philosophers, both those from the past and those contemporaneous; and 2) the discoveries handed down from the sciences. Rather than going out into the world and poking about, setting up controlled environments and acquiring measurements to discover regularities, the philosopher sits atop a mountain of academia, arguing vociferously about the ultimate truths of possibility and necessity. Truths applicable, presumably, to all modes of knowledge and all realms of inquiry--that is to say, truths applicable universally.

Early science - natural philosophy

This has not always been the case: during the Modern Era, "natural philosophy" gained prominence with its new-fangled focus on experimentation. A number of noteworthy philosophers either contributed directly to what we now call science or influenced it strongly with their theories, such as Descartes, Kant, Sir Francis Bacon, and Leibniz. Isaac Newton's legendary tome that lay the groundwork for classical physics was titled Philosophiae Naturalis Principia Mathematica, or "mathematical principles of natural philosophy." Newton and the other members of the Royal Society certainly considered their work to be "natural philosophy," and the continued use of "Philosophical Transactions of the Royal Society" as a name for the longest running science journal in existence is a testament to that attitude. This was by no means exclusive to the Modern period: Aristotle may have been one of the world's first biologists, for all that his conclusions were rife with what we now know are accuracies. Copernicus and Galileo no doubt considered themselves philosophers, etc. etc.

However, with the rise of natural philosophy and its subsequent successes came a devaluation of regular philosophy. By the 19th century (or perhaps the early 20th at the latest), "natural philosophy" had separated even farther from traditional philosophy; it was hereafter known as "science." Strong borders began to appear between the two, spurred on by the anti-metaphysical, pro-empirical agenda of the logical positivists. Since then, science, along with every other field in academia, has undergone a radical process of specialization: we have the natural sciences of physics, chemistry, and biology, then the social sciences of psychology and sociology (and perhaps economics and political science, depending on where ones draws the line). Finally, we have mathematics and computer science, which are hardly empirical, yet they are of such a systematic nature and of such relevance to science proper that they often fall under the general category "science." This segregation of subject matter seems to have arisen as a method of shared labor, or divide-and-conquer strategy: as scientific knowledge accumulates, it becomes inefficient--perhaps impossible--for one person to stay abreast of the current research, and impossible furthermore to devote one's own time to experimentation and theorizing toward the many facets of science at once. Specialization in academia, not surprisingly, mirrored the socio-economic specialization that sprung forth during the Industrial Revolution in the form of division of labor, mechanization, and streamlined factory assembly. Not that it was a new concept: Plato's Republic, Hume's Treatise, and Adam Smith's Wealth of Nations (and no doubt other sources) advocated specialization as the key to efficiency; and, indeed, natural selection itself preceded every thinker through the specialization of cells within an organism and the specialization of individuals within a pack or colony. However, the exponential, near-simultaneous growth of technology, population, science, and the humanities in the last two hundred years exquisitely highlights the role specialization has played, and it is doubtful we would have made the progress we have without it.

Specialization, in conjunction with cooperation, is a wonderful thing which enables synergy--a mysterious emergent property resulting from the pooling and interaction of individual components. Unfortunately, specialization has its drawbacks too: namely, the walls which develop between the expert and the non-expert. Mathematics is a perfect example; from what I hear, mathematics is perhaps the most inaccessible field even to expert mathematicians: at the highest levels of specialization, there might be some ten or fifteen people in the world capable of fully understanding what a given paper tries to prove. A mathematician who studies one niche branch of mathematics may be completely lost when faced with another.

So what does this have to do with philosophy? Well, philosophy was the mother of all inquiry--rational speculation began here, but academic subjects splintered off into child fields that have since then gained their own prominence. In the case of the sciences, that prominence now dwarfs philosophy such that philosophy is the "handmaiden of science," at best, and useless dialectical gobbledygook at worst. And, the inaccessibility that is a byproduct of specialization exacerbates the divide by making it difficult for science to communicate with non-scientific disciplines (see C.P. Snow's Two Cultures for a notorious take on the gulf between the science culture and non-science culture).

So, I suppose I am interested in what relevance philosophy has to non-moral matters. Is there any point to philosophers talking about science and mathematics when many scientists and mathematicians pretty much ignore us? If we have ascertained that philosophy does not give results the way that science has, what role does it play for us? I am interested in science and mathematics, but I do not have the training nor time required to get a full grasp on what the experts are doing. Should this concern me? Is there anything that can be done about it?

Is there a way to be a better/improved/augmented philosopher ('P+'), and what is the relationship between 'P?' and 'H?'?

Friday, March 14, 2008

New Site Banner

For some reason I thought it would be a grand idea to make a new banner for this blog, even though I've got all the design ability (and sensibility) of an orangutan. Or maybe less. Maybe orangutans would actually make fabulous web designers, who knows? Anyway, I'll probably keep tweaking it a bit (and the site's background colors) since they're a bit less than perfect at the moment. It looks kinda amateur right now, but at least it's more original than a default Blogger template, right?

I always marvel at this peculiarity: when I evaluate something for aesthetic cohesion, I'm quite adept at making judgments about (what are for my tastes) good or bad color choices, good or bad fonts, relative spacing, clarity, composition, etc. However, when I'm actually making a new piece of design (or visual art), I find it so difficult to get things to cohere properly. I can't quite find that right set of colors which will make everything gorgeous, I can't quite find the right shapes, etc., etc. Same problem as any artistic endeavor, I suppose--lacking the necessary experience, I'm simply bad at it until I get a chance to develop more.

Anyway, the painting on the left side is from the talented Stella Im Hultberg, used without permission... because I'm too lazy to email and ask, and I don't know that it matters that much on a blog that no one visits.

Also, about the site in general: I'm concerned at the growing number of "frivolous" posts I'm making here. I really intend for this blog to stay on topic (whatever that topic happens to be), so I promise I'll get out some new posts soon that make serious efforts to be worthwhile.

Wednesday, March 12, 2008

A Completely Irrelevant Obsevation

From the on-campus coffee/convenience store, I just bought a bottle of Guayaki Yerba Mate mint flavored tea, a pint of Ben & Jerry's oh-so-maddeningly-delicious Mint Chocolate Cookie, and a few bags of Moroccan mint tea.

Apparently, I'm experiencing some kind of intense need for mint lately, and I hadn't even realized it. Mostly I was amused that these were the only items I bought, and I didn't even notice their prominent theme until a few hours later.

I also keep buying these little honey sticks with added mint flavor that the store sells.


Monday, March 10, 2008

Something About Pride

One of the reasons that I think adults often find it more challenging to learn new things than children is that adults generally possess quite a number of expectations about what should be easy for them. There are certainly other factors: adults already have a solidified framework for their preexisting knowledge; so, any new information must be fit into this framework somehow, and sometimes the fit may not be perfect. There are also brain growth and adaptation differences between the age groups—the brain culls unused neurons as a child develops, so certain capacities get cut if they are never utilized. Consequently, an adult who never, say, learned to speak as a child may lose that ability forever (there are numerous cases of so-called feral child who suffer severe language and social impairments when returned to society.) 1 Today, however, I'm interested in pride and how pride interferes with learning.

My own experience: I am currently taking a statistics class, and I am very aware that the mathematics and reasoning involved are quite rudimentary—the prerequisite is just one semester of calculus, and there have been only one or two places where knowledge about, say, integrals has been helpful. It's not as though basic calculus is a very difficult subject anyway, compared to God-knows-what-else that higher level mathematicians study (hyperbolic geometry? Abstract and linear algebra? Combinatorics?). Now, most of the time I do find it easy to understand and use the concepts in statistics, but every once in a while I get stuck on a particular problem or issue. And, at these points, my venomous pride comes into play: knowing that the material is simple, I berate myself all the more for not picking it up easily. I have to wonder though, is this ever helpful? Such self-castigation is counterproductive: I feel more frustrated and upset with myself, which in turn makes it more difficult to focus on the work, which again makes me upset and frustrated — and before you know it, we have a vicious cycle and/or positive feedback loop spiraling out of control.

Now, I began this post by speaking as though it is a bigger hindrance for adults than children (and it no doubt is). But, as it turns out, children are not always free from the detrimental effects of pride either.

Studies by Claudia M. Mueller and Carol S. Dweck indicate that complimenting children on their intelligence can actually have damaging rather than beneficial effects on their self-esteem and willingness to address difficult problems. This happens if the praise emphasizes ability rather than effort put forth.

From their article's abstract:

Contrary to this popular belief, six studies demonstrated that praise for intelligence had more negative consequences for students' achievement motivation than praise for effort. Fifth graders praised for intelligence were found to care more about performance goals relative to learning goals than children praised for effort. After failure, they also displayed less task persistence, less task enjoyment, more low-ability attributions, and worse task performance than children praised for effort. Finally, children praised for intelligence described it as a fixed trait more than children praise for hard work, who believed it to be subject to improvement.2

Part of the very real problem is that children (and, of course, adults too) want to feel and look smart, particularly if they have been led to believe that they are smart. Sometimes this will incline a child to favor those activities which are easiest in order to maintain their smart image. After all, surely only ungifted plebians have to actually struggle in order to learn things — not like those brainiacs who breeze through the most abstruse material with the greatest of ease. Here's another quote from article's prepatory discussion:

For example, Dweck and her associates have demonstrated that children who hold performance goals are likely to sacrifice potentially valuable learning opportunities if these opportunities hold the risk of making errors and do not ensure immediate good performance (Elliott & Dweck, 1988). That is, "being challenged" and "learning a lot" are rejected in favor of "seeming smart" by children who subscribe to a performance orientation (Mueller & Dweck, 1997).

In other words, children who are overly focused on doing well (rather than trying hard) are reluctant to try new things, particularly if failure might compromise their image as "intelligent." Unfortunately, opening oneself to the possibility of failure is a very necessary part of trying any new activity: closing oneself off from potentially valuable but unfamiliar experiences may wind up stunting one's intellectual growth. (I plan to write more on this subject in the near future.)

Another fascinating point that Mueller and Dweck bring up is how the way children are praised can affect how they conceptualize intelligence. Children praised for their good performance (rather than their hard work) tend to see intelligence as a fixed, static trait rather than something which can be developed and improved. Something along the lines of, "either you're born with it or you ain't" — which seems to be an unfortunately all-too-common view for the average person. Failure and success are then seen in the light of ability rather than effort.

Society really doesn't help with this. We venerate the myth of genius, the prodigy, the wonder-child who's "always right" through some innate, Nature-given powers. However, I think the reality is that these cases (if they can even be said to exist) are very, very, very rare. So rare, in fact, that I strongly suspect that the vast majority of people who are thought of as "geniuses" really did not possess much more in the way of "raw brainpower" or "talent" than the average person. Thomas Edison said, "Success is 10% inspiration and 90% perspiration" — and if you don't believe a prolific inventor of his caliber, who can you believe? Vincent van Gogh worked like a fanatic at honing his painting skills: he may have been talented to begin with, but his brilliance may have been moreover due to his hard work and perseverance. If one examines Albert Einstein and Isaac Newton's lives, one does not necessarily find early displays of talent so much as early displays of intense, self-motivated interest in understanding the world.

To return to my own problems with my statistics course: clearly it does not benefit me to get worked up over my own failings. High standards can be very helpful at times as something to aim towards, but I do not see any compelling reason to allow rampant self-flagellation in the process. The key is to dedicate energy and thought toward growth; be aware when you fall short of your target, but why be cruel to yourself if you don't meet it? Does needless criticism help you to grow more? I doubt it. When I worry that I am dense for not comprehending a subject easily, I am distracting myself from what is really important — namely, that I engage the material, attempt to absorb it with an open mind, and enjoy the experience.

The same ought to apply for children: maintain high expectations, but only if you can present these expectations in a friendly, encouraging manner. Teach them to enjoy struggling — putting forth effort — rather than gold stars and A+'s. Teach them not to fret so much about intellectual pride.


1 Actually, feral children are sometimes able to develop language skills later in life, but the process can be slow and difficult. Children returned to society before the age of about 7 seem to have the best chances of recovery. http://www.feralchildren.com provides more information, with a number of references to published sources.

2 "Praise for Intelligence Can Undermine Children's Motivation and Performance". Journal of Personality and Social Psychology, 75, No. 1 (1998) : 33-52.

Monday, March 3, 2008

This Is Beauty

Theo Jansen's kinetic sculptures:






Here's a clip of a talk he gave for TED (Technology, Entertainment, Design).





Theo Jansen is an engineer who produces the most astonishingly beautiful "creatures": ambulatory and wholly wind-powered, they possess the most rudimentary self-preservation abilities. Through ingenious design and cunning craftsmanship, they glide about on many cycling limbs and are able to respond (in a limited way) to changes in their environment — such as the encroaching high tide or an approaching storm. All this without the use of electronics or a guiding computer. Well, no "computer" in the popular sense of the word. Actually, any device which exhibits the proper computational functionality should indeed be considered a computer; so it is perfectly accurate to say that Jansen's automata are guided by computers, albeit very simple ones. See, for example, the latter of the above videos, where he describes a mechanism which monitors the automaton's distance from the sea — the device counts in binary, just as an electronic computer does.

I must confess to feeling some perhaps foolish sentiments regarding these automata. I want them to "live" as though they were cells, somehow. Perhaps not fulfilling the ten biological criteria for life, but I want to see them developed so that they can better guide themselves, perhaps repair and sustain themselves. I find their spindly-yet-graceful movements so wondrously, eerily beautiful — perhaps all the more so knowing that it is the wind which propels them.

They consume so little and they are more or less harmless; they are gently mesmeric and captivating in their geometric splendor.

I want to see colonies of them deployed onto an otherwise barren planet, where they could simply exist and wander about, free from potential biological scavengers/predators. They would still have to contend with the corrosion of the elements — perhaps it would be possible for them to reproduce, to some degree? I don't even need to see them there, I just want to know that they are out there, exhibiting life-like behavior while powered so clearly by a non-living source. Perhaps extraterrestrials or some distant descendants of the human race would eventually find, marvel at, and speculate about them in curious awe.

Forgive me, for I am but an uneducated philistine in the realm of art; yet I cannot help but pronounce that Theo Jansen must be one of the great artists of our time.

Cyanosarchina Sp.
from Cyanosite Image Gallery