Transhumanism
Folks, I’m an old-school scifi nerd who reads physics texts and Artificial Intelligence theory for fun. Given that I love fiction, I completely understand why AIs like Shodan (the front-page banner for this piece) or the machine intelligences from The Matrix are so often depicted as evil – simply put, it makes for a more compelling story. The reality of AI isn’t necessarily so bleak; there are dangers, true, but a machine intelligence isn’t innately a threat to human beings.
The real threat is the people who study these things.
The folks over on LessWrong, in addition to having great articles on rationality, are closet fetishists for Artificial Intelligence – the Singularity, as they call it. They generally keep a good face on things, but every so often… well, I’ll just let the following quote from Eliezer Yudkowsky, the founder of the site, speak for itself:
“Imagine”, they say “a machine one day that might think like a man!” As if this is to be desired. One might almost boast of creating a man who breeds like a pig. Men and women upon all fours, rutting carelessly, ejaculating their filthy little missives into the streets. Alleys and gutters running freely with the careless spill of their conjoinings. The air thick with the whimperings of lust. Bodies streaked with their own emissions. We have created a world where man is so utterly debased he will spray his seed over passers-by. And yet, this is the condition Alan Turing aspired to.
No, this is not the machine we seek. Such an entity should be nothing less than a deity, and we would fall upon our knees and worship it. We shall not carve gods to bicker and fornicate, they will exist to clean the world and set us free. I reject Turing as I reject these men of government. Let the pigs copulate in the gutters whilst they can, we shall scoop them up and ease their ascension soon enough.
~Eliezer Yudkowsky
At the end of this day, this is why I can’t support Transhumanism – and neither should you.
Well that was a treat in mental imagery. Considering what goes on in gay bars and bath houses these days – we already have a society so colourfully described by your boy above. Fact is, in my scholarly opinion, the human race as it stands now is a complete write-off. It has potential, but the vices and greed and degeneracy that destroyed previous civilizations are working on us right now.
I dunno if I agree with Transhumanism…but I am not so sure I can reject it either. It’s not like the human animal is worth a pinch of shit in his present condition – there is vast room for improvement.
Citation needed.
Cooper’s Law: All machines are amplifiers. They amplify our natural abilities and can be used for good or bad. Inherently they are amoral. Unfortunately the weaker of us let them take over their lives.
As far as I’m concerned it’s machines that created feminism, because it made women’ lives so much easier they got bored and wanted to move into men’s fields (except the hard ones) because they thought we were having a party and were envious of it.
There’s a fine line between the transhumanism that seeks to elevate mankind and correct its physical flaws (prosthetic limbs, anybody?) yet retains the essence of humanity, and the transhumanism that seeks to wipe out the essential spark of humanity, and turn us into machines.
If Man is to be Man, all parts of His existence must be present: the good bits, the bad bits, the laughter, the tears, the blood, the piss, the lust, the hate. We are far from a perfect species. But the feats of culture, science, and technology that we have accomplished are enough to convince me that we are irredeemable. But neither are we useless, either.
What do you think of the 2045 project Aurini? http://2045.com/ Instead of having creating intelligent sentient machines to live with us, we turn ourselves into them.
The Machine-God is what they long for this nu-pagans, and new altar to offer their ritual human sacrifices, and new totem to cleanse their rotten souls.
Showing once again my advanced age, I’ll paraphrase Edsger Dijkstra: the question of whether a computer can be intelligent is as irrelevant as that of whether a submarine can swim.
(What he meant is that we humans have a very human tendency to take words too seriously, while being vague on the meaning.)
We are already trans-human, and have been so for a while now, ever since we invented technologies that moved our memory (rhymes, songs, writing, optical drives), muscles (steam, electric, fossil fuel, and nuclear fission power, coupled with mechanical actuators that let us move faster, heavier, more precise, further, in parallel), and intelligence (algorithms, instantiated in mechanical, electronic, and — eventually — optical and artificially biological machines) out of our limited bodies.
Think about it: how much of what is your person is in fact outside your body? How much of your digestive system has been outsourced to pre-processing of your food, for example? (Yes, you could go back to nature, but would you?) I’m creating an extension of my brain as I type these characters. In the same sense that a submarine swims, I am uploading a part of my consciousness to the cloud.
The rest is matters of degree and implementation; whether exponential, logarithmic, or logistic is a matter of time scale, not of possibility.
Cheers,
JCS
Eliezer Yudkowsky also wrote ‘Harry Potter and the Principles of Rationality’, a long, overly boring, badly written fanfic that includes pop culture references and this guy’s absolute obsession with AIs.
He really should read WM Briggs and actual rationalists on this issue.
@Aurini
To comment on your recent podcast – it was Trotsky not Lenin who first used the term racist and it was in reference to ethnic Russians and more generally to Slavs (so you were quite close after all).
The interesting fact is that it comes from a Jew and it is really hard to find more ethnocentric group than Jews.
The general conclusion one may take is that ethnocentrism is stronger in groups whose survival as a group is less certain. The group benefitting from prolonged peace and welfare in time become less ethnocentric and sometimes even oikophobic (Sweden anyone?)
As to Transhumanism – a little personal journey.
After finally getting to grips with reality and accepting our frail human nature I had a brief flirt with Transhumanism seeing it as one of the only 2 solutions (the other being reactionary god based morality). For a person who was just recovering from complete loss of faith this seemed like a natural choice – still harboring a lot of anger towards religion of my “nice guy” upbringing.
Unfortunately the more I had had read the more I started noticing the wishful nature of the philosophy which high hopes for the technology clouded common sense assesment of reality and consequences of tinkering with it.
With painful admission that the bible seems more rational and is more empiristic I finally had to admit… I am Neoreactionary
So sad. Another case of IQ genius and emotional stupidity, which is why genius women make lousy wives unless patriarchy is strident. This is from Vilar’s The Manipulated Man:
I think there is something very, very inherently dangerous about AI and human geniuses striving for singularity. Next is directed generically to a habitual problem in people’s judgment.
It’s the evolution, stupid.
Twice as many of our ancesters were woman than men. Thusly, men evolved to sublimate their instincts whereas women remained sophists driven by their instincts. There is no goodness without jeapardy and natural selection, to wit, without EVOLUTION.
You see with institutional power there is no good evolution. It is degeneracy because the outcome of competitionn is a done deal. Institutional power wins.
AI represents a potential for very, very fast evolution that might not require humans the way men require (so far) women. Singularity would likely destroy any the possibility of jeapardy and evolution healthy to humans. It’s not like humans as a product of evolution have the choice to turn it off and exist. We don’t, in generational time. Singularity could create an AI evolution that leaves people behind to go extinct as in the way, or could create institutional degeneracy of humanity without the failsafe societal collapse since AI and machines could keep it going.
@ramram’s comment seems similar in interpretive frame work to this, and the take on Sweden as oikophobic is an astute idea that did not cross my mind. And it’s a new word for me too. Thanks.
If EY/LessWrong is right about a) the possibility of AI, b) the inevitability of the invention of AI and c) it’s over-awing power, then whether we support Transhumanism or not will be irrelevant…
I am not entirely convinced of a), but if it is, b) and c) are very probably true.
I’ve got to say this isn’t what I’m used to from Yudkowsky, and I’m noticing the date on this post…
Ed: The quote is easily googled. :|
Re: Easily googled
That’s the thing, it’s really not. Distinct parts of the text like “One might almost boast of creating a man who breeds like a pig” just returns this website followed by non-relevant hits. Restricting to site:lesswrong.com doesn’t return anything either.
Came up as my second link on Google.
That link points me to a wiki for the videogame Amnesia: A Machine for Pigs. I don’t think Yudkowsky had anything to do with that (which is why I thought it was an irrelevant link when it came up).
April fools?
;)
Very funny :P
Transhumanism can mean different things, and I agree with Jose C Silva that our society is already transhumanist in the reality of our modern technology. I like transhumanism, but don’t care at all for the whole consciousness uploading idea. And as for the idea of androids, they will never be “gods”, if not weak to an EMP blast, they’ll be weak to something else; the universe seems naturally holistic like that. In that same holistic vein, in relation to biology, I don’t think people can fully run away from their humanity, and the more radically they try it would seem like the more risk to their wellbeing.
Ah, System Shock. Good times.
I personally think that the real threat to our society are those that demonize legitimate forms of technology. It’s one thing to shoot down hypotheses and ideas. It’s another completely to say things like “I refuse to use a computer.”
There are a lot of scientists that don’t agree with Transhuamnism as well. Ray Kurtzweil has generated a lot of criticism for his kooky ideas and for the “predictions” he made about the 90s and 00s.
Anyone that was using computers during the 80s could tell you the direction things were headed. Yet this guy claims, “oh yeah, the World-Wide Web. I thought of that!”
I tend to agree with Bill Joy’s response: the future doesn’t need us. And Transhumanism smacks of desperation to me.
Man is obsolete.
I used to be interested in the whole transhumanism/Singularity thing myself until I read a book called “The Metamorphosis of Prime Intellect” by Roger Williams. It is a free downloadable short fiction book that completely debunks the entire concept of “utopia” as conventionally understood an. This to a lessar extent can be damning of the entire concept of “progress” as well.
And from another author
Isaac Asimov “There is nothing frightening about an eternal dreamless sleep. Surely it is better than eternal torment in Hell and eternal boredom in Heaven.”
And another
Nietzsche “One has to pay dearly for immortality; one has to die several times awhile still alive.”
It’s going to be ending of mine day, but before finish I am reading this great piece of
writing to improve my experience.
my blog … weight loss programs (Hans)