News:

Look, I haven't mentioned Zeus, Buddah, or some religion.

Main Menu

"The Singularity is Near"

Started by Ultima22689, September 01, 2009, 04:14:59 AM

Previous topic - Next topic

Ultima22689

So i'm curious to know what fellow atheists think about the future and the colorful theories by the likes of Ray Kurzweil and other futurists. Oddly enough i'm aware that my nature is to be very optimistic about things, for example I've recently moved to Jacksonville, FL and had no choice in the matter, hated every moment of it but i'm optimistic as my benefactors hate it here as much as I do and are seriously considering making the move to Southern California but I digress. To simplify Ray's theory is that technology has been advancing at a somewhat static rate up until some 40 to 60 years ago when the rate our technology has advanced at has been rapidly increasing and the 21st century is going to supposedly be the century where things get really crazy with the development of self aware A.I which will then push us to blend our biology with our own technology thus leading to the "Singularity" where everyone will be so smart we begin advancing in ways we could never imagine, here, this will explain it better I'm sure:


http://www.kurzweilai.net/articles/art0 ... rintable=1

The funny thing is people have equated his theories to religion which I thought was peculiar. Sure, they're a tad grandiose but he's using empirical data to explain his theories.

-43-

The singularity is really a giant "what if" as someone who participates in robotics I get far to many questions that link singularity with Sarah Conner, as a (rusty) programmer I can state fairly certainly that Asimov style robotics is the realm of fantasy, real robots deal with their intended application, nothing more, as long as programmers avoid trying to replicate morality then such a scenario is unlikely. Sorry, got into tangent mode.

Ultima22689

So do you think it's possible that as we develop more advanced algorithms which lead to yet even more advanced algorithms that we might be able to create self aware A.I? What separates this from other science fiction that has come true or is beginning to emerge? A.I isn't a manner of hardware but advanced software right? Correct me if I'm wrong, I'm merely curious about it. What reason could there be for our software to remain as is?

Heretical Rants

Cadie will now manage your email for you



QuoteTwo Gmail accounts can happily converse with each other for up to three messages each. Beyond that, our experiments have shown a significant decline in the quality ranking of Autopilot's responses and further messages may commit you to dinner parties or baby namings in which you have no interest.

Whitney

I think that if intelligence can evolve from non-intelligence that we can eventually develop A.I.  I just don't have an opinion on how soon that will be possible.  I think A.I. would lead to very complicated moral questions as it became more and more "human."

Heretical Rants

Quote from: "Whitney"I think A.I. would lead to very complicated moral questions as it became more and more "human."
This statement makes me think of Chobbits and Bicentennial Man.

Ultima22689

I see makes sense, I guess that is the best way to approach it huh? It's just an opinion, I don't have any facts to bear to back this up but I think it will happen sooner or later, i'm going to go ahead and attribute this to my optimistic nature.


As for HerreticalRant's post, what does that have to do with Ray Kurzweil's theories? Last I checked he isn't saying if you don't send him X amount of money then it won't happen or anything of that nature. I don't see the "scam" here coming from the whole singularity theory. If there is one would you mind indulging me


EDIT: If I misunderstood your post I apologize.

Thom Phelps

I think we'll destroy ourselves long before the singularity can be achieved. Not a very "happy atheist" thought, so accept my apologies, but I am cynical regarding mankind's willingness to destroy everything for the sake of ideology.

It only takes a few people who want to bring about the end of the world to actually accomplish it. And the means to do so are growing.

Ultima22689

Quote from: "Thom Phelps"I think we'll destroy ourselves long before the singularity can be achieved. Not a very "happy atheist" thought, so accept my apologies, but I am cynical regarding mankind's willingness to destroy everything for the sake of ideology.

It only takes a few people who want to bring about the end of the world to actually accomplish it. And the means to do so are growing.


Sadly this is very plausible, the world is in a very volatile state right now however we aren't necessarily guaranteed to destroy each other nor is it particularly likely, it is very possible but not outright likely, once again that is an opinion and can't( I can't at least). Assuming all the crazies out there don't  kill us all I think the world will only get better, it seems as the years pass on the population slowly grows a tad bit more intelligent as older generations die off (not to sound callous). This seems to be marked by the increase in atheism and how theism is dropping off.

AlP

I think Kurzweil is exploiting the properties of the exponential curve. You could make pretty much any time in history look like a "Singularity" was about to happen with the right scale. If anyone doesn't understand let me know and I'll try and plot some tomorrow.
"I rebel -- therefore we exist." - Camus

Will

It's nearly impossible to speculate about how AI will choose to think. A common assumption is that AI will think like us, but that would seem to defeat the point.

I just really hope we don't end up being one of these:
I want bad people to look forward to and celebrate the day I die, because if they don't, I'm not living up to my potential.

curiosityandthecat

If you want a feasible (at least, in my opinion) idea of how AI would think, read Peter Watts' Rifters trilogy. Amazing sci-fi and a really good look at extremely well researched AI theoretical construction.

When AI comes around, it's most certainly not going to be Wintermute or Data. Actually, a few years back, the military thought they were developing AI by teaching automated tanks to fire on enemy tanks. It was working rather well; the computers were "learning." Turns out they weren't teaching it to attack enemy tanks at all, just recognize ambient light conditions. One small shed that is lit and looks like a tank, and... boom.

I do love me some Kurzweil, though. The Age of Spiritual Machines is a mindbender.  :headbang:
-Curio

karadan

Quote from: "Thom Phelps"I think we'll destroy ourselves long before the singularity can be achieved. Not a very "happy atheist" thought, so accept my apologies, but I am cynical regarding mankind's willingness to destroy everything for the sake of ideology.

It only takes a few people who want to bring about the end of the world to actually accomplish it. And the means to do so are growing.

We've had nuclear capability since the forties. We're still here, aren't we?

I don't just think AI is possible, i think it is inevitable, given the pace at which we are understanding the nuances of conciousness and mimicking how the brain works. Besides, how will we destinguish a real thinking machine from one which can pass a Turing test 100% of the time?

I'm an optimistic futurist :)
QuoteI find it mistifying that in this age of information, some people still deny the scientific history of our existence.

karadan

Oh, and this is awesome:

[youtube:104f8740]http://www.youtube.com/watch?v=EzjkBwZtxp4[/youtube:104f8740]
QuoteI find it mistifying that in this age of information, some people still deny the scientific history of our existence.

jbeukema

Quote from: "karadan"Oh, and this is awesome:

[youtube:1flowij0]http://www.youtube.com/watch?v=EzjkBwZtxp4[/youtube:1flowij0]


If you move the bow, can it compensate based on the sounds coming from the instrument?