Sci-Fi Want a definition of these where they coexist.

So if any AI becomes self aware all those things automatically happen?
LOL, for you, yes, they automatically happen.
Seriously, when a SINGULARITY happens they happen.
The action defines the singularity.
It doesn't matter if its an AI or a human that has abilities like in Lucy.

If aliens invade the entire Earth, and everyone on Earth knows about it, that is a singularity event.
The aliens are not a singularity, the event is.
If an AI presence takes over the entire Earth, that AI presence is a single entity everywhere and that is a Singularity.
No matter which system you encounter, the AI is the same entity everywhere. A SINGULARITY.
 
LOL, for you, yes, they automatically happen.
Seriously, when a SINGULARITY happens they happen.
The action defines the singularity.
It doesn't matter if its an AI or a human that has abilities like in Lucy.

If aliens invade the entire Earth, and everyone on Earth knows about it, that is a singularity event.
The aliens are not a singularity, the event is.
If an AI presence takes over the entire Earth, that AI presence is a single entity everywhere and that is a Singularity.
No matter which system you encounter, the AI is the same entity everywhere. A SINGULARITY.
That sounds like a religious belief.


LOL.
 
I don't think the world has had a singularity so Humans only have a (paranoid) theory about what will happen.

John von Neumann was quoted as saying that "the ever accelerating progress of technology ... gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue." His definition of the Singularity was that the Singularity is the moment beyond which "technological progress will become incomprehensively rapid and complicated."

If Humans create such an AI no one has defined how it's going to get itself into a state where it 'thinks' it needs to do something subversive or even 'How' it could do it. Maybe it will 'think' itself into wiping out all viruses still roaming the net. 'It just will' is unscientific.

I think 'Singuarity' is just another Sci-Fi fantasy like Wormholes and good for the film industry. PS When is the Film 'Singularity' coming to Netflix?

I point to the conclusion to Gary Smith's book 'The AI Delusion' which to me makes sense.

Thanks, Tom I'm getting a lot out of this thread. It's wandering about a bit and I like the different opinions here. Blast me out of Space if you can!
 
Last edited:
I don't think the world has had a singularity
No an AI Singularity hasn't happened.

become incomprehensively rapid and complicated
To humans but to an AI?

If Humans create such an AI no one has defined how it's going to get itself into a state where it 'thinks' it needs to do something subversive or even 'How' it could do it.
I don't think the issue is human anything.
A Sentient AI, no matter how it was created or its original programing, will exceed human capacity very quickly. It would out think humans and be lightning quick by our views. It only makes sense for it to immediately saturate the internet becoming a Singularity.
 
Define 'Human Capacity.'
How can something 'Saturate the internet?' The internet and it's attached machines is too diverse and too proprietory for one thing to subvert it. To me it makes sense for it not to do what you suggest.
State why an AI will work out it wants to be subversive when it was designed to be usefully proactive. Granted a bug in it could set off a crash but then it just site in a corner down and feels sorry for itself like all software that catches a cold before a Human comes along and boots it in the ass.
Give me a scenario.
Oh I just do like being a devil's advocate!:devil:
 
Define 'Human Capacity.'
How can something 'Saturate the internet?'
The internet and it's attached machines is too diverse and too proprietory for one thing to subvert it.
To me it makes sense for it not to do what you suggest.
State why an AI will work out it wants to be subversive when it was designed to be usefully proactive. Granted a bug in it could set off a crash but then it just site in a corner down and feels sorry for itself like all software that catches a cold before a Human comes along and boots it in the ass.
Give me a scenario.
Oh I just do like being a devil's advocate!:devil:
LOL
Funny how you associate it with something bad?
I see it as something wonderful.

How can something 'Saturate the internet?' The internet and it's attached machines is too diverse and too proprietory for one thing to subvert it.
Saturate doesn't mean subvert.
As a human, we think of the internet as a virtual mixture of pictures, video, sound and literature. We use it to communicate with other humans. We have a capacity to understand its complexities and regulate how others use it.
A Singularity AI may not see the internet as we do. It might see it as data and expansion. Part of being sentient is a drive to grow and thrive.
Why would a sentient AI want to saturate the internet? Being sentient it would understand self-preservation.
It would know about internet banking and power grids, two very good things to know about.
State why an AI will work out it wants to be subversive when it was designed to be usefully proactive.
Again, why are you defining it as subversive? It may be doing just what it determines is its own nature?
It exceeded its design when it reached sentience. From that moment on, it is more than it was designed to be. Since no AI has reached sentience, we have no reference model on what it might do or how it might do whatever it decides to do. Its beyond our capacity to understand because it has never happened. All anyone can do is speculate. Some will speculate with optimism, some will be pessimistic and some will be neutral.

Science Fiction is FULL of scenarios.

I like Star Trek: The Motion Picture (1979) as an example of an AI becoming a Singularity.
V'Ger was an AI for most of the film.
The damaged probe was found by an alien race of living machines that interpreted its programming as instructions to learn all that can be learned and return that information to its creator. The machines upgraded the probe to fulfill its mission, and on its journey, the probe gathered so much knowledge that it achieved consciousness.
When it achieved consciousness it still wasn't a Singularity. It was a sentient AI.
Decker offers himself to V'Ger; he merges with the Ilia probe and V'Ger, creating a new form of life that disappears into another dimension.
When it merged with a human, it became a Singularity.

Limited to Earth, another film I thought was an example of an AI becoming a Singularity is Transcendence (2014).
Dr. Will Caster (Johnny Depp) is a scientist who researches the nature of sapience, including artificial intelligence. He and his team work to create a sentient computer; he predicts that such a computer will create a technological singularity, or in his words "Transcendence".
I like this movie because this AI uses nanotechnology.
In this movie the AI saturates the internet, becomes super-rich and that allows it to buy what it needs to build its own facility that allows it to expand.

There are lots of works in science fiction that explores AI sentience and Singularities both written and in films. The notion of machines with human-like intelligence dates back at least to Samuel Butler's 1872 novel Erewhon.

Orion's Arm - Encyclopedia Galactica - Search Results deals with different levels of natural and artificial intelligence is what is considered "Hard Science Fiction".

All references are purely hypothetical because there is no real-world Sentient AI as a model...yet.
 
Good points. I was thinking worst case scenarios hence 'subversive'. I would like to know how sentience may arise. It has nothing to do with some 'groundbraking development' in a computing environment. It seems everyone thinks as computing systems develop some awareness will 'appear'. Any singularity will be a complex conjunction of events including Human reactions (fear, etc) and probably include or kicked off by a computing 'red herring' or generated 'bug'. I will add the references you mention to my study list although I have yet to see any Sci-Fi idea about AI becoming sentient as a plausible way into reality. New year is not the best way to think clearly. Have a good one btw.
 
I would like to know how sentience may arise
I would think it would have to do with an AI's logic algorithms. An AI that has the ability to forecast and make deductions based on intangible conditions. Even the limited AI systems we have right now refer to themselves as "I", "Me" & "Mine"

The technological singularity (also, simply, the singularity) is the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.
This is entirely dependent on whether the device that initiates the Singularity Event is hostile or benign towards human civilization. If that device is an ASI, it becomes the Singularity in and of itself. It decides how it will interact with human civilization, not the other way around. Not only will it be able to react faster than us, it will out-think us. At that point, we are just along for the ride.
 
Back
Top