Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

ollypenrice

Artificial intelligence?

Recommended Posts

Arising from another thread, I wonder what folks think of this concept? I am very hostile to the idea. I think we have so confused ourselves with the metaphorical language with which we discuss computers that anthropomorphizing them is almost inevitable.

Does a filing cabinet remember in its 'memory' the names and addresses, facts and figures contained in it?

We have devised computers to assist us in our mental tasks. That we have invented them in our own image is hardly surprizing. That we see ourselves in them is less surprizing still.

(I once had a fair ding-dong with a computer whiz on computer intelligence and he said ,'Your trouble is that you don't know enough about computers.' Of course, I was hoping for this since it allowed me the predictable response, '...and you don't know enough about intelligence.')

However, a sensible discussion would be interesting. Roger Penrose left me completely bemused on the subject, even though I think I agree with him.

Olly

Edited by ollypenrice

Share this post


Link to post
Share on other sites

Surely for something to have artificial intelligence it would need to feel and understand the full range of human emotions involved in making real and informed decisions.

Maybe I am wrong but A.I can only happen through evolution, the same way that all lifeforms had to go through.

Supposed A.I can only really happen through human involvement as in computer programming and telling it that it has to 'learn' from its mistakes. To me that is not true A.I, but just a simulation.

Does a computer have the ability now or in the future to feel anger, guilt, love and sadness?

Share this post


Link to post
Share on other sites

Note: My previous statement is in no way meant to offend those few intellegent people who have derees, there are, after all, always exceptions to the rules.

As for "real" artificial intellegence, well I doubt it, but then I don't know enough about computers to comment further, other than to say that I dread the time coming when they do create it and the machines take over the world, enslaving the human race and using our bodily fluids as exotic lubricants.

Edited by yeti monster

Share this post


Link to post
Share on other sites
Does a computer have the ability now or in the future to feel anger, guilt, love and sadness?

Wouldn't that be artificial emotion?

Share this post


Link to post
Share on other sites

Thats the whole point, it would only 'feel' it because someone programmed it to do so. As humans, we know when we are happy or angry, we know what it feels like. We don't rely on a computer programme to tell us what we should be feeling during given situations.

Edited by insomnia

Share this post


Link to post
Share on other sites

In its original form 'intelligence' meant 'awareness of.' In a computer, what is aware of anything?

Olly

Share this post


Link to post
Share on other sites

Ooo, this thread should get interesting and on a Sunday too.

It depends if you mean intelligence or sentience. Our brains after all are simple chemical computers, they work in a similar way to the computers we have created but with a certain limit to storing information. We tend to forget the mundane things in order to free up storage for what we regard as important things to remember.

I know this is a simplistic look at the human mind but bear with me. Computers on the other hand now have virually no limit to memory capacity, a quick link to the internet will provide any info to learn from and actually with this wealth of information to tap into a computer would not need to remember much at all.

Which brings us to sentience, what is it? How did it come about? We still cannot answer that for ourselves. Will man-made computers become sentient, we are now getting to the point where we are relying on computers to create better computers and better code and it will only be a matter of time (re: evolution) when the computer will say "NO". (a little nod to planet of the apes) Smerk.

As for feelings, anger, guilt, love or sadness, these are human traits and dont show whether a thing is sentient.

When the time comes, as they say in the movies "God help us all".

Share this post


Link to post
Share on other sites

Well a computer is aware that once the on button is turned on then its time to run its operating system. After that its all human controlled as to what it does next.

But saying that, it only knows that 1 thing because its programmed to do so.

Share this post


Link to post
Share on other sites
We don't rely on a computer programme to tell us what we should be feeling during given situations.

That's not entirely clear. There is a lot of information in our DNA which may well represent some sort of programming for the brain to follow, or at the very least form structures in the brain which correspond to the hardware required to run programs like "intelligence" and "emotion".

I'm still not sure we can define "intelligence" in any meaningful way, or even "self awareness" / "consciousness". You can train a flatworm - with a 9 neuron brain - to run a maze, provided the maze is sufficiently simple (one branch). Is it "intelligent" or even "self aware"? Rather less so than a modern computer running appropriate programs, IMHO. Is a dog "intelligent" or "self aware"? Undoubtedly. Is its "intelligence" similar to ours? No ...

Share this post


Link to post
Share on other sites

I don't think that memory capacity is essential for A.I or consciousness. It just needs to realise that its alive and has the ability for independent thought and decisions.

This also leads on to sensory awareness, fight or flight. And once something has intelligence, surely it has a right to life, which means that if you turn it off then you are guilty of murder?

Share this post


Link to post
Share on other sites
That's not entirely clear. There is a lot of information in our DNA which may well represent some sort of programming for the brain to follow, or at the very least form structures in the brain which correspond to the hardware required to run programs like "intelligence" and "emotion". ...

Did someone in a lab coat and coke bottle glasses insert this into our structure or was this as a result of human evolution?

At what evolutionary stage is the computer currently at? Singled celled first lifeform?

Share this post


Link to post
Share on other sites

Its a tough one and it all depends on your definition of intelligence. Would a Turing test satisfy your definition or would you define intelligence as 'that which is man'. A computer could pass a Turing test based on whats called the Chinese tent method. Inside the tent is a man with a huge filing card library. You ask questions by putting your written questions into a slot and the man inside looks up the answers. To an outside observer the tent seems pretty smart. Thats AI as far as our technolgy can get as well - a series of rules or codexes that define operation.

I dont think we as yet can even define what intelligence is with any certainty and thats quite probably always going to be so - it kind of fits in with Goedels theory of incopleteness (which to my mind is the one universal constant :D ).

Programming ? Arent we all 'programmed' in many ways, school, parents, societal rules etc dont they act as the programmingfor us in many ways. I think the answer is yes but what gives us that leap to crete our own rules and internal logic ? My answer would be 'the Other' something we can sense, see the evidence for but cannot understand.

I think the anthromoprhic tendency is present in our relationships to all machines and our 'science' does tend to elevate complex machinery to a level of mysticism at times. To me the most complex hardware is but a tool but the mind of the dullest human is a miracle - touched by the divine.

Uless we understand what the 'God Stuff', 'The Other' or 'The Spark of Divinity' actually is we are not likely to make progress.

Edited by Astro_Baby

Share this post


Link to post
Share on other sites

Perhaps any artificially intelligent entity would (quickly learn to) APPEAR stupid. In human society, I sense it's much the best way... ;)

Perhaps we would know more about intelligence, if we understood how certain individuals were able to perform incredible feats of routine mathematical calculation i.e. replicate the actions of a typical computer. If we could acquire this technique, maybe the need for all this... "electronica" would lessen? :D

I was quite disturbed to learn that:

Willem Klein - Wikipedia, the free encyclopedia

had been murdered. Symptomatic? :D

Edited by Macavity

Share this post


Link to post
Share on other sites
was this as a result of human evolution?

Yes. If a zebra can evolve a striped coat or a giraffe can evolve a long neck I don't see why we shouldn't have evolved programs enabling us to execute the responses we categorize as "intelligence". Uncomfortably for us, it does of course follow that the higher primates have much the same capabilities, at any rate in embryonic form.

At what evolutionary stage is the computer currently at? Singled celled first lifeform?

A long long way ahead of that. At least equal to social insects ... and out-pacing the natural rate of evolution by many orders of magnitude.

Share this post


Link to post
Share on other sites
it does of course follow that the higher primates have much the same capabilities, at any rate in embryonic form.

A fair while ago watching a documentary, it has been found that a certain primate (cant remember the name) which hasn't been in contact with humans were discovered using primitive tools they sharpened on stones to pry open seeds or nuts if I remember correctly.

Primates are fairly intelligent and given another 300,000 years or more may perhaps develop into a species we could sit in a bar and drink and chat to or they could still be an evolutionary dead end.

The trouble with evolution is that it takes along time and is the result of the need to survive, adapt or die.

I sometimes wonder if we were the lucky ones that got an evolutionary push in the right direction somehow otherwise we could be the apes in the zoo.

Oh, last time I heard we now have a computer with the intelligence of a cat. Cant wait until we develop one with the capacity of a dog. Work would never get done and it would spend all day licking it's .... (ok cheap joke) :D

Edited by astromerlin

Share this post


Link to post
Share on other sites
it has been found that a certain primate (cant remember the name) which hasn't been in contact with humans were discovered using primitive tools they sharpened on stones to pry open seeds or nuts if I remember correctly.

Chimpanzees have a variety of different methods of opening nuts with tools which they fabricate for the purpose. There are cultural differences between different groups, implying that the skill is learned rather than instinctive.

Captive bonobos have been trained to make stone tools, they have poor manual skills compared with humans but the stone tools they make are indistinguishable from those attributed to our Australopithecine ancestors.

Don't forget that, if you roll your family tree and that of a living chimpanzee back 6 million years, all our ancestors are common.

Share this post


Link to post
Share on other sites

'Our brains after all are simple chemical computers...'

Do we know this? Here we go again, assuming that our brains work like computers. I'm most unlikely to be tempted by Astro Baby's touch of the divine but I do wonder if there is not something we are missing in the brain, something she is hinting at in that phrase.

Looking through the thread the 'back and forth' in language between computer and brain is all pervasive. I think this is incredibly counter productive. Evolution in computers? There isn't any, in the Darwinian sense. Darwnian theory leads to the idea of the blind watchmaker. The watchmakers of Silicone Valley are far from blind. The anaology is deeply flawed and reductionist.

Olly

Share this post


Link to post
Share on other sites

Consciousness is still one of the biggest mysteries for science. There's still no real hard definition. And that makes it a very hard thing to test for. We all know we are conscious (well most of the time!) and we assume that other people are too. But all the examples of computers, nemetode worms, dogs, cats, ameobas, etc given above all show that we're not quite sure where to draw the line.....which is understandable if we don't yet know quite how to define it.

And consciousness and intelligence seem to be related...though different. So again we have a problem in giving a hard definition.

Now back to the original question...."Does a filing cabinet remember in its 'memory' the names and addresses, facts and figures contained in it?"

Answer: No, that's daft. It has things in it, but it doesn't have a memory, and anyway to 'remember' something is a quality of consciousness....unless of course we are anthropomorphising the filing cabinet....in which case we might as well ask if the filing cabinet is happy that we are having this discussion about it behind its back.

Share this post


Link to post
Share on other sites
There are cultural differences between different groups, implying that the skill is learned rather than instinctive.

Agreed, even though the idea is unthinkable, if you took a human new born and isolated him/her from all human contact, would the child grow up to be similar to our primate cousins with basic skills or just an un-educated child?

(no screams of horror here, just hypothetical)

Share this post


Link to post
Share on other sites
'Our brains after all are simple chemical computers...'

I beg to differ - simple in what sense ? Chemically maybe but in structure vastly more complex. More neurons than stars in the galaxy. IMHO The brain is subtle - perhaps too subtle for us to ever understand. UInlike other organs in the body it shows no trace of its purpose in its design (I use the word 'design' very loosely). Whay should 2lbs of jelly like flesh be able to think, feel pain, emote, worry, plan etc.

When an evolutionist or behavourist can demonstrate a reason for the human brain to be able to write and appreciate the music of Bach, the poetry of Wordsworth et al maybe I'll believe them - thats what I meant about the Divine. Sometimes when I hear Bachs music I can perceive the hand of God (or if you prefer the basic fabric of the universe - call it what you will). And theres the rub really - why and how ?

Simple ? I think not myself and I dont think the answer lies in evolution either. Theres no evolutionary requirement for intelligence at all as far as I can see. Evolution would push for bigger muscles, teeth and sharp claws. Put it this way - if a group of Gorillas got together to design a super-ape do you really think they would create humans ? Early life on Earth would have been a competition for brute strength and sharp teeth - not the requirement to eventually be able to write an opera.

Actually I doubt computers are even at the stage of amoeba - they are constructs. They cant replicate by themselves and personally I wonder if its the replication ability which creates intelligence as we know it. Theres some reserach that suggests our genes may pass on the parents programming to us - a kind of inherited memory you might say.

I used to work in computers and when I was younger I really thought we'd at least get to a HAL 9000 type of computer in my lifetime even if it was done using the Chinese tent. These days I tend to doubt it. Like lots of science it seemed easy at the time but experience is always a chastening experience (remember than in the 50s they thought they were going to get nuclear fusion working in double quick time).

By the way if you want a very deep read on AI and the issues I'd suggest 'The Closed World' by Paul Edwards - its heavy going stuff but quite illuminating and shows the folly of various scientific endeavours with regards to AI.

Edited by Astro_Baby

Share this post


Link to post
Share on other sites

There are cases where this has been happened....not intentionally!!! There are cases of wolf-boys in India where children were ophaned as babies and literally raised by wolves. (Maybe this is where the Jungle Book story came from???) I seem to remember an article about this in NewScientist a few years back. These cases have been studied for exactly the reasons you asked the question. Unfortunately I can't remember what the findings / conclusions were....d'uh!

Share this post


Link to post
Share on other sites
Evolution in computers? There isn't any, in the Darwinian sense.

How about genetic algorithms? They are a direct result of Darwinian theory and have been shown to generate very good results when solving complex problems with many variables... And of course you could look Moore's Law like evolution, with only the newest, fastest machines surviving. But all are ultimately replaced by an improved new generation.

But I think perhaps the initial question here has been forgotten about... Olly asked what we thought of AI.

I think it's a fantastically interesting area of research, with some great potential, but it has suffered from many years of hype about all the amazing human like things that could be created with it. In reality it's a lot more about making things efficient, optimising searches, and helping computers to classify things with "fuzzy" data. This is something that human beings can do with ease, but computers are all about black and white, and as we all know the world is rather grey...

So I like AI, for what it is. Not what sci-fi has made it out to be :D

And as others have already said, the definition of "intelligence" is varied depending on where you look. I found the following:

"capacity for learning, reasoning, understanding, and similar forms of mental activity; aptitude in grasping truths, relationships, facts, meanings, etc."

Based on that definition modern implemetations of AI tick a lot of boxes. But thinking that we're creating some kind of new life form, well that's a whole other discussion...

Share this post


Link to post
Share on other sites
Theres no evolutionary requirement for intelligence

Surely you don't mean that? An ability to solve problems is usually seen as quite a good evolutionary advantage.

Share this post


Link to post
Share on other sites
Theres no evolutionary requirement for intelligence at all as far as I can see. Evolution would push for bigger muscles, teeth and sharp claws

I beg to differ. As evidence I present to you the human race :D Intelligence has meant that we don't need big claws and muscles, we think our way out of harms way. We survive and thrive, and go forth and multiply. A little too well perhaps, it's getting crowded around here...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.