Jump to content
Science Forums

If Consciousness Is A Function Of Neurons ?


newuser

Recommended Posts

I cannot agree that cerebral waves is all we are.

Not only waves, but also means to transform them, as we already do for different kinds of waves. When we analyze the information contained in the light from galaxies, we use a specific apparatus to reflect it, refract it, print it, diffract it, absorb it, concentrate it, and so on. If the brain waves contain the information, then the brain should possess the different specific apparatus to analyze its waves. When we concentrate on an idea, a specific apparatus could do the equivalent a telescope or a microscope can do. When we think, a specific apparatus could reflect an idea while comparing it to others by interference.

 

Waves might be all we have in mind, but it would also take a very complex machine to analyze their information.

Edited by LeRepteux
Link to comment
Share on other sites

I'm not sure how I can say that I "feel" the same as you do.  What evidence do you have to support the idea that different people "feel" the same way that you couldn't see in an artificial brain?

I don't have any evidence to support the idea that different people feel the same way about anything. However, I can surmise, given a particular set of circumstances, say being shot at by an unseen enemy, might elicit the thought to duck. Would the artificial brain immediately assess the situation the same way or would it attempt to analyze the situation, say a fraction longer, maybe to figure a way to find the shooter --- and thusly get his artificial brain exploded all over his nice blue synthetic suit!

Link to comment
Share on other sites

Not only waves, but also means to transform them, as we already do for different kinds of waves. When we analyze the information contained in the light from galaxies, we use a specific apparatus to reflect it, refract it, print it, diffract it, absorb it, concentrate it, and so on. If the brain waves contain the information, then the brain should possess the different specific apparatus to analyze its waves. When we concentrate on an idea, a specific apparatus could do the equivalent a telescope or a microscope can do. When we think, a specific apparatus could reflect an idea while comparing it to others by interference.

 

Waves might be all we have in mind, but it would also take a very complex machine to analyze their information.

And that's my point, it would take an operator's manual that contained the very essence of life in all its glory. What are we talking about here? An AI, with an artificial brain, as pgrmdave suggests, or an entity that can feel -- spiritual nuances? (whatever that is).....

 

I can understand that "we have the technology..." that we might use it to create a being with synthetic skin, or something better, and body parts that are very much like a human being's, an artificial brain that functions every bit like a human brain - but what do we want to create? Do we want to prove that no GOD exists, and that given the knowledge, we can create a being just as well, except it won't have a soul, if there is such a thing....

 

The question was asked at the beginning, is consciousness neurons? And I think that what we have determined so far is that neurons and cerebral waves constitute brain activity and awareness to a degree, but is there not a spiritual connection? I say there is. I can't prove it, but neither can you disprove it. And if that is the case, what are we trying to create?

Link to comment
Share on other sites

By creating an artificial intelligence that works the same as a natural one, we would prove that we have understood how a natural one works, we would thus be able to repair our natural one better, and I think that we would know better how to behave, because to begin with, we would know better why we have our behaviors.

Edited by LeRepteux
Link to comment
Share on other sites

In the ideal world. But I've seen a really good idea that would benefit mankind, taken and adapted for military purposes (better killing machines) which corrupts the purpose.

 

Aside from that, I think creating an AI that understands the human process to its fullest is a great idea, but would still be flawed. It would be flawed because the programming would be flawed. We're not perfect, thus, could we actually create a perfect AI? And say we could, how far can we go with this perfect AI?

 

I think some of the same problems with this, arose with clones.

Link to comment
Share on other sites

By creating an artificial intelligence that works the same as a natural one, we would prove that we have understood how a natural one works, we would thus be able to repair our natural one better, and I think that we would know better how to behave, because to begin with, we would know better why we have our behaviors.

 

"Works the same" is the problem. Any minor difference in how it works may have a major impact on the perceived behavior of the system.

 

If you duplicate it exactly though, you just have a copy and it's as mysterious as the original, because there are no differences: your focus was solely on making the copy exact, not understanding what makes it work.

 

Moreover, the exact same "equipment" produces geniuses and ignoramuses, wise men and charlatans, saints and murderers. There's something more there than just the sum of its parts.

 

Of course we computer scientists call that "software".... 

 

 

 

If the human mind was simple enough to understand, we'd be too simple to understand it, :phones:

Buffy

Link to comment
Share on other sites

By creating an artificial intelligence that works the same as a natural one, we would prove that we have understood how a natural one works, we would thus be able to repair our natural one better, and I think that we would know better how to behave, because to begin with, we would know better why we have our behaviors.

What does it mean to "work the same" though?  How would you know that it worked the same?  What would you measure to know whether you had an AI that worked the same our brains worked?

 

 

 

In the ideal world. But I've seen a really good idea that would benefit mankind, taken and adapted for military purposes (better killing machines) which corrupts the purpose.

 

Aside from that, I think creating an AI that understands the human process to its fullest is a great idea, but would still be flawed. It would be flawed because the programming would be flawed. We're not perfect, thus, could we actually create a perfect AI? And say we could, how far can we go with this perfect AI?

 

I think some of the same problems with this, arose with clones.

What does "perfect AI" even mean?  Something that's never wrong? Something that thinks every thought that could be thunk?  Something that predicts the future with 100% accuracy? 

Link to comment
Share on other sites

Hi Buffy,

 

I think that intelligence is due to understand itself, because it is meant to solve problems that may affect its survival, and as we can see actually with pollution and climate change, it is actually menacing its own survival.

"Works the same" is the problem. Any minor difference in how it works may have a major impact on the perceived behavior of the system.

If this kind of AI worked like a natural intelligence, it would have the same differences in capacities, the same differences in character, the same differences due to environment.

If you duplicate it exactly though, you just have a copy and it's as mysterious as the original, because there are no differences: your focus was solely on making the copy exact, not understanding what makes it work.

Duplicating the neurons and the synapses is one thing, but duplicating the mechanism that can produce brain waves is another one, and finding and duplicating the mechanisms that can manipulate those waves in the brain is again something else.

Moreover, the exact same "equipment" produces geniuses and ignoramuses, wise men and charlatans, saints and murderers. There's something more there than just the sum of its parts.

AI brain doesn't need to be more intelligent than natural intelligence to help us understand intelligence, it only has to be made the same.

Edited by LeRepteux
Link to comment
Share on other sites

What does it mean to "work the same" though?  How would you know that it worked the same?  What would you measure to know whether you had an AI that worked the same our brains worked?

If you build an AI with artificial neurons and synapses, and you connect them the same way they are in the brain, and it becomes intelligent, then it means that it works the same, but you have to know how brain works before, because artificial neurons and synapses have to work the same as natural ones, which are biologic and not electronic. For instance, if artificial neurons pulses would be going at the speed of light, it might not work, because it might be way too fast for the AI brain to be able to perceive what we want it to learn. If you think at the speed of light, how can you learn what is presented to you at the speed of a natural brain? Remember, if we want an AI individual to become intelligent, we have to be able to teach him. A child without any human language does not become as intelligent as one that has been taught one.

Edited by LeRepteux
Link to comment
Share on other sites

If you build an AI with artificial neurons and synapses, and you connect them the same way they are in the brain, and it becomes intelligent, then it means that it works the same, but you have to know how brain works before

 

 

By creating an artificial intelligence that works the same as a natural one, we would prove that we have understood how a natural one works

 

This is circular though - you're saying that if we know how the brain works then we can prove we know how the brain works by making something that works the way we think it works.  That logic will lead to the same conclusions whether you have a correct or flawed theory of the mind.

Link to comment
Share on other sites

I did not say that we knew how brain worked, but that we could test an hypothesis about it by building one.

 

If, after you saw a bird fly, you think that you could fly if you had wings, and you build something with wings that can fly with you on it, is it circular thought? In other words, aren't we able to recognize what's intelligent and what's not?

Edited by LeRepteux
Link to comment
Share on other sites

I think so, except for the retractable wings, but it is not for flying. In fact, we have succeeded to fly faster than birds, so why wouldn't it be possible to build an AI which would be faster than ours?

 

Do you really think that a bird's flight and an airplane's flight are the same? Building an airplane tells you nothing about how a bird flies any more than building a hot air balloon teaches you how a propeller works.  Building an AI that's faster than our brains isn't a problem - but trying to say that *because it's faster* that it *must be similar* simply isn't true.  We've got really good AI now (Watson comes to mind) but that doesn't mean that it's in any way, shape, or form showing that we know how brains work.

Link to comment
Share on other sites

What do you think of my proposal about brain waves containing the informations, and about the comparison with light containing the informations from galaxies, which is a kind of memory that lasts for billions of years?

I think that it's potentially possible that different configurations of firing neurons could produce enough slight and measurable variance in the electromagnetic waves generated so as to be recorded and interpreted, but the waves don't "contain information" any more than a computer's hard disk "contains information".  The bits on a hard disk are completely meaningless until there's a program designed to read and interpret them in a particular way.  Electromagnetic waves generated through brain activity might be able to be interpreted and read, but they only contain meaningful information once we add in the context.

Link to comment
Share on other sites

If images can be carried by light waves, if words can be carried by sound waves, why couldn't these informations be carried by another type of waves? Why couldn't the images that we see, and the sounds that we hear, be carried by the brain waves if those waves circulate endlessly in loops in the brain?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...