So basically learning on their own is a possibility. Therefore it can self-gain the ability to think in abstract ways like humans can given enough time. It kinda sounds like genetic programing to me. I'm new to this whole computer thing but it seems like it's a way to tell a machine how to do something without actually telling it how. It sorta sounds like giving it problem solving abilities.
Yes that's basically it. Imo learning is mostly associations. The difference between humans ability to do 3 x 7 and computers ability to do it is that the human can associate that with 3 groups of 7 apples while the computers simply thinks of it as a number. Current machines cannot associate anything without specifically being program ed to do it and even then they would simply associate it with the strings "apple" and "group". Humans on the other hand associate apples with a variety of things such as the color, shape, taste, price, and maybe even genealogy of the fruit. All depending on what we know. And we can associate it with more things as we learn things (you might latter associate it with gravity as you learn the story of how Newton supposedly discovered the law of universal gravitation). Therefor in my opinion a machine that could freely associate different concepts would be A.G.I. (It would ofc have to have a way to learn those concepts in the first place presumably from sensors or the internet).
Disgust isn't an emotion, but rather an acquired disposition towards certain events. It depends on past experiences. Some people are disgusted by the sight of blood, but many are not.
Emotions do have purposes. Take frustration for example. What stops you repeating a task endlessly? Frustration is typically when you repeatedly fail a task, pushing you to give the task up and do something else. This has obvious evolutionary and practical uses. Let's pretend you're God for a moment, you don't want your little humans spending all their days trying to push that rock over. You want them to get frustrated and do something else if it's deemed fruitless.
Other emotions have similar practical uses. Sadness, joy and empathy, for example, are mental tools to push us to communicate and work together. You can obviously see the advantages of working together, and hopefully see why removing the 3 listed emotions from the human psyche would result in less co-operation.
Trust and mistrust are based on past experiences with people we have worked with, pushing us to work with people we trust, and avoid people we mistrust. Throw in optimism, disappointment and anger and you can see why emotions and the ability to learn are closely linked.
TL:DR; Stuff is complicated and nearly everything has evolved the way it has for a VERY good reason.
so what about this? An internet data bank for all basic stuff is connected to the system: math, science, language, etc. etc. It is then automatically accessed by hearing or seeing something and relating what it gets with the information online. It is then stored on the system itself. Some basics would need to be pre-installed, language and facial recognition abilities for example. Things like understanding someone's name would be something that a learning program could probably figure out with enough input.
Another way I have thought about it is to give it anything and everything to start off with. It would then connect images and gained data with the pre-stored info and then the original data would be automatically deleted. Therefore the only info it would have would be the connections and what it learned.
Hmm if I was skilled at this kind of thing here is how I would try it out. This is kinda a what-if fun project since I need to study programing and stuff still.
People are naturally curious. That is a general rule, at least in young children. Making a machine able to learn is one thing, but giving it curiosity is another. The want to learn makes learning more of an experience, at least in humans. People also learn better when they enjoy the subject. Making a machine want to learn as much as possible is probably the best start.
People are different from computers obviously and as such we have a base platform to learn from. Even a new born baby isn't just a whiny blob (although it sure as hell seems like it). In fact, evolution has given the child the ability to cry as well as the knowledge that crying will make it feel better and get it what it wants. Even the feeling of want itself has been ingrained into human evolution. If someone starts with ZERO, it is able to gain ZERO. A machine though knows NOTHING if you don't tell it ANYTHING. Metal has no feelings, plastic has no thoughts. A computer is the same as a brick if it doesn't have anyone telling it anything.
The physical shape of the machine also matters. A brick can't do anything. A doll with mechanisms can. I would actually build something to work with first, then program it later. It's hard to engineer a perfect life resemblance program if you don't know the capabilities of that machine you want to make. I would also need to differentiate between conscious and subconscious actions. When you scratch your head because it itches do you do it knowingly or do you just naturally do it without thinking about it? Humans have dedicated systems for survival and dedicated systems for questioning and experiencing that are separate but they work very well together.
The body would need to be riddled with sensors that all work together and are connected to the same thing. Pain in your eye is the same thing as pain in your leg. The only difference is that different nerves sent the signals. What would a replica of a nerve be anyway? It would need to be far more advanced than a switch or a wire. Actually each nerve would need programs of it's own. Electronics aren't even 1/2 as sensitive and accurate as their biological counterparts.
As far as the seemingly alive part goes, things like motors and sensors would need to be linked with actions. The machine would need to be able to make the connections of when someone smiles it means that they are feeling good. There are different types of smiles too. There is the happy smile, the "I have an evil plan" smile, and the I'll smile to hide the fact that I feel like crap smile. Giving the machine the ability to tell the difference is a giant step in the right direction.
So is anything I just said 1/2 decent or doable? I certainly can't do it now but I would love to know how.
I didn't read all of the posts up until Reaper's last one thoroughly and I'm too lazy to quote right now.
Anyway disgust is technically an emotion. I don't much like psychology because I think it's mostly a bunch of poor guesswork (according to a psychologist I saw I'm apparently schizophrenic and have moderate to severe depression. I assure I do not have either of those... unless this isn't real... I can no longer trust ANY of you!). Just like love is an emotion, hate would be as well (if you like someone, your friends maybe, that would be more aligned with love, although not to the same extent as say your spouse or child typically, likewise disgust would be associated with hate). It's more complex than the happiness and fear line we had in high school (or at least I did. My high school sucked).
Anyway back to the robot thing, I really don't think you can give them actual emotions and things like curiosity. You can make them simulate it to some extent, but they won't actually care (nor will they not care, they won't have any disposition towards anything really, unless you make them work like that. But it still won't matter to it if it's happy function doesn't work. It would just seem like an angry robot all the time, but it won't know that). Our brains don't work like computers, they don't deal wholly with numbers like a computer. We deal with far more complex sets of data and a totally different system of logic. When you think of house, you think of something you live in. A computer doesn't know that, house is just 686f757365 or something like that (well that's hex, technically it would be binary). The only associations it would have are ones made by index. House might be tagged with something like living, and therefore it may be associated with an apartment because of the similar tags, but that's it. It won't make those on it's own.
Which goes back to simulation: You can make a thorough index using speech recognition and whatnot, and force it to associate things with what you say like a couch, and with a good enough program you might be able to make it look at a different couch and it would know what it is based on it's shape and people sitting on it and whatnot, but it's not going to just do it. It's going to do whatever it does based on it's functional set. If it sees a chair instead it won't know what it is.
Quote from Varine:
When you think of house, you think of something you live in.
----
That is just an association. The difference between humans and machines is that the baseline for computers is math (specifically binary math) whereas the baseline for humans is something like sensory input (we assume no one knows exactly how the brain works). However its still all just abstractions and associations. For example to a human the idea of matter being solid is obvious and the idea of matter being made up mostly of empty space is abstract (even though the latter more closely represents reality). For a computer math is the most real bottom line thing but that does not mean it cannot understand say language or the concept of a home just as I can understand chemistry (well I at least I can pretend to understand it).
@Eiviyn:
Emotions are obviously very useful for humans and they have been programmed quite well by evolution to help us do things necessarily for evolution specifically to survive and reproduce. However the primary objective of robots built by humans would presumably be to help us with some specific of general task not to survive and reproduce. Also evolution is a blind watchmaker and is in many ways sub optimal. Emotions are very useful in the circumstances which we have spent most of our evolution (we have only had advanced civilization for 100 years or so) but they can be very cumbersome at times because they adapt very slowly. Your example of frustration is a good one. If a human spent all of its time trying to work on a problem without much success (say trying to cure cancer) than it would be evolutionary beneficial to give up because that would increase its probability of reproducing. However we would not want the machines we might build to attempt to find a cure to cancer to "give up" we would rather they keep trying. Therefor imo although emotions are useful they are not required especially for a task oriented machine. (Which would be more useful anyway)
@Reaper872:
You can program a computer to self program there is an entire field dedicated to it called machine learning.
What about making a machine that automatically turns what it sees into 3d models? That would seem to solve the whole "think in math" problem wouldn't it?
That has already been done to some extent. If you have I-Tunes go to I-Tunes University and search for machine learning. You will find a Stanford course (you can watch the entire thing for free) If you watch the first episode the professor will explain some of the really cool things that have already been done with machine learning (including turning 2d images into 3d models, teaching robots how to move, teaching cars to follow roads). However I don't think thinking in math is a problem. I was just noting that its not how humans think.
Which means that teaching a machine to think in a non-logic based way is basically impossible as the machine it is built around is still based in math. Now I get it. Unless there is another way to make a machine run not using ones and zeros it'd be impossible I suppose :(
We are imperfect and this is what sires what is not replicable to a machine (or at least not probable, see "2001 a space odyssey's Hal")
however, this has made us crave for perfection and want to create perfect "things": ergo machines
(well laziness (sleuth) and pride, and all the other over-known sins have their way ;^) of pushing us there too)
We are chemical made and magnetic/electrically powered.. and in that regard we are unique snowflakes
not to mention, we have too much time to think (a great portion of us, and a significant part of our lives at least)
and
we are social like no other mammal.. we are the social beasts
case study: brave new world, book by orwell: kids in maternity ward have speeches coming from their pillows telling them what's what since birth... no one likes a perfect world.. we would not need then we would not dream then we would wither..
case study (close to home): i am the "second" kid in my family and i am raising a "first" kid = whatthefuck do i do? who's he gonna become? what is going to change me from this experience? etc.. introspection/interest for others... so many variables to account for..
..we are able to maintain sanity while "not knowing" and even "knowing that we will never know everything or indeed enough"...
..while a cp needs to know everything beforehand, we allow imperfection to be acceptable.. while he lives to expunge it (see 2 seconds of zen in the last tron)...
by that i mean why marvel at super computers that can do very complex applications of sophisticated knowledge we have fed it?... That is still peanuts compared to the miracle of thermo-dynamics as applied to the human race (see dr manhattan in watchmen)... (well, every flower or dolphin is unique too but you get my drift, no?)... and that's just contemplating a basic enough variable: birth .. then jump ahead to choosing a mate or a career... lol the variables are more than any brain can master (and some people master 16/20 languages (and i do mean human speech languages :) not java basic or ubounto ..;)
a human being is capable of the best and the worst,
machines are well thought or not, well build or not, well used or not..
so huge a difference that any ai is far away still
(did any of you see blade runner?
"..[it] would be like tears in the rain..." [rutger hauer's last line in his best acting job ever, as the replicant dying in the rain, that will not kill the hero (his hunter).. because he can't stand the thought of being alone when he dies.. .. poetry and humanity (ok it's by a scott brother (rolf) but it's his best work :) )
yeah ok guys. Movies like Tron Legacy and Transformers are just really getting to me I guess. I'm not interested in perfection or making a person out of a computer. I would like to be able to talk to something other than a person and have it understand me though. That much I know I want.
Even animals though don't really understand humans, they just hear the same thing over and over so they do what they think you want. It's not like they can go "how was your day." or "I'm bored, lets play!" They act like that's what they want though. Meh, not enough communication for my tastes.
I know that. People are just boring. I don't mind people, I really don't. That's not my issue. It's just that when you get thinking about being able to make something behave in certain ways and being able to actually control it, it sounds really cool. I also like mechanical things so yeah... there you have it! I guess I just have to accept the fact that there isn't any fun in it for now.
There is no way to teach anyone anything except with logic. Sometimes the logic or the information it is based on is incorrect but all human knowledge is based on logic directly or indirectly. Also humans are probably mathematical operating systems deep down as well as our brains are composed of neurons each which can form about 1000 bonds to nearby neurons. Although we have 1000 states to work with rather than two and a greater level of noise and randomness in programming I would argue that we are probably still based on math and logic just with so many layers of programming (some of which are junk) on top of it that we cannot tell anymore. Ofc I don't know that for sure (no one does) but I don't see what else we could be based on (Our brains do comply with the natural laws of physics after all).
Can one ever get too far.. when discussing i mean?
"There is no way to teach anyone anything except with logic." = i respectfully disagree...
i 100% agree that people's brain mostly works with logic, dubious logic, passive aggressive counter logic, wrong logic etc... but just because that's "the norm".. but that's forgetting about arts and crafts .. about history .. introspection .. about poetry ... No offense meant to anyone, but that's all i personally am interested in, anything else is felt like a choir ;^p
Furthermore, human beings are made of chemicals ... when you meet someone, you meet them.. your eyes glance, you smell this person, you hear that person, rarely even touch that person... It's the most complex interaction you can imagine...
Again, emotion is a chemical imbalance, and i would not have survived my teens (dare i say it, would not have passed the "age of reason" (around 7 for most people)) without it.. i loved being chemically imbalanced.. it is my karma.. and yes i am sad that so few human beings like me roam the earth freely :(
Don't mind me... i'll go cry in someone's robe now ... too sensitive :)
The whole concept of AI is based on the Turing Test, whether the program is sentient or not does not necessarily make it an AI. The only criteria for AI is that it can fool people in a blind test of communication into thinking it is another human (of dubious intellect).
that test doesn't fly by me. What if you have a computer programed with thousands of responses and they are each voice activated by thousands of things someone can say to it? It won't know what it's doing, it just passes because it mimics what a human would do. I'm not talking about a "blind test of communication." I'm talking about making ACTUAL intelligence, thought, and reason. A machine can be told what to say, that's one thing, but having it know what it all means is something all together different. There is really no point making something mimic a human without it knowing what it's doing.
Think of the Queen of Blades for example. She was controlled by the zerg so heavily that she really had no free will. She didn't want to be infested, she didn't want to look like a monster. Her lack of independence doesn't make her not alive. A program is basically something than heavily influences the decision of the machine. It is in fact unable to make decisions without programs. With programs though, it isn't really it's own decision because it is told what to do. I am frustrated by this more than anything else. How do you make a computer have at least a basic understanding of what it means to exist?
So basically learning on their own is a possibility. Therefore it can self-gain the ability to think in abstract ways like humans can given enough time. It kinda sounds like genetic programing to me. I'm new to this whole computer thing but it seems like it's a way to tell a machine how to do something without actually telling it how. It sorta sounds like giving it problem solving abilities.
Yes that's basically it. Imo learning is mostly associations. The difference between humans ability to do 3 x 7 and computers ability to do it is that the human can associate that with 3 groups of 7 apples while the computers simply thinks of it as a number. Current machines cannot associate anything without specifically being program ed to do it and even then they would simply associate it with the strings "apple" and "group". Humans on the other hand associate apples with a variety of things such as the color, shape, taste, price, and maybe even genealogy of the fruit. All depending on what we know. And we can associate it with more things as we learn things (you might latter associate it with gravity as you learn the story of how Newton supposedly discovered the law of universal gravitation). Therefor in my opinion a machine that could freely associate different concepts would be A.G.I. (It would ofc have to have a way to learn those concepts in the first place presumably from sensors or the internet).
Disgust isn't an emotion, but rather an acquired disposition towards certain events. It depends on past experiences. Some people are disgusted by the sight of blood, but many are not.
Emotions do have purposes. Take frustration for example. What stops you repeating a task endlessly? Frustration is typically when you repeatedly fail a task, pushing you to give the task up and do something else. This has obvious evolutionary and practical uses. Let's pretend you're God for a moment, you don't want your little humans spending all their days trying to push that rock over. You want them to get frustrated and do something else if it's deemed fruitless.
Other emotions have similar practical uses. Sadness, joy and empathy, for example, are mental tools to push us to communicate and work together. You can obviously see the advantages of working together, and hopefully see why removing the 3 listed emotions from the human psyche would result in less co-operation.
Trust and mistrust are based on past experiences with people we have worked with, pushing us to work with people we trust, and avoid people we mistrust. Throw in optimism, disappointment and anger and you can see why emotions and the ability to learn are closely linked.
TL:DR; Stuff is complicated and nearly everything has evolved the way it has for a VERY good reason.
so what about this? An internet data bank for all basic stuff is connected to the system: math, science, language, etc. etc. It is then automatically accessed by hearing or seeing something and relating what it gets with the information online. It is then stored on the system itself. Some basics would need to be pre-installed, language and facial recognition abilities for example. Things like understanding someone's name would be something that a learning program could probably figure out with enough input.
Another way I have thought about it is to give it anything and everything to start off with. It would then connect images and gained data with the pre-stored info and then the original data would be automatically deleted. Therefore the only info it would have would be the connections and what it learned.
Hmm if I was skilled at this kind of thing here is how I would try it out. This is kinda a what-if fun project since I need to study programing and stuff still.
People are naturally curious. That is a general rule, at least in young children. Making a machine able to learn is one thing, but giving it curiosity is another. The want to learn makes learning more of an experience, at least in humans. People also learn better when they enjoy the subject. Making a machine want to learn as much as possible is probably the best start.
People are different from computers obviously and as such we have a base platform to learn from. Even a new born baby isn't just a whiny blob (although it sure as hell seems like it). In fact, evolution has given the child the ability to cry as well as the knowledge that crying will make it feel better and get it what it wants. Even the feeling of want itself has been ingrained into human evolution. If someone starts with ZERO, it is able to gain ZERO. A machine though knows NOTHING if you don't tell it ANYTHING. Metal has no feelings, plastic has no thoughts. A computer is the same as a brick if it doesn't have anyone telling it anything.
The physical shape of the machine also matters. A brick can't do anything. A doll with mechanisms can. I would actually build something to work with first, then program it later. It's hard to engineer a perfect life resemblance program if you don't know the capabilities of that machine you want to make. I would also need to differentiate between conscious and subconscious actions. When you scratch your head because it itches do you do it knowingly or do you just naturally do it without thinking about it? Humans have dedicated systems for survival and dedicated systems for questioning and experiencing that are separate but they work very well together.
The body would need to be riddled with sensors that all work together and are connected to the same thing. Pain in your eye is the same thing as pain in your leg. The only difference is that different nerves sent the signals. What would a replica of a nerve be anyway? It would need to be far more advanced than a switch or a wire. Actually each nerve would need programs of it's own. Electronics aren't even 1/2 as sensitive and accurate as their biological counterparts.
As far as the seemingly alive part goes, things like motors and sensors would need to be linked with actions. The machine would need to be able to make the connections of when someone smiles it means that they are feeling good. There are different types of smiles too. There is the happy smile, the "I have an evil plan" smile, and the I'll smile to hide the fact that I feel like crap smile. Giving the machine the ability to tell the difference is a giant step in the right direction.
So is anything I just said 1/2 decent or doable? I certainly can't do it now but I would love to know how.
I didn't read all of the posts up until Reaper's last one thoroughly and I'm too lazy to quote right now.
Anyway disgust is technically an emotion. I don't much like psychology because I think it's mostly a bunch of poor guesswork (according to a psychologist I saw I'm apparently schizophrenic and have moderate to severe depression. I assure I do not have either of those... unless this isn't real... I can no longer trust ANY of you!). Just like love is an emotion, hate would be as well (if you like someone, your friends maybe, that would be more aligned with love, although not to the same extent as say your spouse or child typically, likewise disgust would be associated with hate). It's more complex than the happiness and fear line we had in high school (or at least I did. My high school sucked).
Anyway back to the robot thing, I really don't think you can give them actual emotions and things like curiosity. You can make them simulate it to some extent, but they won't actually care (nor will they not care, they won't have any disposition towards anything really, unless you make them work like that. But it still won't matter to it if it's happy function doesn't work. It would just seem like an angry robot all the time, but it won't know that). Our brains don't work like computers, they don't deal wholly with numbers like a computer. We deal with far more complex sets of data and a totally different system of logic. When you think of house, you think of something you live in. A computer doesn't know that, house is just 686f757365 or something like that (well that's hex, technically it would be binary). The only associations it would have are ones made by index. House might be tagged with something like living, and therefore it may be associated with an apartment because of the similar tags, but that's it. It won't make those on it's own.
Which goes back to simulation: You can make a thorough index using speech recognition and whatnot, and force it to associate things with what you say like a couch, and with a good enough program you might be able to make it look at a different couch and it would know what it is based on it's shape and people sitting on it and whatnot, but it's not going to just do it. It's going to do whatever it does based on it's functional set. If it sees a chair instead it won't know what it is.
so you can't program a computer to be able to self program?
@Varine:
Quote from Varine:
When you think of house, you think of something you live in.
----
That is just an association. The difference between humans and machines is that the baseline for computers is math (specifically binary math) whereas the baseline for humans is something like sensory input (we assume no one knows exactly how the brain works). However its still all just abstractions and associations. For example to a human the idea of matter being solid is obvious and the idea of matter being made up mostly of empty space is abstract (even though the latter more closely represents reality). For a computer math is the most real bottom line thing but that does not mean it cannot understand say language or the concept of a home just as I can understand chemistry (well I at least I can pretend to understand it).
@Eiviyn:
Emotions are obviously very useful for humans and they have been programmed quite well by evolution to help us do things necessarily for evolution specifically to survive and reproduce. However the primary objective of robots built by humans would presumably be to help us with some specific of general task not to survive and reproduce. Also evolution is a blind watchmaker and is in many ways sub optimal. Emotions are very useful in the circumstances which we have spent most of our evolution (we have only had advanced civilization for 100 years or so) but they can be very cumbersome at times because they adapt very slowly. Your example of frustration is a good one. If a human spent all of its time trying to work on a problem without much success (say trying to cure cancer) than it would be evolutionary beneficial to give up because that would increase its probability of reproducing. However we would not want the machines we might build to attempt to find a cure to cancer to "give up" we would rather they keep trying. Therefor imo although emotions are useful they are not required especially for a task oriented machine. (Which would be more useful anyway)
@Reaper872:
You can program a computer to self program there is an entire field dedicated to it called machine learning.
What about making a machine that automatically turns what it sees into 3d models? That would seem to solve the whole "think in math" problem wouldn't it?
That has already been done to some extent. If you have I-Tunes go to I-Tunes University and search for machine learning. You will find a Stanford course (you can watch the entire thing for free) If you watch the first episode the professor will explain some of the really cool things that have already been done with machine learning (including turning 2d images into 3d models, teaching robots how to move, teaching cars to follow roads). However I don't think thinking in math is a problem. I was just noting that its not how humans think.
@DirectorOfTheUED: Go
Which means that teaching a machine to think in a non-logic based way is basically impossible as the machine it is built around is still based in math. Now I get it. Unless there is another way to make a machine run not using ones and zeros it'd be impossible I suppose :(
We are imperfect and this is what sires what is not replicable to a machine (or at least not probable, see "2001 a space odyssey's Hal")
however, this has made us crave for perfection and want to create perfect "things": ergo machines (well laziness (sleuth) and pride, and all the other over-known sins have their way ;^) of pushing us there too)
We are chemical made and magnetic/electrically powered.. and in that regard we are unique snowflakes not to mention, we have too much time to think (a great portion of us, and a significant part of our lives at least) and we are social like no other mammal.. we are the social beasts
case study: brave new world, book by orwell: kids in maternity ward have speeches coming from their pillows telling them what's what since birth... no one likes a perfect world.. we would not need then we would not dream then we would wither..
case study (close to home): i am the "second" kid in my family and i am raising a "first" kid = whatthefuck do i do? who's he gonna become? what is going to change me from this experience? etc.. introspection/interest for others... so many variables to account for..
..we are able to maintain sanity while "not knowing" and even "knowing that we will never know everything or indeed enough"...
..while a cp needs to know everything beforehand, we allow imperfection to be acceptable.. while he lives to expunge it (see 2 seconds of zen in the last tron)...
by that i mean why marvel at super computers that can do very complex applications of sophisticated knowledge we have fed it?... That is still peanuts compared to the miracle of thermo-dynamics as applied to the human race (see dr manhattan in watchmen)... (well, every flower or dolphin is unique too but you get my drift, no?)... and that's just contemplating a basic enough variable: birth .. then jump ahead to choosing a mate or a career... lol the variables are more than any brain can master (and some people master 16/20 languages (and i do mean human speech languages :) not java basic or ubounto ..;)
a human being is capable of the best and the worst,
machines are well thought or not, well build or not, well used or not..
so huge a difference that any ai is far away still
(did any of you see blade runner?
"..[it] would be like tears in the rain..." [rutger hauer's last line in his best acting job ever, as the replicant dying in the rain, that will not kill the hero (his hunter).. because he can't stand the thought of being alone when he dies.. .. poetry and humanity (ok it's by a scott brother (rolf) but it's his best work :) )
my scifi favorite <3 )
yeah ok guys. Movies like Tron Legacy and Transformers are just really getting to me I guess. I'm not interested in perfection or making a person out of a computer. I would like to be able to talk to something other than a person and have it understand me though. That much I know I want.
Even animals though don't really understand humans, they just hear the same thing over and over so they do what they think you want. It's not like they can go "how was your day." or "I'm bored, lets play!" They act like that's what they want though. Meh, not enough communication for my tastes.
just talk to humans dude.. you can do it..honest.. all it takes is trying
again and again and again, until you get the hang of it… i bet 1/10 random people you meet you would like to talk to :)
(well forums are sort of mehhh, i mean in real life :) )
@houndofbaskerville: Go
I know that. People are just boring. I don't mind people, I really don't. That's not my issue. It's just that when you get thinking about being able to make something behave in certain ways and being able to actually control it, it sounds really cool. I also like mechanical things so yeah... there you have it! I guess I just have to accept the fact that there isn't any fun in it for now.
People suck. That's why I use Microsoft Sam to talk to.
@Reaper872:
There is no way to teach anyone anything except with logic. Sometimes the logic or the information it is based on is incorrect but all human knowledge is based on logic directly or indirectly. Also humans are probably mathematical operating systems deep down as well as our brains are composed of neurons each which can form about 1000 bonds to nearby neurons. Although we have 1000 states to work with rather than two and a greater level of noise and randomness in programming I would argue that we are probably still based on math and logic just with so many layers of programming (some of which are junk) on top of it that we cannot tell anymore. Ofc I don't know that for sure (no one does) but I don't see what else we could be based on (Our brains do comply with the natural laws of physics after all).
Getting too far into psychology again....
Can one ever get too far.. when discussing i mean?
"There is no way to teach anyone anything except with logic." = i respectfully disagree...
i 100% agree that people's brain mostly works with logic, dubious logic, passive aggressive counter logic, wrong logic etc... but just because that's "the norm".. but that's forgetting about arts and crafts .. about history .. introspection .. about poetry ... No offense meant to anyone, but that's all i personally am interested in, anything else is felt like a choir ;^p
Furthermore, human beings are made of chemicals ... when you meet someone, you meet them.. your eyes glance, you smell this person, you hear that person, rarely even touch that person... It's the most complex interaction you can imagine...
Again, emotion is a chemical imbalance, and i would not have survived my teens (dare i say it, would not have passed the "age of reason" (around 7 for most people)) without it.. i loved being chemically imbalanced.. it is my karma.. and yes i am sad that so few human beings like me roam the earth freely :(
Don't mind me... i'll go cry in someone's robe now ... too sensitive :)
The whole concept of AI is based on the Turing Test, whether the program is sentient or not does not necessarily make it an AI. The only criteria for AI is that it can fool people in a blind test of communication into thinking it is another human (of dubious intellect).
Contribute to the wiki (Wiki button at top of page) Considered easy altering of the unit textures?
https://www.sc2mapster.com/forums/resources/tutorials/179654-data-actor-events-message-texture-select-by-id
https://media.forgecdn.net/attachments/187/40/Screenshot2011-04-17_09_16_21.jpg
@DrSuperEvil: Go
that test doesn't fly by me. What if you have a computer programed with thousands of responses and they are each voice activated by thousands of things someone can say to it? It won't know what it's doing, it just passes because it mimics what a human would do. I'm not talking about a "blind test of communication." I'm talking about making ACTUAL intelligence, thought, and reason. A machine can be told what to say, that's one thing, but having it know what it all means is something all together different. There is really no point making something mimic a human without it knowing what it's doing.
Think of the Queen of Blades for example. She was controlled by the zerg so heavily that she really had no free will. She didn't want to be infested, she didn't want to look like a monster. Her lack of independence doesn't make her not alive. A program is basically something than heavily influences the decision of the machine. It is in fact unable to make decisions without programs. With programs though, it isn't really it's own decision because it is told what to do. I am frustrated by this more than anything else. How do you make a computer have at least a basic understanding of what it means to exist?