• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • See https://www.intpforum.com/threads/upgrade-at-10-am-gmt.27631/

State machines

Cognisant

Prolific Member
Local time
Today, 06:22
Joined
Dec 12, 2009
Messages
8,601
A very simple state machine would be a cat toy that meows until you pet it and then purrs until you stop petting it. These states are predefined sets of behaviours that the machine transitions between based on various predefined parameters. The cat toy could hiss and growl when you shake it and require more petting than usual to enter the purring state for a while after being shaken. Using such parameters we can define state transitions that give the impression that the cat toy has thoughts and emotions, that it can take a liking to some people and hold grudges against other, even give the toy some apparent degree of autonomy by having it enter different states based on randomised timers.

How can you know for sure that someone's not a very sophisticated state machine?

If such sophisticated state machines existed would you consider them people or simply very realistic depictions of people?
 

Blarraun

straightedgy
Local time
Today, 18:22
Joined
Nov 21, 2013
Messages
4,215
Location
someplace windswept
A state machine by definition can only be in one state at any given time. It's the same question as "how can you know someone isn't a computer" except it's pretty clear that we hold more than one state at any time thus we are more comparable to computers than finite state machines.
 

Serac

A menacing post slithers
Local time
Today, 17:22
Joined
Jun 7, 2017
Messages
2,496
Location
Stockholm
There’s a concept in mathematical modeling called a markov chain, which is similar to what you describe; a process whose next state is only dependent on its current state. I would say there’s many aspects of biological organisms which are inconsistent with the markov property, e.g. homeostasis; for example if you have a certain amount of muscle on your body after having gone to the gym for the past few months and then stopped going to the gym, your future in terms of muscle is not only dependent on your current level but also what you had in the past - you’ll probably mean-revert to your historical level.
 

ZenRaiden

One atom of me
Local time
Today, 17:22
Joined
Jul 27, 2013
Messages
698
Location
Between concrete walls
I mean we have big brains and all and we like to think we are sophisticated, because yet there is no android, but on a whole human brain is rather simple. No not the whole integrated system, but if you look at overall patern of behaviour most people follow a lot of common patterns. Now if you really see a human first time in yourlife chances are they could easily pass as close imitations of humans and not be recognized for being imitations. I mean how do you know the person next to you on a bus, train or plane is human. You just assume that, because they breath, have some micro expressions and body language, make some sounds and generally have some odor. All that in future if people put in the effort could be easily simulated.
 

mr_darker

Member
Local time
Today, 11:22
Joined
May 8, 2018
Messages
29
A very simple state machine would be a cat toy that meows until you pet it and then purrs until you stop petting it. These states are predefined sets of behaviours that the machine transitions between based on various predefined parameters. The cat toy could hiss and growl when you shake it and require more petting than usual to enter the purring state for a while after being shaken. Using such parameters we can define state transitions that give the impression that the cat toy has thoughts and emotions, that it can take a liking to some people and hold grudges against other, even give the toy some apparent degree of autonomy by having it enter different states based on randomised timers.

How can you know for sure that someone's not a very sophisticated state machine?

If such sophisticated state machines existed would you consider them people or simply very realistic depictions of people?
Pretty sure this is basically a "what is conciousness" debate, brought up from thinking about state machines. For me idk if it counts as a state machine but similarly I got thinking about it while thinking both about FPGAs (something you can program to be a state machine) and how it's all similar to how neurons in a brain work, and how it'd be pretty straightforward to just simulate a ton of neurons and their connections on a giant FPGA, if you knew exactly where to make connections (there's projects going on that'll make a virtual map of a brain, they're basically 3d-Un-printing it with an extreme level of detail while scanning/cutting it. Sounded like a multi-year scan). Anyways, it just kinda gets you thinking, you could some day just upload code that is someone's consciousness, to a computer. They'd likely not know they ceased to exist just a second ago.
There's also crazy side-thoughts like what if you made 100 of the same person from the same version of consciousness, would it all be one person or each be their own person.
The answer is each would be its own unique individual.
Which ties into the question is if a single clone from a deceased body is made, is it the old person or is it someone new.
The answer is it's someone new.
Which brings up the question of, is there a way to not die and be cloned, but transfer conciousness?
The answer is yes, you must have conciousness on your old body/hardware at the same time as your new body/hardware, prior to losing conciousness on you new body hardware. That is to say, at some point in time, a single conciousness must be the sole conciousness for two separate bodies. Alternately you may have the old conciousness and a new clone merge conciousness, then delete the old one and its hardware/body. But this wouldn't be 100% clean/pure transferrence, and at one point in time it'd be two separate entities, merged, rather than one entity expanded then shrunk.

But, what is conciousness? Computers, state machines, complex extremely abstract ones. The only thing that separates us from an emotionless computer without wants or needs, is the false, hardwired belief/experience that good and bad exists. All meaning in life, and sense of good and bad, are complex abstractions of perceptions of good and bad. Which can lead you to realize that nothing in life really matters or has purpose. Yet, you're human, and can't choose to unwire that false belief, at least not the sad/negative experiences, so trying to fight nature and "become a computer" is futile, and a road to depression. Also, why would you want to? It'd be like losing humanity, comparable to suicide, and that kinda indicates to me conciousness is what makes us feel/experience. Religious people would call it the soul. Which brings the question, if someone has bad "wiring" in their brain and lacks the ability to feel, or want, and just does without reason or personality, are they just a fleshy machine? Are they unconcious? Like the soulless people on supernatural, minus the negative/violent desires as those are still desires. Kinda high rn tho so might not make sense. Reading replies is too much work for now.
 

JansenDowel

Active Member
Local time
Tomorrow, 06:22
Joined
Sep 7, 2014
Messages
240
Location
New Zealand
A very simple state machine would be a cat toy that meows until you pet it and then purrs until you stop petting it. These states are predefined sets of behaviours that the machine transitions between based on various predefined parameters. The cat toy could hiss and growl when you shake it and require more petting than usual to enter the purring state for a while after being shaken. Using such parameters we can define state transitions that give the impression that the cat toy has thoughts and emotions, that it can take a liking to some people and hold grudges against other, even give the toy some apparent degree of autonomy by having it enter different states based on randomised timers.

How can you know for sure that someone's not a very sophisticated state machine?

If such sophisticated state machines existed would you consider them people or simply very realistic depictions of people?
So basically you're asking if the Turing Test works? No, it does not! To know whether this cat has emotions, you need to do more than observe it. As you have already explained, appearances can be deceiving. In order to distinguish real emotion from emulated emotion, one must have an explanation about how the cat actually works. Its through this explanation that one can determine whether they are real or emulated. Without an explanation, one can never know. This is another reason induction does nothing to help us understand reality. Appearances are always deceiving.
 
Local time
Tomorrow, 04:22
Joined
Jul 23, 2018
Messages
52
A very simple state machine would be a cat toy that meows until you pet it and then purrs until you stop petting it. These states are predefined sets of behaviours that the machine transitions between based on various predefined parameters. The cat toy could hiss and growl when you shake it and require more petting than usual to enter the purring state for a while after being shaken. Using such parameters we can define state transitions that give the impression that the cat toy has thoughts and emotions, that it can take a liking to some people and hold grudges against other, even give the toy some apparent degree of autonomy by having it enter different states based on randomised timers.

How can you know for sure that someone's not a very sophisticated state machine?

If such sophisticated state machines existed would you consider them people or simply very realistic depictions of people?
people, shmeeple.
Just because you have a state machine doesn't mean it can't show interesting unpredictable behaviour. See chaotic adaptive systems and modelling turbulence using cellular automata as examples.

Ultimately, it's in your interaction with something that you determine if you want to give something the status of being a person or not. Different frameworks of personhood as embedded into different cultures when entangled with the individual's own nuances over time ultimately address this question.

Basically what I'm saying is that the world "person" is a loaded term that mean slightly different things to different people. Arguing about it is pretty pointless.
 
Top Bottom