Sunday, July 17, 2022

The necessity of consciousness

Why do living organisms experience consciousness? I hypothesize that living beings acquired consciousness because it was the only way they could learn about what is good and what is bad for their concerns.

Organisms have concerns like staying alive, stay well-nourished, stay healthy, reproduce, etc.

Consciousness as I use it here is the mental experience of information: seeing, smelling, touching, feeling. Not the physical aspect of neurons firing, but the mental aspect of experiencing.

Learning always requires feedback. This feedback can come from an external source, a teacher, or from yourself, as you set a goal to mimic someone or something else.

Experience

But how do you learn that sprouts taste awful? Or that being yelled at hurts? Or that a painting by Picasso is strange. Of course someone may tell you about it, and you may copy that opinion. But what if copying someone else's opinion is not good enough. And that it's better to find out for yourself.

In these cases you learn by trying. You learn from experience

Can this learning be done by mental processes alone? Is the firing of neurons, the strengthening of the axons, or a bit higher up: the creation of a neural representation, is that enough to learn about the taste of pepper, the smell of poop, or the bewilderment caused by an Escher painting?

It isn't. What's missing is the judgement. Of good and bad, of awful and sweet. Of interest and boredom. This judgement is not in the representation. And it is not induced from its representation. The judgement is in the experience of the representation.

Smell a rotten egg. Certain chemicals enter the nose. The nose fires certain neurons. Certain areas in the cortex are activated. The amygdala is involved. Neural patterns are formed. So what? Why would instinctively cause you to retract? It doesn't. Not unless experience is involved. And no matter the kind of representation, it is no substitute for experience.

This experience is important. It connects your sensory input to your inner concerns. It teaches you what is good for you and what is bad for you.

Dualism

But if experience is different from a neuronal representation, what is it? I have to admit that my idea of this is not far from the view that Descartes described, 400 years ago. Dualism, formed from a neuronal mind / body on the one hand, and an experiencing psyche on the other hand. Experience (consciousness) is a different substance than matter.

For this hypothesis to work, the neural mind must influence the conscious substance. The idea that comes to mind is the way electricity influences magnetism, and vice versa: two different substances that turn out to be part of the same continuum. In this case matter and consciousness would both be part of a matter-consciousness universe.


This diagram attempts to show how mind and consciousness interact. Mind creates a representation. Certain representations cause a change in consciousness. In order for this experience to be functional, it needs to be noticed by the mind again. There needs to be perception of the conscious experience.

Necessity

But why is this necessary? Can't experience be emulated? It can be faked, for sure. But it can't be emulated, because experience is what it's all about. Experience drives existence. You would not buy a painting unless it was beautiful. You would not eat all that potato chips unless it tasted so damn good. Experience is the messenger of the deep concerns that makes each of us a living being.


%3CmxGraphModel%3E%3Croot%3E%3CmxCell%20id%3D%220%22%2F%3E%3CmxCell%20id%3D%221%22%20parent%3D%220%22%2F%3E%3CmxCell%20id%3D%222%22%20value%3D%22neural%26lt%3Bbr%26gt%3Brepresentation%22%20style%3D%22rounded%3D1%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23d5e8d4%3BstrokeColor%3D%2382b366%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22160%22%20width%3D%22120%22%20height%3D%2260%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%223%22%20value%3D%22conscious%26lt%3Bbr%26gt%3Bexperience%22%20style%3D%22ellipse%3Bshape%3Dcloud%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23f8cecc%3BstrokeColor%3D%23b85450%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22250%22%20width%3D%22120%22%20height%3D%2280%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%224%22%20value%3D%22perception%20of%26lt%3Bbr%26gt%3Bconscious%20experience%22%20style%3D%22rounded%3D1%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23d5e8d4%3BstrokeColor%3D%2382b366%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22360%22%20width%3D%22120%22%20height%3D%2260%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%225%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%222%22%20target%3D%223%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22120%22%20y%3D%22290%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22170%22%20y%3D%22240%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%226%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%223%22%20target%3D%224%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22560%22%20y%3D%22360%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22260%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%227%22%20value%3D%22concerns%22%20style%3D%22shape%3Dcylinder3%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BboundedLbl%3D1%3BbackgroundOutline%3D1%3Bsize%3D15%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22470%22%20y%3D%22350%22%20width%3D%22110%22%20height%3D%2280%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%228%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%224%22%20target%3D%227%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22340%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22370%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3C%2Froot%3E%3C%2FmxGraphModel%3E


%3CmxGraphModel%3E%3Croot%3E%3CmxCell%20id%3D%220%22%2F%3E%3CmxCell%20id%3D%221%22%20parent%3D%220%22%2F%3E%3CmxCell%20id%3D%222%22%20value%3D%22neural%26lt%3Bbr%26gt%3Brepresentation%22%20style%3D%22rounded%3D1%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23d5e8d4%3BstrokeColor%3D%2382b366%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22160%22%20width%3D%22120%22%20height%3D%2260%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%223%22%20value%3D%22conscious%26lt%3Bbr%26gt%3Bexperience%22%20style%3D%22ellipse%3Bshape%3Dcloud%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23f8cecc%3BstrokeColor%3D%23b85450%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22250%22%20width%3D%22120%22%20height%3D%2280%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%224%22%20value%3D%22perception%20of%26lt%3Bbr%26gt%3Bconscious%20experience%22%20style%3D%22rounded%3D1%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BfillColor%3D%23d5e8d4%3BstrokeColor%3D%2382b366%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22260%22%20y%3D%22360%22%20width%3D%22120%22%20height%3D%2260%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%225%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%222%22%20target%3D%223%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22120%22%20y%3D%22290%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22170%22%20y%3D%22240%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%226%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%223%22%20target%3D%224%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22560%22%20y%3D%22360%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22260%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%227%22%20value%3D%22concerns%22%20style%3D%22shape%3Dcylinder3%3BwhiteSpace%3Dwrap%3Bhtml%3D1%3BboundedLbl%3D1%3BbackgroundOutline%3D1%3Bsize%3D15%3B%22%20vertex%3D%221%22%20parent%3D%221%22%3E%3CmxGeometry%20x%3D%22470%22%20y%3D%22350%22%20width%3D%22110%22%20height%3D%2280%22%20as%3D%22geometry%22%2F%3E%3C%2FmxCell%3E%3CmxCell%20id%3D%228%22%20value%3D%22%22%20style%3D%22endArrow%3Dclassic%3Bhtml%3D1%3Brounded%3D0%3BstrokeWidth%3D3%3BfillColor%3D%23dae8fc%3BstrokeColor%3D%236c8ebf%3B%22%20edge%3D%221%22%20source%3D%224%22%20target%3D%227%22%20parent%3D%221%22%3E%3CmxGeometry%20width%3D%2250%22%20height%3D%2250%22%20relative%3D%221%22%20as%3D%22geometry%22%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22340%22%20as%3D%22sourcePoint%22%2F%3E%3CmxPoint%20x%3D%22330%22%20y%3D%22370%22%20as%3D%22targetPoint%22%2F%3E%3C%2FmxGeometry%3E%3C%2FmxCell%3E%3C%2Froot%3E%3C%2FmxGraphModel%3E


Saturday, January 15, 2022

Processing SHRDLU demo sentence "Why?" with NLI-GO

For several years I have been working on a natural language understanding program called NLI-GO.  It allows the user to interact with a database through natural language. Since I needed example sentences to extend its functionality, and SHRDLU is a great inspiration, I thought it would be a good idea to recreate SHRDLU's famous sample dialog in NLI-GO. And it is a good idea. Each sentence creates new challenges and puzzles that need to be solved, and this allows NLI-GO to grow.

I am still working on it. The dialog has 44 interactions, and the last few months I have been working on interaction 25. It goes like this:

Person: Why?
Computer: TO GET RID OF IT

As usual, this interaction proved more complicated than I had envisioned at first. And now that I finished it I would like to explain some of the techniques I used. 

Some dialog context

The question "why?" follows a series of interactions. Of these interactions the ones that matter are:

Person: Will you please stack up both of the red blocks 
  and either a green cube or a pyramid?
Computer: OK.

This is followed by the sequence

Person: Had you touched any pyramid before you put the 
  green one on the little cube?
Computer: YES, THE GREEN ONE

Person: When did you pick it up?
Computer: WHILE I WAS STACKING UP THE RED CUBE, A LARGE RED BLOCK AND A LARGE GREEN CUBE

Person: Why?
Computer: TO GET RID OF IT

So the person asks the computer when it picked up the green pyramid, and then why it picked up that pyramid. SHRDLU remembers that it was stacking up some blocks and needed to get rid of this pyramid in order to move the block that was below it.

SHRDLU's actions

This is a lot to take in. These are not your average QA questions one would ask a database in an NLI system. So this is a good time to go into the action structure of SHRDLU.

When the user tells SHRDLU to perform some command ("build a stack", "pick up a block"), the actions needed to perform this command are executed. While they are executing, SHRDLU also stores the fact that such and such an action was performed in its database.

This is what the start of an action "pick up" looks like in NLI-GO:

pick_up(EventId, Subject, Object)
start_time(EventId, Start);
end_time(EventId, End); parent_event(EventId, ParentEventId) 

Note that there's no need for an event-based database, if such database would exist. A normal relational database can keep track of these events, but they need to be inserted explicitly by the application.

I am currently using an internal database (a data structure) to store these actions, but they could be stored in a relational database as well. `pick_up`, `start_time`,`end_time`, and `parent_event` would then be the names of tables.

Most commands are not simple. They consist of a hierarchy of actions. If an action "pick up" needs to move aside some object, it starts a "get rid of" action. This action will receive a parent pointer to its origination action. This is stored as the `parent_event`.

Because these actions are stored, SHRDLU has a memory of them. Because each action has a reference to the action that started it, SHRDLU can tell why it performed an action, simply by following the `parent_event` link.

SHRDLU has a simple discrete sense of time. Each time a basic action is performed, the internal time is updated by 1.

Ellipsis

To understand the question "Why?", an understanding system needs to find out what is meant here, exactly, as if asking: "Why what?" Clearly part of the question is left out. this is called ellipsis.

NLI-GO treats this problem by declaring the ellipsis explicitly in the grammar. Here is the grammar rule that deals with it:

{ rule: interrogative_clause(P1) -> 'why',
  ellipsis: [prev_sentence]//mem_vp(P1),
  sense: go:intent(why, P1) $mem_vp }

The rule " interrogative_clause(P1) -> 'why' " rewrites the clause to the single word "why". The value of `ellipsis`,  [prev_sentence]//mem_vp(P1) is a path that leads to the missing part of the sentence.

In trying to match this rule, NLI-GO follows the path, starting by the current `interrogative_clause` node of the active syntax tree:

  • [prev_sentence] tells it to visit the previous sentence in the dialog); this is "When did you pick it up". NLI-GO is not at the root of this sentence
  • //mem_vp tells it to visit all mem_vp nodes anywhere below the current node 
The syntax is somewhat similar to that of XPath (used to navigate XML).

It such an `mem_vp` is found, this node will be copied into the active sentence. The syntax tree of "Why?" is now extended and the complete sentence now looks like this:

Why did you pick it up?

Resolving "it"

Something that may seem self-evident, but which isn't really, is that "it" in the new sentence must refer to the same green pyramid as where "it" in the previous sentence referred to.

NLI-GO should not try to resolve this "it" anew; it should simply inherit the value of "it" that had been resolved in the "When" question. If it wouldn't, "it" might resolve to another object, and this is contra-intuitive. 

(And yes, this is of course exactly what happened at an earlier stage of the software; "it" first referred to SHRDLU (which is not even an "it", but NLI-GO had no idea).

The response: "it" again

The response to the question should be

To get rid of it

However, the response that NLI-GO initially gave was this:

To get rid of the green pyramid

It proved not so simple to get NLI-GO to just use a pronoun.

Centering theory

Interaction 25 is the first interaction that uses a pronoun in its response. Why would SHRDLU suddenly use a pronoun in its response? To the reader this is obvious. The object in the sentence had been referred to as "it" before, so it is natural to keep referring to it as that. The fact that an object may be in the spotlight of a dialog has been worked out in centering theory. If an object was the preferred center of the previous sentence, it must the preferred center in the current sentence again. Also: a sentence' subject is more likely to become a new center than an object, and this in turn is more likely than any other entity.

Concluding remarks

This seemingly innocuous sentence has kept me busy for four months. I had not seen it coming. It has both been a source of joy for learning new concepts but also a continuous source of frustration. I needed to rewrite existing structures to enable these new features and this broke several existing interactions. Still, the new framework is not at all robust. It needs a lot of work to make it simple to use. At the same time, I still feel this is a great field to work in, and it will eventually provide a level of determinism that machine learning will never reach. So it's worth it.



On SQLAlchemy

I've been using SQLAlchemy and reading about it for a few months now, and I don't get it. I don't mean I don't get SQLAlchem...