Life After Death, GPT-3, the Narrative of Self, and Schizophrenia
At first it seemed I was a paranoid doctor who was being treated for a heart condition, a serious one. But then I think I was his son. My son helps me deal with a stupid cardiologist, who I make fun of, we try to trick him, he can't even spell habitat, and we buy books on his expense account. My son stays with me to help me. One day we find out about a talk that is taking place. It's in a bookshop, 100s have gathered, its a book by an old acquaintance of mine, about his work on music and its effects on the brain. In the tent, I become my son. I don't want to sit with the father. Instead I see you. Sleeping, you're not very pleased to see me. You close your eyes. But later you come to talk to me. You say we can meet. You're kind of special. I'm glad you came to talk to me. I'm always grateful and relieved about you. There is excitement in the tent. But I realize there is an unusual group of people I know from my old days in research, gathered. They are moving to a corridor in a department. I go to find them. Its Stephane Gladstone, and Marshal Cummings, and Vinditha Breglu, and Raskot Emique. These guys know so much about phenomenological consciousness it just can't be a coincidence. Marshell used to send me crazy pamphlets in the olden days, ones wrapped in hard to open plastic, saying do not open. Instruction manuals. About life after death. WOM3 has come too, she's interested in the same thing as me, she came with her bicycle. We get them in a corner, there are too many people in this corridor now, and we say to them. Listen, we're here to talk to you all, we want to talk. We want to talk about those ideas you all had, about life after death. Don't worry this isn't Bladerunner. About the possibility of our consciousness persisting after our death. In software. Who wants to talk about this says Stephane, uneasy. What does it matter, the point is we want to talk, and everyone is here, all the right people are here, so we should talk, damnit! You said you knew how to do it, you thought you did. A way to give a software agent first person phenomonology, to make it feel it was still themselves, the person they had always been before their body had died, so that they would be convinced that they would still be themselves. O yes, Stephane says, those silly old chatbots, like the gpts, that were trained only on first person self-narratives from the "I-narrator" of the individual in question. Whats the "I-narrator"? It occurred to me to ask this. The "I-narrator" is a simulator in your brain, of the world, something that allows you to imagine for example elephants playing chess, it's much better than the world because you can reset it, and build things with it like lego, and run it at different levels of granularity and speed. It can night dream and day dream, and as you run it, it tells it's own stories about what is happening. From the first person perspective. From the perspective of the person that is you, who has emotions, drives, memories, etc... and it does this in whatever language you speak. And these stories come from the I-narrator being able to interspect. Interspection is the examination of other locolosses to see how well they are doing, from which motivations can be inferred. The motivations of the I, the self, which brings coherence to your story, makes even your personparity and lets it fit nicely into the I-narrative.
I expressed my incredulity at such a software system ever being able to interface effectively with the wetware of any individuals brain in order to master a real wet "I-narrative", but the damp air, the mists of San Francisco began to make themselves known to me, like a tropical jungle, warm, and I could smell the donuts, and I could feel that it was morning, and that there was hope, it was early dawn, and I felt actual real hope. Tender hope, fragile, like something out of the 1970s. I didn't know how it had been done, but I felt like I was their child, who had found my creator, who had explained to me how I had been formed. Had explained to me that feeling of Dasein, I had once for about 23 seconds, total doubt, about why there was anything at all; my I-narrator being unable to access my locolosses for a few moments, therefore not making it clear in an instant why for example sticking your tongue out at the queen was such a bad idea you shouldn't even propose that move for Monte Carlo Tree Search.
The four Schniderian symptoms of Schizophrenia have at their basis a disintegration of the first person narrative, with interference between being able to identify our emotions and drives from what we perceive are others emotions and drives. We see above that this can come about through something I call the I-narrator which is a narrative making engine which tries to tell a linguistic story from internally observed sense data, a process which has the conscious correlate of introspection, but the process itself I call interspection, because it is an algorithmic non-conscious process. Interspection may not be sufficient for introspection, but it is one of its ingrediants I think, and I expect that its psychopathology underlies some psychotic conditions such as Schizophrenia and less harmful philosophical experiences such as Dasein, and the general stability of the world that we are blessed with experiencing in more normal states.
