Nothing less than the problem of meaning, in a holistic sense, surfaces when language is algorithmically reproducible. (Location 109)
The wager I am making in what follows is that this problem can be addressed only if linguistics is extended to include poetics—and, in fact, poetics encompassing linguistics and much else—reversing the assumption that reference is the primary function of language, grasping it rather as an internally structured web of signs. That this idea of language is computationally tractable—especially before logic, mathematical reasoning, or other features of cognition could be reproduced by machines—should surprise engineers and cultural theorists alike. A theory of meaning for a language that somehow excludes cognition—or at least, what we have often taken for cognition—is required. (Location 110)
GPT stands for “generative pretrained transformer,” but it can hardly be an accident that the more common acronym used to be “general-purpose technology,” a phrase awarded by economists to things like the printing press and the steam engine. (Location 171)
The Owl of Minerva can no longer afford to be nocturnal. (Location 192)
In 1990, the cognitive scientist Stevan Harnad coined the phrase the “symbol grounding problem.” He argued that symbolic systems threatened to float free from any base, because their arbitrary nature meant that combination alone could not ensure semantic validity. (Location 260)
Note: What makes them arbitrary? This is with reference to metaphor - symbolic representations as appropriate rather than sensually accurate. Consistency and coherence - see Metaphor p21
But it is simply not clear how language is grounded and what role sensory information plays for humans, let alone for symbol-systems running on statistical pattern recognition. (Location 274)
Only in abandoning a putatively known or future-tense “empirical” form of reference-first language do we do away with what I call remainder humanism and enter the strange world of computational signs. (Location 395)
“final station” of all meaning—last-instance semiology is my term. Language is not the sole source of meaning—far from it—but an always-present and usually necessary choke point through which other forms of meaning must pass. (Location 406)
“the reported achievements of LLMs [are] often heralded for their ability to perform a wide array of language-based tasks with unprecedented proficiency”; (Location 422)
Note: Amazing thing is how all tasks can be potentially language based tasks apart from eg counting
In the case of culture machines, the impression left by reading in a wide variety of fields we feel called to investigate is that we do not know what we are looking at, exactly. (Location 431)
On one hand, we oppose culture to art. (Location 466)
Note: We differentiate - not sure we oppose
On the other hand, not until culture took on its modern meaning could it be opposed to something like cognition. And only then—now well into the twentieth century—would it be possible for it to be a question of evolutionary theory as well as cognitive science. Culture had to be global before it could become posthuman. (Location 469)
Note: I dont understand this
I think that the semiotics of these machines means that they can contribute to meaning construction noncognitively, (Location 482)
Clifford Geertz’s elegant phrase is that “man is an animal suspended in webs of significance he himself has spun… . I take culture to be those webs.” (Location 509)
Meaning and culture come together in semiotics, not in psychology or its philosophical and scientific counterparts. This creates a strange affinity between computer and data science and literary theory, two discourses in which all claims must be justified by reference to chains of signifiers, no matter what one “believes” in the background. (Location 519)
And yet, if cognition is the molten core of the mind, then how far can any artifact the mind makes really deviate from it? (Location 529)
Note: Ok i think i see what this means - that cognition is to a degree independent of the culture to which it contributes v that “playing in cultural signs” may produce valid cultural artefacfs
It is not clear that any learning system can be freed from the “corpus” of data from which it learns—even if some data are more easily assumed to be reflective of long-term steady states in the world. (Location 530)
Note: Because data is selected and corrected algorithmically optimised for outputs (incidentally, determined by culture). And still, again, it is literally possible to separate corpus from “cognition” (algorithm - the rules or guidelines for what constitutes optimal thought and output + compute)