At the very least, semiotics is a good idea. Trying to understand how humans make sense of their lives, their environment, and the ways in which they interact with, and influence each other through signs is a fundamental endeavour. Semiotics, under any other name, is bound to have emerged when early humans evolved the cognitive capacity to encompass more than a single point of view and represent others as a source of signs, and discriminated between intentional and non-intentional signs with considerable adaptive consequences. Now known as TOM (Theory of Mind) in evolutionary and developmental psychology, this competence is double-edged as all adaptations are: it enhanced the deliberate sharing of information toward cooperation and enabled us to guess and interpret each other’s agenda; but it also drove humans to construe natural phenomena as intentional agencies and elaborate cosmologies in the form of conspiracy theories whose consequences were not always benign.
The urge to understand the nature and range of signs is all the more pressing at a time when quantitative changes in representation and communication technologies are creating qualitative transformations in what it means to be human. In the wake of Darwin’s theory, James Mark Baldwin (1861-1934) called attention to the impact of the built environment on human evolution itself. Culture is indeed a major evolutionary force. During the last centuries, human knowledge has increased at a progressively faster pace. Notwithstanding the despondent philosophers that construe science as a mere succession of paradigms which cancel each other as time goes on, scientific knowledge has proven itself to be cumulative even if some models and theories had to be discarded along the way. After all, theories are what we conceptually build to make up for our ignorance. For long, the painstaking observation and recording of facts – what we now call data – have been relatively sparse, and when they reached a critical mass their handling was a conceptual and computational challenge. When “facts” were only suggestive, theories were elaborated to make up for this lack of certainty and fill the gaps with speculations and generalizations in the hope that new data would prove them right and practical applications would vindicate them. Nineteenth-century semiotics was no exception. Is not now the time for semiotics to move on to the twenty-first century and leave behind ad hoc examples and thought experiments carried over from the scholastic past?
As we enter the age of transformation and witness the emergence of a new epistemological frontier, a time when research becomes data intensive, new methods of inquiry are called for. This is the theme of a recent collection of articles written through the collaborative efforts of scientists and Microsoft researchers, entitled The Fourth Paradigm: Data-Intensive Scientific Discovery (2009) and edited by Tony Hey, Stewart Tansley, and Kristin Tolle. The opening chapter is a transcript of a talk on e-science given by the visionary Jim Gray (1944-2007), shortly before he disappeared at sea, which set the theme of the book. Its four parts deal successively with “earth and environment”, “health and wellbeing”, “scientific infrastructure”, and “scholarly communication”. New means of monitoring and recording processes in these domains, among others, through the exponential development of sensors that gather information 24/7, create such an abundance of data that ways of making sense of this mass run the risk of falling behind. Research must move from a situation in which, in centuries past, the quantity of data was manageable within the scope of human working memory to an age of data availability that transcends the capacity of the relatively simple algorithms of the nineteenth-century methods of scientific discovery. When data reach the level of terabytes and keep increasing in quantity and quality, new ways of progressing on the path of human knowledge are needed.
The Fourth Paradigm [www.fourthparadigm.org] selectively addresses some of the most pressing issues: environment science, health care, scientific infrastructure, and communication of knowledge. Much is relevant in this collective work to the concerns of twenty-first century semioticians. In this age of transformation, semiotics must move from a data-poor and speculation-rich endeavor to a full engagement with its new epistemological environment. Recoiling into mediaeval models on the pretext that the human mind must preserve itself against the assault of technology is self-defeating. As Andy Clark’s provocative book Supersizing the Mind: Embodiment, Action, and Cognitive Extension (2008) convincingly suggests, the elusive human mind is coterminous with the tools it creates. The data stored in the clouds are neither more nor less a part of us than those recorded in books and individual memories. They are as liable to be lost as those preserved for most of our lifetime within the neuronal clusters of our brains. But social distribution and connectivity are key features of human cognitive resilience and biological survival. As technology expands the reach of our organs of perception well below and beyond the thresholds of the natural phenomena for which natural selection fine-tuned our brains, the resulting information overload exceeds both the storage capacity and the computational capability of individual brains. The challenge is to create new powerful algorithms able to smartly mine this high tide of data and make sense of it. Evolution never rests.
Be the first to comment