Where Has Become of Time? Temporal Smearing and Media Theory

By Gary Genosko and Paul Hegarty

University of Ontario Institute of Technology and University of Nottingham.

Gary Genosko

Since 1972 a leap second has been positively introduced into global time standardization systems, neither so much an addition exactly, nor a subtraction, due to the discrepancy between Coordinated Universal Time (UTC) and International Atomic Time (TAI). Due to the slowing of the Earth’s rotation and hangover of the length of a diurnal day just beyond 24 hours (a few milliseconds), UTC is behind the hyper-accuracy and stability of the mean time of hundreds of atomic clocks located around the world. The machinic insertion of compensatory time on New Year’s Eve of 2016 marked the 27th occasion on which such positivity was required. It will not be the final one. There is now a difference of -37 seconds between UTC and TAI, or 27 insertions beyond the original difference of 10 seconds, established in 1972.

Google’s explanation of its actions with regard to the insertion of a leap second into its Network Time Protocol servers is couched in terms of a temporal smear that crosses midnight of New Year’s eve and thus over two calendar years for a period of 20 hours. Within the machine worlds of its servers, Google’s smear began at 2pm on the 31st of December 2016 and continued until 10am on the 1st of January 2017.

Until recently, the leap second has been a consensual, if mildly uncanny adjustment, a para-governmental temporal wobble. But Google is intent on taking ownership of the smear and transducing it into a technologically stabilised change. The demands and complexification of IT networks and systems made the Great Leap Backward of one whole second a risk factor. So instead of one ‘massive’ jump backward of one whole second, Google sought and realized a long, dispersed leap that became a chronological coating or frosting, a smear.

Although there are a number of different strategies of smearing time, Google advocates for a standard smear that it wants other digital giants like Bloomberg, Amazon and Microsoft to replicate. The next smear would be across 2018-19. A longer smear further minimizes frequency change, and keeps it within the range of error for quartz oscillators and has the added benefits of simplicity, calculability, and stability. It is known within the corporation as a cool workaround. Ironically, though, what Google has actually achieved is the insertion of error across a long period of time, rather than take a one second hit. One second is of course, epochal in code-time, but the scale of error-insertion is properly colossal. This is why Google has had to disable the capacity of its servers to detect leaps, and any smear must not attract correction that normally happens when abruptness of any sort occurs. It seems that ‘real time’ is inherently drift-based, and the smear may constitute a neo-temporal reality principle. Watchmakers have known this for some time, since the so-called ‘quartz revolution’ in clocks of the 1920s and later in wristwatches in the late 1960s. Quartz crystals were the triumph over drift (variation) because, literally, they didn’t themselves have moving works. Stone time became the basis of the clock as digital device – which is where the drive to smear initiated, ultimately.

Paul Hegarty

If I had enough time I would analyze Google’s temporal strategy in terms of its affinities and departures from the classical view of time in Aristotle’s core considerations in the Physics Book IV, that is, in terms of a consonant enumeration but in our example at variable speeds/intervals. Google seems to have settled on a linear as opposed to a cosine smear. The difference is subtle: in a cosine smear, smearing picks up speed very slowly and accelerates in the middle of the window period, slowing down near the end; whereas in a linear scenario, the server clocks run slower than usual and in relation to UTC they run behind at the outset and ahead at the finish, while the offset is reduced over the course of the window so that smeared time equals leaped UTC time. Both seem strange in Aristotelian terms as they involve temporal speeds and slownesses, yet reflect in a radical manner the enumerative aspect of time. Instead of acceleration as squared velocity over time, time itself becomes a function, a derivative of an acceleration process that needs, in some ways to be thought of as anti-time.

I would then shift from the classical framework and consider the introduction of leap smears within the Google chronosphere as an example of Wolfgang Ernst’s conception of time-critical media in his book Chronopoetics. Leap seconds conform to Ernst’s sense of kairotic time, an auspicious micro-moment that is both techno-mathematically pre-defined and decisive for ensuring operationality. Google’s execution of time-critical processes establishes its mastery over the measurement and manipulation of humanly imperceptible micro-temporal events. Measurement is crucial to time-criticality and a leap second can be further divided into smaller units and precisely distributed across the smear. Google reasserts the primacy of the relation between time and number and the enduring legacy of Aristotle for media archaeology, which distances itself from Bergsonian duration in order to embrace techno-mathematical time, a point underlined by Ernst first in relation to Bergson’s rejection of Aristotelian countable time. In fact, Bergsonian duration is shown by Google’s smear tactics to be a vacuous, humanist comfort temporality, an extended philosophical spa treatment.

Since time is short, I will simply say that smearing introduces a neo-temporal principle that surpasses Ernst’s sense of time criticality intrinsic to media, not by re-introducing history in the form of a mass consciousness of the effects of computer time such as Y2K, but as a 21st century corporate power flexing its leadership muscles by reformatting not damage, but making ontology a matter of technological control through plasticization of time’s passage. What does it mean to master the drift of ‘real time’ by imposing a management strategy on all of the cybersphere?

The focus on the deconstructed leap second itself is at the heart of Google’s reasoning. Whether or not this will be taken up generally will be seen in the years to come, but it is thought to be an improvement over the backward jump application, which may disrupt time synchronicity between software and server clocks. Already in 2011 the official Google Blog published this statement by site reliability engineer Christopher Pascoe:

Computers traditionally accommodate leap seconds by setting their clock backwards by one second at the very end of the day. But this “repeated” second can be a problem.

For Google, problem avoidance required telling what is called a “small lie” to its servers, preventing them from activating a leap indicator field requiring the final minute of the day to have either 61 or 59 seconds; and instead, milliseconds are smeared over the course of a set duration during every update, maintaining the lie that no server response is required. Google has already begun to mess with time’s arrow. The corporation’s engineers think the servers remain in this way “blissfully unaware.” Servers in most locations are time-determining, not time-subject, so it is hard to see how they could be usefully ‘fooled’, but this anthropomorphic talk is for public consumption it should be recalled.

If this seems abstract, remember that the most powerful part of the global economy is run by this sector of machine time. Entrepreneurial time distorters reminded us of the materiality of time in developing (in 2009) a super fast cable between Chicago and New York to gain 4 milliseconds in trading time.

Although smeared time suggestively evokes the soft clocks of Dali and asks to be narrativized, clock smears are a digital upgrade of this fascination and paranoia about time’s progress. Google’s time lies play with the countability of time, one of Aristotle’s core considerations in the Physics Book IV about time: “time is not movement, but only movement insofar as it admits of enumeration” (219b 2-3). Time is marked by befores and afters, the non-simultaneity of nows, and is neither subject to increases in speeds nor decreases in slownesses: “change is always faster or slower, whereas time is not” (218b14-15). Google’s leap smear highlights the countability of time by managing discrepancies, but it does so in a manner that would shock Aristotle: it manages the speed of time, manipulating server clocks to achieve agreement.

Google’s smear action tells us that time is no longer universal, despite the existence of and insistence on UTC which incorporates the concept of the ‘leap second’ to manage a totalized time system. The smear reveals that this time is in fact malleable precisely when institutions, organizations, cults or followers of professional football’s transfer deadline periods try to mesh with universal time. Google attempted to fix time through its long fudge, and tried to help fixed time stay that way in so doing. But in fact, time is now being generated in multiple locations. When Google spoke of regularizing time for distributed systems, it could have looked just a small bit further and it would have seen ‘distributed time’, a concept that networked informatics may find problematic even as it seeds it.

The leap second is not the first adjustment of time that looks like human intervention into ‘real’ time. Its 37 iterations in 45 years mean that in just over 70, 000 years an entire day will have been added. This is still nothing compared to the transition (in Europe) from the Julian to the Gregorian calendar, operational in 1582 in many countries for the first time. This change was designed to allow for the gradual drift between human and astronomical calendars, and ended up moving everyone that adopted it forward by 11 days. Whilst it seems that Britain (introducing it only in 1752) did not riot about this, many did believe they were 11 days closer to death as a result of the change. Maybe in a few years that increasingly marginalized island will have its own Brexit time, one that is not really running properly, a historical calendar made all the more strange by the effects of time-critical media operations. Which will be a sad and defining irony after Britain’s role in pioneering standard time to enable the co-ordination of rail travel (1840-48), leading into the spacetime land grab of the zero meridian that is Greenwich Mean Time, accepted internationally in 1884.

The question is not so much does Google have power similar to the British Parliament in introducing the ‘Calendar (New Style) Act’ in 1750, with its premonitory declaration of the future loss of eleven days, programmed two years ahead. Rather, it is to ask what kind of power that would be? What would its modality be, once it was in and of time? This is not just about controlling time from above, but a Foucauldian network of time as potency. Smearing time is a disciplining measure, it is about saving time that would otherwise have to be devoted to “inspecting and refactoring code,” a recurring exercise and expense that Google wants to avoid. Smearing requires technical adjustments of temporal speed that are relational with unsmeared time (UTC) and distinct from TAI. This is not a question of the constancy of the speed of light, but of parameters of stretchiness in the service of synchronization. This is a good example of what Ernst dubs chronopoesis, technically ordered passages framed by a variety of international standards and measures, and for the sake of operationality, all of which is highly pre-defined, but not inevitable as the question of further insertions of leap seconds was tabled until a later date at a recent meeting of International Earth Rotation and Reference Systems Services (IERS). And with this deferral we slip out of media time into the historical cultural time of a scientific bureaucracy. But we need to take this further. Might the international time-keepers find themselves susceptible to a process that is less kairotic, and more in the sphere of convulsive flux? Or is this precisely what they have always wanted to prevent?

In 1950 Alan Turing posed a cryptic challenge. Are discrete state machines with discontinuous binary encodings really possible? His answer was that there is continuous movement between binary states. He defined discrete state machines as “machines which move in sudden jumps or clicks from one quite definite state to another. …Strictly speaking there are no such machines. Everything really moves continuously. But there are many kinds of machine [i.e., digital computers] which can profitably be thought of as being discrete state machines.” The time smear demonstrates why this is so as it seeks to avoid the abruptness of the digital, the raggedness of leaps, attention-grabbing and alert-triggering irregularities. Remember Turing was wondering in his paper whether a discrete state machine could pass as a continuous state machine in the imitation game that is still being played with today’s androids. The answer is yes, as long as the servers are dumbed down, a point that would not have occurred to Turing.

Playing with the temporal continuum of clocks positions Google’s strategy as focused on the leap second itself smeared over 72,000 seconds rather than tacking it onto its beginning or end, ameliorating the machinic incomprehensibility (non-representation) of a 61st second. Sub-second smearing is a form of fractional dilation of each second. The Google slow slew of time is a bold step in its capacity to exercise sovereignty over innovations in mediatizations of time. Carl Schmitt argued in Political Theology that the ‘sovereign is he who decides the exception’. In other words, to decide on exceptionality is to decide when we are in the non-exception. So if Google can decide on the exception that is the leap second, with its concatenation of smoothed errors, it will also decide when the exception is not current, and will control all other time.

All non-exceptional time – all models of that time run by other hold-out agencies will slowly become nothing other than non-smeared time. Google’s vision is of a future populated with its own corporate ontology, worked out in its mini-city-states like Toronto’s Quayside. Here is a version of what Foucault might have called networks of mutually assured exceptionalization in the form of carceral data neighbourhoods built on strategic alignments of power/knowledge in one of the fundamental categories of existence. One rather pressing question remains: what would a Google spacetime look like? Stay tuned.

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.