For Grosz, time can only be thought of “when we are jarred out of our immersion in its continuity, when something untimely disrupts our expectations” (5). Her insistence on thinking of ourselves as subjects immersed in it, coupled with this idea that we are able to be jarred out of it, seems to suggest that we have a choice in the matter, that thinking of ourselves as immersed in it will cause us to be immersed in it, and that in being “surprised,” we are actually popping out of it, or at least our consciousness is (5). She problematizes her own language, however, when she says that
We can think of it only in passing moments, through ruptures, nicks, cuts, in instances of dislocation, though it contains no moments or ruptures and has no being or presences, functioning only as a continuous becoming.
revealing, on one hand, a certain disingenuity of language (she doesn’t really think that any of this is actually happening), and a curious idealism on the other: in this model the locus of what is being apprehended is the mind of the perceiver, not the thing, not anything material. Granted, this is precisely the sort of slipperiness that all considerations of time must deal with, but in Grosz’s text a clear articulation between what is philosophy – apprehending the subject and/or the object – and what is concept work – ways of thinking about something to a certain end – is not clearly delineated.
An analogue that might describe the phenomena that Grosz calls a “nick,” and that presents a metaphorically opposing view, can be found by thinking about processing power, or speed. Let’s say that in our story our main character is a robot that has attained sentience: she is just like a human in all of the necessary systemic connections, only she is able to process the data that she apprehends at speeds far faster than her flesh and blood creators. One day she goes for a walk, when at a particularly busy intersection of the vast and technologically advanced metropolis in which she lives she hears the horn of a large truck honking frantically. Her cybernetically improved neck swivels her face in the direction of the sound and the beautiful brass gears of her impressive CPU shift into high gear: she sees three men, standing on three opposite corners, two are pointing towards a large red truck that is coming from the north (one with his left hand one with his right hand), while with their opposite hands they are waving frantically in the direction of three people who are spread out at various distances along a sidewalk (almost immediately she calculates the ranked probability of to whom the men are waving). There are six other vehicles, five pedestrians, four sets of streetlights, three quadrupedal quasi-domesticated canines, eight buildings, and ten tree-like obstacles in the immediate vicinity, and an exponentially multiplying number of possible outcomes and explanations for the scene as it unfolds. It might seem that the above scenario violates anything like an inquiry that is interested in any intents or purposes, and we might object, like Howard the hard-nosed engineer did, that no one will ever be able to think that fast so it’s useless to even think about. But think about the simple linguistic difference between a surprised robot and a surprised human who were later asked to give a statement to an officer of the peace.
I heard the honking of a truck’s horn, and when I looked up there were three men, standing on three opposite corners, two were pointing towards a large red truck that was coming from the north (one with his left hand one with his right hand), while with their opposite hands they are waving frantically in the direction of three young people who are spread out at various distances along a sidewalk. There were also six other vehicles, five pedestrians, and three dogs that were actually in the road.
I heard a horn, and I looked up, and there were people waving and yelling, and these dogs were barking.
What the nick of the surprised robot would look like on a timeline, then, would be more like a balloon-like growth (fig. 1), and the story of the robot’s morning would be a dot at which she woke up, a straight line when she was walking to work, and then the bulb, wherein all the text that processed all that happened filled out the moment of being greatly aware of time via the events that unfolded before her, and would return back to a straight line once she recorded her statement and trotted duly off to perform the labor that humans were increasingly shunning, where there would be another dot. Language, then, has the capability to conceptually – not actually – mimic the acceleration of processing data, an analogue to becoming more aware of something as it happens “in time.” What this suggests, I hope, is that it is possible to think of or about time in moments other than these so-called “nicks.”