Barriers To Entry

My wife Lindsay attends a women-only fitness club here in Toronto. Like many other types of buildings that have intermediate-level security requirements, this club features a database-controlled electronic swipe card system to automatically monitor access.

Deleuze:

The conception of a control mechanism, giving the position of any element within an open environment at any given instant (whether animal in a reserve or human in a corporation, as with an electronic collar), is not necessarily one of science fiction. Felix Guattari has imagined a city where one would be able to leave one's apartment, one's street, one's neighborhood, thanks to one's (dividual) electronic card that raises a given barrier; but the card could just as easily be rejected on a given day or between certain hours; what counts is not the barrier but the computer that tracks each person's position–licit or illicit–and effects a universal modulation.

Linds inadvertently found a vulnerability in the system recently when she realized that the club hadn't been charging her for some time, though her swipe card was still validating. Undeterred, she continued to use the facilities on a periodic basis, until finally one day the computer at the turnstile refused to yield entry, gave a loud beep, and instructed her to report straight to the front desk.

Her options: pay the not-insignificant arrears on the account and receive a slightly lower monthly rate thereafter, or begin a new membership with the club for the slightly higher introductory rate she had originally been paying. Since she believed the error to be the club's, Linds ultimately chose the latter (and less expensive) option.

One thing appears certain in this case: in an era of increasingly ubiquitous pantactile awareness, we tend not to internalize any marginal increase in the intensity of the control system's touch-sense. By temporally removing Bentham's guard from the tower, so to speak, the threat of punishment is not internalized to the same degree as it would be otherwise. The imperative of the automated system, then, is to reduce this temporal gap between signal and engagement.

As DeLanda argues convincingly, much of the "intelligence" required to accomplish this goal in the military control system is being downloaded to computers. A similar shift is taking place with other systems of production, in business and elsewhere. But the system is only as good as its databases, algorithms, rule-sets, etc. If we are to understand databases as cellular, organic processes, then we must understand them as sites of rot, decay, and chaos. I presume that this is how Linds slipped through the gaps in the system.

Comments

One response to Barriers To Entry

- rss feed for this comment thread
  1. Colin Wellum says:

    All systems are cellular. How could they not be born, rot, decay, be organic processes because we are only human. Humans, the system of study for so many years and we are still learning about them. The human body: the epitome of perfection, based on today’s scientists. Humans can break their bones, decay, rot, whither away, and be forgotten. Therefore, it is impossible to create the perfect system, for we can only create something as good as ourselves. Fractures (broken bones) of the system’s infrastructure can occur with as small of a force as not being able to read all the bars in the bar code- similar to your wife’s card key at the gym. The system can be weak and based only on interpretation or fallacy instead of truth (rot/decay). The system can be replaced as easily as a change in power. This change in power will bring their ideologies and beliefs which will be adopted and will replace the “old” because that system has the power. Finally, all systems must have a beginning and an end. That end may not need to be as dramatic a death, it could simply be the launch of a new system by a change in ideology or power. However it comes about, the end will always be there, like a blanket that always leaves your feet cold.