Loyalty cards are loyalty simulators. The corporate entity wishes to retain your business, so it assesses your service to it quantitatively. The class, or achieved level in the loyalty system, triggers a specific script in workers for the corporation. Loyalty is an animal feeling, for people and dogs. If we think of a corporation as a rule driven machine, that does not have emotions or emotional understanding itself, or at least something that is not an animal, we can see why the loyalty must be simulated. From the corporation’s perspective, loyalty is something quantified for the advantage of the organization. In happy circumstances this is a symbiotic collaboration of profit gained for services rendered, though the relationship can be parasitic too, with the corporation exploiting its customers, or customers exploiting loopholes in a poorly designed loyalty system.
Now many people in service roles work hard to genuinely help people out, and really do like to see people looked after. I’m not dismissing (or idealizing) that human connection. The two elements that distinguish a loyalty program as a simulator in a fairly precise sense are firstly the corporate authored script, and secondly the systematic tracking of loyalty state (microsocial class), which in turn determines the roles of actors in the script. Humans write the scripts, and then different humans play them out. We go along as workers and customers with the script because it is our job and because it is convenient, or pleasant. In working with the script it’s hard not to have some emotional response, however attenuated, and loyalty simulation becomes loyalty stimulation. (Perhaps your formidable willpower, dear reader, means you never respond emotionally to corporate transactions, and have never cursed at a late charge on a phone bill, but I certainly have.) At this point we grant the corporation some agency, and it is simulating loyalty in much the way the AI in a Turing Test simulates intelligence.
You can expand this viewpoint to encompass a whole worldview of corporate simulation, as in Baudrillard’s descriptions of Disneyland, “capturing all the real world to integrate it into its synthetic universe”. If we focus instead on the script, the actors, and the tracking of state, we can see loyalty programs as early adopters of gamification. Airline gold class status is the original level treadmill. The scripts of greeting are little interactive fictions, choose-your-own corporate adventures. Humans write the scripts and different humans act the roles. Help desk scripts run the same way: once you exit the robot-driven entry point (press 2 to choose Mandarin, etc), a customer simply enters another script with first level support, where, initially, your lines and choices are still very constrained.
In the play Cyrano De Bergerac, Cyrano woos Roxanne by providing words to be delivered by another man with a more handsome face. The psychologist Stanley Milgram experimented with this idea, finding people reacted to people parrotting other’s words without suspecting they came from another. They gave people the benefit of the doubt that their thoughts were their own, which honestly is usually a good working rule, if you aren’t in a psychological experiment, or living in the twenty-first century. Milgram called the actors in his scenario cyranoids, after the play. It can seem a disturbing concept, as if controlled by another mind. But isn’t every first level help desk, every routine call centre call, every canned gold class greeting, a cyranoid scene? It is software using meat to impersonate meat. It is an inside out mechanical Turk, with the Turk on the outside and the machine within. Corporations and computing have really just made it cheap and banal. A military boot camp is full of cyranoid experience, of ritual interaction backed by systematic tracking. Or the political state itself, like the pre-modern state of special ranks, official clothing and carefully graded formal titles. We can be loyal to a state. We can be cyranoids going over the tops of trenches in our thousands, following a script written by someone else who lived before we were born.
These pre-Turing simulators are nuanced and complex but computationally crude. It’s all IF-THEN, central storage is tremendously expensive ink on paper and the bandwidth is a nightmare. The signal to noise is appalling and you constantly have to resort to hacks like having deserters shot.
Though we are sometimes fooled by crude or sophisticated gamification, we often participate in it actively, as well. Many soldiers of the Great War were sincere volunteers, putting themselves forward for what they saw as a greater cause, or in defense of people dear to them. A mechanistic view of society, where everyone is a robot programmed by some ritual, is too much a caricature. The cyranoid is an everyday feature of formal social structures, ones people have been using for millennia. The new aspect is simply computational cheapness, so we can now have cyranoids in high fidelity loops, remote procedure calls and chains of responsibility. The script is still acted by a person, who chooses with each interaction whether to stay on script or to improvise and face the systematic consequences, or benefits, of such disruption.
With the contemporary explosion of gamified systems, and every app and online marketer trying to quant a few extra percentage points out of our custom, we navigate in and out of cyranoid scripts, tacking and gybing based on advantages of the moment, from our own obligations to the structured rewards and compulsions of the systems we sail between. Cyranoid is too passive a term. We have made a society of cyranauts.