Modern Ornament

There is a landing, on the river side of the Queensland Museum, and above the Queensland Art Gallery, which is reliably mostly empty. It faces a large, busy cafe, often full of daggy and aspirational Brisbane parents and their noisy, curious kids. It is near walkways and stairs for people walking either way along the river. This corner landing repels people. People stop there for a moment, wanting to do something, and move on quickly.

The museum and art gallery is a reinforced concrete structure, built in 1982-86, according to the style at the time, echoing the Barbican in London or the Lincoln Centre in New York, a monumental grey and light brown culture production machine. The staff make clever and lively use of the cavernous spaces inside. There is an enormous fossil of hundreds of mud footprints, capturing a dinosaur stampede, against one cliff-like wall. Nearby, outside, full-scale models of humpback whales hang overhead, while whalesong pipes through discreetly placed speakers.

The whole building is a bit like that landing though, at least on the bare concrete outside. It has a striking concrete slab geometry, all rectangles. It is unshaded in the glaringly hot summer sun and humidity, and open to thunderstorms. In the bright Brisbane winter, the glare remains, but the heat is substituted for a chilly draft, seasoned with city grit and dust. You can’t talk to people, read a newspaper, eat a sandwich, or even check your phone, really. So on busy days, there is a steady stream of people needing a moment’s rest, not finding it, and disconsolately moving on. Perhaps I am misreading the building, and this is by design, like the seats in McDonald’s that are placed a distance from the table precisely determined to make you uncomfortable after five or ten minutes.

Queensland Art Gallery and Museum

Queensland Art Gallery and Museum by kgbo (cropped)

There’s a part early on in The Timeless Way of Building where Christopher Alexander meditates, in a somewhat angst-ridden way, on how problematic he finds contemporary (1970s) architecture. Very well, he relates, I must face that I am a conservative. (Horror of Berkeley horrors.) He then talks himself out of it, deciding that he is not against new buildings and materials so long as they learn from the beauty of old patterns, and that their design is in the hands of the communities they house.

Alexander has had an influential career, including inspiring the design patterns movement in software, but was never quite embraced by the mainstream of the architecture profession. He did get an award from the arch-conservative US National Building Museum. And there is that not-all-wrong, definitely reactionary, article by Rennix and Robinson on ornament and modern architecture that does the rounds on Twitter every few months. The article highlights a 1982 debate between Alexander and Peter Eisenman on beauty. It’s a debate that seems more important to conservative partisans (alt-historical 1980s youtube videos on how Alexander DESTROYS Eisenman flick before one’s eyes), but really is about technical expertise and the way it causes pain, the way we overwhelmingly live in an ecosystem of industrial creative destruction.

From the very beginning it is clear that Alexander and Eisenman don’t really even share a common frame in which to debate. Alexander’s early work, Notes on the Synthesis of Form, is highly mathematical, and Eisenman also makes heavy use of repeated geometrical forms. But Alexander, as always, advocates for a sense of wholeness and harmony, like the way the senses are comforted by detail related at different scales, which he thinks can be got at mathematically. Eisenman is fonder of ideas of deep structure from literary postmodernists like Foucault, Barthes and Derrida, and distrusts these feelings of comfort.

My design sympathies are with Alexander – it’s certainly a better default – and yet, if we zoom out, Eisenman isn’t all wrong. Sometimes technical experts need to inflict pain. The Hippocratic oath doesn’t stop surgeons from using a bone saw, just when they decide it’s worthwhile.

Peter Eisenman: Moneo’s courtyard … was taking away from something that was too large, achieving an effect that expresses the separation and fragility that man feels today in relationship to the technological scale of life, to machines, and the car-dominated environment we live in.

Christopher Alexander: Moneo intentionally wants to produce an effect of disharmony.

PE: What I’m suggesting is that if we make people so comfortable in these nice little structures of yours, that we might lull them into thinking that everything’s all right, Jack, which it isn’t. And so the role of art or architecture might be just to remind people that everything wasn’t all right.

So sure, art does need to do that, sometimes. It sounds like an awkward place to live, though. The residents of Eisenman’s House VI thought so, too – they even wrote a book about it.

The point of inflicting this pain – disharmony is pain – is usually that you go through it in order to become something else. The danger of a disharmonious building is surely that it is so permanent, that on even a generational timescale, it is a destination, not just a transformation. All pain and no amputation. Let alone the prelude to a cyborg prosthetic upgrade, or whatever mutant response to machinic modernity you might need.

Arizona State Football Stadium

Arizona State Football Stadium, by MCSixth

Eisenman is a prolific theorist, though I haven’t dived deeply into his writing. He has had a successful commercial career, too; it’s not all frozen museum pieces like House II. The curves of modern steel and autocad construction have been kind to his later work. His firm built the Arizona University football stadium. It’s a blobby magic schoolbus shape; a cutely monstrous gladiatorial arena.

PE: [Palladio’s Palazzo Chiericati] … makes me feel high in my mind, not in my gut. Things that make me feel high in my gut are very suspicious, because that is my psychological problem.

I guess he got over that. Of course you don’t live in a stadium, either, and they’re not meant to make you comfortable.

Eisenman is probably most famous for another late work, the 2005 Memorial to the Murdered Jews of Europe, in Berlin. The design was selected by competition. As described by James Young, one of the reasons the previous round of competition had failed acrimoniously was the previous winning design was too much of a kitschy ornament. The very concrete elements: the giant tombstone, the specific numbers of boulders, and so on, all become points of disappointment and interrogation, inadequate symbolism under the scale of industrial murder; an enormous snow-globe of death.

This piece by K Michael Hays, part of a lecture series, gives a sense of the project and its institutional reception.

It is a field of many abstract and minimalist stone pillars, without a single defined entrance-exit path, variations in the height of the pillars and the gradient of the ground creating an uncanny, disturbing, maze-like effect amongst the tallest pillars, at the centre. 

It’s hard not to connect the earlier criticism of Eisenman with the strengths of the memorial: “reminding people that everything wasn’t all right”. The memorial to industrial genocide is human-repellant, unheimlich, uncanny, un-home-like. That’s the point. Though when pressed, Eisenman doesn’t even commit to a meaning that concrete; he even considers it might be used by skateboarders, or in a spy film.

SPIEGEL ONLINE: Do you have a favorite monument?

Eisenman: Actually, I’m not that into monuments. Honestly, I don’t think much about them. I think more about sports.

(Spiegel Interview)

 

Lehman on Software, Models and Change

The modeled and evolving quality of software comes to the surface when thinking about software maintenance. A classic paper on this is Lehman’s 1980 paper Programs, life cycles, and laws of software evolution, which lays out the territory with great clarity, then confidently strides off in the completely wrong direction.

Model

Lehman introduces a distinction between 

  • S-programs, that implement a formal mathematical (s)pecification, such as the travelling salesman problem
  • P-programs, that solve some messy (p)roblem arising in the world, such as actually scheduling real salespeople and their ambiguities and partly-known preferences, and 
  • E-programs, those (e)mbedded in and changing the world they directly model, such as in air-traffic control.

For P-programs and E-programs, “the acceptability of a solution is determined by the environment in which it is embedded.” The distinction is in the programs relationship with its origin story: between top-down and bottom-up; transcendence and immanence.

Lehman goes on to note P-programs are also in a feedback loop arising from their use in the world. Their execution is observed, even lived, by their users, and this results in demand for change. 

This is a cybernetic view, though Lehman doesn’t use the terminology. The paper sketches some more complicated loops, particularly where a specification intermediates between the P-program and the world. It is that intermediation, rather than feedback, that is foregrounded in the difficult and famous statement on code:world relations:

Any program is a model of a model within a theory of a model of an abstraction of some portion of the world or of some universe of discourse.

Lehman drops this on page two, before defining S-, P- or E-programs, and never gets around to defining theory or model, or otherwise directly elaborating, disconnected sagelike pronouncements being an expected feature of software engineering papers of the time. Cook (and a team including Lehman) later link this to the social process of Kuhn’s paradigm shifts – renaming P-programs to (p)aradigm-programs and E-programs to (e)volving-programs.

Weisberg’s work on the crucial role of models in science could also help. For Weisberg, a theory maps a model to the world through (mostly explicit) construals. This plays a similar role to “abstraction” in Lehman’s definition. (Bit more on Weisberg here.)

It’s also worth throwing Naur’s “Programming as Theory Building” into the mix, though his paper does not make much distinction between model-as-code and theory-as-code.

Lehman also introduces “laws” of software evolution, which did have some empirical basis, but appear hard to reproduce. They might be compared to more recent work on meaningful descriptive code metrics, or properties of software as a material.

 

The Rivers and The Lakes That You’re Used To

After accelerating through insight after insight into the fluid and evolving nature of software, Lehman closes off the theory section by casually inventing microservices (in 1980), then taking a screaming left turn at the Process Street T-Junction, crashing through a railing and tumbling over a cliff. For over that cliff flows a process waterfall, and in the structured programming era, there’s nothing more attractive.

Like the rest of the structured programming crowd, he has factory envy: “An assembly line manufacturing process is possible when a system can be partitioned into subsystems that are simply coupled and without invisible links … Unfortunately, present day programming is not like this.” Lehman goes on to emphasize the care and structure needed when writing separate elaborate requirement and technical specifications. You get the idea. The remaining process recommendations I’m just going to skip.

It is easy to be wise after the fact in 2019. Agile practices and literal miniature software assembly lines (continuous build infra) now exist, and have made us much more sensitive to the damage done by scope size and delivery delay in large software systems. Trying to solve complex problems with more upfront planning was a high modernist worldview going out of fashion, but still very much in the intellectual water in 1980: Lehman gave a lecture in 1974 referencing both city planning and the Club of Rome report Limits to Growth. Perhaps it would be fairer to point out that thinkers who advocated short simple changes as a response to complex systems – like Jane Jacobs, or John Boyd and his OODA loop – were always tacking into the intellectual wind.

References

Cook, S., Harrison, R., Lehman, M.M. and Wernick, P.: ‘Evolution in software systems: foundations of the SPE classification scheme’, Software Maintenance and Evolution Research and Practice, 2006, 18, (1), pp. 1-35  
Lehman, M.M. , “Programs, cities, students – Limits to growth?” Inaugural Lecture, May 14,  1974, ICST Inaugral Lecture Series, Ed., VOI. 2, pp. 147-163, 1979. vol. 9, pp. 211-229,  1970-1974; and in Programming Methodology D. Gries, Ed. New York: Springer-Verlag, 1979,  pp. 42-69.
Lehman, M. M. (1980). Programs, life cycles, and laws of software evolution. Proceedings of the IEEE, 68(9), 1060-1076.
Naur, P. (1985). Programming as theory building. Microprocessing and microprogramming, 15(5), 253-261.
Weisberg – Simulation and Similarity.

Ancillary SYN-ACK

Ancillary Justice is a cyborg soldiers and AI spaceships novel (with complications) in a space opera setting, built around a soldier called Breq. Think Iain M Banks but with the Roman Empire instead of plush toy communism. The complications are both cool and fundamental to the characters, and I won’t spoil the slow reveal of the first book here, even though it’s all over the web.

The trilogy is completed with Ancillary Sword and Ancillary Mercy. After the galaxy-spanning wandering of the first book, racing towards the capital, the second and third books focus back on a particular system. Breq muscles in as a fleet captain of a capital ship. Aliens, ships and stations join the cast of characters. Interpersonal and gunboat diplomacy ensue.

The heavy Space Roman Empire vibe of the first volume evolves into something a bit more Space Girls Boarding School Naval Academy in the later books. Though both have their virtues, I always tend to favour first books, and the thick vertigo of new ideas is denser in Ancillary Justice than the other two volumes. I still devoured all three at speed and with pleasure.

The Ancillary trilogy is, at some level, network space opera, about synchronization, replication, latency and packet corruption. The empire exists because it successfully replicates itself over distance and time. And then it stops: packet loss and fragmentation.

An Anatomical Sketch of Software As A Complex System

As intellectually awkward artifacts that open up new capabilities, are surprising, frustrating and costly in other ways, and which regularly confound our physical intuitions about their behaviour, software systems meet an everyday language definition of complexity. A more systematic comparison, presented here, shows a significant family resemblance. Complexity science studies common techniques across a number of fields, and using that framework to analyze software engineering could allow a more precise technical understanding of these software problems.

This isn’t a unique thought. Various approaches, such as David Snowden’s Cynefin framework, have used complexity science as a source for insight on software development. Herbert Simon, in works like “Sciences of the Artificial”, helped build complexity science, with software programs and Good Old Fashioned AI as reference points. Famous papers such as Parnas et al’s “The Modular Structure of Complex Systems” also point the same way. As I was introduced to this material myself, I missed a more recent reference that lined up these features of complex systems with modern software in a brief and systematic way. These notes attempt that in the form of an anatomical sketch.

This note considers software systems of many internal models and at least thousands of lines, rather than shorter programs analysed in formal detail. This places it more under software engineering than formal computer science, without intending any strict break from the latter. Likewise, by default, it addresses consciously engineered software rather than machine learning. This complexity differs from algorithmic processing time complexity as captured by O(x) notation, though there may be interesting formal connections to be explored there too.

 

Anatomical Sketch

Ladyman et al give seven features of complex systems, and I’ve added one more from Crutchfield.

1. Non-linearity

Software exhibits non-linearity in the small and the large. Every ‘if’ condition, implicit or explicit, represents distinct possible outputs. This is most obvious in response to unexpected input or state; error and exit, segmentation fault, stack trace, NullPointerException.

2. Feedback

From a use perspective, many software systems are part of a feedback loop, with users and the world, and this feedback can often involve internal software state.

From an engineering perspective, all software systems beyond a trivial size are built in cycles where the current state of a codebase is a rich input into the next cycle of engineering. This is true whether iterative software development methodologies are used or not. For instance, consider bug fixes resulting from a test phase in waterfall.

3. Spontaneous Order

Spontaneous order is not a feature of large software systems. If anything, the usual condition of engineering large software systems is constantly and deliberately working to maintain order against a tendency for these systems to suffer from entropy, or into complicated disorder. The ideas of ‘software crisis’ and ‘technical debt’ are both reactions to lack of perceived order in engineered software.

4. Robustness and lack of central control

In the small, or even at the level of the individual system, software tends to brittleness, as noted above. Robustness, being “stable under perturbations of the system” (Ladyman), must be specifically engineered in by considering a wide variety of inputs and states and testing the system under those conditions. However, certain software ecosystems such as the TCP/IP substrate of the Internet display great robustness. Individual websites go down, but the whole Internet or World Wide Web tends not to. This is related to the choice of a highly distributed architecture based on relatively simple, standard, protocols and design guidelines like Postel’s principle (be tolerant in what you accept and strict in what you send). Like a flock of birds, the lack of central control makes the system tolerant of local failure. High availability systems make use of similar principles of redundancy.

5. Emergence

Software systems tend not to exhibit emergent behaviours as highly visible features of the system, in the way say a flock of birds assumes a particular overall shape once each bird follows certain simple rules about their position relative to their neighbour. Certain important non-visible features are emergent. Leveson, in Engineering A Safer World, argues that system safety (including software) is an emergent feature: “Determining whether a plant is acceptably safe is not possible, for example, by examining a single valve in the plant. In fact, statements about the ’safety of the valve’ without information about the context in which that valve is used are meaningless.” Difficult bugs in established software systems are often multi-causal and emerge from systemic interactions between components rather than isolated failure.

Conway’s law, the observation that a software system’s internal component structure mirrors the team structure of the organisation that created it, describes system shape emerging from social structure without explicit causal rules.

6. Hierarchical organisation

Formal models of computation did not originally differentiate between parts of a program; for instance Turing machines or the Church lambda calculus do not even distinguish between programs and data. Many of the advances in software development have by contrast been tools for structuring programs in hierarchies and differing levels of abstraction. A reasonable history of programming could be told simply through differentiated structure, eg:

  • Turing machines / Church lambda calculus
  • Von Neumann machine separation of program, data, input, output
  • MIT Summer Session Computer: named instructions
  • Hopper: ALGOL compiler and functions
  • Backus: FORTRAN distinguished control structures IF and DO-WHILE
  • Parnas: module decomposition through information hiding
  • Smalltalk object orientation
  • Codd: relational databases
  • GoF design patterns
  • Beck: xUnit automated unit testing
  • Fowler refactoring for improved structure
  • Maven systematic library dependency management

Navigating program hierarchy from user interface through domain libraries to system libraries and services is a significant, even dominant, proportion of modern programming work (from personal observation, though a quantified study should be possible).

7. Numerosity (Many more is different)

The techniques for navigating, designing and changing a codebase of hundreds of classes are different than with a short script, at least partly due to the limitations of human memory and attention span. An early recognition of this is Benington’s Production of large computer programs; a more recent one is Feathers’ Working Effectively With Legacy Code. Feathers states: “As the amount of code in a project grows, it gradually surpasses understanding”.

8. Historical information storage

“Structural complexity is the amount of historical information that a system stores” according to Crutchfield. This is relevant for both use- and engineering-time views of software systems.

In use, the amount of state stored by a software system is historical information in this sense. An example might be a hospital patient record database. A subtlety here is suggested measures of complexity based on amounts of information (such as Kolmogorov) tend to specify maximum compression. Simply allocating several blank terabytes of disk isn’t enough. This also covers implicit forms of complexity such as dependencies in code on particular structures in data. Contrast a hospital database alone (just records and basic SQL) and the same database together with software which provides a better user interface and imposes rules on how records may be updated to suit the procedures of the hospital.

Source control changes provide a build-time problem of historic information. In practice, when extending or maintaining a system, classes are rarely replaced wholesale or deleted. New classes are added or existing classes modified to add functionality. The existing code is always an input to the new state of the code for the programmer doing the change, even if existing code was left untouched. Welsh even declared, in a paper of that name, that “Software is history!”

The result, regardless, is increasing historical information in a codebase over time, and therefore complexity.

 

References
Conway – How Do Committees Invent? Datamation 1968
Crutchfield – Five Questions on Complexity, Responses
Feathers – Working Effectively With Legacy Code, 2006
Ladyman, Lambert, Wisener – What Is A Complex System?
Leveson – Engineering a Safer World, Chapter 3 p64, 2011
Parnas – On The Criteria To Be Used in Decomposing Systems into Modules, Communications of the ACM, 1972
Parnas, Clements, Weiss – The Modular Structure of Complex Systems, IEEE Transactions on Software Engineering, 1985
Postel –  RFC 761 Transmission Control Protocol https://tools.ietf.org/html/rfc761 https://en.wikipedia.org/wiki/Robustness_principle
Simon – Sciences of the Artificial
Snowden and Boone – A Leader’s Framework For Decision Making (Cynefin)
Welsh – Software Is History!

Accelerationism: A Brief Taxonomy

It is a moment of pause for the theory of accelerationism. The burst of self-identifying activity over the last few years has cycled into something of a bear market, even as the conceptual toolbox is more powerful than ever in navigating our present. Theorists of acceleration are connoisseurs of vertigo, and will insist any snapshot of their thought is dead or out of date. This taxonomy is both. But it’s short.

 

Accelerationism: ACC: Capitalism is a feedback cycle of increasing spiraling power, which it is not possible to comprehend or control from within, and therefore at all. The complexity of this alien system includes hyperstitions and reverse causalities. Capitalism melts and reassembles everything. Fictions become realities through their articulation. Future structures assemble themselves through their conditioning of the past.

Texts:

  • Deleuze and Guattari, Anti-Oedipus; A Thousand Plateaus
  • Land, Meltdown
  • CCRU – Writings 1997-2003
  • Collapse Journal I-VIII

 

Right Accelerationism: R#ACC: Capitalism is modernity, science, intelligence. What is powerful in all these things is one identical force. What is best in the world is represented by this force, the product of sharpening by relentless competition, brutal empiricism and blind peer review, the butcher’s yard of evolution. Historically, it was possible to put a defensive brake on capitalism and intelligence. That possibility is fast receding or likely already gone, and was always undesirable. Ethically and therefore politically, we should align ourselves with the emancipation of the means of production. Artificial intelligence, genetic engineering, corporate microstates, and breeding a cyborg elite are all means for achieving this end.

Right accelerationism is entwined with the techno-capitalist thread of neoreaction.

Texts:

 

Left Accelerationism: L#ACC: The tremendous productive power of capitalism is a world system that is impossible to fully control, but it may be harnessed and steered for progressive ends. Only with the wealth and productivity of capitalism has fully automated luxury gay space communism become possible, and now it is within reach it can be seized. Only through computational power can the relationship between Homo sapiens and its ecosystem be understood and balanced. The great corporations and financial structures of the early twenty-first century are themselves prototypes of platform planned economies leveraging enormous computational power to fulfill billions of needs and desires across society. By accelerating progressive technological invention, reinvigorating the domesticated industrial state as a platform state, nationalizing data utilities, sharing dividends and redefining work, the system may be made sustainable and wealth shared with all according their need.

After a surge of activity, many left accelerationists rapidly swerved away from the name a few years ago. This was coincident with Srnicek and Williams’ book Inventing the Future, which is all about left accelerationism, without mentioning it once.

Texts:

 

Unconditional Accelerationism: U#ACC: To erect any political program that pretends to steer, brake, or accelerate this system is folly and human-centric hubris. The system can be studied as a matter of fascination, and of survival. The only politics that makes sense is to embrace fragmentation and create a safe distance from centralized political power. A patchwork of small communities built across and within networks, societies and geographies are a means for some to survive and thrive. Many small ships can ride through a storm with a few losses, where one giant raft will be destroyed, dooming all.

Texts:

 

Blaccelerationism: The separation of human and capital is a power structure shell game. Living capital, speculative value, and accumulated time is stored in the bodies of black already-inhuman (non)subjects.

Texts:

 

Gender Accelerationism: G#ACC: Everyone is becoming transsexual lesbian programmer cyborgs. Enjoy it.

Texts:

 

Zero Accelerationism: Z#ACC: The world-system is accelerating off a cliff.

Texts:

 

Accelerating The Contradictions: Capitalism is riven with conflict and contradictions. Revolutionaries should accelerate this destructive process as it hastens the creation of a system beyond capitalism.

No modern accelerationist group has held this position (D/G: “Nothing ever died of contradictions!”), but it’s a common misunderstanding, or caricature, of Left Accelerationism.

Texts:

 

Other introductions: Meta-nomad has a more theory-soaked introduction to accelerationism, which teases out the rhizomatic cross-connections between these threads, and is a good springboard for those diving further down the rabbit hole.