24 Nov 2016 Formulations We are pleased to share news of the publication of Formulations (Koenig Books [London]/Culturgest [Porto]/MMK [Frankfurt am Main]), a volume of essays on the work of Florian Hecker published to coincide with the opening, on 26 November, of the exhibition of the same name at MMK Frankfurt am Main, following its original version at Culturgest, Porto in Autumn 2015. This transdisciplinary volume brings together writers many of whom will be familiar to Urbanomic readers: contributions come from Diedrich Diederichsen, François J. Bonnet, Reza Negarestani, Michael Newman, Gabriel Catren, Fernando Zalamea, Éric Alliez, Robin Mackay, Ina Blom, Christopher Haworth, and Sarat Maharaj. Editor Robin Mackay has also used Formulations as an opportunity to experimentally extend the editorial model developed in Collapse with a ‘gluing’ procedure inspired by Zalamea’s conceptual championing of mathematical local-global transfers, by the history of the cut-up, and by Hecker’s procedures of macro- and micro-sonic formulation and reformulation—creating an interference between texts that explicitly opens up each separate contribution to the global space of the book. As explained in the ‘Operating Instructions’: The material presented as output was post-processed using a new implementation of content vector interpolation synthesis which allows samples at coarse resolution drawn from multiple source signals to be ‘glued’ into their counterparts at specified points (a form of bidirectional subset interpolation). Detected conformances between local semantic patterns are realised as supplementary semantic vectors, effectively constructing a higher-dimensional global surface across which initially disparate content vectors are seen to exhibit like behaviour. The resulting global projection, in turn, is used to modulate each of the local signals from which it has been synthesized. In this case the operation was carried out using the currently available deep neural network software implementation (‘editor’). […] The technique originated in the protocols of early textual machines constructed during the 1960s and 70s, when it was discovered that the (then manual) cutting and gluing of source signals afforded otherwise unobtainable effects. Where these early stochastically-inclined experiments disregarded semantic constraints, however, our (arguably more conservative) approach factors them in at the interpolation stage, and uses a normalising heuristic global model as an initial filter for fragment selection. With semantic sample conformance detection limited by the memory constraints and discriminatory threshold of the editing module, even repeated iterations rarely yield a ‘smooth’ result: the interpolations remain discernible, their perceptual effects ranging from a sometimes rewarding conspicuous splitting of semantic streams and conceptual themata to the incidence of unpleasant, abnormal, or jarring artefacts in otherwise sound concatenations of semantic units. Nevertheless the suggestive synthetic, perspectival, and even hallucinatory effects obtained here by means of subset interpolation are positive indications for the use of CVI synthesis in the assembly of diverse textual reformulations.