Can digital environments have desire paths?

Path dependency, power and articulating preference in hard-coded systems

Right now, I can't seem to escape the idea that technology systems are a lot like urban planning. (Maybe because I just read James Scott's Seeing Like a State.) The other day I was walking through the National Gardens in Athens, a beautiful, tropical park in the centre of the city.

It sticks out — literally. Only recently, I learned that it was commissioned by the former queen of Greece in the mid 19th century, in direct denial of the surrounding climate. Big circular paths wind around massive palm trees, with irrigation systems throughout.

The park is also peppered with 'desire paths'; a weaving network of footpaths that people actually want to use rather than the paths that were planned and built into the park's landscape. Desire paths are unplanned, and present themselves gradually — they are literally produced by consistent foot traffic.

When we move through digital systems, we don’t have the options we have in physical spaces. We can’t cut corners, or express ourselves outside of the permutations that are designed for.

And the digital world is built up over time based on path dependency. Path dependency describes the phenomenon in which events in the past affect events in the future. E.g. "The anti-trust suit against Microsoft in the 90's will affect the anti-trust suit against [insert any of Big Tech company here]." This means that a past set of assumptions are driving what's on offer now — rather than our expression of what we actually want.

The hard-coded nature of, say, the UI of a web app is susceptible to path dependency (I'm looking at you Bootstrap). You have to click these buttons in that sequence to achieve a desired effect, and the permutations of possibility seem very finite — and boooooooring — because we’ve built habitual design practices that scale. For accessibility, this type of path dependency can be a boon, as it increases the amount of digital space that can be accessed by people of all capabilities and decreases the ‘creative’ overhead and DIY nature of building adaptive, inclusive systems.

Unfortunately trying to find desire paths in (largely path dependent) digital spaces is a huge challenge without massive user research budgets, and a team of data scientists looking solely at user experience. And that's assuming that all teams design humanely, looking for signs of how people actually want to move through online spaces. If not, dark patterns are more nefarious and much harder to fight than the landscaped paths in my local park.

Here's one of many examples: a few weeks ago, I spent 45 minutes email-shouting at a company who insisted I call them to cancel my digital subscription. They'd provided no way to cancel online. This tactical, anti-user friction, is exploiting the fact that desire paths are largely impossible to make in digital environments. If we can't cut corners in these situations, then we cannot remake systems to suit our needs. In turn, designers won't be able to see the patterns of desire, which are necessary to inform change in those systems.

And desire implies a nice-to-have and not a need-to-have. In some scenarios a desire path becomes a dignity path — or the path I should be able to take if I want to retain my dignity in a digital world that is treating me as less than human.

In capitalism, we expect consumerism to be the primary way of expressing our desire paths. But, just as with technological systems, we still have to choose from an existing and limited set of options. If we don't like any of them, we're told to build our own.

What happens when these limitations are experienced in environments that have huge consequences for people experiencing them? Immigration authorities are operating on a path dependency and political focus to make applications as challenging as possible — how does that manifest when they build websites for the application process? What responsibilities do states and other powerful actors have to look for desire paths? How can designers use randomness and research to spot and serve desire paths? And what role can organising play in articulating what we actually want out of the technology that hard codes our systems?

Have you hacked the mainframe? Or just creatively got around the way a digital system tried to box you in? Share your stories below!