The Ugly Sweaters That Make You Invisible to Face Recognition Technology
Fashion fighting facial recognition, one $400 sweater at a time.

Your face is in a database. Everywhere you go, cameras are watching you, tracking you against it. Everything you do is logged.
Until recently, that would have been the suitably ominous opening to a science-fiction short story, but we live in Silicon Valleyâs world, and our online carelessness about data privacy has infected the real world too. In every city â at banks, football games, and concert halls â the same technology that unlocks your phone is used to track exactly who you are, what youâre doing, and where youâre going.
Police use it. Immigration officials use it. Even Taylor Swift uses it. One of the most infamous firms in the sector, Clearview AI, wants to integrate the tech into dating apps, allowing potential swipes to find everything to know about you ever captured on CCTV or posted to the internet. Every embarrassing moment and bad hair day is just a click away. Welcome to the future, hope you like it.
Many willfully ignore our âsurveillance capitalismâ status quo â where the expectation of free product is paid for by unmonitored, endless, highly profitable data-harvesting. Others are simply ambivalent to it. However, thereâs also a growing range of people and brands drawing awareness to the issue and fighting back.
One notable example is Cap _able, an Italian fashion start-up founded in 2018 by Federica Busani and Rachele Didero, whose âManifestoâ collection is the first line to market that renders you functionally invisible to facial recognition.
Theyâre superbly made, and the price reflects that â between $300 and $500 a piece â and have a, um, âconfrontationalâ style, but they work, and the boldness is the point. As Ms. Diderot tells me: âThis collection is trying to say âIâm invisible to AI, but Iâm so visible to other people, and I want them to know and to understand these values that Iâm trying to express.â
Cap_able is waving a red flag that only humans can see, a sartorial alarm that computers canât hear, and the technology behind it is extremely clever.
Facial recognition software works by first identifying the people in a scene â using âobject recognitionâ to separate you from dogs, post boxes, hedges, and buildings â and then applying a more computationally-intense facial recognition algorithm, which compares your biometric identifiers with the other faces in its database. Itâs basically a scaled-up version of your phoneâs face unlock, and just as I canât unlock my phone if Iâm wearing sunglasses or a mask, the âadversarial patternsâ of Virtueâs âCamoflagsâ World Cup face paint, Hong Kong protester masks, and Cap_ableâs jumpers have a similar effect on security cameras.
Namely, these bright patterns break up the identifiable human form, tricking the âobject recognitionâ into seeing a dog or giraffe instead of a person, and it never applies the facial recognition step. The brighter and bigger the pattern, the more effective that camouflage is â a Cap_able coat will be more effective in shadows or different lighting than a beanie or scarf would, for instance â and thereâs a certain poetry to the same aesthetic boldness that makes them unignorable to the human-eye also being the core to their camouflage.
Distinct from other adversarial patterns, Cap_ableâs proprietary camouflage is made three-dimensional, with the garments made using computerized knitting machines. Along with improving the material quality of their garments â Ms. Busani tells they âwill last basically your whole life,â in stark contrast to our fast-fashion norm â this makes the pattern dramatically more resistant to facial recognition software whilst ensuring that the patterns wonât become ineffective through the stretching of age and wear.
However, though the garments are long-lived, their camouflage isnât. They work currently, but itâs only a matter of time before your $450 sweater no longer hides you. âItâs not possible that our garments will be protective and efficient forever,â Ms. Didero says, âitâs just not possible, but youâll still have a beautiful garment with a deep meaning.â
The fundamental problem is that thereâs a constant arms race between surveillance tech and adversarial clothing, where every increase in algorithmic efficiency, accuracy, and fidelity is countered by more complex and creative patterns â but you canât update a sweater. This swift redundancy is why previous versions of this technology never reached stores.
Ece Tankalâs 2017âs Hyperface adversarial pattern was widely praised in the press, but notes at the bottom of their website that it was never sold to the public âbecause its value as camouflage was temporary and has now passed.â It notes, âCamouflage, in general, should be considered temporary, but especially technical camouflage that targets quickly evolving algorithms.â
Ms. Didero says theyâre working on making their clothing âadaptableâ and upgradeable, but theyâre not there yet, and âhardwareâ â be it a sensor or a sweater â is inherently harder to update than software. Perhaps even more troubling: though their garments were developed using You Only Look Once â V7âs widely-used, open-source, default object recognition algorithm â theyâre unable to test against the proprietary algorithms used by companies like Clearview AI. They may already be inept against them.
Put simply, if youâre protesting an authoritarian government, you shouldnât rely on their clothes, and Cap_able acknowledges that. âYou can order it and we can ship it,â Ms. Busani says in reference to Communist China, âbut itâs definitely not the market that weâre aiming at.â Thereâs also a question of their legality in such countries, with Hong Kong banning masks for precisely this purpose.
Rather, their target customers are the most engaged readers for an article like this: people living at camera-coated cities like London and New York, who are concerned about âsurveillance capitalismâ and live within democratic systems where the status quo can be changed. The message in Cap_ableâs patterns â that the status quo of mass data gathering is unacceptable â is as important to them as the protection they offer.
Though Cap_able does sell directly to consumers, theyâve only sold âbetween 50 and 100 garments,â and their intention is to work primarily with other partners. Aided by Italian startup accelerators, their primary goal over the next year is to improve the design and production of their patterns, bringing them to new partners, and making their technology effective using less space, in more conditions, with more muted styles and colors.
You might like the bold style of the âManifestoâ collection, but imagine a Burberry trench coat with a line pattern in cobalt blue; a black wool peacoat with reflective 3M graphics woven in; or even a classic white CDLP T-shirt with a vibrant adversarial, box graphic in the center. Iâd buy those.
I like the company, its founders, and its message â but Iâm evidently a minority, and the great flaw of Shoshana Zuboffâs highly influential 2009 doorstop of a book, âThe Age of Surveillance Capitalism,â is that this status quo wonât change because we, collectively, donât want it to. Privacy was a brief blip in human history, which started in the 19th century, as households got thick walls and multiple beds, and people moved from small communities where everyone knew everything about each other to large cities where you could disappear in the crowd.
Yet now we have the internet, and we like it more. Cameras are everywhere, our faces are in databases, everyoneâs thoughts endlessly stream to the internet, we broadcast our location to advertisers, strangers can find endless pictures of us, and we love it. A bold sweater isnât changing that.
âPrivacyâ is like many values: sincerely believed until it comes into the messy conflict of practice, where it loses to convenience and price. We care about human rights until we want cheap fashion and electronics. Our moral compulsion for animal welfare ends with cheap hamburgers. We buy red poppies and ignore homeless veterans.
Everyone uses Google, and few people care that the greatest software innovation of the modern era â generative AI art and language â works by consuming an unfathomable amount of publicly available data.
We care about privacy until it costs us anything, and $400 is a lot for a sweater.