Anti-Surveillance Fashion Tips
Tagged:ArtificialIntelligence
/
Beauty
/
CorporateLifeAndItsDiscontents
/
Politics
/
Sadness
/
TheDivineMadness
/
ϜΤΦ
Tired of all the mass surveillance in our late capitalism culutre? Maybe it’s your fashion sense.
Fashion Sense?!
Look, if you’re about to take fashion advice from me, think again. Those of you who know me IRL know why. For the rest of you: yes, I have a fashion sense (simple clothes, loose fitting, dark solid colors, inconspicuous); no, it is not conventional. Taking fashion advice from a nerd of low social skills like me will not make anything in your life better.
Mass Surveillance
[Yes, this post is post-dated. I have an official “Long COVID” diagnosis now, for the brain fog. Apologies for lack of timeliness!]
You know you’re being tracked online. Or at least you should.
But there are also tons of cameras all over, private and government, capturing video of general public scenes all the time. As the video from PBS Terra embedded here [1] mentions, the number of surveillance cameras just in the US grew from 47 million in 2015 to 70 million in 2018. These have been shown in 2020 to cause a 13% reduction in theft, but no effect on violent crime (7:37).
The police/government ones are usually armed with facial recognition, and are not shy about tagging you personally in the video. Also, highways have cameras all over that capture license plate numbers for the same purpose.
The authorities know where you went, who you were with, where you drove, how fast you drove, and so on. If there’s an arrest warrant out for you, you’re gonna get picked up pretty fast. That’s the good side, at least most of the time. The bad side is that your info gets captured anyway, without your consent, even if there’s no law enforcement reason. It can be used against you at any time.
Another chilling thought: the face databases on which the facial recognition software was trained include more or less all of our faces, again without permission. They take vast tracts of surveillance camera footage, social media photos, state drivers license databases, etc. They hand-annotate the faces, and train the AI on that. The people in the images had no choice in the matter.
If you’re Black, Hispanic, Indian/Pakistani, or Native American it gets even worse: the error rate for darker-skinned people is much higher than for Whites. You’re more likely to be mistaken for somebody wanted by the cops. That can be anywhere from inconvenient to life-breaking.
Creepy, much?
Adversarial Examples
There’s a trick widely known in the machine learning community: adversarial examples. Once you know how an AI has been trained, you can – sometimes – cook up a perverse example that fools the system. An early example I once saw fooled a system that recognized kinds of fruit by taking an orange and sticking a sign on it that said “apple” – resulting in the system thinking it was an apple.
Can your clothing do something similar to at least some of the surveillance software?
Yes.
At about 2:35 into the video above, they begin to discuss “adversarial fashion”. Shown here is one of their examples, a shirt that has a carefully designed pattern of noise crafted to make a facial recognition system decide there’s no face here. What it lacks in visual charm, it makes up for by making you hard to see for the surveillance software.
The finer details are complicated, but in a nutshell the adversarial patterns signal that something else is present other than a face, or that there are lots of tiny faces instead of your face. Either way, the algorithm will doubt that a human is present.
Here’s a striking video example reported on Twitter, of some research done by Wu, et al. at the University of Maryland. [2]
In the video, the person and those around him are initially well identified by the software, which encloses them in blue rectangles. But 6sec into the video, when he holds the sweater in front of his chest, he’s suddenly no longer recognized (although those around him continue to be recognized).
It appears that the sweater has another scene of people walking on it, so perhaps it confuses the facial recognition software as to scale? You’ll have to read the paper below to find out!
Here’s another example, specifically designed to foil night-vision cameras. [3] Instead of adversarial patterns, it uses a more brute-force attack: an array of high-power infrared LEDs.
The hoodie has LEDs that put out IR at roughly the same wavelengths as used by security cameras to get night vision, but are essentially (or nearly completely?) invisible to humans.
They are then strobed at just the right frequency to mess with the camera’s auto-expose function: when they’re off, the camera aperture dilates, and is immediately given a blast of bright IR, causing the aperture to contract. Repeat rhythmically as needed.
Result: overexposure and loss of definition. As you can see, the wearer’s entire head is obscured in a bright cloud.
Some Drawbacks
So, are these things ready for prime time and use by everyone?
Not really.
Some reasons:
- Narrow adversary: Adversarial examples have to be
re-computed for each supervised classifier they want to fool.
-
So this trick works against a very specific version of some very specific software. But in practice, the software is (a) always being updated, (b) will inevitably be trained to avoid adversarial examples, and (c) have its version number kept secret in any case.
If you ask your institution about how they process their security camera footage, you’re very unlikely to get a cooperative answer. If you ask your local cops how they process surveillance footage, you not only won’t get a helpful answer but may enjoy the hostile scrutiny of a retaliatory investigation.
So the adversarial shirt trick works only once, and depends on you having information you’re unlikely to get in the real world.
-
Single system adversary (e.g., versioning, gait recognition)
-
- Conspicuousness to people: These bits of clothing, or IR LEDs shining about the head
are difficult for certain bits of software to notice, but are blaringly conspicuous to
people.
- Wouldn’t you look funny in a shirt that couldn’t be seen by surveillance software, but also is, as one wag termed it, “so ugly we also wish we couldn’t see it”? (Unless you’re at a rave in Honolulu, or a Jimmy Buffett concert, both of which count as rare exceptions.)
- Isn’t the glare of the infrared about the head a – literally – shining counterexample to going unnoticed, when a person sees it?
Wouldn’t a person reviewing surveillance video immediately notice a person in a loud shirt not tagged as a person? Wouldn’t a head hidden in glare stand out?
After all, surveillance software already detects persons in masks skulking about, and flags them for prompt hostile scrutiny. Surely it will quickly do the same with these stunts. They may work once, if you can get the appropriate software spec and version numbers… somehow.
-
Future legal issues: Suppose it does work, at least once. Further suppose it’s hard to update the surveillance software to compensate. How long do you think it will be before the rich and powerful institutions and people using surveillance will cause their pet legislators to make it highly illegal to do this? As in, “felony illegal.”
Personally, I’d wager it would not take more than a small number of months. A single digit small number.
So even under the most optimistic scenarios about this sort of thing working, the clock is immediately set to ticking to tell us when this becomes very difficult to try.
So in the (very) short term, you might be inconspicuous to cameras but conspicuous to people. In the longer term, you will be just plain conspicuous to everything.
The Weekend Conclusion
So this is not yet a workable response to surveillance, entertaining though it is. There are too many ways for software to catch up, or to flag it when seen as is done now with masking.
But need something: constant surveillance in the US of Muslims after 9/11 changed people’s behavior, in a chilling way that is incompatible with democracy.
Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project (S.T.O.P.) points out that we may have differing levels of trust for institutions. We have probably different feelings about the local police and the IRS. He says:
we may trust different institutions to wield this power, but none of us trust every institution that’s wielding it to do so unchecked.
(I’ve misplaced the source for this one. If you know, please tell me so I can add a footnote!)
We need policy solutions and regulation with very sharp teeth ready to bite those who abuse surveillance, such as:
- biometric privacy laws,
- “no match/no record”,
- limits on law enforcement, etc.
At this moment in history in the US, Cindy Cohn, executive director of EFF, says:
We’ve reached a kind of a moment in our society where we actually don’t think law could ever be on oour side.
At least in the European Union, surveillance data can only be used to investigate serious crimes, not for constant surveillance of the public. THe US has no federal policy; anybody can do anything, and the state legislatures are pretty hoplessly gerrymandered for Republican obstinacy and power-worship.
Notes & References
1: PBS Terra, “What If Our Clothes Could Disrupt Surveillance Cameras?”, YouTube, 2023-Sept. ↩
2: Wu, et al., “Making an Invisibility Cloak: Real World Adversarial Attacks on Object Detectors”, arχiv, last revised 2020-Jul-22 (retrieved 2023-Oct-10). DOI: arXiv:1910.14667v2.↩
3: M Pierce, “The Camera-Shy Hoodie”, Mac Pierce web site, undated (retrieved 2023-Oct-10). He’s giving away schematics and a standalone assembly guide to make your own, if that floats your boat. ↩
Gestae Commentaria
Comments for this post are closed pending repair of the comment system, but the Email/Twitter/Mastodon icons at page-top always work.