Mindmasters
Targeted algorithms can influence our thoughts, feelings and behaviors.
After a few hundred likes, Facebook knows you better than your spouse. And that’s not because Marc Zuckerberg cares more about you.
The pattern of choices we make online reveal more in totality about us than we ever reveal to any one person. Sandra Matz, Columbia professor and author of the new book Mindmasters, knows this because she spent the early part of her career helping companies use these profiles to refine their marketing efforts.
She is also a pioneer in understanding how that knowledge can be used for much more than selling product. When the story broke in 2016 that Cambridge Analytica had used Facebook data to spread misinformation against Hilary Clinton, Matz was the only scientist at the time who had studied this type of psychological targeting. Her book is a startingly compendium of how this technique has advanced since, with both good and bad potential applications.
Escape Is Futile
You might think, as I did, that avoiding social media insulates you from psychological targeting. It doesn’t. Your Google search history reveals so much that it might even be used to recreate your life from scratch - as the 2020 documentary Made to Measure did with a young Austrian woman. Financial transactions are even more revealing - research suggests just three can be enough to pinpoint the person who made them1. And then there’s your phone - a device that Matz says is like a stranger that walks behind you and watches everything you do.
Algorithms fed with this data can accurately predict your answers on standard personality tests like this one. But is predicting test responses really “knowing” who we are? After all we might fudge the answers, disguising what we think of as unflattering information. A better benchmark is life outcomes - our professions, hobbies, where we live, etc. These represent what we actually do, not just what we say. Computers also do a great job of using our online footprint to predict those outcomes. In some cases, Matz says, the computer might know something that you don’t know yourself.
The Basic Bitch Effect
But we don’t always behave consistently. Matz is an introvert who likes dancing. I’m an extrovert who can watch sports for hours alone in a dark room. Algorithms don’t ‘like’ that because they ‘want’ the world to be as predictable - as unchanging - as possible. They aren’t optimized to throw random suggestions at you on the chance you’ll discover something new. They’re designed to feed you suggestions that bring you back to your typical profile. Matz is working on new research exploring the implications:
These agents are not optimized to take risks, and they’re certainly not optimized, currently, to keep you complex given the situation that you’re in.
My current working title for the paper is the Basic Bitch Effect. Because it all makes us more similar and it all makes us so shallow. We’re always the same person. It’s not just that we shrink as individuals, but we also look more similar over time because it’s pulling us in the direction of the average of the population.
This could be especially problematic for generations growing up now, whose life experiences might be dominated by online interactions. Matz worries that they will grow up lacking the skills to deal with the complexities, unpredictability and irrationality of the physical world.
Profiling does have positive applications though, among them using deviations from a known baseline behavior as an early warning sign for things like depression. Phones are potentially very powerful in this regard because they not only capture your online behavior, but your physical movements. Someone with a known risk of depression might be able to set a ‘smoke alarm’ that goes off when their behavior starts to signal a risky period - perhaps alerting a trusted person to check on them.
A Picture Is Worth A Lot More Than 1,000 Words
Take a close look at this man. Is he conservative or progressive? Gay or straight? Introvert or extrovert? Can he hit a reliable backhand? We all know these questions can’t be answered just by looking at a photo.
Actually, new research suggests they can.
Some of this predictability could be due to grooming decisions. In the picture above, the choice of a lavender shirt that subtly accentuates stripes in the jacket could signal the sartorial sophistication of an extrovert. In other images, extroverts are known to wear blue-eyed contact lenses and select flattering camera angles. But the research goes further. Stripping away all grooming choices and just looking at facial features, the data suggests computers can still make good predictions for personality, sexual orientation and perhaps even political views. Here’s Matz:
Take extroversion, if you are a really beautiful kid…the likelihood that you might also turn out a little bit more social and extroverted and trusting in other people could actually go up…because you get a lot of positive social feedback. Another pathway could be hormones. We know that there’s certain types of hormones like testosterone that very much influence our behavior and the way that we show up. It’s essentially related to being somewhat more aggressive and assertive. But we also know that hormones shape our facial features.
So, there might just be certain parts of biology that determine both behavior and the way that our faces look. So, if you take some of these pointers of like, there are certain pathways by which this might play out, then it’s also conceivable that computers, just because they can take in so much information, might pick up on these subtle cues that we, as humans, dismiss.
It’s startling work and more evidence of the potential power these algorithms have. Power to influence your opinions, emotions, and ultimately behavior. Sometimes this might be good; sometimes it’s bad.
In the last third of her book Matz talks about how to make data work for us. For example, many companies might prefer not to store your data - it’s expensive and comes with security risks. Instead, your data could stay locked on your phone and the algorithms would be sent there when needed to run their app. The model downloads to your phone, accesses the data locally, produces the result and updates itself after learning from any new data it encounters. You get the customized results but without giving up all your personal information:
That’s a total game changer because now I can say, hey, I do get exactly the same benefits, but without the downside of you now having my data.
Matz ends her book by saying we have a unique opportunity to reclaim control over our lives and create a collective data infrastructure that benefits all of us. Listen to our conversation to hear more about how this can happen:
In the interest of journalistic openness here are my last three. (1) Printer ink in Whakatane, New Zealand. (2) Heating pad at Ohope Beach Pharmacy, Ohope, New Zealand. (3) Wine from Liqourland, Ohope, New Zealand. How many other people in this rural New Zealand district are writing a book that requires excessive printing (purchase 1) and have an injured right shoulder (purchases 2 & 3)?
You Might Also Like
How to navigate life's trickiest questions.


