For years, business start-ups have tried to capitalise on the internet’s new trust economy, such as its core to services like Airbnb and eBay, where you have to trust strangers with your money and house. Now, yet another start-up thinks it can solve the trust problem and measure trustworthiness with a browser extension. It’s called Karma, and it hates me.
Actually, I’m not sure exactly what Karma thinks of me. The service allows you to connect your various online accounts, and then, after combing through comments and reviews about you, a mysterious algorithm produces a so-called Karma score. It’s like a credit score for your behaviour online, especially in the sharing economy. I just signed up for the new service and connected as many accounts as I could, everything from Airbnb to Ebay to Foursquare to Facebook. The more accounts you connect — and hence the more data you surrender — the higher your score grows. My score is 77 out of 100. That’s a C.
What would you do if you came across someone with a C, an awfully average trustworthy score? You probably wouldn’t trust them too much! Like food safety ratings, you really want to stick to the 5-out-of-5 ratings. Maybe you could stomach a 4 if there’s no other option, but you turn and run when you see that 2 or 3 rating in a window. If I broadcast my own C-level trustworthy score to Airbnb hosts, I expect them to turn and run!
Just for the record: I’m trustworthy. I was a Scout, and trustworthiness is the first thing mentioned in the Scout Law. When I stay at an Airbnb, I make my own bed and steal nothing. When I buy a product from an Etsy seller, I pay promptly and act politely in our messages. I’m not sure how my Foursquare or Facebook profiles say about how trustworthy I am, but therein lies the problem with algorithmically rating human beings.
The new Karma extension produces its arbitrary score not only based on what people say about you but also how much data you surrender to the algorithm. We can only assume that Karma wants to do something with this data, perhaps sell it to advertisers. In his coverage of Karma, Wired’s Jason Tanz points to this quote from self-confessed “expert on the collaborative economy” Rachel Botsman:
In the 21st century, new trust networks, and the reputation capital they generate, will reinvent the way we think about wealth, markets, power and personal identity, in ways we can’t yet even imagine.
That sounds kind of scary to me. When people use phrases like “ways we can’t even imagine,” I usually think of scenes from post-apocalyptic movies. “We never could’ve imagined that cancer patients driving spike-covered trucks would rule the barren Earth one day,” a character might say. Frankly, the rating system that Karma’s concocted sounds a lot like the rating system in the Dave Eggers’ dystopian novel about a social media-saturated future, where algorithms know all.
Maybe I’m being negative, but I don’t think that these trust-based startups add value for users. If Karma wants to be some sort of credit score for online behaviour, I don’t see how this makes my life better. Sure, Airbnb hosts will be able to make more snap judgements based on an arbitrary score. I just have to deal with being judged by yet another number that’s based on complicated personal data.
Here’s an example: I’ve been an eBay member since 1999, and my ratings are 100-per cent positive. However, sellers using this new Karma extension will see that I’m only 77-per cent trustworthy and reconsider doing business with me. That sucks. Meanwhile, Karma will continue gathering data about my every move, every Airbnb stay, every Ebay purchase until I remember to delete my account. That’s creepy.
So I guess I do know what Karma thinks of me. Like Botsman said, these new networks deal in the currency of trust. My reputation is a commodity to Karma, one that can be bought and sold. And if you’ve ever been tricked into signing up for a shitty free credit score service, you also know how that commodity can also be exploited.
I like the idea of building more trust online. I don’t think arbitrary algorithms and data hoarding are the way forward, though. The world doesn’t need another way to rate humans.