It's a service that attempts to calibrate your "influence" and give that influence a score. It has been the centre of lots of debate about how realistic such an idea is and what we might want to do with it.
Klout is not trying an easy thing - attempts to measure this sort of intangible quality inevitably run into problems - but it has gained traction as a score that some online habitues seek to maximise.
It has sparked interest of late for a few reasons: people are wondering what the score actually measures, they're wondering what the value exchange is that's being offered to users and they're wondering what all this suggests about media metrics generally.
It started with unease among bloggers about exactly what deal they're doing when they sign up to Klout. Their Klout gets measured - and everyone likes to have a little number indicating just how important they are - but, they wonder, what happens to the data that gets generated? How is it used?
Most bloggers who are deemed influential in a particular category are familiar with the deluge of PR spam that inevitably follows. Are they just signing up for more of that and providing valuable data for the company to sell to marketers?
At the same time, marketers are puzzled about the data they are getting. An article in Advertising Age the other week pointed out how Klout recently recalibrated all its scores, in part because people had learned how to game them. And anyone who watches Twitter for long will see the bemused Tweets from Klout users detailing the odd and erroneous things the company has declared it has influence about. This shouldn't surprise anyone. This stuff is hard. The social graph is not just dots on a map.
But the most interesting comment on the whole affair has come from Tom Ewing on his Blackbeard Blog. He pointed out that, while the Ad Age article lamented the quality of the Klout score, the greater sadness for the author seemed not to be that the score was wrong - but that it had changed. In an effort to improve it, Klout had undermined its consistency; it was no longer a useful standard.
And that's what the whole industry seems to be looking for: a measure of influence we can trade in. It would be good if it was accurate but, failing that, we could live with something we all agreed on. It's like the consensual hallucination of TV ratings - we all know they're flawed, but we go along with them because at least they're tradable.
It would be ironic if Klout lost influence because it was trying to be more accurate about influence, but that seems to be the odd world we live in.