I think that crowdsourcing is a really cool idea. Crowdsourcing is where you ask a couple thousand of your closest friends to help you with some project - maybe to answer an important question. Who will win the Superbowl? Should I wear a red tie to the interview? Will FitBit stock go up? Will my doggy iPad be a successful product? And, most important, should I order another strawberry margarita, or move right on to my main course of Maker's Mark on the rocks?
Jimi doing some crowdsourcing on whether is vest was the coolest part of the 60's
Naturally, I have blogged about crowdsourcing before. After all, all great men repeat themselves. I had a blog on recommendation engines, and talked a bit about how Netflix cogitates on all your ratings in order to make sure you have a superlative movie viewing experience. In this Valentine's Day blogpost, I did a bit of crowdsourcing in the name of romance.
Opinion polls are an early form of crowdsourcing. The Nielsen company once solicited my opinion on TV shows and radio stations. I showed them! I didn't watch any TV, and only listened to classical music on the radio through the entire period. That'll show 'em.
As we are in a political season (and when aren't we?), we are inundated with the latest pontifications from pollsters. But can we trust the pollsters? Are they biased? Who rates the raters? I have an answer for that! Nate Silver is a prominent statistician who applies his science to meta-analysis - analyzing the analysis. Have a look at his webpage that rates the survey companies on how well they follow an unbiased protocol and on their accuracy. And check out this page for the latest compilation of presidential polls.
Opinion polls have one deficiency. In order to get a statistically significant sample size, you need to ask the opinions of a lot of people, and many of these people don't know or don't care. This is not the most efficient or reliable way to make predictions.
Let's say you had some reason to want to predict the outcome of the NBA playoffs. I dunno... maybe you had some sort of financial stake? (I assume you are like an owner or something... cuz betting on games is naughty.) One way to get a good prediction is to talk to some people who follow the basketball teams closely. Like me, for example. I can tell you, right off the top of my head, how many RBIs Peyton Manning had when he went up against Tiger Woods in the 2015 Stanley Cup. You definitely would want to get my opinions on Michael Phelps before you put money on Yankees to win the Superbowl!
(By the way, I advise against putting money on the Yankees for the Superbowl.)
The problem is... opinion polls don't take into account the expertise of the people being polled. I would argue that the opinion of ten experts is more reliable than the a good random sampling of 1,000 random people who are randomly clueless on the random topic.
But, coming up with a panel of real experts on a random topic is a lot of work. Might there be another way that is almost as good?
Here's an interesting take... how about letting people tell you whether they are an expert? Oh. That's a bad idea. Ok, how about this... ask people to put their money where their mouth is?
Racetracks do this every day. And here is the interesting part: The odds on a horse are not based on the expert opinion of some expert. The odds at the racetrack are based entirely on crowdsourcing. When a lot of money has been bet on a given horse, the odds change. Curiously, the odds change in such a way as to make sure that the track makes money. What are the odds of the track making money!?!
You don't like racetracks and all the shady undesirables lurking around? I have an example of crowdsourcing where people declare their expertise with their checkbook. The stock market. If a lot of people bet on a given stock, the price goes up. If no one likes the company, the stock goes down. Each individual decides how much they are willing to pay to buy a stock, or how much they are willing to sell a stock for. Just like the racetrack, only with a different sort of shady undesirables hanging around.
Now I have set the stage for a clever idea: Let's say that your company is considering whether a given idea for a new product will pay off. You could give one person that job and hope he/she gets it right. You could get a committee on it, and watch the committee form sub-committees, do focus groups, pay for market research, etc. And two years later, one person will finally have to fire the committee and make the decision. Committees are always the best way to get decisions made fast.
Or (get ready for the cool idea!) you could set up a virtual stock market for your employees to invest fake money in a bunch of potential product ideas. By introducing money - even though it's fake - you get people to invest where they feel they have some expertise. And those who actually have that expertise will tend to invest "correctly" and then have more money with which to sway future ideas.
Of course, the details get a bit involved. There is some fancy math under the hood that is needed to simulate how the price of a stock goes up when you put money into it and goes down when you sell or short a stock. This math is called a "Market Maker". It has nothing to do with Maker's Mark, unfortunately.
As we are in a political season (and when aren't we?), we are inundated with the latest pontifications from pollsters. But can we trust the pollsters? Are they biased? Who rates the raters? I have an answer for that! Nate Silver is a prominent statistician who applies his science to meta-analysis - analyzing the analysis. Have a look at his webpage that rates the survey companies on how well they follow an unbiased protocol and on their accuracy. And check out this page for the latest compilation of presidential polls.
I don't know how I feel about these poll results
Opinion polls have one deficiency. In order to get a statistically significant sample size, you need to ask the opinions of a lot of people, and many of these people don't know or don't care. This is not the most efficient or reliable way to make predictions.
Let's say you had some reason to want to predict the outcome of the NBA playoffs. I dunno... maybe you had some sort of financial stake? (I assume you are like an owner or something... cuz betting on games is naughty.) One way to get a good prediction is to talk to some people who follow the basketball teams closely. Like me, for example. I can tell you, right off the top of my head, how many RBIs Peyton Manning had when he went up against Tiger Woods in the 2015 Stanley Cup. You definitely would want to get my opinions on Michael Phelps before you put money on Yankees to win the Superbowl!
(By the way, I advise against putting money on the Yankees for the Superbowl.)
The problem is... opinion polls don't take into account the expertise of the people being polled. I would argue that the opinion of ten experts is more reliable than the a good random sampling of 1,000 random people who are randomly clueless on the random topic.
You don't want the opinion of this random actor!
But, coming up with a panel of real experts on a random topic is a lot of work. Might there be another way that is almost as good?
Here's an interesting take... how about letting people tell you whether they are an expert? Oh. That's a bad idea. Ok, how about this... ask people to put their money where their mouth is?
Racetracks do this every day. And here is the interesting part: The odds on a horse are not based on the expert opinion of some expert. The odds at the racetrack are based entirely on crowdsourcing. When a lot of money has been bet on a given horse, the odds change. Curiously, the odds change in such a way as to make sure that the track makes money. What are the odds of the track making money!?!
You don't like racetracks and all the shady undesirables lurking around? I have an example of crowdsourcing where people declare their expertise with their checkbook. The stock market. If a lot of people bet on a given stock, the price goes up. If no one likes the company, the stock goes down. Each individual decides how much they are willing to pay to buy a stock, or how much they are willing to sell a stock for. Just like the racetrack, only with a different sort of shady undesirables hanging around.
Now I have set the stage for a clever idea: Let's say that your company is considering whether a given idea for a new product will pay off. You could give one person that job and hope he/she gets it right. You could get a committee on it, and watch the committee form sub-committees, do focus groups, pay for market research, etc. And two years later, one person will finally have to fire the committee and make the decision. Committees are always the best way to get decisions made fast.
Or (get ready for the cool idea!) you could set up a virtual stock market for your employees to invest fake money in a bunch of potential product ideas. By introducing money - even though it's fake - you get people to invest where they feel they have some expertise. And those who actually have that expertise will tend to invest "correctly" and then have more money with which to sway future ideas.
Of course, the details get a bit involved. There is some fancy math under the hood that is needed to simulate how the price of a stock goes up when you put money into it and goes down when you sell or short a stock. This math is called a "Market Maker". It has nothing to do with Maker's Mark, unfortunately.
The domain of mathcanics
There is a company in Milwaukee called IdeaWake that has developed some software to do all this. I'm happy to say that I helped them out, just a little bit.