Earlier this year, I was listening to an episode of the Planet Money podcast called “How Much Does This Cow Weigh?” (see Listen Now), and I began thinking about the idea of crowdsourcing inquiries in the clinical ophthalmic community. For those who are not into podcasts based on economic phenomena, Planet Money was launched during the 2008 financial crisis to explain how the economy works through storytelling and interviewing. Planet Money reporters and producers discuss a range of topics, from the raw milk industry to the differences between vodkas. As a longtime listener, I have learned about various economic principles through this program—wisdom of the crowd being one of them.


In 1906, a scientist, polymath, and statistician named Francis Galston came upon an opportunity to test how accurately regular people could answer a specific question. Galston was at a county fair where people were competing to guess the correct weight of an ox. He decided to average their guesses and compare the result to the correct answer, and what he discovered surprised him. The average of 800 guesses (1,197 lbs) was almost exactly accurate to the actual weight of the ox (1,198 lbs). Galston concluded that, if you ask enough people the same question, they may come up with an answer that mirrors or surpasses that of an expert.

This unexpected phenomenon—later dubbed the wisdom of the crowd effect—now drives everything from the stock market to the price of basic consumer products. Sourcing the opinion of a crowd to determine an answer was also the beginning of what we now know as crowdsourcing. The term itself is a portmanteau composed of, as one might expect, crowd and outsourcing and was coined in 2005 by Jeff Howe of Wired. Although crowdsourcing was originally used to describe companies that outsourced tasks to a group of individuals who worked collectively or individually, Howe later expanded the definition to include the application of open-source principles to fields outside software.

In both the tech and research communities, crowdsourcing is related to open innovation, or diverging from conventional closed innovation systems. Over the past decade, crowdsourcing has gained momentum as an approach to problem solving,1,2 through which an organization enlists a large group to solve a problem or a component of a problem and then share solutions.3 This concept has facilitated ways for the public to engage in medical research, including innovation challenges (also known as prize competitions, prize contests, or open contests), hackathons, online systems for collaboration, and other activities.2,4,5

Collective intelligence is not all good, however. A crowd can be swayed, especially if an influential member of the community expresses his or her opinion early in the process. In modern times, this can often be linked to moral outrage, such as the case with vaccine deniers resulting in low vaccine uptake in some communities and subsequent outbreaks of rare diseases. This so-called madness of the crowd can be seen with other well-intentioned causes and can have dire consequences.

Listen Now

Planet Money | Episode 644: How Much Does This Cow Weigh?



When it comes to crowdsourcing in medicine, many patients today turn to the internet for a diagnosis. Crowdsourcing medical websites do exist and are a way for patients to seek answers. For example, CrowdMed is a site on which patients can submit questions and have them answered by a team of medical detectives, ranging from medical students, to nurses, to doctors, to acupuncturists. The questions may or may not be answered adequately, and posting a query can cost up to $250. The app Figure 1 is a free photo-sharing network for medical professionals—think Instagram for doctors (see Download Now). This service provides users with a way to share interesting cases and receive feedback. However, nothing quite like this exists specifically for the ophthalmic community.

Download Now



Ophthalmologists routinely encounter perplexing cases in clinical practice. Wouldn’t it be valuable if we could poll an audience of our colleagues on a particular diagnosis or a preferred course of action? Although this can partially be done through the American Glaucoma Society (AGS) open forum, larger-scale participation would be needed to truly crowdsource an answer. Further, previous answers would need to be masked so that they would not influence subsequent users. Certain changes to the way in which AGS users interact with the forum would have to be made, but the results could tremendously improve how we share ophthalmic knowledge and cases.


Crowdsourcing can be a valuable tool for determining the answers to minor questions, like how much an ox weighs, and to more serious medical questions, like which treatment is best in a clinical vignette. The AGS forum and other similar listservs pool experts together, but it is usually the loudest and most senior voice that wins out. Although expertise is important, the collective experience of the crowd is often more meaningful and more likely to yield a better answer.

On a personal level, when I have a clinical question, I use Instagram stories to crowdsource answers from the ophthalmic community. The benefit is that usually more than a thousand users will see the post, and anywhere from 10% to 20% may reply to the question. On Instagram stories, a user cannot see how other users answered a poll, so each response is not influenced by the prior responses. Of course, the responses can vary, but I find that they usually point toward the correct path.

1. Han L, Chen A, Wei S, Ong JJ, Iwelunmor J, Tucker JD. Crowdsourcing in health and health research: a practical guide. World Health Organization. July 2018. www.who.int/tdr/publications/year/2018/crowdsourcing-practical-guide/en/. Accessed April 23, 2020.

2. Pan SW, Stein G, Bayus B, et al. Systematic review of design contests for health: spurring innovation and mass engagement. BMJ Innovations. 2017;3(4):227-237.

3. Van Ess H. Crowdsourcing: how to find a crowd. Paper presented at: ARD/ZDF Academy September 17, 2010; Germany.

4. Brabham DC, Ribisl KM, Kirchner TR, Bernhardt JM. Crowdsourcing applications for public health. Am J Prev Med. 2014;46(2):179-187.

5. Tucker JD, Day S, Tang W, Bayus B. Crowdsourcing in medical research: concepts and applications. PeerJ. 2019;7:e6762.