Daren C. Brabham defined crowdsourcing as an "online, distributed problem-solving and production model." Kristen L. Guth and Brabham found that the performance of ideas offered in crowdsourcing platforms are affected not only by their quality, but also by the communication among users about the ideas, and presentation in the platform itself.
Despite the multiplicity of definitions for crowdsourcing, one constant has been the broadcasting of problems to the public, and an open call for contributions to help solve the problem.[original research?] Members of the public submit solutions that are then owned by the entity who originally broadcast the problem. In some cases, the contributor of the solution is compensated monetarily with prizes or public recognition. In other cases, the only rewards may be praise or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time, from experts, or from small businesses.
While the term "crowdsourcing" was popularized online to describe Internet-based activities, some examples of projects, in retrospect, can be described as crowdsourcing.
Crowdsourcing has often been used in the past as a competition to discover a solution. The French government proposed several of these competitions, often rewarded with Montyon Prizes. These included the Leblanc process, or the Alkali prize, where a reward was provided for separating the salt from the alkali, and the Fourneyron's turbine, when the first hydraulic commercial turbine was developed.
A number of motivations exist for businesses to use crowdsourcing to accomplish their tasks. These include the ability to offload peak demand, access cheap labor and information, generate better results, access a wider array of talent than what is present in one organization, and undertake problems that would have been too difficult to solve internally. Crowdsourcing allows businesses to submit problems on which contributors can work—on topics such as science, manufacturing, biotech, and medicine—optionally with monetary rewards for successful solutions. Although crowdsourcing complicated tasks can be difficult, simple work tasks[specify] can be crowdsourced cheaply and effectively.
Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use. Urban and transit planning are prime areas for crowdsourcing. For example, from 2008 to 2009, a crowdsourcing project for transit planning in Salt Lake City was created to test the public participation process. Another notable application of crowdsourcing for government problem-solving is Peer-to-Patent, which was an initiative to improve patent quality in the United States through gathering public input in a structured, productive manner.
Researchers have used crowdsourcing systems such as Amazon Mechanical Turk or CloudResearch to aid their research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation to the public. Notable examples include using the crowd to create speech and language databases, to conduct user studies, and to run behavioral science surveys and experiments. Crowdsourcing systems provided researchers with the ability to gather large amounts of data, and helped researchers to collect data from populations and demographics they may not have access to locally.
Artists have also used crowdsourcing systems. In a project called the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world. Artist Sam Brown leveraged the crowd by asking visitors of his website explodingdog to send him sentences to use as inspirations for his paintings. Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized. As with other types of uses, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.
The use of crowdsourcing in medical and health research is increasing systematically. The process involves outsourcing tasks or gathering input from a large, diverse groups of people, often facilitated through digital platforms, to contribute to medical research, diagnostics, data analysis, promotion, and various healthcare-related initiatives. Usage of this innovative approach supplies a useful community-based method to improve medical services.
From funding individual medical cases and innovative devices to supporting research, community health initiatives, and crisis responses, crowdsourcing proves its versatile impact in addressing diverse healthcare challenges.
Another approach is sourcing results of clinical algorithms from collective input of participants. Researchers from SPIE developed a crowdsourcing tool, to train individuals, especially middle and high school students in South Korea, to diagnose malaria-infected red blood cells. Using a statistical framework, the platform combined expert diagnoses with those from minimally trained individuals, creating a gold standard library. The objective was to swiftly teach people to achieve great diagnosis accuracy without any prior training.
A more recent version of crowdsourcing in astronomy is NASA's photo organizing project, which asked internet users to browse photos taken from space and try to identify the location the picture is documenting.
In the field of behavioral science, crowdsourcing is often used to gather data and insights on human behavior and decision making. Researchers may create online surveys or experiments that are completed by a large number of participants, allowing them to collect a diverse and potentially large amount of data. Crowdsourcing can also be used to gather real-time data on behavior, such as through the use of mobile apps that track and record users' activities and decision making. The use of crowdsourcing in behavioral science has the potential to greatly increase the scope and efficiency of research, and has been used in studies on topics such as psychology research, political attitudes, and social media use.
Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indices to records.
Crowdsourcing is increasingly used in professional journalism. Journalists are able to organize crowdsourced information by fact checking the information, and then using the information they have gathered in their articles as they see fit. A daily newspaper in Sweden has successfully used crowdsourcing in investigating the home loan interest rates in the country in 2013–2014, which resulted in over 50,000 submissions. A daily newspaper in Finland crowdsourced an investigation into stock short-selling in 2011–2012, and the crowdsourced information led to revelations of a tax evasion system by a Finnish bank. The bank executive was fired and policy changes followed. TalkingPointsMemo in the United States asked its readers to examine 3,000 emails concerning the firing of federal prosecutors in 2008. The British newspaper The Guardian crowdsourced the examination of hundreds of thousands of documents in 2009.
Data donation is a crowdsourcing approach to gather digital data. It is used by researchers and organizations to gain access to data from online platforms, websites, search engines and apps and devices. Data donation projects usually rely on participants volunteering their authentic digital profile information. Examples include:
Crowdsourcing public policy and the production of public services is also referred to as citizen sourcing. While some scholars argue crowdsourcing for this purpose as a policy tool or a definite means of co-production, others question that and argue that crowdsourcing should be considered just as a technological enabler that simply increases speed and ease of participation. Crowdsourcing can also play a role in democratization.
The first conference focusing on Crowdsourcing for Politics and Policy took place at Oxford University, under the auspices of the Oxford Internet Institute in 2014. Research has emerged since 2012 which focused on the use of crowdsourcing for policy purposes. These include experimentally investigating the use of Virtual Labor Markets for policy assessment, and assessing the potential for citizen involvement in process innovation for public administration.
Governments across the world are increasingly using crowdsourcing for knowledge discovery and civic engagement. Iceland crowdsourced their constitution reform process in 2011, and Finland has crowdsourced several law reform processes to address their off-road traffic laws. The Finnish government allowed citizens to go on an online forum to discuss problems and possible resolutions regarding some off-road traffic laws. The crowdsourced information and resolutions would then be passed on to legislators to refer to when making a decision, allowing citizens to contribute to public policy in a more direct manner. Palo Alto crowdsources feedback for its Comprehensive City Plan update in a process started in 2015. The House of Representatives in Brazil has used crowdsourcing in policy-reforms.
Crowdsourcing has been used extensively for gathering language-related data.
In linguistics, crowdsourcing strategies have been applied to estimate word knowledge, vocabulary size, and word origin. Implicit crowdsourcing on social media has also approximating sociolinguistic data efficiently. Reddit conversations in various location-based subreddits were analyzed for the presence of grammatical forms unique to a regional dialect. These were then used to map the extent of the speaker population. The results could roughly approximate large-scale surveys on the subject without engaging in field interviews.
Mining publicly available social media conversations can be used as a form of implicit crowdsourcing to approximate the geographic extent of speaker dialects. Proverb collection is also being done via crowdsourcing on the Web, most notably for the Pashto language of Afghanistan and Pakistan. Crowdsourcing has been extensively used to collect high-quality gold standards for creating automatic systems in natural language processing (e.g. named entity recognition, entity linking).
Organizations often leverage crowdsourcing to gather ideas for new products as well as for the refinement of established product. Lego allows users to work on new product designs while conducting requirements testing. Any user can provide a design for a product, and other users can vote on the product. Once the submitted product has received 10,000 votes, it will be formally reviewed in stages and go into production with no impediments such as legal flaws identified. The creator receives royalties from the net income. Labelling new products as "customer-ideated" through crowdsourcing initiatives, as opposed to not specifying the source of design, leads to a substantial increase in the actual market performance of the products. Merely highlighting the source of design to customers, particularly, attributing the product to crowdsourcing efforts from user communities, can lead to a significant boost in product sales. Consumers perceive "customer-ideated" products as more effective in addressing their needs, leading to a quality inference. The design mode associated with crowdsourced ideas is considered superior in generating promising new products, contributing to the observed increase in market performance.
Crowdsourcing is widely used by businesses to source feedback and suggestions on how to improve their products and services. Homeowners can use Airbnb to list their accommodation or unused rooms. Owners set their own nightly, weekly and monthly rates and accommodations. The business, in turn, charges guests and hosts a fee. Guests usually end up spending between $9 and $15. They have to pay a booking fee every time they book a room. The landlord, in turn, pays a service fee for the amount due. The company has 1,500 properties in 34,000 cities in more than 190 countries.
Crowdsourcing is frequently used in market research as a way to gather insights and opinions from a large number of consumers. Companies may create online surveys or focus groups that are open to the general public, allowing them to gather a diverse range of perspectives on their products or services. This can be especially useful for companies seeking to understand the needs and preferences of a particular market segment or to gather feedback on the effectiveness of their marketing efforts. The use of crowdsourcing in market research allows companies to quickly and efficiently gather a large amount of data and insights that can inform their business decisions.
Internet and digital technologies have massively expanded the opportunities for crowdsourcing. However, the effect of user communication and platform presentation can have a major bearing on the success of an online crowdsourcing project. The crowdsourced problem can range from huge tasks (such as finding alien life or mapping earthquake zones) or very small (identifying images). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, and subjects that people find sympathetic.
Ivo Blohm identifies four types of Crowdsourcing Platforms: Microtasking, Information Pooling, Broadcast Search, and Open Collaboration. They differ in the diversity and aggregation of contributions that are created. The diversity of information collected can either be homogenous or heterogenous. The aggregation of information can either be selective or integrative. Some common categories of crowdsourcing have been used effectively in the commercial world include crowdvoting, crowdsolving, crowdfunding, microwork, creative crowdsourcing, crowdsource workforce management, and inducement prize contests.
Crowdvoting occurs when a website gathers a large group's opinions and judgments on a certain topic. Some crowdsourcing tools and platforms allow participants to rank each other's contributions, e.g. in answer to the question "What is one thing we can do to make Acme a great company?" One common method for ranking is "like" counting, where the contribution with the most "like" votes ranks first. This method is simple and easy to understand, but it privileges early contributions, which have more time to accumulate votes. In recent years, several crowdsourcing companies have begun to use pairwise comparisons backed by ranking algorithms. Ranking algorithms do not penalize late contributions. They also produce results quicker. Ranking algorithms have proven to be at least 10 times faster than manual stack ranking. One drawback, however, is that ranking algorithms are more difficult to understand than vote counting.
Crowdvoting's value in the movie industry was shown when in 2009 a crowd accurately predicted the success or failure of a movie based on its trailer, a feat that was replicated in 2013 by Google.
On Reddit, users collectively rate web content, discussions and comments as well as questions posed to persons of interest in "AMA" and AskScience online interviews.[cleanup needed]
Crowdfunding is the process of funding projects by a multitude of people contributing a small amount to attain a certain monetary goal, typically via the Internet. Crowdfunding has been used for both commercial and charitable purposes. The crowdfuding model that has been around the longest is rewards-based crowdfunding. This model is where people can prepurchase products, buy experiences, or simply donate. While this funding may in some cases go towards helping a business, funders are not allowed to invest and become shareholders via rewards-based crowdfunding.
Individuals, businesses, and entrepreneurs can showcase their businesses and projects by creating a profile, which typically includes a short video introducing their project, a list of rewards per donation, and illustrations through images. Funders make monetary contribution for numerous reasons:
The dilemma for equity crowdfunding in the US as of 2012 was during a refinement process for the regulations of the Securities and Exchange Commission, which had until 1 January 2013 to tweak the fundraising methods. The regulators were overwhelmed trying to regulate Dodd-Frank and all the other rules and regulations involving public companies and the way they traded. Advocates of regulation claimed that crowdfunding would open up the flood gates for fraud, called it the "wild west" of fundraising, and compared it to the 1980s days of penny stock "cold-call cowboys". The process allowed for up to $1 million to be raised without some of the regulations being involved. Companies under the then-current proposal would have exemptions available and be able to raise capital from a larger pool of persons, which can include lower thresholds for investor criteria, whereas the old rules required that the person be an "accredited" investor. These people are often recruited from social networks, where the funds can be acquired from an equity purchase, loan, donation, or ordering. The amounts collected have become quite high, with requests that are over a million dollars for software such as Trampoline Systems, which used it to finance the commercialization of their new software.
Web-based idea competitions or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielded around 46,000 ideas. Another example is the Netflix Prize in 2009. People were asked to come up with a recommendation algorithm that is more accurate than Netflix's current algorithm. It had a grand prize of US$1,000,000, and it was given to a team which designed an algorithm that beat Netflix's own algorithm for predicting ratings by 10.06%.
Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks. Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.
Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google Ads.
The demographics of Microworkers.com differ from Mechanical Turk in that the US and India together accounting for only 25% of workers; 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.
Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession. Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.
Gregory Saxton et al. studied the role of community users, among other elements, during his content analysis of 103 crowdsourcing organizations. They developed a taxonomy of nine crowdsourcing models (intermediary model, citizen media production, collaborative software development, digital goods sales, product design, peer-to-peer social financing, consumer report model, knowledge base building model, and collaborative science project model) in which to categorize the roles of community users, such as researcher, engineer, programmer, journalist, graphic designer, etc., and the products and services developed.
Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment contributors experience through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and taking the job as a pastime. Community-based motivations refer to motivations related to community participation, and include community identification and social contact. In crowdsourced journalism, the motivation factors are intrinsic: the crowd is driven by a possibility to make social impact, contribute to social change, and help their peers.
Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially, such as the altruistic motivations of online volunteers. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to help researchers identify tumor cells, than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing.
Motivation in crowdsourcing is often a mix of intrinsic and extrinsic factors. In a crowdsourced law-making project, the crowd was motivated by both intrinsic and extrinsic factors. Intrinsic motivations included fulfilling civic duty, affecting the law for sociotropic reasons, to deliberate with and learn from peers. Extrinsic motivations included changing the law for financial gain or other benefits. Participation in crowdsourced policy-making was an act of grassroots advocacy, whether to pursue one's own interest or more altruistic goals, such as protecting nature. Participants in online research studies report their motivation as both intrinsic enjoyment and monetary gain.
Despite the potential global reach of IT applications online, recent research illustrates that differences in location[which?] affect participation outcomes in IT-mediated crowds.
While there it lots of anecdotal evidence that illustrates the potential of crowdsourcing and the benefits that organizations have derived, there is scientific evidence that crowdsourcing initiatives often fail. At least six major topics cover the limitations and controversies about crowdsourcing:
Crowdsourcing initiatives often fail to attract sufficient or beneficial contributions. The vast majority of crowdsourcing initiatives hardly attract contributions; an analysis of thousands of organizations' crowdsourcing initiatives illustrates that only the 90th percentile of initiatives attracts more than one contribution a month. While crowdsourcing initiatives may be effective in isolation, when faced with competition they mail fail to attract sufficient contributions. Nagaraj and Piezunka (2024) illustrate that OpenStreetMap struggled to attract contributions once Google Maps entered a country.
Crowdsourcing allows anyone to participate, allowing for many unqualified participants and resulting in large quantities of unusable contributions. Companies, or additional crowdworkers, then have to sort through the low-quality contributions. The task of sorting through crowdworkers' contributions, along with the necessary job of managing the crowd, requires companies to hire actual employees, thereby increasing management overhead. For example, susceptibility to faulty results can be caused by targeted, malicious work efforts. Since crowdworkers completing microtasks are paid per task, a financial incentive often causes workers to complete tasks quickly rather than well. Verifying responses is time-consuming, so employers often depend on having multiple workers complete the same task to correct errors. However, having each task completed multiple times increases time and monetary costs. Some companies, like CloudResearch, control data quality by repeatedly vetting crowdworkers to ensure they are paying attention and providing high-quality work.
Crowdsourcing quality is also impacted by task design. Lukyanenko et al. argue that, the prevailing practice of modeling crowdsourcing data collection tasks in terms of fixed classes (options), unnecessarily restricts quality. Results demonstrate that information accuracy depends on the classes used to model domains, with participants providing more accurate information when classifying phenomena at a more general level (which is typically less useful to sponsor organizations, hence less common). Further, greater overall accuracy is expected when participants could provide free-form data compared to tasks in which they select from constrained choices. In behavioral science research, it is often recommended to include open-ended responses, in addition to other forms of attention checks, to assess data quality.
Just as limiting, oftentimes there is not enough skills or expertise in the crowd to successfully accomplish the desired task. While this scenario does not affect "simple" tasks such as image labeling, it is particularly problematic for more complex tasks, such as engineering design or product validation. A comparison between the evaluation of business models from experts and an anonymous online crowd showed that an anonymous online crowd cannot evaluate business models to the same level as experts. In these cases, it may be difficult or even impossible to find qualified people in the crowd, as their responses represent only a small fraction of the workers compared to consistent, but incorrect crowd members. However, if the task is "intermediate" in its difficulty, estimating crowdworkers' skills and intentions and leveraging them for inferring true responses works well, albeit with an additional computation cost.
Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to limited access to the Internet, participation in low developed countries is relatively low. Participation in highly developed countries is similarly low, largely because the low amount of pay is not a strong motivation for most users in these countries. These factors lead to a bias in the population pool towards users in medium developed countries, as deemed by the human development index. Participants in these countries sometimes masquerade as U.S. participants to gain access to certain tasks. This led to the "bot scare" on Amazon Mechanical Turk in 2018, when researchers thought bots were completing research surveys due to the lower quality of responses originating from medium-developed countries.
The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures. This results in a long-tail power law distribution of completion times. Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started. Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations. One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually little information is known about the final product, and workers rarely interacts with the final client in the process. This can decrease the quality of product as client interaction is considered to be a vital part of the design process.
An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools. In a typical workplace, coworkers are organized in such a way that they can work together and build upon each other's knowledge and ideas. Furthermore, the company often provides employees with the necessary information, procedures, and tools to fulfill their responsibilities. However, in crowdsourcing, crowd-workers are left to depend on their own knowledge and means to complete tasks.
A crowdsourced project is usually expected to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavor, who creates the majority of the product, while the crowd only participates in minor details.
To make an idea turn into a reality, the first component needed is capital. Depending on the scope and complexity of the crowdsourced project, the amount of necessary capital can range from a few thousand dollars to hundreds of thousands, if not more. The capital-raising process can take from days to months depending on different variables, including the entrepreneur's network and the amount of initial self-generated capital.
The crowdsourcing process allows entrepreneurs to access a wide range of investors who can take different stakes in the project. As an effect, crowdsourcing simplifies the capital-raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started. Overall, the simplified access to capital can save time to start projects and potentially increase the efficiency of projects.
Others argue that easier access to capital through a large number of smaller investors can hurt the project and its creators. With a simplified capital-raising process involving more investors with smaller stakes, investors are more risk-seeking because they can take on an investment size with which they are comfortable. This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project. Instead of being forced to assess risks and convince large institutional investors on why their project can be successful, wary investors can be replaced by others who are willing to take on the risk.
Some translation companies and translation tool consumers pretend to use crowdsourcing as a means for drastically cutting costs, instead of hiring professional translators. This situation has been systematically denounced by IAPTI and other translator organizations.
The raw number of ideas that get funded and the quality of the ideas is a large controversy over the issue of crowdsourcing.
Proponents argue that crowdsourcing is beneficial because it allows the formation of startups with niche ideas that would not survive venture capitalist or angel funding, which are oftentimes the primary investors in startups. Many ideas are scrapped in their infancy due to insufficient support and lack of capital, but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project.
Crowdsourcing allows those who would benefit from the project to fund and become a part of it, which is one way for small niche ideas get started. However, when the number of projects grows, the number of failures also increases. Crowdsourcing assists the development of niche and high-risk projects due to a perceived need from a select few who seek the product. With high risk and small target markets, the pool of crowdsourced projects faces a greater possible loss of capital, lower return, and lower levels of success.
Because crowdworkers are considered independent contractors rather than employees, they are not guaranteed minimum wage. In practice, workers using Amazon Mechanical Turk generally earn less than minimum wage. In 2009, it was reported that United States Turk users earned an average of $2.30 per hour for tasks, while users in India earned an average of $1.58 per hour, which is below minimum wage in the United States (but not in India). In 2018, a survey of 2,676 Amazon Mechanical Turk workers doing 3.8 million tasks found that the median hourly wage was approximately $2 per hour, and only 4% of workers earned more than the federal minimum wage of $7.25 per hour. Some researchers who have considered using Mechanical Turk to get participants for research studies have argued that the wage conditions might be unethical. However, according to other research, workers on Amazon Mechanical Turk do not feel they are exploited and are ready to participate in crowdsourcing activities in the future. A more recent study using stratified random sampling to access a representative sample of Mechanical Turk workers found that the U.S. MTurk population is financially similar to the general population. Workers tend to participate in tasks as a form of paid leisure and to supplement their primary income, and only 7% view it as a full-time job. Overall, workers rated MTurk as less stressful than other jobs. Workers also earn more than previously reported, about $6.50 per hour. They see MTurk as part of the solution to their financial situation and report rare upsetting experiences. They also perceive requesters on MTurk as fairer and more honest than employers outside of the platform.
When Facebook began its localization program in 2008, it received criticism for using free labor in crowdsourcing the translation of site guidelines.
Typically, no written contracts, nondisclosure agreements, or employee agreements are made with crowdworkers. For users of the Amazon Mechanical Turk, this means that employers decide whether users' work is acceptable and reserve the right to withhold pay if it does not meet their standards. Critics say that crowdsourcing arrangements exploit individuals in the crowd, and a call has been made for crowds to organize for their labor rights.
Collaboration between crowd members can also be difficult or even discouraged, especially in the context of competitive crowd sourcing. Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems; only 10.6% of respondents reported working in a team on their submission. Amazon Mechanical Turk workers collaborated with academics to create a platform, WeAreDynamo.org, that allows them to organize and create campaigns to better their work situation, but the site is no longer running. Another platform run by Amazon Mechanical Turk workers and academics, Turkopticon, continues to operate and provides worker reviews on Amazon Mechanical Turk employers.
Besides insufficient compensation and other labor-related disputes, there have also been concerns regarding privacy violations, the hiring of vulnerable groups, breaches of anonymity, psychological damage, the encouragement of addictive behaviors, and more. Many but not all of the issues related to crowdworkes overlap with concerns related to content moderators.
Schenk, Eric; Guittard, Claude (1 January 2009). Crowdsourcing What can be Outsourced to the Crowd and Why. Center for Direct Scientific Communication. Retrieved 1 October 2018 – via HAL. https://hal.inria.fr/halshs-00439256v1
Hirth, Matthias; Hoßfeld, Tobias; Tran-Gia, Phuoc (2011). "Anatomy of a Crowdsourcing Platform – Using the Example of Microworkers.com" (PDF). 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing. pp. 322–329. doi:10.1109/IMIS.2011.89. ISBN 978-1-61284-733-7. S2CID 12955095. Archived from the original (PDF) on 22 November 2015. Retrieved 5 September 2015. 978-1-61284-733-7
Estellés-Arolas, Enrique; González-Ladrón-de-Guevara, Fernando (2012), "Towards an Integrated Crowdsourcing Definition" (PDF), Journal of Information Science, 38 (2): 189–200, doi:10.1177/0165551512437638, hdl:10251/56904, S2CID 18535678, archived from the original (PDF) on 19 August 2019, retrieved 16 March 2012 https://web.archive.org/web/20190819041024/http://www.crowdsourcing-blog.org/wp-content/uploads/2012/02/Towards-an-integrated-crowdsourcing-definition-Estell%C3%A9s-Gonz%C3%A1lez.pdf
Brabham, D. C. (2013). Crowdsourcing. Cambridge, Massachusetts; London, England: The MIT Press.
Brabham, D. C. (2008). "Crowdsourcing as a Model for Problem Solving an Introduction and Cases". Convergence: The International Journal of Research into New Media Technologies. 14 (1): 75–90. CiteSeerX 10.1.1.175.1623. doi:10.1177/1354856507084420. S2CID 145310730. /wiki/CiteSeerX_(identifier)
Prpić, J., & Shukla, P. (2016). Crowd Science: Measurements, Models, and Methods. In Proceedings of the 49th Annual Hawaii International Conference on System Sciences, Kauai, Hawaii: IEEE Computer Society. arXiv:1702.04221
/wiki/ArXiv_(identifier)
Buettner, Ricardo (2015). A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective. 48th Annual Hawaii International Conference on System Sciences. Kauai, Hawaii: IEEE. pp. 4609–4618. doi:10.13140/2.1.2061.1845. ISBN 978-1-4799-7367-5. 978-1-4799-7367-5
Prpić, John; Taeihagh, Araz; Melton, James (September 2015). "The Fundamentals of Policy Crowdsourcing". Policy & Internet. 7 (3): 340–361. arXiv:1802.04143. doi:10.1002/poi3.102. S2CID 3626608. /wiki/ArXiv_(identifier)
Prpić, John; Taeihagh, Araz; Melton, James (September 2015). "The Fundamentals of Policy Crowdsourcing". Policy & Internet. 7 (3): 340–361. arXiv:1802.04143. doi:10.1002/poi3.102. S2CID 3626608. /wiki/ArXiv_(identifier)
Afuah, A.; Tucci, C. L. (2012). "Crowdsourcing as a Solution to Distant Search" (PDF). Academy of Management Review. 37 (3): 355–375. doi:10.5465/amr.2010.0146. https://infoscience.epfl.ch/record/180049/files/afuah_tucci_AMR_2012_FINAL.pdf
de Vreede, T., Nguyen, C., de Vreede, G. J., Boughzala, I., Oh, O., & Reiter-Palmon, R. (2013). A Theoretical Model of User Engagement in Crowdsourcing. In Collaboration and Technology (pp. 94–109). Springer Berlin Heidelberg
Sarin, Supheakmungkol; Pipatsrisawat, Knot; Pham, Khiêm; Batra, Anurag; Valente, Luis (2019). "Crowdsource by Google: A Platform for Collecting Inclusive and Representative Machine Learning Data" (PDF). AAAI Hcomp 2019. https://www.humancomputation.com/2019/assets/papers/143.pdf
Liu, Wei; Moultrie, James; Ye, Songhe (4 May 2019). "The Customer-Dominated Innovation Process: Involving Customers as Designers and Decision-Makers in Developing New Product". The Design Journal. 22 (3): 299–324. doi:10.1080/14606925.2019.1592324. S2CID 145931864. https://www.repository.cam.ac.uk/handle/1810/341960
Schlagwein, Daniel; Bjørn-Andersen, Niels (2014), "Organizational Learning with Crowdsourcing: The Revelatory Case of Lego" (PDF), Journal of the Association for Information Systems, 15 (11): 754–778, doi:10.17705/1jais.00380, S2CID 14811856 http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1693&context=jais
Taeihagh, Araz (19 June 2017). "Crowdsourcing, Sharing Economies, and Development". Journal of Developing Societies. 33 (2): 0169796X1771007. arXiv:1707.06603. doi:10.1177/0169796x17710072. S2CID 32008949. /wiki/Journal_of_Developing_Societies
Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired. https://www.wired.com/wired/archive/14.06/crowds.html
"crowdsourcing (noun)". Oxford English Dictionary. 2023. Retrieved 3 January 2024. https://www.oed.com/dictionary/crowdsourcing_n?tab=factsheet#288590721
"crowdsourcing (noun)". Merriam-Webster. 2024. Retrieved 3 January 2024. https://www.merriam-webster.com/dictionary/crowdsourcing
Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012 https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
Guth, Kristen L.; Brabham, Daren C. (4 August 2017). "Finding the diamond in the rough: Exploring communication and platform in crowdsourcing performance". Communication Monographs. 84 (4): 510–533. doi:10.1080/03637751.2017.1359748. S2CID 54045924. /wiki/Doi_(identifier)
Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired. https://www.wired.com/wired/archive/14.06/crowds.html
Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012 https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
Wei, Zhudeng; Fang, Xiuqi; Yin, Jun (October 2018). "Comparison of climatic impacts transmission from temperature to grain harvests and economies between the Han (206 BC–AD 220) and Tang (AD 618–907) dynasties". The Holocene. 28 (10): 1606. Bibcode:2018Holoc..28.1598W. doi:10.1177/0959683618782592. S2CID 134577720. /wiki/Bibcode_(identifier)
Wei, Zhudeng; Fang, Xiuqi; Yin, Jun (October 2018). "Comparison of climatic impacts transmission from temperature to grain harvests and economies between the Han (206 BC–AD 220) and Tang (AD 618–907) dynasties". The Holocene. 28 (10): 1606. Bibcode:2018Holoc..28.1598W. doi:10.1177/0959683618782592. S2CID 134577720. /wiki/Bibcode_(identifier)
O'Connor, J. J.; Robertson, E. F. (February 1997). "Longitude and the Académie Royale". University of St. Andrews. Retrieved 20 January 2024. https://mathshistory.st-andrews.ac.uk/HistTopics/Longitude1
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
Cattani, Gino; Ferriani, Simone; Lanza, Andrea (December 2017). "Deconstructing the Outsider Puzzle: The Legitimation Journey of Novelty". Organization Science. 28 (6): 965–992. doi:10.1287/orsc.2017.1161. ISSN 1047-7039. https://pubsonline.informs.org/doi/10.1287/orsc.2017.1161
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
Hern, Chester G.(2002). Tracks in the Sea, p. 123 & 246. McGraw Hill. ISBN 0-07-136826-4. /wiki/ISBN_(identifier)
"Smithsonian Crowdsourcing Since 1849". Smithsonian Institution Archives. 14 April 2011. Retrieved 24 August 2018. https://siarchives.si.edu/blog/smithsonian-crowdsourcing-1849
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
Clark, Catherine E. (25 April 1970). "'C'était Paris en 1970'". Études Photographiques (31). Retrieved 2 July 2015. http://etudesphotographiques.revues.org/3407
Axelrod R. (1980), "'Effective choice in the Prisoner's Dilemma'", Journal of Conflict Resolution, 24 (1): 3–25, doi:10.1177/002200278002400101, S2CID 143112198 /wiki/Doi_(identifier)
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
"SETI@home". University of California. Retrieved 20 January 2024. https://setiathome.berkeley.edu
Brabham, Daren C.; Ribisl, Kurt M.; Kirchner, Thomas R.; Bernhardt, Jay M. (1 February 2014). "Crowdsourcing Applications for Public Health". American Journal of Preventive Medicine. 46 (2): 179–187. doi:10.1016/j.amepre.2013.10.016. PMID 24439353. S2CID 205436420. /wiki/Doi_(identifier)
"A Brief History of Crowdsourcing [Infographic]". Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
"UNV Online Volunteering Service | History". Onlinevolunteering.org. Archived from the original on 2 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150702145948/https://www.onlinevolunteering.org/en/org/about/history.html
"Wired 14.06: The Rise of Crowdsourcing". Archive.wired.com. 4 January 2009. Retrieved 2 July 2015. http://archive.wired.com/wired/archive/14.06/crowds.html
Lih, Andrew (2009). The Wikipedia revolution: how a bunch of nobodies created the world's greatest encyclopedia (1st ed.). New York: Hyperion. ISBN 978-1401303716. 978-1401303716
Lakhani KR, Garvin DA, Lonstein E (January 2010). "TopCoder (A): Developing Software through Crowdsourcing". Harvard Business School Case: 610–032. https://www.hbs.edu/faculty/Pages/item.aspx?num=38356
Phadnisi, Shilpa (21 October 2016). "Appirio's TopCoder too is a big catch for Wipro". The Times of India. Retrieved 30 April 2018. https://timesofindia.indiatimes.com/deals/-ma/Appirios-TopCoder-too-is-a-big-catch-for-Wipro/articleshow/54970568.cms
Lardinois, F. (9 August 2014). "For The Love Of Open Mapping Data". Yahoo. Retrieved 20 January 2024. https://techcrunch.com/2014/08/09/for-the-love-of-open-mapping-data
Nagaraj, Abhishek; Piezunka, Henning (September 2024). "The Divergent Effect of Competition on Platforms: Deterring Recruits, Motivating Converts". Strategy Science. 9 (3): 277–296. doi:10.1287/stsc.2022.0125. ISSN 2333-2050. https://pubsonline.informs.org/doi/10.1287/stsc.2022.0125
"Crowdsourcing Back-Up Timeline Early Stories". Archived from the original on 29 November 2014.[better source needed] https://web.archive.org/web/20141129054631/http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/
"Crowdsourcing Back-Up Timeline Early Stories". Archived from the original on 29 November 2014.[better source needed] https://web.archive.org/web/20141129054631/http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/
"Amazon Mechanical Turk". www.mturk.com. Retrieved 25 November 2022. https://www.mturk.com/worker/help
Ohanian, A. (5 December 2006). "reddit on June23-05". Flickr. Retrieved 20 January 2024. https://www.flickr.com/photos/33809408@N00/315068778/in/photostream
"Waze". Waze Mobile. 2009. Retrieved 20 January 2024. https://www.waze.com/about
Piezunka, Henning; Dahlander, Linus (June 2015). "Distant Search, Narrow Attention: How Crowding Alters Organizations' Filtering of Suggestions in Crowdsourcing". Academy of Management Journal. 58 (3): 856–880. doi:10.5465/amj.2012.0458. ISSN 0001-4273. https://journals.aom.org/doi/10.5465/amj.2012.0458
Sengupta, S. (13 August 2013). "Potent Memories From a Divided India". New York Times. Retrieved 20 January 2024. https://www.nytimes.com/2013/08/14/arts/potent-memories-from-a-divided-india.html?_r=0
Garrigos-Simon, Fernando J.; Gil-Pechuán, Ignacio; Estelles-Miguel, Sofia (2015). Advances in Crowdsourcing. Springer. ISBN 9783319183411. 9783319183411
"Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon". New Advent. Retrieved 25 February 2012. http://www.newadvent.org/cathen/10552a.htm
"It Was All About Alkali". Chemistry Chronicles. Retrieved 25 February 2012. http://pubs.acs.org/subscribe/archive/tcaw/11/i01/html/01chemchron.html
"Nicolas Appert". John Blamire. Retrieved 25 February 2012. http://www.brooklyn.cuny.edu/bc/ahp/MBG/MBG4/Appert.html
"9 Examples of Crowdsourcing, Before 'Crowdsourcing' Existed". MemeBurn. 15 September 2011. Retrieved 25 February 2012. http://memeburn.com/2011/09/9-examples-of-crowdsourcing-before-%E2%80%98crowdsourcing%E2%80%99-existed/
Pande, Shamni (25 May 2013). "The People Know Best". Business Today. India: Living Media India Limited. http://businesstoday.intoday.in/story/crowdsourcing-is-the-new-buzzword-in-communications/1/195160.html/
Noveck, Beth Simone (2009), Wiki Government: How Technology Can Make Government Better, Democracy Stronger, and Citizens More Powerful, Brookings Institution Press
Sarasua, Cristina; Simperl, Elena; Noy, Natalya F. (2012), "Crowdsourcing Ontology Alignment with Microtasks" (PDF), Institute AIFB. Karlsruhe Institute of Technology: 2, archived from the original (PDF) on 5 March 2016, retrieved 18 September 2021 https://web.archive.org/web/20160305071957/http://web.stanford.edu/~natalya/papers/iswc2012_crowdmap.pdf
Hollow, Matthew (20 April 2013). "Crowdfunding and Civic Society in Europe: A Profitable Partnership?". Open Citizenship. Retrieved 29 April 2013. https://www.academia.edu/3415172
Federal Transit Administration Public Transportation Participation Pilot Program, U.S. Department of Transportation, archived from the original on 7 January 2009 https://web.archive.org/web/20090107140521/http://www.fta.dot.gov./planning/programs/planning_environment_8711.html
Peer-to-Patent Community Patent Review Project, Peer-to-Patent Community Patent Review Project http://www.peertopatent.org/
Callison-Burch, C.; Dredze, M. (2010), "Creating Speech and Language Data With Amazon's Mechanical Turk" (PDF), Human Language Technologies Conference: 1–12, archived from the original (PDF) on 2 August 2012, retrieved 28 February 2012 https://web.archive.org/web/20120802162113/http://www.aclweb.org/anthology-new/W/W10/W10-0701.pdf
McGraw, I.; Seneff, S. (2011), "Growing a spoken language interface on Amazon Mechanical Turk", Interspeech 2011 (PDF), pp. 3057–3060, doi:10.21437/Interspeech.2011-765 http://people.csail.mit.edu/jrg/2011/McGraw_Interspeech11.pdf
Kittur, A.; Chi, E.H.; Sun, B. (2008), "Crowdsourcing user studies with Mechanical Turk" (PDF), Chi 2008 http://www-users.cs.umn.edu/~echi/papers/2008-CHI2008/2008-02-mech-turk-online-experiments-chi1049-kittur.pdf
Litman, Leib; Robinson, Jonathan (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. SAGE Publications. ISBN 978-1506391137. 978-1506391137
Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on Amazon's Mechanical Turk", Behavior Research Methods, SSRN 1691163 /wiki/SSRN_(identifier)
Koblin, A. (2009). "The sheep market". Proceedings of the seventh ACM conference on Creativity and cognition. pp. 451–452. doi:10.1145/1640233.1640348. ISBN 9781605588650. S2CID 20609292. 9781605588650
"explodingdog 2015". Explodingdog.com. Retrieved 2 July 2015. http://www.explodingdog.com/
DeVun, Leah (19 November 2009). "Looking at how crowds produce and present art". Wired News. Archived from the original on 24 October 2012. Retrieved 26 February 2012. https://web.archive.org/web/20121024130503/http://www.wired.com/techbiz/media/news/2007/07/crowd_captain?currentPage=all
Linver, D. (2010), Crowdsourcing and the Evolving Relationship between Art and Artist, archived from the original on 14 July 2014, retrieved 28 February 2012 https://web.archive.org/web/20140714163540/http://www.crowdsourcing.org/document/crowdsourcing-and-the-evolving-relationship-between-artist-and-audience/5515
"Why". INRIX.com. 13 September 2014. Archived from the original on 12 October 2014. Retrieved 2 July 2015. https://web.archive.org/web/20141012000923/http://www.inrix.com/companyoverview.asp
Wang, Cheng; Han, Larry; Stein, Gabriella; Day, Suzanne; Bien-Gund, Cedric; Mathews, Allison; Ong, Jason J.; Zhao, Pei-Zhen; Wei, Shu-Fang; Walker, Jennifer; Chou, Roger; Lee, Amy; Chen, Angela; Bayus, Barry; Tucker, Joseph D. (20 January 2020). "Crowdsourcing in health and medical research: a systematic review". Infectious Diseases of Poverty. 9 (1): 8. doi:10.1186/s40249-020-0622-9. ISSN 2049-9957. PMC 6971908. PMID 31959234. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6971908
Hildebrand, Mikaela; Ahumada, Claudia; Watson, Sharon (January 2013). "CrowdOutAIDS: crowdsourcing youth perspectives for action". Reproductive Health Matters. 21 (41): 57–68. doi:10.1016/S0968-8080(13)41687-7. ISSN 0968-8080. PMID 23684188. S2CID 31888826. https://www.tandfonline.com/doi/full/10.1016/S0968-8080%2813%2941687-7
Feng, Steve; Woo, Min-jae; Kim, Hannah; Kim, Eunso; Ki, Sojung; Shao, Lei; Ozcan, Aydogan (11 March 2016). Levitz, David; Ozcan, Aydogan; Erickson, David (eds.). "A game-based crowdsourcing platform for rapidly training middle and high school students to perform biomedical image analysis". Optics and Biophotonics in Low-Resource Settings II. 9699. SPIE: 92–100. Bibcode:2016SPIE.9699E..0TF. doi:10.1117/12.2212310. S2CID 124343732. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/9699/96990T/A-game-based-crowdsourcing-platform-for-rapidly-training-middle-and/10.1117/12.2212310.full
Lee, Young Ji; Arida, Janet A.; Donovan, Heidi S. (November 2017). "The application of crowdsourcing approaches to cancer research: a systematic review". Cancer Medicine. 6 (11): 2595–2605. doi:10.1002/cam4.1165. ISSN 2045-7634. PMC 5673951. PMID 28960834. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5673951
Lee, Young Ji; Arida, Janet A.; Donovan, Heidi S. (November 2017). "The application of crowdsourcing approaches to cancer research: a systematic review". Cancer Medicine. 6 (11): 2595–2605. doi:10.1002/cam4.1165. ISSN 2045-7634. PMC 5673951. PMID 28960834. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5673951
Vergano, Dan (30 August 2014). "1833 Meteor Storm Started Citizen Science". National Geographic. StarStruck. Archived from the original on 16 September 2014. Retrieved 18 September 2014. https://web.archive.org/web/20140916020609/http://newswatch.nationalgeographic.com/2014/08/30/1833-meteor-storm-started-citizen-science/
Littmann, Mark; Suomela, Todd (June 2014). "Crowdsourcing, the great meteor storm of 1833, and the founding of meteor science". Endeavour. 38 (2): 130–138. doi:10.1016/j.endeavour.2014.03.002. PMID 24917173. /wiki/Doi_(identifier)
"Gateway to Astronaut Photography of Earth". NASA. https://eol.jsc.nasa.gov/
McLaughlin, Elliot. "Image Overload: Help us sort it all out, NASA requests". CNN. Retrieved 18 September 2014. http://www.cnn.com/2014/08/17/tech/nasa-earth-images-help-needed/
Litman, Leib; Robinson, Jonathan (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. SAGE Publications. ISBN 978-1506391137. 978-1506391137
Liu, Huiying; Xie, Qian Wen; Lou, Vivian W. Q. (1 April 2019). "Everyday social interactions and intra-individual variability in affect: A systematic review and meta-analysis of ecological momentary assessment studies". Motivation and Emotion. 43 (2): 339–353. doi:10.1007/s11031-018-9735-x. S2CID 254827087. /wiki/Doi_(identifier)
Luong, Raymond; Lomanowska, Anna M. (2021). "Evaluating Reddit as a Crowdsourcing Platform for Psychology Research Projects". Teaching of Psychology. 49 (4): 329–337. doi:10.1177/00986283211020739. S2CID 236414676. https://doi.org/10.1177%2F00986283211020739
Brown, Joshua K.; Hohman, Zachary P. (2022). "Extreme party animals: Effects of political identification and ideological extremity". Journal of Applied Social Psychology. 52 (5): 351–362. doi:10.1111/jasp.12863. S2CID 247077069. /wiki/Doi_(identifier)
Vaterlaus, J. Mitchell; Patten, Emily V.; Spruance, Lori A. (26 May 2022). "#Alonetogether:: An Exploratory Study of Social Media Use at the Beginning of the COVID-19 Pandemic". The Journal of Social Media in Society. 11 (1): 27–45. https://thejsms.org/index.php/JSMS/article/view/887
Després, Jacques; Hadjsaid, Nouredine; Criqui, Patrick; Noirot, Isabelle (1 February 2015). "Modelling the impacts of variable renewable sources on the power sector: reconsidering the typology of energy modelling tools". Energy. 80: 486–495. Bibcode:2015Ene....80..486D. doi:10.1016/j.energy.2014.12.005. /wiki/Bibcode_(identifier)
"OpenEI — Energy Information, Data, and other Resources". OpenEI. Retrieved 26 September 2016. http://en.openei.org
Garvin, Peggy (12 December 2009). "New Gateway: Open Energy Info". SLA Government Information Division. Dayton, Ohio, USA. Retrieved 26 September 2016.[permanent dead link] http://govinfo.sla.org/2009/12/12/new-gateway-open-energy-info/
Brodt-Giles, Debbie (2012). WREF 2012: OpenEI — an open energy data and information exchange for international audiences (PDF). Golden, Colorado, USA: National Renewable Energy Laboratory (NREL). Archived from the original (PDF) on 9 October 2016. Retrieved 24 September 2016. https://web.archive.org/web/20161009172347/https://ases.conference-services.net/resources/252/2859/pdf/SOLAR2012_0677_full%20paper.pdf
Davis, Chris; Chmieliauskas, Alfredas; Dijkema, Gerard; Nikolic, Igor. "Enipedia". Delft, The Netherlands: Energy and Industry group, Faculty of Technology, Policy and Management, TU Delft. Archived from the original on 10 June 2014. Retrieved 7 October 2016. https://archive.today/20140610231532/http://enipedia.tudelft.nl/
Davis, Chris (2012). Making sense of open data: from raw data to actionable insight — PhD thesis. Delft, The Netherlands: Delft University of Technology. Retrieved 2 October 2018.Chapter 9 discusses in depth the initial development of Enipedia. https://www.researchgate.net/publication/255823727
"What Is the Four-Generation Program?". The Church of Jesus Christ of Latter-day Saints. Retrieved 30 January 2012. https://www.churchofjesuschrist.org/study/ensign/1972/03/what-is-the-four-generation-program?lang=eng
King, Turi E.; Jobling, Mark A. (2009). "What's in a name? Y chromosomes, surnames and the genetic genealogy revolution". Trends in Genetics. 25 (8): 351–60. doi:10.1016/j.tig.2009.06.003. hdl:2381/8106. PMID 19665817. The International Society of Genetic Genealogy advocates the use of genetics as a tool for genealogical research, and provides a support network for genetic genealogists. It hosts the ISOGG Y-haplogroup tree, which has the virtue of being regularly updated. https://figshare.com/articles/journal_contribution/10096019
Mendex, etc. al., Fernando (28 February 2013). "An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree". The American Journal of Human Genetics. 92 (3): 454–459. doi:10.1016/j.ajhg.2013.02.002. PMC 3591855. PMID 23453668. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3591855
Wells, Spencer (2013). "The Genographic Project and the Rise of Citizen Science". Southern California Genealogical Society (SCGS). Archived from the original on 10 July 2013. Retrieved 10 July 2013. https://web.archive.org/web/20130710014353/http://www.scgsgenealogy.com/Jamboree/2013/DNAday.htm
"History of the Christmas Bird Count | Audubon". Birds.audubon.org. 22 January 2015. Retrieved 2 July 2015. http://birds.audubon.org/history-christmas-bird-count
"Thank you!". Audubon. 5 October 2017. Archived from the original on 24 August 2014. https://web.archive.org/web/20140824051327/http://www.audubon.org/thank-you-0
"Home – ISCRAM2015 – University of Agder" (PDF). iscram2015.uia.no. Archived from the original (PDF) on 17 October 2016. Retrieved 14 October 2016. https://web.archive.org/web/20161017122308/http://iscram2015.uia.no/wp-content/uploads/2015/05/8-9.pdf
Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change and Peer-Learning". International Journal of Communication. 9: 3523–3543. http://crowdsourcinginjournalism.com/2015/10/28/motivation-factors-in-crowdsourced-journalism-social-impact-social-change-and-peer-learning/
Aitamurto, Tanja (2016). "Crowdsourcing as a Knowledge-Search Method in Digital Journalism: Ruptured Ideals and Blended Responsibility". Digital Journalism. 4 (2): 280–297. doi:10.1080/21670811.2015.1034807. S2CID 156243124. http://crowdsourcinginjournalism.com/2015/07/04/crowdsourcing-as-a-knowledge-search-method-in-digital-journalism-ruptured-ideals-and-blended-responsibility/
Aitamurto, Tanja (2013). "Balancing between open and closed: co-creation in magazine journalism". Digital Journalism. 1 (2): 229–251. doi:10.1080/21670811.2012.750150. S2CID 62882093. /wiki/Doi_(identifier)
"Algorithm Watch". Algorithm Watch. 2022. Retrieved 18 May 2022. https://algorithmwatch.org/en/
"Overview in English". DataSkop. 2022. Retrieved 18 May 2022. https://dataskop.net/overview-in-english/
"FAQs". Mozilla Rally. Archived from the original on 14 March 2023. Retrieved 14 March 2023. Mozilla Rally is currently available to US residents who are age 19 and older https://web.archive.org/web/20230314142545/https://rally.mozilla.org/how-rally-works/faqs/
"It's your data. Use it for change". Mozilla Rally. Retrieved 14 March 2023. https://rally.mozilla.org/
Angus, Daniel (16 February 2022). "A data economy: the case for doing and knowing more about algorithms". Crikey. Retrieved 24 March 2022. https://www.crikey.com.au/2022/02/16/data-economy-algorithms/
Burgess, Jean; Angus, Daniel; Carah, Nicholas; Andrejevic, Mark; Hawker, Kiah; Lewis, Kelly; Obeid, Abdul; Smith, Adam; Tan, Jane; Fordyce, Robbie; Trott, Verity (8 November 2021). "Critical simulation as hybrid digital method for exploring the data operations and vernacular cultures of visual social media platforms". SocArXiv. doi:10.31235/osf.io/2cwsu. S2CID 243837581. https://eprints.qut.edu.au/226345/
The Markup (2022). "The Citizen Browser Project—Auditing the Algorithms of Disinformation". The Markup. Retrieved 18 May 2022. https://themarkup.org/citizen-browser
Pretus, Clara; Gil-Buitrago, Helena; Cisma, Irene; Hendricks, Rosamunde C.; Lizarazo-Villarreal, Daniela (16 July 2024). "Scaling crowdsourcing interventions to combat partisan misinformation". Advances.in/Psychology. 2: e85592. doi:10.56296/aip00018. ISSN 2976-937X. https://advances.in/psychology/10.56296/aip00018/
Allen, Jennifer; Arechar, Antonio A.; Pennycook, Gordon; Rand, David G. (3 September 2021). "Scaling up fact-checking using the wisdom of crowds". Science Advances. 7 (36): eabf4393. Bibcode:2021SciA....7.4393A. doi:10.1126/sciadv.abf4393. ISSN 2375-2548. PMC 8442902. PMID 34516925. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8442902
Smith, Graham; Richards, Robert C.; Gastil, John (12 May 2015). "The Potential ofParticipediaas a Crowdsourcing Tool for Comparative Analysis of Democratic Innovations" (PDF). Policy & Internet. 7 (2): 243–262. doi:10.1002/poi3.93. http://westminsterresearch.wmin.ac.uk/15138/1/Participedia%20PSA%20Version.pdf
Moon, M. Jae (2018). "Evolution of co-production in the information age: crowdsourcing as a model of web-based co-production in Korea". Policy and Society. 37 (3): 294–309. doi:10.1080/14494035.2017.1376475. S2CID 158440300. https://doi.org/10.1080%2F14494035.2017.1376475
Taeihagh, Araz (8 November 2017). "Crowdsourcing: a new tool for policy-making?". Policy Sciences. 50 (4): 629–647. arXiv:1802.03113. doi:10.1007/s11077-017-9303-3. S2CID 27696037. /wiki/ArXiv_(identifier)
Diamond, Larry; Whittington, Zak (2009). "Social Media". In Welzel, Christian; Haerpfer, Christian W.; Bernhagen, Patrick; Inglehart, Ronald F. (eds.). Democratization (2 ed.). Oxford: Oxford University Press (published 2018). p. 256. ISBN 9780198732280. Retrieved 4 March 2021. Another way that social media can contribute to democratization is by 'crowdsourcing' information. This elicits the knowledge and wisdom of the 'crowd' [...]. 9780198732280
Aitamurto, Tanja (2012). Crowdsourcing for Democracy: New Era In Policy–Making. Committee for the Future, Parliament of Finland. pp. 10–30. ISBN 978-951-53-3459-6. 978-951-53-3459-6
Prpić, J.; Taeihagh, A.; Melton, J. (2014). "Crowdsourcing the Policy Cycle. Collective Intelligence 2014, MIT Center for Collective Intelligence" (PDF). Humancomputation.com. Archived from the original (PDF) on 24 June 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150624044131/http://humancomputation.com/ci2014/papers/Active%20Papers%5CPaper%2040.pdf
Prpić, J.; Taeihagh, A.; Melton, J. (2014). "A Framework for Policy Crowdsourcing. Oxford Internet Institute, University of Oxford – IPP 2014 – Crowdsourcing for Politics and Policy" (PDF). Ipp.oxii.ox.ac.uk. Retrieved 2 October 2018. https://www.researchgate.net/publication/262523774
Prpić, J.; Taeihagh, A.; Melton, J. (2014). "Experiments on Crowdsourcing Policy Assessment. Oxford Internet Institute, University of Oxford – IPP 2014 – Crowdsourcing for Politics and Policy" (PDF). Ipp.oii.ox.ac.uk. Archived from the original (PDF) on 24 June 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150624041608/http://ipp.oii.ox.ac.uk/sites/ipp/files/documents/IPP2014_Taeihagh.pdf
Thapa, B.; Niehaves, B.; Seidel, C.; Plattfaut, R. (2015). "Citizen involvement in public sector innovation: Government and citizen perspectives". Information Polity. 20 (1): 3–17. doi:10.3233/IP-150351. http://content.iospress.com/articles/information-polity/ip351
Aitamurto and Landemore (4 February 2015). "Five design principles for crowdsourced policymaking: Assessing the case of crowdsourced off-road traffic law reform in Finland". Journal of Social Media for Organizations (1): 1–19. http://thefinnishexperiment.com/2015/02/04/design-for-crowdsourced-policy-making/
Aitamurto, Tanja; Landemore, Hélène; Saldivar Galli, Jorge (2016). "Unmasking the Crowd: Participants' Motivation Factors, Profile and Expectations for Participation in Crowdsourced Policymaking". Information, Communication & Society. 20 (8): 1239–1260. doi:10.1080/1369118x.2016.1228993. S2CID 151989757. http://thefinnishexperiment.com/2016/09/21/motivation-factors-for-participation-in-crowdsourced-policymaking/
Aitamurto, Tanja; Chen, Kaiping; Cherif, Ahmed; Galli, Jorge Saldivar; Santana, Luis (2016). "Civic CrowdAnalytics: Making sense of crowdsourced civic input with big data tools". Proceedings of the 20th International Academic Mindtrek Conference. pp. 86–94. doi:10.1145/2994310.2994366. ISBN 978-1-4503-4367-1. S2CID 16855773 – via ACM Digital Archive. 978-1-4503-4367-1
Aitamurto, Tanja (31 January 2015). Crowdsourcing for Democracy: New Era in Policymaking. Committee for the Future, Parliament of Finland. ISBN 978-951-53-3459-6. 978-951-53-3459-6
Aitamurto, Tanja (31 January 2015). Crowdsourcing for Democracy: New Era in Policymaking. Committee for the Future, Parliament of Finland. ISBN 978-951-53-3459-6. 978-951-53-3459-6
"Home". challenge.gov. https://challenge.gov/
Aitamurto, Tanja (31 January 2015). Crowdsourcing for Democracy: New Era in Policymaking. Committee for the Future, Parliament of Finland. ISBN 978-951-53-3459-6. 978-951-53-3459-6
Nussbaum, Stan. (2003). Proverbial perspectives on pluralism. Connections: the journal of the WEA Missions Committee October, pp. 30, 31.
"Oromo dictionary project". OromoDictionary.com. Retrieved 3 February 2014. http://oromodictionary.com/index.php
Albright, Eric; Hatton, John (2007). Chapter 10. WeSay, a Tool for Engaging Native Speakers in Dictionary Building. Natl Foreign Lg Resource Ctr. hdl:10125/1368. ISBN 978-0-8248-3309-1. 978-0-8248-3309-1
"Developing ASL vocabulary for science and math". Washington.edu. 7 December 2012. Retrieved 3 February 2014. http://www.washington.edu/news/2012/12/07/crowdsourcing-sit-compiles-new-sign-language-for-math-and-science/
Keuleers; et al. (February 2015). "Word knowledge in the crowd: Measuring vocabulary size and word prevalence in a massive online experiment". Quarterly Journal of Experimental Psychology. 68 (8): 1665–1692. doi:10.1080/17470218.2015.1022560. PMID 25715025. S2CID 4894686. https://doi.org/10.1080%2F17470218.2015.1022560
Bill, Jeremiah; Gong, He; Hamilton, Brooke; Hawthorn, Henry; et al. "The extension of (positive) anymore". Google Docs. Retrieved 27 September 2020. https://docs.google.com/presentation/d/1qalqZCbuFG7_HVgQ9vIgK8JOcyWfJZwzB_VPLs4usvU/edit?usp=embed_facebook
Bill, Jeremiah; Gong, He; Hamilton, Brooke; Hawthorn, Henry; et al. "The extension of (positive) anymore". Google Docs. Retrieved 27 September 2020. https://docs.google.com/presentation/d/1qalqZCbuFG7_HVgQ9vIgK8JOcyWfJZwzB_VPLs4usvU/edit?usp=embed_facebook
"Pashto Proverb Collection project". AfghanProverbs.com. Archived from the original on 4 February 2014. Retrieved 3 February 2014. https://web.archive.org/web/20140204002337/http://www.afghanproverbs.com/the_pashto_proverbs_project
"Comparing methods of collecting proverbs" (PDF). gial.edu. Archived from the original (PDF) on 17 December 2014. Retrieved 17 December 2014. https://web.archive.org/web/20141217213801/http://www.gial.edu/images/gialens/vol8-3/Unseth_collecting_proverbs.pdf
Edward Zellem. 2014. Mataluna: 151 Afghan Pashto Proverbs. Tampa, Florida: Culture Direct.
Zhai, Haijun; Lingren, Todd; Deleger, Louise; Li, Qi; Kaiser, Megan; Stoutenborough, Laura; Solti, Imre (2013). "Web 2.0-based crowdsourcing for high-quality gold standard development in clinical Natural Language Processing". Journal of Medical Internet Research. 15 (4): e73. doi:10.2196/jmir.2426. PMC 3636329. PMID 23548263. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3636329
Piezunka, Henning; Dahlander, Linus (June 2015). "Distant Search, Narrow Attention: How Crowding Alters Organizations' Filtering of Suggestions in Crowdsourcing". Academy of Management Journal. 58 (3): 856–880. doi:10.5465/amj.2012.0458. ISSN 0001-4273. https://journals.aom.org/doi/10.5465/amj.2012.0458
Martin, Fred; Resnick, Mitchel (1993), "Lego/Logo and Electronic Bricks: Creating a Scienceland for Children", Advanced Educational Technologies for Mathematics and Science, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 61–89, doi:10.1007/978-3-662-02938-1_2, ISBN 978-3-642-08152-1 978-3-642-08152-1
Nishikawa, Hidehiko; Schreier, Martin; Fuchs, Christoph; Ogawa, Susumu (August 2017). "The Value of Marketing Crowdsourced New Products as Such: Evidence from Two Randomized Field Experiments". Journal of Marketing Research. 54 (4): 525–539. doi:10.1509/jmr.15.0244. ISSN 0022-2437. http://journals.sagepub.com/doi/10.1509/jmr.15.0244
Piezunka, Henning; Dahlander, Linus (June 2015). "Distant Search, Narrow Attention: How Crowding Alters Organizations' Filtering of Suggestions in Crowdsourcing". Academy of Management Journal. 58 (3): 856–880. doi:10.5465/amj.2012.0458. ISSN 0001-4273. https://journals.aom.org/doi/10.5465/amj.2012.0458
Reinhold, Stephan; Dolnicar, Sara (December 2017), "How Airbnb Creates Value", Peer-to-Peer Accommodation Networks, Goodfellow Publishers, doi:10.23912/9781911396512-3602, ISBN 9781911396512 9781911396512
"Prime Panels by CloudResearch | Online Research Panel Recruitment". CloudResearch. Retrieved 12 January 2023. https://www.cloudresearch.com/products/prime-panels/
Nunan, Daniel; Birks, David F.; Malhotra, Naresh K. (2020). Marketing research : applied insight (6th ed.). Harlow, United Kingdom: Pearson. ISBN 978-1-292-30872-2. OCLC 1128061550. 978-1-292-30872-2
Parker, Christopher J.; May, Andrew; Mitchell, Val (November 2013). "The role of VGI and PGI in supporting outdoor activities". Applied Ergonomics. 44 (6): 886–894. doi:10.1016/j.apergo.2012.04.013. PMID 22795180. S2CID 12918341. https://dspace.lboro.ac.uk/2134/10350
Parker, Christopher J.; May, Andrew; Mitchell, Val (15 May 2014). "User-centred design of neogeography: the impact of volunteered geographic information on users' perceptions of online map 'mashups'". Ergonomics. 57 (7): 987–997. doi:10.1080/00140139.2014.909950. PMID 24827070. S2CID 13458260. https://dspace.lboro.ac.uk/2134/23845
Brown, Michael; Sharples, Sarah; Harding, Jenny; Parker, Christopher J. (November 2013). "Usability of Geographic Information: Current challenges and future directions" (PDF). Applied Ergonomics. 44 (6): 855–865. doi:10.1016/j.apergo.2012.10.013. PMID 23177775. S2CID 26412254. Archived from the original (PDF) on 19 July 2018. Retrieved 20 August 2019. https://web.archive.org/web/20180719082903/http://eprints.nottingham.ac.uk/2809/1/Brown_et_al_2013_Usabilty_of_Geographic_Information.pdf
Parker, Christopher J.; May, Andrew; Mitchell, Val (August 2012). "Understanding Design with VGI using an Information Relevance Framework". Transactions in GIS. 16 (4): 545–560. Bibcode:2012TrGIS..16..545P. doi:10.1111/j.1467-9671.2012.01302.x. S2CID 20100267. https://dspace.lboro.ac.uk/2134/10349
Nagaraj, Abhishek; Piezunka, Henning (September 2024). "The Divergent Effect of Competition on Platforms: Deterring Recruits, Motivating Converts". Strategy Science. 9 (3): 277–296. doi:10.1287/stsc.2022.0125. ISSN 2333-2050. https://pubsonline.informs.org/doi/10.1287/stsc.2022.0125
Lardinois, F. (9 August 2014). "For The Love Of Open Mapping Data". Yahoo. Retrieved 20 January 2024. https://techcrunch.com/2014/08/09/for-the-love-of-open-mapping-data
Holley, Rose (March 2010). "Crowdsourcing: How and Why Should Libraries Do It?". D-Lib Magazine. 16 (3/4). doi:10.1045/march2010-holley. Retrieved 21 May 2021. http://www.dlib.org/dlib/march10/holley/03holley.html
Trant, Jennifer (2009). Tagging, Folksonomy and Art Museums: Results of steve.museum's research (PDF). Archives & Museum Informatics. Archived from the original (PDF) on 10 February 2010. Retrieved 21 May 2021. https://web.archive.org/web/20100210192354/http://conference.archimuse.com/files/trantSteveResearchReport2008.pdf
Andro, M. (2018). Digital libraries and crowdsourcing, Wiley / ISTE. ISBN 9781786301611. /wiki/ISBN_(identifier)
Rahman, Mahbubur; Blackwell, Brenna; Banerjee, Nilanjan; Dharmendra, Saraswat (2015), "Smartphone-based hierarchical crowdsourcing for weed identification", Computers and Electronics in Agriculture, 113: 14–23, Bibcode:2015CEAgr.113...14R, doi:10.1016/j.compag.2014.12.012, retrieved 12 August 2015 http://dl.acm.org/citation.cfm?id=2784520
"2015 Cheating Scandal". Bridge Winners. 2015. Retrieved 20 January 2024. https://bridgewinners.com/article/series/2015-cheating-scandal/
Tang, Weiming; Han, Larry; Best, John; Zhang, Ye; Mollan, Katie; Kim, Julie; Liu, Fengying; Hudgens, Michael; Bayus, Barry (1 June 2016). "Crowdsourcing HIV Test Promotion Videos: A Noninferiority Randomized Controlled Trial in China". Clinical Infectious Diseases. 62 (11): 1436–1442. doi:10.1093/cid/ciw171. PMC 4872295. PMID 27129465. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4872295
Zhang, Ye; Kim, Julie A.; Liu, Fengying; Tso, Lai Sze; Tang, Weiming; Wei, Chongyi; Bayus, Barry L.; Tucker, Joseph D. (November 2015). "Creative Contributory Contests to Spur Innovation in Sexual Health: 2 Cases and a Guide for Implementation". Sexually Transmitted Diseases. 42 (11): 625–628. doi:10.1097/OLQ.0000000000000349. PMC 4610177. PMID 26462186. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4610177
Créquit, Perrine (2018). "Mapping of Crowdsourcing in Health: Systematic Review". Journal of Medical Internet Research. 20 (5): e187. doi:10.2196/jmir.9330. PMC 5974463. PMID 29764795. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5974463
Zhang, Ye; Kim, Julie A.; Liu, Fengying; Tso, Lai Sze; Tang, Weiming; Wei, Chongyi; Bayus, Barry L.; Tucker, Joseph D. (November 2015). "Creative Contributory Contests to Spur Innovation in Sexual Health: 2 Cases and a Guide for Implementation". Sexually Transmitted Diseases. 42 (11): 625–628. doi:10.1097/OLQ.0000000000000349. PMC 4610177. PMID 26462186. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4610177
van der Krieke; et al. (2015). "HowNutsAreTheDutch (HoeGekIsNL): A crowdsourcing study of mental symptoms and strengths" (PDF). International Journal of Methods in Psychiatric Research. 25 (2): 123–144. doi:10.1002/mpr.1495. PMC 6877205. PMID 26395198. Archived from the original (PDF) on 2 August 2019. Retrieved 26 December 2018. https://web.archive.org/web/20190802163143/https://pure.rug.nl/ws/files/30435764/2015_Van_der_Krieke_Jeronimus_HowNutsAreTheDutch_A_Crowdsourcing_Study_of_Mental_Symptoms_and_Strengths.pdf
Prpić, J. (2015). Health Care Crowds: Collective Intelligence in Public Health. Collective Intelligence 2015. Center for the Study of Complex Systems, University of Michigan. Papers.ssrn.com. SSRN 2570593. /wiki/SSRN_(identifier)
van der Krieke, L; Blaauw, FJ; Emerencia, AC; Schenk, HM; Slaets, JP; Bos, EH; de Jonge, P; Jeronimus, BF (2016). "Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback (2016)" (PDF). Psychosomatic Medicine. 79 (2): 213–223. doi:10.1097/PSY.0000000000000378. PMID 27551988. S2CID 10955232. https://pure.rug.nl/ws/files/40193705/00006842_201702000_00011.pdf
Guth, Kristen L.; Brabham, Daren C. (4 August 2017). "Finding the diamond in the rough: Exploring communication and platform in crowdsourcing performance". Communication Monographs. 84 (4): 510–533. doi:10.1080/03637751.2017.1359748. S2CID 54045924. /wiki/Doi_(identifier)
Ess, Henk van (2010) "Crowdsourcing: how to find a crowd", ARD ZDF Akademie, Berlin, p. 99 https://www.slideshare.net/searchbistro/harvesting-knowledge-how-to-crowdsource-in-2010
Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM, 54 (4): 86–96, doi:10.1145/1924421.1924442, S2CID 207184672 https://cacm.acm.org/magazines/2011/4/106563-crowdsourcing-systems-on-the-world-wide-web/fulltext
Brabham, Daren C. (2013), Crowdsourcing, MIT Press, p. 45
Blohm, Ivo; Zogaj, Shkodran; Bretschneider, Ulrich; Leimeister, Jan Marco (2018). "How to Manage Crowdsourcing Platforms Effectively" (PDF). California Management Review. 60 (2): 122–149. doi:10.1177/0008125617738255. S2CID 73551209. Archived from the original (PDF) on 20 July 2018. Retrieved 24 August 2020. https://web.archive.org/web/20180720145920/https://www.alexandria.unisg.ch/252464/1/BlohmEtAl_2018_HowToManageCrowdsourcingIntermediaries.pdf
Howe, Jeff (2008), Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business (PDF), The International Achievement Institute, archived from the original (PDF) on 23 September 2015, retrieved 9 April 2012 https://web.archive.org/web/20150923191141/http://www.bizbriefings.com/Samples/IntInst%20---%20Crowdsourcing.PDF
Dahlander, Linus; Jeppesen, Lars Bo; Piezunka, Henning (1 January 2019), Sydow, Jörg; Berends, Hans (eds.), "How Organizations Manage Crowds: Define, Broadcast, Attract, and Select", Managing Inter-organizational Collaborations: Process Views, Research in the Sociology of Organizations, vol. 64, Emerald Publishing Limited, pp. 239–270, doi:10.1108/s0733-558x20190000064016, ISBN 978-1-78756-592-0, retrieved 19 April 2025 978-1-78756-592-0
"Crowdvoting: How Elo Limits Disruption". thevisionlab.com. 25 May 2017. https://thevisionlab.com/crowdsourcing/crowdvoting-elo
Robson, John (24 February 2012). "IEM Demonstrates the Political Wisdom of Crowds". Canoe.ca. Archived from the original on 7 April 2012. Retrieved 31 March 2012. https://web.archive.org/web/20120407121438/http://tippie.uiowa.edu/iem/media/story.cfm?ID=2793
"4 Great Examples of Crowdsourcing through Social Media". digitalagencymarketing.com. 2012. Archived from the original on 1 April 2012. Retrieved 29 March 2012. https://web.archive.org/web/20120401224920/http://www.digitalagencymarketing.com/2012/03/4-great-examples-of-social-crowdsourcing/
Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012 https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
Goldberg, Ken; Newsom, Gavin (12 June 2014). "Let's amplify California's collective intelligence". Citris-uc.org. Retrieved 14 June 2014. http://citris-uc.org/lets-amplify-californias-collective-intelligence-op-ed-ken-goldberg-gavin-newsom-california-report-card/
Escoffier, N. and B. McKelvey (2014). "Using "Crowd-Wisdom Strategy" to Co-Create Market Value: Proof-of-Concept from the Movie Industry." in International Perspective on Business Innovation and Disruption in the Creative Industries: Film, Video, Photography, P. Wikstrom and R. DeFillippi, eds., UK: Edward Elgar Publishing Ltd, Chap. 11. ISBN 9781783475339 /wiki/ISBN_(identifier)
Block, A. B. (21 April 2010). "How boxoffice trading could flop". The Hollywood Reporter. https://www.hollywoodreporter.com/news/how-boxoffice-trading-could-flop-22886
Chen, A. and Panaligan, R. (2013). "Quantifying movie magic with Google search." Google White Paper, Industry Perspectives+User Insights https://adwords.googleblog.com/2013/06/quantifying-movie-magic-with-google.html
Williams, Jack (17 February 2017). "An Indoor Football Team Has Its Fans Call the Plays". The New York Times. ISSN 0362-4331. Retrieved 7 February 2018. https://www.nytimes.com/2017/02/17/sports/football/indoor-football-league-screaming-eagles.html
Prive, Tanya. "What Is Crowdfunding And How Does It Benefit The Economy". Forbes.com. Retrieved 2 July 2015. https://www.forbes.com/sites/tanyaprive/2012/11/27/what-is-crowdfunding-and-how-does-it-benefit-the-economy/
Choy, Katherine; Schlagwein, Daniel (2016), "Crowdsourcing for a better world: On the relation between IT affordances and donor motivations in charitable crowdfunding", Information Technology & People, 29 (1): 221–247, doi:10.1108/ITP-09-2014-0215, hdl:1959.4/unsworks_38196, S2CID 12352130 https://unsworks.unsw.edu.au/bitstreams/c64b500c-b9a6-4ad8-a955-569cb9325363/download
Barnett, Chance. "Crowdfunding Sites In 2014". Forbes.com. Retrieved 2 July 2015. https://www.forbes.com/sites/chancebarnett/2014/08/29/crowdfunding-sites-in-2014/
Agrawal, Ajay; Catalini, Christian; Goldfarb, Avi (2014). "Some Simple Economics of Crowdfunding" (PDF). Innovation Policy and the Economy. 14. University of Chicago Press: 63–97. doi:10.1086/674021. hdl:1721.1/108043. ISSN 1531-3468. S2CID 16085029. https://www.nber.org/system/files/working_papers/w19133/w19133.pdf
Agrawal, Ajay; Catalini, Christian; Goldfarb, Avi (2014). "Some Simple Economics of Crowdfunding" (PDF). Innovation Policy and the Economy. 14. University of Chicago Press: 63–97. doi:10.1086/674021. hdl:1721.1/108043. ISSN 1531-3468. S2CID 16085029. https://www.nber.org/system/files/working_papers/w19133/w19133.pdf
Agrawal, Ajay; Catalini, Christian; Goldfarb, Avi (2014). "Some Simple Economics of Crowdfunding" (PDF). Innovation Policy and the Economy. 14. University of Chicago Press: 63–97. doi:10.1086/674021. hdl:1721.1/108043. ISSN 1531-3468. S2CID 16085029. https://www.nber.org/system/files/working_papers/w19133/w19133.pdf
Leimeister, J.M.; Huber, M.; Bretschneider, U.; Krcmar, H. (2009), "Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition", Journal of Management Information Systems, 26 (1): 197–224, doi:10.2753/mis0742-1222260108, S2CID 17485373 http://portal.acm.org/citation.cfm?id=1653890
Ebner, W.; Leimeister, J.; Krcmar, H. (September 2009). "Community Engineering for Innovations: The Ideas Competition as a Method to Nurture a Virtual Community for Innovations". R&D Management. 39 (4): 342–356. doi:10.1111/j.1467-9310.2009.00564.x. Retrieved 20 January 2024. https://www.researchgate.net/publication/227500941
"DARPA Network Challenge". DARPA Network Challenge. Archived from the original on 11 August 2011. Retrieved 28 November 2011. https://web.archive.org/web/20110811233340/https://networkchallenge.darpa.mil/Default.aspx
"Social media web snares 'criminals'". New Scientist. Retrieved 4 April 2012. https://www.newscientist.com/article/dn21666-social-media-web-snares-criminals.html
Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012 https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
"Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies". 20 February 2012. Retrieved 30 March 2012. http://www.fourhourworkweek.com/blog/2012/02/20/beyond-x-prize-the-10-best-crowdsourcing-tools-and-technologies/
Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012 https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM, 54 (4): 86–96, doi:10.1145/1924421.1924442, S2CID 207184672 https://cacm.acm.org/magazines/2011/4/106563-crowdsourcing-systems-on-the-world-wide-web/fulltext
Kittur, A.; Chi, E.H.; Sun, B. (2008), "Crowdsourcing user studies with Mechanical Turk" (PDF), Chi 2008 http://www-users.cs.umn.edu/~echi/papers/2008-CHI2008/2008-02-mech-turk-online-experiments-chi1049-kittur.pdf
Liu, Wei; Moultrie, James; Ye, Songhe (4 May 2019). "The Customer-Dominated Innovation Process: Involving Customers as Designers and Decision-Makers in Developing New Product". The Design Journal. 22 (3): 299–324. doi:10.1080/14606925.2019.1592324. S2CID 145931864. https://www.repository.cam.ac.uk/handle/1810/341960
Cunard, C. (19 July 2010). "The Movie Research Experience gets audiences involved in filmmaking." The Daily Bruin https://dailybruin.com/2010/07/19/the_movie_research_experience_gets_audiences_involved_in_filmmaking
MacArthur, Kate. "Squadhelp wants your company to crowdsource better names (and avoid Boaty McBoatface)". chicagotribune.com. Retrieved 28 August 2017. http://www.chicagotribune.com/bluesky/originals/ct-squadhelp-startup-names-bsi-20170331-story.html
"Compete To Create Your Dream Home". Co.Exist. FastCoexist.com. 4 June 2013. Retrieved 3 February 2014. http://www.fastcoexist.com/1682162/a-site-that-lets-designers-compete-to-create-your-dream-home
"Designers, clients forge ties on web". Boston Herald. 11 June 2012. Retrieved 3 February 2014. http://bostonherald.com/business/technology/technology_news/2012/06/designers_clients_forge_ties_web
Dolan, Shelagh, "Crowdsourced delivery explained: making same day shipping cheaper through local couriers.", Business Insider, archived from the original on 22 May 2018, retrieved 21 May 2018 https://web.archive.org/web/20180522060126/http://www.businessinsider.com/crowdsourced-delivery-shipping-explained
Murison, Malek (19 April 2018), "LivingPackets uses IoT, crowdshipping to transform deliveries", Internet of Business, retrieved 19 April 2018 https://internetofbusiness.com/livingpackets-iot-international-deliveries/
Biller, David; Sciaudone, Christina (19 June 2018), "Goldman Sachs, Soros Bet on the Uber of Brazilian Trucking", Bloomberg, retrieved 11 March 2019 https://www.bloomberg.com/news/articles/2018-06-19/goldman-sachs-soros-bet-on-the-uber-of-brazilian-trucking
Tyrsina, Radu, "Parcl Uses Trusted Forwarders to Bring you Products that don't Ship to your Country", Technology Personalised, archived from the original on 3 October 2015, retrieved 1 October 2015 https://web.archive.org/web/20151003234051/http://techpp.com/2015/10/01/parcl-buy-products-that-dont-ship-to-your-country/
Geiger D, Rosemann M, Fielt E. (2011) Crowdsourcing information systems: a systems theory perspective. Proceedings of the 22nd Australasian Conference on Information Systems. https://aisel.aisnet.org/acis2011/33/
Powell, D (2015). "A new tool for crowdsourcing". МИР (Модернизация. Инновации. Развитие). 6 (2-2 (22)). ISSN 2079-4665. https://cyberleninka.ru/article/n/a-new-tool-for-crowdsourcing
Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired. https://www.wired.com/wired/archive/14.06/crowds.html
Yang, J.; Adamic, L.; Ackerman, M. (2008), "Crowdsourcing and knowledge sharing: Strategic user behavior on taskcn", Proceedings of the 9th ACM conference on Electronic commerce (PDF), pp. 246–255, doi:10.1145/1386790.1386829, ISBN 9781605581699, S2CID 15553154, archived from the original (PDF) on 29 July 2020, retrieved 28 February 2012 9781605581699
Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM, 54 (4): 86–96, doi:10.1145/1924421.1924442, S2CID 207184672 https://cacm.acm.org/magazines/2011/4/106563-crowdsourcing-systems-on-the-world-wide-web/fulltext
"Mobile Crowdsourcing". Clickworker. Retrieved 10 December 2014. http://www.clickworker.com/en/crowdsourcing-glossar/mobile-crowdsourcing/
Thebault-Spieker, Jacob; Terveen, Loren G.; Hecht, Brent (28 February 2015). "Avoiding the South Side and the Suburbs". Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. New York, NY, USA: ACM. pp. 265–275. doi:10.1145/2675133.2675278. ISBN 9781450329224. 9781450329224
Chatzimiloudis, Konstantinidis & Laoudias, Zeinalipour-Yazti. Crowdsourcing with smartphones (PDF). http://www.cs.ucy.ac.cy/~dzeina/papers/ic12-crowdsourcing.pdf
Arkian, Hamid Reza; Diyanat, Abolfazl; Pourkhalili, Atefe (2017). "MIST: Fog-based data analytics scheme with cost-efficient resource provisioning for IoT crowdsensing applications". Journal of Network and Computer Applications. 82: 152–165. doi:10.1016/j.jnca.2017.01.012. /wiki/Doi_(identifier)
Felstiner, Alek (August 2011). "Working the Crowd: Employment and Labor Law in the Crowdsourcing Industry" (PDF). Berkeley Journal of Employment & Labor Law. 32: 150–151 – via WTF. http://wtf.tw/ref/felstiner.pdf
"View of Crowdsourcing: Libertarian Panacea or Regulatory Nightmare?". online-shc.com. Retrieved 26 May 2017.[permanent dead link] https://online-shc.com/arc/ojs/index.php/JOHE/article/view/4/4
WEI, F-F.; CHEN, W-N.; Guo, X-Q.; Zhao, B.; Jeon, S-W.; Zhang, J. (2024). "CrowdEC: Crowdsourcing-based Evolutionary Computation for Distributed Optimization". IEEE Transactions on Services Computing. 17 (6): 3286–3299. doi:10.1109/TSC.2024.3433487. https://ieeexplore.ieee.org/document/10618890
Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson, B. (2010). "Who are the Crowdworkers? Shifting Demographics in Mechanical Turk" (PDF). Chi 2010. Archived from the original (PDF) on 1 April 2011. Retrieved 28 February 2012. https://wayback.archive-it.org/all/20110401101755/http://www.ics.uci.edu/~jwross/pubs/RossEtAl-WhoAreTheCrowdworkers-altCHI2010.pdf
Litman, Leib; Robinson, Jonathan (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. SAGE Publications. ISBN 978-1506391137. 978-1506391137
Huff, Connor; Tingley, Dustin (1 July 2015). ""Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents". Research & Politics. 2 (3): 205316801560464. doi:10.1177/2053168015604648. S2CID 7749084. https://doi.org/10.1177%2F2053168015604648
Moss, Aaron; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom; Litman, Leib (2022). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". psyarxiv.com. Retrieved 12 January 2023. https://psyarxiv.com/jbc9d/
Levay, Kevin E.; Freese, Jeremy; Druckman, James N. (1 January 2016). "The Demographic and Political Composition of Mechanical Turk Samples". SAGE Open. 6 (1): 215824401663643. doi:10.1177/2158244016636433. S2CID 147299692. https://doi.org/10.1177%2F2158244016636433
Hirth, M.; Hoßfeld, T.; Train-Gia, P. (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform (PDF) http://www3.informatik.uni-wuerzburg.de/TR/tr478.pdf
Brabham, Daren C. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application". First Monday. 13 (6). doi:10.5210/fm.v13i6.2159. https://doi.org/10.5210%2Ffm.v13i6.2159
van der Krieke, L; Blaauw, FJ; Emerencia, AC; Schenk, HM; Slaets, JP; Bos, EH; de Jonge, P; Jeronimus, BF (2016). "Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback (2016)" (PDF). Psychosomatic Medicine. 79 (2): 213–223. doi:10.1097/PSY.0000000000000378. PMID 27551988. S2CID 10955232. https://pure.rug.nl/ws/files/40193705/00006842_201702000_00011.pdf
Brabham, Daren C. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application". First Monday. 13 (6). doi:10.5210/fm.v13i6.2159. https://doi.org/10.5210%2Ffm.v13i6.2159
Lakhani; et al. (2007). The Value of Openness in Scientific Problem Solving (PDF). Retrieved 26 February 2012. http://www.hbs.edu/research/pdf/07-050.pdf
Brabham, Daren C. (2012). "Managing Unexpected Publics Online: The Challenge of Targeting Specific Groups with the Wide-Reaching Tool of the Internet". International Journal of Communication. 6: 20. http://ijoc.org/ojs/index.php/ijoc/article/view/1542/751
Brabham, Daren C. (2010). "Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application". Information, Communication & Society. 13 (8): 1122–1145. doi:10.1080/13691181003624090. S2CID 143402410. /wiki/Doi_(identifier)
Brabham, Daren C. (2012). "The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage". Information, Communication & Society. 15 (3): 394–410. doi:10.1080/1369118X.2011.641991. S2CID 145675154. /wiki/Doi_(identifier)
Saxton, Gregory D.; Oh, Onook; Kishore, Rajiv (2013). "Rules of Crowdsourcing: Models, Issues, and Systems of Control". Information Systems Management. 30: 2–20. CiteSeerX 10.1.1.300.8026. doi:10.1080/10580530.2013.739883. S2CID 16811686. /wiki/CiteSeerX_(identifier)
Aitamurto, Tanja; Landemore, Hélène; Saldivar Galli, Jorge (2016). "Unmasking the Crowd: Participants' Motivation Factors, Profile and Expectations for Participation in Crowdsourced Policymaking". Information, Communication & Society. 20 (8): 1239–1260. doi:10.1080/1369118x.2016.1228993. S2CID 151989757. http://thefinnishexperiment.com/2016/09/21/motivation-factors-for-participation-in-crowdsourced-policymaking/
Brabham, Daren C. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application". First Monday. 13 (6). doi:10.5210/fm.v13i6.2159. https://doi.org/10.5210%2Ffm.v13i6.2159
Lakhani; et al. (2007). The Value of Openness in Scientific Problem Solving (PDF). Retrieved 26 February 2012. http://www.hbs.edu/research/pdf/07-050.pdf
Brabham, Daren C. (2010). "Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application". Information, Communication & Society. 13 (8): 1122–1145. doi:10.1080/13691181003624090. S2CID 143402410. /wiki/Doi_(identifier)
Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning". International Journal of Communication. 9: 3523–3543. http://crowdsourcinginjournalism.com/2015/10/28/motivation-factors-in-crowdsourced-journalism-social-impact-social-change-and-peer-learning/
Kaufmann, N.; Schulze, T.; Viet, D. (2011). "More than fun and money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk" (PDF). Proceedings of the Seventeenth Americas Conference on Information Systems. Archived from the original (PDF) on 27 February 2012. https://web.archive.org/web/20120227173340/http://schader.bwl.uni-mannheim.de/fileadmin/files/publikationen/Kaufmann_Schulze_Veit_2011_-_More_than_fun_and_money_Worker_motivation_in_Crowdsourcing_-_A_Study_on_Mechanical_Turk_AMCIS_2011.pdf
Brabham, Daren C. (2012). "Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning". Journal of Applied Communication Research. 40 (3): 307–328. doi:10.1080/00909882.2012.693940. S2CID 144807388. /wiki/Doi_(identifier)
Lietsala, Katri; Joutsen, Atte (2007). "Hang-a-rounds and True Believers: A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans". MindTrek 2007 Conference Proceedings.
Dahlander, Linus; Piezunka, Henning (1 June 2014). "Open to suggestions: How organizations elicit suggestions through proactive and reactive attention". Research Policy. Open Innovation: New Insights and Evidence. 43 (5): 812–827. doi:10.1016/j.respol.2013.06.006. ISSN 0048-7333. https://linkinghub.elsevier.com/retrieve/pii/S0048733313001108
Kaufmann, N.; Schulze, T.; Viet, D. (2011). "More than fun and money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk" (PDF). Proceedings of the Seventeenth Americas Conference on Information Systems. Archived from the original (PDF) on 27 February 2012. https://web.archive.org/web/20120227173340/http://schader.bwl.uni-mannheim.de/fileadmin/files/publikationen/Kaufmann_Schulze_Veit_2011_-_More_than_fun_and_money_Worker_motivation_in_Crowdsourcing_-_A_Study_on_Mechanical_Turk_AMCIS_2011.pdf
Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning". International Journal of Communication. 9: 3523–3543. http://crowdsourcinginjournalism.com/2015/10/28/motivation-factors-in-crowdsourced-journalism-social-impact-social-change-and-peer-learning/
"State of the World's Volunteerism Report 2011" (PDF). Unv.org. Archived from the original (PDF) on 2 December 2014. Retrieved 1 July 2015. https://web.archive.org/web/20141202072036/http://www.unv.org/fileadmin/docdb/pdf/2011/SWVR/English/SWVR2011_full.pdf
Chandler, D.; Kapelner, A. (2010). "Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets" (PDF). Journal of Economic Behavior & Organization. 90: 123–133. arXiv:1210.0962. doi:10.1016/j.jebo.2013.03.003. S2CID 8563262. http://www.danachandler.com/files/Chandler_Kapelner_BreakingMonotonyWithMeaning.pdf
Aparicio, M.; Costa, C.; Braga, A. (2012). "Proposing a system to support crowdsourcing". Proceedings of the Workshop on Open Source and Design of Communication (PDF). pp. 13–17. doi:10.1145/2316936.2316940. ISBN 9781450315258. S2CID 16494503. 9781450315258
Aitamurto, Tanja; Landemore, Hélène; Saldivar Galli, Jorge (2016). "Unmasking the Crowd: Participants' Motivation Factors, Profile and Expectations for Participation in Crowdsourced Policymaking". Information, Communication & Society. 20 (8): 1239–1260. doi:10.1080/1369118x.2016.1228993. S2CID 151989757. http://thefinnishexperiment.com/2016/09/21/motivation-factors-for-participation-in-crowdsourced-policymaking/
Ipeirotis, Panagiotis G. (10 March 2010). Demographics of Mechanical Turk. http://archive.nyu.edu/handle/2451/29585
Ross, Joel; Irani, Lilly; Silberman, M. Six; Zaldivar, Andrew; Tomlinson, Bill (10 April 2010). "Who are the crowdworkers?". CHI '10 Extended Abstracts on Human Factors in Computing Systems. CHI EA '10. New York, USA: Association for Computing Machinery. pp. 2863–2872. doi:10.1145/1753846.1753873. ISBN 978-1-60558-930-5. S2CID 11386257. 978-1-60558-930-5
Moss, Aaron; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom; Litman, Leib (2022). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". psyarxiv.com. Retrieved 12 January 2023. https://psyarxiv.com/jbc9d/
Quinn, Alexander J.; Bederson, Benjamin B. (2011). "Human Computation:A Survey and Taxonomy of a Growing Field, CHI 2011 [Computer Human Interaction conference], May 7–12, 2011, Vancouver, BC, Canada" (PDF). Retrieved 30 June 2015. http://alexquinn.org/papers/Human%20Computation,%20A%20Survey%20and%20Taxonomy%20of%20a%20Growing%20Field%20(CHI%202011).pdf
Hauser, David J.; Moss, Aaron J.; Rosenzweig, Cheskie; Jaffe, Shalom N.; Robinson, Jonathan; Litman, Leib (3 November 2022). "Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk". Behavior Research Methods. 55 (8): 3953–3964. doi:10.3758/s13428-022-01999-x. PMC 10700412. PMID 36326997. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10700412
Prpić, J; Shukla, P.; Roth, Y.; Lemoine, J.F. (2015). "A Geography of Participation in IT-Mediated Crowds". Proceedings of the Hawaii International Conference on Systems Sciences 2015. SSRN 2494537. /wiki/SSRN_(identifier)
Dahlander, Linus; Piezunka, Henning (9 December 2020). "Why crowdsourcing fails". Journal of Organization Design. 9 (1): 24. doi:10.1186/s41469-020-00088-7. hdl:10419/252174. ISSN 2245-408X. https://doi.org/10.1186%2Fs41469-020-00088-7
Dahlander, Linus; Piezunka, Henning (1 June 2014). "Open to suggestions: How organizations elicit suggestions through proactive and reactive attention". Research Policy. Open Innovation: New Insights and Evidence. 43 (5): 812–827. doi:10.1016/j.respol.2013.06.006. ISSN 0048-7333. https://linkinghub.elsevier.com/retrieve/pii/S0048733313001108
"How Generative AI Can Augment Human Creativity". Harvard Business Review. 16 June 2023. ISSN 0017-8012. Retrieved 20 June 2023. https://hbr.org/2023/07/how-generative-ai-can-augment-human-creativity
Borst, Irma. "The Case For and Against Crowdsourcing: Part 2". Archived from the original on 12 September 2015. Retrieved 9 February 2015. https://web.archive.org/web/20150912024759/http://www.crowdsourcing.org/editorial/the-case-for-and-against-crowdsourcing-part-2/2850
Litman, Leib; Robinson, Jonathan (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. SAGE Publications. ISBN 978-1506391137. 978-1506391137
Ipeirotis; Provost; Wang (2010). Quality Management on Amazon Mechanical Turk (PDF). Archived from the original (PDF) on 9 August 2012. Retrieved 28 February 2012. https://web.archive.org/web/20120809230548/http://people.stern.nyu.edu/panos/publications/hcomp2010.pdf
Hauser, David J.; Moss, Aaron J.; Rosenzweig, Cheskie; Jaffe, Shalom N.; Robinson, Jonathan; Litman, Leib (3 November 2022). "Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk". Behavior Research Methods. 55 (8): 3953–3964. doi:10.3758/s13428-022-01999-x. PMC 10700412. PMID 36326997. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10700412
Lukyanenko, Roman; Parsons, Jeffrey; Wiersma, Yolanda (2014). "The IQ of the Crowd: Understanding and Improving Information Quality in Structured User-Generated Content". Information Systems Research. 25 (4): 669–689. doi:10.1287/isre.2014.0537. /wiki/Doi_(identifier)
Hauser, David; Paolacci, Gabriele; Chandler, Jesse (15 April 2019), "Evidence and Solutions", Handbook of Research Methods in Consumer Psychology, doi:10.4324/9781351137713-17, ISBN 9781351137713, S2CID 150882624, retrieved 12 January 2023 9781351137713
Moss, Aaron J; Rosenzweig, Cheskie; Jaffe, Shalom Noach; Gautam, Richa; Robinson, Jonathan; Litman, Leib (11 June 2021). Bots or inattentive humans? Identifying sources of low-quality data in online platforms. doi:10.31234/osf.io/wr8ds. S2CID 236288817. https://osf.io/wr8ds
Goerzen, Thomas; Kundisch, Dennis (11 August 2016). "Can the Crowd Substitute Experts in Evaluation of Creative Ideas? An Experimental Study Using Business Models". AMCIS 2016 Proceedings. https://aisel.aisnet.org/amcis2016/Virtual/Presentations/10
Burnap, Alex; Ren, Alex J.; Papazoglou, Giannis; Gerth, Richard; Gonzalez, Richard; Papalambros, Panos. When Crowdsourcing Fails: A Study of Expertise on Crowdsourced Design Evaluation (PDF). Archived from the original (PDF) on 29 October 2015. Retrieved 19 May 2015. https://web.archive.org/web/20151029001614/http://ode.engin.umich.edu/publications/PapalambrosPapers/2015/316J.pdf
Kurve, Aditya; Miller, David J.; Kesidis, George (30 May 2014). "Multicategory Crowdsourcing Accounting for Variable Task Difficulty, Worker Skill, and Worker Intention". IEEE Kde (99).
Hirth; Hoßfeld; Tran-Gia (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform (PDF) http://www3.informatik.uni-wuerzburg.de/TR/tr478.pdf
Moss, Aaron J; Rosenzweig, Cheskie; Jaffe, Shalom Noach; Gautam, Richa; Robinson, Jonathan; Litman, Leib (11 June 2021). Bots or inattentive humans? Identifying sources of low-quality data in online platforms. doi:10.31234/osf.io/wr8ds. S2CID 236288817. https://osf.io/wr8ds
PhD, Aaron Moss (18 September 2018). "After the Bot Scare: Understanding What's Been Happening With Data Collection on MTurk and How to Stop It". CloudResearch. Retrieved 12 January 2023. https://www.cloudresearch.com/resources/blog/after-the-bot-scare-understanding-whats-been-happening-with-data-collection-on-mturk-and-how-to-stop-it/
Ipeirotis, Panagiotis G. (2010). "Analyzing the Amazon Mechanical Turk Marketplace" (PDF). XRDS: Crossroads, the ACM Magazine for Students. 17 (2): 16–21. doi:10.1145/1869086.1869094. S2CID 6472586. SSRN 1688194. Retrieved 2 October 2018. https://archive.nyu.edu/bitstream/2451/29801/4/CeDER-10-04.pdf
Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on Amazon's Mechanical Turk", Behavior Research Methods, SSRN 1691163 /wiki/SSRN_(identifier)
Hosaka, Tomoko A. (April 2008). "Facebook asks users to translate for free". NBC News. https://www.nbcnews.com/id/wbna24205912
Britt, Darice. "Crowdsourcing: The Debate Roars On". Archived from the original on 1 July 2014. Retrieved 4 December 2012. https://web.archive.org/web/20140701173128/http://insite.artinstitutes.edu/crowdsourcing-the-debate-roars-on-39739.aspx
Borst, Irma. "The Case For and Against Crowdsourcing: Part 2". Archived from the original on 12 September 2015. Retrieved 9 February 2015. https://web.archive.org/web/20150912024759/http://www.crowdsourcing.org/editorial/the-case-for-and-against-crowdsourcing-part-2/2850
Woods, Dan (28 September 2009). "The Myth of Crowdsourcing". Forbes. Retrieved 4 December 2012. https://www.forbes.com/2009/09/28/crowdsourcing-enterprise-innovation-technology-cio-network-jargonspy.html
Aitamurto, Tanja; Leiponen, Aija. "The Promise of Idea Crowdsourcing: Benefits, Contexts, Limitations". Ideasproject.com. Retrieved 2 July 2015. https://www.academia.edu/963662
Aitamurto, Tanja; Leiponen, Aija. "The Promise of Idea Crowdsourcing: Benefits, Contexts, Limitations". Ideasproject.com. Retrieved 2 July 2015. https://www.academia.edu/963662
"International Translators Association Launched in Argentina". Latin American Herald Tribune. Archived from the original on 11 March 2021. Retrieved 23 November 2016. https://web.archive.org/web/20210311031022/http://www.laht.com/article.asp?ArticleId=344753&CategoryId=14093
Kleeman, Frank (2008). "Un(der)paid Innovators: The Commercial Utilization of Consumer Work through Crowdsourcing". Sti-studies.de. Retrieved 2 July 2015. http://www.sti-studies.de/ojs/index.php/sti/article/view/81/62
Jason (2011). "Crowdsourcing: A Million Heads is Better Than One". Crowdsourcing.org. Archived from the original on 3 July 2015. Retrieved 2 July 2015. https://web.archive.org/web/20150703021755/http://www.crowdsourcing.org/document/crowdsourcing-a-million-heads-is-better-than-one/8619
Dupree, Steven (2014). "Crowdfunding 101: Pros and Cons". Gsb.stanford.edu. Retrieved 2 July 2015. http://www.gsb.stanford.edu/ces/crowdfunding-101
Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson, B. (2010). "Who are the Crowdworkers? Shifting Demographics in Mechanical Turk" (PDF). Chi 2010. Archived from the original (PDF) on 1 April 2011. Retrieved 28 February 2012. https://wayback.archive-it.org/all/20110401101755/http://www.ics.uci.edu/~jwross/pubs/RossEtAl-WhoAreTheCrowdworkers-altCHI2010.pdf
"Fair Labor Standards Act Advisor". Retrieved 28 February 2012. http://www.dol.gov/elaws/faq/esa/flsa/001.htm
Hara, Kotaro; Adams, Abigail; Milland, Kristy; Savage, Saiph; Callison-Burch, Chris; Bigham, Jeffrey P. (21 April 2018). "A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. New York, USA: ACM. pp. 1–14. doi:10.1145/3173574.3174023. ISBN 9781450356206. S2CID 5040507. 9781450356206
Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on Amazon's Mechanical Turk", Behavior Research Methods, SSRN 1691163 /wiki/SSRN_(identifier)
Greg Norcie, 2011, "Ethical and practical considerations for compensation of crowdsourced research participants", CHI WS on Ethics Logs and VideoTape: Ethics in Large Scale Trials & User Generated Content, [1][usurped], accessed 30 June 2015. https://web.archive.org/web/20120630074708/http://www.crowdsourcing.org/document/ethical-and-practical-considerations-for-compensation-of-crowdsourced-research-participants/3650
Busarovs, Aleksejs (2013). "Ethical Aspects of Crowdsourcing, or is it a Modern Form of Exploitation" (PDF). International Journal of Economics & Business Administration. 1 (1): 3–14. doi:10.35808/ijeba/1. Retrieved 26 November 2014. http://www.ijeba.com/documents/papers/2013_1_p1.pdf
Moss, Aaron; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom; Litman, Leib (2022). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". psyarxiv.com. Retrieved 12 January 2023. https://psyarxiv.com/jbc9d/
Moss, Aaron; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom; Litman, Leib (2022). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". psyarxiv.com. Retrieved 12 January 2023. https://psyarxiv.com/jbc9d/
Hosaka, Tomoko A. (April 2008). "Facebook asks users to translate for free". NBC News. https://www.nbcnews.com/id/wbna24205912
Paolacci, G; Chandler, J; Ipeirotis, P.G. (2010). "Running experiments on Amazon Mechanical Turk". Judgment and Decision Making. 5 (5): 411–419. doi:10.1017/S1930297500002205. hdl:1765/31983. S2CID 14476283. https://doi.org/10.1017%2FS1930297500002205
Graham, Mark; Hjorth, Isis; Lehdonvirta, Vili (1 May 2017). "Digital labour and development: impacts of global digital labour platforms and the gig economy on worker livelihoods". Transfer: European Review of Labour and Research. 23 (2): 135–162. doi:10.1177/1024258916687250. PMC 5518998. PMID 28781494. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5518998
Brabham, Daren C. (2012). "The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage". Information, Communication & Society. 15 (3): 394–410. doi:10.1080/1369118X.2011.641991. S2CID 145675154. /wiki/Doi_(identifier)
The Crowdsourcing Scam (Dec. 2014), The Baffler, No. 26 http://www.thebaffler.com/salvos/crowdsourcing-scam
Lakhani; et al. (2007). The Value of Openness in Scientific Problem Solving (PDF). Retrieved 26 February 2012. http://www.hbs.edu/research/pdf/07-050.pdf
Salehi; et al. (2015). We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers (PDF). Archived from the original (PDF) on 17 June 2015. Retrieved 16 June 2015. https://web.archive.org/web/20150617061055/http://www.kristymilland.com/papers/Salehi.2015.We.Are.Dynamo.pdf
Irani, Lilly C.; Silberman, M. Six (27 April 2013). "Turkopticon". Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, USA: ACM. pp. 611–620. doi:10.1145/2470654.2470742. ISBN 9781450318990. S2CID 207203679. 9781450318990
Shmueli, Boaz; Fell, Jan; Ray, Soumya; Ku, Lun-Wei (2021). "Beyond Fair Pay: Ethical Implications of NLP Crowdsourcing". Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics. pp. 3758–3769. doi:10.18653/v1/2021.naacl-main.295. S2CID 233307331. https://aclanthology.org/2021.naacl-main.295