Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

How much is too much? The effects of information quantity on crowdfunding performance

  • Naomi Moy ,

    Contributed equally to this work with: Naomi Moy, Ho Fai Chan, Benno Torgler

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    n.moy@qut.edu.au

    Affiliation School of Economics & Finance, Faculty of Business, Queensland University of Technology, Brisbane, Queensland, Australia

  • Ho Fai Chan ,

    Contributed equally to this work with: Naomi Moy, Ho Fai Chan, Benno Torgler

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation School of Economics & Finance, Faculty of Business, Queensland University of Technology, Brisbane, Queensland, Australia

  • Benno Torgler

    Contributed equally to this work with: Naomi Moy, Ho Fai Chan, Benno Torgler

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations School of Economics & Finance, Faculty of Business, Queensland University of Technology, Brisbane, Queensland, Australia, Center for Research in Economics, Management and the Arts (CREMA), Zurich, Switzerland

How much is too much? The effects of information quantity on crowdfunding performance

  • Naomi Moy, 
  • Ho Fai Chan, 
  • Benno Torgler
PLOS
x

Abstract

We explore the effects of the quantity of information on the tendency to contribute to crowdfunding campaigns. Using the crowdfunding platform Kickstarter, we analyze the campaign descriptions and the performance of over 70,000 projects. We look empirically at the effect of information quantity (word count) on funding success (as measure by amount raised and number of backers). Within this empirical approach, we test whether an excessive amount of information will affect funding success. To do so, we test for the non-linearity (quadratic) effect of our independent variable (word count) using regression analysis. Consistent with the hypothesis that excess information will negatively affect funds raised and number of contributors, we observe a consistent U-shaped relationship between campaign text length and overall success which suggest that an optimal number of words exists within crowdfunding texts and that going over this point will reduce a project’s chance of fundraising success.

Introduction

… in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.

Herbert A. Simon (1971, pp. 40–41)

It has often been said,

There’s so much to be read,

You never can cram

All those words in your head.

So the writer who breeds

More words than he needs

Is making a chore

For the reader who reads.

That’s why my belief is

The briefer the brief is,

The greater the sigh

Of the reader’s relief is.

And that’s why your books

Have such power and strength.

You publish with shorth!

(Shorth is better than length.)

Theodor Seuss Geisel (Dr. Seuss)

Lo bueno, si breve, dos veces bueno.

Baltasar Gracián

Information gathering is a crucial part of both decision-making and problem-solving, whether it be pedestrians checking for approaching traffic before crossing a road or consumers reading hotel reviews before making a reservation. Such efforts are made in the belief that having too little information could result in negative outcomes, such as a severe traffic accident or a disastrous hotel stay. When information is overabundant, on the other hand, individuals cannot process every single datum even though doing so would probably lead to a better outcome (e.g., a higher quality decision). Likewise, when confronted with an information surplus like a lengthy text, they find it difficult to distinguish between relevant and unnecessary data. Rather, because the cognitive costs of acquiring all relevant information outweigh the potential benefits [1], they tend to rely on heuristic or reasoning based shortcuts to reach decisions. These observations have led the developing field of attention economics to treat human attention as a scarce resource [2], a limiting factor in information consumption.

To throw more light on this relationship between the amount of information provided and decision-making, this paper examines it in the context of the crowdfunding platform Kickstarter, which seeks public financial support for innovative projects and ideas. Because potential contributors rely on project description content to inform their pledging decision, it is important to understand the role of information quantity in investment behaviour. To this end, our analysis is guided by one primary research question: Does the amount of information provided by the creator influence the funding outcome? Using a large empirical dataset from Kickstarter.com that encompasses almost 80,000 projects, we identify an inverted U-shaped relation between project description length and number of funders or amount raised.

Text length and information

Information overload occurs when individuals receive too much information and reach a point at which they can no longer process information [1], [3]. In effect, too much information becomes too much of a good thing [4]. This overabundance is evidenced by individual performance during decision-making, which steadily improves as information is added but then declines as data input becomes too much and leads to information overload (see [3], for an overview). Providing too much textual information, therefore, and thereby increasing text length could (unintentionally) have a negative effect on individual contributor decisions. It may, for example, lead to feelings of stress, confusion, pressure, anxiety, or low motivation ([3], p. 328). Naturally, the propensity to ignore lengthy informative texts is closely related to individual attention spans. Moreover, the problems associated with attention span can be further exacerbated by comprehensibility of certain text sections being too dependent on a clear understanding of preceding parts. As well illustrated by the “TL;DR” (“too long, didn’t read”) notation on very long articles, readers can usually gauge the level of effort required to digest information through such cues as the number of pages or thickness of a document. Academic journal articles, for example, tend to be around 20 pages long, while many non-fiction books are no longer than 300 pages and online news articles are growing progressively becoming shorter [5].

The effect of changing writing style in terms of text length is well investigated in various disciplines. For example, various studies in the informetrics and scientometrics literature report a significantly positive relation between scientific article length and citation outcome (see [6], [7], [8], [9], [10]). On the other hand, short and succinct abstracts are more likely to increase citations than longer abstracts [11]. A large body of survey research literature is similarly devoted to studying the link between response rates and questionnaire length. Whereas shorter questionnaires tend to increase both response quality and response rate [12], [13], [14], (also see [15] for an overview), a distinct U-shaped relation is reported between response rate and questionnaire length [16]. In other words, although shorter questionnaires elicit the greatest response rate, the longest questionnaires do not necessarily have the lowest response rate.

In a consumer setting such as exhibit labels in a museum or health claims on foodstuffs, text length can affect individual attention. For example, whereas the longer the text, the less likely it is to be read or understood (see [17], [18], [19], [20]), the longer the online review for consumer products, the more positively it is correlated with purchasing behaviour. For instance, [21] examines the effect of the average length of online book reviews on book sales on Amazon.com and Barnesandnoble.com and [22] studies the relationship between comments length and perceived helpfulness of online reviews on Yelp.com. Research concludes that in the absence of information overload which would diminish attentiveness and lower decision quality [1], [23], [24], [25], [26], longer commentaries increase the level of perceived helpfulness and increase sales [21], [22]. In sum, the length of a body of text (i.e., number of words or pages) influences various types of success, including item sales, decision quality, or article citation.

Crowdfunding: Kickstarter

Building upon the earlier literature, our analysis assesses the effects of text length on success in an entrepreneurial setting using data from the crowdfunding website Kickstarter, which links innovators with individuals who are willing to contribute funds in exchange for physical (product) and/or non-physical (gratitude) rewards. To convince individuals to contribute to their campaigns, creators of the innovation must pitch their idea using text, which may be supported by images and videos. The amount of information used within each medium is important given Kickstarter’s all-or-nothing funding model (i.e., a project creator receives no money if the funding goal is not reached). Because both the descriptive text and outcome are discernible in this setting, we are able to observe how word number deters or increases the number of monetary contributions to a project. Even more important, by holding most things equal in outcome creation (same goal, platform, possibilities, and restrictions), Kickstarter campaigns provide a controlled setting that is equivalent to a real-world laboratory. Kickstarter descriptors are thus ideal texts for study. In our case, we anticipate an inverted U-shaped relation between project text length and funding success; that is, that an overabundance of information, quantified by number of words, will decrease the amount raised and the number of project contributors. No previous studies explore non-linearity [27], [28], [29], [30].

Data and methodology

Dataset

The dataset for this study was obtained from a GitHub project developed by [31]. The raw data was collected by [31] from Kickstarter using Python, in line with previous research using data from crowdfunding websites. For example, [30] and [32] developed customized computer scripts which took daily snapshots of live campaigns on Kickstarter, [32], [33] and [34] also deployed web crawlers to extract project information from Kickstarter. Our dataset comprises detailed information on all Kickstarter campaigns from 21st April 2009 to 6th May 2013, for a total of 87,260 projects. Each observation records project-related information, such as outcome of the funding campaign in terms of the number of ‘backers’ (individuals who supported the campaign financially), and the amount of funds raised, full text of the campaign description, main category and sub-category of the project, funding goal, project’s geographic location, campaign launch date and end date, and project and creator identifier. We removed 3,868 projects that were live at the date of the data collection. Furthermore, we also removed 1,505 cancelled and suspended projects, or if the project description contained less than three words. The content of these projects were manually inspected, some examples of description are: “Sorry, project withdrawn!”, “Removed, for now.”, “Video to come.”. The number of projects with 1, 2 and 3 words are 126, 12 and 18, respectively. The resulting dataset spans from 21st April 2009 to 29th April 2013, and contains 81,892 projects. Although [34] excludes projects with over 2.5 million dollars or lower than 10 cent fundraising goals as non-serious, we retain them without any alteration to our results. For the dependent variable, we use the overall amount raised (in U.S. dollars) and the number of funding contributors to measure fundraising success of a project. These two variables provide an indication of the level of financial success and popularity of a project. When performing visual inspection of the data, we observed that the distribution of both of our dependent variables seems to follow an exponential function (see Fig 1). As a result, we apply natural log transformation to both dependent variables for our analysis.

thumbnail
Fig 1. Distribution of amount raised and the number of backers, (a) original and (b) log transformed.

The right-hand tail (top 99 percentile) of the distributions in (a) was not shown for the purposes of visualization.

https://doi.org/10.1371/journal.pone.0192012.g001

The total word count of the description of the Kickstarter project measures the key independent variable of information quantity provided by the creator. A typical Kickstarter project contains, on average, around 500 words (with a standard deviation of 466 words). It is worth noting that the distribution of project word counts varies across project categories, for example, projects in technology include more words than art projects, possibly due to the need to explain the technical aspect of the product in the former category. The summary statistics for the project description word counts are given by category in Table 1. Descriptions of game, technology, and design projects are longest (634–938 words on average), and descriptions of music, dance, and theatre projects are shortest (351–398 words on average). Additionally, as is typical for Internet-mediated texts such as emails, eforums, and Wikipedia articles [35], the distribution of description word count is right-skewed (see Fig 2). It is also notable that two projects in the publishing category provide sample chapters totaling over 10,000 words. To eliminate the possibility that these outliers, although genuine, might seriously affect the estimates of the non-linear relation between text length and funding outcomes, we censor (winsorize) the word count variable at the top 99th percentile in each category (see Table 1). In other words, we replace the value of the longest 1% of projects with the value of the 99th percentile of the sample.

thumbnail
Fig 2. Distribution of the total word count across projects (a) and by category (b).

The right-hand tail of each distribution is truncated for the purposes of visualization.

https://doi.org/10.1371/journal.pone.0192012.g002

Description edits and potential endogeneity bias

Although the goal and project deadline cannot be edited after launch, and no information can be modified after the campaign has ended, during the actual fundraising period, project creators can amend certain project details (e.g., the text, images, and/or videos in the description) or add additional rewards. Kickstarter also provides them with a separate ‘Project update’ tab through which to communicate with (potential) backers and provide updated information. These amendments, which unfortunately are not documented, create the potential for endogeneity bias from causality issues; for example, a creator may urge potential backers to contribute as the deadline approaches or overfunded projects may offer additional rewards to attract more backers. We therefore draw on two earlier studies that assess the dynamic effect of such amendments and their possible impact on success.

In the first study, [36] examine a smaller sample of 19,299 Kickstarter projects, 64% of which underwent no edits throughout the entirety of the project. The vast majority of edited projects were only amended once or twice, most often in the first few days of the campaign. The authors do report an increased number of edits towards the project’s end, which they attribute to creators either showing appreciation for backers as the goal is being reached or urging more contributions when success seems near. However, they provide no statistical evidence for these claims. Nor do they find any statistically significant relationship between their measure of edit size (extent) and project success.

The second study [37], focuses specifically on updates from the creator via the update tab before the campaign outcome was determined. Of the 8,529 projects studied, 58.6% had at least one update and one update was significant in increasing the chances of success. To identify the frequency of project types, the authors classify the updates into seven latent Dirichlet allocation (LDA) categories. In order of most to least frequent, these are social promotion, progress report, new content, reminder, answer question, new reward, and appreciation. Of these, reminder, progress report, and new reward updates are the most influential in predicting project success. In our study, we account for potential endogeneity bias by using cues similar to those employed by [37] to identify edited projects. For example, we apply the label “progress report” to any description containing “*ve reached” phrases and the label “reminder” to any containing “days to go” type phrases (see S1 Table for the complete list). If, however, the description contains the word ‘update’ or ‘UPDATE’ (case sensitive) but none of the search terms intimating progress report or reminder, we classify it as a general edit that might provide information on new content or new reward. We thus code our edit indicators as no edit, general edit, reminder edit, or progress edit.

In the study by [36], they identified 6,998 (36.26%) edited projects, with a large portion only having minor amendments (e.g., corrected typos). In our case, a significant proportion of the number of projects show no identifiable amendments, however, 6,478 projects (7.91%) have been edited. We find that 72.95% of the projects with an edited description and just over 80% of the projects with a reminder or progress edit achieved funding (see Table 2). This result echoes findings from [37] that a reminder had a stronger effect on achieving success, followed by progress report, and general update (new content or reward).

Controls

As control variables we consider characteristics of the project and its creator as well as external factors within the crowdfunding platform. Project and creator characteristics include the category of the project, funding goal, funding duration (and its squared term), geographic location (latitude and longitude), and creator’s experience (using the number of existing or previous projects by the same creator). To control for the effect of change in the description text during the campaign, we include three dummy variables of each type of identified project edits, i.e., general, reminder and progress edit. Furthermore, we also consider the level of competition using the average number of projects in the same sub-category during the project campaign. Table 3 presents the descriptive statistics of the variables used in the empirical analysis.

Model

Using the following model (1), we test the hypothesis in which excessive amount of information can affect funding success (non-linearity of word counts): (1) where,

Yik Outcome: amount raised or number of backers of the ith project of creator k

TCik Total word count

Total word count, squared

Pik Project characteristics (goal, category, duration, duration2, latitude, longitude, edits dummies).

Cik Creator experience (number of existing or previous projects for the creator)

Eik External characteristic, competition (average number of sub-category competitors during the campaign)

εik Error term

Results

Our estimations use a multivariate OLS regression of funds raised and number of backers contributing (in logs) on total word count of the description (as shown in Table 4). We include both the linear and squared term of word count to assess the potential non-linear relationship with the outcome variables. For all specifications, we cluster the standard errors over project creator level. Across all specifications in Table 4, the coefficient for the linear (quadratic) term of word count is positively (negatively) significant at the 1% level. This finding suggests that having more words in the project description increases both the overall amount raised and the number of contributors but with a diminishing or even negative effect once the text becomes too long. The main effects are robust to controls for project/creator characteristics and external factors. In specifications (3) and (6), we further control for potential description edits (Edits) and exclude projects outside the U.S. (approximately 7% of the total). Holding all other factors constant, the linear term of the word count is positive and the squared term is negative, while the coefficients are statistically significant at the 1% level. This outcome indicates a robust inverted U-shaped relation between text length and overall funding success, as illustrated in part (a) of Figs 3 and 4. For example, based on estimates from specification (3) and (6), adding 100 words to the description would increase the amount raised and number of backers of a 1000-word project by approximately 0.111% and 0.085%, respectively. On the other hand, if a project with a lengthy description (say 2000 words) added an additional 100 words, it would decrease its funds raised and number of contributors by 0.069% and 0.042% respectively. Overall, around 2.3% of the projects exceeded the optimal length of about 1,700 words.

thumbnail
Fig 3. Effects of project description length (word count) on fund raised (in log).

Part (a) depicts the results from specification (3) in Table 4. There exists a concave (inverted U-shape) relationship between the description length and fundraised; increasing the length of the project description increases the funding until it reaches the optimal point, (~1681.77 words), after which the amount raised is negatively affected and starts to decline. Part (b) reports the outcomes for the same specification by project category, as demonstrated in Table 5. The concave relationship is visual in all categories, but the optimal length point varies.

https://doi.org/10.1371/journal.pone.0192012.g003

thumbnail
Fig 4. Effects of project description length (word count) on the number of contributors (in log).

Part (a) depicts the results from specification (6) in Table 4. Similar to Fig 3, there is an inverted U-shape between project description length and number of backers (contributors); the optimal length point is approximately 1689.21 words. Part (b) reports the outcomes for the same specification by project category, as demonstrated in Table 6. A similar pattern was found in all categories, but the optimal length point varies.

https://doi.org/10.1371/journal.pone.0192012.g004

thumbnail
Table 4. Multivariate analysis of description length (word count censored at 99%).

https://doi.org/10.1371/journal.pone.0192012.t004

Because the main effect might vary to different degrees among the various project categories, we extend our analysis by examining each category individually using the same specification structures (see (3) and (6)). As Tables 5 and 6 show, the primary finding of an inverted U-shape is evident in each category (statistically significant at a 1% level); however, the optimal length point varies over categories (see Figs 3b and 4b). The same pattern also emerges within each sub-category (see S1 and S2 Figs), with the exception being electronic music whose coefficient of the quadratic term is not statistically significant.

thumbnail
Table 5. Multivariate analysis of description length (word count censored at 99%) on ln(Raised).

https://doi.org/10.1371/journal.pone.0192012.t005

thumbnail
Table 6. Multivariate analysis of description length (word count censored at 99%) on ln(Backers).

https://doi.org/10.1371/journal.pone.0192012.t006

Discussion

To assess the influence of information quantity on funding success, we measure the former as the word count of the project description and quantify the latter as the funds contributed and the number of contributing backers. Using these variables, we are able to demonstrate a clear non-linear relation between the amount of descriptive text and positive funding outcome, with a positive effect elicited in the lower word count range by increasing the number of words. We also demonstrate that, as evidenced by the inverted U-shape in Figs 3 and 4, in all Kickstarter product categories, there is an optimal number of words beyond which the project creator’s ability to attract contributions and contributors (backers) is reduced. As each category’s turning point varies, it may indicate a degree of flexibility within backer perceptions of what is considered to be too much information, where the consumption of information is more elastic. For example, the categories of Games and Publishing both attempt to capture the imagination of their respective recipients. Yet, the optimal length of the project description is greater for Games than that of Publishing. It may be that in order to sell the game you need to sell the story, whereas in publishing you need to sell the story without giving away the plot line, and thus need to be more succinct. Regardless of whether the creator is emphasizing certain points or providing extra detail, crossing this optimal point (which occurred in 2% of the cases analysed) has a negative impact on overall project success by deterring backers and their funds. This effect may be caused by the extra effort needed to read a lengthy text or by sheer information overload. On the other hand, given that projects with fewer words are more likely to be scams, much shorter texts may not provide sufficient detail to convince potential contributors of the project’s high quality or even its legitimacy [38].

In drawing these conclusions about the influence of text length cues on decision-making behaviour, we are careful to consider the limited cognitive capacity of bounded rational individuals. Nevertheless, our analysis is not without limitations. First, it is highly likely that we have an omitted variable bias. For example, social network sizes, the presence and scope of images and videos, the frequency and timing of updates, as well as spelling errors within the text can all significantly affect success [27], [39], [40], [41]. Nor can we control for the distinctive personal characteristics of backers, such as age or category interests, which may influence their funding decisions by shaping their information capacity or interest. Moreover, although we measure the quantitative aspect of the text, we do not operationalize its qualitative features, which thus offer a useful direction in which to extend the research. Finally, although we try to account for a suspected endogeneity bias, future studies might better address this aspect by working with more precise data and monitoring all changes over time.

Supporting information

S1 Fig. Effects of word count on ln(Raised) based on sub-category.

* designates unspecified sub-categories. Each section is based on the sub-category that is offered on Kickstarter during the time period of data collection, and represents the marginal effects of the word count based on specification (3).

https://doi.org/10.1371/journal.pone.0192012.s001

(TIF)

S2 Fig. Effects of word count on ln(Backers) based on sub-category.

* designates unspecified sub-categories. Each section is based on the sub-category that is offered on Kickstarter during the time period of data collection, and represents the marginal effects of the word count based on specification (6).

https://doi.org/10.1371/journal.pone.0192012.s002

(TIF)

S1 Table. Search terms to identify type of edit.

Note: All terms are case sensitive unless the word/phrase contains at least one upper case letter.

https://doi.org/10.1371/journal.pone.0192012.s003

(DOCX)

Acknowledgments

We would like to thank the reviewer Jose Roberto Iglesias for his helpful suggestions that improved this paper.

References

  1. 1. Malhotra NK. Information load and consumer decision making. J Consum Res. 1982; 8(4): 419–430.
  2. 2. Huberman BA, Wu F. The economics of attention: Maximizing user value in information-rich environments. Adv Complex Sys. 2008; 11(04): 487–496.
  3. 3. Eppler MJ, Mengis J. The concept of information overload: A review of literature from organization science, accounting, marketing, MIS, and related disciplines. Inform Soc. 2004; 20(5): 325–344.
  4. 4. Shenk D. Information overload. In: Johnston DH, editor. Encyclopedia of International Media and Communications. New York: Elsevier; 2003. p. 395–405.
  5. 5. Baron NS. Words Onscreen: The Fate of Reading in a Digital World. Oxford: Oxford University Press; 2015.
  6. 6. Torgler B, Piatti M. A Century of American Economic Review: Insights on Critical Factors in Journal Publishing. Basingstoke: Palgrave Macmillan; 2016.
  7. 7. Falagas ME, Zarkali A, Karageorgopoulos DE, Bardakas V, Mavros MN. The impact of article length on the number of future citations: A bibliometric analysis of general medicine journals. PLoS ONE. 2013; 8(2): e49476. pmid:23405060
  8. 8. Chan H, Guillot M, Page L, Torgler B. The inner quality of an article: Will time tell? Scientometrics. 2015; 104(1): 19–41.
  9. 9. Weinberger CJ, Evans JA, Allesina S. Ten simple (empirical) rules for writing science. PLoS Comput Biol. 2015; 11(4): e1004205. pmid:25928031
  10. 10. Stremersch S, Verniers I, Verhoef PC. The quest for citations: Drivers of article impact. J Marketing. 2007; 71(3): 171–193. Find that article length has a positive linear relation with citation number and that non-linearity does not increase the model fit.
  11. 11. Letchford A, Preis T, Moat HS. The advantage of simple paper abstracts. J Informetr. 2016; 10(1): 1–8.
  12. 12. Burchell B, Marsh C. The effect of questionnaire length on survey response. Qual Quant. 1992; 26(3): 233–244.
  13. 13. Galesic M, Bosnjak M. Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opin Quart.2009; 73(2): 349–360.
  14. 14. Herzog AR, Bachman JG. Effects of questionnaire length on response quality. Public Opin Quart. 1981; 45(4): 549–559.
  15. 15. Rolstad S, Adler J, Rydén A. Response burden and questionnaire length: Is shorter better? A review and meta-analysis. Value Health. 2011; 14(8): 1101–1108. pmid:22152180
  16. 16. Lund E, Gram IT. Response rate according to title and length of questionnaire. Scand J Public Healt.1998; 26(2): 154–160.
  17. 17. Bitgood SC, Patterson DD. The effects of gallery changes on visitor reading and object viewing time. Environ Behav. 1993; 25(6): 761–781.
  18. 18. Cota A, Bitgood S. Recall of label content: The effects of length and sequence. Recall. 1993; 8(4): 12.
  19. 19. Williams P. Consumer understanding and use of health claims for foods. Nutr Rev. 2005; 63(7): 256–264. pmid:16121480
  20. 20. Bitgood S, Dukes S, Abbey L. Interest and effort as predictors of reading: A test of the general value principle. Current Trends in Audience Research. 2006; 19(2): 2.
  21. 21. Chevalier JA, Mayzlin D. The effect of word of mouth on sales: Online book reviews. J Marketing Res. 2006; 43(3): 345–354.
  22. 22. Kampouris P. TL;DR (Too long; didn’t read)! A study on the effects of length and fluency on the helpfulness of online reviews. M. Sc. Thesis, Tilburg University. 2013.
  23. 23. Lee BK, Lee WN. The effect of information overload on consumer choice quality in an on‐line environment. Psychol Market. 2004; 21(3): 159–183.
  24. 24. Lurie NH. Decision making in information-rich environments: The role of information structure. J Consum Res. 2004; 30(4): 473–486.
  25. 25. Chen YC, Shang RA, Kao CY. The effects of information overload on consumers’ subjective state towards buying decision in the internet shopping environment. Electron Commer RA. 2009; 8(1): 48–58.
  26. 26. Krasnova H, Kolesnikova E, Guenther O. “It won’t happen to me!”: Self-disclosure in online social networks. AMCIS 2009: Proceedings of Americas Conference on Information Systems; 2009; paper 343.
  27. 27. Evers MW. The main drivers of crowdfunding success: A conceptual framework and empirical analysis. M. Sc. Thesis, Erasmus University. 2012.
  28. 28. Geva H, Barzilay O, Oestreicher-Singer G. A potato salad with a lemon twist: Using supply-side shocks to study the impact of low-quality actors on crowdfunding platforms; 2016. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2777474
  29. 29. Crosetto P, Regner T. Crowdfunding: Determinants of success and funding dynamics. Prepint. Jena Economic Research Papers; 2014. https://ideas.repec.org/p/jrp/jrpwrp/2014-035.html
  30. 30. Zhou M, Du Q, Fan W, Qiao Z, Wang G, Zhang X. Money talks: A predictive model on crowdfunding success using project description. 21st AMCIS 2015: Proceedings of Americas Conference on Information Systems; 2015; Puerto Rico.
  31. 31. Neight-Allen. django-kickstarter-scraper; 2013 [cited 2016 Jun 8]. Database: GitHub [Internet]. https://github.com/neight-allen/django-kickstarter-scraper#db-dump
  32. 32. Kuppuswamy V, Bayus BL. Crowdfunding creative ideas: The dynamics of project backers in Kickstarter. In: Hornuf L, Cumming D, editors. The Economics of Crowdfunding: Startups, Portals, and Investor Behavior. Forthcoming; 2017. Available from: SSRN: https://ssrn.com/abstract=2234765 or http://dx.doi.org/10.2139/ssrn.2234765
  33. 33. Etter V, Grossglauser M, Thiran P. Launch hard or go home!: Predicting the success of kickstarter campaigns. COSN 2013: Proceedings of the first ACM conference on Online social networks ACM; 2013; New York, NY, USA; 2013. p. 177–182. Available from: http://dx.doi.org/10.1145/2512938.2512957
  34. 34. Mollick E. The dynamics of crowdfunding: An exploratory study. J Bus Venturing. 2014; 29(1): 1–16.
  35. 35. Sobkowicz P, Thelwall M, Buckley K, Paltoglou G, Sobkowicz A. Lognormal distributions of user post lengths in Internet discussions: A consequence of the Weber-Fechner law? EPJ Data Sci. 2013; 2(1): 1.
  36. 36. Venugopal V, Bagadia S. Understanding the dynamics of crowdfunding: Kickstarter edits. CS224N Final Project. 2015. https://nlp.stanford.edu/courses/cs224n/2015/reports/19.pdf
  37. 37. Xu A, Yang X, Rao H, Fu WT, Huang SW, Bailey BP. Show me the money! An analysis of project updates during crowdfunding campaigns. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; 2014. http://dl.acm.org/citation.cfm?id=2556288
  38. 38. Shafqat W, Lee S, Malik S, Kim H. The language of deceivers: Linguistic features of crowdfunding scams. Proceedings of the 25th International Conference Companion on World Wide Web, 2016. http://dl.acm.org/citation.cfm?id=2872518&picked=prox
  39. 39. Cumming D J, Leboeuf G, Schwienbacher A. Crowdfunding models: Keep-it-all vs. all-or-nothing. In: Paris December 2014 finance meeting EUROFIDAI-FFI; 2014 Dec; Paris; 2014.
  40. 40. Gao Q, Lin M. Lemon or cherry? The value of texts in debt crowdfunding, 2015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2446114
  41. 41. Simon H. Designing organizations for an information-rich world, In: Greenberger M, editor. Computers, Communications, and the Public Interest. Baltimore, MD: The Johns Hopkins Press; 1971. p. 37–72.