Hostname: page-component-848d4c4894-wg55d Total loading time: 0 Render date: 2024-05-18T06:34:45.584Z Has data issue: false hasContentIssue false

Learning by doing? The relationship between effort, learning effect and product quality during hackathons of novice teams

Published online by Cambridge University Press:  08 April 2024

Nuno Miguel Martins Pacheco*
Affiliation:
TUM School of Engineering and Design, Technical University of Munich, Munich, Germany
Mara Geisler
Affiliation:
TUM School of Management, Technical University of Munich, Munich, Germany
Medina Bajramovic
Affiliation:
Department of Statistics, Ludwig-Maximilians-Universität München, Munich, Germany
Gabrielle Fu
Affiliation:
TUM School of Engineering and Design, Technical University of Munich, Munich, Germany
Anand Vazhapilli Sureshbabu
Affiliation:
TUM School of Engineering and Design, Technical University of Munich, Munich, Germany
Markus Mörtl
Affiliation:
TUM School of Engineering and Design, Technical University of Munich, Munich, Germany
Markus Zimmermann
Affiliation:
TUM School of Engineering and Design, Technical University of Munich, Munich, Germany
*
Corresponding author Nuno Miguel Martins Pacheco martins.pacheco@tum.de
Rights & Permissions [Opens in a new window]

Abstract

Design education prepares novice designers to solve complex and challenging problems requiring diverse skill sets and an interdisciplinary approach. Hackathons, for example, offer a hands-on, collaborative learning approach in a limited time frame to gain practical experience and develop problem-solving skills quickly. They enable collaboration, prototyping and testing among interdisciplinary teams. Typically, hackathons strongly focus on the solution, assuming that this will support learning. However, building the best product and achieving a strong learning effect may not be related. This paper presents the results of an empirical study that examines the relationship between product quality, learning effect and effort spent in an academic 2-week hackathon. Thirty teams identified user problems in this course and developed hardware and mechatronic products. This study collected the following data: (1) effort spent during the hackathon through task tracking, (2) learning effect through self-assessment by the participants and (3) product quality after the hackathon by an external jury. The study found that the team effort spent has a statistically significant but moderate correlation with product quality. The correlation between product quality and learning effect is statistically insignificant, suggesting that for this setting, there is no relevant association.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press

1. Introduction

The importance of design in driving economic performance has been increasingly recognized by employers (Shute & Becker Reference Shute and Becker2010; Matthews & Wrigley Reference Matthews and Wrigley2017; Sheppard et al. Reference Sheppard, Sarrazin, Kouyoumjian and Dore2018). Beyond designers’ ability to develop effective solutions to ill-structured problems (Jonassen, Strobel, & Lee Reference Jonassen, Strobel and Lee2006; Chen, Kolmos, & Du Reference Chen, Kolmos and Du2021), the landscape has evolved to encompass increasingly complex and intertwined systems, processes and contextual factors (Friedman Reference Friedman2019; Meyer & Norman Reference Meyer and Norman2020). Equally significant is the challenge designers face in operating within sociotechnical systems, where they must responsibly address societal and environmental concerns (Friedman Reference Friedman2019; Meyer & Norman Reference Meyer and Norman2020). Despite this growing interest and the evolving demands placed on designers, the literature points out that education systems and corporate cultures often fail to impart these critical skills to novice designers (Lawson Reference Lawson2005; Anthony Reference Anthony2014; Norman Reference Norman2016; Meyer & Norman Reference Meyer and Norman2020). Novice designers are new to the design field and possess limited experience designing products or systems (Deininger et al. Reference Deininger, Daly, Sienko and Lee2017; Menold et al. Reference Menold, Berdanier, McComb, Hocker and Gardner2018; Flus & Hurst Reference Flus and Hurst2021a). This designation may apply to young engineers in companies and engineering students. In many cases, these young designers end up learning on the job (Meyer & Norman Reference Meyer and Norman2020) what they were initially expected to acquire through their formal education.

Design education emerges as a compelling solution to this paradox, equipping novice designers with the knowledge and skills demanded by the industry (Tu, Liu, & Wu Reference Tu, Liu and Wu2018; Porras et al. Reference Porras, Knutas, Ikonen, Happonen, Khakurel and Herala2019; Meyer & Norman Reference Meyer and Norman2020). However, traditional design education has historically placed significant emphasis on skill-based training, with a primary focus on acquiring technical skills as the cornerstone of the education of novice designers (Norman Reference Norman2016; Meyer & Norman Reference Meyer and Norman2020; Lin, Huang, & Lin Reference Lin, Huang and Lin2021). While technical proficiency undeniably holds significant importance, it is crucial to recognize that achieving excellence in design extends beyond skill refinement. It entails gaining a comprehensive understanding of the design process, fostering teamwork skills and applying technology in a human-centered context. At its core, the discipline of design revolves around the acts of creation and execution (Norman Reference Norman2016; Meyer & Norman Reference Meyer and Norman2020) and as such, the design project plays a pivotal role in the education of novice designers (Meyer & Norman Reference Meyer and Norman2020). Therefore, design education often takes the form of problem- and project-based learning (PBL) due to its anticipated advantages in enhancing students’ academic performance and developing transferable skills, effectively preparing students for the complexities of real-world design work (Kolmos, Fink, & Lrogh Reference Kolmos, Fink and Lrogh2004; Lawson Reference Lawson2005; Norman Reference Norman2016; Meyer & Norman Reference Meyer and Norman2020; Chen et al. Reference Chen, Kolmos and Du2021).

One prevalent form of design education in universities is hackathon-like events or seminars that claim to reflect the real professional world (Lawson Reference Lawson2005; Flus & Hurst Reference Flus and Hurst2021a). A literature review by Flus & Hurst (Reference Flus and Hurst2021a) encompassing 39 studies on design research at hackathons revealed various themes in educational hackathon research, including the experiences of participants (Olesen, Hansen, & Halskov Reference Olesen, Hansen, Halskov and Buchanan2018) and the impact of interdisciplinary teams on team performance (Legardeur et al. Reference Legardeur, Masson, Gardoni, Pimapunsri, Bitran, Conn, Gernreich, Heber, Huizingh and Torkkeli2020). Furthermore, numerous papers have shared qualitative insights and anecdotes from hackathons as design educational formats (Artiles & Wallace Reference Artiles and Wallace2013; Artiles & LeVine Reference Artiles and LeVine2015; Lewis et al. Reference Lewis, Parker, Cheng and Resnick2015; Fowler Reference Fowler2016; Nandi & Mandernach Reference Nandi, Mandernach and Alphonce2016; Page et al. Reference Page, Sweeney, Bruce, Baxter, Bohemia, Kovacevic, Buck, Tollestrup, Eriksen and Ovesen2016; Gama et al. Reference Gama, Alencar, Calegario, Neves and Alessio2018; Kos Reference Kos2019). Remarkably, among the 39 papers examined, merely 10 employed quantitative methods, exposing a gap in the quantitative research of hackathons as educational formats.

By bridging this gap, we have the opportunity to lay the groundwork for a more comprehensive exploration of the effectiveness of hackathons as a pedagogical tool in design education. Such research could provide a data-driven foundation that offers evidence-based recommendations, ultimately enhancing design education practices within hackathon events. This, in turn, holds the potential to significantly improve learning outcomes for novice designers. Therefore, the primary objective of this study is to investigate the efficacy of hackathons as an educational instrument for novice designers.

2. Hackathons

Hackathons are project-based events exposing participants to design (Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; La Place et al. Reference La Place, Jordan, Lande and Weiner2017). Since the term “hackathon” (a combination of “hack” and “marathon”) was first coined in 1999 (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Kollwitz & Dinter Reference Kollwitz, Dinter, Hildebrandt, van Dongen, Röglinger and Mendling2019), the popularity of these events has multiplied. As the number of hackathon events increased, so did new terms (e.g., game jam, code fest, design sprints and makeathon) and classifications. This has created a situation where there is no widely accepted definition of a hackathon (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015).

In the context of this paper, hackathons are defined as intensive, time-limited, collaborative events where participants from diverse disciplinary backgrounds develop innovative solutions, typically in the form of functional prototypes, to address specific challenges. These events emphasize the synthesis of creative thinking, problem-solving and technical skills and interdisciplinary collaboration within a constrained time frame.

Although hackathons vary greatly in their objectives and implementation, they share common characteristics and a consistent structure. In general, hackathons are defined as brief and fixed-duration events in which small, interdisciplinary teams collaborate to develop functional software or hardware prototypes (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; Flores et al. Reference Flores, Golob, Maklin, Herrera, Tucci, Al-Ashaab, Williams, Encinas, Martinez, Zaki, Sosa, Pineda, Moon, Lee, Park, Kiritsis and von Cieminski2018; Taylor & Clarke Reference Taylor, Clarke, Mandryk, Hancock, Perry and Cox2018; Lifshitz-Assaf, Lebovitz, & Zalmanson Reference Lifshitz-Assaf, Lebovitz and Zalmanson2021; Flus & Hurst Reference Flus and Hurst2021a).

Hackathon events typically follow a set schedule (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015). The event begins with a kick-off presentation outlining the goals, schedule and prizes. Participants then form teams if they still need to do so. After the teams have come together, the actual hack begins. Teams explore the problem space, ideate, build prototypes and often work long hours. At the end of the event, the ideas are presented to an audience, often competing for a price. Due to this emphasis on creating and presenting a tangible output, hackathons usually focus on the final product (Briscoe & Mulligan Reference Briscoe and Mulligan2014).

3. Design education and hackathons

Design education has traditionally placed a strong emphasis on skill-based training, primarily aimed at honing the technical abilities of novice designers (Norman Reference Norman2016; Meyer & Norman Reference Meyer and Norman2020; Lin et al. Reference Lin, Huang and Lin2021). This lays the foundation for lower-level educational objectives, encompassing tasks such as memorization, description and interpretation of information corresponding to the “remember” and “understand” levels in Krathwohl’s (Reference Krathwohl2002) revised taxonomy of Bloom (Reference Bloom1979). Nevertheless, this conventional approach falls short of reaching higher-level educational goals (Fernando et al. Reference Fernando, Mansur, Alves and Torres2019).

Over the past 50 years, higher design education has recognized the need to bridge this gap between knowledge acquisition and practical skill application, which has led to the integration of more pragmatic PBL formats into the curriculum (Kolmos et al. Reference Kolmos, Fink and Lrogh2004; Chen et al. Reference Chen, Kolmos and Du2021). PBL, a student-centered pedagogy, aligns with the attainment of higher educational objectives, including “apply”, “analyze”, “evaluate” and “create” as outlined in Bloom’s (Reference Bloom1979) revised taxonomy (Sasson, Yehuda, & Malkinson Reference Sasson, Yehuda and Malkinson2018; Fernando et al. Reference Fernando, Mansur, Alves and Torres2019). This transformation in educational methodology is underpinned by three fundamental principles (Kolmos, de Graaff, & Du Reference Kolmos, de Graaff, Du, Du, de Graaf and Kolmos2009). The first principle centers around cognitive learning, where students engage with authentic, real-world problems for their learning journey. This process encourages collaboration among peers and stakeholders to develop prototypes of viable solutions within defined timeframes (Kolmos et al. Reference Kolmos, de Graaff, Du, Du, de Graaf and Kolmos2009; Savin-Baden Reference Savin-Baden2014; Kokotsaki, Menzies, & Wiggins Reference Kokotsaki, Menzies and Wiggins2016; Boelt, Kolmos, & Holgaard Reference Boelt, Kolmos and Holgaard2022). The second principle underscores the importance of team-based learning, redefining education as a social event (Kolmos et al. Reference Kolmos, de Graaff, Du, Du, de Graaf and Kolmos2009; Savin-Baden Reference Savin-Baden2014; Boelt et al. Reference Boelt, Kolmos and Holgaard2022). Within this context, students not only engage in self-directed learning but also actively participate in the sharing and organization of knowledge. This collaborative dynamic naturally transitions into the third principle: interdisciplinary learning, as team members often bring diverse subject backgrounds to the table (Kolmos et al. Reference Kolmos, de Graaff, Du, Du, de Graaf and Kolmos2009; Savin-Baden Reference Savin-Baden2014; Boelt et al. Reference Boelt, Kolmos and Holgaard2022). PBL equips novice designers with critical insights into design processes, collaboration and principles of good design (Kokotsaki et al. Reference Kokotsaki, Menzies and Wiggins2016; Meyer & Norman Reference Meyer and Norman2020). In fact, it has been linked to improved retention rate as well as skill and knowledge development (Norman & Schmidt Reference Norman and Schmidt2000; Dochy et al. Reference Dochy, Segers, van den Bossche and Gijbels2003; Strobel & van Barneveld Reference Strobel and van Barneveld2009; Sasson et al. Reference Sasson, Yehuda and Malkinson2018).

One widely recognized example of PBL is design studios, dedicated classes where students work separately on projects with guidance by an instructor (Cennamo Reference Cennamo, Boling, Schwier, Gray, Smith and Campbell2016; Ferreira, Christiaans, & Almendra Reference Ferreira, Christiaans and Almendra2016; Emam, Taha, & ElSayad Reference Emam, Taha and ElSayad2019; Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023). This teacher-centered learning approach leads to students engaging in an iterative process to find solutions for open-ended, project-based problems, known as design briefs (Cennamo Reference Cennamo, Boling, Schwier, Gray, Smith and Campbell2016; Emam et al. Reference Emam, Taha and ElSayad2019; Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023). Throughout this creative journey, the instructor assumes a pivotal role by steering students through the design process and providing periodic feedback on their designs, which they use to refine their work (Cennamo Reference Cennamo, Boling, Schwier, Gray, Smith and Campbell2016; Emam et al. Reference Emam, Taha and ElSayad2019; Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023). This aims to replicate a real-world studio environment within an educational context (Ferreira et al. Reference Ferreira, Christiaans and Almendra2016). Critics of studio-based learning, however, contend that this teacher-centered approach fosters dependency on instructors, diverting students’ focus from honing their skills (Belluigi Reference Belluigi2016; Souleles Reference Souleles2017). Consequently, we conclude that studio design education may hinder the attainment of the educational objective “evaluate” from Bloom’s revised taxonomy, which involves the ability to make informed judgments. Souleles (Reference Souleles2017) even goes so far as to argue that an over-reliance on teacher-centered instructional strategies impedes the development of competencies required for practices like participatory design, co-design, design activism, disruptive design, critical design and other human-centered design strategies. Moreover, as higher education institutions grapple with space constraints due to the proliferation of higher education (Micklethwaite & Knifton Reference Micklethwaite and Knifton2017; Corazzo Reference Corazzo2019; Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023), it becomes imperative to reimagine design studios as collaborative learning environments (Micklethwaite & Knifton Reference Micklethwaite and Knifton2017; Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023). Alternatively, there is a growing trend in design education to shift away from traditional studio-based learning activities toward PBL approaches that immerse students more deeply in the design process and transform the role of the teacher from an “all-knowing master” into a mentor who facilitates learning (Brosens et al. Reference Brosens, Raes, Octavia and Emmanouil2023).

In contrast to design studios, which typically follow a teacher-centered approach to PBL, hackathons embrace a student-centered approach (Horton et al. Reference Horton, Jordan, Weiner and Lande2018), prioritizing collaborative and hands-on problem-solving in a time-limited and high-pressure environment. Unlike traditional design studio courses that extend over an entire semester, hackathons offer short yet intense projects, enabling students to fully immerse themselves in their design projects. The inherent nature of hackathons makes them exceptionally well-suited for design education as they closely mimic real-world, innovative environments (Lawson Reference Lawson2005; Flus & Hurst Reference Flus and Hurst2021a). During hackathons, students are encouraged to nurture their analytical problem-solving abilities and collaborate actively, relying less on continuous guidance.

These events share several characteristics (see Table 1) and make hackathons an excellent platform for design education, as they offer participants an opportunity to gain practical experience with the entire design process in a short period and develop valuable skills for new product development (Lawson Reference Lawson2005; Flus & Hurst Reference Flus and Hurst2021a). Instructors can easily tailor the hackathon format and content to align with participants’ academic level and educational objectives (Hurst, Litster, & Rennick Reference Hurst, Litster and Rennick2020). Hackathons, as informal learning platforms, foster learning-by-doing, peer learning and creativity among students in design education (Nandi & Mandernach Reference Nandi, Mandernach and Alphonce2016; Flus & Hurst Reference Flus and Hurst2021a). Moreover, hackathons can lead to the creation of finished products with genuine business value, making them effective idea-generation platforms (Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; Page et al. Reference Page, Sweeney, Bruce, Baxter, Bohemia, Kovacevic, Buck, Tollestrup, Eriksen and Ovesen2016). Prior research has indicated that greater learning during projects enhances team performance by avoiding problem-solving errors and promoting innovative combinations of knowledge (Huang & Li Reference Huang and Li2012; McKay & Ellis Reference McKay and Ellis2014). Additionally, investing more effort has been associated with greater learning and improved academic achievements in academic settings (Hagger & Hamilton Reference Hagger and Hamilton2019; Putwain et al. Reference Putwain, Nicholson, Pekrun, Becker and Symes2019). These findings not only contribute to the entrepreneurial environment within institutions but also highlight the significance of hackathons in education. As a result, different universities have extensively implemented hackathons in educational settings such as: THINK. MAKE. START., ME310, XTech, Integrated Product Development, Multidisciplinary Group Project and Advanced Embodiment Design.

Table 1. Characteristics of a hackathon

Despite the benefits above, there are concerns about the extent to which hackathons replicate realistic innovation environments. One criticism is the lack of genuine customers providing problem statements for teams to work on (Lawson Reference Lawson2005). Instead, teams begin the hacking by exploring various problem spaces, hastily seeking input from users and customers (Flus & Hurst Reference Flus and Hurst2021a). The emphasis on speed in hackathons can lead participants to rush through certain design phases, such as requirements elicitation and code maintenance (Gama Reference Gama, LaToza and Mao2017). This tendency is driven by the pressure to showcase a final prototype to the audience at the event’s conclusion (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Porras et al. Reference Porras, Knutas, Ikonen, Happonen, Khakurel and Herala2019). Consequently, participants may become excessively preoccupied with their solution, neglecting sufficient reflection on their design process (Lawson Reference Lawson2005; Gama Reference Gama, LaToza and Mao2017; Flus & Olechowski Reference Flus, Olechowski and Gero2023). A recent study by Thomson & Grierson (Reference Thomson, Grierson, Seybold and Mantwill2021) confirms concerns expressed by researchers regarding the predominant focus on solutions in hackathons. The study revealed that students assigned increased importance to the stages associated with designing the solution after participating in a design project. Another criticism is the need for hackathons to accurately mirror the decision-making environment of realistic innovation settings (Flus & Olechowski Reference Flus, Olechowski and Gero2023). Real-world settings often involve cross-functional teams (Cooper Reference Cooper2019), whereas students frequently lack formal exposure to such projects (Kelland, Brisco, & Whitfield Reference Kelland, Brisco, Whitfield, Bohemia, Lyndon and Hilary2022). Furthermore, hackathons are characterized by their fast-paced and dynamic nature (Briscoe & Mulligan Reference Briscoe and Mulligan2014;Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; Flus & Hurst Reference Flus and Hurst2021a), which can create ambiguity and uncertainty (Böhle, Heidling, & Schoper Reference Böhle, Heidling and Schoper2016; Flus & Hurst Reference Flus and Hurst2021a), calling for slow design thinking (Kannengiesser & Gero Reference Kannengiesser and Gero2019; Flus & Olechowski Reference Flus, Olechowski and Gero2023). However, due to time constraints, participants are often required to adopt a rapid design thinking approach, potentially limiting exploration of design alternatives (Flus & Olechowski Reference Flus, Olechowski and Gero2023). Experienced designers also exhibit fast design thinking due to their expertise, allowing them to design intuitively and effectively (Hurst et al. Reference Hurst, Nespoli, Abdellahi and Gero2019; Kannengiesser & Gero Reference Kannengiesser and Gero2019).

In conclusion, hackathons are viewed positively as both idea-generation platforms and teaching tools. They facilitate collaboration, knowledge exchange and the creation of innovative solutions for complex problems (Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; Flus & Hurst Reference Flus and Hurst2021a). However, it is crucial to acknowledge that hackathons’ solution-focused nature has also received criticism in design education (Lawson Reference Lawson2005; Gama Reference Gama, LaToza and Mao2017; Flus & Olechowski Reference Flus, Olechowski and Gero2023). While the quality of the product concepts produced is undoubtedly significant, it should not be the sole measure of success. Factors such as participants’ learning outcomes, efficient resource allocation and the proper use of agile methodologies are crucial for overall educational success. This raises the question of the efficacy of hackathons as instruments for design education, particularly in terms of participants’ learning outcomes and the quality of the resultant products. Remarkably, these crucial aspects have not, to the best of our knowledge, been examined in conjunction with the effort invested by students in the design project.

4. Research objectives

This study aimed to investigate the relationship between effort invested in product development, learning effect and product quality in educational hackathons involving teams of novice designers, which are referred to as novice teams. The research question guiding this investigation was: What is the relationship between time spent on product development, product quality and learning effect? Four hypotheses were formulated and tested to address this question.

Based on the outcome-focused nature of hackathons (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Porras et al. Reference Porras, Knutas, Ikonen, Happonen, Khakurel and Herala2019; Flus & Olechowski Reference Flus, Olechowski and Gero2023), the first hypothesis posited a significant correlation between the total hours invested in product development and product quality. We expected that greater time investment by participants would lead to improved final product quality. This hypothesis aligns with the notion that effort predicts academic achievements (Lee Reference Lee2014; Hagger & Hamilton Reference Hagger and Hamilton2019; Putwain et al. Reference Putwain, Nicholson, Pekrun, Becker and Symes2019), although its applicability to hackathons, characterized by intense time pressure and collaborative innovation, remains unexplored. Therefore, the present study sought to investigate whether the number of hours invested in the development phase would be associated with higher product quality in hackathons.

H1: The number of hours a team invests into developing a product during a student-based hackathon correlates with the product quality at a moderate to high level ( $ \mid \rho \mid $ ≥. 3).

The second hypothesis proposed a significant correlation between the number of hours a team spends in the problem/solution phase and the resulting product quality. We expected that greater time investment during the problem phase would positively impact product quality. During this phase, teams conduct research, observations and interviews to understand and define the problem they intend to solve. This leads to a clear problem definition that guides the subsequent solution phase. Prior research suggests that effective problem exploration enables designers to generate a broader range of solutions and potentially more innovative outcomes (Daly et al. Reference Daly, McKilligan, Studer, Murray and Seifert2018; Studer et al. Reference Studer, Daly, McKilligan and Seifert2018; Murray et al. Reference Murray, Studer, Daly, McKilligan and Seifert2019). Additionally, it was anticipated that the hours invested in the solution phase would also positively influence product quality. This aligns with hackathons’ emphasis on generating solutions and producing a high-quality final prototype (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Porras et al. Reference Porras, Knutas, Ikonen, Happonen, Khakurel and Herala2019; Thomson & Grierson Reference Thomson, Grierson, Seybold and Mantwill2021; Flus & Olechowski Reference Flus, Olechowski and Gero2023).

H2: The number of hours a team invests in the problem/solution phase in developing a product during a student-based hackathon correlates with the product quality at a moderate to high level ( $ \mid \rho \mid $ ≥. 3).

The third hypothesis of this study examined the relationship between total effort and the learning effect in hackathons. The learning effect refers to the improvement in self-assessed knowledge after completing the course compared to before starting it. The hypothesis proposed that greater total effort invested by participants would lead to a stronger learning effect. This hypothesis was derived from previous literature suggesting that increased effort promotes learning (Huang & Li Reference Huang and Li2012; McKay & Ellis Reference McKay and Ellis2014).

H3: The number of hours a team invests into developing a product during a student-based hackathon correlates with the learning effect at a moderate to high level ( $ \mid \rho \mid $ ≥. 3).

The fourth hypothesis of this study investigated the relationship between the learning effect and product quality in hackathons. The hypothesis suggested that the learning effect influences the quality of the final solution. This hypothesis was based on previous research highlighting the significant role of learning in achieving project success within teams and organizations (Zwikael et al. Reference Zwikael, Levin, Parviz and Rad2008; Huang & Li Reference Huang and Li2012; McKay & Ellis Reference McKay and Ellis2014).

H4: The learning effect a team experiences during a student-based hackathon correlates with the product quality at a moderate to high level ( $ \mid \rho \mid $ ≥. 3).

This study is significant for understanding the effectiveness of hackathons as a pedagogical tool for novice designers. Effective pedagogy, as understood by the authors, encompasses an approach to learning that not only successfully equips novice designers with essential skills for real-world problem-solving, such as collaboration and analytical skills, but also imparts practical skills in crafting high-quality products. It provides practical recommendations for designing and implementing hackathon events in design education to enhance their effectiveness. By exploring the relationship between effort, learning effect and product quality in educational hackathons, this study contributes to the development of more effective pedagogical strategies for teaching design skills. It offers practical recommendations for the design and implementation of hackathon events within design education, aimed at enhancing their effectiveness as a means to prepare novice designers for the complexities of real-world problem-solving and the creation of products that align with the principles of desirability, viability and feasibility.

The rest of this paper is organized as follows. First, we present the methods utilized in the study and the results. Finally, we delve into the implications of our discoveries within the context of our theoretical framework and accompany this discussion with recommendations for educators and future research.

5. Method

5.1. Participants

The participants were 151 novice designers from different disciplines (e.g., engineering, computer science, design, business, entrepreneurship) enrolled in a product development course during the academic years 2022–2023. Novice designers were graduate students with limited experience in solving product development challenges and wanted to develop foundational skills and knowledge in development principles and techniques (Deininger et al. Reference Deininger, Daly, Sienko and Lee2017; Menold et al. Reference Menold, Berdanier, McComb, Hocker and Gardner2018; Flus & Hurst Reference Flus and Hurst2021a). Out of the 151 participants in the course, 37 participated in batch 1, 51 participated in batch 2 and 63 participated in batch 3. The participants applied for the following roles: (1) business role (BR; e.g., responsible for business attributes), (2) problem role (PR; e.g., responsible for customer attributes) and (3) tech role (TR; e.g., responsible for technical attributes) (Martins Pacheco et al. Reference Martins Pacheco, Behrenbeck, Tariq, Vazhapilli Sureshbabu, Zimmermann, Mortensen, Hansen and Deininger2020).

During the application process, all applicants were asked to rate their peer applicants on a scale from 1 to 10 (with one being the least favorable and 10 being the most favorable). Subsequently, applicants who received the highest average ratings among the peer students were invited to participate in the course. The distribution of roles among the participants was as follows: 31 BRs, 35 PRs and 85 TR. The participants formed 30 teams (7 Teams in batch 1, 10 in batch 2 and 13 in batch 3). The teams consisted of 1–2 BRs (8 in batch 1, 10 in batch 2 and 13 in batch 3), 0–2 PRs (10 in batch 1, 12 in batch 2 and 13 in batch 3) and 2–3 TRs (19 in batch 1, 29 in batch 2 and 37 in batch 3). Table 2 presents each variable’s mean (M) and ranges across different batches. The “total” column represents the values across all the batches.

Table 2. Overview of three hackathon batches (2022–2023)

Note: Pre- and post-test score, knowledge self-assessment; learning effect, ratio between post- and pre-test score; BR, business role; PR, problem role; TR, tech role; product quality, jury assessment.

All teams participated in a 2-week hackathon course focusing on the development of new hardware or mechatronic products. The student teams had limited experience collaborating on a shared design project and were referred to as novice teams. These teams were composed of multiple students who had not been working together before, which we refer to as novice designers (Kiernan, Ledwith, & Lynch Reference Kiernan, Ledwith and Lynch2020).

5.2. Course setting

This study examined a practical 2-week hackathon, where novice designers worked in interdisciplinary teams of four to six students to develop new hardware or mechatronic products. During the course, teams were guided through the process of developing an idea and creating a proof of concept. At the course start, each team began their problem investigation by engaging in brainstorming sessions to generate potential ideas or problems. Subsequently, teams defined personas, performed desk research and conducted preliminary interviews to assess the feasibility and potential success. The data collection during the hackathon began only after all teams had firmly established their initial ideas, with all teams starting from a common point. Teams retained the opportunity to make adjustments to their ideas at later stages. Throughout the course, the teams worked on the following steps: (1) identification of a user problem based on interviews, (2) validation of the problem based on interviews, (3) finding a solution to the problem and building prototypes to demonstrate the potential solution and (4) iteration of the prototyping process with further user and expert interviews. The teams were provided a 400 euro budget for the project, infrastructure, and access to workshops for prototyping activities. Furthermore, they were supported by teaching assistants with expertise in engineering, computer science, design, business and entrepreneurship. At the end of the course, they presented their products in front of a jury panel composed of experts from industry and research. The jury evaluated the product quality through a quality scheme (Girotra, Terwiesch, & Ulrich Reference Girotra, Terwiesch and Ulrich2010).

The teams were provided with methodological support through (1) a double diamond process model and (2) an adapted scrum method (see Figure 1, respectively, Figure 2). The Double Diamond process model consisted of four phases (discover, define, develop and deliver) and served as a road map for the development (Design Council 2023).

Figure 1. Double Diamond model adapted from Design Council (2023).

The Double Diamond was divided into the problem phase (discover and define), dedicated to understanding user needs and the solution phase (develop and deliver), focused on developing solutions. The teams started with an idea in the problem phase, tried to understand the customer needs, tested the prototypes and concluded the phase by formulating a problem definition (e.g., requirements list). In the solution phase, teams developed further prototypes to test their solution with stakeholders and concluded the solution phase with a proof-of-concept.

The teams followed scrum-like prototyping cycles (Martins Pacheco et al. Reference Martins Pacheco, Vazhapilli Sureshbabu, Nürnberger, Durán Noy, Zimmermann, Seybold and Mantwill2021) using the plan, do, check and act approach (Deming Reference Deming1986). Instead of traditional scrum roles, the teams adopted the course roles: (1) PR, (2) BR and (3) TR (Martins Pacheco et al. Reference Martins Pacheco, Behrenbeck, Tariq, Vazhapilli Sureshbabu, Zimmermann, Mortensen, Hansen and Deininger2020).

Each day, the teams started by planning their prototyping cycle, setting objectives and “definitions-of-done” for the day. Then, the teams assigned a phase in the Double Diamond for the current prototyping cycle. Next, they assigned tasks to team members to work on throughout the day, aiming to achieve their previously set goals. Each time a task was completed, the effort spent (in hours) was documented. Incomplete tasks were added to the backlog for future work. At the end of the day, the teams came together to review and reflect on whether their goals were achieved. Based on their insights, they planned their next prototyping cycle.

The PETRA software tool, specifically developed for this course, was utilized to facilitate this methodology, allowing for the planning and documentation of prototyping cycles and tasks. The tool featured a kanban board to visualize individual tasks within a cycle (Martins Pacheco et al. Reference Martins Pacheco, Vazhapilli Sureshbabu, Nürnberger, Durán Noy, Zimmermann, Seybold and Mantwill2021). At the end of a prototyping cycle, teams reflected on their cycles by documenting their development outcome (incl. documents), key insight and rating in terms of achieved “definition-of-done”. This tool helped the teaching assistants track team activities and progress. The teams were introduced to the methods and tools before the hackathon started and were supported throughout the course.

Data collection was conducted across three course batches, with consistent course settings, methods and tools used throughout the study.

5.3. Outcomes

The study measured product quality, effort spent and learning effect as the key variables. Product quality was the primary dependent variable, while effort spent was the independent variable. The learning effect acted as both a secondary dependent and independent variable.

5.3.1. Product quality

Product quality was assessed by a jury consisting of five industry and research experts on the final day of the hackathon. In batch 1, the jury consisted of a software developer from a Bavarian OEM, a manager from a research association, a startup investor and two startup founders (alumni of the course). In batch 2, the jury consisted of a software developer from a Bavarian OEM, a manager from a research association and three startup founders (alumni of the course). In batch 3, the jury consisted of a software developer from a Bavarian OEM, a manager from a research association, a startup founder (alumni of the course), a professor specialized in software development and a makerspace electronics expert. The jury members were well-equipped to evaluate the student projects thanks to their unique perspectives, professional expertise and deep industry knowledge. The jury saw the products for the first time during a three-minute pitch, which included a product demo. Subsequently, they had 5 minutes to ask questions and 10 more minutes to test and discuss the product in detail at a product fair with each team. The product quality assessment utilized a quality scheme (Girotra et al. Reference Girotra, Terwiesch and Ulrich2010) comprising five metrics, namely: (1) technical feasibility (to what extent is the proposed product feasible to develop at a reasonable price with existing technology), (2) novelty (originality of the idea with respect to the unmet need and proposed solution), (3) specificity (the extent to which the idea included a proposed solution), (4) demand (reflecting market size and attractiveness) and (5) business value of the generated product idea (utility of the ideas to a commercial organization that might develop and sell the products). The jury was asked to rank each category on a scale from 1 (lowest value) to 10 (highest value). The average scores from the jury were calculated for each team.

5.3.2. Effort

Effort spent by the teams was measured through task tracking with the PETRA software tool which was utilized by the teams throughout the course. Upon completing a task, the team recorded the time spent (in hours) on that task. Each task completed within the daily prototyping cycle was taken into account, with each cycle corresponding to a phase in the Double Diamond model. In this study, effort was defined as the time in hours that teams invested in their work to achieve specific goals. This definition centered on the quantitative aspect of the effort, highlighting the allocation of time as a measure of the work invested by the teams in completing tasks and progressing through the various phases of the hackathon. It did not address the qualitative aspects of the work, such as the quality of the outcomes or the effectiveness of the efforts, which could vary even when teams invested similar amounts of time.

5.3.3. Learning effect

The learning effect of teams was evaluated by the ratio of self-assessed familiarity with course-related topics. Participants completed a knowledge self-assessment using a 5-point Likert scale ranging from 1 (not at all familiar) to 5 (extremely familiar) with 15 items. The following items were considered: prototyping, design thinking, agile development (i.e., scrum, lean), user research, software and hardware development, industrial design, business tools (i.e., business model canvas), business plan, finance, market analysis, marketing, project management, lean startup and sprints. An example question asked: “Rate your level of familiarity according to your experiences on prototyping”. Participants rated their familiarity before the course (pre-test), which also indicates the level of preexisting knowledge. The preexisting knowledge is categorized according to the mean and standard deviation of the pre-test into three groups: (1) low preexisting knowledge, (2) medium preexisting knowledge and (3) high preexisting knowledge. Table 3 presents each group’s mean (M) and range between 25th and 75th percentile across the different variables. After the course but before the final jury evaluation the participants rated their familiarity again (post-test). The ratings were then divided to calculate a ratio. A ratio $ >1 $ indicated improved self-evaluation after the course, while a ratio $ <1 $ indicated better self-evaluation before the course. A ratio of 1 indicated no change in self-evaluation. The team rating was obtained by averaging the participants’ ratings.

Table 3. Overview of preexisting knowledge and its relationship to the variables

Note: M, mean; IQR, interquartile range between 25th and 75th percentile. The teams were categorized into three distinct groups: “low” for teams with pre-test scores below the mean minus one standard deviation (n = 6), “medium” for teams with pre-test scores falling within one standard deviation of the mean (n = 20) and “high” for teams with pre-test scores exceeding the mean plus one standard deviation (n = 4).

5.4. Statistical analysis

All statistical analyses were performed using R version 4.3.0 (R Core Team 2023).

First, we assessed whether there were any baseline differences between the three batches in terms of product quality, effort spent and learning effect. The ANOVA test was used to test significant differences between the batches with a significance p-value of .05. For the primary dependent variable “product quality”, no significant differences between the batches were detected (p = .09). For the secondary dependent variable, “learning effect”, we assessed the variables “pre-test” and “post-test” for significant mean differences between the batches. We found no significant differences between the batches for the “pre-test” (p = .21) and the “post-test” (p = .12). However, significant differences were observed between batches in the independent variables “total effort” (p = .01) and “solution effort” (p = .01) but not in “problem effort” (p = .11). An explanation for the significant difference could be that the effort varies between batches. This reflects our experience that teams influence other teams’ ideas and efforts, so time commitment differed across the batches. Subsequently, the data of the three batches were combined for a bigger sample size.

In this study, we used correlation analysis to examine the relationship between variables and referred to the mathematical correlation coefficient as $ \rho $ . In our results section, we will refer to $ \rho $ as the true correlation coefficient, used to calculate Spearman’s correlation coefficient, and $ r $ as the estimate from data (Pearson’s correlation coefficient) (see an overview of the hypotheses tested in this study in Table 4). The data were tested for normal distribution using the Shapiro–Wilk-Test. The relationships between the variables were examined through correlation analyses. The total effort, solution effort, product quality and learning effect were normally distributed. Thus, we applied Pearson’s moment correlation. Problem effort was not normally distributed, so we used Spearman’s correlation. The statistical significance of the correlation coefficient was assessed using t-tests. Scatterplots of the variables were used for visual interpretation. The correlation coefficients were interpreted according to Cohen (Reference Cohen1988), with correlations of $ \mid \rho \mid $ <. 30 representing small, .30 $ \le $ |ρ| <. 50 moderate, and $ \mid \rho \mid $ ≥. 50 strong associations.

Table 4. Overview of hypotheses and results

* p < .05,

** p < .01.

5.5. Case-study analysis

A comparative case study analysis was conducted on three teams representing the highest, middle and lowest product quality scores evaluated by a jury. The study examined the detailed processes, outcomes and development artifacts of these teams, documenting the evolution of their design processes. The progression of effort invested in each prototyping cycle was recorded, and the corresponding learning effect was examined. The jury evaluations were analyzed in detail, focusing on the individual elements of each team’s work. This thorough analysis aimed to provide concrete examples, reinforcing the findings from the statistical analysis, thereby enhancing the credibility and comprehensibility of the study. We adopted a mixed-methods approach to address the limitations associated with quantitative data analysis by incorporating statistical analyses alongside qualitative case study analysis of three teams. This approach enhanced our understanding of the research context and provided a more comprehensive examination of the data.

6. Results

The teams exhibited a high level of commitment within the course, as indicated by an average total effort of M = 224.1 (SD = 89.46; range: 75–371). Most of their time was spent in the solution phase, with an average solution effort of M = 125.67 (SD = 62.49; range: 25–247). Comparatively, the teams spent less time in the problem phase, with an average problem effort of M = 98.43 (SD = 48.23; range: 31–195). The teams were effective in realizing quality products following the jury with a mean rating of M = 6.19 (SD = 0.86; range: 4.24–7.68). Furthermore, the teams exhibited a learning effect, with an average value of M = 1.14 (SD = 0.11; range: 0.98–1.41). Descriptive values among the batches are presented in Table 2.

6.1. Relationship between total effort and product quality

The aim of this study was to investigate the correlation between the total effort and the product quality (see Table 4). The normal distribution of both variables was confirmed through the Shapiro–Wilk test (total effort: W = .93, p = .24; product quality: W = .91, p = .14).

The results revealed a moderate positive correlation (see Figure 3) between total effort and product quality (r = .42, p = .02). Thus, the null hypothesis, which suggests a less than moderate correlation between total effort and product quality, was rejected. These findings support the hypothesis ( $ {H}_1 $ ) of a moderate positive relationship between total effort and product quality.

Figure 3. Scatterplot of the variables total effort and product quality with its regression line.

6.2. Relationship between problem/solution effort and product quality

A correlation analysis investigated the relationship between the number of hours invested by a team during the problem/solution phase and the product quality, which corresponds to hypotheses $ {H}_{21} $ and $ {H}_{22} $ in Table 4.

The first study examined the correlation between the problem effort, denoted by the number of hours a team invests in the problem phase of product development during a student-based hackathon and the product quality. The Shapiro–Wilk test failed to provide sufficient evidence for normal distribution of the variable problem effort (W = .93, p = .03).

The correlation analysis showed a weak positive correlation (see Figure 4); however, no statistically significant relationship between problem effort and product quality ( $ \rho $ = .14, p = .46). Thus, the null hypothesis ( $ {H}_{21} $ ) of a less than moderate correlation between problem effort and product quality cannot be rejected.

Figure 4. Scatterplot of the variables problem effort and product quality with its regression line.

This study explored the correlation between the solution effort, defined as the number of hours a team invests in the solution phase of product development during a student-based hackathon, and the product quality (solution effort: W = .93, p = .24; product quality: W = .91, p = .14).

A significant positive correlation (see to Figure 5) was found between solution effort and product quality (r = .50, p = .01), leading to the rejection of the null hypothesis ( $ {H}_{22} $ ) and demonstrating a relationship between solution effort and product quality.

Figure 5. Scatterplot of the variables solution effort and product quality with its regression line.

6.3. Relationship between total effort and learning effect

The correlation between total effort and the learning effect was examined in this study. This hypothesis corresponds to the hypothesis $ {H}_3 $ in Table 4. Both variables, total effort and learning effect, were confirmed to be normally distributed based on the Shapiro–Wilk test results (total effort: W = .93, p = .24, learning effect: W = .91, p = .14).

The correlation analysis indicated a nonsignificant moderate correlation (see Figure 6) between total effort and learning effect (r = .34, p = .06). Since the p-value exceeded the conventional .05 threshold for statistical significance, the null hypothesis ( $ {H}_3 $ ) of a less than moderately correlation between total effort and learning effect cannot be rejected. Therefore, it is concluded that there is no statistically significant relationship between total effort and learning effect.

Figure 6. Scatterplot of the variables total effort and learning effect with its regression line.

6.4. Relationship between learning effect and product quality

The current study investigated the correlation between the learning effect of a team during a student-based hackathon and product quality (see Table 4). The Shapiro–Wilk test confirmed the normal distribution of both variables (learning effect: W = .91, p = .14; product quality: W = .93, p = .24).

The correlation analysis revealed a negligible negative correlation (see Figure 7) between learning effect and product quality (r = −.04, p = .84). The obtained p-value, which is significantly greater than the .05 threshold, indicates a lack of statistical significance in the correlation. Therefore, we cannot reject the null hypothesis of less than a moderate correlation, and conclude that there is no statistically significant relationship between learning effect and product quality.

Figure 7. Scatterplot of the variables learning effect and product quality with its regression line.

6.5. Single-case analysis

This subsection provides a detailed description of the product development process for three selected teams: Team A (lowest jury score), Team B (average jury score) and Team C (highest jury score). It includes information on the effort invested, artifacts produced and assessments by an external jury. An overview of the raw data of all teams is included in Supplementary Table S1.

6.5.1. Team A – Automated cereal dispenser

Team A, comprised of three TR and one BR, focused on developing an automated cereal dispenser (see Figure 8).

Figure 8. Development process overview of Team A.

At the beginning of the problem phase, Team A engaged in product idea generation and investigating the need for an automated kitchen dispenser. The team comprehensively analyzed various kitchen dispensers, including existing products, ongoing development projects and initial design concepts. Comparing their design concept with existing solutions, they assessed the potential market value and volume of their proposed solution. The team conducted 10 interviews to gain deeper insights into user needs and preferences. The results revealed that more than half of the interviewees were willing to pay over 50 euros for the team’s conceptual product. They also sought feedback from hotels, which helped them identify specific product requirements. Throughout the problem phase, the team completed five prototyping cycles, with three in the discovery phase and two in the define phase.

Concluding the problem phase, Team A built a cardboard prototype and designed basic subsystems for their automated cereal dispenser. Moving into the solution phase, they conducted an in-depth analysis of their functional prototype to uncover additional technical requirements. They also focused on enhancing the system’s functionality and usability by incorporating more components while assessing its practicality. The team dedicated most of their time to the deliver phase, spending 34 hours testing the subsystems and components’ feasibility and finalizing the proof-of-concept for the final presentation. During the solution phase, the team completed five prototyping cycles, three in the develop phase and two in the deliver phase.

Team A generated 10 artifacts during the design process. Their total effort of 110 hours was below the average for all teams. However, they exhibited a higher-than-average learning effect, with a ratio of 1.20. Referring to Table 3, the team can be assigned to the group with low preexisting knowledge (2,56). Notably, they demonstrated significant improvement in agile product development, increasing their knowledge score by 0.75. Despite these advancements, the jury awarded Team A the lowest score of 4.2 points for product quality among all 30 teams. To summarize, despite receiving the lowest score for product quality and devoting fewer hours to the development process than the average, Team A exceeded the average learning effect.

6.5.2. Team B – Agriculture drone for pesticide use

Team B, consisting of three TR, one PR and one BR, focused on creating an agriculture drone for pesticide usage (see Figure 9).

Figure 9. Development process overview of Team B.

Initially, Team B brainstormed two product ideas but realized they lacked specific user needs and customer value after analysis and consultations with teaching assistants. Consequently, they explored additional ideas and decided to address challenges in the agriculture industry by the end of the problem phase. They completed four prototyping cycles, two in the discovery phase and two in the definition phase. In the solution phase, they developed a comprehensive design concept using CAD software, incorporating technical features and stakeholder suggestions collected in interviews. During the delivery phase, they conducted various physical tests on their prototypes to confirm functionality and refine their marketing strategy. They completed six prototyping cycles, four in the development phase and two in the delivery phase.

Team B exhibited a learning effect with a ratio of 1.15, similar to the average learning effect of 1.14 for all teams. Referring to Table 3, the team can be assigned to the group with medium preexisting knowledge (2,90). Their knowledge score in user research increased by one point out of a maximum of five. Regarding product quality, Team B received an average score among the 30 teams. They invested 203 hours in the development process, close to the average of 224 hours for all teams.

6.5.3. Team C – Smart manufacturing glove

Team C, comprised of three TR, one PR and one BR, focused on designing a glove that would improve the efficiency of manufacturing processes (see Figure 10).

Figure 10. Development process overview of Team C.

Team C began by brainstorming and creating personas for potential product users while conducting thorough market research, user interviews and technical feasibility assessments. In the problem phase, the team completed five prototyping cycles, three in the discovery phase and two in the definition phase. With a total of 88 hours, Team C spent 10 hours less than the team average in the problem phase. However, they allocated a significant amount of time, 247 hours, to the solution phase, the highest among all teams and nearly double the average of 125.7 hours. During this phase, they visited a renowned German automobile OEM production site to gather insights and evaluate the effectiveness and desirability of their solution. They focused on building an audio detection device that identifies correct plug-in click connections during the production process, utilizing machine learning to enhance their software algorithms. Team C achieved the highest jury score among all 30 teams, indicating their effective approach and the success of their final product. They completed five prototyping cycles in the solution phase, three in the develop phase and two in the deliver phase.

Despite having the highest product quality score and investing the most time in the development process (335 hours), Team C’s learning effect ratio of 1.07 falls below the average of 1.14. Referring to Table 3, the team can be assigned to the group with medium preexisting knowledge (2,93). Notably, their self-assessed knowledge score in software and hardware development declined by 0.8, respectively, 0.4. However, they improved in design thinking, with an increased knowledge score of 0.6 points. In conclusion, while Team C achieved the highest product quality score and invested the most time in the development process, their learning effect fell below average.

7. Discussion

This research aimed to examine the relationship between effort, learning effect and product quality in educational hackathons. Due to the small sample size, we expected moderate to strong effects.

7.1. Relationship between total effort and product quality

In this study, we examined the association between total effort and product quality. The results supported our hypothesis, showing a significant positive correlation between effort and product quality. This finding aligns with previous research highlighting the predictive role of effort in academic achievements (Lee Reference Lee2014; Hagger & Hamilton Reference Hagger and Hamilton2019; Putwain et al. Reference Putwain, Nicholson, Pekrun, Becker and Symes2019). The case analyses further supported our findings. Team A, with the lowest effort, received the lowest product quality score, while Team C, with the highest effort, achieved the highest score. This underscores the crucial role of effort in attaining high-quality outcomes in educational product development settings. It reaffirms that dedicating more time to the development process can lead to better products.

7.2. Relationship between problem and solution effort, respectively, and product quality

We explored the association between problem/solution effort and product quality.

Our results showed no significant correlation between problem effort and product quality, although a small positive correlation was observed. Problem effort may have some impact on product quality; however, it is not statistically significant. In contrast, a strong significant correlation was found between solution effort and product quality, emphasizing the critical role of the solution phase in determining evaluated product quality. The case analyses supported these findings, with Team A (lowest score) allocating 60% of their total time to the solution phase, Team B (average score) devoting 67% and Team C (highest score) spending 74%. These cases further underscored the importance of solution effort in achieving higher product quality outcomes.

Our findings support the importance of a prompt transition from problem formulation to solution generation, facilitating iterative design processes involving idea generation, testing and evaluation (Darke Reference Darke1979; Lloyd & Scott Reference Lloyd and Scott1995; Ball & Christensen Reference Ball and Christensen2019; Batliner et al. Reference Batliner, Boës, Heck and Meboldt2022). Ball & Christensen (Reference Ball and Christensen2019) argue that teams should move quickly into the solution space, as speculative solution ideas aid in formulating design problems. This iterative process helps designers to gain clarity on the problem at hand and assess ideas’ compatibility with requirements and constraints by constantly uncovering missing or poorly articulated information. It prompts further interactions with users for clarification or making feasible assumptions. Our study builds on this knowledge by emphasizing the contribution of solution phase efforts to product quality.

Moreover, our study aligns with Batliner et al. (Reference Batliner, Boës, Heck and Meboldt2022) who identified early and ongoing investment of time in testing as a predictor for project success. While Batliner et al. (Reference Batliner, Boës, Heck and Meboldt2022) focused specifically on testing, our study explored the overall solution effort and its association with product quality. By corroborating their findings in a broader context, our study enhances understanding of the factors influencing project success in hackathons. It further supports the notion that the solution-oriented nature of hackathons plays a significant role in determining the quality of the final product (Briscoe & Mulligan Reference Briscoe and Mulligan2014; Porras et al. Reference Porras, Knutas, Ikonen, Happonen, Khakurel and Herala2019). However, it is worth noting that hackathons are typically not only seen as a means to an end for producing solutions (Lawson Reference Lawson2005; Gama Reference Gama, LaToza and Mao2017; Flus & Olechowski Reference Flus, Olechowski and Gero2023) but are also frequently viewed as catalysts of PBL (Horton et al. Reference Horton, Jordan, Weiner and Lande2018) and idea generation (Komssi et al. Reference Komssi, Pichlis, Raatikainen, Kindstrom and Jarvinen2015; Flus & Hurst Reference Flus and Hurst2021a). In addition to developing solutions, hackathons can serve as platforms for participants to rapidly generate and validate innovative ideas. This experiential process enables participants to apply, analyze, evaluate and create new, original work, aligning with Bloom’s higher educational objectives as outlined in his revised taxonomy (Krathwohl Reference Krathwohl2002). This multifaceted role underscores the versatility of hackathons in simultaneously promoting project success and fostering innovation within the hackathon ecosystem.

Hackathons may not fully replicate realistic environments, as they often prioritize solution development over a comprehensive design process. This became evident in the observation by Sadowska & Laffy (Reference Sadowska and Laffy2017), where design briefs in hackathons triggered an emphasis on solution development rather than a design process and learning. Although our study’s teams independently sought out problems without a design brief, the deadline for the final presentation of their solutions may have still led to a significant emphasis on solution development.

7.3. Relationship between total effort and learning effect

This study aimed to examine the impact of total effort on the learning effect in hackathons, building on previous research indicating effort being a predictor for learning (Huang & Li Reference Huang and Li2012; McKay & Ellis Reference McKay and Ellis2014).

Contrary to our expectations, the results did not show a significant correlation between the effort invested and the learning effect. Moreover, the case analyses revealed that as effort increased, the learning effect decreased. Additionally, the data presented in Table 3 unveil a significant insight. It appears that teams with limited preexisting knowledge invested the most time and achieved the highest level of learning. Surprisingly, teams with the highest preknowledge learned the least, despite investing more time than teams with moderate preknowledge.

These results challenge conventional assumptions regarding the effectiveness of hackathon formats in education (Nandi & Mandernach Reference Nandi, Mandernach and Alphonce2016; Page et al. Reference Page, Sweeney, Bruce, Baxter, Bohemia, Kovacevic, Buck, Tollestrup, Eriksen and Ovesen2016; Gama et al. Reference Gama, Alencar, Calegario, Neves and Alessio2018; Horton et al. Reference Horton, Jordan, Weiner and Lande2018; Rennick et al. Reference Rennick, Hulls, Wright, Milne, Li and Bedi2018). It suggests that teams with low preknowledge may need to invest more time as they grapple with the initial skill threshold. The concept of initial threshold describes the need of attaining a certain skill level to effectively utilize a given medium for product development (Boa, Mathias, & Hicks Reference Boa, Mathias, Hicks, Maier, Škec, Kim, Kokkolaras, Oehmen, Fadel, Salustri and van der Loos2017; Ranscombe et al. Reference Ranscombe, Bissett-Johnson, Mathias, Eisenbart and Hicks2020). For instance, a designer who is a novice in CAD, with a limited skill level in CAD sketching, may encounter challenges in creating sophisticated designs while they are still in the process of mastering basic CAD operations (Ranscombe et al. Reference Ranscombe, Bissett-Johnson, Mathias, Eisenbart and Hicks2020). Conversely, materials like cardboard or LEGO present prototyping tools that offer a lower initial barrier to entry, characterized by a low skill threshold (Boa et al. Reference Boa, Mathias, Hicks, Maier, Škec, Kim, Kokkolaras, Oehmen, Fadel, Salustri and van der Loos2017; Ranscombe et al. Reference Ranscombe, Bissett-Johnson, Mathias, Eisenbart and Hicks2020). However, the requirement for a functional prototype with a high skill threshold in the course might have led students with low initial knowledge to invest extra effort in achieving a functional prototype for the final presentation. On the other hand, teams with higher preknowledge may not have the same learning incentives because they might be overly confident about their skill level. While complete novices lack confidence, learning even a little can boost their confidence, sometimes surpassing the accuracy of their judgment (Sanchez & Dunning Reference Sanchez and Dunning2018). Such teams may invest more time in prototype development due to their overconfidence in their existing skills of evaluating their designs and a strong focus on perfecting their prototypes. However, this can result in less active exploration of new concepts or approaches during the hackathon. This may suggest that novice designers are less challenged by the hackathon format when they possess more initial skills and knowledge. In fact, they might even internalize incorrect design practices as a result.

Further investigation is needed to explore the factors contributing to learning success. However, research by Sadowska & Laffy (Reference Sadowska and Laffy2019) shows that the impact of learning in hackathons may manifest in the long term, with benefits becoming apparent years later (Sadowska & Laffy Reference Sadowska and Laffy2019). This may be due to the focus on developing transferable skills and internalizing these approaches in hackathons rather than specific subject knowledge. This highlights the broader educational implications of hackathons and their potential to cultivate skills that extend beyond the immediate context.

This finding underscores the importance of recognizing the dynamic interplay between initial knowledge levels, effort investment and learning outcomes in hackathons. It prompts further inquiry into how to effectively engage and motivate participants with varying levels of preknowledge to encourage continuous learning of all participants.

7.4. Relationship between learning effect and product quality

Finally, this study aimed to examine the relationship between the learning effect and product quality in hackathons, drawing on earlier literature emphasizing the importance of (organizational and team) learning for project success (Zwikael et al. Reference Zwikael, Levin, Parviz and Rad2008; Huang & Li Reference Huang and Li2012; McKay & Ellis Reference McKay and Ellis2014).

Contrary to expectations, the results indicate no statistically significant relationship between the learning effect and product quality in educational hackathons. The scatterplot suggests a small negative correlation between the two variables. The case analysis further supports these findings, with the lowest-scoring team (Team A) exhibiting a relatively higher learning effect compared to the highest-scoring team (Team C). Additionally, despite investing more time in improving product quality, the teams in the case studies allocated a decreasing portion of their time to the problem phase and experienced a decreasing learning effect. This suggests that teams with a lower learning effect, indicating a higher preexisting knowledge base, were more efficient in the design process. A higher level of prior knowledge allowed these teams to allocate their time more effectively to the solution development, ultimately resulting in the need for less time to achieve better product outcomes. This phenomenon aligns with the concept of a skill threshold (Boa et al. Reference Boa, Mathias, Hicks, Maier, Škec, Kim, Kokkolaras, Oehmen, Fadel, Salustri and van der Loos2017, Ranscombe et al. Reference Ranscombe, Bissett-Johnson, Mathias, Eisenbart and Hicks2020). This observation gains further support from the data presented in Table 3, which shows that the average learning effect of teams is most pronounced in groups with low initial skill levels and least pronounced in groups with high initial skill levels. Interestingly, teams with high initial skill levels, on average, produce less favorably evaluated products compared to groups with low skill levels, who also invest more time on average. Consequently, while groups with low preexisting knowledge expend significantly more effort to surmount the skill threshold, it appears to yield positive results, enabling them to compete with their peers while learning. This phenomenon can be partially explained by Aryana, Naderi, & Balis (Reference Aryana, Naderi and Balis2019) and Flus & Hurst (Reference Flus and Hurst2021a), who propose that familiarizing participants with the stages of a design process fosters learning and encourages adherence to the process, leading to improved outcomes in hackathons.

Further, our findings align with the idea of individual differences in problem formulation, as highlighted by Ball & Christensen (Reference Ball and Christensen2019), who emphasized that successful designers effectively balance information gathering, goal definition and solution development. The present study supports the suggestion that excessive time spent on problem formulation activities may hinder the problem-solving progress (Ball & Christensen Reference Ball and Christensen2019), emphasizing the importance of designers’ experience and their ability to strategically manage problem formulation and solution development.

Essentially, this implies that while prior knowledge can accelerate solution development, the relationship between learning effect and product outcome in hackathons is multifaceted and may not consistently adhere to a linear pattern. Understanding these dynamics is pivotal for both educators and practitioners aiming to leverage hackathons effectively in design education and real-world problem-solving contexts.

7.5. Practical implications

This study offers practical implications for educators and organizations involved in team-based product development processes.

While hackathons provide valuable opportunities for participants to engage with design processes, the lack of a significant correlation between effort invested and learning effect sheds light on their limitations. Consequently, hackathons may be better suited as platforms for brief exposure to the entire design process rather than comprehensive representations. Educators should critically examine the scope and limitations of hackathons in effectively educating novice designers and supplement their learning experiences accordingly. For instance, alternative approaches such as structured feedback mechanisms (Nicol & Macfarlane-Dick Reference Nicol and Macfarlane-Dick2006; Hattie & Timperley Reference Hattie and Timperley2007), reflection and self-assessment (Boud, Keogh, & Walker Reference Boud, Keogh and Walker1985; Zimmerman Reference Zimmerman2002) and cooperative learning (Johnson & Johnson Reference Johnson and Johnson2014; Lara & Lockwood Reference Lara and Lockwood2016) can facilitate learning success. Additionally, implementing partial hackathons with a problem or solution focus could offer a more balanced representation of the design process, addressing both problem formulation and solution development.

Allocating sufficient time and resources to the solution phase can lead to higher product quality in student-based hackathons. Educators and team leaders should emphasize the importance of dedicating adequate time and effort to the solution phase, providing guidance on effective time management, design requirements and testing to maximize product quality impact (Aryana et al. Reference Aryana, Naderi and Balis2019). Furthermore, to support teams with limited initial knowledge, hackathon practitioners might consider presenting product development tools with a lower skill threshold to such teams (Boa et al. Reference Boa, Mathias, Hicks, Maier, Škec, Kim, Kokkolaras, Oehmen, Fadel, Salustri and van der Loos2017, Ranscombe et al. Reference Ranscombe, Bissett-Johnson, Mathias, Eisenbart and Hicks2020). By offering product development tools with a lower skill threshold, practitioners level the playing field and cater to the foundational knowledge and comprehension aspects of Bloom’s (Reference Bloom1979) taxonomy. This approach ensures that participants have the essential tools and knowledge to engage meaningfully in the hackathon, setting the stage for more successful learning and problem-solving outcomes.

In conclusion, educators and organizations should optimize the allocation of time and resources in team-based product development processes to enhance product quality and explore alternative learning formats to improve the learning effect in these contexts.

7.6. Limitations and future research

Two factors constrained the generalizability of this study. First, the limited sample size of 30 teams participating in a single educational hackathon course raises the possibility of sample bias. Therefore, applying these findings to other contexts, such as professional settings or different educational environments, should be done cautiously. Second, given the unique goals and dynamics of various hackathon types (Kollwitz & Dinter (Reference Kollwitz, Dinter, Hildebrandt, van Dongen, Röglinger and Mendling2019), caution should be exercised when extending our results beyond the specific context of our study, focusing on the development of new hardware and mechatronic products. To overcome these limitations, future research should aim to replicate our study with larger, diverse samples drawn from a range of hackathon formats. Furthermore, recognizing that hackathons are integral to broader educational or professional goals, forthcoming research should delve deeper into the multifaceted relationships between hackathon outcomes and these overarching objectives.

While teamwork dynamics and team constellation may have influenced the performance of the sample teams, exploring these factors fell outside the scope of this study. Nevertheless, it is essential to recognize several limitations associated with the sample teams in this research, as addressing these limitations could notably enhance the ongoing development of knowledge in design pedagogy research. First, the study did not control for the teams’ varying levels of subject-related knowledge, which may have influenced their performance and the jury’s assessment. Thomson & Grierson (Reference Thomson, Grierson, Seybold and Mantwill2021) highlight variations in design project performance among students in different academic years, indicating the significance of knowledge diversity in such contexts. Additionally, the multidisciplinary and multifunctional nature of the sample teams, while potentially influencing their performance, was not controlled for this study. Existing evidence suggests that prior experience in various fields within a team can significantly affect group effectiveness (Brown & Eisenhardt Reference Brown and Eisenhardt1995; Edmondson & Nembhard Reference Edmondson and Nembhard2009). Thus, it is imperative that future research extends its focus to explore the influence of teams’ diverse subject-related knowledge on hackathon outcomes. This exploration can provide valuable insights into the role of prior experience diversity of multidisciplinary teams in shaping their teamwork. Second, cultural differences were not taken into account among the participating teams. Considering the evidence suggesting cultural variations in creative thinking and knowledge application (Zhang, Bohemia, & McCardle Reference Zhang, Bohemia and McCardle2019), future studies should incorporate these factors to better understand their influence on learning, hackathon performance and effort allocation.

A jury of five industry and research specialists evaluated the product’s quality. The subjective nature of this assessment introduces the possibility of individual biases affecting the scoring and ranking of the products (Stylidis, Wickman, & Söderberg Reference Stylidis, Wickman and Söderberg2020; Boudier et al. Reference Boudier, Sukhov, Netz, Le Masson and Weil2023). Biases can be observed, as professionals in the industry tend to overestimate the significance of attributes they are currently focusing on (Stylidis et al. Reference Stylidis, Wickman and Söderberg2020). Additionally, experts with similar backgrounds might have varying perceptions of ideas (Boudier et al. Reference Boudier, Sukhov, Netz, Le Masson and Weil2023). Efforts were made to address this limitation by appointing a diverse jury with individuals from different backgrounds. The industry experts provided explicit feedback to the teams and posed questions after their presentations, following the recommendation of Boudier et al. (Reference Boudier, Sukhov, Netz, Le Masson and Weil2023). However, it is essential to acknowledge that biases cannot be eliminated entirely. Future research exploring the effect of potential biases in the jury’s judgments and prototype sophistication on jury evaluation could provide more comprehensive knowledge of what constitutes “quality” in this context. Examining whether the perceived quality of a product is influenced by its level of refinement or technical advancement would provide valuable insights regarding the evaluation process for organizations and educators.

The study focused on the association between total or solution/problem effort and product quality, ignoring other potential mediators or moderators that could influence product quality. Future research should be conducted to investigate additional parameters related to product quality. For instance, future investigations could categorize effort using alternative frameworks, such as the three lenses of human-centered design (e.g., desirability, feasibility, viability) or the purpose of prototypes (e.g., exploration, communication, evaluation), to examine their influence on product quality.

Next, the ratio of self-assessed familiarity with course-related subjects was used to assess the learning effect. This self-reported metric may be biased and inaccurate. To give a more trustworthy and valid evaluation of participants’ knowledge, future research could add objective measures, such as standardized performance tests before and after the intervention (Gustafsson & Borglin Reference Gustafsson and Borglin2013; Shrivastava, Shah, & Navaid Reference Shrivastava, Shah and Navaid2018). By addressing these limitations and conducting further research, educators can optimize learning outcomes in hackathon-based educational contexts.

A strong emphasis was placed on quantitative data due to its objectivity. While these data are valuable, it may be advantageous to collect qualitative insights. Reflections from participants or observational data could add to a more nuanced picture of the learning process and its consequences.

Moreover, the study presumes a linear relationship between effort, learning effect and product quality. However, the relationship between these variables could be more complex and nonlinear. Therefore, the existing linear approach could be extended to include nonlinear relationships and interactions, which could show intriguing dynamics like threshold effects.

Finally, the present study found no statistically significant relationship between the learning effect and product quality. However, the case analyses showed that teams with lower learning effects demonstrated greater efficiency in the design process. Future research should explore the impact of sample characteristics, such as participants’ level of expertise or prior experience, on product design effectiveness. This investigation could shed light on whether more effective product design is facilitated by a higher level of the team’s prior design knowledge. Such a study could also explore whether teams that demonstrated less learning did so due to possessing more design knowledge prior to the event. This knowledge could have enabled them to design products more effectively, leading to better evaluation outcomes. Replication studies with larger sample sizes are needed to confirm and strengthen these initial findings.

8. Conclusion

This study explored the relationship between effort, learning effect and product quality in educational hackathons of novice teams. We discovered a significant positive correlation between total effort and product quality, emphasizing the importance of allocating adequate time and resources to product development for higher-quality outcomes. However, no meaningful relationship was seen between overall effort and learning effect, calling into question the premise that effort alone promotes learning success in hackathons. Furthermore, there was not any significant association between the learning effect and product quality. This seems like a bold statement and calls into question the very nature of having hackathons with novice teams in higher education institutions. These findings provide practical insights for educators and organizations involved in team-based product development processes, highlighting the importance of prioritizing effective solution development and exploring alternative learning methods in hackathon settings. Furthermore, they have the potential to have a substantial impact on how hackathons are planned and facilitated, particularly those involving novice teams. It implies that how we define and measure success in these environments may need to be reconsidered, with an emphasis not only on the finished output but also on the learning process. Implementing these insights enhances educational success and product quality in hackathons with novice teams.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/dsj.2024.9.

Acknowledgments

The authors thank all the students and teaching assistants of the Technical University of Munich who participated and supported this study.

References

Anthony, S. D. 2014 Innovation Leadership Lessons from the Marshmallow Challenge, online document www.hbr.org/2014/12/innovation-leadership-lessons-from-the-marshmallow-challenge (accessed April 25, 2023).Google Scholar
Artiles, J. A. & LeVine, K. E. 2015 Ta-da! you’re a design thinker! validating the designshop as a model for teaching design thinking to non-designers and achieving systemic re-design in the education system. In 2015 ASEE Annual Conference & Exposition. American Society of Engineering Education.Google Scholar
Artiles, J. A. & Wallace, D. R. 2013 Borrowing from hackathons: overnight designathons as a template for creative idea hubs in the space of hands-on learning, digital learning, and systems re-thinking In World Engineering Education Forum 2013. Asociación Colombiana de Facultades de Ingeniería (ACOFI).Google Scholar
Aryana, B., Naderi, E. & Balis, G. 2019 Strategies for empowering collective design. The Design Journal 22 (1), 20732088.CrossRefGoogle Scholar
Ball, L. J. & Christensen, B. T. 2019 Advancing an understanding of design cognition and design metacognition: Progress and prospects. Design Studies 65, 3559.CrossRefGoogle Scholar
Batliner, M., Boës, S., Heck, J. & Meboldt, M. 2022 Linking testing activities with success in agile development of physical products. Procedia CIRP 109, 146154.CrossRefGoogle Scholar
Belluigi, D. Z. 2016 Constructions of roles in studio teaching and learning. International Journal of Art & Design Education 35 (1), 2135.CrossRefGoogle Scholar
Bloom, B. S. 1979 Taxonomy of Educational Objectives: The Classification of Educational Goals. Longman.Google Scholar
Boa, D., Mathias, D. & Hicks, B. 2017 Evolving lego: prototyping requirements for a customizable construction kit. In DS 87–4 Proceedings of the 21st International Conference on Engineering Design (ICED 17) (ed. Maier, A., Škec, S., Kim, H., Kokkolaras, M., Oehmen, J., Fadel, G., Salustri, F. & van der Loos, M.), pp. 397–306. Design Society.Google Scholar
Boelt, A. M., Kolmos, A. & Holgaard, J. E. 2022 Literature review of students’ perceptions of generic competence development in problem-based learning in engineering education. European Journal of Engineering Education 47 (6), 13991420.CrossRefGoogle Scholar
Böhle, F., Heidling, E. & Schoper, Y. 2016 A new orientation to deal with uncertainty in projects. International Journal of Project Management 34 (7), 13841392.CrossRefGoogle Scholar
Boud, D., Keogh, R. & Walker, D. 1985 Reflection: Turning Experience into Learning. Routledge.Google Scholar
Boudier, J., Sukhov, A., Netz, J., Le Masson, P. & Weil, B. 2023 Idea evaluation as a design process: Understanding how experts develop ideas and manage fixations. Design Science 9, e9.CrossRefGoogle Scholar
Briscoe, G. & Mulligan, C. 2014 Digital Innovation: The Hackathon Phenomenon, online document www.core.ac.uk/download/pdf/30697508.pdf (accessed April 25, 2023).Google Scholar
Brosens, L., Raes, A., Octavia, J. R. & Emmanouil, M. 2023 How future proof is design education? A systematic review. International Journal of Technology and Design Education 33 (2), 663683.CrossRefGoogle ScholarPubMed
Brown, S. & Eisenhardt, K. 1995 Product development: Past research, present findings, and future directions. Academy of Management Review 20 (2), 343.CrossRefGoogle Scholar
Burroughs, J. E., Dahl, D. W., Moreau, C. P., Chattopadhyay, A. & Gorn, G. J. 2011 Facilitating and rewarding creativity during new product development. Journal of Marketing 75 (4), 5367.CrossRefGoogle Scholar
Cennamo, K. S. 2016 What is studio? In Studio Teaching in Higher Education (ed. Boling, E., Schwier, R. A., Gray, C. M., Smith, K. M. & Campbell, K.). Routledge.Google Scholar
Chen, J., Kolmos, A. & Du, X. 2021 Forms of implementation and challenges of pbl in engineering education: A review of literature. European Journal of Engineering Education 46 (1), 90115.CrossRefGoogle Scholar
Cohen, J. 1988 Statistical Power Analysis for the Behavioral Sciences (2nd edition). Routledge.Google Scholar
Cooper, R. G. 2019 The drivers of success in new-product development. Industrial Marketing Management 76, 3647.CrossRefGoogle Scholar
Corazzo, J. 2019 Materialising the studio. A systematic review of the role of the material space of the studio in art, design and architecture education. The Design Journal 22 (1), 12491265.CrossRefGoogle Scholar
Daly, S. R., McKilligan, S., Studer, J. A., Murray, J. K. & Seifert, C. M. 2018 Innovative solutions through innovated problems. International Journal of Engineering Education 34, 695707.Google Scholar
Darke, J. 1979 The primary generator and the design process. Design Studies 1 (1), 3644.CrossRefGoogle Scholar
Deininger, M., Daly, S. R., Sienko, K. H. & Lee, J. C. 2017 Novice designers’ use of prototypes in engineering design. Design Studies 51, 2565.CrossRefGoogle ScholarPubMed
Deming, W. E. 1986 Out of the Crisis. The MIT Press.Google Scholar
Design Council 2023 The Double Diamond, online document www.designcouncil.org.uk/our-resources/the-double-diamond (accessed September 28, 2023).Google Scholar
Dochy, F., Segers, M., van den Bossche, P. & Gijbels, D. 2003 Effects of problem-based learning: A meta-analysis. Learning and Instruction 13 (5), 533568.CrossRefGoogle Scholar
Edmondson, A. C. & Nembhard, I. M. 2009 Product development and learning in project teams: The challenges are the benefits. Journal of Product Innovation Management 26 (2), 123138.CrossRefGoogle Scholar
Emam, M., Taha, D. & ElSayad, Z. 2019 Collaborative pedagogy in architectural design studio: A case study in applying collaborative design. Alexandria Engineering Journal 58 (1), 163170.CrossRefGoogle Scholar
Fernando, A., Mansur, A. F. U., Alves, A. C. & Torres, R. B. 2019 Trello as virtual learning environment and active learning organiser for pbl classes: an analysis under bloom’s taxonomy. In International Symposium on Project Approaches in Engineering Education (PAEE-ALE). PAEE Association.Google Scholar
Ferreira, J., Christiaans, H. & Almendra, R. 2016 A visual tool for analysing teacher and student interactions in a design studio setting. CoDesign 12 (1–2), 112131.CrossRefGoogle Scholar
Fischer, C., Malycha, C. P. & Schafmann, E. 2019 The influence of intrinsic motivation and synergistic extrinsic motivators on creativity and innovation. Frontiers in Psychology 10, 115.CrossRefGoogle ScholarPubMed
Flores, M., Golob, M., Maklin, D., Herrera, M., Tucci, C., Al-Ashaab, A., Williams, L., Encinas, A., Martinez, V., Zaki, M., Sosa, L. & Pineda, K. F. 2018 How can hackathons accelerate corporate innovation? In Advances in Production Management Systems (ed. Moon, I., Lee, G. M., Park, J., Kiritsis, D. & von Cieminski, G.), Vol. 535, pp. 167175. Springer.Google Scholar
Flus, M. & Hurst, A. 2021a Design at hackathons: new opportunities for design research. Design Science 7, E4.CrossRefGoogle Scholar
Flus, M. & Hurst, A. 2021b Experiences of design at hackathons: initial findings from an interview study. In DS 109: Proceedings of the Design Society: 23th International Conference on Engineering Design (ICED21) (ed. Seybold, C. & Mantwill, F.), Vol. 1, pp. 14611470. Cambridge University Press.Google Scholar
Flus, M. & Olechowski, A. 2023 A focussed literature review of dual-process thinking to inform the study of hackathons. In Design Computing and Cognition’22 (ed. Gero, J. S.). Springer.Google Scholar
Fowler, A. 2016 Informal stem learning in game jams, hackathons and game creation events. In Proceedings of the International Conference on Game Jams, Hackathons, and Game Creation Events. ACM.Google Scholar
Friedman, K. 2019 Design Education Today: Challenges, Opportunities, Failures, online document www.academia.edu/40519668/Friedman_2019_Design_Education_Today_Challenges_Opportunities_Failures (accessed March 16, 2024).Google Scholar
Gama, K. 2017 Preliminary findings on software engineering practices in civic hackathons. In Proceedings of the 4th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE 2017) (ed. LaToza, T. & Mao, K.), pp. 1420. IEEE.Google Scholar
Gama, K., Alencar, B., Calegario, F., Neves, A. & Alessio, P. 2018 A hackathon methodology for undergraduate course projects. In Frontiers in Education, pp. 19. IEEE.Google Scholar
Girotra, K., Terwiesch, C. & Ulrich, K. T. 2010 Idea generation and the quality of the best idea. Management Science 56, 591605.CrossRefGoogle Scholar
Gustafsson, M. & Borglin, G. 2013 Can a theory-based educational intervention change nurses’ knowledge and attitudes concerning cancer pain management? A quasi-experimental design. BMC Health Services Research 13, 328.CrossRefGoogle ScholarPubMed
Hagger, M. S. & Hamilton, K. 2019 Grit and self-discipline as predictors of effort and academic attainment. British Journal of Educational Psychology 89 (2), 324342.CrossRefGoogle ScholarPubMed
Hattie, J. & Timperley, H. 2007 The power of feedback. Review of Educational Research 77 (1), 81112.CrossRefGoogle Scholar
Herd, K. B. & Mehta, R. 2019 Head versus heart: the effect of objective versus feelings-based mental imagery on new product creativity. Journal of Consumer Research 46 (1), 3652.CrossRefGoogle Scholar
Horton, P. A., Jordan, S. S., Weiner, S. & Lande, M. 2018 Project-based learning among engineering students during short-form hackathon events. In ASEE Annual Conference and Exposition, Conference Proceedings 2018. American Society for Engineering Education.Google Scholar
Huang, J.-W. & Li, Y.-H. 2012 Slack resources in team learning and project performance. Journal of Business Research 65 (3), 381388.CrossRefGoogle Scholar
Hurst, A., Litster, G. & Rennick, C. (2020), Operationalizing jonassen’s design theory of problem solving: An instrument to characterize educational design activities. In 2020 ASEE Virtual Annual Conference Content Access Proceedings. ASEE Conferences.Google Scholar
Hurst, A., Nespoli, O. G., Abdellahi, S. & Gero, J. S. 2019 A comparison of design activity of academics and practitioners using the fbs ontology: A case study. In DS 94: Proceedings of the 22nd International Conference on Engineering Design (ICED19) (Vol. 1), pp. 13231332. Design Society.Google Scholar
Johnson, D. W. & Johnson, R. T. 2014 Cooperative learning: improving university instruction by basing practice on validated theory. Journal on Excellence in College Teaching 25 (3–4), 85118.Google Scholar
Jonassen, D., Strobel, J. & Lee, C. B. 2006 Everyday problem solving in engineering: lessons for engineering educators. Journal of Engineering Education 95 (2), 139151.CrossRefGoogle Scholar
Kannengiesser, U. & Gero, J. S. 2019 Design thinking, fast and slow: A framework for kahneman’s dual-system theory in design. Design Science 5, 121.CrossRefGoogle Scholar
Kelland, C. A., Brisco, R. & Whitfield, I. 2022 Students perception of risk: team members contribution within collaborative projects. In DS 117: Proceedings of the 24th International Conference on Engineering and Product Design Education (E&PDE 2022) (ed. Bohemia, E., Lyndon, B. & Hilary, G.). Design Society.Google Scholar
Kiernan, L., Ledwith, A. & Lynch, R. 2020 Comparing the dialogue of experts and novices in interdisciplinary teams to inform design education. International Journal of Technology and Design Education 30 (1), 187206.CrossRefGoogle Scholar
Kokotsaki, D., Menzies, V. & Wiggins, A. 2016 Project-based learning: a review of the literature. Improving Schools 19 (3), 267277.CrossRefGoogle Scholar
Kollwitz, C. & Dinter, B. 2019 What the hack? – Towards a taxonomy of hackathons. In Business Process Management (ed. Hildebrandt, T., van Dongen, B. F., Röglinger, M. & Mendling, J.) Lecture Notes in Computer Science. Springer International Publishing.Google Scholar
Kolmos, A., de Graaff, E. & Du, X. 2009 Diversity of pbl – Pbl learning principles and models. In Research on PBL (ed. Du, X., de Graaf, E. & Kolmos, A.). Sense Publishers.Google Scholar
Kolmos, A., Fink, F. & Lrogh, L. 2004 The Aalborg PBL Model. Aalborg University Press.Google Scholar
Komssi, M., Pichlis, D., Raatikainen, M., Kindstrom, K. & Jarvinen, J. 2015 What are hackathons for? IEEE Software 32 (5), 6067.CrossRefGoogle Scholar
Kos, B. A. (2019), Understanding female-focused hackathon participants’ collaboration styles and event goals. In Proceedings of the International Conference on Game Jams, Hackathons and Game Creation Events 2019. Association for Computing Machinery.Google Scholar
Krathwohl, D. R. 2002 A revision of bloom’s taxonomy: An overview. Theory Into Practice 41 (4), 212218.CrossRefGoogle Scholar
La Place, C., Jordan, S. S., Lande, M. & Weiner, S. 2017 Engineering students rapidly learning at hackathon events. In Proceedings of the ASEE Annual Conference and Exposition. American Society for Engineering Education.Google Scholar
Lara, M. & Lockwood, K. 2016 Hackathons as community-based learning: A case study. TechTrends 60 (5), 486495.CrossRefGoogle Scholar
Lawson, B. 2005 How Designers Think: The Design Process Demystified (4th edition). Elsevier.Google Scholar
Lee, J.-S. 2014 The relationship between student engagement and academic performance: is it a myth or reality? The Journal of Educational Research 107 (3), 177185.CrossRefGoogle Scholar
Legardeur, J., Masson, D. H., Gardoni, M. & Pimapunsri, K. 2020 The paradox of diversity’s influence on the creative teams: Lessons learned from the analysis of 14 editions of” the 24h of innovation” hackathon. In Proceedings of ISPIM Connects Bangkok (ed. Bitran, I., Conn, S., Gernreich, C., Heber, M., Huizingh, E. & Torkkeli, M.). The International Society for Professional Innovation Management (ISPIM).Google Scholar
Lewis, B. A., Parker, J., Cheng, L. W. & Resnick, M. 2015 Ux day design challenge. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting. SAGE Publications, p. 1.Google Scholar
Lifshitz-Assaf, H., Lebovitz, S. & Zalmanson, L. 2021 Minimal and adaptive coordination: how hackathons’ projects accelerate innovation without killing it. Academy of Management Journal 64 (3), 684715.CrossRefGoogle Scholar
Lin, C., Huang, J. & Lin, R. 2021 From steam to cheer: a case study of design education development in Taiwan. Education Sciences 11 (4), 171.CrossRefGoogle Scholar
Lloyd, P. & Scott, P. 1995 Difference in similarity: interpreting the architectural design process. Environment and Planning B: Planning and Design 22 (4), 383406.CrossRefGoogle Scholar
Martins Pacheco, N. M., Behrenbeck, J., Tariq, B., Vazhapilli Sureshbabu, A. & Zimmermann, M. 2020 A role-based prototyping approach for human-centred design in fuzzy front-end scenarios. In DS 101: Proceedings of NordDesign 2020 (ed. Mortensen, N. H., Hansen, C. T. & Deininger, M.). Design Society.Google Scholar
Martins Pacheco, N. M., Vazhapilli Sureshbabu, A., Nürnberger, M. C., Durán Noy, L. I. & Zimmermann, M. 2021 A fuzzy front-end product development framework for start-ups. In DS 109: Proceedings of the Design Society: 23th International Conference on Engineering Design (ICED21) (ed. Seybold, C. & Mantwill, F.). Cambridge University Press.Google Scholar
Matthews, J. & Wrigley, C. 2017 Design and design thinking in business and management higher education. Journal of Learning Design 10 (1), 41.CrossRefGoogle Scholar
McKay, D. S. & Ellis, T. J. (2014), Tracking the flow of knowledge in it organizations: the impact of organizational learning factors and project learning practices on project success. In Proceedings of the 47th Hawaii International Conference on System Sciences (HICSS), pp. 51855194. IEEE.Google Scholar
Medina Angarita, M. A. & Nolte, A. 2019 Does it matter why we hack? – Exploring the impact of goal alignment in hackathons. In Reports of the European Society for Socially Embedded Technologies (ed. Ciolfi, L., Salovaara, A. & Wagner, I.). European Society for Socially Embedded Technologies (EUSSET).Google Scholar
Menold, J., Berdanier, C., McComb, C., Hocker, E. & Gardner, L. 2018 Thus, i had to go with what i had: a multiple methods exploration of novice designers’ articulation of prototyping decisions In Proceedings of the ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference - 2018. The American Society of Mechanical Engineers.Google Scholar
Meyer, M. W. & Norman, D. 2020 Changing design education for the 21st century. Journal of Design, Economics, and Innovation 6 (1), 1349.Google Scholar
Micklethwaite, P. & Knifton, R. 2017 Climate change. Design teaching for a new reality. The Design Journal 20 (1), 16361650.CrossRefGoogle Scholar
Murray, J., Studer, J., Daly, S., McKilligan, S. & Seifert, C. 2019 Design by taking perspectives: How engineers explore problems. Journal of Engineering Education 108 (2), 248275.CrossRefGoogle Scholar
Nandi, A. & Mandernach, M. 2016 Hackathons as an informal learning platform. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (ed. Alphonce, C.). Association for Computing Machinery.Google Scholar
Nicol, D. J. & Macfarlane-Dick, D. 2006 Formative assessment and self–regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education 31 (2), 199218.CrossRefGoogle Scholar
Norman, D. A. 2016 When you come to a fork in the road, take it: the future of design. The Journal of Design, Economics, and Innovation 2 (4), 343348.Google Scholar
Norman, G. R. & Schmidt, H. G. 2000 Effectiveness of problem-based learning curricula: theory, practice and paper darts. Medical Education 34(9), 721728.CrossRefGoogle ScholarPubMed
Olesen, J. F., Hansen, N. B. & Halskov, K. 2018 Four factors informing design judgement at a hackathon. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (ed. Buchanan, G.). Association for Computing Machinery.Google Scholar
Page, F., Sweeney, S., Bruce, F. & Baxter, S. 2016 The use of the” hackathon” in design education: An opportunistic exploration. In DS 83: Proceedings of the 18th International Conference on Engineering DS 83: Proceedings of the 18th International Conference on Engineering and Product Design Education (E&PDE16) (ed. Bohemia, E., Kovacevic, A., Buck, L., Tollestrup, C., Eriksen, K. & Ovesen, N.). Design Society.Google Scholar
Porras, J., Knutas, A., Ikonen, J., Happonen, A., Khakurel, J. & Herala, A. 2019 Code camps and hackathons in education - literature review and lessons learned. In Proceedings of the 52nd Hawaii International Conference on System Sciences. ScholarSpace/AIS Electronic Library (AISeL).Google Scholar
Putwain, D. W., Nicholson, L. J., Pekrun, R., Becker, S. & Symes, W. 2019 Expectancy of success, attainment value, engagement, and achievement: a moderated mediation analysis. Learning and Instruction 60, 117125.CrossRefGoogle Scholar
Ranscombe, C., Bissett-Johnson, K., Mathias, D., Eisenbart, B. & Hicks, B. 2020 Designing with lego: exploring low fidelity visualization as a trigger for student behavior change toward idea fluency. International Journal of Technology and Design Education 30 (2), 367388.CrossRefGoogle Scholar
R Core Team. 2023 R: A language and environment for statistical computing, Vienna, Austria: R foundation for Statistical Computing. R version 4.3.0. www.R-project.org.Google Scholar
Rennick, C., Hulls, C., Wright, D., Milne, A. J. B., Li, E. & Bedi, S. 2018 Engineering design days: engaging students with authentic problem-solving in an academic hackathon. In Proceedings of the 2018 ASEE Annual Conference & Exposition. American Society for Engineering Education.Google Scholar
Sadowska, N. & Laffy, D. 2017 The design brief: Inquiry into the starting point in a learning journey. The Design Journal 20 (1), S1380S1389.CrossRefGoogle Scholar
Sadowska, N. & Laffy, D. 2019 Measuring the impact of strategic design learning experience long after the classroom delivery. The Design Journal 22 (1), 13051315.CrossRefGoogle Scholar
Sanchez, C. & Dunning, D. 2018 Research: Learning a Little About Something Makes us Overconfident, online document www.hbr.org/2018/03/research-learning-a-little-about-something-makes-us-overconfident (accessed March 16, 2024).Google Scholar
Sasson, I., Yehuda, I. & Malkinson, N. 2018 Fostering the skills of critical thinking and question-posing in a project-based learning environment. Thinking Skills and Creativity 29, 203212.CrossRefGoogle Scholar
Savin-Baden, M. 2014 Using problem-based learning: new constellations for the 21st century. Journal on Excellence in College Teaching 25, 124.Google Scholar
Sheppard, B., Sarrazin, H., Kouyoumjian, G. & Dore, F. 2018 The Business Value of Design, online document www.mckinsey.com/capabilities/mckinsey-design/our-insights/the-business-value-of-design/ (accessed December 13, 2023).Google Scholar
Shrivastava, M., Shah, N. & Navaid, S. 2018 Assessment of change in knowledge about research methods among delegates attending research methodology workshop. Perspectives in Clinical Research 9 (2), 8390.CrossRefGoogle ScholarPubMed
Shute, V. J. & Becker, B. J., ed. 2010 Innovative Assessment for the 21st Century: Supporting Educational Needs (1st edition). Springer.CrossRefGoogle Scholar
Souleles, N. 2017 Design for social change and design education: Social challenges versus teacher-centred pedagogies. The Design Journal 20 (1), 927936.CrossRefGoogle Scholar
Strobel, J. & van Barneveld, A. 2009 When is pbl more effective? A meta-synthesis of meta-analyses comparing pbl to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning 3 (1), 4458.CrossRefGoogle Scholar
Studer, J. A., Daly, S. R., McKilligan, S. & Seifert, C. M. 2018 Evidence of problem exploration in creative designs. AI EDAM 32 (4), 415430.Google Scholar
Stylidis, K., Wickman, C. & Söderberg, R. 2020 Perceived quality of products: a framework and attributes ranking method. Journal of Engineering Design 31 (1), 3767.CrossRefGoogle Scholar
Taylor, N. & Clarke, L. 2018 Everybody’s hacking. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (ed. Mandryk, R., Hancock, M., Perry, M. & Cox, A.), pp. 112. ACM.Google Scholar
Thomson, A. & Grierson, H. 2021 Engineering student attainment and engagement throughout the design process. In DS 109: Proceedings of the Design Society: 23th International Conference on Engineering Design (ICED21) (ed. Seybold, C. & Mantwill, F.). Cambridge University Press.Google Scholar
Tu, J.-C., Liu, L.-X. & Wu, K.-Y. 2018 Study on the learning effectiveness of Stanford design thinking in integrated design education. Sustainability 10 (8), 2649.CrossRefGoogle Scholar
Zhang, Y., Bohemia, E. & McCardle, J. 2019 Aspects of a study of creative thinking and knowledge application. In Academy for Design Innovation Management Conference (ADIM 2019). Academy for Design Innovation.Google Scholar
Zimmerman, B. J. 2002 Becoming a self-regulated learner: An overview. Theory Into Practice 41 (2), 6470.CrossRefGoogle Scholar
Zwikael, O., Levin, G., Parviz, & Rad, F. 2008 Top management support - the project friendly organization. Cost Engineering 50, 2230.Google Scholar
Figure 0

Table 1. Characteristics of a hackathon

Figure 1

Table 2. Overview of three hackathon batches (2022–2023)

Figure 2

Figure 1. Double Diamond model adapted from Design Council (2023).

Figure 3

Figure 2. Prototyping cycle adapted from Martins Pacheco et al. (2021).

Figure 4

Table 3. Overview of preexisting knowledge and its relationship to the variables

Figure 5

Table 4. Overview of hypotheses and results

Figure 6

Figure 3. Scatterplot of the variables total effort and product quality with its regression line.

Figure 7

Figure 4. Scatterplot of the variables problem effort and product quality with its regression line.

Figure 8

Figure 5. Scatterplot of the variables solution effort and product quality with its regression line.

Figure 9

Figure 6. Scatterplot of the variables total effort and learning effect with its regression line.

Figure 10

Figure 7. Scatterplot of the variables learning effect and product quality with its regression line.

Figure 11

Figure 8. Development process overview of Team A.

Figure 12

Figure 9. Development process overview of Team B.

Figure 13

Figure 10. Development process overview of Team C.

Supplementary material: File

Martins Pacheco et al. supplementary material

Martins Pacheco et al. supplementary material
Download Martins Pacheco et al. supplementary material(File)
File 56.3 KB