National Academies Press: OpenBook

Science Teaching Reconsidered: A Handbook (1997)

Chapter: chapter 7: choosing and using instructional resources, 7 choosing and using instructional resources.

What issues should I consider when selecting instructional materials?

How can I use electronic resources to enhance student learning?

How can I help my students use textbooks more effectively?

A key feature of effective teaching is the selection of instructional materials that meet the needs of students and fit the constraints of the teaching and learning environment. There are many pressures for educators to match the audiovisual stimuli of television, computers, and electronic games with which students are experienced. The speed of personal computers and the ease of authoring systems permit instructors to design and customize computer-based audiovisual presentations and to develop computer-based assignments for their students. The tremendous increases in rates of information transfer, access to the Internet, and posting of materials on the World Wide Web give instructors and students an almost limitless supply of resource material. In addition, the ease of electronic communications between an instructor and students, and among students, provides new opportunities for sharing questions, answers, and discussions during a course. At the same time, there remains a major role for student use of textbooks and for instructional use of demonstrations, films, videos, slides, and overhead transparencies.

Carefully scripted presentations and activities run the risk of emphasizing teacher delivery rather than student learning. Carefully planned and prepared instructional resources sometimes tempt instructors to race ahead and to cover more. The rapid-fire presentations combined with audiovisual overload can tempt students to remain intellectually passive. One way to avoid this is to intersperse activities which assess student understanding and encourage reflection and critical thinking. Another possibility is to reduce the pace of the class session, by pausing periodically to invite questions.

Instructional resources usually fall into one of two categories: student-centered and teacher-centered. In the student-centered model, instructional resources can be used for tutorials, problem solving, discovery, and review. In the teacher-centered model, resources are used for presentations of supplementary or primary material in the classroom as described in some examples in Chapter 2 . Information technology can also be used for communication and for information retrieval.

TEXTBOOK USE IN TEACHING AND LEARNING

The mode of teaching so common today—the lecture-text-exam approach-is an artifact of centuries of European education. The professor's main role before the wide availability of the printing press was to lecture on information obtained from a rare copy of an often ancient book. Despite the fears of the faculty at the University of Salamanca during the sixteenth century, the textbook rapidly became a useful supplement to the class lecture rather than its replacement. Today a textbook is available for almost every college science class. As McKeachie (1994) notes, ''. . . my years of experience in attempting to assess teaching effectiveness have led me to think that the textbook, more than any other element of the course, determines student learning."

Advantages and Disadvantages of Using Textbooks

Books are a highly portable form of information and can be accessed when, where, and at whatever rate and level of detail the reader desires. Research indicates that, for many people, visual processing (i.e., reading) is faster than auditory processing (i.e., listening to lectures), making textbooks a very effective resource (McKeachie, 1994). Reading can be done slowly, accompanied by extensive note taking, or it can be done rapidly, by skimming and skipping. There are advantages to both styles, and you may find it useful to discuss their merits with your students.

One important aspect of any science class is helping the student to make sense of the mass of information and ideas in a field. This can be done by showing students how to arrange information in a meaningful hierarchy of related major and minor concepts. Well-chosen textbooks help students understand how information and ideas can be organized.

Textbooks have several major limitations. Although a well-written book can engage and hold student interest, it is not inherently interactive. However, if students are encouraged to ask questions while they read, seek answers within the text, and identify other sources to explore ideas not contained in the text, they will become active readers and gain the maximum benefit from their textbook. In order to meet the needs of a broad audience, texts are often so thick that they overwhelm students seeking key information. Texts are often forced to rely on historical or dated examples, and they rarely give a sense of the discovery aspects and disorganization of information facing modern researchers.

Changes in Textbook Style and Content

Science textbooks have evolved considerably from the descriptive and historical approaches common before World War II. Today's texts are far more sophisticated, less historical, and contain more facts than in the past, with complex language and terminology (Bailar, 1993). Illustrations and mathematical expressions are more common. Emphasis has shifted toward principles and theory. Modern texts attempt to deal with issues of process as well as matters of fact or content. They are replete with essays, sidebars, diagrams, illustrations, worked examples, and problems and questions at

many different levels. One result of these changes is that the average book length has increased two to four times in the past several decades.

In response to the need for quality science textbooks for all students, not just science majors, some authors are returning to descriptive and historical approaches. Generally, books for science literacy courses describe important ideas and discoveries, present a limited number of fundamental concepts, and emphasize the links among different facts and principles. Others (e.g., Trefil and Hazen, 1995) take an interdisciplinary approach, by covering a range of science disciplines in a coherent, connected manner.

Textbooks and Effective Learning

Research on the effectiveness of textbooks has focused on two general areas: text structure and layout. The study of text structure has focused on how the reader builds cognitive representations from text. Recent work categorizes the structure of science text as either a proof-first or a principle-first organization (Dee-Lucas and Larkin, 1990). The proof-first organization develops a proof or argument that builds to a conclusion, usually in the form of a fundamental concept, principle, or law. In principle-first organization, a concept or principle is stated explicitly, then the evidence needed to support it is presented. The prevalence of the proof-first structure in contemporary textbooks may be due to the fact that most college science textbooks are written by scientists with little formal training in education. They present science the way it is practiced by experts. However, studies by Dee-Lucas and Larkin (1990) indicate that the principle-first structure is more effective for long-term retention and understanding by novice readers.

Layout and illustrations are important predictors of a text's effectiveness. One of the most effective types of illustration, especially for students with low verbal aptitude, is a simple multicolor line drawing (Dwyer, 1972; Holliday et al., 1977). Although more visually appealing, and more prevalent in the current textbook market, realistic drawings or photographs are less effective at enhancing student learning. The organization of information on a page also affects student learning (Wendt, 1979).

How to Choose and Use an Appropriate Textbook

Before selecting a text, it is important to know what books are currently on the market. Colleagues who teach the same or a similar course (in your department or at other institutions) are good sources of ideas and information. Your campus bookstore's textbook manager can provide the name and phone number for textbook sales representatives from many different companies. Science education publications (see Appendix B ) carry advertisements from major publishers, and some feature a book review section or annual book buyer's guide. Professional society meetings also provide a chance to talk to publishers and see their new textbooks. Many companies will supply review copies to potential textbook adopters, in return for information about the course in which it might be used.

There are a number of factors to consider when selecting a textbook. To be of greatest value to students, the objectives of a textbook must be consistent with those of the course. Authors often try to meet particular objectives in their books, and these may differ among the choices. Skim the preface to see whether you share the author's approach to the subject.

Consider how the table of contents aligns with your course syllabus and teaching philosophy:

Is coverage of topics broad or specific?

Are key principles stated precisely and clearly?

Are the explanations and interpretations consistent with your teaching style?

In addition to content, evaluate the text structure and layout as discussed in the previous section. Textbooks vary greatly in their level of difficulty with respect to readability, depth of theoretical treatment of information, and complexity of end-of-chapter problems. Colleagues who have adopted the book can provide insight about these issues. They are also helpful for determining whether a textbook contains errors, which have been shown to have a large, negative effect on student learning (Iona, 1987).

The text itself is rarely the only resource available to the students and instructor. Many publishers have a separate study guide, often with chapter summaries and solutions to textbook problems. Upon adoption of a text, publishers often provide (or offer for sale at a reduced price) transparencies, slides, and computer test banks. Software to accompany textbooks is also becoming more popular. This software can vary considerably in quality and usefulness, so you may want to ask for a demonstration disk before purchasing it or requiring that students purchase it.

Once you have chosen a textbook, help your students use it effectively. A number of suggestions are given in the sidebar. Allow time during the first week of class to introduce the text and outline your strategy for its use. Encourage your students to use the text by asking them questions that require higher-order critical thinking skills drawing on and extending its material, methods, or examples. Simple factual questions are of little value to long-term retention or true understanding. Higher-order questions require students to think about the readings, ask questions, integrate material, and develop answers in their own words.

When appropriate, help students to understand that a text book is not always the final authority on a topic, particularly in fields where new information is discovered at a very fast rate. Students may learn that it is okay to question the text if the instructor also openly disagrees with some interpretations or approaches in the book. The instructor can use different interpretations as examples of unresolved problems and illustrate critical thinking by presenting reasons and evidence for differing opinions. However, be careful not to develop such a negative attitude toward the text that students stop using it, or question the teacher's judgment for choosing it.

What If I Can't Find the "Perfect" Textbook?

After a thorough search, you may find that the book you want simply does not exist. Publishers have realized this and have taken steps to customize their products to meet faculty needs. It is possible to select certain chapters of a given book to be bound as a volume. It is also possible to

combine chapters of different books from the same publisher. This approach offers considerable flexibility, given that many smaller textbook publishers are now subsidiaries of larger corporations. Another option is to combine resources from several different publishers and to offer students a "coursepack" instead of a textbook. Many college bookstores and copy centers will work with faculty members to collect chapters, readings, and supplements. They obtain the required copyrights, and bind and sell custom-designed materials tailored for a particular course.

INFORMATION TECHNOLOGY USE IN TEACHING AND LEARNING

The Internet is an international high-speed electronic communications network (linking many local, regional, and national networks) which allows individuals at institutions or at home to access each other's computers for rapid communication or information retrieval. For some, the value of the Internet is that it allows users at remote locations to sign-on to computers where they have accounts, often using connection software called telnet. For others, rapid electronic communication and document sharing replaces phone conversations and meetings and facilitates collaboration.

Another major use of the Internet has been to provide free public access to documents in electronic form. Many individuals and organizations "post" documents on their own computers so that others can obtain electronic copies (without need for special accounts and passwords). File transfers can be made by FTP (file transfer protocol) software, and for many who have posted documents to their Web pages (see below), file transfers can be initiated by as little as the click of a button on the title of the document.

World Wide Web

The World Wide Web (WWW) is a system of linking information (text, sound, graphics, and video) in a way that allows for easy movement between related documents and sites. To use the Web you need a computer with special software that is called a browser, such as Lynx, Mosaic, Cello, or Netscape, or equivalent services available through commercial Internet providers. Highly detailed text, graphics, and videos are available on a wide array of topics.

The Internet and the ease of information viewing and retrieval that are possible through the Web mean that students are no longer limited to information provided by textbooks and printed materials in libraries. Students may "search" on the World Wide Web for preprints and reprints of articles, for discussion bulletin boards on specialized topics, for conference abstracts and proceedings, or for topical compilations of materials for research or teaching. Most Web navigational software systems include search engines that allow the user to locate information or sites by topic area. With more than a thousand new Web sites added every day, browsing for information on the Web needs to be done even more carefully than a literature search for library references. Bear in mind that while the Web holds enormous potential in providing access to information, much of the information available has not been reviewed for quality or reliability.

A number of electronic resources are available to those seeking information about education. Many professional societies have created Web pages with information about their educational initiatives and with links to other resources. Also, consider looking at the information posted by those who fund educational initiatives, including the National Science Foundation, the Howard Hughes Medical Institute, and the Department of Education. Other databases of references and curricular initiatives are provided by the NRC Committee on Undergraduate Science Education ( http://www2.nasedu/cusehome ), Project Kaleidoscope (PKAL), the Eisenhower Clearinghouse, and the Educational Research Information Center (ERIC).

Electronic Communication

Electronic mail ("e-mail") enables students and faculty to communicate with each other and with people all over the world. Many groups have adopted or created systems under which messages sent to a single address are delivered to mail accounts of all members of the group. This kind of electronic bulletin board is called a "listserv." A variation of a listserv bulletin board is a moderated listserv for which all messages are viewed by a moderator (and perhaps condensed, grouped, arranged, and/or edited) before being broadcast. Another form of group electronic communication is through a bulletin board on which messages are posted, called a newsgroup. Interested readers must sign on to a particular electronic address to find and read messages or posted documents. Bulletin boards of this type permit readers to leave their reactions to and comments on the postings of others.

Many instructors use electronic communication to facilitate interactions among students, and between students and themselves. Some faculty

members create course-related Web pages with a mechanism for students to enter their comments or messages when they are connected to the Web page. Sample uses of e-mail or Web pages for communication include:

Students send questions electronically to the instructor, which gives them an opportunity to express a doubt or misconception that they might have been afraid to voice in class. The instructor can transmit the question and the answer simultaneously to all students, without identifying the individual who asked the question.

Students send or post questions about course material and are encouraged to answer each other's questions. Faculty members can monitor these exchanges to gauge student understanding and progress.

Faculty hold "electronic office hours" in addition to traditional ones, so that students can ask a question and receive an answer almost immediately. This approach is becoming more common at institutions with a large commuter population, where students cannot always attend the faculty member's office hours.

Faculty require drafts of student papers to be submitted electronically; not only does this make it easier for some faculty to review the draft, it forces the student to become familiar with technology used in the workplace.

Faculty members distribute or post homework assignments, homework solutions, exam solutions, and other supplemental information electronically.

Faculty create electronic "suggestion boxes" where students can post their comments about the course; consult the administrator of your campus e-mail system for ways to make the postings anonymous.

Choosing and Using Electronic Technologies

Before reviewing particular software, it is important to know which course goal it will help you to achieve. The next step is to talk to publishers, colleagues, and personnel from your campus's academic computing department. Lists such as those published by Boettcher (1993) and Kozma

and Johnson (1991) describe award-winning software developed by faculty members. Many software vendors offer demonstration disks that illustrate many of their products' features. In addition to working with the demonstration disks yourself, invite students to give you feedback on the product.

After purchasing software for student use, you should invest the time necessary to maximize its benefit to students. Some class time (or special sessions in a computer lab) may be needed to teach the students how to use the software effectively. If students will use the product outside of class, introduce the software to the staff at the campus computer labs, so that they will be prepared to answer students' questions. Faculty usually need to develop "courseware" to help guide the students through the software.

The great advantage of multimedia systems is that the combined audio and visual explanation helps students learn and remember. But to avoid student frustration with interactive systems, instructors should make their expectations clear and should provide opportunities for students to get assistance.

Effective science teaching requires creativity, imagination, and innovation. In light of concerns about American science literacy, scientists and educators have struggled to teach this discipline more effectively. Science Teaching Reconsidered provides undergraduate science educators with a path to understanding students, accommodating their individual differences, and helping them grasp the methods—and the wonder—of science.

What impact does teaching style have? How do I plan a course curriculum? How do I make lectures, classes, and laboratories more effective? How can I tell what students are thinking? Why don't they understand? This handbook provides productive approaches to these and other questions.

Written by scientists who are also educators, the handbook offers suggestions for having a greater impact in the classroom and provides resources for further research.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

This site belongs to UNESCO's International Institute for Educational Planning

Home

IIEP Learning Portal

how do educational researches and literature differ from instructional resources

Search form

  • issue briefs
  • Improve learning

Learning and teaching materials

The central role of textbooks and other learning and teaching materials (LTM) in enhancing the quality of learning and improving student performance is widely recognized (Smart and Jagannathan, 2018; GEM Report, 2016b). In low-income countries, quality LTM can compensate for disabling factors such as large class sizes, poorly trained or unqualified teachers, a shortage of instructional time, high levels of illiteracy among parents, and a lack of reading materials in homes (Smart and Jagannathan, 2018; Read, 2015).

Quality LTM are crucial for achieving SDG 4. Ensuring that every institution has appropriate learning materials and technology is a key strategy for reaching target 4a in particular. According to the Education 2030 Framework for Action , ‘[e]ducation institutions and programmes should be adequately and equitably resourced, with … books, other learning materials, open educational resources and technology that are non-discriminatory, learning conducive, learner friendly, context specific, cost effective and available to all learners – children, youth and adults’ (Education 2030, 2016: 33).

Definitions and practical considerations

  • Textbooks are the most visible aspects of a curriculum and are often considered the main script that shapes the teaching and learning processes (UNESCO, 2017). Quality textbook development and provision involves four main steps: development (based on curricular frameworks); procurement systems (state or private sector, approved textbooks list); distribution and access (arrival in schools, issuance to students); and storage and conservation.
  • Teachers’ guides support teachers in their teaching practices. Effective teachers’ guides should: contain explicit communication of conceptual goals with links to proposed activities, provide knowledge and support to help understand and implement teaching plans, reinforce pedagogical content knowledge, give guidance on the practice and understanding of relevant pedagogical activities, present alternatives and freedom of choice, and engage teachers in ongoing reflection.
  • Supplementary materials include books, newspapers, informational pamphlets, and other materials printed in mother tongue and instructional languages reflecting local customs and concerns. They enrich teaching, engage students in multi-dimensional learning, build students’ abilities to apply their knowledge (Elliott and Corrie, 2015), and are thus critical for literacy outcomes (Read, 2015). Studies show that investments in reading books and school libraries have an even greater correlation with increases in student achievement in lower grades than investments in textbook provision (Read and Treffgarne, 2011; Read, 2015).
  • Multimedia and digital resources are a growing source of knowledge for teachers and learners. Several studies show that greater access to information and communication technologies in schools can help reduce the digital divide between low- and high-income groups (UNESCO, 2014a; Jacob, 2016). The digital era has challenged conventional textbook practices. Textbooks need updating more frequently and need to support collaborative and interactive pedagogical methods (Smart and Jagannathan, 2018).

What we know

Studies suggest that textbooks and similar materials (workbooks, exercise books) can increase student learning (Glewwe et al., 2011). The two most consistent characteristics in improving student performance are the availability of LTM, and well trained, prepared, supervised and motivated teachers. Since providing textbooks is cheaper than training and motivating teachers, textbooks are the most cost effective of all education inputs on student achievement (Read, 2015).

Several studies in Africa documented the positive correlation of textbooks and learning achievement (UIS, 2012). PASEC (Programme d'analyse des systèmes éducatifs de la CONFEMEN) and SACMEQ (Southern and Eastern Africa Consortium for Monitoring Educational Quality) results show significant positive correlations between access to textbooks and student test scores in both reading and mathematics (SACMEQ, 2010; Kuecken and Valfort, 2013; PASEC, 2015).

However, many conditions need to be met for LTM to enhance learning. For textbooks to be effective, they must be regularly used in class, be in a language that is widely understood by both students and teachers (Read, 2015), and improve teacher–learner interaction (World Bank, 2018a). Kuecken and Valfort (2013) warn about possible biases due to omitted variables such as teacher qualification or school infrastructure that may influence both textbook access and educational outcomes. They make a distinction between the impact of textbook sharing and textbook ownership on learning outcomes, finding an impact for textbook sharing only amongst students of a high socio-economic status.

In order to enable quality learning for all users, all links of the LTM chain – including definition, design, creation, development, production, distribution, storage, and classroom usage – must be carefully considered (Read, 2015: 29–30).

Many countries still face the challenges of insufficient availability, poor quality, and ineffective usage of LTM (Elliot and Corrie, 2015). Equity and inclusion are also key issues to be addressed at all stages of LTM development and provision.

Provision, cost, and accessibility

Accessibility is ‘the extent to which an individual or group is able to acquire and use these tools, either freely or at an affordable cost’ (UNESCO 2014b: 13). Adequate supply is considered a minimum of one textbook for three students, and, at primary level, enough reading books so that every child has access to at least one new book per week. Given that LTM are often first to be hit by severe funding constraints, reducing their cost is key to improving their accessibility (Read and Treffgarne, 2011). With increased enrolment rates, LTM provision systems are more expensive to maintain, and the high risk of corruption across the LTM value chain may influence price. For example, textbook contracts may be awarded in favour of books of lower quality and higher cost (GEM Report, 2016a). While highly centralized book production systems are expensive, decentralization requires the creation of specialized management, monitoring, and supervision systems operated by trained staff and supported by regular and reliable budget allocations. It also requires the establishment of approved textbook lists, from which schools themselves may select the titles they want (Read, 2015).

Data and monitoring

Many countries do not have clearly defined, achievable LTM provision targets, nor access to data that enable them to estimate LTM supply and allocation to schools (Read, 2016). Private sector competition can lead to better production, higher quality, and reduced prices, but only if good management and monitoring processes exist within ministries of education (Read, 2015).

Quality and relevance

The physical characteristics of textbooks have a strong impact on their longevity and ultimately on their lifetime costs. The quality of layout, font, illustrations, and/or graphics, as well as the balance between visuals and text, also plays a key role in learning processes. For electronic media (e.g. audio, graphics, video, animation), quality may be judged in terms of functionality as well as design, interactivity, and ease of navigation. For web resources, ease of access and navigation is important.

LTM should be a product of the curriculum development process and therefore aligned to the philosophy, objectives, content, methodology, and evaluation of the curriculum (UNESCO, 2005; Oates, 2014; Smart and Jagannathan, 2018). They should be age-appropriate and take into account different linguistic environments, local and indigenous knowledge, skills, and materials as well as the background and needs of learners (UNESCO, 2005; UNESCO, 2014b).

LTM need to be grounded in both learning theory and subject-specific content theory, provide varied application of concepts and principles, facilitate active and equitable participation of all learners, and guide learners to reflect on what they are learning (Oates, 2014). Finally, the likelihood of LTM leading to quality learning highly depends on how teachers use them. Many teachers have little or no practical experience in the correct and creative use of textbooks and associated teachers’ guides.

Equity and inclusion

Quality textbooks should be free from divisive stereotypes and prejudices, frequently revised and updated to reflect changing local, national, regional, and international contexts (UNESCO, 2017). While LTM must adapt and respond to the diverse needs of all learners ‘in a wide range of cultural contexts, economic conditions and educational settings’, as well as personal situations (UNESCO, 2005: 3), they also need to represent this diversity in their content. However, some textbooks still present stereotypical, simplistic interpretations of gender and of ethnic, cultural, religious, and linguistic minorities (GEM Report, 2016b). The underrepresentation of people with disabilities in textbooks across the world perpetuates their invisibility and disadvantage.

Resources should be available in a language comprehensible to learners, in particular for ‘low-achieving’ students (Read, 2015). Textbooks should accommodate the special needs of learners with disabilities, through large font and Braille editions, augmentative and alternative modes, and adapted versions at simpler levels of reading difficulty. In crises and emergencies, textbooks need to respond to these particular contextual challenges as part of integrated, crisis-sensitive education content and planning approaches (Batton et al., 2015).

Policy and planning

Design and implement a textbook policy.

A textbook policy can help align the ‘quality’ components of education – curriculum, textbooks, and assessment systems – with the learning process in the classroom. A textbook policy can also facilitate allocation of budgets between physical and digital materials and ensure coherence between curriculum, classroom practices, and learning objectives. The policy should set out the roles of the different actors involved in the process (Smart and Jagannathan, 2018).

Provide capacity building

Capacity building may involve the training of textbook producers to create inclusive materials; support efforts of national and local publishing industries as providers of affordable textbooks and reading materials; training in content authoring and evaluation; and teacher training to develop and use textbooks and supplementary learning materials (UNESCO 2014b; UNESCO 2014c).

Develop computerized LTM management information systems

Investment in a national, computerized LTM management system can provide information, system control, and accurate forward-cost projections. Examples include Rwanda Learning and Teaching Materials Management Information System (LTMMIS) and Namibia Learning Support Materials Management Information System (LSMMIS) (Read, 2016: 14–19).

Decentralizing supply and distribution

Decentralizing from supply-side policies to demand-based school selection allows schools to select and order LTM efficiently, and ensure ownership of the materials selected (Read and Treffgarne, 2011).

Investing in school and classroom storage and simple school management and usage systems, as well as opting for materials with high production specifications and a long book life, can help achieve maximum cost amortization and minimum distribution costs (Read, 2015). A shift from state- to private-sector authorship, publishing, production, and distribution in a public private partnership with government offers potential for better production and presentational quality as well as reduced prices (Read, 2015; Smart and Jagannathan, 2018). Innovative financing models based on PPPs include Gavi (the Vaccine Alliance) whose approach could increase access to textbooks in low-income countries (Elliott and Corrie, 2015) and The Global Book Alliance (Results for Development and International Education Partners Ltd., 2016).

Programmes and reforms

  • Rwanda: an innovative textbook distribution programme significantly increased access to textbooks (GPE Secretariat, 2013).
  • Cameroon: a textbook reform which introduced a single book per subject and price approval, led to a reduction in corruption and costs (Ntap, 2017).
  • Kenya: Secondary Education Quality Improvement Project reforms reduced the cost of textbooks for grades 7 to 12 by almost 65 per cent (Jena, 2018).
  • Swaziland : free textbooks have been provided to all primary school pupils since 2003, leading to a 25 percentage point increase in the number of pupils with textbooks (SACMEQ, 2011)
  •  Guatemala and Nicaragua: free textbook programmes target the most disadvantaged learners (GEM Report, 2016a)

Plans and policies

  • Ghana: Textbook Development and Distribution Policy for Pre-Tertiary Education (2002)*
  • Pakistan: National Textbook and Learning Materials Policy and Plan of Action (2007)
  • Namibia: Republic of Namibia Textbook Policy (2008)
  • Mali: Politique nationale du manuel scolaire et du matériel didactique (2010)
  • Democratic Republic of the Congo: Politique nationale des manuels scolaires (2017)

*See Opoku-Amankwa, Brew-Hammond, and Mahama, 2015.

  •  Brunswic, E.; Valérien, J. 1995. Planning the Development of School Textbooks: A Series of Twelve Training Modules for Educational Planners and Administrators. Paris: UNESCO-IIEP.
  • Crabbe, R.A.B.; Nyingi, M.; Abadzi, H. 2014. Textbook Development in Low Income Countries: A Guide for Policy and Practice. Washington, DC: World Bank .
  •  RTI International. 2015. A Guide for Strengthening Gender Equality and Inclusiveness in Teaching and Learning Materials . Washington, DC: USAID

Batton, J.; Alama, A.; Sinclair, M.; Bethke, L.; Bernard, J. 2015. Textbooks and other education materials: what key messages do we want to convey and how? Paris: UNESCO-IIEP.

Education 2030. 2016. Incheon Declaration and Framework for Action for the implementation of Sustainable Development Goal 4: Ensure inclusive and equitable quality education and promote lifelong learning .

Elliott, L.; Corrie, L. 2015. ‘The GAVI approach to Learning and teaching materials in sub-Saharan Africa.’ Background paper prepared for the Education for All Global Monitoring Report 2015, Education for All 2000–2015: Achievements and Challenges .

GEM Report. 2016a. ‘Every child should have a textbook’ . Policy paper 23. UNESCO.

----. 2016b. ‘Textbooks pave the way to sustainable development’ . Policy paper 28. UNESCO.

Glewwe, P.; Hanushek, E.; Humpage, S.; Ravina, R. 2011. School resources and educational outcomes in developing countries: A review of the literature from 1990 to 2010 . Cambridge, MA: National Bureau of Economic Research.

GPE Secretariat. 2013. ‘Books for All: Rwanda’s innovative textbook distribution program’ . Global Partnership for Education , retrieved 29 May 2019.

Jacob, B.A. 2016. ‘ The opportunities and challenges of digital learning’ . Brookings , retrieved 29 May 2019.

Jena, N. 2018. ‘ Making textbooks affordable and available for every student in Kenya’ . World Bank Blogs , retrieved 29 May 2019.

Kuecken, M.; Valfort, M.-A. 2013. ‘When do textbooks matter for achievement? Evidence from African primary schools’ . Economics Letters 119(3): 11–15.

Ntap, E. J. 2017. ‘ Réforme de livres scolaires au Cameroun pour lutter contre la corruption ’ . VOA, 19 December 2017.

Oates, T. 2014. Why textbooks count: A policy paper . Cambridge: Cambridge Assessment.

Opoku-Amankwa, K.; Brew-Hammond, A.; Mahama, A.K. 2015. ‘Publishing for pre-tertiary education in Ghana: The 2002 textbook policy in retrospect’ . Educational Studies 41(4): 414–429.

PASEC. 2015. PASEC2014 education system performance in Francophone sub-Saharan Africa: competencies and learning factors in primary education.  CONFEMEN. Dakar.

Read, N. 2016. ‘ Measures of learning and teaching material availability and use in sub-Saharan Africa and other low income countries’. Background paper prepared for the Global Education Monitoring Report 2016, Education for People and Planet: Creating Sustainable Futures for All .

Read, T.; Treffgarne, C. 2011. ‘ Learning and teaching materials: policy and practice for provision’. Guidance Note. DfID.

Read, T. 2015. Where have all the textbooks gone? Toward sustainable provision of teaching and learning materials in sub-Saharan Africa. Washington, DC: World Bank.

Results for Development; International Education Partners Ltd. 2016. ‘ Global Book Fund feasibility study: Final report’ . Washington, DC: Results for Development.

Smart, A.; Jagannathan, S. 2018. Textbook policies in Asia: Development, publishing, printing, distribution, and future implications . Manila: Asian Development Bank.

SACMEQ. 2010. How successful are textbook provision programmes? SACMEQ Policy Issues Series 6.

----. 2011. Quality of primary school inputs in Swaziland . Policy Brief 2.

UIS. 2012. ‘School and teaching resources in sub-Saharan Africa: Analysis of the 2011 UIS Regional Data Collection on Education’ . UIS information bulletin 9. Montreal: UNESCO-UIS.

UNESCO. 2005. A Comprehensive strategy for textbooks and learning materials. Paris: UNESCO.

----. 2014a. Teaching and learning: achieving quality for all; EFA Global Monitoring Report, 2013-2014. Paris: UNESCO.

----. 2014b. Textbooks and learning resources: A global framework for policy development . Paris: UNESCO.

----. 2014c. Textbooks and learning resources: Guidelines for developers and users . Paris: UNESCO.

----. 2017. Making textbook content inclusive: A focus on religion, gender, and culture . Paris: UNESCO.

World Bank. 2018. World Development Report 2018: Learning to realize education’s promise. Washington, DC: World Bank.

Related information

  • Why textbooks are a crucial part of every child’s learning journey
  • UNESCO International Bureau of Education (IBE)
  • Lesson plans
  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

how do educational researches and literature differ from instructional resources

Understanding Science

How science REALLY works...

Prepare and plan

Educational research.

The teaching resources recommended on our site are consistent with what is known about how students learn the nature and process of science. Educational research suggests that the most effective instruction in this area is explicit and reflective, and provides multiple opportunities for students to work with key concepts in different contexts. But just how do we know that this sort of instruction works? And how do we know which concepts are hardest for students to learn and which are the most difficult misconceptions to address? To find out, browse the links below. Each link summarizes a journal article from the education research literature and helps reveal how we know what we know about how students learn.

  • “That’s what scientists have to do”: Preservice elementary teachers’ conceptions of the nature of science during a moon investigation.  (Abell et al., 2001)
  • Influence of a reflective activity-based approach on elementary teachers’ conceptions of nature of science.  (Akerson et al., 2000)
  • Evaluating knowledge of the nature of (whole) science.  (Allchin, 2011)
  • Learners’ responses to the demands of conceptual change: Considerations for effective nature of science instruction.  (Clough, 2006)
  • Examining students’ views on the nature of science: Results from Korean 6th, 8th, and 10th graders.  (Kang et al., 2004)
  • Influence of explicit and reflective versus implicit inquiry-oriented instruction on sixth graders’ views of nature of science.  (Khishfe and Abd-El-Khalick, 2002)
  • Teachers’ understanding of the nature of science and classroom practice: Factors that facilitate or impede the relationship.  (Lederman, 1999)
  • Revising instruction to teach nature of science.  (Lederman and Lederman, 2004)
  • Science teachers’ conceptions of the nature of science: Do they really influence teacher behavior?  (Lederman and Zeidler, 1987)
  • Examining student conceptions of the nature of science.  (Moss, 2001)
  • Student conceptualizations of the nature of science in response to a socioscientific issue.  (Sadler et al., 2004)
  • Explicit reflective nature of science instruction: Evolution, intelligent design, and umbrellaology.  (Scharmann et al., 2005)
  • Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and scientific inquiry.  (Schwartz et al., 2004)
  • Tangled up in views: Beliefs in the nature of science and responses to socioscientific dilemmas.  (Zeidler et al., 2002)

Abell, S., M. Martini, and M. George. 2001. “That’s what scientists have to do”: Preservice elementary teachers’ conceptions of the nature of science during a moon investigation.  International Journal of Science Education  23(11):1095-1109. Two sections of an undergraduate course in elementary science education were observed during an extended investigation, in which students made observations of the moon and tried to develop explanations for what they saw. Students worked in groups, were engaged in many aspects of the process of science, and were asked to reflect on their own learning regarding the moon. Eleven student journals of the experience, along with interview transcripts from these students, were analyzed for student learning regarding observation in science, the role of creativity and inference in science, and social aspects of science. Major findings include:

  • Students recognized that observations are key in science but didn’t recognize the role that observation plays in science.
  • Students recognized that their own work involved observing, predicting, and coming up with explanations, but they did not generally connect this to the process of science.
  • Students recognized that collaboration facilitated their own learning but did not generally connect this to the process of science.

This research highlights the pedagogical importance of making the nature and process of science explicit: even though students were actively engaged in scientific processes, they did not get many of the key messages that the instructors implicitly conveyed. The researchers also recommend asking students to reflect on how their own understandings of the nature and process of science are changing over time.

Akerson, V.L., F. Abd-El-Khalick, and N.G. Lederman. 2000. Influence of a reflective activity-based approach on elementary teachers’ conceptions of nature of science.  Journal of Research in Science Teaching  37(4):295-317. Fifty undergraduate and graduate students enrolled in a science teaching methods course engaged in six hours of activities designed to target key nature-of-science concepts, consistent with those outlined in Lederman and Lederman (2004). After the initial set of activities and throughout the course, students were encouraged to reflect on those concepts as opportunities arose within the designated pedagogical content, and were assigned two writing tasks focusing on the nature of science. By the end of the course, students were so accustomed to these reflections that they frequently identified such opportunities for themselves. Students were pre- and post-tested with an open-ended questionnaire targeting the key concepts, and a subset of students was interviewed on these topics. Responses were analyzed for key concepts to determine whether students held adequate conceptions in these areas. Major findings include:

  • There were few differences between graduates and undergraduates: most students began the course with largely inadequate conceptions.
  • Students began the course understanding least about the empirical nature of science, the tentative nature of scientific knowledge, the difference between theories and laws, and the role of creativity in science.
  • Significant gains were achieved as a result of instruction. Student conceptions improved most in the areas of the tentative nature of scientific knowledge, the difference between theories and laws, and the difference between observation and inference.

The explicit, reflective instruction was effective, but despite the gains achieved, many students still held inadequate conceptions at the end of the course. This supports the idea that students hold tenacious misconceptions about the nature and process of science, and, the authors argue, suggests that instructors should additionally focus on helping students see the inadequacy of their current conceptions. The authors suggest that the role of subjectivity, as well as of social and cultural factors, in science are best learned through rich historical case studies, which are hard to fit into a methods course. Finally, the authors conclude that nature-of-science instruction is effective in a methods course, but would likely be more effective in a science content course.

Allchin, D. 2011. Evaluating knowledge of the nature of (whole) science.  Science Education  95:518-542. The author argues that commonly used instruments assessing knowledge of the nature of science are inadequate in several ways. They focus too much on declarative knowledge instead of conceptual understanding, are designed for research not classroom assessment, and are inauthentic in the sense that they do not examine student knowledge in contexts similar to those in which we want students to use this knowledge. Furthermore, lists of the tenets of the nature of science (which such assessments are based upon) are oversimplified and incomplete. The author argues that instead of assessing whether students can list the characteristics of scientific knowledge, we should be interested in whether students can effectively analyze information about scientific and socioscientific controversies and assess the reliability of scientific claims that affect their decision making. In order to do this, students need to understand how the process of science lends credibility to scientific ideas. The author proposes an alternative assessment form (based on the AP free responses essay) that requires well-informed analysis on the part of the student, involves authentic contexts, and can be adapted for many different assessment purposes and situations. In it, students are asked to analyze historic and modern case studies of scientific and socioscientific controversies. Prototypes for this type of assessment are provided.

Clough, M. 2006. Learners’ responses to the demands of conceptual change: Considerations for effective nature of science instruction.  Science Education  15:463-494. The author introduces the idea that many aspects of student learning about the nature and process of science can be explained, and that learning may be improved, by viewing this learning as a process of conceptual change. Just as in learning about Newtonian physics, students often enter an instructional setting with tenacious misconceptions about what science is and how it works — probably resulting from previous instruction (e.g., cookbook labs) and other experiences. Students may then distort new information to fit their existing incorrect knowledge frameworks. The author proposes that this is why explicit, reflective instruction (which provides students with opportunities to assess their previous conceptions) helps students learn about the nature and process of science, while implicit, non-reflective instruction does not. Furthermore, the author argues that explicit instruction on the nature and process of science can be placed along a continuum from decontextualized to highly contextualized. Examples of each are:

  • Decontextualized: black-box activities
  • Moderately contextualized: students reflecting on the process of science in their own labs
  • Highly contextualized: students reflecting on a modern or historic example of science in progress

Highly contextualized activities are useful because they make it difficult for a student to dismiss their learning as applying only to “school science” and because teachers are less likely to view such activities as add-ons. However, decontextualized activities also have advantages because they make it very easy to be explicit and emphasize key concepts. The author concludes that instruction that incorporates instruction from all along the continuum and that draws students’ attention to the connections between the different positions along the continuum is likely to be most effective.

Kang, S., L. Scharmann, and T. Noh. 2004. Examining students’ views on the nature of science: Results from Korean 6th, 8th, and 10th graders.  Science Education  89(2):314-334. A multiple-choice survey (supplemented by open-ended questions) on the nature and process of science was given to a large group of 6th, 8th, and 10th grade students in Korea. Most students thought that:

  • Science is mainly concerned with technological advancement
  • Theories are proven facts
  • Theories can change over time
  • Scientific knowledge is not constructed, but discovered (i.e., can be read off of nature)

Interestingly, Korean students don’t tend to hold the common Western misperception of theories as “just hunches.” The researchers found little improvement in understanding in older students. This suggests that special attention is needed to help students learn about the nature of science. The researchers argue that we should begin instruction in this area early in elementary school.

Khishfe, R., and F. Abd-El-Khalick. 2002. Influence of explicit and reflective versus implicit inquiry-oriented instruction on sixth graders’ views of nature of science.  Journal of Research in Science Teaching  39(7):551-578. Two sixth grade classes (62 students total) in Lebanon experienced two different versions of a curriculum spanning ten 50 minute segments. One class participated in an inquiry-oriented science curriculum, which included a discussion component that explicitly emphasized how the nature of science was demonstrated through student activities. The other participated in the same inquiry curriculum, but their discussion focused exclusively on science content or the skills students had used in the activity. Both groups completed open-ended questionnaires and participated in interviews regarding their views of the nature of science before and after the intervention. The two groups started off with similar, low levels of understanding, but the students in the class with explicit discussion of the nature of science substantially improved their understanding of key elements of the nature of science (the tentative, empirical, and creative nature of scientific knowledge, as well as the difference between observation and inference) over the course of the intervention. The other group did not. However, even with the enhanced, explicit curriculum, only 24% of the students achieved a consistently accurate understanding of the nature of science. These findings support the idea that inquiry alone is insufficient to improve student understanding of the nature of science; explicit, reflective instruction is necessary as well. The researchers further conclude that this instruction should be incorporated throughout teaching over an extended period of time in order to see gains among a larger fraction of students. The researchers emphasize that explicit, reflective teaching does not mean didactic teaching, but rather instruction that specifically targets nature of science concepts and that provides students with opportunities to relate their own activities to the activities of scientists and the scientific community more broadly.

Lederman, N.G. 1999. Teachers’ understanding of the nature of science and classroom practice: Factors that facilitate or impede the relationship.  Journal of Research in Science Teaching  36(8):916-929. Five high school biology teachers were observed weekly for one year to examine whether their conceptions of the nature of science were reflected in their teaching. The researcher also collected data from questionnaires, student and teacher interviews, and classroom materials. All five teachers had accurate understandings of the nature of science. The most experienced teachers used pedagogical techniques consistent with the nature of science, though they weren’t explicitly trying to do so and did not claim to be trying to improve students’ understanding of the nature of science. Less experienced teachers did not teach in a manner consistent with their views of the nature of science. This suggests that an adequate understanding of the nature and process of science and curricular flexibility alone are not sufficient to ensure that teachers will use pedagogical techniques that reflect that understanding. In addition, the researchers found that students in these classrooms gained little understanding of the nature of science, regardless of whether they were taught by a more or less experienced teacher. This lends further support to the idea that teachers need to be explicit about how lessons and activities relate to the nature and process of science in order for students to improve their understandings in this area. The researcher concludes that teacher education programs need to make a concerted effort to help teachers improve their ability to explicitly translate their understanding of the nature of science into their teaching practices. Furthermore, teachers should be encouraged to view an understanding of the nature of science as an important pedagogical objective in its own right.

Lederman, N.G., and J.S. Lederman. 2004. Revising instruction to teach nature of science.  The Science Teacher  71(9):36-39. The authors describe seven aspects of the nature of science that are important for K-12 students to understand:

  • the difference between observation and inference
  • the difference between laws and theories
  • that science is based on observations of the natural world
  • that science involves creativity
  • that scientific knowledge is partially subjective
  • that science is socially and culturally embedded
  • that scientific knowledge is subject to change.

They argue that most lessons can be modified to emphasize one or more of these ideas and provide an example from biology instruction. Many teachers use an activity in which students study a slide of growing tissue and count cells at different stages of mitosis in order to estimate the lengths of these stages. The authors recommend modifying this activity in several ways:

  • asking students to reason about how they know when one stage ends to emphasize the sort of subjectivity with which scientists must deal
  • asking students to grapple with ambiguity in their data
  • asking students to reason about why different groups came up with different estimates and how confident they are in their estimates in order to emphasize the tentativity of scientific knowledge
  • asking students to distinguish between what they directly observed on the slide and what they inferred from those observations.

The authors emphasize that incorporating the nature and process of science into this activity involves, not changing the activity itself, but carefully crafting reflective questions that make explicit relevant aspects of the nature and process of science.

Lederman, N.G., and D.L. Zeidler. 1987. Science teachers’ conceptions of the nature of science: Do they really influence teacher behavior?  Science Education  71(5):721-734. Eighteen high school biology classrooms led by experienced teachers were studied over the course of one semester. Teachers’ understandings of the nature and process of science were assessed at the beginning and end of the semester. In addition, the researchers made extensive observations of each classroom at three different points in the semester and categorized the teachers’ and students’ behaviors along many variables relating to teaching the nature and process of science. The researchers found  no  relationship between a teacher’s knowledge of the nature and process of science and the teacher’s general instructional approach, the nature-of-science content addressed in the classroom, the teacher’s attitude, the classroom atmosphere, or the students’ interactions with the teacher. This finding challenges the widely held assumption that student understanding of the nature and process of science can be improved simply by improving teacher understanding. Instead, the teachers’ level of understanding of this topic was unrelated to classroom performance. The authors emphasize that this doesn’t indicate that a teacher’s ideas don’t matter at all; teachers need at least a basic understanding of the topics they will teach, but this alone isn’t enough. The authors suggest that to improve their teaching in this area, instructors also need to be prepared with strategies designed specifically for teaching the nature and process of science.

Moss, D.M. 2001. Examining student conceptions of the nature of science.  International Journal of Science Education  23(8):771-790. Five 11th and 12th grade students, with a range of academic achievement, taking an environmental science class, were interviewed six times over the course of a year. The class was project-based and engaged students in data collection for real scientific research. Interviews focused on students’ views of selected aspects of the nature and process of science. The researcher coded and interpreted transcripts of the interviews. Major findings include:

  • In contrast to previous studies, most students understood that scientific knowledge builds on itself and is tentative. Students also seemed to understand science as a social activity.
  • Many students didn’t know what makes science science and had trouble distinguishing science from other ways of knowing.
  • Many students viewed science as merely procedural.
  • Most students didn’t understand that scientists regularly generate new research questions as they work.
  • Despite the authentic, project-based nature of the course, there were few shifts in student views of the nature and process of science.

This research supports the view that explicit instruction is necessary to improve student understanding of the nature/process of science. The researcher suggests that this can be done by having students develop their own descriptions of the fundamentals of the nature and process of science. The researcher also suggests that teachers need to focus on helping students understand the boundaries of science, perhaps by explicitly discussing how science compares to other human endeavors.

Sadler, T.D., F.W. Chambers, and D. Zeidler. 2004. Student conceptualizations of the nature of science in response to a socioscientific issue.  International Journal of Science Education  26(4):387-409. A group of average- to below average-achieving high school students was asked to read contradictory reports about the status of the global warming debate and answer a series of open-ended questions that related to the nature and process of science. Each report included data to support its conclusions. The researchers examined and coded students’ oral and written responses. On the positive side, the researchers found that:

  • Most students understood that science and social issues are intertwined.
  • Most students were comfortable with the idea that scientific data can be used to support different conclusions and that ideological positions may influence data interpretation.
  • Almost half of the students were unable to accurately identify and describe data, and some conflated expectations and opinions with data.
  • There was a tendency for students to view the interpretation consistent with their prior opinion as the most persuasive argument – even in cases where they judged the opposite interpretation to have the most scientific merit. This suggests that students may not incorporate scientific information into their decision-making process, dichotomizing their personal beliefs and scientific evidence.

The researchers suggest that instruction should focus on the above two issues and that teachers should encourage students to consider scientific findings when making decisions. In addition, students should be encouraged to deeply reflect on socioscientific issues and consider them from multiple perspectives.

Scharmann, L.C., M.U. Smith, M.C. James, and M. Jensen. 2005. Explicit reflective nature of science instruction: Evolution, intelligent design, and umbrellaology.  Journal of Science Teacher Education  16(1):27-41. Through multiple iterations of a preservice science teacher education course, the researchers designed a 10 hour instructional unit. In the unit, students:

  • attempt to arrange a set of statements along a continuum from more to less scientific
  • develop a set of criteria for making such judgments
  • participate in a set of inquiry activities designed to teach the nature of science (e.g., the black box activity)
  • read and reflect on articles about the nature of science
  • analyze intelligent design, evolutionary biology, and umbrellaology (a satirical description of the field of umbrella studies) in terms of the criteria they developed.

The final iteration of this set of activities was judged by the authors to be highly effective at changing students’ views of the nature of science and perhaps even helping them recognize that intelligent design is less scientific than evolutionary biology. Furthermore, the researchers suggest that using a continuum approach regarding the classification of endeavors as more or less scientific may be helpful for students who have strong religious commitments and that explicit, respectful discussion of religion in relation to science early in instruction is likewise important for these students.

Schwartz, R.S., N.G. Lederman, and B. Crawford. 2004. Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and scientific inquiry.  Science Education  88(4):610-645. A group of preservice science teachers participated in a program that included 10 weeks of work with a scientific research group, discussions of research and the nature of science, and writing prompts which asked the preservice teachers to make connections between their research and the process of science. Participants were interviewed and observed, and responded to a questionnaire about the nature of science. Eighty-five percent of the participants improved their understanding of the nature of science over the course of the program. The two participants who did not improve their understanding were the two that focused on the content of their research and did not reflect on how this related to the nature of science. Participants also seemed to gain a better understanding of how to teach the nature and process of science explicitly. The researchers conclude that the research experience alone did little to improve students understanding, but that this experience was important for providing the context in which active reflection about the nature and process of science could occur. They recommend that scientific inquiry in the K-12 classroom incorporate reflective activities and explicit discussions relating the inquiry activity to the nature and process of science.

Zeidler, D.L., K.A. Walker, W.A. Ackett, and M.L. Simmons. 2002. Tangled up in views: Beliefs in the nature of science and responses to socioscientific dilemmas.  Science Education  86(3):343-367. A sample of 248 high school and college students were given open-ended questions eliciting their views of the nature of science. In addition, researchers elicited students’ views on a socioscientific issue (the appropriateness of animal research) using both a Likert scale item and open-ended questions. From this large sample, 42 pairs of students with differing views of the appropriateness of animal research were selected. These pairs of students were allowed to discuss the issue with each other and were probed by an interviewer. Finally, they were presented with data anomalous to their own view and were probed again on their confidence in the data and their willingness to change their view. Researchers analyzed these 82 students’ responses to the open-ended questions using concept mapping and compared their responses to Likert items. They found that students  did  change their views on the issue as a result of discussion and exposure to anomalous data. They also found that younger students tended to be less skeptical of anomalous data presented to them from an official-sounding report. In only a few cases were students’ views of the nature of science obviously related to their analysis of the socioscientific issue. These were mainly situations in which a student expressed a belief that scientists interpret data to suit their personal opinion, and then, correspondingly, the student selectively accepted or rejected evidence according to whether it supported his or her opinion. In addition, many students seemed to believe that all opinions are equally valid and immune to change regardless of the scientific evidence. The authors conclude that instruction on the nature of science should be incorporated throughout science courses and should include discussion in which students are asked to contrast different viewpoints on socioscientific issues and evaluate how different types of data might support or refute those positions.

Thanks to Norm Lederman and Joanne Olson for consultation on relevant research articles.

Correcting misconceptions

Teaching tips

Subscribe to our newsletter

  • Understanding Science 101
  • The science flowchart
  • Science stories
  • Grade-level teaching guides
  • Teaching resource database
  • Journaling tool
  • Misconceptions

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Am J Pharm Educ
  • v.78(4); 2014 May 15

Where and How to Search for Evidence in the Education Literature: The WHEEL

An awareness of how and where to search the education literature, and how to appraise it is essential to be a teacher scholar (an academic who takes a scholarly approach to teaching), to develop high quality education research, and to perform the scholarship of teaching and learning. Most pharmacy faculty scholars do not receive training in searching the education literature. Thus, a framework for searching the education literature is needed. The framework presented here on where and how to search for evidence in the education literature, referred to as the WHEEL for teaching, is meant to serve as a guide for faculty members in conducting comprehensive and exhaustive literature searches for the publication of scholarship of teaching and learning projects, educational research, or approaching one's teaching in a scholarly manner. Key resources to search and methods for searching the education literature are listed and described.

INTRODUCTION

Pharmacy faculty members are faced with many teaching challenges today including an emphasis on incorporating active-learning strategies using instructional technology tools; measuring learning and outcomes through such things as progress examinations, objective structured clinical examinations, rubrics, and portfolios; and incorporating various instructional strategies such as team-based learning, high-fidelity simulations, and interprofessional education. In the recommendations from the American Association of Colleges of Pharmacy Task Force on Best Evidence Pharmacy Education, Hammer and colleagues recommended that a series of papers in The American Journal of Pharmaceutical Education (AJPE) on “how to’s” for the scholarship of teaching and learning (SoTL) and educational research is needed. 1 During the last few years, researchers have provided the academy with numerous key articles in the area of SoTL that guide the development of projects. 2-5 What is lacking is a guide to searching the educational literature. Even if faculty members do not perform SoTL or educational research, faculty members should have a scholarly approach to teaching, which requires knowing how and where to search for education literature beyond the pharmacy discipline that supports teaching. Having an awareness of the levels of evaluation used to appraise the educational literature is also important. The intent of this article is to guide faculty members in conducting a comprehensive and exhaustive literature search and in appraising the literature in preparation for publication of scholarship of teaching and learning, for educational research projects, and in being a teacher scholar. These guidelines are summarized by the acronym WHEEL, which stands for w here and h ow to search for e vidence in the e ducation l iterature. The analogy of a wheel is appropriate because as new information is learned and the search process is repeated, the wheel moves forward.

WHY TO SEARCH THE EDUCATION LITERATURE

Health professionals are familiar with the practice of evidence-based medicine for the best treatment of patients. For this purpose, they identify a question, search the literature, appraise the evidence, apply the evidence, and evaluate the outcomes. 6,7 This process is also applicable to teaching methods used to educate pharmacy students. In order for educators to take a scholarly approach to teaching, they must first review and be familiar with what is already known about different teaching methods. Evidence, including published literature as well as data collected internally, should be used whenever educators are doing anything from trying a new active-learning technique in the classroom to revising an entire curriculum. Searching the literature is a key component in this process. Reviewing the evidence prior to implementing a change can help to inform educators about potential pitfalls and best practices. A comprehensive literature search should be conducted to review all potentially valuable information. Assistance with how and where to search for education-related literature is useful as most pharmacy faculty scholars do not receive this training.

Published literature provides background support to indicate what is known about a specific area. It also assists in identifying the needs or unknowns in the particular area of study. Reviewing the background support and data then allows educators and researchers to formulate their specific ideas or research in an evidence-based manner and provides focus to support a unique project that would have a great impact in the classroom and could add to the literature. Literature from other disciplines or from higher education in general may have already evaluated a similar teaching method and what they found may be applicable to pharmacy education. Analyzing the existing evidence in light of the current proposed project is an important step for conducting research that matters. 8 Incomplete searches may lead to a biased publication and an inappropriate interpretation of the impact of a project. 9 It is important to know what has been done in the past in order to build on established knowledge. Without first consulting published literature and data, researchers will be repeating work already accomplished by other scholars.

WHERE TO SEARCH FOR EDUCATION LITERATURE

Because there is no one database that houses all education literature, multiple searches within multiple sources are necessary in order to find all pertinent literature. Common databases (listed in Table 1 ) include thousands of citations relevant to health education. Unfortunately, the databases may not always have appropriate subject headings for searching health education literature and not all health education resources are indexed. Multiple searches using various terminology and use of related citations may be helpful to ensure complete searches in databases. Foundations and organizations focusing on education are good sources of information that may be useful to supplement literature searches in databases. Many of these are listed in Table 2 . Search functions within these resources vary greatly. Ancestry searching, which involves reviewing the bibliographies from any useful citations found in these searches, should be used to identify additional citations.

Databases Useful in Researching the Education Literature

An external file that holds a picture, illustration, etc.
Object name is ajpe78470-t1.jpg

Education Resources Available Within Organizations and Foundations

An external file that holds a picture, illustration, etc.
Object name is ajpe78470-t2.jpg

It is important to also manually search pharmacy-related education journals such as the American Journal of Pharmaceutical Education and Currents in Pharmacy Teaching and Learning . Additionally, scanning other education-related journals and information from other health professions is essential as well. Databases may not include all education-related journals, and indexing of these journals, if they are included, may not be comprehensive. Table 3 includes a list of key journals related to teaching and notes if they are currently indexed within the most commonly used databases. This may include only partial indexing of the journals (eg, indexing only 1 year of a journal’s articles). These journals may be added or deleted from the databases over time. The resources listed in the tables are not all-inclusive lists, but may be ideas for beginning searches. 10

Key Teaching and Learning Journals for Educators in the Health Professions and Their Inclusion in Popular Databases a

An external file that holds a picture, illustration, etc.
Object name is ajpe78470-t3.jpg

Finally, experts in the field may be contacted and Web searches for grey literature (any literature that is not controlled by commercial publishers) may be conducted. Grey literature for health education research may include items such as dissertations, theses, committee reports, government reports, technical reports, etc. 11

HOW TO SEARCH THE EDUCATION LITERATURE

The systematic approach to searching for any type of literature also applies to searching for education literature. The steps to searching for education literature are summarized in Figure 1 . The figure includes numbers to order the steps; however, it is a circle (wheel) because as new information is learned, the process may be repeated to incorporate the new information. The first step in the process is to clearly define the question or the hypothesis and the scope of the question. If a team of scholars are looking to create a tool to assess an interprofessional experience including nursing and pharmacy students, they would need to think about specifically which aspect(s) of the interaction they are looking to assess. Are they looking to assess students’ learning, contributions, attitude, etc? Are they looking only at experiential learning vs classroom lectures, or a one-time event vs multiple interactions? They should think about the participants, intervention, comparison, outcome, and inclusion and exclusion criteria when determining the scope of the query. This is followed by determining essential concepts for which more information is needed. In the interprofessional experience information, this may include understanding what is meant by interprofessional vs interdisciplinary.

An external file that holds a picture, illustration, etc.
Object name is ajpe78470-fig1.jpg

WHEEL for Searching the Education Literature. (*Involve experts as the question is defined and again as methodology is developed.)

Search terms should be developed, taking time to think about all possible related terms, different spellings, and synonyms to ensure a complete search. Related citation searching may be used to identify additional appropriate citations. Searching the term “interprofessional” in Medline links to the Medical Subject Heading (MeSH) terms interprofessional relations or interdisciplinary communication. Searching the term “interdisciplinary team” links to the MeSH term “patient care teams.” Definitions for the MeSH terms should be reviewed to make sure that the chosen term is the most appropriate for the search. Subject trees may also be reviewed to identify sub-themes. Multiple combinations of MeSH terms and keywords should be used.

Once the key concepts are identified, resources should be identified in which to complete searches. Searches should begin in tertiary resources (resources that provide an overview and summarize information from other resources) followed by searches in databases. 11,12 All resources likely to cover the essential concepts should be included in the search. Databases differ in the types of journals and publication dates for the journals they index. Brief descriptions of the content of several databases are included in Table 1 . When searching for information about evaluating interprofessional interactions between nursing and pharmacy students, any resources that would have information about nursing, pharmacy, and interprofessional assessment should be accessed. Medline, The Cumulative Index to Nursing and Allied Health Literature (CINAHL), and education databases would be appropriate resources in which to begin. On the other hand, it would likely be a waste of time to search a database such as PsychINFO for this type of literature.

A Web search for information especially from appropriate government agencies, organizations, and foundations would help to ensure a complete search. For assessment of interprofessional interactions, there are multiple organizations listed in Table 2 that may be appropriate to search and may also include searching accreditation standards for each discipline included for requirements for interprofessional education. If the interaction involved managing patients with diabetes, a search for organizations related to diabetes management and education should also be conducted. Searches should not be limited only to pharmacy literature, but should include interprofessional interactions between any health fields.

These searches should be supplemented by manual searching of appropriate journals that may not have been indexed in the databases searched or may not have been completely indexed, and ancestry searching of useful citations. Once all of the information found is organized and appraised, areas where information is lacking should be identified; the process should be repeated to identify and fill in any gaps.

APPRAISING THE EVIDENCE

The final step in the process is to manage the information found by appraising the evidence and determining what is relevant to the project. The concept of using the “best evidence” is recommended. 1,8 This concept has parallels to the Cochrane Collaboration, which has a mission to gather the best evidence for medical practice. The Best Evidence Medical Education Collaboration and the Campbell Collaboration are 2 groups that strive for the systematic review of studies in teaching and education domains. 1 The BEME Collaboration suggests appraising the evidence in medical education using the QUESTS criteria. 8 Details of the QUESTS dimensions are provided in Table 4 . The use of the educational information retrieved from the research literature should be selected based on the highest quality of evidence in design, utility, extent, strength, target, and setting. Guidelines for evaluating educational manuscripts have been published. 8,13 Following appraisal of the current literature, areas of additional research should be identified for researchers to target their education projects. Learning to search for high-quality research in education and being able to appraise the quality is critical for the enhancement of SoTL projects as well as in using a scholarly approach to teaching.

QUESTS Method for Evaluating Evidence in Educational Practice 1 , 4 , a

An external file that holds a picture, illustration, etc.
Object name is ajpe78470-t4.jpg

The analysis of the design and methodology of high-quality education literature is not dissimilar from that of basic science or clinical science literature evaluation. Kirkpatrick described the different levels of evaluation that can be used in SoTL or in evaluating effectiveness of teaching in terms of a hierarchy. 14 These 4 levels from lowest to highest include: evaluation of reaction such as satisfaction, evaluation of learning such as knowledge or skills acquired, evaluation of behavior such as changes brought to the real world, and evaluation of results such as impact of behavior changes on society, outcomes, or costs. As the scholar progresses in this hierarchy, the level of evidence for the educational research is strengthened. Ideally, researchers in education should be striving to identify strategies that have impacts on the higher levels of evaluation so that these strategies may be reproduced. Higher levels of evidence also would be more useful for enhancing teaching. In searching the education literature, more of the evidence found is often at the lower levels of evaluations, ie, levels 1 and 2, in contrast to evidence published in the basic science or clinical science literature.

In the same way that pharmacy educators take an evidence-based approach to research and to caring for patients, it is important for them to take a scholarly approach to teaching. It is essential to effectively search for and use evidence to improve teaching, to identify where there are needs for further research in teaching and learning, and to enhance the application and integration of SoTL. Ideas should be defined and search terms should be identified. A comprehensive search should be conducted in all appropriate resources. Additionally, published literature outside of pharmacy education may be useful in guiding applications to pharmacy. All information found should be appraised to determine its value in improving pharmacy education.

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Do open educational resources improve student learning? Implications of the access hypothesis

Roles Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Software, Visualization, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation OpenStax, Rice University, Houston, Texas, United States of America

ORCID logo

Roles Conceptualization, Writing – review & editing

Roles Conceptualization, Formal analysis, Writing – review & editing

Roles Supervision, Writing – review & editing

  • Phillip J. Grimaldi, 
  • Debshila Basu Mallick, 
  • Andrew E. Waters, 
  • Richard G. Baraniuk

PLOS

  • Published: March 6, 2019
  • https://doi.org/10.1371/journal.pone.0212508
  • Reader Comments

Fig 1

Open Educational Resources (OER) have been lauded for their ability to reduce student costs and improve equity in higher education. Research examining whether OER provides learning benefits have produced mixed results, with most studies showing null effects. We argue that the common methods used to examine OER efficacy are unlikely to detect positive effects based on predictions of the access hypothesis. The access hypothesis states that OER benefits learning by providing access to critical course materials, and therefore predicts that OER should only benefit students who would not otherwise have access to the materials. Through the use of simulation analysis, we demonstrate that even if there is a learning benefit of OER, standard research methods are unlikely to detect it.

Citation: Grimaldi PJ, Basu Mallick D, Waters AE, Baraniuk RG (2019) Do open educational resources improve student learning? Implications of the access hypothesis. PLoS ONE 14(3): e0212508. https://doi.org/10.1371/journal.pone.0212508

Editor: James A.L. Brown, National University Ireland Galway, IRELAND

Received: December 20, 2018; Accepted: February 5, 2019; Published: March 6, 2019

Copyright: © 2019 Grimaldi et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: Data used in this report were generated via code, and are available on GitHub ( https://github.com/openstax/oer-simulation-study ).

Funding: Authors PJG, DBM, and AEW are employees of OpenStax, a non-profit OER textbook publisher based out of Rice University. RGB is the founder. OpenStax provided support in the form of full or partial salaries for authors PJG, DBM, AEW, & RGB, but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.

Competing interests: Authors PJG, DBM, and AEW are employees of OpenStax, a non-profit OER textbook publisher based out of Rice University. RGB is the founder of OpenStax. This does not alter our adherence to PLOS ONE policies on sharing data and materials.

Introduction

The textbook has long been a critical component of the education system at all levels. In addition to providing a scaffold for content discussed in a course, textbooks have historically been the primary learning resource for students. For a variety of market-based reasons, the price of textbooks has risen dramatically over the last two decades, outpacing the price increases of all goods and services by almost four times [ 1 ]. Within higher education, these price increases ultimately fall on the students, who are responsible for procuring their own course materials. In response to these price trends, many educators have turned to open educational resources (OER) [ 2 , 3 ]. While OER refers to any educational resource that is openly licensed and freely distributed, for the purposes of this document we will limit our discussion to OER textbooks. Over the last decade, OER has risen dramatically in popularity. According to OpenStax, the leading producer of OER textbooks, adoption of OER textbooks has saved students an estimated $500 million dollars since 2012 [ 4 ]. Moreover, recent survey data [ 5 ] suggest that OER textbooks now rival commercial textbooks in terms of overall market share. More importantly, textbook prices appear to have recently leveled off for the first time in three decades, an effect which is partially attributed to increased competition from OER alternatives [ 6 ].

While the OER movement has been successful in reducing the cost of educational materials, many have wondered whether adoption of OER affords additional benefits, such as improved student learning outcomes [ 7 ]. This question has motivated a flurry of empirical research comparing the grades of students who used OER textbooks to students who used a commercial textbook (for a recent review, see [ 8 ]). Overall, this research has produced somewhat mixed results. Several studies have found no significant differences between OER and traditional textbooks on student grades [ 9 – 12 ]. Occasionally, however, negative or positive effects are found. One study [ 13 ] found no significant difference in regular exam scores, but did find a benefit of OER adoption on a specialized exam score. Another study [ 14 ] compared OER and traditional texts across seven high school classes and found a negative effect of OER in two classes, and no significant difference in the other five classes. In a study comparing OER and a commercial textbook across fifteen courses [ 15 ] a negative effect of OER was found in one course, a positive effect of OER in five courses, and a non-significant difference in the remaining nine courses. A six-semester study comparing OER to non-OER [ 16 ] observed a negative effect in two semesters, and a positive effect in one semester. However, a later analysis revealed these effects were likely artifacts of confounding variables. A study comparing digital and print OER books to traditional print text across three course exams [ 17 ] found a positive effect of digital OER on only one exam. A large scale evaluation of OER [ 18 ] found positive effects of OER adoption on student grades. It is worth noting that this research varies considerably in terms of quality and rigor. Nearly all used quasi-experimental designs, and some failed to control for possible confounding variables (e.g., [ 18 ]; see [ 19 ] for a discussion). Nevertheless, the important thing to note is that the majority of comparisons in the literature find null effects of OER adoption on learning outcomes.

Why do most comparisons of OER to traditional materials fail to find a positive effect of OER? On one hand, the primary goal of OER is to offer an alternative to commercial textbooks that are comparable in quality, but free and openly licensed. Assuming an OER textbook is no different in quality, then there are no meaningful differences to explain effects on learning outcomes. License and cost certainly should not affect learning at a cognitive level. In this sense, the frequency of null effects is expected. On the other hand, the price of a textbook can affect whether a student decides to purchase a textbook, and a student cannot learn from a textbook they do not have. If we reasonably assume that having a textbook is better for student learning than not having a textbook, these students would then be at a learning disadvantage. Thus, adoption of OER would be effective as a learning intervention because it ensures that all students have access to the textbook, and would therefore result in better learning outcomes (for similar discussion, see [ 8 , 15 , 18 ]). We refer to this idea as the access hypothesis .

If access is the primary mechanism for how OER might affect learning outcomes, then we can see that current research approaches are not well suited for detecting an effect of OER adoption. In most educational research, an intervention is expected to impact all students who receive the intervention, and its impact is measured by comparing students who receive the intervention to students in a control condition. However, the access hypothesis predicts that an OER intervention should only affect a subset of students—specifically those who would not otherwise have access to the textbook. Students who are willing or able to purchase the textbook should not be affected. Yet, every study that has evaluated OER efficacy to date has treated OER as any other intervention, specifically by comparing an entire sample of students who received the OER intervention to a sample of students who did not. Indeed, this is the approach recommended by the most active researchers of the field [ 20 ]. The problem with this approach is that the effect of the intervention is washed out by students who are not expected to be affected by the intervention. To draw an analogy, the current research approach in OER is the equivalent of measuring the effect of a pain relieving drug on a sample of people who are mostly not in pain. In this sense, we should not expect to observe effects of an OER intervention, even if we believe that having access to a textbook is beneficial to learning.

If the impact of OER is measured across an entire sample of students, then it is necessary for researchers to consider the textbook access rates prior to implementation of OER. Past research reveals some insights as to what the expected textbook access rates are in a typical classroom. A recent survey of over 22,000 Florida students enrolled in public universities and colleges found that close to 66.5% of students reported not purchasing a textbook at some point in their academic career [ 21 ]. While this statistic is concerning, the data are limited in that they do not indicate what the access rates are in any given classroom. Just because a student avoids purchasing a textbook once does not mean they will repeat the behavior for all of their classes. Indeed, more targeted research reveals that access rates can be very high. A survey of 824 University of Colorado at Boulder Physics students [ 22 ] found that 97% purchased the required texts. Another survey of 1023 students at an undisclosed university across a range of introductory level science courses [ 23 ] found that 96% of students reported purchasing their required texts. A survey of 162 students in a political science course [ 12 ] found a 98% access rate. We can imagine that if an OER intervention were conducted on these samples, it would be very difficult to observe a positive effect because the existing access rates are already so high. Of course, we cannot expect access rates to be high in every classroom. An internal survey at Virginia State University [ 24 ] reported that only 47% of students purchased textbooks. Unfortunately, they did not report how many students were included in this sample. Regardless, it is fair to say that the rate of textbook access will vary across contexts and student populations. As we will see, the access rate of any given population can have a profound effect on the results of research aimed at evaluating the impact of OER adoption.

In this paper, we argue that the standard approach taken in past research on OER efficacy is severely limited in its functional ability to properly evaluate the impact of OER. This functional limitation is controlled by the existing textbook access rate prior to an OER intervention. In order to formally illustrate this point, we conducted a series of simulated experiments designed to mimic a typical study on OER effectiveness. We used these simulated experiments to measure the likelihood of a standard OER efficacy study to correctly reject the null hypothesis (i.e., statistical power [ 25 ]). A simulation study is useful because we can examine the expected results of an experiment in perfectly controlled conditions. Most real world educational research is plagued by instructor artifacts, confounding variables, and random differences between groups. Moreover, it is incredibly difficult to implement a randomized control trial with real students. In a simulation, we do not have to worry about any of these constraints. A simulation is also necessary in this case, because traditional power analysis does not allow us to vary the number of students who might be affected by an intervention, as is predicted by the access hypothesis. The primary goal of these simulations was to examine the influence of access rate on statistical power in a typical study of OER effectiveness, and make inferences about the likelihood of detecting a positive effect of OER adoption in a real world study. We then apply this model to evaluate existing studies that have already been conducted.

Simulation Study

For each simulated experiment, we first generated a sample of n student scores s from a normal distribution s ∼ N ( μ , σ 2 ), truncated between 0 and 100. These scores represented the final grade of each student in the course on a 100 point scale, where μ was the sample mean and σ was the sample standard deviation. Second, students were randomly determined to have access to the textbook at a rate of a , and not have access at a rate of 1 − a . Third, students were randomly assigned to either an OER or Non-OER condition with the constraint that both conditions must have an equal size. Fourth, in order to simulate the effect of access, we decreased the score of the students in the Non-OER condition who were previously determined to not have access to the textbook. The scores of students in the OER condition were unaffected, representing the fact that all of these students now have access to the book. The magnitude of the score decrease was equivalent to dσ . The parameter d represents the effect size [ 26 ] of having access to a textbook. Finally, we fit a regression model that predicted student score by condition, and tested the condition coefficient against 0 by using a standard t-test. An overview of the simulation is shown in Fig 1 .

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

For each experiment, a sample of student scores were generated, and students were determined to have access or not to the textbook. Students were then randomly assigned to either an OER or Non-OER condition. The effect of access was simulated by reducing scores for students determined to not have access, but only in the Non-OER condition. Lastly, a statistical test the OER and Non-OER conditions was performed. See text for more information.

https://doi.org/10.1371/journal.pone.0212508.g001

Parameter values.

For determining the value of n , we wanted to use similar sample sizes as studies that have examined OER in past research. Of the 42 direct comparisons of OER and non-OER materials on course grade [ 9 , 12 , 14 – 18 , 27 , 28 ], 95% involved sample sizes smaller than 5000 students. Thus, we examined levels of n between 100 and 5,000. We address sample sizes larger than 5,000 later in this report.

For generating our sample distribution, we set μ to 70 and σ to 20. We chose these values because we felt they were representative of a typical classroom, and similar to those we have observed in past research. However, because the effect of an intervention is measured by relative differences in scores, the actual values used here do not have much influence on the outcome of the simulation.

For the a parameter, which represents the proportion of students who are expected to have access to the textbook, we wanted to examine access rates that would be expected in a typical college classroom. We examined 6 different levels of a − 40%, 50%, 60%, 70%, 80%, and 90%. This range is likely to cover most student populations that might appear in OER research.

The d parameter represents the effect of having access to a textbook versus not having access to a textbook. We anticipated the literature would provide a clear direction for setting this parameter. To our surprise, despite the ubiquity of textbooks in higher education, there are few studies examining the effects of textbooks in general (both OER and non-OER) on learning outcomes. To our knowledge, there are no experimental studies that would afford calculation of a reasonable effect size of textbook usage. There are some correlational studies that at least show positive relationships between textbooks and learning. One study [ 22 ] reported moderate correlations between the amount of reading assignments a student completed and their final course grade for conceptual physics courses ( r = .45), but no correlation for calculus based physics ( r = .07). Another study [ 23 ] found that students who reported regularly reading their textbook had higher grades than students who read their textbooks only occasionally. However, they also found no difference between students who never read their books and those who read regularly. A positive relationship between student grades and engagement was found between student grades and engagement with a digital textbook, even after accounting for general student aptitude [ 29 ]. In sum, these studies show at the very least that use of the textbook can be beneficial to learning. We concluded that if there is an effect of textbook access on learning, it is likely to be small. Thus, we set d to a value of 0.25, which is considered to be the minimum effect size necessary for an educational intervention to be substantively important [ 30 ].

For each level of n and a , we repeated the experiment 10,000 times and recorded the p -values of each experiment. Statistical power was computed as the proportion of studies with p -values lower than α . All simulations were conducted using R [ 31 ], and based on code presented in [ 32 ]. The full code used for the simulations is available on GitHub ( https://github.com/openstax/oer-simulation-study ).

We examined the proportion of simulated experiments that rejected the null hypothesis at the standard α of.05 (i.e., power). The results are shown on Fig 2 . As is the case of all experiments where samples are drawn from normal distributions, the probability of success increases with n [ 25 ]. However, we also see that access rate ( a ) plays a strong influence on the ability to detect the effect of OER. When access is very low, experiments have a much higher likelihood of correctly rejecting the null hypothesis with smaller n . This makes sense, because there are more student in the sample that can be impacted by the intervention. However, as a increases, it pulls the probability of success down. Indeed, when a is large, it requires very large numbers of students to detect a significant effect. To illustrate the strength of this relationship, an OER experiment with 10,000 students will have a 89.3% chance of success when the access rate is 70%. However, the same 10,000 student experiment conducted on a sample with an access of 80% will only have a 56.5% success rate. The situation gets considerably worse when the access rate is 90%. This experiment would only have a 19% success rate. The fact that an experiment with 10,000 students would have such a low chance of correctly rejecting the null hypothesis demonstrates the influential role of access rate.

thumbnail

A study sample’s access rate to textbooks prior to adopting OER can severely hinder the likelihood of detecting an effect of OER, even at large sample sizes.

https://doi.org/10.1371/journal.pone.0212508.g002

Examination of past studies

The results of these simulations beg the question—how are we to interpret previous studies that have examined the effects of OER interventions on learning outcomes? To this end, we used the simulation procedure described previously to conceptually replicate prior studies on OER efficacy, with the goal of estimating the probability that such a study would have detected an effect of OER, given the reported sample sizes used in those studies at different levels of access.

Selection of research.

From the literature, we were able to find 42 direct comparisons of OER to traditional materials, across 9 publications [ 9 , 12 , 14 – 18 , 27 , 28 ]. We did not include a study or comparison if tests of statistical significance were not reported. Further, we only included comparisons that used a continuous performance metric as their dependent variable (i.e., grades on a 0-4 scale or test scores). Comparisons that used non-performance based dependent variables (e.g., drop or withdrawal) were not included, as they are not suitable for use as measures of learning. Some studies (e.g., [ 15 ]) examined both grades and pass rates separately, which is a dichotomous version of grade (i.e., C- or better.). As an aside, it is not clear to us why both measures are sometimes used, as the measures are likely highly correlated. In cases where both measures were used, we only included comparisons on course grade. We did not examine studies that only examined pass rates, because these studies use non-parametric statistics which are not applicable to the power analysis we conducted. Also, several studies conducted both an analysis which collapsed across different courses or semesters, and then conducted separate analyses for each of these levels [ 16 , 27 ]. In these cases, we only included each separate analysis, but not the overall analyses. In the case of [ 18 ], they collapsed across multiple courses without conducting separate analysis for those courses. In this case, we only included the overall analysis. The complete list of comparisons is shown on Table 1 .

thumbnail

https://doi.org/10.1371/journal.pone.0212508.t001

Simulation power analysis.

For each of the prior comparisons, we conducted 10,000 simulations using the same sample sizes reported by the authors. Since there is no way of determining the true access rate of the samples used in these comparisons, we used a range of a values (40%, 60%, and 80%). All other assumptions of the prior simulations were the same ( μ = 70, σ = 20, d = .25, α = .05), with one exception. We noted that many of the prior studies under consideration had imbalanced numbers of Non-OER and OER students, typically with far more Non-OER students than OER students. Rather than assuming equal sample sizes like in the previous simulations, we matched the sample size allocation ratio of the comparison study in the simulations. For example, the study in [ 15 ] reported one comparison with 4615 students, but 4531 in the Non-OER condition, and 84 students in the OER condition. In our simulations of this study, we drew samples of 4615 students, and allocated 98.2% to the Non-OER condition, and 1.8% to OER condition.

The estimated power for each comparison, for access rates of 40%, 60%, and 80%, are shown on the far right columns of Table 1 . We can see that for most comparisons, even under the most optimistic of scenarios (i.e., 40% access), the expected likelihood that the comparison would yield a positive significant effect of OER is very small. Only the comparisons which had very large sample sizes had substantial power at the 40% access level [ 16 , 18 ], though even some of these comparisons had low power at 80% access rates. Note that for many studies, power is so low at the 80% access level that the probability of correctly rejecting the null hypothesis is just as likely as falsely rejecting a true null hypothesis ( α = .05)! Thus, if there was an 80% access rate, these experiments were just as likely to detect a real effect of OER as they were to detect a false effect of OER. Interestingly, one comparison [ 15 ] had low power even with a sample size over 4000 students. This was due to the extreme imbalance of students in the Non-OER and OER conditions.

Given the results of this power analysis, we can determine the expected number of comparisons that should have correctly rejected the null hypothesis by summing the power values for each level of access. Across the 42 reported comparisons, we would only expect to observe significant effects of OER 18, 11.5, or 5.2 times, for the 40%, 60%, and 80% access rates, respectively. Note that only 9 of the 42 comparisons on Table 1 found positive effects of OER on learning outcomes. Even though this number seems very low, the results of the simulation power analysis demonstrate that this is well aligned with what should be expected, even if there is a real effect of OER. Of course, our power estimates assume perfect conditions. These real world studies have many confounding factors to contend with, so it is likely that the real power of these studies was even lower than what we estimated. In this case, it is possible that the number of significant effects found so far could even be higher than what would be expected.

Over the last decade, there has been a fair amount of research examining whether the adoption of OER textbooks improves student learning outcomes relative to commercial textbooks. The majority of this research has found no significant difference between OER and commercial texts when measuring learning performance. We have argued that one possible reason why most tests of OER efficacy fail stems from the predictions of the access hypothesis. The access hypothesis, formally introduced by us, states that OER might improve learning outcomes relative to traditional course materials by improving access to the textbook. Therefore, an OER intervention should only affect a subset of students who would not have access to the textbook. Through the use of simulated power experiments, we have demonstrated that the textbook access rate of a research sample prior to the intervention has profound effects on statistical power. As the access rate of a sample increases, the power of an experiment decreases dramatically. If the access rate is high, even studies with very large sample sizes should produce null results most of the time.

Overall, our analysis helps to provide better context to the studies that have examined OER efficacy. Even under ideal conditions, detecting positive effects of OER should be extremely difficult. The fact that most studies have found null effects is not surprising; in fact, these null effects are expected. Furthermore, our results stress the importance of being skeptical of studies that report positive effects of OER interventions. This is especially true if the study used relatively small sample sizes. In our simulation experiments, even comparisons with 1000 students are more likely to discover null effects than positive ones, even with access rates at the low end of the scale.

Implications for OER research

These results have several implications for future research in OER. First, we recommend that researchers attempt to measure textbook access rates in their student population prior to implementation of OER. If access rates are very high, it is important to consider that the likelihood of detecting an effect on learning outcomes should be very low. The effect of access rates should be considered when interpreting null results.

Second, it is critical that future research works towards determining the true effect size of textbook access on learning. Determining the true effect size will afford far more reliable power calculations, and more importantly, enable more meaningful interpretation of research studies. For instance, a high powered study that produces a null result is more meaningful than a low powered study that produces a null result, because the null result is unexpected in the case of a high powered study. Unfortunately, accurate measures of power require a reliable measure of effect size, and the vast majority of studies on OER efficacy do not report enough statistics in their analyses for computation of effect size estimates. It is critical that researchers report all relevant test statistics, p-values, sample sizes, means, and measures of dispersion. We encourage reviewers and editors of future research insist that authors report these measures.

Finally, it is common for OER researchers to conduct comparisons without making an explicit hypothesis or prediction. Hypotheses and predictions are critical, because they help guide research designs and interpretation of results. In the case of the access hypothesis, having an explicit mechanism makes it clear that the intervention should only affect some students. We cannot help but wonder if so many low power null effects would have been published had the access hypothesis been formally proposed earlier.

Potential theoretical mechanisms for OER efficacy

In this paper, we have discussed access as being the primary mechanism for why OER might improve learning. It is certainly possible that adoption of OER could affect learning outcomes in other ways. One idea is that the open nature of OER affords the ability to teach in ways that are not possible given the constraints of closed source materials [ 7 ]. Another idea is that OER may provide better or worse quality than the commercial counterpart (e.g., [ 17 ]). However, as mentioned previously, these ideas are rarely expressed as a formal hypothesis, and the mechanisms are rarely tested as part of the research. One exception is the work of [ 17 ], which compared learning outcomes from an OER and commercial textbook, but also examined the perceived quality and readability of the books. While differences in perceived quality and readability were observed, these differences did not translate into strong benefits to learning [ 17 ]. It should be noted that other mechanisms would not be subject to the same power constraints as access, as these mechanisms would presumably affect all students in the study. Thus, detecting quality difference effects, for example, should require far fewer students than access effects.

With regards to the access hypothesis, we made the assumption throughout this paper that students who have access to the textbook would use the textbook in effective ways. Of course, access is not a guarantee for learning. A student with access to a textbook could easily choose to ignore it or engage in ineffective learning strategies. These students are no better off with a textbook than they were without one. This fact creates a general boundary condition on the ability for access alone to affect learning. Practically speaking, the effect of access on learning depends critically on usage after access to the materials is supplied. If students engage with the book in ineffective ways, then access will be an irrelevant factor. To this end, we simply caution readers that while access is an important step towards improving learning, it not sufficient.

Limitations

It is important to point out that our simulated experiments provide only a proxy measure of statistical power. In particular, these simulations estimate power under an unrealistically optimistic experimental scenario. The situation only gets more difficult in real world studies, which have instructor effects, student effects, and other confounds to control for. These variables only add noise to the data, and reduce this probability of success even further. Thus, a researcher hoping to estimate their statistical power with a real-world data should understand that their actual power will be lower than those shown in Fig 2 .

Another limitation of our analysis of past research studies is that we assumed an effect size d of 0.25, rather than computing the observed effect sizes post hoc. Unfortunately, as previously mentioned, the vast majority of research we reviewed did not report sufficient statistics to conduct such analysis. If the true effect size of OER adoption is larger, then these studies may have had considerably more statistical power than what we estimated. To this end, we conducted a supplemental analysis which estimates the minimum effect size required in order for an OER study of varying sample sizes and access rates to achieve an acceptable level of power. This analysis is explained in detail in S1 Appendix , and the results shown on S1 Fig . Should additional research become available that suggests the effect size is different than the one we used, S1 Fig can be used to determine whether power of these past studies was adequate. Also, we remind readers that the source code of our analysis is available such that anyone rerun our analysis with varying levels of d .

Relevance to educational research

While it is tempting to think that the research failings discussed in this paper are unique to OER, the reality is that these failings are the result of a common mistake in educational research (and even social science research more broadly). Specifically, that mistake is overgeneralizing the influence of an experimental variable without critically considering the context in which that variable is manipulated. The importance of contextual factors was articulated decades ago by [ 33 , 34 ], who noted the fragile nature of many of the most landmark findings in memory research (e.g., levels of processing [ 35 ]). In particular, it was observed that minor changes to an experimental design could completely change the outcome of a manipulation. The critical insight of [ 33 ] was that variables not manipulated by the experimenter are just as important as the ones that were manipulated. The materials used, the final assessment, the types of students, and the interactions among all these factors were all critically important. Thus, if one wants to understand whether an intervention affects learning, they need to be aware of the context in which that intervention is taking place.

Of course, researchers and practitioners are naturally compelled to focus only on variables of interest in isolation. To illustrate, one of the most influential studies in education is a meta-analysis of over 800 factors that affect student learning [ 36 ]. While compiling such an extensive list of factors is quite the achievement, in our view, it presents an unrealistic view about the nature of learning. It leads one to a misplaced belief that certain techniques are better than others. However, even the strongest factors listed by [ 36 ] could quite easily be rendered ineffective by applying them to certain topics, certain populations of students, or certain outcome measures. For example, it is well known that effectiveness of an educational strategy or intervention can depend on the prior aptitude of individual’s in a study [ 37 ]. The very existence of such interactions prevents us as a field from ever discovering “laws” of human learning [ 38 ] or making broad sweeping claims about any intervention. In sum, the effectiveness of any educational intervention will almost always depend on the context in which it is implemented.

Failing to consider the importance of context can lead to poor study design and misleading conclusions. In this paper, we discussed the importance of student access in moderating the effectiveness of OER. Past researchers assumed OER would have a general effect on learning and failed to context influences, which lead to a dearth of under powered and ill designed studies. A similar analogue comes from the oft maligned enterprise of media comparison studies [ 39 – 42 ]. Media comparison studies typically evaluate student learning from a standard instructional strategy delivered on different types of “media” (e.g., computer vs. paper). Like OER, most of these studies have produced null results, and have been vehemently criticized for decades as being without merit [ 39 ]. Indeed, [ 39 ] took a strong stance that media is only a vehicle for delivering instructional strategies, and that media itself will never influence learning. While this is often true, others [ 41 , 42 ] have argued that many media comparison studies employed standardized research designs that were not well suited to measure the unique mechanisms afforded by media evaluated in the study. [ 43 ] reviews a wide variety of media studies which reveal the nuances of when media can have a meaningful influence on student outcomes. Thus, by carefully considering the context in which an intervention occurs and is evaluated and devising appropriate hypothesis to test, one can design studies that effectively and appropriately measure the unique merits of the intervention.

Conclusions

The goal of educational research is to answer important questions about education through scientific analysis. However, studies that are not grounded in theory or lack statistical power do not provide meaningful insights for answering these questions. On the contrary, such studies only muddy the waters, and move us further from determining the truth. Despite the large number of studies that have been conducted on OER efficacy, these studies unfortunately do not provide much information about the potential impacts of OER on student learning. While the large number of null effects may suggest that OER adoption may not provide much benefit to student learning, the reality is these studies do not provide much insight, because they were incapable of detecting positive effects even if they did exist. As it currently stands, the question of whether OER affects student learning remains unanswered.

Supporting information

S1 appendix. supplemental simulation analysis..

https://doi.org/10.1371/journal.pone.0212508.s001

S1 Fig. Minimum effect size required to detect an effect of OER at 80% success rate as a function of access rate and sample size.

For a given value of a , the minimum value of d necessary to detect an effect of OER is very sensitive to sample sizes n below 1000. Conversely, for a given value of n , the minimum value of d is extremely sensitive to the access rate.

https://doi.org/10.1371/journal.pone.0212508.s002

Acknowledgments

The authors would like to thank Micaela McGlone for her project management support on this project.

  • 1. Perry MJ. Chart of the Day: The Astronomical Rise in College Textbook Prices vs. Consumer Prices and Recreational Books; 2016. Retrieved from http://www.aei.org/publication/chart-of-the-day-the-astronomical-rise-in-college-textbook-prices-vs-consumer-prices-and-recreational-books/ .
  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 4. Ruth D. 48 Percent of Colleges, 2.2 Million Students Using Free OpenStax Textbooks This Year; 2018. Retrieved from: https://openstax.org/press/48-percent-colleges-22-million-students-using-free-openstax-textbooks-year .
  • 5. Seaman JE, Seaman J. Opening the Textbook: Educational Resources in U.S. Higher Education, 2017; 2017. Retrieved from https://www.onlinelearningsurvey.com/reports/openingthetextbook2017.pdf .
  • 6. Perry MJ. Wednesday Afternoon Links; 2017. Retrieved from http://www.aei.org/publication/wednesday-afternoon-links-30/ .
  • 9. Allen G, Guzman-Alvarez A, Molinaro M, Larsen DS. Assessing the Impact and Efficacy of the Open-Access ChemWiki Textbook Project; 2015. Retrieved from https://library.educause.edu/resources/2015/1/assessing-the-impact-and-efficacy-of-the-openaccess-chemwiki-textbook-project .
  • 10. Bowen WG, Chingos MM, Lack KA, Nygren TI. Interactive Learning Online at Public Universities: Evidence from Randomized Trials; 2012. Retrieved from http://mitcet.mit.edu/wpcontent/uploads/2012/05/BowenReport-2012.pdf .
  • 14. Robinson TJ. The Effects of Open Educational Resource Adoption on Measures of Post-Secondary Student Success [Unpublished Dissertation]. Brigham Young University; 2015.
  • 20. Hilton J, Wiley D, Fischer L, Nyland R. Guidebook to Research on Open Educational Resources Adoption. Retrieved from http://openedgroup.org/wp-content/uploads/2016/08/OER-Research-Guidebook.pdf ; 2016.
  • 21. Florida Virtual Campus. 2016 Student Textbook and Course Materials Survey; 2016. Retrieved from http://www.openaccesstextbooks.org/pdf/2016_Florida_Student_Textbook_Survey.pdf .
  • 26. Cohen J. Statistical Power Analysis for the Behavioural Sciences. 2nd ed. Hillsdale, NJ: Erlbaum; 1988.
  • 30. Institute of Education Sciences, U S Department of Education. What Works Clearinghouse, What Works Clearinghouse Procedures Handbook (Version 4.0). Retrieved from http://whatworks.ed.gov ; 2017, October.
  • 31. R Core Team. R: A Language and Environment for Statistical Computing; 2017. Available from: https://www.R-project.org/ .
  • 32. Coppock A. Power Analysis Simulations in R; 2013. Retrieved from http://egap.org/content/power-analysis-simulations-r .
  • 33. Jenkins JJ. Four points to remember: a tetrahedral model of memory experiments. In: Craik FM, Cermak L, editors. Levels of Processing in Human Memory. Hillsdale, NJ: Erlbaum; 1979. p. 429–446.
  • 36. Hattie J. Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Abingdon, Oxon: Routledge; 2008.
  • 37. Cronbach L, Snow R. Aptitudes and Instructional Methods: A Handbook for Research on Interactions. Oxford, England: Irvington; 1977.

Advertisement

Advertisement

Learning to Argue Through Dialogue: a Review of Instructional Approaches

  • Review Article
  • Open access
  • Published: 04 September 2021
  • Volume 34 , pages 477–509, ( 2022 )

Cite this article

You have full access to this open access article

how do educational researches and literature differ from instructional resources

  • Chrysi Rapanta   ORCID: orcid.org/0000-0002-9424-3286 1 &
  • Mark K. Felton 2  

8738 Accesses

16 Citations

3 Altmetric

Explore all metrics

Over the past 20 years, a broad and diverse research literature has emerged to address how students learn to argue through dialogue in educational contexts. However, the variety of approaches used to study this phenomenon makes it challenging to find coherence in what may otherwise seem to be disparate fields of study. In this integrative review, we propose looking at how learning to argue (LTA) has been operationalized thus far in educational research, focusing on how different scholars have framed and fostered argumentative dialogue, assessed its gains, and applied it in different learning contexts. In total, 143 studies from the broad literature on educational dialogue and argumentation were analysed, including all educational levels (from primary to university). The following patterns for studying how dialogue fosters LTA emerged: whole-class ‘low structure’ framing with a goal of dialogue, small-group ‘high structure’ framing with varied argumentative goals, and studies with one-to-one dialectic framing with a goal of persuasive deliberation. The affordances and limitations of these different instructional approaches to LTA research and practice are discussed. We conclude with a discussion of complementarity of the approaches that emerged from our analysis in terms of the pedagogical methods and conditions that promote productive and/or constructive classroom interactions.

Similar content being viewed by others

how do educational researches and literature differ from instructional resources

Examining Science Education in ChatGPT: An Exploratory Study of Generative Artificial Intelligence

how do educational researches and literature differ from instructional resources

ChatGPT in higher education - a synthesis of the literature and a future research agenda

how do educational researches and literature differ from instructional resources

Navigating Generative AI (ChatGPT) in Higher Education: Opportunities and Challenges

Avoid common mistakes on your manuscript.

Over the past 20 years, a broad and diverse research literature has emerged to address how students learn to argue through dialogue in educational contexts. This field has sprung, in part, from a proliferation of research into the benefits of argumentation for learning (see Andriessen and Baker 2014 ; Asterhan and Schwarz 2016 ), which is based on the view that argumentation supports knowledge construction, by situating learned facts, or knowledge, within evidence-based explanatory frameworks (Leitão 2000 ). In fact, extensive research has shown that by engaging in argumentative practices students from childhood to adulthood learn how to better structure their reasoning, consider alternative viewpoints (e.g., Kuhn et al. 2016a ; Larrain et al. 2019 ; Nussbaum and Edwards 2011 ; Reznitskaya et al. 2009 ), and develop more nuanced understandings of the topics they argue about (e.g., Asterhan and Schwarz 2007 ; Iordanou et al. 2019 ; Zohar and Nemet 2002 ). Moreover, several scholars (e.g., Rapanta 2021 ; Kuhn 2018 , 2019 ; Osborne et al. 2013 ) have confirmed that argumentation can be used as a pedagogical method to help students think critically, solve problems, and make decisions that concern their lives as individuals, as members of a society, and as world citizens.

Argumentation may involve an individual discourse activity, like argumentative writing (Klein et al. 2019 ) or graphical representation (Noroozi et al. 2018 ; Nussbaum 2008 ), or it may take the form of a social discourse activity in which dialogue partners collaboratively construct, critique, and reconcile arguments in the service of knowledge construction (Kuhn 2015 ). This view, rooted in social constructivist theories of learning, explains how skills of argument emerge and are refined on the social plane before becoming internalized to manifest as individual reasoning and learning (Billig 1987 ; Kuhn and Crowell 2011 ; Mercer and Howe 2012 ). However, for this socially rooted learning to take place, an immersion in dialogue is necessary (Cavagnetto 2010 ; Prawat 1991 ). According to Chinn and Clark ( 2013 ), when students learn how to argue through dialogue, they learn ‘how to engage effectively in the practice of argumentation’ (p. 321).

The current research is divided regarding what this ‘effective engagement’ entails and what it takes for researchers and educators to achieve it (Asterhan et al. 2020 ; Kim and Wilkinson 2019 ). On one extreme, there is the non-instrumental view, in which dialogue represents a form of social meaning-making, valuable in itself and not defined by external gains or pre-defined goals (Clarà 2021 ). From this perspective, learning to argue naturally emerges from adopting dialogic moves that facilitate the exchange of ideas or perspectives, such as asking open-ended questions, elaborating on others’ ideas, exploring competing viewpoints, coordinating views, and reflecting on dialogue metacognitively (Hennessy et al. 2016 ; Howe et al. 2019 ). On the other extreme, there is the intentional instructional framing of dialogue to produce specific argumentative gains. From this perspective, learning to argue involves using dialogue to define problems, produce claims, critiques and rebuttals using evidence, and coalesce arguments in ways that produce more accurate, developed, or sound outcomes (Baker 1999 ; Chen et al. 2016 ; Zohar and Nemet 2002 ).

Adopting the non-instrumental point of view, engaging in dialogue is the primary focus on LTA, whereas adopting the argument-oriented point of view, engaging in argumentation is the primary focus of LTA. Independently of the approach taken, dialogue needs to have certain characteristics and/or lead to certain gains so that it can fulfil its learning potential (Resnick et al. 2018 ). Although extensive research is available on the educational dialogue that leads, directly or indirectly, to LTA outcomes, these outcomes are not defined similarly across studies, and the methods adopted to define and measure them do not always involve controlled comparison to traditional teaching methods, so that valid conclusions about the effectiveness of instructional approaches can be drawn (Hoadley 2006 ). Our goal with this review is to better distinguish and define the range of approaches taken by researchers and teachers to elicit and advance students’ reasoning in the social context of classroom interaction, with effectiveness being defined either in terms of interventions on the quality of discourse and dialogue, or as a characteristic of the dialogue itself. By characterizing and mapping these various approaches to LTA, we hope to contribute to the current discussion of how to promote students’ argumentation competence through interactive discussions (Noroozi et al. 2018 ). In the section that follows, we provide definitions of dialogue and argumentation, as well as our own conceptualization of LTA from an instructional framing perspective.

Defining Dialogue and Argumentation

From a social constructivist perspective, dialogue is a critical activity through which knowledge construction, in the broad sense of meaning making, takes place (Ford 2012 ). Dialogue partners must commit to taking turns articulating their own ideas as speakers and seeking to understand their partner’s ideas as listeners (Wells 2007 ). This dialogicality (Koschmann 1999 ) introduces a tension between the effort to enter into a shared understanding ( intersubjectivity ) and the effort to distinguish one’s own thoughts from another’s ( alterity ). As a result, knowledge is constructed between speakers, rather than being transmitted from one to the other. This basic idea of dialogic exchange cannot be taken for granted in a classroom context. When it comes to teacher-student interactions, research has shown that much of teacher discourse is monological (Scott et al. 2006 ), meaning the dynamic and equitable exchange of roles between speakers does not take place. Similarly, when it comes to student-student interaction, much of the classroom dialogue is either disputative or cumulative (Mercer 2004 ), neither of which leads to the productive resolution or contrast of perspectives.

Argumentative dialogue is a dialogue activity in which the tension between intersubjectivity and alterity is addressed by the balance, constantly constructed by participants, between construction and critique (Ford 2008 ). On one hand, dialogue participants use language to construct and support a contested or potentially contested assertion (Rapanta 2019 ; Rapanta and Macagno 2019 ); on the other hand, they continuously strive to coordinate between the evidence supporting their own position and evidence supporting alternative or contrary positions (Kuhn 1999 ). In other words, argumentative dialogue involves an exchange ‘in which the participants not only defend their own claims, but also engage constructively with the argumentation of their peers’ (Nielsen 2013 ; p. 373). In so doing, they propose and critique the grounds for accepting a possible resolution to the doubt or uncertainty driving argumentation. Productive engagement, aimed at the resolution of doubt or uncertainty, creates the potential for argumentative dialogue to lead to knowledge construction, as speakers engage deeply with each other’s thinking in a critical discussion (Keefer et al. 2000 ). However, there is always the risk of a ‘position-driven’ argument, where authentic efforts at reasoning about opposing claims and evidence are replaced by strategic efforts at winning the exchange regardless of the quality of the arguments under consideration (Keefer et al. 2000 ).

It is because of this dialectical nature—which, as we will show further on, can be more or less explicit depending on how dialogue is framed—that argumentative dialogue can have more possibilities than any other type of dialogue to be both constructive and critical. This quality, as a result, brings several benefits at a social, cognitive, and epistemological level, briefly described below.

Social Outcomes

Research that emphasizes social gains as a result of engagement in argumentative dialogue operates on the assumption that a so-called ‘argument-as-process’ emerges from dialectical exchanges between speakers who hold opposing views on a topic. From this perspective, divergent views fuel a form of discourse in which individuals draw out and challenge the claims and evidence for each other’s conclusions (Felton and Kuhn 2001 ; Kuhn and Zillmer 2015 ). Focusing speakers on the differences in their views encourages them to explore the relative strengths of arguments in ways that are often absent when speakers focus on similarities in their perspectives (Thiebach et al. 2016 ). As a result, the social dynamic of dialectical exchange creates a context in which speakers naturally prompt one another to produce more fully elaborated and carefully examined arguments than might be later observed in independent reasoning (Kuhn et al. 1997 ; Kuhn 2019 ; Larrain et al. 2019 ). This results in the development of the social, critical thinking skill of ‘antilogos’, that is, one’s ability to identify limitations in one’s own assumptions and positions, which may lead to totally different or even oppositional assumptions and positions (Billig 1987 ).

Cognitive Outcomes

Several higher-order cognitive and metacognitive processes are inherent in argumentative dialogue due to the complexity and diversity of the task of putting forward a position, defending it with sufficient and relevant evidence, addressing counterarguments, generating valid rebuttals to further strengthen one’s own side, or making revisions to one’s argument in light of valid critique. Together, these activities draw on a set of cognitive, meta-cognitive and meta-strategic resources (Rapanta et al. 2013 ), including inferential abilities, evaluation abilities, argument appraisal, as well as more advanced strategies of theory-evidence coordination and undermining an opponent’s point of view with counterevidence. In a climate of fake news and alternative facts, learning to identify, assess, and integrate information from sources to draw valid conclusions is a direct benefit of argumentative dialogue.

Epistemological Outcomes

Argumentative reasoning, including both the construction and evaluation of arguments, and epistemic cognition are highly intertwined (Chinn et al. 2011 ; Iordanou et al. 2016 ). Argumentative dialogue, as a social manifestation of argumentative reasoning, helps epistemic cognition to become explicit, especially when it comes to the acquisition of norms of what counts as a good argument and argumentative dialogue (Kuhn et al. 2013 ). As argumentative dialogue externalizes thinking, it creates a context for reflective judgments about the validity of each speaker’s claims and evidence. As a result, it opens the door to new insights into the nature of knowing, standards for judging certainty, and practices for constructing knowledge from carefully vetting claims and evidence.

Although not all educational dialogues are aimed at argumentation as a goal, argument-related gains like the ones described above are a part of the learning potential of dialogue (Resnick et al. 2018 ). However, when dialogue is not designed to be argumentative, those social, cognitive, and epistemological gains often emerge as desired by-products or even manifestations of the dialogic process, even though they were not the direct object/-ive of dialogue. To be able to bridge the two approaches, i.e., the more instrumentalist one, directly aiming at argument gains, and the more ontological one, which welcomes argument gains as part of the dialogic engagement (Clarà 2021 ), we provide below a conceptualization of LTA instructional approaches which applies to both cases.

LTA Instructional Approaches

According to Morrison et al. ( 2019 ), the fundamental components of any instructional design, no matter what the designed learning activity/environment is, are: the Objectives, which ‘provide a map for designing the instruction and for developing the means to assess learner performance’ (p. 17); the Methods, which define how the subject content and/or skills are best learnt; the Learners, referring to the characteristics of learning for whom the activity/environment is designed; and the Evaluation, referring to ways to determine the extent to which learning is achieved. Applying this instructional effectiveness model directly to the LTA literature is not possible, due to the great variety of approaches that exist within as discussed above. Therefore, what we propose is to transform the generic instructional effectiveness model by Morrison et al. ( 2019 ) into a framework for reviewing the LTA literature, which can adapt to the diversity of approaches encountered within. Our MeDOL framework adapts Morrison et al.’s ( 2019 ) effectiveness components as follows: Methods refer to the authors’ approach of framing educationally effective dialogue, and to the fostering methods used to increase LTA outcomes; Dialogue goals refer to the epistemic goals pursued by the study participants during the LTA activity; Outcomes refer to the participants’ LTA gains described in each study; and Learners refer to the students and learning main characteristics described in each study such as age/educational grade, learning structure (i.e., whole class, small groups, dyads, or mixed), and disciplinary field.

It has been argued that framing methods shape, or at least characterize, the process by which educational goals are achieved through dialogue (Ford and Wargo 2012 ). When the goal is LTA, dialogue can either be framed a priori as argumentation, or emerge as a result of the dialogic activity. In this sense, argumentative dialogue may emerge through a more structured approach, where the learning environment is intentionally designed to produce specific argumentative interactions and outcomes; or it may follow a less structured approach, where the learning environment is designed to promote dialogic norms, without specifically focusing on the production of particular argumentative interactions or outcomes, as the ones described above. We will henceforth refer to this distinction as ‘high vs low structure’ approaches to argumentative dialogue.

Low-structure approaches to educational dialogue do not explicitly focus on structuring argumentation as a goal-oriented activity itself, but rather on establishing norms for dialogue and its productivity. In this approach, educational (argumentative) dialogue is productive insofar as participants actively engage in exploring different perspectives. Argumentation may emerge as a natural result of exploring these perspectives, but it is not set as the primary outcome. An example of this approach is ‘exploratory talk’ (Mercer 2004 ).

In contrast to low-structure dialogic approaches, the high-structure ones focus on argumentation as an explicit goal, and not a by-product of students’ interactions. This is primarily done through focusing on argumentative knowledge construction, therefore on the types of dialogic activities students should engage in so that some type of new knowledge or understanding can emerge. This type of educationally effective dialogue is particularly common within the fields of computer-supported collaborative learning (CSCL) (see, for example, the argument scripting approaches—Jermann and Dillenbourg 2003 ) and science education (see, for example, the Science Writing Heuristic approach—Hand et al. 2021 ).

A separate case of high-structure dialogic approaches focuses on the resolution of a controversy as the vehicle for promoting advances in argumentation. Within this approach, which can be called dialectic (Asterhan 2013 ), the characteristics of the issue and the nature of the task assigned to students as part of its resolution are of high priority in terms of designing the interaction. Regarding the issue, the more controversial it is, the greater the possibilities that authentic argumentation will emerge, in the form of a critical discussion (van Eemeren and Grootendorst 1992 ; Walton and Krabbe 1995 ). Moreover, a clear need for the issue resolution must be present either in the form of a dilemma (e.g., Kuhn 2018 ; Zohar and Nemet 2002 ), or in the form of a decision among multiple alternatives (e.g., Jiménez-Aleixandre 2002 ; Garcia-Mila et al. 2013 ).

Dialogue Goals

From a dialogue theory perspective, the identification of types of dialogue according to dialogue goals has attracted researchers’ attention (see, for example, Walton 1989 , 1998 ). When it comes to educational contexts, it is still not clear what such a typology should take into consideration as a ‘goal’; therefore, different ‘goal’ approaches result into different dialogue types. For example, Osborne et al. ( 2016 ) proposed two main interlinked goals for science argumentation: constructing and critiquing scientific explanations. Keefer et al. ( 2000 ) propose four types of educational dialogues by crossing ‘convergent’ or ‘divergent’ participant approaches, on one hand, with how successful they are in fulfilling their goal, giving: (a) a successful divergent approach called ‘critical discussion’ (term borrowed from Van Eemeren and Grootendorst 2003 ), (b) a non-successful divergent approach called ‘eristic discussion’, (c) a successful convergent approach called ‘explanatory inquiry’, and (d) a non-successful convergent approach called ‘rapidly reaching consensus’.

In our view, both proposals discussed above are insufficient to capture the complexity of dialogue goals in LTA instructional settings. On one hand, Osborne et al.’s ( 2016 ) proposal clearly focuses on the epistemic aspects of instructional framing for LTA without refining the dynamics of knowledge construction and critique. On the other hand, Keefer et al.’s ( 2000 ) focuses on the dialogue fluidity and versatility ‘on the go’, and not as part of an instructional framing. We view dialogue goals as epistemic goals enacted by the student participants in each dialogic context. Examples of such dialogue goals are the ones proposed by Berland and Reiser ( 2009 ), namely: sensemaking, articulation, and persuasion. Extending Berland and Reiser’s ( 2009 ) initial proposal, sensemaking dialogues aim at making sense of what others say and/or a specific phenomenon under consideration. Articulation focuses on either articulating the relationship between sources of evidence or coordinating theories with evidence. It is about understanding at least two types of data and establishing the relationship between them. Finally, persuasion dialogues focus on either arriving at a consensus (deliberation) or proving one’s side or position as the best according to epistemic standards. When it comes to persuasion as an ultimate goal in educational dialogue, a deliberation phase is always necessary, particularly when a final decision must be made. For this reason, and also to avoid the misconception that persuasion must be confrontational (Kruger 1993 ; Micheli 2012 ), we can refer to this goal as a deliberation goal, rather than persuasion

The traditional distinction between argument1 and argument2 (O’Keefe 1992 ) proposes two legitimate and complementary approaches to argument, one that focuses on argument-as-product (i.e., something that a person makes) and another that focuses on argument-as-process (i.e., something that a person engages in).

Arguments-as-products may comprise both structural and functional elements of discourse (for more about this distinction, see Rapanta et al. 2013 ; Macagno 2016 ). In terms of structural elements, effective argumentative reasoning might take some of the following forms: (a) effective integration and use of evidence (i.e., Berland and Reiser 2011 ; Kuhn et al. 2013 ); (b) effective integration of arguments and counterarguments, i.e., balanced or dialogical arguments (i.e., Kuhn and Udell 2007 ; Polo et al. 2016 ); and (c) effective use of counter-arguments and rebuttals (i.e., Kuhn and Udell 2003 ; Crowell and Kuhn 2014 ). It may also relate to the elaboration of arguments, as with the structural elements proposed by Toulmin ( 1958 ). More precisely, data, warrants , and backings are related to the identification and use of evidence, whereas claims, qualifiers , and rebuttals are related to the skill of construction and critique. Of course, to be able to assert that the use of specific argument elements is effective, other pragmatic criteria may be needed, such as conceptual complexity, coherence, and relevance.

In terms of argumentative function, effective dialogue takes the form of discursive moves that elicit reasoning, operate on reasoning, or redirect conversation. Early work in the transactive nature of dialogic reasoning (Berkowitz and Gibbs 1983 ; Kruger 1993 ) plays an important role in many functional models of argumentative reasoning. This work focuses on the ways in which speakers engage with and operate on each other’s thinking through dialogue, with an emphasis on how dialogue can be used as a way to socially elicit, elaborate, critique, and revise arguments. Generally, analyses of discursive moves in argumentation focus on the ways in which instructional interventions, scaffolds, or contextual variables facilitate these discursive processes. Some studies also go beyond the analysis of individual moves to look at how these moves are coordinated to form either coherent dialogic processes or effective argumentative strategies. It is common to speak of ‘dialogic transactivity’ in the first case, and of ‘dialectic transactivity’ in the latter (Vogel et al. 2016 ).

In recent years, social scientists have begun to challenge the notion that good argumentation involves distilling rational argument from the soup of social and emotional conflict. Instead, argumentation is seen as a dynamic interplay of social, emotional, and cognitive functions (Asterhan 2013 ; Gilbert 2004 ; Plantin 2004 ). On the one hand, emotions play an important role in promoting active engagement in reasoning and complement rationality (Lipman 2003 ). On the other hand, there is clear evidence that negative emotions drive the distortion of facts, increase miscommunication, and lead arguments astray in disputative argumentation (Polo et al. 2016 ). For example, Polo and her colleagues (Polo et al. 2016 ) propose that attempts to maintain positive emotion may be the reason that cumulative talk fails to draw out the kind of constructive criticism found in exploratory talk. Therefore, the ability to modulate and navigate emotions in a dialogue is an important component of argumentation (Andriessen et al. 2011 ).

There is also a social dimension to the regulation of argumentation, since the norms of social interaction can also have a direct impact on the progress of argumentative dialogue. In fact, there seems to be a close relationship between norms for social interaction and emotional tension in argumentative dialogue. At the most basic level, turn-taking is a social norm that, when broken, can negatively impact dialogue. When speakers talk over one another, cut each other off, or dominate the conversation, they undermine the appearance of good faith, and threaten other speakers’ face, i.e., perceived public image (Chiu 2008 ). Conversely, positive social regulation, such as active listening among peers, and between teacher and students (Alexander 2017 ; Michaels et al. 2008 ), can serve to facilitate effective argumentative dialogue by creating a safe space for the critical evaluation of ideas.

Finally, metadialogue (Krabbe 2003 ) is a socio-epistemic regulator of argument-as-process that contributes to effective argumentation and at the same time functions as a manifestation of argument gains. It may include appeals to standards for valid reasoning, where speakers make an explicit statement about the nature of claims and evidence and the ways in which they are coordinated to draw a conclusion (Macagno et al. 2015 ). Metadialogue can also be used to regulate the goals, process, and outcomes of argumentative dialogue (Felton et al. 2015a , b ): speakers may explicitly coordinate goals, focus dialogue on points of disagreement, or propose solutions to apparent contradictions in their views. Finally, metadialogue can be used to impose social norms for regulating argumentative dialogue (Kuhn and Zillmer 2015 ; Michaels et al. 2002 ). Across these disparate applications, metadialogue can be understood as the conscious and explicit attempt to optimize the social construction of arguments. Seen in this way, metadialogue represents a form of epistemic cognition operating on the process of argumentation (Kuhn et al. 2013 ). Indeed, metacognitive knowledge and regulation of argumentation are naturally epistemological because of the role that argumentation plays in establishing the strength, validity, truth, or applicability of conclusions through the coordination of claims and evidence. When speakers choose to direct or correct the course of argumentation with metadialogue, they manifest epistemological aims, knowledge, and values about argument.

Finally, all the above (Methods, Goals, Outcomes) makes sense because it works for a concrete type of learner in a concrete context. The learner characteristics expected to be found in all empirical reviewed studies are: (a) the age/educational grade; (b) the disciplinary field or topic on which the dialogue is held; and (c) the setting of the learning situation, meaning whether the dialogue take place in whole-class discussion format, in small groups, or in one-to-one settings.

The Present Study

To address the existing tension between ‘dialogue’ as the main focus and ‘argumentation’ as the main focus of research, we propose to look at LTA as a common instructional process to pursue either explicitly or as a by-product. Our approach is different to theoretical approaches focusing on dialogue purposefulness (Alexander 2017 ) or argumentation goal-orientedness (Walton 2013 ): rather than looking at the concrete aims participants strive at during their engagement in argumentative dialogue, we opt for studying the different ways in which learners’ engagement in the practice of argumentation is framed, fostered, and evaluated by educational researchers. We propose that by studying the literature in this way, we might better understand how different approaches to LTA might inform and complement one another. We believe that doing so might bring more coherence to our understanding of how to foster argumentative dialogue and to what ends.

Our research questions are:

How do educational researchers frame argumentative dialogue and its gains in instructional settings?

How is argumentative dialogue scaffolded and fostered in different contexts?

The integrative review was selected as the most appropriate way to address the research problem of mapping existing educational research focusing on learning to argue (LTA) through dialogue. Given the complexity of argumentative discourse and the breadth of approaches taken to describe and study the phenomenon, we decided to include both experimental and non-experimental studies in our review, whereas a meta-analysis would limit our scope to experimental studies only (Whittemore and Knafl 2005 ). In addition, our goal was not to measure the effectiveness of dialogue on students’ learning (such a goal would have been impossible given the variety of methodological approaches), but to understand how learning to argue is actually carried out in different instructional contexts. We therefore aimed at studies from both educational dialogue and argumentation fields, while limiting our search to those studies that offered an evaluation of the quality of discussions revealing or leading to some type of learning to argue gains. The indicators we used for the LTA evaluation are included in the search keywords described below.

A systematic search of peer-reviewed articles was conducted using four large databases: ISI’s Web of Science (WOS), Elsevier’s Scopus, EBSCO, and PROQUEST. We intentionally aimed at studies explicitly focusing on some manifestation and assessment of dialogue quality, defined in a variety of ways, such as productive, constructive, effective, strategic, or persuasive. For each one of the searches, the keywords used were: ‘argument*’ or ‘dialog*’, on the one hand, and ‘effect*’ or ‘construct*’ or ‘productive’ or ‘strateg*’ or ‘persuas*’ as LTA quality indicators, on the other.

The initial search yielded 3013 results, excluding unpublished manuscripts, book chapters, and conference proceedings, to ensure a higher quality of the studies due to stricter peer review criteria. A first screening of these documents based only on the title resulted in a sample of 945. This reduction was mainly due to the frequent use of the term ‘argument’ in its ordinary, non-technical sense. The pool was further reduced after the exclusion of duplicates, as several databases were simultaneously considered, to 761 articles. A second screening based on the abstract resulted in 341 articles. For this second screening, we used the following inclusion criteria: (a) that the article was empirical rather than theoretical or methodological; (b) that the article addressed K-16 education; (c) that the study addressed teacher-students and/or student-student interactions in either face-to-face or computer-supported classroom contexts. The selected articles underwent a third screening based on their full text. To the previous inclusion criteria, the following two were added: (a) that the quality of dialogue and/or reasoning were assessed in some way; and (b) that the interactions were held in the classroom or other educational setting (e.g., computer laboratory). This final screening resulted in 145 articles, with two cases of pairs of articles reporting on the same studies; therefore, we ended with a total number of 143 studies published between 1997 (date of the first included source) to 2020 (until November, and only finally published documents).

Coding Process

All 143 studies were coded in terms of the MeDOL framework presented in the theoretical part of this paper. Eleven coding categories emerged from our analysis (Figure 1 ). Four of these variables (coach-based, materials-based, peers-based fostering methods, and argument-as-process) were openly coded, and the remaining seven (dialogue approach, task-based scaffolds, dialogue goal, argument-as-product, dialogue setting, educational grade, and subject field) were defined a priori, with sub-categories explained below.

figure 1

The coding scheme (Note: Categories marked with an asterisk were open-coded)

During the coding process, the two authors randomly selected 20% of the studies and double-coded them independently for three high-inference variables, namely dialogue approach (low structure/high structure/dialectic), dialogue goal (sensemaking/articulation/deliberation), and argument-as-product (main/elaborated/complex/dialogic/dialectic). The inter-rater reliabilities were satisfying for all three variables ( κ = .776, κ = .727, and κ = .702, respectively), and disagreements were resolved by discussion.

Dialogue Approach

As mentioned in the theoretical part of this review, three sub-categories were used to describe the different approaches to LTA: (a) low-structure dialogic, (b) high-structure dialogic, and (c) dialectic. A main distinction between the high versus low structure approaches to argumentative dialogue is that in the low structure approach, dialogic interaction has the predominant ‘formative potential’ (Chin and Teou 2009 ), shaping students’ ideas and thinking by creating a shared dialogic ethos (Littleton and Mercer 2013 ). In contrast, in the high structure approach, it is the design of the learning environment that supports students’ enactment of meaningful argumentative practices (Ravenscroft 2000 ; Wu and Krajcik 2006 ). Similarly, a main distinction between the dialectic and the high structure dialogic approach is that the high structure guides students through a defined set of procedures to reach a specific outcome (e.g., an answer to a problem), whereas the dialectic approach focuses on eliciting opposing views to encourage each party to consider their arguments in a framework of alternatives.

Task-based Scaffolds

These refer to task components that have been shown to be relevant or significant for the quality of argumentative dialogue in the various studies. Among the task-based scaffolds, we pre-defined the following sub-categories: (a) the issue(s)/topic discussed; (b) the class organization (seating/group structure); (c) how the task is organized in terms of contents and actions (task structure/script); or (d) a specific instructional technique adopted as a task activity (e.g., philosophical circle, Collaborative Reasoning, etc.).

Dialogue Goal

As mentioned in the theoretical part of this review, three sub-categories were used to describe the different dialogue goals of LTA: (a) sensemaking, (b) articulation, and (c) deliberation/deliberative persuasion. According to Dougherty et al. ( 2000 ), ‘people cannot collectively use knowledge unless they first make shared sense of it’ (p. 323). Therefore, sensemaking is a starting point for any constructive dialogue (i.e., a dialogue that explicitly aims at constructing new knowledge) to take place. When learners engage in ‘sensemaking’ as their main task, they question each other, clarify content, build on each other’s ideas, and share their individual explanations about a phenomenon. As part of this task-process, several cognitive processes can take place such as: describing multiple aspects of a phenomenon, comparing two or more phenomena, or describing different types/cases of a phenomenon (Meyer et al. 2015 ).

Examples of studies focusing on sensemaking as a primary task for students to engage with are: Lee, Kang, and Kim ( 2015 ) Footnote 1 in science (students were asked to provide an explanation of a scientific phenomenon using modelling strategies) or Chisholm and Loretto ( 2016 ) in a language classroom (‘how students made meanings in interaction by dialoguing with other students, texts, and ideas’, p. 1).

Articulation focuses on either articulating the relationship between sources of evidence and/or coordinating theories with evidence. It is about understanding at least two different types of data and establishing a relationship between them. In articulation-oriented dialogue tasks, students exchange, contrast, and coordinate their interpretations of a phenomenon using some type of data (information) as evidence. Based on Baker ( 1999 ), Veerman et al. ( 2000 ) define articulation as a knowledge transformation process, during which ‘already stated information is evaluated and integrated into the collective knowledge base in such a way that, a new insight or a new direction transpires, that can be used to answer questions or to solve problems’ (p. 272). Although sensemaking prepares the ground for joint knowledge construction and learning with others, articulation makes this goal explicit. Examples of studies focusing on articulation as a primary task for students to engage with are: Kim and Song ( 2006 ) in science (all students were asked to defend their scientific evidence-based reports in front of the rest of their classmates who would challenge it with questions, without the goal being that of choosing the report that best explained the phenomenon at hand), or Jadallah et al. ( 2011 ) in a language classroom (‘One of Ms. Jackson’s principal objectives was to have children support their arguments with story evidence’, p. 204).

Finally, deliberation focuses on either arriving at a defending one’s position on a topic or reaching consensus after weighing alternatives based on epistemic standards. The activities characterizing deliberation dialogue involve addressing explicit disagreement to reach a conclusion or decision or final state of a debate, while revising one’s position as necessary. Defining deliberation as the main task of learning to argue implies an explicit focus on increasing the plausibility of one position, either through accepting it as the most accountable one (e.g., by consensus or compromise) or through persuading the other party using logical arguments. Sensemaking and articulation are both elements of the deliberative process, as opposing claims must be first elaborated and then substantiated in order to be critically evaluated. Examples of studies focusing on persuasion/deliberation as a primary task for students include Felton et al. ( 2015a , b ), where science students used critical dialogue to evaluate solutions to a socio-scientific problem, or Muller-Mirza et al. ( 2007 ) Footnote 2 , where students engaged in a historical debate through role-playing.

Argument-as-Product

The sub-categories and codes we used to capture the argumentative reasoning outcomes, when these were expressed as functional or structural products of discussion (see theoretical part of this review) were: claim (C), reason (R), claim quality (Cq), evidence (E), questions (Q), Toulmin’s ( 1958 ) Argument Pattern (TAP) structure, e.g., claim-data-(warrant)-backing, connection between claim and reasons (C/R), evidence quality/types (Eq), types or quality of reasons (Rq), transactive moves (T), counterargument and/or rebuttal (CAR), evidence-based counterargument and/or rebuttal (CAR/E), revised or integrated argument (RE), quality/types of counter-argumentation strategies (CARq), use of critical questions by students (Qq). Table 1 shows the description of the five levels of argument quality, described previously, based on the presence or combination of the above elements.

Study Demographics

In terms of the research design employed, the 143 studies comprised 82 (57%) descriptive, 37 (26%) quasi-experimental, and 24 (17%) experimental studies. When it comes to students’ educational grade, 43 out of the 143 studies (30%) focused on primary school children, 43 (30%) on middle grades, 26 (18%) on secondary, 28 (20%) on university students, and a remaining 3 (2%) on more than one grade. Finally, regarding disciplinary field or type of issue discussed by students, 61 studies (43%) were about science, 32 (22%) were about general interest/social issues, 17 (12%) were about language/literature, 11 (8%) were about mathematics, 11 (8%) were about socio-scientific issues, 10 (7%) were about a combination of fields, and 1 was about history.

Instructional Framing Approaches to LTA

To explore patterns in framing argumentative dialogue, we looked at the relationship between ‘dialogue approach’ and ‘dialogue task,’ testing their independence using Fisher’s exact test, with Cramer’s V ( φc ) to look at effect size. Adjusted standardized residuals of +/−2 were used to determine the contribution of individual cells in the omnibus tests (Beasley and Schumacker 1995 ). The test yielded significant results ( p < .0001), with a moderate effect size ( φ c = .429). Upon analysis of standardized residuals, the four relationships driving our findings were low-structured sensemaking, high-structured articulation, high-structured deliberation, and dialectic deliberation (Table 2 ). Due to the significant frequency of their cross-tabbed observation, these four instructional framing stories emerging from the analysed studies were identified as study patterns (see Sandelowski and Barroso 2007 , for a detailed view of how patterns are defined in literature reviews).

Argumentative dialogue evaluation was approached in terms of gains in argumentative reasoning (argument-as-product) and/or in socio-emotional and metadialogic gains in argumentation (argument-as-process). In terms of reasoning gains, we had previously identified five types or levels of arguments, comprising both structural and functional elements, namely: main, elaborated, complex, dialogic, and dialectic (see Table 1 ). A significant association was found between the ‘study pattern’ and the ‘reasoning gains’ using Fisher’s exact test, p < .002, with a moderate effect size, φc = .329. Table 3 shows the reasoning gains types reported per study pattern.

As can be observed in Table 3 , a large majority of Pattern 2 (high-structured articulation) studies refer an elaborated structure of student arguments (i.e., a TAP, evidence-based claim structure), and a large majority of Pattern 4 (dialectic deliberation) studies focus on integrated arguments (i.e., integrating the dialectical argument-counterargument relationship), whereas a no clear association emerged for Pattern 3 (high-structured deliberation) and Pattern 1 (low-structured sensemaking) studies.

When it comes to the reported socio-emotional and metalevel gains, 48 out of the 108 pattern studies mentioned at least one of the two argument-as-process gains (i.e., socio-emotional and/or metalevel) as relevant for learning to argue. Of these, 17 studies exclusively focused on socio-emotional argument-as-process gains, 21 studies exclusively focused on metalevel gains, while eleven studies simultaneously considered both types of gains. These studies were distributed among Pattern 2 and Pattern 3 studies.

What types of different stories do the four patterns tell us regarding how learning to argue is achieved? Below we will give a descriptive account for each one of the four identified patterns, composed of different instructional approaches, tasks, and outcomes related to the ‘umbrella’ learning to argue (LTA) instructional goal.

LTA Pattern 1: Low-structured Sensemaking

Studies belonging to the first pattern (low-structured sensemaking) tend to describe argumentative dialogue as an organic, whole-class, student-driven (Aukerman et al. 2016 ) or student-dominant (McNeill and Pimentel 2010 ) dialogue, aiming at the co-construction of ideas and shared interpretive authority among teacher and students alike (Chisholm and Loretto 2016 ). Within this study pattern, argumentative dialogue is a learning conversation (Simon et al. 2008 ), in which student agency and authority in discourse (Forman et al. 2017 ) is promoted. These studies tend to define argument quality as a description of different modes of participation such as public warrantability (Atwood et al. 2010 ), ‘interthinking’ (Aukerman et al. 2016 ), agreeing/disagreeing (Topping and Trickey 2007 ), or divergent thinking (Damico and Rosaen 2009 ). When they focus on the structural elements of the arguments produced by students, this is mostly the warrant or explanation of reasoning (e.g., Coker Jr and Erwin 2011 ; Frijters et al. 2008 ; Langer-Osuna 2015 ; Langer-Osuna and Avalos 2015 ; Lee and Majors 2003 ). In very few cases, they present some socio-emotional or metalevel gains as result of students’ participation in dialogue. Some of these gains are explained below.

Reznitskaya et al. ( 2012 ) describe a study in which Philosophy for Children (P4C) sessions provide a discussion frame suitable for the construction of new meanings and the reflection on the reasoning processes. Making sense of the concepts discussed and of the dialogue itself is therefore the primary goal set for the discussion. In the science context, Simon and her colleagues (Simon et al. 2008 ) describe how students’ engagement in scientific talk during whole-class discussions can be fulfilled in ways that children talk about their ideas, clarify their thinking, and, consequently, develop their capacity to reason. In both examples, the gains of dialogue are not separate from the dialogue itself, adopting a participation rather than an acquisition approach to learning (Sfard 1998 ). In particular, Reznitskaya et al. ( 2012 ) characterize LTA dialogues as inquiry dialogues, placing the focus on the fact that there are no right and wrong answers during P4C discussions. Simon et al. ( 2008 ) largely define LTA dialogues as argumentative interactions, as opposed to recall responses, characterizing the former as learning conversations. An emphasis on low-structure instructional framing is evident in both definitions. Other studies within this pattern group highlight some manifestations of LTA gains, either as a product or as a process. For example, Topping and Trickey ( 2007 ) emphasize that students explicitly agree or disagree based on reasons, and for that to be possible, a process of increased student participation and responsiveness is recommended. Sutherland ( 2006 ) presents more advanced reasoning gains as manifestations of LTA dialogue, such as posing higher-order questions and producing critical responses. Veerman et al. ( 2000 ) found out that checking conceptual information relevant to the argument was more important than argumentative moves. In their study, students’ transactive reasoning gains were accompanied by reflection as a relevant metalevel (epistemic) process. Planning essays was another relevant metalevel (metacognitive) process accompanying low-structured sensemaking dialogue in a study by Coker Jr and Erwin ( 2011 ).

In short, sensemaking tasks, highly represented by LTA Pattern 1 studies, are requisite for advanced argumentation. Making claims clear and precise, justifying claims, and providing reasons for agreeing or disagreeing are common elements across LTA sensemaking tasks.

LTA Pattern 2: High-structured Articulation

Studies belonging to the second pattern (i.e., high-structured articulation) tend to conceive dialogue as a collective negotiation of meanings driven by the critical evaluation of evidence (e.g., Arvaja et al. 2000 ; Cavagnetto et al. 2010 ; Ford 2012 ; Hogan et al. 1999 ; Kim and Song 2006 ; Sampson & Clark, 2011). In these studies, argumentation can be used to justify or explain, focusing on the use of evidence-based arguments to draw conclusions (e.g., Baines et al. 2009 ; Choi et al. 2014 ; Gillies 2013 ; Hsu et al. 2015 ; Kim and Song 2006 ; Kulatunga et al. 2013 ; Ryu and Sandoval 2012 ; Yun and Kim 2015 ) and/or on the generation, contrast, and evaluation of alternative explanations of the same phenomenon (e.g., Ford 2012 ; Sampson & Clark, 2011 ). High-structured dialogue studies focusing on articulation as their major task include operationalizations such as students challenging or questioning each other about evidence (Choi et al. 2014 ), interpreting and evaluating data (Selcen Guzey and Aranda 2017 ), or engaging in scientific inquiry (Kim and Song 2006 ).

When it comes to gains in argument-as-process, Pattern 2 studies tend to place an equal focus on socio-emotional and metalevel processes. For instance, Alexopoulou and Driver (1996) focused on the extent to which students considered and evaluated their own and their peers’ assertions instead of simply presenting their views, so that they avoid repetition/circular argument. On top of that reasoning gain, the researchers focused on the socio-emotional process of avoiding tensions and conflict through balancing power in interaction, and on students’ willingness and openness to negotiate ideas at a metadialogical level. Similarly, Baines et al. ( 2009 ) focused simultaneously on the socio-emotional process of achieving egalitarian student participation through group maintenance versus group blocking, and on the metalevel process of metatalk about the group itself and not about the task. Another example is Ryu and Sandoval ( 2012 ) whose main presented student gains were students’ listening to each other, at a socio-emotional level, and students’ evaluating arguments developing evidentiary norms, at a ‘meta’ level.

Given the great emphasis on socio-emotional and meta-level processes, as explained above, we might also expect more explicit focus on sophisticated argument products that take alternative points of view into account. However, as shown on Table 3 , Pattern 2 studies mainly focus on elaborated (TAP-structure) types of argument. This is probably related to the disciplinary influence that TAP has in science (Erduran et al. 2004 ), and most of Pattern 2 studies, as we will see later on, come from the science field.

LTA Pattern 3: High-structured Deliberation

When deliberation is the major goal-based task enacted by the studies, a different LTA pattern emerges, always under a highly structured dialogic framing, as with Pattern 2. The difference is that Pattern 3 (high-structured deliberation) studies place the weight not on the articulation between theory and evidence (which is presupposed) but on the articulation between different theories, of which only one can be chosen as the ‘best explanation’. Examples of this enactment include considering opposing viewpoints to critique and potentially refine ideas (Golanics and Nussbaum 2008 ) coming to an agreement on the most sensible solution (Cross et al. 2008 ) and using argumentation to refocus attention away from personal positions and towards the reasons underlying those positions in light of alternatives (Hsu et al. 2015 ). The distinction between justificatory and explanatory discourse is also evident in this group of studies (e.g., Asterhan et al. 2012 ; Noroozi et al. 2013 ; Oliveira et al. 2015 ), as with Pattern 2. However, among Pattern 3 studies, attention to metalevel processes is more explicit, including knowledge of the formal qualities of single arguments and of argumentative sequences (Noroozi et al. 2013 ), regulation of meta-cognition (Hsu et al. 2015 ), acquisition of argumentation norms as a classroom discourse pattern (Yun and Kim 2015 ), and willingness to disagree (Kim et al. 2007 ).

Given the focus on deliberation in Pattern 3 studies, we might again expect more explicit attention to ‘dialectic’ arguments (integrated arguments taking two contrary perspectives into consideration). However, as shown on Table 3 , this is not the case. It seems that dialogic rather than dialectic transactivity is highlighted, which also explains the increased emphasis on meta-level processes. This may be because the majority of these studies take place in small groups, as we will see later on in this section; therefore, a great attention to calibrate a variety of perspectives is necessary.

LTA Pattern 4: Dialectic Deliberation

Finally, studies belonging to the fourth pattern (i.e., dialectic deliberation) explicitly define argumentation as deliberative dialogue (e.g., Felton et al. 2015a , b ; Garcia-Mila et al. 2013 ; Villarroel et al. 2016 ) with deliberation either being ‘purely’ dialectical, i.e., focusing on the genuine difference in opinions (Asterhan and Schwarz 2009 ) or as a path towards consensus building (Berland and Lee 2012 ). From an instructional framing perspective, the only difference between Pattern 4 and Pattern 3 is that in Pattern 4 studies, disagreement is established as a necessary starting point for the dialogue, whereas in Pattern 3 studies, disagreement may or may not emerge during interaction. This is largely because Pattern 4 studies leverage disagreement to drive gains in counter-argumentation (e.g., Asterhan and Schwarz 2009 ), rebuttals (e.g., Felton 2004 ; Kuhn et al. 2016b ), and argument-counterargument integration (expressed through revisions, concessions, and compromises) (e.g., Felton et al. 2015b ; De Vries et al. 2002 ; Muller-Mirza et al. 2007 ; Nussbaum and Edwards 2011 ). This explicit focus on dialectic deliberation, more often framed in one-to-one rather than small-group settings as illustrated below, leads to a greater focus on the quality of arguments-as-products with the significant majority of Pattern 4 studies focusing on dialectic types of arguments (see Table 3 ).

LTA Fostering Methods

Considering our whole studies’ sample ( N = 143), the two main methods for fostering LTA were related to (a) the teacher or coach and (b) the task itself. In total, 72 studies (50%) reported that some type of coach-based method was effective in bringing out any type of gains in students’ argumentative dialogue, and 77 studies (54%) did so with task-based methods (32 studies used both coach-based and task-based methods). Among the task-based methods, a majority focused either on the instructional task’s structure (e.g., a script for interaction) (37 out of 77 studies), or on an instructional technique adopted as part of the class (e.g., Philosophy for Children) (33 out of 77 studies).

In addition, 41 studies (29%) used some type of material scaffold (visual, online, mixed, or other) to ensure or ‘boost’ the quality of argumentative dialogue, whereas 28 studies (20%) referred to some type of peers’ role, either prescribed or enacted during dialogue, as a relevant fostering factor. Though less common than coach-based and task-based methods, peers’ role was an important fostering method either as a variable set a priori or as enacted during interaction. Examples of prescribed peers’ role are discussion host (Yun and Kim 2015 ; Zhang et al. 2016 ), critical audience (Forman and Ford 2014 ; Gillies and Haynes 2011 ), and peer coach (Veerman et al. 2000 ). Examples of enacted peers’ role are leader versus helper (Albe 2008 ), expert versus follower (Cross et al. 2008 ), and metacognitive questioner (Gillies 2013 ).

When considering only the studies belonging to one of the four patterns described above, the distribution of fostering methods per study pattern reveals some further insights regarding how different instructional framings come along with different ways of scaffolding dialogue gains. An interesting observation is that the more we move from low-structured sensemaking to high-structured deliberation, the more concrete the teacher’s role becomes, from being a dialogue facilitator to an encourager of transactive reasoning, and from being a thinking guide to a challenger and a critical thinker. Also, the instructor’s role in dialectic deliberation vanishes, meaning that the focus is not on what the instructor does or does not, but what the students do as result of the learning environment’s guidelines. Alongside this shift, the material supports also move from visual materials (like drafts, graphs, sheets) towards highly structured, often computer-supported, collaborative learning environments. Moreover, the richest study patterns in terms of the variety of scaffolding methods used are Patterns 2 (high-structured articulation) and 3 (high-structured deliberation). Figure 2 presents a comprehensive summary of some representative examples of how LTA dialogue is fostered in each study pattern.

figure 2

Selective presentation of fostering methods per study pattern

Contextual Aspects of LTA Studies

Overall, the four instructional framing patterns described above seem to define argumentative dialogue in different ways and with different purposes. In this section we will focus on RQ2: In what different ways is argumentative dialogue fostered and/or scaffolded in different contexts? We consider ‘context’ in the following ways, directly relating to the learner characteristics in our MeDOL framework: (a) the dialogue setting, (b) the subject matter, and (c) students’ educational grade.

Study Patterns and Dialogue Setting

To test for an association between the four study patterns and the dialogue setting, we again performed Fisher’s exact test, which returned a significant result, p < .0001, with a moderate effect size, φc = .545. We can then infer that a first differential contextual aspect of the four study patterns previously identified is the dialogue setting, being: (a) one-to-one, i.e., a peer or dyad of peers arguing ‘against’ another peer or dyad; (b) small group, i.e., the LTA task relies on student groups to solve/engage with; or (c) whole class, meaning that the teacher guides the dialogue with everyone at a time. Based on analysis of the residuals (Table 4 ), Pattern 1 studies are mostly associated with whole-class discussions, Pattern 2 and Pattern 3 studies with small-group settings, whereas Pattern 4 studies with one-to-one discussions (as in the case of structured debates).

Study Patterns and Subject Matter

When it comes to the use of a LTA pattern in a specific disciplinary context, we again found a significant association by Fisher’s exact test ( p < .0001), this time between study pattern and subject matter. The effect size was moderate ( φc = .415). An analysis of residuals (Table 5 ) suggests that, the majority of Pattern 2 studies focus on Science (including Maths) and socio-scientific issues (SSI), Pattern 4 mainly focuses on general interest and scientific topics, and Pattern 1 studies focus more on Language/History.

Study Patterns and Educational Grade

Finally, a test of association between the four study patterns and students’ grade level produced a significant result using Fisher’s exact test (or approached significance, after a Bonferroni corrected alpha level of .01), p = .031, with a moderate effect size, φ c =.262. As shown on Table 6 , Pattern 1 studies were most common at the elementary level, and pattern 4 studies were most common at the university level.

In summary, four study patterns emerged in our sample based on how learning to argue is framed and operationalized as a pedagogical goal:

Low-structured dialogic sensemaking (Pattern 1) is commonly applied to teacher-mediated whole-class discussions with young children in a variety of disciplinary contexts, with teachers’ dialogic moves being used as the main scaffold for students.

High-structured dialogic articulation (Pattern 2) is associated with TAP structure arguments in small-group settings, particularly in science inquiry where templates and graphs are used as a primary scaffold for students to construct and compare their scientific explanations.

High-structured dialogic deliberation (Pattern 3) is associated with a high presence of socio-emotional and meta-level processes, applied mostly with secondary school students particularly with science and SSI topics.

Dialectic deliberation (Pattern 4) is associated primarily with the production of integrated dialectic arguments in peer-to-peer discussions about general interest issues among older students (adolescents and adults).

Despite extensive research, defining the characteristics of effective argumentative dialogue is an open problem, reflected in the complexity and importance of designing argumentative learning environments ‘that work’ (Berland and McNeill 2010 ; Bell and Linn 2000 ; Clark et al. 2007 ; Jiménez-Aleixandre 2008 ). Such an enterprise is relatively simple when the goal of argumentative dialogue is some type of measurable conceptual gains in content learning, or performance on a reading/comprehension test. The same is not true when the goal is to promote advances in argumentation itself, either as a process or an outcome. A common paradox in sociocultural learning emerges: How can learners learn how to argue effectively, when effective engagement in argumentation is a necessary part of such learning? We will address this paradox by referring to two terms commonly used in the literature to refer to effective educational dialogue, namely productivity and constructiveness.

Drawing on Bereiter’s ideas, Wells and Arauz ( 2006 ) define productive dialogue as one in which ‘participants are willing to revise their own opinions as they open-mindedly consider the proposals and arguments of others’ and thus ‘the common understanding jointly created is superior to that with which the participants started’ (pp. 415–416). Similarly, a dialogue is constructive when it ‘literally adds to the (co-)construction or building of something—meaning, understanding, solutions to problems and sometimes knowledge’, and when ‘it generally contributes in some way to cooperative goal-oriented activity’ (Baker 1999 ; pp. 180–181). In other words, from a productive dialogue perspective, what matters is the production of dialogue moves and sequences that may be considered of a ‘high’ dialogic quality, such as: open questions, elaboration of previous contributions, reasoned discussion of competing viewpoints, linkage and coordination across contributions, and metacognitive engagement with dialogue (Hennessy et al. 2016 ; Howe et al. 2019 ). Furthermore, from the perspective of constructive (argumentative) dialogue, the focus is on the construction of new argumentative and/or content knowledge (e.g., Baker 2009 ; Noroozi et al. 2013 ), as part of completing a task with a concrete goal, such as solving a complex problem, addressing an ill-defined issue, or making a decision. When such a goal is reached through a careful calibration of ideas and weighing of points of views, we can even talk about ‘productive argumentation’ (Andriessen and Schwarz 2009 ).

This theoretical tripartite of productive dialogue, constructive argumentative interaction, and productive argumentation is confirmed by our findings, pointing to the richness of approaches within what can be generally referred to as ‘learning to argue’ (Muller-Mirza and Perret-Clermont 2009 ; Von Aufschnaiter et al. 2008 ). What this review further brings to our knowledge is the instructional effectiveness potential of each one of these theoretical paradigms when applied in concrete contexts. Below we will give a summary of the different instructional framings emerged, discussing the affordances and limitations of each.

Instructional Framing Used to Optimize Classroom Discussions

Under this approach, applied by Pattern 1 studies in our sample, sensemaking is the foundational dialogue goal-task, related to grasping content, framing an authentic question, and co-constructing possible responses. Studies that use instructional framing to optimize classroom discussions tend to focus on dialogue quality per se as an indicator of how thinking and reasoning take place. This characterization of dialogue quality includes exploratory talk (as opposed to presentational or recitation talk) (Brown 2016 ; Molinari and Mameli 2013 ), dialogic co-construction of knowledge (Mason 1998 ), interpretive authority (Chisholm and Loretto 2016 ), or even presence of arguments (Lee and Majors 2003 ). Although the focus is not directly on the production of more sophisticated argumentative products, this often is a desired outcome. For example, in Larrain et al. ( 2014 ), students’ justification of counterarguments and rebuttals is the main indicator of the dialogue’s productivity, and it is explicitly fostered by teachers’ requests for justifications. Similarly, Sutherland ( 2006 ) focuses explicitly on students posing higher-order questions and encouraging critical responses, which are considered a Level 5 argument quality in our coding system (see Table 1 ). This is made possible, again, through the instructor’s explicit orientation towards modelling questions.

When it comes to gains in argument-as-process, Pattern 1 studies in our sample did not typically focus on socio-emotional and meta-level processes as outcomes. And yet, at least conceptually, both processes could play an important role in sensemaking. The collective elaboration and engagement with different views, also known as ‘relational agency’ (Edwards 2011 ), is a pre-requisite for social knowledge construction. Learners must not only express their own ideas in dialogue, but also interact with others in ways that invite, interpret, and examine views. Commitment to these processes naturally emerges from a growing epistemic awareness and meta-level processing of the dialogic process. These are an essential process in any argumentative learning environment, especially when concrete content learning outcomes are expected, because social agency and epistemic agency are so interrelated that one cannot take place without the other. As Miller et al. ( 2018 ) remark, for students to be able to act with epistemic agency in the classroom, soliciting and building on each other’s knowledge as a resource for learning is a first step in their participatory sensemaking. Additional research exploring these complex relationships in sensemaking dialogue is warranted.

When it comes to fostering methods and conditions related to this type of LTA instructional framing, a preference towards instructors’ roles/moves and instructional techniques is observed (Figure 2 ). It is, therefore, possible that students’ LTA gains, among studies following this framing, are largely due to the effectiveness of the teacher in engaging with students’ thinking (Leach and Scott 2002 ; Murphy et al. 2016 ).

Instructional Framing Used to Optimize Articulation of (Scientific) Contents

Articulation, the main focus of Pattern 2 studies, commonly corresponds, in our review, to collaborative inquiry, i.e., the process in which students engage in collaborative efforts to advance their shared understandings of the phenomenon or issue at hand (Hakkarainen 2003 ). The passage from sensemaking (content problematization) to articulation, or from Pattern 1 to Pattern 2 study approaches, is marked by a passage from information seeking and elaboration to critique (Chen et al. 2016 ) or from knowledge-sharing to knowledge-constructing discourse (Fu et al. 2016 ). Exploratory talk in its full original sense as both construction and critique (Mercer 1995 ) is what characterizes the dialogical interactions taking place in Pattern 2 articulation studies. But how is this balance between construction and critique achieved? In other words, how can students learn to argue effectively when the focus is on advancing knowledge? One might say that this is possible through transactive reasoning focusing on counterarguments and rebuttals. However, this was the case only in very few studies in our sample (e.g., Ford 2012 ; Gillies 2016 ), as the majority of Pattern 2 studies focused on the TAP structure as an indicator of constructive argumentation. This may be because of the focus of scientific inquiry on the construction of scientific explanations, and the TAP structure gives teachers and researchers a (more or less) clear idea of what a scientific explanation looks like (Erduran et al. 2004 ).

Productive dialogue, for studies focusing on articulation/collaborative inquiry as their main task, is commonly part of interactions where students navigate and negotiate their understandings of phenomena towards more coherent explanations (Baker 1999 ; Chen et al. 2019 ). For these moments of collaboration, understood as deep engagement with each other’s ideas, to emerge, both argument-as-products and arguments-as-process are necessary: the more constructive students become with their own contributions, the more productive the interaction with each other’s contributions becomes (Chi and Menekse 2015 ). However, what we observe in Pattern 2 studies is a focus on either high-level (4 & 5) arguments-as-products alone, or on a combination of medium-level arguments (2 & 3) with some type of relevant socio-emotional or metalevel processing. We consider this a rich area for inquiry to fill a gap in literature on articulation. As Chen et al. ( 2019 ) argue, students’ epistemic understanding of argument, especially its plausible and ‘uncertain’ nature, cannot be separated from their social negotiation processes. What material and visual scaffolds usually do is to reduce such uncertainty, orienting students towards the construction of a theory or solution. Peers’ and instructor’s role in maintaining uncertainty, through for example the use of critical questions, would be an asset for this type of studies, as it would increase dialogical transactivity around plausible scientific explanations.

Instructional Framing Used to Optimize the Search for the Best Explanation

Finally, deliberation whether it is framed as small-group interaction (Pattern 3 studies) or one-on-one debates (Pattern 4 studies) focuses on the production of transactive arguments. This includes studies focusing on prompting students to co-construct meanings without necessarily critically contrasting these meanings with each other. In such cases, the learning environment serves as a context of conceptual convergence, understood as a common focus on the construction of shared understanding (Oliveira and Sadler 2008 ). When this focus on collaboration rather than persuasion also forms an explicit part of the instructional goals for deliberative dialogue, it is often the case that dialectically transactive moves (e.g., claim revision, articulating opposing positions, argument-counterargument integration) emerge in students’ discourse (Berland and Lee 2012 ; Nussbaum 2002 ; Nussbaum and Edwards 2011 ; Vogel et al. 2016 ). This is explained by the principles of coalescent argumentation (Gilbert 2013 ) according to which collaboration and confrontation are both aspects of a balanced process of critical questioning and coalescing arguments.

Nonetheless, it is also often the case that argumentative dialogue is framed as a process of committing to a position and proving its validity against an opposing position. However, even in this context of adversarial argumentation, a focus on individual reflection on the arguments produced during interaction is often present, either as part of the instructional prompts or as part of the computer-supported collaborative learning environment in which argumentation takes place. This finding is in line with recent recommendations for more research on the role and nature of reflection as a mediating process in students’ learning to argue (Iordanou and Constantinou 2014 ). That said, when students are asked to deliberate in a context of highly divergent positions, the focus must not be on divergence per se, but on how and why these positions differ from each other in terms of their evidence quality and relevance (Macagno 2016 , 2019 ) and how argumentative talk is productively employed from part of the students as required or prompted by the activity design (Schwarz 2009 ). That latter is not always guaranteed, calling for additional research into the types of instructional prompts and dialogic moves used to support student interaction. This might include, for example, further research into the metacognitive questions asked by teachers, as they move from group to group or from dyad to dyad, which is an aspect studied in depth by the Pattern 1 studies, mostly focusing on whole-group discussions. In addition to the mediatory teacher’s role, peers’ role can also be further looked at, as recent research suggests that placing the focus on affordances for peer-to-peer counter-argumentation and rebuttal using relevant evidence is an effective strategy for learning to argue (see, for example, Larrain et al. 2019 ).

Study Limitations

As with meta-analyses (Glass 1976 ), the integrative review bears the limitation of shifting away from the original units of analysis in the studies reviewed to treating the studies themselves as the unit of analysis. Doing so runs the risk of collapsing disparate findings across settings in the interest of seeing larger patterns in the literature. That being said, we have mitigated this problem by focusing not on the conclusions drawn by each study, but instead on the approach taken by the researchers. Nonetheless, we recognize (per Jackson 1980 ) that we have had to make inferences about how researchers frame learning to argue, rather than drawing on explicit statements made by the authors themselves (particularly in the field of educational dialogue, where learning to argue is not necessarily positioned as the goal of dialogue).

A second limitation is that we cannot use our findings to infer which of the studies’ characteristics caused the results reported. Whatever statistical results were reported must be seen as a meta-analytical approach of associations of study characteristics coded a posteriori, and not expressing any causal relationship between units of analysis originally used in the studies we reviewed. A final limitation regards the number and type of studies selected to be reviewed. As Jackson ( 1980 ) again would point out, the selected studies of a review should be considered only a sample of the phenomenon being studied. Though we have followed recommended practices in sampling from the literature, selecting studies according to their relevance to the phenomenon under investigation (i.e., learning to argue), we recognize that we have not addressed the whole population of studies in the fields of educational dialogue and argumentation, as that would be beyond the scope of this paper and would preclude a focused analysis of the literature.

Constructing learning environments that promote learning to argue (LTA) through dialogue has been characterized as a complex enterprise (Clark et al. 2007 ; Jiménez-Aleixandre 2008 ), without sufficiently addressing the reasons for this complexity. We focus on instructional framing as a way to look at the LTA literature, perceiving as such: the methods and approaches implemented by researchers to frame educational dialogue, the dialogue goals pursued as part of the LTA activity, the noted gains in terms of LTA, and the learners’ characteristics. Our analysis of the reviewed studies revealed a continuum in the LTA literature, with four different concentrations in empirical studies, namely: engaging in low-structured sensemaking, engaging in high-structured articulation, engaging in high-structured deliberation, and engaging in dialectic deliberation. Each research focus has its own affordances and limitations as discussed above. Interestingly enough, what emerges as a limitation for one trend of research is covered by another, pointing out to the complementarity of existing instructional approaches aiming, either directly or indirectly, at learning to argue.

An implication of the above is that when it comes to teaching students how to argue, all four approaches need to be considered to fully leverage the benefits of engaging in argumentative dialogue. The complementary patterns we have identified clears the way for a better understanding of pedagogical content knowledge in argumentative dialogue, in the interest of designing more effective and purposeful dialogue-based and dialogue-oriented learning environments.

Data Availability

Data will be available upon request.

It is interesting to note that Lee et al. (2015) used the triadic model (sensemaking-articulation-persuasion) proposed by Berland and Reiser ( 2009 ) as a method of analysing goals emerging in students’ discourse. This view of the three tasks as individual cognitive operations is different than the approach adopted in this study, where we view the three tasks as social activities held dialogically in the classroom.

It is worth noting here that this study, and others in our sample, did not identify learning to argue as their main pedagogical goal. Muller-Mirza et al. ( 2007 ) explicitly claim that their goal ‘was not to teach students how to argue, but to provide them with the opportunity to learn from argumentation’ (p. 256). However, the study was still included because there was enough information provided regarding how students engaged in the practice of argumentation and the gains achieved from this engagement, other than content learning.

Albe, V. (2008). When scientific knowledge, daily life experience, epistemological and social considerations intersect: Students’ argumentation in group discussions on a socio-scientific issue. Research in Science Education, 38 (1), 67–90. https://doi.org/10.1007/s11165-007-9040-2

Article   Google Scholar  

Alexander, R. J. (2017). Towards dialogic teaching: Rethinking classroom talk (5th ed.). Dialogos.

Google Scholar  

Andriessen, J., & Baker, M. J. (2014). Arguing to learn. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 439–460). Cambridge University Press.

Chapter   Google Scholar  

Andriessen, J. E., & Schwarz, B. B. (2009). Argumentative design. In N. Muller-Mirza & A. N. Perret-Clermont (Eds.), Argumentation and education: Theoretical foundations and practices (pp. 145–174). Springer.

Andriessen, J., Baker, M., & van der Puil, C. (2011). Socio-cognitive tension in collaborative working relations. In S. Ludwigsen, A. Lund, I. Rasmussen, & R. Säljö (Eds.), Learning across sites: New tools, infrastructures and practices (pp. 222–242). Routledge.

Arvaja, M., Häkkinen, P., Eteläpelto, A., & Rasku-Puttonen, H. (2000). Collaborative processes during report writing of a science learning project: The nature of discourse as a function of task requirements. European Journal of Psychology of Education, 15 (4), 455–466. https://doi.org/10.1007/bf03172987

Asterhan, C. (2013). Epistemic and interpersonal dimensions of peer argumentation. In M. Baker, J. Andriessen, & S. Järvelä (Eds.), Affective learning together (pp. 251–271). Routledge.

Asterhan, C. S., & Schwarz, B. B. (2007). The effects of monological and dialogical argumentation on concept learning in evolutionary theory. Journal of Educational Psychology, 99 (3), 626–639. https://doi.org/10.1037/0022-0663.99.3.626

Asterhan, C. S., & Schwarz, B. B. (2009). Argumentation and explanation in conceptual change: Indications from protocol analyses of peer-to-peer dialog. Cognitive Science, 33 (3), 374–400. https://doi.org/10.1111/j.1551-6709.2009.01017.x

Asterhan, C. S., & Schwarz, B. B. (2016). Argumentation for learning: Well-trodden paths and unexplored territories. Educational Psychologist, 51 (2), 164–187. https://doi.org/10.1080/00461520.2016.1155458

Asterhan, C. S., Schwarz, B. B., & Gil, J. (2012). Small-group, computer-mediated argumentation in middle-school classrooms: The effects of gender and different types of online teacher guidance. British Journal of Educational Psychology, 82 (3), 375–397. https://doi.org/10.1111/j.2044-8279.2011.02030.x

Asterhan, C. S., Howe, C., Lefstein, A., Matusov, E., & Reznitskaya, A. (2020). Controversies and consensus in research on dialogic teaching and learning. Dialogic Pedagogy, 8 . https://doi.org/10.5195/dpj.2020.312

Atwood, S., Turnbull, W., & Carpendale, J. I. (2010). The construction of knowledge in classroom talk. Journal of the Learning Sciences, 19 (3), 358–402. https://doi.org/10.1080/10508406.2010.481013

Aukerman, M., Martin, P. C., Gargani, J., & McCallum, R. D. (2016). A randomized control trial of Shared Evaluation Pedagogy: The near-term and long-term impact of dialogically organized reading instruction. L1 Educational Studies in Language and Literature, 16 , 1-26. 10.17239/L1ESLL-2016.16.02.02

Baines, E., Rubie-Davies, C., & Blatchford, P. (2009). Improving pupil group work interaction and dialogue in primary classrooms: Results from a year-long intervention study. Cambridge Journal of Education, 39 (1), 95–117. https://doi.org/10.1080/03057640802701960

Baker, M. J. (1999). Argumentation and constructive interaction. In P. Coirier & J. Andriessen (Eds.), Foundations of argumentative text processing (pp. 179–202). University of Amsterdam Press. https://doi.org/10.1007/978-94-017-0781-7_3

Baker, M. (2009). Argumentative interactions and the social construction of knowledge. In N. Muller-Mirza & A. N. Perret-Clermont (Eds.), Argumentation and education: Theoretical foundations and practices (pp. 127–144). Springer.

Beasley, T. M., & Schumacker, R. E. (1995). Multiple regression approach to analyzing contingency tables: Post hoc and planned comparison procedures. Journal of Experimental Education, 64 (1), 79–93. https://doi.org/10.1080/00220973.1995.9943797

Bell, P., & Linn, M. C. (2000). Scientific arguments as learning artifacts: Designing for learning from the web with KIE. International Journal of Science Education, 22 (8), 797–817. https://doi.org/10.1080/095006900412284

Berkowitz, M. W., & Gibbs, J. C. (1983). Measuring the developmental features of moral discussion. Merrill-Palmer Quarterly (1982-), 29 , 399–410.

Berland, L. K., & Lee, V. R. (2012). In pursuit of consensus: Disagreement and legitimization during small-group argumentation. International Journal of Science Education, 34 (12), 1857–1882. https://doi.org/10.1080/09500693.2011.645086

Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94 (5), 765–793. https://doi.org/10.1002/sce.20402

Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93 (1), 26–55. https://doi.org/10.1002/sce.20286

Berland, L. K., & Reiser, B. J. (2011). Classroom communities’ adaptations of the practice of scientific argumentation. Science Education, 95 (2), 191–216. https://doi.org/10.1002/sce.20420

Billig, M. (1987). Arguing and thinking: A rhetorical approach to social psychology . Cambridge University Press.

Brown, A. C. (2016). Classroom community and discourse: How argumentation emerges during a Socratic circle. Dialogic Pedagogy: An International Online Journal , 4 . https://doi.org/10.5195/dpj.2016.160

Cavagnetto, A. R. (2010). Argument to foster scientific literacy: A review of argument interventions in K–12 science contexts. Review of Educational Research, 80 (3), 336–371. https://doi.org/10.3102/0034654310376953

Cavagnetto, A., Hand, B. M., & Norton-Meier, L. (2010). The nature of elementary student science discourse in the context of the science writing heuristic approach. International Journal of Science Education, 32 (4), 427–449. https://doi.org/10.1080/09500690802627277

Chen, Y. C., Park, S., & Hand, B. (2016). Examining the use of talk and writing for students’ development of scientific conceptual knowledge through constructing and critiquing arguments. Cognition and Instruction, 34 (2), 100–147. https://doi.org/10.1080/07370008.2016.1145120

Chen, Y. C., Benus, M. J., & Hernandez, J. (2019). Managing uncertainty in scientific argumentation. Science Education, 103 (5), 1235–1276. https://doi.org/10.1002/sce.21527

Chi, M. T., & Menekse, M. (2015). Dialogue patterns in peer collaboration that promote learning. In L. Resnick, C. Asterhan, & S. Clarke (Eds.), Socializing intelligence through academic talk and dialogue (pp. 263–274). American Educational Research Association.

Chin, C., & Teou, L. Y. (2009). Using concept cartoons in formative assessment: Scaffolding students’ argumentation. International Journal of Science Education, 31 (10), 1307–1332. https://doi.org/10.1080/09500690801953179

Chinn, C. A., & Clark, D. B. (2013). Learning through collaborative argumentation. In C. E. Hmelo-Silver, C. A. Chinn, C. K. K. Chan, & A. M. O’Donnell (Eds.), International handbook of collaborative learning (pp. 314–332). Taylor & Francis.

Chinn, C. A., Buckland, L. A., & Samarapungavan, A. L. A. (2011). Expanding the dimensions of epistemic cognition: Arguments from philosophy and psychology. Educational Psychologist, 46 (3), 141–167. https://doi.org/10.1080/00461520.2011.587722

Chisholm, J. S., & Loretto, A. J. (2016). Tensioning interpretive authority during dialogic discussions of literature. L1 Educational Studies in Language and Literature, 16 , 1–32. https://doi.org/10.17239/L1ESLL-2016.16.02.04

Chiu, M. M. (2008). Effects of argumentation on group micro-creativity: Statistical discourse analyses of algebra students’ collaborative problem solving. Contemporary Educational Psychology, 33 (3), 382–402. https://doi.org/10.1016/j.cedpsych.2008.05.001

Choi, A., Hand, B., & Norton-Meier, L. (2014). Grade 5 students’ online argumentation about their in-class inquiry investigations. Research in Science Education, 44 (2), 267–287. https://doi.org/10.1007/s11165-013-9384-8

Clarà, M. (2021). Conceptually driven inquiry: addressing the tension between dialogicity and teleology in dialogic approaches to classroom talk. Educational Review. , 1–20. https://doi.org/10.1080/00131911.2021.1923462

Clark, D. B., Sampson, V., Weinberger, A., & Erkens, G. (2007). Analytic frameworks for assessing dialogic argumentation in online learning environments. Educational Psychology Review, 19 (3), 343–374. https://doi.org/10.1007/s10648-007-9050-7

Coker Jr., D. L., & Erwin, E. (2011). Teaching academic argument in an urban middle school: A case study of two approaches. Urban Education, 46 (2), 120–140. https://doi.org/10.1177/0042085910377426

Corcelles Seuba, M., & Castelló, M. (2017). Learning philosophical thinking through collaborative writing in secondary education. Journal of Writing Research, 7 (1), 157–199. https://doi.org/10.17239/jowr-2015.07.01.07

Cross, D., Taasoobshirazi, G., Hendricks, S., & Hickey, D. T. (2008). Argumentation: A strategy for improving achievement and revealing scientific identities. International Journal of Science Education, 30 (6), 837–861. https://doi.org/10.1080/09500690701411567

Crowell, A., & Kuhn, D. (2014). Developing dialogic argumentation skills: A 3-year intervention study. Journal of Cognition and Development, 15 (2), 363–381. https://doi.org/10.1080/15248372.2012.725187

Damico, J., & Rosaen, C. L. (2009). Creating epistemological pathways to a critical citizenry: Examination of a fifth-grade discussion of freedom. Teachers College Record, 111 (5), 1163–1194.

De Vries, E., Lund, K., & Baker, M. (2002). Computer-mediated epistemic dialogue: Explanation and argumentation as vehicles for understanding scientific notions. Journal of the Learning Sciences, 11 (1), 63–103. https://doi.org/10.1207/s15327809jls1101_3

Dougherty, D., Borrelli, L., Munir, K., & O’Sullivan, A. (2000). Systems of organizational sensemaking for sustained product innovation. Journal of Engineering and Technology Management, 17 (3-4), 321–355. https://doi.org/10.1016/s0923-4748(00)00028-x

Edwards, A. (2011). Building common knowledge at the boundaries between professional practices: Relational agency and relational expertise in systems of distributed expertise. International Journal of Educational Research, 50 (1), 33–39. https://doi.org/10.1016/j.ijer.2011.04.007

Erduran, S., Simon, S., & Osborne, J. (2004). TAPping into argumentation: Developments in the application of Toulmin’s argument pattern for studying science discourse. Science Education, 88 (6), 915–933. https://doi.org/10.1002/sce.20012

Evagorou, M., & Osborne, J. (2013). Exploring young students’ collaborative argumentation within a socioscientific issue. Journal of Research in Science Teaching, 50 (2), 209–237. https://doi.org/10.1002/tea.21076

Felton, M. K. (2004). The development of discourse strategies in adolescent argumentation. Cognitive Development, 19 (1), 35–52. https://doi.org/10.1016/j.cogdev.2003.09.001

Felton, M., & Kuhn, D. (2001). The development of argumentive discourse skill. Discourse Processes, 32 (2-3), 135–153. https://doi.org/10.1080/0163853X.2001.9651595

Felton, M., Crowell, A., & Liu, T. (2015a). Arguing to agree: Mitigating my-side bias through consensus-seeking dialogue. Written Communication, 32 (3), 317–331. https://doi.org/10.1177/0741088315590788

Felton, M., Garcia-Mila, M., Villarroel, C., & Gilabert, S. (2015b). Arguing collaboratively: Argumentative discourse types and their potential for knowledge building. British Journal of Educational Psychology, 85 (3), 372–386. https://doi.org/10.1111/bjep.12078

Ford, M. (2008). Disciplinary authority and accountability in scientific practice and learning. Science Education, 92 (3), 404–423. https://doi.org/10.1002/sce.20263

Ford, M. J. (2012). A dialogic account of sense-making in scientific argumentation and reasoning. Cognition and Instruction, 30 (3), 207–245. https://doi.org/10.1080/07370008.2012.689383

Ford, M. J., & Wargo, B. M. (2012). Dialogic framing of scientific content for conceptual and epistemic understanding. Science Education, 96 (3), 369–391. https://doi.org/10.1002/sce.20482

Forman, E. A., & Ford, M. J. (2014). Authority and accountability in light of disciplinary practices in science. International Journal of Educational Research, 64 , 199–210. https://doi.org/10.1016/j.ijer.2013.07.009

Forman, E. A., Ramirez-DelToro, V., Brown, L., & Passmore, C. (2017). Discursive strategies that foster an epistemic community for argument in a biology classroom. Learning and Instruction, 48 , 32–39. https://doi.org/10.1016/j.learninstruc.2016.08.005

Frijters, S., ten Dam, G., & Rijlaarsdam, G. (2008). Effects of dialogic learning on value-loaded critical thinking. Learning and Instruction, 18 (1), 66–82. https://doi.org/10.1016/j.learninstruc.2006.11.001

Fu, E. L., van Aalst, J., & Chan, C. K. (2016). Toward a classification of discourse patterns in asynchronous online discussions. International Journal of Computer-Supported Collaborative Learning, 11 (4), 441–478. https://doi.org/10.1007/s11412-016-9245-3

Garcia-Mila, M., Gilabert, S., Erduran, S., & Felton, M. (2013). The effect of argumentative task goal on the quality of argumentative discourse. Science Education, 97 (4), 497–523. https://doi.org/10.1002/sce.21057

Gilabert, S., Garcia-Mila, M., & Felton, M. K. (2013). The effect of task instructions on students’ use of repetition in argumentative discourse. International Journal of Science Education, 35 (17), 2857–2878. https://doi.org/10.1080/09500693.2012.663191

Gilbert, M. A. (2004). Emotion, argumentation and informal logic. Informal Logic, 24 (3), 245–264. https://doi.org/10.22329/il.v24i3.2147

Gilbert, M. A. (2013). Coalescent argumentation . Routledge.

Book   Google Scholar  

Gillies, R. M. (2013). Productive academic talk during inquiry-based science. Pedagogies: An International Journal, 8 (2), 126–142. https://doi.org/10.1080/1554480x.2013.767770

Gillies, R. M. (2016). Dialogic interactions in the cooperative classroom. International Journal of Educational Research, 76 , 178–189. https://doi.org/10.1016/j.ijer.2015.02.009

Gillies, R. M., & Haynes, M. (2011). Increasing explanatory behaviour, problem-solving, and reasoning within classes using cooperative group work. Instructional Science, 39 (3), 349–366. https://doi.org/10.1007/s11251-010-9130-9

Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5 , 3–8.

Golanics, J. D., & Nussbaum, E. M. (2008). Enhancing online collaborative argumentation through question elaboration and goal instructions. Journal of Computer Assisted Learning, 24 (3), 167–180. https://doi.org/10.1111/j.1365-2729.2007.00251.x

González-Howard, M., McNeill, K. L., Marco-Bujosa, L. M., & Proctor, C. P. (2017). ‘Does it answer the question or is it French fries?’: An exploration of language supports for scientific argumentation. International Journal of Science Education, 39 (5), 528–547. https://doi.org/10.1080/09500693.2017.1294785

Hakkarainen, K. (2003). Progressive inquiry in a computer-supported biology class. Journal of Research in Science Teaching, 40 (10), 1072–1088. https://doi.org/10.1002/tea.10121

Hand, B., Chen, Y. C., & Suh, J. K. (2021). Does a knowledge generation approach to learning benefit students? A systematic review of research on the science writing heuristic approach. Educational Psychology Review, 33 (2), 535–577. https://doi.org/10.1007/s10648-020-09550-0

Harney, O. M., Hogan, M. J., Broome, B., Hall, T., & Ryan, C. (2015). Investigating the effects of prompts on argumentation style, consensus and perceived efficacy in collaborative learning. International Journal of Computer-Supported Collaborative Learning, 10 (4), 367–394. https://doi.org/10.1007/s11412-015-9223-1

Hennessy, S., Rojas-Drummond, S., Higham, R., Márquez, A. M., Maine, F., Ríos, R. M., García-Carrión, R., Torreblanca, O., & Barrera, M. J. (2016). Developing a coding scheme for analysing classroom dialogue across educational contexts. Learning, Culture and Social Interaction, 9 , 16–44. https://doi.org/10.1016/j.lcsi.2015.12.001

Hoadley, U. (2006). Analysing pedagogy: The problem of framing. Journal of Education, 40 (1), 15–34.

Hogan, K., Nastasi, B. K., & Pressley, M. (1999). Discourse patterns and collaborative scientific reasoning in peer and teacher-guided discussions. Cognition and Instruction, 17 (4), 379–432. https://doi.org/10.1207/s1532690xci1704_2

Howe, C., Hennessy, S., Mercer, N., Vrikki, M., & Wheatley, L. (2019). Teacher-student dialogue during classroom teaching: Does it really impact on student outcomes? Journal of the Learning Sciences, 28 (4-5), 462–512. https://doi.org/10.1080/10508406.2019.1573730

Hsu, P. S., Van Dyke, M., Chen, Y., & Smith, T. J. (2015). The effect of a graph-oriented computer-assisted project-based learning environment on argumentation skills. Journal of Computer Assisted Learning, 31 (1), 32–58. https://doi.org/10.1111/jcal.12080

Iordanou, K., & Constantinou, C. P. (2014). Developing pre-service teachers’ evidence-based argumentation skills on socio-scientific issues. Learning and Instruction, 34 , 42–57. https://doi.org/10.1016/j.learninstruc.2014.07.004

Iordanou, K., Kendeou, P., & Beker, K. (2016). Argumentative reasoning. In J. A. Greene, W. A. Sandoval, & I. Braten (Eds.), Handbook of epistemic cognition (pp. 39–53). Routledge.

Iordanou, K., Kuhn, D., Matos, F., Shi, Y., & Hemberger, L. (2019). Learning by arguing. Learning and Instruction ,. Online first , 63 , 101207. https://doi.org/10.1016/j.learninstruc.2019.05.004

Jackson, G. B. (1980). Methods for integrative reviews. Review of Educational Research, 50 (3), 438–460.

Jadallah, M., Anderson, R. C., Nguyen-Jahiel, K., Miller, B. W., Kim, I. H., Kuo, L. J., Dong, T., & Wu, X. (2011). Influence of a teacher’s scaffolding moves during child-led small-group discussions. American Educational Research Journal, 48 (1), 194–230. https://doi.org/10.3102/0002831210371498

Jermann, P., & Dillenbourg, P. (2003). Elaborating new arguments through a CSCL script. In J. Andriessen, M. Baker, & D. Suthers (Eds.), Arguing to learn: Confronting cognitions in computer-supported collaborative learning environments (pp. 205–226). Springer.

Jiménez-Aleixandre, M. P. (2002). Knowledge producers or knowledge consumers? Argumentation and decision making about environmental management. International Journal of Science Education, 24 (11), 1171–1190. https://doi.org/10.1080/09500690210134857

Jiménez-Aleixandre, M.-P. (2008). Designing argumentation learning environments. In S. Erduran & M.-P. Jiménez-Aleixandre (Eds.), Argumentation in science education (pp. 91–116). Springer.

Katchevich, D., Hofstein, A., & Mamlok-Naaman, R. (2013). Argumentation in the chemistry laboratory: Inquiry and confirmatory experiments. Research in Science Education, 43 (1), 317–345. https://doi.org/10.1007/s11165-011-9267-9

Keefer, M. W., Zeitz, C. M., & Resnick, L. B. (2000). Judging the quality of peer-led student dialogues. Cognition and Instruction, 18 (1), 53–81. https://doi.org/10.1207/s1532690xci1801_03

Kim, H., & Song, J. (2006). The features of peer argumentation in middle school students’ scientific inquiry. Research in Science Education, 36 (3), 211–233. https://doi.org/10.1007/s11165-005-9005-2

Kim, M. Y., & Wilkinson, I. A. (2019). What is dialogic teaching? Constructing, deconstructing, and reconstructing a pedagogy of classroom talk. Learning, Culture and Social Interaction, 21 , 70–86. https://doi.org/10.1016/j.lcsi.2019.02.003

Kim, I. H. (2014). Development of reasoning skills through participation in collaborative synchronous online discussions. Interactive Learning Environments, 22 (4), 467–484. https://doi.org/10.1080/10494820.2012.680970

Kim, I.-H., Anderson, R. C., Nguyen-Jahiel, K., & Archodidou, A. (2007). Discourse patterns during children’s collaborative online discussions. Journal of the Learning Sciences, 16 (3), 333–370. https://doi.org/10.1080/10508400701413419

Klein, P. D., Haug, K. N., & Bildfell, A. (2019). Writing to learn. In S. Graham, C. A. McArthur, & M. Hebert (Eds.), Best practices in writing instruction (3rd ed., pp. 162–184). The Guilford Press.

Koschmann, T. (1999). Towards a dialogic theory of learning: Bakhtin’s contribution to understanding learning in settings of collaboration. In C. Hoadley & J. Roschelle (Eds.), Proceedings of the Computer Support for Collaborative Learning (CSCL) 1999 Conference (pp. 308–313). Lawrence Erlbaum Associates.

Krabbe, E. C. W. (2003). Metadialogues. In F. H. van Eemeren, J. A. Blair, C. A. Willard, & A. F. Snoeck Henkemans (Eds.), Anyone who has a view: Theoretical contributions to the study of argumentation (pp. 83–90). Kluwer.

Kruger, A. C. (1993). Peer collaboration: Conflict, cooperation, or both? Social Development, 2 (3), 165–182. https://doi.org/10.1111/j.1467-9507.1993.tb00012.x

Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28 (2), 16–46. https://doi.org/10.3102/0013189x028002016

Kuhn, D. (2015). Thinking together and alone. Educational Researcher, 44 (1), 46–53. https://doi.org/10.3102/0013189x15569530

Kuhn, D. (2018). Building our best future: Thinking critically about ourselves and our world . Wessex.

Kuhn, D. (2019). Critical thinking as discourse. Human Development, 62 (3), 146–164. https://doi.org/10.1159/000500171

Kuhn, D., & Crowell, A. (2011). Dialogic argumentation as a vehicle for developing young adolescents’ thinking. Psychological Science, 22 , 545–552. https://doi.org/10.1177/0956797611402512

Kuhn, D., Goh, W., Iordanou, K., & Shaenfield, D. (2008). Arguing on the computer: A microgenetic study of developing argument skills in a computer-supported environment. Child Development, 79 (5), 1310–1328. https://doi.org/10.1111/j.1467-8624.2008.01190.x

Kuhn, D., & Udell, W. (2003). The development of argument skills. Child Development, 74 (5), 1245–1260. https://doi.org/10.1111/1467-8624.00605

Kuhn, D., & Udell, W. (2007). Coordinating own and other perspectives in argument. Thinking and Reasoning, 13 (2), 90–104. https://doi.org/10.1080/13546780600625447

Kuhn, D., & Zillmer, N. (2015). Developing norms of discourse. In L. Resnick, C. Asterhan, & S. Clarke (Eds.), Socializing intelligence through academic talk and dialogue (pp. 77–86). American Educational Research Association. https://doi.org/10.3102/978-0-935302-43-1_6

Kuhn, D., Shaw, V., & Felton, M. (1997). Effects of dyadic interaction on argumentive reasoning. Cognition and Instruction, 15 (3), 287–315. https://doi.org/10.1207/s1532690xci1503_1

Kuhn, D., Zillmer, N., Crowell, A., & Zavala, J. (2013). Developing norms of argumentation: Metacognitive, epistemological, and social dimensions of developing argumentive competence. Cognition and Instruction, 31 (4), 456–496. https://doi.org/10.1080/07370008.2013.830618

Kuhn, D., Hemberger, L., & Khait, V. (2016a). Argue with me: Developing thinking and writing through dialog . Routledge.

Kuhn, D., Hemberger, L., & Khait, V. (2016b). Tracing the development of argumentive writing in a discourse-rich context. Written Communication, 33 (1), 92–121. https://doi.org/10.1177/0741088315617157

Kulatunga, U., Moog, R. S., & Lewis, J. E. (2013). Argumentation and participation patterns in general chemistry peer-led sessions. Journal of Research in Science Teaching, 50 (10), 1207–1231.

Langer-Osuna, J. M. (2015). From getting ‘fired’ to becoming a collaborator: A case of the coconstruction of identity and engagement in a project-based mathematics classroom. Journal of the Learning Sciences, 24 (1), 53–92. https://doi.org/10.1080/10508406.2014.944643

Langer-Osuna, J. M., & Avalos, M. A. (2015). ‘I’m trying to figure this out. Why don’t you come up here?’: Heterogeneous talk and dialogic space in a mathematics discussion. ZDM, 47 (7), 1313–1322. https://doi.org/10.1007/s11858-015-0735-y

Larrain, A., Howe, C., & Cerda, J. (2014). Argumentation in whole-class teaching and science learning. Psykhe, 23 (2), 1–15. https://doi.org/10.7764/psykhe.23.2.712

Larrain, A., Freire, P., López, P., & Grau, V. (2019). Counter-arguing during curriculum-supported peer interaction facilitates middle-school students’ science content knowledge. Cognition and Instruction, 37 (4), 453–482. https://doi.org/10.1080/07370008.2019.1627360

Leach, J., & Scott, P. (2002). Designing and evaluating science teaching sequences: An approach drawing upon the concept of learning demand and a social constructivist perspective on learning. Studies in Science Education, 38 , 115–142.

Lee, C. D., & Majors, Y. J. (2003). ‘Heading up the street’: Localised opportunities for shared constructions of knowledge. Pedagogy, Culture and Society, 11 (1), 49–68. https://doi.org/10.1080/14681360300200160

Lee, S., Kang, E., & Kim, H. B. (2015). Exploring the impact of students’ learning approach on collaborative group modeling of blood circulation. Journal of Science Education and Technology, 24 (2–3), 234–255. https://doi.org/10.1007/s10956-014-9509-5 .

Leitão, S. (2000). The potential of argument in knowledge building. Human Development, 43 (6), 332–360. https://doi.org/10.1159/000022695

Lipman, M. (2003). Thinking in education . Cambridge University Press.

Littleton, K., & Mercer, N. (2013). Interthinking: Putting talk to work . Routledge.

Macagno, F. (2016). Argument relevance and structure. Assessing and developing students’ uses of evidence. International Journal of Educational Research, 79 , 180–194. https://doi.org/10.1016/j.ijer.2016.07.002

Macagno, F. (2019). Coding relevance. Learning, culture, and social interaction . Online first, 100349. https://doi.org/10.1016/j.lcsi.2019.100349

Macagno, F., Mayweg-Paus, E., & Kuhn, D. (2015). Argumentation theory in education studies: Coding and improving students’ argumentative strategies. Topoi, 34 (2), 523–537. https://doi.org/10.1007/s11245-014-9271-6

Maley, T., Stoll, W., & Demir, K. (2013). Seeing an old lab in a new light: Transforming a traditional optics lab into full guided inquiry. The Physics Teacher, 51 (6), 368–371. https://doi.org/10.1119/1.4818379

Mason, L. (1998). Sharing cognition to construct scientific knowledge in school context: The role of oral and written discourse. Instructional Science, 26 (5), 359–389. https://doi.org/10.1023/a:1003103213786

McNeill, K. L., & Pimentel, D. S. (2010). Scientific discourse in three urban classrooms: The role of the teacher in engaging high school students in argumentation. Science Education, 94 (2), 203–229. https://doi.org/10.1002/sce.20364

Mercer, N. (1995). The guided construction of knowledge: Talk amongst teachers and learners . Multilingual matters.

Mercer, N. (2004). Sociocultural discourse analysis. Journal of Applied Linguistics, 1 (2), 137–168. https://doi.org/10.1558/japl.v1i2.137

Mercer, N., & Howe, C. (2012). Explaining the dialogic processes of teaching and learning: The value and potential of sociocultural theory. Learning, Culture and Social Interaction, 1 (1), 12–21. https://doi.org/10.1016/j.lcsi.2012.03.001

Mercer, N., & Sams, C. (2006). Teaching children how to use language to solve maths problems. Language and Education, 20 (6), 507–528. https://doi.org/10.2167/le678.0

Mevarech, Z. R., & Kramarski, B. (2003). The effects of metacognitive training versus worked-out examples on students’ mathematical reasoning. British Journal of Educational Psychology, 73 (4), 449–471. https://doi.org/10.1348/000709903322591181

Meyer, O., Coyle, D., Halbach, A., Schuck, K., & Ting, T. (2015). A pluriliteracies approach to content and language integrated learning–mapping learner progressions in knowledge construction and meaning-making. Language, Culture and Curriculum, 28 (1), 41–57. https://doi.org/10.1080/07908318.2014.1000924

Michaels, S., O’Connor, M. C., Hall, M. W., & Resnick, L. (2002). Accountable talk: Classroom conversation that works . University of Pittsburgh Press.

Michaels, S., O’Connor, C., & Resnick, L. B. (2008). Deliberative discourse idealized and realized: Accountable talk in the classroom and in civic life. Studies in Philosophy and Education, 27 (4), 283–297. https://doi.org/10.1007/s11217-007-9071-1

Micheli, R. (2012). Arguing without trying to persuade? Elements for a non-persuasive definition of argumentation. Argumentation, 26 (1), 115–126. https://doi.org/10.1007/s10503-011-9240-9

Miller, E., Manz, E., Russ, R., Stroupe, D., & Berland, L. (2018). Addressing the epistemic elephant in the room: Epistemic agency and the next generation science standards. Journal of Research in Science Teaching, 55 (7), 1053–1075. https://doi.org/10.1002/tea.21459

Molinari, L., & Mameli, C. (2013). Process quality of classroom discourse: Pupil participation and learning opportunities. International Journal of Educational Research, 62 , 249–258. https://doi.org/10.1016/j.ijer.2013.05.003

Moon, A., Stanford, C., Cole, R., & Towns, M. (2016). The nature of students’ chemical reasoning employed in scientific argumentation in physical chemistry. Chemistry Education Research and Practice, 17 (2), 353–364. https://doi.org/10.1039/c5rp00207a

Moon, A., Stanford, C., Cole, R., & Towns, M. (2017). Analysis of inquiry materials to explain complexity of chemical reasoning in physical chemistry students’ argumentation. Journal of Research in Science Teaching, 54 (10), 1322–1346. https://doi.org/10.1002/tea.21407

Morrison, G. R., Ross, S. M., Morrison, J. R., & Kalman, H. K. (2019). Designing effective instruction (8th ed.). John Wiley and Sons.

Muller-Mirza, N., & Perret-Clermont, A. N. (2009). Argumentation and education: Theoretical foundations and practices . Springer.

Muller-Mirza, N., Tartas, V., Perret-Clermont, A. N., & de Pietro, J. F. (2007). Using graphical tools in a phased activity for enhancing dialogical skills: An example with Digalo. International Journal of Computer-Supported Collaborative Learning, 2 (2), 247–272. https://doi.org/10.1007/s11412-007-9021-5

Murphy, P. K., Firetto, C. M., Wei, L., Li, M., & Croninger, R. M. (2016). What REALLY works: Optimizing classroom discussions to promote comprehension and critical-analytic thinking. Policy Insights from the Behavioral and Brain Sciences, 3 (1), 27–35. https://doi.org/10.1177/2372732215624215

Nielsen, J. A. (2013). Dialectical features of students’ argumentation: A critical review of argumentation studies in science education. Research in Science Education, 43 (1), 371–393. https://doi.org/10.1007/s11165-011-9266-x

Noroozi, O., Weinberger, A., Biemans, H. J., Mulder, M., & Chizari, M. (2013). Facilitating argumentative knowledge construction through a transactive discussion script in CSCL. Computers & Education, 61 , 59–76. https://doi.org/10.1016/j.compedu.2012.08.013

Noroozi, O., Kirschner, P. A., Biemans, H. J., & Mulder, M. (2018). Promoting argumentation competence: Extending from first-to second-order scaffolding through adaptive fading. Educational Psychology Review, 30 (1), 153–176. https://doi.org/10.1007/s10648-017-9400-z

Nussbaum, E. M. (2002). Scaffolding argumentation in the social studies classroom. The Social Studies, 93 (2), 79–83. https://doi.org/10.1080/00377990209599887

Nussbaum, E. M. (2008). Using argumentation vee diagrams (AVDs) for promoting argument-counterargument integration in reflective writing. Journal of Educational Psychology, 100 (3), 549–565. https://doi.org/10.1037/0022-0663.100.3.549

Nussbaum, E. M., & Edwards, O. V. (2011). Critical questions and argument stratagems: A framework for enhancing and analyzing students’ reasoning practices. Journal of the Learning Sciences, 20 (3), 443–488. https://doi.org/10.1080/10508406.2011.564567

O’Keefe, D. J. (1992). Two concepts of argument. In W. L. Benoit, D. Hample, & P. Benoit (Eds.), Readings in argumentation (pp. 79–90). Foris Publications.

Oliveira, A. W., & Sadler, T. D. (2008). Interactive patterns and conceptual convergence during student collaborations in science. Journal of Research in Science Teaching, 45 (5), 634–658. https://doi.org/10.1002/tea.20211

Oliveira, D. K. B., Justi, R., & Mendonça, P. C. C. (2015). The use of representations and argumentative and explanatory situations. International Journal of Science Education, 37 (9), 1402–1435. https://doi.org/10.1080/09500693.2015.1039095

Osborne, J., Simon, S., Christodoulou, A., Howell-Richardson, C., & Richardson, K. (2013). Learning to argue: A study of four schools and their attempt to develop the use of argumentation as a common instructional practice and its impact on students. Journal of Research in Science Teaching, 50 (3), 315–347. https://doi.org/10.1002/tea.21073

Osborne, J. F., Henderson, J. B., MacPherson, A., Szu, E., Wild, A., & Yao, S. Y. (2016). The development and validation of a learning progression for argumentation in science. Journal of Research in Science Teaching, 53 (6), 821–846.

Oyler, J. (2019). Exploring teacher contributions to student argumentation quality. Studia Paedagogica, 24 (4), 173–198. https://doi.org/10.5817/sp2019-4-8

Plantin, C. (2004). On the inseparability of emotion and reason in argumentation. Amsterdam Studies in the Theory and History of Linguistic Sciences Series, 4 (248), 265–276. https://doi.org/10.1075/cilt.248.18pla

Polo, C., Lund, K., Plantin, C., & Niccolai, G. P. (2016). Group emotions: The social and cognitive functions of emotions in argumentation. International Journal of Computer-Supported Collaborative Learning, 11 (2), 123–156. https://doi.org/10.1007/s11412-016-9232-8

Prawat, R. S. (1991). The value of ideas: The immersion approach to the development of thinking. Educational Researcher, 20 (2), 3–30. https://doi.org/10.3102/0013189X020002003

Rapanta, C. (2019). Argumentation strategies in the classroom. Wilmington: Vernon Press.

Rapanta, C. (2021). Can teachers implement a student-centered dialogical argumentation method across the curriculum?. Teaching and Teacher Education, 105 . https://doi.org/10.1016/j.tate.2021.103404

Rapanta, C., Garcia-Mila, M., & Gilabert, S. (2013). What is meant by argumentative competence? An integrative review of methods of analysis and assessment in education. Review of Educational Research, 83 (4), 483-520. https://doi.org/10.3102/0034654313487606

Rapanta, C., & Macagno, F. (2019). Pragmatics, education and argumentation: Introduction to the special issue. Learning, Culture and Social Interaction. https://doi.org/10.1016/j.lcsi.2019.100371

Ravenscroft, A. (2000). Designing argumentation for conceptual development. Computers & Education, 34 (3-4), 241–255. https://doi.org/10.1016/s0360-1315(99)00048-2

Resnick, L. B., Asterhan, C. S., Clarke, S. N., & Schantz, F. (2018). Next generation research in dialogic learning. In G. E. Hall, L. E. Quinn, & D. M. Gollnick (Eds.), Wiley handbook of teaching and learning (pp. 323–338). Wiley.

Reznitskaya, A., Kuo, L. J., Clark, A. M., Miller, B., Jadallah, M., Anderson, R. C., & Nguyen-Jahiel, K. (2009). Collaborative reasoning: A dialogic approach to group discussions. Cambridge Journal of Education, 39 (1), 29–48. https://doi.org/10.1080/03057640802701952

Reznitskaya, A., Glina, M., Carolan, B., Michaud, O., Rogers, J., & Sequeira, L. (2012). Examining transfer effects from dialogic discussions to new tasks and contexts. Contemporary Educational Psychology, 37 (4), 288–306. https://doi.org/10.1016/j.cedpsych.2012.02.003

Ryu, S., & Sandoval, W. A. (2012). Improvements to elementary children’s epistemic understanding from sustained argumentation. Science Education, 96 (3), 488–526. https://doi.org/10.1002/sce.21006

Sampson, V., & Clark, D. B. (2011). A comparison of the collaborative scientific argumentation practices of two high and two low performing groups. Research in Science Education, 41 (1), 63–97. https://doi.org/10.1007/s11165-009-9146-9 .

Sandelowski, M., & Barroso, J. (2007). Handbook for synthesizing qualitative research . Springer.

Schwarz, B. B. (2009). Argumentation and learning. In N. Muller-Mirza & A. N. Perret-Clermont (Eds.), Argumentation and education: Theoretical foundations and practices (pp. 91–126). Springer.

Scott, P. H., Mortimer, E. F., & Aguiar, O. G. (2006). The tension between authoritative and dialogic discourse: A fundamental characteristic of meaning making interactions in high school science lessons. Science Education, 90 (4), 605–631. https://doi.org/10.1002/sce.20131

Selcen Guzey, S., & Aranda, M. (2017). Student participation in engineering practices and discourse: An exploratory case study. Journal of Engineering Education, 106 (4), 585–606. https://doi.org/10.1002/jee.20176

Sfard, A. (1998). On two metaphors for learning and the dangers of choosing just one. Educational Researcher, 27 (2), 4–13. https://doi.org/10.3102/0013189X027002004

Simon, S., Naylor, S., Keogh, B., Maloney, J., & Downing, B. (2008). Puppets promoting engagement and talk in science. International Journal of Science Education, 30 (9), 1229–1248. https://doi.org/10.1080/09500690701474037

Skoumios, M. (2009). The effect of sociocognitive conflict on students’ dialogic argumentation about floating and sinking. International Journal of Environmental and Science Education, 4 (4), 381–399.

Sutherland, J. (2006). Promoting group talk and higher-order thinking in pupils by ‘coaching’ secondary English trainee teachers. Literacy, 40 (2), 106–114. https://doi.org/10.1111/j.1467-9345.2006.00436.x

Thiebach, M., Mayweg-Paus, E., & Jucks, R. (2016). Better to agree or disagree? The role of critical questioning and elaboration in argumentative discourse. Zeitschrift für Pädagogische Psychologie, 30 (2-3), 133–149. https://doi.org/10.1024/1010-0652/a000174

Topping, K. J., & Trickey, S. (2007). Impact of philosophical enquiry on school students’ interactive behaviour. Thinking Skills and Creativity, 2 (2), 73–84. https://doi.org/10.1016/j.tsc.2007.03.001

Toulmin, S. (1958). The uses of argument . Cambridge University Press.

Van Amelsvoort, M., Andriessen, J., & Kanselaar, G. (2007). Representational tools in computer-supported collaborative argumentation-based learning: How dyads work with constructed and inspected argumentative diagrams. The Journal of the Learning Sciences, 16 (4), 485–521. https://doi.org/10.1080/10508400701524785

van Eemeren, F. H., & Grootendorst, R. (1992). Argumentation, communication, and fallacies: A pragma-dialectical perspective . Lawrence Erlbaum Associates, Inc..

Van Eemeren, F. H., & Grootendorst, R. (2003). A pragma-dialectical procedure for a critical discussion. Argumentation, 17 (4), 365–386. https://doi.org/10.1023/a:1026334218681

Veerman, A. L., Andriessen, J. E., & Kanselaar, G. (2000). Learning through synchronous electronic discussion. Computers & Education, 34 (3-4), 269–290. https://doi.org/10.1016/S0360-1315(99)00050-0

Villarroel, C., Felton, M., & Garcia-Mila, M. (2016). Arguing against confirmation bias: The effect of argumentative discourse goals on the use of disconfirming evidence in written argument. International Journal of Educational Research, 79 , 167–179. https://doi.org/10.1016/j.ijer.2016.06.009

Vogel, F., Kollar, I., Ufer, S., Reichersdorfer, E., Reiss, K., & Fischer, F. (2016). Developing argumentation skills in mathematics through computer-supported collaborative learning: The role of transactivity. Instructional Science, 44 (5), 477–500. https://doi.org/10.1007/s11251-016-9380-2

Von Aufschnaiter, C., Erduran, S., Osborne, J., & Simon, S. (2008). Arguing to learn and learning to argue: Case studies on how students’ argumentation relates to their scientific knowledge. Journal of Research in Science Teaching, 45 (1), 101–131. https://doi.org/10.1002/tea.20213

Walton, D. N. (1989). Informal logic: A handbook for critical argumentation . Cambridge University Press.

Walton, D. N. (1998). The new dialectic: Conversational contexts of argument . University of Toronto Press.

Walton, D. (2013). Methods of argumentation . Cambridge University Press.

Walton, D., & Krabbe, E. C. (1995). Commitment in dialogue: Basic concepts of interpersonal reasoning . SUNY press.

Weinberger, A., Marttunen, M., Laurinen, L., & Stegmann, K. (2013). Inducing socio-cognitive conflict in Finnish and German groups of online learners by CSCL script. International Journal of Computer-Supported Collaborative Learning, 8 (3), 333–349. https://doi.org/10.1007/s11412-013-9173-4

Wells, G. (2007). Semiotic mediation, dialogue and the construction of knowledge. Human Development, 50 (5), 244–274. https://doi.org/10.1159/000106414

Wells, G., & Arauz, R. M. (2006). Dialogue in the classroom. The Journal of the Learning Sciences, 15 (3), 379–428. https://doi.org/10.1207/s15327809jls1503_3

Whittemore, R., & Knafl, K. (2005). The integrative review: Updated methodology. Journal of Advanced Nursing, 52 (5), 546–553. https://doi.org/10.1111/j.1365-2648.2005.03621.x

Wu, H. K., & Krajcik, J. S. (2006). Inscriptional practices in two inquiry-based classrooms: A case study of seventh graders’ use of data tables and graphs. Journal of Research in Science Teaching, 43 (1), 63–95. https://doi.org/10.1002/tea.20092

Yun, S. M., & Kim, H. B. (2015). Changes in students’ participation and small group norms in scientific argumentation. Research in Science Education, 45 (3), 465–484. https://doi.org/10.1007/s11165-014-9432-z

Nussbaum, E. M., Kardash, C. M., & Graham, S. E. (2005). The effects of goal instructions and text on the generation of counterarguments during writing. Journal of Educational Psychology, 97 (2), 157–169. https://doi.org/10.1037/0022-0663.97.2.157

Zhang, J., Niu, C., Munawar, S., & Anderson, R. C. (2016). What makes a more proficient discussion group in English language learners’ classrooms? Influence of teacher talk and student backgrounds. Research in the Teaching of English , 183–208.

Zohar, A., & Nemet, F. (2002). Fostering students’ knowledge and argumentation skill through dilemmas in human genetics. Journal of Research in Science Teaching, 39 (1), 35–62. https://doi.org/10.1002/tea.10008

Download references

This work is funded by national funds through the FCT – Fundação para a Ciência e a Tecnologia, I.P., under the strategic project UIDB/00183/2020, the Norma Transitória –DL 57/2016/CP1453/CT0066, and the “METACARE: Evidence-based Metaphors for Diabetes Care” Project– PTDC/FER-FIL/28278/2017. The authors would also like to acknowledge the European Commission Horizon 2020 (H2020) project "DIALLS: Dialogue and argumentation for cultural literacy learning in schools" (No 770045), as part of which this study was developed.

Author information

Authors and affiliations.

Faculty of Social Science and Humanities, Universidade Nova de Lisboa, Lisbon, Portugal

Chrysi Rapanta

Connie L. Lurie College of Education, San Jose State University, San Jose, CA, USA

Mark K. Felton

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Chrysi Rapanta .

Ethics declarations

Conflict of interest.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Rapanta, C., Felton, M.K. Learning to Argue Through Dialogue: a Review of Instructional Approaches. Educ Psychol Rev 34 , 477–509 (2022). https://doi.org/10.1007/s10648-021-09637-2

Download citation

Accepted : 12 August 2021

Published : 04 September 2021

Issue Date : June 2022

DOI : https://doi.org/10.1007/s10648-021-09637-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Argumentation
  • Learning to argue
  • Instructional approach
  • Find a journal
  • Publish with us
  • Track your research

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Selecting Potential Instructional Materials for Literature Teaching in the 21 st Century Milieu: Findings from a Systematic Review of Literature

Profile image of ferdinand bulusan

2019, Asian EFL

The kind of instructional materials given to students in a literature class is believed to contribute to the improvement or deterioration of students' achievement. This study addresses this gap by finding the potential criteria for selecting the appropriate instructional material for literature teaching. It makes use of a systematic literature review surveyed from published studies in refereed journals available from various databases. To plot the responses to the formulated research questions, a repertory grid was used. The data on the grid were analyzed to be able to identify the various gaps in the conducted researches. The surveyed materials underwent thematic analysis. Subsequently, this study yields the following criteria, encapsulated through the acronym "CARE," namely: cultural enrichment, authenticity of the material, relevant language enrichment, and ease requirement for understanding. It recommends that language teachers, especially those specializing in literature teaching, try the use of the criteria proposed by this study. Finally, this paper presents pedagogical implications and recommendations for further research along this vein.

Related Papers

International Journal of Academic Research in Business and Social Sciences

Othman Lebar

how do educational researches and literature differ from instructional resources

Journal Of Language Education and Development (JLed)

SISKA MALDIN

Literature and culture are two elements that cannot be separated as overall general knowledge. Thus, it is evident that EFL learners are suggested to learn literature and study it when learning English as a foreign language. This is due to the reason that literature can be used as motivating material that is represented in the value listed in the story. Clearly, literature is able to provide EFL students access to other people whose language they are studying. Unfortunately, research shows that teachers in high school find it difficult and seems boring to teach literature in the EFL classroom. Thus, this paper is aimed to show several strategies that EFL teachers might use to teach literature in their classroom as part of a genre-based approach application. The strategies are divided into pre-activity, whilst-activity, and post-activity.

Carola Surkamp

Simon Bibby

World Journal of English Language

Srinivasa Rao Idapalapati

Richard Beach

Language Teaching

The resurgence in the use of literature in language teaching has been accompanied by an increasing number of research articles in this area. Research (in a number of second languages) has looked at the type of interactions and the type of language that arise from classroom discussions about literature, as well as at the views of teachers and learners. Importantly, the reactions that learners have to incorporating literature in their language lessons are linked to the type of approach and type of task that are used in the classroom. The paper surveys the existing research, as well as evidence from practitioners about approaches that are used and the range of works and authors that are taught.

wan roselezam wan yahya

SMART M O V E S J O U R N A L IJELLH

Abstract Using literature as a resource in language classroom has been a challenge for EFL teachers across the globe for years. In EFL setting, mostly teachers are trained and certified to teach integrated skills of language and other components of language like grammar, vocabulary, pronunciation, etc. The English language skills and areas are taught by applying a variety of English Language Teaching (ELT) methods and techniques. The primary aim of the study is to identify the role and the use of literature as a resource in EFL classrooms. The use of literature as a resource in language classroom can be useful and challenging not only for the teachers but also for the learners to improve proficiency. To learn a foreign language is as difficult as any other subject of study as every component of language requires a particular

RELATED PAPERS

Jose Luis Gonzalez Martínez

THÉMATA. Revista de Filosofía

Marisa Vadillo

Chemical Communications

Diego Pallarola

Shad Sargand

misa misa a c

Journal of Babol University of Medical Sciences

mubabol journal

Marco Ospina

Khairatun Nisa

Fiep Bulletin- Online

Branislav Antala

Diabetes & Metabolism Journal

Journal of the American Geriatrics Society

NANCY LIVIER RAMIREZ SANTANA

International Journal of Environmental Research and Public Health

Paula Videira

siti roaitu

Marcin Zbiec

Victor Villon

Applied Developmental Science

Rubén G. Rumbaut

Mecánica Computacional

Pablo A. Kler

Scientific Reports

O. Roger Anderson

International Journal of Case Reports and Images

Onyeka Chukwu

Dicp-The annals of pharmacotherapy

British Journal of Education, Society &amp; Behavioural Science

Andrew Smith

CATUR ABHIRAMA SAPUTRO

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

IMAGES

  1. research literature review example

    how do educational researches and literature differ from instructional resources

  2. Types Of Research Presentation

    how do educational researches and literature differ from instructional resources

  3. Types of Research

    how do educational researches and literature differ from instructional resources

  4. Literature Review In Educational Research

    how do educational researches and literature differ from instructional resources

  5. Instructional Resources

    how do educational researches and literature differ from instructional resources

  6. Explain the Different Genres of Literature

    how do educational researches and literature differ from instructional resources

VIDEO

  1. Differentiated Instruction

  2. Research, Educational research

  3. OMAH-History Speaker Series -History Education Through Art

  4. How To Use "BE" and "BEING" Correctly Based On The Meaning

  5. Types of Research in Educational Research(b.ed/m.ed/Net Education)

  6. How Learning and Intellectual Disability Differ(Part -2)

COMMENTS

  1. Choosing and Using Instructional Resources

    Books are a highly portable form of information and can be accessed when, where, and at whatever rate and level of detail the reader desires. Research indicates that, for many people, visual processing (i.e., reading) is faster than auditory processing (i.e., listening to lectures), making textbooks a very effective resource (McKeachie, 1994).

  2. Learning and teaching materials

    The central role of textbooks and other learning and teaching materials (LTM) in enhancing the quality of learning and improving student performance is widely recognized (Smart and Jagannathan, 2018; GEM Report, 2016b). In low-income countries, quality LTM can compensate for disabling factors such as large class sizes, poorly trained or unqualified teachers, a shortage of instructional time ...

  3. PDF Teachers' Perspectives on Educational Research

    The volume of literature housed in the Education Resources ... Linking the vast community of educational researchers is "a deep concern with the condition of children and schools" (Gardner ... teachers' instructional practices and how educational research might be made more appealing to teachers. In other words, is there a demand by ...

  4. Helping Practitioners and Researchers Identify and Use Education

    Instructors may have a hard time finding relevant educational literature to support instructional choices, because many relevant articles are in journals that are unfamiliar to science instructors. Research that informs practice in education is spread across many disciplines, ranging from psychology and education and their subdisciplines to

  5. Resources, Instruction, and Research

    Educational or less elaborated versions of the academic tasks. policy and practice inevitably involve negotiation that were central to the regime, and optimal ver- among goals, instructional means, and resources, sions of the instructional media needed to enact the. and research should weigh the consequences of tasks.

  6. Educational research

    The teaching resources recommended on our site are consistent with what is known about how students learn the nature and process of science. Educational research suggests that the most effective instruction in this area is explicit and reflective, and provides multiple opportunities for students to work with key concepts in different contexts.

  7. Where and How to Search for Evidence in the Education Literature: The

    The steps to searching for education literature are summarized in Figure 1. The figure includes numbers to order the steps; however, it is a circle (wheel) because as new information is learned, the process may be repeated to incorporate the new information. The first step in the process is to clearly define the question or the hypothesis and ...

  8. (PDF) Educational research: Reviewing the literature

    literature review is the informative, evaluative and critical synthesis of a particular topic. and provides readers with a clear picture of the subject and its associated range of. perspectives ...

  9. Resources for Teaching: Examining Personal and Institutional Predictors

    Research that does exist in this area points to relationships between instructional quality and three broad classes of teacher characteristics: background characteristics such as educational experiences and prior career experience (Leinhardt, 1989; Scribner & Akiba, 2010); knowledge, habits, and dispositions, including content and pedagogical content knowledge, and self-efficacy (Baumert et al ...

  10. Instructional Resources: Literature Resource Materials Revisited

    In the April 1990 column I reviewed of literature rather than basal readers is. two sets of materials designed to facili. tate the use of literature in the class. room. I focused on adaptability/. flexibility of such materials and a re. view of four other criteria for evaluat. ing resources: content validity, transfer.

  11. Capturing instructional differentiation in educational research

    This paper aims to contribute to a better understanding of instructional differentiation. It discusses definitions and operationalisations of instructional differentiation in the educational research literature and argues for the inclusion of deliberateness and adaptiveness as two defining characteristics of instructional differentiation.

  12. Full article: The role of instructional materials in the relationship

    Conceptions of the official curriculum and the operational curriculum. To conceptualize aspects of curriculum, we turn to Remillard and Heck (Citation 2014).They explain that the official curriculum is comprised of curriculum goals and objectives, the content of consequential assessments, and the designated curriculum. The goals and objectives include "expectations for student learning or ...

  13. (PDF) Teachers' Effective Use of Educational Resources ...

    Conclusion - Teachers must be resourceful in order to improve the ways in which students learn in the 21st century. Teachers can improve how they work as educators by utilizing the varied ...

  14. PDF Research-Based Instructional Strategies that Improve Educational

    strategies emerge from systematic review of the literature of four different sources: Research on the mind that help learners learn compound tasks; Research on classroom instructional processes that are used by most effective educators; Research on instructional ideologies that every educator must know;

  15. PDF AVAILABILITY AND USE OF INSTRUCTIONAL MATERIALS IN THE TEACHING OF ...

    Percentages of what is learnt using Different Senses Senses used when Learning % of what is learnt Taste 1.0 Touch 1.5 Small 3.5 Hearing 11.5 Sight 83.0 Total 100.0 Source: Sampath (1990) Sampath also notes that we remember more when we see, hear, say and do. Table-2. What is remembered from Learning Activities using Different Senses

  16. (PDF) Selecting Potential Instructional Materials for Literature

    Educational Drama and Language Arts: What Research Shows (2007) by Betty Jane Wagner is an important book that gives the panoramic view of recent research on drama as a classroom strategy for ...

  17. Open Educational Resources: A Review of the Literature

    This chapter begins by reviewing the many definitions of the term open educational resources and concludes by discussing challenges and opportunities for the approach. Open educational resources (OER) are educational materials either licensed under an open copyright license or in the public domain. Neither the term "open educational resources ...

  18. Do open educational resources improve student learning ...

    Open Educational Resources (OER) have been lauded for their ability to reduce student costs and improve equity in higher education. Research examining whether OER provides learning benefits have produced mixed results, with most studies showing null effects. We argue that the common methods used to examine OER efficacy are unlikely to detect positive effects based on predictions of the access ...

  19. Learning to Argue Through Dialogue: a Review of Instructional

    Over the past 20 years, a broad and diverse research literature has emerged to address how students learn to argue through dialogue in educational contexts. However, the variety of approaches used to study this phenomenon makes it challenging to find coherence in what may otherwise seem to be disparate fields of study. In this integrative review, we propose looking at how learning to argue ...

  20. (PDF) Selecting Potential Instructional Materials for Literature

    The resurgence in the use of literature in language teaching has been accompanied by an increasing number of research articles in this area. Research (in a number of second languages) has looked at the type of interactions and the type of language that arise from classroom discussions about literature, as well as at the views of teachers and learners.

  21. Instructional Materials for Diverse Learners: Features and

    American Educational Research Journal, 17, 211-218. Google Scholar. ... is a program specialist for the Great Lakes Area Regional Resource Center. Through this federally funded project, she assists state departments of special education in their work to provide quality special education and related services for children with special needs and ...

  22. Differentiated instruction: Curriculum and resources provide a roadmap

    Conclusion and recommendations for future research. Given what the literature says and what the research findings in terms of teachers' abilities to modify the state-prepared curricular resources to meet learners' needs demonstrated, differentiating UDL-based curricular resources may appear challenging but possible and necessary.

  23. FS 2 Learning Episode 6

    Teacher-made resources (improvised teaching materials) Digital media is an ICT-based instructional resource that can be created and preserved in digital formats (digital videos, digital audios, digital images, presentations, e-books, electronic documents, infographics, social media, video games). Educational research such as research journals ...