Critical Thinking: 40 Useful Performance Feedback Phrases

Critical Thinking: Use these sample phrases to craft meaningful performance evaluations, drive change and motivate your workforce.

Critical Thinking is the ability to think clearly and rationally while understanding the logical connection between ideas in a reflective and independent thinking.

Critical Thinking: Exceeds Expectations Phrases

  • Highly demonstrates assertive and decisive ability when finding solutions for problems.
  • Knows how to communicate ideas and issues easily in a very clear and concise manner.
  • Able to piece together elements and come up with proper deductions on issues.
  • Knows how to clarify problems and solutions easily so that other people can understand.
  • Evaluates ideas and opinions in an unbiased manner without favoritism.
  • Thinks through issues in a very logical manner that results in finding the best solution to a problem.
  • Demonstrates excellent problem-solving skills by accessing a problem and devising the best possible solution for it.
  • Takes in into consideration different and varied perspectives when solving problems.
  • Examines the implications and possible consequences to any particular action carefully.
  • Solves problems one by one so as not to mix up issues and ideas.

Critical Thinking: Meets Expectations Phrases

  • Uses strategic approachability and skill when it comes to solving issues.
  • Demonstrates well assertive and decisive ability when it comes to handling problems.
  • Tries to always consider all factors at play before deciding on a particular methods or way.
  • Gathers all the required facts and figures before starting to solve a particular problem.
  • Always seeks to ask questions as a form of finding a sound basis to solving a problem.
  • Not afraid to make mistakes and tries to find creative ways to handle any issue.
  • Looks at issues in different angles and not in a one-sided way.
  • Shows great flexibility when it comes to changing strategies and tact while dealing with different problems.
  • Questions regularly to find out if the decision taken will achieve the desired effect.
  • Always feels comfortable and confident when seeking help or advice whenever stuck on solving any issue.

Critical Thinking: Needs Improvement Phrases

  • Does not take the time to consider all factors available before making a decision carefully.
  • Looks at issues in a one-sided manner instead of having different varied ways of looking at it.
  • Finds it challenging to arrive at a concrete conclusion after undertaking any evaluation.
  • Not willing to do proper research and relies on outdated data and information.
  • Does not demonstrate a curious type of attitude to try to find out a little bit more on issues.
  • Not willing to accept corrections and take calculated risks when necessary.
  • Does not show enough willingness to try to improve the critical thinking skills.
  • Does not present ideas and points in a logical order or outline.
  • Looks at issues in a biased and unfair way by not evaluating all factors.
  • Not willing and inflexible to change tact or strategy when the initial plan fails to achieve the desired effect

Critical Thinking: Self Evaluation Questions

  • How well do you research and gather facts and information before solving an issue?
  • Give an instance you hurriedly made a decision without thinking it through and what was the result?
  • Elaborate on a given occasion that you made the best decision. How did you feel about it?
  • How well do you consider all the factors available before making your decision?
  • Do you get to appreciate and learn from your mistakes and how do you deal with making wrong decisions?
  • Do you always try to inquire further, what could be the missing elements that could achieve a positive outcome?
  • How well do you consider having a concrete conclusion when presenting your ideas?
  • When expressing your ideas do you deliver them in a logical manner?
  • Do you usually look at issues in a one-sided manner or at different angles?
  • How flexible are you when it comes to trying different ways of solving problems other than the intended way.

These articles may interest you

Recent articles.

  • Skills needed to be a biomedical scientist
  • Poor Employee Performance Feedback: Compensation and Benefits Manager
  • Outstanding Employee Performance Feedback: IT Audit Senior
  • Employee Performance Goals Sample: Advanced Practice Psychiatric Nurse
  • Poor Employee Performance Feedback: Teacher Assistant
  • Employee Appreciation Lunch Invitation Example
  • Employee Performance Goals Sample: Claims Quality Assurance Auditor
  • Scheduling: 40 Useful Performance Feedback Phrases
  • Outstanding Employee Performance Feedback: Digital Archivist
  • Good Employee Performance Feedback: Biotechnician
  • Skills needed to be a high school equivalency instructor
  • Orientation to Work: 15 Examples for Setting Performance Goals
  • Skills needed to be a brand ambassador
  • Motivating Employees: Employee Recognition Gifts
  • Skills needed to be a technical consultant

Status.net

Critical Thinking: Performance Review Examples (Rating 1 – 5)

By Status.net Editorial Team on July 15, 2023 — 8 minutes to read

Critical thinking skills are an essential aspect of an employee’s evaluation: the ability to solve problems, analyze situations, and make informed decisions is crucial for the success of any organization.

Questions that can help you determine an employee’s rating for critical thinking:

  • Does the employee consistently analyze data and information to identify patterns and trends?
  • Does the employee proactively identify potential problems and develop solutions to mitigate them?
  • Has the employee demonstrated the ability to think creatively and come up with innovative ideas or approaches?
  • Does the employee actively seek out feedback and input from others to inform their decision-making process?
  • Has the employee demonstrated the ability to make sound decisions based on available information and data?

Performance Review Phrases and Paragraphs Examples For Critical Thinking

5 – outstanding.

Employees with outstanding critical thinking skills are exceptional at identifying patterns, making connections, and using past experiences to inform their decisions.

Phrases Examples

  • Consistently demonstrates exceptional critical thinking abilities
  • Always finds creative and innovative solutions to complex problems
  • Skilfully analyzes information and data to make well-informed decisions
  • Frequently provides valuable insights and perspectives that benefit the team
  • Continuously seeks out new learning opportunities to sharpen their critical thinking skills
  • Demonstrates exceptional ability to identify and analyze complex issues
  • Consistently develops innovative solutions to problems
  • Skillfully connects disparate ideas to create coherent arguments
  • Effectively communicates well-reasoned conclusions
  • Exceptional ability to recognize trends in data
  • Expertly applies existing knowledge to new situations
  • Consistently anticipates potential challenges and develops solution

Paragraph Example 1

“Jane consistently demonstrates outstanding critical thinking skills in her role. She not only engages in deep analysis of complex information, but she also presents unique solutions to problems that have a significant positive impact on the team’s performance. Her ability to make well-informed decisions and offer valuable insights has led to numerous successes for the organization. Moreover, Jane’s dedication to improvement and learning demonstrates her commitment to personal and professional growth in the area of critical thinking.”

Paragraph Example 2

“Jessica consistently displays outstanding critical thinking skills. She is able to identify and analyze complex issues with ease and has demonstrated her ability to develop innovative solutions. Her skill in connecting disparate ideas to create coherent arguments is impressive, and she excels at communicating her well-reasoned conclusions to the team.”

Paragraph Example 3

“Melanie consistently demonstrates an exceptional ability to recognize patterns and trends in data, which has significantly contributed to the success of our projects. Her critical thinking skills allow her to apply her extensive knowledge and experience in creative and innovative ways, proactively addressing potential challenges and developing effective solutions.”

4 – Exceeds Expectations

Employees exceeding expectations in critical thinking skills are adept at analyzing information, making sound decisions, and providing thoughtful recommendations. They are also effective at adapting their knowledge to novel situations and displaying confidence in their abilities.

  • Excellent analytical capabilities
  • Provides well-reasoned recommendations
  • Demonstrates a solid understanding of complex concepts
  • Regularly demonstrates the ability to think analytically and critically
  • Effectively identifies and addresses complex problems with well-thought-out solutions
  • Shows exceptional skill in generating innovative ideas and solutions
  • Exhibits a consistently high level of decision-making based on sound reasoning
  • Proactively seeks out new information to improve critical thinking skills
  • Routinely identifies potential challenges and provides solutions
  • Typically recognizes and prioritizes the most relevant information
  • Logical thinking is evident in daily decision-making
  • Often weighs the pros and cons of multiple options before selecting a course of action

“Eric’s critical thinking skills have consistently exceeded expectations throughout his tenure at the company. He is skilled at reviewing and analyzing complex information, leading him to provide well-reasoned recommendations and insights. Eric regularly demonstrates a deep understanding of complicated concepts, which allows him to excel in his role.”

“In this evaluation period, Jane has consistently demonstrated an exceptional ability to think critically and analytically. She has repeatedly shown skill in identifying complex issues while working on projects and has provided well-thought-out and effective solutions. Her innovative ideas have contributed significantly to the success of several key initiatives. Moreover, Jane’s decision-making skills are built on sound reasoning, which has led to positive outcomes for the team and organization. Additionally, she actively seeks opportunities to acquire new information and apply it to her work, further strengthening her critical thinking capabilities.”

“John consistently exceeds expectations in his critical thinking abilities. He routinely identifies potential challenges and provides thoughtful solutions. He is skilled at recognizing and prioritizing the most relevant information to make well-informed decisions. John regularly weighs the pros and cons of various options and selects the best course of action based on logic.”

3 – Meets Expectations

Employees meeting expectations in critical thinking skills demonstrate an ability to analyze information and draw logical conclusions. They are effective at problem-solving and can make informed decisions with minimal supervision.

  • Capable of processing information and making informed decisions
  • Displays problem-solving skills
  • Demonstrates logical thinking and reasoning
  • Consistently demonstrates the ability to analyze problems and find possible solutions.
  • Actively engages in group discussions and contributes valuable ideas.
  • Demonstrates the ability to draw conclusions based on logical analysis of information.
  • Shows willingness to consider alternative perspectives when making decisions.
  • Weighs the pros and cons of a situation before reaching a decision.
  • Usually identifies relevant factors when faced with complex situations
  • Demonstrates an understanding of cause and effect relationships
  • Generally uses sound reasoning to make decisions
  • Listens to and considers different perspectives

“Sarah consistently meets expectations in her critical thinking skills, successfully processing information and making informed decisions. She has shown her ability to solve problems effectively and displays logical reasoning when approaching new challenges. Sarah continues to be a valuable team member thanks to these critical thinking skills.”

“Jane is a team member who consistently meets expectations in regards to her critical thinking skills. She demonstrates an aptitude for analyzing problems within the workplace and actively seeks out potential solutions by collaborating with her colleagues. Jane is open-minded and makes an effort to consider alternative perspectives during decision-making processes. She carefully weighs the pros and cons of the situations she encounters, which helps her make informed choices that align with the company’s objectives.”

“David meets expectations in his critical thinking skills. He can usually identify the relevant factors when dealing with complex situations and demonstrates an understanding of cause and effect relationships. David’s decision-making is generally based on sound reasoning, and he listens to and considers different perspectives before reaching a conclusion.”

2 – Needs Improvement

Employees in need of improvement in critical thinking skills may struggle with processing information and making logical conclusions. They may require additional guidance when making decisions or solving problems.

  • Struggles with analyzing complex information
  • Requires guidance when working through challenges
  • Difficulty applying past experiences to new situations
  • With some guidance, Jane is able to think critically, but she struggles to do so independently.
  • John tends to jump to conclusions without analyzing a situation fully.
  • Sarah’s problem-solving skills need improvement, as she often overlooks important information when making decisions.
  • David’s critical thinking skills are limited and need further development to enhance his overall work performance.
  • Occasionally struggles to identify and analyze problems effectively
  • Inconsistently uses logic to make decisions
  • Often overlooks important information or perspectives
  • Requires guidance in weighing options and making judgments

“Bob’s critical thinking skills could benefit from further development and improvement. He often struggles when analyzing complex information and tends to need additional guidance when working through challenges. Enhancing Bob’s ability to apply his past experiences to new situations would lead to a notable improvement in his overall performance.”

“Jenny is a valuable team member, but her critical thinking skills need improvement before she will be able to reach her full potential. In many instances, Jenny makes decisions based on her first impressions without questioning the validity of her assumptions or considering alternative perspectives. Her tendency to overlook key details has led to several instances in which her solutions are ineffective or only partly beneficial. With focused guidance and support, Jenny has the potential to develop her critical thinking skills and make more informed decisions in the future.”

“Tom’s critical thinking skills require improvement. He occasionally struggles to identify and analyze problems effectively, and his decision-making is inconsistent in its use of logic. Tom often overlooks important information or perspectives and may require guidance in weighing options and making judgments.”

1 – Unacceptable

Employees with unacceptable critical thinking skills lack the ability to analyze information effectively, struggle with decision-making, and fail to solve problems without extensive support from others.

  • Fails to draw logical conclusions from information
  • Incapable of making informed decisions
  • Unable to solve problems without extensive assistance
  • Fails to analyze potential problems before making decisions
  • Struggles to think critically and ask relevant questions
  • Cannot effectively identify alternative solutions
  • Lacks the ability to apply logic and reason in problem-solving situations
  • Does not consistently seek input from others or gather information before making a decision
  • Regularly fails to recognize or address important issues
  • Makes hasty decisions without considering potential consequences
  • Lacks objectivity and often relies on personal biases
  • Resistant to alternative viewpoints and constructive feedback

“Unfortunately, Sue’s critical thinking skills have been consistently unacceptable. She fails to draw logical conclusions from available information and is incapable of making informed decisions. Sue has also shown that she is unable to solve problems without extensive assistance from others, which significantly impacts her performance and the team’s productivity.”

“Jane’s performance in critical thinking has been unacceptable. She often fails to analyze potential problems before making decisions and struggles to think critically and ask relevant questions. Jane’s inability to effectively identify alternative solutions and apply logic and reason in problem-solving situations has negatively impacted her work. Furthermore, she does not consistently seek input from others or gather information before making a decision. It is crucial for Jane to improve her critical thinking skills to become a more effective and valuable team member.”

“Susan’s critical thinking skills are unacceptable. She regularly fails to recognize and address important issues, and her decision-making is often hasty and without considering potential consequences. Susan frequently lacks objectivity and tends to rely on personal biases. She is resistant to alternative viewpoints and constructive feedback, which negatively affects her work performance.”

  • Job Knowledge Performance Review Phrases (Examples)
  • 100 Performance Review Phrases for Job Knowledge, Judgment, Listening Skills
  • 100+ Performance Evaluation Comments for Attitude, Training Ability, Critical Thinking
  • How to Write an Effective Performance Review (Essential Steps)
  • How To Write a Manager Performance Review? (with Examples)
  • 60 Self-Performance Review Goals Examples

loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

critical thinking training feedback

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Managing part-time staff.

Strategies for Your Flexible Workforce

The Influence Model

Using Reciprocity to Gain Influence

Add comment

Comments (1)

priyanka ghogare

critical thinking training feedback

Get 20% off your first year of Mind Tools

Our on-demand e-learning resources let you learn at your own pace, fitting seamlessly into your busy workday. Join today and save with our limited time offer!

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Most Popular

Newest Releases

Article a4j6oek

Team Management Skills

Article ao7h95f

5 Phrases That Kill Collaboration

Mind Tools Store

About Mind Tools Content

Discover something new today

How do i manage a hybrid team.

Adjusting your management style to a hybrid world

The Life Career Rainbow

Finding a Work-Life Balance That Suits You

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

9 ways to future proof your career.

Staying Relevant for Tomorrow's World

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Pain Points

  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, performance assessment of critical thinking: conceptualization, design, and implementation.

critical thinking training feedback

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.

Introduction

In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

www.frontiersin.org

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank all the researchers who have participated in the iPAL program.

  • ^ https://www.ipal-rd.com/
  • ^ https://www.aacu.org/value
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: https://www.aacu.org/value (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: https://www.aacu.org/research/2018-future-of-work (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: http://fsse.indiana.edu/pdf/FSSE_IR_2019/summary_tables/FSSE19_Frequencies_(FSSE_2019).pdf (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: http://timssandpirls.bc.edu/pirls2016/international-results/ (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at: https://www.insidehighered.com/views/2009/10/16/limitations-portfolios

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: https://pdfs.semanticscholar.org/91ae/8edfac44bce3bed37d8c9091da01d6db3776.pdf .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

This article is part of the Research Topic

Assessing Information Processing and Online Reasoning as a Prerequisite for Learning in Higher Education

Technology-scaffolded peer assessment for developing critical thinking in pre-service teacher training: the importance of giving feedback

  • Development Article
  • Published: 05 December 2022
  • Volume 71 , pages 667–688, ( 2023 )

Cite this article

  • Camila Barahona 1 , 3 ,
  • Miguel Nussbaum 1 ,
  • Vicente Martin 1 ,
  • Alejandra Meneses 2 ,
  • Silvana Arriagada 1 ,
  • Angela Di Serio 4 , 5 &
  • Isabel Hilliger 1  

1089 Accesses

1 Altmetric

Explore all metrics

Developing critical thinking is becoming increasingly important as is giving and receiving feedback during the learning process. The aim of this work is to study how technology can scaffold peer assessment activities to develop critical thinking among pre-service teachers and study the relevance of giving and receiving feedback. A series of practice and application activities were introduced using technology-scaffolded peer assessment. Technological scaffolding minimized classroom logistics, while at the same time resolved any personal issues between peers as the tasks were assigned at random. Mixed-methods analysis revealed that technology-scaffolded peer assessment with anonymous feedback aided the significant development of critical thinking activities. It also showed that the feedback that was given was a predictor of the success of these activities. The added value of this work is that we show that for pre-service teachers, in a Reading Methods course, we can improve critical thinking skills with technology scaffolded peer assessment, and that giving feedback shows to be more relevant than receiving it.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

critical thinking training feedback

Similar content being viewed by others

critical thinking training feedback

An online collaborative peer-assessment approach to strengthening pre-service teachers’ digital content development competence and higher-order thinking tendency

Jian-Wen Fang, Shao-Chen Chang, … Gang Yang

critical thinking training feedback

Research trends and applications of technology-supported peer assessment: a review of selected journal publications from 2007 to 2016

Qing-Ke Fu, Chi-Jen Lin & Gwo-Jen Hwang

critical thinking training feedback

Effects of Scaffolded Peer Assessment on Students’ Holistic Critical Thinking in Academic Writing

Data availability.

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Amalia, Q., Hartono, Y., & Indaryanti, I. (2019). Students’ critical thinking skills in modeling based learning. Journal of Physics: Conference Series, 1166 (1), 012017. https://doi.org/10.1088/1742-6596/1166/1/012017

Article   Google Scholar  

Aminudin, M., & Kusmaryono, I. (2019). mathematical teacher efforts to develop student’s critical thinking skill. Math Didactic: Jurnal Pendidikan Matematika, 5 (3), 248–258. https://doi.org/10.33654/math.v5i3.773

Arsal, Z. (2015). The effects of microteaching on the critical thinking dispositions of pre-service teachers. Australian Journal of Teacher Education, 40 (3), 9. https://doi.org/10.14221/ajte.2014v40n3.9

Baker, K. M. (2016). Peer review as a strategy for improving students’ writing process. Active Learning in Higher Education, 17 (3), 179–192. https://doi.org/10.1177/1469787416654794

Barak, M. (2017). Science teacher education in the twenty-first century: A pedagogical framework for technology-integrated social constructivism. Research in Science Education, 47 (2), 283–303. https://doi.org/10.1007/s11165-015-9501-y

Berndt, M., Strijbos, J., & Fischer, F. (2018). Effects of written peer-feedback content and sender’s competence on perceptions, performance, and mindful cognitive processing. European Journal of Psychology of Education, 33 , 31–49. https://doi.org/10.1007/s10212-017-0343-z

Bonett, D. G. (2019). Point-biserial correlation: Interval estimation, hypothesis testing, meta-analysis, and sample size determination. British Journal of Mathematical and Statistical Psychology . https://doi.org/10.1111/bmsp.12189

Broadbent, J., Panadero, E., & Boud, D. (2018). Implementing summative assessment with a formative flavour: A case study in a large class. Assessment & Evaluation in Higher Education, 43 (2), 307–322. https://doi.org/10.1080/02602938.2017.1343455

Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67 (3), 343–368. https://doi.org/10.1080/00131911.2014.929565

Cáceres, M., Nussbaum, M., & Ortiz, J. (2020). Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity, 37 , 100674. https://doi.org/10.1016/j.tsc.2020.100674

Cao, Z., Yu, S., & Huang, J. (2019). A qualitative inquiry into undergraduates’ learning from giving and receiving peer feedback in L2 writing: Insights from a case study. Studies in Educational Evaluation, 63 , 102–112. https://doi.org/10.1016/j.stueduc.2019.08.001

Chang, S. C., Hsu, T. C., & Jong, M. S. Y. (2020). Integration of the peer assessment approach with a virtual reality design system for learning earth science. Computers & Education, 146 , 103758. https://doi.org/10.1016/j.compedu.2019.103758

Chasteen, S. V., & Scherr, R. E. (2020). Developing the Physics Teacher Education Program Analysis rubric: Measuring features of thriving programs. Physical Review Physics Education Researchers Suggest, 16 (1), 010115. https://doi.org/10.1103/PhysRevPhysEducRes.16.010115

Chechile, R. A. (2018). A Bayesian analysis for the Wilcoxon signed-rank statistic. Communications in Statistics-Theory and Methods, 47 (21), 5241–5254. https://doi.org/10.1213/ANE.0b013e31827f53d7

Chen, M. R. A., & Hwang, G. J. (2020). Effects of a concept mapping-based flipped learning approach on EFL students’ English-speaking performance, critical thinking awareness and speaking anxiety. British Journal of Educational Technology, 51 (3), 817–834.

Clark, V. L. P. (2019). Meaningful integration within mixed methods studies: Identifying why, what, when, and how. Contemporary Educational Psychology, 57 , 106–111. https://doi.org/10.1016/j.cedpsych.2019.01.007

Creswell, J. W. (2014). A concise introduction to mixed methods research . SAGE publications.

Google Scholar  

Cruz, G., Payan-Carreira, R., Dominguez, C., Silva, H., & Morais, F. (2021). What critical thinking skills and dispositions do new graduates need for professional life? Views from Portuguese employers in different fields. Higher Education Research & Development, 40 (4), 721–737. https://doi.org/10.1080/07294360.2020.1785401

Danczak, S. M., Thompson, C. D., & Overton, T. L. (2020). Development and validation of an instrument to measure undergraduate chemistry students’ critical thinking skills. Chemistry Education Research and Practice, 21 (1), 62–78. https://doi.org/10.1039/C8RP00130H

Divine, G., Norton, H. J., Hunt, R., & Dienemann, J. (2013). A review of analysis and sample size calculation considerations for Wilcoxon tests. Anesthesia & Analgesia, 117 (3), 699–710. https://doi.org/10.1213/ANE.0b013e31827f53d7

Double, K. S., McGrane, J. A., & Hopfenbeck, T. N. (2020). The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educational Psychology Review, 32 (2), 481–509. https://doi.org/10.1007/s10648-019-09510-3

El Soufi, N., & See, B. H. (2019). Does explicit teaching of critical thinking improve critical thinking skills of English language learners in higher education? A critical review of causal evidence. Studies in Educational Evaluation, 60 , 140–162. https://doi.org/10.1016/j.stueduc.2018.12.006

Ennis, R. H. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4

Facione, P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction Executive Summary “The Delphi Report (vol. 423(c), pp. 1–19). The California Academic Press. https://doi.org/10.1016/j.tsc.2009.07.002

Fang, J. W., Chang, S. C., Hwang, G. J., et al. (2021). An online collaborative peer-assessment approach to strengthening pre-service teachers’ digital content development competence and higher-order thinking tendency. Educational Technology Research and Development, 69 , 1155–1181. https://doi.org/10.1007/s11423-021-09990-7

Fertelli, T. K. (2019). Peer assessment in learning of nursing process: Critical thinking and peer support. International Journal of Caring Sciences, 12 (1), 331–339.

Filius, R. M., de Kleijn, R. A., Uijl, S. G., Prins, F. J., van Rijen, H. V., & Grobbee, D. E. (2018). Strengthening dialogic peer feedback aiming for deep learning in SPOCs. Computers & Education, 125 , 86–100.

Fu, Q. K., Lin, C. J., & Hwang, G. J. (2019). Research trends and applications of technology-supported peer assessment: A review of selected journal publications from 2007 to 2016. Journal of Computers in Education, 6 (2), 191–213. https://doi.org/10.1007/s40692-019-00131-x

Giacumo, L. A., & Savenye, W. (2020). Asynchronous discussion forum design to support cognition: Effects of rubrics and instructor prompts on learner’s critical thinking, achievement, and satisfaction. Educational Technology Research and Development, 68 , 37–66. https://doi.org/10.1007/s11423-019-09664-5

Gielen, M., & De Wever, B. (2015). Structuring peer assessment: Comparing the impact of the degree of structure on peer feedback content. Computers in Human Behavior, 52 , 315–325. https://doi.org/10.1016/j.chb.2015.06.019

Gill-Simmen, L. (2020). Developing critical thinking skills: Using Edward de Bono’s six thinking hats in formative peer assessment & feedback. Journal of Applied Learning and Teaching, 3 (1), 138–141. https://doi.org/10.37074/jalt.2020.3.1.5

Golightly, A. (2021). Self-and peer assessment of preservice geography teachers’ contribution in problem-based learning activities in geography education. International Research in Geographical and Environmental Education, 30 (1), 75–90. https://doi.org/10.1080/10382046.2020.1744242

Goodsett, M. (2020). Best practices for teaching and assessing critical thinking in information literacy online learning objects. The Journal of Academic Librarianship, 46 (5), 102163. https://doi.org/10.1016/j.acalib.2020.102163

Hanrahan, S. J., & Isaacs, G. (2001). Assessing self-and peer-assessment: The students’ views. Higher Education Research & Development, 20 (1), 53–70. https://doi.org/10.1080/07294360123776

Harland, T., Wald, N., & Randhawa, H. (2017). Assessment & evaluation in higher education student peer review: Enhancing formative feedback with a rebuttal Student peer review: Enhancing formative feedback with a rebuttal. Assessment & Evaluation in Higher Education, 42 (5), 801–811. https://doi.org/10.1080/02602938.2016.1194368

Hwang, G., & Chang, S. (2021). Facilitating knowledge construction in mobile learning contexts: A bi-directional peer-assessment approach. British Journal of Educational Technology, 52 (1), 337–357. https://doi.org/10.1111/bjet.13001

Hwang, G.-J., Hung, C.-M., & Chen, N.-S. (2014). Improving learning achievements, motivations and problem-solving skills through a peer assessment-based game development approach. Educational Technology Research & Development, 62 (2), 129–145. https://doi.org/10.1007/s11423-013-9320-7

Hursen, C. (2020). The effect of problem-based learning method supported by web 2.0 tools on academic achievement and critical thinking skills in teacher education. Technology, Knowledge and Learning. https://doi.org/10.1007/s10758-020-09458-2

Ion, G., Sánchez Martí, A., & Agud Morell, I. (2019). Giving or receiving feedback: Which is more beneficial to students’ learning? Assessment & Evaluation in Higher Education, 44 (1), 124–138. https://doi.org/10.1080/02602938.2018.1484881

Janssen, E. M., Mainhard, T., Buisman, R. S., Verkoeijen, P. P., Heijltjes, A. E., van Peppen, L. M., & van Gog, T. (2019). Training higher education teachers’ critical thinking and attitudes towards teaching it. Contemporary Educational Psychology, 58 , 310–322. https://doi.org/10.1016/j.cedpsych.2019.03.007

Kimberlin, C. L., & Winterstein, A. G. (2008). Validity and reliability of measurement instruments used in research. American Journal of Health-System Pharmacy, 65 (23), 2276–2284. https://doi.org/10.2146/ajhp070364

Kostiainen, E., Ukskoski, T., Ruohotie-Lyhty, M., Kauppinen, M., Kainulainen, J., & Mäkinen, T. (2018). Meaningful learning in teacher education. Teaching and Teacher Education, 71 , 66–77. https://doi.org/10.1016/j.tate.2017.12.009

Kvalseth, T. (1991). A coefficient of agreement for nominal scales: An asymmetric version of Kappa. Educational and Psychological Measurement, 51 (1), 95–101.

Lai, C. L., & Hwang, G. J. (2015). An interactive peer-assessment criteria development approach to improving students’ art design performance using handheld devices. Computers & Education, 85 , 149–159. https://doi.org/10.1016/j.compedu.2015.02.011

Li, P., Chang, L., Chua, T. H. H., & Loh, R. S. M. (2018). “Likes” as KPI: An examination of teenage girls’ perspective on peer feedback on Instagram and its influence on coping response. Telematics and Informatics, 35 (7), 1994–2005. https://doi.org/10.1016/j.tele.2018.07.003

Liu, J., McBride, R. E., Xiang, P., & Scarmardo-Rhodes, M. (2018). Physical education pre-service teachers’ understanding, application, and development of critical thinking. Quest, 70 (1), 12–27. https://doi.org/10.1080/00336297.2017.1330218

Lin, G. Y. (2018). Anonymous versus identified peer assessment via a Facebook-based learning application: Effects on quality of peer feedback, perceived learning, perceived fairness, and attitude toward the system. Computers & Education, 116 , 81–92. https://doi.org/10.1016/j.compedu.2017.08.010

Li, L., & Grion, V. (2019). The Power of giving feedback ad receiving feedback in peer assessment. All Ireland Journal of Teaching and Learning in Higher Education, 11 (2), 1–17. https://ojs.aishe.org/index.php/aishe-j/article/view/413/671

Li, H., Xiong, Y., Hunter, C. V., Guo, X., & Tywoniw, R. (2020). Does peer assessment promote student learning? A meta-analysis. Assessment & Evaluation in Higher Education, 45 (2), 193–211. https://doi.org/10.1080/02602938.2019.1620679

Latifi, S., Noroozi, O., & Talaee, E. (2021). Peer feedback or peer feedforward? Enhancing students’ argumentative peer learning processes and outcomes. British Journal of Educational Technology, 52 (2), 768–784. https://doi.org/10.1111/bjet.13054

Lee, Y. F., Lin, C. J., Hwang, G. J., Fu, Q. K., & Tseng, W. H. (2021). Effects of a mobile-based progressive peer-feedback scaffolding strategy on students’ creative thinking performance, metacognitive awareness, and learning attitude. Interactive Learning Environments . https://doi.org/10.1080/10494820.2021.1916763

Mulder, R., Baik, C., Naylor, R., & Pearce, J. (2014). How does student peer review influence perceptions, engagement and academic outcomes? A case study. Assessment & Evaluation in Higher Education, 39 (6), 657–677. https://doi.org/10.1080/02602938.2013.860421

Mercader, C., Ion, G., & Díaz-Vicario, A. (2020). Factors influencing students’ peer feedback uptake: Instructional design matters. Assessment & Evaluation in Higher Education . https://doi.org/10.1080/02602938.2020.1726283

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods, 16 (1), 1609406917733847.

Polit, D. F., & Beck, C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29 (5), 489–497. https://doi.org/10.1002/nur.20147

Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 42 (5), 533–544. https://doi.org/10.1007/s10488-013-0528-y

Planas-Lladó, A., Feliu, L., Castro, F., Fraguell, R. M., Arbat, G., Pujol, J., Suñol, J. J., & Daunis-i-Estadella, P. (2018). Using peer assessment to evaluate teamwork from a multidisciplinary perspective. Assessment & Evaluation in Higher Education, 43 (1), 14–30. https://doi.org/10.1080/02602938.2016.1274369

Panadero, E., & Alqassab, M. (2019). An empirical review of anonymity effects in peer assessment, peer feedback, peer review, peer evaluation and peer grading. Assessment & Evaluation in Higher Education, 44 (8), 1253–1278. https://doi.org/10.1080/02602938.2019.1600186

Puntambekar, S. (2022). Distributed scaffolding: scaffolding students in classroom environments. Educational Psychology Review, 34 (1), 451–472. https://doi.org/10.1007/s10648-021-09636-3

Reddy, K., Harland, T., Wass,R., $ Wald, N. (2021). Student peer review as a process of knowledge creation through dialogue. Higher Education Research & Development, 40 (4), 825–837. https://doi.org/10.1080/07294360.2020.1781797

Sharma, P., & Hannafin, M. J. (2007). Scaffolding in technology-enhanced learning environments. Interactive Learning Environments, 15 (1), 27–46. https://doi.org/10.1080/10494820600996972

Stupple, E. J. N., Maratos, F. A., Elander, J., Hunt, T. E., Cheung, K. Y. F., & Aubeeluck, A. V. (2017). Development of the Critical Thinking Toolkit (CriTT): A measure of student attitudes and beliefs about critical thinking. Thinking Skills and Creativity, 23 , 91–100. https://doi.org/10.1016/j.tsc.2016.11.007

Shen, B., Bai, B., & Xue, W. (2020). The effects of peer assessment on learner autonomy: An empirical study in a Chinese college English writing class. Studies in Educational Evaluation, 64 , 100821. https://doi.org/10.1016/j.stueduc.2019.100821

Topping, K. (2018). Using peer assessment to inspire reflection and learning . Routledge.

Book   Google Scholar  

Tai, J., & Adachi, C. (2019). The transformative role of self-and peer-assessment in developing critical thinkers. In C. Bryan & K. Clegg (Eds.), Innovative assessment in higher education (pp. 64–73). Routledge.

Chapter   Google Scholar  

Tan, J. S., & Chen, W. (2022). Peer feedback to support collaborative knowledge improvement: What kind of feedback feed-forward? Computers & Education . https://doi.org/10.1016/j.compedu.2022.104467

Veliz, L., & Veliz-Campos, M. (2019). An interrogation of the role of critical thinking in English language pedagogy in Chile. Teaching in Higher Education, 24 (1), 47–62. https://doi.org/10.1080/13562517.2018.1456424

van Heerden, M., & Bharuthram, S. (2021). Knowing me, knowing you: The effects of peer familiarity on receiving peer feedback for undergraduate student writers. Assessment & Evaluation in Higher Education, 46 (8), 1191–1201. https://doi.org/10.1080/02602938.2020.1863910

Wass, R., Harland, T., & Mercer, A. (2011). Scaffolding critical thinking in the zone of proximal development. Higher Education Research & Development, 30 (3), 317–328.

Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist, 52 (1), 17–37. https://doi.org/10.1080/00461520.2016.1207538

Wu, Y., & Schunn, C. D. (2020). From feedback to revisions: Effects of feedback features and perceptions. Contemporary Educational Psychology, 60 , 101826. https://doi.org/10.1016/j.cedpsych.2019.101826

Wang, C., OuYang, J., & Wu, F. (2021). Subgroups of assessor and assessee: The relationship between students’ peer assessment roles and perceptions of MSCL in science education. Journal of Science Education and Technology, 30 (6), 816–828.

Yu, S. (2020). Giving genre-based peer feedback in academic writing: Sources of knowledge and skills, difficulties and challenges. Assessment & Evaluation in Higher Education . https://doi.org/10.1080/02602938.2020.1742872

Yu, S., $ Liu, C. (2021). Improving student feedback literacy in academic writing: An evidence-based framework. Assessing Writing, 48 , 100525. https://doi.org/10.1016/j.asw.2021.100525

Yuan, R., Liao, W., Wang, Z., Kong, J., & Zhang, Y. (2022). How do English-as-a-foreign-language (EFL) teachers perceive and engage with critical thinking: A systematic review from 2010 to 2020. Thinking Skills and Creativity . https://doi.org/10.1016/j.tsc.2022.101002

Zaidi, N. L. B., Grob, K. L., Monrad, S. M., Kurtz, J. B., Tai, A., Ahmed, A. Z., Gruppen, L. D., & Santen, S. A. (2018). Pushing critical thinking skills with multiple-choice questions: Does bloom’s taxonomy work? Academic Medicine, 93 (6), 856–859.

Zhu, Q., & Carless, D. (2018). Dialogue within peer feedback processes: Clarification and negotiation of meaning. Higher Education Research & Development, 37 (4), 883–897. https://doi.org/10.1080/07294360.2018.1446417

Zheng, L., Chen, N. S., Cui, P., & Zhang, X. (2019). A systematic review of technology-supported peer assessment research: An activity theory approach. International Review of Research in Open and Distributed Learning, 20 (5), 168–191. https://doi.org/10.1973/irrodl.v20i5.4333

Zhang, H., Yuan, R., & He, X. (2020). Investigating University EFL teachers’ perceptions of critical thinking and its teaching: voices from China. The Asia-Pacific Education Researcher . https://doi.org/10.1007/s40299-020-00500-6

Download references

Acknowledgements

This work was funded by ANID / CONICYT 1180024 and 2116576.

Author information

Authors and affiliations.

Escuela de Ingeniería, Pontificia Universidad Católica de Chile. Avenida Vicuña Mackenna, 4860, Santiago, Chile

Camila Barahona, Miguel Nussbaum, Vicente Martin, Silvana Arriagada & Isabel Hilliger

Facultad de Educación, Pontificia Universidad Católica de Chile. Avenida Vicuña Mackenna, 4860, Santiago, Chile

Alejandra Meneses

Instituto de Éticas Aplicadas, Pontificia Universidad Católica de Chile. Avenida Vicuña Mackenna, 4860, Santiago, Chile

Camila Barahona

Escuela Superior de Ingeniería, Ciencia Y Tecnología, Universidad Internacional de Valencia, Valencia, Spain

Angela Di Serio

Departamento de Computación Y Tecnología de La Información, Universidad Simón Bolívar, Caracas, Venezuela

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Camila Barahona .

Ethics declarations

Ethical approval.

The study received approval from the University’s ethics committee (protocol identifier number: 160915003). Before participating in the first peer assessment activity, the students were provided with information on the purpose of the research, the specific activities that were involved. The voluntary nature of participation in the study was also highlighted.

Informed consent

The students were requested to sign an informed consent form.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 65 kb)

Rights and permissions.

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Barahona, C., Nussbaum, M., Martin, V. et al. Technology-scaffolded peer assessment for developing critical thinking in pre-service teacher training: the importance of giving feedback. Education Tech Research Dev 71 , 667–688 (2023). https://doi.org/10.1007/s11423-022-10173-1

Download citation

Accepted : 19 November 2022

Published : 05 December 2022

Issue Date : April 2023

DOI : https://doi.org/10.1007/s11423-022-10173-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Pre-service teachers
  • Peer assessment
  • Giving feedback
  • Technological scaffolding
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Constructing a critical thinking evaluation framework for college students majoring in the humanities

Associated data.

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author/s.

Introduction

Education for sustainable development (ESD) has focused on the promotion of sustainable thinking skills, capacities, or abilities for learners of different educational stages. Critical thinking (CT) plays an important role in the lifelong development of college students, which is also one of the key competencies in ESD. The development of a valuable framework for assessing college students’ CT is important for understanding their level of CT. Therefore, this study aimed to construct a reliable self-evaluation CT framework for college students majoring in the humanities.

Exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and Item analysis were conducted to explore the reliability and validity of the CT evaluation framework. Six hundred and forty-two college students majoring in the humanities were collected. The sample was randomly divided into two subsamples ( n 1 = 321, n 2 = 321).

The Cronbach’s alpha coefficient for the whole scale was 0.909, and the values of the Cronbach’s alpha coefficients for individual factors of the scale ranged from 0.724 to 0.878. Then CFA was conducted within the scope of the validity study of the scale. In this way, the structure of the 7-factor scale was confirmed. Results indicated that the constructed evaluation framework performed consistently with the collected data. CFA also confirmed a good model fitting of the relevant 22 factors of the college students’ CT framework ( χ 2 /df  = 3.110, RMSEA = 0.056, GFI = 0.927, AGFI = 0.902, NFI = 0.923, and CFI = 0.946).

These findings revealed that the CT abilities self-evaluation scale was a valid and reliable instrument for measuring the CT abilities of college students in the humanities. Therefore, the college students’ CT self-evaluation framework included three dimensions: discipline cognition (DC), CT disposition, and CT skills. Among them, CT disposition consisted of motivation (MO), attention (AT), and open-mindedness (OM), while CT skills included clarification skills (CS), organization skills (OS), and reflection (RE). Therefore, this framework can be an effective instrument to support college students’ CT measurement. Consequently, some suggestions are also put forward regarding how to apply the instrument in future studies.

Nowadays, individuals should be equipped with the abilities of identifying problems, in-depth thinking, and generating effective solutions to cope with various risks and challenges caused by the rapid development of science and technology ( Arisoy and Aybek, 2021 ). In this context, critical thinking (CT) is gaining increasing attention. Promoting college students’ CT is an important way of improving their abilities of problem solving and decision making to further enhance their lifelong development ( Feng et al., 2010 ). Although human beings are not born with CT abilities ( Scriven and Paul, 2005 ), they can be acquired through learning and training, and are always sustainable ( Barta et al., 2022 ).

Especially in the field of education, CT should be valued ( Pnevmatikos et al., 2019 ). Students should be good thinkers who possess the abilities of applying critical evaluation, finding, and collating evidence for their views, as well as maintaining a doubting attitude regarding the validity of facts provided by their teachers or other students ( Sulaiman et al., 2010 ). Many countries have regarded the development of students’ CT as one of the fundamental educational goals ( Flores et al., 2012 ; Ennis, 2018 ). CT is helpful for students to develop their constructive, creative, and productive thinking, as well as to foster their independence ( Wechsler et al., 2018 ; Odebiyi and Odebiyi, 2021 ). It also provides the power to broaden their horizons ( Les and Moroz, 2021 ). Meanwhile, when college students have a high level of CT abilities, they will likely perform better in their future careers ( Stone et al., 2017 ; Cáceres et al., 2020 ). Therefore, college students should be capable of learning to access knowledge, solve problems, and embrace different ideas to develop their CT ability ( Ulger, 2018 ; Arisoy and Aybek, 2021 ).

Due to the significant meaningfulness of CT abilities at all education levels and in various disciplines, how to cultivate students’ CT abilities has been the focus of CT-related research ( Fernández-Santín and Feliu-Torruella, 2020 ). Many studies have shown that inquiry-based learning activities or programs are an effective way to exercise and enhance students’ CT abilities ( Thaiposri and Wannapiroon, 2015 ; Liang and Fung, 2020 ; Boso et al., 2021 ; Chen et al., 2022 ). Students not only need the motivation and belief to actively participate in such learning activities and to commit to problem solving, but also need the learning skills to cope with the problems that may be encountered in problem-solving oriented learning activities. These requirements are in line with the cultivation of students’ CT abilities. Meanwhile, research has also indicated that there is an interrelationship between problem solving and CT ( Dunne, 2015 ; Kanbay and Okanlı, 2017 ).

However, another important issue is how to test whether learning activities contribute to improving the level of students’ CT abilities. It is effective to measure students’ CT abilities through using CT measurement instruments. Some CT measurement frameworks have been developed to cope with the need to cultivate CT abilities in teaching and learning activities ( Saad and Zainudin, 2022 ). However, there are still some imperfections in these existing CT evaluation frameworks. For example, most studies on college students’ CT are in the field of science, with very little research on students in the humanities, and even less on specifically developing CT assessment frameworks for college students in the humanities. Only Khandaghi et al. (2011) conducted a study on the CT disposition of college students in the humanities, and the result indicated that their CT abilities were at an intermediate level. However, there are few descriptions of college students’ CT with a background in humanities disciplines. Compared to humanities disciplines, science disciplines seem to place more emphasis on logical and rational thinking, which might cater more to the development of CT abilities ( Li, 2021 ). However, it is also vital for college students in the humanities to engage in rational thinking processes ( Al-Khatib, 2019 ). Hence, it is worth performing CT abilities evaluations of college students in the humanities by constructing a CT evaluation framework specifically for such students. In addition, previous measurements of CT have tended to be constructed according to one dimension of CT only, either CT skills or CT disposition. CT skills and disposition are equally important factors, and the level of CT abilities can be assessed more comprehensively and accurately by measuring both dimensions simultaneously. Therefore, the purpose of this study was to develop a self-evaluation CT framework for college students that integrates both CT skills and disposition dimensions to comprehensively evaluate the CT ability of college students in the humanities.

Literature review

Ct of college students in the humanities.

CT is hardly a new concept, as it can be traced back 2,500 years to the dialogs of Socrates ( Giannouli and Giannoulis, 2021 ). In the book, How We Think, Dewey (1933 , p 9; first edition, 1910) mentioned that thinking critically can help us move forward in our thinking. Subsequently, different explanations of CT have been presented through different perspectives by researchers. Some researchers think that CT means to think with logic and reasonableness ( Mulnix and Mulnix, 2010 ), while others suggest that CT refers to the specific learning process in which learners need to think critically to achieve learning objectives through making decisions and problem solving ( Ennis, 1987 ).

Generally, for a consensus, CT involves two aspects: CT skills and CT disposition ( Bensley et al., 2010 ; Sosu, 2013 ). CT skills refer to the abilities to understand problems and produce reasonable solutions to problems, such as analysis, interpretation, and the drawing of conclusions ( Chan, 2019 ; Ahmady and Shahbazi, 2020 ). CT disposition emphasizes the willingness of individuals to apply the skills mentioned above when there is a problem or issue that needs to be solved ( Chen et al., 2020 ). People are urged by CT disposition to engage in a reflective, inferential thinking process about the information they receive ( Álvarez-Huerta et al., 2022 ), and then in specific problem-solving processes, specific CT skills would be applied. CT disposition is the motivation for critical behavior and an important quality for the learning and use of critical skills ( Lederer, 2007 ; Jiang et al., 2018 ).

For college students, the cultivation of their CT abilities is usually based on specific learning curriculums ( O’Reilly et al., 2022 ). Hence, many studies about students’ CT have been conducted in various disciplines. For example, in science education, Ma et al.’s (2021) study confirmed that there was a significant relationship between CT and science achievement, so they suggested that it might be valuable to consider fostering CT as a considerable outcome in science education. In political science, when developing college students’ CT, teachers should focus on not only the development of skills, but also of meta-awareness ( Berdahl et al., 2021 ), which emphasizes the importance of CT disposition, i.e., learners not only need to acquire CT skills, such as analysis, inference, and interpretation, but also need to have clear cognition of how to apply these skills at a cognitive level. Duro et al. (2013) found that psychology students valued explicit CT training. For students majoring in mathematics, Basri and Rahman (2019) developed an assessment framework to investigate students’ CT when solving mathematical problems. According to the above literature review, there have been many studies on CT in various disciplines, which also reflects the significant importance of CT for the development of students in various disciplines. However, most studies on CT have been conducted in the field of science subjects, such as mathematics, business, nursing, and so on ( Kim et al., 2014 ; Siew and Mapeala, 2016 ; Basri and Rahman, 2019 ), but there have been few studies on the CT of students in the humanities ( Ennis, 2018 ).

There is a widespread stereotype that compared to humanities subjects, science majors are more logical, and so more attention should be paid to their CT ( Lin, 2016 ). This begs the question, are all students in the humanities (e.g., history, pedagogy, Chinese language literature, and so on) sensual or “romantic”? Do they not also need to develop independent, logical, and CT? Can they depend only on “romantic” thinking? This may be a prejudice. In fact, the humanities are subjects that focus on humanities and our society ( Lin, 2020 ). Humanities should be seen as the purpose rather than as a tool. The academic literacy of humanities needs to be developed and enhanced through a long-term, subtle learning process ( Bhatt and Samanhudi, 2022 ), and the significance for individuals is profound. Hence, the subjects of both humanities and sciences play an equally important role in an individual’s lifelong development. As such, what should students majoring in humanities subjects do to develop and enhance their professional competence? Chen and Wei (2021) suggested that individuals in the humanities should have the abilities to identify and tackle unstructured problems to adapt to the changing environments, and this suggestion is in line with a developmental pathway for fostering CT. Therefore, developing their CT abilities is an important way to foster the humanistic literacy of students in the humanities. Specifically, it is important to be equipped with the abilities to think independently and questioningly, to read individually, and to interpret texts in depth and in multiple senses. They also need to learn and understand the content of texts and evaluate the views of others in order to expand the breadth of their thinking ( Barrett, 2005 ). Moreover, they need the ability to analyze issues dialectically and rationally, and to continually reflect on themselves and offer constructive comments ( Klugman, 2018 ; Dumitru, 2019 ). Collegiate CT skills are taught via independent courses or embedded modules ( Zhang et al., 2022 ). The humanities are no exception. Yang (2007) once designed thematic history projects, as independent courses, to foster students’ disposition toward CT concerning the subject of history, and the results showed that the history projects can support learners’ development of historical literacy and CT. In a word, the humanities also play an important role in fostering the development and enhancement of college students’ CT, esthetic appreciation and creativity, and cultural heritage and understanding ( Jomli et al., 2021 ). Having good CT therefore also plays a crucial role in the lifelong development of students in the humanities.

An accurate assessment of the level of CT abilities is an important prerequisite for targeted improvement of students’ CT abilities in special disciplines ( Braeuning et al., 2021 ). Therefore, it might be meaningful to construct a self-evaluation CT framework for college students in the humanities according to their professional traits.

Evaluating college students’ CT

Given that CT can be cultivated ( Butler et al., 2017 ), more attention has been paid to how to improve students’ CT abilities level in instruction and learning ( Araya, 2020 ; Suh et al., 2021 ). However, it is also important to examine how CT can be better assessed. The evaluation of thinking is helpful for students to think at higher levels ( Kilic et al., 2020 ). Although the definitions of CT are controversial ( Hashemi and Ghanizadeh, 2012 ), many researchers have reached a consensus on the main components of CT: skills and disposition ( Bensley et al., 2016 ), and different CT evaluation frameworks have been developed according to one of the two dimensions. For example, Li and Liu (2021) developed a five-skill framework for high school students which included analysis, inference, evaluation, construct, and self-reflection. Meanwhile, in recent years, the assessment of CT disposition has also attracted the interest of a growing number of researchers. Sosu (2013) developed the “Critical Thinking Disposition Scale” (STDS), which included two dimensions: critical openness and reflective skepticism. The specific taxonomies of the evaluation framework of CT skills and dispositions is shown in Table 1 . As illustrated in Table 1 , there are some universal core items to describe CT skills. For the dimension of CT skills, the sub-dimensions of interpretation, analysis, inference, and evaluation are the important components. Those CT skills are usually applied along with the general process of learning activities ( Hsu et al., 2022 ). For instance, at the beginning of learning activities, students should have a clear understanding of the issues raised and the knowledge utilized through applying interpretation skills. Likewise, there are some universal core items to describe CT dispositions, such as open-mindedness, attentiveness, flexibility, curiosity, and so on.

Taxonomies of the evaluation framework of CT skills and dispositions.

For a good critical thinker, it is equally important to have both dispositional CT and CT skills. Students need to have the awareness of applying CT abilities to think about problem-solving and subsequently be able to utilize a variety of CT skills in specific problem-solving processes. Therefore, we argue that designing a CT self-evaluation framework that integrates the two dimensions will provide a more comprehensive assessment of college students’ CT. In terms of CT disposition, motivation, attentiveness, and open-mindedness were included as the three sub-dimensions of CT disposition. Motivation is an important prerequisite for all thinking activities ( Rodríguez-Sabiote et al., 2022 ). Especially in problem-solving-oriented learning activities, the development of CT abilities will be significantly influenced by the motivation level ( Berestova et al., 2021 ). Attentiveness refers to the state of concentration of the learner during the learning process, which reflects the learners’ level of commitment to learning, playing a crucial role in the development of CT abilities during the learning process. Open-mindedness requires learners to keep an open mind to the views of others when engaging in learning activities. The three sub-dimensions have been used to reflect leaners’ disposition to think critically. Especially in the humanities, it is only through in-depth communication between learners that a crash of minds and an improvement in abilities can take place ( Liu et al., 2022 ), and it is therefore essential that learners maintain a high level of motivation, attentiveness, and open-mindedness in this process to develop their CT abilities. In terms of CT skills, three sub-dimensions were also selected to measure the level of learners’ CT skills, namely clarification skills, organization skills, and reflection. In the humanities, it should be essential abilities for students to understand, analyze, and describe the literature and problems comprehensively and exactly ( Chen and Wei, 2021 ). Then, following the ability to extract key information about the problem, to organize and process it, and to organize the information with the help of organizational tools such as diagrams and mind maps. Finally, the whole process of problem solving is reflected upon and evaluated ( Ghanizadeh, 2016 ), and research has shown that reflection learning intervention could significantly improve learners’ CT abilities ( Chen et al., 2019 ).

Research purpose

CT plays an important role in college students’ academic and lifelong career development ( Din, 2020 ). In the current study on college students’ CT measurement, it can be improved in two main ways.

Firstly, the attention to the discipline cognition related to CT in previous studies is insufficient. Generally, students’ CT abilities can be cultivated based on two contexts: the subject-specific instructional context and the general skills instructional context ( Ennis, 1989 ; Swartz, 2018 ). In authentic teaching and learning contexts, the generation and development of CT usually takes place in problem-oriented learning activities ( Liang and Fung, 2020 ), in which students need to achieve their learning objectives by identifying problems and solving them. According to Willingham (2007) , if you are to think critically, you must have a sound knowledge base of the problem or topic of enquiry and view it from multiple perspectives. Due to the difference in nature of the disciplines, the format of specific learning activities should also vary. Hence, an adequate cognition of the discipline is an important prerequisite for learning activities; meanwhile, college students’ cognition level regarding their discipline should also be an important assessment criterion for them to understand their own level of CT abilities. Cognition refers to the acquisition of knowledge through mental activity (e.g., forming concepts, perceptions, judgments, or imagination; Colling et al., 2022 ). Learners’ thinking, beliefs, and feelings will affect how they behave ( Han et al., 2021 ). Analogically speaking, discipline cognition refers to an individual’s understanding of their discipline’s backgrounds and knowledge ( Flynn et al., 2021 ). Cognition should be an important variable in CT instruction ( Ma and Luo, 2020 ). In the current study, we added the dimension of discipline cognition into the self-evaluation CT framework of college students in the humanities. What’s more, in order to represent the learning contexts of humanities disciplines, the specific descriptions of items are concerned with the knowledge of the humanities, (e.g., “I can recognize the strengths and limitations of the discipline I am majoring in.,” and “Through studying this subject, my understanding of the world and life is constantly developing.”).

Secondly, the measurement factors of CT skills and disposition should be more specific according to the specific humanities background. In previous studies, researchers tended to measure students’ CT in terms of one of the two dimensions of CT skills. CT thinking skills used to be measured from perspectives such as analysis, interpretation, inference, self-regulation, and evaluation. However, in specific learning processes, how should students concretely analyze and interpret the problems they encounter, and how can they self-regulate their learning processes and evaluate their learning outcomes? Those issues should also be considered to evaluate college students’ levels of CT abilities more accurately. Therefore, the current study attempted to construct a CT framework in a more specific way, and by integrating both dimensions of CT disposition and skills. Therefore, what specific factors would work well as dimensions for evaluating the CT abilities of college students in the humanities? In the current study, firstly, students’ disposition to think critically is assessed in terms of three sub-dimensions: motivation, attention, and open-mindedness, to help students understand the strength of their own awareness to engage in CT ( Bravo et al., 2020 ). Motivation is an important prerequisite for all thinking activities ( Rodríguez-Sabiote et al., 2022 ), and it could contribute to the development of engagement, behavior, and analysis of problems ( Berestova et al., 2021 ). Meanwhile, there was a positive relationship between academic motivation and CT. Therefore, in the current study, motivation is still one of the crucial factors. The sub-dimension of attentiveness was also an important measurement factor, which aimed to investigate the level of the persistence of attention. Attentiveness also has a positive influence on a variety of student behaviors ( Reynolds, 2008 ), while the sub-dimension of open-mindedness mainly assesses college students’ flexibility of thinking, which is also an important factor of CT ( Southworth, 2020 ). A good critical thinker should be receptive of some views that might be challenging to their own prior beliefs with an open-minded attitude ( Southworth, 2022 ). Secondly, college students’ CT skills were then assessed in the following three sub-dimensions of clarification skills, organization skills, and reflection, with the aim of understanding how well students use CT skills in the problem-solving process ( Tumkaya et al., 2009 ). The three sub-dimensions of CT skills selected in this framework are consistent with the specific learning process of problem solving, which begins with a clear description and understanding of the problem, i.e., clarification skills. In the humanities, it should be an essential competence for students to understand, analyze, and describe the literature and problems comprehensively and exactly ( Chen and Wei, 2021 ).

We thus constructed a model for evaluating the CT of college students in the humanities (see Figure 1 ). The proposed evaluation framework incorporates three dimensions: discipline cognition (DC), CT disposition, and CT skills. Among them, CT disposition includes the three sub-dimensions of motivation (MO), attention (AT), and open-mindedness (OM), while CT skills include the three sub-dimensions of clarification skills (CS), organization skills (OS), and reflection (RE). In other words, this study aimed to construct a seven-dimensional evaluation framework and to test whether it is an effective instrument for measuring the CT of college students in the humanities.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-13-1017885-g001.jpg

A model for evaluating the CT abilities of college students in the humanities.

Materials and methods

Research design.

In order to address the two problems of the existing college students’ CT evaluation frameworks mentioned above, a CT self-evaluation framework for college students in the humanities was preliminarily developed in this study, including the following seven factors: discipline cognition (2 items), motivation (5 items), attentiveness (5 items), open-mindedness (5 items), clarification skills (3 items), organization (3 items), and reflection (4 items).

Then, to ensure the content validity of the measurement framework, four experts who have studied CT and five teachers who have worked in the field of humanities were invited to review all items and give feedback. The research team compared the similarities and differences in expert opinions and made joint decisions. Meanwhile, to ensure the popularity, accuracy, and objectivity of the items, 25 college students majoring in humanities participated in the pretest, and the presentation and description of the items was improved according to their feedback. Finally, a questionnaire consisting of 30 items was constructed, including three items for participants’ socio-demographic information (e.g., gender, grade, and subject), two for discipline cognition, five for motivation, five for attention, five for open-mindedness, three for clarification skills, three for organization skills, and four for reflection (as shown in Table 2 ). For each item, a 5-point Likert-style scale (5 = strongly agree, 4 = agree, 3 = neutral, 2 = disagree, 1 = strongly disagree) was used.

Dimensions and items of the college students’ CTS evaluation framework.

Participants and data collection

In the current study, simple random sampling was adopted and the online questionnaire was uploaded on Questionnaire Star 1 (accessed on 18 March 2022), a professional online survey tool widely used in China ( Sarjinder, 2003 ). The link to the online questionnaire was sent to the teachers in the humanities of some colleges in Jiangsu, China. Then teachers sent the link to their students. In the first part of the questionnaire, students were told that they were participating in an anonymous study, the content of which may be published without any commercial use. If they did not want to participate in the survey, they could quit the website of the online questionnaire. Students who agreed to participate in the survey filled in the questionnaire. In addition, to ensure the reliability of the results of the subsequent data analysis, the ratio of the number of questionnaire items to the number of participants should be 1:5, and the larger the sample size the better ( Gorsuch, 1983 ). Therefore, eventually, 654 college students agreed to take part in the study, and completed the online questionnaire. After deleting those questionnaires with the same answer for all items or overly short response times, the effective number of samples was 642, with an effective rate of 98.2%.

The recruited effective sample comprised 642 participants, of whom 67.4% were female ( n  = 433), and 32.6% were male ( n  = 209). Sophomores ( n  = 215, 33.5%) and juniors ( n  = 249, 38.8%) made up most of the total number of participants. Meanwhile, the current study aimed to construct a CT framework for college students in the humanities field; hence, all participants were students in humanities disciplines, such as history ( n  = 187, 29.1%), educational history ( n  = 78, 12.2%), philosophy ( n  = 97, 15.1%), Chinese language and literature ( n  = 221, 34.4%), and pedagogy ( n  = 59, 9.2%). The specific socio-demographic information is shown in Table 3 .

Socio-demographic profile of respondents.

Data analysis

To construct an evaluation framework of college students’ CT skills and to confirm its reliability and validity, exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and item analysis were carried out. Firstly, 642 samples were randomly assigned to two groups, with 321 samples in each ( Yurdakul et al., 2012 ) to avoid inflation of the Cronbach’s alpha value or other effects ( Devellis, 2011 ). EFA was used to analyze the first group of samples. CFA was applied to the second sample. Firstly, EFA was conducted in order to determine the underlying factor structure of the CT-evaluation framework and to make decisions about item retention ( Kieffer, 1988 ). During this process, principal component analysis (PCA) was applied as an EFA factor extraction technique ( Vogel et al., 2009 ). CFA was then used to confirm the factor structure of the scale using the second group of 321 samples ( Kline, 2005 ). Lastly, all samples were analyzed to test the differentiation and suitability of the items ( Yurdakul et al., 2012 ). SPSS 18.0 and AMOS 24.0 were applied to analyze the collected data.

SPSS 22.0 was used for conducting EFA, and the maximum variance method was adopted for factor rotation.

Reliability analysis of the scale

Prior to the EFA, sample variance and sample size evaluations were conducted. An evaluation of Bartlett’s Test of Sphericity was found to be significant, thus confirming homogeneity of variance ( χ 2  = 9162.198; p  < 0.001). Then, the Cronbach’s alpha value ( Pallant, 2007 ) was applied to evaluate the reliability of the scale, and the results showed that the whole scale had good reliability ( α  = 0.909). Specifically, the Cronbach’s alpha values of the seven factors were 0.724 (DC), 0.771 (MO), 0.878 (AT), 0.839 (OM), 0.819 (CL), 0.755 (OR), and 0.878 (RE), indicating their reliability. The Kaiser-Meyer Olkin (KMO) value of the questionnaire was 0.907, showing the appropriateness of the EFA ( Kaiser, 1974 ).

Validity analysis of the scale

To confirm the validity of the evaluation dimensions, the method of PCA was applied to extract factors, and maximum variance rotation was used for the EFA. Seven factors were finally obtained. Kieffer (1988) suggested that two strategies should be applied for EFA. Thus, oblique rotation and orthogonal rotation were both used. If the results of the two methods are similar, the results obtained by the orthogonal rotation method can be used. Therefore, in the current study, two methods were both applied for EFA, namely optimal skew and maximum variance orthogonal rotation. The results of the two methods showed no significant difference. This study thus applied the results of the maximum variance orthogonal rotation method. MO5, OM4, and OM5 were removed since their maximum factor loadings were not in line with their initial evaluation dimension ( Conway and Huffcutt, 2016 ). In addition, the factors with an eigenvalue higher than 1 were picked. Items with a factor loading of less than 0.4 and with inconsistent content were removed through the multiple orthogonal rotations ( Zhao et al., 2021 ). There were 25 items with eigenvalues greater than 1 and independent factor loadings greater than 0.5 which were retained ( Fabrigar et al., 1999 ). Table 4 presents the results of the component transformation matrix. Finally, seven factors were selected, with a cumulative variance contribution of 71.413% ( Conway and Huffcutt, 2016 ). The eigenvalues and cumulative variance contributions of the seven factors are shown in Table 5 .

The factor analysis of college students’ CT framework ( N  = 321).

The eigenvalues and contribution rates of the five factors in the model.

The first-order CFA was adopted to determine the validity, convergence, and identifiability of the framework in this study ( Kline, 2005 ). CFA was used to explore the relationships between each factor, and then to construct the evaluation framework of humanities college students’ CT.

Fitting validity analysis for the framework

As shown in Figure 2 , first-order CFA was conducted. According to Hair et al. (2014) , items that do not meet the standard load (<0.5) must be eliminated. The absolute and relative fitting indexes were applied to verify the framework fit. The Chi-square/ df in this research was 3.651, and the value of RMSEA was 0.044 (<0.08; Liu et al., 2021 ). In addition, the goodness-of-fit index (GFI) and adjusted fitness index (AGFI) were 0.923 and 0.906 respectively, which both met the reference standard proposed by Foster et al. (1993) . Moreover, consistent with Hair et al. (2014) recommendations, the normed fitness index (NFI), comparative fitness index (CFI), incremental fitness index (IFI), and relative fitness index (RFI) were 0.975, 0.982, and 0.972 (>0.9). In addition, the values of simplifying the specification fitness index (PNFI), and streamlining fitness indicator (PGFI) were more than 0.5. Therefore, these results indicated the good fitting validity of the framework ( Table 6 ).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-13-1017885-g002.jpg

The first-order CFA model.

The fitting index of the evaluation framework.

Convergence validity analysis for the framework

The CFA results are shown in Table 7 . The comprehensive reliability (CR) and average variance extracted (AVE) were used to test the construct validity of the framework. According to Hair et al. (2014) , the CR value of all items should be more than 0.7. Thus, the CR of the 22 remaining items was good. What is more, Fornell and Larcker (1981) pointed out that if the AVE is higher than 0.5, the framework shows good convergence validity. Therefore, the results in Table 5 show that this evaluation framework has high validity and is reasonable.

Results of the confirmatory factor analysis.

Discriminant validity analysis of the framework

The discriminant validity of the framework could be ensured by testing the correlation matrix among dimensions. Schumacker and Lomax (2016) proposed that in the structural discriminant validity analysis of tools, the AVE square root of all factors must be more than the absolute value of the Pearson correlation coefficient between two factors in order to be recognized as having discriminant validity. Therefore, as shown in Table 8 , the result of structural discriminant validity analysis indicated that this framework had good discriminant validity.

The results of interrelated coefficient matrix and square roots of AVE.

***Significant at the 0.001 level; **Significant at the 0.01 level; *Significant at the 0.05 level.

Item analysis

Item analysis was conducted to determine how well the items discriminate between college students with high abilities and those with low abilities in terms of CT within the scope of the item validity of the CT-evaluation scale form. In order to accomplish this goal, item discrimination statistics were calculated based on the differences between the lowest group means of 27% and the highest group means of 27% of the participants determined according to the scores of each item and to the total scores of the scale ( Aridag and Yüksel, 2010 ). Therefore, first, the total scores for everyone were calculated by using the scale. This was followed by the calculation of total scores that were then ranked from the highest to the lowest. Of all the participants constituting the study group ( N  = 642), 27% (174) of them who had the highest scores were determined to be the higher group, and 27% of all the participants who had the lowest scores were determined to be the lower group. The independent samples t -test was applied for the purpose of statistically testing the difference between the mean scores of the two groups. The results obtained are presented in Table 9 . Further, items with dimensional Pearson correlation coefficients and standardized factor loadings that did not reach the standard value (less than 0.4 and 0.45 respectively) were eliminated. Finally, for the remaining 22 items, the decisive values were higher than 0.3, and the gross interrelated coefficient between questions and items was higher than 0.4. Overall, the item analysis results showed that the remaining 22 items reached the standard.

t -test results for the item means of the high-low-27% group.

CT is one of the key competencies that college students need to acquire ( Bandyopadhyay and Szostek, 2019 ). This study aimed to construct a self-evaluation CT framework for college students in the humanities. In the initial framework, three dimensions and 27 items were conceived; then EFA was conducted, and items with independent factor loadings below 0.5 were excluded ( Fabrigar et al., 1999 ). As a result, 25 items were retained for CFA. The results showed that three items should be eliminated because of their lower standard load (less than 0.5). Subsequently, the evaluation model with 22 items had an acceptable fitting index; meanwhile, good convergence and discriminant validity of the framework was also shown by calculating CR, AVE, and the square roots of AVE. Finally, to verify the suitability and distinctiveness of the constructed items, item analysis was conducted. The result showed that for the remaining 22 items, the decisive values were higher than 0.3, and the gross interrelated coefficient between questions and items was higher than 0.4, so the remaining 22 items reached the standard. Therefore, the final self-evaluation CT framework is a 22-item instrument, measuring three dimensions and six sub-dimensions: discipline cognition, CT disposition (open-mindedness, motivation, and attentiveness), and CT skills (reflection, organization skills, and clarification skills).

Compared to previous studies about the construction of an assessment framework for CT, this study focused on three important issues: the CT abilities of college students majoring in the humanities was the focus of this study; both CT skills and CT dispositions were included; and more specific dimensions of CT were the core measurement factors. In previous CT assessment frameworks, students in the disciplines of science (mathematics, business, nursing, engineering, etc.) were often the main subjects of study ( Kim et al., 2014 ; Michaluk et al., 2016 ; Siew and Mapeala, 2016 ; Basri and Rahman, 2019 ), while college students majoring in the humanities have received less attention. However, CT as a guide of belief and action ( Gyenes, 2021 ) is an important ability for college students in all fields ( Davies, 2013 ; Zhang et al., 2022 ). In humanities subjects, research has shown that independent thinking skills are valuable indicators of students’ discipline-specific abilities in humanities subjects ( Bertram et al., 2021 ). College students in the humanities need CT abilities to identify problems and find critical solutions ( Baş et al., 2022 ). Meanwhile, the assessment instrument developed in this study added the dimension of disciplinary cognition, which is considered a prerequisite to help college students have a clear idea of their subject background. Therefore, the CT assessment framework provided a practical method for teachers and learners in the humanities to investigate the level of their CT abilities. For example, in the discipline of history, thematic history projects could be applied to foster students’ CT abilities in authentic history teaching contexts ( Yang, 2007 ). In order to verify whether the projects help to improve learners’ CT abilities, this CT evaluation framework can be applied before and at the end of the project to determine whether there are differences in learners’ levels of CT abilities before and after learning. Likewise, in philosophy classroom, philosophical whole-class dialog can be useful teaching strategies to activate learners to think critically about moral values ( Rombout et al., 2021 ). Learners in dialogs must take others’ perspectives into account ( Kim and Wilkinson, 2019 ), which is in line with the sub-dimension of open-mindedness in the current CT evaluation framework. Hence, the CT evaluation framework can also be applied in specific disciplines.

In addition, in the current CT evaluation framework, both CT skills and CT dispositions were included, and more specific dimensions of CT were the core measurement factors. In terms of CT disposition, it reflects the strength of students’ belief to think and act critically. In the current evaluation instrument, the three sub-dimensions of motivation, open-mindedness, and attentiveness are the evaluation factors. The cultivation of college students’ CT abilities is usually based on specific educational activities. When college students get involved in learning activities, there are opportunities for them to foster their CT abilities ( Liu, 2014 ; Huang et al., 2022 ). An important factor influencing student engagement is motivation ( Singh et al., 2022 ), which has an important effect on college students’ behavior, emotion, and cognitive process ( Gao et al., 2022 ). Hence, it makes sense to regard motivation as a measure factor of CT disposition, and it is crucial for college students to self-assess their motivation level in the first place to help them have a clear insight into their overall level of CT. The sub-dimension of attentiveness was also an important measurement factor, which aimed to investigate the level of the persistence of attention. Attentiveness also has a positive influence on a variety of student behaviors ( Reynolds, 2008 ), while the sub-dimension of open-mindedness mainly assesses college students’ flexibility of thinking, which is also an important factor of CT ( Southworth, 2020 ). A good critical thinker should be receptive of some views that might be challenging to their own prior beliefs with an open-minded attitude ( Southworth, 2022 ). CT skills were then assessed in the following three sub-dimensions of clarification skills, organization skills, and reflection, with the aim of understanding how well students use CT skills in the problem-solving process ( Tumkaya et al., 2009 ). The three sub-dimensions of CT skills selected in this framework are consistent with the specific learning process of problem solving, which begins with a clear description and understanding of the problem, i.e., clarification skills, followed by the ability to extract key information about the problem, to organize and process it, and to organize the information with the help of organizational tools such as diagrams and mind maps. Finally, the whole process of problem solving is reflected upon and evaluated, and research has shown that reflection learning intervention could significantly improve learners’ CT abilities ( Chen et al., 2019 ).

In other words, the self-evaluation framework of college students’ CT constructed in this study focused on the investigation of college students in the humanities, and the descriptions of specific items combined the characteristics of the humanities. What’s more, because there are some differences in the extent to which students apply specific CT skills and are aware of how to use CT to solve problems based on their different disciplinary backgrounds ( Belluigi and Cundill, 2017 ), the construction of the CT assessment framework for college students provides a practical pathway and a more comprehensive instrument for assessing the CT abilities of college students majoring in the humanities, and a research entry point was provided for researchers to better research the CT of college students majoring in the humanities.

Based on a previous literature review of CT, this study further investigated the necessity of college students’ CT to construct a framework for evaluating the CT of college students in the humanities, and to test its effectiveness. The EFA, CFA, and item analysis methods were conducted in this study to construct a three-dimensional college students’ CT self-evaluation framework. The results indicate that the framework constructed in this study has good reliability and validity. Finally, a framework with three dimensions (discipline cognition, CT disposition, and CT skills) and seven sub-dimensions (discipline cognition, motivation, attentiveness, open-mindedness, reflection, organization skills, and clarification skills) totaling 22 items was developed.

Implications

The main significance of this study is reflected in three aspects. Firstly, the current study constructed a CT-evaluation framework for college students majoring in the humanities. The results of the EFA, CFA, and item analysis supported the reliability and validity of the three-dimensional framework which indicates that it consists of discipline cognition, CT disposition, and CT skills. The specific assessment factors not only integrate the two dimensions of CT (skills and disposition), making the assessment framework more comprehensive, but also integrate the dimension of discipline cognition, enabling specific measures to be developed based on specific disciplinary contexts, ensuring that CT is assessed more accurately and relevantly. Second, the CT-evaluation framework can be applied in specific instruction and learning contexts. It is well known that CT has become one of the abilities in the 21st century. In instruction and learning, specific instructional strategies and learning activities should be purposefully applied according to specific humanistic backgrounds. Prior to undertaking specific teaching activities, it is worth having a prerequisite understanding of college students’ level of CT abilities by inviting students to complete the self-evaluation CT competence instrument. Likewise, after the learning activities, it is also an important instrument to evaluate the effectiveness of learning activities in terms of cultivating college students’ CT abilities. Finally, the construction of the CT assessment framework for college students provides a practical pathway for assessing the CT abilities of college students majoring in the humanities, and a research entry point was provided for researchers to better research the CT of these students majoring in the humanities in the future.

Limitations and future work

There are two main limitations of this study. First, the sample in this study was from one area and was selected by random sampling, which cannot cover all the college students in the major. More and larger representative samples will be needed in the future to assess the extent to which the findings are applicable to other population groups to confirm the conclusions of the study. In addition, this evaluation framework of college students’ CT is still in the theoretical research stage and has not yet been put into practice. Therefore, the framework should be practically applied in further research to improve its applicability and usability according to practical feedback.

Data availability statement

Ethics statement.

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author contributions

QL: conceptualization. SL: methodology. SL and ST: writing—original draft preparation. SL, XG, and QL: writing—review and editing. All authors have read and agreed to the published version of the manuscript.

This study was supported by the School Curriculum Ideological and Political Construction Project (no. 1812200046KCSZ2211).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of inter.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

1 www.wjx.cn

  • Abrami P., Bernard R. M., Borokhovski E., Wade A., Surkes M. A., Tamim R., et al.. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis . Rev. Educ. Res. 78 , 1102–1134. doi: 10.3102/0034654308326084 [ CrossRef ] [ Google Scholar ]
  • Ahmady S., Shahbazi S. (2020). Impact of social problem-solving training on critical thinking and decision making of nursing students . BMC Nurs. 19 :94. doi: 10.1186/s12912-020-00487-x, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Al-Khatib O. (2019). A Framework for Implementing Higher-Order Thinking Skills (Problem-Solving, Critical Thinking, Creative Thinking, and Decision-Making) in Engineering & Humanities. Advances in Science and Engineering Technology International Conferences (ASET).
  • Álvarez-Huerta A., Muela A., Larrea I. (2022). Disposition toward critical thinking and creative confidence beliefs in higher education students: the mediating role of openness to diversity and challenge . Think. Skills Creat. 43 :101003. doi: 10.1016/j.tsc.2022.101003 [ CrossRef ] [ Google Scholar ]
  • Araya A. E. M. (2020). Critical thinking for civic life in elementary education: combining storytelling and thinking tools/pensamiento critico Para la Vida ciudadanaen educacion primaria: combinando narrativa y herramientas de pensamiento . Educacion 44 , 23–43. [ Google Scholar ]
  • Aridag N. C., Yüksel A. (2010). Analysis of the relationship between moral judgment competences and empathic skills of university students . Kuram ve Uygulamada Egitim Bilimleri 10 , 707–724. [ Google Scholar ]
  • Arisoy B., Aybek B. (2021). The effects of subject-based critical thinking education in mathematics on students’ critical thinking skills and virtues . Eur J of Educ Res 21 , 99–120. doi: 10.14689/ejer.2021.92.6 [ CrossRef ] [ Google Scholar ]
  • Bandyopadhyay S., Szostek J. (2019). Thinking critically about critical thinking: assessing critical thinking of business students using multiple measures . J. Educ. Bus. 94 , 259–270. doi: 10.1080/08832323.2018.1524355 [ CrossRef ] [ Google Scholar ]
  • Barrett A. (2005). The information-seeking habits of graduate student researchers in the humanities . J Acad Libr 31 , 324–331. doi: 10.1016/j.acalib.2005.04.005 [ CrossRef ] [ Google Scholar ]
  • Barta A., Fodor L. A., Tamas B., Szamoskozi I. (2022). The development of students critical thinking abilities and dispositions through the concept mapping learning method – a meta-analysis . Educ. Res. Rev. 37 :100481. doi: 10.1016/j.edurev.2022.100481 [ CrossRef ] [ Google Scholar ]
  • Baş M. T., Özpulat F., Molu B., Dönmez H. (2022). The effect of decorative arts course on nursing students’ creativity and critical thinking dispositions . Nurse Educ. Today :105584. doi: 10.1016/j.nedt.2022.105584 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Basri H., Rahman A. A. (2019). Investigating critical thinking skill of junior High School in Solving Mathematical Problem . Int. J. Instr. 12 , 745–758. doi: 10.29333/iji.2019.12345a [ CrossRef ] [ Google Scholar ]
  • Bellaera L., Weinstein-Jones Y., Ilie S., Baker S. T. (2021). Critical thinking in practice: the priorities and practices of instructors teaching in higher education . Think. Skills Creat. 41 :100856. doi: 10.1016/j.tsc.2021.100856 [ CrossRef ] [ Google Scholar ]
  • Belluigi D. Z., Cundill G. (2017). Establishing enabling conditions to develop critical thinking skills: a case of innovative curriculum design in environmental science . Environ. Educ. Res. 23 , 950–971. doi: 10.1080/13504622.2015.1072802 [ CrossRef ] [ Google Scholar ]
  • Bensley D. A., Crowe D. S., Bernhardt P., Buckner C., Allman A. L. (2010). Teaching and assessing critical thinking skills for argument analysis in psychology . Teach. Psychol. 37 , 91–96. doi: 10.1080/00986281003626656 [ CrossRef ] [ Google Scholar ]
  • Bensley D. A., Rainey C., Murtagh M. P., Flinn J. A., Maschiocchi C., Bernhardt P. C., et al.. (2016). Closing the assessment loop on critical thinking: the challenges of multidimensional testing and low test-taking motivation . Think. Skills Creat. 21 , 158–168. doi: 10.1016/j.tsc.2016.06.006 [ CrossRef ] [ Google Scholar ]
  • Bentler P. M. (1990). Comparative fit indexes in structural models . Psychol. Bull. 107 , 238–246. doi: 10.1037/0033-2909.107.2.238, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Berdahl L., Hoessler C., Mulhall S., Matheson K. (2021). Teaching critical thinking in political science: A case study . J. Political Sci. Educ. 17 , 910–925. doi: 10.1080/15512169.2020.1744158 [ CrossRef ] [ Google Scholar ]
  • Berestova A., Kolosov S., Tsvetkova M., Grib E. (2021). Academic motivation as a predictor of the development of critical thinking in students . J. Appl. Res. High. Educ. 14 , 1041–1054. doi: 10.1108/JARHE-02-2021-0081 [ CrossRef ] [ Google Scholar ]
  • Bertram C., Weiss Z., Zachrich L., Ziai R. (2021). Artificial intelligence in history education. Linguistic content and complexity analyses of student writings in the CAHisT project (computational assessment of historical thinking) . Comput. Educ. Artif. Intell 100038 :100038. doi: 10.1016/j.caeai.2021.100038 [ CrossRef ] [ Google Scholar ]
  • Bhatt I., Samanhudi U. (2022). From academic writing to academics writing: transitioning towards literacies for research productivity . Int. J. Educ. Res. 111 :101917. doi: 10.1016/j.ijer.2021.101917 [ CrossRef ] [ Google Scholar ]
  • Boso C. M., Van Der Merwe A. S., Gross J. (2021). Students’ and educators’ experiences with instructional activities towards critical thinking skills acquisition in a nursing school . Int J. Afr Nurs Sci 14 :100293. doi: 10.1016/j.ijans.2021.100293 [ CrossRef ] [ Google Scholar ]
  • Braeuning D., Hornung C., Hoffmann D., Lambert K., Ugen S., Fischbach A., et al.. (2021). Cognitive development . 58 :101008. doi: 10.1016/j.cogdev.2021.101008, [ CrossRef ] [ Google Scholar ]
  • Bravo M. J., Galiana L., Rodrigo M. F., Navarro-Pérez J. J., Oliver A. (2020). An adaptation of the critical thinking disposition scale in Spanish youth . Think. Skills Creat. 38 :100748. doi: 10.1016/j.tsc.2020.100748 [ CrossRef ] [ Google Scholar ]
  • Butler H. A., Pentoney C., Bong M. P. (2017). Predicting real-world outcomes: critical thinking ability is a better predictor of life decisions than intelligence . Think. Skills Creat. 25 , 38–46. doi: 10.1016/j.tsc.2017.06.005 [ CrossRef ] [ Google Scholar ]
  • Cáceres M., Nussbaum M., Ortiz J. (2020). Integrating critical thinking into the classroom: a teacher’s perspective . Think. Skills Creat. 37 , 100674–100618. doi: 10.1016/j.tsc.2020.100674 [ CrossRef ] [ Google Scholar ]
  • Chan C. (2019). Using digital storytelling to facilitate critical thinking disposition in youth civic engagement: A randomized control trial . Child Youth Serv. Rev. 107 :104522. doi: 10.1016/j.childyouth.2019.104522 [ CrossRef ] [ Google Scholar ]
  • Chen F., Chen S., Pai H. (2019). Self-reflection and critical thinking: the influence of professional qualifications on registered nurses . Contem Nurs 55 , 59–70. doi: 10.1080/10376178.2019.1590154, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chen Q., Liu D., Zhou C., Tang S. (2020). Relationship between critical thinking disposition and research competence among clinical nurses: A cross-sectional study . J. Clin. Nurs. 29 , 1332–1340. doi: 10.1111/jocn.15201, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chen K. L., Wei X. (2021). Boya education in China: lessons from liberal arts education in the U.S. and Hong Kong . Int. J. Educ. Dev. 84 :102419. doi: 10.1016/j.ijedudev.2021.102419 [ CrossRef ] [ Google Scholar ]
  • Chen X., Zhai X., Zhu Y., Li Y. (2022). Exploring debaters and audiences’ depth of critical thinking and its relationship with their participation in debate activities . Think. Skills Creat. 44 :101035. doi: 10.1016/j.tsc.2022.101035 [ CrossRef ] [ Google Scholar ]
  • Colling J., Wollschlager R., Keller U., Preckel F., Fischbach A. (2022). Need for cognition and its relation to academic achievement in different learning environments . Learn. Individ. Differ. 93 :1021110. doi: 10.1016/j.lindif.2021.102110 [ CrossRef ] [ Google Scholar ]
  • Conway J. M., Huffcutt A. I. (2016). A review and evaluation of exploratory factor analysis practices in organizational research . Organ. Res. Methods 6 , 147–168. doi: 10.1177/1094428103251541 [ CrossRef ] [ Google Scholar ]
  • Davies M. (2013). Critical thinking and the disciplines reconsidered . High. Educ. Res. Dev. 32 , 529–544. doi: 10.1080/07294360.2012.697878 [ CrossRef ] [ Google Scholar ]
  • Devellis R. F. (2011). Scale Development . New York: SAGE Publications, Inc. [ Google Scholar ]
  • Dewey J. (1933). How We Think . D C Heath, Boston. [ Google Scholar ]
  • Din M. (2020). Evaluating university students’ critical thinking ability as reflected in their critical reading skill: A study at bachelor level in Pakistan . Think. Skills Creat. 35 :100627. doi: 10.1016/j.tsc.2020.100627 [ CrossRef ] [ Google Scholar ]
  • Dumitru D. (2019). Creating meaning. The importance of arts, humanities and culture for critical thinking development . Stud. High. Educ. 44 , 870–879. doi: 10.1080/03075079.2019.1586345 [ CrossRef ] [ Google Scholar ]
  • Dunne G. (2015). Beyond critical thinking to critical being: criticality in higher education and life . Int. J. Educ. Res. 71 , 86–99. doi: 10.1016/j.ijer.2015.03.003 [ CrossRef ] [ Google Scholar ]
  • Duro E., Elander J., Maratos F. A., Stupple E. J. N., Aubeeluck A. (2013). In search of critical thinking in psychology: an exploration of student and lecturer understandings in higher education . Psychol. Learn. Teach. 12 , 275–281. doi: 10.2304/plat.2013.12.3.275 [ CrossRef ] [ Google Scholar ]
  • Dwyer C. P., Hogan M. J., Stewart I. (2014). An integrated critical thinking framework for the 21st century . Think. Skills Creat. 12 , 43–52. doi: 10.1016/j.tsc.2013.12.004 [ CrossRef ] [ Google Scholar ]
  • Ennis R. H. (1962). A concept of critical thinking . Harvard Educ Rev 32 , 81–111. [ Google Scholar ]
  • Ennis R. H. (1987). Critical Thinking and the Curriculum Think Skills Ins: Con. Tec., 40–48. [ Google Scholar ]
  • Ennis R. H. (1989). Critical thinking and subject specificity: clarification and needed research . Educ Res. 18 , 4–10. doi: 10.3102/0013189X018003004 [ CrossRef ] [ Google Scholar ]
  • Ennis R. H. (2018). Critical thinking across the curriculum: A vision . Springer 37 , 165–184. doi: 10.1007/s11245-016-9401-4 [ CrossRef ] [ Google Scholar ]
  • Fabrigar L. R., Wegener D. T., Mac Callum R. C., Strahan E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research . Psychol. Method. 4 , 272–299. doi: 10.1037/1082-989X.4.3.272 [ CrossRef ] [ Google Scholar ]
  • Facione P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. Available at: http://files.eric.ed.gov/fulltext/ED315423.pdf (Accessed November 3, 2022).
  • Facione N. C., Facione P. A., Sanchez C. A. (1994). Critical thinking disposition as a measure of competent clinical judgment: the development of the California critical thinking disposition inventory . J. Nurs. Educ. 33 , 345–350. doi: 10.3928/0148-4834-19941001-05, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Feng R. C., Chen M. J., Chen M. C., Pai Y. C. (2010). Critical thinking competence and disposition of clinical nurses in a medical center . J. Nurs. Res. 18 , 77–87. doi: 10.1097/JNR.0b013e3181dda6f6 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fernández-Santín M., Feliu-Torruella M. (2020). Developing critical thinking in early childhood through the philosophy of Reggio Emilia . Think. Skills Creat. 37 :100686. doi: 10.1016/j.tsc.2020.100686 [ CrossRef ] [ Google Scholar ]
  • Flores K. L., Matkin G. S., Burbach M. E., Quinn C. E., Harding H. (2012). Deficient critical thinking skills among college graduates: implications for leadership . Educ. Philos. Theory 44 , 212–230. doi: 10.1111/j.1469-5812.2010.00672.x [ CrossRef ] [ Google Scholar ]
  • Flynn R. M., Kleinknecht E., Ricker A. A., Blumberg F. C. (2021). A narrative review of methods used to examine digital gaming impacts on learning and cognition during middle childhood . Int. J. Child Comput. Int. 30 :100325. doi: 10.1016/j.ijcci.2021.100325 [ CrossRef ] [ Google Scholar ]
  • Fornell C., Larcker D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error . J. Mark. Res. 18 , 39–50. doi: 10.1177/002224378101800312 [ CrossRef ] [ Google Scholar ]
  • Foster J., Barkus E., Yavorsky C. (1993). Understanding and Using Advanced Statistics. New York: SAGE Publications. [ Google Scholar ]
  • Gao Q. Q., Cao B. W., Guan X., Gu T. Y., Bao X., Wu J. Y., et al.. (2022). Emotion recognition in conversations with emotion shift detection based on multi-task learning . Knowl-based Syst. 248 :108861. doi: 10.1016/j.knosys.2022.108861 [ CrossRef ] [ Google Scholar ]
  • Ghanizadeh A. (2016). The interplay between reflective thinking, critical thinking, self-monitoring, and academic achievement in higher education . High. Educ. 74 , 101–114. doi: 10.1007/s10734-016-0031-y [ CrossRef ] [ Google Scholar ]
  • Giannouli V., Giannoulis K. (2021). Critical thinking and leadership: can we escape modern Circe's spells in nursing? Nursing leadership Toronto . Ont 34 , 38–44. doi: 10.12927/cjnl.2021.26456 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gorsuch R. (1983). Factor analysis (2nd ed). Hillsdale, NJ: Lawrence Erlbaum Associates. [ Google Scholar ]
  • Gyenes A. (2021). Student perceptions of critical thinking in EMI programs at Japanese universities: A Q-methodology study . J. Eng Aca Pur 54 :101053. doi: 10.1016/j.jeap.2021.101053 [ CrossRef ] [ Google Scholar ]
  • Hair J. F., Black W. C., Babin B. J., Anderson R. E. (2014). Multivariate Data Analysis , 7th Edn. Upper Saddle River, NJ: Pearson Prentice Hall. [ Google Scholar ]
  • Halpern D. F. (1998). Teaching critical thinking for transfer across domains: disposition, skills, structure training, and metacognitive monitoring . Am. Psychol. 53 , 449–455. doi: 10.1037/0003-066X.53.4.449, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Han J., Usher E. L., Brown C. S. (2021). Trajectories in quantitative and humanities self-efficacy during the first year of college . Learn. Individ. Differ. 91 :102054. doi: 10.1016/j.lindif.2021.102054 [ CrossRef ] [ Google Scholar ]
  • Hashemi M. R., Ghanizadeh A. (2012). Critical discourse analysis and critical thinking: an experimental study in an EFL context . System 40 , 37–47. doi: 10.1016/j.system.2012.01.009 [ CrossRef ] [ Google Scholar ]
  • Hsu F. H., Lin I. H., Yeh H. C., Chen N. S. (2022). Effect of Socratic reflection prompts via video-based learning system on elementary school students’ critical thinking skills . Comput. Educ. 183 :104497. doi: 10.1016/j.compedu.2022.104497 [ CrossRef ] [ Google Scholar ]
  • Huang Y. M., Silitonga L. M., Wu T. T. (2022). Applying a business simulation game in a flip classroon to enhance engagement, learning achievement, and higher-order thinking skills . Comput. Educ. 183 :104497. doi: 10.1016/j.compedu.2022.104494 [ CrossRef ] [ Google Scholar ]
  • Jiang J., Gao A., Yang B. Y. (2018). Employees' critical thinking, Leaders' inspirational motivation, and voice behavior the mediating role of voice efficacy . J. Pers. Psychol. 17 , 33–41. doi: 10.1027/1866-5888/a000193 [ CrossRef ] [ Google Scholar ]
  • Jomli R., Ouertani J., Jemli H., Ouali U., Zgueb Y., Nacef F. (2021). Comparative study of affective temperaments between medical students and humanities students (evaluation by validated temps-a) . Eur. Psychiatry 64 :S199. doi: 10.1192/j.eurpsy.2021.529 [ CrossRef ] [ Google Scholar ]
  • Kaiser H. F. (1974). An index of factorial simplicity . Psychometrika 39 , 31–36. doi: 10.1007/BF02291575 [ CrossRef ] [ Google Scholar ]
  • Kanbay Y., Okanlı A. (2017). The effect of critical thinking education on nursing students’ problem-solving skills . Contemp. Nurse 53 , 313–321. doi: 10.1080/10376178.2017.1339567, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kember D., Leung D. Y. P., Jones A., Loke A. Y., Mckay J., Sinclair K., et al.. (2010). Development of a questionnaire to measure the level of reflective thinking . Asses. Eval. High. Edu. 25 , 381–395. doi: 10.1080/713611442 [ CrossRef ] [ Google Scholar ]
  • Khandaghi M. A., Pakmehr H., Amiri E. (2011). The status of college students’ critical thinking disposition in humanities . Proc. Soc. Behav. Sci 15 , 1866–1869. doi: 10.1016/j.sbspro.2011.04.017 [ CrossRef ] [ Google Scholar ]
  • Kieffer K. M. (1988). Orthogonal Versus Oblique Factor Rotation: A Review of the Literature Regarding the Pros and Cons. In Proceedings 554 of the Annual Meeting of the 27th Mid-South Educational Research Association, New Orleans, LA;4 November 1998, 4-6, 555. Available at: https://files.eric.ed.gov/fulltext/ED427031.pdf (Accessed November 3, 2022).
  • Kilic S., Gokoglu S., Ozturk M. A. (2020). Valid and reliable scale for developing programming-oriented computational thinking . J. Educ. Comput. Res. 59 , 257–286. doi: 10.1177/0735633120964402 [ CrossRef ] [ Google Scholar ]
  • Kim D. H., Moon S., Kim E. J., Kim Y. J., Lee S. (2014). Nursing students' critical thinking disposition according to academic level and satisfaction with nursing . Nurs. Educ. Today 34 , 78–82. doi: 10.1016/j.nedt.2013.03.012, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kim M.-Y., Wilkinson I. A. G. (2019). What is dialogic teaching? Constructing, deconstructing, and reconstructing a pedagogy of classroom talk . Learn. Cult Soc. Inter 21 , 70–86. doi: 10.1016/j.lcsi.2019.02.003 [ CrossRef ] [ Google Scholar ]
  • Kline T. J. B. (2005). Psychological Testing: A Practical Approach to Design and Evaluation . Thousand Oaks, London, New Delhi: Sage Publications. [ Google Scholar ]
  • Klugman C. M. (2018). How health humanities will save the life of the humanities . J. Med. Humanit. 38 , 419–430. doi: 10.1007/s10912-017-9453-5 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lederer J. M. (2007). Disposition toward critical thinking among occupational therapy students . Am. J. Occup. Ther. 61 , 519–526. doi: 10.5014/ajot.61.5.519, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Les T., Moroz J. (2021). More critical thinking in critical thinking concepts (?) A constructivist point of view . J Crit Educ Policy Sci 19 , 98–124. [ Google Scholar ]
  • Li N. (2021). Reasonable or unwarranted? Benevolent gender prejudice in education in China . Asia Pac. Educ. Res 31 , 155–163. doi: 10.1007/s40299-020-00546-6 [ CrossRef ] [ Google Scholar ]
  • Li X. Y., Liu J. D. (2021). Mapping the taxonomy of critical thinking ability in EFL . Think. Skills Creat. 41 :100880. doi: 10.1016/j.tsc.2021.100880 [ CrossRef ] [ Google Scholar ]
  • Liang W., Fung D. (2020). Development and evaluation of a WebQuest-based teaching programme: students’ use of exploratory talk to exercise critical thinking . Int. J. Educ. Res. 104 :101652. doi: 10.1016/j.ijer.2020.101652, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lin S. S. (2016). Science and non-science undergraduate students’ critical thinking and argumentation performance in reading a science news report . Int. J. Sci. Math. Educ. 12 , 1023–1046. doi: 10.1007/s10763-013-9451-7 [ CrossRef ] [ Google Scholar ]
  • Lin L. (2020). The future of "useless" Liberal arts . Univ. Mon. Rev. Philos. Cult. 47 , 93–110. [ Google Scholar ]
  • Liu O. L., Frankel L., Roohr K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessment . ETS Res. Rep. Series 2014 , 1–23. doi: 10.1002/ets2.12009 [ CrossRef ] [ Google Scholar ]
  • Liu H., Shao M., Liu X., Zhao L. (2021). Exploring the influential factors on readers' continuance intentions of e-book APPs: personalization, usefulness, playfulness, and satisfaction . Front. Psychol. 12 :640110. doi: 10.3389/fpsyg.2021.640110, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Liu T., Zhao R., Lam K.-M., Kong J. (2022). Visual-semantic graph neural network with pose-position attentive learning for group activity recognition . Neurocomputing 491 , 217–231. doi: 10.1016/j.neucom.2022.03.066 [ CrossRef ] [ Google Scholar ]
  • Ma L., Luo H. (2020). Chinese pre-service teachers’ cognitions about cultivating critical thinking in teaching English as a foreign language . Asia Pac. J. Educ 41 , 543–557. doi: 10.1080/02188791.2020.1793733 [ CrossRef ] [ Google Scholar ]
  • Michaluk L. M., Martens J., Damron R. L., High K. A. (2016). Developing a methodology for teaching and evaluating critical thinking skills in first-year engineering students . Int. J. Eng. Educ. 32 , 84–99. [ Google Scholar ]
  • Mulnix J. W., Mulnix M. J. (2010). Using a writing portfolio project to teach critical thinking skills . Teac. Phi 33 , 27–54. doi: 10.5840/teachphil20103313 [ CrossRef ] [ Google Scholar ]
  • Murphy E. (2004). An instrument to support thinking critically about critical in thinking online asynchronous discussions . Aust. J. Educ. Technol. 20 , 295–315. doi: 10.14742/ajet.1349 [ CrossRef ] [ Google Scholar ]
  • Nair G. G., Stamler L. L. (2013). A conceptual framework for developing a critical thinking self-assessment scale . J. Nurs. Educ. 52 , 131–138. doi: 10.3928/01484834-20120215-01, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • O’Reilly C., Devitt A., Hayes N. (2022). Critical thinking in the preschool classroom - a systematic literature review . Think. Skills Creat. 46 :101110. doi: 10.1016/j.tsc.2022.101110 [ CrossRef ] [ Google Scholar ]
  • Odebiyi O. M., Odebiyi A. T. (2021). Critical thinking in social contexts: A trajectory analysis of states’ K-5 social studies content standards . J. Soc. Stud. Res. 45 , 277–288. doi: 10.1016/j.jssr.2021.05.002 [ CrossRef ] [ Google Scholar ]
  • Pallant J. F. (2007). SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS . 3rd Edn.. Routledge. Region 6th Series, Bangi, 551 . [ Google Scholar ]
  • Perkins D. N., Jay E., Tishman S. (1993). Beyond Abilities: A Dispositional Theory of Thinking. Merrill-Palmer Quarterly (1982), 1-21. Available at: http://www.jstor.org/stable/23087298 (Accessed November 3, 2022).
  • Pnevmatikos D., Christodoulou P., Georgiadou T. (2019). Promoting critical thinking in higher education through the values and knowledge education (VaKE) method . Stud. High. Educ. 44 , 892–901. doi: 10.1080/03075079.2019.1586340 [ CrossRef ] [ Google Scholar ]
  • Quinn S., Hogan M. J., Dwyer C., Finn P. (2020). Development and validation of the student-educator negotiated critical thinking dispositions scale (SENCTDS) . Think. Skills Creat. 38 :100710. doi: 10.1016/j.tsc.2020.100710 [ CrossRef ] [ Google Scholar ]
  • Reynolds S. J. (2008). Moral attentiveness: who pays attention to the moral aspects of life? J. Appl. Psycho. 93 , 1027–1041. doi: 10.1037/0021-9010.93.5.1027, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rodríguez-Sabiote C., Olmedo-Moreno E. M., Expósito-López J. (2022). The effects of teamwork on critical thinking: A serial mediation analysis of the influence of work skills and educational motivation in secondary school students . Think. Skills Creat. 45 :101063. doi: 10.1016/j.tsc.2022.101063 [ CrossRef ] [ Google Scholar ]
  • Rombout F., Schuitema J. A., Volman M. L. L. (2021). Teaching strategies for value-loaded critical thinking in philosophy classroom dialogues . Think. Skills Creat. 43 :100991. doi: 10.1016/j.tsc.2021.100991 [ CrossRef ] [ Google Scholar ]
  • Saad A., Zainudin S. (2022). A review of project-based learning (PBL) and computational thinking (CT) in teaching and learning . Learn. Motiv. 78 :101802. doi: 10.1016/j.lmot.2022.101802 [ CrossRef ] [ Google Scholar ]
  • Sarjinder S. (2003). “ Simple random sampling ,” in Advanced Sampling Theory with Application (Dordrecht: Springer; ) [ Google Scholar ]
  • Schumacker R. E., Lomax R. G. (2016). A Beginner' s Guide to Structural Equation Modeling (4th Edn..) New York: Routledge. [ Google Scholar ]
  • Scriven M., Paul R. (2005). The Critical Thinking Community. Available at: http://www.criticalthinking.org (Accessed November 3, 2022).
  • Siew M., Mapeala R. (2016). The effects of problem-based learning with thinking maps on fifth graders’ science critical thinking . J. Balt. Sci. Educ. 15 , 602–616. doi: 10.33225/jbse/16.15.602 [ CrossRef ] [ Google Scholar ]
  • Simpson E., Courtney M. (2002). Critical thinking in nursing education: Literature review . Int. J. Nurs. Pract. 8 , 89–98. doi: 10.1046/j.1440-172x.2002.00340.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Singh M., James P. S., Paul H., Bolar K. (2022). Impact of cognitive-behavioral motivation on student engagement . Helyon 8 . doi: 10.1016/j.heliyon.2022.e09843 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sosu E. M. (2013). The development and psychometric validation of a critical thinking disposition scale . Think. Skills Creat. 9 , 107–119. doi: 10.1016/j.tsc.2012.09.002 [ CrossRef ] [ Google Scholar ]
  • Southworth J. (2020). How argumentative writing stifles open-mindedness . Arts Hum. High. Educ. 20 , 207–227. doi: 10.1177/1474022220903426 [ CrossRef ] [ Google Scholar ]
  • Southworth J. (2022). A perspective-taking theory of open-mindedness: confronting the challenge of motivated reasoning . Educ. Theory 93 , 1027–1041. doi: 10.1037/0021-9010.93.5.1027 [ CrossRef ] [ Google Scholar ]
  • Stone G. A., Duffy L. N., Pinckney H. P., Templeton-Bradley R. (2017). Teaching for critical thinking: preparing hospitality and tourism students for careers in the twenty-first century . J. Teach. Travel Tour. 17 , 67–84. doi: 10.1080/15313220.2017.1279036 [ CrossRef ] [ Google Scholar ]
  • Suh J., Matson K., Seshaiyer P., Jamieson S., Tate H. (2021). Mathematical modeling as a catalyst for equitable mathematics instruction: preparing teachers and young learners with 21st century skills . Mathematics 9 :162. doi: 10.3390/math9020162 [ CrossRef ] [ Google Scholar ]
  • Sulaiman W. S. W., Rahman W. R. A., Dzulkifli M. A. (2010). Examining the construct validity of the adapted California critical thinking dispositions (CCTDI) among university students in Malaysia . Procedia Soc. Behav. Sci. 7 , 282–288. doi: 10.1016/j.sbspro.2010.10.039 [ CrossRef ] [ Google Scholar ]
  • Swartz R. J. (2018). “ Critical thinking, the curriculum, and the problem of transfer ,” in Thinking: The Second International Conference . eds. Perkins D. N., Lochhead J., Bishop J. (New York: Routledge; ), 261–284. [ Google Scholar ]
  • Thaiposri P., Wannapiroon P. (2015). Enhancing students’ critical thinking skills through teaching and learning by inquiry-based learning activities using social network and cloud computing . Procedia Soc. Behav. Sci. 174 , 2137–2144. doi: 10.1016/j.sbspro.2015.02.013 [ CrossRef ] [ Google Scholar ]
  • Thomas K., Lok B. (2015). “ Teaching critical thinking: an operational framework ,” in The Palgrave Handbook of Critical Thinking in Higher Education . eds. Davies M., Barnett R. (New York: Palgrave Handbooks; ), 93–106. [ Google Scholar ]
  • Tumkaya S., Aybek B., Aldag H. (2009). An investigation of university Students' critical thinking disposition and perceived problem-solving skills . Eurasian J. Educ. Res. 9 , 57–74. [ Google Scholar ]
  • Ulger K. (2018). The effect of problem-based learning on the creative thinking and critical thinking disposition of students in visual arts education . Interdis. J. Probl-Bas. 12 :10. doi: 10.7771/1541-5015.1649 [ CrossRef ] [ Google Scholar ]
  • Vogel D. L., Wade N. G., Ascheman P. L. (2009). Measuring perceptions of stigmatization by others for seeking psychological help: reliability and validity of a new stigma scale with college students . J. Couns. Psychol. 56 , 301–308. doi: 10.1037/a0014903 [ CrossRef ] [ Google Scholar ]
  • Wechsler S. M., Saiz C., Rivas S. F., Vendramini C. M. M., Almeida L. S., Mundim M. C., et al.. (2018). Creative and critical thinking: independent or overlapping components? Think. Skills Creat. 27 , 114–122. doi: 10.1016/j.tsc.2017.12.003 [ CrossRef ] [ Google Scholar ]
  • Willingham D. T. (2007). Critical thinking: why it is so hard to teach? Am. Fed. Teach. Summer 2007 , 8–19. [ Google Scholar ]
  • Yang S. (2007). E-critical/thematic doing history project: integrating the critical thinking approach with computer-mediated history learning . Comput. Hum. Behav. 23 , 2095–2112. doi: 10.1016/j.chb.2006.02.012 [ CrossRef ] [ Google Scholar ]
  • Yurdakul I. K., Odabasi H. F., Kiliçer K., Çoklar A. N., Birinci G., Kurt A. A. (2012). The development, validity and reliability of TPACK-deep: A technological pedagogical content knowledge scale . Comput. Educ. 58 , 964–977. doi: 10.1016/j.compedu.2011.10.012 [ CrossRef ] [ Google Scholar ]
  • Zhang Q., Tang H., Xu X. (2022). Analyzing collegiate critical thinking course effectiveness: evidence from a quasi-experimental study in China . Think. Skills Creat. 45 :101105. doi: 10.1016/j.tsc.2022.101105 [ CrossRef ] [ Google Scholar ]
  • Zhao L., He W., Su Y. S. (2021). Innovative pedagogy and design-based research on flipped learning in higher education . Front. Psychol. 12 :577002. doi: 10.3389/fpsyg.2021.577002, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

Critical thinking definition

critical thinking training feedback

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Prevent plagiarism. Run a free check.

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

critical thinking training feedback

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved March 25, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Learn more

How it works

Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.

Explore how BetterUp connects to your core business systems.

We pair AI with the latest in human-centered coaching to drive powerful, lasting learning and behavior change.

Build leaders that accelerate team performance and engagement.

Unlock performance potential at scale with AI-powered curated growth journeys.

Build resilience, well-being and agility to drive performance across your entire enterprise.

Transform your business, starting with your sales leaders.

Unlock business impact from the top with executive coaching.

Foster a culture of inclusion and belonging.

Accelerate the performance and potential of your agencies and employees.

See how innovative organizations use BetterUp to build a thriving workforce.

Discover how BetterUp measurably impacts key business outcomes for organizations like yours.

A demo is the first step to transforming your business. Meet with us to develop a plan for attaining your goals.

Request a demo

  • What is coaching?

Learn how 1:1 coaching works, who its for, and if it's right for you.

Accelerate your personal and professional growth with the expert guidance of a BetterUp Coach.

Types of Coaching

Navigate career transitions, accelerate your professional growth, and achieve your career goals with expert coaching.

Enhance your communication skills for better personal and professional relationships, with tailored coaching that focuses on your needs.

Find balance, resilience, and well-being in all areas of your life with holistic coaching designed to empower you.

Discover your perfect match : Take our 5-minute assessment and let us pair you with one of our top Coaches tailored just for you.

Find your Coach

Best practices, research, and tools to fuel individual and business growth.

View on-demand BetterUp events and learn about upcoming live discussions.

The latest insights and ideas for building a high-performing workplace.

  • BetterUp Briefing

The online magazine that helps you understand tomorrow's workforce trends, today.

Innovative research featured in peer-reviewed journals, press, and more.

Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance

We're on a mission to help everyone live with clarity, purpose, and passion.

Join us and create impactful change.

Read the buzz about BetterUp.

Meet the leadership that's passionate about empowering your workforce.

For Business

For Individuals

16 constructive feedback examples — and tips for how to use them

constructive-feedback-examples-man-presenting-in-front-of-team

Elevate your communication skills

Unlock the power of clear and persuasive communication. Our coaches can guide you to build strong relationships and succeed in both personal and professional life.

Giving constructive feedback is nerve-wracking for many people. But feedback is also necessary for thriving in the workplace. 

It helps people flex and grow into new skills, capabilities, and roles. It creates more positive and productive relationships between employees. And it helps to reach goals and drive business value.

But feedback is a two-way street. More often than not, it’s likely every employee will have to give constructive feedback in their careers. That’s why it’s helpful to have constructive feedback examples to leverage for the right situation. 

We know employees want feedback. But one study found that people want feedback if they’re on the receiving end . In fact, in every case, participants rated their desire for feedback higher as the receiver. While the fear of feedback is very real, it’s important to not shy away from constructive feedback opportunities. After all, it could be the difference between a flailing and thriving team. 

If you’re trying to overcome your fear of providing feedback, we’ve compiled a list of 16 constructive feedback examples for you to use. We’ll also share some best practices on how to give effective feedback . 

What is constructive feedback? 

When you hear the word feedback, what’s the first thing that comes to mind? What feelings do you have associated with feedback? Oftentimes, feedback conversations are anxiety-ridden because it’s assumed to be negative feedback. Unfortunately, feedback has this binary stigma, it’s either good or bad.

But in reality, there are plenty of types of feedback leveraged in both personal and professional relationships. They don’t all fall into one camp or the other. And each type of feedback is serving a purpose to ultimately better an individual, team, or work environment. 

For example, positive feedback can be used to reinforce desired behaviors or big accomplishments. Real-time feedback is reserved for those “in the moment” situations. Like if I’ve made a mistake or a typo in a blog, I’d want my teammates to give me real-time feedback . 

However, constructive feedback is its own ball game. 

What is constructive feedback?

Constructive feedback is a supportive way to improve areas of opportunity for an individual person, team, relationship, or environment. In many ways, constructive feedback is a combination of constructive criticism paired with coaching skills. 

16 constructive feedback examples to use 

To truly invest in building a feedback culture , your employees need to feel comfortable giving feedback. After all, organizations are people, which means we’re all human. We make mistakes but we’re all capable of growth and development. And most importantly, everyone everywhere should be able to live with more purpose, clarity, and passion. 

But we won’t unlock everyone’s full potential unless your people are comfortable giving feedback. Some employee feedback might be easier to give than others, like ways to improve a presentation. 

But sometimes, constructive feedback can be tricky, like managing conflict between team members or addressing negative behavior. As any leader will tell you, it’s critical to address negative behaviors and redirect them to positive outcomes. Letting toxic behavior go unchecked can lead to issues with employee engagement , company culture, and overall, your business’s bottom line. 

Regardless of where on the feedback spectrum your organization falls, having concrete examples will help set up your people for success. Let’s talk through some examples of constructive feedback. For any of these themes, it’s always good to have specific examples handy to help reinforce the feedback you’re giving. We’ll also give some sample scenarios of when these phrases might be most impactful and appropriate. 

See how BetterUp Works - Watch Demo

Constructive feedback examples about communication skills  

An employee speaks over others and interrupts in team meetings.

“I’ve noticed you can cut off team members or interrupt others. You share plenty of good ideas and do good work. To share some communication feedback , I’d love to see how you can support others in voicing their own ideas in our team meetings.” 

An employee who doesn’t speak up or share ideas in team meetings.

“I’ve noticed that you don’t often share ideas in big meetings. But in our one-on-one meetings , you come up with plenty of meaningful and creative ideas to help solve problems. What can I do to help make you more comfortable speaking up in front of the team?” 

An employee who is brutally honest and blunt.

“Last week, I noticed you told a teammate that their work wasn’t useful to you. It might be true that their work isn’t contributing to your work, but there’s other work being spread across the team that will help us reach our organizational goals. I’d love to work with you on ways to improve your communication skills to help build your feedback skills, too. Would you be interested in pursuing some professional development opportunities?”  

An employee who has trouble building rapport because of poor communication skills in customer and prospect meetings.

“I’ve noticed you dive right into the presentation with our customer and prospect meetings. To build a relationship and rapport, it’s good to make sure we’re getting to know everyone as people. Why don’t you try learning more about their work, priorities, and life outside of the office in our next meeting?” 

constructive-feedback-examples-woman-with-hands-up-at-table

Constructive feedback examples about collaboration 

An employee who doesn’t hold to their commitments on group or team projects.

“I noticed I asked you for a deliverable on this key project by the end of last week. I still haven’t received this deliverable and wanted to follow up. If a deadline doesn’t work well with your bandwidth, would you be able to check in with me? I’d love to get a good idea of what you can commit to without overloading your workload.”  

An employee who likes to gatekeep or protect their work, which hurts productivity and teamwork .

“Our teams have been working together on this cross-functional project for a couple of months. But yesterday, we learned that your team came across a roadblock last month that hasn’t been resolved. I’d love to be a partner to you if you hit any issues in reaching our goals. Would you be willing to share your project plan or help provide some more visibility into your team’s work? I think it would help us with problem-solving and preventing problems down the line.” 

An employee who dominates a cross-functional project and doesn’t often accept new ways of doing things.

“I’ve noticed that two team members have voiced ideas that you have shut down. In the spirit of giving honest feedback, it feels like ideas or new solutions to problems aren’t welcome. Is there a way we could explore some of these ideas? I think it would help to show that we’re team players and want to encourage everyone’s contributions to this project.” 

Constructive feedback examples about time management 

An employee who is always late to morning meetings or one-on-ones.

“I’ve noticed that you’re often late to our morning meetings with the rest of the team. Sometimes, you’re late to our one-on-ones, too. Is there a way I can help you with building better time management skills ? Sometimes, the tardiness can come off like you don’t care about the meeting or the person you’re meeting with, which I know you don’t mean.” 

A direct report who struggles to meet deadlines.

“Thanks for letting me know you’re running behind schedule and need an extension. I’ve noticed this is the third time you’ve asked for an extension in the past two weeks. In our next one-on-one, can you come up with a list of projects and the amount of time that you’re spending on each project? I wonder if we can see how you’re managing your time and identify efficiencies.” 

An employee who continuously misses team meetings.

“I’ve noticed you haven’t been present at the last few team meetings. I wanted to check in to see how things are going. What do you have on your plate right now? I’m concerned you’re missing critical information that can help you in your role and your career.” 

constructive-feedback-examples-woman-handing-people-papers

Constructive feedback examples about boundaries 

A manager who expects the entire team to work on weekends.

“I’ve noticed you send us emails and project plans over the weekends. I put in a lot of hard work during the week, and won’t be able to answer your emails until the work week starts again. It’s important that I maintain my work-life balance to be able to perform my best.” 

An employee who delegates work to other team members.

“I’ve noticed you’ve delegated some aspects of this project that fall into your scope of work. I have a full plate with my responsibilities in XYZ right now. But if you need assistance, it might be worth bringing up your workload to our manager.” 

A direct report who is stressed about employee performance but is at risk of burning out.

“I know we have performance reviews coming up and I’ve noticed an increase in working hours for you. I hope you know that I recognize your work ethic but it’s important that you prioritize your work-life balance, too. We don’t want you to burn out.”  

Constructive feedback examples about managing 

A leader who is struggling with team members working together well in group settings.

“I’ve noticed your team’s scores on our employee engagement surveys. It seems like they don’t collaborate well or work well in group settings, given their feedback. Let’s work on building some leadership skills to help build trust within your team.” 

A leader who is struggling to engage their remote team.

“In my last skip-levels with your team, I heard some feedback about the lack of connections . It sounds like some of your team members feel isolated, especially in this remote environment. Let’s work on ways we can put some virtual team-building activities together.” 

A leader who is micromanaging , damaging employee morale.

“In the last employee engagement pulse survey, I took a look at the leadership feedback. It sounds like some of your employees feel that you micromanage them, which can damage trust and employee engagement. In our next one-on-one, let’s talk through some projects that you can step back from and delegate to one of your direct reports. We want to make sure employees on your team feel ownership and autonomy over their work.” 

8 tips for providing constructive feedback 

Asking for and receiving feedback isn’t an easy task. 

But as we know, more people would prefer to receive feedback than give it. If giving constructive feedback feels daunting, we’ve rounded up eight tips to help ease your nerves. These best practices can help make sure you’re nailing your feedback delivery for optimal results, too.

Be clear and direct (without being brutally honest). Make sure you’re clear, concise, and direct. Dancing around the topic isn’t helpful for you or the person you’re giving feedback to. 

Provide specific examples. Get really specific and cite recent examples. If you’re vague and high-level, the employee might not connect feedback with their actions.

constructive-feedback-examples-you-need-a-coach

Set goals for the behavior you’d like to see changed. If there’s a behavior that’s consistent, try setting a goal with your employee. For example, let’s say a team member dominates the conversation in team meetings. Could you set a goal for how many times they encourage other team members to speak and share their ideas? 

Give time and space for clarifying questions. Constructive feedback can be hard to hear. It can also take some time to process. Make sure you give the person the time and space for questions and follow-up. 

Know when to give feedback in person versus written communication. Some constructive feedback simply shouldn’t be put in an email or a Slack message. Know the right communication forum to deliver your feedback.   

Check-in. Make an intentional effort to check in with the person on how they’re doing in the respective area of feedback. For example, let’s say you’ve given a teammate feedback on their presentation skills . Follow up on how they’ve invested in building their public speaking skills . Ask if you can help them practice before a big meeting or presentation. 

Ask for feedback in return. Feedback can feel hierarchical and top-down sometimes. Make sure that you open the door to gather feedback in return from your employees. 

Start giving effective constructive feedback 

Meaningful feedback can be the difference between a flailing and thriving team. To create a feedback culture in your organization, constructive feedback is a necessary ingredient. 

Think about the role of coaching to help build feedback muscles with your employees. With access to virtual coaching , you can make sure your employees are set up for success. BetterUp can help your workforce reach its full potential.

See how BetterUp works - Watch Demo

Madeline Miles

Madeline is a writer, communicator, and storyteller who is passionate about using words to help drive positive change. She holds a bachelor's in English Creative Writing and Communication Studies and lives in Denver, Colorado. In her spare time, she's usually somewhere outside (preferably in the mountains) — and enjoys poetry and fiction.

5 types of feedback that make a difference (and how to use them)

Are you receptive to feedback follow this step-by-step guide, how to give constructive feedback as a manager, why coworker feedback is so important and 5 ways to give it, should you use the feedback sandwich 7 pros and cons, handle feedback like a boss and make it work for you, how to give and take constructive criticism, how to get feedback from your employees, how to give negative feedback to a manager, with examples, similar articles, how to give feedback using this 4-step framework, how to embrace constructive conflict, how to praise someone professionally on their work (with examples), 25 performance review questions (and how to use them), how to give kudos at work. try these 5 examples to show appreciation, stay connected with betterup, get our newsletter, event invites, plus product insights and research..

3100 E 5th Street, Suite 350 Austin, TX 78702

  • Platform Overview
  • Integrations
  • Powered by AI
  • BetterUp Lead
  • BetterUp Manage™
  • BetterUp Care™
  • Sales Performance
  • Diversity & Inclusion
  • Case Studies
  • Why BetterUp?
  • About Coaching
  • Find your Coach
  • Career Coaching
  • Communication Coaching
  • Life Coaching
  • News and Press
  • Leadership Team
  • Become a BetterUp Coach
  • BetterUp Labs
  • Center for Purpose & Performance
  • Leadership Training
  • Business Coaching
  • Contact Support
  • Contact Sales
  • Privacy Policy
  • Acceptable Use Policy
  • Trust & Security
  • Cookie Preferences

critical thinking training feedback

Critical Thinking in the Age of AI

Over the past few decades, online discussions have occasionally turned to the question of whether the tools that allow us easy access to a wealth of information (like Google) help or hurt our ability to reason (i.e., to be critical thinkers) and to learn . We are now seeing this same discussion about new artificial intelligence tools, like ChatGPT. Before reading on, take a moment and think about your own perspective on this: Do these tools help make you a stronger critical thinker or not?

To try and figure this out requires a clear definition of critical thinking, something that is frequently lacking in these discussions. Let’s use what Daniel Willingham, a cognitive scientist, says is his “ commonsensical view .” His definition is that critical thinking is: 

  • Novel: Not a direct repetition of something you’ve learned before. 
  • Self-directed: Not just repeating steps you’ve been given. 
  • Effective: Following patterns that are likely to yield useful conclusions.

These elements are best thought about in the context of the tools we have available: As we use generative AI to create more writing and other content, it will be even more important to approach media with the right framework. Daniel Dennett, the philosopher and scientist, wrote about some mental habits to use (he called them “ intuition pumps ”). Occam’s razor is one that many people are familiar with—don’t rely on a complex explanation when a simpler one works just as well. He also talks about certain things to be on the lookout for, such as a “deepity”: a statement that sounds simple and profound but can be read multiple ways and which is actually quite pointless (Dennett gives the example “Love is just a word”). If you find yourself gasping and going “wow” after encountering one of these, take a moment to see if you can explain what interesting idea it actually revealed; you may not be able to!

Like any habit of mind, people can adopt effective general patterns with effortful practice. And domain-specific critical thinking skills can be supported through instruction. That is, if your goal is to teach someone how to, say, debug a program that isn’t working, there is evidence that it can be taught through direct instruction and practice applying it, getting feedback along the way. 

But this gets at another challenge. Frequently, what people mean by critical thinking is the application of approaches to new situations. You may think that someone who just received that training in debugging may be able to use the same underlying skills to help, say, revise an essay. However, there is a lot of psychology research that finds we are generally terrible at transferring critical thinking skills to new situations. We’ll dive into this limitation in people’s abilities to transfer their knowledge in another post, but, for now, we shouldn’t assume that for people using generative AI, being experts in their particular fields will protect them when evaluating false or misleading claims in areas outside their expertise.

So what does all of this mean for a person trying to keep up their critical thinking skills as the use of generative AI ramps up? Here are a few suggestions:

  • Learn : Continue building your own body of knowledge and skills, even if it is seemingly something that a computer could do for you. That will give you the grounding to potentially make connections and form new ideas that go beyond what even ChatGPT can generate. 
  • Evaluate : Stress the critical in the idea of “critical thinking.” It is well-known that generative AI can hallucinate , particularly when it comes to up-to-date research. Even outside interactions with a tool like ChatGPT, try to apply some healthy skepticism, whether to a news article, a YouTube video, an interesting newsletter, a corporate strategy document, or any other media. Look for additional sources for claims you see, particularly ones that seem too good to be true. 
  • Reflect : After you work with an AI, do some reflecting. For example, if you are using ChatGPT to help you craft a persuasive message (like a marketing email or even a LinkedIn post), ask yourself how it went. Did it produce what you wanted? What elements seemed to align with your thinking and which didn’t? Making sure you stop to explore these questions in your own interactions will help make you a stronger critical thinker when dealing with the output of AI systems.

More Thought Leadership

critical thinking training feedback

Unlocking the Future of Learning: The Power of Collaborative Learning

critical thinking training feedback

Unlocking Empathy, Communication, and Leadership through VR Training

Educate your workforce with mit horizon.

critical thinking training feedback

critical thinking training feedback

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

How to Give (and Receive) Critical Feedback

  • Patrick Thean

critical thinking training feedback

It can make or break your relationships with your team.

New leaders often procrastinate difficult discussions at the expense of themselves and their teams. At the root of this feeling is usually a lack of experience and practice — both of which can be gained with intention and time. Here are two especially “spicy” conversations that all new managers face, and how to navigate them now and in the future:

  • Giving Feedback to Direct Reports: You should be having regular weekly or biweekly one-on-one meetings with each of your team members to check in on their work and offer your support. This is a great time to share feedback (both positive and negative). When delivering critical or negative feedback, take a coaching approach. Instead of telling your direct report, “Hey, you filled this report out all wrong. I need you to fix it,” begin by giving them the benefit of the doubt and then offer them guidance.
  • Receiving Feedback from Direct Reports: First, you need to create a psychologically safe space where your team members feel comfortable expressing thoughts, doubts, and perspectives without the fear of a consequence. You can do this by sharing vulnerably when the opportunity arises and openly appreciating team members who do the same. If people seem hesitant to approach you, directly ask for feedback and show your appreciation by taking it seriously when received. When a direct report expresses their opinions and frustrations, repeat back what they say to you to be sure you’re on the same page.

What’s the biggest challenge new managers face when trying to grow in their roles? Answers could include broad, complex, and difficult-to-qualify problems like figuring out how to fulfill the company’s vision, retain clientele, or lead a productive team. While these goals are worthy of pursuit, there is one overarching skill needed to see them come to fruition: building strong relationships with your direct reports.

critical thinking training feedback

  • PT Patrick Thean is an international speaker, USA TODAY, and Wall Street Journal bestselling author, CEO coach and serial entrepreneur. He is the author of Rhythm: How to Achieve Breakthrough Execution and Accelerate Growth .

Partner Center

IMAGES

  1. Critical_Thinking_Skills_Diagram_svg

    critical thinking training feedback

  2. why is Importance of Critical Thinking Skills in Education

    critical thinking training feedback

  3. Critical Thinking Skills

    critical thinking training feedback

  4. How to Build Your Team's Critical Thinking Skills

    critical thinking training feedback

  5. How To Give Critical Feedback Effectively

    critical thinking training feedback

  6. Critical Thinking With TeachThought: Grow Your Teaching

    critical thinking training feedback

VIDEO

  1. Mathematical Thinking Thinking Training Parents’ Collection Children Benefit

  2. Foundations of Critical Thinking

  3. How to Give Critical Feedback to Your Employees

  4. Logic and critical thinking Ch 5 Fallacy basic practical Qiestion

  5. How to Develop Critical Thinking Skills? Urdu / Hindi

COMMENTS

  1. 100+ Performance Evaluation Comments for Attitude, Training Ability

    Attitude The performance review evaluates one's attitude, work ethic, motivation, and engagement. It looks at aspects like enthusiasm, positivity, adaptability, and receptiveness to feedback. Critical Thinking The performance review evaluates the ability to analyze issues objectively, troubleshoot problems logically, challenge assumptions ...

  2. Critical Thinking: 40 Useful Performance Feedback Phrases

    Critical Thinking: Meets Expectations Phrases. Uses strategic approachability and skill when it comes to solving issues. Demonstrates well assertive and decisive ability when it comes to handling problems. Tries to always consider all factors at play before deciding on a particular methods or way.

  3. A Short Guide to Building Your Team's Critical Thinking Skills

    A Short Guide to Building Your Team's Critical Thinking Skills. by. Matt Plummer. October 11, 2019. twomeows/Getty Images. Summary. Most employers lack an effective way to objectively assess ...

  4. How to Give and Receive Feedback with Critical Thinking

    Critical thinking is the ability to analyze, evaluate, and apply information in a logical and rational way. It can help you to avoid biases, assumptions, and errors in your feedback process.

  5. Critical Thinking: Performance Review Examples (Rating 1

    Critical Thinking: Performance Review Examples (Rating 1 - 5) Critical thinking skills are an essential aspect of an employee's evaluation: the ability to solve problems, analyze situations, and make informed decisions is crucial for the success of any organization. Questions that can help you determine an employee's rating for critical ...

  6. How Feedback Boosts Critical Thinking in Your Team

    Feedback is a powerful tool to enhance your team's critical thinking skills. Critical thinking is the ability to analyze, evaluate, and create solutions based on evidence, logic, and reasoning.

  7. How to Train Your Employees in Critical Thinking

    However, critical thinking is not something that can be learned overnight, or by simply reading a book or watching a video. It requires practice, feedback, and reflection, as well as a supportive ...

  8. Critical Thinking Skills for the Professional

    This course is part of the Professional Skills for the Workplace Specialization. When you enroll in this course, you'll also be enrolled in this Specialization. Learn new concepts from industry experts. Gain a foundational understanding of a subject or tool. Develop job-relevant skills with hands-on projects.

  9. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  10. The Right Way to Process Feedback

    Process: Take time to metabolize the feedback and let the feedback run through both your body and your mind. That means feeling your feelings and investigating why you may be feeling them ...

  11. Frontiers

    Enhancing students' critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves ...

  12. Critical thinking skills: How to develop them in every employee

    4. Evaluate all existing evidence and be open to revising your hypothesis. Pull in related information for a more systemic, broader understanding of the issue. 5. Develop conclusions based on data and present recommendations. Drawing conclusions is the final and most crucial part of critical thinking.

  13. Technology-scaffolded peer assessment for developing critical thinking

    Developing critical thinking is becoming increasingly important as is giving and receiving feedback during the learning process. The aim of this work is to study how technology can scaffold peer assessment activities to develop critical thinking among pre-service teachers and study the relevance of giving and receiving feedback. A series of practice and application activities were introduced ...

  14. What Good Feedback Really Looks Like

    Summary. Feedback — both positive and negative — is essential to helping managers enhance their best qualities and address their worst so they can excel at leading. Strengths-based development ...

  15. Constructing a critical thinking evaluation framework for college

    The evaluation of thinking is helpful for students to think at higher levels (Kilic et al., 2020). Although the definitions of CT are controversial ... Impact of social problem-solving training on critical thinking and decision making of nursing students. BMC Nurs. 19:94. doi: 10.1186/s12912-020-00487-x, ...

  16. Critical thinking: reasoned decision making

    Critical thinking helps making decisions within a company, selecting the best action for the organization. In this course of critical thinking the students will learn the tendencies, approximations and assumptions on which their reflections are based, and the conditions and the outcomes derived from their ways of thinking. ... Evaluation of the ...

  17. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  18. Using Critical Thinking in Essays and other Assignments

    Critical thinking definition. Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or ...

  19. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  20. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  21. How to Give

    Listen to the original Dear HBR episode: Critical Feedback (Feb 2019) Find more episodes of Dear HBR. Discover 100 years of Harvard Business Review articles, case studies, podcasts, and more at ...

  22. 16 Constructive Feedback Examples (And Tips For How to Use Them)

    Some employee feedback might be easier to give than others, like ways to improve a presentation. But sometimes, constructive feedback can be tricky, like managing conflict between team members or addressing negative behavior. As any leader will tell you, it's critical to address negative behaviors and redirect them to positive outcomes.

  23. Critical Thinking in the Age of AI

    And domain-specific critical thinking skills can be supported through instruction. That is, if your goal is to teach someone how to, say, debug a program that isn't working, there is evidence that it can be taught through direct instruction and practice applying it, getting feedback along the way. ‍ But this gets at another challenge.

  24. How to Give (and Receive) Critical Feedback

    Giving Feedback to Direct Reports: You should be having regular weekly or biweekly one-on-one meetings with each of your team members to check in on their work and offer your support. This is a ...

  25. Challenge-based learning and design thinking in higher education

    As shown in Figure 1, the research methodology is developed in three phases: during the training process, at the end of the training process in the subjects and at the end of the evaluation of the generic competencies declared in them.. During the training process, the VAR is applied to the students and feedback is given to the tutor. At the end of the training process in the subjects, the ...