loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

development of critical thinking problem solving and performance skills

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

How to write a business case.

Getting Approval and Funding for Your Project

How to Reboot Your Career Video

Video Transcript

Add comment

Comments (1)

priyanka ghogare

development of critical thinking problem solving and performance skills

Try Mind Tools for FREE

Get unlimited access to all our career-boosting content and member benefits with our 7-day free trial.

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Newest Releases

Article acd2ru2

Team Briefings

Article a4vbznx

Onboarding With STEPS

Mind Tools Store

About Mind Tools Content

Discover something new today

New pain points podcast - perfectionism.

Why Am I Such a Perfectionist?

Pain Points Podcast - Building Trust

Developing and Strengthening Trust at Work

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Communicate like a leader.

Dianna Booher

Expert Interviews

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Learn more

How it works

Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.

Explore how BetterUp connects to your core business systems.

We pair AI with the latest in human-centered coaching to drive powerful, lasting learning and behavior change.

Build leaders that accelerate team performance and engagement.

Unlock performance potential at scale with AI-powered curated growth journeys.

Build resilience, well-being and agility to drive performance across your entire enterprise.

Transform your business, starting with your sales leaders.

Unlock business impact from the top with executive coaching.

Foster a culture of inclusion and belonging.

Accelerate the performance and potential of your agencies and employees.

See how innovative organizations use BetterUp to build a thriving workforce.

Discover how BetterUp measurably impacts key business outcomes for organizations like yours.

A demo is the first step to transforming your business. Meet with us to develop a plan for attaining your goals.

Request a demo

  • What is coaching?

Learn how 1:1 coaching works, who its for, and if it's right for you.

Accelerate your personal and professional growth with the expert guidance of a BetterUp Coach.

Types of Coaching

Navigate career transitions, accelerate your professional growth, and achieve your career goals with expert coaching.

Enhance your communication skills for better personal and professional relationships, with tailored coaching that focuses on your needs.

Find balance, resilience, and well-being in all areas of your life with holistic coaching designed to empower you.

Discover your perfect match : Take our 5-minute assessment and let us pair you with one of our top Coaches tailored just for you.

Find your Coach

Research, expert insights, and resources to develop courageous leaders within your organization.

Best practices, research, and tools to fuel individual and business growth.

View on-demand BetterUp events and learn about upcoming live discussions.

The latest insights and ideas for building a high-performing workplace.

  • BetterUp Briefing

The online magazine that helps you understand tomorrow's workforce trends, today.

Innovative research featured in peer-reviewed journals, press, and more.

Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance

We're on a mission to help everyone live with clarity, purpose, and passion.

Join us and create impactful change.

Read the buzz about BetterUp.

Meet the leadership that's passionate about empowering your workforce.

Find your Coach

For Business

For Individuals

How to develop critical thinking skills

man-thinking-while-holding-pen-and-looking-at-computer-how-to-develop-critical-thinking-skills

Jump to section

What are critical thinking skills?

How to develop critical thinking skills: 12 tips, how to practice critical thinking skills at work, become your own best critic.

A client requests a tight deadline on an intense project. Your childcare provider calls in sick on a day full of meetings. Payment from a contract gig is a month behind. 

Your day-to-day will always have challenges, big and small. And no matter the size and urgency, they all ask you to use critical thinking to analyze the situation and arrive at the right solution. 

Critical thinking includes a wide set of soft skills that encourage continuous learning, resilience , and self-reflection. The more you add to your professional toolbelt, the more equipped you’ll be to tackle whatever challenge presents itself. Here’s how to develop critical thinking, with examples explaining how to use it.

Critical thinking skills are the skills you use to analyze information, imagine scenarios holistically, and create rational solutions. It’s a type of emotional intelligence that stimulates effective problem-solving and decision-making . 

When you fine-tune your critical thinking skills, you seek beyond face-value observations and knee-jerk reactions. Instead, you harvest deeper insights and string together ideas and concepts in logical, sometimes out-of-the-box , ways. 

Imagine a team working on a marketing strategy for a new set of services. That team might use critical thinking to balance goals and key performance indicators , like new customer acquisition costs, average monthly sales, and net profit margins. They understand the connections between overlapping factors to build a strategy that stays within budget and attracts new sales. 

Looking for ways to improve critical thinking skills? Start by brushing up on the following soft skills that fall under this umbrella: 

  • Analytical thinking: Approaching problems with an analytical eye includes breaking down complex issues into small chunks and examining their significance. An example could be organizing customer feedback to identify trends and improve your product offerings. 
  • Open-mindedness: Push past cognitive biases and be receptive to different points of view and constructive feedback . Managers and team members who keep an open mind position themselves to hear new ideas that foster innovation . 
  • Creative thinking: With creative thinking , you can develop several ideas to address a single problem, like brainstorming more efficient workflow best practices to boost productivity and employee morale . 
  • Self-reflection: Self-reflection lets you examine your thinking and assumptions to stimulate healthier collaboration and thought processes. Maybe a bad first impression created a negative anchoring bias with a new coworker. Reflecting on your own behavior stirs up empathy and improves the relationship. 
  • Evaluation: With evaluation skills, you tackle the pros and cons of a situation based on logic rather than emotion. When prioritizing tasks , you might be tempted to do the fun or easy ones first, but evaluating their urgency and importance can help you make better decisions. 

There’s no magic method to change your thinking processes. Improvement happens with small, intentional changes to your everyday habits until a more critical approach to thinking is automatic. 

Here are 12 tips for building stronger self-awareness and learning how to improve critical thinking: 

1. Be cautious

There’s nothing wrong with a little bit of skepticism. One of the core principles of critical thinking is asking questions and dissecting the available information. You might surprise yourself at what you find when you stop to think before taking action. 

Before making a decision, use evidence, logic, and deductive reasoning to support your own opinions or challenge ideas. It helps you and your team avoid falling prey to bad information or resistance to change .

2. Ask open-ended questions

“Yes” or “no” questions invite agreement rather than reflection. Instead, ask open-ended questions that force you to engage in analysis and rumination. Digging deeper can help you identify potential biases, uncover assumptions, and arrive at new hypotheses and possible solutions. 

3. Do your research

No matter your proficiency, you can always learn more. Turning to different points of view and information is a great way to develop a comprehensive understanding of a topic and make informed decisions. You’ll prioritize reliable information rather than fall into emotional or automatic decision-making. 

close-up-of-mans-hands-opening-a-dictionary-with-notebook-on-the-side-how-to-develop-critical-thinking-skills

4. Consider several opinions

You might spend so much time on your work that it’s easy to get stuck in your own perspective, especially if you work independently on a remote team . Make an effort to reach out to colleagues to hear different ideas and thought patterns. Their input might surprise you.

If or when you disagree, remember that you and your team share a common goal. Divergent opinions are constructive, so shift the focus to finding solutions rather than defending disagreements. 

5. Learn to be quiet

Active listening is the intentional practice of concentrating on a conversation partner instead of your own thoughts. It’s about paying attention to detail and letting people know you value their opinions, which can open your mind to new perspectives and thought processes.

If you’re brainstorming with your team or having a 1:1 with a coworker , listen, ask clarifying questions, and work to understand other peoples’ viewpoints. Listening to your team will help you find fallacies in arguments to improve possible solutions.

6. Schedule reflection

Whether waking up at 5 am or using a procrastination hack, scheduling time to think puts you in a growth mindset . Your mind has natural cognitive biases to help you simplify decision-making, but squashing them is key to thinking critically and finding new solutions besides the ones you might gravitate toward. Creating time and calm space in your day gives you the chance to step back and visualize the biases that impact your decision-making. 

7. Cultivate curiosity

With so many demands and job responsibilities, it’s easy to seek solace in routine. But getting out of your comfort zone helps spark critical thinking and find more solutions than you usually might.

If curiosity doesn’t come naturally to you, cultivate a thirst for knowledge by reskilling and upskilling . Not only will you add a new skill to your resume , but expanding the limits of your professional knowledge might motivate you to ask more questions. 

You don’t have to develop critical thinking skills exclusively in the office. Whether on your break or finding a hobby to do after work, playing strategic games or filling out crosswords can prime your brain for problem-solving. 

woman-solving-puzzle-at-home-how-to-develop-critical-thinking-skills

9. Write it down

Recording your thoughts with pen and paper can lead to stronger brain activity than typing them out on a keyboard. If you’re stuck and want to think more critically about a problem, writing your ideas can help you process information more deeply.

The act of recording ideas on paper can also improve your memory . Ideas are more likely to linger in the background of your mind, leading to deeper thinking that informs your decision-making process. 

10. Speak up

Take opportunities to share your opinion, even if it intimidates you. Whether at a networking event with new people or a meeting with close colleagues, try to engage with people who challenge or help you develop your ideas. Having conversations that force you to support your position encourages you to refine your argument and think critically. 

11. Stay humble

Ideas and concepts aren’t the same as real-life actions. There may be such a thing as negative outcomes, but there’s no such thing as a bad idea. At the brainstorming stage , don’t be afraid to make mistakes.

Sometimes the best solutions come from off-the-wall, unorthodox decisions. Sit in your creativity , let ideas flow, and don’t be afraid to share them with your colleagues. Putting yourself in a creative mindset helps you see situations from new perspectives and arrive at innovative conclusions. 

12. Embrace discomfort

Get comfortable feeling uncomfortable . It isn’t easy when others challenge your ideas, but sometimes, it’s the only way to see new perspectives and think critically.

By willingly stepping into unfamiliar territory, you foster the resilience and flexibility you need to become a better thinker. You’ll learn how to pick yourself up from failure and approach problems from fresh angles. 

man-looking-down-to-something-while-thinking-how-to-develop-critical-thinking-skills

Thinking critically is easier said than done. To help you understand its impact (and how to use it), here are two scenarios that require critical thinking skills and provide teachable moments. 

Scenario #1: Unexpected delays and budget

Imagine your team is working on producing an event. Unexpectedly, a vendor explains they’ll be a week behind on delivering materials. Then another vendor sends a quote that’s more than you can afford. Unless you develop a creative solution, the team will have to push back deadlines and go over budget, potentially costing the client’s trust. 

Here’s how you could approach the situation with creative thinking:

  • Analyze the situation holistically: Determine how the delayed materials and over-budget quote will impact the rest of your timeline and financial resources . That way, you can identify whether you need to build an entirely new plan with new vendors, or if it’s worth it to readjust time and resources. 
  • Identify your alternative options: With careful assessment, your team decides that another vendor can’t provide the same materials in a quicker time frame. You’ll need to rearrange assignment schedules to complete everything on time. 
  • Collaborate and adapt: Your team has an emergency meeting to rearrange your project schedule. You write down each deliverable and determine which ones you can and can’t complete by the deadline. To compensate for lost time, you rearrange your task schedule to complete everything that doesn’t need the delayed materials first, then advance as far as you can on the tasks that do. 
  • Check different resources: In the meantime, you scour through your contact sheet to find alternative vendors that fit your budget. Accounting helps by providing old invoices to determine which vendors have quoted less for previous jobs. After pulling all your sources, you find a vendor that fits your budget. 
  • Maintain open communication: You create a special Slack channel to keep everyone up to date on changes, challenges, and additional delays. Keeping an open line encourages transparency on the team’s progress and boosts everyone’s confidence. 

coworkers-at-meeting-looking-together-the-screen-how-to-develop-critical-thinking-skills

Scenario #2: Differing opinions 

A conflict arises between two team members on the best approach for a new strategy for a gaming app. One believes that small tweaks to the current content are necessary to maintain user engagement and stay within budget. The other believes a bold revamp is needed to encourage new followers and stronger sales revenue. 

Here’s how critical thinking could help this conflict:

  • Listen actively: Give both team members the opportunity to present their ideas free of interruption. Encourage the entire team to ask open-ended questions to more fully understand and develop each argument. 
  • Flex your analytical skills: After learning more about both ideas, everyone should objectively assess the benefits and drawbacks of each approach. Analyze each idea's risk, merits, and feasibility based on available data and the app’s goals and objectives. 
  • Identify common ground: The team discusses similarities between each approach and brainstorms ways to integrate both idea s, like making small but eye-catching modifications to existing content or using the same visual design in new media formats. 
  • Test new strategy: To test out the potential of a bolder strategy, the team decides to A/B test both approaches. You create a set of criteria to evenly distribute users by different demographics to analyze engagement, revenue, and customer turnover. 
  • Monitor and adapt: After implementing the A/B test, the team closely monitors the results of each strategy. You regroup and optimize the changes that provide stronger results after the testing. That way, all team members understand why you’re making the changes you decide to make.

You can’t think your problems away. But you can equip yourself with skills that help you move through your biggest challenges and find innovative solutions. Learning how to develop critical thinking is the start of honing an adaptable growth mindset. 

Now that you have resources to increase critical thinking skills in your professional development, you can identify whether you embrace change or routine, are open or resistant to feedback, or turn to research or emotion will build self-awareness. From there, tweak and incorporate techniques to be a critical thinker when life presents you with a problem.

Cultivate your creativity

Foster creativity and continuous learning with guidance from our certified Coaches.

Elizabeth Perry, ACC

Elizabeth Perry is a Coach Community Manager at BetterUp. She uses strategic engagement strategies to cultivate a learning community across a global network of Coaches through in-person and virtual experiences, technology-enabled platforms, and strategic coaching industry partnerships. With over 3 years of coaching experience and a certification in transformative leadership and life coaching from Sofia University, Elizabeth leverages transpersonal psychology expertise to help coaches and clients gain awareness of their behavioral and thought patterns, discover their purpose and passions, and elevate their potential. She is a lifelong student of psychology, personal growth, and human potential as well as an ICF-certified ACC transpersonal life and leadership Coach.

How to improve your creative skills for effective problem-solving

6 ways to leverage ai for hyper-personalized corporate learning, can dreams help you solve problems 6 ways to try, how divergent thinking can drive your creativity, what is lateral thinking 7 techniques to encourage creative ideas, what’s convergent thinking how to be a better problem-solver, 8 creative solutions to your most challenging problems, thinking outside the box: 8 ways to become a creative problem solver, why asynchronous learning is the key to successful upskilling, similar articles, what is creative thinking and why does it matter, discover the 7 essential types of life skills you need, 6 big picture thinking strategies that you'll actually use, what are analytical skills examples and how to level up, how intrapersonal skills shape teams, plus 5 ways to build them, critical thinking is the one skillset you can't afford not to master, stay connected with betterup, get our newsletter, event invites, plus product insights and research..

3100 E 5th Street, Suite 350 Austin, TX 78702

  • Platform Overview
  • Integrations
  • Powered by AI
  • BetterUp Lead
  • BetterUp Manage™
  • BetterUp Care™
  • Sales Performance
  • Diversity & Inclusion
  • Case Studies
  • Why BetterUp?
  • About Coaching
  • Find your Coach
  • Career Coaching
  • Communication Coaching
  • Life Coaching
  • News and Press
  • Leadership Team
  • Become a BetterUp Coach
  • BetterUp Labs
  • Center for Purpose & Performance
  • Leadership Training
  • Business Coaching
  • Contact Support
  • Contact Sales
  • Privacy Policy
  • Acceptable Use Policy
  • Trust & Security
  • Cookie Preferences

development of critical thinking problem solving and performance skills

Work Life is Atlassian’s flagship publication dedicated to unleashing the potential of every team through real-life advice, inspiring stories, and thoughtful perspectives from leaders around the world.

Kelli María Korducki

Contributing Writer

Dominic Price

Work Futurist

Dr. Mahreen Khan

Senior Quantitative Researcher, People Insights

Kat Boogaard

Principal Writer

development of critical thinking problem solving and performance skills

How to build critical thinking skills for better decision-making

It’s simple in theory, but tougher in practice – here are five tips to get you started.

Get stories like this in your inbox

Have you heard the riddle about two coins that equal thirty cents, but one of them is not a nickel? What about the one where a surgeon says they can’t operate on their own son?

Those brain teasers tap into your critical thinking skills. But your ability to think critically isn’t just helpful for solving those random puzzles – it plays a big role in your career. 

An impressive 81% of employers say critical thinking carries a lot of weight when they’re evaluating job candidates. It ranks as the top competency companies consider when hiring recent graduates (even ahead of communication ). Plus, once you’re hired, several studies show that critical thinking skills are highly correlated with better job performance.

So what exactly are critical thinking skills? And even more importantly, how do you build and improve them? 

What is critical thinking?

Critical thinking is the ability to evaluate facts and information, remain objective, and make a sound decision about how to move forward.

Does that sound like how you approach every decision or problem? Not so fast. Critical thinking seems simple in theory but is much tougher in practice, which helps explain why 65% of employers say their organization has a need for more critical thinking. 

In reality, critical thinking doesn’t come naturally to a lot of us. In order to do it well, you need to:

  • Remain open-minded and inquisitive, rather than relying on assumptions or jumping to conclusions
  • Ask questions and dig deep, rather than accepting information at face value
  • Keep your own biases and perceptions in check to stay as objective as possible
  • Rely on your emotional intelligence to fill in the blanks and gain a more well-rounded understanding of a situation

So, critical thinking isn’t just being intelligent or analytical. In many ways, it requires you to step outside of yourself, let go of your own preconceived notions, and approach a problem or situation with curiosity and fairness.

It’s a challenge, but it’s well worth it. Critical thinking skills will help you connect ideas, make reasonable decisions, and solve complex problems.

7 critical thinking skills to help you dig deeper

Critical thinking is often labeled as a skill itself (you’ll see it bulleted as a desired trait in a variety of job descriptions). But it’s better to think of critical thinking less as a distinct skill and more as a collection or category of skills. 

To think critically, you’ll need to tap into a bunch of your other soft skills. Here are seven of the most important. 

Open-mindedness

It’s important to kick off the critical thinking process with the idea that anything is possible. The more you’re able to set aside your own suspicions, beliefs, and agenda, the better prepared you are to approach the situation with the level of inquisitiveness you need. 

That means not closing yourself off to any possibilities and allowing yourself the space to pull on every thread – yes, even the ones that seem totally implausible.

As Christopher Dwyer, Ph.D. writes in a piece for Psychology Today , “Even if an idea appears foolish, sometimes its consideration can lead to an intelligent, critically considered conclusion.” He goes on to compare the critical thinking process to brainstorming . Sometimes the “bad” ideas are what lay the foundation for the good ones. 

Open-mindedness is challenging because it requires more effort and mental bandwidth than sticking with your own perceptions. Approaching problems or situations with true impartiality often means:

  • Practicing self-regulation : Giving yourself a pause between when you feel something and when you actually react or take action.
  • Challenging your own biases: Acknowledging your biases and seeking feedback are two powerful ways to get a broader understanding. 

Critical thinking example

In a team meeting, your boss mentioned that your company newsletter signups have been decreasing and she wants to figure out why.

At first, you feel offended and defensive – it feels like she’s blaming you for the dip in subscribers. You recognize and rationalize that emotion before thinking about potential causes. You have a hunch about what’s happening, but you will explore all possibilities and contributions from your team members.

Observation

Observation is, of course, your ability to notice and process the details all around you (even the subtle or seemingly inconsequential ones). Critical thinking demands that you’re flexible and willing to go beyond surface-level information, and solid observation skills help you do that.

Your observations help you pick up on clues from a variety of sources and experiences, all of which help you draw a final conclusion. After all, sometimes it’s the most minuscule realization that leads you to the strongest conclusion.

Over the next week or so, you keep a close eye on your company’s website and newsletter analytics to see if numbers are in fact declining or if your boss’s concerns were just a fluke. 

Critical thinking hinges on objectivity. And, to be objective, you need to base your judgments on the facts – which you collect through research. You’ll lean on your research skills to gather as much information as possible that’s relevant to your problem or situation. 

Keep in mind that this isn’t just about the quantity of information – quality matters too. You want to find data and details from a variety of trusted sources to drill past the surface and build a deeper understanding of what’s happening. 

You dig into your email and website analytics to identify trends in bounce rates, time on page, conversions, and more. You also review recent newsletters and email promotions to understand what customers have received, look through current customer feedback, and connect with your customer support team to learn what they’re hearing in their conversations with customers.

The critical thinking process is sort of like a treasure hunt – you’ll find some nuggets that are fundamental for your final conclusion and some that might be interesting but aren’t pertinent to the problem at hand.

That’s why you need analytical skills. They’re what help you separate the wheat from the chaff, prioritize information, identify trends or themes, and draw conclusions based on the most relevant and influential facts. 

It’s easy to confuse analytical thinking with critical thinking itself, and it’s true there is a lot of overlap between the two. But analytical thinking is just a piece of critical thinking. It focuses strictly on the facts and data, while critical thinking incorporates other factors like emotions, opinions, and experiences. 

As you analyze your research, you notice that one specific webpage has contributed to a significant decline in newsletter signups. While all of the other sources have stayed fairly steady with regard to conversions, that one has sharply decreased.

You decide to move on from your other hypotheses about newsletter quality and dig deeper into the analytics. 

One of the traps of critical thinking is that it’s easy to feel like you’re never done. There’s always more information you could collect and more rabbit holes you could fall down.

But at some point, you need to accept that you’ve done your due diligence and make a decision about how to move forward. That’s where inference comes in. It’s your ability to look at the evidence and facts available to you and draw an informed conclusion based on those. 

When you’re so focused on staying objective and pursuing all possibilities, inference can feel like the antithesis of critical thinking. But ultimately, it’s your inference skills that allow you to move out of the thinking process and onto the action steps. 

You dig deeper into the analytics for the page that hasn’t been converting and notice that the sharp drop-off happened around the same time you switched email providers.

After looking more into the backend, you realize that the signup form on that page isn’t correctly connected to your newsletter platform. It seems like anybody who has signed up on that page hasn’t been fed to your email list. 

Communication

3 ways to improve your communication skills at work

3 ways to improve your communication skills at work

If and when you identify a solution or answer, you can’t keep it close to the vest. You’ll need to use your communication skills to share your findings with the relevant stakeholders – like your boss, team members, or anybody who needs to be involved in the next steps.

Your analysis skills will come in handy here too, as they’ll help you determine what information other people need to know so you can avoid bogging them down with unnecessary details. 

In your next team meeting, you pull up the analytics and show your team the sharp drop-off as well as the missing connection between that page and your email platform. You ask the web team to reinstall and double-check that connection and you also ask a member of the marketing team to draft an apology email to the subscribers who were missed. 

Problem-solving

Critical thinking and problem-solving are two more terms that are frequently confused. After all, when you think critically, you’re often doing so with the objective of solving a problem.

The best way to understand how problem-solving and critical thinking differ is to think of problem-solving as much more narrow. You’re focused on finding a solution.

In contrast, you can use critical thinking for a variety of use cases beyond solving a problem – like answering questions or identifying opportunities for improvement. Even so, within the critical thinking process, you’ll flex your problem-solving skills when it comes time to take action. 

Once the fix is implemented, you monitor the analytics to see if subscribers continue to increase. If not (or if they increase at a slower rate than you anticipated), you’ll roll out some other tests like changing the CTA language or the placement of the subscribe form on the page.

5 ways to improve your critical thinking skills

Beyond the buzzwords: Why interpersonal skills matter at work

Beyond the buzzwords: Why interpersonal skills matter at work

Think critically about critical thinking and you’ll quickly realize that it’s not as instinctive as you’d like it to be. Fortunately, your critical thinking skills are learned competencies and not inherent gifts – and that means you can improve them. Here’s how:

  • Practice active listening: Active listening helps you process and understand what other people share. That’s crucial as you aim to be open-minded and inquisitive.
  • Ask open-ended questions: If your critical thinking process involves collecting feedback and opinions from others, ask open-ended questions (meaning, questions that can’t be answered with “yes” or “no”). Doing so will give you more valuable information and also prevent your own biases from influencing people’s input.
  • Scrutinize your sources: Figuring out what to trust and prioritize is crucial for critical thinking. Boosting your media literacy and asking more questions will help you be more discerning about what to factor in. It’s hard to strike a balance between skepticism and open-mindedness, but approaching information with questions (rather than unquestioning trust) will help you draw better conclusions. 
  • Play a game: Remember those riddles we mentioned at the beginning? As trivial as they might seem, games and exercises like those can help you boost your critical thinking skills. There are plenty of critical thinking exercises you can do individually or as a team . 
  • Give yourself time: Research shows that rushed decisions are often regrettable ones. That’s likely because critical thinking takes time – you can’t do it under the wire. So, for big decisions or hairy problems, give yourself enough time and breathing room to work through the process. It’s hard enough to think critically without a countdown ticking in your brain. 

Critical thinking really is critical

The ability to think critically is important, but it doesn’t come naturally to most of us. It’s just easier to stick with biases, assumptions, and surface-level information. 

But that route often leads you to rash judgments, shaky conclusions, and disappointing decisions. So here’s a conclusion we can draw without any more noodling: Even if it is more demanding on your mental resources, critical thinking is well worth the effort.

Advice, stories, and expertise about work life today.

  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, performance assessment of critical thinking: conceptualization, design, and implementation.

development of critical thinking problem solving and performance skills

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.

Introduction

In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

www.frontiersin.org

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank all the researchers who have participated in the iPAL program.

  • ^ https://www.ipal-rd.com/
  • ^ https://www.aacu.org/value
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: https://www.aacu.org/value (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: https://www.aacu.org/research/2018-future-of-work (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: http://fsse.indiana.edu/pdf/FSSE_IR_2019/summary_tables/FSSE19_Frequencies_(FSSE_2019).pdf (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: http://timssandpirls.bc.edu/pirls2016/international-results/ (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at: https://www.insidehighered.com/views/2009/10/16/limitations-portfolios

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: https://pdfs.semanticscholar.org/91ae/8edfac44bce3bed37d8c9091da01d6db3776.pdf .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

This article is part of the Research Topic

Assessing Information Processing and Online Reasoning as a Prerequisite for Learning in Higher Education

Tara Well Ph.D.

How to Improve Your Critical Thinking Skills

Traditional tools and new technologies..

Posted September 29, 2023 | Reviewed by Lybi Ma

Hannah Olinger / Unsplash

Technology provides access to vast information and makes daily life easier. Yet, too much reliance on technology potentially interferes with the acquisition and maintenance of critical thinking skills in several ways:

1. Information Overload : The constant influx of data can discourage deep critical thinking as we may come to rely on quick, surface-level information rather than delving deeply into a subject.

2. Shortened Attention Span: Frequent digital distractions can disrupt our ability for the sustained focus and concentration required for critical thinking.

3. Confirmatory Bias and Echo Chambers: Technology, including social media and personalized content algorithms, can reinforce confirmation bias . People are often exposed to information that aligns with their beliefs and opinions, making them less likely to encounter diverse perspectives and engage in critical thinking about opposing views.

4. Reduced Problem-Solving Opportunities: Technology often provides quick solutions to problems. While this benefits efficiency, it may discourage individuals from engaging in complex problem-solving, a fundamental aspect of critical thinking.

5. Loss of Research Skills: The ease of accessing information online can diminish traditional research skills, such as library research or in-depth reading. These skills are essential for critical thinking, as they involve evaluating sources, synthesizing information, and analyzing complex texts.

While technology can pose challenges to developing critical thinking skills, it's important to note that technology can also be a valuable tool for learning and skill development. It can provide access to educational resources, facilitate collaboration , and support critical thinking when used thoughtfully and intentionally. Balancing technology use with activities that encourage deep thinking and analysis is vital to lessening its potential adverse effects on critical thinking.

Writing is a traditional and powerful tool to exercise and improve your critical thinking skills. Consider these ways writing can help enhance critical thinking:

1. Clarity of Thought: Writing requires that you articulate your thoughts clearly and coherently. When you need to put your ideas on paper, you must organize them logically, which requires a deeper understanding of the subject matter.

2. Analysis and Evaluation: Critical thinking involves analyzing and evaluating information. When you write, you often need to assess the validity and relevance of different sources, arguments, or pieces of evidence, which hone your critical thinking skills.

3. Problem-Solving: Writing can be a problem-solving exercise in itself. Whether crafting an argument, developing a thesis, or finding the right words to express your ideas, writing requires thinking critically about approaching these challenges effectively.

4. Research Skills: Good writing often involves research, and research requires critical thinking. You need to assess the credibility of sources, synthesize information, and draw conclusions based on the evidence you gather.

5. Argumentation: Constructing a persuasive argument in writing is a complex process requiring critical thinking. You must anticipate counterarguments, provide evidence to support your claims, and address potential weaknesses in your reasoning.

6. Revision and Editing: To be an influential writer, you must learn to read your work critically. Editing and revising requires evaluating your writing objectively, identifying areas that need improvement, and refining your ideas and arguments.

7. Problem Identification: In some cases, writing can help you identify problems or gaps in your thinking. As you write, you might realize that your arguments are not as strong as you initially thought or that you need more information to support your claims. This recognition of limitations is a crucial aspect of critical thinking.

Writing is a dynamic process that engages multiple facets of critical thinking. It has been a valuable tool used in education , business, and personal development for centuries.

Yet, this traditional approach of self-generated written thoughts is rapidly being supplanted by AI -generated writing tools like Chat GPT (Generative Pre-trained Transformer. With over 100 million users of Chat GPT alone, we cannot ignore its potential impact. How might the increasing reliance on AI-generated writing tools influence our critical thinking skills? The impact can vary depending on how the tools are used and the context in which they are employed.

development of critical thinking problem solving and performance skills

Critical thinking involves evaluating information sources for credibility, relevance, and bias. If individuals consistently trust the information provided by chatbots without critically assessing its quality, it can hinder their development of critical thinking skills. This is especially true if they depend on the chatbot to provide answers without questioning or verifying the information. Relying solely on chatbots for answers may also reduce people's effort in problem-solving. Critical thinking often requires wrestling with complex problems, considering multiple perspectives, and generating creative solutions. If we default to chatbots for quick answers, we may miss opportunities to develop these skills.

However, it's essential to note that the impact of chatbots on critical thinking skills may not be entirely negative. These tools can also have positive effects:

1. Chatbots provide quick access to vast information, which can benefit research and problem-solving. When used as a supplement to critical thinking, they can enhance the efficiency of information retrieval.

2. Chatbots can sometimes assist in complex tasks by providing relevant data or suggestions. When individuals critically evaluate and integrate this information into their decision-making process, it can enhance their critical thinking.

3. Chatbots can be used as learning aids. They can provide explanations, examples, and guidance, which can support skill development and, when used effectively, encourage critical thinking.

In summary, the impact of chatbots on critical thinking skills depends on how we use them. The effect will be harmful if they become a crutch to avoid independent thought or analysis. However, they can be valuable resources when used as tools to facilitate and augment critical thinking and writing processes. Individuals must balance leveraging the convenience of chatbots and actively engaging in independent critical thinking and problem-solving to maintain and enhance their cognitive abilities. You can do that effectively through writing regularly.

Copyright 2023 Tara Well, PhD

Tara Well Ph.D.

Tara Well, Ph.D. , is a professor in the department of psychology at Barnard College of Columbia University.

  • Find a Therapist
  • Find a Treatment Centre
  • Find a Support Group
  • Find Online Therapy
  • Calgary, AB
  • Edmonton, AB
  • Hamilton, ON
  • Montréal, QC
  • Toronto, ON
  • Vancouver, BC
  • Winnipeg, MB
  • Mississauga, ON
  • Oakville, ON
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

International Study Reveals Measuring and Developing Critical-Thinking Skills as an Essential Best Practice in Higher Education

Opportunities exist for higher education institutions worldwide to increase critical-thinking skills among higher education graduates through explicit instruction, practice, and measurement of the skills employers are most seeking in today’s innovation economy..

NEW YORK, October 18, 2023 | Source: GlobeNewswire

The  Council for Aid to Education, Inc.  (CAE), a leader in designing innovative performance tasks for measurement and instruction of higher-order skills, recently co-authored an article on a six-year international study in the  European Journal of Education Study . Key findings shared in  “Assessing and Developing Critical-Thinking Skills in Higher Education”  include that it is feasible to reliably and validly measure higher-order skills in a cross-cultural context and that assessment of these skills is necessary for colleges and universities to ensure that their programs are graduating students with the skills needed for career success after graduation.

Between 2015 and 2020, 120,000 students from higher education institutions in six different countries — Chile, Finland, Italy, Mexico, the UK, and the US — were administered CAE’s  Collegiate Learning Assessment (CLA+) , a performance-based assessment that measures proficiency with critical thinking, problem solving, and written communication. Analysis of the data show that students entering a higher education program on average performed at the  Developing  mastery level of the test while exiting students on average performed at the  Proficient  mastery level. The amount of growth is relatively small (d = 0.10), but significant. However, half of exiting students perform at the two lowest levels of proficiency, indicating that higher education degrees do not necessarily mean students have gained the higher-order skills needed for innovation-oriented workplaces.

“In response to employer concerns about graduate employability, assessing and developing students’ higher-order skills is an essential component of best practices in higher education,” said Doris Zahner, Ph.D., CAE’s chief academic officer. “The ability to measure these skills in a cross-cultural context addresses a current gap between the skills that higher education graduates possess and the skills that are required by hiring managers for success in the workplace.”

This study reinforces the same findings of  OECD’s 2013 Assessment of Higher Education Learning Outcomes (AHELO) Feasibility Study and is based upon a recently published 2022 OECD report, Does Higher Education Teach Students to Think Critically? . Since this original study, CAE has further improved CLA+ through lessons learned from its implementation, analytical research on the data gathered, and international collaboration.

The research discussed in “Assessing and Developing Critical-Thinking Skills in Higher Education” reinforces the need for policymakers, researchers, and higher education leaders to have valid and reliable internationally comparative assessments of the skills that are needed for today’s knowledge economy. “The results outlined in this report show the power of assessing critical-thinking skills and how such assessments can feed into the higher education policy agenda at the national and international level,” said article co-author Dirk Van Damme, former head of the Centre for Educational Research and Innovation at OECD and current senior research fellow at the Centre for Curriculum Redesign.

CAE, in collaboration with the Finland Ministry of Education and Culture, will continue to study the impact of higher education on the development of critical-thinking skills. Starting in 2023 and continuing through 2025, a cohort of students from 18 Finnish higher education institutions will use CLA+ to measure their growth with critical thinking, adding a longitudinal component to this ongoing research.

To learn more about this study, CAE’s other research, and CAE’s performance-based assessments and critical thinking instruction, visit  cae.org .

About CAE As a nonprofit whose mission is to help improve the academic and career outcomes of secondary and higher education students, CAE is the leader in designing innovative performance tasks for measurement and instruction of higher order skills and within subject areas.

Over the past 20 years, CAE has helped over 825,000 students globally understand and improve their proficiency in critical thinking, problem solving and effective written communication. Additionally, CAE’s subject area assessments have helped millions of K12 students across the US. Supported by best practices in assessment development, administration and psychometrics, CAE’s performance-based assessments include the Collegiate Learning Assessment (CLA+) and College and Career Readiness Assessment (CCRA+). To learn more, please visit  cae.org  and connect with us on  LinkedIn  and   YouTube .

You Might Also Like…

development of critical thinking problem solving and performance skills

THE AI EDGE:

development of critical thinking problem solving and performance skills

Women’s History Month Video: Professionals Share How Higher-Order Skills Contribute to Career Success

development of critical thinking problem solving and performance skills

Black History Month Video: How Higher-Order Skills Drive Career Success

  • Visit Two Rivers Public Charter School to see the school that inspired the Two Rivers Learning Institute.
  • Course Login

Two Rivers Learning Institute

Assessing Critical Thinking and Problem-Solving

Critical thinking.

How do you assess critical thinking and problem solving skills?

In considering how we assess critical thinking and problem solving skills, we wanted to answer the question of how we know whether students are learning the cognitive processes we are teaching and are able to transfer them to novel situations. In answer to this challenge, we have designed short performance tasks that target each of our constructs of critical thinking and problem solving.

What are performance tasks?

Performance tasks are specific activities that require students to demonstrate mastery of knowledge or skills through application within the task. The performance tasks that we utilize to assess critical thinking and problem solving are each aligned with a specific thinking type. In each task, students are required to make their thinking visible either through demonstration of their work, through oral description of their thinking, or through writing. How do you design performance tasks aligned with constructs of critical thinking and problem solving?

In designing performance tasks, we always begin with the cognitive skill that we want to assess. Every decision about how to design performance tasks then grows from that clear understanding of the target.

Because the focus is on a specific cognitive skill, we want to remove barriers from both the level of understanding of the content or basic math and reading skills. Thus we choose tasks that are situated in contexts with which most students are already familiar. In addition, we ensure that the literacy and math components of the task are sufficiently low that most students are not hindered by the reading or computational components.

However, we strive to design tasks that are problematic for students. In other words, students shouldn’t have a quick solution to the tasks. We make tasks problematic in a couple of ways. First, we make tasks problematic by giving open-ended assignments where there are multiple possible solutions. Second, we make tasks problematic through the complexity of the problem that students need to think through.

How do you evaluate students’ critical thinking and problem solving skills through a performance task?

When students complete performance tasks, they generate evidence of their thinking that we can utilize to evaluate their critical thinking and problem solving skills. Utilizing our rubrics we evaluate student responses across the task to each dimension on the rubric. We don’t generate a single score for each construct. Instead, students are scored on each component of the rubric. This allows us to give refined feedback to students.

Status.net

Critical Thinking: Performance Review Examples (Rating 1 – 5)

By Status.net Editorial Team on July 15, 2023 — 8 minutes to read

Critical thinking skills are an essential aspect of an employee’s evaluation: the ability to solve problems, analyze situations, and make informed decisions is crucial for the success of any organization.

Questions that can help you determine an employee’s rating for critical thinking:

  • Does the employee consistently analyze data and information to identify patterns and trends?
  • Does the employee proactively identify potential problems and develop solutions to mitigate them?
  • Has the employee demonstrated the ability to think creatively and come up with innovative ideas or approaches?
  • Does the employee actively seek out feedback and input from others to inform their decision-making process?
  • Has the employee demonstrated the ability to make sound decisions based on available information and data?

Performance Review Phrases and Paragraphs Examples For Critical Thinking

5 – outstanding.

Employees with outstanding critical thinking skills are exceptional at identifying patterns, making connections, and using past experiences to inform their decisions.

Phrases Examples

  • Consistently demonstrates exceptional critical thinking abilities
  • Always finds creative and innovative solutions to complex problems
  • Skilfully analyzes information and data to make well-informed decisions
  • Frequently provides valuable insights and perspectives that benefit the team
  • Continuously seeks out new learning opportunities to sharpen their critical thinking skills
  • Demonstrates exceptional ability to identify and analyze complex issues
  • Consistently develops innovative solutions to problems
  • Skillfully connects disparate ideas to create coherent arguments
  • Effectively communicates well-reasoned conclusions
  • Exceptional ability to recognize trends in data
  • Expertly applies existing knowledge to new situations
  • Consistently anticipates potential challenges and develops solution

Paragraph Example 1

“Jane consistently demonstrates outstanding critical thinking skills in her role. She not only engages in deep analysis of complex information, but she also presents unique solutions to problems that have a significant positive impact on the team’s performance. Her ability to make well-informed decisions and offer valuable insights has led to numerous successes for the organization. Moreover, Jane’s dedication to improvement and learning demonstrates her commitment to personal and professional growth in the area of critical thinking.”

Paragraph Example 2

“Jessica consistently displays outstanding critical thinking skills. She is able to identify and analyze complex issues with ease and has demonstrated her ability to develop innovative solutions. Her skill in connecting disparate ideas to create coherent arguments is impressive, and she excels at communicating her well-reasoned conclusions to the team.”

Paragraph Example 3

“Melanie consistently demonstrates an exceptional ability to recognize patterns and trends in data, which has significantly contributed to the success of our projects. Her critical thinking skills allow her to apply her extensive knowledge and experience in creative and innovative ways, proactively addressing potential challenges and developing effective solutions.”

4 – Exceeds Expectations

Employees exceeding expectations in critical thinking skills are adept at analyzing information, making sound decisions, and providing thoughtful recommendations. They are also effective at adapting their knowledge to novel situations and displaying confidence in their abilities.

  • Excellent analytical capabilities
  • Provides well-reasoned recommendations
  • Demonstrates a solid understanding of complex concepts
  • Regularly demonstrates the ability to think analytically and critically
  • Effectively identifies and addresses complex problems with well-thought-out solutions
  • Shows exceptional skill in generating innovative ideas and solutions
  • Exhibits a consistently high level of decision-making based on sound reasoning
  • Proactively seeks out new information to improve critical thinking skills
  • Routinely identifies potential challenges and provides solutions
  • Typically recognizes and prioritizes the most relevant information
  • Logical thinking is evident in daily decision-making
  • Often weighs the pros and cons of multiple options before selecting a course of action

“Eric’s critical thinking skills have consistently exceeded expectations throughout his tenure at the company. He is skilled at reviewing and analyzing complex information, leading him to provide well-reasoned recommendations and insights. Eric regularly demonstrates a deep understanding of complicated concepts, which allows him to excel in his role.”

“In this evaluation period, Jane has consistently demonstrated an exceptional ability to think critically and analytically. She has repeatedly shown skill in identifying complex issues while working on projects and has provided well-thought-out and effective solutions. Her innovative ideas have contributed significantly to the success of several key initiatives. Moreover, Jane’s decision-making skills are built on sound reasoning, which has led to positive outcomes for the team and organization. Additionally, she actively seeks opportunities to acquire new information and apply it to her work, further strengthening her critical thinking capabilities.”

“John consistently exceeds expectations in his critical thinking abilities. He routinely identifies potential challenges and provides thoughtful solutions. He is skilled at recognizing and prioritizing the most relevant information to make well-informed decisions. John regularly weighs the pros and cons of various options and selects the best course of action based on logic.”

3 – Meets Expectations

Employees meeting expectations in critical thinking skills demonstrate an ability to analyze information and draw logical conclusions. They are effective at problem-solving and can make informed decisions with minimal supervision.

  • Capable of processing information and making informed decisions
  • Displays problem-solving skills
  • Demonstrates logical thinking and reasoning
  • Consistently demonstrates the ability to analyze problems and find possible solutions.
  • Actively engages in group discussions and contributes valuable ideas.
  • Demonstrates the ability to draw conclusions based on logical analysis of information.
  • Shows willingness to consider alternative perspectives when making decisions.
  • Weighs the pros and cons of a situation before reaching a decision.
  • Usually identifies relevant factors when faced with complex situations
  • Demonstrates an understanding of cause and effect relationships
  • Generally uses sound reasoning to make decisions
  • Listens to and considers different perspectives

“Sarah consistently meets expectations in her critical thinking skills, successfully processing information and making informed decisions. She has shown her ability to solve problems effectively and displays logical reasoning when approaching new challenges. Sarah continues to be a valuable team member thanks to these critical thinking skills.”

“Jane is a team member who consistently meets expectations in regards to her critical thinking skills. She demonstrates an aptitude for analyzing problems within the workplace and actively seeks out potential solutions by collaborating with her colleagues. Jane is open-minded and makes an effort to consider alternative perspectives during decision-making processes. She carefully weighs the pros and cons of the situations she encounters, which helps her make informed choices that align with the company’s objectives.”

“David meets expectations in his critical thinking skills. He can usually identify the relevant factors when dealing with complex situations and demonstrates an understanding of cause and effect relationships. David’s decision-making is generally based on sound reasoning, and he listens to and considers different perspectives before reaching a conclusion.”

2 – Needs Improvement

Employees in need of improvement in critical thinking skills may struggle with processing information and making logical conclusions. They may require additional guidance when making decisions or solving problems.

  • Struggles with analyzing complex information
  • Requires guidance when working through challenges
  • Difficulty applying past experiences to new situations
  • With some guidance, Jane is able to think critically, but she struggles to do so independently.
  • John tends to jump to conclusions without analyzing a situation fully.
  • Sarah’s problem-solving skills need improvement, as she often overlooks important information when making decisions.
  • David’s critical thinking skills are limited and need further development to enhance his overall work performance.
  • Occasionally struggles to identify and analyze problems effectively
  • Inconsistently uses logic to make decisions
  • Often overlooks important information or perspectives
  • Requires guidance in weighing options and making judgments

“Bob’s critical thinking skills could benefit from further development and improvement. He often struggles when analyzing complex information and tends to need additional guidance when working through challenges. Enhancing Bob’s ability to apply his past experiences to new situations would lead to a notable improvement in his overall performance.”

“Jenny is a valuable team member, but her critical thinking skills need improvement before she will be able to reach her full potential. In many instances, Jenny makes decisions based on her first impressions without questioning the validity of her assumptions or considering alternative perspectives. Her tendency to overlook key details has led to several instances in which her solutions are ineffective or only partly beneficial. With focused guidance and support, Jenny has the potential to develop her critical thinking skills and make more informed decisions in the future.”

“Tom’s critical thinking skills require improvement. He occasionally struggles to identify and analyze problems effectively, and his decision-making is inconsistent in its use of logic. Tom often overlooks important information or perspectives and may require guidance in weighing options and making judgments.”

1 – Unacceptable

Employees with unacceptable critical thinking skills lack the ability to analyze information effectively, struggle with decision-making, and fail to solve problems without extensive support from others.

  • Fails to draw logical conclusions from information
  • Incapable of making informed decisions
  • Unable to solve problems without extensive assistance
  • Fails to analyze potential problems before making decisions
  • Struggles to think critically and ask relevant questions
  • Cannot effectively identify alternative solutions
  • Lacks the ability to apply logic and reason in problem-solving situations
  • Does not consistently seek input from others or gather information before making a decision
  • Regularly fails to recognize or address important issues
  • Makes hasty decisions without considering potential consequences
  • Lacks objectivity and often relies on personal biases
  • Resistant to alternative viewpoints and constructive feedback

“Unfortunately, Sue’s critical thinking skills have been consistently unacceptable. She fails to draw logical conclusions from available information and is incapable of making informed decisions. Sue has also shown that she is unable to solve problems without extensive assistance from others, which significantly impacts her performance and the team’s productivity.”

“Jane’s performance in critical thinking has been unacceptable. She often fails to analyze potential problems before making decisions and struggles to think critically and ask relevant questions. Jane’s inability to effectively identify alternative solutions and apply logic and reason in problem-solving situations has negatively impacted her work. Furthermore, she does not consistently seek input from others or gather information before making a decision. It is crucial for Jane to improve her critical thinking skills to become a more effective and valuable team member.”

“Susan’s critical thinking skills are unacceptable. She regularly fails to recognize and address important issues, and her decision-making is often hasty and without considering potential consequences. Susan frequently lacks objectivity and tends to rely on personal biases. She is resistant to alternative viewpoints and constructive feedback, which negatively affects her work performance.”

  • Job Knowledge Performance Review Phrases (Examples)
  • 100 Performance Review Phrases for Job Knowledge, Judgment, Listening Skills
  • 100+ Performance Evaluation Comments for Attitude, Training Ability, Critical Thinking
  • How to Write an Effective Performance Review (Essential Steps)
  • How To Write a Manager Performance Review? (with Examples)
  • 60 Self-Performance Review Goals Examples

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Data Descriptor
  • Open access
  • Published: 25 April 2024

Students’ performance, attitude, and classroom observation data to assess the effect of problem-based learning approach supplemented by YouTube videos in Ugandan classroom

  • Nicholus Gumisirizah 1 ,
  • Joseph Nzabahimana 1 &
  • Charles M. Muwonge 2  

Scientific Data volume  11 , Article number:  428 ( 2024 ) Cite this article

Metrics details

  • Applied physics

In response to global demands, Uganda’s Vision 2040 seeks to transform the country into a modern and prosperous nation by implementing Sustainable Development Goal (SDG) 4, focusing on equitable and quality education. The 21st-century workforce requires individuals who can effectively navigate complex workplace challenges. This dataset was gathered from Form-2 Ugandan secondary school students (aged 12 to 15) across 12 schools in the Sheema District. The dataset comprises three types of data: students’ performance in a physics topic (simple machines), their attitudes toward problem-solving and critical thinking when learning physics using Problem-Based Learning (PBL) supplemented by YouTube videos, and classroom observations documented with the reformed teaching observational protocol (RTOP). The intervention of teaching using PBL was executed in 2022, collecting data from 973 lower secondary school students. The intervention involved three approaches: one group (144 students) received PBL along with YouTube videos, another group of 482 students received PBL alone, and a third group (347 students) was taught using the traditional method. This data article explains the study’s data creation, collection, and analysis process. The dataset holds significance for secondary school teachers, policymakers, and researchers, offering insights into the impact of PBL with and without ICT resources on learning physics and students’ attitudes toward these learner-centered approaches.

Similar content being viewed by others

development of critical thinking problem solving and performance skills

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

development of critical thinking problem solving and performance skills

Secondary school students’ attitude towards mathematics word problems

development of critical thinking problem solving and performance skills

Dataset and validation of the approaches to study skills inventory for students

Background & summary.

Physics education in secondary schools plays a vital role in developing students’ social, physical, leadership, and problem-solving skills. Understanding physics concepts equips learners to know how things work, enabling them to apply this understanding to real-life situations 1 . The physics teaching is structured around activity-based 2 chapters and topics, emphasizing hands-on experiences 3 and practical applicability in everyday life. However, many students find physics challenging, necessitating an active teaching approach. Teaching in physics remains dynamic and interactive, with teachers adopting various strategies to engage students actively. Reciprocal teaching involves dialogues between the teacher and small student groups, while peer collaboration fosters cooperative work on class activities. Problem-Based Learning (PBL) 4 , 5 , 6 is a student-centered approach that encourages group-based learning and teacher facilitation. It has been widely adopted in various educational fields, promoting problem-solving in learning environments. Implementing PBL follows a five-stage process:

Finding a problem

The teacher prepares a task for students to investigate, stimulating problem-solving abilities.

Organizing ideas on the problem

Learners investigate the problem, generate ideas, and receive probing questions from the facilitator to stimulate critical thinking.

The teacher facilitates the distribution of learners into groups, each focusing on solving a particular problem related to the main task. Responsibilities are assigned within each group, promoting cooperation.

Present findings

Learners present solutions to the problem and receive feedback from peers, consolidating their learning outcomes.

Generalizing

Problem-solving leads to the development of skills essential for solving complex, real-world situations. These skills, including problem-solving, creativity, communication, cooperation, and innovation, prepare students to adapt to change and overcome 21st-century challenges.

Integrating YouTube videos as Information and Communication Technology (ICT) tools within a PBL approach offers a multifaceted strategy to enhance physics education 7 , 8 . High-quality videos aligned with curriculum objectives introduce real-world problems and cater to diverse learning styles. Interactive features and accessibility allow continuous learning, and educators can curate playlists to align with curriculum goals. The flipped classroom model 9 combines videos with problem-solving discussions 10 , creating a dynamic learning environment that deepens students’ understanding of physics concepts and their practical applications.

Physics is a subject that holds a significant position in promoting scientific literacy, critical thinking, and essential life skills. However, conventional teaching methods often struggle to engage and empower students in the subject matter effectively. This inadequacy is a pressing concern, as it can hinder students from developing a strong foundation in physics, which is essential for their academic and practical pursuits. This study was critically important due to the existing challenges within physics education in Ugandan secondary schools. Incorporating innovative teaching approaches, such as PBL supplemented by YouTube videos, becomes pivotal in addressing these challenges. These methods can enhance students’ comprehension of physics and nurture vital skills like problem-solving, creativity, communication, cooperation, and innovation. These skills are indispensable for students to thrive in a rapidly evolving, knowledge-driven world.

Sharing the data generated through this study is equally significant. It is a valuable resource for educators, policymakers, curriculum designers, and researchers. By making this data accessible, the study contributes to the ongoing efforts to improve the quality and relevance of secondary education in Uganda. Educators can utilize this data to adopt innovative and effective teaching methods that align with the goals of the educational system, ultimately enhancing students’ performance and fostering lifelong learning. Policymakers and curriculum designers can use the insights derived from this data to conduct essential reviews and make informed decisions about teacher competence and the adoption of innovative teaching methodologies. Furthermore, researchers in similar fields can leverage this data to understand better the impact of PBL and the use of multimedia resources in education. This data identifies gaps and challenges and offers potential solutions and avenues for further research.

This data-sharing article presents insights into the effects of Problem-Based Learning (PBL) supplemented by YouTube videos on students’ comprehension of simple machines in physics within Ugandan lower secondary schools. The research collected data from 973 students, encompassing both public and private schools in the Sheema district of Uganda. Three primary types of data were collected: students’ performance data, attitude data, and classroom observation data.

Performance data was acquired through a Physics Learning Achievement Test (PLAT), involving students from various school types and teaching methods. Attitude data were collected via two surveys, one focusing on problem-solving ability (AAPS) and the other on critical thinking ability (CTMS) under PBL with YouTube videos. The Approaches to Problem-Solving Survey (AAPS) and the Critical Thinking Motivational Scale (CTMS) are measurement tools commonly used in the field of physics. The AAPS assesses various strategies individuals employ when solving problems, while the CTMS evaluates motivational factors influencing critical thinking abilities. The Reformed Teaching Observation Protocol (RTOP) assessed classroom practices and teaching methods.

The dataset, available in raw, filtered, and analyzed formats, offers valuable insights into the impact of innovative teaching methods on student performance, attitudes, and classroom practices. It addresses critical questions about the effectiveness of PBL approaches, with potential implications for science education in Uganda.

This dataset intends to assess the impact of PBL when supplemented with YouTube videos on Ugandan form-2 lower secondary schools in learning simple machines. The following are the research questions:

To what extent do PBL and PBL supplemented with YouTube videos enhance students’ conceptual understanding of simple machines in physics?

What are the problem-solving and critical thinking levels brought by learning with PBL supplemented by YouTube videos?

How is physics teaching reformed when learning simple machines in physics with PBL supplemented by YouTube videos?

Are there differences in students’ academic achievement for school type (government alongside private school)?

Ethics statements

The research project rigorously adhered to ethical standards established by the University of Rwanda College of Education’s (UR-CE) Research and Innovation Unit under the ethical protocol number Ref. 03/DRI-CE/078/EN/gi/2021, dated 30th November 2021. All necessary permissions were obtained systematically and ethically, as outlined in the research project description. Here is a summary of the ethical considerations and recruitment process:

Ethical protocol

The research project adhered to the ethical standards and principles of the UR-CE)‘s Research and Innovation Unit. The protocol number and approval date are explicitly mentioned, demonstrating a formal ethical review.

Permissions from authorities

The Ministry of Education and Sports obtained formal permission to access schools through the Permanent Secretary’s (PS) office. The PS communicated with the Chief Administrative Officer (CAO), District Education Officer (DEO), and Resident District Commissioner (RDC) to secure the necessary support for the study.

Engagement with schools

With the approval from the CAO, the DEO contacted school heads to inform them about the research study. The school heads responded positively and even provided physics teachers with three-day problem-based learning (PBL) training as part of the research. It is worth noting that all participating teachers held teaching qualifications, and as part of the research process, we provided them with a three-day training session specifically focused on implementing PBL interventions. This training aimed to ensure consistent delivery of PBL across treatment classrooms and schools, thereby mitigating variations attributable to individual teaching styles.

Informed consent

Teachers and students, with parental consent, willingly participated in the research study. Informed consent forms were signed, indicating they fully understood the study’s purpose, procedures, potential risks, and benefits. Anonymity was ensured for students by not including their names on the test papers.

The research employed purposive sampling to select 973 students from 12 schools. These schools were divided into three groups, each with a different teaching method: PBL with YouTube videos, PBL alone, and Traditional teaching.

Geographic considerations

Schools were selected from different town councils at extreme ends of the district, sharing similar characteristics suitable for the study. This approach helps ensure that the study’s findings are robust and generalizable.

Research design

The study utilized a non-equivalent comparison group pre/post-test design (Creswell, 2012). The study involves Form 2 students from six Sheema District, Western Uganda schools. Three selected schools were public, while the remaining three were private, offering a diverse representation of school types in the district. The selection of schools was purposeful, aiming to ensure diverse representation and maximize the study’s validity. This approach allowed for the strategic allocation of schools to treatment or control groups based on specific criteria pertinent to the research objectives. Notably, the selection criteria considered factors such as geographical location, school size, academic performance, and availability of resources to ensure a balanced representation of different educational contexts. The traditional method, characterized by conventional lectures supplemented with textbooks and teacher-centered content delivery, was employed in control group schools. Students in this group primarily learned through note-taking with minimal demonstrations. Conversely, four other secondary schools were designated as the first treatment group, where Problem-Based Learning (PBL) was implemented. Four additional schools comprised the second treatment group, which utilized PBL supplemented by educational YouTube videos. These groups collectively engaged in constructing knowledge and enhancing conceptual understanding. The participants in the study were form-2 students, ranging in age from 12 to 15 years, who were already enrolled in the schools.

We provide a performance (achievement) test to all 973 students before and after teaching interventions in all groups. We administered an attitude survey (motivation scale) and observed classes in the group that used PBL and YouTube videos. Table  1 presents the sample size under the teaching intervention of design groups implemented.

The objective of the performance test was to gauge students’ grasp of conceptual understanding acquired through the implementation of a problem-based learning approach following the completion of the topic on simple machines. The test, spanning 25 minutes, consisted of ten questions sourced from practice exercises on simple machines within form-two secondary learners’ physics textbooks. The National Curriculum Development Center and the Ministry of Education and Sports in Uganda approved these textbooks. The examination encompassed themes outlined in the approved lower secondary curriculum physics syllabus, covering concepts like the applications of simple machines, mechanical advantage, velocity ratio, and efficiency of machines. Specific topics included levers (covering classes and applications), pulley systems (encompassing types, applications, mechanical advantage, velocity ratio, and efficiency), inclined planes (including applications, mechanical advantage, velocity ratio, and efficiency), wheel and axle (exploring understanding, applications, and velocity ratio), gears (addressing simplification of work, applications, and velocity ratio), and methods of enhancing machine efficiency. The test was validated by four researchers from Mbarara University of Science and Technology (MUST) and the University of Rwanda College of Education (URCE). Test 1 was scored in MS Excel with “IF EXCACT” function, while Test 2 was manually marked, and results were entered in the same software.

Attitude surveys were all adopted from existing literature. Critical Thinking Motivational Scale (CTMS) was used as our Survey 1 and was adapted from Valenzuela et al . 11 , while Attitudes and Approaches to Problem-Solving Survey (AAPS) was used as our Survey 2 and was adapted from Singh and Mason 12 and available at Physport ( https://www.physport.org/assessments/assessment.cfm?A=AAPS ). Problem-solving and critical thinking are integral to effective physics education. They deepen students’ understanding by connecting theoretical concepts to real-world situations 13 , 14 . These skills encourage active engagement and foster analytical abilities, allowing students to break down complex problems. Additionally, they promote creativity, help apply theory to practice, and cultivate logical reasoning. Problem-solving and critical thinking prepare students for future challenges in scientific and engineering fields, encourage collaboration, boost confidence, and instill a mindset for lifelong learning. Incorporating these skills into physics teaching enhances academic performance and equips students with valuable personal and professional growth tools. We adopted all 19 items from CTMS and only 31 items from AAPS to meet our research aim. Thus, the last two items (32 and 33) in AAPS were removed as they were not related to the content delivered in our study. All these surveys were rated on a Likert scale (from strongly disagree to strongly agree). Items 1–4 are related to expectancy, items 5-8 to attainment, items 9-12 to utility, items 13-16 to interest, and items 17-19 to cost.

Classroom observation data was collected with the famous standardized reformed teaching observation protocol (RTOP) from Pibun and Sawada 15 and is available at Pysport ( https://www.physport.org/assessments/assessment.cfm?A=RTOP ). RTOP proved its validity and reliable results across the globe 16 , 17 , 18 , 19 with its potential to reveal reformed teaching while implementing a new teaching method. It comprises 25 statements where each item is evaluated on a 5-scale. It is scored 0 when such practice was not found in a lesson and 4 when a certain practice was very well described or observed in a delivered lesson. During classroom observation, an observer sits in the classroom and observes what the teacher and student do. He/she may take notes on what is happening but wait until the class is over to rate these 25 items.

Data Records

All data described in this descriptor are deposited in figshare ( https://figshare.com/articles/dataset/RTOP_Data_for_the_implementation_of_Problem-based_learning_in_a_Physics_classroom_Uganda/23974902 ) 20

To evaluate the impact of PBL teaching intervention on students’ performance and attitude toward learning physics, we gathered three data types (performance, attitude, and observation) presented in five datasets (two performance tests, two attitude surveys, and one classroom observation).

Students’ performance data

The student performance data comprises two datasets or MS Excel files. The first file contains data for test one titled “Performance data _ Test 1 (Multiple choice) _ 12102022 figshare.” This file contains data from ten multiple-choice questions. The file contains three sheets. The first sheet shows test items (all ten questions), the second presents pretest answer choices, and the third presents post-test answer choices or results. Each results sheet shows the school code (column B), student code (column C), school type (column D), and treatment group (column E) as variables. From column “F” to column “O” we see student answer choices under each test question. From column “Q” to column “Z” we marked the test (one score for each correct question). Column “AB” shows the percent score. Row “3” shows the expected correct answer, while row “4” shows variables and the number of test items.

The second file contains data for test two titled “Performance data _ Test 2 (Problem solving) _ 12102022 figshare.” This file contains data from ten-word problem kinds of questions. The file contains three sheets. The first sheet shows test items (all ten questions), the second presents pretest scores, and the third presents post-test scores or results. Each results sheet shows the school code (column C), student code (column D), school type (column E), and treatment group (column F) as variables. From column “G” to column “P” we see student scores under each test question. Column “R” shows the total score, while column “S” shows percent score. Row “3” shows the assigned score when each question’s expected correct answer was provided. Row “4” shows variables and several test items.

Students’ attitude data

The student attitude data comprises two datasets or MS Excel files. The first file contains data for the first survey titled “Motivation data _ Survey 1 (Critical thinking ability) _ 12102022 figshare.” This file contains data from 19 items of critical thinking ability survey. The file contains two sheets. The first sheet shows the pre-test results, while the second shows the post-test results. Each sheet shows the school code (column C), student code (column D), school type (column E), and treatment group (column F) as variables. From column “F” to column “O” we see student answer choices under each test question. From column “G” to column “Y” we see student answers or agreement (1: STRONGLY DISAGREE, 2: DISAGREE, 3,: NEUTRAL, 4: AGREE, AND 5: STRONGLY AGREE) to each item of the survey. Row “2” shows the survey title, while row “4” shows the variables and number of survey items.

The second file contains data for the second survey titled “Attitude data _ Survey 2 (Problem solving ability) _ 12102022 figshare.” This file contains data from 31 items related to problem-solving ability in learning physics. The file contains two sheets. The first sheet shows the pre-test results, while the second shows post-test results. Each sheet shows the school code (column C), student code (column D), school type (column E), and treatment group (column F) as variables. From column “G” to column “AK” we see student answers or agreement (1: STRONGLY DISAGREE, 2: DISAGREE, 3,: NEUTRAL, 4: AGREE, AND 5: STRONGLY AGREE) to each item of the survey. Row “2” shows the survey title, while row “4” shows the variables and number of survey items.

Classroom observation data

The file for classroom observation data is titled “Classroom observation data _ RTOP for video & pbl group _ 12102022 figshare” and contains only one sheet. From column “B” to column “C” we see RTOP while the following columns (D-AA) present data. Row “10” shows school codes, while row “5” shows several observations and frequencies under each school supplied with PBL and YouTube videos teaching intervention. The data range from 0 (never occurred) to 4 (very descriptive).

Technical Validation

Initially, we had 20 problem-solving questions, but evaluators rated 10 as valid, which were included in the final administration. We also initially had 15 multiple-choice questions, and evaluators rated 10 as appropriate and aligned with the study objectives. A pilot study was conducted with 90 students to evaluate the face validity and reliability of the questions. We assessed the reliability of these items using a split-half method and obtained a high reliability (r = 0.87) for multiple-choice items and a medium reliability (r = 0.68) measured by the Pearson product-moment correlation coefficient for problem-solving items. The split-half reliability assumes that the two halves of the test are equivalent in difficulty and content 21 .

CTMS and AAPS

During our pilot phase, the internal consistency of CTMS, assessed using Cronbach’s alpha, was found to be high (0.793) for all 19 items, medium (0.428) for expectancy, (0.411) for attainment, (0.686) for utility, (0.574) for interest, and (0.594) for cost. The AAPS exhibited an internal consistency reliability of Cronbach’s alpha = 0.685. It is important to note that the AAPS contains nine items formulated negatively. Therefore, for a positive attitude, students were required to respond with ‘Disagree’ or ‘Strongly disagree’ to these items (1, 3, 5, 8, 11, 12, 16, 23, and 30). Consequently, the reliability of the 22 positively formulated items was 0.601, while that of the negatively formulated items was 0.480.

Before observing actual classes, we underwent a 2-hour training session and watched and coded a YouTube classroom video on physics. The inter-rater agreement between the first author and the assistant exceeded 80% on two occasions, indicating the reliability of the data.

Scope and potential limitations

In our study, we recognized the significance of investigating potential bias in the results obtained from students in both private and public schools. To ensure the credibility and robustness of our findings, we conducted a comparative analysis to determine whether any notable disparities existed between these two groups. Our data collection process was comprehensive, encompassing a diverse range of schools, including both private and public institutions. This approach allowed us to capture a broad spectrum of socioeconomic backgrounds and educational settings. The study itself involved Form-2 students who were enrolled in schools situated in different town councils at opposite ends of the district. Despite their geographical diversity, these schools shared pertinent characteristics relevant to our research objectives. To facilitate our investigation, we categorized these schools into distinct treatment groups, comprising PBL alone and PBL with videos, along with a control group following traditional teaching methods. Importantly, we deliberately chose to maintain the existing class arrangements in these schools. Our commitment to preserving each school’s established class organization and cultural norms guided this decision.

However, it is essential to acknowledge the limitations inherent in the research design. One notable limitation is the observation of attitudes, which was limited to the student group exposed to the PBL with the video teaching method. This restriction may impact the generalizability of the findings, as attitudes toward learning may vary among students exposed to different instructional methods. Future research endeavors could consider incorporating measures to assess attitudes across all treatment groups to provide a more comprehensive understanding of the intervention’s effects.

The current data files do not contain information on individual teachers due to the scope and focus of the study. These variables could include educators’ teaching experience, pedagogical approach, content knowledge, and instructional effectiveness. Since we recognize the significance of teacher impact, we would consider incorporating such variables in future research projects to provide a more comprehensive analysis of instructional effectiveness and its associated factors.

Regarding the decision to maintain existing class arrangements in schools, particularly considering cultural norms, it is crucial to recognize its potential influence on the study outcomes. The intervention’s impact may have been influenced by preserving the existing class structures, including student composition and dynamics. For instance, certain class arrangements may foster greater collaboration and engagement, while others may present challenges in implementing collaborative learning approaches such as PBL. Therefore, future studies could explore the relationship between class arrangements and instructional effectiveness to provide insights into optimizing learning environments.

Usage Notes

Value of the data.

The data presented is valuable and beneficial to science education in Uganda as it elucidates the status of students’ content knowledge and their perceptions about learning simple machines with PBL approaches.

Policymakers and curriculum designers have the opportunity to conduct essential reviews that highlight the competence of teachers. This process can pave the way for advocating innovative and relevant teaching methodologies, subsequently informing the identification of professional development requirements for educators.

Researchers in similar fields can re-use these data to measure the effect of PBL intervention on student achievement, identify gaps, and predict possible remedies. Thus, data can be analyzed using various variables such as teaching intervention and school type.

Code availability

No custom code was used.

Putri, I. E. & Sinaga, P. Collaborative problem-solving: how to implement and measure it in science teaching and learning. in International conference on mathematics and science education (ICMSCE) 2020 vol 1806 (IOP PUBLISHING LTD, 2021).

Sokoloff, D. R. Active Learning of Introductory Optics: Interactive Lecture Demonstrations and Optics Magic Tricks. in ducation and Training in Optics and Photonics, OSA Technical Digest Series (Optical Society of America, 2007) . https://doi.org/10.1364/ETOP.2007.EWA2 (2007).

Dohn, N. B., Fago, A., Overgaard, J., Madsen, P. T. & Malte, H. Students’ motivation toward laboratory work in physiology teaching. Adv. Physiol. Educ. 40 , 313–318 (2016).

Article   PubMed   Google Scholar  

Nguyen, D.-H., Gire, E. & Rebello, N. S. Facilitating Strategies for Solving Work-Energy Problems in Graphical and Equational Representations. in 2010 PHYSICS EDUCATION RESEARCH CONFERENCE (reds Singh, C., Sabella, M. & Rebello, S.) vol 1289 241–244 (AMER INST PHYSICS, 2010).

Seltpuk, G. S. & Tarak, M. Physics Teaching in Problem-Based Learning Physics Teaching in Problem-Based Learning. 844 , 1–2 (2017).

Bilgin, I. & Erdal, Ş. The Effects of Problem-Based Learning Instruction on University Students’ Performance of Conceptual and Quantitative Problems in Gas Concepts. EURASIA J. Math. Sci. Technol. Educ. 5 , 153–164 (2009).

Article   Google Scholar  

Susilawati, S., Rahmana, F. & Kosim, K. Practicality of problem-based physics learning tools with video assistance to improve problem-solving ability of students. J. Sci. Sci. Educ. 3 , 55–59 (2022).

Nasbey, H. & Raihanati, R. Developing a video education on the topic of Modern Physics based on problem-based learning (PBL) assisted PhET online learning. in Journal of Physics: Conference Series (reds F.C., W. et al .) vol 2377 (Institute of Physics, 2022).

Zhang, Y., Zhang, X., Zhang, K. & Shen, D. A Topic-based Computer Aided Instruction System for Programming Practice Courses. in Proceedings of the 2015 international conference on social science, education management and sports education (red Chen, L.) vol 39 1342–1345 (ATLANTIS PRESS, 2015).

Becerra-Labra, C., Gras-Martí, A. & Martínez Torregrosa, J. Effects of a problem-based structure of physics contents on conceptual learning and the ability to solve problems. Int. J. Sci. Educ. 34 , 1235–1253 (2012).

Valenzuela, J., Nieto, A. M. & Saiz, C. Critical Thinking Motivational Scale (CTMS): una aportación para el estudio de la relación entre el pensamiento crítico y la motivación. Electron. J. Res. Educ. Psychol. 9 , 823–848 (2011).

Singh, C. & Mason, A. Physics graduate students’ attitudes and approaches to problem solving. AIP Conf. Proc. 1179 , 273–276 (2009).

Article   ADS   Google Scholar  

Ukobizaba, F., Ndihokubwayo, K., Mukuka, A. & Uwamahoro, J. Insights of teachers and students on mathematics teaching and learning in selected Rwandan secondary schools. African J. Educ. Stud. Math. Sci. 15 , 93–106 (2019).

Dorimana, A., Uworwabayeho, A. & Nizeyimana, G. Enhancing Upper Secondary Learners’ Problem - solving Abilities using Problem-based Learning in Mathematics. Int. J. Learn. Teach. Educ. Res. 21 , 235–252 (2022).

Piburn, M. et al . Reformed teaching observation protocol (RTOP) Training Guide . doi:ED419696 (Tempe, Arizona: Arizona Collaborative for Excellence in the Preparation of Teachers, 2000).

Ndihokubwayo, K., Uwamahoro, J. & Ndayambaje, I. Implementation of the Competence-Based Learning in Rwandan Physics Classrooms: First Assessment Based on the Reformed Teaching Observation Protocol. EURASIA J. Math. Sci. Technol. Educ. 16 , 1–8 (2020).

Ndihokubwayo, K., Uwamahoro, J. & Ndayambaje, I. Classroom observation data collected to document the implementation of physics competence-based curriculum in Rwanda. Data Br. 36 , 107055 (2021).

Article   CAS   Google Scholar  

Park, S., Jang, J. Y., Chen, Y. C. & Jung, J. Is Pedagogical Content Knowledge (PCK) Necessary for Reformed Science Teaching?: Evidence from an Empirical Study. Res. Sci. Educ. 41 , 245–260 (2010).

MacIsaac, D., Sawada, D. & Falconer, K. Using the Reformed Teaching Observation Protocol (RTOP) as a Catalyst for Self- Reflective Change in Secondary Science Teaching. (2001).

Gumisirizah, N., Nzabahimana, J. & Muwonge, C. M. Students’ academic achievement test, Survey and RTOP data for the implementation of problem-based learning method supplemented by YouTube videos, Uganda. Figshare https://doi.org/10.6084/m9.figshare.23974902.v2 (2023).

Adalikwu, S. & Iorkpilgh, I. The Influence of Instructional Materials on Academic Performance of Senior Secondary School Students in Chemistry in Cross River State. Glob. J. Educ. Res. 12 , 39–45 (2013).

Google Scholar  

Download references

Acknowledgements

We acknowledge the African Center for Excellence for Innovative Teaching and Learning Mathematics and Science (ACEITLMS) for funding this study and the authors of the research tools we used to free them to use. All study participants, teachers, and school headteachers are also well acknowledged.

Author information

Authors and affiliations.

African Center of Excellence for Innovative Teaching and Learning Mathematics and Science (ACEITLMS), University of Rwanda College of Education (URCE), Kayonza, P.O Box 55, Rwamagana, Rwanda

Nicholus Gumisirizah & Joseph Nzabahimana

Mbarara Science and Technology (MUST), Mbarara, Uganda

Charles M. Muwonge

You can also search for this author in PubMed   Google Scholar

Contributions

Nicholus Gumisirizah: Conceptualization, Methodology, Visualization, Data curation, Software, Writing- Original draft preparation. Joseph Nzabahimana, Charles M. Muwonge: Conceptualization, Validation, Writing- Reviewing and Editing.

Corresponding author

Correspondence to Nicholus Gumisirizah .

Ethics declarations

Competing interests.

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Gumisirizah, N., Nzabahimana, J. & Muwonge, C.M. Students’ performance, attitude, and classroom observation data to assess the effect of problem-based learning approach supplemented by YouTube videos in Ugandan classroom. Sci Data 11 , 428 (2024). https://doi.org/10.1038/s41597-024-03206-2

Download citation

Received : 04 November 2023

Accepted : 02 April 2024

Published : 25 April 2024

DOI : https://doi.org/10.1038/s41597-024-03206-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

development of critical thinking problem solving and performance skills

COMMENTS

  1. What Are Critical Thinking Skills and Why Are They Important?

    Problem-solving: Problem-solving is perhaps the most important skill that critical thinkers can possess. The ability to solve issues and bounce back from conflict is what helps you succeed, be a leader, and effect change. One way to properly solve problems is to first recognize there's a problem that needs solving.

  2. Critical Thinking

    People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1] Key Critical Thinking Skills. Critical thinkers possess a set of key characteristics which help them to question information and their own ...

  3. How to develop critical thinking skills

    Here are 12 tips for building stronger self-awareness and learning how to improve critical thinking: 1. Be cautious. There's nothing wrong with a little bit of skepticism. One of the core principles of critical thinking is asking questions and dissecting the available information.

  4. How to build critical thinking skills for better decision-making

    Problem-solving. Critical thinking and problem-solving are two more terms that are frequently confused. After all, when you think critically, you're often doing so with the objective of solving a problem. The best way to understand how problem-solving and critical thinking differ is to think of problem-solving as much more narrow.

  5. Critical Thinking and Problem-Solving

    Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking ...

  6. 5 Top Critical Thinking Skills (And How To Improve Them)

    Top 5 critical thinking skills. Here are five common and impactful critical thinking skills you might consider highlighting on your resume or in an interview: 1. Observation. Observational skills are the starting point for critical thinking. People who are observant can quickly sense and identify a new problem.

  7. Full article: Fostering critical thinking skills in secondary education

    Our critical thinking skills framework. The focus on critical thinking skills has its roots in two approaches: the cognitive psychological approach and the educational approach (see for reviews, e.g. Sternberg Citation 1986; Ten Dam and Volman Citation 2004).From a cognitive psychological approach, critical thinking is defined by the types of behaviours and skills that a critical thinker can show.

  8. Frontiers

    Enhancing students' critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves ...

  9. How to Improve Your Critical Thinking Skills

    Consider these ways writing can help enhance critical thinking: 1. Clarity of Thought: Writing requires that you articulate your thoughts clearly and coherently. When you need to put your ideas on ...

  10. PDF The Case for Performance-Based Assessment of Critical-Thinking Skills

    that critical-thinking, problem-solving, qualitative and quantitative analytic-based reasoning, and written-communication skills need to be improved. The multiple-choice test may do a satisfactory job of measuring student mastery of content; however, mastery of content is not enough to survive and prosper in the 21st century.

  11. A Systematic Review on Instruments to Assess Critical Thinking

    Critical Think ing an d Problem Solving (CTPS) are soft skills essential to be equipped among students according to. 21st-century learning. Several instruments have bee n dev eloped to measure ...

  12. International Study Reveals Measuring and Developing Critical-Thinking

    Between 2015 and 2020, 120,000 students from higher education institutions in six different countries — Chile, Finland, Italy, Mexico, the UK, and the US — were administered CAE's Collegiate Learning Assessment (CLA+), a performance-based assessment that measures proficiency with critical thinking, problem solving, and written communication.

  13. The effectiveness of collaborative problem solving in promoting

    Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field ...

  14. Full article: Enhancing critical analysis and problem‐solving skills in

    Research aims. Drawing on CLA and PBL, our aims were to (1) develop and implement a CLA‐ and PBL‐based tutorial programme to foster the development of critical analysis and problem‐solving skills in undergraduate psychology students, and (2) assess change in students' self‐reported critical and problem‐solving skills over the tutorial programme.

  15. Developing critical thinking, collective creativity skills and problem

    Collaborative problem-solving skills are paramount within the context of 21st-century learning skills development. These skills include critical thinking, creativity, collaboration and communication (4Cs). This research examines the elements of motivation, play and collaborative practice within a design activity, called Design Jam. Design Jams ...

  16. Assessing Critical Thinking and Problem-Solving

    Performance tasks are specific activities that require students to demonstrate mastery of knowledge or skills through application within the task. The performance tasks that we utilize to assess critical thinking and problem solving are each aligned with a specific thinking type. In each task, students are required to make their thinking ...

  17. Constructivism learning theory: A paradigm for students' critical

    This study serves as an example of how critical thinking and creativity may be used to learn. A validated tool that combines creativity and critical thinking with problem-solving abilities and critical thinking skills has also been developed as a result of this research to improve student performance in Saudi Arabia's higher education system.

  18. Critical Thinking: Performance Review Examples (Rating 1

    Critical Thinking: Performance Review Examples (Rating 1 - 5) Critical thinking skills are an essential aspect of an employee's evaluation: the ability to solve problems, analyze situations, and make informed decisions is crucial for the success of any organization. Questions that can help you determine an employee's rating for critical ...

  19. Students' performance, attitude, and classroom observation data to

    Problem-solving and critical thinking are integral to effective physics education. They deepen students' understanding by connecting theoretical concepts to real-world situations 13 , 14 .

  20. How to improve your analytical thinking skills

    Analytical thinking is essential for effective problem-solving, decision-making, and equitable leadership. And yet, more research reveals only 38% of employees demonstrate the necessary balance of ...

  21. Developing critical thinking, collective creativity skills and problem

    Collaborative problem-solving skills are paramount within the context of 21st-century learning skills development. These skills include critical thinking, creativity, collaboration and communication (4Cs). This research examines the elements of motivation, play and collaborative practice within a design activity, called Design Jam. Design Jams, such as Global Service Jams and Global ...

  22. Boost Critical Thinking for HR Problem Solving

    Critical thinking in HR Operations is greatly improved by asking the right questions.When confronted with a problem, don't rush to solutions. Instead, probe deeper by asking 'why' the issue exists ...

  23. Chapter 13 Flashcards

    professional development provides instructional guidance to improve instruction Which of the following statements best describe the following quote: "The teacher understands and uses a variety of instructional strategies to encourage students' development of critical thinking, problem solving, and performance skills."

  24. The building blocks of literacy Reading in the early ...

    Early exposure to reading boosts cognitive abilities, aiding in the development of critical thinking and problem-solving skills. Through stories, children explore various emotions and social situations, developing empathy and emotional intelligence. This emotional engagement supports social development and helps children navigate their ...