Already a subscriber? Login here
Not yet a subscriber? - Subscribe here

Browse by:



Displaying: 1-7 of 7 documents


1. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Frank Fair, From the Editor’s Desk
view |  rights & permissions | cited by
2. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Donald Hatcher, Critical Thinking Instruction: A Realistic Evaluation The Dream vs. Reality
abstract | view |  rights & permissions | cited by
Since the 80s, educators have supported instruction in critical thinking (CT) as “an Educational Ideal.” This should not be a surprise given some of the more common conceptions, e.g., Ennis’s “reasonable reflective thinking on what to believe or do,” or Siegel’s “being appropriately moved by reasons,” as opposed to bias, emotion or wishful thinking. Who would want a doctor, lawyer, or mechanic who could not skillfully evaluate arguments, causes, and cures? So, educators endorsed the dream that, through proper CT instruction, students’ critical skills and “rational passions” could be greatly improved. In spite of the dream’s appeal, the reality is, after 30+ years, there is little reason to think the dream resembles reality. After describing what I take to be an adequate definition of CT, such a depressing conclusion will be supported by CT assessment scores from across academe, the continued widespread disagreement among experts in nearly all fields, including CT, and the abundant psychological research on rationality and decision making. And finally, while the ideal extols the value of objectivity, I shall argue that bias may be unavoidable because personal values play a vital role in the evaluation of many arguments.
3. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
David Wright, Are We Asking the Right Questions about Critical Thinking Assessment?: A Response to Hatcher
abstract | view |  rights & permissions | cited by
This is a response essay to Donald Hatcher’s (2015), “Critical Thinking Instruction: A Realistic Evaluation: The Dream vs. Reality.” Hatcher argues that critical thinking (CT) instruction seriously falls short of the ideal of honestly evaluating alternative evidence and arguments. This failure is apparent, he argues, when one surveys student performance on a variety of CT assessment tests. Hatcher reviews the current CT assessment data, which includes an extensive pool of results collected from Baker University where Hatcher oversaw a sophisticated and well-funded CT program for about two decades. Hatcher also argues that evidence from the philosophical and psychological literatures on disagreement and judgment suggests even CT experts fail to model the ideal and that CT has suffered from an unrealistic conception of rationality and human decision-making. I reply by arguing that, by putting the CT assessment data in a different context and asking an alternative set of questions, one can justifiably derive a more positive evaluation of the future of CT instruction in light of the CT ideal. Instead of focusing on whether students are achieving the CT ideal by the time of the post-test, instructors should ask whether they are making the kind of progress that there is good reason to expect. I close by challenging the soundness of the proposed implications of Hatcher’s arguments.
4. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Donald Hatcher, Effect Size and Critical Thinking Assessment: A Response to Wright
abstract | view |  rights & permissions | cited by
This is a brief response to David Wright’s commentary on my paper, “Critical Thinking Instruction: A Realistic Evaluation: The Dream vs. Reality.” Wright claims that if one looks more closely at the literature on critical thinking (CT) assessment that the reported effect sizes for CT instruction are quite respectable and my standards are too high. My comments will focus is on whether effect size is both problematic and an adequate measure for assessment.
5. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Ada Haynes, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning
abstract | view |  rights & permissions | cited by
This article provides a brief overview of the efforts to develop and refine the Critical thinking Assessment Test (CAT) and its potential for improving the design of classroom assessments. The CAT instrument was designed to help faculty understand their students’ strengths and weaknesses using a short answer essay format. The instrument assesses a broad collection of critical thinking skills that transcend most disciplines. The questions were deliberately designed around real-world scenarios that did not require specialized knowledge from any particular discipline. Various faculty who collaborated in the national dissemination of the CAT instrument found that it was a helpful model for designing better course assessments to grade student work. Classroom assessments modeled on the CAT emphasize more critical thinking within the discipline and less rote retention of factual information. We describe the ongoing work to help faculty successfully adapt the CAT to applications that can be used in each discipline’s courses to evaluate and encourage students’ critical thinking.
6. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Pauletta G. Baughman, Gustavo M.S. Oliveira, Elizabeth M. Smigielski, Vida M. Vaughn, Evidence-Based Critical Thinking Exercise: Shaping Tomorrow’s Dental Choices
abstract | view |  rights & permissions | cited by
The objective was to educate first-year dental students on how to appraise new dental treatments by applying critical thinking (CT) and evidence-based dentistry (EBD) skills. To facilitate this task, we utilized a learning exercise involving a simulated office visit by a dental pharmaceutical representative. The simulated office sales call was conducted after instruction by dental school faculty and clinical librarians on EBD and CT principles. Students’ critical thinking and evi­dence-based practice skills were tested using a validated critical thinking assessment tool and a rubric-based written assignment. Results showed that ninety-one percent of students demonstrat­ed a high/positive response. Students agreed that the exercise helped them to consider multiple perspectives in subject matter. The majority of students also scored high/positive in understand­ing the components of a clinical question employing the PICO format. Students agreed that the instruction received supported their ability to demonstrate critical thinking skills. Eighty percent indicated instruction as having a high/positive impact on navigating complex clinical questions. The authors concluded that simulated office visit plus explicit instruction in EBD principles im­proved first year dental students’ CT skills.
7. Inquiry: Critical Thinking Across the Disciplines: Volume > 30 > Issue: 3
Michael Lively, Critical Thinking and the Pedagogy of Music Theory
abstract | view |  rights & permissions | cited by
Students of music theory are often challenged by both the complexity of the concepts that they are expected to learn and by the abstract nature of these ideas. For many students, their direct experience with music, acquired during the study of skills associated with musical performance, does not directly translate into the intellectual environment of traditional music theory classes. The difficulty derives from the gap between the students’ perception of musical structure and the understanding of these concepts generally held by composers and music theorists. In this study, I suggest that in addition to systematically teaching the content of the established music theory curriculum, instructors have more success when developing instructional material and determining the design of their courses by considering higher-level critical thinking skills