Reflecting on International PISA Problem-Solving Test

Standard

The Programme for International Student Assessment (PISA) tests evaluate the academic skills of subset of 15 year old students to countries in the Organization for Economic Cooperation and Development (OECP), consisting of 35 countries worldwide. This organization also tracks economic, educational and equity data and a full read of the results document will find correlations of PISA results with these parameters. An overall geographical map of 2015 PISA findings is found at (http://www.oecd.org/pisa/). The 2015 version of the test, offered every 3 years, included results in science and mathematics, which should be of interest to medical educators. A direct connection to the full report and database of country results is found at http://www.oecd.org/edu/pisa-2015-results-volume-i-9789264266490-en.htm

The next few postings in this blog will attempt to take a deeper dive into some of this data, focusing on its potential relevance to medical educators.

One of the missions of VetMedAcademy.org is to cultivate a dialogue on the role of intentional training of our students in problem-solving. You’ll even see a learning module on the site entitled Blended Learning and Critical Thinking which suggests strategies relevant to veterinary educators.  The PISA test last evaluated individual problem-solving skills in 2012, and then evaluated collaborative problem-solving in 2015. Let’s take a look first at the individual problem-solving test results from the PISA 2012 test http://www.oecd.org/pisa/keyfindings/pisa-2012-results.htm We might be disappointed that these results are “dated” but should note that the students evaluated are the cadre of students currently in veterinary schools.

Firstly, the test purported to evaluate through a series of static and interactive computer exercises of escalating difficulty the following processes:
“• Exploring and understanding • Representing and formulating • Planning and executing
• Monitoring and reflecting”
Some interesting patterns were revealed. Students in Singapore, Korea and Japan scored highest on the problem-solving section. About 20% of students worldwide were only able to solve only simple problems that referred to familiar situations. Likewise, about 11% of 15-year-olds in this sample could systematically explore a complex scenario, requiring multiple steps including consideration of constraints and feedback on results. Problem-solving performance correlates positively to other assessed subjects but more weakly than do traditional academic areas like math, science and reading. We’ll come back to additional data on this later.

Some countries perform more strongly on problem-solving than might be predicted by performance in these traditional areas. This implies that some educational systems may be providing some experience in problem-solving in greater proportion than other countries. This also implies that we might want to look at these systems for clues, and that each country might consider adjusting its approach to seek the ideal balance of educational priorities.

For examples, students in China, Korea, and Singapore perform strongest on “static”problems that require understanding, formulating or representing new knowledge, compared to other types of problems. In contrast, students in Brazil, Ireland, Korea and the United States perform the best on “interactive”problems requiring discovery of some of the information to solve the problem. Static problems are those typically found in textbooks. “Real-life” problems are more interactive than that, and solving them may not benefit as much from rote practice of certain types of problems. However, the “static” problem-solving skillset is also needed for solving “real-life”problems.

The static or interactive nature of the problem situation is related to how information is presented. Static problems, where all relevant information is disclosed at the outset, are the typical textbook problems encountered in schools, whereas in most contexts outside of schools, the relevant information to solve the problem has to be obtained by interacting with the environment. Static problems can be regarded as a special case of interactive problems. This highlights the fact that the set of skills that are required to solve static tasks is a subset of the skills required for interactive tasks. However, in addition, students must also “be open to novelty, tolerate doubt and uncertainty, and dare to use intuitions (‘hunches and feelings’)” to initiate a solution. The PISA report suggested that students with poorer performance on interactive problems might benefit from opportunities to practice “curiosity, perserverance and creativity.”

Lest educators from any country become sanguine about these results, it is most sobering to move away from looking at averages, and pay attention to the fact that we would certainly want all of our students to perform at a higher level than is represented by the following percentiles. For more detail on the nature of each level of problem-solving ability, see page 57 of the reportand the graphic at the top of this post:
91.8% at Level 1: can explore problem generally only when they have prior encounters with similar situations.
78.6% at Level 2: can explore and evaluate a small part of a problem, and can test a simple hypothesis in linear fashion
56.6%: at Level 3: can explore a problem and infer only simple relationships between components, and can fully deal with one condition, and can develop tests for a hypothesis.
31.0% at Level 4: can explore a “moderately complex problem in a focused way,” and understands connections between components of a problem, and are able to adjust plans following feedback, and can formulate a hypothesis and develop plans to test it.
11.4% at Level 5: can explore a scenario to understand the structure of underlying information, can react quickly to feedback, and think ahead to develop a plan that accomodates given constraints.
2.5% at Level 6: can develop models of diverse problem scenarios, facilitating efficient problem-solving, can set up general hypotheses about a system and propose a way to test them through multiple steps with appropriate monitoring,, and can quickly establish when inadequate information is available to reach a conclusion, and modify strategies accomodating both explcit and implicit constraints.

Which of these student performance levels would YOU want treating you or your animals? 

For the sake of argument, we’d argue that minimal performance for a medical professional should be a 5, if not a 6!  So, there is clearly lots of room for development for almost 90% of students worldwide, but do we develop our medical professional students towards these ends? More later on correlations between educational systems and PISA results, and on the 2015 results in collaborative problem-solving.

Article in J. Vet. Med. Education on Development of Clinical Reasoning Skills

Standard

A recent study by Vinten et al. in the J. Vet. Med. Education documented the outcome of focus groups among faculty, staff, and recent graduates of the  University of Nottingham School of Veterinary Medicine and Science (SVMS) on the adequacy of current training on clinical reasoning skill.  One conclusion was that student development in this skill would be enhanced through increased responsibility for the outcome of clinical decisions within the training program.  Furthermore, financial issues were felt to be given too little shrift during training leading to less preparedness when faced later with actual practice decisions.

The fundamental message from this study supports the idea of consistently providing students practice at “real-life” practice in clinical decision-making.  A parallel question follows naturally, but is not addressed by this study; that is, since veterinarians and other medical professionals are constantly making decisions, some with quite high stakes, why are we as medical educators not carving out more time for students to practice these skills with cases that include all of the messy aspects of reality?  If we did, we would include cases with distracting information (inconsistent or missing data from lab tests), weave in the uncomfortable reality of a client’s financial situation, and other ethical dilemmas of a professional. The dilemma for many veterinary educators becomes….what can be cut from the curriculum to make room for this kind of training?

Abstract