By PETER OMMUNDSEN Problem-based learning (PBL) is an exciting way to learn biology and is readily incorporated into large classes in a lecture hall environment. PBL engages students in solving authentic biological case problems, stimulating discussion among students and reinforcing learning. A problem-based learning environment emulates the workplace and develops self-directed learners. This is preferable to a mimetic learning environment in which students only watch, memorize, and repeat what they have been told. The examples given here are suitable for use in a first year college biology lecture theater, but the method is applicable to any class size and educational level. [A more detailed explanation of PBL in Biology may be found in Chapter Four of INSPIRING STUDENTS, published in 1999 by Kogan Page.] METHOD FOR INSTRUCTORS (1) Form Small Groups You may decide to devote all or part of a class session to PBL, but students must form small work groups during that time. Ask the students to form groups of 3-5 people, or assign the groups yourself or by lottery.
Present the students with a brief problem statement (preferably on a printed work sheet, an example of which is shown below), e.g., "A 28-year-old man appears to have osteoporosis." In some cases a video clip or specimen might be used as a trigger. Emphasize to the students that they are dealing with an authentic case history. Bizarre problems work best [more examples follow]. Prior to class you should review the case history and arm yourself with data that can be released incrementally (progressive disclosure) as the case proceeds. There is a comprehensive data set for the osteoporosis problem in the New England Journal of Medicine, 1994, 331:1056-61; 1088-9. Needless to say, the students should not be given the reference, as the objective is to solve a problem, not read a solution.
Ask the groups to brainstorm possible causes of the osteoporosis. Each group will have to discuss, review, or investigate the biology of bone, including the role of osteoblasts, diet, vitamin D, parathyroid hormone, growth hormone, calcitonin, kidney function, etc. This is when much learning occurs, as the students help each other understand the basic biology. PBL students must reflect upon biological mechanisms rather than just memorize facts (as might occur in some traditional lecture-only courses). The instructor circulates among the groups, providing assistance but not solutions. The groups may well explore avenues unanticipated by the instructor. This is highly desirable and should not be discouraged. The instructor should avoid controlling the agenda of the groups. Each group ranks its hypotheses in order of priority and prepares requests for more data. (E.g., for calcium deficiency hypothesis -- "What did he usually eat?")
Ask that a rep from each group place their top priority hypothesis or data request on the chalkboard (if already entered by another group, place their second choice, etc.). If this is not practical, ask for oral suggestions from the groups when the small group work is halted and the class is reconvened. Student suggestions may include -- Low calcium diet Immobility Low density of vitamin D receptors Calcitonin deficiency Excessive PTH Chronic acidosis buffered by salts mobilized from bone The small group work can be stopped and the instructor can briefly discuss the ideas with the entire class. It is important to value every contribution, to assist the students in analysis of the biology involved, and to provide further information [he was not immobile, he had a normal diet, etc.]. The students can be prompted for data requests: "If you could ask for just three test results from examination of this man, what would they be?" It is not likely that the students will solve a problem on the first pass, and the feedback from the instructor motivates the next round of small group work. The students could now be told that the man's lumbar spine density is 3.1 standard deviations below the average age-matched healthy female (osteoporosis = 2.5+ SD), his height is 204 cm, his left middle finger is 10 cm, and knee films show open epiphyses. (The students should now be able to figure out that the man may still be growing at age 28). The cycle of small group work and instructor feedback can be continued during the current class session or on future occasions. The key to managing a PBL session is providing continual feedback to maintain student enthusiasm while simultaneously prolonging the resolution of the problem to ensure that adequate learning occurs.
At this point in our example the groups will likely focus on the hormones required for epiphyseal closure and bone mineralization. They may ask you for serum estrogen levels (high) which will suggest estrogen-resistance. Were estrogen receptors defective? (Yes.) When a reasonable number of groups have solved the problem, you might request a brief written analysis from each group describing the biology involved in the case. Students may be asked to include certain key words in their reports. If you wish to further pursue this case at a later date you could tackle the genetics of the defect. (C to T transition in the estrogen receptor gene in both alleles causing a premature stop codon; both parents heterozygous with consanguinity in the pedigree.)
Effective problem-solving requires an orderly approach. Problem-solving skills do not magically appear in students as a result of instructors simply throwing problems at them. Our students use the following heuristic: "How to make a DENT in a problem: D efine, E xplore, N arrow, T est."
What exactly are you trying to determine? Does the problem have several components? If several, state them separately. Does everyone in the group agree with the way the problem has been framed? Ask group members to "think out loud," as that slows down their reasoning and enables people to check for errors of understanding.
Brainstorm ideas that may contribute to a solution. Justify your ideas to group members. Clarify for them the biology involved. Have them paraphrase your ideas. Listen carefully to the ideas of other group members and give positive feedback. Make a list of learning issues. What do we know? What don't we know? Is this problem analagous to any past problem? What core biological concepts may apply to this problem? Assign research tasks within the group.
After developing a list of hypotheses, sort them, weed them, and rank them. List the type of data required to test each hypothesis. Give priority to the simplest, least costly tests. It is easier to get information on the diet of a subject than it is to do sophisticated biochemical tests.
Seek from your instructor the data that you need to test your ideas. If all your possible solutions are eliminated, begin the cycle again: define, explore, narrow, test. When you encounter data that confirm one of your hypotheses you may be asked to write a biological explanation of your solution and justify it using the available evidence.
Following are examples of typical case problems that have been culled from biological journals and that have been successfully class-tested at the first-year college level. A sample student work sheet may be seen by clicking here .
A 58-year-old woman experienced attacks of confusion: she would repeat the same question 30 times even though it was answered for her each time. [New England Journal of Medicine 315:1209-19.] This is a good introductory case, as the students are able to generate a wide range of ideas: Alzheimer's Disease, trauma, alcohol abuse, atherosclerosis, arrhythmia, hypotension, cancer, epilepsy, diabetes, hypocalcemia, emphysema, dehydration, hypoglcemia, stroke, etc. The students perceive that the class as a whole is a credible learning resource, and the instructor can help the class reflect upon the biological implications of each suggestion. Eventually the students will ask the circumstances of the woman's attacks (e.g., "Following alcohol consumption?") When the students learn that the attacks occurred in the late afternoon, they will likely focus on diet and blood sugar. The instructor might at this point present a short talk on carbohydrate function and blood sugar regulation. This can be done using a transparency, with copies available to the students. It is important in a PBL environment to minimize the time required for note-taking. The students will ask for information on the woman's blood glucose level (1.6 mmol/L) and urine glucose level (zero). The student groups can now brainstorm and investigate possible causes of the low blood glucose: glucagon deficiency, insulin poisoning, anorexia nervosa, extreme exercise, etc. They may ask for an x-ray image of her abdomen, which the instructor can display as a transparency copied from the article. The students can be assisted in identifying the anatomy, including an abnormal mass in the pancreas (an insulin-secreting tumour). Additional discussion and learning opportunities can be generated by displaying copies of the ultrasonogram, angiogram, histopathology, etc. The students in each group may then collaborate in writing a brief report that explains the biology of the case.
Sabrina the cat fell 32 stories from a New York skyscraper and easily survived, as do most cats that fall from skyscrapers, especially those that fall more than several stories. Not so for humans. Why? [Natural History Magazine, August 1989: 20-26.] This intriguing case requires students to confront (or review) fundamental concepts that have wide application in biology, including allometry, momentum, stress, compliance, friction, surface area, acceleration, equilibrium, adaptation, and natural selection.
A woman with type AB blood gave birth to a child with blood type O. A second type-O child was born six years later. [Nature 277:210-211.] This case appears to contradict Mendelian inhertiance, which the students will be obliged to thorougly review, but it also demands that they make a rigorous examination of meiosis, gametogenesis, fertilization, and early development in order to propose some credible explanatory mechanisms.
A farmer was alarmed to notice tomato plants that were stunted and withered. This case initially requires the students to carefully reflect upon many basic concepts of plant anatomy, histology, physiology, ecology, and pathophysiology. Students might discuss and explore possible effects of soil quality, water relations, humidity, transpiration, hormones, and nutrition. Students should be encouraged to explore examples of pathogenic mechanisms, perhaps involving TMS, wilt fungi, wilt viruses, stunt viruses, and wilt bacteria. Ultimately the cause may be attributed to ABA deficiency, and the instructor might suggest this by introducing evidence of viviparity. Students can then focus on the roles of ABA and ethylene, and further work might address the genetics of the defect. . There is a comprehensive literature on ABA-deficient mutants, and many easily accessible web resources, e.g., Plant Biology 2000 Abs 706, XVI International Botanical Congress Abs 6158, etc.
A 94-year-old woman admitted to hospital for pneumonia had a swollen abdomen. A CT scan revealed a fetus. The woman had dementia so was unable to explain what had happened. [New England Journal of Medicine 321:1613-14.] This case prompts exhaustive brainstorming of all aspects of reproductive physiology and will produce many imaginative hypotheses.
In a coyote-control experiment coyote population density was greatly reduced. The number of rodent species then declined from ten to only two! Rodent species richness did not change on comparison areas where coyote density remained high. [Journal of Wildlife Management 63:1066-81.] This case opens many avenues of biology for exploration, including trophic levels, population regulation, population limitation, competitive exclusion, niche breadth, keystone species, umbrella species, predator control policy, biodiversity, and species richness.
A 24-year-old man experienced abdominal pain, diarrhea, and distention whenever he consumed sugar. This was a life-long problem. [New England Journal of Medicine 316:438-442.] This case ensures that students master the taxonomy of carbohydrates, and the physiology of carbohydrate digestion and absorption.
A woman encountered her 30-year-old daughter squeezing the toothpaste and unable to let go. Later that day the daughter was found holding the doorjamb and unable to move forward. [New England Journal of Medicine 317:493-501.]
Obtain a selection of DNA-typing profiles (RFLP autorads or STR electropherograms) from local police, and construct a brief but equivocal fictional case history. Divide the class into to groups of five � each group with one judge, two prosecutors and two defense attorneys. Each student should have a copy of the case and copies of raw DNA profiles. (The old autorads force the students to measure by hand.) Each side must argue the evidence before the judge and submit to the instructor a brief written report along with a written decision from the judge. This exercise demands that students help each other to thoroughly understand the genetics, and the proceedings result in much hilarity. It is desirable to introduce some complexity, for example we included an autorad from blood on a knife that contained specimens from several people. Another good source of DNA typing problems is wildlife census data from hair traps (e.g., grizzly bears).
121 cases of illness were characterized by sleeplessness, headache, tachycardia, shortness of breath, sweating, tremor, heat intolerance, and weight loss. [New England Journal of Medicine 316:993-998.]
A fitness test of applicants to a fire department resulted in 32 hospitalizations with back pain, muscle pain, and reduced urine output. One person died. [MMWR 39:751-6.] The students will at some point address muscle physiology. What happens when muscle cells break during exertion? What are the consequences of hyperkalemia on the heart? Where does all the potassium originate? What are the effects of myoglobin on the kidneys? What is the impact of oxygen free radicals produced by damaged muscles?
An 80-year-old woman suffered from confusion, falls, and fractures. Her lungs were gritty like hard sponges. [New England Journal of Medicine 315:1209-19.]
A one-year-old boy began to have recurrent bacterial infections including pneumonia, sinusitis, and middle ear infections. This pattern continued, and at age 9 he developed Hodgkin's disease. He is HIV-negative. [New England Journal of Medicine 320:696-702.]
In one mite species of the genus Adactylidium the male is born, does nothing, and dies within a few hours. What evolutionary selection pressures might have shaped this life-style? [Stephen J. Gould, The Panda's Thumb (book) pp 73-75.]
An 88-year-old man had eaten 25 eggs per day for many years, yet his serum cholesterol was only in the range of 150-200 mg/dL. [New England Journal of Medicine 324:896-900.]
An 18-year-old man fatigued quickly during exercise. [New England Journal of Medicine 324:364-9.] This is an excellent case for application of principles of cellular energy metabolism.
Four hundred people at a rock concert collapsed or experienced faintness, with possibly as many as six different proximal causes. [New England Journal of Medicine 332:1721.] Students must reflect on the biology of a number of organ systems: fasting hypoglycemia, fasting acidosis, orthostasis, hyperventilation-induced cerebral vasoconstriction, Valsalva pressure from screaming and crowding, etc.
A forest patch was logged, then replanted, but within seven years the newly planted trees began to die. [Local example -- acid precipitation, leaching of soil nutrients, inadequate woody debris left on ground as a soil nutrient bank after logging.]
A rattlesnake can flick its tail 90 times per second. (Compare that to the speed at which you can flick a finger and address the possible differences in muscle biology.) [Science News 150:53 July 27 1996.]
A 26-year-old woman complained of weakness and lassitude. Her blood pH was 7.56 and her arterial pCO2 was 45.2 mMol. Blood pressure was 90/60. This is a terrific case, well presented, with a wealth of data on blood gas and electrolyte values. The case requires students to consider the functional interaction of several organ systems. [Nephrology Dialysis Transplantation 16:1066-1068.] A printable pdf copy is available at Teaching Point .

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains * and * are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Biology library

Course: biology library   >   unit 1, the scientific method.

  • Controlled experiments
  • The scientific method and experimental design


  • Make an observation.
  • Ask a question.
  • Form a hypothesis , or testable explanation.
  • Make a prediction based on the hypothesis.
  • Test the prediction.
  • Iterate: use the results to make new hypotheses or predictions.

Scientific method example: Failure to toast

1. make an observation..

  • Observation: the toaster won't toast.

2. Ask a question.

  • Question: Why won't my toaster toast?

3. Propose a hypothesis.

  • Hypothesis: Maybe the outlet is broken.

4. Make predictions.

  • Prediction: If I plug the toaster into a different outlet, then it will toast the bread.

5. Test the predictions.

  • Test of prediction: Plug the toaster into a different outlet and try again.
  • If the toaster does toast, then the hypothesis is supported—likely correct.
  • If the toaster doesn't toast, then the hypothesis is not supported—likely wrong.

Logical possibility

Practical possibility, building a body of evidence, 6. iterate..

  • Iteration time!
  • If the hypothesis was supported, we might do additional tests to confirm it, or revise it to be more specific. For instance, we might investigate why the outlet is broken.
  • If the hypothesis was not supported, we would come up with a new hypothesis. For instance, the next hypothesis might be that there's a broken wire in the toaster.

Want to join the conversation?

  • Upvote Button navigates to signup page
  • Downvote Button navigates to signup page
  • Flag Button navigates to signup page

Incredible Answer

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Biology LibreTexts

1.3: Problem Solving

  • Last updated
  • Save as PDF
  • Page ID 8377

Problem Solving

Educators and employers alike have all argued strongly in recent years that the ability to solve problems is one of the most important skills that should be taught to and nurtured in university students. Medical, professional, and graduate schools alike look for students with demonstrated ability to solve problems; the MCAT has even recently changed its format to more specifically assess student’s ability to solve problems. Life is full of problems to solve, irrespective of the profession one chooses. Effective problem-solving skills are important!

Despite a clear demand for this skill set, it is surprisingly rare to find problem solving taught explicitly in formal educational settings, particularly in core science courses where the transmission and memorization of “facts” usually take precedence.

In BIS2A, we want to start changing this. After all, nobody really cares if you’ve memorized the name or catalytic rate of the third enzyme in the citric acid cycle (not even standardized tests), but a lot of people care if you can use information about that enzyme and the context it functions in to help develop a new drug, design a metabolic pathway for making a new fuel, or help understand its importance in the evolution of biological energy transformations.

Your instructors believe that the ability to solve problems is a skill like any other. It is NOT an innate (i.e. you’ve either got it or you don’t) aptitude. Problem solving can be broken down into a set of skills that can be taught and practiced to mastery. So, even if you do not consider yourself a good problem solver today, there is no reason why you can’t become a better problem solver with some guidance and practice. If you think that you are already a good problem solver, you can still get better.

Cognitive scientists have thought about problem solving a lot. Some of this thinking has focused on trying to classify problems into different types. While problems come in many different flavors (and we’ll see some different types throughout the course), most problems can be classified along a continuum of how well-structured they are.

At one end of the continuum are well-structured problems . These are the types of problems that you usually encounter in school. They usually have most of the information required to solve the problem, ask you to apply some known rules or formulas, and have a pre-prescribed answer. On the other end of the continuum are ill-structured problems . These are the types of problems you will usually face in real life or at work. Ill-structured problems are often poorly defined and usually do not include all of the information required to solve them. There may be multiple ways of solving them, and even multiple possible “correct” outcomes/answers.

Note: Possible Discussion

Well-structured problems (like the story problems you might often encounter in text books) are often set in an artificial context, while the ill-structured problems one faces in day-to-day life are often set in a very specific context (your life). Is it possible for multiple people to observe the same situation and perceive different problems associated with it? How does context and perception influence how one might identify a problem, its solution, or its importance? To have a fruitful/enriching discussion it pays to start by presenting an example AND some direct reasoning. Replies that acknowledge the initial comment and either provide an extension of the original argument (by way of a new perspective or example) or provide a reasoned counter-argument the are most valuable follow-ups.

Problems can also be “simple” or “complex,” depending on how many different variables need to be considered to find a solution. They can also be considered as “dynamic” if they change over time. Other problem classification schemes include story problems, rule-based problems, decision-making problems, troubleshooting problems, policy problems, design problems, and dilemmas. As you can see, problem solving is a complicated topic, and a proper, in-depth discussion about it could take up multiple courses. While the topic of problem solving is fascinating, in BIS2A we aren’t interested in teaching the theories of problem solving per se. However, we ARE interested in teaching students skills that are applicable to solving most types of problems, giving students an opportunity to practice these skills, and assessing whether or not they are improving their problem-solving abilities.

Note: Since we are asking you to think explicitly about problem solving, it is fair to expect that your ability to do so will be evaluated on exams. Do not be surprised by this. We are going to incorporate problem solving into the class in a number of different ways:

  • We will be explicitly teaching elements of problem solving in class.
  • We will have some questions on the study guides that encourage problem solving.
  • We will make frequent use of the pedagogical tool we call the “Design Challenge” to help structure our discussion of the topics we cover in class.

When we are using the Design Challenge in class, we are working on problem solving. Within the context of the Design Challenge, your instructor may also present other specific concepts related to problem solving – like decision-making. Slides will be marked explicitly to engage you to think about problem solving. Your instructor will also remind you verbally on a regular basis.

Instructions: The following problems have multiple choice answers. Correct answers are reinforced with a brief explanation. Incorrect answers are linked to tutorials to help solve the problem.

The Biology Project Department of Biochemistry and Molecular Biophysics University of Arizona Thursday, October 1, 1998 Revised: November 2004 Contact the Development Team

Browse Course Material

Course info, instructors.

  • Prof. Hazel Sive
  • Prof. Tyler Jacks
  • Dr. Diviya Sinha


As taught in.

  • Biochemistry
  • Cell Biology
  • Developmental Biology
  • Molecular Biology

Learning Resource Types

Introductory biology, problem sets.

In this section, Dr. Diviya Sinha describes how she and the course team create and grade the problem sets for 7.013.

In this course, students are assigned seven long problem sets, roughly one for every two weeks. These problem sets are designed to get students to understand and apply lecture material in order to solve problems. For more discussion about our goals for problem sets, please read Prof. Hazel Sive’s comments on Teaching Students to Solve Problems . This section focuses on how we create the problem sets and their solution keys, and how we grade problem sets.

Developing Problem Set Questions

"I often look at news articles to help me develop the problem set questions."

For each problem set, I think about the key concepts we want to cover and work on developing questions. I often look at news articles to help me develop the problem set questions. For example, we have three lectures on cell biology in 7.013. I came across an article on the MIT homepage titled, “Living cells say: Can you hear me now?” The article discusses how a cell communicates with another neighboring cell. It was written in a very nice way, and the work was done by an MIT professor, so we incorporated it into problem set 4 and problem set 5 . A lot of the problems are similarly grounded in very current research.

Prof. Sive and Prof. Jacks often send me ideas and problems that they would like to include. For example, for the first problem set this semester, Prof. Jacks suggested that we include a question about the DNA that is present in all the cells of just one person: How many trips can you take to the sun with the length of all the DNA present in one person’s body? We incorporated this as a question, and the students really liked it.

I am always in touch with the professors when I’m developing the course materials, and they offer feedback promptly.

Problem Set Solutions

For each problem set, we have an extra staff meeting. When I write problem sets, I eventually become sort of immune to any issues the problem sets might have because I’m so familiar with them. The TAs and I each write out a complete set of answers for all the questions on every problem set. For the most part, the TAs’ answers are the same as mine, but once in a while their solutions have parts that aren’t in my prepared solution key. So, the TAs’ answers can help me understand how I need to improve or add onto each problem set and solution key. I find it very helpful.

Grading responsibilities are split between TAs and undergraduate graders who are undergraduate students who did well in the course in past semesters. If a graduate TA is teaching two recitation sections, the TA might grade the problem sets for one of the recitation sections, and the undergraduate grader might grade the problem sets for the other recitation section. For the next problem set, they switch. This way, the TA gets a picture of how all of the students are doing without having to shoulder all of the grading responsibilities.

This is a big class, and we strive to be consistent when it comes to grading. To promote consistency, the graders, TAs, and I maintain a shared Google document. If a grader or a TA comes across an answer on the problem set that is not part of the solution key and they want to accept it, then they post their comments on the Google document so everyone can see it. If it’s accepted by me or the TA who is in charge of that problem set, then everyone who comes across that same type of answer can accept it.


You are leaving MIT OpenCourseWare

Problem-Solving in Biology Teaching: Students’ Activities and Their Achievement

  • Published: 21 July 2023
  • Volume 22 , pages 765–785, ( 2024 )

Cite this article

  • Nataša Nikolić   ORCID: 1 &
  • Radovan Antonijević   ORCID: 1  

266 Accesses

1 Altmetric

Explore all metrics

Problem-solving is, by nature, a creative process which, by teaching through the implementation of research and discovery activities, allows students to create their knowledge, revise it and link it to broader systems. The aim of the research was to describe and analyse the process of solving biological problems through activities that are performed during the process of solving them, as well as to study how the implementation of these activities affects the level and quality of student achievement in biology. This study employed a quantitative method research strategy to describe the problem-solving process in biology teaching and determine student achievement. Data collection was by means of survey and testing. A Likert-scale survey and a biology knowledge test were constructed for the purposes of the research. For data analysis, descriptive statistics, factor analysis and the Pearson correlation coefficient were used. The data of eighth-grade students were collected from September 2016 to February 2017, in 72 schools in Serbia (565 students). The factor analysis confirmed that problem-solving activities could be grouped into the following five areas: (1) analysing and planning problem-solving; (2) discovering solution(s) to the problem; (3) problem-solving evaluation activities; (4) additional activities involving the discussion of the problem; (5) the degree of student independence in the process of discovering a solution to a problem. The results show that with the increasing frequency of the realisation of the research problem-solving activities, the achievement of students also increases. With regard to achievement quality, a positive but low correlation was found in all three domains—knowledge acquisition, understanding and application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

problem solving questions biology

Similar content being viewed by others

problem solving questions biology

Studying the student’s perceptions of engagement and problem-solving skills for academic achievement in chemistry at the higher secondary level

Sankar E. & A. Edward William Benjamin

problem solving questions biology

The Impact of Problem-Based Learning on Students’ Achievement in Mechanical Waves in Secondary Schools

Stella Teddy Kanyesigye, Jean Uwamahoro & Imelda Kemeza

problem solving questions biology

Research mathematicians’ practices in selecting mathematical problems

Morten Misfeldt & Mikkel Willum Johansen

Ali, A. R., Toriman, M. E., & Gasim, M. B. (2014). Academic achievement in biology with suggested solutions in selected secondary schools in Kano State, Nigeria. International Journal of Education and Research, 2 (11), 215–224.

Google Scholar  

Anderson, J. R. (2005). Cognitive Psychology and Its Implications: Sixth Edition . Macmillan.

Aryulina, D., & Riyanto, R. (2016). A problem-based learning model in biology education courses to develop inquiry teaching competency of preservice teachers. Cakrawala Pendidikan, 35 (1), 47–57.

Article   Google Scholar  

Antonijević, R., & Nikolić, N. (2019). The role of problem-oriented teaching in the process of the development of critical and creative thinking. In V. Orlović Lovren, J. Peeters & N. Matović (Eds.), Quality of education: Global development goals and local strategies (pp. 49–63). Institute for Pedagogy and Andragogy.

Bowden, E. M. (1997). The effect of reportable and unreportable hints on anagram solution and the Aha! experience. Consciousness & Cognition, 6 (4), 545–573.

Article   CAS   Google Scholar  

Bruner, J., Goodnow, J., & Austin, A. (1956). A study of thinking . Wiley.

Cai, J., & Brook, M. (2006). Looking back in problem-solving. Mathematics Teaching, 196 , 42–45.

Chin, C., & Chia, L. G. (2006). Problem-based learning: Using ill-structured problems in biology project work. Science Education, 90 (1), 44–67.

Article   ADS   Google Scholar  

Chin, C., & Osborne, J. (2008). Students’ questions: A potential resource for teaching and learning science. Studies in Science Education, 44 (1), 1–39.

Costa, V., & Sarmento, R. P (2019). Confirmatory factor analysis–A case study . Retrieved from

DeVellis, R. F. (1991). Scale development: Theory and applications . Sage Publications Inc.

Dunlap, J. C. (2005). Problem-based learning and self-efficacy: How a capstone course prepares students for a profession. Educational Technology Research and Development, 53 (1), 65–85.

Etherington, M. B. (2011). Investigative primary science: A problem-based learning approach. Australian Journal of Teacher Education, 36 (9), 53–74.

Gagné, E. D., Yekovich, C. W., & Yekovich, F. R. (1993). The cognitive psychology of school learning (2nd ed.). HarperCollins College Publishers.

Gbore, L. O., & Daramola, C. A. (2013). Relative contributions of selected teachers’ variables and students’ attitudes toward academic achievement in biology among senior secondary schools students in Ondo State. Nigeria. Current Issues in Education, 16 (1), 1–11.

Gijbels, D., Dochy, F., Van den Bossche, P., & Segers, M. (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research, 75 (1), 27–61.

Harun, N. G., Yusof, K. H., Jamaludin, M. Z., Helmi, S. A., & Hassan, S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences, 56 (2012), 233–242.

Hoskinson, A. M., Caballero, M. D., & Knight, J. K. (2013). How can we improve problem-solving in undergraduate biology? Applying lessons from 30 years of physics education research. CBE Life Sciences Education, 12 (2), 153–161.

Article   PubMed   PubMed Central   Google Scholar  

Hurst, R. W., & Milkent, M. M. (1996). Facilitating successful prediction problem-solving in biology through application of skill theory. Journal of Research in Science Teaching, 33 (5), 541–552.

Jamari, D., Mohamed, H., Abdullah, Z., Mohd Zaid, N., & Aris, B. (2018). Biology problem-solving: The high achiever students. European Proceedings of Social and Behavioural Sciences, 68 , 831–842.

Kapa, E. (2001). A metacognitive support during the process of problem solving in a computerized environment. Educational Studies in Mathematics, 47 (3), 317–336.

Kirui, J. M., & Kaluyu, V. (2018). Influence of selected psychosocial factors on learners’ performance in science subjects: A case of public secondary schools in Moyale Sub-County, Kenya. International Journal of Education and Research, 6 (1), 15–28.

Kolber, B. J. (2011). Extended problem-based learning improves scientific communication in senior biology students. Journal of College Science Teaching, 41 (1), 32–39.

Meiring, S. P. (1980). Problem solving. A basic mathematics goal, parts 1 and 2 . GLC Publishers, Agincourt.

Montague, M. (2005). Math problem solving for upper elementary students with disabilities (p. 8). The Access Center: Improving Outcomes for All Students K.

Nehm, R. H. (2010). Understanding undergraduates’ problem-solving processes. Journal of Microbiology & Biology Education, 11 (2), 119–122.

Okoye, N. S., & Okechukwu, R. N. (2006). The effect of concept mapping and problem-solving teaching strategies on achievement in genetics among Nigerian Secondary School Students. African Journal of Educational Studies in Mathematics and Sciences, 4 , 93–98.

Polya, G. (1973). How to solve it: A new aspect of mathematical method  (2nd ed.). Princeton University Press.

Ranjanie, B. (2017). Impact of problem-based learning on teaching biology for higher secondary students. International Journal of Current Research, 9 (12), 62932–62934.

Sabella, M. S., & Redish, E. F. (2007). Knowledge organization and activation in physics problem solving. American Journal of Physics, 75 (11), 1017–1029.

Squires, J. E., Estabrooks, C. A., Newburn-Cook, C. V., & Gierl, M. (2011). Validation of the conceptual research utilization scale: An application of the standards for educational and psychological testing in healthcare. BMC Health Services Research, 11 (1), 1–14.

Stanisavljević, J. D., & Đurić, D. Z. (2012). Efekat primene problemske nastave biologije na trajnost i kvalitet stečenih znanja [The effect of the application of problem-based biology teaching on the durability and quality of acquired knowledge]. Uzdanica, 9 (1), 303–312.

Sungur, S., Tekkaya, C., & Geban, O. (2006). Improving achievement through problem-based learning. Journal of Biological Education, 40 (4), 155–160.

Syafii, W., & Yasin, M. R. (2013). Problem-solving skills and learning achievements through problem-based module in teaching and learning biology in high school. Asian Social Science, 9 (12), 220–228.

Thakur, P., & Dutt, S. (2017). Problem-based learning in biology: Its effect on achievement motivation of students of 9th standard. International Journal of Multidisciplinary Education and Research, 2 (2), 99–104.

Trauth Nare, A., Buck, G., & Beeman-Cadwallader, N. (2016). Promoting student agency in scientific inquiry: A self-study of relational pedagogical practices in science teacher education. In G. Buck & V. Akerson (Eds). Allowing our professional knowledge of pre-service science teacher education to be enhanced by self-study research: Turning a critical eye on our practice (pp. 43–67). Springer.

Woolfolk, A., Hughes, M., and Walkup, V. (2013). Psychology in Education (2nd ed.). Pearson Education.

Xun, G., & Land, S. M. (2004). A conceptual framework for scaffolding III-structured problem-solving processes using question prompts and peer interactions. Educational Technology Research and Development, 52 (2), 5–22.

Download references

Author information

Authors and affiliations.

Department of Pedagogy and Andragogy, Faculty of Philosophy, University of Belgrade, Belgrade, Serbia

Nataša Nikolić & Radovan Antonijević

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Nataša Nikolić .

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Nikolić, N., Antonijević, R. Problem-Solving in Biology Teaching: Students’ Activities and Their Achievement. Int J of Sci and Math Educ 22 , 765–785 (2024).

Download citation

Received : 29 June 2021

Accepted : 07 July 2023

Published : 21 July 2023

Issue Date : April 2024


Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Biology teaching
  • Problem-solving
  • Problem-solving activities
  • Students’ achievement
  • Find a journal
  • Publish with us
  • Track your research

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Successful Problem Solving in Genetics Varies Based on Question Content

  • Jennifer S. Avena
  • Betsy B. McIntosh
  • Oscar N. Whitney
  • Ashton Wiens
  • Jennifer K. Knight

Department of Molecular, Cellular, and Developmental Biology

School of Education, University of Colorado Boulder, Boulder, CO 80309

Search for more papers by this author

Department of Applied Mathematics, University of Colorado Boulder, Boulder, CO 80309

*Address correspondence to: Jennifer Knight ( E-mail Address: [email protected] ).

Problem solving is a critical skill in many disciplines but is often a challenge for students to learn. To examine the processes both students and experts undertake to solve constructed-response problems in genetics, we collected the written step-by-step procedures individuals used to solve problems in four different content areas. We developed a set of codes to describe each cognitive and metacognitive process and then used these codes to describe more than 1800 student and 149 expert answers. We found that students used some processes differently depending on the content of the question, but reasoning was consistently predictive of successful problem solving across all content areas. We also confirmed previous findings that the metacognitive processes of planning and checking were more common in expert answers than student answers. We provide suggestions for instructors on how to highlight key procedures based on each specific genetics content area that can help students learn the skill of problem solving.


The science skills of designing and interpreting experiments, constructing arguments, and solving complex problems have been repeatedly called out as critical for undergraduate biology students to master ( American Association for the Advancement of Science, 2011 ). Yet each of these skills remains elusive for many students, particularly when the skill requires integrating and evaluating multiple pieces of information ( Novick and Bassok, 2005 ; Bassok and Novick, 2012 ; National Research Council, 2012 ). In this paper, we focus on describing the steps students and experts take while solving genetics problems and determining whether the use of certain processes increases the likelihood of success.

The general process of solving a problem has been described as building a mental model in which prior knowledge can be used to represent ways of thinking through a problem state ( Johnson-Laird, 2010 ). Processes used in problem solving have historically been broken down into two components: those that use domain-general knowledge and those that use domain-specific knowledge. Domain-general knowledge is defined as information that can be used to solve a problem in any field, including such strategies as rereading and identifying what a question is asking ( Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ). Although such steps are important, they are unlikely to be the primary determinants of success when specific content knowledge is required. Domain-specific problem solving, on the other hand, is a theoretical framework that considers one’s discipline-specific knowledge and processes used to solve a problem (e.g., Prevost and Lemons, 2016 ). Domain-specific knowledge includes declarative (knowledge of content), procedural (how to utilize certain strategies), and conditional knowledge (when and why to utilize certain strategies) as they relate to a specific discipline ( Alexander and Judy, 1988 ; Schraw and Dennison, 1994 ; Prevost and Lemons, 2016 ).

Previous studies on problem solving within a discipline have emphasized the importance of domain-specific declarative and conditional knowledge, as students need to understand and be able to apply relevant content knowledge to successfully solve problems ( Alexander et al. , 1989 ; Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ). Our prior work ( Avena and Knight 2019 ) also supported this necessity. After students solved a genetics problem within a content area, they were offered a content hint on a subsequent content-matched question. We found that content hints improved performance overall for students who initially did not understand a concept. In characterizing the students’ responses, we found that the students who benefited from the hint typically used the content language of the hint in their solution. However, we also found that some students who continued to struggle included the content language of the hint but did not use the information in their problem solutions. For example, in solving problems on predicted recombination frequency for linked genes, an incorrect solution might use the correct terms of map units and/or recombination frequency but not actually use map units to solve the problem. Thus, these findings suggest that declarative knowledge is necessary but not sufficient for complex problem solving and also emphasize the importance of procedural knowledge, which includes the “logic” of generating a solution ( Avena and Knight, 2019 ). By definition, procedural knowledge uses both cognitive processes, such as providing reasoning for a claim or executing a task, and metacognitive processes, such as planning how to solve a problem and checking (i.e., evaluating) one’s work (e.g., Kuhn and Udell, 2003 ; Meijer et al. , 2006 ; Tanner, 2012 ). We explore these processes in more detail below.

Cognitive Processing: Reasoning

Generating reasoning requires using one’s knowledge to search for and explain an appropriate set of ideas to support or refute a given model ( Johnson-Laird, 2010 ), so reasoning is likely to be a critical component of solving problems. Toulmin’s original scheme for building a scientific argument ( Toulmin, 1958 ) included generating a claim, identifying supporting evidence, and then using reasoning (warrant) to connect the evidence to the claim. Several studies have demonstrated a positive relationship between general reasoning “ability” ( Lawson, 1978 ), defined as the ability to construct logical links between evidence and conclusions using conceptual principles, and performance ( Cavallo, 1996 ; Cavallo et al. , 2004 ; Johnson and Lawson, 1998 ). As elaborated in more recent literature, there are many specific subcategories of reasoning. Students commonly use memorized patterns or formulas to solve problems: this approach is considered algorithmic and could be used to provide logic for a problem ( Jonsson et al. , 2014 ; Nyachwaya et al. , 2014 ). Such algorithmic reasoning may be used with or without conveying an understanding of how an algorithm is used ( Frey et al. , 2020 ). When an algorithm is not appropriate (or not used) in describing one’s reasoning, but instead the solver provides a generalized explanation of underlying connections, this is sometimes referred to as “explanatory” or “causal” reasoning ( Russ et al. , 2008 ). Distinct from causal reasoning is the domain-specific form of mechanistic reasoning, in which a mechanism of action of a biological principle is elaborated ( Russ et al. , 2008 ; Southard et al. , 2016 ). Another common form of reasoning is quantitative reasoning, which can also be described as statistical or, in other specialized situations, graph-construction reasoning (e.g., Deane et al. , 2016 ; Angra and Gardner, 2018 ). The detailed studies of these specific subcategories of reasoning have usually involved extensive interviews with students and/or very specific guidelines that prompt the use of a particular type of reasoning. Those who have explored students’ unprompted general use of reasoning have found that few students naturally use reasoning to support their ideas ( Zohar and Nemet, 2002 ; James and Willoughby, 2011 ; Schen, 2012 ; Knight et al. , 2015 ; Paine and Knight, 2020 ). However, with explicit training to integrate their knowledge into mental models ( Kuhn and Udell, 2003 ; Osborne, 2010 ) or with repeated cueing from instructors ( Russ et al. , 2008 ; Knight et al. , 2015 ), students can learn to generate more frequent, specific, and robust reasoning.

Metacognitive Processing

Successfully generating possible solutions to problems likely also involves metacognitive thinking . Metacognition is often separated into two components: metacognitive knowledge (knowledge about one’s own understanding and learning) and metacognitive regulation (the ability to change one’s approach to learning; Flavell, 1979 ; Jacobs and Paris, 1987 ; Schraw and Moshman, 1995 ). Metacognitive regulation is usually defined as including such processes as planning, monitoring one’s progress, and evaluating or checking an answer ( Flavell, 1979 ; Jacobs and Paris, 1987 ; Schraw and Moshman, 1995 ; Tanner, 2012 ). Several studies have shown that helping students use metacognitive strategies can benefit learning. For example, encouraging the planning of a possible solution beforehand and checking one’s work afterward helps students generate correct answers during problem solving (e.g., Mevarech and Amrany, 2008 ; McDonnell and Mullally, 2016 ; Stanton et al. , 2015 ). However, especially compared with experts, students rarely use metacognitive processes, despite their value ( Smith and Good, 1984 ; Smith, 1988 ). Experts spend more time orienting, planning, and gathering information before solving a problem than do students, suggesting that experts can link processes that facilitate generating a solution with their underlying content knowledge ( Atman et al. , 2007 ; Peffer and Ramezani, 2019 ). Experts also check their problem-solving steps and solutions before committing to an answer, steps not always seen in student responses ( Smith and Good, 1984 ; Smith, 1988 ). Ultimately, prior work suggests that, even when students understand content and employ appropriate cognitive processes, they may still struggle to solve problems that require reflective and regulative skills.

Theoretical Framework: Approaches to Learning

Developing domain-specific conceptual knowledge requires integrating prior knowledge and new disciplinary knowledge ( Schraw and Dennison, 1994 ). In generating conceptual knowledge, students construct mental models in which they link concepts together to generate a deeper understanding ( Johnson-Laird, 2001 ). These mental constructions involve imagining possible relationships and generating deductions and can be externalized into drawn or written models for communicating ideas ( Chin and Brown, 2000 ; Bennett et al. , 2020 ). Mental models can also trigger students to explain their ideas to themselves (self-explanation), which can also help them solve problems ( Chi et al. , 1989 ).

As our goal is to make visible how students grapple with their knowledge during problem solving, we fit this study into the approaches to learning framework (AtL: Chin and Brown, 2000 ). This framework, derived from detailed interviews of middle-school students solving chemistry problems, defines five elements of how students approach learning and suggests that these components promote deeper learning. Three of these elements are identifiable in the current study: engaging in explanations (employing reasoning through understanding and describing relationships and mechanisms), using generative thinking (application of prior knowledge and analogical transfer), and engaging in metacognitive activity (monitoring progress and modifying approaches). The remaining two elements: question asking (focusing on facts or on understanding) and depth of approaching tasks (taking a deep or a surface approach to learning: Biggs, 1987 ) could not be addressed in our study. However, previous studies showed that students who engage in a deep approach to learning also relate new information to prior knowledge and engage in reasoning (explanations), generate theories for how things work (generative thinking), and reflect on their understanding (metacognitive activity). In contrast, those who engage in surface approaches focus more on memorized, isolated facts than on constructing mental or actual models, demonstrating an absence of the three elements described by this framework. Biggs (1987) also previously provided evidence that intrinsically motivated learners tended to use a deep approach, while those who were extrinsically motivated (e.g., by grades), tended to use a surface approach. Because solving complex problems is, at its core, about how students engage in the learning process, these AtL components helped us frame how students’ learning is revealed by their own descriptions of their thinking processes.

Characterizing Problem-Solving Processes

Thus far, a handful of studies have investigated the processes adult students use in solving biology problems, and how these processes might influence their ability to develop reasonable answers ( Smith and Good, 1984 ; Smith, 1988 ; Nehm, 2010 ; Nehm and Ridgway, 2011 ; Novick and Catley, 2013 ; Prevost and Lemons, 2016 ; Sung et al. , 2020 ). In one study, Prevost and Lemons (2016) collected and analyzed students’ written documentation of their problem-solving procedures when answering multiple-choice questions. Students were taught to document their step-by-step thinking as they answered multiple-choice exam questions that ranged from Bloom’s levels 2 to 4 (understand to analyze; Bloom et al. , 1956 ), describing the steps they took to answer each question. The authors’ qualitative analyses of students’ documented problem solving showed that students frequently used domain-general test-taking skills, such as comparing the language of different multiple-choice distractors. However, students who correctly answered questions tended to use more domain-specific procedures that required knowledge of the discipline, such as analyzing visual representations and making predictions, than unsuccessful students. When students solved problems that required the higher-order cognitive skills of application and analysis, they also used more of these specific procedures than when solving lower-level questions. Another recent study explored how students solved exam questions on the genetic topics of recombination and nondisjunction through in-depth clinical interviews ( Sung et al. , 2020 ). These authors described two approaches that are not conceptual: using algorithms to bypass conceptual thinking and using non–biology specific test-taking strategies (e.g., length of answer, specificity of terminology). They also showed that students sometimes alternate between using an algorithm and a conceptual strategy, defaulting to the algorithm when they do not understand the underlying biological concept.

Research Question 1. How do experts and students differ in their description of problem-solving processes, using a much larger sample size than found in the previous literature (e.g., Chi et al. , 1981 ; Smith and Good, 1984 ; Smith, 1988 ; Atman et al. , 2007 ; Peffer and Ramezani, 2019 ).

Research Question 2. Are certain problem-solving processes more likely to be used in correct than in incorrect student answers?

Research Question 3. Do problem-solving processes differ based on content and are certain combinations of problem-solving processes associated with correct student answers for each content area?

Mixed-Methods Approach

This study used a mixed-methods approach, combining both qualitative and quantitative research methods and analysis to understand a phenomenon more deeply ( Johnson et al. , 2007 ). Our goal was to make student thinking visible by collecting written documentation of student approaches to solving problems (qualitative data), in addition to capturing answer correctness (quantitative data), and integrating these together in our analyses. The student responses serve as a rich and detailed data set that can be interpreted using the qualitative process of assigning themes or codes to student writing ( Hammer and Berland, 2014 ). In a qualitative study, the results of the coding process are unpacked using examples and detailed descriptions to communicate the findings. In this study, we share such qualitative results but also convert the coded results into numerical representations to demonstrate patterns and trends captured in the data. This is particularly useful in a large-scale study, because the output can be analyzed statistically to allow comparisons between categories of student answers and different content areas.

Students in this study were enrolled in an introductory-level undergraduate genetics course for biology majors at the University of Colorado in Spring 2017 ( n = 416). This course is the second in a two-course introductory series, with the first course being Introduction to Cell and Molecular Biology. The students were majority white, 60% female, and 63% were in their first or second year. Ninety percent of the students were majoring in biology or a biology-related field (neuroscience, integrative physiology, biochemistry, biomedical engineering). Of the students enrolled in the course, 295 students consented to be included in the study; some of the student responses have been previously described in the prior study ( Avena and Knight, 2019 ). We recruited experts from the Society for the Advancement of Biology Education Research Listserv by inviting graduate students, postdoctoral fellows, and faculty to complete an anonymous online survey consisting of the same questions that students answered. Of the responses received, we analyzed responses from 52 experts. Due to the anonymous nature of the survey, we did not collect descriptive data about the experts.

Problem Solving

As part of normal course work, students were offered two practice assignments covering four content areas related to each of two course exams (also described in Avena and Knight, 2019 ). Students could answer up to nine questions in blocks of three questions each, in randomized order, for three of the four content areas. Expert participants answered a series of four questions, one in each of the four content areas. All questions were offered online using the survey platform Qualtrics. All participants were asked to document their problem-solving processes as they completed the questions (as in Prevost and Lemons 2016 ), and they were provided with written instructions and an example in the online platform only (see Supplemental Material); no instructions were provided in class, and no explicit discussion of types of problem-solving processes to use were provided in class throughout the semester. Students could receive extra credit up to ∼1% of the course point total, obtaining two-thirds credit for explaining their answer and an additional one-third if they answered correctly. All students who completed the assignment received credit regardless of their consent to participate in the research.

We used questions developed for a prior study ( Avena and Knight, 2019 ) on four challenging genetics topics: calculation of the probability of inheritance across multiple generations (Probability), prediction of the cause of an incorrect chromosome number after meiosis (Nondisjunction), interpretation of a gel and pedigree to determine inheritance patterns (Gel/Pedigree), and prediction of the probability of an offspring’s genotype using linked genes (Recombination; see example in Figure 1 ; all questions presented in Supplemental Material). These content areas have previously been shown to be challenging based on student performance ( Smith et al. , 2008 ; Smith and Knight, 2012 ; Avena and Knight, 2019 ). Each content area contained three isomorphic questions that addressed the same underlying concept, targeted higher-order cognitive processes ( Bloom et al. , 1956 ), and contained the same amount of information with a visual ( Avena and Knight, 2019 ). Each question had a single correct answer and was coded as correct (1) or incorrect (0). For each problem-solving assignment, we randomized 1) the order of the three questions within each content area for each student and 2) the order in which each content area was presented. During each set of three isomorphic questions, while solving one of the isomorphic problems, students also had the option to receive a “content hint,” a single most commonly misunderstood fact for each content area. We do not discuss the effects of the content hints in this paper (instead, see Avena and Knight, 2019 ).

FIGURE 1. Sample problem for students from the Gel/Pedigree content area. Problems in each content area contain a written prompt and an illustrated image, as shown in this example.

Process Coding

Students may engage in processes that they do not document in writing, but we are limited to analyzing only what they do provide in their written step-by-step descriptions. For simplicity, throughout this paper, a “process” is a thought documented by the participant that is coded as a particular process. When we refer to “failure” to use a process, we mean that a participant did not describe this thought process in the answer. Our initial analysis of student processes used a selection of codes from Prevost and Lemons (2016) and Toulmin’s ( 1958 ) original codes of Claim and Reason. We note that all the problems we used can potentially be solved using algorithms, memorized patterns previously discussed and practiced in the class, which may have limited the reasoning students supplied. Because of the complexity of identifying different types of reasoning, we did not further subcategorize the reasoning category in the scheme we present, as this is beyond the scope of this paper. We used an emergent coding process ( Saldana, 2015 ) to identify additional and different processes, including both cognitive and metacognitive actions. Thus, our problem-solving processes (PsP) coding scheme captures the thinking that students document while solving genetics problems (see individual process codes in Table 1 ). We used HyperRESEARCH software (ResearchWare, Inc.) to code each individual’s documented step-by-step processes. A step was typically a sentence and sometimes contained multiple ideas. Each step was given one or more codes, with the exception of reasoning supporting a final conclusion (see Table 2 for examples of coded responses). Each individual process code captures when the student describes that process, regardless of whether the statement is correct or incorrect. Four raters (J.K.K., J.S.A., O.N.W., B.B.M.) coded a total of 24 student answers over three rounds of coding and discussion to reach consensus and identify a final coding scheme. Following agreement on the codes, an additional 12 answers were coded by the four raters to determine interrater agreement. Specifically, in these 12 answers, there were 150 instances in which a code for a step was provided by one or more raters. For each of these 150 instances, we identified the number of raters who agreed. We then calculated a final interrater agreement of 83% by dividing the total number of raters who agreed for all 150 instances (i.e., 524) by the total number of possible raters to agree for four raters in 150 instances (i.e., 600). We excluded answers in which students did not describe their problem-solving steps and those in which students primarily or exclusively used domain-general processes (i.e., individual process codes within the General strategy category in Table 1 ) or made claims without any other supporting codes. The latter two exclusion criteria were used because such responses lacked sufficient description to identify the thought processes. The final data set included a total of 1853 answers from 295 students and 149 answers from 52 experts. We used only correct answers from experts to serve as a comparison to student answers, excluding an additional 29 expert answers that were incorrect.

aExamples of student responses are to a variety of content areas and have been edited for clarity. Each individual process code captures the student’s description, regardless of whether the statement is correct or incorrect.

aThe responses above are all solutions to the question in Figure 1 .

After initial coding and analyses, we identified that student use of drawing was differentially associated with correctness based on content area. Thus, to further characterize drawing use, two raters (J.S.A. and J.K.K.) explored incorrect student answers from Probability and Recombination. One rater examined 33 student answers to identify an initial characterization, and then two raters reviewed a subset of answers to agree upon a final scheme. Each rater then individually categorized a portion of the student answers, and the final interrater agreement on 10 student answers was 90%. Interrater agreement was calculated as described earlier, with each answer serving as one instance, so we divided the total number of raters agreeing for each answer (i.e., 18) by the total possible number of raters agreeing (i.e., 20).

Statistical Analyses

The unit of analysis for all models considered is an individual answer to a problem. We investigate three variations of linear models, specified below. The response variable in all cases is binary (presence/absence of process or correct/incorrect answer). Thus, the models are generalized linear models, and, more specifically, logistic regression models. Because our data contain repeated measures in the form of multiple answers per student, we specifically use generalized linear mixed models (GLMM) to include a random effect on the intercept term in all models, grouped by participant identifier ( Gelman and Hill, 2006 ; Theobald, 2018 ). This component of the model accounts for variability in the baseline outcome between participants. In our case, we can model each student’s baseline probability of answering a problem correctly or each participant’s baseline probability of using a given process (e.g., one student may use Reason more frequently than another student). Accounting for this variation yields better estimates of the fixed effects in the models.

problem solving questions biology

The fitted models give some, but not all, pairwise comparisons among predictor groups. We conducted pairwise post hoc comparisons (e.g., expert vs. correct student, expert vs. incorrect student, correct student vs. incorrect student, or among the four content areas) to draw inferences about the differences among all groups. In particular, we performed Tukey pairwise honestly significant difference (HSD) tests for all pairs of groups, comparing estimated marginal means (estimated using the fitted model) on the logit scale. Using estimated marginal means corrects for unbalanced group sample sizes, and using the Tukey HSD test provides adjusted p values, facilitating comparison to a significance level of α = 0.05.

To ease reproducibility, we use “formula” notation conventionally used in R to specify the models we employ in this paper, which has the following general form: outcome = fixed effect + (1 | group). The random effects component is specified within parentheses, with the random effect on the left of the vertical bar and the grouping variable on the right.

Process present = Expert/Student answer status + (1| ID)

Process present = Content area + (1| ID)

where “Process present” is the response variable as described for model 1; “Content area” is the fixed effect: Factor-level grouping: Probability (1)/Nondisjunction (2)/Gel-Pedigree (3)/Recombination (4); and “(1|ID)” is the random effect as described for model 1.

Student answer correctness = Process 1 + Process 2 + … + Process X + (1| ID)

where “Student answer correctness” is the response variable: incorrect (0)/correct (1); “Process 1 + Process 2 + … + Process X” is the list of process factors entered into the model as the fixed effect: absent (0)/present (1); and “(1|ID)” is the random effect as described for models 1 and 2. We identified which components were associated with correctness by seeing which predictor coefficients remained non-zero in a representative lasso model. We identified a representative model for each content area by first identifying the lasso penalty with the lowest Akaike information criterion (AIC) to reduce variance and then identifying a lasso penalty with a similar AIC that could be used across all content areas. Because a penalty parameter of 25 and the penalty parameter with the lowest AIC for each content area had similar AIC values, we consistently used a penalty parameter of 25. Note that when the penalty parameter is set to zero, the GLMM model is recovered. On the other hand, when the penalty parameter is very large, no predictors are included in the model. Thus, the selected penalty parameter forced many, but not all, coefficients to 0, giving a single representative model for each content area.

All models and tests were performed in R (v. 3.5.1). We used the lme4 package in R ( Bates et al. , 2015 ) for models 1 and 2, and estimation of parameters was performed using residual maximum likelihood. For model 3, we used the glmmLasso package, and the model was fit using the default EM-type estimate. Post hoc pairwise comparisons were performed using the emmeans package.

Human Subjects Approval

Human research was approved by the University of Colorado Institutional Review Board (protocols 16-0511 and 15-0380).

The PsP Coding Scheme Helps Describe Written Cognitive and Metacognitive Processes

We developed a detailed set of codes, which we call the PsP scheme to characterize how individuals describe their solutions to complex genetics problems. Table 1 shows the 18 unique processes along with descriptions and examples for each. With the support of previous literature, we grouped the individual processes into seven strategies, also shown in Table 1 . All strategies characterized in this study were domain specific except the General category, which is domain general. We categorized a set of processes as Orientation based on a previously published taxonomy for think-aloud interviews ( Meijer et al. , 2006 ) and on information management processes from the Metacognitive Awareness Inventory ( Schraw and Dennison, 1994 ). Orienting processes include: Notice (identifying important information in the problem), Recall (activating prior knowledge without applying it), Identify Similarity (among question types), and Identify Concept (the “type” of problem). Orientation processes are relatively surface level, in that information is observed and noted, but not acted on. The Metacognition category includes the three common elements of planning (Plan), monitoring (Assess Difficulty), and evaluating (Check) cited in the metacognitive literature (e.g., Schraw and Moshman, 1995 ; Tanner, 2012 ). The Execution strategy includes actions taken to explicitly solve the problem, including Use Information (apply information related to the problem), Integrate (i.e., linking together two visual representations provided to solve the problem or linking a student’s own drawing to information in the problem), Draw, and Calculate. The Use Information category is distinguished from Recall by a student applying a piece of information (Use Information) rather than just remembering a fact without directly using it in the problem solution (Recall). Students may Recall and then Use Information, just Recall, or just Use Information. If a student used the Integrate process, Use Information was not also coded (i.e., Integrate supersedes Use Information). The Reasoning strategy includes just one general process of Reason, which we define as providing an explanation or rationale for a claim, as previously described in Knight et al. (2013) , Lawson (2010) , and Toulmin (1958) . The Conclusion strategy includes Eliminate and Claim, processes that provide types of responses to address the final answer. The single process within the Error strategy category, Misinterpret, characterizes steps in which students misunderstand the question stem. Finally, the General category includes the codes Clarify, State the Process, and Restate, all of which are generic statements of execution, representing processes that are domain general ( Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ).

To help visualize the series of steps students took and how these steps differed across answers and content areas, we provide detailed examples in Tables 2 and 3 . In Table 2 , we provide three examples of similar-length documented processes to the same Gel/Pedigree problem ( Figure 1 ) from a correct expert, a correct student, and an incorrect student. Note the multiple uses of planning and reasoning in the expert answer, multiple uses of reasoning in the correct student answer, and the absence of both such processes in the incorrect student answer. The reasoning used in each case provides a logical explanation for the claim, which either immediately precedes or follows the reasoning statement. For example, in the second incident of Claim and Reason for Eliot, “because otherwise Zach could not be unaffected” is a logical explanation for the claim “it has to be dominant.” Similarly, for Cassie’s Claim and Reason code, “If both parents are heterozygous for the disease” is a logical explanation for the claim “it is probably inherited in a dominant manner.” Table 3 provides additional examples of correct student answers to the remaining three content areas. Note that for Probability and Recombination questions, the Reason process often explains why a certain genotype or probability is assigned (e.g., “otherwise all or none of the children would have the disease” explains why “Both parents of H and J must be Dd” in Li’s Probability answer) or how a probability is calculated, for example, “using the multiplication rule” (Li’s Probability explanation) or “multiply that by the 100% chance of getting ‘af’ from parent 2” (Preston’s Recombination explanation). In Nondisjunction problems, a student may claim that a nondisjunction occurred in a certain stage of meiosis (the Claim) because it produces certain gamete genotypes consistent with such an error (the Reason), as seen in Gabrielle’s answer.

aResponses edited slightly for clarity. See Table 2 for a correct student documented solution to the Gel/Pedigree problem.

Across All Content Areas, Expert Answers Are More Likely Than Student Answers to Contain Orientation, Metacognition, and Execution Processes

For each category of answers (expert, correct student, and incorrect student), we calculated the overall percent of answers that contained each process and compared these frequencies. Note that, in all cases, frequency represents the presence of a process in an answer, not a count of all uses of that process in an answer. The raw frequency of each process is provided in Table 4 , columns 2–4. To determine statistical significance, we used GLMM to account for individual variability in process use. The predicted likelihood of each process per group and pairwise comparisons between groups from this analysis is provided in Table 4 , columns 5–10. These comparisons show that expert answers were significantly more likely than student answers to contain the processes of Identify concept, Recall, Plan, Check, and Use Information ( Table 4 and Supplemental Table S1). The answers in Table 2 represent some of the typical trends identified for each group. For example, expert Eliot uses both Plan and Check, but these metacognitive processes are not used by either student, Cassie (correct answer) or Ian (incorrect answer).

aPairwise comparison: incorrect students to correct students (i–c), incorrect students to correct experts (i–e), correct students to correct experts (c–e). NA, no comparison made due to predicted probability of 0 in at least one group. *** p < 0.001; ** p < 0.01; * p < 0.05; ns: p > 0.05. See Supplemental Table S1 for standard error of coefficient estimates. Interpretation example: 82.05% and 92.36% of incorrect and correct student answers, respectively, contained Reason. The GLMM, after accounting for individual variability, predicts the probability of an incorrect student using Reason to be 91.80%, while the probability of a correct student using Reason is 96.68%.

Across All Content Areas, Correct Student Answers Are More Likely Than Incorrect Answers to Contain the Processes of Reason and Eliminate

Students most commonly used the processes Use Information, Reason, and Claim, each present in at least 50% of both correct and incorrect student answers ( Table 4 ). The processes Notice, Recall, Calculate, and Clarify were present in 20–50% of both correct and incorrect student answers ( Table 4 ). In comparing correct and incorrect student answers across all content areas, we found that Integrate, Reason, Eliminate, and Clarify were more likely used in correct compared with incorrect answers ( Table 4 ). As illustrated in Table 2 , the problem-solving processes in Cassie’s correct answer include: reasoning for a claim of dominant inheritance and eliminating when ruling out the possibility of an X-linked mode of inheritance. However, in describing the incorrect answer, Ian fails to document use of either of these processes.

Process Use Varies by Question Content

To determine whether student answers contain different processes depending on the content of the problem, we separated answers, regardless of correctness, by content area. We then excluded some processes: we did not analyze the Error and General codes, as well as Claim, which was seen in virtually every answer across content areas. We also excluded the very rarely seen processes of Identify Similarity and Identify Concept, which were present in 5% or fewer of both incorrect and correct student answers. For the remaining 11 processes, we found that each content area elicited different frequencies of use, as shown in Table 5 and Supplemental Table S2. Some processes were nearly absent in a content area: Calculate was rarely seen in answers to Nondisjunction and Gel/Pedigree questions and Eliminate was rarely seen in answers to Probability and Recombination questions. Furthermore, in answering Probability questions, students were more likely to use the processes Plan and Use Information than in any other content area. Recall was most likely in Recombination and least likely in Gel/Pedigree. Examples of student answers showing some of these trends are shown in Table 3 .

aAll student answers (correct and incorrect) are reported. Processes excluded from analyses include Claim, those within the Error and General strategies, processes that were present in 5% or fewer of both incorrect and correct student answers. Pairwise comparisons between: Probability (P), Recombination (R), Nondisjunction (N), and Gel/Pedigree (G). NA: no comparison made due to prevalence of 0% in at least one group. *** p < 0.001; ** p < 0.01; * p < 0.05; ns: p > 0.05. See Supplemental Table S2 for standard errors of coefficient estimates. Interpretation example: In Probability questions, 94.43% of answers contain Reason, while in Nondisjunction, 84.69% of answers contain Reason. Based on GLMM estimates to account for individual variability in process use, a question in the Probability content area had a 97.52% chance of using Reason, and a question in the Nondisjunction content area had an 92.88% chance of using this process.

The Combination of Processes Linked to Correctness Differs by Content Area

Performance varied by content area. Students performed best on Nondisjunction problems (75% correct), followed by Gel/Pedigree (73%), Probability (54%), and then Recombination (45%). Table 6 shows the raw data of process prevalence for correct and incorrect student answers in each of the four content areas. To examine the combination of problem-solving processes associated with correct student answers for each content area, we used a representative GLMM model with a lasso penalty. This type of analysis measures the predictive value of a process on answer correctness, returning a coefficient value. The presence of a factor with a higher positive coefficient increases the probability of answering correctly more than a factor with a lower positive coefficient. With each additional positive factor in the model, the likelihood of answering correctly increases in an additive manner ( Table 7 and Supplemental Table S3). To interpret these values, we show the probability estimates (%) for each process, which represent the probability that an answer will be correct in the presence of one or more processes ( Table 7 ). The strength of association of the process with correctness, measured by positive coefficient size, is listed in descending order. Thus, for each content area, the process with the strongest positive association to a correct answer is listed first. A process with a negative coefficient (a negative association with correctness) is listed last, and models with negative associations are highlighted in gray in Table 7 . An example of how to interpret the GLMM model is as follows. For the content area of Probability, Calculate (strongest association with correctness), Use Information, and Reason (weakest association with correctness) in combination are positively associated with correctness; Draw is the only negative predictor of correctness. For this content area, the intercept indicates a 7.31% likelihood of answering correctly in the absence of any of the processes tested. If an answer contains Calculate only, there is a 40.19% chance the answer will be correct. If an answer contains both Calculate and Use Information, there is a 58.60% chance the answer will be correct, and if the answer contains the three processes of Calculate, Use Information, and Reason combined, there is a 67.56% chance the answer will be correct. If Draw is present in addition to these three processes, the chance the answer will be correct slightly decreases to 66.40%. For Recombination, the processes of Calculate, Recall, Use Information, Reason, and Plan in combination are associated with correctness, and Draw and Assess Difficulty are negatively associated with correctness. For Nondisjunction, the processes of Eliminate, Draw, and Reason in combination are associated with correctness. For Gel/Pedigree, only the process of Reason was associated with correctness. The examples of correct student answers for each content area, as shown in Tables 2 and 3 , were selected to include each of the positively associated processes described.

aAll student answers (correct and incorrect) are reported. Processes excluded from analyses include Claim, those within the Error and General strategies, processes that were present in 5% or fewer of both correct and incorrect student answers.

aBased on a representative GLMM model with a lasso penalty predicting answer correctness with a moderate penalty parameter (lambda = 25). The intercept represents the likelihood of a correct answer in the absence of all processes initially entered into the model: Notice, Plan, Recall, Check, Assess Difficulty, Use Information, Integrate, Draw, Calculate, Reasoning, Eliminate. Shaded rows indicate the inclusion of negative predictors in combination with positive predictors. Probabilities were calculated using the inverse logit of the sum of the combination of log odds coefficient estimates and the intercept from Supplemental Table S3.

To identify why drawing may be detrimental for Probability and Recombination problems, we further characterized how students described their process of Draw in incorrect answers from these two content areas. We identified two categories: Inaccurate drawing and Inappropriate drawing application. Table 8 provides descriptions and student examples for each category. For Probability problems, 49% of the incorrect student answers that used Draw were Inaccurate, as they identified incorrect genotypes or probabilities while drawing a Punnett square. Thirty-one percent of the answers contained Inappropriate drawing applications such as drawing a Punnett square for each generation of a multiple-generation pedigree rather than multiplying probabilities. Five percent of the answers displayed both Inaccurate and Inappropriate drawing ( Figure 2 ). For Recombination, 83% of incorrect student answers using Draw used an Inappropriate drawing application, typically treating linked genes as if they were unlinked by drawing a Punnett square to calculate probability. Ten percent of answers used both Inappropriate and Inaccurate drawing ( Figure 2 ).

FIGURE 2. Drawing is commonly inaccurate or inappropriate in incorrect student answers for Probability and Recombination. Drawing categorization from student answers that used Draw and answered incorrectly for content areas of (A) Probability ( n = 55) and (B) Recombination ( n = 71). Each category is mutually exclusive, so those that have both Inaccurate drawing/Inappropriate drawing are not in the individual use categories. “No drawing error” indicates neither inaccurate nor inappropriate drawings were described. “Cannot determine” indicates not enough information was provided in the students’ written answer to assign a drawing use category.

In this study, we identified and characterized the various processes that a large sample of students and experts used to document their answers to complex genetics problems. Overall, although their frequency of use differed, experts and students used the same set of problem-solving strategies. Experts were more likely to use orienting and metacognitive strategies than students, confirming prior findings on expert–novice differences (e.g., Chi et al. , 1981 ; Smith and Good, 1984 ; Smith, 1988 ; Atman et al. , 2007 ; Smith et al. , 2013 ; McDonnell and Mullally, 2016 ; Peffer and Ramezani, 2019 ). For students, we also identified which strategies were most associated with correct answers. The use of reasoning was consistently associated with correct answers across all content areas combined as well as for each individual content area. Students used other processes more or less frequently depending on the content of the question, and the combination of processes associated with correct answers also varied by content area.

Domain-Specific Problem Solving

We found that most processes students used (i.e., all but those in the General category) were domain specific, relating directly to genetics content. Prevost and Lemons (2016) , who examined students’ process of solving multiple-choice biology problems, found that domain-general processes were more common in answers to lower-order than higher-order questions. They also found that using more domain-specific processes was associated with correctness. In our study, students solved only higher-order problems that asked them to apply or analyze information. Students also had to construct their responses to each problem, rather than selecting from multiple predetermined answer options. These two factors may explain the prevalence of domain-specific processes in the current study, which allowed us to investigate further the types of domain-specific processes that lead to correct answers.

Metacognitive Activity: Orienting and Metacognitive Processes Are Described by Experts but Not Consistently by Students

Our results support several previous findings from the literature comparing the problem-solving tactics of experts and students: experts are more likely to describe orienting and metacognitive problem-solving strategies than students, including planning solutions, checking work, and identifying the concept of the problem.

While some students used planning in their correct answers, experts solving the same problems were more likely to do so. Prior studies of solutions to complex problems in both engineering and science contexts found that experts more often used the orienting/planning behavior of gathering appropriate information compared with novices ( Atman et al. , 2007 ; Peffer and Ramezani, 2019 ). Experts likely have engaged in authentic scientific investigations of their own, and planning is more likely when the problem to be solved is more complex (e.g., Atman et al. , 2007 ), so experts are likely more familiar with and see value in planning ahead before pursuing a certain problem-solving approach.

Experts were much more likely than students to describe their use of checking work, as also shown in previous work ( Smith and Good, 1984 ; Smith, 1988 ; McDonnell and Mullally, 2016 ). McDonnell and Mullally (2016) found greater levels of unprompted checking after students experienced modeling of explicitly checking prompts and were given points for demonstrating checking. These researchers also noted that when students reviewed their work, they usually only checked some problem components, not all. Incomplete checking was associated with incorrect answers, while complete checking was associated with correct answers. In the current study, we did not assess the completeness of checking, and therefore may have missed an opportunity to correlate checking with correctness. However, if most students were generally checking their answers in a superficial way (i.e., only checking one step in the problem-solving process versus checking all steps), this could explain why there were no differences in the presence of checking between incorrect and correct student answers. In contrast to our study, Prevost and Lemons (2016) found checking was the most common domain-specific procedure used by students when answering both lower- and higher-order multiple-choice biology questions. The multiple-choice format may prompt checking, as the answers have already been provided in the scenario. In addition, while that study assessed answers to graded exam questions, we examined answers to extra-credit assignments. Thus, a lack of motivation may have influenced whether the students in the current study reported checking their answers.

Identifying the Concept of a Problem.

Although this strategy was relatively uncommon even among experts, they were more likely than students to describe identifying the concept of a problem in their solutions. This is consistent with previous research showing that nonexperts use superficial features to solve problems ( Chi et al. , 1981 ; Smith and Good, 1984 ; Smith et al. , 2013 ), a tactic also associated with incorrect solutions ( Smith and Good, 1984 ). The process of identifying relevant core concepts in a problem allows experts to identify the appropriate strategies and knowledge needed for any given problem ( Chi et al. , 1981 ). Thus, we suggest that providing students with opportunities to recognize the core concepts of different problems, and thus the similarity of their solutions, could be beneficial for learning successful problem solving.

Engaging in Explanations: Using Reasoning Is Consistently Associated with Correct Answers

Our findings suggest that, although reasoning is frequently used by both correct and incorrect students, it is strongly associated with correct student answers across all content areas. Correct answers were more likely than incorrect answers to use reasoning; furthermore, reasoning was associated with a correct answer for each of the four content areas we explored. This supports previous work showing that reasoning ability in general is associated with overall biology performance ( Cavallo, 1996 ; Johnson and Lawson, 1998 ). Students who use reasoning may be demonstrating their ability to think logically and sequentially connect ideas, essentially building an argument for why their answers make sense. In fact, teaching the skill of argumentation helps students learn to use evidence to provide a reason for a claim, as well as to rebut others’ claims ( Toulmin, 1958 ; Osborne, 2010 ), and can improve their performance on genetics concepts ( Zohar and Nemet, 2002 ). Thus, the genetics students in the current study who were able to explain the rationale behind each of their problem-solving steps are likely to have built a conceptual understanding of the topic that allowed them to construct logical rationales for their answers.

In the future, think-aloud interviews should be used to more closely examine the types of reasoning students use. Students may be more motivated and better able to explain their rationales verbally, or with a combination of drawn and verbal descriptions, than they are inclined to do when typing their answers in a writing-only situation. Interviewers can also ask follow-up questions, confirming student explanations and ideas, something that cannot be obtained from written explanations. In addition, the problems used in this study were near-transfer problems, similar to those that students previously solved during class. Such problems can often be solved using an algorithmic approach, as also recently described by Frey et al. (2020) in chemistry. Future studies could identify whether and when students use more complex approaches such as causal reasoning (providing connections between ideas) or mechanistic reasoning (explaining the biological mechanism as part of making causal connections ( Russ et al. , 2008 ; Southard et al. , 2016 ) in addition to or instead of algorithmic reasoning.

Students Use Different Processes to Answer Questions in Different Content Areas

Overall, students answered 60% of the questions correctly. Some content areas were more challenging than others: Recombination was the most difficult, followed by Probability, then Gel/Pedigree and Nondisjunction (see also Avena and Knight, 2019 ). While our results do not indicate that a certain combination of processes are both necessary and sufficient to solve a problem correctly, they can be useful to instructors wishing to guide students in their strategy use when considering their solutions to certain types of problems. In the following section, we discuss the processes that were specifically associated with correctness in student answers for each content area.


Solving a Probability question requires calculation, while many other types of problems do not. To solve the questions in this study, students needed to consider multiple generations from two families to calculate the likelihood of independent events occurring by using the product rule. Smith (1988) found that both successful and unsuccessful students often find this challenging. Our previous work also found that failing to use the product rule, or using it incorrectly, was the second most common error in incorrect student answers ( Avena and Knight, 2019 ). Correctly solving probability problems likely also requires a conceptual understanding of the reasoning behind each calculation (e.g., Deane et al. , 2016 ). This type of reasoning, specific to the mathematical components of a problem, is referred to as statistical reasoning, a suggested competency for biology students ( American Association for the Advancement of Science, 2011 ). The code of Reason includes reasoning about other aspects of the problem (e.g., determining genotypes; see Table 3 ) in addition to reasoning related to calculations. While reasoning was prevalent in both incorrect and correct answers to Probability problems, using reasoning still provided an additional 9% likelihood of answering correctly for students who had also used calculating and applying information in their answers.

Generally, calculation alone was not sufficient to answer a Probability question correctly. When students applied information to solving the specific problem (captured with the Use Information code), such as determining genotypes within the pedigree or assigning a probability, their likelihood of generating a correct answer was 40%. This only increased to 59% if they also used Calculate (see Table 7 ). We previously found that the most common content error in these types of probability problems was mis-assigning a genotype or probability due to incorrectly using information in the pedigree; this error was commonly seen in combination with a product rule error ( Avena and Knight, 2019 ). This correlates with our current findings on the importance of applying procedural knowledge: both Use Information and Calculate, under the AtL element of generating knowledge, contribute to correct problem-solving.


Both the Probability and Recombination questions are fundamentally about calculating probabilities; thus, not surprisingly, Calculate is also associated with correct answers to Recombination questions. Determining map units and determining the frequency of one possible genotype among possible gametes both require calculation. Use of Recall in addition to Calculate increases the likelihood of answering correctly from 18 to 39%. This may be due to the complexity of some of the terms in these problems. As shown previously, incorrect answers to Recombination questions often fail to use map units in their solution ( Avena and Knight, 2019 ). Appropriately using map units thus likely requires remembering that the map unit designation is representative of the probability of recombination and then applying this definition to the problem. When students Used Information, along with Calculate and Recall, their likelihood of answering correctly increased to 63%.

Reasoning and planning also contribute to correct answers in this content area. In their solutions, students needed to consider the genotypes of the offspring and both parents to solve the problem. The multistep nature of the problem may give students the opportunity to plan their possible approaches, either at the very beginning of the problem and/or as they walk through these steps. This was seen in Preston’s solution ( Table 3 ), in which the student sequentially made a short-term plan and then immediately used information in the problem to carry out that plan.

Drawing: A Potentially Misused Strategy in Probability and Recombination Solutions.

Only one process, Drawing, was negatively associated with correct answers in solutions to both Probability and Recombination questions. Drawing is generally considered beneficial in problem solving across disciplines, as it allows students to generate a representation of the problem space and/or of their thinking (e.g., Mason and Singh, 2010 ; Quillin and Thomas, 2015 ; Heideman et al. , 2017 ). However, when students generate inaccurate drawings or use a drawing methodology inappropriately, they are unlikely to reach a correct answer. In a study examining complex meiosis questions, Kindfield (1993) found that students with more incorrect responses provided drawings with features not necessary to solving the problem. In our current study, we found that the helpfulness of a drawing depends on its quality and the appropriateness or context of its use.

When answering Recombination problems, many students described drawing a Punnett square and then calculating the inheritance as if the linked genes were actually genes on separate chromosomes. In doing so, students revealed a misunderstanding of when and why to appropriately use a Punnett square as well as their lack of understanding that the frequency of recombination is connected to the frequency of gametes. Because we have also shown that planning is beneficial in solving Recombination problems, we suggest that instructors emphasize that students first plan to look for certain characteristics in a problem, such as linked versus unlinked genes, to identify how to proceed. For example, noting that genes are linked would suggest not using a Punnett square when solving the problem. Similarly, in Probability questions, students must realize that uncertainty in genotypes over multiple generations of a family can be resolved by multiplying probabilities together rather than by making multiple possible Punnett squares for the outcome of a single individual. These findings connect to the AtL elements of generative thinking and taking a deep approach: drawing can be a generative behavior, but students must also be thinking about the underlying context of the problem rather than a memorized fact.


In Nondisjunction problems, students were asked to predict the cause of an error in chromosome number. Our model for processes associated with correctness in nondisjunction problems ( Table 7 ) suggested that the likelihood of answering correctly in the absence of several processes was 70%. This may explain the higher percent of correct answers in this content area (75%) compared with other content areas. Nonetheless, three processes were shown to help students answer correctly. The process Eliminate, even though used relatively infrequently (10%), provides a benefit. Using elimination when there are a finite number of obvious solutions is a reasonable strategy, and one previously shown to be successful ( Smith and Good, 1984 ). Ideally, this strategy would be coupled with drawing the steps of meiosis and then reasoning about which separation errors could not explain the answer. Drawing was associated with correct answers in this content area, though it was neither required nor sufficient. Instead of drawing, some students may have used a memorized series of steps in their solutions. This is referred to as an “algorithmic” explanation, in which a memorized pattern is used to solve the problem. For example, such a line of explanation may go as follows: “beginning from a diploid cell heterozygous for a certain gene, two of the same alleles being present in one gamete indicates a nondisjunction in meiosis II.” Such algorithms can be applied without a conceptual understanding ( Jonsson et al. , 2014 ; Nyachwaya et al. , 2014 ), and thus students may inaccurately apply them without fully understanding or being able to visualize what is occurring during a nondisjunction event ( Smith and Good, 1984 ; Nyachwaya et al. , 2014 ). Using a drawing may help provide a basis for analytic reasoning, providing logical links between ideas and claims that are thoughtful and deliberate ( Alter et al. , 2007 ). Indeed, in Kindfield’s study ( 1993 ), in which participants (experts and students) were asked to complete complex meiosis questions, they found that those with more accurate models of meiosis used their drawings to assist in their reasoning process. Kindfield (1993) suggested that these drawings allowed for additional working memory space, thus supporting an accurate problem-solving process.


Unlike other content areas, the only process associated with correctness in the Gel/Pedigree model was Reasoning, which provided a greater contribution to correct solutions than in any other content area. In these problems, students are asked to find the most likely mode of inheritance given both a pedigree of a family and a DNA gel that shows representations of alleles for each family member. The two visuals, along with the text of the problem, provide students an opportunity to provide logical explanations at many points in the problem. Students use reasoning to support intermediate claims as they think through possible solutions, and again for their final claims, or for why they eliminate an option. Almost half of both correct and incorrect student answers to these questions integrated features from both the gel and pedigree to answer the problem. Even though many correct and incorrect answers integrate, correct answers also reason. We suggest that the presence of two visual aids prompts students to integrate information from both, thus potentially increasing the likelihood of using reasoning.


In this study, we captured the problem-solving processes of a large sample of students by asking them to write their step-by-step processes as part of an online assignment. In so doing, we may not have captured the entirety of a student’s thought process. For example, students may have felt time pressure to complete an assignment, may have experienced fatigue after answering multiple questions on the same topic, or simply may not have documented everything they were thinking. Students may also have been less likely to indicate they were engaging in drawing, as they were answering questions using an online text platform; exploring drawing in more detail in the future would require interviews or the collection of drawings as a component of the problem-solving assignment. Additionally, students may not have felt that all the steps they engaged in were worth explaining in words; this may be particularly true for metacognitive processes. Students are not likely accustomed to expressing their metacognitive processes or admitting uncertainty or confusion during assessment situations. However, even given these limitations, we have captured some of the primary components of student thinking during problem solving.

In addition, our expert–student comparison may be biased, as experts had different reasons than students for participating in the study. The experts likely did so because they wanted to be helpful and found it interesting. Students, on the other hand, had very different motivations, such as using the problems for practice in order to perform well on the next exam and/or to get extra credit. Although it is likely not possible to put experts and students in the same affective state while they are solving problems, it is worth realizing that the frequencies of processes they use could reflect their different states while answering the questions.

Finally, the questions in the assignments provided to students were similar to those seen previously during in-class work. The low prevalence of metacognitive processes in their solutions could be due to students’ perception that they have already solved similar questions. This may prevent them from articulating their plans or from checking their work. More complex, far-transfer problems would likely elicit different patterns of processes for successful problem solving.


Calculating: In questions regarding probability, students will need to be familiar with mathematical representations and calculations. Practicing probabilistic thinking is critical.

Drawing: Capturing thought processes with a drawing can help visualize the problem space and can be used to generate supportive reasoning for one’s thinking (e.g., a drawing of the stages of meiosis). However, a cautionary note: drawing can lead to unsuccessful problem solving when used in an inappropriate context, such as a Punnett square when considering linked genes or using multiple Punnett squares when other rules should be used, such as multiplication of probabilities from multiple generations.

Eliminating: In questions with clear alternate final answers, eliminating answers, preferably while explaining one’s reasons, is particularly useful.

Practicing metacognition: Although there were few significant differences in metacognitive processes between correct and incorrect student answers, we still suggest that planning and checking are valuable across content areas, as demonstrated by the more frequent use of these processes by experts.

In summary, we suggest that instructors not only emphasize key pieces of challenging content for each given topic, but also consistently demonstrate possible problem-solving strategies, provide many opportunities for students to practice thinking about how to solve problems, and encourage students to explain to themselves and others why each of their steps makes sense.


This work was supported by the National Science Foundation (DUE 1711348). We are grateful to Paula Lemons, Stephanie Gardner, and Laura Novick for their guidance and suggestions on this project. Special thanks also to the many students and experts who shared their thinking while solving genetics problems.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC. Google Scholar
  • Bassok, M., & Novick, L. R. ( 2012 ). Problem solving . In Holyoak, K. J.Morrison, R. G. (Eds.), Oxford handbook of thinking and reasoning (pp. 413–432). New York, NY: Oxford University Press. Google Scholar
  • Biggs, J. B. ( 1987 ). Student approaches to learning and studying. Research monograph . Hawthorn, Australia: Australian Council for Educational Research. Google Scholar
  • Bloom, B. S., Krathwohl, D. R., & Masia, B. B. ( 1956 ). Taxonomy of Educational Objectives: The Classification of Educational Goals . New York, NY: David McKay. Google Scholar
  • Gelman, A., & Hill, J. ( 2006 ). Data analysis using regression and multilevel/hierarchical models . Cambridge, England: Cambridge University Press. Google Scholar
  • Groll, A. ( 2017 ). glmmLasso: Variable selection for generalized linear mixed models by L1-penalized estimation . R package version , 1 (1), 25. Google Scholar
  • Kindfield, A. C. H. ( 1993 ). Biology diagrams: Tools to think with . Journal of the Learning Sciences , 3 (1), 1–36. Google Scholar
  • Lemke, J. L. ( 1990 ). Talking science: Language, learning, and values . Norwood, NJ: Ablex Publishing. Retrieved July 30, 2020, from Google Scholar
  • McDonnell, L., & Mullally, M. ( 2016 ). Teaching students how to check their work while solving problems in genetics . Journal of College Science Teaching , 46 (1), 68. Google Scholar
  • Novick, L. R., & Bassok, M. ( 2005 ). Problem Solving . In Holyoak, K. J.Morrison, R. G. (Eds.), The Cambridge handbook of thinking and reasoning (pp. 321–349). New York. NY: Cambridge University Press. Google Scholar
  • Osborne, J. ( 2010 ). Arguing to learn in science: The role of collaborative, critical discourse . Science , 328 , 463–466. Medline ,  Google Scholar
  • Saldana, J. ( 2015 ). The coding manual for qualitative researchers . Los Angeles, CA: Sage. Google Scholar
  • Schen, M. ( 2012 , March 25). Assessment of argumentation skills through individual written instruments and lab reports in introductory biology . Paper presented at: Annual Meeting of the National Association for Research in Science Teaching (Indianapolis, IN) . Google Scholar
  • Smith, M. K., & Knight, J. K. ( 2012 ). Using the Genetics Concept Assessment to document persistent conceptual difficulties in undergraduate genetics courses . Genetics , 191 , 21–32. Medline ,  Google Scholar
  • Smith, M. K., Wood, W. B., & Knight, J. K. ( 2008 ). The Genetics Concept Assessment: A new concept inventory for gauging student understanding of genetics . CBE—Life Sciences Education , 7 (4), 422–430. Link ,  Google Scholar
  • Toulmin, S. ( 1958 ). The uses of argument . Cambridge: Cambridge University Press. Google Scholar

problem solving questions biology

Submitted: 21 January 2021 Revised: 16 July 2021 Accepted: 22 July 2021

© 2021 J. S. Avena et al. CBE—Life Sciences Education © 2021 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (


Hyndland Science Faculty


National 5 Problem Solving

You can access the National 5 Problem Solving Help Sheet below:

National 5 Problem Solving Help Sheet

You can access the full National 5 Problem Solving Booklet below:

National 5 Problem Solving Booklet Questions

National 5 Problem Solving Booklet Marking Scheme

You can access problem solving questions by category using the links below:

Biological Keys Questions | Biological Keys Marking Scheme

Calculation Questions | Calculations Marking Scheme

Graphs, Charts & Experiments Questions | Graphs, Charts & Experiments Marking Scheme

Ratio Questions | Ratio Marking Scheme

Report a Glow concern


  • Illustrations
  • Problem Sets
  • Calculators

You are here

Punnett square practice problems.

  • Alleles, genotype and phenotype
  • Punnett Square
  • Genotype and phenotype probabilities with a monohybrid cross
  • Allele, genotype and Phenotype questions
  • Monohybrid genotype and phenotype probability questions

Problem Solving

Problem solving.

A substantial proportion of the National 5 Biology exam will consist of problem solving questions.

Problem solving questions in Biology involves using unfamiliar data and information and calculating or extracting answers. You need to be able to process, present, select and analyse data.

Students often find these questions challenging so i t is important to spend as much time practi c ing problem solving questions as revising knowledge from the course.


Each section focuses on a different problem solving skill and includes

Worked Examples

Practice Questions

Checkpoint Test

problem solving questions biology

2 . Percentages

problem solving questions biology

3. Percentage Change

problem solving questions biology

5. Drawing Graph s

problem solving questions biology

6. Analysing Graphs

problem solving questions biology

7. Experimental Design

problem solving questions biology

8. Scientific Literature

European Proceedings Logo

  • Publishing Policies
  • For Organizers/Editors
  • For Authors
  • For Peer Reviewers

Search icon

Biology Problem-Solving: The High Achiever Students

email address

Problem-solving has been acknowledged as one of the compulsory skills needed to face and overcome challenges of the modern world in either learning or everyday life. However, there is limited information regarding the level of problem-solving skills demonstrated by school students in learning biology subject compared to physics and mathematics. This study aims to identify the problem-solving level of 16-year old high achievers in selected boarding school in the Southern and Central Regions of Malaysia. The problem-solving skills of 70 students were measured using a validated open-ended test, UKPM, which consists of general and topic-specific problem-solving questions for biology. These questions focus on the different steps in the problem-solving processes. High achievers from boarding schools were chosen to ensure the homogenous background of the participants. The data were descriptively analysed and the overall score was used to determine the students’ problem-solving level based on the classification in Programme of International Students Assessment (PISA). The result showed that the majority of the participants are low (35%) and intermediate (64%) problem solvers and they showed incompetence in manipulating information and making justifications. They possess high tendency to find the absolute answer, but lack the reflecting ability when answering the test. The criteria and limitations showed that the participants are prone to practise a converged thinking pattern. In this, educators should introduce innovative alternative teaching and learning approach need to enhance the students’ problem-solving skills. Keywords: Problem-solving Skills Problem-solving Processes Biology School


Problem-solving (PS) skills refer to a person’s ability to make critical judgment and decision based on the appropriate justification of the problem’s situation and its surrounding ( Kivunja, 2014 ). Solving a problem requires an individual to explore the root cause of a problem and create potential solutions pragmatically by using logic, lateral, and creative thinking ( Ismail & Atan, 2011 ). This approach is parallel with the 21st-century learning that emphasised on the construction of new knowledge, a shift from focusing solely on rote memorisation and classroom knowledge transfer in schools that have become habitual over the years. Problem-solving (PS) PS is not an innate skill ( Bal & Esen, 2016 ) therefore, providing the students with the chance to solve the problem is actually an effective way to develop this skill ( Shute, Ventura, & Ke, 2015 ; Shute & Wang, 2013 ). Learning instruction that emphasises on the understanding of core concept helps in developing students’ PS as this skill is best learnt through the use of domain-specific problem-solving activities that are challenging for students to learn ( Prevost & Lemons, 2016 ). During the process, increasing students’ understanding of the topic will help them to create and relate to the new knowledge. In the aspect of learning and education, the repetition cycle of the PS process through practices will equip them with PS skills that can be applied in different problems regardless of the context, discipline, or situation ( Yang, 2012 ). In this light, PS skills can have long-term benefits and subsequently, help the students to take charge of their profession, personal encounter and everyday hurdles ( Bal & Esen, 2016 ; Syafii & Yasin, 2013 ). Moreover, biologists agreed that students should acquire PS skills in order to learn biology better ( Hoskinson, Caballero, & Knight, 2013 ).

The numerous proposed models on PS stipulate that the basic component of the PS process is to identify problems, to suggest solutions, to apply solutions, and to reflect at the end of the process. One may interpret that the problem-solving process is the sequential steps in a linear process. However, in reality, most individuals demonstrate flexible and inventive approaches based on the different circumstance and they do not adhere to an perpetual linear PS process ( Yu, Fan, & Lin, 2014 ). PS skills are taught through the integration with the teaching of different subjects, fields, domains, or contents, the steps, process, or stages remained unchanged, therefore, understanding the meanings and function of each PS step is crucial for the success of problem-solving. PS step can be used as a guideline on what to observe and measure in evaluating the proficiency of PS. Besides that, teachers will know the types of learning support or scaffolding that have to be given to the students during the teaching and learning process.

PS requires a variety of mental skills, including interpreting information, planning, trying alternative strategies, reflecting and decision-making. However, studies have found that students are not aware of the processes taking place in problem solving ( Yu et al., 2014 ). Early PS studies demonstrated that students have difficulties in PS steps, especially when tackling the orientation stage, which is to identify the problem. In this regard, the first step of PS is vital and students should be taught for a better understanding of this component. Hence, this component should be set as the rubric or benchmark during PS measurement and assessment. Studies done in both general and domain specific PS affirmed that the students’ incapability in solving a problem does not stemmed from their lack of knowledge or skills specific domain, rather, it is due to the failure of properly identifying the source of the problem and its details. This can be seen in PS studies that compared the differences between novice and expert problem solvers where PS performance is highly influenced by individual ability to understand the problem, as well as analysing the potential answer or solution ( Prevost & Lemons, 2016 ). Nevertheless, enhancing students’ PS skills is one of the prior goals of all educational institutions therefore, developing PS skills is necessary in order to improve students’ ability in scientific thinking especially in science subjects, such as biology ( Ulusoy, Turan, Tanriverdi, & Kolayis, 2012 ; Yenice, Ozden, & Evren, 2012 ). In other words, PS skills should be developed early as in the students’ schooling years.

Previous studies have shown that Malaysia students face difficulty in problem-solving ( Abd Razak, Mohd Johar, Andriani, & Yong, 2014 ; Johnny, Abdullah, Abu, Mokhtar, & Atan, 2017 ; Kaus, Phang, Ali, Abu Samah, & Ismail, 2017 ). On the other hand, the term ‘problem-solving’ is commonly synonymous with obvious calculation and this resulted in the lack of study on problem-solving associated with Biology. In this regard, despite the differences in problems’ structure and contents between science subjects, the instructional purpose, which is to elucidate the patterns and processes in the natural world and systems, align comparatively with each other. Research noted that solving biology problem requires the engagement of the same skill practised by physicists and biologists ( Hoskinson et al., 2013 ). Nevertheless, compared to mathematics or physics there is still a prominent gap in the research of PS skills in biology for the past three decades( Kim, Prevost, & Lemons, 2015 ).

Problem Statement

The implementation of PS in pedagogical activities has led to the measurement of PS skills among the students. Studies have shown that there is a significant positive relationship between academic achievement, career success, and certain habits of mind or behaviour with skills competencies ( OECD, 2014 ; Stecher & Hamilton, 2014 ; Wüstenberg, Greiff, & Funke, 2012 ). In this light, it is more challenging to measure the competency level of PS skills compared interpersonal skills, therefore, there are guidelines that can be used in developing the instrument or selecting the rubrics to measure PS skills. On the other hand, these higher-order PS skills are arduous to be measured and the discrepancies on relevant and credible measurement scales are still debatable among the researchers ( McCoy, Braun-Monegan, Bettesworth, & Tindal, 2015 ; Stecher & Hamilton, 2014 ). By observing and measuring these PS processes, this study will obtain valuable information related to the cognitive habit in ones’ mind when solving a task. Observing and measuring these processes during intervention study will provide formative information as well as evidence of the student’s development of PS skills. There is a lack of information about problem-solving skills for biology and how students solve biology problems, among school students is still in its formative stages.

Research Questions

The research questions that lead this study are as follows:

What is the students’ problem-solving competency level for Biology?

How is the students’ performance in non-routine Biology questions in terms of the problem-solving steps in problem-solving process?

Purpose of the Study

For the purposes of this article, domain-specific problem-solving refers to topic Cell Division of secondary school biology syllabus investigate the PS level of the students. Therefore, this study aims to identify the students’ abilities regarding the steps in PS as well as their problem-solving competency level for Biology.

Research Methods

This study was participated by 70 science stream students (39 females and 31 males) who are 16 years old from three high-achieving fully residential schools located in the Central and Southern Regions of Malaysia. The students were chosen to ensure the homogenous background of the participants. Their PS skills for the biology subject were measured using the UKPM, which is a validated open-ended test with 20 topic-specific questions in Section A and 20 general questions in Section B. The topic-specific PS questions are related to cell division, while the general PS questions are related to biology or science as well as questions adapted from the problem-solving domain in Programme for International Student Assessment (PISA). This study referred to the Ge & Land PS Model that comprises four problem-solving steps, which are identifying problems (PS1), giving suggestions and options to solve the problem (PS2), making justification (PS3), and reflecting the action (PS4) ( Bixler, 2007 ). A total of 10 questions were allocated for each PS step and each question focuses on the different steps in the PS process. The maximum score for the UKPM is 120 and the data were analysed descriptively to identify the participants’ performance for each step in the PS process. The assessment rubric was adapted from previous research ( Bixler, 2007 ) while the PS classification of the competency level was done by referring to the OECD or ‘Organisation for Economic Cooperation and Development’ classification that was used in PISA ( OECD, 2014 ). The UKPM was validated prior to the research by experts in the PS domain, as well as against the biology syllabus for the Malaysian secondary school.

Table 2 summarises the participants’ scores. The results show that neither the female nor male participants score excellently in the UKPM test. The mean score for female participants was 42.92, and the difference is not distinct with the male participants with the mean score of 42.42. The overall achievement did not reach 50% of the overall score with only 42.70. The dispersion of the score patterns for all groups is almost similar, with the average of 8. The highest score is 62/120, while the lowest is 26/120. Both extreme scores were obtained by female participants.

Each individual score was compared and classified according to the six PS competency level as presented by Organisation for Economic Co-operation and Development (2014). Diagram 1 shows the percentage of the number of participants in each competency level; only 1% of the participants could be classified as possessing level 4 competencies. The majority of the participants (64%) could be categorised as level 3 problem solvers, while 35% could be classified as having level 2 competencies.

Problem-Solving Skill Level

The level of problem-skill for each gender was compared against and there are only minor differences. Diagram 2 shows that 2% of the female participants possess level 4 competence. Moreover, there is a 4% difference between the number of females (36%) and males (32%) who demonstrated level 2 competence. The same difference was observed in level 3 and there is only a 6% difference between both groups.

Gender Differences in Problem-Solving Skill Competency

Table 3 describes the score of each PS step in details. Out of the four steps in the PS process, making reflection (PS4) has the lowest mean score of 9.33 ± 3.90 for females and 9.39 ± 3.35 for males. The mean scores show that most participants are capable to obtain at least 10 out of the total 40 marks allocated for PS4 in UKPM. Although the other three steps have higher mean scores, they are still considered to be in the low range as none of the PS steps are able to reach at least 50% of the mean score compared to the allocated marks. The minimum and maximum scores for each PS step are in the lower range as the highest score is 21 (PS4) and the lowest score is 3 (PS2). Diagram 3 summarises the findings related to the PS steps. In this light, there are no major differences in the overall achievement each PS steps between each gender.

Mean Score for Each PS Steps According to Gender

The results provide the insights on students’ behaviour when solving problems during the biology subject. The biology subject is different from physics and mathematics; this is because, the calculations only play minimal roles compared to reading and understanding the fact. It was discovered that the participants from both genders have poor knowledge and capabilities in all the PS steps, and consequently, they obtained poor results in the PS domains based on the Programme International Student Assessment (PISA).

It was discovered that the participants did not plan well and did not evaluate the situation in the questions. In one of the PS1 questions, the participants were asked to list all the barriers and factors that they should consider before choosing the most appropriate option and only a small percentage of students managed to list the appropriate answers beyond the question given, while the rest only listed down a few factors that could be found in the question. In another PS1 question, the participants were required to propose an arrangement plan regarding the number of people to be placed in eight rooms. For this question, the participants should consider the criteria given when proposing the arrangement. The researcher expected the participants to perform some calculations, however, the majority of them presented wrong answers even though a draft table was provided to assist them in planning and evaluating the problem by providing specific directions for the key stages. They only provided answers that they are familiar despite the expectation that they would be able to find the solution when they delve deeper into the question. This shows excellent achievement in public school examination will not ensure good competency in non-routine PS as most school examination revolves around routine problem (Abd Razak et al., 2014)

In the meantime, the planning process is seldom practised in answering open-ended problems even though it is commonly used in routine and algorithmic problems. Hence, it should be considered in developing the skills to solve open-ended problems ( Reid & Yang, 2002 ). In this light, most past PS studies only focused on the earlier PS steps, which are identifying the root cause of the problem and planning the solutions and actual success in PS is actually determined based on the capability to determine what is needed to be solved and how to do it effectively ( Ulusoy et al., 2012 ).

Identifying the root cause of the problem and planning the solutions are categorised as knowledge acquisition according to the PS framework in PISA ( OECD, 2014 ) or rule identification and application ( Schweizer, Wüstenberg, & Greiff, 2013 ; Wüstenberg et al., 2012 ). It is assumed that these particular steps are more utilised in higher-order thinking skills (HOTS) compared to the later steps in PS model. Meanwhile, reflections and monitoring require judgement and deep thinking and reflections can be done in most of the PS steps. Due to the huge influence in PS stages, some studies have divided the ‘Identify Problem’ and ‘Solution Planning’ steps into smaller sub-steps (E.g., ‘Gather Info’). Some studies also added other indicators that are relative to these steps (E.g., ‘Avoiding Problem’ and ‘Flexibility’) with a specific checklist of criteria that has to be observed in the study. These additions were based on the researchers’ perspectives and research needs.

The results revealed that the students are confused and facing difficulties in linking the function of spindle fibre with mitosis or meiosis failure (Figure 4 ). The participants’ lack of understanding of the key concepts has contributed to the poor results for the PS steps. As an example, the question related to the concepts of meiosis and its functions in producing haploid gamete cell which are different in terms of numbers of chromosomes, genetic content due to random desegregation, and crossing over processes. In this light, the students were unable to answer the question even though it is just slightly different from the examination format questions (Figure 5 ). This shows that the students were confused and the students had come out with varied segments of inaccurate response and totally incorrect answers. On the other hand, the success rate was improved when the same question was modified with additional explicit hints or organised to be similar to the pattern of the examination format questions. Without this explicit linkage, the participants had difficulties in linking what they had learnt on the idea of cell division with examination questions. It seems that solving previous exam questions and drill practices are common in a biology lesson so that the students could ace their examination. In this light, despite their impressive results, the students tend to have limitations in terms of their level of thinking skills. These students tend to answer the questions through memorisation, and the drilling practices create a mind model and schema that will be stored in their minds, rather than creating understanding of the principles. In other words, the students memorise the content and they face difficulties when presented with questions in a new context or structure.

Example of PS3 Questions on the Function of Spindle Fibre

Step four in the PS process is making reflection. In UKPM, questions in PS4 prompt the participants to present their agreement on the topic (Figure 6 ). For example, the students are provided with a formula as a guide for their answers and to answer PS4 questions, students are expected to review and identify the formula and they have to suggest the correct formula when giving their justifications. Unfortunately, there were not many participants who were able to present a sound reflection. Most of them only provide their answers by referring to the given calculation without reflecting and they also provided incorrect answers. As a result, they scored very low for PS4 which affected their overall score. This shows that the learners have low abilities and face difficulties in creating a link between skills and knowledge ( Reid & Yang, 2002 ). Moreover, this study also observed the habit and pattern related to how the participants answer the test. Besides that, at the end of the UKPM test, the researcher had obtained verbal feedbacks from students who seemed to not prefer lengthy questions as they only glanced through the instruction and provide answers without any description, explanation, or justification. It was found that these students are more familiar with routine questions, which only require one right answer and they looked uncomfortable when asked to answer non-routine abstract questions that require giving opinions and justifications and consequently, gave opinions that did not reflect the lesson that they had learnt.

Example of PS4 Involving Student’s Agreement

In the meantime, a good problem solver has three characteristics, which are having a good conceptual understanding of the domain involved, including domain-specific skills and being able to adjust wisely to the use of automated skills. This is because PS requires two types of knowledge namely declarative and procedural knowledge that are interdependent during the PS activity ( Yu et al., 2014 ). Expert problem solvers are more mature when performing an integrated mental representation of the problem, as well as demonstrating a better understanding of the core concepts, nature, and form of the problem ( Prevost & Lemons, 2016 ). They need more time to define and understand the issues compared to novice problem solvers who prefer to complete the task impatiently and often ignore the PS steps, including the first and most important step, which is problem identification ( Yu et al., 2014 ). The same pattern could also be observed among primary, secondary, and college students. Novice problem solvers usually seek the solution without the definite understanding of the problem and they lacked the ability to reflect their own performances. They tend to overlook the analysis and reflection process, even though they knew that they were stuck with an inappropriate solution during the process.

The flexible, linear, and sequential PS processes can be practised differently according to the problem solvers’ creativity and needs, as well as the situation. However, the students’ lack of understanding on these PS processes will affect their perceptions on the processes’ progression and nature. Furthermore, research ( Yu et al., 2014 ) reported a similar pattern in the early phase of their study. They found that students lacked the flexibility and creativity as they opted for the linear mode of an incomplete PS process. Therefore, it is important to incorporate effective teaching strategies to enhance the students' understanding on the meaning and function of each PS step, and the students can develop individual skills when solving problems. A good problem solver that practises effective monitoring step would consistently reflect the chosen strategy to ensure that they are on the right track, as well as checking for other solutions. The students should be able to monitor and steer the direction of their own progress, to ask questions among themselves that could help to maximise the effective strategies and to prevent themselves from constantly using the unproductive approach in generating solutions (Jamari, Mohamed, Abdullah, Mohd Zaid, & Aris, 2017b). Choosing an effective strategy without making revisions or having self-correcting mechanisms to monitor the progress of PS is comparable to those who fail to choose good method and strategy right from the start. This issue can contribute to the failure in PS. Therefore, students must be encouraged to make verification by monitoring and reflecting their choice that could increase their HOTS and PS level.

Previous studies suggested that instructional scaffolding is necessary in aiding the students’ problem-solving processes (Jamari, Mohd Zaid, Abdullah, Mohamed, & Aris, 2017a; Kim et al., 2015 ). In this light, it is important to focus on content specific scaffolding, which is also known as conceptual scaffolding in school. This is because mastering the content of the lesson is the ultimate goal of having learning assistance, either with or without instructional materials or the presence of a teacher. Therefore, the teacher is responsible to help students in understanding the function of each step involved in the PS in class regardless of subject or domain ( Yenice et al., 2012 ). The action of mentioning these processes during the teaching and learning process without giving the students with the opportunity to perform activities that require them to think, learn and practise each step will not enhance the students’ PS skills ( Yu et al., 2014 ). The similarities between ill-structured task and common everyday problems make it worthwhile to inculcate and develop the students’ PS skills. PS skills helps to cater the needs of solving multiple tasks in a short term which refers to schooling and learning and at the same time shaping an individual to be a capable problem solver later in life as a long term goal. Since an ill-structured task usually has complex structure and may have numerous potential solutions, this type of task requires more cognitive activity to process all the problems’ information in the attempt to find the best solution.

Promising instructional strategies to enhance HOTS and PS have been widely studied and including the inquiry learning approach and focused on STEM education (Jamari et al., 2017b). However, there are still not much studies being done on these approaches for the Malaysian context although there are plenty of studies done in other countries. Examples of teaching strategies that emphasised on authentic ill-structured problem that can be applied by teachers include case-based learning (CBL) and problem-based learning (PBL). The problem or task does not stem from the textbook, but from the everyday problems which require the application of the similar concepts or principles. CBL and PBL are categorised under inquiry and they are suitable to be used in the environment of science learning due to their potential to attract the interest of students, to spark inquiry, and to encourage them to continue exploring the task ( Herreid, Schiller, & Herreid, 2012 ; Pai et al., 2010 ). Teacher’s face to face or online involvement provides the suitable guidance to help them interpret and accelerate active information transfer processes by providing a learning environment that can develop HOTS and PS skills ( Kivunja, 2014 ; McCoy et al., 2015 ).

Although time is a factor that can affect the development of PS in an individual, it is important to expose the students to PS steps and processes so that they can learn and practise these skills to become a competent citizen. Therefore, teaching approaches and strategies that emphasise on authentic problem and active learning such as Inquiry Learning and STEM (Science, Technology, Engineering and Mathematics) education should be combined with appropriate instructional scaffolding that focuses on the students’ ability to master the lesson, as well as nurturing and developing their PS skills ( Bybee, 2010 ; Moore, Johnson, Peters-Burton, & Guzey, 2016 ; Tseng, Chang, Lai, & Chen, 2013 ). In the meantime, this study has several limitations. One of the limitations is the small number of participants as the study was focused on high achievers. Hence, the sample for this study may not represent the whole population. Nevertheless, the sample provides insights on how high achieving students conduct PS. This information will add to the body of knowledge on problem solving in the context of Malaysian school students. It is assumed that other students are facing the same PS problems as shown by the high achievers, therefore, future research can be implemented on students from different categories and backgrounds. A PS research with more focus on the biology subject can be conducted by replicating this research to other topics in the biology syllabus..

  • Abd Razak, N. N. F., Mohd Johar, A., Andriani, D., & Yong, C. Y. (2014). Mathematical Problem Solving Ability Among Form Two Students. Jurnal Pendidikan Matematik, 2(2), 1-13.
  • Belgin Bal, İ., & Esen, E. (2016). Problem Solving Skills of Secondary School Students. China-USA Business Review, 15(6). doi:10.17265/1537-1514/2016.06.002
  • Bixler, B. A. (2007). The Effects of Scaffolding Students' Problem-Solving Pocess Via Question Prompts on Problem Solving and Intrinsic Motivation In An Online Learning Environment. (Doctor of Philosophy phd thesis), Pennsylvania State University,
  • Bybee, R. W. (2010). Advancing STEM Education A 2020 Vision. Technology And Engineering Teacher, 70(1), 30-35.
  • Herreid, C. F., Schiller, N. A., & Herreid, K. F. (2012). Science Stories: Using Case Studies to Teach Critical Thinking. VA, USA: National Science Teachers Association.
  • Hoskinson, A. M., Caballero, M. D., & Knight, J. K. (2013). How can we improve problem solving in undergraduate biology? Applying lessons from 30 years of physics education research. CBE Life Sci Educ, 12(2), 153-161. doi:10.1187/cbe.12-09-0149
  • Ismail, S., & Atan, A. (2011). Aplikasi Pendekatan Penyelesaian Masalah Dalam pengajaran Mata Pelajaran Teknikal dan Vokasional di Fakulti Pendidikan UTM. Journal of Educational Psychology and Counseling, 2, 113-144.
  • Jamari, D., Mohamed, H., Abdullah, Z., Mohd Zaid, N., & Aris, B. (2017b). Fostering Higher Order Thinking And Problem Solving Skills Through Social Media. Man In India, 97(12), 1-10.
  • Jamari, D., Mohd Zaid, N., Abdullah, Z., Mohamed, H., & Aris, B. (2017a). Instructional Scaffolding To Support Ill-Structured Problem Solving A Review. Sains Humanika, 9(1-4), 33-39.
  • Johnny, J., Abdullah, A. H., Abu, M. S., Mokhtar, M., & Atan, N. A. (2017). Difficulties In Reasoning Among High Achievers When Doing Problem Solving In Mathematics. Man In India, 97(12), 61-70.
  • Kaus, M. A., Phang, F. A., Ali, M. B., Abu Samah, N., & Ismail, A. K. (2017). Problem Solving And Social Supports: The Roles of Parents. Man In India, 97(12), 279-287.
  • Kim, H. S., Prevost, L., & Lemons, P. P. (2015). Students' usability evaluation of a Web-based tutorial program for college biology problem solving. Journal of Computer Assisted Learning, 31(4), 362-377. doi:10.1111/jcal.12102
  • Kivunja, C. (2014). Do You Want Your Students to Be Job-Ready with 21st Century Skills? Change Pedagogies: A Pedagogical Paradigm Shift from Vygotskyian Social Constructivism to Critical Thinking, Problem Solving and Siemens’ Digital Connectivism. International Journal of Higher Education, 3(3). doi:10.5430/ijhe.v3n3p81
  • McCoy, J. D., Braun-Monegan, J., Bettesworth, L., & Tindal, G. (2015). Do Scaffolded Supports between Aspects of Problem Solving Enhance Assessment Usability? Journal of Education and Practice, 6(36), 175-185.
  • Moore, T., Johnson, C. C., Peters-Burton, E. E., & Guzey, S. S. (2016). The need for a STEM road map: A framework for integrated STEM education. In (pp. 33 -12). NY: Routledge Taylor & Francis Froup.
  • OECD. (2014). PISA 2012 Results: Creative Problem Solving Students’ skills in tackling real-life problems Volume V. Retrieved from La rue André-Pascal, PARIS
  • Pai, A., Benning, T., Woods, N., McGinnis, G., Chu, J., Netherton, J., & Bauerle, C. (2010). The Effectiveness of a Case Study-Based First-Year Biology Class at a Black Women's College. Journal of College Science Teaching, 40(2), 32.
  • Prevost, L. B., & Lemons, P. P. (2016). Step by Step: Biology Undergraduates' Problem-Solving Procedures during Multiple-Choice Assessment. CBE Life Sci Educ, 15(4). doi:10.1187/cbe.15-12-0255
  • Reid, N., & Yang, M.-J. (2002). Open-ended problem solving in school chemistry: A preliminary investigation. International Journal of Science Education, 24(12), 1313-1332. doi:10.1080/09500690210163189
  • Schweizer, F., Wüstenberg, S., & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 42-52. doi:10.1016/j.lindif.2012.12.011
  • Shute, V. J., Ventura, M., & Ke, F. (2015). The power of play: The effects of Portal 2 and Lumosity on cognitive and noncognitive skills. Computers & Education, 80, 58-67. doi:10.1016/j.compedu.2014.08.013
  • Shute, V. J., & Wang, L. (2013). Measuring Peoblem Solving Skills In Portal 2. In IADIS International Conference on Cognition and Exploratory Learning in Digital Age (CELDA 2013) (pp. 33-39). Fort Worth, Texas, USA: International Assn for Development of the Information Society (IADIS).
  • Stecher, B. M., & Hamilton, L. S. (2014). Measuring Hard-to-Measure Student Competencies A Research and Development Plan (13 978-0-8330-8806-2). Retrieved from Santa Monica, California:
  • Syafii, W., & Yasin, R. M. (2013). Problem Solving Skills and Learning Achievements through Problem-Based Module in teaching and learning Biology in High School. Asian Social Science, 9(12). doi:10.5539/ass.v9n12p220
  • Tseng, K. H., Chang, C. H., Lai, S. J., & Chen, W. P. (2013). Attitudes towards science, technology, engineering and mathematics (STEM) in a project-based learning (PjBL) environment. International Journal of Technology Design Education, 23, 87-102.
  • Ulusoy, Y. O., Turan, H., Tanriverdi, B., & Kolayis, H. (2012). Comparison of Perceived Problem Solving Skills of Trainee Students Graduated from Different. Procedia - Social and Behavioral Sciences, 46, 2099-2103. doi:10.1016/j.sbspro.2012.05.435
  • Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving — More than reasoning? Intelligence, 40(1), 1-14. doi:10.1016/j.intell.2011.11.003
  • Yang, Y. T. C. (2012). Building virtual cities, inspiring intelligent citizens: Digital games for developing students’ problem solving and learning motivation. Computers & Education, 59(2), 365-377. doi:10.1016/j.compedu.2012.01.012
  • Yenice, N., Ozden, B., & Evren, B. (2012). Examining of Problem Solving Skills According to Different Variables for Science Teachers Candidates. Procedia - Social and Behavioral Sciences, 46, 3880-3884. doi:10.1016/j.sbspro.2012.06.165
  • Yu, K.-C., Fan, S.-C., & Lin, K.-Y. (2014). Enhancing Students’ Problem-Solving Skills through Context-Based Learning. International Journal of Science and Mathematics Education, 13(6), 1377-1401. doi:10.1007/s10763-014-9567-4

Copyright information

Creative Commons License

About this article

Publication date.

01 May 2018

Article Doi


Future Academy

Print ISBN (optional)

Edition number.

1st Edition

Business, innovation, sustainability, environment, green business, environmental issues

Cite this article as:

Jamari, D., Mohamed, H., Abdullah, Z., Zaid, N. M., & Aris, B. (2018). Biology Problem-Solving: The High Achiever Students. In M. Imran Qureshi (Ed.), Technology & Society: A Multidisciplinary Pathway for Sustainable Development, vol 40. European Proceedings of Social and Behavioural Sciences (pp. 831-842). Future Academy.

We care about your privacy

We use cookies or similar technologies to access personal data, including page visits and your IP address. We use this information about you, your devices and your online interactions with us to provide, analyse and improve our services. This may include personalising content or advertising for you. You can find out more in our privacy policy and cookie policy and manage the choices available to you at any time by going to ‘Privacy settings’ at the bottom of any page.

Manage My Preferences

You have control over your personal data. For more detailed information about your personal data, please see our Privacy Policy and Cookie Policy .

These cookies are essential in order to enable you to move around the site and use its features, such as accessing secure areas of the site. Without these cookies, services you have asked for cannot be provided.

Third-party advertising and social media cookies are used to (1) deliver advertisements more relevant to you and your interests; (2) limit the number of times you see an advertisement; (3) help measure the effectiveness of the advertising campaign; and (4) understand people’s behavior after they view an advertisement. They remember that you have visited a site and quite often they will be linked to site functionality provided by the other organization. This may impact the content and messages you see on other websites you visit.

Biology AI Homework Solver

Omni is the most accurate AI homework solver on the market!

See Omni in Action


Revolutionize Your Biology Studies with Smodin’s AI-powered Biology Homework Solver

Do you find yourself struggling with complex biology concepts and equations? Are you spending countless hours on biology homework? Say goodbye to the stress and frustration with Smodin’s Biology AI Homework Solver. Our advanced algorithms and machine learning technology generate accurate and efficient solutions to your biology problems, giving you more time to focus on your studies.

Ace Your Biology Assignments with the Help Smodin AI-powered Homework Solver

Are you worried about the accuracy of your biology homework? Let Smodin Biology AI Homework Solver assist you. With Smodin AI homework solver, you can complete your assignments with ease and confidence, ensuring that your responses are accurate and precise. Boost your grades and improve your academic performance with Smodin AI homework solver.

Why Choose Smodin’s Biology AI Homework Solver?

Smodin Biology AI Homework Solver has several benefits, including:

  • Time-saving: Complete your assignments in a fraction of the time it would take you to solve them manually.
  • Accuracy: Smodin Homework AI solver provides accurate and error-free solutions every time.
  • 24/7 Availability: Access Smodin AI homework solver at any time, from anywhere.
  • Convenience: User-friendly and easy-to-use interface, providing step-by-step solutions to your biology problems.

Get Instant Assistance with Biology Questions Using Smodin AI-powered Biology Solver

Are you stuck on a biology question and don't know where to turn for help? Let Smodin AI-powered Biology Solver assist you. With our advanced algorithms, we can provide quick and accurate solutions to your biology problems, giving you the confidence to tackle any assignment or exam question.

Maximize Your Potential with Smodin Biology AI Homework Solver

Smodin Biology AI Homework Solver is designed to help you unlock your full potential in biology studies. With Smodin AI Homework solver, you can improve your understanding of biology concepts and equations, and enhance your academic performance. Don't let challenging biology assignments hold you back from achieving your goals, let Smodin AI-powered biology solver take you to new heights.

Experience the Best Biology AI Homework Solver

Smodin Biology AI Homework Solver is the most reliable and efficient solution for all your biology assignments. Whether you're a high school or college student, Smodin homework solver can help you tackle even the most challenging problems. Say goodbye to the frustration of biology homework and hello to success with Smodin AI-powered biology homework solver tool. Try it today and experience the convenience and accuracy of Smodin Biology AI Homework Solver.

© 2024 Smodin LLC

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.15(4); Winter 2016

Step by Step: Biology Undergraduates’ Problem-Solving Procedures during Multiple-Choice Assessment

Luanna b. prevost.

† Department of Integrative Biology, University of South Florida, Tampa, FL 33620

Paula P. Lemons

‡ Department of Biochemistry and Molecular Biology, University of Georgia, Athens, GA 30602

Associated Data

Findings from a mixed-methods investigation of undergraduate biology problem solving are reported. Students used a variety of problem-solving procedures that are domain general and domain specific. This study provides a model for research on alternative problem types and can be applied immediately in the biology classroom.

This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their problem-solving procedures. We identified a range of procedures and organized them as domain general, domain specific, or hybrid. We also identified domain-general and domain-specific errors made by students during problem solving. We found that students use domain-general and hybrid procedures more frequently when solving lower-order problems than higher-order problems, while they use domain-specific procedures more frequently when solving higher-order problems. Additionally, the more domain-specific procedures students used, the higher the likelihood that they would answer the problem correctly, up to five procedures. However, if students used just one domain-general procedure, they were as likely to answer the problem correctly as if they had used two to five domain-general procedures. Our findings provide a categorization scheme and framework for additional research on biology problem solving and suggest several important implications for researchers and instructors.


The call to reform undergraduate education involves shifting the emphasis in science classes away from rote memorization of facts toward learning core concepts and scientific practices ( National Research Council [NRC], 2003 ; American Association for the Advancement of Science [AAAS], 2011 ). To develop instruction that focuses on core concepts and scientific practices, we need more knowledge about the concepts that are challenging for students to learn. For example, biology education research has established that students struggle with the concepts of carbon cycling (e.g., Anderson et al. , 1990 ; Hartley et al ., 2011 ) and natural selection (e.g., Nehm and Reilly, 2007 ), but we know much less about students’ conceptual difficulties in ecology and physiology. Researchers and practitioners also need to discover how students develop the ability to use scientific practices. Although these efforts are underway (e.g., Anderson et al. , 2012 ; Gormally et al. , 2012 ; Dirks et al. , 2013 ; Brownell et al. , 2014 ), many research questions remain. As research accumulates, educators can create curricula and assessments that improve student learning for all. We investigate one key scientific practice that is understudied in biology education, problem solving ( AAAS, 2011 ; Singer et al. , 2012 ).

For the purposes of this article, we define problem solving as a decision-making process wherein a person is presented with a task, and the path to solving the task is uncertain. We define a problem as a task that presents a challenge that cannot be solved automatically ( Martinez, 1998 ). Problem-solving research began in the 1940s and 1950s and focused on problem-solving approaches that could be used to solve any problem regardless of the discipline ( Duncker and Lees, 1945 ; Polya, 1957 ; Newell and Simon, 1972 ; Jonassen, 2000 , 2012 ; Bassok and Novick, 2012 ). Despite the broad applicability of these domain-general problem-solving approaches, subsequent research has shown that the strongest problem-solving approaches derive from deep knowledge of a domain ( Newell and Simon, 1972 ; Chi et al. , 1981 ; Pressley et al. , 1987 ). Domain is a term that refers to a body of knowledge that can be broad, like biology, or narrow, like ecosystem structure and function. This body of literature has developed into a theoretical framework called domain-specific problem solving. We situate our research within this theoretical framework.


Domain-specific problem solving has its origins in information-processing theory (IPT; Newell and Simon, 1972 ). IPT focuses on the cognitive processes used to reach a problem solution and emphasizes the general thinking processes people use when they attempt problem solving, such as brainstorming ( Runco and Chand, 1995 ; Halpern, 1997 ) and working backward by beginning with the problem goal and working in reverse toward the initial problem state ( Newell et al. , 1958 ; Chi and Glaser, 1985 ). Despite the empirical evidence for general thinking processes, one of IPT’s shortcomings as a comprehensive view of human cognition ( Dawson, 1998 ) is that the knowledge base of the problem solver is not considered.

Domain-specific problem solving expands IPT to recognize that experts in a particular domain have a relatively complete and well-organized knowledge base that enables them to solve the complex problems they face (e.g., Chase and Simon, 1973 ). One of the landmark studies showing the differences between the knowledge base of experts and nonexperts, or novices, was conducted in science, specifically in physics. Chi and colleagues (1981) compared the classification of physics problems by advanced physics PhD students (i.e., experts) and undergraduates who had just completed a semester of mechanics (i.e., novices), identifying fundamental differences. Chemistry researchers built on Chi’s work to identify differences in how experts and novices track their problem solving and use problem categorization and multiple representations ( Bunce et al ., 1991 ; Kohl and Finkelstein, 2008 ; Catrette and Bodner, 2010 ). Biology researchers built upon this work by conducting similar problem-solving studies among experts and novices in evolution and genetics ( Smith, 1992 ; Smith et al. , 2013 ; Nehm and Ridgway, 2011 ). Taken together, these studies established that experts tend to classify problems based on deep, conceptual features, while novices classify problems based on superficial features that are irrelevant to the solution.

Domain-specific problem-solving research within biology also has revealed important individual differences within groups of problem solvers. These studies show that wide variation in problem-solving performance exists. For example, some novices who solve problems about evolution classify problems and generate solutions that are expert-like, while others do not ( Nehm and Ridgway, 2011 ). This research points to the importance of studying variations in problem solving within novice populations.

Given the centrality of the knowledge base for domain-specific problem solving, it is necessary to describe the components of that knowledge base. Domain-specific problem-solving research recognizes three types of knowledge that contribute to expertise. Declarative knowledge consists of the facts and concepts about the domain. Procedural knowledge represents the how-to knowledge that is required to carry out domain-specific tasks. Conditional knowledge describes the understanding of when and where to use one’s declarative and procedural knowledge ( Alexander and Judy, 1988 ). Note that the field of metacognition also uses this three-type structure to describe metacognitive knowledge, or what you know about your own thinking ( Brown, 1978 ; Jacobs and Paris, 1987 ; Schraw and Moshman, 1995 ). However, for this paper, we use these terms to describe knowledge of biology, not metacognitive knowledge. More specifically, we focus on procedural knowledge.

Procedural knowledge consists of procedures. Procedures are tasks that are carried out automatically or intentionally during problem solving ( Alexander and Judy, 1988 ). Procedures exist on a continuum. They can be highly specific to the domain, such as analyzing the evolutionary relationships represented by a phylogenetic tree, or general and applicable to problems across many domains, such as paraphrasing a problem-solving prompt ( Pressley et al. , 1987 , 1989 ; Alexander and Judy, 1988 ).


We used domain-specific problem solving to investigate the most common form of assessment in the college biology classroom, multiple-choice assessment ( Zheng et al ., 2008 ; Momsen et al ., 2013 ). College biology and science, technology, engineering, and mathematics (STEM) courses rely on multiple-choice assessment due to large enrollments, limited teaching assistant support, and ease of scoring. Outside the classroom, multiple-choice assessment is used on high-stakes exams that determine acceptance to professional schools, like the Medical College Admissions Test and Graduate Record Exam. To our knowledge, the framework of domain-specific problem solving has not been applied previously to investigate multiple-choice assessment in college biology.

It has become common practice within the biology education community to think about assessment, including multiple-choice assessment, by determining the Bloom’s taxonomy ranking of assessment items (e.g., Bissell and Lemons, 2006 ; Crowe et al. , 2008 ; Momsen et al. , 2010 , 2013 ). Bloom’s Taxonomy of Educational Objectives was built to facilitate the exchange of test items among faculty; it was not based primarily on the evaluation of student work ( Bloom, 1956 ; Anderson and Krathwohl, 2001 ). Bloom’s taxonomy helps educators think about the range of cognitive processes they could ask their students to perform and has served as an invaluable resource enabling educators to improve alignment between learning objectives, assessments, and classroom curricula (e.g., Crowe et al. , 2008 ). When applying Bloom’s taxonomy to assessment items, items are ranked as remembering, understanding, applying, analyzing, evaluating, and synthesizing. Items ranked as remembering and understanding are grouped as lower-order items; and items ranked as applying, analyzing, evaluating, and synthesizing are grouped as higher-order items ( Zoller, 1993 ; Crowe et al. , 2008 ). Despite the value of Bloom’s taxonomy for instructors, what is not known is the relationship between the procedural knowledge of domain-specific problem solving and the Bloom’s ranking of biology assessments. This is a critical gap in the literature, because efforts to improve student learning in college science classrooms may be stymied if critical insights about student work from domain-specific problem solving are not linked to our understanding of assessment and curricular design.

In the study reported here, we used the theoretical lens of domain-specific problem solving to describe the procedural knowledge of nonmajors in an introductory biology course. We addressed the following research questions:

  • What are the domain-general and domain-specific procedures students use to solve multiple-choice biology problems?
  • To what extent do students use domain-general and domain-specific procedures when solving lower-order versus higher-order problems?
  • To what extent does the use of domain-general or domain-specific procedures influence the probability of answering problems correctly?

Setting and Participants

We recruited participants from a nonmajors introductory biology course at a southeastern public research university in the Spring 2011 semester. One of the authors (P.P.L.) was the course instructor. The course covered four major areas in biology: evolution, ecology, physiology, and organismal diversity. The instructor delivered course content using lecture interspersed with clicker questions and additional opportunities for students to write and discuss. Students also completed five in-class case studies during the semester; students completed cases in self-selected small groups and turned in one completed case study per group for grading. In addition to group case studies, the instructor assessed student learning via individual exams. Students also received points toward their final grades based on clicker participation.

In the second week of the semester, the instructor announced this research study in class and via the course-management system, inviting all students to participate. Students who volunteered to participate by completing an informed consent form were asked to produce written think-alouds for problems on course exams throughout the semester. One hundred sixty-four students completed an informed consent form. Of the 164 consenting students, 140 students actually produced a written think-aloud for at least one of 13 problems; of the 140 students, 18 did written think-alouds for all 13 problems. The remainder of students did written think-alouds for one to 13 problems. On average, research participants provided written think-alouds for 7.76 problems.

The 164 consenting students represented 73.9% of the course enrollment ( n = 222). The 164 consenting students included 70.8% females and 29.2% males; 20.4% freshmen, 40.9% sophomores, 24.1% juniors, and 13.9% seniors. The 164 students were majoring in the following areas: 3.7% business, 1.5% education, 4.4% humanities, 11.0% life and physical sciences, 5.9% engineering, and 72.3% social sciences.

This research was conducted under exempt status at the University of Georgia (UGA; IRB project 201110340).

Data Collection

Problem development..

We wrote 16 multiple-choice problems to include in this study. All problems related to material dealt with during class and focused specifically on ecosystems, evolution, and structure–function relationships. On data analysis, three problems were excluded, because most students were confused by the wording or visual representations or were able to solve the problem correctly with a superficial strategy. Each problem was preceded by a prompt for students to provide their written think-aloud (see Written Think-Alouds section). Each problem was also labeled with a preliminary Bloom’s taxonomy categorization ( Anderson and Krathwohl, 2001 ). A summary of all problems, including a description, the preliminary Bloom’s ranking, and the faculty consensus Bloom’s ranking, is provided in Table 1 . As an example, one of the final 13 problems is shown in Figure 1 . All other problems are shown in Supplemental Figure S1.

An external file that holds a picture, illustration, etc.
Object name is ar71fig1.jpg

Sample problem from the domain of evolution used to probe students’ problem-solving procedures. The preliminary ranking that students saw for this question was Applying and Analyzing based on Bloom’s taxonomy. Experts ranked this problem as Analyzing. The correct answer is E. Images of benthic and limnetic males are courtesy of Elizabeth Carefoot, Simon Fraser University.

Summary of problems used for data collection

For each problem, a description is included along with the preliminary Bloom’s ranking, and the final consensus Bloom’s ranking. The actual problems are included in Supplemental Figure S1.

Ranking of Problems by Bloom’s Level.

We wanted to investigate the use of domain-general or domain-specific procedures in lower-order versus higher-order problems. We asked three biology faculty members who were not investigators in this study to rank the Bloom’s levels of the problems we developed. The biology faculty members were selected because they have extensive teaching experience in college biology and also have experience ranking assessment items using Bloom’s taxonomy. The faculty used a protocol similar to one described previously ( Momsen et al. , 2010 ). To assist with Bloom’s ranking, we provided them with class materials relevant to the problems, including lecture notes and background readings. This is necessary, because the ranking of a problem depends on the material that students have encountered in class previously. The faculty members independently ranked each problem. Interrater reliability of independent rankings was determined using an intraclass coefficient (0.82). The faculty members met to discuss their rankings and settled disagreements by consensus. The preliminary Bloom’s rankings and the faculty consensus Bloom’s rankings for problems are reported in Table 1 . For the remainder of the paper, we use the consensus Bloom’s rankings to describe problems as either lower order or higher order.

Administration of Problems to Students.

The 13 problems included in this study were administered to students on exams 1, 2, 3, and the final exam as follows: three on exam 1, three on exam 2 four on exam 3, and three on the final exam. Students’ multiple-choice responses were part of the actual exam score. They received 0.5 extra-credit points for providing satisfactory documentation of their thought processes. Students did not receive extra credit if we judged their documentation to be insufficient. Insufficient responses were those in which students made only one or two brief statements about their problem-solving process (e.g., “I chose C”). Students could answer the multiple-choice problem and opt not to provide documentation of their thinking for extra credit. Students could receive up to 6.5 points of extra credit for documentation of the problem set. The total points possible for the semester were 500, so extra credit for this research could account for up to 1.3% of a student’s grade.

Written Think-Alouds.

We developed a protocol to capture students’ written descriptions of their thought processes while solving problems on exams based on a think-aloud interview approach. In the think-aloud interview approach, research participants are given a problem to solve and are asked to say aloud everything they are thinking while solving the problem ( Ericsson and Simon, 1984 ; Keys, 2000 ). In the written think-aloud, students are asked to write, rather than say aloud, what they are thinking as they solve a problem. To train students to perform a written think-aloud, the course instructor modeled the think-aloud in class. She then assigned a homework problem that required students to answer a multiple-choice problem and construct written think-alouds recounting how they solved the problem. We then reviewed students’ homework and provided feedback. We selected examples of good documentation and poor documentation and published these anonymously on the online course-management system. After this training and feedback, we included four problems on every exam for which we asked students to provide a written think-aloud description. We collected 1087 written think-alouds from 140 students (63% of course enrollment, n = 222) for 13 problems. Figure 2 shows a typical example of a student written think-aloud.

An external file that holds a picture, illustration, etc.
Object name is ar71fig2.jpg

Written think-aloud from an introductory biology student who had been instructed to write down her procedures for solving a multiple-choice biology problem. This document describes the student’s procedures for solving the problem shown in Figure 1 .

Data Analysis

We analyzed students’ written think-alouds using a combination of qualitative and quantitative methods. We used qualitative content analysis ( Patton, 1990 ) to identify and categorize the primary patterns of student thinking during problem solving. We used quantitative analysis to determine the relationship between use of domain-general, hybrid, and domain-specific procedures and problem type and to investigate the impact of domain-general/hybrid and domain-specific procedure use on answering correctly.

Qualitative Analyses of Students’ Written Think-alouds.

The goal of our qualitative analysis was to identify the cognitive procedures students follow to solve multiple-choice biology problems during an exam. Our qualitative analysis took place in two phases.

Phase 1: Establishing Categories of Student Problem-Solving Procedures.

Independently, we read dozens of individual think-alouds for each problem. While we read, we made notes about the types of procedures we observed. One author (P.P.L.) noted, for example, that students recalled concepts, organized their thinking, read and ruled out multiple-choice options, explained their selections, and weighed the pros and cons of multiple-choice options. The other author (L.B.P.) noted that students recalled theories, interpreted a phylogenetic tree, identified incomplete information, and refuted incorrect information. After independently reviewing the written think-alouds, we met to discuss what we had found and to build an initial list of categories of problem-solving procedures. Based on our discussion, we built a master list of categories of procedures (Supplemental Table S1).

Next, we compared our list with Bloom’s Taxonomy of Educational Objectives ( Anderson and Krathwohl, 2001 ) and the Blooming Biology Tool ( Crowe et al. , 2008 ). We sought to determine whether the cognitive processes described in these sources corresponded to the cognitive processes we observed in our initial review of students’ written think-alouds. Where there was overlap, we renamed our categories to use the language of Bloom’s taxonomy. For the categories that did not overlap, we kept our original names.

Phase 2: Assigning Student Problem-Solving Procedures to Categories.

Using the list of categories developed in phase 1, we categorized every problem-solving procedure articulated by students in the written think-alouds. We analyzed 1087 documents for 13 problems. For each of the 13 problems, we followed the same categorization process. In a one-on-one meeting, we discussed a few written think-alouds. While still in the same room, we categorized several written think-alouds independently. We then compared our categorizations and discussed any disagreements. We then repeated these steps for additional think-alouds while still together. Once we reached agreement on all categories for a single problem, we independently categorized a common subset of written think-alouds to determine interrater reliability. When interrater reliability was below a level we considered acceptable (0.8 Cronbach’s alpha), we went through the process again. Then one author (either L.B.P. or P.P.L.) categorized the remainder of the written think-alouds for that problem.

At the end of phase 2, after we had categorized all 1087 written think-alouds, we refined our category list, removing categories with extremely low frequencies and grouping closely related categories. For example, we combined the category Executing with Implementing into a category called Analyzing Visual Representations.

Phase 3: Aligning Categories with Our Theoretical Framework.

Having assigned student problem-solving procedures to categories, we determined whether the category aligned best with domain-general or domain-specific problem solving. To make this determination, we considered the extent to which the problem-solving procedures in a category depended on knowledge of biology. Categories of procedures aligned with domain-general problem solving were carried out without drawing on content knowledge (e.g., Clarifying). Categories aligned with domain-specific problem solving were carried out using content knowledge (e.g., Checking). We also identified two categories of problem solving that we labeled hybrids of domain-general and domain-specific problem solving, because students used content knowledge in these steps, but they did so superficially (e.g., Recognizing).

Supplemental Table S1 shows the categories that resulted from our analytical process, including phase 1 notes, phase 2 categories, and phase 3 final category names as presented in this paper. Categories are organized into the themes of domain-general, hybrid, and domain-specific problem solving (Supplemental Table S1).

Quantitative Analyses of Students’ Written Think-Alouds.

To determine whether students used domain-general/hybrid or domain-specific problem solving preferentially when solving problems ranked by faculty as lower order or higher order, we used generalized linear mixed models (GLMM). GLMM are similar to ordinary linear regressions but take into account nonnormal distributions. GLMM can also be applied to unbalanced repeated measures ( Fitzmaurice et al. , 2011 ). In our data set, an individual student could provide documentation to one or more problems (up to 13 problems). Thus, in some but not all cases, we have repeated measures for individuals. To account for these repeated measures, we used “student” as our random factor. We used the problem type (lower order or higher order) as our fixed factor. Because our independent variables, number of domain-general/hybrid procedures and number of domain-specific procedures, are counts, we used a negative binomial regression. For this analysis and subsequent quantitative analyses, we grouped domain-general and hybrid procedures. Even though hybrid procedures involve some use of content knowledge, the content knowledge is used superficially; we specifically wanted to investigate the impact of weak content-knowledge use compared with strong content-knowledge use. Additionally, the number of hybrid procedures in our data set is relatively low compared with domain-general and domain-specific.

To determine whether students who used more domain-general/hybrid procedures or domain-specific procedures were more likely to have correct answers to the problems, we also used GLMM. We used the number of domain-general/hybrid procedures and the number of domain-specific procedures as our fixed factors and student as our random factor. In this analysis, our dependent variable (correct or incorrect response) was dichotomous, so we used a logistic regression ( Fitzmaurice et al. , 2011 ). We also explored the correlations between the average number of domain-general/hybrid and domain-specific procedures used by students and their final percentage of points for the course.

In this section, we present the results of our analyses of students’ procedures while solving 13 multiple-choice, biology problems ( Figure 1 and Supplemental Figure S1). We used the written think-aloud protocol to discover students’ problem-solving procedures for all 13 problems.

Students Use Domain-General and Domain-Specific Procedures to Solve Multiple-Choice Biology Problems

We identified several categories of procedures practiced by students during problem solving, and we organized these categories based on the extent to which they drew upon knowledge of biology. Domain-general procedures do not depend on biology content knowledge. These procedures also could be used in other domains. Hybrid procedures show students assessing multiple-choice options with limited and superficial references to biology content knowledge. Domain-specific procedures depend on biology content knowledge and reveal students’ retrieval and processing of correct ideas about biology.

Domain-General Procedures.

We identified five domain-general problem-solving procedures that students practiced ( Table 2 ). Three of these have been described in Bloom’s taxonomy ( Anderson and Krathwohl, 2001 ). These include Analyzing Domain-General Visual Representations, Clarifying, and Comparing Language of Options. In addition, we discovered two other procedures, Correcting and Delaying, that we also categorized as domain general ( Table 2 ).

Students’ problem-solving procedures while solving multiple-choice biology problems

The procedures are categorized as domain-general, hybrid, and domain-specific. Superscripts indicate whether the problem-solving procedure aligns with previously published conceptions of student thinking or was newly identified in this study: a , Anderson and Krathwohl (2001) ; b identified in this study; c , Crowe et al . (2008) .

During Correcting, students practiced metacognition. Broadly defined, metacognition occurs when someone knows, is aware of, or monitors his or her own learning ( White, 1998 ). When students corrected, they identified incorrect thinking they had displayed earlier in their written think-aloud and mentioned the correct way of thinking about the problem.

When students Delayed, they described their decision to postpone full consideration of one multiple-choice option until they considered other multiple-choice options. We interpreted these decisions as students either not remembering how the option connected with the question or not being able to connect that option to the question well enough to decide whether it could be the right answer.

Hybrid Procedures.

We identified two problem-solving procedures that we categorized as hybrid, Comparing Correctness of Options and Recognizing. Students who compared correctness of options stated that one choice appeared more correct than the other without giving content-supported reasoning for their choice. Similarly, students who recognized an option as correct did not support this conclusion with a content-based rationale.

Domain-Specific Procedures.

In our data set, we identified six domain-specific problem-solving procedures practiced by students ( Table 2 ). Four of these have been previously described. Specifically, Analyzing Domain-Specific Visual Representations, Checking, and Recalling were described in Bloom’s taxonomy ( Anderson and Krathwohl, 2001 ). Predicting was described by Crowe and colleagues (2008) . We identified two additional categories of domain-specific problem-solving procedures practiced by students who completed our problem set, Adding Information and Asking a Question.

Adding Information occurred when students recalled material that was pertinent to one of the multiple-choice options and incorporated that information into their explanations of why a particular option was wrong or right.

Asking a Question provides another illustration of students practicing metacognition. When students asked a question, they pointed out that they needed to know some specific piece of content that they did not know yet. Typically, students who asked a question did so repeatedly in a single written think-aloud.

Students Make Errors While Solving Multiple-Choice Biology Problems

In addition to identifying domain-general, hybrid, and domain-general procedures that supported students’ problem-solving, we identified errors in students’ problem solving. We observed six categories of errors, including four that we categorized as domain general and two categorized as domain specific ( Table 3 ).

Students’ errors while solving multiple-choice biology problems

The errors are presented in alphabetical order, described, and illustrated with example quotes from different students’ documentation of their solutions to the problem shown in Figure 1 (except for Misreading, which is from problem 13 in Supplemental Figure S1).

The domain-general errors include Contradicting, Disregarding Evidence, Misreading, and Opinion-Based Judgment. In some cases, students made statements that they later contradicted; we called this Contradicting. Disregarding Evidence occurred when students’ failed to indicate use of evidence. Several problems included data in the question prompt or in visual representations. These data could be used to help students select the best multiple-choice option, yet many students gave no indication that they considered these data. When students’ words led us to believe that they did not examine the data, we assigned the category Disregarding Evidence.

Students also misread the prompt or the multiple-choice options, and we termed this Misreading. For example, Table 3 shows the student Misreading; the student states that Atlantic eels are in the presence of krait toxins, whereas the question prompt stated there are no krait in the Atlantic Ocean. In other cases, students stated that they arrived at a decision based on a feeling or because that option just seemed right. For example, in selecting option C for the stickleback problem ( Figure 1 ), one student said, “E may be right, but I feel confident with C. I chose Answer C.” These procedures were coded as Opinion-Based Judgment.

We identified two additional errors that we classified as domain specific, Making Incorrect Assumptions and Misunderstanding Content. Making Incorrect Assumptions was identified when students made faulty assumptions about the information provided in the prompt. In these cases, students demonstrated in one part of their written think-aloud that they understood the conditions for or components of a concept. However, in another part of the written think-aloud, students assumed the presence or absence of these conditions or components without carefully examining whether they held for the given problem. In the example shown in Table 3 , the student assumed additional information on fertility that was not provided in the problem.

We classified errors that showed a poor understanding of the biology content as Misunderstanding Content. Misunderstanding Content was exhibited when students stated incorrect facts from their long-term memory, made false connections between the material presented and biology concepts, or showed gaps in their understanding of a concept. In the Misunderstanding Content example shown in Table 3 , the student did not understand that the biological species concept requires two conditions, that is, the offspring must be viable and fertile. The student selected the biological species concept based only on evidence of viability, demonstrating misunderstanding.

To illustrate the problem-solving procedures described above, we present three student written think-alouds ( Table 4, A–C ). All three think-alouds were generated in response to the stickleback problem; pseudonyms are used to protect students’ identities ( Figure 1 ). Emily correctly solved the stickleback problem using a combination of domain-general and domain-specific procedures ( Table 4A ). She started by thinking about the type of answer she was looking for (Predicting). Then she analyzed the stickleback drawings and population table (Analyzing Domain-General Visual Representations) and explained why options were incorrect or correct based on her knowledge of species concepts (Checking). Brian ( Table 4B ) took an approach that included domain-general and hybrid procedures. He also made some domain-general and domain-specific errors, which resulted in an incorrect answer; Brian analyzed some of the domain-general visual representations presented in the problem but disregarded others. He misunderstood the content, incorrectly accepting the biological species concept. He also demonstrated Recognizing when he correctly eliminated choice B without giving a rationale for this step. In our third example ( Table 4C ), Jessica used domain-general, hybrid, and domain-specific procedures, along with a domain-specific error, and arrived at an incorrect answer.

Students’ written think-alouds describing their processes for solving the stickleback problem

Different types of problem-solving processes are indicated with different font types: Domain-general problem-solving steps: blue lowercase font; domain-specific problem-solving steps: blue uppercase font, hybrid problem-solving steps: blue italics; domain-general errors: orange lowercase font; domain-specific errors: orange uppercase font. The written think-alouds are presented in the exact words of the students. A, Emily, all domain-general and domain-specific steps; correct answer: E; B, Brian, domain-general and hybrid steps, domain-general and domain-specific errors; incorrect answer: C; C, Jessica, domain-general, hybrid, and domain-specific steps; domain-specific errors; incorrect answer: C.

Domain-Specific Procedures Are Used More Frequently for Higher-Order Problems Than Lower-Order Problems

To determine the extent to which students use domain-general and domain-specific procedures when solving lower-order versus higher-order problems, we determined the frequency of domain-general and hybrid procedures and domain-specific procedures for problems categorized by experts as lower order or higher order. We grouped domain-general and hybrid procedures, because we specifically wanted to examine the difference between weak and strong content usage. As Table 5, A and B , shows, students frequently used both domain-general/hybrid and domain-specific procedures to solve all problems. For domain-general/hybrid procedures, by far the most frequently used procedure for lower-order problems was Recognizing ( n = 413); the two most frequently used procedures for higher-order problems were Analyzing Domain-General Representations ( n = 153) and Recognizing ( n = 105; Table 5A ). For domain-specific procedures, the use of Checking dominated both lower-order ( n = 903) and higher-order problems ( n = 779). Recalling also was used relatively frequently for lower-order problems ( n = 207), as were Analyzing Domain-Specific Visual Representations, Predicting, and Recalling for higher-order problems ( n = 120, n = 106, and n = 107, respectively). Overall, students used more domain-general and hybrid procedures when solving lower-order problems (1.43 ± 1.348 per problem) than when solving higher-order problems (0.74 ± 1.024 per problem; binomial regression B = 0.566, SE = 0.079, p < 0.005). Students used more domain-specific procedures when solving higher-order problems (2.57 ± 1.786 per problem) than when solving lower-order problems (2.38 ± 2.2127 per problem; binomial regression B = 0.112, SE = 0.056, p < 0.001).

Frequency of each problem-solving procedure for lower-order and higher-order problems

Procedures are presented from left to right in alphabetical order. A color scale is used to represent the frequency of each procedure, with the lowest-frequency procedures shown in dark blue, moderate-frequency procedures shown in white, and high-frequency procedures shown in dark red.

Most Problem-Solving Errors Made by Students Involve Misunderstanding Content

We also considered the frequency of problem-solving errors made by students solving lower-order and higher-order problems. As Table 6 shows, most errors were categorized with the domain-specific category Misunderstanding Content, and this occurred with about equal frequency in lower-order and higher-order problems. The other categories of errors were less frequent. Interestingly, the domain-general errors Contradicting and Opinion-Based Judgment both occurred more frequently with lower-order problems. In contrast, the domain-specific error Making Incorrect Assumptions occurred more frequently with higher-order problems.

Frequency of errors for lower-order and higher-order problems

Categories of errors are presented from left to right in alphabetical order. A color scale is used to represent the frequency of each type of error, with the lowest-frequency errors shown in dark blue, moderate-frequency errors shown in white, and high-frequency errors shown in dark red.

Using Multiple Domain-Specific Procedures Increases the Likelihood of Answering a Problem Correctly

To examine the extent to which the use of domain-general or domain-specific procedures influences the probability of answering problems correctly, we performed a logistic regression. Predicted probabilities of answering correctly are shown in Figure 3 for domain-general and hybrid procedures and Figure 4 for domain-specific procedures. Coefficients of the logistic regression analyses are presented in Supplemental Tables S2 and S3. As Figure 3 shows, using zero domain-general or hybrid procedures was associated with a 0.53 predicted probability of being correct. Using one domain-general or hybrid procedure instead of zero increased the predicted probability of correctly answering a problem to 0.79. However, students who used two or more domain-general or hybrid procedures instead of one did not increase the predicted probability of answering a problem correctly. In contrast, as Figure 4 shows, using zero domain-specific procedures was associated with only a 0.34 predicted probability of answering the problem correctly, and students who used one domain-specific procedure had a 0.54 predicted probability of success. Strikingly, the more domain-specific procedures used by students, the more likely they were to answer a problem correctly up to five procedures; students who used five domain-specific procedures had a 0.97 probability of answering correctly. Predicted probabilities for students using seven and nine domain-specific codes show large confidence intervals around the predictions due to the low sample size ( n = 8 and 4, respectively). Also, we examined the extent to which the use of domain-general or domain-specific procedures correlates with course performance. We observed a weak positive correlation between the average number of domain-specific procedures used by students for a problem and their final percentage of points in the course (Spearman’s rho = 0.306; p < 0.001). There was no correlation between the average number of domain-general/hybrid procedures used by students for a problem and their final percentage of points in the course (Spearman’s rho = 0.015; p = 0.857).

An external file that holds a picture, illustration, etc.
Object name is ar71fig3.jpg

Predicted probability of a correct answer based on the number of domain-general and hybrid procedures.

An external file that holds a picture, illustration, etc.
Object name is ar71fig4.jpg

Predicted probability of a correct answer based on the number of domain-specific procedures.

We have used the theoretical framework of domain-specific problem solving to investigate student cognition during problem solving of multiple-choice biology problems about ecology, evolution, and systems biology. Previously, research exploring undergraduate cognition during problem solving has focused on problem categorization or students’ solutions to open-response problems ( Smith and Good, 1984 ; Smith, 1988 ; Lavoie, 1993 ; Nehm and Ridgway, 2011 ; Smith et al. 2013 ). Our goal was to describe students’ procedural knowledge, including the errors they made in their procedures. Below we draw several important conclusions from our findings and consider the implications of this research for teaching and learning.

Domain-Specific Problem Solving Should Be Used for Innovative Investigations of Biology Problem Solving

Students in our study used a variety of procedures to solve multiple-choice biology problems, but only a few procedures were used at high frequency, such as Recognizing and Checking. Other procedures that biology educators might most want students to employ were used relatively infrequently, including Correcting and Predicting. Still other procedures that we expected to find in our data set were all but absent, such as Stating Assumptions. Our research uncovers the range of procedures promoted by multiple-choice assessment in biology. Our research also provides evidence for the notion that multiple-choice assessments are limited in their ability to prompt some of the critical types of thinking used by biologists.

We propose that our categorization scheme and the theoretical framework of domain-specific problem solving should be applied for further study of biology problem solving. Future studies could be done to understand whether different ways of asking students to solve a problem at the same Bloom’s level could stimulate students to use different procedures. For example, if the stickleback problem ( Figure 1 ) were instead presented to students as a two-tier multiple-choice problem, as multiple true–false statements, or as a constructed-response problem, how would students’ procedures differ? Additionally, it would be useful to investigate whether the more highly desired, but less often observed procedures of Correcting and Predicting are used more frequently in upper-level biology courses and among more advanced biology students.

We also propose research to study the interaction between procedure and content. With our focus on procedural knowledge, we intentionally avoided an analysis of students’ declarative knowledge. However, our process of analysis led us to the conclusion that our framework can be expanded for even more fruitful research. For example, one could look within the procedural category Checking to identify the declarative knowledge being accessed. Of all the relevant declarative knowledge for a particular problem, which pieces do students typically access and which pieces are typically overlooked? The answer to this question may tell us that, while students are using an important domain-specific procedure, they struggle to apply a particular piece of declarative knowledge. As another example, one could look within the procedural category Analyzing Visual Representations to identify aspects of the visual representation that confuse or elude students. Findings from this type of research would show us how to modify visual representations for clarity or how to scaffold instruction for improved learning. We are suggesting that future concurrent studies of declarative and procedural knowledge will reveal aspects of student cognition that will stay hidden if these two types of knowledge are studied separately. Indeed, problem-solving researchers have investigated these types of interactions in the area of comprehension of science textbooks ( Alexander and Kulikowich, 1991 , 1994 ).

Lower-Order Problems May Not Require Content Knowledge, While Higher-Order Problems Promote Strong Content Usage

Because of the pervasive use among biology educators of Bloom’s taxonomy to write and evaluate multiple-choice assessments, we decided it was valuable to examine the relationship between domain-general and domain-specific procedures and lower-order versus higher-order problems.

For both lower-order and higher-order problems, domain-specific procedures were used much more frequently than domain-general procedures ( Table 5, A and B ). This is comforting and unsurprising. We administered problems about ecosystems, evolution, and structure–function relationships, so we expected and hoped students would use their knowledge of biology to solve these problems. However, two other results strike us as particularly important. First, domain-general procedures are highly prevalent ( Table 5A , n = 1108 across all problems). The use of domain-general procedures is expected. There are certain procedures that are good practice in problem solving regardless of content, such as Analyzing Domain-General Visual Representations and Clarifying. However, students’ extensive use of other domain-general/hybrid categories, namely Recognizing, is disturbing. Here we see students doing what all biology educators who use multiple-choice assessment fear, scanning the options for one that looks right based on limited knowledge. It is even more concerning that students’ use of Recognizing is nearly four times more prevalent in lower-order problems than higher-order problems and that overall domain-general procedures are more prevalent in lower-order problems ( Table 5A ). As researchers have discovered, lower-order problems, not higher-order problems, are the type most often found in college biology courses ( Momsen et al ., 2010 ). That means biology instructors’ overreliance on lower-order assessment is likely contributing to students’ overreliance on procedures that do not require biology content knowledge.

Second, it is striking that domain-specific procedures are more prevalent among higher-order problems than lower-order problems. These data suggest that higher-order problems promote strong content usage by students. As others have argued, higher-order problems should be used in class and on exams more frequently ( Crowe et al. , 2008 ; Momsen et al. , 2010 ).

Using Domain-Specific Procedures May Improve Student Performance

Although it is interesting in and of itself to learn the procedures used by students during multiple-choice assessment, the description of these categories of procedures begs the question: does the type of procedure used by students make any difference in their ability to choose a correct answer? As explained in the Introduction , the strongest problem-solving approaches stem from a relatively complete and well-organized knowledge base within a domain ( Chase and Simon, 1973 ; Chi et al. , 1981 ; Pressley et al. , 1987 ; Alexander and Judy, 1998). Thus, we hypothesized that use of domain-specific procedures would be associated with solving problems correctly, but use of domain-general procedures would not. Indeed, our data support this hypothesis. While limited use of domain-general procedures was associated with improved probability of success in solving multiple-choice problems, students who practiced extensive domain-specific procedures almost guaranteed themselves success in multiple-choice problem solving. In addition, as students used more domain-specific procedures, there was a weak but positive increase in the course performance, while use of domain-general procedures showed no correlation to performance. These data reiterate the conclusions of prior research that successful problem solvers connect information provided within the problem to their relatively strong domain-specific knowledge ( Smith and Good, 1984 ; Pressley et al. , 1987 ). In contrast, unsuccessful problem solvers heavily depend on relatively weak domain-specific knowledge ( Smith and Good, 1984 ; Smith, 1988 ). General problem-solving procedures can be used to make some progress in reaching a solution to domain-specific problems, but a problem solver can get only so far with this type of thinking. In solving domain-specific problems, at some point, the solver has to understand the particulars of a domain to reach a legitimate solution (reviewed in Pressley et al. , 1987 ; Bassok and Novick, 2012 ). Likewise, problem solvers who misunderstand key conceptual pieces or cannot identify the deep, salient features of a problem will generate inadequate, incomplete, or faulty solutions ( Chi et al. , 1981 ; Nehm and Ridgway, 2011 ).

Our findings strengthen the conclusions of previous work in two important ways. First, we studied problems from a wider range of biology topics. Second, we studied a larger population of students, which allowed us to use both qualitative and quantitative methods.

Limitations of This Research

Think-aloud protocols typically take place in an interview setting in which students verbally articulate their thought processes while solving a problem. When students are silent, the interviewer is there to prompt them to continue thinking aloud. We modified this protocol and taught students how to write out their procedures. However, one limitation of this study and all think-aloud studies is that it is not possible to analyze what students may have been thinking but did not state. Despite this limitation, we were able to identify a range of problem-solving procedures and errors that inform teaching and learning.

Implications for Teaching and Learning

There is general consensus among biology faculty that students need to develop problem-solving skills ( NRC, 2003 ; AAAS, 2011 ). However, problem solving is not intuitive to students, and these skills typically are not explicitly taught in the classroom ( Nehm, 2010 ; Hoskinson et al. , 2013 ). One reason for this misalignment between faculty values and their teaching practice is that biology problem-solving procedures have not been clearly defined. Our research presents a categorization of problem-solving procedures that faculty can use in their teaching. Instructors can use these well-defined problem-solving procedures to help students manage their knowledge of biology; students can be taught when and how to apply knowledge and how to restructure it. This gives students the tools to become more independent problem solvers ( Nehm, 2010 ).

We envision at least three ways that faculty can encourage students to become independent problem solvers. First, faculty can model the use of problem-solving procedures described in this paper and have students write out their procedures, which makes them explicit to both the students and instructor. Second, models should focus on domain-specific procedures, because these steps improve performance. Explicit modeling of domain-specific procedures would be eye-opening for students, who tend to think that studying for recognition is sufficient, particularly for multiple-choice assessment. However, our data and those of other researchers ( Stanger-Hall, 2012 ) suggest that studying for and working through problems using strong domain-specific knowledge can improve performance, even on multiple-choice tests. Third, faculty should shift from the current predominant use of lower-order problems ( Momsen et al. , 2010 ) toward the use of more higher-order problems. Our data show that lower-order problems prompt for domain-general problem solving, while higher-order problems prompt for domain-specific problem solving.

We took what we learned from the investigation reported here and applied it to develop an online tutorial called SOLVEIT for undergraduate biology students ( Kim et al. , 2015 ). In SOLVEIT, students are presented with problems similar to the stickleback problem shown in Figure 1 . The problems focus on species concepts and ecological relationships. In brief, SOLVEIT asks students to provide an initial solution to each problem, and then it guides students through the problem in a step-by-step manner that encourages them to practice several of the problem-solving procedures reported here, such as Recalling, Checking, Analyzing Visual Representations, and Correcting. In the final stages of SOLVEIT, students are asked to revise their initial solutions and to reflect on an expert’s solution as well as their own problem-solving process ( Kim et al. , 2015 ). Our findings of improved student learning with SOLVEIT ( Kim et al. , 2015 ) are consistent with the research of others that shows scaffolding can improve student problem solving ( Lin and Lehman, 1999 ; Belland, 2010 ; Singh and Haileselassie, 2010 ). Thus, research to uncover the difficulties of students during problem solving can be directly applied to improve student learning.

Supplementary Material


We thank the students who participated in this study and the biology faculty who served as experts by providing Bloom’s rankings for each problem. We also thank the Biology Education Research Group at UGA, who improved the quality of this work with critical feedback on the manuscript. Finally, we thank the reviewers, whose feedback greatly improved the manuscript. Resources for this research were provided by UGA and the UGA Office of STEM Education.

  • Alexander PA, Judy JE. The interaction of domain-specific and strategic knowledge and academic performance. Rev Educ Res. 1988; 58 :375–404. [ Google Scholar ]
  • Alexander PA, Kulikowich JM. Domain-specific and strategic knowledge as predictors of expository text comprehension. J Reading Behav. 1991; 23 :165–190. [ Google Scholar ]
  • Alexander PA, Kulikowich JM. Learning from physics text: a synthesis of recent research. J Res Sci Teach. 1994; 31 :895–911. [ Google Scholar ]
  • American Association for the Advancement of Science. Vision and Change in Undergraduate Biology Education: A Call to Action. Washington, DC: 2011. [ Google Scholar ]
  • Anderson C, Sheldon TH, Dubay J. The effects of instruction on college non-majors’ conceptions of respiration and photosynthesis. J Res Sci Teach. 1990; 27 :761–776. [ Google Scholar ]
  • Anderson LW, Krathwohl DR. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Boston, MA: Allyn & Bacon; 2001. [ Google Scholar ]
  • Anderson TR, Schönborn KJ, du Plessis L, Gupthar AS, Hull TL. Identifying and developing students’ ability to reason with concepts and representations in biology. In: Treagust DF, Tsui C, editors. Multiple Representations in Biological Education, vol. 7. Dordrecht, Netherlands: Springer; 2012. pp. 19–38. [ Google Scholar ]
  • Bassok M, Novick LR. Problem solving. In: Holyoak KJ, Morrison RG, editors. Oxford Handbook of Thinking and Reasoning. New York: Oxford University Press; 2012. pp. 413–432. [ Google Scholar ]
  • Belland BR. Portraits of middle school students constructing evidence-based arguments during problem-based learning: the impact of computer-based scaffolds. Educ Technol Res Dev. 2010; 58 :285–309. [ Google Scholar ]
  • Bissell AN, Lemons PP. A new method for assessing critical thinking in the classroom. BioScience. 2006; 56 :66–72. [ Google Scholar ]
  • Bloom BS. Taxonomy of Educational Objectives: The Classification of Educational Goals. New York: McKay; 1956. [ Google Scholar ]
  • Brown AL. Knowing when, where, and how to remember: a problem of metacognition. In: Glaser R, editor. Advances in Instructional Psychology, vol. 1. Hillsdale, NJ: Erlbaum; 1978. pp. 77–165. [ Google Scholar ]
  • Brownell SE, Wenderoth MP, Theobald R, Okoroafor N, Koval M, Freeman S, Walcher-Chevillet CL, Crowe AJ. How students think about experimental design: novel conceptions revealed by in-class activities. BioScience. 2014; 64 :125–137. [ Google Scholar ]
  • Bunce DM, Gabel DL, Samuel JV. Enhancing chemistry problem-solving achievement using problem categorization. J Res Sci Teach. 1991; 28 :505–521. [ Google Scholar ]
  • Cartrette DP, Bodner GM. Non-mathematical problem solving in organic chemistry. J Res Sci Teach. 2010; 47 :643–660. [ Google Scholar ]
  • Chase WG, Simon HA. The mind’s eye in chess. In: Chase WG, editor. Visual Information Processing. New York: Academic; 1973. pp. 115–181. [ Google Scholar ]
  • Chi MTH, Feltovich PJ, Glaser R. Categorization and representation of physics problems by experts and novices. Cogn Sci. 1981; 5 :121–152. [ Google Scholar ]
  • Chi MTH, Glaser R. Problem-solving ability. In: Sternberg RJ, editor. Human Abilities: An Information-Processing Approach. New York: Freeman; 1985. [ Google Scholar ]
  • Crowe A, Dirks C, Wenderoth MP. Biology in Bloom: implementing Bloom’s taxonomy to enhance student learning in biology. CBE Life Sci Educ. 2008; 7 :368–381. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Dawson MRW. Understanding Cognitive Science, 1st ed. Malden, MA: Wiley-Blackwell; 1998. [ Google Scholar ]
  • Dirks C, Leroy C, Wenderoth MP. Science Process and Reasoning Skills Test (SPARST): development and early diagnostic results. 2013. Presented at the Society for the Advancement of Biology Education Research (SABER) annual meeting, held 11–14 July 2013, in Minneapolis, MN.
  • Duncker K, Lees LS. On problem-solving. Psychol Monogr. 1945; 58 :i–113. [ Google Scholar ]
  • Ericsson KA, Simon HA. Protocol Analysis: Verbal Reports as Data, rev. ed. Cambridge, MA: MIT Press; 1984. [ Google Scholar ]
  • Fitzmaurice GM, Laird NM, Ware JH. Applied Longitudinal Analysis, 2nd ed. Hoboken, NJ: Wiley; 2011. [ Google Scholar ]
  • Gormally C, Brickman P, Lutz M. Developing a test of scientific literacy skills (TOSLS): measuring undergraduates’ evaluations of scientific information and arguments. CBE Life Sci Educ. 2012; 11 :364–377. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Halpern DE. Critical Thinking across the Curriculum: A Brief Edition of Thought and Knowledge. Mahwah, NJ: Erlbaum; 1997. [ Google Scholar ]
  • Hartley LM, Wilke BJ, Schramm JW, D’Avanzo C, Anderson CW. College students’ understanding of the carbon cycle: contrasting principle-based and informal reasoning. BioScience. 2011; 61 :65–75. [ Google Scholar ]
  • Hoskinson A-M, Caballero MD, Knight JK. How can we improve problem solving in undergraduate biology? Applying lessons from 30 years of physics education research. CBE Life Sci Educ. 2013; 12 :153–161. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jacobs JE, Paris SG. Children’s metacognition about reading—issues in definition, measurement, and instruction. Educ Psychol. 1987; 22 :255–278. [ Google Scholar ]
  • Jonassen D. In: Trends and Issues in Instructional Design and Technology. 3rd ed., ed. RA Reiser and JV Dempsey, Boston, MA: Pearson Education; 2012. Designing for problem solving; pp. 64–74. [ Google Scholar ]
  • Jonassen DH. Toward a design theory of problem solving. Educ Technol Res Dev. 2000; 48 :63–85. [ Google Scholar ]
  • Keys CW. Investigating the thinking processes of eighth grade writers during the composition of a scientific laboratory report. J Res Sci Teach. 2000; 37 :676–690. [ Google Scholar ]
  • Kim HS, Prevost L, Lemons PP. Students’ usability evaluation of a Web-based tutorial program for college biology problem solving. J Comput Assist Learn. 2015; 31 :362–377. [ Google Scholar ]
  • Kohl PB, Finkelstein ND. Patterns of multiple representation use by experts and novices during physics problem solving. Phys Rev Spec Top Phys Educ Res. 2008; 4 :010111. [ Google Scholar ]
  • Lavoie DR. The development, theory, and application of a cognitive-network model of prediction problem solving in biology. J Res Sci Teach. 1993; 30 :767–785. [ Google Scholar ]
  • Lin X, Lehman JD. Supporting learning of variable control in a computer-based biology environment: effects of prompting college students to reflect on their own thinking. J Res Sci Teach. 1999; 36 :837–858. [ Google Scholar ]
  • Martinez ME. What is problem solving. Phi Delta Kappan. 1998; 79 :605–609. [ Google Scholar ]
  • Momsen J, Offerdahl E, Kryjevskaia M, Montplaisir L, Anderson E, Grosz N. Using assessments to investigate and compare the nature of learning in undergraduate science courses. CBE Life Sci Educ. 2013; 12 :239–249. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Momsen JL, Long TM, Wyse SA, Ebert-May D. Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills. CBE Life Sci Educ. 2010; 9 :435–440. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • National Research Council. BIO2010: Transforming Undergraduate Education for Future Research Biologists. Washington, DC: National Academies Press; 2003. [ PubMed ] [ Google Scholar ]
  • Nehm RH. Understanding undergraduates’ problem-solving processes. J Microbiol Biol Educ. 2010; 11 :119–122. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Nehm RH, Reilly L. Biology majors’ knowledge and misconceptions of natural selection. BioScience. 2007; 57 :263–272. [ Google Scholar ]
  • Nehm RH, Ridgway J. What do experts and novices “see” in evolutionary problems. Evol Educ Outreach. 2011; 4 :666–679. [ Google Scholar ]
  • Newell A, Shaw JC, Simon HA. Elements of a theory of human problem solving. Psychol Rev. 1958; 65 :151–166. [ Google Scholar ]
  • Newell A, Simon HA. Human Problem Solving. Upper Saddle River, NJ: Prentice-Hall; 1972. [ Google Scholar ]
  • Patton MQ. Qualitative Evaluation and Research Methods. Thousand Oaks, CA: Sage; 1990. [ Google Scholar ]
  • Polya G. How to Solve It. Garden City, NY: Doubleday; 1957. [ Google Scholar ]
  • Pressley M, Borkowski JG, Schneider W. Cognitive strategies: good strategy users coordinate metacognition and knowledge. Ann Child Dev. 1987; 4 :89–129. [ Google Scholar ]
  • Pressley M, Goodchild F, Fleet J, Zajchowski R, Evans ED. The challenges of classroom strategy instruction. Elem Sch J. 1989; 89 :301–342. [ Google Scholar ]
  • Runco MA, Chand I. Cognition and creativity. Educ Psychol Rev. 1995; 7 :243–267. [ Google Scholar ]
  • Schraw G, Moshman D. Metacognitive theories. Educ Psychol Rev. 1995; 7 :351–371. [ Google Scholar ]
  • Singer SR, Nielsen NR, Schweingburger HA, Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research . Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: National Academies Press; 2012. [ Google Scholar ]
  • Singh C, Haileselassie D. Developing problem-solving skills of students taking introductory physics via Web-based tutorials. J Coll Sci Teach. 2010; 39 :42–49. [ Google Scholar ]
  • Smith JI, Combs ED, Nagami PH, Alto VM, Goh HG, Gourdet MA, Hough CM, Nickell AE, Peer AG, Coley JD, et al. Development of the biology card sorting task to measure conceptual expertise in biology. CBE Life Sci Educ. 2013; 12 :628–644. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Smith MU. Successful and unsuccessful problem solving in classical genetic pedigrees. J Res Sci Teach. 1988; 25 :411–433. [ Google Scholar ]
  • Smith MU. Expertise and the organization of knowledge: unexpected differences among genetic counselors, faculty, and students on problem categorization tasks. J Res Sci Teach. 1992; 29 :179–205. [ Google Scholar ]
  • Smith MU, Good R. Problem solving and classical genetics: successful versus unsuccessful performance. J Res Sci Teach. 1984; 21 :895–912. [ Google Scholar ]
  • Stanger-Hall KF. Multiple-choice exams: an obstacle for higher-level thinking in introductory science classes. CBE Life Sci Educ. 2012; 11 :294–306. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • White RT. Decisions and problems in research on metacognition. In: Fraser B, Tobin KG, editors. International Handbook of Science Education. Dordrecht, Netherlands: Springer; 1998. pp. 1207–1213. [ Google Scholar ]
  • Zheng AY, Lawhorn JK, Lumley T, Freeman S. Application of Bloom’s taxonomy debunks the “MCAT myth.” Science. 2008; 319 :414–415. [ PubMed ] [ Google Scholar ]
  • Zoller U. Are lecture and learning compatible? Maybe for LOCS: unlikely for HOCS. J Chem Educ. 1993; 70 :195–197. [ Google Scholar ]

Using Systems and Systems Thinking to Unify Biology Education


  • 1 Department of Biological Sciences, North Dakota State University, Fargo, ND 58108.
  • 2 Biology Department, Saint Louis University, St. Louis, MO 63103.
  • 3 Biological Sciences, Bethel University, St. Paul, MN 55112.
  • 4 Department of Plant Biology. Michigan State University, East Lansing, MI 48824.
  • PMID: 35499820
  • PMCID: PMC9508906
  • DOI: 10.1187/cbe.21-05-0118

As biological science rapidly generates new knowledge and novel approaches to address increasingly complex and integrative questions, biology educators face the challenge of teaching the next generation of biologists and citizens the skills and knowledge to enable them to keep pace with a dynamic field. Fundamentally, biology is the science of living systems. Not surprisingly, systems is a theme that pervades national reports on biology education reform. In this essay, we present systems as a unifying paradigm that provides a conceptual framework for all of biology and a way of thinking that connects and integrates concepts with practices. To translate the systems paradigm into concrete outcomes to support instruction and assessment in the classroom, we introduce the biology systems-thinking (BST) framework, which describes four levels of systems-thinking skills: 1) describing a system's structure and organization, 2) reasoning about relationships within the system, 3) reasoning about the system as a whole, and 4) analyzing how a system interacts with other systems. We conclude with a series of questions aimed at furthering conversations among biologists, biology education researchers, and biology instructors in the hopes of building support for the systems paradigm.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.
  • Biology / education
  • Problem Solving
  • Systems Analysis


  1. national 5 biology problem solving questions and answers

    problem solving questions biology

  2. Solving a Biological Problem

    problem solving questions biology

  3. Education Is The Key To Success: Biology For Class IX

    problem solving questions biology

  4. national 5 biology problem solving questions and answers

    problem solving questions biology

  5. national 5 biology problem solving questions and answers

    problem solving questions biology

  6. Problem-Solving Methods in Biology Worksheet for 3rd

    problem solving questions biology


  1. Solving a biological problem

  2. Branches of Biology Quiz


  4. Questions Solving

  5. Biology Questions Paper Series 9

  6. Chapter 2 Solving A Biological Problem || Topic BIOLOGICAL METHOD || Sindh Text Book


  1. Problem-based Learning in Biology with 20 Case Examples

    Problem-based Learning in Biologywith 20 Case Examples. Problem-based Learning in Biology. Problem-based learning (PBL) is an exciting way to learn biology and is readily incorporated into large classes in a lecture hall environment. PBL engages students in solving authentic biological case problems, stimulating discussion among students and ...

  2. Teaching Students to Solve Problems

    If you look at our problem sets and exams, you'll see that there are no questions where students have to label a diagram, give a definition, or regurgitate facts. ... We explain that biology is a rigorous problem solving discipline; in fact, biology is all about using information to solve problems. It's a terrific moment when a student ...

  3. The scientific method (article)

    The scientific method. At the core of biology and other sciences lies a problem-solving approach called the scientific method. The scientific method has five basic steps, plus one feedback step: Make an observation. Ask a question. Form a hypothesis, or testable explanation. Make a prediction based on the hypothesis.

  4. 1.3: Problem Solving

    Problem Solving. Educators and employers alike have all argued strongly in recent years that the ability to solve problems is one of the most important skills that should be taught to and nurtured in university students. Medical, professional, and graduate schools alike look for students with demonstrated ability to solve problems; the MCAT has ...

  5. PDF Chapter 3: Molecular Biology Problems

    Molecular Biology Problems. If you were a molecular biologist, you would focus on biological molecules like DNA, RNA, and proteins. Although generally true, your work would overlap with other areas like genetics and biochemistry. In this chapter, we have given you problems that will allow you to explore the structure and function of DNA and RNA ...

  6. Monohybrid Cross Problem Set

    Monohybrid Cross Problem Set. Genetics is the study of heredity and variation in organisms. We begin with a study of the monohybrid cross, invented by Mendel. In a monohybrid cross, organisms differing in only one trait are crossed. Our objective is to understand the principles that govern inheritance in plants and animals, including humans, by ...

  7. Step by Step: Biology Undergraduates' Problem-Solving Procedures during

    This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their ...

  8. Problem Sets

    Problem Sets. In this section, Dr. Diviya Sinha describes how she and the course team create and grade the problem sets for 7.013. In this course, students are assigned seven long problem sets, roughly one for every two weeks. These problem sets are designed to get students to understand and apply lecture material in order to solve problems.

  9. Problem-Solving in Biology Teaching: Students' Activities and Their

    Problem-solving is, by nature, a creative process which, by teaching through the implementation of research and discovery activities, allows students to create their knowledge, revise it and link it to broader systems. The aim of the research was to describe and analyse the process of solving biological problems through activities that are performed during the process of solving them, as well ...

  10. How Can We Improve Problem Solving in Undergraduate Biology? Applying

    The phrase "problem solving" appears regularly in the biology education research (BER) literature, and there is near-universal agreement that problem solving is a valuable skill for biology students to learn and practice (AAAS, 2011; NAS, 2011). Research about defining relevant biology problems, how students solve biology problems, and what ...

  11. Successful Problem Solving in Genetics Varies Based on Question Content

    Problem solving is a critical skill in many disciplines but is often a challenge for students to learn. To examine the processes both students and experts undertake to solve constructed-response problems in genetics, we collected the written step-by-step procedures individuals used to solve problems in four different content areas. We developed a set of codes to describe each cognitive and ...

  12. PDF A toolkit for tackling problem solving

    standard grade Biology. 75% of S4 pupils are taking Biology and Chemistry. However, only 0.5% of S4 pupils are taking all 3 sciences. Calculate the number of S4 pupils taking Biology and Chemistry. 75% of 185 75 ÷ 100 = 0.75 0.75 x 185 = 138.25 = 138 pupils Questions to try 1.

  13. National 5 Problem Solving

    You can access the National 5 Problem Solving Help Sheet below: National 5 Problem Solving Help Sheet. You can access the full National 5 Problem Solving Booklet below: National 5 Problem Solving Booklet Questions. National 5 Problem Solving Booklet Marking Scheme. You can access problem solving questions by category using the links below:

  14. Biology: Problem solving Flashcards

    Biology problem solving. 6 terms. b3cca8. N5 Biology: Problem solving tips. 19 terms. Mr_Houston25 Teacher. Nat 5 Biology Unit 1. 100 terms. anissaleh Teacher. MES - S4 French (Environment) ... Verified questions. computer science. Suppose that beta is an int variable. Consider the following C++ code.

  15. Punnett Square Practice Problems

    Punnett Square Practice Problems. A plant species has two alleles for seed shape: Flat (F) and round (f). The Flat (F) allele exhibits complete dominance. Based on the following Punnett Square, what is the probability that an offspring will have round seeds? Given your answer to the nearest percentage. Practice using a Punnett Square to ...

  16. PDF An investigation into students' difficulties in numerical problem

    The 'mathematics problem' is a well-known source of difficulty for students attempting numerical problem solving questions in the context of science education. This paper illuminates this problem from a biology education perspective by invoking Hogan's numeracy framework. In doing so, this study has revealed that the contextualisation of ...

  17. Biology Problem Solver

    Research & Education Assoc., 2013 - Science - 1080 pages. Each Problem Solver is an insightful and essential study and solution guide chock-full of clear, concise problem-solving gems. All your questions can be found in one convenient source from one of the most trusted names in reference solution guides. More useful, more practical, and more ...

  18. National 5 Biology

    A substantial proportion of the National 5 Biology exam will consist of problem solving questions. Problem solving questions in Biology involves using unfamiliar data and information and calculating or extracting answers. You need to be able to process, present, select and analyse data. Students often find these questions challenging so it is ...

  19. Biology Problem-Solving: The High Achiever Students

    This study aims to identify the problem-solving level of 16-year old high achievers in selected boarding school in the Southern and Central Regions of Malaysia. The problem-solving skills of 70 students were measured using a validated open-ended test, UKPM, which consists of general and topic-specific problem-solving questions for biology.

  20. AI Biology Problem Solver

    Smodin Biology AI Homework Solver has several benefits, including: Time-saving: Complete your assignments in a fraction of the time it would take you to solve them manually. Accuracy: Smodin Homework AI solver provides accurate and error-free solutions every time. 24/7 Availability: Access Smodin AI homework solver at any time, from anywhere.

  21. Step by Step: Biology Undergraduates' Problem-Solving Procedures during

    Asking a Question c: Asked a question about the problem stem or multiple-choice options. Checking a: ... One reason for this misalignment between faculty values and their teaching practice is that biology problem-solving procedures have not been clearly defined. Our research presents a categorization of problem-solving procedures that faculty ...

  22. Higher Biology

    Globular & Fibrous Proteins. Download File. Key Area 4 Cellular Differentiation. Download File. Key Area 5 Structure of the Genome.

  23. Using Systems and Systems Thinking to Unify Biology Education

    To translate the systems paradigm into concrete outcomes to support instruction and assessment in the classroom, we introduce the biology systems-thinking (BST) framework, which describes four levels of systems-thinking skills: 1) describing a system's structure and organization, 2) reasoning about relationships within the system, 3) reasoning ...