Competency Exam Final Report

July 2001

As part of our work to enhance the mathematical and quantitative components of the major’s sequence of General Biology courses at the University of Tennessee we designed a series of competency exams to be given to students in the Biology majors' core courses (Biology 130: Biodiversity and Biology 140: Cell Biology) at the beginning of each semester and again at the end of the semester. Both the initial and follow-up exams consisted of multiple-choice questions and word problems, none of which required a calculator. The follow-up exams consisted of similar questions with different values from those on the initial exam.

One main objective of the exams was to provide a self-assessment for the students’ use; the exam covered basic mathematical concepts that would be used in the course, and provided information about available tutorial resources. Students were expected to score at least 90% on the exam, and advised to review the missed concepts if they did not. The exam also served to emphasize to students that math is not only a fundamental part of biology, but would be an important part of the course they were about to take. Another objective was for us to gain an idea of the general level of students’ familiarity with these concepts at the beginning of the course.

The exam was first administered to the Biology 140 students in the spring semester of 1999. This course was taught by three different professors, one of whom was a co-Primary Investigator on the grant (Dr. B. Mullin). Administering the exam to students whose instructor is involved with this project as well as to students whose instructors were not involved also provided us with a potential comparison between course sections with different emphases. Dr. Mullin gave the exams during lecture and emphasized mathematical concepts with examples and exercises for students throughout the course. The other two course sections were given the exam by teaching assistants during their lab periods.

In the fall semester of 1999 the exams were administered to students in both the Biology 130 course (taught by Dr. S. Riechert, also a co-Primary Investigator) and to students in trailer sections of the Biology 140 course. All exams were administered by teaching assistants during lab periods. In the Biology 130 course, Dr. Riechert emphasized mathematical concepts, while the instructors of the Biology 140 course were not encouraged to do so.

The competency exams were administered a third time in the Biology 140 course during spring semester 2000. All exams were given by teaching assistants during laboratory periods, and none of the course instructors were involved with this project.

The subjects covered on the Biology 140 exams were: (Q1) exponents; (Q2) metric conversions; (Q3) mean and median for a small set of data; (Q4) calculating molecular weight of a compound; (Q5) dilution; (Q6) concentration; (Q7) graph interpretation; (Q8) using reaction rate to calculate amount of product produced; (Q9) approximate volume of a cylinder; (Q10) molarity; and (Q11[10]) graph interpolation. The subjects covered on the Biology 130 exams were: (Q1) exponents; (Q2) concentration and metric conversions; (Q3) mean and median for a small data set; (Q4) half-life; (Q5) probability based on a sample; (Q6) mean, variance, and graph

interpretation; (Q7) dilution; (Q8) rate of population increase/decrease; (Q9) volume and surface area; and (Q10) graph interpolation. Examples of the competency exams are attached. Results are presented below.

Biology 140: Spring 1999

The graph of all the scores shows a general increase in the scores between the initial exam and the follow-up exam. The mean score on the initial exam was 7.43 out of 11; on the follow-up exam the mean was 8.61 out of 11.

The most commonly missed questions were Q5, Q6, Q8, and Q9, but the percentage of students missing each question decreased considerable in the follow-up exam.

However, pooling the scores from all the students taking the exam confounds the results somewhat. Scores for the two groups (Mullin’s class and Other sections) on the initial exam were compared with a t-test and their means were not significantly different (p = 0.29). However, students in Dr. Mullin's class were exposed to mathematical concepts throughout the semester, and were also told that they would be given extra credit based on their score on the second exam. The mean for the second exam in Mullin’s class was considerably higher than the mean for the first exam, which was probably in part a result of students knowing ahead of time that they would receive extra credit. For those students of Mullin’s who took the exam both times, we compared the scores with a paired t-test. The mean increase in students’ scores from the first exam to the second was 2.44 points, which was significant at a = 0.01 (t = 12.05, p < 0.0001). Although this could be an artifact of the students’ knowledge that their scores would count as extra credit, it does indicate that emphasizing the importance of mathematical concepts may encourage students to review (and perhaps relearn) these ideas.

Unfortunately the second exam for the other sections was given to students at the end of their lab practical during finals week, and was not administered to all students. Furthermore, students were not receiving extra credit for the exam. There were some students in this group that only appeared to try to answer a few of the questions, and two exams were blank after the first question. As a result, the scores for the other sections’ second exams are rather suspect, and a comparison between these scores and the initial exam scores, or these scores and Mullin’s exam 2 scores would be meaningless.

Biology 130 and Biology 140: Fall 1999

The Biology 130 students and the students in the 140 trailer section were both given the same exam (i.e., the 140 students in these semesters took the exam based on more ecological/evolutionary concepts, although the mathematical concepts were essentially the same as those emphasized in the standard 140 competency exam). Although several of the TAs in the 130 labs allowed their students to use calculators on the initial exam, a two-tailed t-test indicated that there was no significant difference between the scores of sections in which students used or did not use calculators (*p* = 0.62). Students were not given extra credit for their scores on the follow-up exam. Scores actually dropped significantly (*p*-value for a one-tailed t-test = 0.03) between the first and second exam; the mean for the first exam was 5.38 and the mean for the second exam was 5.09.

Results from the Biology 130 exams were rather puzzling at first. However, it turned out that several TAs had informed their students that the tests did not matter and not to worry about their performance. This, combined with the lack of incentive that extra credit might have provided, likely explains the poor performance of the students on the follow-up exam. Even the scores of students that could be compared with a paired t-test dropped from the initial exam to the follow-up (*p* = 0.05).

Students in the Biology 140 course who took the same exam that semester also did better on the initial exam than they did on the follow-up (the means for the initial and follow-up exams were 6.16 and 5.51, respectively, with *p* = 0.005 based on two-tailed t-test). Although the instructors in this course were not specifically incorporating mathematical concepts, one would expect that the scores would not have been significantly different between the two exams.

Biology 140: Spring 2000

The last group of initial and follow-up competency exams was administered in to students in the Biology 140 course during the Spring semester 2000. The mean scores were not significantly different between the initial and follow-up exams (mean for initial exam = 7.33, mean for follow-up exam = 7.34, *p* = 0.48 based on one-tailed t-test). A paired t-test of scores from students who had taken both exams was also not significant (*p* = 0.52). The higher mean scores for students in this course may have been in part a result of the fact that many of the students taking the Bio 140 course had taken the Bio 130 course the semester before, and as a result the competency exam was somewhat familiar to them.

Biology240 (Genetics): Spring 2000

We were then curious to see how students who had taken Dr. Mullin's Bio 140 course in the spring semester 1999 (where mathematical concepts were emphasized, but students were also given extra credit for the follow-up exam) would score on the exam that they had taken one year earlier. With the permission of the Genetics (Bio 240) instructor we administered the exams to students in this course during the spring semester 2000. Scores were considered only for those students (regardless of instructor) who had taken Bio 140 the previous spring, so that the scores from students who had not been exposed to the test would not confound the results.

Twenty-two students in the Genetics course had taken Bio 140 in the spring of 1999. Of these, 11 had been in Dr. Mullin's class and 11 had been in a class taught by a different instructor. The mean score for Dr. Mullins' former students was 8.55 out of 10, while the mean for students from other sections was 6.73 out of 10 (*p* = 0.02 based on one-tailed t-test). These results suggest that although the conclusions from the Spring 1999 analyses were likely to have been affected by the extra credit given in Dr. Mullins' class, these students did appear to have retained the math they learned during that course.

Conclusions

Several complicating factors affected our ability to assess the effects of instructors emphasizing mathematical concepts in Biology on students' math comprehension. This difficulty seems to be primarily a shortcoming of test administration. In the Spring of 1999 students may have "crammed" the night before the exam, knowing they would be getting extra-credit, and the consequent increase in the scores may not permit the conclusion that students' mathematical understanding had increased. However, the follow-up exam given to students in the Genetics class suggests that students who had taken the course in which mathematical concepts were emphasized did retain a better understanding of mathematics than their counterparts whose instructors had not emphasized math.

In the fall of 1999, students were not given extra credit for the follow-up exam. All other things being equal, their poorer scores on this exam might have suggested that the instructor's emphasis on mathematics had no (or even a negative) effect on students' grasp of these concepts. However, this conclusion is not appropriate, as numerous teaching assistants (at least one-third) reported that they had discouraged students from taking the exam seriously; furthermore, some reported that they had overheard students saying that they would try to do poorly because they the exam seemed pointless.

In the spring of 2000, students were again not given extra credit. However, this time we asked that the TAs emphasize the importance of the exam to their students. The instructors in these courses had not emphasized mathematical concepts, and we therefore expected the difference in the mean scores between the two exams to be negligible, which they were.

These results demonstrate that a number of factors influence students performance on these competency exams. The answer to our primary question, whether emphasis (examples and exercises, in particular) by the instructor in lecture increased students mathematical literacy, was somewhat obscured by the manner in which the exam was administered. Students receiving extra credit had higher scores than those who did not; however, these students also performed very well on the exam one year later. It is unlikely that "cramming" for extra credit on the earlier exam would be responsible for this, and therefore the conclusion that emphasis by the instructor has a positive effect on students' quantitative skills is supported. Perhaps most importantly, it is clear from our results that the attitude of the person directly giving the exam (in this case the lab TAs) can significantly influence students' performance on the test.

While interpretation of our overall results was complicated by aspects of the test administration, emphasizing mathematical concepts in the Biology core courses can positively affect students' math comprehension. Our results also lead to some simple recommendations for future testing of this nature. First, students need to be motivated to perform to the best of their ability on the exams by some means other than extra credit. Emphasizing to them the rationale behind the testing, as well as how the results will be used, may encourage them to do their best. Second, individuals administering the test must also understand the rationale of the testing, and be encouraged to communicate its importance to their students.