At the end of the semester, as part of the instuctor evaluation process, students completed a short questionaire that addressed their
level of agreement with respect to having learned about scientific literacy, literature type, and the peer-review process (see Table 5). This
questionaire was completed anonymously, and thus no identifiers
could be attached to the individual responses.
Did It Work?
Statistically significant differences were observed with respect to
both formatting (p = 0.001) and understanding the peer review
process (p = 0.002) (Figures 1 and 2). Post-tests analyses showed
no significant variation in the means between either the pre-test
and the first post-test, or between the two post-tests. The significant difference lay specifically between the Pre-test and Post-test 2,
revealing a slight, but not statistically significant, improvement in
student scores immediately after the assigment was completed.
However, scores continued to improve throughout the semester,
culminating in a statistically significant improvement in understanding after having applied this knowledge to a project and
With respect to scientific literacy, the student scores did not
increase. The questions posed were about specific features of scientific
literacy. These features were not specifically reinforced in the paper
component, which could explain the non-significant score increase
in this section.
There was a significant increase in student confidence in all three
categories (p < 0.0001). In each case, both post-test results were
significantly higher than the pre-test results. There was no statis-
tically significant increase in student confidence between Post-
tests 1 and 2. This indicates that student confidence and their
perception of their knowledge gain increases after the assignment,
without a concurrent statistically significant increase in knowl-
edge gain. Though this seems counterintuitive, this level of appar-
ent contradiction has been previously observed. Students have
been known to be more engaged in the scientific process, without
necessarily being able to articulate the associated knowledge ( i.e.,
they feel they know how to do it, but cannot use the appropriate
vocabulary) (Salter & Atkins, 2014). Thus, procedural knowledge
(how they do it) precedes declarative knowledge (what it means).
Across the board, students showed an overwhelming response of
strongly agree/agree for each of these three categories. Students felt
they had a better understanding of the issue of scientific literacy
Table 4. Student confidence level questions with
respect to the three objectives. The students use a
ranking system from 1 (no confidence) to 5 (full
How confident do you feel about being able to identify a
correctly formatted reference?
How confident do you feel about understanding the concept
of scientific literacy and its connection to source?
How confident do you feel about how the peer-review
Table 5. The questions posed at the end of the
semester. The students chose from the following
categories: Strongly Agree, Agree, Disagree, Strongly
Disagree, and Not Applicable.
I now have a better understanding of the issue of scientific
I now know the difference between primary and secondary
I now undertand the process of peer review.
Figure 1. Average and standard error for total test scores for
Formatting (F) and Scientific Literacy (SL) in pre-tests (Pre),
Post-test 1 (Post 1), and Post-test 2 (Post 2). Formatting
maximum attainable score was 3. Asterisks indicate the paired
tests that showed statistically significant differences.
Figure 2. Average and standard error for total test scores for
Peer-Review (PR) in pretests (Pre), Post-test 1 (Post 1), and Post-test 2 (Post 2). Formatting maximum attainable score was 2.
Asterisks indicate the paired tests that showed statistically