Leaving the Nest: The Evolution of CHRPP (the Course of Human Participant Protection) | Quitter le nid : l’évolution du cours d’éthique sur la protection des participants humains

Four years ago [Institution Name] University launched an online tutorial called CHRPP, the Course in Human Research Participant Protection, and published a paper based about its purpose, design, and usability in (Authors, 2009). CHRPP was originally created to raise awareness among research students about the federal policy regarding research ethics and to encourage ethical research practices. Self-assessments and interactive activities were built into the tutorial to help achieve our goals. Since the first publication CHRPP has been updated based on user feedback from a user satisfaction survey. The generally positive reception of this innovative tutorial led to it serving as the basis of a new national research ethics tutorial hosted by the Government of Canada’s Panel on Research Ethics. This paper summarizes the evolution of CHRPP from a homegrown solution for [Institution Name] University to an essential piece of Canada’s national research ethics education program. 
 
En 2008, l’Universite Queen’s a lance un tutoriel en ligne nomme CHRPP (Course in Human Research Participant Protection, cours sur la protection des participants humains a la recherche) et publie un article sur son objectif, sa conception et sa convivialite dans Balkwill, Stevenson, Stockley et Marlin (2009). Le CHRPP a ete cree pour sensibiliser les etudiants qui font de la recherche sur la politique federale relative a l’ethique de recherche et pour favoriser les pratiques ethiques de recherche. Des autoevaluations et des activites interactives ont ete integrees au tutoriel pour nous aider a atteindre nos objectifs. Depuis sa premiere publication, le CHRPP a ete mis a jour en se basant sur la retroaction tiree d’une enquete sur la satisfaction des utilisateurs. La reception generalement positive qu’a recue ce tutoriel innovateur lui a valu de servir de base pour un nouveau tutoriel national en ethique de la recherche qu’heberge le Groupe consultatif en ethique de la recherche du gouvernement du Canada. Cet article resume l’evolution du CHRPP qui, d’une solution maison pour un etablissement est devenu une partie essentielle du programme national canadien d’education en ethique de la recherche.


Introduction
Researchers are required to have a basic understanding of research ethics as it applies to their work (note that for the purposes of this article we are primarily concerned with the ethical conduct of research involving human participants). The extent to which researchers receive training in research ethics varies between discipline, institutions, and national funding bodies (e.g., Braunschweiger, 2010;Dubois, Schilling, Heitman, Steneck, & Kon, 2010;Löfström, 2012). In the mid-2000s, our team at Queen's University evaluated a number of different off-theshelf products for research ethics education. Finding nothing that was both user-friendly and relevant to the Canadian research ethics environment, we decided to create our own online tutorial in order to address the specific needs of our institution.
The Course in Human Research Participant Protection (CHRPP) was developed by a team of experts in research ethics, e-learning, and pedagogy. As described in a preceding Journal of Online Learning and Teaching article Hatching CHRRP: Developing an e-learning tutorial for research ethics, the purpose of the tutorial was "to ensure all graduate students involved in research with human participants were aware of the national standard of research ethics" (Balkwill, Stevenson, Stockley, & Marlin, 2009, ¶ 1) of the Tri-Council Policy Statement (TCPS). A description of the intended design specifications and process of creating CHRPP is outlined below. This research was authorized by the General Research Ethics Board at Queen's University.

Canadian Research Ethics Regulatory Context
Research ethics guidance at the federal level in Canada is provided by the Panel on Research Ethics (PRE) and the Secretariat on Responsible Conduct of Research (SRCR). The Policy that details this guidance is the Tri-Council Policy Statement: Ethical Conduct for Research Involving Human, second edition (TCPS 2). The TCPS 2 is a joint policy of the three federal research funding agencies (The Natural Sciences and Engineering Research Council; The Social Science and Humanities Research Council, and the Canadian Institute of Health Research). Originally launched in 1998 and updated in 2010, it is similar in spirit to the Belmont Report and the Declaration of Helsinki. The policy provides guidance to institutions, researchers and research ethics board (REB) members on the design, review, and conduct of ethically acceptable research. It addresses key issues such as the evaluation of risks and benefits to participants, the elements of informed consent, the safeguarding of privacy and confidentiality, the management of conflicts of interest, and the consideration of the needs of people in vulnerable circumstances. Adopted by Canadian research institutions that are eligible for Agency funding, TCPS 2 applies to all research involving human participants, regardless of discipline.

The CHRPP Approach: Design Specification of the Online Ethics Course
The original TCPS, which CHRPP was based upon, was available to researchers and REB members as a hard copy book or an online pdf document. The PRE website offered a basic online tutorial that was primarily text-driven with no interactive elements. However, a needs assessment previously done at the institution revealed that student and faculty researchers found the PRE website tutorial to be uninteresting and overly simplistic, as well as indicating a lack of practical knowledge of TCPS and how to apply the guidance to research design. CHRPP was thus developed to be a more interesting, authentic, and effective learning experience for students, staff and faculty. Aiming to be a practical guide for researchers about the ethical considerations outlined in the TCPS, how to apply TCPS, and institution-specific policies to research design and conduct, CHRPP took an applied approach to TCPS, offering practical examples of ethical issues in a wide variety of disciplines and providing users with opportunities to develop a more indepth understanding of how to incorporate ethical considerations into research design.
In the design specifications, CHRPP users who completed the tutorial were to: Goal 1: have an improved appreciation for the value of an ethical approach: Goal 2: have improved judgement skills in regard to ethical issues, Goal 3: know the TCPS Guidelines and be able to apply them, Goal 4: have a better appreciation for the rights of the participants, Goal 5: have a better understanding of their role in protecting research subjects, Goal 6: understand the ethics process at Queen's University, and Goal 7: have an appreciation of emerging ethical issues.
CHRPP was launched at Queen's University in 2008 with eight modules: (1) Why Ethics, (2) Defining Research, (3) Assessing Risk and Benefits, (4) Informed Consent, (5) Privacy and Confidentiality, (6) Vulnerable Populations, (7) Conflict of Interest, and (8) REB Review. In 2009, four other institutions licensed modules 1-7 of CHRPP to make it accessible for their students and staff with module 8 remaining Queen's specific in its reflection of our review process and contacts. See Figures 1 and 2 for screen shots illustrating the look and feel of the modules.  For a full description of the case study regarding the development of this tutorial, the design specification and usability results, see Balkwill, Stevenson, Stockley, and Marlin (2009). After its initial development and launch, CHRPP was adopted for use by researchers and research ethics board (REB) members at several other institutions. In this paper, we present the results of a mixed-method user satisfaction survey voluntarily completed by a sample of CHRPP users across institutions.
To explore the effectiveness of CHRPP's modules and features for multiple users, the evaluation of CHRPP asked: 1. How useful was the tutorial (CHRPP) and its specific modules? 2. How effective were the specific online design features? 3. How effective was the tutorial in meeting the learning goal of improving users' understanding of research ethics policy and how to apply it to their research? 4. How effective is the tutorial interface for multiple users in different disciplines and roles? 5. To what extent were particular features predictive of user ratings of the tutorial overall?

Methods
After completing CHRPP, users were invited to complete a user satisfaction online survey. The Likert-type ratings items and the numerical estimation of time to completion were grouped into four thematic categories: 1. Ratings of usefulness: Overall and for each of the 8 tutorial modules (9) 2. Ease of use re: Structure, navigation, duration, enjoyability (5) 3. Perceived effect of pedagogical elements on learning (5) 4. Perceived impact of CHRPP on research ethics knowledge and application (6) Each of the Likert-type scales consisted of seven response options ranging from most critical (e.g. strongly disagree; not useful at all) to most favourable (e.g. strongly agree; essential). The data in the free response field was coded (no comment, positive, critical) for analyses of possible response bias. Sample comments relevant to each analysis are included in the results section. Responses to this survey were collected from the time of CHRPP's launch in June 2008 until June 2011. At the time of data collection, there were 341 respondents from at least four institutions (with some participants' institutions not specified). Of these, 338 completed the survey and were included in our analyses. As three of these four specified universities did not begin using CHRPP until 2009, 86% of respondents were from the first institution, Queen's University. The number of responses for each question varied as respondents were permitted to skip questions; the sample sizes for each analysis are reported separately. All responses were anonymous as no identifying information, including age or gender, were collected.
Survey items asked users to evaluate specific features of CHRPP, the usefulness of each module, and perceived learning outcomes of the tutorial using Likert-type scales and free response. This combination of formative and summative assessment descriptive and inferential statistics were completed for quantitative items followed by a discussion of how the results relate to the stated goals. The free-response items were thematically coded and summarized with exemplary quotes where relevant.

Usefulness of CHRPP and Its Modules
Overall CHRPP was considered mostly useful by users (mean = 5.78), with all eight modules being similarly rated. Table 1 shows the mean ratings of usefulness of each module and the overall ratings of the tutorial overall. Average ratings of the eight modules were all between mostly useful (5) and useful (6), with modules not significantly differing as the means fell within one standard deviation of each other; in addition the mode rating for all modules and overall was 6 (Useful) with 82% to 92% responses rating the modules 5 or higher.  (2), Useful only in some aspects (3), Undecided (4), Mostly useful (5), Useful (6), and Essential (7).
The final module, Module 8, was distinct in its focus on how to apply for institutional REB approval, rather than on research ethics principles or considerations, and was the only module specifically mentioned in Queen's University users' comments. The national and institutional policy compliance information provided by CHRPP about this process was particularly appreciated. Users commented regarding Module 8 that: "For Module 8, when critiquing the ethics application, it would be helpful to be able to access REB requirements for each section (e.g., consent form, letter of information)." "I wish that there was more coverage of the process for the ethics application (i.e., more materials or details in module 8)." "I found section 8 to be helpful in preparing for my ethics submission." "The last section where we had to rate the ethics form was a bit overwhelming, but was helpful in getting me to think critically about the entire process." Comments about the overall usefulness of the tutorial tended to focus on the national and institutional policy compliance information provided by CHRPP, including: "This tutorial was very useful to fully understand the details of the ethics review process." "Found it to be too repetitive (sic) of the TCPS tutorial so very little was new." "Overall I found the CHRPP very useful particularly all of the information about the Tri-Council Policy Statement." Several respondents commented that they found the information in CHRPP too basic for their own experience but saw the value of it as an introductory course for those new to research ethics. For example: "I think this would be useful for first year master or doctoral students. Individuals who have submitted applications before may not gain as much information and could find parts of the course as review of concepts they are already familiar with. Overall is seems worthwhile to complete though."

Effectiveness of CHRPP's Online Design Features
Ratings of Pedagogical Elements. Users generally agreed (see Table 2) that the pedagogical design features of the tutorial were effective, the module objectives were clear, the examples given were easily connected to their own area of research, and the interactive elements helped them to better understand the material. The quizzes challenged them to think about the module content, and the feedback for the quizzes was a useful way to check and consolidate their learning.  (3), Undecided (4), Somewhat agree (5), Agree (6), and Strongly agree (7).
Qualitative feedback from users responding to an open-ended comment question similarly revealed their appreciation for several features of CHRPP including its ease of navigation, quizzes, examples, and its use of media and historic stories of incidents of real life research ethics issues.
"Overall the CHRPP is very easy to navigate and is well-designed." The examples and media clips particularly served to raise awareness of the relevance and importance of ethical considerations. Learning about ethical requirements in the context of hearing of real scenarios grounded CHRPP's message of ethics in real experiences. As noted by users: "Having the excerpts from CBC news around certain scenarios helps to make one aware of violations of the past, and the relevance of General Ethics Board Reviews in our contemporary time. Clearly though some violations of privacy have created useful research, this is not worth the violations to human dignity." However there may be technological limitations to consider as for at least one user, "The videos took a long time to load!" Quizzes at the end of each module required users to apply the knowledge gained from that module by selecting the best of several feasible responses to each question. Explanations of why a response was, or was not, the best choice appeared after each selection. Users were encouraged to continue selecting responses and reading feedback until they found the best answer(s). The ratings of the three items about the module quizzes indicated that most respondents somewhat to strongly agreed (>90%) that they were effective learning tools. Comments about this quiz format included: "Some of the quiz options were difficult to choose between however, the captions associated with the modules made it somewhat more clear why one option was better than another." "Bravo, very well done online course. I loved the multifaceted explanations for each quiz answers. Supportive interactive elements were reliable and of high quality." "I loved the multifaceted explanations for each quiz answers." "The quiz questions are probably too easy much of the time." Overall, user responses indicated that the specific design features highlighted in the user satisfaction survey were considered effective. One respondent highlighted the features and benefits of an online course that is interactive, clearly formatted, and provides real world examples for other mandatory policy training topics, such as the workplace hazardous materials information system (WHMIS) training: "I like the interactive format of this course a lot.
Materials are clearly presented and good examples were used at the course. I hope to see similar courses (e.g. WHMIS) to be presented in a similar style in the future!" Amount of time and material. We also asked users to indicate how much time they spent doing the tutorial, whether the amount of content was appropriate and whether it was easy to navigate. The self-reported average time necessary to complete the tutorial was 3.81 hours with the majority (255 out of 338) reporting two to five hours and 90% within one to six hours. There was a considerable amount of variance in responses to this questionlikely due to the users' ability to choose how many features (media, external links, example and exercise options) of the tutorial to access. Responding on a Likert-type scale of strongly disagree (1) to strongly agree (7), 90 % of respondents agreed (somewhat to strongly) that the amount of content in CHRPP was appropriate (M = 5.80, SD = 0.93), while some comments revealed that some trimming or shortcuts for customizing material may be beneficial in the future: "I liked how I could complete the modules at my own pace." "The materials are too long. A more compact explanation and highlights of important points would be more helpful." "The modules were a little long but I understand that the information was necessary." "At times it was quite lengthy as were some of the examples, but overall the materials were appropriate." "To be honest, it was a little bit too long."

Ease of navigation.
On the same scale, over 90% of respondents agreed (somewhat to strongly) that it was easy to navigate CHRPP (M = 6.18, SD = 1.06) with comments such as "Overall the CHRPP is very easy to navigate and is well-designed." However, comments highlighted technical issues associated with navigation such as reports of broken hyperlinks (n = 10), missing buttons, and inconsistencies in display (e.g., "There were times that I felt the navigation was somewhat inconsistent (new pop-up window)...."). Participants also expressed the desire for a progress bar or indicator (n = 5):"You might consider putting up a bar along the top showing you your progress so you know how much more you have to go."

Meeting CHRPP's Learning Goals
After completing the tutorial, users found CHRPP to be of practical value to their research activities as well as the research ethics review process. Mean ratings of the six statements related to CHRPP's learning outcomes indicate that CHRPP survey respondents generally agreed they felt more capable and confident about their knowledge of research ethics and their ability to apply it to their work after completing the tutorial (see Table 3). The information provided by the CHRPP is a useful resource in planning for my ethics application.
Over 90% of respondents agreed (somewhat to strongly) that completing CHRPP had given them more knowledge about: how to protect the welfare of participants; how to plan an ethics application and successfully meet ethics guidelines; and how to assess researchattributable risks to participants. Furthermore, these respondents also felt more capable of advising colleagues about the application of research ethics. The majority of comments about these aspects of the tutorial were positive: "Overall the CHRPP was a great tool to expand my knowledge throughout the research procedure" "I am really glad I had this learning and exposure before going into my own research design and GREB ethics application submission." "This is a long process that is only useful if you have a complicated study planned." "This was very helpful in terms of constructing my ethics proposal and conducting my research."

Addressing the Needs of Diverse CHRPP Users
Diverse research areas. In terms of research area, 47.4% of respondents (n = 340) hailed from the health sciences, 37.4% from the social sciences, 12.4% from humanities, and 2.9% from computer sciences, engineering and professional programs (such as Education, and Urban Planning). The following two pie charts provide a breakdown of user role ( Figure 3) and a breakdown of their research area (Figure 3) with rounded values.  Table 2). CHRPP users across these three disciplinary areas also reported no difference in the usefulness of each module ( 2 s  14.711, ps  .258), and equally rated specific features ( 2 s  14.098, ps  .295), except for more users in social sciences agreeing that quizzes make users think more carefully than expected by chance (( 2 (12) = 23.10, p = .027, z-critical = 2.157, standard residual = 2.1). Ratings of being more confident and capable did not differ by research area ( 2 s  18.533, ps  .100). Due to small numbers of users in "other" disciplines, chi-square analysis could not be conducted on their responses with reasonable validity as too many cells contained expected counts of zero (Conover, 1980).

Diverse roles.
Of the 338 CHRPP users who indicated their role, 76% were graduate students. The remaining 24% consisted of undergraduate students (9.5%), research associates and assistants (5.9%), faculty and adjunct faculty members (3.3%), post-doctoral fellows (1.8%) or other (3.6%) as shown in Figure 4 with rounded values. In terms of experience with the research ethics review process, 34% had had a research proposal approved by an REB and 66% had either been unsuccessful in submitting or had not yet been through this process (n = 326).  Further detailed comparisons of ratings across these roles of modules and features were not possible due to small samples across most role categories (e.g., faculty, post-doctoral fellows) other than graduate students resulting in cross-tabulations between roles and ratings of modules or features having several cells with expected counts of zero which is below the minimal for a Chi-square analysis (Conover, 1980). Even removing groups with less than 15 users whose expected counts included 0 or .1, such as faculty, other, and post-doctoral fellows, produced similar low expected counts with any significant finding due to one to four individuals.

Prior Research Ethics Approval Success
We also examined the differences in ratings of tutorial and module usefulness between users who had reported success in receiving REB approval for a project ("yes") and those responding "no" which might indicate either no experience with REB review or no success. Those reporting success in their ethics applications generally did not differ from users who said "no" in their overall ratings (F(5) = 3.056, p. = .079), nor ratings of features ( 2 s  9.305, ps  .097; with quizzes having significant chi-square but non-significant post-hoc analysis, ( 2 (6) = 13.705, p = .033, z-critical = 2.086, standard residual = 2.0). While there was also no difference across most modules ( 2 s  11.094, ps  .086), four successful applicants unexpectedly rated Module 5 not at all useful ( 2 (6) = 12.79, p = .046, z-critical = 2.086, standard residual = 2.3) and Module 6 not at all useful ( 2 (6) = 14.82, p = .022, z-critical = 2.086, standard residual = 2.3) than expected by chance. Perhaps these modules are easier for those with experience.
The possibility of a ceiling effect in learning for those already successful with REB applications was somewhat evident with ratings of two learning outcomes, though no difference for the other learning goals (see list in Table 2 with an REB application were less likely to strongly agree and were more undecided about the statement "I know more now about protecting research participants (human subjects) than I did before taking the course" (strongly agree: 17 observed, 29.1 expected, z-critical = 2.086, standard residual = -2.2; Undecided: 6 observed, 2.4 expected, z-critical = 2.086, standard residual = 2.4;  2 (6) = 25.520, p < .001). Successful REB applicants were also less likely to strongly agree with, "The information provided by CHRPP is a useful resource in planning for my ethics application" than expected by chance (strongly agree: 24 observed, 37.1 expected, zcritical = 2.086, standard residual = -2.2), though were proportionally more likely to somewhat agree while non-successful applicants were less likely to indicate somewhat agree (successful: 25 observed, 14 expected, z-critical = 2.086, standard residual = 3.0; not successful: 16 observed, 27 expected, z-critical = 2.086, standard residual = -2.1;  2 (6) = 26.915, p < .001).
Users' open-ended comments echoed these findings, suggesting that while CHRPP was generally valuable for all users, it was more informative for new researchers than for experienced researchers: "This course also aids in the completion of a General Research Ethic Board application, simplifying it for first timers!" "Information is basic and good for those individuals starting out." "I appreciated the introduction to ethics provided by this course. " "I am a PhD student and therefore have already completed a full semester ethics course, and have already completed an ethics application for my Masters. Therefore, much of this information was a review for me. However, I can see it being useful to newer graduate students who may not have been exposed to all of this information in the past" "Some of the elements described were very basic and elementary; however, may be necessary as most students may not have the background in grant or research proposals." "I think this would be useful for first year master or doctoral students. Individuals who have submitted applications before may not gain as much information and could find parts of the course as review of concepts they are already familiar with. Overall is seems worthwhile to complete though." However, even with prior knowledge there was still value in completing CHRPP: "In my undergraduate and social worker master's degree I was required to do a total of 3 research courses, so I found this material to be a review for me, but it was a very good review and there was a lot of good information. I think this should be essential learning." "I have a fairly good knowledge and experience in ethics -but this tutorial was an excellent review." "I found this to be a good reviewalthough I have taken other similar courses before from NIH. This course does provide a Canadian context that is useful." "In my current job I actively work with the REB and submission of many proposalsalthough I am not a PI. I have been active in interpretation and use of the current (and new) TCP statement as well as the provincial privacy legislation. In this regard most of the material was not new but there were a few parts that made me think and back track."

Relative Impact of Specific Features or Modules on the Usefulness of CHRPP
With multiple specific features and modules potentially contributing to users' overall experience with CHRPP we wanted to see if particular aspects of the tutorials were related to user ratings of the tutorial's usefulness. First, we examined if user ratings of particular modules predicted ratings of the overall usefulness of CHRPP using a regression. We found a significant relationship explaining almost 60% of the variation (R 2 = .592) with higher ratings of modules 3, 4 and 8 predicting higher ratings of CHRPP overall as shown in Table 4. A similar regression with CHRPP's features and ease of use items as predictors found a significant relationship explaining about 50% of the variation in overall usefulness (R 2 = .517). Higher ratings of clear objectives, challenging quizzes, appropriate amount of content, easy navigation, and enjoyable interactive elements predicted higher ratings of CHRPP overall as shown in Table 5.

Discussion
Overall satisfaction with CHRPP appeared to be significantly related to specific modules and features of this online tutorial. The qualitative comments suggested possible reasons, including the usefulness of media clips for illustrating the importance of ethics in the real world and within research studies, as well as the interactive examples, exercises and quizzes which challenged users to apply newly acquired knowledge. Individual features including objectives, quizzes, content volume, interactive elements and easy navigation were predictive of overall course usefulness ratings. Interestingly, specific modules, rather than all modules, were predictive of overall course ratings, suggesting potential foci for further development or perhaps indicating to developers that not all modules were equally important for overall experience.
Taken together, these findings provide evidence that particular features and modules 3, 4 and 8 were particularly relevant to users' satisfaction with CHRPP. The tutorial appeared to address the needs of users across diverse backgrounds including research area and prior success with REB applications. Evidence for consistency across roles was inconclusive because of zero expected counts in some cross-tabulation cells that are used for chi-square calculations due to non-graduate student participants, and also because most participants rated modules and features quite highly.

The Evolution of CHRPP
CHRPP underwent several improvements inspired by user comments in the user satisfaction survey. At this time it had also come to the attention of PRE and was evaluated as a possible replacement for the existing TCPS tutorial by the Secretariat on Research Ethics (SRE, now known as SRCR). Upon their recommendation, PRE decided to adopt CHRPP as the basis for their new online tutorial, which would be based on the extensive update of their Policy (TCPS 2, launched December, 2010).
Queen's University agreed to license the CHRPP infrastructure so that SRCR and PRE could begin work on an updated tutorial that would eventually be known as TCPS 2: CORE (Course on Research Ethics). The French version of the tutorial would be called EPTC 2: FER (Formations de l'Ethique Recherche).
The development of CORE followed a similar pathway as the development of CHRPP. The development team, which included Balkwill, sought to retain the strengths reflected in the respondents' comments about CHRPP. There were still eight modules, but some were renamed and others were reconceptualised to reflect the perspectives of TCPS 2 (see Figure 3).
Why Ethics became Core Principles. Vulnerable Populations became Fairness and Equity. Module 8 was transformed from a tour of the Queen's University research ethics board process to a more general guide to REB review. Instead of separate paths for a general REB and a health sciences REB, one path was written for researchers and the other path was written for REB members.
CORE retained most of the audio and video material that was featured in CHRPP. Due to accessibility requirements, each of these excerpts was made available as a downloadable printed transcript in French and English. All new material followed the same ICE pedagogical model used with CHRPP, which focuses on teaching ideas, connections, and the extensions (Fostaty Young & Wilson, 2000). After an extensive development and vetting process, including beta and usability testing, CORE/FER was launched in June, 2011. See Figure 5 for a screenshot of CORE. Figure 5. CORE module one -Concern for welfare.
Within two weeks of the launch, over 2000 users had created accounts. Four months after launch, CORE/FER had served over 16,000 users. As of this writing, over 50,000 users have completed this tutorial. More than 60 institutions have made the tutorial a requirement (e.g. for course credit, to receive participant pool access, to be eligible for REB review, etc.) and have requested institutional access accounts to track user completion. Thanks to the feedback of initial users summarized in this article and ongoing development, CHRPP, now CORE, has not only left the nest, it is a soaring success.

Benefits of Online Tutorials for Research Ethics
The obvious benefits of an online tutorial for research ethics are that it is available to anyone with internet access, it is self-paced, and it can be used on its own or as a component of a course, a training program, or a research ethics curriculum. By including interactive elements, examples from a wide variety of disciplines, and media excerpts highlighting how research affects real people, CORE/FER engages users in the decision-making process of researchers and REB members. The first phase of this tutorial addresses TCPS 2 guidance that applies to all research, regardless of discipline. Phase two will consist of new modules addressing specific research areas such as qualitative research, biological materials and genetics, and research involving First Nations, Inuit and Métis peoples of Canada.
We have been asked if CORE/FER is currently part of any accreditation or certification process. It is not at present, but it could be adapted to fill this role. The current quiz structure is not pass/fail, but rather another opportunity for users to self-evaluate and learn from each response. The addition of a stand-alone exam based on the tutorial, written offline in real-time, would be one way to turn CORE/FER into a more rigorous evaluation of research ethics knowledge.

Limitations
Three limitations of our current study of CHRPP offer considerations for future research on online tutorials. First, our survey relied on self-report of confidence and improved understanding, which may reflect individuals' self-efficacy rather than actual ability in navigating research ethics guidelines. Future research could compare users' responses to questions about research ethics before and after the modules are completed to indicate knowledge gained. A means of tracking the success of user applications for REB approval could also serve as a measure of success in applying research ethics guidance. Either of these measures would provide a valuable and less subjective assessment of the impact of this online tutorial. However, they would also require the resources necessary to sufficiently measure knowledge and subsequent action.
Second, there is the potential for bias in responses and participation. As the questions were all positively worded, social desirability, for example, might cause positive responses due to the tendency of respondents to agree with a statement rather than disagree (Friedman, 1988). However, Friedman (1993) reported that the tendency to agree with positively worded statements manifested only when the order of the scale options ran from most positive (left-side) to most negative (right-side). While future research could further explore or might benefit from including a few negatively worded questions (e.g., the module objectives were unclear), a significant association of comment valence (positive or negative) with specific item ratings suggested the valence was consistently reflected. For example, comments provided by dissatisfied respondents tend to be more negative than those provided by satisfied respondents, consistent with the literature (e.g., Borg, 2005;Lewicka, et al, 1992;Poncheri, et al, 2008). Respondents who commented positively were nearly equivalent to those who wrote negative comments (ratio of 86:81), and were not significantly different from non-commenters in their ratings based on a two-tailed independent t-test, suggesting no respondent bias or confusion in the item ratings. In terms of survey participation across roles, most respondents were graduate students, which may reflect that graduate students are the population most often required to complete such training. Future research could focus on users in other roles (e.g. faculty, administrators, and REB members).
Third, the demographics collected included role, institution, success in prior REB applications, and area of research, but missed other factors such as prior experience with research or with TCPS that users raised in the open-ended comments. Beyond demonstrating the value of a mixed-method survey, these insights suggest that future surveys include familiarity with the online course content (e.g., research ethics policy, the REB review process and research). The current question about REB application success may be nuanced to distinguish between those who have not submitted an application from those who have but did not receive REB approval. It should also allow a "not applicable" response for those whose role does not involve submitting applications for REB approval. Due to the potential confusion that exists in the original version of this question, corresponding analyses can only be taken as preliminary results suggesting minimal but potential differences across roles.

Conclusion
The results of this evaluation were used to form recommendations for the development of the next version of this tutorialas described in the Discussion section, titled Application of Study Findings to The Evolution of CHRPP. Subsequent analyses of relationships among the quantitative and qualitative data offered further insights into e-learning design for this and other online educational activities. Since our original conception of CHRPP in 2005, and the official launch in 2008, we have been fortunate to see it so well-received by users and then leave the nest to become a key component of Canada's national research ethics education program. We attribute its success to sound pedagogical design (i.e., Driscoll, 1998;Driscoll & Carliner, 2005) and built-in flexibility to evolve with changes to the TCPS and to changes in online learning. Another important component was our commitment to continuous program evaluation that allowed us to further improve the tutorial. Our program evaluation was based on the established standards of practice which included the need for utility, feasibility, propriety, and accuracy (Yarbrough, Shulha, Hopson, Carruthers and The Joint Committee on Standards for Educational Evaluation, 2011). These are the standards that guided the design and implementation of this program evaluation.
This design protocol has been adopted by PRE for CORE/FER. Continuous program review has been built into the new online tutorial and user feedback is currently being analyzed and incorporated into a future update of CORE/FER. We look forward to the growth and development of this new hatchling, the legacy of CHRPP.