Evaluation of eLearning for Effective Practice Guidebook/Class Projects/Kevin and Jon - project page

From WikiEducator
Jump to: navigation, search

Effectiveness Evaluation Plan - Kevin and Jon

[Jon and Kevin this is an excellent plan, and you have provided a detailed background and rationale for the evaluation. The detail about your methodology is comprehensive and based on "solid" evidence. You clearly have an understanding of the evaluation process, and the need for mixed methods. The limitations are well described, and this is something you can mention in the report - whether these factors impacted on the evaluation. The reference list is impressive, and demonstrates you have read widely. Good to see APA referencing in action. It would be good to see some more current journal articles in the report. The purpose, decisions and evaluation questions fit very well apart from a couple of aspects - see further on.

Some suggestions: You mention "LMS data collection to ensure quality outcomes" early on, but have not included this in the data collection method. This is okay as it would be too big for this particular project if you do, but you need to mention that this particular method for collecting data will not be done for this project, but will be done in the future - put this in the methods section.

One of your evaluation questions - *ST9: Do the technologies employed help students successfully meet the learning outcomes? - may not be necessary, because I cannot see how the survey questions will answer this. It would be helpful to see an outline of what you expect to cover in the interviews - perhaps this evaluation question will be answered in the interviews. Are the learning outcomes related to the learning objectives for each individual course in the programme, or the graduate outcomes for the programme? Just make it clear in the report what the learning outcomes are expected to be.--Bronwynh 05:51, 16 November 2010 (UTC)] '

Introduction

The purpose of this plan is to outline an evaluation study to be carried out on web-supported content that was developed to enhance teaching and learning in a Certificate in Hospitality, Level 4 programme that is delivered in a polytechnic institute. The intention is to gather data to determine the effectiveness of the on-line content and the use of the learning management system, Moodle, and to make recommendations for enhancement and improvement. Included in the plan are the rationale and background to the evaluation, together with the methods employed and the target audience.


http://4.bp.blogspot.com/_4RraWTO7ZWs/TJVMYNnISJI/AAAAAAAAAME/Y_s53KIWdLs/s320/UCol+snip.JPG


In choosing to conduct an effectiveness evaluation, the evaluators have selected a process which will help determine whether the on-line materials fully support learners to achieve the programme's learning outcomes, and whether the objectives of the evaluation have been met. It is believed that an effectiveness evaluation will increase the probability of positive outcomes and will help the identification of areas for improvement. The evaluators are confident that the data collected in this evaluation will result in, but not be limited to, the following, as presented by Reeves and Hedberg (2003):

• Clarification of adaption, adoption, and implementation decisions

• The programme’s future with development of a solid, sustainable business case and model for approval review.

The 'Multiple Methods Evaluation Model', which involves triangulation to capture a broad data viewpoint, will be used. This will involve employing a combination of interviews, survey, and LMS data collection to ensure quality outcomes. It is envisaged that the resultant recommendations will be acted on by the organisation. It is implicit in this evaluation's objectives that “feedback should be sought at the level at which one is endeavouring to monitor quality...the focus should be on students’ perceptions of key aspects of teaching or on key aspects of the quality of their programmes” (Marshall, 2006, p.54).

Background

The focus of the evaluation are the on-line materials that have been developed to support a National Certificate in Hospitality (Professional Cookery), Level 4 programme offered by a polytechnic institute in the North Island of New Zealand. The programme is unit standards-based and comprises 138 New Zealand Qualifications Authority (NZQA) credits.

Delivery is on an intramural, full-time basis over 36 weeks, is face-to-face (a mixture of practical and theoretical learning), and comprises 790 lecturer-supported learning hours and 590 independent learning hours. The target market for the programme is those with prior experience in catering and who require a formal, national qualification in the area of culinary arts.

The programme offers students a broad overview of the advanced cookery skills required by the catering and hospitality industries and recognises the competence, skills and knowledge required of chefs with some experience, and who are involved in cooking-non routine dishes in a commercial kitchen.

The entry requirements for the programme are previous study and skills in culinary arts (equivalent to a Certificate in Culinary Arts, Level 3 or a National Certificate in Hospitality (Basic Cookery), Level 3 or minimum of five years’ industry experience. Students who do not meet these criteria but have a combination of previous study in culinary arts and industry experience may be granted entry to the programme at the discretion of the Programme Leader and with the approval of the Faculty Board of Studies. International students require a level of English language ability equivalent to IELTS 5.5; students who do not meet this criterion may be granted entry to the programme at the discretion of the Programme Leader.

Graduates are able to cook dishes from fresh ingredients and use a substantial range of preparation methods, cookery methods, and finishing techniques. They will have demonstrated the ability to cook a variety of different dishes across each of the main food groups and types, using advanced techniques, multiple ingredients and differing flavours and textures. They will also have demonstrated knowledge and understanding of food commodities and their application, food safety, basic nutrition, commercial food costing, and portion control. In addition, graduates will have demonstrated a range of interpersonal, personal presentation, customer care, numeracy, literacy, and teamwork skills required for working in a commercial kitchen.

The 8 papers (courses) within the programme are supported by on-line materials, using the institute's learning management system (LMS), Moodle. They are; Food Safety, Nutrition and Commodities, Hospitality Operations, Advanced Hot Kitchen (1), Advanced Hot Kitchen (2), Advanced Hot Kitchen (3), Advanced Hot Kitchen (4), Patisserie (1), and Patisserie (2).

The topic blocks in Moodle are not aligned to the papers and are organised as follows; Class Notes, Patisserie Theory, Homework, Theory Assessments, Practical Assessments Information, Practical Assessments Two, Videos from Class, Restaurants and Cafe Menus. Section links are provided and are; Culinary Terms, Assessment Information, Video Clips, Shared Recipes, Recap of Lessons, and Student Handbook. Links to PowerPoints, handouts, videos, recipes and underpinning knowledge questions (UPKs) are provided in each topic block.

Teaching staff demonstrate how to access Moodle during the first week of the programme and articulate their expectation that students will access the materials to support their learning. The importance of regularly accessing the materials on Moodle is reinforced during face-to-face sessions.

Staff teaching on the programme value being able to identify how often and for how long students access the on-line materials. They have, however, expressed concern that the current cohort of students are not fully engaging with Moodle, particularly when compared to the previous cohort. The age range of the current cohort is between 16 and 24 years, and all are international students from China. [I am interested to know which aspects of the online materials students are not engaging with - is it specific content, activities or other aspects, e.g., online assessments, or discussions?--Bronwynh 05:23, 16 November 2010 (UTC)]

Purposes

The principal purpose of this evaluation is to explore and critically review the on-line materials that support the face-to-face delivery of a National Certificate in Hospitality (Professional Cookery), Level 4 programme.


This will enable the evaluators to make judgements as to the effectiveness of the materials and contribution to the overall learning experience of the students and achievement of the learning outcomes. The findings and conclusions drawn from undertaking the evaluation will lead to the formulation of recommendations for enhancements and improvements, based on best practice guidelines (Milne & Dimock, 2006).

Limitations

This year's cohort of 14 is substantially smaller than previous cohorts (18 students were enrolled on the programme and 4 have withdrawn), which greatly reduces the anticipated sample size.

It is intended that every student from the previous year's cohort will be contacted to elicit their feedback, but this may not be achievable as some graduates have relocated and their email addresses or contact details may not be known. If this is the case, or the response rate is low, relevant data will be extracted from the programme's 2009 Student Satisfaction Survey, which includes questions related to the delivery and effectiveness of on-line content.

Interviews with current students may prove problematic, as English is their second language. To overcome these problems, the interviews will be recorded (with the students' consent) and the support of an ESOL specialist will be sought to help with transcription.

It is intended to use the free version of 'Survey Monkey'. As this version limits users to 10 questions, this will inevitably have a slight impact on the reliability and validity of the responses.

Audiences

The primary audiences for this evaluation will be students enrolled on the programme, the Programme Leader and teaching staff involved in the delivery and assessment of the 8 papers.

Secondary audiences will be the Heads of School, the Faculty Deans and the institute's eLearning Steering Group; it is intended that the evaluation findings and subsequent recommendations will be widely disseminated and help inform the design and management of other Moodle courses in the institute.

Decisions

It is envisaged that the findings of this evaluation will be used to influence future decisions in the School of Hospitality, in particular, and the polytechnic institute, in general. The findings and recommendations will highlight areas for improvement, and suggest alternative and more effective approaches and tools to incorporate when developing on-line content to support lecturer-supported and independent learning hours. Of particular value will be designing content that meets the needs of students for whom English is a second language.

Anticipated outcomes of the evaluation are;

Positive Outcomes

Students fully engage with the on-line content.

Students achieve the programme learning outcomes and demonstrate competence against all unit standards.

Adequate resources and time are allocated to improving the instructional design and content.

Students are actively encouraged to undertake regular evaluations of the on-line content and make recommendations for improvement.

The on-line content is included in the programme's Self Assessment and External Evaluation and Review Process.

Negative Outcomes

Teaching staff may feel threatened by the findings and recommendations.

Teaching staff may lack the knowledge, skills, motivation, and/or time to incorporate the recommended enhancements and tools.

Students may still not fully engage with the on-line content.

The on-line content requires extensive modification.

The recommended changes do not fully meet the needs of students for whom English is a second language.

Questions

[This section is for the "big picture" evaluation questions which you have put in the data collection matrix. I am putting them here. I remember mentioning you would be best to do this previously, and also about putting the survey and interview questions in the Appendix. I will settle for the way you have done it in the plan, but will expect the re-arrangement to be done when the report is prepared. :) --Bronwynh 05:30, 16 November 2010 (UTC)]

  • ST2: Do the students know at the start of the course what is expected of them?
  • ST9: Do the technologies employed help students successfully meet the learning outcomes?
  • SO12: Do students have access to content support in a timely fashion?
  • ST5:Have activities been identified that allow individuals and groups to learn through experience, including opportunities to demonstrate, reinforce knowledge, develop understanding, and practice skills?
  • How did the online experience and/or materials add to participant learning?
  • Does the paper successfully integrate technology and instruction?



Question development influenced by the Evaluation Cookbook and the eLearning Guidelines resulting in what is hoped to result in a balanced response for this evaluations purpose (see Methods, and Instrumentation).


Survey Questions (Students)


Student Computer Access (Yes / No)

1. I have access to a computer at home

2. I have access to Broadband at home

3. I can easily get to a campus to use a computer

4. My computer runs Microsoft Word 97-2003 (.doc)

5. My computer runs Microsoft Word 2007 (.docx)

6. I use an alternative word processing software


UCOL Moodle (Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment)

• This programme makes good use of web resources.

• The Moodle programme is well organized.

• Technology does not get in the way of my learning.

• Technology supports my learning.

• More on-line options would help me learn.

• I learn better online than I do in a traditional classroom.

• I would have more control of my learning if more material was available on-line.

• I would find an on-line discussion forum helpful for communicating with my lecturer(s) and classmates.


Re: Online Experience

• Overall, how satisfied or dissatisfied are you with the format of the on-line component of this course? Very Dissatisfied / Dissatisfied / Satisfied /Very Satisfied /Additional Comment

• It is easy to find my way around the course (or paper) online. Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• It is easy to use the online features of the Moodle course. For example, the Discussion fourm the quizzes etc. (Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• How much on-line interaction have you had with other students in your programme? A lot /Some basic interaction / Very little / None/Any additional Comment


Survey Questions (Staff)


• How do you think the online component of the paper encourages a performance-based learning experience for the student?

• How does the on-line component successfully integrate technology and instruction?

• Does the on-line component successfully integrate technology and instruction? (Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• Do the on-line lessons present easy to follow instruction? Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• Is the on-line component addressing academic expectations? Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• Is there any bias shown within the online component (social, gender, etc.) Strongly Disagree / Disagree / Agree / Strongly Agree/Additional Comment

• Are documents and audio presented in appropriate format in your opinion?

• Have you had any feedback, and in what form, about the paper and it’s online component?

• Do you have any specific concerns about the paper as an online resource?

• Do you have anything you would care to remove from the paper for the benefit of all participants?

Methods

A mixed methods approach to evaluation has been chosen, as using only a quantitative approach which provides statistical results, may present difficulty in assessing the efficacy of the on-line content. Adding qualitative ‘flesh’ to the quantitative ‘bones’ is acknowledged as a sound strategy to overcoming this problem.

Green et al. (1989) highlight five major purposes for mixed-method evaluation;

Triangulation tests the consistency of findings obtained through different evaluation instruments. In this evaluation, triangulation will increase the opportunities to control and assess some of the threats or multiple causes influencing the results.

Complementarity clarifies and illustrates results from one method with the use of another method. In this evaluation, the focus group will add information about the quality and effectiveness of the on-line materials and will help to qualify the statistics gained from the questionnaires.

Development results from the use of one method shape subsequent methods or steps in the evaluative process. In this evaluation, results from the electronic questionnaires might suggest that other instruments should be introduced.

Initiation stimulates new evaluation questions or challenges the results obtained through one method. In this evaluation, semi-structured interviews with the developer and teaching staff provide insights on how the on-line content has been perceived and valued.

Expansion provides richness and detail to the evaluation as it enables an exploration of the specific features of each method. Use of a range of evaluation tools will expand the breadth of the evaluation and enlighten the more general debate on the effectiveness of on-line content to support face-to-face programme delivery.

The evaluation methodology, integrating different methods, is likely to produce better results in terms of quality and scope. It will also enable the evaluators to probe the underlying issues assumed by mixed-method and provide the means to recommend creative alternatives to the existing content. It is anticipated that the evaluation outcomes may trigger small but radical shifts in practice in the short term and be useful to audiences outside of the programme.


Sample

The student sample includes the 14 students who are currently undertaking the National Certificate in Hospitality (Professional Cookery), Level 4, plus students from the previous intake (number currently unknown).

The staff member who developed the on-line content will be invited to participate in the evaluation, along with four other staff who are currently teaching or have previously taught on the programme.

Three teaching staff members who have developed on-line content for similar programmes on other campuses will also be invited to participate. The rationale for this is that they have developed their content in Moodle and may have already evaluated their materials and chosen tools; it is anticipated that these staff will be willing to share the outcomes of their evaluations.

Instrumentation

It is proposed to use a range of data collection instruments to evaluate the effectiveness of the on-line content of the programme: questionnaires, focus group, semi-formal structured interviews.

"Questionnaires and surveys constitute one of the fundamental methods of research, especially in the area of education, where the human factor holds the center role” (Kaskalis, 2005). Questionnaires were chosen as they are familiar to most people, generally do not make people apprehensive, and are less intrusive than telephone surveys. It was decided to use electronic questionnaires as they have the advantage of providing data that can be quantified quickly and efficiently.

Focus groups were originally called ‘focused interviews’ or ‘group depth interviews’ and the technique was developed after World War II to evaluate audience response to radio programmes (Stewart & Shamdasani, 1990). A focus group may be defined as a group of interacting individuals having common interest or characteristics, brought together by an interviewer, who uses the group and its interaction as means of gaining information about a specific or focused issue. Programme evaluators have found focus groups to be useful in developing their understandings of how or why people hold particular views about a topic or programme of interest.

The focus group will be conducted in three phases (Krueger, 1988):

1. Conceptualization

2. Interview

3. Analysis and reporting.

Careful and systematic analysis of the focus discussions will enrich the information gleaned from the electronic questionnaire and provide the evaluators with indications and insights as to how the on-line content is perceived, used and valued by the students.

Semi-structured interviews will be conducted with a fairly open framework to allow for focused, conversational, two-way communication. The evaluators are cognisant that “semi-structured interviews, to be successful require:

• As much preparation before the session, probably and certainly

• More discipline and more creativity in the session, and certainly

• More time for analysis and interpretation after the session" (Wengraf, 2001, p.5).

Not all questions will be designed and phrased ahead of the interviews; most of the questions will be created and posed during the interview in order to allow the interviewer the flexibility to probe for details or discuss issues with the interviewee. Two underlying principles that will be followed when conducting the interviews will be to avoid leading the interview or imposing meanings, and to create a relaxed and comfortable conversation.


The chosen instruments will be used as follows:

- Current students will be surveyed using an electronic questionnaire, followed by a focus group.

- The previous year's students will be surveyed using an electronic questionnaire.

- The on-line developer will be invited to take part in a semi-formal, structured interview*.

- Staff who taught on the programme in 2009 and those who are currently teaching on the programme will be surveyed using an electronic questionnaire.

- Three developers of on-line content for similar programmes of study will be invited to take part in a semi-formal, structured group interview*.

  • Consent for interviews to be recorded will be requested to allow the interviewer to draw out the main themes.

Logistics and Timeline & Data Collection Matrix



Data Collection Matrix

Questions
          Survey                
           Interviews lead teachers                 
           Interviews Individual teachers
                                                         ST2: Do the students know at the start of the course what is expected of them?

X

X

X

ST9: Do the technologies employed help students successfully meet the learning outcomes?


X

X


SO12: Do students have access to content support in a timely fashion?

X

X

X

ST5:Have activities been identified that allow individuals and groups to learn through experience, including opportunities to demonstrate, reinforce knowledge, develop understanding, and practice skills?

X

X


How did the online experience and/or materials add to participant learning?

X

X

Does the paper successfully integrate technology and instruction?
Dates: Events:
Weeks 5-8 (Sept.5 - Oct.2)
  • Design initial plan
  • Contact Staff to be sampled
  • Compile questions for interviews (students/staff)
  • Organise dates/times
  • Gather/review research


Week 9 (Oct.3 - 9)
  • Disseminate survey
Weeks 10-12 (Oct10 - 30)
  • Implement interviews
  • Analyse evaluation results
Weeks 13-16 (Oct. 31 - Nov.27)
  • Review/Present Data
Week 17 (Nov.29 - Dec.4)
  • Present summary of project to the class
  • Hand in the final of evaluation report


Budget

Jon 60 hours @ $80 per hour = $4,800.00 Kevin 60 hours @ $80 per hour = $4,800.00


References

Gratton, C., & Jones, I. (2004). Research methods for sport studies. New York: Routledge.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation design. Educational Evaluation and Policy Analysis, 11(3), 255-74.

Harvey, J. (1998). Evaluation Cookbook. Retrieved from http://www.icbl.hw.ac.uk/ltdi/cookbook/

Kaskalis, T. H. (2005). Localizing and experiencing electronic questionnaires in an educational web site. World Academy of Science, Engineering and Technology, 3. Retrieved from http://www.waset.org/journals/waset/v3/v3-36.pdf

Krueger, R. A. (1988). Focus groups: A practical guide for applied research. Newbury Park, CA: Sage Publications.

Marshall, Dr. S. (2006). E-Learning maturity model: Process assessment workbook. Wellington, New Zealand: Victoria University of Wellington.

Milne, J., & Dimock, E. (2006). eLearning guidelines: Guidelines for the support of eLearning in New Zealand. Retrieved from http://elg.massey.ac.nz/Guidelines-questions.pdf

Munn, P., & Drever, E. (1999). Using questionnaires in small-scale research: A teachers’ guide. Edinburgh: Scottish Council for Research in Education.

Reeves, T. C., & Hedberg, J. G. (2003). Evaluation: Interactive learning system. NJ: USA. Englewood Cliffs.

Stewart, D. W., & Shamdasani, P. N. (1990). Focus groups: Theory and practice. Applied Social Research Methods Series, Vol. 20. Newbury Park, CA: Sage Publications.

Wengraf, T. (2001). Qualitative research interviewing: Semi-structured, biographical and narrative methods. London; Thousand Oaks, CA.: Sage Publications.