Evaluation of eLearning for Effective Practice Guidebook/Class Projects/Alex and Rachel - project page

From WikiEducator
Jump to: navigation, search

Alex and Rachel - Evaluation Plan

A note about our approach to this project page

Our approach to the planning phase of this project is to draft our plans independently, share them on this page then provide feedback on each other's plans. We have agreed to approach planning this way as while we both work in similar environments, may use similar evaluation methods and be working for similar outcomes, our projects are fundamentally different. Rachel is evaluating an established evaluation methodology, Alex is evaluating the manager's role in an induction programme. With the application of evaluative practice being quite diverse it would therefore be difficult for either of us to draft a section that could be applied to both projects. We agree that this approach is the most beneficial for us both.

We also believe by approaching the project in this way, we will be able to contribute and colloborate on each others work - a peer review [1]- prior to inviting feedback from our fellow classmates. We will be in the position to be able to take the best of our colloborative work, along with any suggestions or refinements and each of us will finish up with a highly polished Evaluation plan. After all, two heads are better then one ...

Follow Rachel on her blog 

Alex's Blog http://alexselearningevaluationblog.blogspot.com/

[Alex, your part of the plan is looking good. You have covered all the sections, in particular, the rationale for using interviews actually provides valuable background information - it would be good to see this in the methods section. There needs to be information about the type of evaluation you are planning, i.e., effectiveness. There is a good list of Decisions and limitations. The interview questions are moved to the Appendices.

Please provide clarification about your "big picture" questions - see further on. Also there are a lot of interview questions which may mean the interviews will take some time. My advice is to gather as much information as possible with the survey to save time. Be careful about using interview questions which ask for opinion "What do you think..?" - you need to clearly indicate what you want to know otherwise you will get a lot of vague responses.

For this project, it is possible to only conduct a few of the interviews, and get responses from a proportion of the anticipated survey respondents. Bronwynh 01:52, 18 October 2010 (UTC)]

Introduction

Provide an overview of the intentions and design of evaluation project, and introduce major sections of the plan as well as the primary people involved in writing the plan.

Rachel

In 2008, the organization was awarded the Achievement Award - Gold Level from the New Zealand Business Excellence Foundation in recognition of it’s strengths including the organisation’s focus on continuous improvement and clear vision to be a World Class provider of general insurance. The organisation continues to strive towards business excellence across all activities and areas of the organization, including how it measures the success of delivering quality learning solutions to the employees.

The high levels of engagement experienced by the organisation’s employees in a positive workplace climate where the quality learning & development solutions are provided for it’s employees has seen it ranked within the top five 'Large Workplaces' in the independent nationwide unlimited/JRA Magazine's 'Best Places to Work' survey every year since 2004.

The organisational capability & learning (OC&L) team have committed to a review of the organisation’s learning evaluation practices. The OC&L team are learning & development professionals employed by the organisation to provide specialist expertise in the areas of learning analysis, design, development, implementation and evaluation. The evaluation review will be completed by the OC&L Team and will cover both behavioural Competency (customer service, leadership & other “soft skills”) and Technical (industry, vocational & technology) learning within the organisation.

The review will focus on:

  • The usability of the current tools for collecting evaluation data from learners
  • The usability of the current on-line evaluation/survey tool for the OC&L Team
  • The process of completing learning evaluations for learners
  • Producing meaningful data and reports to enable stakeholders to make decisions which will (in time) deliver specific business results.
  • Identifying alternative evaluation data collection methods (beyond the current paper based or on-line collection tools).

This review will feature input from learners, team leaders and managers at all levels of the organisation who are employees undertaking learning activities as part of their overall individual development activities.

Alex

The Learning & Development (L&D) team was responsible for the design and implementation of a new induction programme for our organization in 2009. While considerable focus has been given to the process and resources for the ‘Discover’ programme L&D would like to understand better how the manager’s experience of inducting a new employee under the new process. The team would also like to understand how the new induction material has been incorporated in the eight week Call Centre Representative induction.

Background

Present any information which is needed to provide the reader with an understanding of the background of the eLearning that is being evaluated and the rationale.

Rachel

Approximately 5 years ago, the organisation adopted the Kirkpatrick Four Level Model of Evaluation as their evaluation standard. The Kirkpatrick Four Levels are:

Level 4: Results To what degree targeted outcomes occur, as a result of the learning event(s) and subsequent reinforcement.
Level 3: Behavior To what degree participants apply what they learned during training when they are back on the job.
Level 2: Learning To what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event.
Level 1: Reaction To what degree participants react favorably to the learning event.

However, deployment of the Kirkpatrick Module has been inconsistent. Delivery of surveys, collection of data & reporting has sporadic. Different processes and question sets exist between Technical & Competency learning. This is a direct result of the organisation’s lack of collaborative and holistic approach to evaluations across all arenas of learning.

When reviewing the combined evaluation activities from both the Technical & Competency learning areas, the following issues were isolated:
Some learning activities have been “over-evaluated” e.g. spamming learners with multiple on-line surveys which resulted in no/very low response rates. “The problem of fatigue usually arises out of intensive interaction between the …. system and the respondent. Consequently, due to lack of interest or attention, the respondent will provide biased answers either intentionally or unintentionally.”[1]

  • One evaluation form (set of survey questions) was distributed over a period of 6 month to all learners for all solutions; regardless of their delivery mode e.g. two surveys were longer (29 questions over 4 pages) than the 2 page learning activity being evaluated.
  • Some evaluation data was collected which was outside the organisation’s area of influence to rectify e.g. what topics or subjects were missing from this book that you would like to learn more about?
  • Surveys and evaluations had been distributed sporadically and some responses collected over a 3 year period without any:

    o review or analysis of the data

    o follow-up with facilitators’

    o feedback to team leaders/managers

    o action planning or any other output post learning activity.

  • The context of the words used in some evaluations was inconsistent, leading to different interpretations – some learners were daunted and feared being ‘tested’ or ‘assessed’, whilst others were confused or unable to interpret the questions.

The current state of evaluation in the Technical Learning area is almost nil. After a very long break of 2-3 years with no Technical evaluations being delivered, the OC&L Team has recently recommenced sending & collecting level 1 data from learners. The recently revised Level 1 evaluation uses a different set of questions which are customised for each solution type. For example, different questions are asked if the solution is a job aid designed & developed using their in-house capability versus an on-line module purchased from a third party supplier. The organisation has always recorded summative assessment results as proof of knowledge acquisition and this has been considered sufficient in the past to meet the data collection needs of Kirkpatrick’s Level 2: Learning. The organisation has decided there will be no Level 1 evaluation conducted for any compliance training as these are mandated regulatory requirements.

In the Competency Learning space, another approach has been taken. All instructor led training (ILT) courses or workshops have been fairly consistently evaluated over the past 2-3 years. The organisation also has a small library of topically relevant books and articles which were being evaluated over the past 6 months but this has recently been suspended, pending the outcome of this review. When the organisation adopted the Kirkpatrick model 2-3 years ago, post course assignments were created for nearly all of the ILT courses/workshops to provide a means of assessing the knowledge, skill & attitude acquired (summative assessment). Unfortunately, there has been no records kept of these results and it appears that not all learners have completed or submitted the post course requirements.

Alex

The ‘Discover’ induction programme was launched in late 2009 and as yet has not been evaluated. A learner evaluation is being carried out by the Learning & Development manager using the online survey prepared by our design partner Inspire. The L&D Advisor's role is to carry out an evaluation on the resources and process provided for the managers to implement the Discover programme with new staff. This includes a Manager's guide to induction (both paper based and online), several online modules and a process that involves the recruitment team instigating the induction through direct contact with the manager.

The items in themselves contain learning outcomes for managers - a fact that may not have been clearly articulated to either managers or those who are part of the induction process. As discussed in Assessing Learning Outcomes (2006), 'assessing learning outcomes is concerned with determining whether or not learners have acquired the desired type or level of capability, and whether they have benefited from the educational experience'. Manager's should be using the guides and resources to aquire the desired capability and benefit from the educational experience, so to ensure it's validity and usefulness these resurces should be evaluated.

Purposes

Describe the purposes of the evaluation, that is, what you are evaluating and the intended outcomes.

Rachel

The overall purpose of this evaluation review is to review current evaluation processes, design & implement evaluations process for Level 1 (Reaction) & 2 (Learning) to ensure learning activity continues to contribute to and remain aligned to the strategic goals of the organisation. However, the organisation acknowledges they have limited ability to influence the design of the externally developed third party modules.

Therefore, the writer will be concentrating on aspects of the overall workplace review which are within the organisation’s direct control by focusing on the evaluation of the internally developed Technical Learning on-line modules at Level 1 (Reaction); and developing a plan to address any gaps or areas for improvement which maybe identified as a result.

Alex

To evaluate the effectiveness of the process, resources, communication and support available to managers for the ‘Discover’ programme to ensure management activities during an employee's first ninety days contribute to the intended outcomes of the programme. While focused on overall effectiveness of the programme, the work is interwoven with outcome and impact evaluation to ensure the measure the programme has caused a 'demonstrable effect' in the way our managers induct new employees and what the broader, overall affect this has on our rganisation as a whole. ([Research Methods Knowledgebase, 2006, p.Types of Evaluation).

Limitations

Outline any limitations to the interpretation and generalizability of the evaluation. Also describe potential threats to the reliability and validity of the evaluation design and instrumentation.

Rachel

In Scope: Level 1 evaluations for technical learning on-line modules.
Out of Scope: Evaluations for other types of learning solutions at Level 1 or 2 (e.g. group learning, books, tertiary study, self paced workbooks, etc). Any evaluation at Level 3 or Level 4, as well as any Return on Investment (ROI) or Return on Expectations (ROE) analysis.

Alex

The evaluation activities described in this plan are to measure the effectiveness of the manager's portion of the induction only. Consideration of the employee's (learner) experience will only be considered in terms of their interaction with their manager during the programme. Data regarding the employee's experience will be gathered outside of this plan.

Potential threats to the reliability and validity of this evaluation include:

1. The evaluator cannot guarantee that all Managers will participate in the online survey.

2. Face to face data collection from managers is limited to a small pool of managers due to time constraints.

3. Not all managers have participated in supporting new employees through the induction process so amount of data can be gathered on user experience.

4. Not all support staff have participated in supporting managers through inducting a new employee so a limited amount of data can be gathered on from this perspective.

Audiences

Specify all the primary and secondary audiences or consumers of the evaluation.

Rachel

This review will feature input from all learners, team leaders and managers at all levels of the organisation who are employees. They will be undertaking to complete technical learning on-line modules as part of their overall individual development activities.


Alex

The primary audience for the evaluation will be the Organisational Capability team and the People & Culture Leadership Team. The evaluation and subsequent reporting will determine if further investment in the programme should be made and to what degree. The secondary audiences are the P&C Business Partners, Recruitment Team and Retail Trainers. These groups all play a significant part in the induction process and will have contributed to the evaluative process. They will need to be consulted on any resulting changes  to process or resources.

Decisions

This section is probably the most difficult, but it should be included if the evaluation is to have meaningful impact on decision-making. Both positive and negative outcomes should be anticipated.

Rachel

To enable stakeholders to make learning related decisions which will (in time) deliver specific business results, they need access to meaningful learning data and reports. For example, changes to the content, navigation or support for learners who are completing online Technical Learning modules will made if the current modules are found to be insufficient for delivering the relevant knowledge for the learner’s role and position within the organisation.

Alex

A key outcome of this evaluative process is to determine if further investment in the induction is required to improve the experience of new starts in the organisation. From anecdotal feedback the Organisational Capability team has received it appears the a significant number of managers may not be adequately aware the programme or do not understand how to implement it successfully. The data will therefore lead to specific decisions the Organisational Capabilility Manager and P&C Leadership team need to make. The report will need to cover a variety of recommendations and present possible options, including an option for revitalising the programme without further investment.

Key decisions the Organisational Capabilility Manager and P&C Leadership team need to make include:

1. whether the findings indicate a critical need for redevelopment/refreshment of the programme;

2. financial implications for redevelopment/refreshment  and the budget managers are prepared to allocate;

3. whether redevelopment should form part of the Manager's on-boarding project or be kept separate;

4. who will be responsible for running the redevelopment/refreshment  project if is to go ahead;

5. timeframes the Organisational Capabilility Manager and P&C Leadership team are prepared to commit to for redevelopment/refreshment.

Questions

List your 'big picture' evaluation questions and sub-questions here. A key element of a sound evaluation plan is careful specification of the questions to be addressed by the evaluation design and data collection methods. The clearer and more detailed these questions are, the more likely that you will be able to provide reliable and valid answers to them. Note: The eLearning Guidelines can assist here.

Rachel

MD1   What systems are in place for monitoring the quality of learning material, including its periodic review and or redevelopment?

MO17 How does the organisation monitor the impact and effectiveness of e-learning?

[Analyse] 3 factors that determine the effectiveness of elearning:

1. What the learner’s manager does before the learning?

2. What the tutor or instructional designer [OC&L Team] does before the learning takes place?

3. What the manager does after the learning has taken place?

Giving people new knowledge and skills is way down the list.[2]



Alex

The overall purpose of interviewing Business Partners, Managers and the Retail Trainers is to identify how well these groups understand the induction process, their role in the it and the how to access support and resources for the programme. The interviews and surveys are designed to determine how well People & Culture have disseminated information via online mechanisms and if support channels have been adequately advertised. This approach aligns with Process S5 of the E-Learning Maturity Model - Teaching staff are provided with pedagogical support and professional development in using e-learning. While Marshall discusses the process from the 'teaching staff's' perspective, the same can easily be applied to the corporate world and the role of the manager. 'Teaching staff need training and support if they are to be effective with new technologies [they] need to be able to access a range of professional supports as they encounter issues during their work.' (Harasim in Marshall, p.134, 2007) Our managers require the training and support to implement our induction programme and this evaluation will determine if they have adequately received that.

Key (big-picture) question:
Are our managers provided with sufficient educational support and professional development in using the tools and resources provided to successfully implement the Discover progamme?

Sub-questions
• How well do Business Partners, Managers and the Retail Trainers understand the induction process, and their role in it?
• What do key personnel know about accessing support and resources for the programme?
• What are the perceptions of managers about managing the induction process?

Please see the appendix for a full list of questions to be used in the interviews.

Methods

Describe the evaluation design and procedures. Match the method to the purposes and questions of your evaluation, and the phase of eLearning, e.g., analysis, design, development, implementation.

Rachel

  • Draft interview questions to be used when meeting with stakeholders to discuss what evaluation is and what it is not in order to provide a meaningful benchmark of quality for the organisation. [Analysis phase]
  • Review current evaluation processes for on-line modules within the Technical learning space to enable contribution at a later stage towards the definition of evaluation processes across the whole organisation. [Analysis phase]
  • Review use of the organisation’s current on-line survey/evaluation tool to identify if a more user-friendly application can be identified for later consideration. [Development phase]
  • Draft a set of Technical Learning evaluation practices for on-line modules including:
     o A timetable/calendar of when and how often the evaluation will occur [Implementation & Evaluation phase]
     o Identify which on-line modules will be evaluated at each level [Implementation phase]
     o Determine how feedback will be distributed to team leaders/managers and the various authors of on-line modules [Design & Development phases]
     o Document the evaluation process for new on-line solutions [Design & Development phases]

Alex

As the programme is already being utilised by significant numbers of organisation the evaluation of the manager's resources and process of the Discover programme is an effectiveness evaluation. Prior to the development of the programme a needs analyis was carried out by the external design team to identify the organisation's requiremets for the programme. Within the content itself, new employees are provided with the opportunity to evaluate their experience of the programme through a survey that is both an effectiveness and impact evaluation. Effectiveness evaluation is concerned with measuring the degree to which a course, programme or resources supports the user in meeting the desired learning outcomes and what, if any measures need to be taken to improve it's effectiveness. This evaluation is focused on identifying the effectiveness of the tools and resources provided for managers to prepare them and support them through inducting a new employee. The desired outcome of the tools and resources we provide is that managers will carry out a thorough induction process that is aligned with the organisations intentions and brand values.

The evaluator proposes a mix of qualitative and quantitative data collection methods. Qualitative data is descriptive data generally captured through interviews, observations or open answer survey questions. The purpose of qualitative data is to gather evidence of the experience from the user/learner/employee's perspective in their own words. Quantitative data is generally expressed in numbers and can be accumulated through assessments, metric driven survey questions (likeart scales), business reporting metrics and database analysis. The purpose of collecting quantitative data is to provide defined verifiable data that can often be manipulated to provide several angles of information about the sample group.

The evaluation will consist of the following activities:

  • Online voluntary survey for all Contact managers
  • Face to face interview with 12 Contact managers
  • Face to face interviews with P&C Business Partners
  • Face to face interviews with P&C Recruitment Consultants
  • Face to face interviews with Retail Trainers
  • Analysis of learner evaluation results
  • Organisational results of Hewitt Engagement Survey

The evaluator feels that collecting data from a variety of sources, using varied collection methods will provide a broader basis for analysing and understanding the effectiveness of the programme. Some academics refer to this ‘broad strategy of data collection and analysis within which a range and variety of techniques’ (Elliott, date unknown, p.Triangulation) are used as triangulation.

Interviews (qualitative data) were chosen to enable conversations directly with managers for deeper insight into what it is like to manage the process. All managers being interviewed will have had a new employee since Jan 1 2010 so should have been through the induction process and will be able to give a detailed first hand account of the process from their perspective. The Interviews with Business Partners and Retail trainers will offer the same benefits.

The purpose of the survey (quantitative data) is to widen the data pool. The organisation has around 120 managers across New Zealand so the survey is the best way to try and accumulate as much data from them as possible. The survey will also proved metrics to report on and compare to our engagement survey results. The mix of methods was chosen to gain deeper insight in various areas of investigation. In Methods of Assessment the author descriibes how a range of assessments can help determine the level of achievement.

The evaluator feels that the interview method will provide data about 'actual performance' that would not be evident in a survey response. The evaluator is also looking for the 'oral response' that from participants that can better articulate the actual level of a participant's capability (Methods of Assessment, 2006).

Sample

Specify exactly which students and personnel will participate in the evaluation. If necessary, a rationale for sample sizes should also be included.

Rachel

Up to 250 learners from across all levels of the organisation who complete at least one technical learning on-line module from Jul to Nov 2010.

3 x Technical Reviewers (subject matter experts) who are responsible for reviewing and approving on-line technical module content.

2 x Authors who are responsible for creating on-line technical modules from approved stroryboards.

2 x Testers who are responsible for reviewing and publishing on-line technical modules for learners.


Alex

4 P&C Business Partners - There are currently only four P&C Business partners within the organisation.

2 Recruitment Consultants - There are currently only two full time Recruitment Consultants in the organisation

3 Retail Trainers - Only three Tetail trainers are avilalbe for the interview, the final  one is on Annual Leave

12 Managers (Interview only) - This is an achieveable sample size and 10% of the full manager group. Two managers from each of the six business groups will be selected. 

120 Managers (Survey Only) - The survey will be made available to all one hundered and twenty managers within the organisation. This is to get as substatinal a response as possible.

Instrumentation

Outline all the evaluation instruments and tools to be used in the evaluation. Actual instruments, e.g., questionnaire, interview questions etc., should be included in appendices.

Rachel

The evaluation instruments and tools to be used in the evaluation are face to face interviews, online surveys, storyboard edits, testers checklists and email feedback from the learners who are involved with the design, development or completion of at least one technical learning on-line module from Jul to Nov 2010.


Alex

The evaluation instruments and tools to be used in the evaluation are face to face intervies, online surveys, data from the employee feedback process for induction and data gathered in the 2009 Hewitt survey.

Logistics and Timeline

Outline the steps of the evaluation in order of implementation, analysis, and reporting of the evaluation, and include a timeline.

Rachel

The writer will be responsible for drafting the report, with peer review feedback to be incorporated as deemed essential to met the evaluation purpose.

Draft stakeholder interview questions Jul 1 day
Review current evaluation processes for on-line Technical learning modules Aug 1 day
Draft a set of Technical Learning evaluation practices for on-line modules Sep 4 days
Review use of the organisation’s current on-line survey/evaluation tool Oct 2 days
Implement elements as detailed above Oct-Nov 5 days
Reporting and recommendations Dec 2 days


Alex

All evaluation design, development, implementation and administration will be carried out by the Learning & Development Advisor and signed off by the Learning & Development and Organisational Capability Managers.

Below outlines the due dates for each section of the evaluation plan.

1. Online survey design 30 September  2. Manager interview 30 September  3. Business partner interview design 15 September  4. Business partner interviews 1 October  5. Online survey implementation 15 October 6. Manager interviews 15 October 7. Analysis of learner evaluations 30 October 8. Reporting and recommendations 30 November

Budget

Provide a rough costing of the evaluation process. For example, the evaluator's time and costs, payment of other participants etc.,

Rachel

The project does not requiore a budget however costs have been estimated using the organisational average cost per hour as follows:

Evaluator costs: 7.5 hours per day x 15 days = 87.5 hours @ $100 per hour = $8750

Sample audience costs: too variable across the different groups to estimate.


Alex

The project does not requiore a budget however costs have been estimated as follows:

Evaluator costs: Approx 40 hour total/Approx $1500

Participants costs: Approx 87 hours (based 147 participants providing a maximum of 30 minutes to an hour of their time each). Dollar amount too variable across groups to determine.


Appendices

Alex

Business Partner interview questions

1. On a scale of 1 – 10 how well do you feel you understand the role of the manager in the Discover induction?

2. Where have you found the majority of the information you require about the process? (Please choose one).

  • a. Online – the Discover homepage
  • b. From Learning & Development
  • c. From the Recruitment Team
  • d. From another Business Partner

3. Have you ever supported a manager through induction a new start?

  • a. Yes – what were the biggest challenges? What were the best resources?
  • b. No – if you were to what kind of support do you think you might need?

4. Tell me about your experience of our online online resources (Discover homepage, schedules, checklist, managers guides, buddy guide) for managers? (Direct answer to quality, relevance, usefulness for managers)

5. How easy is it to access those resources online - was it easy to find what you needed quickly?

6. How effectively do you feel the resources (Discover homepage, schedules, checklist, managers guides, buddy guide) prepare managers to carry out the induction process?

7. Have you completed the online modules yourself?

  • a. Yes – can you give me an example of what you found valuable in the modules? Was there any thing that was challenging about the content or the ability for users to complete the module?
  • b. No – do you think that not completing them disadvantages your ability to support managers through the process?

8. What type of feedback from managers have you had about the programme?

9. Are there any common themes or issues you have noticed in supporting managers through the process? (If so what?)

10. Do you have any comments or feedback you would like to give about the programme?

11. If you could change one thing about the programme what would it be?


Manager online survey questions

The organisation does not currently have a Learning Management System (LMS) so are unable to identify how many managers have progressed their staff through the programme. The survey questions are intended to provide People & Culture will  big picture overview of how familiar Managers are with the Discover programme, how many have used it and what their general impression of the process and resources are. 

Questions for this survey available on request - unable to upload a document or post an image off them here. The survey will take place between be carried out over a week in October. The Manager may or may not complete the survey prior to being interviewed.

Manager interview questions These questions are a guide for the interview process - the conversation will be dependent on existing relationship with the manager and their experience with the programme. The purpose of the interviews will be to gain deeper insight nto how well the programme works in practice and identifying common issues between managers. managers will be asked if they have competed the survey prior to the interview. If they have the interviewer will modify questioning appropriately.

1. You have managed an employee through the induction process – tell me about you prepared to do this. (Looking for references to people, timing place, resources).

2. During preparation for the induction did you face any challenges? Tell me about them.

3. Have you completed the online modules yourself?

  • a. Yes – how well do you feel each module prepares the inductee for employment at Contact?
  • b. No – do you think that not completing them disadvantages your ability to support new employees through the process?

4. If you didn’t face challenges tell me about why you feel the process ran smoothly.

5. If you had to support a colleague through the manger’s process how would you do it?

  • a. Where would you access information and resources?
  • b. If you wouldn’t go straight to the Discover homepage why would that be?

6. What sort of conversations with other managers/colleagues have you had about the programme?

  • a. What would you say the general consensus about the programme is – positive/negative – and why?

7. Explain how/why you believe/don’t believe the programme support you in providing a new start with a great induction experience.

8. Do you have any other comments /feedback you would like to provide?

'

'
Retail Trainer's Interview 1. When the Discover programme was launched in 2009 what steps did you take to incorporate it into the current Call Centre induction programme?

2. How many Inductions have you done since then?

3. How does the online component fit into the Full Contact Induction?

4. Now that you have the extra modules - what percentage of the Full Contact induction is presented online?

5. How does the manager work in their responsibilities or do you act as the manager in the first 8 weeks?

6. If you act as manager in the frst 8 weeks how do you manage the transition when the CSRs move out into their teams?

7. What do you think are the most valuable tools for the manager in the Discover programme?

8. What do you think are the least valuable tools for the manager in the Discover programme?

9. Are there any further resources you feel we should consider adding?

10. Does the online component of the programme meet your needs?

Rachel

Stakeholder interview questions

What is your vision of the improvements expected when the evaluation review is complete?

What do you see as being the specific objectives of the evaluation review?

In meeting the objectives, what factors do you see as being critical to the success of the evaluation review?

Are there any working environment considerations that could impact on the evaluation review’s success in either a positive or restrictive way, for example, markets, geography, culture, policy, working practices, methods, quality standards?

Is there any other information relevant to the evaluation review of which we should be aware?

Are there any assumptions being made in carrying out the evaluation review?

What do you see as the major risks to the success of the evaluation review?

What evaluation review status updates do you require and at what frequency? E.g. Phase reviews, Customer satisfaction surveys, Supplier reviews, Post Implementation Reviews, etc

Technical Learning On-line Modules

1. My knowledge of this subject prior to the module/s was

2. My knowledge of this subject after the module/s is

3. What did you find MOST VALUABLE? Please explain why.

4. What did you find LEAST VALUABLE? Please explain why.

5. What improvements or changes (if any) would you like to see?

6. The module/s were clear and easy to follow.

7. The module/s provided useful information that helped me to more fully understand the topics covered.

8. The module/s content and activities were engaging.

9. Did you complete each module in one go or break it into smaller parts over a few days?

10. The module/s were relevant to my role and position.

11. I will be able to use what I have learnt in my day to day activities.

12. If you have answered YES to question 11, please explain HOW you will apply what you've learnt.

13. If you have answered NO to question 11, please explain WHY you don't think you will be able to apply what you've learnt.

14. My overall rating for these module/s is

References

Rachel

  1. http://www.springerlink.com
  2. http://elearningmag.org
 Wang, L., Wei, P., & Chang, T. (2007). Reducing Evaluation Fatigue in Interactive Evolutionary Algorithms by Using an Incremental Learning Approach. Abstract [Adobe Digital Editions version]. Retrieved from http://www.springerlink.com

 Little,B. 2009. The Benefits of Experience, Knowledge Alone is Not Learning. Elearn Magazine. Retrieved from http://elearningmag.org


Alex

Assessing Learning Outcomes (2006). WikiEdProfessional eLearning Guidebook/Assessment, feedback, and e-moderation/Assessing Learning . Retrieved September 28, 2010 from http://wikieducator.org/WikiEdProfessional_eLearning_Guidebook/Assessment,_feedback,_and_e-moderation/Assessing_Learning_Outcomes

Elliott, J. (date unknown). Collecting, analysing and reporting data in action-research: some methods and techniques.. Retrieved November 22, 2010 from University of East Anglia, School of Education and Professional Development Web site: http://www.uea.ac.uk/edu/phdhkedu/acadpapers/jeoecdpage1.html

Hegarty, B. (2009). What are the functions or methods of evaluation? Retrieved September 28, 2010.

Introduction to Evaluation (2006). Introduction to Evaluation. Retrieved September 28, 2010 from http://www.socialresearchmethods.net/kb/intreval.htm

Marshall, S. (2007). UTDC Victoria University of Wellington. Retrieved September 30, 2010 from Victoria University of Wellington, University Teaching Development Centre Web site: http://www.utdc.vuw.ac.nz/research/emm/documents/versiontwothree/20070620ProcessDescriptions.pdf

Methods of Assessment (2006). WikiEdProfessional eLearning Guidebook/Assessment, feedback, and e-moderation/Assessing Learning . Retrieved September 28, 2010 from http://wikieducator.org/WikiEdProfessional_eLearning_Guidebook/Assessment,_feedback,_and_e-moderation/Assessing_Learning_Outcomes/Methods_of_assessment