Training Educators to Design and Develop ODL Materials/Types of Assessment in ODL/Content

From WikiEducator
Jump to: navigation, search
Road Works.svg Work in progress, expect frequent changes. Help and feedback is welcome. See discussion page. Road Works.svg


Tutorial.png Workshop Modules Home.png Principles of ODL | ID Models | Needs Analysis | Developing Learners’ Profile | Methods of Delivery | Content Development Methodology for ODL | Types of Assessment in ODL | Developing a Student Guide | Relevant Technologies | Course Evaluation | Other Key Issues

PRINCIPLES OF ASSESSMENT

The Role of Assessment

Assessment: a general term that includes the full range of procedures used to gain information about student learning...and the formation of value judgments concerning learningprogress...(Linn and Gronlund, 200, 31-32).

In its broadest sense, e-assessment is the use of information technology for any assessment-related activity. This definition embraces a wide range of student activity ranging from the use of a word processor to on-screen testing. Due to its obvious similarity to e-learning, the term e-assessment is becoming widely used as a generic term to describe the use of computers within the assessment process. E-assessment can be used to assess cognitive and practical abilities. Cognitive abilities are assessed using e-testing software; practical abilities are assessed using e-portfolios or simulation software. See more information in E-assessment

It is generally accepted that assessment will propel student learning. Therefore there must be a match between what is assesed (content) AND how this content is assessed (format) in order to make assessment meaningful and meet the instructional goals.



Icon activity.jpg
Activity
Brainstorming Exercise

In groups, discuss the role of assessment in general. Apply this to ODL situation. Report to plenary



Modes of Assessment

Icon activity.jpg
Activity
Matching Exercise
  1. This kind of assessment is administered during instruction to find out which learning outcomes learners are handling or which they need help with; done in order to shape, and improve performance and behaviour.
  2. This kind of assessment is administered at the end of a specified period of time: -course, unit, year. This is to identify whether learners have achieved the objectives of the course. Emphasis can be placed on assigning grades.
  3. This assessment is given to compare learners’ score with the average score of the other students in the class. The teacher can include a large number of easy items.
  4. This assessment is given to compare learner performance against a standard or a set of performance tasks. A learner’s outcome is dependent on what he/she can do – what objectives each learner has mastered. The facilitator can use some very easy and some very difficult items.
  5. This assessment is based on authentic tasks that require students to show what they can do. Learners have to create responses to any given problems and be able to defend their positions. At times learners must demonstrate their ability in real life contexts while working on projects, or designing something. This encourages higher analytic and critical thinking.

For each of the descriptions above:

Match the description with the mode of assessment as indicated below.
A. Criterion-referenced Assessment
B. Formative Assessment
C. Norm-referenced Assessment
D. Performance-based assessment
E. Summative or Final Assessment



Types of Assessment in ODL

Remember that the learning style (of the learner) is also a major consideration. See Types of Online Assessment.

Handout is provided on Types of Assessment

Issues Related to Validity and Reliability

There are two other factors that contribute to the effectiveness of assessment. Namely:reliability and validity


Icon define.gif
Definitions
  • Reliability is related to the consistency of assessment scores over time,and between and among raters.
    • Refers to the assessment obtained with an assessment instrument
    • Consistency of test scores or assessment results from one measurement to another
    • Inter rater – consistency of scores between raters
    • Intra rater - Consistency of scores given by the same rater at different times
  • Validity is related to the appropriateness of the inferences that are made on the basis of the results of assessment.
    • Accuracy
    • Criterion-related: Concerned with adequacy and appropriateness of the interpretation and use of assessment results
    • Content-related: How well the sample tasks are representative of the domain of tasks or content to be measured-
    • Construct –relatedThe correspondence between achievement test items and the instruction for which the test is built.
    • Does the test measure what it sets out to measure?


In order to ensure a high degree of reliability and validity there are several approaches the facilitator can utilize.

  • How can the facilitator improve Reliability?
    • Avoid ambiguous questions and directions or instructions.
    • Sample more items with similar content.
    • Use well defined scoring/marking schemes.
    • Train raters/markers in an effort to standardize marking or interpretation of students’ work
  • How can the facilitator improve Validity?
    • Design a table of specifications.
    • Test only what is taught.
    • Consider ‘for whom’ and ‘for what’.
    • Ensure that instructions are clear.
    • Use item types that enhance reliability of tests – both subjective and objective items.
    • Ensure appropriate sampling content.
    • Determine which low discriminating items to discard after item analysis.
    • Pay attention to scoring procedures and test administration.

Gronlund (2000) points out: “The degree of validity is the single most important aspect of a test”. Furthermore, the teacher must be aware of the many factors which may influence the validity of tests measurement, or evaluation results at any given time in the assessment process. Therefore, the teacher must pay attention to:

  1. the test;
  2. administration and scoring;
  3. pupil’s responses;
  4. the group and the criterion.


Icon key points.gif
Key points
  • Integrate assessment and instruction in order to support student learning. This will promote a high degree of validity and reliability of the educational decisons we make that can impact the learners'future.
    • Specify the purpose of the assessment
      • Use a variety of assessments as appropriate
        • Communicate the criteria
          • Provide timely feedback
            • Use appropriate grading procedures


web resourcess



Activities and Reflections

Icon activity.jpg
Activity
Journal Tasks:
A. Utilize the factors outlined by Gronlund, 1998, 17 -27; Chapter 11:
  1. Outline procedures you can follow to enhance the validity and reliability of recent assessment results
  2. Find ONE other reference related to the topic and discuss the main ideas with a colleague

B.

  1. Explore the merits of ipsative assessment.
  2. Apart from the four modes of assessment dealt with in Activity 1, Diagnostic Assessment is important:
  • Describe Diagnostic Assessment
    • Explain why Diagnostic Assessment would be useful to you as a facilitator / tutor
      • Design an assessment, taking care to indicate the following:
        • Type:
        • Subject Area:
        • Topic (s):
        • Objectives: (Please indicate the levels/Bloom's Taxonomy)
        • Items: (Ensure each item is matched with one of your stated objective; try to incllude at least three types forms)





Icon reflection.gif

Reflection

Assessment Checklist: This is based on the views of Ebel & Frisbie, 1991; Gronlund 1998. Have you...

Right.png decided on the purpose of the assessment?

Right.png identified learning outcomes or objectives?

Right.png prepared test specifications?

Right.png constructed items which match the learning outcomes or objectives?

Right.png reviewed items?

Right.png edited items?

Right.png organized Test?



Module.png Assessment in ODL 

Module Home | Related Content 1.1 | Related Content 1.2 | Related Content 1.3 | Related Content 1.4