Peer Evaluation/Recommended rubric refinements

Components of the rubric system

 * 1) Criterion: Dimension to be evaluated, for example knowledge, relevance, critical thinking.
 * 2) Levels: the rating scale used, eg:
 * 3) * Not achieved, achieved, merit
 * 4) * Below average, average, good, excellent.
 * 5) * Beginning, intermediate, exemplary
 * 6) Descriptors: Definitions describing desired performance level.
 * 7) Weightings: The relative proportion of the contribution of individual criteria to overall score.

Criterion referenced rubric (optional)
''Criterion referenced rubric comprises objectively verifiable criteria of the Yes/No type. It is typically used to assess the completeness of a submission.''


 * 1) Optional means that this is not a required component of the evaluation system, but if included functions as described below.
 * 2) It can be used alone or in conjunction with an analytic or rating scale rubric.
 * 3) The "Completeness" criterion aims to assess whether the post meets the minimum criteria for "Achieved" and specifies criteria for a "Merit" rating. If the post does not meet the minimum criteria is it designated as "Not achieved" or other suitable 3 level typology.
 * 4) Results for the "Completeness" dimension are displayed separately from the average score determined by and analytic or rating scale rubric, and can simply be displayed as "Not achieved", "Achieved" or "Merit":
 * 5)  The respondent does not rate the post as such, but merely completes a "check list" of yes/no type questions to verify if the minimum specified elements are present in the submission.

Example based on Activity 3.1
Notes


 * 1) Include comment text field for the item on "does this post related to the activity".
 * 2) Every peer evaluation submission should include the validation question confirming that post relates to the question.

Example based on Activity 4.1
Note: Include comment text field for item on does this post related to the activity

Example based on Second Learning Reflection
Note: Include comment text field for item on does this post related to the activity

Analytic rubrics
Analytic rubrics provide descriptors for each criterion at every level of the rubric. They can be used in conjunction with the criterion referenced rubric but not in conjunction with rating scale rubric below.


 * Suggested UI behaviour: User clicks on the "cell" or corresponding star which best describes the performance for the criterion concerned.

Example based on Activity 3.1
Notes


 * 1) Scoring: 1 = Beginner, 2 = Intermediate, 3 = Advanced multiplied by weighting
 * 2) Final score: could be displayed to users as 3 stars (eg &#9734; &#9734; &#9734;) but partially shaded according to aggregate score from all evaluations. For example an aggregate score of 2.5 would show the 1st star shaded full and 2nd star half shaded.
 * 3) No of levels: System should allow user to specify the number of levels, that is 2 or more, although in practice 4 or more levels will become increasingly difficult to provide descriptors which will succeed in discriminating successfully. I suggest restricting to a maximum of 5
 * 4) Layout Current table layout is not optimised for mobile display. Perhaps each level descriptor could be a row under the heading of the criterion.
 * 5) Comments: Provide one open text field for comments.

Example based on Activity 4.1
Notes


 * 1) Scoring: 1 = Beginner, 2 = Intermediate, 3 = Advanced multiplied by weighting
 * 2) Final score: could be displayed to users as 3 stars (eg &#9734; &#9734; &#9734;) but partially shaded according to aggregate score from all evaluations. For example an aggregate score of 2.5 would show the 1st star shaded full and 2nd star half shaded.
 * 3) No of levels: System should allow user to specify the number of levels, that is 2 or more, although in practice 4 or more levels will become increasingly difficult to provide descriptors which will succeed in discriminating successfully. I suggest restricting to a maximum of 5
 * 4) Layout Current table layout is not optimised for mobile display. Perhaps each level descriptor could be a row under the heading of the criterion.
 * 5) Comments: Provide one open text field for comments.

Rating scale rubrics
''A rating scale rubric lists the criteria and corresponding descriptions Respondents rate the quality of the item using a 5 point star system. It is useful to provide examples of what constitutes a weak or strong answer but not a requirement depending on the format of the descriptor.''

Example based on 2nd Learning reflection
Notes


 * 1) I've deliberated on whether the system should provide options for rating scales, but feel we should restrict the rating scale to a standard 5 star system.

Thoughts about colour coding stars
I've been thinking about the idea of the "traffic light" analogy / assessment approach for the colour coding of the stars used for the rating system as a visual cue. The stars can be used as for displaying results and also used as input for the rating scale rubric approach.

From the learner's perspective (self evaluation) using a 5 star system:


 * 1) Red (Star 1): I need help or need to work more, my post is not up to standard
 * 2) Amber (Stars 2 - 4): I'm getting there - still need to improve on a few aspects
 * 3) Green (Star 5): I've mastered this!

From the evaluator's perspective


 * 1) Red (Star 1): Learners post is below standard - needs more work
 * 2) Amber (Stars 2 - 4): Learner gets the basics but there are areas for improvement
 * 3) Green (Star 5): Model answer which learners should emulate

For three star reporting


 * 1) Red (Star 1)
 * 2) Amber (Star 2)
 * 3) Green (Star 3).