User talk:Akash Agarwal/GSOCideas
- [View source↑]
- [History↑]
Contents
Thread title | Replies | Last modified |
---|---|---|
Peer evaluation | 3 | 13:53, 9 March 2014 |
Machine blog posting analysis | 1 | 06:20, 9 March 2014 |
Upload student CSV list | 3 | 14:58, 4 March 2014 |
Emailing participants | 0 | 22:03, 3 March 2014 |
The peer evaluation idea sounds quite interesting. Do you already have a mental model of the workflow and what the user would actually see in the wiki? It might be useful to draw a page or two to illustrate your idea.
I agree with Jim, the peer evaluation sounds interesting and is a necessary component for scaling OERu courses in the future.
One of the unique aspects of an open online course, is that learners can choose how they participate. From low level "sip and dip" "read only" to active engagement participating in all the activities. To do date, the evaluation (validation) of the blog posts are used for certification of participation rather than course grades. (That's not to say that peer evaluation for some elements of grading will be excluded in the future.)
You will need to have a think about an implementation model for assigning learners to review designated blog posts (given that not all learners will be posting blogs and the problem of a few posts getting most of the reviews). One implementation is to only assign the peer review task to those learners who have already posted blogs themselves, for example allocating the two most recent posts to a learner for review and somehow notifying the learner of the reviews they should complete on their course dashboard. We will need to think about what happens when someone is assigned a blog to review, and doesn't complete the task. We could for example design a reward system where a blog reviewer gains points which are recognised as a course participation metric for competing their assigned reviews. We will need a mechanism to know which posts have been reviewed and consider the option for minimum number of reviews per post.
I have given a bit of a thought on the peer evaluation workflow.
- The peer evaluation dashboard will be very similar to the Activity register for a course, like http://wikieducator.org/Open_content_licensing_for_educators/E-Activity_register but with the following changes:
- Only posts for an activity for which he has submitted himself can be seen to him.
- Only posts which have less than a fixed number of reviews will be visible in this dashboard, for example a course can be set as each activity requiring about 2-5 peer reviews. So, only the posts with less than 5 reviews will be displayed in this dashboard.
- Posts will be sorted according to the number of peer reviews, and then newest first, i.e, the lastest posts with no peer evaluations yet will appear higher in the list.
- The table will look like https://www.dropbox.com/s/vkmvxj4kx73ekjd/IMG_20140308_221215.jpg (a very rough diagram.)
- The user can then click on a particular row to go to the peer evaluation page of that post.
- It will contain a link to the actual blog, where the learner can go and have a look at it.
- It will look something like https://www.dropbox.com/s/itggpqb7z8hmzmp/IMG_20140308_221558.jpg (a very rough diagram.)
- Now, the course can set a minimum number of posts that each user has to review for each activity,say 3 or 4,this will not only serve the purpose of evaluation but will increase learning as learners will be reading the posts of their peers, without taking away to freedom of which posts they are to review. They would get some credit for each post they review.
The above workflow assures that all posts would be peer reviewed as well as the learner will have freedom to choose the posts they would like to read from the ones which have not yet been evaluated.
Because we don't have a clear way to tie together a user and a natural person, self-assignment allows the creation of sock-puppets for grading or favorably marking friends. At the same time it allows smaller self-organized cohorts to work within a larger class. (My own experience in MOOCs suggests the latter is helpful and should be encouraged.)
I actually think all the submissions should be visible, with those needing review being highlighted. I don't want to prevent a motivated student from reading everything (not that he is likely to review everything).
It would be important to record the time of the review (since remote content could change after review).
I think some sort of peer review is important to the Academic Volunteers International scheme. It also suggests there might be multiple types of reviewers, possibly categories like: current student, student that has completed the course, tutor, community volunteer.
I remain skeptical of machine analysis of blog postings. Can you show us examples of how this is being done now, either in a project you have contributed to or in another Open Source project? Doing rigorous machine analysis in the time available for a GSoC project sounds very ambitious.
I plan to do very basic analysis in the GSOC timeframe and also working on how many tasks I would be able to complete out of my ideas during the time available for GSOC.
There was a project done at my lab some time back (http://cdeproject.iiit.ac.in/htap/). Although, a bit different from what I plan to do I can use the concepts learned here and apply them accordingly.
To describe one of the relevant projects related to twitter trends in very broad terms I’ll use an example. We take a specific event and then we make a list of all tags related to it. Then from the database of tweets which is continually updated we query for the tweets whose at least some fixed percentage of text matches with the tags. I can do the same for blogs. I have a specific activity. For each activity a list of tags can be written (I plan to do this manually to start with but it can be automated as well) . Now I just search for the tags in the blog posts and see if it contains at least some of them. This the most simplistic approach I could think of to achieve automatic detection. I do have some other approaches in mind, for example I could perform an outlier detection (http://www.eng.tau.ac.il/~bengal/outlier.pdf ) for the blog posts and then the instructor or peers would have to revalidate only the outliers.
Of course if you are happy with the results of these I would be glad to perform more rigorous machine learning tasks outside the GSOC time frame too.
I very much dislike subscribing users to mailing lists, so I don't like the idea of uploading a CSV of mailing addresses. I believe it is far better to invite them to join the course (from mailings from the opt-in lists at the OERu web site for example). This puts the student in control... and hopefully will also make it obvious to them how to unsubscribe (remove themselves from a class).
I agree, subscribing users without their express permission would not be permissible in the WikiEducator / OERu environment. However, I can see a use case for a "pre-registration" process where interest in an OERu course is registered, for example signing for updates on the OERu website, or an OERu partner teaching a course in parallel mode where registered students are expected to participate in the open OERu version on WikiEducator.
A CSV file could "pre-populate" fields with information we have (eg Name, email address) when this category of "pre-registration" student comes to the WE site to complete their registration.
That sounds tricky... since the pre-registration would then also have to create a Mediawiki user and provide a way for the student to claim it with a password. If the user already had a WikiEducator account, he might now end up with a second.
It is tricky --
I wasn't thinking about a physical "pre-registration" in the wiki as such, but the email which is sent out from the CSV file to invite the registration could send a link to the registration form which has some intelligence to recognise confirmed WE email accounts - and provides a login links and then pre-populates the relevant form fields from information we know when the WE user is confirmed through password.
Not sure I'm making much sense -- but think of lowering the number of clicks / pages for registration of existing WE users.
A tool exists that allows instructors to email students (by walking the class category members). It needs to be revised so that it can only be used to email students in a particular class, as it currently gives the instructor too much power.