Enosha Hettiarachchi

Siirry navigaatioon Siirry hakuun

Back to Action Research Group[1]

Action Research 2.0 Project Planning for Enosha Hettiarachch[muokkaa]

My Action Research Plan at ELGG - http://htk.tlu.ee/ejump/pg/file/enosha/read/2263/action-research-plan-draft-version

Title: SCORM based Question Objects for Moodle

Overall aim:

Create creative SCORM based Question Objects which will be used to track student progress and participation in the Moodle Learning Management System as well as to conduct practical based e-testing for online learning programs.

Key words: e-learning, Moodle development, Interactive activates, SCORM Modules, online education, Web Development, Software Development, Action Research, Assessment based learning, SCO, Assets, Moodle Gradebook.

Key developmental goals of the project: (3-5 ideally)

o Develop SCORM based question objects o Allow students to do SCORM based questions as an examination during a given time period. o Configure Moodle/write code to offer limited number of attempts during the examination o Write appropriate code to calculate the marks obtained for SCORM based question objects. o Transfer the highest score obtained out of the attempts given to the Gradebook of the Moodle system. o Offer a limited number of randomly generated questions to the students through the activity bank available in a SCORM object. o Provide students with the facility to skip a particular question and revisit it again. o Implement the ways to transform the scores from the SCORM objects to Gradebook (So far Moodle system does not support this facility). o Simulate Microsoft Office package to develop questions needed to create a question bank.

Key research questions of the study: (3-5 ideally)

o How to design SCORM based question objects to conduct e-testing in an effective way? o Do students prefer online simulated assessments opposed to paper and computer based ones? o What are the actions needed to be taken to do a paradigm shift from the earlier method of MCQ assessments to Activity based assessments? o Whether this can be achieved with the Moodle environment and what are the modifications, codes that have to be done in order to accomplish this.

Research methodology and methods: (outline of approach to research and main methods with reference to key literature)

I am working at the e-learning centre of University of Colombo School of Computing as an Instructional Designer. At the same time I am following the Master of Computer Science (MCS) programme. As for the final year project I am doing a project called, SCORM based Question Objects for Moodle, which can be used for assessment based learning and to conduct online e-testing.

First we have to develop questions for the question bank by simulating the Microsoft office package because these SCORM objects are used to conduct online practical assessment for a course called “ICT Applications”. Development of these activities will be done using Flash, JavaScripts or Java Applets. According to the actions performed by the students marks/scores has to be calculated.

Then a way to implement the communication between these activities and the LMS has to be taken into account. In this case the knowledge about the SCO’s, Assets and SCORM’s has to be obtained. All the questions developed will be fall into the category of SCO’s. The questions as it is will be unable to communicate with the LMS. Therefore the communication has to be implemented between SCO and LMS.

After the communication between each of these SCO’s (Question object’s) and LMS is obtained all he questions has to be packaged into SCORM objects and these SCORM objects should be able to transfer the overall marks obtained by each student to the Moodle Gradebook. Here each student should be offered a limited number of attempts participate in a particular online assessment. The marks/scores obtained by students on each of these attempts, time taken and whether the assessment is completed or not should be recorded to the user report of each student. Out of all the attempts given the highest marks/scores obtained should be transferred to the Moodle Gradebook.

Each SCORM Question object acts like a question bank and it should contain quite a lot of questions within each. Students will not be able to access all of these questions. Students are given a limited number of questions on a randomly generated manner. Therefore the changes of students getting the same questions are limited. And at the same time these online assessments have to be done within a particular time frame and after the time is over the marks/scores obtained upto that point will be transferred to the Moodle LMS.

Within the SCORM based question object students are given to facility to skip a particular question and revisit it later and at the same time the number of questions attempted and the number of questions marked as revisit later will be displayed within the screen. After the completion of a particular SCORM based question the marks/scores obtained will be displayed on the screen to the students before they exit from that particular assessment.

In addition to all this a small recording of camtasia will be given for the students explaining how these online assessments can be actually carried out and what the steps that they have to follow.

Currently at the moment since our online LMS at the e-learning centre does not do these kind of practical based online assessments and the fact the students are used to computer based assessments this will be a paradigm shift for the students.

Therefore a pre-test and a questioner should be given to the students before making students to do this kind of online e-testing. Then after students are exposed to the new system, post-test and a questioner should be given to obtain student feedback. At the same time review of student’s progress can be also carried out with the data obtained through the Moodle system.

And at the same time a questioner should be given to the head and the staff at e-learning centre to obtain their feedback about the new system.

After the above entire things are done the actual results and finding of the research can be acquired.

After developing this I’ll be selecting a group of students who are doing currently following this course and at the same time i'll be using my colleagues as a separate group to test it from UI aspect as well. Here I'll be gathering data from interviews (which will be recorded with the permission of the interviewee) and questioners. During the testing stage will also be using wiki's and forums to get the feedback from my colleagues.

Research ethics: (Include a short statement outlining potential ethical issues that you may face and in particular how you will ensure informed consent, anonymity, confidentiality, beneficence i.e. positive good and non-malfeasance i.e. no harm).

A SCORM based Question Object consist of several interactive activities which are available to students on random basis. Here there should by quite a number of activities otherwise the same questions will be repeated, which will allow students to memorize the questions. This might allow students to do cheating.

By any means the developers and Instructional designers should not disclose the questions to students. Questions should be maintained in a very secure manner.

All the students should be treated equally. For example, if students are given only a limited number of attempts it should not be changed with respect to different students.

Teachers or Subjects Matters Experts should check whether the questions are in good quality and standards and whether they cover the relevant objective.

Student marks and information should not be used inappropriately.

Transcripts of courses gathered automatically in machine readable format should not be used for research without the permission of students.

Also need to maintain privacy, confidentiality and anonymity.

The data collected for this study from questioners and Moodle will not be used for any other purpose.

Review of relevant literature (~1500 words plus list of references)

According to research carried out by Australian universities about why online assessments were considered following ideas has been expressed.

[WWW1] “Many academics are seeking to diversify assessment tasks, broaden the range of skills assessed and provide students with more timely and informative feedback on their progress. Others are wishing to meet student expectations for more flexible delivery and to generate efficiencies in assessment that can ease academic staff workloads.

At the same time, in a climate of increasing academic workloads, the adoption of on-line assessment may help to manage large volumes of marking and assessment-related administration efficiently. The automation of routine on-line tasks, in particular, may have the potential in the long-term to provide time/cost-efficient student assessment, though the present evidence suggests that some on-line assessment, at least in the early stages, can add significantly both to staff workload and to overall expenses.” The answer that has been received to the question about “Is on-line assessment improving the assessment of student learning? is as follows;

[WWW1] “There is some evidence that on-line assessment, unless carefully planned, can encourage students to focus on lower level cognitive skills. The educational effectiveness of on-line assessment that concentrates primarily or exclusively on true/false or multiple choice responses, for example, is highly questionable in a higher education environment. As is widely known, such approaches to assessment can have direct negative effects on student approaches to learning by encouraging narrow reproduction rather than the development of higher order cognition that involves, for example, critical evaluation.”

Above findings emphasise that using true/false or multiple choice responses are not encouraged in higher educational environments. That was the approach that was followed up to now even for the degree level courses because of the lack of practical based assessments and the fact that the interactive activities that were developed does not incorporate a facility to transfer marks/scores to the Gradebook of the Moodle LMS. SCORM based questions objects are the idea solution to overcome these pitfalls.

[WWW1] Assessment plays a central role in student learning whatever the mode. On-line learning can challenge students to learn new skills and ways of studying and learning:

“[On-line assessment]…definitely teaches skills not possible to learn from doing a normal essay” “It assesses personal initiative well. You have to be able to work through things by yourself” On-line assessment is but one of many modes that may be used to indicate to students which aspects of their learning are valued and will be rewarded. The use of on-line tools to assess learner progress toward subject objectives can take many forms including:

• Electronic submission of written assignments

• Parallel print and on-line assessment options where students are given the choice of whether and how they use on-line tools in assessment tasks

• Publication of documents on the web

• Labelling of on-line diagrams

• Manipulation of on-line graphs

• Completion of on-line quizzes

• Completion of short-answer and multiple choice questions

• On-line exams with monitored and controlled start and stop times

• Any formative or summative task carried out in a web-based environment.

[WWW2] “Sharable Content Object Reference Model (SCORM) is a collection of standards and specifications for web-based e-learning. It defines communications between client side content and a host system called the run-time environment (commonly a function of a learning management system). SCORM also defines how content may be packaged into a transferable ZIP file.

SCORM is a specification of the Advanced Distributed Learning (ADL) Initiative, which comes out of the Office of the United States Secretary of Defense.

SCORM 2004 introduces a complex idea called sequencing, which is a set of rules that specify the order in which a learner may experience content objects. The standard uses XML, and it is based on the results of work done by AICC, IMS Global, IEEE, and Ariadne.”

At the moment SCORM 1.2 is used as the standards in the e-learning center. [WWW4] “There are several tools for creating SCORM content, such as Articulate, Camtasia, Captivate, ContentGenerator.net SCORM functionality, Courselab, eXe OPEN SOURCE SCORM Development Package, Learning Essentials 2.0 for Microsoft Office, Microsoft Learning Development System, MyUdutu, Reload, MOS Solo, ToolBook, Wimba Create (formerly Course Genie), Wondershare PPT2Flash Professional, QuizForce, WordForce and PowerPointForce.”

Reload is used as the tool for creating SCORM content at the e-learning centre, therefore the same tool is used when creating SCORM objects for this project.

[WWW3] “RELOAD (Reusable eLearning Object Authoring and Delivery) is a project funded under the JISC Exchange for Learning Programme (X4L). The project focuses on the development of tools that are based on emerging learning technology interoperability specifications. It is managed by the University of Bolton with staff located at the University of Bolton and the University of Strathclyde.

The primary aims of this project are to:

• facilitate the creation, sharing and reuse of learning objects and services

• enhance the range of pedagogical approaches realisable through the use of lesson plans

These aims will be achieved through the production of a suite of software tools for authoring and delivery of standard-compliant learning objects incorporating comprehensive user guides and exemplar resources. The tools will be of significant value to the JISC and wider community, since they provide the crucial "missing link" which allows users to author and transfer learning objects, in specification compliant format, between authoring and design tools, local and distributed digital repositories, and VLEs.”

[WWW4] A tool called Test Suite was identified to perform self testing on Learning Management Systems, Sharable Content Objects, Meta-data documents, and Content Packages. It is a product of ADL SCORM® Version 1.2.

The three aspects specifically addressed by the SCORM are:

1. A common Content Aggregation Model which includes Content Packaging.

2. A common specification for a Run-Time Environment (RTE) that facilitates interoperable communication between Learning Management Systems (LMS) and course content components.

3. A common representation of content aggregation, sharable content object (SCO), asset, and package meta-data that can be utilized to promote discoverability and reuse.

The Test Suite can be used to assess the conformance to these three main areas of the SCORM by providing four testing components as follows:

1. LMS Run-Time Environment Conformance Test.

2. Sharable Content Object (SCO) Run-Time Environment Conformance Test.

3. Meta-Data Conformance Test

4. Content Package Conformance Test.

The Test Suite provides a common framework for all of the conformance tests. Each test includes a step-by-step set of detailed instructions as well as a log of each test action and the outcome.

The Test Suite runs entirely within a Web browser on one local computer. All of the software is loaded on the local hard-drive and requires no HTTP Web server to run.

Brief introduction about each of the testing components is as follows;

o LMS Run-Time Environment Conformance Test

The LMS RTE Conformance Test verifies that a test subject LMS is conformant with The Run-Time Environment section of the SCORM.

This test will verify that the test subject is able to:

o Launch known conformant Sharable Content Objects (SCOs)

o Support the Application Program Interface (API) functions defined in the SCORM

o LMSInitialize("")

o LMSFinish("")

o LMSCommit("")

o LMSGetValue("datamodel.group.element")

o LMSSetValue("datamodel.group.element","value")

o LMSGetLastError()

o LMSGetErrorString()

o LMSGetDiagnostic()

o Determine if all mandatory data model elements are implemented correctly by the LMS

o Determine which optional data model elements are supported by the LMS and implemented correctly.

o Sharable Content Object Run-Time Environment Conformance Test

The SCO RTE Conformance Test verifies that the test subject SCO is conformant with the Run-Time Environment section of the SCORM. This test will verify that the test subject:

o Can be launched by a known conformant LMS

o Supports the Application Program Interface (API) functions defined in the SCORM

Required o LMSInitialize("")

o LMSFinish("")

Optional o LMSCommit("")

o LMSGetValue("datamodel.group.element")

o LMSSetValue("datamodel.group.element","value")

o LMSGetLastError()

o LMSGetErrorString()

o LMSGetDiagnostics()

In addition to verification of the above requirements, this test will also audit which CMI data model elements the SCO(s) made use of in the content implementations, and verify that the data elements are used in the correct manner with regard to the data type rules associated with the data model elements.

o Meta-Data Conformance Test

The Meta-data Conformance Test verifies that the tested Asset, SCO, Content Aggregation, or Package Meta-data XML document is conformant with the Content Aggregation Model section of the SCORM. This test will verify that the test subject Meta-data:

• Required

o Is a well-formed XML document;

o Validates against the IMS Meta-data Version 1.2.1 Schema;

o Implements all mandatory elements correctly;

• Optional

o Uses optional elements correctly.

o Uses extensions.

The following will be tested for each use of a meta-data element, where applicable:

• Correct use of repeating elements (i.e. multiplicity).

• Correct use of "langstring" and "datetime" elements

• Adherence to "restricted vocabularies"

• Adherence to "best practice vocabularies"

• Adherence to the smallest permitted maximum number of characters

• Element 4.2 "size" must be given in a numeric format (i.e. bytes)

o Content Package Conformance Test

The Content Package Conformance Test verifies that the test subject content package is conformant with the Content Aggregation Model section of the SCORM.

The purpose of this test is to verify that the test subject package:

• Contains an imsmanifest XML document at the root of the content package that:

o Is well-formed;

o Validates against the ADL Content Packaging Version 1.2 Schema;

o Implements all mandatory elements correctly;

o Uses optional elements correctly;

o Optionally uses extensions.

• Contains all supporting schema definition files at the root of the package.

• Contains conformant Asset, SCO, Content Aggregation, and Package Meta-data as determined by the Meta-data Conformance Test.

• Contains conformant SCOs as determined by the SCO RTE Test.

The following will be tested for each element, where applicable:

• Correct use of repeating elements (i.e. multiplicity).

• Adherence to "restricted vocabularies"

• Adherence to the smallest permitted maximum number of characters

• Correct use of Timespan data types

• Correct use of ID data types

• Correct use of IDRef data types

• Correct use of location/href

References :

• [WWW1] http://www.cshe.unimelb.edu.au/assessinglearning/03/online.html [15/01/2009] • [WWW2] http://en.wikipedia.org/wiki/SCORM [22/11/2008] • [WWW3] http://www.reload.ac.uk/ [15/12/2008] • [www4] http://www.adlnet.gov/downloads/DownloadPage.aspx?ID=215 [19/12/2008]

Activity plan: (a programme of work with particular tasks to be done)

   * February 1st – 15th – Explore the relevant literature on using SCORM and online e-assessment, blogs and forums.How to use blogs in LMS to get feedback from students.
   * February 15th – February 30th – Identify the ways and the tools that can be used to developing SCORM based Question Objects. Identify what type of questioners that can be given to the students to obtain their feedback.
   * March 1st – March 6th – Prepare a short questionnaire for the pupils to test their understanding of SCORM based question Objects for assessment and how reflecting on one’s own learning can help in the learning process and what is their experiences of doing online assessments compared to office computer based examinations.
   * March 12th – Start with implementation
   * Week 1 of implementation: Develop simulations for Microsoft Office package (considered online Microsoft Word due to time constrains) using Flash. Convert them to SCORM based question objects (after doing necessary implementations using Javascript and PHP) 
   * Week 2 of implementation:  Select the volunteers to test the project, and  give them a pre-test and ask them to post their comments and feedback in their blogs and forums. Explain what A Blog is and how it is created. Ask the students to fill in the questionnaire.
   * Week 3 of implementation:  Ask the volunteer group to do the online assessment and post their comments again to the forum. Compare their results with the pre-test and online assessment(post-test). Follow the progress of the volunteer group. Help volunteer group to in writing their blogs and encourage themselves to help each other.
   * Week 4 of implementation: Ask volunteer group once again to fill in the same questionnaire. Set up a meeting with each volunteer group who participated in the project and discuss his/her opinion on whether the SCORM and online e-assessment improve their learning and assessment rather and offline examinations and whether it is a good tool for assessment.
   * April 6th – May 30th – Writing the report about the project.