Feb 092016
 

Mr Luke Attwood leads three modules (ranging levels 5 – 7) in the school of Computer Science and Informatics at DMU. The modules are; Object Oriented Software Design & Development, Object Oriented Programming, and E-Commerce Software.

Each component requires that students submit their software solutions in source code form and Luke uses the Assignment submission tool in DMU’s VLE (Blackboard) with an associated electronic rubric for marking and providing text based feedback.

None of the components are usually marked anonymously, although it is feasible to do this when circumstances require by hiding student identification data.

The rubrics are consistent in that they all involve a set of criteria against which there is a 5 tier scale that Luke has adopted to represent the levels of achievement (0 – 100) with a percentage and description attached; e.g. Significant room for improvement (40%).

Before rolling out the rubrics Luke tested them with colleagues second marking to ensure the rubrics would produce a consistent and accurate result. Luke constantly updates and tests the rubrics to further refine the wording of the criteria but over the last two years minimal changes have been required and the rubrics are performing well by way of consistency and the distribution of marks.

At first, colleagues were sceptical with regard to the rubrics providing enough variation of marks. Although each criteria only has a 5 tier scale (e.g. 0, 40, 60, 80, 100); as there are several different criteria Luke finds that the rubrics still return a granular spread of marks across the cohort. Luke did experiment with fewer criteria (and fewer levels of achievement) but this did result in bunching of marks and too many criteria (or levels of achievement) became unmanageable – Luke would advise a minimum of 3 criteria but 4 or more is better and is appropriate to the nature of the submissions on the modules he teaches. This, in combination with the 5 tier scale has consistently worked well.

When marking, Luke will enable the ‘Show Feedback’ view in Blackboard to enable personalised text feedback to be included alongside any pre-determined feedback that Luke may have pre-prepared as the rubric was produced. Not all of Luke’s rubrics include pre-filled feedback but if he finds that similar comments are being made on a frequent basis Luke will make a note of these for pasting into the relevant feedback section later. Having the Show Feedback view enabled also allows Luke to override the overall grade if necessary and he will always provide a text based comment with a brief explanation regarding any overridden grades.

Luke is confident that using rubrics to mark students work enables him to provide provisional grades and personalised text feedback in a timely manner. Whilst Luke’s students are yet to specifically provide feedback on his use of rubrics there has been no negative feedback from the students.

Luke finds that there are multiple benefits to using rubrics in this way. Rubrics provide the ability to easily mark and write comments in a single location that are then immediately available for students to view rather than having to upload a separate mark sheet, which for a lot of students in itself can be time consuming. The rubric grid also does all of the calculations automatically for Luke so he does not need to worry about this or even have the totals moderated. Furthermore, it provides an elegant way of placing comments directly next to where they are applicable.

Thank you to Luke Attwood for enabling this post to be created. If you wish to explore the use of rubrics in Blackboard further please contact your local ELT Project Officer.

Ian Pettit.

Aug 212015
 

Dr Marie Bassford, Senior Lecturer, Faculty of Technology, started to teach a new first year Physics Fundamentals module in 2014/15. Having taught only second and third year students for a number of years; Marie saw the opportunity to develop a new first year module as a vehicle to re-engage with the use of phase tests for assessment.

In the past, Marie would have made use of a paper based optical mark reading system as the platform for delivering phase tests but with her learning technologist background, Marie sought to identify a more automated approach that could be re-used year on year.

Marie delivers Physics Fundamentals with two colleagues and following an exploratory conversation with the ELT Project Officer, Marie and the team decided to move forward with the Blackboard Learn Test tool for administering phase tests via the Virtual Learning Environment.

Each of the module team members were tasked with producing questions for the phase test. These questions were linked to the learning outcomes and were all multiple choice type questions with one correct answer. The phase test itself was to be 25 questions in total but with the input of three colleagues a bank of 58 questions was produced.

The ELT Project Officer helped Marie to develop her skills and understanding of the Test tool and Marie ensured that the questions were authored in Blackboard Learn.

The over production of questions was purposeful as in conversation with the ELT Project Officer, Marie had decided to create a large pool of questions and have Blackboard Learn serve a random 25 questions to each student from the pool. There were a number of reasons for this approach:

  1. Having each student answer a randomly selected set of 25 questions helps to minimise copying in the test environment;
  2. Building a pool of questions enables Marie and the team to add to this pool each year; and
  3. The pool can be re-used year on year with minimal effort by including it in the annual Course Copy.

There was an amount of effort required to create the initial pool of 58 questions but the three colleagues teaching Physics Fundamentals spread this work across the module team and this pool will now roll over each year. Marie is confident that although there will inevitably be updates and amendments to the question pool; in the long run, the time that has been invested will be recouped.

Due to the multiple choice nature of the questions, Blackboard Learn marks each test upon completion and there are options for students to see their provisional grade instantly upon completion of the test along with any generic/automated feedback. When compared to the work required to print and scan optically mark read submissions the use of the Test tool minimises the effort required to deliver the test and grades once the initial pool is created and the test is deployed on the module shell.

The students who engaged with the phase test in December 2014 were generally positive about the experience. One student did question the use of randomised question sets but as the questions that each student received had all been carefully written to support one or more learning outcome this query was swiftly dealt with.

Conversely, one student actively told Marie that she thought it was “clever how the questions were randomised” and that she understood that it made it fairer to assess that way when taking the test together, side by side at PCs.

During the test, there was one issue in that a superscript character that had been used in one of the questions did not display correctly but this was quickly dealt with on the day and Marie has now re-formatted the question text to ensure that it displays fully to the students.

Marie’s next Physics Fundamentals phase test in is April 2015 and she will be using this question pool with the addition of further questions for the second phase test and Marie will continue to use the Test tool this year and in years to come as part of the Physics Fundamentals assessment  components.

Thank you to Dr Marie Bassford and the Physics Fundamentals module team for enabling the documentation of their use of online phase tests. If you wish to learn more about how to replicate this practice at DMU, please contact your local ELT Project Officer.

Ian Pettit and Marie Bassford.

Jul 022014
 

Just a quick post about a neat trick I discovered today that could help improve access to files for students and staff.

A colleague was looking for a more intuitive way to point fellow staff members to a Blackboard course's file repository as the link (in Control Panel) is not always obvious.

Firstly we right-clicked the link in the browser and copied the link location to the clipboard.

We then created a new item in a content area, inserted a picture and submitted.

Once submitted, the item was edited and with the picture selected the hyperlink button was clicked and the URL for the course file repository pasted into the link URL.

The changes were submitted and we now have a big picture/button within a content area that takes staff and students (depending on permissions) to the course files or a specific directory. This is much more obvious for staff members who may be using a Blackboard course or Organisation for sharing files.

We also found that this approach can be used when creating an Announcement too. Using the divider on the Announcement page we can permanently stick a link to the student files to the top of the default course entry page.

Using this approach could help in a scenario where students need the content of a directory and the instructor can save time by using this technique as an alternative to attaching individual files to an item or using the folder content type as the folder content type offers limited ability to wrap links within contextual and support information.

 

Ian.

Apr 032014
 

It’s not as testing as it seems

It was anticipated that students studying their first year of the new Mathematics for Scientific Computing module would take four tests on paper but newly appointed module leader, Dr Sarah Greenfield, had other ideas.

Sarah recognised that the nature of the tests; multiple choice with single correct answers, lends itself to an online format. This not only means that Sarah can have the tests marked by the system but it saves paper and makes the tests re-usable year on year with minor amendments.

One hurdle that was overcome was the use of pictures in one of the tests. The fourth phasetest is a thirteen question test in which each of the questions and each of the four options for each question are presented in picture format.

Sarah talked to the relevant specialist and it was decided that to create this test using the Blackboard Learn Virtual Learning Environment would investigated as this was anticipated that this platform would add the most value by way of time saving and recycling the tests.

With support from the ELT Project Officer, Sarah has been able to efficiently create a bank of questions for Phasetest 4 which comprise of the question image and four answer images per question. It took a few hours to create all of the images so that they all appear consistently but this up-front investment is far out-weighed by the time saved by having the tests marked by the VLE and on an ongoing basis, there is no longer a need to re-write and re-print paper tests every year.

Two issues were encountered:

Firstly, when adding pictures (.jpg format) to the test in the VLE, the default permissions do not include any access for students. This means that although the test appears to be fine for instructors, when students take the test for real, blank spaces appear where the pictures should be.

To overcome this, as the pictures were created, they were saved to a single local directory with meaningful filenames and following an agreed structure/hierarchy. Once all of the pictures had been produced, the entire directory was uploaded to the VLE and the folder permissions were changed to include students with read only permissions. Before submitting, the ‘overwrite’ option was selected and the new student permissions automatically percolated through to the sub-directories and files.

The second issue was how to create the thirteen questions in an efficient manner. Sarah could have created each question individually and browsed for each question and its respective four options images as the questions were created but it was estimated that this would represent the best part of a day’s work.

Instead, Sarah created the first question and using the Edit Test screen twelve copies were taken. Sarah then went into the editor for the second question and amended the question title and so on. However, at this point the images in the copied questions were still showing the images for question one. To save time, instead of removing all of the images and browsing the course for the correct image, Sarah made use of the HTML mode in the editor to point the existing code to the correct image.

For example, this code points toward the third answer option for question one:

<div class="vtbegenerated"><img src="https://vle.dmu.ac.uk/bbcswebdav/courses/IMAT1205_2014_Y/Phasetest4files/Sarah%20G_Phasetest4images_March2014/Sarah%20G_Phasetest4images_March2014/Q1/Q1C.jpg" alt="" style="border: 0px solid rgb(0, 0, 0);" /></div>

The key part here is the “Q1/Q1C.jpg” section of the code. This represents the sub-directory and the filename of this particular image.

Using the HTML mode, Sarah was able to copy and paste this code into each of the questions and amend only the sub-directory and filename to show the correct picture for the correct question and its respective answer options.

For example, this code represents the first answer option in question eight:

<div class="vtbegenerated"><img src="https://vle.dmu.ac.uk/bbcswebdav/courses/IMAT1205_2014_Y/Phasetest4files/Sarah%20G_Phasetest4images_March2014/Sarah%20G_Phasetest4images_March2014/Q8/Q1A.jpg" alt="" style="border: 0px solid rgb(0, 0, 0);" /></div>

Note that the code is almost identical with the exception of the sub-directory and filename which is now “Q8/Q1A.jpg

Using this technique of copy and paste with the HTML mode on, Sarah was able to quickly re-point the links to the relevant picture files in around an hour. This use of HTML mode requires absolutely no knowledge of HTML programming or tags, just a well organised folder to begin with.

Sarah’s students will now sit this phasetest in a computer lab whilst logged into the VLE rather than on paper. This will save Sarah time as there is no longer a need to manually mark the tests and Sarah now has a bank of questions that can be easily added to or amended over the next few years.

 

Ian Pettit