Mar 152017
 

Dr Indrani Lahiri teaches a second year undergraduate Public Relations module and two third year International Public Relations and Global Advertising Practices modules within the Leicester Media School at DMU.

All three of the modules are assessed using multiple methods, one of which is a group presentation that the students are required to deliver in small groups. Due to the nature of the modules, and from experience of teaching similar modules at a different university, in 2015/16 Indrani sought to find a way to easily record the student presentations on the second year Public Relations module. ‘Public Relations’ will involve the graduates being expected to present in front of a panel or larger audience in industry and recording the presentations provides a resource for students to use for reflection and development. The recordings can also be downloaded and with the appropriate consent from peers, students can use their recording as evidence of successful group work after graduation.

Following a conversation with the ELT Project Officer Indrani engaged with the DMU Replay system as it was important to record both a traditional video feed alongside screen content and audio.

In the lab space available, a high definition webcam with microphone was used to record the student’s audio and the video feed alongside the screen of the PC used to present with.

Building this recording element into the 2015/16 presentation assessment component helped the students to practice their presentation skills in this scenario whilst providing a resource to aid Indrani, the moderator and the External Examiner in assessing the students as well as providing an artefact that the students can keep after graduation.

As the presentations are delivered in groups conversations were held around Intellectual Property Rights and individual consent to use the videos for reflective purposes and/or as part of an online CV. The JISC model consent form was adapted and students were asked to sign a copy of the form to give permission for their colleagues to take and use a copy of their respective video.

Only one student took a copy of their group’s recording in 2015/16 and this was found to be because the adapted consent form posed a barrier. Students wanted to have a copy of their recording but it is suggested that signing the form made this ‘official’ and this put them off engaging with this aspect of the initiative.

Indrani, and the External Examiner, have found having the presentation recordings available to be very valuable and Indrani has received positive feedback from the External Examiner on this use of technology in the assessment activities.

Feedback from the students suggests that although the introduction of the recording technology was positively received, they would have appreciated this earlier in the module. Indrani has recognised the need to engage with DMU Replay earlier on and in 2016/17 rolled-out the use of DMU Replay on the two third year International Public Relations and Global Advertising Practices modules.

Since students found this approach to be valuable, in 2016/17 they recorded their individual presentations from their own space for International Public Relations as an evidence of practical work to showcase their competency as a video blogger. Again this became a part of their portfolio submission that they can take with them as evidence of practical work to job interviews.

Indrani also recorded the assessed presentations for Global Advertising Practices in 2016/17.

The plan for 2017/18 is to further expand the use of DMU Replay and make this available for the Global Advertising Practices module. Recording practice presentation runs, blog posts and the assessed presentation will add value for these students who will be briefed to produce an advertising campaign focused on artificial intelligence and robotics that is based on a real brief from industry. Again, having their recording available after graduation will be valuable when seeking employment.

Alongside Indrani’s use of DMU Replay to enable students to reflect and produce video artefacts to support their employability; Indrani is also using DMU Replay to provide video feedback on other written assessment components – the External Examiner is supportive of this approach.
Assessing students using different modes and providing opportunities for students to reflect on their performance by providing feedback using different tools enables a Universal Design for Learning approach. Students are demonstrating their knowledge in different and more creative ways that align to the learning outcomes and they can also watch their presentation and video blog posts back as part of their self-directed reflective activities.

In 2017/18 Indrani is considering opening the student presentations to a wider audience as part of a Public Relations event. This would provide further opportunities for Indrani to provide feedback or even encourage the student cohort to feedback to each other as their presentations build toward the Public Relations event.

Thank you to Dr Indrani Lahiri for enabling this blog post to be created.

Ian Pettit.
ELT Project Officer.

Aug 082016
 

Since 2007, Dr Sophy Smith (DMU Teacher Fellow, 2015 and HEA Senior Teacher Fellow, 2015) has been taking a fresh approach to assessing students on the MA/MSc Creative Technologies course at DMU.

In essence, Sophy and the team provide an open choice to the students with regard to what they study, how they are assessed and the format that they express themselves in.

At the beginning of the Major Project module, the students will have a conversation with Sophy and they will firstly decide whether they will go for an MA or an MSc in Creative Technologies. Then, based on the student’s aspirations regarding employability a set of learning objectives and the assessment format will be agreed.

The module is 15 weeks in total with 2 hours per week being delivered by Sophy and the other members of the teaching team. There are also workshops and seminars and a project is started around half way through.

The only rigid assessment component is that the students are required to provide a critical commentary regarding their project however the format of this component is also negotiable. Students do often opt to provide a written critical commentary but some students have also opted to provide a collection of blog posts, a film or any other media.

Sophy began taking this approach on two modules in 2007 but this negotiated approach to assessment is now the norm on all of the course specific modules that make up the MA/MSc Creative Technologies.

As part of the course there are shared modules too and students do not have the choice when taking these modules, only MA/MSc specific modules provide the opportunity to follow a tailored path for assessment.

One example of this flexible approach to assessment includes a student who knew that he wanted to work in the games industry producing Machinima style movies and therefore the module was tailored toward this goal for this student and his project title and assessment mechanism was also focused on this goal.

Linking the assessment to the student’s employability aspirations in this way ensures that students build a body of work throughout the life of the course to show prospective employees and this triangulated approach (linking learning objectives to employability goals and enabling a preferred expression format) is believed to be linked to the high employment rates that the graduates of this course demonstrate.

This tailored model also supports the principles of Universal Design for Learning (UDL) in that from the beginning the students are engaging in ways that suit them as individuals and they will be assessed in a manner that plays to their strengths.

Students also feel as though they own their learning experience on this course which helps them to feel motivated and to achieve.

When marking, the learning objectives that were agreed at the beginning of the course are made known to the marker and second marker and the student work is marked against these; alongside general marking criteria in line with PG regulations.

The negotiated learning objectives do align with the learning outcomes on each module to ensure parity and quality standards are upheld and Sophy believes that the success of this approach is rooted in the clarity of the learning objectives.

Typically, the MA/MSc Creative Technologies will attract around 15 students per year and the flexible approach to assessment is now standard across all course specific modules on this course.

There is a new MA Digital Arts that is coming on stream in 2016/17 and this good practice has been carried over to the new course where students will again be able to negotiate their learning objectives and assessment style from the beginning based on their employment related goals.

Thank you to Dr Sophy Smith, Reader in Creative Technologies at the Institute of Creative Technologies, DMU for enabling this blog post.

 

Ian Pettit.

Feb 092016
 

Mr Luke Attwood leads three modules (ranging levels 5 – 7) in the school of Computer Science and Informatics at DMU. The modules are; Object Oriented Software Design & Development, Object Oriented Programming, and E-Commerce Software.

Each component requires that students submit their software solutions in source code form and Luke uses the Assignment submission tool in DMU’s VLE (Blackboard) with an associated electronic rubric for marking and providing text based feedback.

None of the components are usually marked anonymously, although it is feasible to do this when circumstances require by hiding student identification data.

The rubrics are consistent in that they all involve a set of criteria against which there is a 5 tier scale that Luke has adopted to represent the levels of achievement (0 – 100) with a percentage and description attached; e.g. Significant room for improvement (40%).

Before rolling out the rubrics Luke tested them with colleagues second marking to ensure the rubrics would produce a consistent and accurate result. Luke constantly updates and tests the rubrics to further refine the wording of the criteria but over the last two years minimal changes have been required and the rubrics are performing well by way of consistency and the distribution of marks.

At first, colleagues were sceptical with regard to the rubrics providing enough variation of marks. Although each criteria only has a 5 tier scale (e.g. 0, 40, 60, 80, 100); as there are several different criteria Luke finds that the rubrics still return a granular spread of marks across the cohort. Luke did experiment with fewer criteria (and fewer levels of achievement) but this did result in bunching of marks and too many criteria (or levels of achievement) became unmanageable – Luke would advise a minimum of 3 criteria but 4 or more is better and is appropriate to the nature of the submissions on the modules he teaches. This, in combination with the 5 tier scale has consistently worked well.

When marking, Luke will enable the ‘Show Feedback’ view in Blackboard to enable personalised text feedback to be included alongside any pre-determined feedback that Luke may have pre-prepared as the rubric was produced. Not all of Luke’s rubrics include pre-filled feedback but if he finds that similar comments are being made on a frequent basis Luke will make a note of these for pasting into the relevant feedback section later. Having the Show Feedback view enabled also allows Luke to override the overall grade if necessary and he will always provide a text based comment with a brief explanation regarding any overridden grades.

Luke is confident that using rubrics to mark students work enables him to provide provisional grades and personalised text feedback in a timely manner. Whilst Luke’s students are yet to specifically provide feedback on his use of rubrics there has been no negative feedback from the students.

Luke finds that there are multiple benefits to using rubrics in this way. Rubrics provide the ability to easily mark and write comments in a single location that are then immediately available for students to view rather than having to upload a separate mark sheet, which for a lot of students in itself can be time consuming. The rubric grid also does all of the calculations automatically for Luke so he does not need to worry about this or even have the totals moderated. Furthermore, it provides an elegant way of placing comments directly next to where they are applicable.

Thank you to Luke Attwood for enabling this post to be created. If you wish to explore the use of rubrics in Blackboard further please contact your local ELT Project Officer.

Ian Pettit.

Jan 192016
 

Dr Simon Coupland, School of Computer Science and Informatics, Faculty of Technology started teaching a first year undergraduate C++ programming module in 2014/15. Another first for the module in 2014/15 was that Mathematics students were enrolled for the first time – some of whom may never have engaged with coding before.

The C++ module is an introduction to programming and this module underpins the complex threshold concepts that students are required to understand as they progress into areas such as Games Programming. Therefore it is critical that students are offered various ways to construct knowledge in line with their preferred learning style(s) on this module as it does form part of the foundation of multiple undergraduate programmes.

Historically, students on the C++ module were tasked with producing a piece of code each week as part of their lab sessions and the previous member of teaching staff would mark each piece of code individually between sessions.

However, knowing that the cohort could be quite varied this year, Simon sought to provide alternative methods of assessing the lab work that would involve the students collaborating and learning from each other.

Following conversations with the ELT Project Officer, Simon identified that a peer marking model for the student’s lab work would help the students to learn collaboratively, foster relationships in their first year and alleviate Simon’s marking workload as the module attracts a large (100+) student cohort.

Following a session that focused on the use of TurnItIn’s PeerMark solution, Simon set up the following scenario:

  • Each week the students will create and submit a piece of C++ code using a TurnItIn PeerMark link during the lab;
  • The following week the students will create a second piece of code and peer mark a colleague’s code; and
  • In the third week, further code is submitted and the previous week’s is peer marked but the peer marked submissions from week one are released in order for students to see each other’s comments on their work.

This is then a rolling plan with a new piece of code being generated each week for review and release over the three week cycle.

Following the first week’s lab session, all students created their C++ code and submitted it via the TurnItIn link in Blackboard. The following week, the majority of students’ seemed to engage with the peer marking aspect and offered feedback to a colleague via the TurnItIn PeerMark function.

However, after this initial flurry of interest, although all students’ continued to submit their weekly piece, very few students re-engaged with peer marking and this trend continued for approximately six weeks until Simon removed the peer marking element and reverted to the traditional model as despite encouragement the students’ were no longer peer marking each other’s work.

As this is the first year that Simon has taught the C++ first year module there is no benchmark for Simon regarding the demographic of the 2014/15 student cohort but Simon feels as though the following factors may have played a role in the students not continuing with the peer marking aspect of the module

  1. Confidence – this is a first year undergraduate module and Simon believes that although the initial novelty of marking each other’s work was appealing, the ongoing peer marking set up may have been daunting for the students who are both confident and not so confident with the subject.
    Simon did, in a later lab session, encourage the students to buddy up and talk to each other during informal peer to peer sessions away from the lab but students who tried to engage in this activity found that their contemporaries would let them down by not honouring appointments and this physical buddying/mentoring approach has also ceased now.
  2. The nature of the cohort – as recorded, this is the first year that Simon has taught the year one C++ module and Simon feels as though the 2014/15 cohort may not have transitioned into HE quite as well as may be required to understand and handle the value and responsibility of a peer marking approach to lab work. However, having not engaged with previous year’s cohorts in this way Simon has no benchmark to help identify whether this cohort is typical or atypical of a first year C++ cohort.
    Also, 2014/15 saw the first Mathematics student enrolments on the first year C++ module. Traditionally, only students who would have engaged with coding prior to coming to University and who are heading toward a career path that involves coding would be enrolled on this module but this year, students studying Mathematics were enrolled and they may not have had any experience of writing code prior to week one of the module. This widens the gap between those in the cohort who are already confident with coding and those who are not and it is surmised that with a less spikey profile of coding skills across the cohort, the assumed issues relating to confidence driving an unwillingness to engage in peer marking would be reduced.
  3. The technology – whilst the technology supported Simon’s approach on the whole Simon would have liked to have seen an automated approach to identifying who had engaged with the peer marking activity on a weekly basis. With this extra functionality, Simon would then have introduced a scenario whereby students who failed to peer mark in any given week would not be eligible to receive colleague’s feedback the following week or until they re-engaged with peer marking. This may have motivated more students to peer mark if their engagement with colleagues in the previous week could be linked to their work being peer marked the following week but this is an assumption and there is no functionality in TurnItIn to support this scenario.

The experiences documented here are valuable for anyone considering a peer marking approach – the key lessons learned can be summarised as:

  • Ensure that students are fully bought into and understand the value of the peer marking approach ahead of embarking on this journey;
  • Get to know the cohort of students and critically evaluate whether they are at a stage where they will not let each other down (virtually or physically);
  • Look for ways to motivate students to peer mark. For example, should the student marking be linked to their assessment this would motivate them to continue to peer mark their colleague’s work; and
  • Do not assume that all students will always want to engage in this type of peer assessment and collaboration.

Other colleagues in the Faculty of Technology are also trialling alternative peer marking models in 2014/15 and it will be interesting to see how successful or otherwise colleagues have been in engaging students in this type of activity and whether Simon’s experience is typical or not.

Click here to read about Dr Catherine Flick’s experiences of introducing a peer marking model to a second year Introduction to Research and Ethics module.

 

Ian Pettit

ELT Project Officer

Nov 232015
 

Dr James Russell, Principal Lecturer, Film Studies; currently teaches two modules at first year and third year undergraduate level.

James has approximately eighty students enrolled across the two modules and in the 2014/15 academic year James looked to innovate his assessment technique by engaging with one of the electronic assessment tools that DMU subscribes to.

Students studying in both years are required to submit a final essay of around 1500 words and traditionally James would print these and mark by hand. However, James felt that he had perfected his technique to the point where he could not mark any faster and he was also finding that students were not always forthcoming in picking their feedback up in hard copy.

Therefore James sought to identify a different approach to marking that might be more efficient and also make feedback more readily available to the students.

Given that the students submit their essay via the TurnItIn system, James concluded this would be a good place to start and explored the use of GradeMark for marking electronically whilst online.

James quickly identified that he would be able to create a subset of QuickMarks that are relevant to the subject and he marked the latest cohort’s submissions using a combination of QuickMarks and the free form text feedback function that is available in GradeMark.

During this initial year, James also insisted that his students hand in to provide a contingency position and conversations were had with the internal second marker and the external moderator who in turn have found the use of GradeMark to be quick and easy.

In conversation with James, it is clear that the trial use of GradeMark in 2014/15 has been a success. James is also the Subject Group Leader for Media, Film and Journalism and at a recent Programme Management Board meeting James was almost evangelistic in front of colleagues about electronic online marking – hence this blog post.

The benefits of marking online are linked to the students being able to pick their feedback up immediately once James releases this and James also feels that marking online is faster and more efficient than marking in a traditional paper based manner. GradeMark also works well with the second marker being able to see James’ comments on screen and the external moderator has been positive about the format of the downloaded submissions that are sent for moderation.

James will be continuing to mark in this way and next year he is planning to rely solely on the electronic approach. He is also encouraging colleagues to engage, where appropriate, with this scalable electronic marking technique.

Thank you to Dr James Russell for agreeing to have this practice documented and disseminated.

Ian Pettit
ELT Project Officer

Aug 212015
 

Dr Marie Bassford, Senior Lecturer, Faculty of Technology, started to teach a new first year Physics Fundamentals module in 2014/15. Having taught only second and third year students for a number of years; Marie saw the opportunity to develop a new first year module as a vehicle to re-engage with the use of phase tests for assessment.

In the past, Marie would have made use of a paper based optical mark reading system as the platform for delivering phase tests but with her learning technologist background, Marie sought to identify a more automated approach that could be re-used year on year.

Marie delivers Physics Fundamentals with two colleagues and following an exploratory conversation with the ELT Project Officer, Marie and the team decided to move forward with the Blackboard Learn Test tool for administering phase tests via the Virtual Learning Environment.

Each of the module team members were tasked with producing questions for the phase test. These questions were linked to the learning outcomes and were all multiple choice type questions with one correct answer. The phase test itself was to be 25 questions in total but with the input of three colleagues a bank of 58 questions was produced.

The ELT Project Officer helped Marie to develop her skills and understanding of the Test tool and Marie ensured that the questions were authored in Blackboard Learn.

The over production of questions was purposeful as in conversation with the ELT Project Officer, Marie had decided to create a large pool of questions and have Blackboard Learn serve a random 25 questions to each student from the pool. There were a number of reasons for this approach:

  1. Having each student answer a randomly selected set of 25 questions helps to minimise copying in the test environment;
  2. Building a pool of questions enables Marie and the team to add to this pool each year; and
  3. The pool can be re-used year on year with minimal effort by including it in the annual Course Copy.

There was an amount of effort required to create the initial pool of 58 questions but the three colleagues teaching Physics Fundamentals spread this work across the module team and this pool will now roll over each year. Marie is confident that although there will inevitably be updates and amendments to the question pool; in the long run, the time that has been invested will be recouped.

Due to the multiple choice nature of the questions, Blackboard Learn marks each test upon completion and there are options for students to see their provisional grade instantly upon completion of the test along with any generic/automated feedback. When compared to the work required to print and scan optically mark read submissions the use of the Test tool minimises the effort required to deliver the test and grades once the initial pool is created and the test is deployed on the module shell.

The students who engaged with the phase test in December 2014 were generally positive about the experience. One student did question the use of randomised question sets but as the questions that each student received had all been carefully written to support one or more learning outcome this query was swiftly dealt with.

Conversely, one student actively told Marie that she thought it was “clever how the questions were randomised” and that she understood that it made it fairer to assess that way when taking the test together, side by side at PCs.

During the test, there was one issue in that a superscript character that had been used in one of the questions did not display correctly but this was quickly dealt with on the day and Marie has now re-formatted the question text to ensure that it displays fully to the students.

Marie’s next Physics Fundamentals phase test in is April 2015 and she will be using this question pool with the addition of further questions for the second phase test and Marie will continue to use the Test tool this year and in years to come as part of the Physics Fundamentals assessment  components.

Thank you to Dr Marie Bassford and the Physics Fundamentals module team for enabling the documentation of their use of online phase tests. If you wish to learn more about how to replicate this practice at DMU, please contact your local ELT Project Officer.

Ian Pettit and Marie Bassford.

Jun 052015
 

Dr Neil Brown mainly teaches Energy Analysis Techniques, Energy Efficiency, and Mechanical and Electronic Engineering Labs, in the School of Engineering and Sustainable development at DMU.
 
Traditionally, all feedback in the Energy and Sustainable Development (ESD) Subject Group has been text based due to the use of a specific database for communicating feedback to students. The database was partly developed for the benefit of the Distance Learners in ESD who make up the majority of the cohort.

Neil’s  biggest single marking load is Energy Analysis Techniques, this is a core module to three MSc courses and the assessment comprises of two written components. To provide as much meaningful feedback to students as is possible and to be able to mark efficiently and away from the university whilst offline; he has identified an innovative and efficient way to provide feedback that his students have also embraced.

The approach adopted bypasses the computer keyboard by using speech to text software to simply dictate to the computer. Using this approach it’s possible to generate feedback much more quickly, with less fatigue, and allowing concentration on the subject in hand.  He also uses this technique to generate course notes for Distance Learners and he has found that dictation can be around 5-6x faster than typing.

For marking, the overall process is not sped up massively, but the extra detail possible in feedback means that there are almost zero queries on marks from students, which in itself offers a massive time saving. One recent comment was that a student was ‘blown away’ by the amount of feedback.

For Energy Analysis Techniques, comments on each report are grouped as; general comments, notable good features, and areas for improvement.  Comments could also be placed in the submitted PDF of each assignment. This is done in conjunction with grid marking, where a spreadsheet is used to generate marks based on weighted criteria.  It’s not vital to mark in this way, but grouping comments this way, plus grid marking makes things easier still.

Neil uses Dragon Naturally Speaking 10, which now costs around £30. The basic microphone which comes boxed with the software works reasonably well, but he has found that suppliers of dictation software to GPs etc. offer microphones with much better results – expect to spend around £30-50.
Usually, the dictation is carried out using a basic Dell laptop from 2010, running Windows 7. The Dragon Naturally Speaking CD installs itself in Windows and the software can be configured to run on Linux with some tweaking, and Mac OS. He has also trialled other speech to text solutions such as Google speech recognition and IBM ViaVoice but the Google product proved less reliable on accuracy.  The IBM product worked well but it did require significantly more training.

To dictate, a microphone is plugged into the laptop and the Dragon software is started along with the application (Word, Excel, Open/Libre office, Notepad). Training the software to recognise a specific voice takes around 30 minutes and involves reading some set passages before dictating for real. This ‘training’ can be one-off, although the software does become more accurate with more use by the same person/voice.

A bespoke database had been used in the past, long before Blackboard was used for providing feedback, but now the subject group uses DMU’s Blackboard Learn VLE installation. Blackboard Learn offers the chance to provide audio feedback too, circumventing text altogether. Neil and the ELT Project Officer discussed this and Neil tested this multimedia based audio feedback approach, although after a trial the students stated a preference for text as text is easier to skim read and pick out the salient points. He also felt that the audio files were rather lengthy, handling them became fiddly for a large cohort, so has now reverted to dictation.

This approach to providing rich text based electronic feedback not only benefits students but colleagues who may have a disability could also adopt this technique to speak their feedback.  The software can also control the computer, offering improved functionality for anyone who is differently able.

Neil’s top tips for those who may wish to replicate this practice would be:

  1. Use a good quality microphone – background noise can reduce the accuracy of the software
  2. Set the software to be as accurate as possible and speak clearly
  3. Skim read the output text before releasing to the student as some specialist words or phrases can be misinterpreted
  4. Understand your students – Energy Analysis Techniques students prefer text based feedback but in other subjects it may be more appropriate to provide audio, text, or feedback in other media.

Ian Pettit, Neil Brown

Apr 032014
 

It’s not as testing as it seems

It was anticipated that students studying their first year of the new Mathematics for Scientific Computing module would take four tests on paper but newly appointed module leader, Dr Sarah Greenfield, had other ideas.

Sarah recognised that the nature of the tests; multiple choice with single correct answers, lends itself to an online format. This not only means that Sarah can have the tests marked by the system but it saves paper and makes the tests re-usable year on year with minor amendments.

One hurdle that was overcome was the use of pictures in one of the tests. The fourth phasetest is a thirteen question test in which each of the questions and each of the four options for each question are presented in picture format.

Sarah talked to the relevant specialist and it was decided that to create this test using the Blackboard Learn Virtual Learning Environment would investigated as this was anticipated that this platform would add the most value by way of time saving and recycling the tests.

With support from the ELT Project Officer, Sarah has been able to efficiently create a bank of questions for Phasetest 4 which comprise of the question image and four answer images per question. It took a few hours to create all of the images so that they all appear consistently but this up-front investment is far out-weighed by the time saved by having the tests marked by the VLE and on an ongoing basis, there is no longer a need to re-write and re-print paper tests every year.

Two issues were encountered:

Firstly, when adding pictures (.jpg format) to the test in the VLE, the default permissions do not include any access for students. This means that although the test appears to be fine for instructors, when students take the test for real, blank spaces appear where the pictures should be.

To overcome this, as the pictures were created, they were saved to a single local directory with meaningful filenames and following an agreed structure/hierarchy. Once all of the pictures had been produced, the entire directory was uploaded to the VLE and the folder permissions were changed to include students with read only permissions. Before submitting, the ‘overwrite’ option was selected and the new student permissions automatically percolated through to the sub-directories and files.

The second issue was how to create the thirteen questions in an efficient manner. Sarah could have created each question individually and browsed for each question and its respective four options images as the questions were created but it was estimated that this would represent the best part of a day’s work.

Instead, Sarah created the first question and using the Edit Test screen twelve copies were taken. Sarah then went into the editor for the second question and amended the question title and so on. However, at this point the images in the copied questions were still showing the images for question one. To save time, instead of removing all of the images and browsing the course for the correct image, Sarah made use of the HTML mode in the editor to point the existing code to the correct image.

For example, this code points toward the third answer option for question one:

<div class="vtbegenerated"><img src="https://vle.dmu.ac.uk/bbcswebdav/courses/IMAT1205_2014_Y/Phasetest4files/Sarah%20G_Phasetest4images_March2014/Sarah%20G_Phasetest4images_March2014/Q1/Q1C.jpg" alt="" style="border: 0px solid rgb(0, 0, 0);" /></div>

The key part here is the “Q1/Q1C.jpg” section of the code. This represents the sub-directory and the filename of this particular image.

Using the HTML mode, Sarah was able to copy and paste this code into each of the questions and amend only the sub-directory and filename to show the correct picture for the correct question and its respective answer options.

For example, this code represents the first answer option in question eight:

<div class="vtbegenerated"><img src="https://vle.dmu.ac.uk/bbcswebdav/courses/IMAT1205_2014_Y/Phasetest4files/Sarah%20G_Phasetest4images_March2014/Sarah%20G_Phasetest4images_March2014/Q8/Q1A.jpg" alt="" style="border: 0px solid rgb(0, 0, 0);" /></div>

Note that the code is almost identical with the exception of the sub-directory and filename which is now “Q8/Q1A.jpg

Using this technique of copy and paste with the HTML mode on, Sarah was able to quickly re-point the links to the relevant picture files in around an hour. This use of HTML mode requires absolutely no knowledge of HTML programming or tags, just a well organised folder to begin with.

Sarah’s students will now sit this phasetest in a computer lab whilst logged into the VLE rather than on paper. This will save Sarah time as there is no longer a need to manually mark the tests and Sarah now has a bank of questions that can be easily added to or amended over the next few years.

 

Ian Pettit

Aug 212013
 

I recently took delivery of a new piece of technology from the IT department here at DMU. It’s a telephone… or is it?

Last year DMU started the roll-out of Microsoft Lync to all of its staff members; the vision is that staff will make use of the software to video conference with each other, we will make use of Voice Over IP and we can also manage our time and meetings more effectively due to the integration with Outlook.

From a business justification perspective, all of the points above stand up by way of helping us to be more efficient and save costs but it is the other benefit that this project has ‘accidentally’ delivered that I want to talk about here.

The Centre for Enhancing Learning through Technology (CELT) works independently from the IT support department as part of the Library and Learning Services Directorate at DMU. However projects that the IT team delivers will often have a knock-on effect on CELT’s work and can sometimes deliver unexpected teaching, learning and assessment related benefits.

In order to take full advantage of MS Lync, the IT team has equipped each staff member with a new telephone – I was quite excited to take delivery of mine as my previous device would have been more at home in a museum than on my desk but aside the obvious benefits of clearer sound, being notified when I had missed a call and being able to see when my next meeting is scheduled on the telephone’s screen I also noticed that there are now some new cables on my desk.

The cable I’m really excited about (if I can get excited about a piece of black wire) is the USB cable that now plugs into my PC and the new ‘phone.

This cable not only facilitates the communication between the MS Lync desktop application and the telephone but my PC also now ‘sees’ the new telephone and its discreet devices as devices that the PC can access, control and interface with.

And here’s the point – by providing each staff member with a new telephone and the USB interface, the IT department has given everyone a good quality microphone and speaker that their computer can see and use.

Over the last couple of years, as one of the Enhancing Learning through Technology Project Officers at DMU, I have worked with a number of staff in areas such as creating screencast based resources, providing audio or audio/visual feedback and using screencast technology to provide resources and feedback using a variety of media for Distance Learners and attending students.

One area that has always been a sticking point is the provision of an appropriate microphone and speaker(s) to enable a teaching team to adopt such practice en masse.

Traditionally, I have always advised staff members to look for a mid-range wireless USB headset with microphone as this can double up as a device to be used in the office for recording audio feedback or if staff wish to record their session then the wireless USB headset can also be worn whilst teaching in order to capture audio as part of a lecture capture solution without having to loan or purchase a separate lapel mic’. However, such headsets can cost around £50 each and this cost can be prohibitive.

I have also come across instances whereby teaching staff will be in possession of a microphone but it will be an older 3.5mm jack plug style microphone. This would be ok when maybe using the Windows sound recorder to produce audio files but when interfacing with software such as Expression for the production of screencast type content, a USB microphone is required as in my experience Expression does not interface with more traditional equipment plugged into a jack plug and other applications struggle to pick up the older style microphones at a decent volume (even with a bit of tweaking of the levels).

So this brings me back to my nice new shiny telephone and the fact that when it was first plugged into my work PC it installed a few drivers, talked to MS Lync and did everything that the IT team expected it to; but now, when I open the ‘recording devices’ menu on my PC I see I have a new USB microphone available to use that Expression can also see (or is that hear) as well as other software such as Panopto and the Windows sound recorder.

The ‘phone actually has two microphones, the one in the handset and the one that is built into the body for use in loud-speaker mode, it doesn’t matter which I use when using the ‘phone to record audio on my PC, both deliver very good quality audio and the PC doesn’t need to switch between the handset and loud-speaker microphone which makes using the telephone as a USB microphone really easy – it’s just the same as plugging a USB microphone into a computer and talking to it.

The provision of these telephones at DMU has opened up a lot of potential for staff wanting to experiment with audio and audio/visual resources and feedback as everyone now has a good quality microphone on their desk that will talk to software that is free to use or other centrally supported software and they also have a speaker through which recorded content can be played for checking prior to uploading to the VLE, a real bonus for a project that was focused solely on providing a more corporate style communication tool for staff.

One member of academic staff at DMU is ahead of the game in this respect as his location was equipped early in the MS Lync project. Cormac Norton, School of Nursing, Faculty of Health and Life Sciences has already adopted the use of his new telephone as a USB microphone to add voice to PowerPoint slides – a case study that looks at Cormac’s technique in detail can be accessed on the CELT Hub here.

This experience also highlights the need for people such as myself, who support the use of technology from a teaching, learning and assessment perspective to be aware of the technology that is centrally provided and how technology that might not have been designed or implemented with teaching, learning and assessment in mind can be exploited in order to make a difference.

I’m sure if we all looked hard enough we’d be able to squeeze just a bit more out of the kit that we are supplied to work with every day.


Ian Pettit.