Technology integration – a reflection

I recently presented a paper at the Australasian Simulation Congress on technology integration in the context of a first semester engineering course, “Introduction to Scientific Research and Writing and the Basics of Project Management”. I drew on Hattie’s (2009) research on high impact teaching and learning strategies to develop a pilot program (Pilot 2) in collaboration with a team of professors and a psychology research assistant, however the Technology Integration Planning (TIP) model (Roblyer & Doering, 2014) provides a good framework to visualise how I integrated technology into the curriculum.

🏴 PHASE 1: ANALYSIS OF LEARNING AND TEACHING NEEDS

☝️ Step 1: Determine the relative advantage

Context: Funding was received from the German Ministry of Education (BMBF) in May 2011 to develop a course to scale project-oriented problem-based learning faculty-wide. The first semester engineering course, “Introduction to Scientific Research and Writing and the Basics of Project Management” was designed as a remedial intervention to improve the academic literacy skills, metacognitive knowledge and self-regulation of first semester engineering students. During Pilot 1 (Winter Semester 2013/14 through to Summer Semester 2014), great emphasis was placed on maintaining a traditional unidirectional, vertical plane of assessment. The students attended frontal lectures during Phase 1 (the first half of the semester), completing a summative assessment in the form of a multiple-choice exam prior to progressing to Phase 2 (the second half of semester). In order to successfully complete the course, the students were required to conduct exploratory research and disseminate the results in the form of a short article and film. Each team was assigned a senior student tutor whose primary function was to record attendance at compulsory team meetings. The students received a final mark based on their performance in the multiple-choice exam in Phase 1, in addition to any marks gained through a bonus point program. Bonus points were awarded to students who participated in the campus mentoring program “Mentoring4Beginners” prior to Phase 1 as well as to those who produced one of the top three films during Phase 2. The topics selected for the Phase 2 quest were non-subject specific.

An average of 350 students enrolled in Pilot 1 each semester. The overall student numbers were and remain far higher than the capacity of the facilities designed to accommodate them. Other challenges included the percentage of achievement variance among newly enrolled students, the rising dropout rate and the decline in academic literacy levels. Student evaluations from Pilot 1 indicated that a significant number of students found it difficult to make the connection between the structure of the course and what they were expected to do. Many questioned the relevancy of the course to their chosen degree path. There were also numerous complaints about the amount of on-campus time required to complete the course and “death by PDF” (the subject material was provided to the students in the form of PDF attachments uploaded en masse in an Open Source LMS with limited functionality).

I decided to leverage BYOD and mobile learning for Pilot 2 (Winter Semester 2014/15 through to Summer Semester 2015) in view of the infrastructural restraints and the students’ learning requirements to:

  1. Provide timely, frequent and individualized feedback
  2. Make the course content as accessible as possible over three learning spaces: on-campus lectures, on-campus group activities and tutorials and off-campus mobile learning activities
  3. Increase student engagement through formalized peer-tutoring from cross-aged tutors

☝️ Step 2: Assess TechPACK

  1. Who (content knowledge): provided by a team of three professors (senior faculty)
  2. What (technological knowledge): provided by myself – I have extensive experience in integrating technology and digital media into the curriculum in secondary schools and higher education institutions
  3. How (pedagogical knowledge): I drew on Hattie’s (2009) research on high impact teaching and learning strategies to develop the pedagogical strategies for Pilot 2 in cooperation with the professors

Note: I find the TechPACK assessment a little too technology focused for Step 2. I prefer the TPAC model – here is a short two minute explanation:

☝️ Assess TPAC: my choice of technology was guided by the what (the subject content developed by three professors) and the how (pedagogy guided by Hattie’s research on high impact teaching and learning strategies) within the local context (all students own smart phones and/or tablets; the operating systems vary).

🏴 PHASE 2: PLANNING FOR INTEGRATION

☝️ Step 3: Decide on objectives, assessments

Instead of the final grade being based only on the multiple-choice exam results gained at the end of Phase 1 as well as any additional marks secured through the bonus point program, the team deliverables (Milestone II presentation, milestone documents, film and article) were weighted. Three additional professors joined the team for Phase 2 from the start of Pilot 2, designing a subject specific quest and providing subject expertise and support in physics, mechanics and mathematics throughout Phase 2.

The students were required to successfully complete two ‘levels’ to obtain their final grade, as opposed to one level in Pilot 1. In order to progress to the second and final level the students were required to demonstrate their mastery of the concepts taught in the first level through a multiple-choice test, or ‘side errand’. The lower level, or ‘Phase 1’, consisted of a series of lectures (surface learning). The higher level, or ‘Phase 2’, was comprised of project-oriented problem-based learning (deep learning). No prior knowledge of academic writing or project management was assumed.

Elements of game design which have been found to be particularly impactful on learning such as feedback, challenge and practice at the right level, in addition to peer tutoring from cross-age tutors, formative assessment, valuing error and creating trust (Hattie, 2009, pp. 297-298) featured strongly in the course design for Pilot 2.

☝️ Step 4: Design integration strategies

Centered on game-like learning delivered through a series of ‘side quests and errands’, or scaffolded tasks, Pilot 2 was supported by a responsive assessment feedback loop to guide the students in their learning while competing with other groups in ‘multiplayer mode’. The students, or ‘players’, were required to complete the side quests and errands to progress through the course, or ‘real-time strategy game’, which was designed to simulate a professional setting. Three learning spaces were introduced: on-campus lectures, on-campus group activities and tutorials and off-campus mobile learning activities.

Phase 1: structured in-lecture questioning using online game-based learning platforms were introduced to increase student participation, improve knowledge retention and assist academic staff in better monitoring student learning. Online quests, designed to complement the on-campus lecture series, were made available for the students to complete in their own time on their mobile devices. Social networking technologies were leveraged to add a lateral plane of formative assessment and create a sense of community among the students, the senior student tutors and the staff.

Phase 2: An outcomes-based approach guided by constructive alignment was taken for Phase 2. The learning objectives were rewritten to be more clearly measurable and the course assignments were mapped more closely to the course objectives. Formalized peer tutoring from cross-aged tutors was introduced – the groups were required to attend three face-to-face meetings in addition to collaborating online. This formed an integral component of the continuous feedback loop for Phase 2, which also included a mobile questionnaire and fixed milestone reviews. Online quests, designed to complement the on-campus activities, were made available for the students to complete in their own time on their mobile devices. Social networking technologies were leveraged to add a lateral plane of formative assessment and create a sense of community among the students, the senior student tutors and the staff.

☝️ Step 5: Prepare instructional environment

Fifteen senior students were employed, trained and assigned four student teams to supervise throughout Phase 2. They were given a limited instructional role, supporting the first semester students in developing their technological skills, metacognitive knowledge and self-regulation through formative assessment. The senior student tutors were not viewed as a substitute for academic staff, but rather as a valuable component of the feedback loop that assisted in determining the ‘player levels’ of the students during the implementation of Phase 2. Regular meetings were held with the senior student tutors, a psychology research assistant and myself throughout the duration of Pilot 2.

🏴 PHASE 3: POST-INSTRUCTION ANALYSIS AND REVISIONS

☝️ Step 6: Analyze results

Both the student tutors and students were enthusiastic about about the BYOD and mobile learning components of the course, reporting that they felt they had aquired valuable work, study and time management skills.

http://derwagpblog.tumblr.com/post/144905655359/jobbenimstudium-welche-fähigkeiten-hast-du-durch


However, a significant number of students felt that the subject specific quest was too difficult and there was no indication that the subject specific quest topic improved the learning outcomes substantially.

☝️ Step 7: Make revisions

Based on the student feedback from Pilot 1 and Pilot 2, it was decided to maintain the both the technology component and the structure of the course as implemented during Pilot 2 for Winter Semester 2015/16.  However, unlike Pilot 2, a non-subject specific topic for the Phase 2 quest was selected for Winter Semester 2015/16.  Previous studies have shown that project-oriented learning at tertiary level can have a negative effect on learning if implemented before students have obtained sufficient surface knowledge (Hattie, 2015, p. 85). This could explain why the students’ perception of their achievement did not improve markedly between Pilot 1 (non-specific Phase 2 quest topic) and Pilot 2 (subject specific Phase 2 quest topic). Although challenge and feedback were included in the course design for Pilot 2, the level of difficulty (subject specific quest topic) was not appropriate for the level of surface knowledge possessed by the first semester engineering students.

Read the paper here.


References

Hattie, J. (2009). Visible Learning: a synthesis of over 800 meta-analyses relating to achievement. Oxon: Routledge.

Hattie, J. (2015). The applicability of Visible Learning to higher education. Scholarship of Teaching and Learning in
Psychology, Vol 1(1), Mar 2015, 79-91. Retrieved from: http://dx.doi.org/10.1037/stl0000021

Roblyer, M., & Doering, A. (2014). Integrating Educational Technology into Teaching: International Edition, 6th Edition, Pearson.

2 Comments
  • Ellen McIntyre
    January 14, 2017

    Hi Benita,
    That was quite an extensive study with a decent time frame and sample size. Will you get a chance to pursue the next phase of the project or are you moving on?
    I was not aware of the Hattie 2015 article, but having skimmed it tonight I will certainly be looking at it in more detail. I would be interested to have more details about the project. The subject title “Introduction to Scientific Research and Writing and the Basics of Project Management” appeared to be quite challenging for first year students and I wondered whether there was a differential focus on scientific research, scientific writing, and project management. The game-based strategy would have made some interesting activities and some examples of those would be good too. Is it a form of adaptive learning?
    Best wishes,
    Ellen

    • Benita Rowe
      January 14, 2017

      Hi Ellen, thanks for your comment! I’ve just started a new job but before I left the course was adopted into the curriculum.
      On the title – I was and am a little uncomfortable with it as the course was designed to teach students how to cite references. It’s not what we would call scientific research in English but that was the title that was assigned to it.
      The game-based strategy worked really well in that particular context. It wasn’t adaptive learning in a strict sense because it combined both physical and online environments, however the feedback system was designed and used in a manner that made it possible to adapt the content as necessary and provide rolling feedback throughout the progression of the course, which was no small feat given the number of students! 🙂
      I did want to publish the materials and activities under an open access license, particularly as their creation was funded with taxpayers’ Euros. Unfortunately that was not permitted. I described the activities and structure as extensively as I could in the paper though – here it is.
      Here is a Google Spreadsheet with all of the software etc. that I use to create BYOD/ mLearning materials and activities (I used them for Pilot 2 as well) – feel free to download it or add it to your Google Drive.
      Some of the students published their entries – here is one of my favourite short films – enjoy!
      Cheers, Benita

Leave a Reply

Your email address will not be published. Required fields are marked *