Areas of Interst: Possible Research in Ed Tech

I should have posted this back in September in order to track my progress in the field, but better late then never right?


I keep going back to one of the very first pieces I ever read as a doctoral student, Grover’s (2001) piece on the 10 mistakes doctoral students make. I refer to it in an attempt to avoid making too many of them. There are so many different things I am interested in, and in theory they would all make interesting studies, but as Grover said, my project must be more than just interesting and relevant, it also must be defensible and feasible (2001, 13). I can imagine all sort of interesting ways to test my possible research topics, but when I factor in time, ability, money, and practicality, many of those projects end up being crossed off the list.

My main areas of interest include the design and development of virtual environments and social media, including Web 2.0 technologies, but after reading Noffke (2008), I am less interesting in developing something as opposed to researching issues emanating from practice (p. 430). It would be great to sit down and develop a whole new learning environment (I even thought about developing a Second Life island for this cohort to see if it assisted with building a sense of community) but who am I kidding? Just the research into the requirements of creating a new learning environment would take months, and then there’s the design, the development and implementation and the buy-in from instructors. Assuming I was able to convince someone to try it, I could then get around to testing if it had any positive learning outcomes. By that time I’ll be in year 6 of my degree. Maybe someday I’ll have the opportunity to take on such a far reaching project, but for now I need to scale it back.

While working on the dissertation review project I read several works which utilized active research with a mixed methodology approach, qualitatively analyzing surveys, and quantitatively analyzing performance results. It was in the merging of the two that the researchers found not just conclusions, but an area for real world applicability. Many issues concerning the use of social media, virtual learning environments (VLEs), and Web 2.0 technologies live in this realm between qualitative and quantitative. Engagement, student interest, usability, instructional design and implementation all merge to create the learning environment, therefore the effectiveness of these technologies may be measured by their test scores, but their appropriateness is measured by human behavior. Education and learning is a personal experience and should be researched as such. Any research on the use these technologies have to answer questions not only of relevancy but of action (Noffke, 2008, p. 431) such as: Is this the best tool for this time and place? Can it deliver results in a costly and efficient manner or are there other options? If this was available in your school/organization would you use it? Does using this tool give you a benefit no other tool does?

I went back and looked at the possible research topics I had listed earlier and tried to narrow them down based on which ones I could develop practical test models for (based on my limited knowledge of creating text models). As I read through them I asked myself “How would I go about setting up the research?” The idea of using Second Life and other virtual environments to teach cultural sensitivity for those working and being deployed abroad interests me in my role as a courseware developer for the Air Force, but I doubt I could ever get the Air Force to sign on to such a case study, at least not in a timely manner. I did notice that there is some overlap in how the research for some of the topics could be done, and while the specifics varied, a mixed methods approach would be appropriate for most. I decided that the best way to proceed was to just pick one and use it as a test case. I can always adapt these ideas to another topic at a later time. The specific topic that I choose is: Can virtual environments eliminate the “gender gap” in math and science? The advent of avatars allows learners to be whoever they want to be, so would gender ambiguous learning environments help girls and women find their voice in the fields of math, science and technology? 

I believe it would be possible to set up this experiment via a VLE using three test groups. It would require a mixed methodology, analyzing not just test results, but student response and instructor buy-in. In their article on mixed method research, Johnson and Onwuegbuzie (2004) content that “mixed method research offers great promise for practicing researchers who would like to see methodologists describe and develop techniques that are closer to what researchers actually use in practice” (p. 15). The field of education technology is not just about numbers and test scores, it’s about how people react to and interact with that technology. Jill Sellars Morris utilized a mixed methods approach in her dissertation, A Case Study on Advanced Technology: Understanding the Impact of Advanced Technology on Student Performance. Morris (2010) conducted a direct comparison of test scores, but also a more subjective interview to measure the engagement of students and the attitudes of teachers (p. 13). Morris was able to do her research at a school which was just begging to implement smart board technology, some classes had them, and others did not, thereby her test groups were already in place. If I could find a similar situation with a school that was just beginning to implement the use of VLEs, this would alleviate the costly and time consuming process of having to develop a virtual world. 

Like the dissertation research conducted by Park (2006) in his analysis of animation’s effects on student learning, my research would use three separate test groups. One group would act as the control group, being taught a math or science course in a traditional manner, while the other two groups would use a VLE for their course. One group would use avatars that are designed to represent themselves, and the other group would use generic avatars with a gender neutral alias. The instructor would never know the gender or “real-world” identity of each student. The research would also be conducted blind, the avatar identity matched up to the student profile only after the test results were analyzed. By using a quantitative analysis on the test scores one could see if there was a difference between traditional methods and VLEs overall, and then break it out based on gender. The results might show there is no difference, a worse performance in a VLE overall and/or by gender, or a better performance overall and/or by gender. 

In their article Saeed, Yang, and Sinnappan (2009) argue that the appropriate use of technology positively influences student academic performance, and that student learning styles influence student preferences for using that technology (p. 98). Part of the qualitative research would have to include a survey to test this hypothesis. One set of questions would go to the students: Did you feel comfortable asking questions in your class? Did you feel that the instructor gave you adequate explanations? Was the instructor accessible? Did you enjoy your learning experience? How would you rate this class against others you have taken? Do you feel that you learned the material? Another set of questions would go to the instructor: Did you notice a difference in the way you interacted with the students in the different environments? Did you notice greater student participation in one over the others? Were you comfortable teaching in the virtual environment? Were you able to interact with your students at the same level when you didn’t know who they were? The answer to the survey questions would then be compared to the test results. The relationship between engagement, enjoyment and perception and actual performance could then be extrapolated.

No matter what the results I’m sure it would still leave a lot of questions. As always, more research will need to be done.

Works Cited

Grover, V. (2001, May). 10 Mistakes Doctoral Students Make in Managing their Program. Decision Line.
Johnson, R. &. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26.

Morris, J. (2010). A case study on advanced technology: Understanding the impact of advanced technology on student performance. Ed.D., Northcentral University. Retrieved from ProQuest Digital Dissertations. (AAT 3411162)

Noffke, S.E., (2008). Research relevancy or research for change? Educational Researcher, 37(7), 429-431.

Park, J. (2006). Animation effects on student learning: The impact of animated graphics on student recall. Ph.D., University of Florida. Retrieved from ProQuest Digital Dissertations. (AAT 3229726)

Saeed, N., Yang, Y., & Sinnappan, S. (2009). Emerging web technologies in higher education: A case of incorporating blogs, podcasts and social bookmarks in a web programming course based on students' learning styles and technology preferences. Educational Technology & Society, 12(4), 98–109.

Comments

Popular posts from this blog

A reason to GO to graduate school, you're surrounded by graduate students

The Un-Educating of America

6 Technologies That Will Change Education?!?