Issues in the Teaching of Research Methods

PDF with appendices available here.

“… science must begin with myths, and with the criticism of myths; neither with the collection of observations, nor with the invention of experiments, but with the critical discussion of myths, and of magical techniques and practices.” 

Popper (1963) p66. 

MacInnes (2012) has talked of the potential for research methods to shock, inspire and challenge but he also states that this will require some creative innovation in the development of teaching resources and delivery techniques.  There are many research methods books available (Breakwell et al, 2012; Coolican, 2009; Flanagan, 2012; Miles and Banyard, 2007) that can provide a practitioner with the knowledge to teach research methods, however this knowledge does not automatically translate into the capacity to teach it well (Garner, 2012).  Even ‘the best curriculum is worthless without proper pedagogy’ (Coombs and Rybacki, 1999, p56).

Throughout this assignment I will address the issues and provide strategies for a teacher who is returning to teaching following a gap of service.  It will address four issues that cover the main areas where a teacher returning to service would need support and interventions.  Within each of these areas I will discuss the issues that the teacher could encounter, provide advice and practical strategies based on both personal experience and research, and detail ways to measure the effectiveness of the suggestions.  The four areas that I will cover are:

  1. What are they thinking? Here I will consider engagement in class, specifically linked to the teaching of research methods and improving critical thinking skills through the teaching of research methods and methodology.
  2. Where are they? This will consider measurement of student performance and consider both formative and summative assessment strategies that can be utilised when teaching research methods.
  3. What are they doing? A discussion of the content and teaching strategies that can be used to with examples of resources and discussion on their effectiveness.
  4. Get them doing it.  Ideas and strategies to encourage students to ‘get their hands dirty’ with research methods to both engage and give them a better understanding of methods.

Core concepts throughout the following suggested strategies come from Angelo (1995) who suggests that classroom learning improves when (a) students are personally invested and actively engaged, (b) they receive prompt and comprehensible feedback, and (c) they work cooperatively with their classmates and teachers.


What are they thinking?

A solid understanding of research methods is a corner stone of a successful psychology student and can often be seen by students as dull with both students and teachers approaching the topic with disinterest and trepidation (Lammers, 1993). Research method courses can lack the intrinsic elements of other areas of psychology that engage and interest learners therefore it is vital that the teacher reveal to the students the beauty of science by actively engaging them in the learning process.

The very act of being engaged adds to the foundation of skills and dispositions that are essential to gaining an awareness of a topic, both in knowledge and skills therefore any strategy given to the teacher returning should incorporate active and collaborative learning activities (Kuh, 2003).  These could take the form of abstract activities such as writing letters to students within the class to explain concepts (Dunn, 2000) to role-playing an ethics committee and making decisions on the legitimacy of research (Herzog, 1990) (appendix 5(i) & 5(ii)).  These role-plays and reflection on them helps students with the important process of evaluation and metacognition (Pellegrino et al., 2001).

Scholman’s ‘Table of Learning’ taxonomy makes the assertion that learning begins with student engagement, which in turn leads to knowledge and understanding.  When the learner has gained this knowledge and understanding they become capable of performance.  At this point reflection on ones awareness leads to higher-order thinking and awareness of your understanding (Scholman, 2002).  Therefore, without engagement, active learning, or investment in ones learning the student will not progress and achieve

Within the context of this assignment, engagement in a post-16 research methods module can be fostered through many different avenues from delivery and pedagogical style to the content and context offered for methods and concepts.  Further to this, developing critical thinking skills and allowing students the freedom to explore the area within a scaffold rather than force rote learning will also provide a solid foundation on which they can build a successful understanding of research methods (Holt, 1975; Kuh et al., 2010).

Psychology students should be able to think critically, or evaluate claims, in a way that explicitly incorporates basic principals of psychological science or have psychological critical thinking (Lawson, 1999).  Within research methods this can be achieved using a variety of strategies.  One suggestion is to start topic areas with more abstract activities to get students considering methodological issues outside of the narrow framework of each subject specification (Kaminski et al., 2008) and bring these issues to life (Blair-Broeker, 2003).  The use of activities such as More cat owners have degrees demonstrating the dangers of misinterpreting correlational research and the possible bias caused by funding, and the dangers of bread again illustrating issues of inferring causation from correlation (see appendix 2(i) and 2(ii)) act as excellent points for discussion about causation and correlation.  Articles such as these teach students to be ‘savvy consumers and producers of research’ and develop the abilities needed to analyse, synthesise and applied learned information (Sternberg, 1999).

To scaffold students’ analysis and evaluation skills the returning teacher could utilise a set of critical thinking questions to frame their evaluation of research along with their understanding of research methods (Lawson, 1999)(see appendix 3).  These critical thinking questions provide students with important questions that they can use to establish the credibility of a research method. It also allows differentiation across learners providing the opportunity for those with low ability to give limited responses and the more able students to expand and demonstrate their synoptic awareness of research methods and the surrounding issues and concepts.

Popper (1963) suggested that all science should start with myths and the techniques to test these myths. One should monopolise on the intrinsic engagement a learner gets from the ‘treasure hunt’ of considering a myth (or research question) and designing an experiment to see how feasible it is.  It has been suggested that using the television programme ‘Mythbusters’ is an effective way to engage, demonstrate scientific thinking and provide context for the scientific research method (Burkley and Burkley, 2009).  A more specific example of this was seen in Blessing and Blessing’s (2010) PsychBusters where they suggested providing students with psychological ‘myths’ who then produce a presentation outlining the myth and provide supporting or refuting evidence for it.  These assignments are entering the domain of problem-based learning (PBL) and psychological applied learning scenarios (PALS)(Norton, 2004).

Within the context of post-16 psychology both the use of Mythbusters seems unrelated from psychology and the implementation of presentations over a long period of time (as in Blessing and Blessing, 2010) are impractical.  A more succinct task and activity which fosters the same thinking skills has been developed (see appendix 1 and 2(iii)) where students in small working clusters can work through a list of ‘misconceptions’ that are based on the Edexcel specification.  Further to this, as extension work students are encouraged to access online videos considering psychology myths (‘Mulierculum’, 2012) and further reading (Lilienfeld, 2009).  A list of psychological myths was created to provide an task for students that would be use to measure the effectiveness of the critical thinking exercises (appendix 4).  Students would complete an exercise similar to Blessing and Blessing (2010) however would write a written report on the ‘myth’.

The measurement of student engagement and the consequent impact on learning is an issue of much research and debate (Fredericks et al., 2005; Lipman and Rivers, 2008; Kuh, 2003; Scholman, 2002) and several student engagement measures have been created such as the School Engagement Scale (Fredericks et al., 2005) which considers engagement on three levels: behavioural, emotional and cognitive.  The impact of the strategies above will be evident on both formative and summative assessments. Increased critical thinking skills would manifest themselves in the AO2/AO3 bands and it would also produce learners who are more aware of the deeper psychological issues surrounding research development.

 

Where are they?

Testing plays a critical role in fostering student learning (Connor-Greene, 2000) and there is more leverage to improve teaching through changing assessments than there is in changing anything else (Gibbs and Simpson, 2004).  In the current landscape of A Level assessment we are seeing movement towards more applied questions in exams that are requiring students to demonstrate more than just rote learning of studies. However, many teachers choose to use high stakes and spaced assessments to measure student progression that rely heavily on memorisation of information and don’t challenge higher thinking skills (Bol and Strage, 1996).

While it is generally acknowledged that increased use of formative assessment (or assessment for learning) leads to higher quality learning, it is often claimed that the pressure in schools to improve the results achieved by students in externally-set tests and examinations precludes its use (Wiliam et al., 2004, p49).

Research has shown that repeated retrieval induced through testing (and not repeated encoding during additional study) produces large positive effects on long-term retention (Karpicke et al., 2008, p968).  This would suggest that regular (each session) assessments reviewing prior learning would be more effective than more spaced ‘mock exams’ or assessments.  In essence measurement of student progress becomes a fluid and flowing process of AfL.   Why should there be such a distinct line between teaching and testing anyway?

In the context of this assignment there are several methods that could be employed by the person returning to teaching to achieve a continuous assessment model: just in time teaching (JiTT) and daily assessments.  Both methods have strengths and weaknesses and these will be reviewed along with specific ways they could be implemented in a research methods classroom.

JiTT is a pedagogical strategy that uses feedback between classroom activities and work that students do at home, in preparation for the classroom meeting (Novak et al., 1999).  JiTT is comparable to the more recent ‘Flipped Classroom’.  Flipped classrooms are shifting the way teachers provide instruction by inverting traditional teaching methods to engage students in the learning process. Using technology, lectures are moved out of the classroom and delivered online as a means to free up class time for interaction and collaboration (Shimamoto, 2012).

Both of these methods put responsibility for learning onto the student.  An example of JiTT would ask students to complete an assessment, essay or question prior to the lesson and send this to the teacher before the session.  The teacher can then prepare a session based on the essays that have been handed in using examples of students work that demonstrate both good and bad practice.  This would allow rolling assessment of classwork but leaves little time for planning of a session and could impact the teacher negatively.

The ‘flipped classroom’ is another method that empowers the learner to take control of their learning and assessment by providing resources and tasks for them to complete outside of the classroom so that class time can be utilised for discussion and clarification of points.  Websites such as resourcd.com allow teachers to ‘flip’ their resources and create a portal for students to use (see appendix 6 for an example).  The returning teacher could make use of this, allowing more class time to be focused on clarification of assessment and measuring learning as well as supporting learning by allowing students multiple retrieval attempts of information that is covered (Karpicke et al., 2008, p968).

A further option to monitor students is daily class assessments.  Connor-Greene (2000) found daily essay quizzes to be a valuable catalyst for all levels of thinking which also encouraged students to prepare thoroughly for every class.  With appropriate questions these assessments provide a starting point for each class, enable students to creatively apply, synthesise and evaluate what they have covered in prior sessions.  In using daily quizzes the testing becomes a dynamic process rather than a static measure of student knowledge and teaching and testing are no longer distinct entities (Connor-Greene, 2000, p88).

There are many ways to address assessment but the above would suggest that embedding assessment into each session as part of the planning is essential.  Many teachers postpone planning of assessments until after a unit is taught and write the assessment reactively (Sanchez and Valcarcel, 1999). Using the above methods, or a mixture of JiTT, flipped classroom and embedded daily assessments would ensure that feedback on student progress is dynamic.  Interventions can be put into place immediately for those who are not succeeding.

 

What are they doing?

Especially at A Level, teachers of research methods and statistics are often faced with teaching the content to heterogeneous groups of students who have a wide variety of academic backgrounds and knowledge (Porter et al., 2006).  Differentiation and planning for such a wide range of learners is more of an issue to consider in the delivery of research methods as a result of the wide range of skills needed to successfully complete a the module.  Research methods demands learners aware of scientific concepts and philosophies, statistical methods, both descriptive and inferential, as well as specific psychological terminology.

In the context of this assignment one area to be considered is the skillset that the teacher provides the learners with.  Teaching is not just about giving the students knowledge but also providing the learner with signposts to help develop their studentship skills and become a better learner in general (Dunlosky et al., 2013).  A recent monograph has considered the relative benefits of a variety of revision and learning strategies that students utilise and reflected on the impact they have on both learning and retention of content.  A table of the findings can be found in Table 1 that illustrates the relative utility of ten different strategies that students use.

In its own right this Dunlosky et al. (2013) provides a wealth of information to the returning teacher as to the skills that should be fostered within the class.  Using this research as a foundation the returning teacher should encourage learners to use and develop the ‘spaced revision’ strategy (see appendix 8).  This builds on ideas from the research including summarisation and practice testing.  Many students do use practice testing as a revision method that would suggest it was an effective assessment strategy.  However, if students do test themselves while studying they likely do it to assess what they have learned rather than to enhance their long-term retention by practicing retrieval therefore the ‘spaced revision’ method should aid retrieval and subsequent understanding and performance on assessments.

Untitled

Table 1: Relative utility of different learning and revision methods taken from Dunlosky et al. (2013).

­Action research completed to assess the impact of the ‘spaced revision’ method has illustrated a small effect between two classes on a class assessment on research methods.  One week before a class assessment one class was provided with the ‘spaced revision’ method and encouraged to use this over the subsequent seven days while the second class was unaware of this strategy and encouraged to revise each day using their preferred method.  The results suggested a small but significant effect  (U=91, N=19,15, P<0.05) of the revision method, however the statistical analysis did not take into account prior achievement and predicted achievement (based on ALPS and ALIS scores).  The class that did not receive the revision strategy prior to the assessment was given it following the test and the assessment was conducted for the basis of this assignment and had no impact on any internal grades or reports to ensure that these learners were not disadvantaged by the investigation.

Questioning strategies, both orally and on activity sheets, have a measured impact on student learning of information and concepts (Wilen, 1987).  As effective research methods knowledge requires learning a wealth of terminology and the ability to apply this appropriately the returning teacher should consider the impact of this.  Elaborative interrogation simply encourages the learner to consider their response to an answer and could be seen as a form of metacognition; requiring the learner to think about their thinking (Dunlosky et al., 2013).  This can be achieved through the careful design of worksheets (see appendix 9 for a pre and post example) and oral questioning strategies in the classroom by simply asking ‘why?’ when a student responds.  This allows the teacher to identify the thought process the student is following.  It could be possible that a student has rote-learned responses that are incorrect.  An example taken from class:

 Studying the difference in running a maze between white rats and grey rats.

Teacher: “What design would be most appropriate here?”

Student: “… er … independent measures [?]”

Teacher: “Thank you.”

Leaving the discussion here would leave the teacher believing that the student had grasped the concept of experimental design and could apply it correctly.  However, this may not be the case.  Let us now apply elaborative interrogation.

Teacher: “Why have you concluded that it is independent measures?”

Student: “’Cause it’s like that other answer with the dogs and cats, init?”

It is now evident that the student is not aware of the concepts or how to apply it to a novel situation if there has not been a similar one before.  Elaborative interrogation seems to be a powerful learning procedure that is generally useful during fact learning (Pressley et al., 1988) and it has been found that generating an elaboration led to better memory for main ideas (Seifert, 1993).  Further to elaborative interrogation it would be important for the teacher to be aware of other questioning techniques and use appropriate questioning words to stretch her students (see appendix 10) and explore questioning techniques such as PPPB (see appendix 11 for feedback from action research conducted) and how PPPB and elaborative interrogation can be combined.

 

Get them doing it.

The ACME Maths Needs of the Nation (2011) suggested that too much emphasis on ‘teaching to the test’ leaves both GCSE and A Level qualification holders poor at applying their mathematical knowledge in different contexts.  Further commentary (Garner, 2012; MacInnes, 2012) has identified that A Level and undergraduate psychology students do not have transferable skills when it comes to research methods and statistics.  Having students conduct research and understanding the practical and ethical issues first-had is a integral element of supporting students and ensuring that they progress with a knowledge of how to conduct research, not just how to pass a particular specification’s methods paper (Jarrett, 2010).

There are several ways that students can take part in creating and conducting research from practical demonstrations and studies to more theoretical applied learning scenarios (see appendix 7) (Norton, 2004).  When creating an activity the teacher should be aware of not ‘teaching to the test’ and encourage wider thinking from students.  It is all too easy to give a sheet with aim, procedure, results and conclusion for learners to complete but these give a false impression of what research is like.  Worse, it fosters the ‘what do I want my study to show’ mentality with students beginning at the end and reverse engineering a study to support the conclusions that they have made already (MacInnes, 2012).

Conducting research is only part of understanding the ‘big picture’ and statistics is an area of research methods where students often fail to engage with the topic or consider that they are not able to master before even starting.  Using the Jigsaw Technique you are able differentiate well and scaffold the learning of the weaker students by using the more able (Cumming, 1983).  It has been shown that cooperative learning methods benefit students’ achievement and bring increased individual attention to a task (Perkins, 2001).

Following a session teaching statistical methods to an A2 psychology class feedback about the session suggested students were not comfortable with the content and when each statistical test should be used, how it worked and significance levels.  Consequently, the ‘jigsaw sheets’ (see appendix 12) were developed to allow clusters of students to work through completing a statistical test: learning by doing.  The feedback from this second session was more positive and a summative assessment provided support for this illustrating that all students achieved at least 60% on the test.  Interestingly, during the session one student also developed a mnemonic (Richmond et al., 2011) to recall when statistical tests are used which was then passed onto the class (see appendix 13).

Building on from the Jigsaw Technique above it is possible to combine a range of interventions above that will engage students, allow them to complete activities outside of the classroom (JiTT; Flipping classrooms) and allow collaborative learning.  This was achieved in two ways that encouraged students to get involved and be empowered by their learning.  The returning teacher could take advantage of these techniques in several ways using web technologies.

Lineweaver (2010) found that online discussions improve students’ class preparation and engagement.  Through the use of a twitter account (twitter.com/psych_at_wyke) students were encouraged to tweet about concepts, ideas and items of media research that interpreted data poorly.  This builds on the ideas of Connor-Greene (1993) and Hall et al. (2006) of comparing media reports to original research.  The students actively engaged in this with 67% of students following the twitter account and 23% of students taking part by ‘tweeting’, ‘retweeting’ or contributing to a discussion on a concept or news article (appendix 14).

The use of Google Forms (https://docs.google.com/templates?type=forms) further engaged the students in active research and lesson delivery and there are many other uses for this considering the number of students who have access to mobile phones or devices that can connect to the Internet in class (Thornton and Houser, 2004).  Students can use Google Forms to both contribute to class activities as well as collect empirical data for their own research.  When research has been designed it is simple and quick to create a web-based form to collect data with descriptive statistics created ‘on the fly’ by the software students can see how response rates affect the overall findings of a study.  Further to this it can be used in class for students to contribute to activities such as creating multiple-choice questions in clusters that are then collated by the teacher and given as a formative assessment at the end of a session (see appendix 15).

 

Final thoughts

Currently the amount of curriculum space is not sufficient to give students a secure grasp of even the most basic of quantitative methods (MacInnes, 2012). Within the current climate in education it is tempting for teachers to deliver strategic lessons and assessments that ‘teach to the test’ (Halonen et al., 2003) and miss extending students knowledge and skill set and developing the higher level skills needed (Jarrett, 2010).  Methods teaching is resource intensive: to be done well it needs a substantial amount of small group work, consolidating skills and placing them within a clear methodological framework from research design through to analysis and reporting or results. (MacInnes, 2012)

When providing advice and strategies for any teacher it is important to consider the view that teaching is instinctive rather than learned and there are few particular patterns of behaviour that are more effective than others for all learners allowing the teacher to differentiate and stretch all (Weinstein, 1988).  When providing advice to the teacher in question one should support and scaffold their delivery rather than enforce explicit strategies.  Education courses and textbooks covey the notion that learning is nonproblematic if certain methods are applied and generally avoid discussion of what to do when faced with failure (Good, 1983).

Students are not just empty receptacles waiting to be filled with important facts, new and interesting concepts and practical titbits of information (as Bain (2004) suggests) but actively learning and inquisitive.  The teacher, facilitator or whatever label is ‘in vogue’ at the time, plays an enormous role in learning through delivery and course design.  Research methods provide the foundation on which the rest of psychology is built, and for a student to achieve, a sound awareness of this is needed.

The one element that seems to pervade all discussions of exceptional teaching is enthusiasm for a subject (Buskist et al., 2002) and this must be shown consistently both overtly in discussion and through creating exciting and engaging lessons for learners.  Students should not only learn about the content and concepts within research but should also be pushed to learn how to generate and evaluate research effectively and consequently will become learners who not only recognise good studies but also know how to design them (Sternberg, 1999).

 

References

Advisory Council on Mathematics Education (2011) Mathematical Needs: Mathematics in the workplace and Higher Education London: ACME. Available http://www.acme-uk.org/media/7624/acme_theme_a_final%20(2).pdf last accessed 18/05/2013.

Angelo, T.A. (1995) Classroom assessment for critical thinking. Teaching of psychology, 22, 6-7.

Bain, K. (2004) What the best college teachers do. Cambridge, MA: Harvard University Press cited in Saville, B.K., (2008) A guide to teaching research methods in psychology, Oxford: Blackwell Publishing.

Beins, B. (1993) Using the Barnum effect to teach about ethics and deception in research. Teaching of Psychology, 20, 33-35.

Blair-Brokeker, C. (2003) Bringing psychology to life. In Buskist, W., Hevern, V., and Hill, G.W. (eds) (2002) Essays from e-xcellence in teaching. Available: http://teachpsych.lemoyne.edu/teachpsych/eit/index.html last accessed 15/5/13.

Blessing, S.B., and Blessing, J.S. (2010) PsychBusters: A means of fostering critical thinking in the introductory course. Teaching of Psychology, 37(3), 178-182.

Bol, L., & Strage, A. (1996) The contradiction between teachers’ instructional goals and their assessment practices in high school biology courses. Science Education, 80, 145–163.

Breakwell, G., Smith, J.A., and Wright, D.B. (Eds)(2012) Research Methods in Psychology, 4th Edition. London: SAGE Publications.

Budkist, K., Sikorski, J., Buckley, T., and Saville, B.K. (2002) Elements of Master Teaching cited in Saville, B.K., (2008) A guide to teaching research methods in psychology, Oxford: Blackwell Publishing.

Burkley, E., and Burkley, M. (2009) Mythbusters: A tool for teaching research methods in psychology. Teaching of Psychology, 36(3), 179-184.

Connor-Greene, P.A. (2000) Assessing and promoting student learning: blurring the line between teaching and testing. Teaching of Psychology, 27(2), 84-88.

Coolican, H. (2009) Research and Statistics in Psychology. NY: Routledge.

Coombs, W.T., and K. Rybacki. (1999) Public relations education: Where is pedagogy? Public Relations Review 25(1), 55–64.

Cumming, G. (1983) The introduction of statistics course: mixed student groups preferred to be streamed, Teaching of Psychology, 10(1), 34-37.

Dunlosky, J., Rawson, K.A., Marsh, E.J., Nathan, M.J., and Willingham, D.T. (2013) Improving students’ learning with effective learning techniques: promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4-58.

Dunn, D. (2000) Letter exchanges on statistics and research methods: writing, responding and learning. Teaching of Psychology, 27, 128-130.

Flanagan, C. (2012) The Complete Companions: The research methods companion for A Level Psychology. Oxford: OUP.

Fredericks, J.A., Blumenfeld, P., Friedel, J., & Paris, A. (2005). School engagement. In K.A. Moore & L. Lippman (Eds.) What do children need to flourish?: Conceptualizing and measuring indicators of positive development. New York, NY:  Springer Science and Business Media

Garner, M. (2012) There’s madness in our methods: The pedagogical culture of research methods. The Proceedings of HEA Social Sciences teaching and learning summit: Teaching research methods. University of Warwick.

Gibbs, G. & Simpson, C. (2005) Conditions under which assessment supports student learning, Learning and Teaching in Higher Education, 1(1), 3-31.

Good, T.L. (1983) Recent classroom research: Implications for teacher education. In D. C. Smith (Ed.), Essential knowledge for beginning educators. Washington, DC: American Association of Colleges of Teacher Education.

Hall, S., and Seery, B. (2006) Behind the facts: helping students evaluate media reports of psychological research. Teaching of Psychology, 33, 101-104.

Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., et al. (2003) A rubric for learning, teaching and assessing scientific inquiry in psychology. Teaching of Psychology, 30(3), 196-208.

Herzog, H.A. (1990) Discussing animal rights and animal research in the classroom. Teaching of Psychology, 17, 90-94.

Holt, J. (1972). Freedom and beyond. Middlesex : Penguin Books.

Jarrett, C. (2010) The journey to undergraduate psychology. The Psychologist, 23 (9), 714-717.

Kaminski, J.A., Sloutsky, V.M., and Heckler, A.F. (2008) The advantage of abstract examples in learning math. Science, 320, 454-455.

Karpicke, J.D., and Roediger, H.L. (2008) The critical importance of retrieval for learning. Science, 319, 966-968.

Kuh, G.D. (2003) What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change: The Magazine of Higher Learning, 35, 2, 24-32.

Kuh, G.D., Kinzie, J.H., Schuh, J.H., and Whitt, E.J. (2010) Student Success in College: Creating conditions that matter. Wiley and Sons: San Francisco.

Lammers, W.J. (1993) Engaging Activities for Students Who are Learning Research Methods. Teaching of Psychology, 20, 33-35.

Lawson, T.J. (1999) Assessing Psychological Critical Thinking As A Learning Outcome for Psychology Majors, Teaching of Psychology, 26(3), 207-209.

Lilienfeld, S.O., Lynn, S.J., Ruscio, J., and Beyerstein, B.L. (2009) 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behaviour. Sussex: Wiley-Blackwell.

Lippman, L., and Rivers, A. (2008) Assessing School Engagement: a guide for out-of-school time programe practitioners. Child Trends. Available: http://www.childtrends.org/files/child_trends-2008_10_29_rb_schoolengage.pdf last accessed 16/05/2013.

MacInnes, J. (2012) Quantitative Methods teaching in UK Higher Education: The state of the field and how it might be improved. The Proceedings of HEA Social Sciences teaching and learning summit: Teaching research methods. University of Warwick.

Miles, J., and Banyard, P. (2007) Understanding and using statistics in psychology: a practical introduction. London:SAGE Publications.

‘Mulierculum’ (2012) Psychological Mythbusters. Available http://www.youtube.com/watch?v=prk8KVHZrik last accessed 14/5/2013.

Norton, L.  (2004). Psychology Applied Learning Scenarios (PALS): A practical introduction to problem-based learning using vignettes for psychology lecturers. LTSN.

Novak, G.M., Patterson, E.T., Gavrin, A.D., and Christian, W. (1999) Just-in-time teaching: blending active learning with web technology cited in Saville, B.K.  (2008) A guide to teaching research methods in psychology, Oxford: Blackwell Publishing.

Pellegrino, J.W., Chudowsky, N., and Glaser, R. (Eds) (2001) Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press. Available http://www.nap.edu/books/0309072727/html/ last accessed 10/5/2013.

Perkins, D.V. (2001) A “jigsaw classroom” technique for undergraduate statistics courses. Teaching of Psychology, 28(2), 2001.

Popper, K. (1963) Conjectures and Refutations: The Growth of Scientific Knowledge. NY: Routledge.

Porter, A., Cartwright, T., & Snelgar, R. (2006) Teaching statistics and research methods to heterogeneous groups: the Westminster experience. In Proceedings of the Seventh International Conference on Teaching Statistics, Salvador, Brazil. Voorburg: The Netherlands: International Statistical Institute.

Pressley, M., Symons, S., McDaniel, M. A., Snyder, B. L., & Turnure, J. E. (1988) Elaborative interrogation facilitates acquisition of confusing facts. Journal of Educational Psychology, 80(3), 268.

Richmond, A.S., Russell, N.C., and Levin, J.R. (2011) Got neurons? Teaching neuroscience mnemonically promotes retention and higher order thinking. Psychology Learning and Teaching, 10(1), 40-45.

Sanchez, G., and Valcarcel, V. (1999) Science teachers’ views and practices in planning for teaching. Journal of Research in Science Teaching, 36(4), 493-513.

Saville, B.K. (2008) A guide to teaching research methods in psychology, Oxford: Blackwell Publishing.

Seifert, T. L. (1993) Effects of elaborative interrogation with prose passages. Journal of Educational Psychology, 85(4), 642.

Shimamoto, D. (2012, April 17) Implementing a Flipped Classroom: An Instructional Module. Powerpoint presented at the Technology, Colleges, and Community Worldwide Online Conference.

Shulman, L.S. (2002) Making Differences: A table of learning. Change: The Magazine of Higher Learning, 34(6), 36-44.

Sternberg, R.J. (1999) Teaching psychology students to be savvy consumers and producers of research questions. Teaching of Psychology, 26(3), 211-213.

Thornton, P., & Houser, C. (2004). Using mobile phones in education. In Wireless and Mobile Technologies in Education, 2004. Proceedings. The 2nd IEEE International Workshop on (pp. 3-10). IEEE.

Tickle, L. (2000) Teacher induction: The way ahead. Buckingham: Open University Press.

Weinstein, C.S. (1988) Preservice teachers’ expectations about the first year of teaching. Teaching and Teacher Education, 4(1), 31-40.

Wilen, W. W. (1987). Questions, Questioning Techniques, and Effective Teaching. NEA Professional Library. Available from http://www.eric.ed.gov/ERICWebPortal/search/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED310102&ERICExtSearch_SearchType_0=no&accno=ED310102 last accessed 08/05/2013.

Wiliam, D., Lee, C., Harrison, C., and Black, P. (2004) Teachers developing assessment for learning: impact on student achievement. Assessment in Education, 11(1), 49-65.

 

 

1 Comment Issues in the Teaching of Research Methods

  1. Pingback: Improving Students’ Learning With Effective Learning Techniques | Research for the Advancement of Psychology Teaching

Leave A Comment

Your email address will not be published. Required fields are marked *