From research question to research design

A proposed experiment on design thinking in education

Within current contexts of teaching and learning, education is shifting towards providing learners with a skillset to enable them to become active citizens in a globally connected world. The types of pedagogies which might be relevant to developing mindsets and skills appropriate for emerging digital learning and working environments is hotly debated. However, in literature, we see design thinking being explored as one possibility (Razzouk & Shute, 2012).

Design thinking is strongly endorsed by popular commentary for its disruptive, game changing potential (Brown, 2009) and its possibilities in providing a framework and common language for students, teachers and parents who are after great learning (McIntosh, 2012). Non-empirical works are also appearing, reporting on the positive influence of design thinking in the classroom (Mouldey, 2015). The few empirical explorations of design thinking in K-12 classrooms also paint a positive picture regarding the educational benefits to students (Carroll, 2014; Carroll, Goldman, Britos, Koh, Royalty & Hornstein, 2012; Scheer et al, 2012). However, as the application of design thinking in a K-12 setting is a new phenomenon very little empirical evidence has been collected from the educational sector (Anderson, 2014; Retna, 2015). Experimental and quasi-experimental studies on design thinking are lacking and most, if not all, are qualitative (Razzouk & Shute, 2012).

Hence, there is a clear need for evidence-based inquiry that explores design thinking in K-12 education. Therefore, to “undertake a study that has not been conducted before and thereby fill a void in existing knowledge” (Creswell, 2014, p. 18) this paper poses the following research question that was designed via an iterative process, as documented in Appendix A:

What is the relationship between design thinking and student learning and engagement within the context of middle school K-12 education?

This paper documents a research design and method that I have chosen to answer this research question. Research designs that I discounted as being inappropriate for this specific work have also been discussed briefly. Ethics issues and dilemmas surrounding this proposed research are discussed, as have the essential features of the paradigm that I acknowledged as framing this research.

Chosen research design

To act as a framework for the collection and analysis of data a quasi-experimental research design (McMillan & Schumacher, 2014) that incorporates both qualitative and quantitative research strategies (Bryman, 2014) has been chosen to assist with answering the research question posed to guide this research. As an evaluation research project (Bryman, 2014; McMillan & Schumacher, 2014) this study will focus on manipulating design thinking as an intervention (Punch & Oancea, 2014) in a school environment. This planning process could have progressed either via a paradigm-driven approach or a pragmatic question-driven approach (Punch & Oancea, 2014). However, for this research proposal, a question-driven approach (Bryman, 2014; Punch & Oancea, 2014) was adopted to ensure compatibility and integrity in the way the research question and research method fits together (Punch & Oancea, 2014).

This pragmatic approach to research design, as illustrated in Figure 1 (below), positions the chosen study design between the research question and any collected data, and connects the research question to collected data (Punch & Oancea, 2014).

Figure 1. General planning for educational research

The initial research problem and information from literature also informs the research design, as is also depicted in Figure 1. A well-written research question should indicate what data is necessary to answer it (Punch & Oancea, 2014) and therefore analysis of this question led the decision making process in choosing an appropriate research plan.

The research question indicates that this research should aim to determine if there is a causal relationship between design thinking and other variables such as   learning outcomes and student engagement.

The chosen design therefore needs to support an experimental approach where variation in the treatment variable can be controlled to study the consequences of doing so (Punch & Oancea, 2014). Furthermore, the chosen design should satisfy the criterion of validity, where the integrity of conclusions that are generated is strong (Bryman, 2014). Therefore, as declared above, the method that I have chosen incorporates an experimental design which, while allowing variables to be manipulated, provides strong validity, particularly internal validity, which relates mainly to the issue of causality (Bryman, 2014). However, rather than adopting a classical experimental design (Bryman, 2014), a quasi-experimental design (Bryman, 2014; Creswell, 2014; McMillan & Schumacher, 2014) was chosen as it was not possible to assign subjects randomly to the experimental and comparison groups. This choice does pose threats to the validity of the design as discussed below.

I also chose to integrate quantitative and qualitative research strategies within this one project. This mixed methods design (Bryman, 2014; Creswell, 2014; McMillan & Wergin, 2010) should enable this work to develop a better understanding of design thinking in the classroom by combining “the strengths of methods focused on quantitative data with the strengths of methods focussed on qualitative data” (Punch & Oancea, 2014, p. 339). The choice to support quantitative data with qualitative observations was also guided by a perceived need to develop an understanding of the context in which such an intervention occurs, as well as the diverse viewpoints of the stakeholders; a choice underpinned by viewpoints in childhood research concerned with seeking the child’s perspective that require a qualitative method to be incorporated into research designs (Punch & Oancea, 2014).

Alternative Research Designs

A research design provides the structure that guides the execution of a research method and the analysis of the subsequent data (Bryman, 2014). With this in mind, and as discussed above, an experimental design was chosen so as to allow the classroom researcher to manipulate design thinking as an experimental intervention, that determines what the subjects will experience (McMillan & Schumacher, 2014). However, although a true experimental design would have provided the strongest argument for a causal effect (McMillan & Schumacher, 2014), a true experimental approach was discounted as being too disruptive (Creswell, 2014) and perhaps unethical, due to the random assignment of students to each study group (Bryman, 2014; Creswell, 2014; Punch & Oancea, 2014). Furthermore, true experiments threaten the external validity of any results obtained (McMillan & Schumacher, 2014). A specific aim of this work is to produce results that can be generalized beyond the specific research context (Bryman, 2014); therefore the choice of a quasi-experimental approach was more appropriate than a classical experimental design.

Likewise, other non-experimental designs, where there is no control over what may influence subject’s responses (McMillan & Wergin, 2010) such as a comparative design (Bryman, 2014, p. 72), cross-sectional design (Bryman, 2014, p. 58) and longitudinal design (Bryman, 2014, p. 63) were assessed as being not suitable for this study as they do not allow for the manipulation of variables (McMillan & Schumacher, 2014) and the study of causality in a short term study. It should be pointed out that although a longitudinal design does allow for causal inferences to be made (Bryman, 2014) this design was not chosen due to the time and costs involved in such a study (Bryman, 2014). Finally, a case study design (Bryman, 2014, p. 66) was briefly considered but discounted as such a design views the case as a point of interest in its own right (Bryman, 2014). Such a context bound design (McMillan & Schumacher, 2014) is underpinned by an idiographic approach to research whereas this project is striving for a nomothetic design – to generate “statements that apply regardless of time and place” (Bryman, 2014, p. 69).

In summary, the need for strong internal and external validity was a guiding factor in deciding upon a quasi-experimental design, as was the need for a non-disruptive approach. Other non-experimental designs were not chosen because they do not test for causal inferences.

 Strengths and Limitations of the Chosen Design

The implication of not randomly assigning students to each study group in the quasi design is that the treatment and comparison groups may not be comparable and thus internal validity of the collected data is compromised (Bryman, 2014; McMillan & Schumacher, 2014). However, this limitation is balanced by the strong external validity provided by such a design (Bryman, 2014; Creswell, 2014; McMillan & Schumacher, 2014). Furthermore, the quasi-experiment design provides for a stronger ethical approach, as discussed below, as the classes that are already intact are not disrupted by the study.

Further threats to internal validity are widely discussed in research design literature (Bryman, 2014; Creswell, 2014). However, quasi designs do provide reasonable control over most sources of invalidity (McMillan & Schumacher, 2014). Moreover, it should be pointed out that this experiment will be undertaken on a small school campus where the students are very familiar with each other. In this situation, the treatment and comparison group participants will be able to communicate with each other and the danger is that some aspects of the design thinking stimulus are passed on from the treatment group to the comparison group. This diffusion of treatments may pose a threat to the internal validity (McMillan & Schumacher) of the obtained results.

Compensatory rivalry (Creswell, 2014) may also need to be considered when interpreting any findings. If this scenario was to develop, the comparison group would exert extra effort because they realise that they are not being instructed via a design thinking framework and therefore their behaviour affects the dependent variables being studied.

However, collection of qualitative data strengthens the overall experimental design by allowing for development of an in-depth understanding of the context in which the intervention occurs (Bryman, 2014; McMillan & Schumacher, 2014).  Although, such overt observations may also threaten internal validity by leading to subject effects and treatment effects (McMillan & Schumacher, 2014) the chosen design should allow for collection of meaningful data.

Research Method

The educational setting for this project is an inner city all boys secondary school in Melbourne, Australia. The study will take place on a Year 9 campus, populated by 125 students, aged 14 through to 15. The student population is predominantly Anglo-Saxon with a mix of other cultures and new immigrants. A small percentage of the students are fee-paying, non-English speaking students, new to Australia. As is commonly used in education, and as per a quasi-experimental design, the project will make use of intact, already established groups of subjects (McMillan & Schumacher, 2014) designated as Class A and Class B, as shown in Table 1 below. Each class will be randomly assigned to each intervention (X1 and X2) and taught by the same teacher.

The method chosen to answer the research question, is a non-equivalent groups pre-test, post-test comparison group design (McMillan & Schumacher, 2014). Identical research instruments (pre-tests) are to be administered to each class on the first day of a four-week unit of work and an identical instrument (post-test) at the conclusion of this unit of work. Following the pre-test, Class A will be taught using a pragmatic approach to instruction (designated as X1) while Class B will be taught via a design thinking framework (designated X2). A series of qualitative observations (designated as O), undertaken by the researcher will be made throughout the intervention period that will end at the post-test. The proposed quasi-experimental approach allows the researcher to compare the two interventions and make claims about causality.

Consistent with compatible qualitative and quantitative strategies (Bryman, 2014), quantitative data will be collected, while qualitative data will be collected to enhance the ability to study process while bringing sensitivity to meaning and to context (Punch & Oancea, 2014).

Qualitative Instruments: The classroom researcher will be involved in micro-ethnography (Bryman, 2014) to collect participant observations (refer to Table 1) to allow for triangulation of data (Creswell, 2014; McMillan & Schumacher) and to develop an understanding of the intervention from the participants’ perspective (McMillan & Schumacher, 2014). At the end of the study period, a semi-structured interview (Bryman, 2014) will be utilised to collect qualitative information from all students regarding learning outcomes and student engagement.

Quantitative Instruments: Data will be collected via the use of a pre-test, post tests to measure specific learning outcomes related to the topic being explored and assist with analysis of the intervention (Creswell, 2014). A self-completion questionnaire supervised by the researcher will be administered online, to collect student voice regarding the use of design thinking in the classroom will be administered at the conclusion of the study.

Strengths and limitations of the chosen method

The chosen design and method incorporates two treatments into this one study. As design thinking has been studied in a secondary setting but not extensively, it remains unknown if the intervention is distinct enough from that given to the comparison group to generate a significant statistical difference (McMillan & Schumacher, 2014). Perhaps the intervention needs to be given for a longer period as literature does state that the effects of design thinking do take time to develop. This is an issue of fidelity of intervention (McMillan & Schumacher, 2014) that may be addressed by analysing any digital footage of the classroom teaching and learning, collected throughout the experiment.

A limitation to the method also stems from the difficulty of identifying appropriate variables to measure the value of design thinking (Razzouk & Shute, 2014). Thus, construct validity (Bryman, 2014) may be a limitation of this chosen method. However, in this situation, in which the variables are unknown, the qualitative approach is well suited to address the research problem where the researcher needs to learn from participants via exploration (Creswell, 2014).

This quasi-experiment has been designed to develop an understanding of the intervention of design thinking while also examining the causal effects that inhibit or promote change when the intervention occurs (Bryman, 2014). Thus, although weaknesses have been identified the proposed method should allow for collection of credible data.

Ethical Issues

The ethical issues I have considered whilst designing this project have been tabulated into Table B1 in Appendix B. The resulting framework was used to frame an ethical stance on how individuals and groups directly affected by this research should be treated. To ensure ethical conduct and quality of research, this information should now be continually referred to during planning, data collection and dissemination of the research findings (Tangen, 2014). Analysis of this information illustrates that the researcher is ethically responsible for protecting the rights and welfare of the subjects who participate in this study (McMillan & Schumacher, 2014).

 

Although ethical considerations have been tabulated into Table B1, practical application does pose considerable dilemmas as ethical research is also about responsible and situated judgement (Punch & Oancea, 2014). In particular, within the four-week timeframe of this project the test subjects will be exposed to intense observation so as to achieve technically precise data collection (Punch & Oancea, 2014) and yet the question remains, is this what ought to be done? (Punch & Oancea, 2014) as such an approach may cause stress to the participants?

 

Furthermore, the guidelines that are presented in Table B1 state that all participants in the study be fully informed regarding the procedures of this project. However, their awareness may change their behaviour because they know they are being studied (Bryman, 2014). Thus, the research decision to seek consent has consequences for validity of the project. The option to adopt a more covert approach exists, but the ethical framework presented in Table B1, suggests that covert approaches are likely to impinge on human values and sensibilities. Thus, within the context of this project many contextual decisions need to be faced but perhaps never neatly defined, nor fully resolved (Punch & Oancea, 2014).

Paradigms

A paradigm is a collection of beliefs about knowledge and our relationships with knowledge, along with practices based upon those beliefs (Hughes, 2010). Based on these worldviews, perhaps without realising it, researchers make assumptions about knowledge and how it can be obtained (Creswell, 2014). The proposed research strives to be question driven (Punch & Oancea, 2014) without declaring a particular paradigmatic allegiance (Punch & Oancea, 2014). However, published commentary on research design, encourages researchers to be mindful of deeper philosophical issues and discuss them as they arise (Hughes 2010; Punch & Oancea, 2014; MacKenzie & Knipe, 2006).

Analysis of the chosen quasi-experimental, mixed methods design suggests that the design is consistent with the doctrine of positivism that advocates the application of the natural sciences to the study of social reality (Bryman, 2014). This viewpoint is inherent in the quasi-experimental design that incorporates the “practices and norms of the natural scientific model” (Bryman, 2014, p. 36) to establish relationships between variables (McMillan & Schumacher, 2014). The assumption being illuminated here is that empirical research strategies, such as an intervention strategy, normally associated with the natural sciences, can also be applied to the social sciences. This scientific approach also illuminates two further assumptions. The first is that there are stable social facts, with a single reality, which can be measured (McMillan & Schumacher, 2014). The second is a deterministic worldview in which causes determine effects or outcomes (MacKenzie & Knipe, 2006). Another positivist feature that is apparent is the use of observation and measurement in order to predict and control forces that surround us (Hughes, 2010, p. 195). Thus, the paradigm underpinning the proposed research is positivism, but not exclusively.

The quantitative approach, including student observations, built into the methodology is more consistent with interpretivism (Bryman, 2014) as a contrasting epistemological worldview. Interpretivism aims to ‘understand’ human behaviour rather than just ‘explain’ it (Bryman, 2014) and therefore concentrates on the meaning people bring to different contexts (Punch & Oancea, 2014). Therefore, the interpretivist researcher seeks to understand socially constructed, negotiated and shared meanings and represent them as theories of human behaviour (Hughes, 2010). In this project, the methodical collection of qualitative data on students’ attitudes and learning, collected by video observation and student interviews reflects such an interpretivist worldview. Also built into the proposed design and method is the assumption that the social researcher should gain “access to people’s ‘common-sense’ thinking and hence interpret their actions and their social world from their point of view” (Bryman, 2014, p. 30). We can see such a viewpoint built into this proposed research that seeks to comprehend the effects of design thinking, via the exploration of the wider context, including student and teacher perceptions. There are multiple realities that can be explored rather than one reality and the purpose of research is to understand a social situation from the participant’s perspective (McMillan & Schumacher, 2014). The interpretivist researcher recognises that reality is socially constructed and that participants’ views of the reality being studied are important considerations (MacKenzie & Knipe, 2006).

In summary, this short exploration of paradigms illuminates epistemological and ontological assumptions being made by this research, that otherwise may have remained hidden. Although, as we have seen, they may feed into ways the research is designed and carried out.

Conclusion

In conclusion, a research design and method was chosen to provide answers to the posed research question. The quasi-experimental mixed methods design that was chosen via a pragmatic question driven approach to research design, will ideally produce valid results that are able to provide answers to the research question. By addressing this question the hope is to fill a void in existing empirical knowledge regarding how design thinking might be used with the middle years classroom.

This design task was discovered to be iterative in nature with the researcher needing to challenge and restate assumptions being made through the entire planning process, including assumptions about knowledge and how it is discovered. Limitations of the research design and method were uncovered. However, the strengths of the design should allow this work to generate externally valid data regarding the use of design thinking in the classroom setting. As has been discussed in this proposal, new questions would also then need to be articulated, once the researcher has a better understanding of the studied phenomenon of design thinking.


References

Anderson, N. (2014). Design thinking: framework to foster creativity and innovation in rural and remote education. Australian and International Journal of Rural Education. 22(2), 43-52.

Brown, T. (2009). Change by design: How design thinking transforms organizations and inspires innovation. HarperCollins e-books.

Bryman, A. (2012). Social research methods (4th ed.). Oxford, UK: OUP.

Carroll, M., Goldman, S., Britos, L., Koh, J., Royalty, A., & Hornstein, M. (2010). Destination, imagination, and the fires within: Design thinking in a middle school classroom. International Journal of Art & Design Education, 29 (1), 37-53.

Carroll, M. (2014). Shoot For The Moon! The Mentors and the Middle Schoolers Explore the Intersection of Design Thinking and STEM. Journal of Pre-College Engineering Education Research (J-PEER). 4(1), 14-30.

Creswell, J. W. (2014). Educational research: Planning, conducting, and evaluating quantitative and qualitative research [Kindle version]. Retrieved from http://www.amazon.com.au/

Hughes, P. (2010). Paradigms, methods and knowledge. In G. Mac Naughton, S. A. Rolfe, & I. Siraj-Blatchford (Eds.), Doing early childhood research: international perspectives on theory & practice (2nd ed.). (pp. 35-61). Crows Nest, NSW: Allen & Unwin.

Luka, I. (2014). Design Thinking in Pedagogy. The Journal of Education Culture and Society, (2), 63-74.

MacKenzie, N., & Knipe, S. (2006). Research dilemmas: Paradigms, methods and methodology. Issues in Educational Research, 16(2), 193-205.

McIntosh, E. (2012). The Problem Finders: Designing For Great Learning. UnBoxed. A Journal of Adult Learning in Schools. From: https://www.hightechhigh.org/unboxed/issue9/the_problem_finders/

McMillan, J. H., & Wergin, J. F. (2010). Understanding and evaluating educational research (4th ed.). Upper Saddle River, NJ: Pearson/Merrill.

McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry [Kindle version]. Retrieved from http://www.amazon.com.au/

Punch, K. F., & Oancea, A. E. (2014). Introduction to research methods in [Kindle version]. Retrieved from http://www.amazon.com.au/

Razzouk, R., & Shute, V. (2012). What is design thinking and why is it important? Review of Educational Research, 82 (3), 330–348

Scheer, A., Noweski, C., & Meinel, C. (2012). Transforming constructivist learning into action: Design thinking in education. Design and Technology Education: An International Journal, 17(3).

Retna, K. S. (2015). Thinking about “design thinking”: a study of teacher experiences. Asia Pacific Journal of Education, (ahead-of-print), 1-15.

Tangen, R. (2014). Balancing ethics and quality in educational research: The ethical matrix mix. Scandinavian Journal of Educational Research. 58 (6), 678-694

Appendix A

Research Topic or Problem

The problem being explored is the need for a pedagogical framework that is able to cater for teaching the necessary skills for a 21st Century context, which at present is being shaped by the ubiquity of technology associated knowledge networks. This flood of technology and associated knowledge networks is now changing education with anywhere, anytime learning possible.

Educational institutes, including K-12 schools in Australia, function in a world radically different from that in which such institutes were invented. Due to huge social changes, being shaped by the disruptive effects of technology, a perceived educational need is to cultivate students with high skills of creativity, problem solving and self-directed learning. The aim of education is slowly shifting from the dissemination of knowledge to a focus on providing these learners with the necessary skillset to be active citizens in a globally connected world that is increasingly reliant on active knowledge networks whilst also beset by wicked design problems. The types of pedagogies which might be relevant to develop these mindsets and skills in emerging Digital Learning Environments is hotly debated.

 

In literature, we see Design Thinking being explored as a possible pedagogical framework to encourage innovative mindsets capable of solving wicked design problems. However, is Design Thinking a viable option in K-12 education, considering that many potential effects are long term? Is such an approach also underpinned by a viable learning theory? Also what are teacher perceptions of Design Thinking as a pedagogical framework?

Research Question

Within the context of middle school education, what is the relationship between design thinking and student learning outcomes and levels of cognitive engagement?

From Literature to Research Question and Practical Importance

This perceived need for education to adapt to the 21st Century context also applies to pedagogy (Luka, 2014) and thus there exists a recent trend to adopt and apply design practices to education. One possible pathway is via the application of design thinking (Razzouk and Shute, 2012) as a teaching and learning framework. Therefore, design thinking is being endorsed in popular media as providing a framework and a common language for students, teachers and parents who are after great learning (McIntosh, 2012). The work of Mouldey (2015) supports such claims by reporting on the positive influence of design thinking in the classroom. However, these non empirical commentaries provides a positive introduction to explorations of Design Thinking in K-12 education, experimental and quasi-experimental studies on design thinking are lacking and most, if not all, are qualitative (Razzouk). It appears that little empirical evidence has been conducted on design thinking, in the educational sector (Anderson, 2014).

Whether Design Thinking improves learning outcomes remains largely unknown but the few academic papers that provide an empirical exploration of design thinking in K-12 classrooms paint a positive picture (Carroll, Goldman, Britos, Koh, Royalty & Hornstein, 2012; Scheer et al, 2012). The start of such an exploration is positive but the application of design thinking in a K-12 setting is still a relatively new phenomenon and the current evidence base to inform debate is very small (Anderson, 2012). Therefore, there is a clear need for empirically driven evaluation research concerned with the evaluation of the Design Thinking as a pedagogical framework in K-12 education.

With the above social changes in mind and in recognition of the perceived need for a new approach to teaching and learning, the research question that has been posed stems from a need to further inform the debate on the application of Design Thinking in a K-12 setting; but with a very specific focus on Year 9 boys, due to their lessening engagement with education. The challenges of engaging middle school students is well documented (Honan, 2010)).  The question has been posed with the understanding that design thinking is a framework to support a constructivist approach to learning (Scheer et al, 2012) within an evolving Digital Learning Environment (DLE).

References

Anderson, N. (2014). Design thinking: framework to foster creativity and innovation in rural and remote education. Australian and International Journal of Rural Education. 22(2), 43-52.

Carroll, M. (2014). Shoot For The Moon! The Mentors and the Middle Schoolers Explore the Intersection of Design Thinking and STEM. Journal of Pre-College Engineering Education Research (J-PEER). 4(1), 14-30

Carroll, M., Goldman, S., Britos, L., Koh, J., Royalty, A., & Hornstein, M. (2010). Destination, imagination, and the fires within: Design thinking in a middle school classroom. International Journal of Art & Design Education, 29 (1), 37-53.

Honan, E. (2010). Literacies. In D. Pendergast & N. Bahr (Eds.), Teaching middle years: Rethinking curriculum and assessment (2nd ed). (pp. 139–154). Crows Nest, NSW: Allen & Unwin.

Luka, I. (2014). Design Thinking in Pedagogy. The Journal of Education Culture and Society, (2), 63-74.

McIntosh, E. (2012). The Problem Finders: Designing For Great Learning. UnBoxed. A Journal of Adult Learning in in Schools. From:https://www.hightechhigh.org/unboxed/issue9/the_problem_finders/

Razzouk, R., & Shute, V. (2012). What is design thinking and why is it important? Review of Educational Research, 82 (3), 330–348

Scheer, A., Noweski, C., & Meinel, C. (2012). Transforming constructivist learning into action: Design thinking in education. Design and Technology Education: An International Journal, 17(3).

Appendix B

Ethical Concerns and Recommendations

Table B1 Ethical concerns and recommendations
Ethical concern Recommendations
Harm to participants:

Physical harm, harm to participants development, loss of self-esteem, stress and inducing participants to perform reprehensible acts.

 

Non-maleficence must be a guiding principle.

Minimise disturbances both to subjects themselves and to the subject’s relationships with their environment.

Develop strategies to keep records anonymous and report findings so that individuals cannot be identified, such an approach may entail the use of pseudonyms.

Work to ensure field notes do not disclose people’s identities.

Ensure student have a voice when dialoguing with researcher.

The safety and confidentiality of the researcher should also be considered.

Put in place data protection procedures.

Seek informed consent.

Lack of informed consent:

Participants not fully aware of study and cannot make an informed decision about inclusion in the study or not.

Supply student participants and their guardians as much information as possible (See Bryman 2014, p. 153) so that they can make an informed decision regarding participation in the study.

Make use of a study information sheet and consent form. For example as offered by Bryman (2014 p. 141).

Deception:

Researchers represent their work as something other than what it is.  This is ethically questionable and can also bring social studies into social disrepute.

Avoid deception or covert approaches as these violate principals of informed consent.

Do not permit researchers or collaborators to pursue methods of inquiry that are likely to infringe human values and sensibilities. (Bryman, 2014 p. 143).

Participation must be voluntary and fully informed.

 

Invasion of privacy:

Information that is public enters the public domain

 

All participants have a right to privacy.

Structure the research to ensure confidentiality and anonymity.

Use of names without permission should not occur.

Take steps to ensure that location of research site is not identified.

All records to be kept confidential.

This ethics information is adapted from Bryman, 2014; Creswell, 2014; McMillan and Schumacher, 2014; Punch and Oancea, 2014; Tangen, 2014.

END OF DOCUMENT

This work was submitted by Simon Keily, for the CSU subject “Introduction to educational research” (EER500)

This is the work of Simon Keily. The work or ideas of other authors contained within this assignment have been acknowledged in full.
Bookmark and Share